logo
human_computer_confluence

HC2 Summer School 2013

The HC2 (Human Computer Confluence) project originally emerged out of various European research initiatives aiming at fundamental and strategic research studying how the emerging symbiotic relation between humans and ICT (Information and Communication Technologies) can be based on radically new forms of sensing, perception, interaction and understanding.

The HC2 project just organised its 2nd summer school in order to share scientific knowledge and experience among participants, enhance and stimulate interdisciplinary dialogue as well as provide further opportunities for co-operation within the study domains of Human Computer Confluence. This 3-day event (17-19 July) was hosted by IRCAM (Institut de Recherche et de Coordination Acoustique/Musique, one of the partners of the HC2 project) in Paris and had a strong focus on audio and musical applications of HC2.

The summer school included invited lectures by experts in the field, a round-table discussion and practical workshops where participants engaged in hands-on HCC group projects. It gathered 27 students working on 4 projects:

  • Discussion and prototyping of an EEG application for the consumer market
  • Hybrid musical instrument and Internet of Things
  • Embodied Interaction and Sound: Body and Object Sonic Interaction
  • Interface for a historified smart city

Starlab and Neuroelectrics were in charge of the first workshop -Discussion and prototyping of an EEG application for the consumer market-. We were lucky to have an interdisciplinary group composed of musicians, composers, engineers and psychologists, who decided to develop a neurofeedback application oriented to adults where sound and EEG sonification played a crucial role.

In classical neurofeedback paradigms the end user’s cognitive state (attention, relaxation, mental workload…) levels -measured in his/her brain waves- are in most cases visually displayed and/or transformed into game commands. The goal of the end user will be to learn how to voluntarily alter these patterns gaining control over them.

The group members claimed that in most neurofeedback\biofeedback applications the feedback the user receives is mostly visual and therefore tasks that require the user to close his/her eyes such as relaxation or meditation are difficult to train. They belive that transforming EEG patterns into complex sounds and music can deliver richer feedback that at the same time is easy to interpret. During the workshop they used Enobio in its 8 channel version. NIC (Enobio SW application) streamed the measured EEG in real time to a Simulink/Matlab application in charge of extracting two different state levels:

  • Relaxation: Measured as the power in the alpha band in the occipital area.
  • Concentration, active busy thinking: Measured as the power in the low beta band in the frontal area.

Another Enobio electrode was placed with a sticker electrode on the cheek to detect when the user was smiling. Concentration, relaxation and smiling levels were send through TCP/IP to MAX, a visual programming language for music and multimedia development.

The streamed relaxation level was transformed into the volume of an ‘ohm’ sound. If a high relaxation level was detected the ‘ohm’ sound volume was low showing the user that he was in a state of high relaxation. Otherwise the volume was high helping the user to relax. Active concentration and busy thinking levels were used to modified the pitch of a predefined song. The user was able to self-realize he was gaining concentration or losing it just by listening to the changing pitch. Just for the fun of it when the user smiled a funny drop-falling sound was played. Audio feedback could be played independently for each extracted state or all together.

The project was succesfuly presented at the end of the summer school including a demo of the developed prototype. Congratulations to the team for such wonderful work!

Leave a Reply

Your email address will not be published. Required fields are marked *