logo
neuroelectrics

Audio neurofeedback

Today, instead of writing about something I’ve worked on or something I know, I want to do something different: I want to learn about something together with you, the readers. There’s a particular application which I don’t know much about, and I’d like to learn or read about it. And I thought that writing here what I find while actually searching and finding it would be useful to others as well as to me. Of course, there is a bit a delay between me actually finding it and you reading about it, but that’s only because of the logical delay and because at the same time I try to filter out any non-valid findings.

Anyway, let’s get to business. At the last HC2 summer school, David and I were in charge of tutoring one of the working groups. Our mission was purely of support. That is, the participants would chose what application they wanted to develop, and we would provide help, suggestions and guidance. Our group was supposed to use Enobio EEG sensor, and taking up the gauntlet thrown by my last post, they decided they wanted to develop an auditory feedback application. And at this point is when I realized that I knew very little about auditory neurofeedback; and I wanted to know more, whether there actually are people doing it, the advantages of using auditory feedback versus the more common visual feedback, for which applications, etc.

auditory
Photo credits: Wikimedia commons

Sometimes the idea behind using auditory feedback is that for some applications using other feedback may be undesirable or even unavailable. For example, there are some scenarios on which the subject should keep eyes closed. In this case, the common usage of a visual feedback is not possible. Other examples of possible scenarios is those on visual inputs are desired to be excluded due to the belief that they could interfere with the response or the attention of the subject. Some examples of published works choosing auditory feedback due to constrains are those focusing on the lower frequency response bands, like alpha or theta. Those bands show more activation power while subjects keep eyes closed. Therefore, if you want to better monitor such bands, you should discard visual feedback. In this work, Saxby and Peniston present an auditory neurofeedback training based on alpha theta monitoring to treat alcoholism. In this other work, McKnight and Fehmi present an auditory neurofeedback training based on synchronization of alpha bands in different areas to improve attention. Last but not least, in this work, Butnik presents an application to treat ADHD using auditory feedback.

In other cases the idea for using auditory feedback is not because the inability of use other kind of feedback, but rather because auditory feedback is specifically desired due to its characteristics (for example, the human auditory sense is very good in terms of performance on background while performing other task; the auditory sense is also very good in terms of frequency resolution capacity). This is the case of the work of Vernon et al, in which they present a protocol to improve cognitive performance while reading, in which the auditory feedback presents itself like a great option due to the good combination that the visual tasks and auditory inputs make in the human brain.

girl-reading-a-book
Photo credits: Xanetia

With these published works around the usage of auditory feedback, I think it is safe to conclude that audio neurofeedback is a reality and is actually an interesting field within the area of neurofeedback applications. Indeed, some of the works already developed in the early 70’s and 80’s showed the convenience of using auditory feedback. Moreover, together with other modern examples like the exposed above, audio neurofeedback has proved itself (at least to me!) like a promising area worth keeping and eye to it.

Do you know of any other audio neurofeedback applications? Do you want to share them with me and the rest of the readers? This is just getting started!

Leave a Reply

Your email address will not be published. Required fields are marked *