logo
neuroelectrics

The rise and decline of EEG, where are we now?

Electroencephalography (EEG) is a brain-state monitoring technique that allows for the recording of neural activity non-invasively. Since the first human EEG recording in 1924 by Hans Berger, the technique has become of widespread use in clinical diagnosis and in the research community [1] (besides other not-so conventional uses, see this post by Alejandro Riera). Widespread? The EEG has not been equally attractive or useful since its discovery as its clinical and experimental usage reached a high point around 1970 – 1980. After that, the community seemed to loose the interest on the method, and is not until the start of the 2000s that the downhill trend stabilized.

ebook-eeg-basic-principles-cta

Upon its discovery, the analysis of EEG was based on the manual tracing and observation of the oscillatory activity. Through visual inspection, researchers and clinicians had to infer diagnoses and the brain mechanisms that explained population trends. But as soon as the technology allowed, the analysis and processing of EEG signals shifted towards the automatization. For instance, the first ink-writing amplifier was developed in 1932 and from then, tools for the automatic monitoring and analysis of neural signals have only improved. During this time also, the processing power of computers started increasing, and the use of signal processing transformations like the Fourier or Hilbert transforms and event related potentials (introduced in this post) spread among the EEG community. However, sooner than later, researchers and clinicians realized that EEG is far too complex and variable for its use in clinical environments with such rudimentary techniques. By the 80s, the experimentation with EEG started to drop, and the expected automatization that had to be introduced in clinical procedures was never successful.

While all this information can be gathered from diverse books and articles [1 for instance], these trends and changes of attitudes towards EEG can now be quantified through computational linguistics. A particularly interesting tool for that would be the Google Ngram Database, which contain the frequency count of words from all the books published between 1500-2008 on diverse text corpora (English text books, French, German…). By analyzing the English corpora, we can see that the aforementioned rise and decline of EEG is reflected through the change on the bibliographical references to this technique, visualized in Figure 1. After the 80s, EEG saw a general disinterest reflected by a huge decrease of bibliographical references, reaching in 2008 allusion levels observed in 1968 (see Figure 1, adapted from Google Ngram Viewer, similar trend observed for keywords ‘electroencephalogram’, electroencephalography’, and analogous words).

Figure 1. Prevalence of the text keywords in the Google books database (in ‰, adatpted from Google Ngram Viewer)
Figure 1. Prevalence of the text keywords in the Google books database (in ‰, adatpted from Google Ngram Viewer)

To understand the decrease of the prevalence of EEG in the bibliography consider that, despite of the efforts over these decades, the automatization of EEG analysis during the 70-80s was limited to the analysis of the frequency domain. Machine-learning approaches were on its childbirth (see Figure 1), as well as the computational power available on the computers used for that analysis. But this has been changing, as our mobile phones reach a computing power similar to 2010 laptops, the computing power and methodology associated to the analysis of EEG evolves. Nowadays, the use of machine learning and computational intelligence approaches for the analysis of EEG is becoming prominent in the neuroscience community performing research (as already introduced by Aureli Soria-Frisch in this post). The methodology applied for EEG analysis, it’s finally starting to consider the broad spatio-temporal complexity of the signals, and in turn, the complexity of the underlying brain.

However, a rise of the use of EEG in clinical settings is yet to be seen as many of these state of the art methodologies seem not to be reaching the clinicians. Can we change the course of this steady presence of EEG in bibliography? Can we extend the applicability of EEG in clinical settings?

There is still lots of work to do, involving research and development in the medical and clinical domains, but we will be watching closely and working to see how far can we bring EEG into benefiting clinical practice.

References:

  1. Niedermeyer, E., & da Silva, F. L. (Eds.). (2005). Electroencephalography: basic principles, clinical applications, and related fields. Lippincott Williams & Wilkins.
  2. National Clinical Guideline Centre (January 2012).The Epilepsies: The diagnosis and management of the epilepsies in adults and children in primary and secondary care (PDF). National Institute for Health and Clinical Excellence. pp. 21–28.

Leave a Reply

Your email address will not be published. Required fields are marked *