A couple of weeks ago I took part in a neurodesconferencia event for Brain Awareness Week called “Hackers Cerebrals”. The idea was to talk about control, or loss of control, from the point of view of what the neuroscience tells us.
Usually when we talk about this we make a point of saying that of course our technology cannot be used for “mind-control” (some even feel the need to mention it in their FAQ), and of course it’s true in the sense that most people mean. But what about in a more general sense?
We accept that recreational drugs and pharmacological treatments can affect our self-control, see a recent article from Paul Bloom on this. He mentions how levodopa for Parkinson’s can lead to compulsive gambling, how rohypnol, a sedative, has hypnotic and amnesiac effects and of course there’s alcohol which has been known to reduce inhibition and increase risk taking.
Now, if we look at how transcranial current stimulation (tCS) is being used as a potential intervention for depression, addiction, cognitive impairment and other disorders of the mind it’s not a huge leap to think of it in the same way as the examples above. As we gain knowledge on how to stimulate the brain to, for example, treat depression we are modulating brain activity in qualitatively much the same way as we are with levodopa or other medications. Maybe we are not so far from, for example, controlling our moods (emotion regulation) as shown by a recent paper from Widge & Moritz. The idea here, and as it’s in rats an idea that is still very far from applications in humans, is that we could design a closed-loop BCI-tCS system that allows its owner to auto-regulate brain function and therefore emotional state. Interestingly while you “take” control of your emotional state you are accepting that it can in fact be controlled, and that you are essentially at the mercy of hidden processes.
Bloom’s article is actually about reason and whether or not we are in control of our decision making processes or, as some have suggested, simply good a rationalising after the fact. That we are “biochemical puppets”.
An important implication of this is the notion of moral responsibility and this actually seemed to be very much on people’s minds at Hackers cerebrals. If you can alter someones mental state so easily, or even potentially “fix” it when it has gone wrong, does this somehow reduce their moral responsibility for their actions?