Dr. Ramses Alcaide: Co-founder of Neurable, Bringing Neurotech to Headphones – Ep. 52

Who, what, Why?

Dr. Ramses Alcaide is the co-founder and CEO of Neurable, a company creating wearable brain control interfaces.

Their flagship product is a pair of headphones that can respond to your brainwaves just by wearing them as normal. Suffice it to say, few people out there are as on the bleeding edge of what’s to come as Dr. Alcaide, so today we talk about how he was able to build his company, follow his passion, find his “why”, and shape the future of human technology along the way.

Full Audio Conversation:

Neurable Links:

If you enjoy the show, please rate it 5 stars on Apple Podcasts, subscribe, and leave a nice review!

EPISODE HIGHLIGHTS & RESEARCH NOTES:

There's a lot more you're missing.

Submit your email address to gain instant access to the rest of this page, including episode highlights with timestamps & original research.

Loading...

3:14 – “How do we bring that level of of capability that you typically see in the laboratory into an everyday system, and leveraging that IP that we we created at the University of Michigan? Building upon it over the last 10 years, we’ve been able to get to just that, essentially.”

7:35 – “And as I saw this gap in really understanding the brain data and what we could do there, that’s when I said like, I really need to get into neuroscience. I need to work with some of the brightest minds.”

11:36 – “There was this one kid that we were working with, and it wasn’t until he was 15 that they realized he was actually fully cognitively capable. But now he had missed all the developmental milestones that a child goes through through in development. So there is no way to make him a functional member of society. And so how many of these children are we missing out on their potential because we’re not able to understand them, right?”

16:09 – “So what if we could just use the signal processing pipeline to boost up some of those signals? What could we achieve? And so in 2015, we spun the company out from the University of Michigan and we started Neurable, and that’s where the company really began.”

19:54 – “So Neuralink does require surgery. And the way that we kind of see Neuralink versus what we’re doing at Neurable is the difference between getting a prosthetic leg and getting a pair of shoes.”

22:49 – “I can’t go into that yet, because we’re going to be doing a big reveal on it. But it’s it’s just as simple as using Siri, but without having to essentially say anything out loud. Which is really cool, because your brain as you actually think or do these commands, it starts, it activates your entire self, and we’re able to pick that up in secondary ways and use that as a control modality.”

25:24 – “I foresee brain computer interfaces becoming mainstream. So right now, millions of hearables are being sold every single day. Hearables being earbuds, headphones, etc. And just like Bluetooth is in every one of those headphones, I believe that neurotechnology is going to be in every one of those devices in the near future.”

28:53 – “I know how well you’re going to react to this test or this reaction time task before you even know. And reaction time is tied directly to attention. So the more attent you are, the better your reaction time is going to be.”

32:40 – “And then we went to Boston, and in two days we raised our first $2 million dollars.”

41:18 – “We foresee a future where in the next couple of years, the next version of headphones that you wear will have neuro-tech inside them. And then from there our goal is, how do we make that software as seamless as possible? How do we remove any type of calibration step? How do you just grab the headphones and put them on and then they just work, right? And we’re we’re really close to getting there.”

50:56 – “It was all worth it. Especially for the team that we have here. Like, they’re just so incredible. They’re really the ones to give credit to everything at this point. You know, it’s like, yeah, it started out with this mission for me as a child. It went to the tech we built at the University of Michigan, but everything has evolved so much in the last five years since we built the team out—they’re wholly responsible.”

55:58 – “Your parents might tell you to go do something else… don’t listen to that. Just find what it is that makes you happy. There’s there’s not enough time. Life is both too long and too short for you to do something you don’t want.”

RESEARCH NOTES ON NEUROTECHNOLOGY:

  1. Potentials of neurotechnologies

https://brain.ieee.org/topics/neurotechnologies-the-next-technology-frontier/

The greatest potential of neurotechnologies is in their ability to alleviate human suffering through enabling better treatments for mental and neurological disorders, movement disorders and sensory disorders. Innumerable people could benefit from treatments for as-yet unsolved neurological disorders like Alzheimer’s Disease and multiple sclerosis, as well as psychiatric disorders like bipolar disorder and phobias. 

Beyond medical applications, neurotechnologies have the potential to elevate human experience and functioning in other ways. For example, these technologies could enhance human learning ability, boost physical performance, and enable efficiencies like brain-controlled devices.

  1. Can be used for treatment of various brain disorders (e.g. Parkinson’s Disease, Alzheimer’s, epilepsy, major depressive disorder) but might have risks

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5733340/

Neurotechnology-based interference with brain activity can be very effective, allowing for successful treatment of brain disorders. This approach complements traditional (mostly pharmaceutical) treatment methods, and it often leads to a substantial improvement in quality of life. However, one has to understand that these interventions change the brain and its functions—either as a desired result of therapy, or as an unwanted side effect. In extreme cases, interventions in the brain can transiently or irreversibly alter a patient’s personality and character. 

  1. Clinical applications of neurotechnologies offer alternatives to pharmaceutical approaches and devices for diseases that have been fatal.

https://link.springer.com/article/10.1007/s12152-019-09406-7

If natural sensors of the human body are degenerated by diseases or decoupled by traumatic injuries, muscles or organs do no longer obtain neural input. Why not using technical electrical signals to write information into the nervous system? If diseases of the nervous system or other organs lead to changes in the neural information flow, why not overwriting these “wrong” patterns with ones that alleviate the symptoms of those diseases? 

  1. A fundamental tool for going beyond human limitsDavide Valeriani, senior research officer at University of Essex’s BCI-NE Lab

https://www.theguardian.com/technology/2018/jan/01/elon-musk-neurotechnology-human-enhancement-brain-computer-interfaces

“Everyone who makes decisions wears the EEG cap, which is part of a BCI, a tool to help measure EEG activity … it measures electrical activity to gather patterns associated with confident or non-confident decisions. We train the BCI – the computer basically – by asking people to make decisions without knowing the answer and then tell the machine, ‘Look, in this case we know the decision made by the user is correct, so associate those patterns to confident decisions’ – as we know that confidence is related to probability of being correct. So during training the machine knows which answers were correct and which one were not. The user doesn’t know all the time.

“I hope more resources will be put into supporting this very promising area of research. BCIs are not only an invaluable tool for people with disabilities, but they could be a fundamental tool for going beyond human limits, hence improving everyone’s life.”

  1. Why neurotechnology is scary:

https://www.politico.eu/article/machines-brain-neurotechnology-neuroscience-privacy-neurorights-protection/

Rafael Yuste, Neurobiologist “The problem is what these tools can be used for.” There are some scary examples: Researchers have used brain scans to predict the likelihood of criminals reoffending, and Chinese employers have monitored employees’ brainwaves to read their emotions. Scientists have also managed to subliminally probe for personal information using consumer devices.

“We have on the table the possibility of a hybrid human that will change who we are as a species, and that’s something that’s very serious. This is existential.” 

With brain implants in particular, experts say it’s unclear whether thoughts would be induced, or originate from the brain, which poses questions over accountability. “You cannot discern which tasks are being conducted by yourself and which thoughts are being accomplished by the AI, simply because the AI is becoming the mediator of your own mind” – Marcello Ienca, bioethicist at ETH Zurich

  1. Potential of neurotechnology in mobile communication industry https://lsspjournal.biomedcentral.com/articles/10.1186/s40504-017-0050-1

Several leading companies including Apple and Samsung are incorporating neurogadgets into the accessory assortments of their major products. For instance, iPhone accessories such as the XWave headset already allow to plug directly into compliant iPhones and read brainwaves. Meanwhile, prototypes of next-generation Samsung Galaxy Tabs and other mobile or wearable devices have being tested to be controlled by brain activity via EEG-based BCI (Powell, Munetomo, Schlueter, and Mizukoshi 2013). In the light of these trends, Yuan and colleagues predicted that neurodevices will gradually replace the keyboard, the touch screen, the mouse and the voice command device as humans’ preferred ways to interact with computers (Yuan, Hsieh, and Chang 2010).

  1. Ethical issues: 

https://www.scientificamerican.com/article/the-ethical-challenges-of-connecting-our-brains-to-computers/

What if someone were to face employment discrimination because the algorithms that power a neurotech application used for hiring misinterpret his or her neurodata? What if a criminal gets a hold of the previous or current neurodata of the secretary of defense and steals top secret information? Ethical concerns increase when we are not just monitoring someone’s neurodata but also interpreting it, decoding the person’s thoughts—with implications for accuracy and mental privacy.

One tricky aspect is that most of the neurodata generated by the nervous systems is unconscious. It means it is very possible to unknowingly or unintentionally provide neurotech with information that one otherwise wouldn’t. So, in some applications of neurotech, the presumption of privacy within one’s own mind may simply no longer be a certainty.

  1. Available-at-home neurotechnology’s issues:

https://www.technologynetworks.com/neuroscience/articles/privacy-in-the-brain-the-ethics-of-neurotechnology-353075

Anna Wexler, assistant professor of medical ethics and health policy on issues in direct-to-consumer neurotechnologies:

“The first way is as an immediate adverse reaction that is measurable, like a  burn on the skin beneath where an electrode was placed.” The second issue is more fundamental, and harder to assess. “The second part of the harm issues is unintended negative consequences outside of safety, so things like potential effects on cognition if someone uses the device very frequently.” While the presence or absence of a charred chunk of hair after an excessive round of unregulated stimulation is easily quantifiable, assessing how someone’s subjective experience of reality might be altered by devices – especially those that explicitly aim to help people by tweaking their mood – is hard to measure, and there have not yet been any conclusive studies on this, Wexler explains.

  1. Capability of neurotechnology and its implications: 

https://link.springer.com/article/10.1007/s11948-020-00218-0

Reading the mind, like reading a book, implies something about mind’s being potentially open to view. This would mark a radical departure from conventional accounts of one’s mind as accessible only to oneself. In a mind-reading context, one person might gain access to another’s ideas, thoughts, intentional, emotional, perceptual states, or their memories. This might be done with or without permission. It could offer the promise of exciting new modes of communication, self-expression, and mutual understanding. Often the stuff of science fiction, this prospect can have alarming dimensions concerning who might have access to the mind, as well as implications for how persons might be judged. 

For example, since 2013 it has been known that detection of a specific type of signal (the ‘P300’ wave) can play a role in ‘spying’ on brain activity to extract confidential information. This can be done with subliminal cues, perhaps to gain information predicting personal beliefs. Researchers constructed a game and recorded the brain activity of its players. These signals could be processed to elicit details about bank PIN numbers and related private information without the game player knowing (Ienca et al. 2018)

  1. Possible effect of neurotechnology on decision making

https://www.nature.com/articles/d41586-019-02214-2

For neuroethicists, one concern is that inserting a decision-making device into someone’s brain raises questions about whether that person remains self-governing, especially when these closed-loop systems increasingly use AI software that autonomously adapts its operations. In the case of a device for monitoring blood glucose that automatically controls insulin release to treat diabetes, such decision-making on behalf of a patient is uncontroversial. But well-intentioned interventions in the brain might not always be welcome. For instance, a person who uses a closed-loop system to manage a mood disorder could find themselves unable to have a negative emotional experience, even in a situation in which it would be considered normal, such as a funeral. “If you have a device that constantly steps up in your thinking or decision-making,” says Frederic Gilbert, an ethicist who studies brain–computer interfaces (BCIs), “it might compromise you as an agent.”

  1. BCI and AR/VR

https://www.frontiersin.org/articles/10.3389/fnhum.2020.00144/full

“BCIs, together with AR/VR, offer the possibility for immersive scenarios through induced illusions of an artificially perceived reality that can be utilized not only in basic BCI research but also in many fields of application.”

  1. People are worried about brain chip implants that boost a person’s natural abilities 

https://www.pewresearch.org/fact-tank/2016/09/26/americans-wary-of-using-chip-implants-to-boost-brain-power-for-the-healthy/

Some 54% of U.S. adults foresee a future where computer chips will routinely be embedded in our bodies. But as with other kinds of potential human enhancements, a recent Pew Research Center survey found that more Americans are worried about the idea of an implanted brain chip (69%) than are enthusiastic (34%). And a minority of U.S. adults – 32% – would want this implanted device for themselves.

Many participants categorized brain chip implants as unnecessary for a healthy person. One person described the idea of brain chips for enhancement as “cosmetic surgery for the brain.” Another said: “I feel like it’s low value. I feel like it’s just going to be another thing that increases our vanity. … It just feels shallow to me.”

https://bigthink.com/health/neurotechnology/

“If we start tinkering with the brain, if we start changing it….What does that mean? Are we about to fundamentally change what it means to be human? And if so, are we okay with that? – by Nita Farahany, Professor of Philosophy and Law, on how technologies and brain computer interfaces are starting to become part of mainstream society. 

  1. Rise of Neurotechnology Calls for a Parallel Focus on Neurorights

https://www.scientificamerican.com/article/the-rise-of-neurotechnology-calls-for-a-parallel-focus-on-neurorights/

The Chilean bill sets out to protect the right to personal identity, free will, mental privacy, equitable access to technologies that augment human capacities, and the right to protection against bias and discrimination. The landmark bill would be the first of its kind to pioneer a regulatory framework which protects human rights from the manipulation of brain activity.

https://www.iberdrola.com/innovation/neurotechnology

In the near future, private companies could have access to the vast amounts of data that neurotechnological devices will yield, and these rights would focus on the protection of such personal information and the prevention of risks arising from it, such as epigenetic modifications, neuropsychiatric complications or its use for biohacking.

0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Scroll to Top