Almost all of us have experienced this feeling, but most of us probably don’t know what to call it – I never really thought about it before, until I started to hear about new neuropathy projects built on AI.
When you’re “in the zone,” when you feel completely embedded in some activity, when you get really passionate about something that you’re doing in the moment – psychologists often call that “flow,” and it’s a useful construct for researching things like attention, mind/body states, and the treatment of conditions like ADD and ADHD.
BB, writing in Wired magazine, describes it this way (this quote has gotten a lot of attention elsewhere):
“The ego falls away. Time flies. Every action, movement, and thought follows inevitably from the previous one, like playing jazz. Your whole being is involved, and you’re using your skills to the utmost.”
So yeah, you may know what that feels like.
I was listening to A.J. Keller give a talk about a new project called, well, Neuropathy. It struck me that this is exactly the type of thing that will bring more transparency to the study of our brains, in an age where we’re also working on creating artificial intelligence. So, we’re working on it from both ends.
“(We can) create a future where … we can lift someone out of depression, and unleash a very empathetic person into the world,” he said. “It’s this really big opportunity, where there are almost a billion people across the world that are living with mental health issues – every single time I go and talk, a person comes up to me and talks about how their family member has this affliction, how (they’ve) tried all of these different medications, maybe psychedelics worked, but there’s really this whole clouded understanding of what is going to be successful in treating a lot of these people. And really … you talk to a psychologist, and the big issue is that we’re poking around … the average consumer doesn’t have the tools to actually see what’s going on inside their mind.”
In detailing how to use technologies to learn more about the human brain, Keller put a big focus on sound and its impact.
“(We’re) leveraging reinforcement learning to essentially predict what sounds a person needs to hear, in order to shift their brain activity,” he said. “And the brain is incredibly interesting. And that when a person attenuates to, or listens to a specific sound, their brain starts to amplify that sound. And when the brain amplifies that, it creates cascading rhythms. And you can actually learn to predict what sounds you need to put into someone’s ears in order to elicit this.”
Now, here’s something else interesting that he went to, and before I get into this, think about the “10x developer” or “10x programmer,” that “unicorn” who gets the work of ten other people done. Is he or she perhaps experiencing more flow?
Keller described it like this:
“When they sit down and start working, when they get into this flow state, the amount of gamma activity across their head increases linearly based on the complexity of the task, (and what) we also found is that if they’re in a more anxious state, they’re not able to get focused, and they just get become more anxious. So if you try to amplify this brain activity that is required to be focused at the wrong time, you can actually make the person feel more uncomfortable. But if you instead intervene, and the AI system learns that … we need to increase their mindfulness, and take them off of this anxiety level before we enter them into a flow state, now, all of a sudden, we have the ability to help that person be able to focus.”
Keller then talked about the power of customized meditations and interventions based on just sound, not more invasive means.
“If you play the right song for a music for a person at the right time, they have this transformational shift very quickly,” he said, noting that these interventions are easy to implement, partly because the FDA favors non-invasive treatments and services.
And that seemed to be a major part of his elevator pitch for this interesting tech, along with the idea that wearables shouldn’t just be passive trackers.
“That’s really where a lot of the wearables today have fallen short, is that they’re sort of just tracking but they’re not being predictive in the moment,” he said, explaining how the new application might affect sleep states, and generally improve mental “flow.”
This one seems poised to take off – what do you think? We’ve been seeing a lot of good work in this sector. Stay tuned!