We are excited to be participating in this year's ISS R&D Conference!
We'll be presenting recent research, in collaboration with the UND/NASA Human Spaceflight Laboratory, Fordham University, and Cognionics, evaluating new capabilities for assessing crewmember neural and cognitive performance during a 14-day simulated Lunar/Martian mission in the Inflatable Lunar/Martian Habitat Analog at the University of North Dakota.
Intheon gave several presentations at the 7th International BCI Meeting, held in Asilomar, California, on May 21-25. In one of these, "From N=1 to N=Everyone: Scalable Technologies for Multi-Brain Computer Interfacing", Tim Mullen gave a live demonstration of a "hyperbrain": real-time computation of neural synchrony across groups of individuals, all done with mobile devices and NeuroScale.
We're excited to participate with the Human Spaceflight Laboratory at the University of North Dakota, Fordham University, and Cognionics, on a research project for Inflatable Lunar Martian Analog Habitat Mission V, testing the feasibility of using wearable EEG and specially designed cognitive assessment protocols to characterize and predict future changes in cognitive performance during Lunar/Martian missions.
Intheon was invited to be a Featured Beta Startup at the Collision tech conference held in New Orleans on May 1-3, where we showcased our NeuroPype desktop signal processing application and previewed the capabilities of our NeuroScale cloud platform. We received a lot of positive feedback and excitement from attendees about what we have been building and have in the works as we pioneer the first cloud-scalable platform for 'anytime, anywhere' neural state decoding!
Having fun at the Intheon offices with a GearVR-capable application showing integration of heart, muscle/gesture, and cortically localized brain activity, processed in the cloud with NeuroScale, and powering an interactive mobile VR game rendered on a phone. While wearing a Cognionics Quick-30 headset for measuring EEG, EMG and ECG signals, we took turns activating or deactivating the orb using EMG, while levitating the orb with our EEG activity by focusing our attention. (Game dev by Tim Omernick.) One more step towards "anytime, anywhere" BCI applications! (click Read More to see video)
Our CEO, Tim Mullen recently gave an invited Keynote talk discussing our mission to power the future of "anytime, anywhere" neurotechnology, at the Founders Keynote Session of the IEEE Systems, Man, and Cybernetics conference in Budapest, Hungary.
This clip shows the first demonstration of wireless dry EEG brain activity mapping and directed connectivity analysis with real-time interactive 3D visualization in a standard web browser on a smart phone. This represents an important step towards enabling powerful new pervasive applications of EEG, including live remote monitoring of brain activity (neurotelemetry) for clinical or consumer applications.
Dr. Tim Mullen gave a Keynote address on “Brain Computer Interfaces: Present and Future” at CES 2016 (IEEE International Conference on Consumer Electronics). In this video, he discusses our vision for pervasive brain-computer interfaces with IEEE CESoc TV.
This video demonstrates a real-time wearable brain-computer interface (BCI) capable of detecting whether a car driver (here, Intheon CDO Nima Bigdely Shamlo) heard an unusual sound interspersed amongst regular sounds played over the car speakers. The is the first demonstration of a real-time, dry wireless EEG BCI system operating in a moving vehicle with real-time computation on the cloud (NeuroScale).
Intheon CTO Christian Kothe is in San Diego wearing a 21-channel Neuroelectrics Enobio EEG cap. The data is streamed live through the NeuroScale platform where we are removing artifacts and computing a real-time map of cortical brain activity and brain network connectivity (multivariate Granger causality). The resulting data is streamed to San Francisco where Intheon CEO, Dr. Tim Mullen, visualizes Christian’s 3D brain maps in real-time on his cell phone.
Here we show how a brain-computer interface — here a multiplayer, mobile brain-controlled game (“neurogame”) called Tractor Beam — can be powered by the NeuroScale cloud platform and used on planes and trains (or anywhere else you have Internet). Here Intheon team members (Mullen, McCoy, Bigdely-Shamlo, Ward) were on their way from San Diego to San Francisco, on May 5th 2015, to present at the NeuroGaming conference.
Demonstrating the “Glass Brain” at Mozart & the Mind 2014 in an epic rhythm experience with Mickey Hart, Adam Gazzaley, Christine Stevens, Bill Walton, and 700 friends! The Glass Brain — powered by the Intheon team’s software — is the world’s first interactive, real-time, high-resolution visualization of an active human brain (cortical brain activity and connectivity), designed specifically for Virtual Reality. Here Intheon CEO Tim Mullen (right) is flying through the live brain of Grateful Dead drummer Mickey Hart (left) in Oculus VR. Mickey is wearing a 64-channel wearable EEG system developed by Cognionics Inc.
Demonstrating the “Glass Brain” at the GTC2014 Keynote with Adam Gazzaley. The Glass Brain — powered by the Intheon team’s software — is the world’s first interactive, real-time, high-resolution visualization of an active human brain (cortical brain activity and connectivity), designed specifically for Virtual Reality. Here Intheon CEO Tim Mullen (right) is flying through the live brain of Grateful Dead drummer Mickey Hart (left) in Oculus VR. Mickey is wearing a 64-channel wearable EEG system developed by Cognionics Inc. Visit http://intheon.io/projects/ for more details on the Glass Brain and other Intheon projects!