Apple is developing technology to allow users to control iPhones and other devices with their brain signals. This initiative includes collaboration with companies like Synchron, a maker of brain computer interfaces (BCIs).
The Wall Street Journal reported Apple is creating ways for third-party BCI devices, such as Synchron’s Stentrode, to interact with its products.
Pioneering Brain Implant Integration
Synchron’s Stentrode device is central to this effort. The Stentrode, a stent-like implant, is placed in a vein near the brain’s motor cortex using a minimally invasive procedure. Its electrodes read brain signals, which Apple aims to translate into commands for selecting, controlling, and manipulating interfaces on iPhones, iPads, Macs, and the Apple Vision Pro.
According to The Wall Street Journal, this technology intends to help people with conditions like amyotrophic lateral sclerosis (ALS) interact with their devices. Synchron has implanted its Stentrode in 10 people since 2019.
Early Tests Show Promise and Challenges
Mark Jackson, an ALS patient in Pittsburgh, is an early tester of the Stentrode implant. He has used the technology to control his iPhone, iPad, and Vision Pro headset. Jackson shared an experience of virtually visiting the Swiss Alps with an Apple VR headset connected to his implant. However, he noted that Synchron’s technology is still in early development and lacks key features.
For instance, the absence of cursor movement support makes navigation slower than typical device interaction. In response, Apple plans to add broader BCI support through a new protocol for Switch Control in iOS 19 and visionOS 3 this fall, enabling users with BCIs to control devices without physical movement.
Expanding Accessibility Features
Apple’s work on BCI integration is part of a larger push for new accessibility features. The company announced that iOS 19 will significantly improve its Personal Voice feature. This tool allows users at risk of losing their speech to create a synthetic voice that sounds like them.
The updated process will require users to record only 10 phrases, with processing completed in under a minute, a substantial improvement from the 150 phrases and overnight processing previously needed.
Apple states the resulting voice will also be smoother and sound more natural. Researchers believe BCIs will revolutionize device interaction for people with severe physical limitations.