A middle-aged man with high paraplegia due to cervical spinal cord injury is able, using only his thoughts, to precisely control a smart wheelchair to stroll freely around his neighborhood, command a robot dog as an 'extension of his body' to fetch food deliveries, and even participate in online data annotation tasks. This achievement marks new progress in the second invasive brain-computer interface clinical trial carried out by the team of Zhao Zhengtou and Li Xue at the Center for Excellence in Brain Science and Intelligence Technology of the Chinese Academy of Sciences, in collaboration with Fudan University Huashan Hospital and related enterprises. This clinical trial represents a significant leap for brain-computer interface technology—from two-dimensional screen cursor control to three-dimensional interaction with the physical world.
China’s invasive brain-computer interface technology is advancing rapidly. In March this year, the teams of Zhao Zhengtou and Li Xue, together with Fudan University Huashan Hospital, successfully conducted China’s first prospective clinical trial with an invasive brain-computer interface. A participant who lost all four limbs in a high-voltage electric accident 13 years ago used a domestically developed invasive brain-computer interface system to achieve 'thought control' of electronic devices. He can now skillfully operate racing games, play chess, and more. With this, China became the second country in the world, after Elon Musk’s Neuralink in the US, to bring invasive brain-computer interface technology to the clinical trial stage.
Initial Control of Embodied Intelligent Robots
In June this year, the team began their second invasive brain-computer interface clinical trial. Following a tragic fall, Mr. Zhang became quadriplegic due to spinal cord injury in 2022. After over a year of rehabilitation with no improvement, only his head and neck remained mobile. After the implantation of the brain-computer interface system, Mr. Zhang was able to control computer cursors, tablets, and other electronic devices with his thoughts within 2–3 weeks—reaching the same behavioral level as the first participant in the initial invasive trial. To further enhance users’ ability to interact with their environment, the research team introduced new technologies to expand application scenarios of the brain-computer interface from two-dimensional screens to the three-dimensional physical world. The system now enables users, via brain 'thoughts', to approach the speed of typical human operation of smartphones and computers, and to initially control embodied intelligent robots.
Experts noted that the second clinical study has achieved a series of breakthroughs in key technologies. In neural information extraction, the team developed a high-compression, high-fidelity neural data compression technique, inventively integrating methods such as 'spike-band power', 'interspike interval', and 'spike count' for data compression. This hybrid decoding model efficiently extracts effective signals even in noisy environments, improving overall brain control performance by 15%-20%. To address instability caused by noise, light, electromagnetic interference, and fluctuations in patients’ physiological and psychological states in real-world environments, the team introduced 'neural manifold alignment technology' to extract stable low-dimensional features from dynamic high-dimensional neural signals, thereby enhancing the decoder’s environmental adaptability and cross-day stability.
Real-time Parameter Tuning Without Interrupting Operation
In addition, the team revolutionized the system’s calibration process by developing 'online recalibration technology.' This allows real-time fine-tuning of decoding parameters during everyday use, eliminating the need to interrupt operation for specialized calibration, which keeps system performance consistently high and ensures a 'the more you use it, the smoother it gets' user experience. Response speed is one of the core metrics for brain-computer interfaces. Natural human neural circuit conduction delay is about 200 milliseconds. By adopting a custom communication protocol, the research team compressed end-to-end system latency—from signal acquisition to command execution—to under 100 milliseconds, below physiological delay, making the control experience even more smooth and natural for patients.
Zhao Zhengtou said that the breakthroughs in the second invasive brain-computer interface trial are all-encompassing—from two-dimensional to three-dimensional, virtual to physical, basic control to life integration. Patient Mr. Zhang described his feeling as, "It's like controlling a character in a video game. I don't have to deliberately think about which direction the joystick should go—the movement follows my intentions naturally. The signal transmission is stable and there is little delay." In addition, the third patient, who is also paralyzed from the neck down, underwent surgery in October this year. In less than two months, he has already learned to control a robotic arm to help himself drink water.