Somewhat unceremoniously, Facebook recently provided an update on its brain-computer interface project. In a paper published in the journal Nature Communications, a team of researchers described a prototypical system capable of reading and decoding study subjects’ brain activity while they speak.
A set of machine learning algorithms equipped with phonological speech models learned to decode specific speech sounds from the data and to distinguish between questions and responses.
High-resolution brain-machine interfaces, or BCI for short, are predictably complicated — they must be able to read neural activity to pick out which groups of neurons are performing which tasks.
MORE