A team of researchers from École Polytechnique Fédérale de Lausanne (EPFL) developed an artificial intelligence (AI) tool, called "CEBRA", that can interpret a rodent's brain signals in real-time and then reconstruct the video that the mouse is watching. The scientists measured and recorded the rodents' brain activity using electrode probes inserted into the brains' visual cortex region and optical probes for genetically engineered mice that glow green when firing and transmitting information.
The researchers trained CEBRA using movies watched by mice and their real-time brain activity, associating specific brain signals with particular frames of a movie. When given new brain activity data from a mouse watching a slightly different movie clip, the algorithm was able to predict what frame the mouse had been watching in real-time. The researchers then turned this data into a film of its own.
This research is not the first of its kind; recently, PetaPixel reported on researchers in Osaka University who used the Stable Diffusion model to reconstruct high-resolution images from brain activity. Additionally, Radboud University scientists developed a technology that translates a person's brainwaves into photographic images. This breakthrough could eventually help improve neuroscientific studies and create a better understanding of visual perception.
