Blurry images taken from the human brain reveal movie clips just watched by test subjects.
For the reconstructed brain videos, the team drew from a separate library of 18 million seconds of YouTube video clips selected at random. If the brain activity measurements are good and the model is accurate, their process should find the clip from the library most similar to the video actually viewed.
The result was a set of blurry, ghostly continuous videos approximating what the subjects were watching.
“You’re reconstructing a movie that they saw using other movies that they didn’t actually see,” Gallant said. This counterintuitive approach is key for proving the decoder works.