Just published is a research article, on the use of Pink Floyd’s music to analyse the brain (https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002176). As a result of the analysis, AI was able to reconstruct part of Another Brick In The Wall, Part 1, from brain waves alone. Whilst in itself, you might wonder what the point of that is. There is a hope that eventually, technology will exist that can use AI to determine what someone who is non-verbal is wanting to say. This would have huge implications for assisting such people, which is why the breakthrough is seen as so significant. The audio that was created is very muddy sounding and hard to make out, but there are clear parts that (knowing the song) you can recognise such as the “All in all…” lyrics. Whilst there has already been work on translating brain activity into words, the ability for AI to recognise musical elements could result in much more accurate “translation” within brainâcomputer interface (BCI) applications, conveying emotion behind words and phrases – key to proper comprehension of people’s thoughts and feelings. The analysis used 29 volunteers, who each had epilepsy and during a procedure (iEEG) had 2668 electrodes put onto the brain’s surface. Some 347 of these were on the part of the brain used for music processing. The team behind the work were Pink Floyd fans, so used their music for the research. A huge amount of detail on the work can be found here (https://doi.org/10.1371/journal.pbio.3002176) for those interested in much more detail.