Watch the piano-playing robot developed by leading AI lab that can also read human emotions
23 January 2023, 17:31 | Updated: 23 January 2023, 17:40
A scientific laboratory in China on the frontline of Artificial Intelligence development has created a robot that can play the piano, and read the human emotions of its audience.
Listen to this article
In December 2022, photos and videos of a robot playing the piano in a restaurant in eastern China began spreading across the internet.
No explanation was provided for why this robot had seemingly taken on the job of resident pianist at this Hangzhou eatery, but the limited details of what this advanced android artist could do were impressive.
The robot, named Xiaole, is reportedly able to play the piano using a high-precision visual perception system, meaning it can locate the correct keys to press on the piano by itself.
Due to being built with multiple joints in its arms, the humanoid robot is able to complete everyday tasks and activities, such as playing the piano, and can also reportedly move its waist, head, and feet in order to reach all 88-keys of the instrument.
However, a notable addition to all these incredible abilities is that Xiaole is able to recognise human faces. Not only that, but it is also able to read its audience’s emotions – possibly scoping out what pundits thought of its performance.
But why would a piano-playing robot need to also read and somewhat understand its audience?
Well, Xiaole is created by China’s Zhejiang Lab, a research institute based in Hangzhou with a specialism in Artificial Intelligence. The robot’s first somewhat public outing was in fact at an employee’s ‘family day’, where it performed for guests visiting the scientific campus in November 2022. A month later, the robot was moved to a local restaurant.
Notably, Xiaole isn’t the first piano-playing robot created by Zhejiang Lab. In 2021, the research institute showcased an android that played the piano after analysing the audience’s character. Sound familiar?
Subsequently, Xiaole’s ability to read human emotions may stem from this first robot’s learnings, and potentially has ties to how humans make, play, and understand music.
Human expression is ultimately at the heart of all music-making, and it's interesting to see this understanding echoed and understood by the scientists working at the cutting edge of Artificial Intelligence development.
Regardless of how impressive the robot’s mechanical skill and command of the keyboard is, the research and development into helping AI understand emotions and process them in their own tasks and skills may unlock a whole new genre of machine-learned music.