AlphaGalileo is a service for the media.
Journalists should register for free access to embargoed
news and press office contact information.
Please register view details
Please register to view contact details
Please log in or register to view articles older than 3 months
This item is under embargo and is only visible to journalists
Bookmark this item in My Area
This item is bookmarked
Add comments to this news release
Machine learning writes songs that elicits emotions from its listeners.
Music, more than any art, is a beautiful mix of science and emotion. It follows a set of patterns almost mathematically to extract feelings from its audience. Machines that make music focus on these patterns, but give little consideration to the emotional response of their audience. An international research team led by Osaka University together with Tokyo Metropolitan University, imec in Belgium and Crimson Technology has released a new machine-learning device that detects the emotional state of its listeners to produce new songs that elicit new feelings.
"Most machine songs depend on an automatic composition system," says Masayuki Numao, professor at Osaka University. "They are preprogrammed with songs but can only make similar songs."
Numao and his team of scientists wanted to enhance the interactive experience by feeding to the machine the user's emotional state. Users listened to music while wearing wireless headphones that contained brain wave sensors. These sensors detected EEG readings, which the robot used to make music.
"We preprogrammed the robot with songs, but added the brain waves of the listener to make new music." Numao found that users were more engaged with the music when the system could detect their brain patterns.
Numao envisions a number of societal benefits to a human-machine interface that considers emotions. "We can use it in health care to motivate people to exercise or cheer them up."
The device was on display at the 3rd Wearable Expo in Tokyo, Japan last January.
This item has been withdrawn. Registered users can contact the publishing organisation for further details by logging in