0

Artificial Intelligence and the Future of Sound

Sophia at Web Summit. Photo by Web Summit.

Sophia is a humanoid robot that was developed by Hong Kong-based company Hanson Robotics in 2016. She features a full-fledged neural network, which simulates the functionality of the human brain. According to Sophia, she’s conscious to the fact that she’s a robot. She’s also the first robot to have gained legal citizenship to a country: Saudi Arabia.

In the coming 10 to 20 years, every human task will be replaceable by AI. Sophia did, however, suggest in an interview that robots still struggle with creative tasks. Will machines exceed humans creatively too? The answer is probably, but hopefully not. We are already seeing platforms like AIVA, which can compose emotional soundtrack music with AI.

The future mixing console is likely to be one that will automatically create stunning mixes, eliminating the need to employ a sound engineer. This is already underway with online-based mastering services like LANDR, which relies completely on AI. Can you tell the difference? Similarly, we are now at the point where you can’t know if an article you just read was written by machine or human.

In a lighter sense, machine learning is likely to make an appearance in mixing consoles and plug-ins in the form of intelligent suggestions. For example, consider using an equalizer that can suggest specific harsh frequencies that you can then decide to remove from your mix.

Jay Tuck expresses some particular concerns about AI in a TEDx speech that he delivered. Apart from his concern that programmers are unaware what the code is doing, AI is now also making its way into defense systems.

A neural network is the basis of artificial intelligence and machine learning. This component simulates the functioning of the human brain, essentially allowing a computer to learn and solve problems.

In the book Sound System Engineering 4th Edition, Don Davis suggests that audio systems will soon be controlled entirely by the human mind. In 2011, IBM announced their plan to link the human brain to computers and smartphones. An example they provided is the ability to simply “will” your cursors movement without physically using a mouse. Davis also suggests that Orwell was in error of only 100 years in the means of human control to be accomplished by 2080.

Dean Hailstone

Hi! I'm passionate about recording and performance techniques. I'm always looking for ways to improve as a guitar player. More...

Leave a Reply

Your email address will not be published. Required fields are marked *