Elon Musk, CEO of Tesla Motors and cofounder of PayPal, argues that a new generation of machine learning algorithms are a serious threat to humanity.
His early investment in DeepMind, now owned by Google, was to keep an eye on the algorithm company’s projects. Musk’s new form of responsible investment sends a strong message. Musk, like many others, is worried about bringing to life the Terminator films. Steven Hawking believes dismissing films such as Johnny Depp’s Transcendence as just science fiction could be the worst mistake in history.
These kinds of comments leave some computer scientists rolling their eyes. We are a long way from the kind of artificial intelligence portrayed on the big screen. Today's algorithms don’t exhibit the independent intelligence needed to realise these sci-fi fears. At present they are a clever way to solve well-defined problems: optimising your inbox, creating more enjoyable video games or improving fitness levels. Amongst the tools developed to do this, it is machine learning that has captured most imaginations.
Machine learning algorithms can find trends and generate predictive models for large, heterogeneous data sets in short time frames, where traditional data science analysis would struggle. They find the categories that best fit your emails, depending on which ones you normally read, ignore or reply to. The best programmes then hide an email or prompt you to respond based on this information. They are called learning algorithms because they improve their best fit models when new data - or new emails - come along. Far more than simply screening your inbox, machine learning algorithms are being used to create intelligent personal AI assistants, such as VivLabs current project Viv, which will be able to provide greater joined up predictive and response capabilities than current systems like Siri or Google Now.
This Hot Topics event was a roundtable discussion for those working on machine learning technology, investment and policy.
For further information, please contact us at: [email protected].