Playlist
March 24th, 2017

Across the mix of fascinating talks and panels at this year’s SXSW, one topic consistently permeated discussions. From autonomous cars and health care to filmmaking and design, the future of Intelligence, both Artificial (AI) and Human (HI), dominated the conversation as the biggest trend to emerge from the conference.

There’s been much buzz recently of the approaching 4th Industrial Revolution, an era with the potential to drastically improve the way we live, work and connect through technologies that fuse the physical, digital and biological. But to get there, we’ll need to push beyond our current technological limitations and AI will be a key driver in that evolution.

The State of AI: It’s Not Artificial Intelligence, It’s Augmented Intelligence

From SXSW’s Abby (chatbot) to Amazon’s Alexa (virtual assistant) to IBM’s Watson (cognitive platform), the current state of AI is focused on helping humans achieve a specific task rather than replacing them. And while there were some who expressed dystopian visions of the future  and warnings of potential abuse, especially considering the current political climate, there were also positive predictions on how AI will drive growth, creativity and “inspire 1,600-plus startups and $12 billion in funding by 2020.”

Several recent advances in technology have driven the rise of AI: explosion of data, massive increase in computational power and research breakthroughs in machine deep learning. Most AIs are based on programs called Neural Networks, a form of machine learning inspired by the biology of our brains. Using this approach, AI can perform particular tasks, like playing chess, through trial and error. But it cannot learn another game without overwriting its chess-playing skills and suffering from what is called “catastrophic forgetting”. While Google’s DeepMind is bringing us a step closer to learning like humans and building complex skillsets, we’re still years away from Artificial General Intelligence (AGI).

GoogledeepmindIllustration via GoogleDeepmind

Ray Kurzweil, Google’s Director of Engineering, predicts that computers will have human-level intelligence by 2029, with the ‘Singularity’ following in 2045. The godfather of futurism discussed its current limitations at SXSW, stating “We will make a great leap forward when AIs can read comic books.” This will signify that they’ve obtained the human ability to infer action and meaning between two points by linking together a history of experiences or stories.

AI as a Tool for Creativity

While agencies like AKQA and JWT are experimenting with AI as a tool to assist creatives, others are investing further. After debuting the first AI-created short film, ‘Eclipse’, at the Saatchi & Saatchi New Directors showcase in Cannes last summer, Team One recently launched an AI Lab as a dedicated space to experiment and brainstorm.

IBM’s Watson shows how AI can be used as a tool to create art, providing analysis, insight and inspiration for Grammy award-winning producer Alex Da Kid to write a song [Cognitive Music], and more recently, to create the first thinking sculpture, inspired by Gaudí and developed with IBM’s Watson cognitive technology for the Mobile World Congress in Barcelona, Spain.

The consensus at SXSW appears to be that we haven’t yet figured out how to get machines to truly harness our creativity and emotion. During the SXSW session ‘Humans, Machines and the Future of Industrial Design’, Philippa Mothersill of MIT’s Media Lab expressed a similar opinion, and along with IDEO’s Jason Robinson, the panel posed the question: How do we teach computers to think creatively for themselves?

Since many current expressions of AIs use a voice interface for interactions, many AI-related session focused on language. Mothersill is particularly concentrated on exploring ways for computers to design objects that are expressive and communicate through their physical attributes. She has created tools like the EmotiveModeler, an Emotive Form Design CAD Tool that “integrates our unconscious understanding and emotive perception of shapes into a CAD tool that allows both novice and expert designers to use only descriptive adjectives and emotions to design objects whose forms communicate emotive character.”

We have a ways to go before AI fully replaces creatives, but according to Kurzweil’s predictions about the future of art and AI, we’re progressing towards a very interesting future where we will merge with AI.

Keeping Up with the Machines: Human Intelligence

If AI is the future, then we still have one frontier to conquer, the human brain. Many of the most fascinating panels included a neuroscientist, sharing their expert perspective on everything from advertising to athletics.

Along with many in Silicon Valley, entrepreneur Bryan Johnson believes “neurotechnology” could be the next big thing, and in order to tackle the biggest problems facing humanity today, our brains will need an upgrade. That is why he founded Kernel, a startup “building advanced neural interfaces to treat disease and dysfunction, illuminate the mechanisms of intelligence, and extend cognition.” The goal of Kernel is to allow humans the opportunity to co-evolve alongside machines and perhaps eventually, outperform them. The startup is focusing initially on gaining a deeper understanding of the brain and eventually, moving towards augmenting the brain so that we can interface directly with computers, while also making us smarter and healthier.

Kernel’s work on memory implants and brain interfaces will not occur in the very near future due to several obstacles, which includes its need for invasive brain surgery, and the fact that electronics can irritate the brain’s tissue and stop working after a while. Despite the vision being cast into a distant future, there are many fascinating projects and products that are moving us towards a more intelligent future.

Controlling the Machines.. and our Emotions

Futurist and Author Richard Yonck posited a very interesting question during the SXSW session ‘The Future of Emotional Machines’: What happens when our machines understand and react to our emotions?  Beginning with the premise that emotion is the most basic and natural form of communication, Yonck believes that it will be at the heart of how we will soon work with and use computers in the coming age of artificial emotional intelligence, an era in which our technologies are able to read, interpret, predict, and even influence our emotions.  

We will see an ecosystem of emotionally aware products and services develop, giving rise to new opportunities in the resulting ‘emotion economy’.  But to make the next giant step in the relationship between humans and technology, we will need the appropriate tools.

Speaking on ‘Brain Wearables’ at SXSW, Tan Le, founder of EEG wearables company Emotiv, demonstrated how we are equipping ourselves to control our machines and environments with our minds, all while also collecting data to improve the most sophisticated machine in the universe.

The audience was treated to a live demo that hints at the tools potential. Using the device, a volunteer succeeded in moving a robotic ball using the power of thought alone. According to Le, “There is no theoretical limit for the range of complex ideas our thoughts can produce, so the applications are only limited by the imagination of the researchers who work on it.”

There are two major categories for this technology:

1. A Man/Machine Interface to control devices and our physical worlds with our thoughts. A machine learning algorithm is used to search for and map repeated patterns. From there, the EEG will understand the familiar brain wave patterns for specific commands and translate them to our devices.

2. It can monitor brain signals to collect and analyze to train and improve ourselves. The ability to read and analyze data will help us figure out how to be happier, calmer and more focused individuals.

The Mill NY’s Executive Creative Director Rama Allen and a team of creative technologists created a VR experience with this concept in mind. STRATA is a responsive VR experience driven to teach us to calm and focus our minds. Based on biofeedback techniques, the visuals within STRATA respond and react to your physiological and neurological data. This in turn helps create awareness of the user’s autonomic nervous system. The goal is to use our own biometrics as a controller, calming oneself to levitate upwards through five fantastical worlds.

The Intelligent Revolution is Coming

This year’s SXSW conference painted a fascinating vision of a future where the merging of man and machine will drive the next phase in human evolution. In contrast to previous phases in humankind, change will be driven by our ideas, creativity and progress. Given the current state of our world today and the fear of our 'future robot overlords', it’s easy to see how that may seem frightening and lead to potential abuses of the technology. On the other hand, given the opportunities and potential to solve mankind’s many issues, from the environment to the way we work, it may just be too great to ignore.