We have all seen the films, read the comics or been awed by the prophetic books, and from them we think we have developed a pretty good understanding of artificial intelligence. So if artificial intelligence is represented by, say, 2001: A Space Odyssey’s HAL 9000, or the talking operating system you fall in love with in Her, why have an entire edition of BLINK dedicated to it? The answer, of course, is that that’s not all there is.
The truth is that “artificial intelligence” is a broad term that refers to much more than what has been popularized in pop culture... and it’s a branch of science that will have irreversible impact on the marketing world. Before we get into how, here is a short glossary of terms and some background information that will help you get the most out of this issue.
Terms you need to know
Deep Learning or Machine Learning
A way of talking about the algorithms that computers use to build their own models based on example input. It describes a computer that can move beyond fully descripted code and learn from experience, just as humans do. Deep learning is also sometimes known as neural networks.
Artificial IntelligenceThis is a term that covers a lot of ground.
Here are a few ways to think about it:
The digital capability or data that comes from our surroundings. It describes moments where the physical environment interacts intelligently and unobtrusively with people. It can relate to the Internet of Things and informed objects or the use of unintended data sources like using weather forecast to inform planning.
This is intelligence gained by automating processes and is similar to specialized artificial intelligence when a computer takes on automated processes faster and more cost-efficiently.
Occurs when human intelligence is boosted by use of sensors and other digital capabilities. An example is making better decisions based on tracking behavior.
Natural Language Processing
A type of artificial intelligence that focuses on enabling machines to read and understand human language with the aim of enhancing communication between person and machine.
Algorithms created to simulate the way the layers of the brain work, and they make learning in computers possible. Simple neural networks are limited to up to a hundred or so neurons, usually organized in a simple layer. Advanced neural networks, such as Google Search, can simulate the effects of billions of neurons and are usually organized in hierarchical layers.
Virtual Assistants (VA) or Virtual Private Assistants (VPA)Handy information aggregators that are increasingly useful and supportive. Android has Google Now, Apple has Siri, Windows has Cortana, Baidu has Duer and Facebook has M. This is the current artificial intelligence battlefield. It’s predicted that VPAs will have a huge impact on marketing and could fundamentally change consumer behavior and the consumer journey.
Law of Accelerating Returns
Coined by Ray Kurzweil in 1999, this formula describes the certain types of progress that are exponential including human progress, which moves quicker as time goes by because more advanced societies have the ability to progress at a faster rate than less advanced societies.
Proposed by British mathematician Alan Turing to determine a satisfactory operational definition of intelligence and assess whether a computer can pass for a human. In the test, an interrogator questions both humans and computers and the computer only passes if the interrogator cannot determine who or what is responding.
Growth that increases at a consistent rate. It takes just 30 exponential steps to get to 1,053,741,824. The increase is slow at first but the last step adds more than 500 million (536,870,912 for the detail-oriented).
Introduced in technology by science fiction author and futurist Vernor Vinge in 1987. “It’s a problem we face every time we consider the creation of intelligences greater than our own. When this happens, human history will have reached a kind of singularity—a place where extrapolation breaks down and new models must be applied—and the world will pass beyond our understanding.” Ray Kurzweil predicts we will reach singularity in 2045.
In 1965, the co-founder of Intel, Gordon Moore, extrapolated that computing would dramatically increase in power and decrease in relative cost at an exponential pace. It describes the decades-long trend in the development of computer hardware, where the number of transistors placed on an integrated circuit doubles every two years.
RobotThe physical embodiment of artificial intelligence either in the narrow or super-intelligent sense.
Inventor, thinker, author and futurist. Described by The Wall Street Journal as “the restless genius,”
Kurzweil is known for his many inventions (including the first CCD flatbed scanner and print-to-speech reading machine for the blind) and his 30-year track record of accurate technological predictions. He has received 20 honorary doctorates as well as honors from three US presidents. Kurzweil has also authored seven books, five of which have been best sellers. He is currently a director of Engineering at Google, heading up a team developing machine intelligence and natural language understanding. Kurzweil believes that humans should have a positive outlook on AI.
CEO of Tesla Motors, entrepreneur and visionary, Musk is worried that artificial intelligence will destroy humanity. To protect our civilization, he has donated $10 million toward research that will keep artificial intelligence safe. Musk is an advocate of proactively confronting the dangers of artificial intelligence.
NICK BOSTROMAuthor and professor of philosophy at Oxford University Bostrom is the founding director of the Future of Humanity Institute. He is the author of more than 200 publications including the academic book “Superintelligence: Paths, Dangers, Strategies”, a New York Times best seller. Bostrom regards artificial intelligence as a “transgenerational, global existential risk”.
STEPHEN HAWKINGPhysicist and cosmologist. Hawking, the subject of the film "The Theory of Everything", advocates a careful approach to artificial intelligence development and expects that “computers will overtake humans with artificial intelligence at some point within the next 100 years.”