Artificial Intelligence (Semi-) Simply Defined

Sinu
5 min readSep 12, 2017

In the epic 1968 sci-fi film directed by Stanley Kubrick, 2001: A Space Odyssey, the astronaut Dave Bowman asks the rogue supercomputer HAL to let him back inside the spacecraft, and HAL responds, “I’m sorry Dave, I’m afraid I can’t do that.”

Fast forward to today when Artificial Intelligence (AI) is no longer science fiction but a reality. In a Slate article, Adam Elkus, a Ph.D. student in computational social science at George Mason University and a fellow at New America’s Cybersecurity Initiative, tested how Alexa would respond when referencing the sci-fi film, he writes: “When you say ‘HAL, open the pod bay doors,’ Alexa responds by not only mimicking the first part of HAL’s response — she also reminds you that she is not HAL and we’re not in space.” Check out Elkus’ video of Alexa’s “snippy” interaction and see if you don’t have flashbacks to 2001.

Photo courtesy of Wikimedia Commons.

Even though Alexa may have been programmed to have the ability to provide pithy comebacks, AI is still in its infancy, but expected to surge over the next few years. According to a new report from Tractica, interest in implementing AI systems is growing among companies and institutions around the world, and the revenue generated from the direct and indirect application of AI software is predicted to grow from $1.4 billion in 2016 to $59.8 billion by 2025.

But what exactly is Artificial Intelligence, anyway?

AI, which is defined as an information system that is inspired by a biological system, is an umbrella term that includes multiple technologies, such as machine learning, deep learning, computer vision, natural language processing (NLP), machine reasoning, and strong AI. In simple terms, AI is detailed programming that allows us to teach machines to simulate human activity.

All forms of AI are built on basic artificial neural networks, the foundation for learning. The concept of neural networks goes back to the 1950s at Dartmouth College and it was the beginning of AI as a field of research. In a nutshell, these networks are a way of structuring a computer so that it resembles a brain — comprised of neuron-like nodes connected together in a web. Individually these nodes are dumb, answering extremely basic questions, but collectively they can tackle difficult problems. More importantly, with the right algorithms, they can be taught.

There are two common terms that are key to understanding the different technologies that use AI:

  • Machine Learning: Humans teach it to do a specific task through complex programming. New software is making it easier for machines to learn as new data accumulates and is analyzed. An example is when Gmail accurately identifies and automatically filters out spam emails from your inbox.
  • Deep Learning: Built on complex mathematics, it is designed to function very much like the human brain. Instead of focusing on task-specific algorithms, its methods are based on learning data representations. Over time, scientists have built layers of mathematical neural networks, allowing the machine to have higher functionality and better capacity to predict outcomes based on data sets. Whether it’s automatic vehicles, better movie recommendations from Netflix, or even more advanced health care prevention, deep learning holds the most promise for the next generation of breakthroughs.

Because AI has become such a buzzword these days, it is helpful to understand what actually is AI. When we look at the variety of ways we interface with software, it’s easier to understand the difference between basic programming algorithms that might use incredible amounts of data and robust processing to simulate some AI outcomes, and actual AI software designed to learn on its own.

  • Email. Your computer software simply syncs with a server’s software to exchange information.
  • Smartphone Gaming Apps. Self-contained game apps work directly on your phone, independent of an outside server or cloud.
  • Social Media Phone Apps. Most social media apps interface with an enormous cloud infrastructure.
  • PokémonGo. Distinct from self-contained phone games, PokémonGo is an augmented reality app. This means the app is constantly connected to a massive cloud infrastructure while also using a combination of phone inputs (e.g., GPS, camera) and cloud processing to augment your user experience, and integrate it with your immediate landscape, either by creating virtual PokéStops or placing an animate character right into your environment using your camera.
  • Chess. When IBM’s Deep Blue beat world chess champion Garry Kasparov, it wasn’t because the computer was “thinking.” It had “memorized” tens of thousands of chess games and used raw processing to beat Kasparov. It was a defining moment in the computer vs. human brain evolution, but it was still not deep learning AI.
  • Evolving Software. This is software programmed with basic instructions that learns to improve itself.
  • Autonomous Driving. Driving requires significant pattern recognition, meaning deep learning is the only path forward in the short-term to make autonomous driving a reality. Whether it is recognizing a pedestrian crossing a sidewalk, monitoring speed, or recognizing the difference between a green and red light, a self-driving car must be able to source the input, process its significance, and issue a response command in microseconds to be effective.

AI research has evolved slowly since it began in the ’50s, but has heated up over the past few years with companies such as Google, Facebook and Microsoft making huge strides.

In fact, AI is being built into many of the tech tools and services we use every day. Virtual assistants are becoming more commonplace in people’s homes, and are “learning” to better recognize vocal commands. Alexa regularly adds new “skills” from calling an Uber to ordering toilet paper. iPhone users see AI daily when as emoji keyboard predictions and facial recognition both improve. Facebook is using AI to root out terrorist communications through its worldwide network.

“Artificial intelligence has applications and use cases in almost every industry vertical and is considered the next big technological shift, similar to past shifts like the industrial revolution, the computer age, and the smartphone revolution,” says Tractica research director Aditya Kaul.

Meanwhile, AI is a heavily debated topic with Stephen Hawking, Bill Gates and Elon Musk calling for AI to be regulated and monitored so it does not get beyond human control like HAL in 2001: A Space Odyssey.

--

--

Sinu

Sinu is a technology managed service provider with offices in New York City and Washington DC. www.sinu.com