Monday, December 9

Cloudy with a chance of neurons: The tools that make neural networks work

Machine learning is really good at turning pictures of normal things into pictures of eldritch horrors.

Enlarge / Machine learning is really good at turning pictures of normal things into pictures of eldritch horrors. (credit: Jim Salter)

Artificial Intelligence—or, if you prefer, Machine Learning—is today's hot buzzword. Unlike many buzzwords have come before it, though, this stuff isn't vaporware dreams—it's real, it's here already, and it's changing your life whether you realize it or not.

A quick overview of AI/ML

Before we go too much further, let's talk quickly about that term "Artificial Intelligence." Yes, it's warranted; no, it doesn't mean KITT from Knight Rider, or Samantha, the all-too-human unseen digital assistant voiced by Scarlett Johansson in 2013's Her. Aside from being fictional, KITT and Samantha are examples of strong artificial intelligence, also known as Artificial General Intelligence (AGI). On the other hand, artificial intelligence—without the "strong" or "general" qualifiers—is an established academic term dating back to the 1955 proposal for the Dartmouth Summer Project on Artificial Intelligence (DSRPAI), written by Professors John McCarthy and Marvin Minsky.

All "artificial intelligence" really means is a system that emulates problem-solving skills normally seen in humans or animals. Traditionally, there are two branches of AI—symbolic and connectionist. Symbolic means an approach involving traditional rules-based programming—a programmer tells the computer what to expect and how to deal with it, very explicitly. The "expert systems" of the 1980s and 1990s were examples of symbolic (attempts at) AI; while occasionally useful, it's generally considered impossible to scale this approach up to anything like real-world complexity.

Read 24 remaining paragraphs | Comments

No comments:

Post a Comment