Why I’m Obsessed With AI

binary-3175073_1920

Over the past couple of years, I have become obsessed with artificial intelligence (AI). If you’re not also obsessed with AI then you probably don’t know enough about AI. To remedy that, read Tim Urban’s massive 2-part post about AI on his blog Wait But Why.

In short, there are three different types of AI:

  1. Narrow intelligence (ANI) – The kind of AI we have now, which can perform one task better than humans (like a calculator, chess program, or search engine).
  2. General intelligence (AGI) – This is a machine that can think like humans and be able to learn and perform any task it’s given.
  3. Superintelligence (ASI) – A machine far more intelligent and capable of doing everything better than humans.

The world is currently in an arms race toward creating AGI, with governments like the U.S., China, Russia, and private companies like Google, Facebook, Microsoft, IBM, Uber, Alibaba, Baidu, and others all trying to be the first. The belief is the first to create AGI will be the last because once you have a generally intelligent machine, it will be able to teach itself and soon become superintelligent. And a superintelligent machine can control every other machine connected to the internet—essentially control the entire world.

General AI will be the most important thing humans have ever created because it is the first thing we’ve created that could become smarter than us. In the future, AI is either going to overtake humans, or humans will become AI. We could merge with machines to become superintelligent ourselves. In a way, this “merging” has already started. Just consider how much you rely on your cell phone—it’s practically an extension of your body. We will only become more and more connected to technology in the future. The singularity, or the moment machine intelligence surpasses human intelligence, is not a question of if, but when.

Some AI experts like Ray Kurzweil are optimistic about AI and think artificial superintelligence will be the greatest thing to happen to humanity. He thinks the singularity will come as soon as 2045 and allow us to upload our minds and live forever. However, others like Nick Bostrom, Sam Harris, Stephen Hawking, and Elon Musk are more pessimistic. They think AI could spell doom for humanity, causing our extinction. It’s not that AI will become Terminators hellbent on destroying humans. Consider Bostrom’s paperclip maximizer thought experiment, in which humans program an AI to maximize the production of paperclips, so it turns all matter in the galaxy, including Earth and humans, into paperclips. In such a case, the AI wouldn’t be evil—it would merely be doing what it was programmed to do.

Other AI experts think that both the pessimists like Bostrom and the optimists like Kurzweil are wrong, and general AI is decades or even centuries away. But even if they’re right, that doesn’t mean we should stop worrying about AI. They still admit general AI will likely one day be possible, so the timeline is irrelevant. If a technology is possible, which will either be the best or worst thing to ever happen to humanity, we must start now to ensure that we do everything possible to make sure the former happens. This is what people like Max Tegmark are trying to do with the Future of Life Institute.

I think more science fiction stories written today should be about the future of AI because it is such an important topic. The singularity will be the most significant moment in human history. Life and what it means to be human will be forever changed at that moment—when machine intelligence surpasses that of human intelligence. Will we be able to upload our minds and make copies of our consciousnesses? Will humans become immortal, either through artificial bodies or living in virtual reality—on the internet or a simulation of our choosing?

We as a species need to prepare mentally for the singularity. The best way to do that is through fiction—stories that try to predict the future of technology and what it will be like. AI will be the most important piece of technology ever created, so writers and artists need to create art and stories about this impending possibility. More books and movies and TV shows need to be about AI, such as Ex Machina, Westworld, and Black Mirror. That is our future—AI, mind uploads, and simulations. Future humans will most likely be living in the cloud and exploring cyberspace as opposed to living on other planets and exploring outer space.

That’s why most of the fiction I’m currently writing involves AI in some way. I’m trying to predict and make sense of that situation—the myriad possibilities of what life could be like in the post-singularity future. It’s such a fascinating subject that is not explored nearly enough in popular media.

Of course, trying to predict what will happen after the singularity may be a fool’s errand. It’s impossible to know what it will be like on the other side—that’s why it’s called a singularity. But still, we have to try to think about it. Not just sci-fi writers, but all fiction writers, TV writers, scientists, philosophers, theologians, and even everyday people should be thinking about what life could be like after that moment (which could possibly happen during our lifetime.)

What will ultimately happen with AI? I don’t know, but I can’t stop thinking about the possibilities. Everyone should be considering the future of AI and trying to make sure it’s the best thing humans ever invent, not the worst.

14 thoughts on “Why I’m Obsessed With AI

  1. Pingback: Nat Geo’s One Strange Rock Review | Tim Barry Jr.

  2. Pingback: The Simulation Test | T.Z. Barry

  3. Pingback: The Greatest Invention in Human History | T.Z. Barry

  4. Pingback: The Musk Rogan Podcast | T.Z. Barry

  5. Pingback: Top 10 Fiction Books I Read in 2018 | T.Z. Barry

  6. Pingback: The Power of Science Fiction | T.Z. Barry

  7. Pingback: The Future of Screenwriting | T.Z. Barry

  8. Pingback: Recommendation Algorithms Rule Your Life | T.Z. Barry

  9. Pingback: Science Fiction Books vs. Movies | TZ Barry

  10. Pingback: Social Anxiety in Black Mirror: White Christmas | TZ Barry

  11. Pingback: Social Anxiety in Movies: Her | TZ Barry

  12. Pingback: Nat Geo’s One Strange Rock Review | TZ Barry

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s