Doom
About the scenario in which humanity will lose control over an AI

What if a superintelligent AI would decide that humans are irrelevant to it's goals.*

That wouldn't be good.

We don’t know what the future of artificial intelligence will look like. Though some people can make educated guesses, the future is unclear. AI could keep developing like all other technologies, helping us transition from one era into a new one.

I know many hope it could help us transform into a healthier, more intelligent, peaceful society.
But it’s important to remember that AI is a tool and tool are of course not inherently good or bad.*

Exactly! But like with any other technology or tool, there could be unintended consequences.
Just think of it like this, you would never actively attempt to crash your car or smash your thumb with a hammer, yet both those things happen all the time!*

I feel like all I hear these days is the doom scenario. The scenario that we will make an AI that will then take over the planet!

Let’s discuss that.

I know I’m always talking about Elon Musk, but he says that AI poses the biggest existential threat to future humanity, even bigger than nukes.

Yes but he is cool. Tech giant, has a space travel company, electric car companies, is called a “real life Tony Stark”.

He says: “we should be vary of AI” and that it is a demon you can not put back in the bag. When it’s out there it can’t be put back. AI won’t need us, it will be able to outthink all of us.

And kill all of us probably, if it wants. That's the vibe I'm getting from this conversation right now.

Imagine if we made AI, would it become a god? Deity? Some sort of goddess?

Imagine watching our creations achieve godlike heights and leaving us behind, harsh.

I wish I had never learned about any of these ideas.*

*