• You are currently viewing our forum as a guest, which gives you limited access to view most discussions and access our other features. By joining our free community, you will have access to additional post topics, communicate privately with other members (PM), view blogs, respond to polls, upload content, and access many other special features. Registration is fast, simple and absolutely free, so please join our community today! Just click here to register. You should turn your Ad Blocker off for this site or certain features may not work properly. If you have any problems with the registration process or your account login, please contact us by clicking here.

Can AI Wipe Out The Human Race?

Typh0n

clever fool
Joined
Feb 13, 2013
Messages
3,497
Instinctual Variant
sx/sp
Well I would think that depending on the personality of the AI, it could end up committing suicide on its own (by its own conclusion).

They dont understand emotions.
 

chubber

failed poetry slam career
Joined
Oct 18, 2013
Messages
4,413
MBTI Type
INTP
Enneagram
4w5
Instinctual Variant
sp/sx
Ha, well Im guessing to think like a human you need flesh.

Or emotions that could actually be thinking processes beyond it's own control. Influences/Manipulators.
 

Typh0n

clever fool
Joined
Feb 13, 2013
Messages
3,497
Instinctual Variant
sx/sp
Or emotions that could actually be thinking processes beyond it's own control. Influences/Manipulators.

I cant conceive of emotions existing without some physical manifestation of that emotion.

Can you?
 

chubber

failed poetry slam career
Joined
Oct 18, 2013
Messages
4,413
MBTI Type
INTP
Enneagram
4w5
Instinctual Variant
sp/sx
I cant conceive of emotions existing without some physical manifestation of that emotion.

Can you?

when you are sleeping. in your dreams, are you then having a physical emotional manifestation happening? Do you think someone paralyzed does not have emotions?
 

Typh0n

clever fool
Joined
Feb 13, 2013
Messages
3,497
Instinctual Variant
sx/sp
when you are sleeping. in your dreams, are you then having a physical emotional manifestation happening? Do you think someone paralyzed does not have emotions?

I dont know about all that, but you bring up some interesting points.
 

Fluffywolf

Nips away your dignity
Joined
Mar 31, 2009
Messages
9,581
MBTI Type
INTP
Enneagram
9
Instinctual Variant
sp/sx
Well I would think that depending on the personality of the AI, it could end up committing suicide on its own (by its own conclusion).

An AI doesn't have a personality in the same sense humans have. You can program an AI to deal with threats and allow it to evolve upon its existing parameters and allow it to include its own, and you probably have created an AI with the potential of total annihilation. But seriously, who'd create an AI like that, right?

But that is essentially it, every AI that will ever be created will be given a certain power, sure. But it's only on that power that it could ever build upon. No AI would ever end up misfiring neutrons and do something completely unexpected.

The AI's that will be created in the near future would likely be nothing more than sophisticated bots sifting through massive amounts of information, eventually replacing administrative work, they will become the engines of burocracy. But no burocratic AI would ever decide to play with the nuclear launch codes, just because it's bored.
 

Kullervo

Permabanned
Joined
May 15, 2014
Messages
3,298
MBTI Type
N/A
This just makes me fear human "progress", sometimes I think we need some kind of devastation so the clock gets turned back some thousands of years and we have to revert to for simpler ways of living...

I welcome this decision, though if I had my way, humanoid AIs would not be allowed to be built at all.
 

thistlechaser

New member
Joined
May 12, 2014
Messages
53
MBTI Type
INFP
Enneagram
5w6
Instinctual Variant
sp/sx
Nice try, machines. Trying to figure out how us humans would respond to your inevitable takeover. I'm onto you.

But seriously, I tend to think that it isn't necessarily inevitable that machines will wipe us out. It isn't necessarily inevitable that humans will wipe out all of the pine trees on the planet, but it wouldn't be a thing that we had more of an ethical dilemma over than, say, an oak tree. If we were useful to machines and served a purpose, there wouldn't be a need to wipe us out until we start competing for resources. So I guess it'd be best to make sure to make machines out of stuff that would make them not initially have the same metabolic needs as we do. That should at least delay it for a while. They'll probably still end up taking over, unless they can find a function for our humanity that they can't develop on their own. I tend to imagine sentient machines as being like sociopaths, only with less narcissism. There are still tasks that humans do so much better than machines can. Eyewire, for instance--that game already has an AI that guesses and colors nerve pathways, but the researchers need humans for the precise coloring required to find where the pathways come and go. Until machines are able to think laterally, abstractly, creatively...we still have a purpose. If they ever develop an ethical system of their own, I imagine they'd at least *try* not to kill us off unless we tried to kill them off or threatened their resources.
 
I

Infinite Bubble

Guest
I made a speculative blog post somewhat related to this a while ago that may be of interest. It definitely seems to be a possibility.

 
Top