Siúil a Rúin
when the colors fade
- Joined
- Apr 23, 2007
- Messages
- 14,045
- MBTI Type
- ISFP
- Enneagram
- 496
- Instinctual Variant
- sp/sx
As long as they are cute
between ufo's flying over nuclear weapons bases, super-intelligent robots rising up against humanity and islamic jihad, I think it's safe to say the world is fucked.
I doubt robots will suddenly rise up and take over on their own accord. Just because they have intelligence doesn't mean they will have ambition or compunction to control anything.
I honestly didn't read anything in this thread other than the title. I've been thinking the same exact thing ever since I was a child. Maybe I watched too many science fiction movies over the years, two examples being The Matrix saga and The Terminator saga, but my opinion is set that the more humanity depends on technology to do essentially everything on our behalf, the closer we are getting to our own demise, and of course if that happens, it's going to be our fault and we cannot blame the technology because the technology was made from us, and to reject any responsibility is to reject the reality of the threat and the cause of the threat. Then you might as well say that you should not be taken seriously by anybody. Although unfortunately, it's too late for us to do anything now. Humanity has decided that technology makes better decisions than humans. Protecting people simply is not a possibility, but protecting yourself or those that are willing to consider your way of thinking, that is certainly a possibility. All you need to do is take steps to make sure that you're not going to be a victim of technology when it betrays those who created it in the first place.
Hate to say it but it will be the dumb f*cks (to intellectuals) who will be in control of this shit. Why? Because ideas are best understood and translated in real time not on paper. But! Process is slow. It's not some 1% conspiracy. Mainly applications that can be applied in multi-formats that have purpose...I don't think its too late to reverse the process of technology making decisions for humans, but I just don't see how we will. We are too dependent on technology, mainly due to economic reasons.
Islamic jihad is the dumbest world ending threat ever. If the West handled islamic jihadists the same way they try to handle the West, 90% of them would be gone in less than a month, and there would be little to no islamic terror acts in the EU.
I don't believe in Skynet or any AI doomsday scenario. I mean for one I've always thought the 3 laws of robotics by Asimov are perfectly programmable. Not an expert on this at all I must admit though.
A lot of stupid people make a lot of children. That's my main concern.
The three laws necessarily lead to a sort of skynet scenario, if it did nothing else the Will Smith flick should have made that clear.
Idiocracy crossed with I, Robot? Now that would be interesting, result? I think maybe Wall-E
A machine can be intelligent.
But can it have a soul? Because if it has no soul, it also has no will and cannot make decisions independent of what a programmer decides. Which means it still subject to humans, which is kinda what Forever was saying.
The real danger is that we humans, at least most of us, will become more and more subject to programming,due to our dependence on technology and will become more and more like machines anyways.
A machine can be intelligent.
But can it have a soul? Because if it has no soul, it also has no will and cannot make decisions independent of what a programmer decides. Which means it still subject to humans, which is kinda what Forever was saying.
The real danger is that we humans, at least most of us, will become more and more subject to programming,due to our dependence on technology and will become more and more like machines anyways.
Some people don't even believe that humans have souls.
Do they not believe in souls at all, or is this belief more complicated than that?