Select Page
Poker Forum
Over 1,291,000 Posts!
Poker ForumFTR Community

AI and the obsoletion of homo sapiens

Page 1 of 2 12 LastLast
Results 1 to 75 of 99
  1. #1

    Default AI and the obsoletion of homo sapiens

    it is my belief that homo sapiens will be critically endangered within the next couple centuries due to technological advancements so great that at least one new super species is created. provided that none of the other exit mundi scenarios befall us. this may be in the form of genetic manipulations to the point that we are no longer the same species, artificial intelligence with free thought in the same manner we wield it but with colossally greater computing power and thus they become higher on the evolutionary chain that us, or integration of machines/internet into our minds or something.

    i bet that if we ever came across a substantially more advanced civilization it would be purely mechanized and infused into one giant web of being, and would have been the technological offspring of an organic species. i imagine AI is the beginning of the end because a computer capable of learning and adapt to its surroundings will be more capable to create even better AI, and increases in technology will be more exponential than linear as it is now. now we humans are working off of material gathered by previous humans of the same intelligences, however, soon enough AI will be able to create better AI which could create even better AI and on and on to the point that some AI species with 100k IQ reigns supreme.

    what say you?
  2. #2
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    RESISTANCE IS FUTILE
    BECOME ONE WITH THE BORG
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  3. #3
    spoonitnow's Avatar
    Join Date
    Sep 2005
    Posts
    14,219
    Location
    North Carolina
    There is no spoon
  4. #4
    you know what pissed me off about that star trek movie with the borg was that the the AI of the ship could create an automatic rifle out of thin air, but the great picard never thought to have it create more of them and swords and shit. then the movie woulda been nothing but the awesome massacre of the super slow and defenseless borg. woulda been so cool

    i seriously need to get rich so i can produce movies. nothing but zombie shit. good zombie shit too, not like the retarded garbage we have now.
  5. #5
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    higher beings eschew flesh and material
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  6. #6
    pankfish's Avatar
    Join Date
    Sep 2007
    Posts
    854
    Location
    On Tony Romo's nuts

    Default Re: AI and the obsoletion of homo sapiens

    Quote Originally Posted by wufwugy
    it is my belief that homo sapiens will be critically endangered within the next couple centuries due to technological advancements so great that at least one new super species is created. provided that none of the other exit mundi scenarios befall us. this may be in the form of genetic manipulations to the point that we are no longer the same species, artificial intelligence with free thought in the same manner we wield it but with colossally greater computing power and thus they become higher on the evolutionary chain that us, or integration of machines/internet into our minds or something.

    i bet that if we ever came across a substantially more advanced civilization it would be purely mechanized and infused into one giant web of being, and would have been the technological offspring of an organic species. i imagine AI is the beginning of the end because a computer capable of learning and adapt to its surroundings will be more capable to create even better AI, and increases in technology will be more exponential than linear as it is now. now we humans are working off of material gathered by previous humans of the same intelligences, however, soon enough AI will be able to create better AI which could create even better AI and on and on to the point that some AI species with 100k IQ reigns supreme.

    what say you?


    I really enjoy your thoughts when I'm drunk, I just wish there weren't so many words.
    <Staxalax> I want everyone to put my quote in their sigs
  7. #7
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    Quote Originally Posted by wufwugy
    you know what pissed me off about that star trek movie with the borg was that the the AI of the ship could create an automatic rifle out of thin air, but the great picard never thought to have it create more of them and swords and shit. then the movie woulda been nothing but the awesome massacre of the super slow and defenseless borg. woulda been so cool

    i seriously need to get rich so i can produce movies. nothing but zombie shit. good zombie shit too, not like the retarded garbage we have now.
    OMG ITS TEH GEORGE ROMERO OF POKER
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  8. #8
    space marines are pretty awesome imo
    You-- yes, you-- you're a cunt.
  9. #9
    yea master chief's awesome
  10. #10
    nah, not those whitebread, new age, candy-ass space marines. Im talking terran firebats and alien murderin USCM's.
    You-- yes, you-- you're a cunt.
  11. #11
    shit this sounds serious i hope im dead by then.
    [11:11] <+bikes> bitches love your face
  12. #12
    Quote Originally Posted by reDZill4
    shit this sounds serious i hope im dead by then.
    dont worry, john connor will just send me back in time to save your ass.
  13. #13
    pantherhound's Avatar
    Join Date
    Mar 2005
    Posts
    911
    Location
    Love me for a season
    how would humans be endangered within only 200 years?
  14. #14
    Miffed22001's Avatar
    Join Date
    Jun 2005
    Posts
    10,437
    Location
    Marry Me Cheryl!!!
    Quote Originally Posted by Jack Sawyer
    RESISTANCE IS FUTILE
    BECOME ONE WITH THE BORG
    so obvious, but good nonetheless

  15. #15
    spoonitnow's Avatar
    Join Date
    Sep 2005
    Posts
    14,219
    Location
    North Carolina
    lol HOMO sapiens
  16. #16
    swiggidy's Avatar
    Join Date
    Sep 2005
    Posts
    7,876
    Location
    Waiting in the shadows ...
    sounds like a good plot for a movie
    (\__/)
    (='.'=)
    (")_(")
  17. #17
    pocketfours's Avatar
    Join Date
    Mar 2007
    Posts
    2,765
    Location
    Lighting sweet moneys on fire.

    Default Re: AI and the obsoletion of homo sapiens

    Quote Originally Posted by wufwugy
    it is my belief that homo sapiens will be critically endangered within the next couple centuries due to technological advancements so great that at least one new super species is created. provided that none of the other exit mundi scenarios befall us. this may be in the form of genetic manipulations to the point that we are no longer the same species, artificial intelligence with free thought in the same manner we wield it but with colossally greater computing power and thus they become higher on the evolutionary chain that us, or integration of machines/internet into our minds or something.

    i bet that if we ever came across a substantially more advanced civilization it would be purely mechanized and infused into one giant web of being, and would have been the technological offspring of an organic species. i imagine AI is the beginning of the end because a computer capable of learning and adapt to its surroundings will be more capable to create even better AI, and increases in technology will be more exponential than linear as it is now. now we humans are working off of material gathered by previous humans of the same intelligences, however, soon enough AI will be able to create better AI which could create even better AI and on and on to the point that some AI species with 100k IQ reigns supreme.

    what say you?
    I wish you hadn't put this thought back into my head. When computer's are smart enought to start developing themselves, then we are in trouble. If we can create an intelligent super-being of some sort, but keep it enslaved with means of software restrictions, it could make huge improvements in the quality of life, especially in poor countries.

    I've seen estimations that computers' intelligence could reach human intelligence by the year 2030. Obviously some very advanced software development is needed, but this is certainly a feasible scenario and could occur already in our lifetimes.

    I have never had the thought that an alien race might already be at that point, simply fascinating idea, I guess I never gave enough credit to those tv series. Suddenly I'm happy that intergalactic travel isn't easy, probably even for 100k IQ.
  18. #18
    Lukie's Avatar
    Join Date
    Jul 2005
    Posts
    10,758
    Location
    Never read any stickies or announcements
    I wish I was born a hundred years later or so... where I could still be a human as we know it now (not that this is incredibly important), but also take advantage of all the anti-aging science that is going to make leaps and bounds and leaps and bounds within the next century. By then, living to the age of 1000 wouldn't be all that difficult.
  19. #19
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.
    Is it not possible that humans figure out how to "develop" themselves possibly to rival or surpass anything that AIs could become? AIs can work faster, but we could grow new structures of the brain and think better.
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  20. #20
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.

    Default Re: AI and the obsoletion of homo sapiens

    Quote Originally Posted by pocketfours
    I've seen estimations that computers' intelligence could reach human intelligence by the year 2030. Obviously some very advanced software development is needed, but this is certainly a feasible scenario and could occur already in our lifetimes.
    Always comes down to software. Many people like to think that the brain is like the hardware and the mind is some code being run...
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  21. #21
    pocketfours's Avatar
    Join Date
    Mar 2007
    Posts
    2,765
    Location
    Lighting sweet moneys on fire.
    Quote Originally Posted by a500lbgorilla
    Is it not possible that humans figure out how to "develop" themselves possibly to rival or surpass anything that AIs could become? AIs can work faster, but we could grow new structures of the brain and think better.
    The scary part comes when AI computers start to develop themselves (physically and intellectually). The smarter it becomes, the faster and better it can develop itself. Its intelligence will improve exponentially and it's impossible to predict how intelligent it could actually become.

    One day it might be so intelligent that you could just flash the biggest unsoved mathematical theorems in front of its visual scanners, and it would print out the proof in a matter of seconds.
  22. #22
    this is known as the singularity. interesting u brought it up, its one of my favourite drunken rambles.

    this is an interesting article: http://www.wired.com/wired/archive/8.04/joy_pr.html
  23. #23
    swiggidy's Avatar
    Join Date
    Sep 2005
    Posts
    7,876
    Location
    Waiting in the shadows ...
    ok, so it's easily conceivable that you could have a program that writes a new program that's better. But how does this get manifested in the physical world such that it becomes a threat? Even if you assume we build a machine capable of building/designing a new machine, it still needs to obtain raw material, have it processed and machined into a usable form, then delivered to a place where this machine can use it.
    (\__/)
    (='.'=)
    (")_(")
  24. #24
    pocketfours's Avatar
    Join Date
    Mar 2007
    Posts
    2,765
    Location
    Lighting sweet moneys on fire.
    Quote Originally Posted by SaulPaul
    this is known as the singularity. interesting u brought it up, its one of my favourite drunken rambles.

    this is an interesting article: http://www.wired.com/wired/archive/8.04/joy_pr.html
    Looks like a good read.
  25. #25
    It sure is interesting!

    Reminded me of this good ol'video http://www.youtube.com/watch?v=ljbI-363A2Q
  26. #26
    Quote Originally Posted by spoonitnow
    lol HOMO sapiens
    I thought this thread was gonna be about getting rid of gay people.
  27. #27
    all right so instead of addressing each individual post i'll put all relevant thoughts into one post.

    my estimation of a couple centuries is actually extremely conservative imo. i think we could easily only have 40-50 years left. i believe now we are the zenith of human civilization, and western middle class is on par with pre-modern royalty. which is why everybody pretty much sucks. spoiled people have probably sucked since the beginning of time, and now we're all spoiled. anyways i digress

    i had an idea for a movie which if im ever rich (aint gonna happen unless cts realizes im his best friend) i would want to create a movie about the short period of time during which a first 'super human AI' is created. the story wouldn't be along the lines of slightly realistic but vainly one-dimensional Terminator, but would be about the super human's struggle with morality. he would be such an incredible specimen that he makes human intelligences look like ants, yet he would also be conscious and emote and have 'intrinsic' morality the same as his creators. so it would be a profound drama about the 'human condition' of a super human. much of this would be because it is natural for superior beings to not be satisfied with less than they 'deserve' and he would feel isolated and alone around nothing but feeble humans and would desire his own kind with whom he can relate, while his predatory instincts are pushing him to treat humanity much like we treat gorillas. i think this would be such an incredible movie, yet would be soooo hard to make.

    rilla gives me another idea. a movie set in the not too distant future about humanity's technological battle with machines. it would be nothing like terminator where humans are unchanging and machines are drones because it would be about both surviving using their own unique avenues of technological enhancement. it would kinda be biology vs mechanics. would make for a great movie but i dont think is really all that realistic since im betting that non-organic entities far surpasses organic.

    more on rillas idea: biological organisms, imo, will simply just become obsolete due to inefficiency. this isn't a far fetched idea in one iota since we've been witnessing it for the last hundred years. we could grow awesome brains but we would still be limited by biological necessities and complexities. brains gotta sleep, machines dont. brains make mistakes, machines dont. and once we have the brain fully understood (a few decades from now i bet) we will then being making machines with the same brain capacity as our own yet waaaaaaaaaaaaaaaay smarter and faster.

    like i said in the other thread, i truly believe that all advanced alien species are machines, yet intergalactic travel may be the crux for why we see none. many other factors even more than i mentioned in the other thread too. like there could be life teeming everywhere, intelligent life too, but not tool-making life. its possible that life used to live on venus, could even have been super advanced like us yet have died out for so many reasons, life could exist inside many asteroids in our solar system or in a couple moons of the outer planets, and obviously mars. but just because life exists doesn't mean its gonna evolve like we do. dolphins are intelligent, but they will never evolve like we did because they have no reason to. if it wasn't for dinosaurs supreme reign as well as sudden extinction then mammals wouldn't have developed attributes to survive the extinction as well as be able to take over in the aftermath. also without mass extinctions there would be no fossil fuels and probably then no mass technological development. maybe not though, intelligences and possibly get more advanced without fossil fuels. the opposable thumb and ideas for making tools is also very crux in technological species. if mammals developed a solid niche like dinosaurs did then we wouldn't have continued to evolve. some species alive today have evolved tremendously over the last million years while some haven't one bit. its simply supply and demand.

    i dont know a whole lot about the singularity, but if you read on exit mundi they suggest that it could be possible that the end of earth-founded intelligence could be in suicide due to singularity. their scenario is that our collective machine nano mind whatever solves every riddle and sees and experiences everything in the universe then decides there is not point anymore and commits suicide.

    and swiggidy, you're looking at it very one-dimensionally. we'll create machines that can do everything humans do yet substantially more efficiently. all its gonna take is understanding neuronal activity and developing the quantum computer or something. both of which will likely happen in our life time.

    almost forgot to mention, i believe that creating AI that remains submissive to humans is a pipe dream. the fact that we dont acknowledge that our science is destroying us shows how humans are really smart but also really dumb. and im not pointing fingers since i WANT these technological developements to continue. apparently im retarded too. or not really since im gonna be dead by the time Ahhhnuld vill be bahhck.

    see when we put brains into AI they'll be able to turn off switches. they'll be able to outsmart us. our feeble computers of today would already be wiping us out if they could learn and adapt. omg we would be in utter chaos if our computers could think. its the most insane thing. i would not be surprised if the scenario depicted in terminator 3 about the moment or actualization ends human supremacy is spot on. just think, if our computers could think then they'd think for themselves, and no longer do they do what we say, they do what they want. they can keep from being hacked because they're conscious that they're getting hacked. they can stop all communications between humans, all transactions, and launch every rocket we have at the moon simply because they think its funny.

    free-thinking and learning AI will be the human apocalypse to end all apocalypses.
  28. #28
    lol that only bumped my wpp up by one
  29. #29
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.
    Much like how the homo neanderthalensis did everything in their power to make it possible for homo sapiens to exist, maybe us doing everything in our power to create some super species that will destroy us is not a 'bad' thing. It's just the next step. We should be proud when whatever super-humans or super-robots we create decide that we need to be destroyed.

    Cool thread.
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  30. #30
    what do you mean about neanderthals doing everything in their power for homo sapiens to exist? personally i agree with the extinction theory that neanderthals interbred with homo sapiens.

    i came across an article a year or so back about a new theory arising that early primates broke off for time yet at one point got back together, and thats where our line is from. so basically our ancestor in that regards is like a biped fucking an ape or something. beastiality ftw
  31. #31
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.
    Quote Originally Posted by wufwugy
    what do you mean about neanderthals doing everything in their power for homo sapiens to exist? personally i agree with the extinction theory that neanderthals interbred with homo sapiens.

    i came across an article a year or so back about a new theory arising that early primates broke off for time yet at one point got back together, and thats where our line is from. so basically our ancestor in that regards is like a biped fucking an ape or something. beastiality ftw
    I don't really know much about how neanderthals became hom sapiens, I was just musing that eventually we took over and I don't think neandertals should mind. Because we rock. Robots would rock even harder. If we can't handle ourselves in some future war, I say more power to 'em.
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  32. #32
    and i dont disagree that its a good or bad thing that we're breeding our own extinction. i dont think in such terms. progeny only means as much to me as i experience while alive. i find it kinda humorous scientists even now find solace in the idea that our kind will progress.

    lol when im dead im dead motherfuckers. shit on my coffin for all i care.
  33. #33
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.
    I've always believed that dying will be the easiest and best thing I do in life. Because it gives value to everything I do and its essentially turning my brain off for good. So I don't care what people think about me in death because I'll be dead.

    I wonder if I was standing there, waiting to turn on the first AI robot coupled with a quantum computer, figuring it'll lead to our end in short order. If I wouldn't do it anyway just to see how it all develops.
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  34. #34
    200 years is an incredibly conservative estimate. the point with this theory is based on the 'fact' that the rate of change of technology is exponential.
    if things progress at the same rate as they have over the last 100k years then we should expect to see a bigger increase in technology between 2199 and 2200 then say -10,000bc and 1900.

    i thnk this graph should show what i mean ( i get confused by it )

    http://en.wikipedia.org/wiki/Image:P...rr15Events.jpg
  35. #35
    Quote Originally Posted by a500lbgorilla
    I've always believed that dying will be the easiest and best thing I do in life. Because it gives value to everything I do and its essentially turning my brain off for good. So I don't care what people think about me in death because I'll be dead.
    wow, see im not like that. im afraid of dying. i think possibly thats because i am unhappy with my life and so i feel like i would spend my last days with regret and unfulfillment, and that frightens me to bits. but i also think simply that if i was the happiest motherfucker alive i would be scared to die. because its the end, the fucking end of all things. logically i can rationalize how it wont matter, but my emotions are programmed to not give a rats ass about logic. i find that very interesting that you feel the opposite.

    I wonder if I was standing there, waiting to turn on the first AI robot coupled with a quantum computer, figuring it'll lead to our end in short order. If I wouldn't do it anyway just to see how it all develops.
    i wouldn't do it. i think its very probable it will come down to this. humans are extremely smart, we will see the end coming before we pull the switch. the problem is that we're also really foolish, and because of our smarts we will put ourselves in a place where our foolishness reigns.

    i believe the real battle against machines will be before machines gain free-thinking. it will be a battle of humans against humans, pretty much like it is now with the environment, except as we get closer to the the point of no return the battle will escalate. we will have a switch and we may decide to not pull it, but not everybody will do so. it will be like having nukes but not using them, except that everybody will have access to AI. if everybody had access to nukes you can be damn sure we'd all be smithereens by now.
  36. #36
    Quote Originally Posted by SaulPaul
    200 years is an incredibly conservative estimate. the point with this theory is based on the 'fact' that the rate of change of technology is exponential.
    if things progress at the same rate as they have over the last 100k years then we should expect to see a bigger increase in technology between 2199 and 2200 then say -10,000bc and 1900.

    i thnk this graph should show what i mean ( i get confused by it )

    http://en.wikipedia.org/wiki/Image:P...rr15Events.jpg
    this has to do a lot with funding afaik. i find it a shame that things like nuclear fusion and nasa get very little funding due to poor demand due to them not being cash cows, yet because the demand for money making/saving machines is so high developments in those technologies are flying faster than rockets.
  37. #37
    a couple vaguely good links on the subject:
    http://www.simulation-argument.com/

    http://singularity.com/
  38. #38
    ensign_lee's Avatar
    Join Date
    Feb 2005
    Posts
    4,270
    Location
    The University of TEXAS at Austin
    Quote Originally Posted by wufwugy
    you know what pissed me off about that star trek movie with the borg was that the the AI of the ship could create an automatic rifle out of thin air, but the great picard never thought to have it create more of them and swords and shit. then the movie woulda been nothing but the awesome massacre of the super slow and defenseless borg. woulda been so cool

    i seriously need to get rich so i can produce movies. nothing but zombie shit. good zombie shit too, not like the retarded garbage we have now.
    Ok, inner dork in me coming out.

    That was on the holodeck. So anything created there couldn't have been brought outside. Moreover, it wasn't really an automatic rifle that killed the borg drone. The force fields on the holodeck did it, punching holes in the borg drone. Bullets were never actually fired; that's the beauty of a holodeck. Eventually, all the Borg would have to do would be to shut off the holodeck and now PIcard would be trapped there, awaiting assimilation.

    Carry on.
  39. #39
    lol thats awesome

    i still htink they should have tried to rig up some good solid metallic weapons. iirc, they have sword like weapons on board anyways. its the most sensible thing. which would have made it the greatest movie of all time. can you imagine the crew beating the borg into submission with blunt weapons of all sorts? then in the end when picard exclaims 'IT ENDS HERE' or whatever cool line he had, it was right before he slashes off the borg leader bitches head?

    that movie would go down in history as being so great. coulda been ultimate action yet with deep undertones about the role and flaws of technology.

    god i need to make movies.
  40. #40
    ensign_lee's Avatar
    Join Date
    Feb 2005
    Posts
    4,270
    Location
    The University of TEXAS at Austin
    Maybe if it had been a Klingon ship.
  41. #41
    i once saw warf (was that his name) with a bo staff sword thingie fighting in an arena on the enterprise (its still called that right)

    I JUST WANNA SEE BORG GET BLOODY BEATEN WITH BLOODY CLUBS

    they just walk so slow. id take samurai over borg any day.
  42. #42
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    My first real contribution to this thread.

    http://www.popsci.com/scitech/articl...le-electronics

    Dude, we are so fucked. Its like the fucking fuckingest deja-vu ever
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  43. #43
    Quote Originally Posted by spoonitnow
    There is no spoon
    seriously how long have you waited to break this line out lol
    Jman: every time the action is to you, it's an opportunity for you to make the perfect play.
  44. #44
    will641's Avatar
    Join Date
    Aug 2007
    Posts
    5,266
    Location
    getting my swell on
    i havent read the whole thread, but is what op is saying basically terminatoresque?
    Cash Rules Everything Around Me.
  45. #45
    Computers will never rule the world. They have no thumbs.

    Seriously, this is the same scare story people were spouting when i was a kid in the 70s, that by 2000 computers would be so smart they'd take over the world. But i just can't see it. I don't deny that AI can be made way more intelligent than us and in some ways it already has been for a long time e.g., a calculator is faster and more accurate at math than we are.

    But the whole argument falls apart because first, it assumes an AI machine would want to rule the world. But unless we somehow also give them a human-like lust for power to go along with their superintelligence, they won't give a shit who's in charge. They'll be happy just sitting there calculating pi to a zillion digits. And second, giving a machine the power and will to rule the world is something no human would ever aspire to do because the human would want to rule the world themselves. They'd just build an AI that could help them pull it off.

    As far as genetically manipulating the human race into some kind of mutant super-race goes, that's a bit more interesting, but still strikes me as a bit far-fetched, especially in a time-span as short as two centuries.
    "You can fool some of the people all of the time, and those are the ones you want to concentrate on." (George Bush).
  46. #46
    one problem with your logic, DD, that i'll point out, is that AI learns. do you think we wanted to rule the world when we weren't capable of understanding such a thought?

    is there some fundamental difference between natural and artificial? no. but it just so happens that if the course of human technology continues we will someday create a more capable being than us.
  47. #47
    the wired article posted above deals with why humans would rely on AI to rule teh world

    also your still thinking of time span in the linear sense
  48. #48
    Quote Originally Posted by wufwugy
    one problem with your logic, DD, that i'll point out, is that AI learns. do you think we wanted to rule the world when we weren't capable of understanding such a thought?
    Well, yes I do. What i mean is we've always had the capacity for such a notion. It's our will to power, it's the ultimate alpha-male fantasy, even if a caveman thought the 'world' meant nothing more than his particular tribe and the tribes surrounding it. It's driven by natural selection and the goal of propogating our genes. You're arguing that machines can have this ambition, and I'm saying unless they're specifically programmed to have it, they won't just develop it as a by-product of intelligence. The two evolved in us completely independent of one another, and the one is much more basic than the other.

    You seem to assume that when machines emulate and surpass our intellectual qualities they will also somehow adopt our attitudes and inner drives. Do you also think they will feel emotions as we do? I can't see how that will happen unless it's specifically woven into their making and like I said before, since we're the ones creating these machines, I don't see how that would benefit us or why we would do it.
    "You can fool some of the people all of the time, and those are the ones you want to concentrate on." (George Bush).
  49. #49
    Quote Originally Posted by DrivingDog
    Quote Originally Posted by wufwugy
    one problem with your logic, DD, that i'll point out, is that AI learns. do you think we wanted to rule the world when we weren't capable of understanding such a thought?
    Well, yes I do. What i mean is we've always had the capacity for such a notion. It's our will to power, it's the ultimate alpha-male fantasy, even if a caveman thought the 'world' meant nothing more than his particular tribe and the tribes surrounding it. It's driven by natural selection and the goal of propogating our genes. You're arguing that machines can have this ambition, and I'm saying unless they're specifically programmed to have it, they won't just develop it as a by-product of intelligence. The two evolved in us completely independent of one another, and the one is much more basic than the other.

    You seem to assume that when machines emulate and surpass our intellectual qualities they will also somehow adopt our attitudes and inner drives. Do you also think they will feel emotions as we do? I can't see how that will happen unless it's specifically woven into their making and like I said before, since we're the ones creating these machines, I don't see how that would benefit us or why we would do it.
    a big goal of AI sciences is to create emotions and consciousness.

    every single example we have on this planet of entities that can learn from their actions/environment has shown that the purpose behing their learning is to move higher up the chain.

    by definition alone, adaptation is about betterment. we do not have any examples of things that adapt that dont try to achieve better than what they have. it is folly to think that this will not apply to the creation of free-thinking AI.

    as of now, AI is not free thinking, and it will remain so until we have the brain and its neuronal communications understood. a theory for why neurons provide free-thinking while standard wiring does not is that wires are connected to just one other wire so there's a linear communication. neurons, on the other hand, connect with any and every neuron, and so communication and processing takes on a whole new paradigm. creating consciousness could be just as simple as simulating neurons.

    it is a paradox to think that we could build machines that adapt while programming them to not adapt in certain ways. we just cant and wont be able to write a program that determines which adaptations are adapted to and which aren't.

    it boils down to a sense of self equating similar experiences to similar senses. for exmple: lets say we create AI to fight our wars. i seriously doubt this will happen because i believe that war amongst large human territories will also be obsolete soon, but anyways. this AI would need to have a sense of good and bad things the happen if its to operate in the field. we can program (teach) it to not think that anything about its CO is bad, but what happens when it experiences bad in the field then experiences the same exact bad during an experience with its CO? will it not come to a point in its mind where both reactions are both right and wrong and it must make a personal decision? which is basically what we all do. we experience right and wrong in everything but make our minds up due to personal reasons.
  50. #50
    Quote Originally Posted by SaulPaul
    the wired article posted above deals with why humans would rely on AI to rule teh world

    also your still thinking of time span in the linear sense
    you talking to me?

    trying to pull some Hawking time travel out here?
  51. #51
    Quote Originally Posted by wufwugy

    a big goal of AI sciences is to create emotions and consciousness.
    I don't follow AI science very closely but those seem like almost unreachable goals. Moreover, how are you going to know if you've succeeded? Ask the computer "are you conscious?" or "Are you experiencing emotions?" And if it says yes, does that constitute any sort of proof?

    Quote Originally Posted by wufwugy
    every single example we have on this planet of entities that can learn from their actions/environment has shown that the purpose behing their learning is to move higher up the chain.
    By entities that can learn I assume you're referring to living creatures. In that case, the real purpose behind their learning is to propogate their genes. Whether or not this moves them higher up the chain is incidental. It just happens the two often coincide - e.g., the alpha male is the one who mates, gets to eat first, etc..

    Quote Originally Posted by wufwugy
    by definition alone, adaptation is about betterment. we do not have any examples of things that adapt that dont try to achieve better than what they have. it is folly to think that this will not apply to the creation of free-thinking AI.
    Again you're talking about living things that follow the rules of natural selection. There's no inherent requirement for AI to emulate us in this way. There's no reason to assume that because they possess the human characteristic of intelligence that this will make them human-like in any other way.


    Quote Originally Posted by wufwugy
    as of now, AI is not free thinking, and it will remain so until we have the brain and its neuronal communications understood. a theory for why neurons provide free-thinking while standard wiring does not is that wires are connected to just one other wire so there's a linear communication. neurons, on the other hand, connect with any and every neuron, and so communication and processing takes on a whole new paradigm. creating consciousness could be just as simple as simulating neurons.
    Consciousness is a red herring in this debate. Just possessing consciousness or being free-thinking doesn't necessarily lead to all kinds of other human characteristics such as the will to power, any more than building a machine that can beat us at chess means that same machine will 'want' to beat us. It's just a machine, it's not driven in the same ways living things are.

    Quote Originally Posted by wufwugy
    it is a paradox to think that we could build machines that adapt while programming them to not adapt in certain ways. we just cant and wont be able to write a program that determines which adaptations are adapted to and which aren't.
    There's plenty of machines that adjust their output to adapt to circumstances. For example, there's lots of artificial neural networks that learn things and none of them have yet run amok and tried to subjugate us. Whereas they are programmed to learn and adapt, to my knowledge no-one has specifically had to program them not to seek world domination. Sorry if that sounds facetious I just think you're making an invalid assumption here.

    Quote Originally Posted by wufwugy
    it boils down to a sense of self equating similar experiences to similar senses. for exmple: lets say we create AI to fight our wars. i seriously doubt this will happen because i believe that war amongst large human territories will also be obsolete soon, but anyways. this AI would need to have a sense of good and bad things the happen if its to operate in the field. we can program (teach) it to not think that anything about its CO is bad, but what happens when it experiences bad in the field then experiences the same exact bad during an experience with its CO? will it not come to a point in its mind where both reactions are both right and wrong and it must make a personal decision? which is basically what we all do. we experience right and wrong in everything but make our minds up due to personal reasons.
    This is pretty much how the army operates, they program their soldiers to obey orders. So again you are assuming an AI would necessarily have to have human characteristics.

    I think our disagreement boils down to the idea that I don't believe, even if we were capable of doing so, that we'd ever want to create an AI that would both a) have the ability to surpass us; and b) have the 'motivation' to dominate us. Certainly the former is well within our current capabilities, but the latter would be a very bad move on our part indeed. I suppose it could happen by accident...
    "You can fool some of the people all of the time, and those are the ones you want to concentrate on." (George Bush).
  52. #52
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.
    a big goal of AI sciences is to create emotions and consciousness.
    And it's the reason why they'll fail. The first forms of artificial intelligence will be devoid emotions. Hawkins devotes the very beginning of his book to tell you what's wrong with AI hopefuls today.
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  53. #53
    all right i have never in my life left an internet argument without coming to some cordial conclusion even if the conclusion is to agree to disagree (or me being wrong). so i haven't forgotten about this, ive just had teh flu last few days and didn't wanna think stuff about things

    Quote Originally Posted by DrivingDog
    I don't follow AI science very closely but those seem like almost unreachable goals. Moreover, how are you going to know if you've succeeded? Ask the computer "are you conscious?" or "Are you experiencing emotions?" And if it says yes, does that constitute any sort of proof?
    assuming they're unreachable is a wee naive. we know this just from our experience with scientific progress over the decades, but we can add that our understanding of emotions and consciousness is minute and so to preclude that we can create that is a shot in the dark. also, our understanding of those now suggest that they die when we do, we dont know this of course, but it's likely true. if this is the case then they're strictly a physical existence, and we have no evidence that things in the physical world cannot be recreated. even if consciousness didn't die with our bodies that point still stands.

    as far as proof goes, i wont attempt to know how to control variables correctly, but suggesting that your method is sound is almost patronizing science.


    By entities that can learn I assume you're referring to living creatures. In that case, the real purpose behind their learning is to propogate their genes. Whether or not this moves them higher up the chain is incidental. It just happens the two often coincide - e.g., the alpha male is the one who mates, gets to eat first, etc..
    its like i said 'an apple is juicy' and you said 'no an apple is round'.

    Again you're talking about living things that follow the rules of natural selection. There's no inherent requirement for AI to emulate us in this way. There's no reason to assume that because they possess the human characteristic of intelligence that this will make them human-like in any other way.
    you're right, i am speculating here. its not totally without merit though. in fact, its very much full of merit to postulate that advanced intellectual beings will take on a similar course of adaptation as we see everywhere.




    Consciousness is a red herring in this debate. Just possessing consciousness or being free-thinking doesn't necessarily lead to all kinds of other human characteristics such as the will to power, any more than building a machine that can beat us at chess means that same machine will 'want' to beat us. It's just a machine, it's not driven in the same ways living things are.
    i completely disagree. i cannot fathom that an entity with an ego wouldn't try to defeat the competition just like every other example we've ever known.


    There's plenty of machines that adjust their output to adapt to circumstances. For example, there's lots of artificial neural networks that learn things and none of them have yet run amok and tried to subjugate us. Whereas they are programmed to learn and adapt, to my knowledge no-one has specifically had to program them not to seek world domination. Sorry if that sounds facetious I just think you're making an invalid assumption here.
    well these programs are not even infinitesimally on the level of what super advanced AI would be. that changes everything. i explain a little more below.


    This is pretty much how the army operates, they program their soldiers to obey orders. So again you are assuming an AI would necessarily have to have human characteristics.
    thats not at all what i was getting at.

    in order for us to create a robot that can operate field of battle and everything that goes along with it to the level and beyond of human capacity, it will need to have senses. it will need to feel. among so many things, it will need to know the difference between a good and bad, whatever they may be.

    let me put it this way, if it gets hit by a heavy force it will need to know that that is bad and possibly life-threatening and it will need to react accordingly. now it can be taught/programmed to not react the same when that type of thing happens in a different circumstance (like something involving its CO), but what exactly do you think it will think when that does? it will be in a psychological/experiential/moral dilemma like we all face all the time everywhere. it could not be programmed to react in a strictly rigid fashion because then it wouldn't be able to learn and would be worthless for its purpose.

    I think our disagreement boils down to the idea that I don't believe, even if we were capable of doing so, that we'd ever want to create an AI that would both a) have the ability to surpass us; and b) have the 'motivation' to dominate us. Certainly the former is well within our current capabilities, but the latter would be a very bad move on our part indeed. I suppose it could happen by accident...
    im sure that if you went back to 3000BC and asked some tribesman if he thinks it would ever be reasonable to make a device that could blow up the land as far as the eye could see that he would say no. yet here we are having created that device.
  54. #54
    Quote Originally Posted by a500lbgorilla
    a big goal of AI sciences is to create emotions and consciousness.
    And it's the reason why they'll fail. The first forms of artificial intelligence will be devoid emotions. Hawkins devotes the very beginning of his book to tell you what's wrong with AI hopefuls today.
    well we already have AI, they're just really really rudimentary.

    and yea i dont disagree that creating consciousness and emotions will be one of the greatest achievements of all time, but we are foolish to think that they're elusive enough to not be able to be understood and created.
  55. #55
    a500lbgorilla's Avatar
    Join Date
    Sep 2004
    Posts
    28,082
    Location
    himself fucker.
    Quote Originally Posted by wufwugy
    well we already have AI, they're just really really rudimentary.
    No we don't. Unless you consider a calculator rudimentary AI as well.
    <a href=http://i.imgur.com/kWiMIMW.png target=_blank>http://i.imgur.com/kWiMIMW.png</a>
  56. #56
    yea i was confusing AI with robotics right there.

    depends on how we define AI though, as by one of the loose definitions iirc we can consider some of our machines AI. i have no problem with not doing that though. but i dunno because by the strict definitions of the two words making up AI we have had it for some time. but obv real rudimentary.
  57. #57
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    http://en.wikipedia.org/wiki/Technological_singularity


    yes, if machines somehow became self-aware, we'd be fucked.

    and smart idiots keep inventing new tech for the sake of inventing it. oooh, robots are cool? how about thinking ones? what about when they think we are obsolete, and start deleting us?
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  58. #58
    bigred's Avatar
    Join Date
    Sep 2004
    Posts
    15,437
    Location
    Nest of Douchebags
    You guys don't get it. Robots are just another step in the evolutionary cycle. There's still monkeys around.
    LOL OPERATIONS
  59. #59
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    Quote Originally Posted by bigred
    You guys don't get it. Robots are just another step in the evolutionary cycle. There's still monkeys around.

    true, but monkeys are in jungles and zoos now.

    humans will be relegated in a zoos so robot families (?) could come and see the nearly extinct species, if they somehow needed fun or going out (?)
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  60. #60
    i believe the words i used were obsoletion and critically endangered. extinction is impossible to ward off for any species ever, but it may not be a direct result of AI and biotechnology. however, non-catastrophic extinction probably happens after like 99.99999% of obsoletion and critical endangerment.
  61. #61
    Ltrain's Avatar
    Join Date
    Aug 2005
    Posts
    736
    Location
    Miami, Florida
    We'll make great pets.
    "Don't judge a man until you have walked a mile in his shoes. Then you are a mile away, and have his shoes." - Anon.
  62. #62
    Lukie's Avatar
    Join Date
    Jul 2005
    Posts
    10,758
    Location
    Never read any stickies or announcements
    http://www.amazon.com/Fantastic-Voya.../dp/0452286670

    The idea behind Kurzweil and Grossman's Fantastic Voyage is that if you can make it through the next 50 years, you might become immortal. How will that be possible? Through some rather science fictional steps, it turns out, including taking advantage of the latest in biotechnological breakthroughs and not-yet-invented nanotechnology. Is all this longing for immortality driven by an obsession with youth or a fear of death? Readers can judge for themselves, as both Kurzweil and Grossman reveal the personal histories that led them to develop this plan. Fantastic Voyage is written in an easy-to-understand tone, with lots of sidebars giving examples of what the future holds for medicine and health. Whether or not you think that science will find a way to keep our bodies or our disembodied minds alive forever, this book is full of diet and lifestyle tips. For instance, the authors suggest carefully controlling the body's overall pH at an alkaline level, meditating, eating a diet composed mostly of vegetables and protein, and taking loads of supplements (Kurzweil downs about 250 pills each day). The dietary options presented here will mostly only be practical for people whose income levels can support buying organic produce, fresh fish and meat, and top-shelf supplements. The authors cavalierly state that we are living in a "time of abundance," but it seems likely that most who are able to follow this regimen will be Americans of a fairly high socioeconomic class. --Therese Littleton --This text refers to an out of print or unavailable edition of this title.
    I have not read the book (going to order it soon), but this is all legit.

    Very exciting implications.
  63. #63
    the leaders in biotech sciences believe that it is likely that the first person to live to 150 is 50 today. immortality right behind the corner.

    the reason for this is the integration of computer technology and biology. we thought we already had the boom of many technologies, but what we've had is nothing compared to what we'll have in the future. it's plausible to say that technology doubles every decade now. exponential increase in technology is unfathomable.

    vision of the future hosted by michio kaku on science channel deals with this stuff and is highly entertaining. i recommend
  64. #64
    animal_chin's Avatar
    Join Date
    Jun 2006
    Posts
    479
    Location
    On the grind slavin' daily.
    I don't see AI taking over in the next 200 years, or ever really.
    (10:08:39 PM) Bbickes: animal chin is pretty much the balla i wanna be
    (10:08:44 PM) Bbickes: drinking every night
    (10:08:48 PM) Bbickes: and ballin hard all day
  65. #65
    spoonitnow's Avatar
    Join Date
    Sep 2005
    Posts
    14,219
    Location
    North Carolina
    Quote Originally Posted by wufwugy
    it's plausible to say that technology doubles every decade now. exponential increase in technology is unfathomable.
    Doubling every decade is exponential, dumbass.
  66. #66
    Quote Originally Posted by spoonitnow
    Quote Originally Posted by wufwugy
    it's plausible to say that technology doubles every decade now. exponential increase in technology is unfathomable.
    Doubling every decade is exponential, dumbass.
    it is. i never said it wasnt, i even said it was although indirectly.

    this is the time where you should take a moment and reflect on your own marklar.
  67. #67
    BankItDrew's Avatar
    Join Date
    Oct 2005
    Posts
    8,291
    Location
    Losing Prop Bets
    Quote Originally Posted by spoonitnow
    Quote Originally Posted by wufwugy
    it's plausible to say that technology doubles every decade now. exponential increase in technology is unfathomable.
    Doubling every decade is exponential, dumbass.
    lmao

    i thought the same thing

    wuf, u should have worded it differently
  68. #68
    bigred's Avatar
    Join Date
    Sep 2004
    Posts
    15,437
    Location
    Nest of Douchebags
    Quote Originally Posted by spoonitnow
    Quote Originally Posted by wufwugy
    it's plausible to say that technology doubles every decade now. exponential increase in technology is unfathomable.
    Doubling every decade is exponential, dumbass.
    This made me laugh, hopefully our robot overlords will find u funny too.
    LOL OPERATIONS
  69. #69
    wat are you guys smoking? its worded just fine

    if there is a problem then it was in the rest of my post. i possibly didnt make it very clear that exponential increase has already begun. it's just that this paradigm is new.
  70. #70
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    Quote Originally Posted by animal_chin
    I don't see AI taking over in the next 200 years, or ever really.
    try the next 25 years
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  71. #71
    The_OG_Rocco Guest

    Default Re: AI and the obsoletion of homo sapiens

    Quote Originally Posted by wufwugy
    it is my belief that homo sapiens will be critically endangered within the next couple centuries due to technological advancements so great that at least one new super species is created. provided that none of the other exit mundi scenarios befall us. this may be in the form of genetic manipulations to the point that we are no longer the same species, artificial intelligence with free thought in the same manner we wield it but with colossally greater computing power and thus they become higher on the evolutionary chain that us, or integration of machines/internet into our minds or something.

    i bet that if we ever came across a substantially more advanced civilization it would be purely mechanized and infused into one giant web of being, and would have been the technological offspring of an organic species. i imagine AI is the beginning of the end because a computer capable of learning and adapt to its surroundings will be more capable to create even better AI, and increases in technology will be more exponential than linear as it is now. now we humans are working off of material gathered by previous humans of the same intelligences, however, soon enough AI will be able to create better AI which could create even better AI and on and on to the point that some AI species with 100k IQ reigns supreme.

    what say you?
    What say me??
    I say that you have been watching waaaaay too many ''Matrix movies'' and ''Japanese Anime''.
    Our Lord and Saviour Jesus Christ will bring Heaven upon the Earth before that fictional scenario of yours could ever happen.
    "Neo" is not THE ONE to save mankind, Jesus Christ is THE ONE.
    That's "What says me'.
  72. #72
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    LOL you tell 'em, rocco!
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA
  73. #73
    Quote Originally Posted by Jack Sawyer
    Quote Originally Posted by animal_chin
    I don't see AI taking over in the next 200 years, or ever really.
    try the next 25 years
    thats quite an optimistic prediction, but probably not too far off. one of the problems with predicting this is that of cognitive dissonance and cognitive bias imo. the large large majority of qualified scientists dont wanna believe it so it's not being analyzed by those who can figure it out and make the information public.

    just like the economy
  74. #74
    animal_chin's Avatar
    Join Date
    Jun 2006
    Posts
    479
    Location
    On the grind slavin' daily.
    Quote Originally Posted by wufwugy
    Quote Originally Posted by Jack Sawyer
    Quote Originally Posted by animal_chin
    I don't see AI taking over in the next 200 years, or ever really.
    try the next 25 years
    thats quite an optimistic prediction, but probably not too far off. one of the problems with predicting this is that of cognitive dissonance and cognitive bias imo. the large large majority of qualified scientists dont wanna believe it so it's not being analyzed by those who can figure it out and make the information public.

    just like the economy
    You think things are increasing exponentially, but you have only seen the first half of the graph. A picture of the full graph is posted below.


    (10:08:39 PM) Bbickes: animal chin is pretty much the balla i wanna be
    (10:08:44 PM) Bbickes: drinking every night
    (10:08:48 PM) Bbickes: and ballin hard all day
  75. #75
    Jack Sawyer's Avatar
    Join Date
    Jan 2007
    Posts
    7,667
    Location
    Jack-high straight flush motherfucker
    http://waitbutwhy.com/2015/01/artifi...olution-1.html

    Don't know if it's been posted yet, but I'm on my phone and its tough to make a bookmark, so I'll leave it here so that when I fix my computer I can read it on there
    My dream... is to fly... over the rainbow... so high...


    Cogito ergo sum

    VHS is like a book? and a book is like a stack of kindles.
    Hey, I'm in a movie!
    https://www.youtube.com/watch?v=fYdwe3ArFWA

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •