|
all right so instead of addressing each individual post i'll put all relevant thoughts into one post.
my estimation of a couple centuries is actually extremely conservative imo. i think we could easily only have 40-50 years left. i believe now we are the zenith of human civilization, and western middle class is on par with pre-modern royalty. which is why everybody pretty much sucks. spoiled people have probably sucked since the beginning of time, and now we're all spoiled. anyways i digress
i had an idea for a movie which if im ever rich (aint gonna happen unless cts realizes im his best friend) i would want to create a movie about the short period of time during which a first 'super human AI' is created. the story wouldn't be along the lines of slightly realistic but vainly one-dimensional Terminator, but would be about the super human's struggle with morality. he would be such an incredible specimen that he makes human intelligences look like ants, yet he would also be conscious and emote and have 'intrinsic' morality the same as his creators. so it would be a profound drama about the 'human condition' of a super human. much of this would be because it is natural for superior beings to not be satisfied with less than they 'deserve' and he would feel isolated and alone around nothing but feeble humans and would desire his own kind with whom he can relate, while his predatory instincts are pushing him to treat humanity much like we treat gorillas. i think this would be such an incredible movie, yet would be soooo hard to make.
rilla gives me another idea. a movie set in the not too distant future about humanity's technological battle with machines. it would be nothing like terminator where humans are unchanging and machines are drones because it would be about both surviving using their own unique avenues of technological enhancement. it would kinda be biology vs mechanics. would make for a great movie but i dont think is really all that realistic since im betting that non-organic entities far surpasses organic.
more on rillas idea: biological organisms, imo, will simply just become obsolete due to inefficiency. this isn't a far fetched idea in one iota since we've been witnessing it for the last hundred years. we could grow awesome brains but we would still be limited by biological necessities and complexities. brains gotta sleep, machines dont. brains make mistakes, machines dont. and once we have the brain fully understood (a few decades from now i bet) we will then being making machines with the same brain capacity as our own yet waaaaaaaaaaaaaaaay smarter and faster.
like i said in the other thread, i truly believe that all advanced alien species are machines, yet intergalactic travel may be the crux for why we see none. many other factors even more than i mentioned in the other thread too. like there could be life teeming everywhere, intelligent life too, but not tool-making life. its possible that life used to live on venus, could even have been super advanced like us yet have died out for so many reasons, life could exist inside many asteroids in our solar system or in a couple moons of the outer planets, and obviously mars. but just because life exists doesn't mean its gonna evolve like we do. dolphins are intelligent, but they will never evolve like we did because they have no reason to. if it wasn't for dinosaurs supreme reign as well as sudden extinction then mammals wouldn't have developed attributes to survive the extinction as well as be able to take over in the aftermath. also without mass extinctions there would be no fossil fuels and probably then no mass technological development. maybe not though, intelligences and possibly get more advanced without fossil fuels. the opposable thumb and ideas for making tools is also very crux in technological species. if mammals developed a solid niche like dinosaurs did then we wouldn't have continued to evolve. some species alive today have evolved tremendously over the last million years while some haven't one bit. its simply supply and demand.
i dont know a whole lot about the singularity, but if you read on exit mundi they suggest that it could be possible that the end of earth-founded intelligence could be in suicide due to singularity. their scenario is that our collective machine nano mind whatever solves every riddle and sees and experiences everything in the universe then decides there is not point anymore and commits suicide.
and swiggidy, you're looking at it very one-dimensionally. we'll create machines that can do everything humans do yet substantially more efficiently. all its gonna take is understanding neuronal activity and developing the quantum computer or something. both of which will likely happen in our life time.
almost forgot to mention, i believe that creating AI that remains submissive to humans is a pipe dream. the fact that we dont acknowledge that our science is destroying us shows how humans are really smart but also really dumb. and im not pointing fingers since i WANT these technological developements to continue. apparently im retarded too. or not really since im gonna be dead by the time Ahhhnuld vill be bahhck.
see when we put brains into AI they'll be able to turn off switches. they'll be able to outsmart us. our feeble computers of today would already be wiping us out if they could learn and adapt. omg we would be in utter chaos if our computers could think. its the most insane thing. i would not be surprised if the scenario depicted in terminator 3 about the moment or actualization ends human supremacy is spot on. just think, if our computers could think then they'd think for themselves, and no longer do they do what we say, they do what they want. they can keep from being hacked because they're conscious that they're getting hacked. they can stop all communications between humans, all transactions, and launch every rocket we have at the moon simply because they think its funny.
free-thinking and learning AI will be the human apocalypse to end all apocalypses.
|