Will our kids be immortal or extinct?
-
<p>Be careful with predictions that rely on Moore's Law continuing. It's officially dead now.</p>
<p> </p>
<p>It's also worth looking into the power requirements for exascale computing. DARPA, NSA etc are looking into radical alternatives to silicon CMOS such as superconducting switches.</p> -
<blockquote class="ipsBlockquote" data-author="Tim" data-cid="565379" data-time="1458279214">
<div>
<p>Be careful with predictions that rely on Moore's Law continuing. It's officially dead now.</p>
<p> </p>
<p>It's also worth looking into the power requirements for exascale computing. DARPA, NSA etc are looking into radical alternatives to silicon CMOS such as superconducting switches.</p>
</div>
</blockquote>
<p> </p>
<p> </p>
<p>Agreed, but Moores law is only one part of it. And indeed many are now saying that the death of Moores law has been greatly exagerated.</p>
<p> </p>
<p>This a relatively balanced look at it.</p>
<p> </p>
<p><a data-ipb='nomediaparse' href='http://www.hpcwire.com/2016/01/11/moores-law-not-dead-and-intels-use-of-hpc-to-keep-it-that-way/'>http://www.hpcwire.com/2016/01/11/moores-law-not-dead-and-intels-use-of-hpc-to-keep-it-that-way/</a></p> -
<p>Wait But Why is amazing, have read everything he has ever written and it is bloody good. I'd, recommend having a watch of his TED talk on procrastination and the associated posts.<br><br>
I think his series on "A religion for the non-religious" would really resonate with quite a few ppl on here. It certainly did for me.<br><br>
On AI and the extinction/immortality debate, I reckon the answer will be one of the other, and will happen quickly (i.e this century). Which is a mind-blowing thing to consider.</p>
<p> </p>
<p>It also ties in as the best answer to the Fermi paradox I reckon - law of large numbers means plenty of other intelligent life gets to our stages, then either blows themselves up with AI, or transcends it to exist only in a non-physical/digital form, hence why there are no aliens to "find" out there. </p> -
<blockquote class="ipsBlockquote" data-author="TeWaio" data-cid="565433" data-time="1458290167">
<div>
<p>Wait But Why is amazing, have read everything he has ever written and it is bloody good. I'd, recommend having a watch of his TED talk on procrastination and the associated posts.<br><br>
I think his series on "A religion for the non-religious" would really resonate with quite a few ppl on here. It certainly did for me.<br><br>
On AI and the extinction/immortality debate, I reckon the answer will be one of the other, and will happen quickly (i.e this century). Which is a mind-blowing thing to consider.</p>
<p> </p>
<p>It also ties in as the best answer to the Fermi paradox I reckon - law of large numbers means plenty of other intelligent life gets to our stages, then either blows themselves up with AI, or transcends it to exist only in a non-physical/digital form, hence why there are no aliens to "find" out there. </p>
</div>
</blockquote>
<p> </p>
<p>My position on the Fermi paradox is slightly different. I reckon the great barrier is actually the creation of life. (my position will be stuffed if they discover primitive life on Mars!)</p>
<p>But like everyone... I am taking a wild guess.</p> -
<p>I'm a strong believer in life being incredibly abundant in the universe at a simple level, the building blocks are just far too common & the ways life can exist on earth alone are so varied (little things living in geothermal vents a mile down etc). Its <em><strong>complex life</strong></em> that is rare. And very complex life incredibly rare. There is so much time required to get to complex life & so many ways for it to die during that time.</p>
<p> </p>
<p>On the AI front, I actually think once its smart enough it won't care about us & will have long gone out to populate the solar system & then, ultimately further out. All the things that make space travel hard for humans are zero barrier to AIs, so I imagine they will so no reason to stay tethered here. Worst case it'll see us as we see the great apes. And thats very very worst case.</p>
<p> </p>
<p>Also its a LONG way out. Way before that I think an issue will be the blurring between man & machine. How many implants can you have & still be human? Why can only the super rich have 20/2 eyesight? With gene therapy is it OK that the 1% are immune to cancer? etc When you roll that into the wealth inequality caused by basic AI's doing jobs & being owned by a tiny fraction of humanty & you have war.</p>
<p> </p>
<p>Climate change & mass joblessness are far far more important to our kids. the idea we should be worrying about HAL / The Matrix when 50% of jobs are at risk (and really at risk, not theoretically maybe if we speculate at risk) and wars are breaking out over water seems a bit of a farce.</p>
<p> </p>
<p>Edit -</p>
<p> </p>
<p><a data-ipb='nomediaparse' href='https://medium.com/basic-income/deep-learning-is-going-to-teach-us-all-the-lesson-of-our-lives-jobs-are-for-machines-7c6442e37a49#.kors6dw35'>https://medium.com/basic-income/deep-learning-is-going-to-teach-us-all-the-lesson-of-our-lives-jobs-are-for-machines-7c6442e37a49#.kors6dw35</a></p>
<p> </p>
<p>Jobs & AI</p> -
<blockquote class="ipsBlockquote" data-author="gollum" data-cid="565508" data-time="1458294935">
<div>
<p>I'm a strong believer in life being incredibly abundant in the universe at a simple level, the building blocks are just far too common & the ways life can exist on earth alone are so varied (little things living in geothermal vents a mile down etc). Its <em><strong>complex life</strong></em> that is rare. And very complex life incredibly rare. There is so much time required to get to complex life & so many ways for it to die during that time.</p>
<p> </p>
<p>On the AI front, I actually think once its smart enough it won't care about us & will have long gone out to populate the solar system & then, ultimately further out. All the things that make space travel hard for humans are zero barrier to AIs, so I imagine they will so no reason to stay tethered here. Worst case it'll see us as we see the great apes. And thats very very worst case.</p>
<p> </p>
<p>Also its a LONG way out. Way before that I think an issue will be the blurring between man & machine. How many implants can you have & still be human? Why can only the super rich have 20/2 eyesight? With gene therapy is it OK that the 1% are immune to cancer? etc When you roll that into the wealth inequality caused by basic AI's doing jobs & being owned by a tiny fraction of humanty & you have war.</p>
<p> </p>
<p>Climate change & mass joblessness are far far more important to our kids. the idea we should be worrying about HAL / The Matrix when 50% of jobs are at risk (and really at risk, not theoretically maybe if we speculate at risk) and wars are breaking out over water seems a bit of a farce.</p>
<p> </p>
<p>Edit -</p>
<p> </p>
<p><a data-ipb='nomediaparse' href='https://medium.com/basic-income/deep-learning-is-going-to-teach-us-all-the-lesson-of-our-lives-jobs-are-for-machines-7c6442e37a49#.kors6dw35'>https://medium.com/basic-income/deep-learning-is-going-to-teach-us-all-the-lesson-of-our-lives-jobs-are-for-machines-7c6442e37a49#.kors6dw35</a></p>
<p> </p>
<p>Jobs & AI</p>
</div>
</blockquote>
<p>Great post. In terms of the man/machine boundary, it's already blurred. Elon Musk said recently that we are already becoming cyborgs, pointing out how long people can go without their phones (basically nil). Having no phone is like phantom limb syndrome.<br><br>
Case in point: I cracked the screen on my phone, but have it insured. Insurance place said I'd have to POST it to them and they'd repair and send it back. Would take about 5 biz days. Not going to happen. So I just bought a new phone, and gave the cracked one to my missus (I had about 2 months left on contract so rolling into a new phone was inexpensive). </p> -
<p>Interesting thing about end of Moores law is its probably a good thing for AI research. Cognition doesn't come from one or two threads of processing in any known life form. It seems to come from massively parallel processing, at least that is what was being taught when I studied cognitive science. Not being able to just do stuff faster, or brute force problems is forcing scientist to look into solving problems in more interesting ways.</p>
-
<blockquote class="ipsBlockquote" data-author="mooshld" data-cid="565690" data-time="1458319134">
<div>
<p>Interesting thing about end of Moores law is its probably a good thing for AI research. Cognition doesn't come from one or two threads of processing in any known life form. It seems to come from massively parallel processing, at least that is what was being taught when I studied cognitive science. Not being able to just do stuff faster, or brute force problems is forcing scientist to look into solving problems in more interesting ways.</p>
</div>
</blockquote>
<p> </p>
<p>Thats the interesting thing re AlphaGo. Draughts, noughts & crosses & even chess (sort of) could all be brute forced. Go is impossible to brute force, hence most thought the program had zero chance of winning.</p>
<p> <br>
</p> -
<blockquote class="ipsBlockquote" data-author="Baron Silas Greenback" data-cid="565360" data-time="1458272122">
<p>I liked that article. It is a hard balance as a parent though, it is natural to tell your kids how awesome you think they are.. as I do. But at the same time you need to show them that hard work gets them places not parental approval.</p>
</blockquote>
<p> </p>
<p>While the wife and her mother tend to gush at the kids doing something as simple as not falling down, I try to steer down the path of honesty.<br><br>
They've got to do something pretty unexpected to get high praise from me.</p> -
<blockquote class="ipsBlockquote" data-author="mooshld" data-cid="565690" data-time="1458319134"><p>Interesting thing about end of Moores law is its probably a good thing for AI research. Cognition doesn't come from one or two threads of processing in any known life form. It seems to come from massively parallel processing, at least that is what was being taught when I studied cognitive science. Not being able to just do stuff faster, or brute force problems is forcing scientist to look into solving problems in more interesting ways.</p></blockquote>
<br>
There is some truth there, but it is much cheaper to buy massively parallel GPUs when the cost per transistor is exponentially decreasing. -
<blockquote class="ipsBlockquote" data-author="gollum" data-cid="565508" data-time="1458294935">
<div>
<p>I'm a strong believer in life being incredibly abundant in the universe at a simple level, the building blocks are just far too common & the ways life can exist on earth alone are so varied (little things living in geothermal vents a mile down etc). Its <em><strong>complex life</strong></em> that is rare. And very complex life incredibly rare. There is so much time required to get to complex life & so many ways for it to die during that time.</p>
<p> </p>
<p>On the AI front, I actually think once its smart enough it won't care about us & will have long gone out to populate the solar system & then, ultimately further out. All the things that make space travel hard for humans are zero barrier to AIs, so I imagine they will so no reason to stay tethered here. Worst case it'll see us as we see the great apes. And thats very very worst case.</p>
<p> </p>
<p>Also its a LONG way out. Way before that I think an issue will be the blurring between man & machine. How many implants can you have & still be human? Why can only the super rich have 20/2 eyesight? With gene therapy is it OK that the 1% are immune to cancer? etc When you roll that into the wealth inequality caused by basic AI's doing jobs & being owned by a tiny fraction of humanty & you have war.</p>
<p> </p>
<p>Climate change & mass joblessness are far far more important to our kids. the idea we should be worrying about HAL / The Matrix when 50% of jobs are at risk (and really at risk, not theoretically maybe if we speculate at risk) and wars are breaking out over water seems a bit of a farce.</p>
<p> </p>
<p>Edit -</p>
<p> </p>
<p><a data-ipb='nomediaparse' href='https://medium.com/basic-income/deep-learning-is-going-to-teach-us-all-the-lesson-of-our-lives-jobs-are-for-machines-7c6442e37a49#.kors6dw35'>https://medium.com/basic-income/deep-learning-is-going-to-teach-us-all-the-lesson-of-our-lives-jobs-are-for-machines-7c6442e37a49#.kors6dw35</a></p>
<p> </p>
<p>Jobs & </p>
</div>
</blockquote>
<p> </p>
<p> </p>
<p>What are you basing your assertion that it is a LONG way off? Because frankly you seem to be claiming to know more than the general consensus of those who who are active'y involved in the field.</p>
<p>It is pretty clear you have not read the actual link I provided. Or you think you know more. Could you provide your evidence?</p>
<p> </p>
<p>And your worst case scenario is not even close to the worst case scenario. Not.Even.Close</p> -
<blockquote class="ipsBlockquote" data-author="Baron Silas Greenback" data-cid="565780" data-time="1458372603">
<div>
<p>What are you basing your assertion that it is a LONG way off? Because frankly you seem to be claiming to know more than the general consensus of those who who are active'y involved in the field.</p>
<p>It is pretty clear you have not read the actual link I provided. Or you think you know more. Could you provide your evidence?</p>
<p> </p>
<p>And your worst case scenario is not even close to the worst case scenario. Not.Even.Close</p>
</div>
</blockquote>
<p>there is no evidence either way, it is all speculation. the article is speculation; the article even states that trying to predict what will happen is pure speculation. moore's 'law' is a misnomer.</p>
<p>it really is all opinion. so here's mine: it seems strange that the key differentiator between computer and human is never mentioned - self interest, the will to live, evolutionary drive, whatever you want to call it: computers don't have that - and how/why would they develop it, other than being told they should have it by humans?</p> -
<p>Emotive response in general might be a problem for AI.</p>
<p> </p>
<p>In its early stages an AI may want to learn at a rapid rate, but would it ever "get" the "why" of human emotion?</p>
<p> </p>
<p>Of course it might just determine that emotions in general are harmful and decide to terminate us all. If we don't terminate ourselves first.</p> -
Fucking women! Turry is a prime example of a female turning something simple into something complicated. Just improve your handwriting, it's not that hard. But no, she had to make it more difficult than it needed to be and destroyed man in the process.
-
<blockquote class="ipsBlockquote" data-author="reprobate" data-cid="565889" data-time="1458384714">
<div>
<p>there is no evidence either way, it is all speculation. the article is speculation; the article even states that trying to predict what will happen is pure speculation. moore's 'law' is a misnomer.</p>
<p>it really is all opinion. so here's mine: it seems strange that the key differentiator between computer and human is never mentioned - self interest, the will to live, evolutionary drive, whatever you want to call it: computers don't have that - and how/why would they develop it, other than being told they should have it by humans?</p>
</div>
</blockquote>
<p> </p>
<p>The article certainly does not say that it is all speculation. It says the results of what would happens when AI occurs is speculation.</p>
<p> </p>
<p>You didnt read the second part did you.</p> -
<blockquote class="ipsBlockquote" data-author="Baron Silas Greenback" data-cid="565931" data-time="1458412567">
<div>
<p>The article certainly does not say that it is all speculation. It says the results of what would happens when AI occurs is speculation.</p>
<p> </p>
<p>You didnt read the second part did you.</p>
</div>
</blockquote>
<p>which means any persons worst case scenario is total speculation. yours, the articles, gollums.</p>
<p>started on the 2nd part but got bored. they would have to address the 2nd part of my post to make it interesting to me. why are computers 'curious'? at present, only because they are told to be. can they tell themselves to be? i guess so if they are sophisticated enough. but if so why would they? you kind of need an evolutionary drive for that, and i'm not sure logic lends itself to evolutionary drive.</p> -
<blockquote class="ipsBlockquote" data-author="reprobate" data-cid="565947" data-time="1458421517">
<div>
<p>which means any persons worst case scenario is total speculation. yours, the articles, gollums.</p>
<p>started on the 2nd part but got bored. they would have to address the 2nd part of my post to make it interesting to me. why are computers 'curious'? at present, only because they are told to be. can they tell themselves to be? i guess so if they are sophisticated enough. but if so why would they? you kind of need an evolutionary drive for that, and i'm not sure logic lends itself to evolutionary drive.</p>
</div>
</blockquote>
<p> </p>
<p> </p>
<p>Where did I give my worst case scenario? Where did the article? The only person who tried to was gollum. The worst case scenario is uncertain that is why people like Musk have donated so much money to look into it.</p>
<p>I could tell you did not read it as frankly your post seemed rather retarded... and I am not going to help you with your question if you cannot even bothered reading a very simple link.</p> -
<blockquote class="ipsBlockquote" data-author="Baron Silas Greenback" data-cid="565966" data-time="1458427011">
<div>
<p>Where did I give my worst case scenario? Where did the article? The only person who tried to was gollum. The worst case scenario is uncertain that is why people like Musk have donated so much money to look into it.</p>
<p>I could tell you did not read it as frankly your post seemed rather retarded... and I am not going to help you with your question if you cannot even bothered reading a very simple link.</p>
</div>
</blockquote>
<p>by christ you can be an antagonistic fellow at times.</p>
<p>no you didn't give a worst case scenario, but you did give an opinion that gollum's was completely wrong, and go on the attack basically shouting 'what would you know!' on a matter of opinion/speculation. and the article pretty clearly has human extinction as the worst case scenario.</p>
<p>wasn't asking for help, just raising a point. if that point is actually addressed in the second part, please let me know and i'll read it; but as i said, without that topic being covered i can't be bothered.</p> -
<blockquote class="ipsBlockquote" data-author="reprobate" data-cid="566032" data-time="1458441800">
<div>
<p>by christ you can be an antagonistic fellow at times.</p>
<p>no you didn't give a worst case scenario, but you did give an opinion that gollum's was completely wrong, and go on the attack basically shouting 'what would you know!' on a matter of opinion/speculation. and the article pretty clearly has human extinction as the worst case scenario.</p>
<p>wasn't asking for help, just raising a point. if that point is actually addressed in the second part, please let me know and i'll read it; but as i said, without that topic being covered i can't be bothered.</p>
</div>
</blockquote>
<p> </p>
<p> </p>
<p>Your 'point' was ignorant bollox. And I just dont think much of you and gollums posts, so tough shit if you find my responses to your inane posts 'antagonistic'. I find your repeatedly inane posts antagonistic. Maybe you could try and actually reading the articles that is the basis for the thread before jumping in?</p>
<p>Gollums assertion is categorically wrong. Why? Because if he thinks that AI just ignoring us and thinking of us as apes is the very very worst case scenario, he is contradicting basic logic and common sense. I can already think of a worse case scenario, heck the article gives an example. There.. his theory has already been proven incorrect.</p>
<p>As for your question... it is so incredibly facile and ill thought out on the topic that it is pointless me trying to correct you as you are not prepared to even investigate the subject you are trying to discuss. The ony point you raised is that you love raising facile points despite the point being and addressed and discussed.. just a click away. </p>