Main Page Contact Register Log In


If there is a slow takeoff, meaning AI intelligence increases at a slower pace than governments can legislate it, this might become a matter of public debate. However, if computing power is still doubling every 30 months when AI reaches general intelligence, this seems like a lower bound for intelligence increases. The upper bound is unknown. The entire planet could change in a matter of minutes. Under these faster singularity takeoff scenarios, the future of the planet will be decided by the first company to make the breakthrough, not by a government. Nick Bostrom's book has a long, multi-chapter discussion of how to optimize the AI's utility function for the takeoff. The solution is not simple which is why there are entire organizations trying to solve the problem.






It seems ironic that many companies and scientists funded by government or industry are working hard, some dedicating their whole careers to what seems will likely lead to the end of the human race, and no one seems too concerned about it. Of course if this happens slowly enough, there is a chance we'll be able to stop and think at some point. But there is a real chance that this will snowball fast. This is crazy.
63%
Alice
stars0
Replies (0)