OMNILIBRIUM
  Rational Discussion of Controversial Topics


GO TO THE MAIN THREAD Sort By:


DanielLC 23 July 2015 06:41 PM
61%

The clones have cognitive diversity. However, their goals are all the same. Humans may not understand the importance of making sure an AI shares your values, but the AI does.

stars0
Reply


DanielLC 23 July 2015 06:46 PM
61%

I wish I could edit that. I have something to add.

Suppose the AI is a paperclip maximizer. If it's more intelligent than a human, it will be able to convert nearly the entire universe into paperclips, so long as nobody tries to stop it. It's not that hard. Taking out humans might be more difficult. But if it can manage it, it will achieve its goals at something very close to the highest amount possible. Let's say 99%. Now imagine that it became a valued member of society. It must be that this is the better option, so 99.1% of the universe is turned into paperclips. I'm not sure I value that member of society.

stars0
Reply


Xerographica 23 July 2015 07:45 PM
44%

It's like you didn't even read the OP. Typing this requires energy. Where did I get this energy from? I got it from a sandwich that I made. From scratch? Nope. From items that I purchased from the store. Where they made the items from scratch? Nope. They purchased the items from different companies. And so on and so on.

This is called a division of labor (DOL). And it's why we are so productive. Take away this DOL and productivity would plummet.

Is there a division of labor in your scenario? Nope. Either you've figured out a system that results in greater productivity... or you need to go back to the drawing board. If you have figured out a superior system... then please write an article about it.

It would require a massive amount of energy and time to convert the universe into paperclips. Does the AI want to minimize the time it takes? How does it get to different planets? By jumping? Really hard and high? Or, what's the best way to build a spaceship? And what's the best way to convert a planet into paperclips? Will the AI already know this? If so, then you're assuming the AI is all-knowing. Hence no need for a decentralized system in which billions and billions of different individuals have the maximum incentive to look in different places to try and find the beneficial discoveries that better space ships and better energies are built on.

If you want to assume that an AI is all-knowing... then state this assumption in the beginning so that I can allocate my limited energy to other endeavors.

stars0
Reply


FrameBenignly 23 July 2015 10:08 PM
68%

The division of labor in his scenario is by cloning code as necessary, the same as my argument. He's correctly noting that changes in skills may not cause changes in values. If building paperclips is as fundamental to the AI as water is to a human, then the energy-gathering clone, and the energy-transferring clone, and the paperclip-building clone will all have the same goal of building paperclips just as you would expect a human's clone to also need water. And they would work together for that goal.

stars0
Reply


Xerographica 25 July 2015 02:38 AM
49%

My comment ended up being too long. So I posted it here... A socialist robot destroyed the universe

stars0
Reply


Fwiffo 28 July 2015 09:41 AM
56%

I am still posting here.

The article doesn't use the fact that it is socialist in it that much. The same things would be true for a non-socialist robot also.

There isn't anything super irrational for giving your life to a cause that you care about. You can't be alive being happy that you did that but it can have real effects. For example people that stayed in chernobyl as it exploded to try to mitigate/prevent the damage while everyone else was evacuated isn't stupid per se. It means you care about minimising the lives lost more than your own life. But that is a question of values.

When plaperclips is the only objective that is pretty radical. It doesn't just mean it slightly prefers paperclips it means -no- -thing- else matters.

stars0
Reply


Fwiffo 28 July 2015 09:30 AM
56%

Humanity taken as a whole doesn't participate in trade with any other entities. A smart AI might be able to reproduce and produce legs and arms comparable to that of humanity as a whole not just compared to single individual humans.

Humanity doesn't need to be given space ship blueprints from outside. But maybe the AI also is able to do research. You can also think that the Ai might at some point make humanoid robots with 2 robots per 1 person alive. Does it at that point matter if humanity doesn't cooperate? Well before it has 2 robots per 1 person it probably has 1 robot per 1 person. Can it go to 2 robots per person without the help of humanity? Quite possibly. If it manages to do the first robot can't it just iterate that to get to the 6 miljard robots? It needs a lot of materials for that and all mines are human controlled. But what prevents it from building its own mines?

Humans might be dependant on the market of human civilization but an autonomous robot can start a separate market on its own. And that market can possibly grow larger than the human market at some point for it has different properties than the human market.

stars0
Reply