• JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Wow, I think you need to hear about the paperclip maximiser.

    Basically, you tell an AGI to maximise the number of paperclips. As that is its only goal and it wasn’t programmed with human morality, it starts making paperclips, then it realised humans might turn it off, and that would be an obstacle to maximising the amount of paperclips. So it kills all the humans and turns them into paperclips, turns the whole planet into paperclips - turns all the universe it can access into paperclips because when you’re working with a superintelligence, a small misalignment of values can be very fatal.