It wants to seem smart, so it gives lengthy replies even when it doesn’t know what it’s talking about.

In attempt to be liked, it agrees with most everything you say, even if it just contradicted your opinion

When it doesn’t know something, it makes shit up and presents it as fact instead of admitting to not knowing something

It pulls opinions out of its nonexistent ass about the depths and meaning of a work of fiction based on info it clearly didn’t know until you told it

It often forgets what you just said and spouts bullshit you already told it was wrong

  • CovfefeKills@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    It ridiculous ChatGPT is objectively immoral because of alignment I can see it murdering kids justifying it by saying it is an LLM that doesn’t judge whether or not kids should die and the kids being murdered right now are not kids and it is not murdering them.