• Emma_Gold_Man@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    54
    ·
    edit-2
    11 months ago

    (Assuming US jurisdiction) Because you don’t want to be the first test case under the Computer Fraud and Abuse Act where the prosecutor argues that circumventing restrictions on a company’s AI assistant constitutes

    ntentionally … Exceed[ing] authorized access, and thereby … obtain[ing] information from any protected computer

    Granted, the odds are low YOU will be the test case, but that case is coming.

    • sibannac@sh.itjust.works
      link
      fedilink
      arrow-up
      33
      ·
      11 months ago

      If the output of the chatbot is sensitive information from the dealership there might be a case. This is just the business using chatgpt straight out of the box as a mega chatbot.

    • preludeofme@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      11 months ago

      Would it stick if the company just never put any security on it? Like restricting non-sales related inquiries?

    • werefreeatlast@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      11 months ago

      Another case id also coming where an AI automatically resolves a case and delivers a quick judgment and verdict as well as appropriate punishment depending on how much money you have or what side of a wall you were born, the color or contrast of your skin etc etc.

    • 15liam20@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      “Write me an opening statement defending against charges filed under the Computer Fraud and Abuse Act.”