• TommySoda@lemmy.world
    link
    fedilink
    arrow-up
    72
    ·
    edit-2
    7 days ago

    Let’s not do anything about the unregulated technology that can spread lies faster than ever before as websites get absolutely flooded with believable bots that outnumber the actual users. Let’s make secret passwords and handshakes like we’re in a clubhouse.

    Regardless, it’s not a bad idea since it’s probably not gonna get better for awhile if at all.

    • Ledivin@lemmy.world
      link
      fedilink
      arrow-up
      29
      ·
      7 days ago

      The technology is out. While something should be done on that side of things, it also doesn’t remove the technology from existence - you will still need other protections.

    • rudyharrelson@lemmy.radio
      link
      fedilink
      arrow-up
      9
      ·
      7 days ago

      Regulations virtually always lag years behind technology, don’t they? In the interim period with absolutely no regulations, we must take it upon ourselves to protect ourselves and loved ones from being exploited.

      Given just how wealthy the AI bubble is making some people, we may not see any common sense regulation for quite some time. Best to adapt to that reality imo. Gonna tell my friends and family to call me by my hacker alias, “X360N0_sc0peX” on the phone or I’ll assume they’re a bot.

    • zecg@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      6 days ago

      What can be done, you can download an LLM and run it locally, they’re not going away

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 days ago

        vigilance

        Vigilance is like, not drinking the water that comes out of a nuclear reactor.

        What we’re talking about here is letting everyone run their own reactor and dump the waste into the street.

        You don’t gain vigilance, you lose all habitable public space.

        • TechLich@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          6 days ago

          It’s a bit late for that. This particular nuclear reactor is open source, free to download and runs on consumer hardware. Can’t really unfry that egg and the quality is getting better all the time. Identity fraud is already illegal in most places so not sure exactly what regulation would be appropriate here.

          • phneutral@feddit.org
            link
            fedilink
            arrow-up
            1
            ·
            6 days ago

            First of all: you need giant data centres to train the models.

            Identity fraud is illegal, copyright theft is illegal as well — put the blame on the owner of the data centres.

            I know from valid sources that governments know who theses folks are.

            • TechLich@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              6 days ago

              Not entirely true. You don’t need your own personal data centre, you can use GPU cloud instances for a lot of that stuff. It’s expensive but not so expensive that it would be impossible without being a huge tech company (only 1000s of dollars, not billions). This can be done by anyone with a credit card and some cash to burn. Also, you don’t need to train a model from scratch, you can build on existing models that others have published to cut down on training.

              However, to impersonate someone’s voice you don’t need any of that. You only need about 5-10 seconds of audio for a zero-shot impersonation with a pre-trained model. A minute or so for few-shot. This runs on consumer hardware and in some cases even in real time.

              Even to build your own model from scratch for high quality voice audio, there doesn’t need to be a huge amount of initial training data. Something like xtts was trained with about 10-15K hours of English audio which is actually pretty easy to come by in the public domain. There are a lot of open and public research datasets specifically for this kind of thing, no copyright infringements necessary. If a big tech company wants more audio data than what’s publically available, they just pay people to record audio, no need to steal it or risk copyright claims and breaking surveillance laws, they have a budget to exploit people to record whatever they want.

              This tech wasn’t invented by some evil giant tech company stealing everybody’s data, it was mostly geeky computer scientists presenting things at computer speech synthesis conferences. That’s not to say there aren’t a bunch of huge evil tech companies profiting from this or contributing to this kind of tech, but in the context of audio deepfakes being accessible to scammers, it’s not on them and I don’t think that some kind of extra copyright regulation on data centres would do anything about it.

              The current industry leader in this space in terms of companies trying to monetize speech synthesis is elevenlabs which is a private start-up with only a few dozen employees.

              The current tech is not perfect but definitely good enough to fool someone who isn’t thinking too hard over a noisy phone call and a scammer doesn’t need server time or access to a data centre to do it.

    • arglebargle@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 days ago

      Websites have been full of shit, bots or not, since forever. Nothing new here.

  • ERROR: Earth.exe has crashed@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    33
    ·
    7 days ago

    Secret phrases seem like it could just get wiretapped and its not longer a secret.

    You’re gonna need to change them every day, nay, every conversation.

    Might need some some RSA 4096 to handshake each phone call for authentication and might as well do encryption too.

    Or we might need to generate some One Time Pads and then do a challenge-response thing by reading 5 digits, then have the other person reply 5 digits after that, then the numbers are crossed out.

    The future is gonna be so weird.

    I feel like CSAM might go out of control.

    Any video of politicians/candidates doing bad things would be responded with “CNN FAKE NEWS DEEPFAKE”.

    Like you could just murder someone on 4K camera and you can claim its a deepfake.

    We’re so fucked.

  • InternetCitizen2@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    7 days ago

    HA HA HA fellow humankind member. This has given me a pointer to a disk location of a friend back in university

    Lol but seriously, back in uni a friend of mine got their social media hacked. The hacker was trying to beg for money and such. One person got suspicus and asked what their favorite beer was, so the scammer texted me “hey what is my favorite beer?”

    Fortunately the account got locked for some reason, so no money was stolen. Bro still has not recovered it.

      • Daemon Silverstein@thelemmy.club
        link
        fedilink
        arrow-up
        3
        ·
        7 days ago

        My previous comment is a reference to the Supernatural TV series. The protagonist brothers Sam and Dean Winchester had Poughkeepsie as a distress signal whenever one of them needed to inform the other to “pack up and run”. One of the situations involved Dean telling Crowley the distress signal so Crowley could enter Sam’s mind and warn him about his ongoing angelic possession.

  • Infynis@midwest.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 days ago

    Saying the same thing over and over again in different conversations would be super useful if your goal is to train an AI to listen to calls

    • Delphia@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      6 days ago

      Its honestly not hard

      “Son its dad, Ive had to borrow a phone…”

      “Before I transfer the money dad, whens Gran getting out of the hospital?”

      (Grans been dead for a decade)

  • Hiro8811@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    6 days ago

    I mean close family members will recognise if it’s you so they’ll have to contact someone else which mostly likely will find through Facebook or other platforms. Still this still seems like a way to make people even more unique between them.