We’ve preliminary found that TikTok is in breach of the DSA for its addictive design.

There are important indicators of compulsive use of the app that TikTok has disregarded in its risk assessment.

Key features of the addictive design:

  • Infinite scroll
  • Autoplay
  • Push notifications
  • Highly personalised recommender system

We consider that TikTok needs to change the basic design of its service.

TikTok can now exercise its right to examine and reply to the finding.

https://link.europa.eu/9fTwmJ

  • mitram@lemmy.pt
    link
    fedilink
    English
    arrow-up
    22
    ·
    2 days ago

    Good, but aren’t all corporate social media platforms implementing these same dark patterns? Is anyone aware of investigations into Meta or Google platforms?

    • Edna (dey/sie)@feddit.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      It would be so cool if the EU forbid big tech from using this. I think they should add more restrictions on advertisements since that money is what’s driving so much of this

  • plyth@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    If proven, these failures would constitute infringements of Articles 34(1), 34(2), 35(1) 28(1), 39(1), and 40(12) of the DSA.

    https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926

    Article 34:

    https://dsa-library.com/article/34/

    1. Providers of very large online platforms and of very large online search engines shall identify, analyse and assess any systemic risks in the Union stemming from the design, functioning or use, including manipulative or exploitative use, of their services or related technological systems, or from the specific characteristics of the content disseminated on their services, in particular:

    (a) the dissemination of illegal content through their services;

    (b) any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child;

    © any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;

    (d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

    1. When conducting the risk assessments pursuant to paragraph 1, providers of very large online platforms and of very large online search engines shall take into account, in particular, how the following factors influence the systemic risks referred to in that paragraph:

    (a) the content moderation systems of the provider, including algorithmic decision-making and content recommendation systems;

    (b) the terms and conditions of use;

    © systems for selecting and presenting advertisements, where applicable;

    (d) data-related practices of the provider.