Video-based content is already difficult to effectively moderate, because problematic content might be showcased visually, without keywords an automated AI might be able to check. And TikTok users are known for inventing shorthand and coded language specifically to evade censorship. There’s corn — or even just a corn emoji — for discussing porn, or “seggs” for sex. In the world of antisemitism, “H!tl3r” or “that Austrian painter” helps users talk about Hitler without detection.

  • Semi-Hemi-Demigod@kbin.social
    link
    fedilink
    arrow-up
    24
    ·
    11 months ago

    Instead, on TikTok, users have little control over the videos they’re served. The main interface with the app is through the user’s homepage, called the For You Page — FYP for short — where they scroll through an infinite supply of videos served up by the app’s algorithm that play, one by one. A few might be from people they follow, but most won’t.

    I wasn’t aware of this because I avoid that app like the plague, but this is peak enshittification right here.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      11
      ·
      11 months ago

      There’s even a great YouTube video that explains why this is a terrible thing even in a landscape of terrible social media, and how it’s actually making other social media worse.

      There are a lot of people who believe quite seriously that TikTok is a psyop to deteriorate Western culture and implement a massive spyware network.

    • Quetzalcutlass@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      11 months ago

      A while back YouTube disabled the home feed recommendations unless you have watch history turned on. The problem is, under the old behavior if you had history turned off, it’d only recommend videos based on your recent likes and watch later playlist - in other words, it’d recommend things based on what you like to watch rather than what’s popular.

      This meant users with watch history turned off got better, more relevant suggestions than those subject to the normal algorithm (where you’d get constant right-wing influencer spam if you clicked on a single vaguely political video). So of course Google disabled it. Can’t have a feature that prioritizes customer satisfaction over engagement, after all!

      • Semi-Hemi-Demigod@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        Which makes total sense from Google’s point of view. They get paid when people watch YouTube. If I log on and see that Lazerpig hasn’t posted a video and then log off, I’m not using YouTube.

        But if I log on and they recommend a historical cooking show I’ll watch like ten of them in a row before I remember what I even came there for.

    • Nix@merv.news
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      11 months ago

      Thats how youtube is as well though? The default homepage for both is based on what you watch and like and comment on and some videos from accounts you follow. Or you can switch to the Followed/Subscriptions section for either app.

      Its like that on every social media other than mastodon

    • JustMy2c@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      11 months ago

      It’s fantastic for finding content.

      Unfortunately less evolved humans could easily be deceived and led to extreme values, whether right or left (probably BOTH!)

      • ItsAFake
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        11 months ago

        Yeah you have to actively tell the algorithm what you like and especially what you don’t like, you put some effort in and you’ll get a nice feed of topics that suit you well, it’s still a shitty app that will eat your time away, but put in some effort and it will be an entertaining or informative waste of time.