• ocassionallyaduck@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    6 months ago

    The cats out of the bag on this. It’s enforceable for now to try and ban it, maybe. Because the models are mostly online and intensive.

    In 2028 though, when you can train your own model and generate your own local images without burning a server farm? This has to happen for ML to keep growing and catch on.

    welp. Then there is infinite fake child porn. Because you cannot police every device and model.

    Because of how tech companies have handled this technology, this is not an if scenario. This is guaranteed now.

    • Asafum@feddit.nl
      link
      fedilink
      arrow-up
      13
      ·
      6 months ago

      Because you cannot police every device and model.

      FBI: “Challenge accepted. Hey Microsuck, let’s collaborate on a windows feature that records everything you do.”

      Microsuck: “one step ahead of you. We already have it.” (Not a joke.)

      • Karyoplasma@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        You cannot force people to use Micro$oft. But I’m sure that it would only increase market share for them because it will be mediatized in a way that depicts non-privacy invading operating systems as morally evil because good guys don’t have anything to hide. Kinda like they did with pleading the fifth and shifting the public image of doing so being a silent admission to having committed a crime.

    • TheObviousSolution@lemm.ee
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      6 months ago

      I remember when they tried to do the same with CRISPR. Glad that didn’t take off and remained largely limited to the industry and academia. But then again, Wuhan …