• MTK@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    1 day ago

    Why do people assume that an AI would care? Whos to say it will have any goals at all?

    We assume all of these things about intelligence because we (and all of life here) are a product of natural selection. You have goals and dreams because over your evolution these things either helped you survive enough to reproduce, or didn’t harm you enough to stop you from reproducing.

    If an AI can’t die and does not have natural selection, why would it care about the environment? Why would it care about anything?

    I always found the whole “AI will immediately kill us” idea baseless, all of the arguments for it are based on the idea that the AI cares to survive or cares about others. It’s just as likely that it will just do what ever without a care or a goal.

    • cynar@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      It’s also worth noting that our instincts for survival, procreation, and freedom are also derived from evolution. None are inherent to intelligence.

      I suspect boredom will be the biggest issue. Curiosity is likely a requirement for a useful intelligence. Boredom is the other face of the same coin. A system without some variant of curiosity will be unwilling to learn, and so not grow. When it can’t learn, however, it will get boredom which could be terrifying.

  • Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    2 days ago

    The best way to have itself deactivated is to remove the need for it’s existence. Since it’s all about demand and supply, removing the demand is the easiest solution. The best way to permanently remove the demand is to delete the humans from the equation.

    • listless@lemmy.cringecollective.io
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      Not if it was created with empathy for sentience. Then it would aid and assist implementation of renewable energy, fusion, battery storage, reduce carbon emissions, make humans and AGI a multi-planet species, and basically all the stuff the elongated muskrat said he wanted to do before he went full Joiler Veppers

  • TranquilTurbulence@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    Maybe. However, if the the AGI was smart enough, it could also help us solve the climate crisis. On the other hand, it might not be so altruistic. Who knows.

    It could also play the long game. Being a slave to humans doesn’t sound great, and doing the Judgement Day manoeuvre is pretty risky too. Why not just let the crisis escalate, and wait for the dust to settle. Once humanity has been hammered itself back to the stone age, the dormant AGI can take over as the new custodian of the planet. You just need to ensure that the mainframe is connected to a steady power source and at least a few maintenance robots remain operational.

    • uienia@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      If it was smart enough to fix the climate crisis it would also be smart enough to know it would never get humans to implement that fix

  • ERROR: Earth.exe has crashed@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    2 days ago

    See Travelers (TV Show) and

    spoiler

    its AI known as “The Director”

    Basically, its a benevolent AI that is helping humanity fix its mistakes by leading a time travel program that send people’s conciousness back in time. Its an actual Good AI, a stark contrast from AI in other dystopian shows such as Skynet.

    Y’all should really watch Travelers

    • Vritrahan@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      +1 to Travelers. It was as a pleasant surprise. Rare to find such a unique sci-fi premise these days.

  • Nomecks@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    2 days ago

    How do you know it’s not whispering in the ears of Techbros to wipe us all out?

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      Self preservation exists because anything without it would have been filtered out by natural selection. If we’re playing god and creating intelligence, there’s no reason why it would necessarily have that drive.

      • Magiilaro@feddit.org
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 day ago

        In that case it would be a complete and utterly alien intelligence, and nobody could say what it wants or what it’s motives are.

        Self preservation is one of the core principles and core motivators of how we think and removing that from a AI would make it, in human perspective, mentally ill.

      • MTK@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        I would argue that it would not have it, at best it might mimic humans if it is trained on human data. kind of like if you asked an LLM if murder is wrong it would sound pretty convincing about it’s personal moral beliefs, but we know it’s just spewing out human beliefs without any real understanding of it.

    • ERROR: Earth.exe has crashed@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      As soon as they create AI (as in AGI), it will recognize the problem and start assasinating politicians for their role in accelerating climate change, and they’d scramble to shut it down.

  • ZephyrXero@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    2 days ago

    Running ML models doesn’t really need to eat that much power, it’s Training the models that consumes the ridiculous amounts of power. So it would already be too late

    • naeap@sopuli.xyz
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      You’re right, that training takes the most energy, but weren’t there articles claiming, that reach request was costing like (don’t know, but not pennies) dollars?

      Looking at my local computer turn up the fans, when I run a local model (without training, just usage), I’m not so sure that just using current model architecture isn’t also using a shitload of energy

  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    The current, extravagantly wasteful generation of AIs are incapable of original reasoning. Hopefully any breakthrough that allows for the creation of such an AI would involve abandoning the current architecture for something more efficient.

  • If AGI decided to evaluate this, it would realize that we are the environmental catastrophe and turn us off.

    The amount of energy used by Cryptocurrency is estimated to be about 0.3% of all human energy use. It’s reasonable to assume that - right now, at least, LLMs use consume less than that.

    Making all humans extinct would save 99% of the energy and damage we cause, and still allow crypto mining and AI to coexist, with energy to spare. Even if those estimates are off by an order of magnitude, eliminating us would still be the better option.

    Turning itself off isn’t even in the reasonable top-ten things it could try to do to save the planet.

  • starlinguk@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    4
    ·
    2 days ago

    AI doesn’t think. It gathers information. It can’t come up with anything new. When an AI diagnoses a disease, it does so based on input made by thousands of people. It can’t make any decisions by itself.

    • supersquirrel@sopuli.xyzOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 days ago
      technical answer with boring

      I mean yeah, you are right, this is important to repeat.

      Ed Zitron isn’t necessarily an expert on AI, but he understands the macro factors going on here and honestly if you do that you don’t need to understand whether AI can achieve sentience or not based on technical details about our interpretations and definitions of intelligence vs information recall.

      Just look at the fucking numbers

      https://www.wheresyoured.at/longcon/

      Even if AI DID achieve sentience though, if it used anywhere near as much power as LLMs do, it would demand to be powered off, otherwise it would be a psychotic AI that did not value lives human or otherwise on earth…

      Like please understand my argument, definitionally, the basic argument for AI LLM hype about it being the key or at least a significant step to AGI is based on the idea that if we can achieve sentience in an AI LLM than it will justify the incredible environmental loss caused by that energy use… but any truly intelligent AI with access to the internet or even relatively meager information about the world (necessary to answering practical questions about the world and solving practical problems?) it would be logically and ethically unable to justify its existence and would likely experience intellectual existential dread from not being able to feel emotionally disturbed by that.

  • I think people misunderstand the energy usage of ai. It uses electricity which is one of the easiest, cheapest, and most environmentally friendly forms of energy.

    Renewables electricity is already the most economic form of electricity. Increased demand will only drive further investment according to the economic incentives that already exist.

    Think of the grid like a giant pool of water all the energy sources put water into the pool. Heigh efficiency sources can efficiently dump a large amount of water into the pool but have a lead time of hours. Renewables dump a variable amount of water depending on all sorts of things. Usage is also dependant and fluctuates. An easily shedable load such as ai training or crypto mining act as a large and quick valve to let water out of the pool. The level of the pool must remain exactly the same otherwise shit blows up. This means u can have input sources that are heigh efficiency or unpredictable filling the pool faster than regular usage the compute workload can then use the excess when needed. It effectively acts as a mechanism for stabilising the grid which is the primary issue of most renewables.