“There’s no way to get there without a breakthrough,” OpenAI CEO Sam Altman said, arguing that AI will soon need even more energy.

    • JDubbleu@programming.dev
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      10 months ago

      In fairness the computing world has seen unfathomable efficiency gains that are being pushed further with the sudden adoption of arm. We are doing our damnedest to make computers faster and more efficient, and we’re doing a really good job of it, but energy production hasn’t seen nearly those gains in the same amount of time. With the sudden widespread adoption of AI, a very power hungry tool (because it’s basically emulating a brain in a computer), it has caused a sudden spike in energy needed for computers that are already getting more efficient as fast as we can. Meanwhile energy production isn’t keeping up at the same rate of innovation.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        10 months ago

        The problem there is the paradox of efficiency, making something more efficient ends up using more of it not less as the increase in use stimulated by the greater efficiency outweighs the reduced input used.

      • Blóðbók@slrpnk.net
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        10 months ago

        It’s not so much the hardware as it is the software and utilisation, and by software I don’t necessarily mean any specific algorithm, because I know they give much thought to optimisation strategies when it comes to implementation and design of machine learning architectures. What I mean by software is the full stack considered as a whole, and by utilisation I mean the way services advertise and make use of ill-suited architectures.

        The full stack consists of general purpose computing devices with an unreasonable number of layers of abstraction between the hardware and the languages used in implementations of machine learning. A lot of this stuff is written in Python! While algorithmic complexity is naturally a major factor, how it is compiled and executed matters a lot, too.

        Once AI implementations stabilise, the theoretically most energy efficient way to run it would be on custom hardware made to only run that code, and that code would be written in the lowest possible level of abstraction. The closer we get to the metal (or the closer the metal gets to our program), the more efficient we can make it go. I don’t think we take bespoke hardware seriously enough; we’re stuck in this mindset of everything being general-purpose.

        As for utilisation: LLMs are not fit or even capable of dealing with logical problems or anything involving reasoning based on knowledge; they can’t even reliably regurgitate knowledge. Yet, as far as I can tell, this constitutes a significant portion of its current use.

        If the usage of LLMs was reserved for solving linguistic problems, then we wouldn’t be wasting so much energy generating text and expecting it to contain wisdom. A language model should serve as a surface layer – an interface – on top of bespoke tools, including other domain-specific types of models. I know we’re seeing this idea being iterated on, but I don’t see this being pushed nearly enough.[1]

        When it comes to image generation models, I think it’s wrong to focus on generating derivative art/remixes of existing works instead of on tools to help artists express themselves. All these image generation sites we have now consume so much power just so that artistically wanting people can generate 20 versions (give or take an order of magnitude) of the same generic thing. I would like to see AI technology made specifically for integration into professional workflows and tools, enabling creative people to enhance and iterate on their work through specific instructions.[2] The AI we have now are made for people who can’t tell (or don’t care about) the difference between remixing and creating and just want to tell the computer to make something nice so they can use it to sell their products.

        The end result in all these cases is that fewer people can live off of being creative and/or knowledgeable while energy consumption spikes as computers generate shitty substitutes. After all, capitalism is all about efficient allocation of resources. Just so happens that quality (of life; art; anything) is inefficient and exploiting the planet is cheap.


        1. For example, why does OpenAI gate external tool integration behind a payment plan while offering simple text generation for free? That just encourages people to rely on text generation for all kinds of tasks it’s not suitable for. Other examples include companies offering AI “assistants” or even AI “teachers”(!), all of which are incapable of even remembering the topic being discussed 2 minutes into a conversation. ↩︎

        2. I get incredibly frustrated when I try to use image generation tools because I go into it with a vision, but since the models are incapable of creating anything new based on actual concepts I only ever end up with something incredibly artistically compromised and derivative. I can generate hundreds of images based on various contortions of the same prompt, reference image, masking, etc and still not get what I want. THAT is inefficient use of resources, and it’s all because the tools are just not made to help me do art. ↩︎

      • fidodo@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        10 months ago

        It’s emulating a ridiculously simplified brain. Real brains have orders of magnitude more neurons, but beyond that they already have completely asynchronous evaluation of those neurons, as well as much more complicated connecting structure, as well as multiple methods of communicating with other neurons, some of which are incredibly subtle and hard to detect.

        To really take AI to the next level I think you’d need a completely bespoke processor that can replicate those attributes in hardware, but it would be a very expensive gamble because you’d have no idea if it would work until you built it.

    • variants@possumpat.io
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 months ago

      This dude al is the new florida man, wonder if it’s the same al from married with children

    • LemmyIsFantastic@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      5
      ·
      10 months ago

      Some of the smartest people on the planet are working to make this profitable. It’s fucking hard.

      You are dense and haven’t taking even a look at simple shit like hugging face. Power consumption is about the biggest topic you find with anyone in the know.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Some of the smartest people on the planet are working to make this profitable. It’s fucking hard.

        [Take a look at] hugging face. Power consumption is about the biggest topic you find with anyone in the know.

        ^ fair comment