• sunzu2@thebrainbin.org
    link
    fedilink
    arrow-up
    24
    arrow-down
    3
    ·
    1 month ago

    Well they do need to make a profit per unit to justify selling them lol

    Gaining market share while taking an L is not really viable strategy for HW markets, maybe consoles or selling data moning “smart” devices.

    • maniii@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 month ago

      I wonder how Microsoft feels about XBox and other things. Gaining marketshare by throwing money at it is an absolutely viable strategy for companies that can do it.

      AMD doesnt need to price-cut to compete here. If AMD could make OpenCL and similar projects more successful than CUDA and nVidia, which requires time,money,investment in people,talent, etc etc. then AMD might in 2 or 3 years have a viable alternative to compete.

        • maniii@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          29 days ago

          At the expense of employee benefits, shareholder value/payouts, public interests, competition, everyone and everything ,except M$ board & C-suite , lost money.

    • orcrist@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      29 days ago

      Tell that to Amazon. If you can build a monopoly, you win in the medium run.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    arrow-up
    21
    arrow-down
    4
    ·
    1 month ago

    It’s not just price, at least for me.

    It’s also the fact that FSR is worse than DLSS, that AMF is worse than nvenc, that their raytracing performance is not even close, and that AFMF isn’t as good as DLSS frame generation, and that the drivers aren’t as stable, and so on and so on and so on…

    The whole product is just… not strictly equivalent, and the price difference isn’t the reason that I don’t really look too hard at AMD cards.

    If AMD gets to equivalency with FSR, AMF, and AFMF that’d make their cards FAR more compelling than a $100 lower price tag would.

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      21
      ·
      1 month ago

      FSR is still pretty good, especially considering it works on almost any GPU. DLSS does not. By far.

      Also, Linux.

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      1 month ago

      the problem on the Nvidia front is that vram capacities are hitting the midrange gpus to the point that they may actually lose said features. in particular with the 4070 ti and slower, vram usage gets to the point where the user may not be able to use all features and half to selectively use them because each feature has their own vram cost attached to it.

      outside of the 4060 ti 16gb you have to spend 800 to get the 4070 ti super to get 16gb vram

    • grandma@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      ·
      1 month ago

      I bought an AMD card even though NVIDIAs upscaling is much better. With the added raw performance for the same price, I’m not going to need to rely as much on upscaling. It starts making less and less sense the higher your budget goes though.

    • vithigar@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 month ago

      But consider that if you get a more powerful card at the same price you don’t need as much upscaling or frame generation. FSR being sightly worse is irrelevant if you can run the game at native.

      AMF being worse than NVENC is certainly true, but in my opinion that barely matters. If you care about quality you should use CPU encoding no matter which one you have, and if you just want to capture video locally you can crank up the bitrate where the differences become negligible.

      As for ray tracing there’s no counter argument there. Nvidia is better, AMD doesn’t match them. If you want to do anything with heavy ray tracing AMD is basically a non-starter. Though I do think it’s adequate for games with light ray tracing.

        • vithigar@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          29 days ago

          Their AVC encoding is pretty bad, yes. Simple solution there if you have an AMD card: Don’t use it.

          If you’re streaming use x264, it’ll look better than either AMD or Nvidia hardware encoding at streaming bitrates. If you’re recording locally use HEVC or AV1 which AMD does much better with than AVC.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        But consider that if you get a more powerful card at the same price you don’t need as much upscaling or frame generation. FSR being sightly worse is irrelevant if you can run the game at native.

        I’m on a 3080, and if I’m getting 40fps in a title at settings I’m happy with (which is ending up more common than I’d like), not even a 7900xtx is going to give me the 90fps I’d much prefer. And, lest you think I’m being vastly unfair, I’ll also say there are no nVidia cards that will do so either. And yes, this is entirely dependent on your resolution, but the ultrawide I’m quite fond of is essentially the same pixel count as 4k144, which is a lot of pixels to attempt to draw at once.

        The only way to get there (at least until the 5090 shows up, I guess?) is to do some sort of upscaling. And, frankly, FSR is - subjectively - not ‘slightly worse’ but rather such a artifact-y mess (at least in games I’m playing) that I’d rather have 40fps than deal with how ugly it makes everything.

        XESS is a lot better, and works fine on AMD cards, but until FSR gets a lot cleaner, or everything starts supporting XESS, DLSS is still the best game in town.

        As for NVENC, you’re absolutely right, unless you’re using it for streaming, and have a hard cap on upper bitrates because you’re not Twitch royalty. I’ll admit that’s an edge case that most people don’t have, or even need to consider, but if you do need low-bitrate streaming, and don’t want to deal with x264 doing it in software, well, it’s NVENC or sub-par quality from AMF. I’m honestly surprised they haven’t invested time in fixing the one real use case that hardware encoding still has (real-time encoding of low bitrates), but I suppose someone somewhere has an excel sheet that shows that the market that cares about it is so small as to be of no value to spend time on.

        • vithigar@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          If there are no nvidia cards that can run your game at 90fps, not even the 4090, then you’re using ray tracing I assume? In which case I’ve already agreed. The gap is too large, and a product tier offset in AMD pricing isn’t going to make up for that gap. My comments about FSR vs DLSS in this scenario assume a superior performance baseline for AMD, where you’re comparing no FSR to DLSS “quality”, or maybe FSR “quality” to DLSS “performance”. AMD would need to tank their prices to an absurd degree to close that gap when ray tracing is involved.

          As for why AMD haven’t put more time into their encoder, I have a suspicion they were banking on people moving away from AVC to HEVC or, more recently, AV1. Their HEVC and AV1 encoders are much closer in quality to nvidia than their AVC encoder, and clearly have more attention paid to them. Hell, even as far back as Polaris cards AMD’s HEVC encoder was even faster than their AVC, while also looking better.

    • hperrin@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      28 days ago

      Other than ray tracing, those are all gimmicky. You should buy the card that can run the games you want to play at the resolution you want to play them at. During the RTX 3000 vs RX 6000 generation, AMD had substantially better price to performance for everything except ray tracing. Now, that’s changed, and AMD is a much less appealing deal.

  • festus@lemmy.ca
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    1 month ago

    The big reason I switched back to Nvidia was because I wanted to play with some local AI models, and doing that with AMD cards was quite difficult at the time (I think it’s improved a little, but still isn’t straightforward).

    • BaroqueInMind@lemmy.one
      link
      fedilink
      arrow-up
      6
      ·
      1 month ago

      I’ve tried to run a few minimal 8B and even 1.3B models on AMD cards and they are such trash that my CPU can run them faster. Why did they code the xformers in python to only be compatible with Nvidia drivers?

  • MyOpinion@lemm.ee
    link
    fedilink
    arrow-up
    8
    ·
    1 month ago

    They are almost always cheaper, but going that low is a a great way to lose money.