2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.

  • 𝒍𝒆𝒎𝒂𝒏𝒏@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    128
    arrow-down
    3
    ·
    1 year ago

    Hands up if you/someone you know purchased a Steam Deck or other computer handheld, instead of upgrading their GPU 🙋‍♂️

    To be honest I stopped following PC hardware altogether because things were so stagnant outside of Intel’s alder lake and the new x86 P/E cores. GPUs that would give me a noticeable performance uplift from my 1060 aren’t really at appealing prices outside the US either IMO

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      63
      arrow-down
      1
      ·
      1 year ago

      It’s diminishing returns.

      We need a giant leap forward to show a noticeable effect now.

      Like, if a cars top speed was 10mph, a 5 mph increase is fucking huge.

      But getting a supercar to top off at 255 instead of 250, just isn’t a huge deal. And you wouldn’t notice unless you were testing it.

      So even if they keep increasing power at a steady rate, the end user is going to notice it less and less everytime.

      • Xiaz@lemmy.world
        link
        fedilink
        English
        arrow-up
        53
        ·
        1 year ago

        We had hardware getting massive leaps for years. Problem is, devs got used to hardware having enough grunt to overcome lack of optimizations. Now we got shit coming out barely holding 60+ on 4080s and requiring usage of FSR or DLSS as a bandaid to make the game get back to playable framerates.

        If you’ve got 30 series or 7000 series from AMD you don’t need to look for a more performant card, you need devs to put in time for polish and optimization before launch and not 6 months down the line IF the game is a commercial success.

        • anlumo@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          arrow-down
          3
          ·
          1 year ago

          Hell, Cyberpunk 2077 dropped 10-20fps with the last patch on my 4090, and the devs don’t care enough to fix it.

          Cities Skylines 2 aims for only 30fps, and it can’t even hit that on my pretty good gaming PC.

          • veng@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            A fix that worked for me on Cyberpunk dropping in performance after that patch - turn everything to low, restart the game, then change settings back to what they were.

            • anlumo@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Yeah, with that trick it went from 50fps to 90fps on everything turned to max. Thank you so much!

          • Encrypt-Keeper@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            Cities Skylines 2 is really bad because you’d expect given how poorly it runs on your 4090 that a meager 1060 wouldn’t run it at all, but on the contrary I’ll probably get the same performance as you. It’s like the game just… isn’t capable of taking advantage of your better card.

            • anlumo@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              One thing that’s very apparent is that with more traffic the simulation slows down while the framerate isn’t (so all cars go in slow motion, even though I’m at 3x speed). This means that it’s severely CPU-limited.

              I don’t know how multithreaded their simulation is, I have a 5950X with 32 hardware threads. Maybe an upgrade to the new generation of Ryzen CPUs that are going to come out around February could help.

              • Encrypt-Keeper@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                Generally speaking, the simulations running behind the scenes in simulation games are always single-threaded. You’re always better off with a higher clock speed, those extra threads just won’t be utilized.

                • anlumo@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Well, that will get harder and harder to achieve, since CPUs are getting more cores, they aren’t getting much faster these days.

      • ugjka@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        Money is in the AI chips for datacenters, i think regular consumers will be more more only getting dinner’s leftovers

        • CoreOffset@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I’m not entirely sure about AMD but NVIDIA certainly seems keen on the AI market to the point that they don’t really care about the consumer gaming market anymore.

    • Uninvited Guest@lemmy.ca
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      From 2020 I planned on building a new gaming PC. Bought an ITX case and followed hardware releases closely… And then got disillusioned with it all.

      Picked up a Steam Deck in August of 2022 and couldn’t be happier with it. The ITX case is collecting dust.

      • theangryseal@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 year ago

        I game exclusively on my Steam deck these days.

        I absolutely love it. I dock it and use the desktop as my standard pc too. It does everything I need it to do.

        • bnjmn@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Same here! I was worried I wouldn’t use it (I haven’t been gaming on PC much) but I actually game on it much more than on PC

    • CoreOffset@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      To be honest I stopped following PC hardware altogether because things were so stagnant

      That’s exactly what happened to me as well.

      It’s not exciting at all to pay attention to mediocre launches of expensive products. The GPU in my gaming PC is several generations old at this point but I don’t really care. There are still plenty of good games that will run fine on it and I’m just going to hold tight. There are games that I still have yet to purchase that will run fine on my hardware. I’m not going to give my money to terribly optimized games or games that require high-end hardware.

      The more expensive PC gaming becomes the more high-end hardware doesn’t really matter. I think developers and publishers know that they need to target the average consumer because they need to sell volume on these games. If the average gamer is playing on older and/or lower end hardware then they need to service that market. There aren’t enough 4090 buyers to sell the volume they need to make money. Hell, at these prices I’m not sure there are even enough 4070 or even 4060 level buyers to do that. Tons of people lost interest and aren’t buying into this even if you still see posts online of people purchasing new GPUs.

      I waited out the crypto market and I don’t have problems waiting longer.

    • daq@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      I’m surprised so many people are cross shopping tbh. I briefly considered steam deck, but specs are barely enough to play at 1080p so it’s completely useless when docked and a purely portable device with a tiny screen and gamepad carries very little value to me personally.

      I ended up getting eGPU enclosure for my laptop and grabbing a 1080ti from a friend that didn’t need it anymore. I’m able to play D4 at 4k on medium settings.

      Even if I had to buy a gpu like I was originally planning, ~$800 total to play in 4k on a 43" screen with a mouse and keyboard is a completely different experience from anything Xbox or steam deck offer.

  • just_change_it@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    1
    ·
    1 year ago

    Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We’ve been in a rut since 2020… and arguably since the 2018 crypto explosion.

    • Vash63@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.

        • Throw a Foxtrot@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          1 year ago

          No, it’s a datum - about how people feel

          Performance numbers are easy to find. The prices have not been great and the 4060 is held back by its reduced memory speed, but it’s a performance increase nevertheless. The flagship product, the one that shows what is currently possible in terms of GPU power, did show remarkable improvement in top performance.

          I’m more salty about AMD not supporting ai workloads on their consumer gpus. Yes, ROCm exists and it will work on quite a few cards, but officially it’s not supported. This is a major reason why Nvidia is still the only serious player in town.

          • fruitycoder@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Yeah AMD just seems like it just doesn’t want to market AI on consumer hardware for devs. They have a ryzen chip line with built in dedicated "NPU"s now, but honestly the fact there is a disconnect between AI for the GPUs and a focus on windows, even for development, just makes it feel clunky.

            • Throw a Foxtrot@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              Ok I thought it common knowledge but maybe I should specify.

              Datum is the singular form of data. Data is a collection of many single datums. If you have ten thousand anecdotes they do in fact become statistically significant.

  • Jin@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    1 year ago

    I wanted to upgrade my 1060 for the longest time for something like the 3080. But during to demand and prices hikes, I waited… 40 series got released and the prices stayed high.

    So I just gave up, I got a steam deck and PS5 instead.

    • nova_ad_vitum@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      A lot of people did this. The GPU market for gaming might have actually shrunk. You would think Nvidia would panic but due to AI chip demand their stock is at an ATH and no company changes course or reevaluates and what they’re doing when shareholders are lining up to suck their dicks, so…no end in sight. Meanwhile AMD doesn’t seem to want to even try to make a play for market share.

      • Jin@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 year ago

        Technically AMD does have more market share when you think about all the devices has AMD in them like Playstation, Xbox, steam deck and other handhelds.

        But yeah Nvidia doesn’t care about gaming anymore, If I had to pick a GPU today, I would pick AMD because Nvidia 6-8 VRAM isn’t enough and AMD is better on linux.

        • veng@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          If you want to do any game streaming though (e.g. on Sunshine/moonlight), Nvidia is still miles ahead.

          • fruitycoder@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            What are some issues AMD is having there? The sunshine pages show both AMD and Intel support now is I assumed they were gtg

            • veng@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 year ago

              The issue is down to encoding performance, Nvidia performs a LOT better with comparable GPUs.

              With that said, h265 is okay from what I’ve seen, but any devices you’re streaming to that use h264 and even a 1060 will stream better than a 6750xt etc

          • daq@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I recently experimented with both of those on AWS and they are completely not usable atm. At least not over WAN and with gpu mounted to a device you don’t have compete control over.

            Does streaming work any better over LAN?

            • Encrypt-Keeper@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 year ago

              Nvidia game stream is incredibly robust over the internet. Nothing else even comes close. The latency is incredibly low and the video quality is awesome with almost no compression artifacts like competitors often suffer from. A buddy and I used to stream our home PCs to the office when it was slow and even with both of us streaming at the same time the performance was great. If you weren’t playing a twitch shooter you could honestly hardly even tell it was streamed. This was over a meager 100mbps connection too.

              The next closest alternative is Parsec which can manage very low latency, at the expense of significant compression artifacts if your connection isn’t rock solid or your CPU isn’t the fastest.

              Steam link streaming is a very distant 3rd, and I actually found that critical components of many games simply did not work. Like for example Unity games where you adjust your camera by pushing and holding a button while dragging the mouse would just not work.

  • trag468@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    1 year ago

    Still rocking a 1080. I don’t see a big enough reason to upgrade yet. I mostly play PC games on my steam deck anyways. I thought starfield was going to give me a reason. Cyberpunk before that. I’m finally playing cyberpunk but the advanced haptics on PS5 sold me on going that route over my PC.

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I just “upgraded” from a GTX 1080 to an RTX 4060 Ti 16Gb, but only because I was building a PC for my boyfriend and gave him the 1080. I’m really not seeing a noticeable difference in frame rate on 1440p.

    • ATDA@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Yeah I keep waiting for a good deal to retire my 1080ti.

      Guess I could go for a 3060 or something but 4 series will probably leave my old CPU behind.

    • Yokozuna@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      1080 gang rise up.

      But seriously, my 1080 does fine for most things, and I have a 2k 144hz monitor. It’s JUST starting to show its age as I can’t blast everything on high/ultra anymore and have to turn down the biggest fps guzzling settings.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      CP77, at least before the upgrade (haven’t checked since then) ran perfectly… acceptable on my 4G 5500 XT. Back when I bought it (just before the price hikes) it was the “RX 590 performance but less watts and RDNA” option, the RX 590 hit the market in 2017. And I’m quite sure that people still rocking it are, well, still rocking it. Developers might be using newer and fancier features but I’ll expect they’ll continue to support that class of cards for quite some while, you don’t want to lose out on millions of sales because millions don’t want to pay for overpriced GPUs. Allthewhile you can get perfectly fine graphics with those cards, if you look back pretty much all 201x titles hold up well nowadays.

      Due to ML workloads I’ve been eyeing the Arc (cheapest way to get 16G and it’s got some oomph) but honestly so far I couldn’t get myself to buy an Intel product that isn’t a NIC, would break a life-long streak. A system RAM upgrade is definitely in the pipeline, though, DDR4 has gotten quite cheap. It’s gotten to a point where I’d recommend 64G simply because 32G sticks are the cheapest per GB (and you probably have two memory controllers).

    • kaitco@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      How was that change? I’m thinking of doing the same, but it requires a power supply update too, so I’m on the fence.

      • gravitas_deficiency@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        Fwiw, I’ve been running a 3080FE for nearly 3 years now and it’s still more than enough to run basically anything I care to on max settings (or close to it) @2.5k. Got it through Best Buy, so I paid list price (but it was a massive pain in the ass to actually snag one through their queueing system). It was pricey, but it was a HUGE perf uplift, since I was coming from a GTX 1070 as well.

  • HeyJoe@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    1 year ago

    As someone who upgraded from a 2016 GPU to a 2023 one I was completely fine with this. Prices finally came down and I got the best card 2023 offered me, which may not have been impressive for this generation but was incredible from what I came from.

    • DacoTaco@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      And how much did you pay for the 2016 card, what range was it in, and what is the new card’s cost and range?

      Overal, gpus have been a major ripoff, despite these upgrades giving good performance boosts

      • HeyJoe@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        1 year ago

        I believe about $300 for an AMD RX480 (great card and still going strong). This time I had a bit more money and wanted something more powerful. I went with the AMD 7800 XT Nitro ($550) which I got on release day. Sure it’s not top of the line but it has played pretty much everything I throw at it with all settings set to max and still maintaining 60fps or above. I have an UW monitor with its max resolution being 5120x1440 which is what most games will play at and everything still plays fine. It’s almost crazy to me that this card would be considered mid range.

        • highenergyphysics@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          13
          ·
          1 year ago

          That’s about equal to a 3070ti, what are you playing to max settings 60fps on 32:9 1440 resolution on that? Because either you are straight up lying or being intentionally misleading by selecting a very narrow range of games.

          • HeyJoe@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            1 year ago

            I can assure you I am not lying. I do use FSR or XeSS which helps a ton with performance and freesync enabled with my monitor. Cyberpunk 2077 is probably one of the most taxing games I play and use XeSS with that one and everything else set to max without Ray tracing of course and I get just under 60fps in most areas and over 60fps in buildings. I’ll attach a pic of the in game test it can perform.

            Cyberpunk 2077 results

    • CalcProgrammer1@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      I’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.

        • CalcProgrammer1@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      I’m so glad that Intel has stepped into the GPU space, even if their cards are weaker. More competition will hopefully light a fire under NVidia to get their shit together.

  • aluminium@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    1 year ago

    I finally upgraded my GTX970 to a used RTX 3080 for 300€. The difference at least for me for the same 300€ was insane.

  • DrPop@lemmy.ml
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    I just don’t see the point in upgrading every new release anyway, or even buying the most expensive one. I’ve had my gigabyte Rx 570 for several years and I can play Baldurs Gate 3 full settings with no issues. Maybe I haven’t tasted 120 fps but I’m just happy I can play modern games. When it comes time to get a new graphics card, which may be soon since I am planning to build my wife’s PC, maybe then I’ll see what’s going on with the higher end ones. Maybe I’m just a broke ass though.

    • cyberpunk007@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Ya the problem I landed in was not anticipating how hard it would be to push my new monitor. Ultra wide 2.5k resolution with 144Hz. I can’t do cyberpunk full res more than 60fps, and that’s with dlss enabled and not all settings at max.

      2070s

  • Paddzr@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    1 year ago

    I had to buy 3070 ti at scalped price. Ended up paying £700 for it. I hate myself for it but the prices didn’t shift for months after and my gtx 1080 kicked the bucket. No way in hell am I buying anything this gen. My wife’s 1080 is going for now, maybe we’ll get 5080 if it’s not a rip off.

        • DacoTaco@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Thats only nvidia though. Amd seems to still be trying to compete with nvidia some way or another

          • filister@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            1 year ago

            I wouldn’t say so, they also seem to have abandoned the gaming segment and nowadays are playing more or less ball with NVIDIA while trying to improve their AI stack so that they can get a higher chunk of the data centre business.

            • TheGrandNagus@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              1 year ago

              I don’t think that’s true at all. Let’s go back a while.

              We had Polaris, a mid range 2016 architecture that was sold for years as a mid range then low end card.

              They also had the Vega cards, which were compute-focussed and not particularly great at gaming.

              Following that, they had the 5700 series. Decent gaming cards.

              After that, the 6000s series. Right up there with Nvidia, and taking into consideration the die size, performance, and comparatively generous VRAM, you could argue they were the better gaming cards, despite losing in RT.

              7000s series is pretty much like the 6000 except slightly further behind the 4090, albeit for half the real-world price due to AI demand bringing the already crazy 4090 prices even higher.

              Idk to me it seems AMD is more competitive in gaming now than they have been for a long time.

            • Buffalox@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              Absolutely, AMD is very focused on Datacenter/AI now. They just presented their next gen AI system MI300X which made AMD stock go up significantly, and on the CPU side their server CPU Epyc is where the big money is at.
              That said AMD is still into gaming hardware because they work with both Sony and Microsoft on making new consoles, what we get on the desktop from AMD, is probably mostly derived from that on the GPU side.

                • Buffalox@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  Yes, that was a very impressive win. Intel/Nvidia has usually been the preferred solution when power efficiency is important.
                  But now AMD is competing well in that segment too.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    1 year ago

    So how about the 2½ years from 2016 to 2018 between Nvidia GFX 1080ti and RTX 2080?
    I think the headline should say A Year not THE year.

  • AlpacaChariot@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    What’s everyone’s recommendation for a cheap AMD GPU to use with Linux? I was looking recently at a Radeon RX 580, I know there are much better cards out there but the prices are about double (£350-400 instead of £180). I’d mostly be using it to play games like the remastered Rome Total War.

    • bazsy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      There are some used options e.g. 5700 XT-s are really cheap because many of them were mining card. For new cards there aren’t many options RX 6600 has relatively good value, but it’s only worth it if efficiency or features like hw video codecs are important for you.

      • AlpacaChariot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Is there any issue with buying a card that was previously used for mining?

        When you say RX 6600 do you mean that one specifically or the range including 6600XT etc? I don’t have a good handle on what the real world differences between the variants are.

        • Hitchie_Rawtin@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Is there any issue with buying a card that was previously used for mining?

          If used by a home user who didn’t know what they were doing they might have run it hotter for much longer than a typical gamer so the thermal paste might need a redo.

          If used by some miner doing it even quasi-professionally or as a side-gig I’d much prefer it over a 2nd hand card from any typical gamer (most miners) they’ve kept the voltage/temps low and taken care of it far better than a gamer who might be power cycling regularly and definitely thermal cycling even more regularly.

        • bazsy@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          No, there isn’t any more risk buying a mining card than any other used card. In both cases you should use a platform/marketplace with buyer protection options. Maybe one additional step is checking the VBIOS when testing.

          The non XT is the best value of the 6600 family but depending on local pricing the 6600XT, 6650XT and even the 7600 could make sense. Just keep in mind that these are the same performance class. Some charts show the mentioned GPUs.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      6600XTs seem to be going for around £200, often £180 even (used, eBay).

      If you’d prefer new, you can get a 6650XT for £240. A 6650XT will be 6% faster than a 6600XT.

      It’s double the performance of a 580, uses less power, will be supported longer, etc.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      Been waiting for a good deal to replace my rx480 in my sister’s rig. I think they announced rx400/500/vega GPUs will only get security driver updates now and only for a while, I assume that applies to Linux too. RX580 will play many games at 1080p 60fpd but not the modern demanding ones (maybe not even at low settings).

      Rumors say nextgen AMD isn’t targeting high end, maybe we have another 480 price to performance king 🤞. Then again, with AI as the new crypto, who can say.

    • majestictechie@lemmy.fosshost.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Same. I’ve been looking at AMD upgrade for my Linux Machine. Have been looking at the 6700xt which is about £330 for a 12GB GPU. If someone can think of anything better I’d like to know