Hey this maybe a stupid question. I am considering on buying a GPU. I am in conflict between nvidia and AMD. I know AMD works better on linux in general but I am curious to follow the NVIDIA advancements as they go with the new open source kernel modules and stuff… I don’t know if it is worth it to pick team green over team red. Also typically performance will be better with NVIDIA on compute and stuff like that.

P.S.

Yes, this is related to the previous post I made here.

  • Lettuce eat lettuce@lemmy.ml
    link
    fedilink
    arrow-up
    36
    arrow-down
    1
    ·
    4 months ago

    Unless you are a power user who is confident in your ability to troubleshoot weird/esoteric issues and bugs, just go AMD.

    If there aren’t any specific features you need from Nvidia, like CUDA for CAD/Render workloads, AMD is going to have a higher chance of #JustWorking and will give you awesome gaming performance.

    I’ve got a 6700XT paired with a 5800X3D running Nobara Linux for my main gaming rig. Love it to death, runs everything butter smooth.

    For instance, Deep Rock Galactic maxed settings at 1080p, I don’t ever see it dip below about 160FPS, and most of the time it’s between 180-210, which feels amazing on my 240Hz monitor.

    In defense of Nvidia, things are wayyy better than they were even 2-3 years ago, and the majority of folks, especially with older Nvidia GPUs, seem to have a pretty decent experience on Linux.

    That being said, I would estimate that roughly 75% of the posts I see from users who are having really odd/random issues with Linux have an Nvidia GPU.

    • Telorand@reddthat.com
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Only recent issue I’ve seen from AMD folks is VRR problems via HDMI. No idea if that affects Nvidia users, but I’d imagine it’s a small subset of AMD users experiencing that.

      • jrgd@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        4 months ago

        The VRR problems are specifically related to either monitors not supporting Freesync over HDMI or the user running a monitor expecting HDMI VRR to work on HDMI 2.1 specs (>4k@60hz or equivalent bandwidth negotiation requirements). I would concur a small subset of users is correct for the use-cases where this becomes a problem.

      • Petter1@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        4 months ago

        It will install fine and will use the open source nouveau driver by default. After install, you can search the app for configuring drivers to install the proprietary driver from NVIDA from there via a GUI.

      • Lettuce eat lettuce@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        I think you have to manually install the Nvidia drivers. If you search “drivers” in the Cinnamon launcher, they have a system app to download and install them.

  • GolfNovemberUniform@lemmy.ml
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    4 months ago

    I don’t think the new open-source modules will help bring the features to Linux. NVIDIA isn’t interested in making their monopolist features reverse engineerable.

    • LeFantome@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Are you mixing two concepts?

      The Open Source kernel modules will work with the proprietary Linux drivers which have all the features.

      There will also be Open Source drivers which do lack features. However, “Linux” still has the features via the proprietary drivers.

  • data1701d (He/Him)@startrek.website
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    4 months ago

    AMD unless you’re actually running AI/ML applications that need a GPU. AMD is easier than NVidia on Linux in every way except for ML and video encoding (although I’m on a Polaris card that lost ROCm support [which I’m bitter about] and I think AMD cards have added a few video codecs since). In GPU compute, Nvidia holds a near-dictatorship, one which I don’t necessarily want to engage in. I haven’t ever used an Intel card, but I’ve heard it seems okay. Annecdotally, graphics support is usable but still improving for gaming. Although its AI ecosystem is immature, I think Intel provides special Tensorflow/Torch modules or something, so with a bit of hacking (but likely less than AMD) you might be able to finagle some stuff into running. Where I’ve heard these shine, though, is in video encoding/decoding. I’ve heard of people buying even the cheapest card and having a blast in FFMPEG.

    Truth be told, I don’t mess with ML a lot, but Google Colab provides free GPU-accelerated Linux instances with Nvidia cards, so you could probably just go AMD or Intel and get best usability while just doing AI in Colab.

      • data1701d (He/Him)@startrek.website
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        4 months ago

        I’ve heard of people coercing even my graphics card, an RX 580. However, I avoid generative AI for ethical reasons and also because Microsoft is trying to shove it down my throat. I really hope that copyright law prevails and that companies will have to be much more careful with what they include in their datasets.

  • bastion@feddit.nl
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    4 months ago

    I must say, I switched to a system with AMD and there’s no going back for me. If Linux is going to be your daily driver, it’s absolutely AMD.

  • Kongar@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 months ago

    Single person’s data point:

    I’ve had numerous gpus-I’ve been all over the map for years. Sometimes amd sucks, sometimes nvidia sucks. Right now, I’m rocking a 4090 and it’s working better in endeavoros than I’ve ever seen nvidia work in linux. (I’ve always had problems with nvidia cards screen tearing, stuttering, and general installation issues).

    But honestly, those complaints have been resolved at least with my distro. I think both brands are in a good spot right now. I think you’re safe to buy whatever floats your boat.

    IMO

    • thingsiplay@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      Good to know (I use EndeavourOS too, BTW). I think its also important to know if you use Wayland.

      I think you’re safe to buy whatever floats your boat.

      It is not that simple! In example OpenCL was problematic with AMD, which prevents it from being used in applications. Installing ROCm driver as an alternative can be problematic in AMD too, which solves one issue but brings another. I just recently got OpenCL working with AMD, thanks to a new experimental implementation in Rust. My point is, he really should research before buying, because depending on the use case one option is better than the other.

      • Telorand@reddthat.com
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        I love these “AMD outlier cases,” because it tempers my own expectations.

        I have a 3060ti and want to upgrade to an AMD card in 5-ish years, but it’s nice to not be surprised or know beforehand that it’s not necessarily going to be perfect or better than my Nvidia experience.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    4 months ago

    Depends on what you want to do. If you need CUDA for certain applications in example, its better to use Nvidia. Do you have a G-Sync monitor? Nvidia. Currently Nvidia does not work well with Wayland, if you want to game. Nvidia is also better at Raytracing, if that is something important to you. The Open Source Kernel modules for Nvidia doesn’t matter at all, because the driver is still closed source and basically nothing changes. I believe HDMI is better supported for Nvidia, because of the closed source driver. HDMI does not like Open Source and therefore its a bit limited on AMD. I would recommend using DisplayPort anyway, but that might not be an option for every monitor.

    Also in my experience it was a pain to use Nvidia, not only because of problems here and there (under X11 back then), but also because drivers were downloaded multiple versions in Flatpak. Because each program was depending on a certain version of the Nvidia driver. Each of the drivers were over 300 MB downloads, so it adds up after 6 versions and updating over and over again.

    I don’t know what the current state of Nvidia is to be honest, because i switched to AMD. So it comes down to what card is available to you at what price, and what you want do. If you don’t know and have to ask, I would say AMD is a safe bet. Buy into Nvidia only after research and if you really need certain stuff.

    • Empricorn@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Do you have a G-Sync monitor? Nvidia.

      This is really only relevant for older/lower-power GPUs, right? I think if you can easily game at high frame rates on modern games, you don’t need G-Sync.

      • thingsiplay@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        This has nothing to do with older or lower power GPU. G-Sync is the best possible way to play games, if you can. Higher framerates is not a replacement for VRR technology.

        • Telorand@reddthat.com
          link
          fedilink
          arrow-up
          2
          ·
          4 months ago

          Thanks for the rec. I’m still on Windows on my main rig, but I’m transitioning to Linux very soon, once I have all my ducks in a row.

          I have a G-sync 240Hz monitor, and it’s far superior to using V-sync. Good to know I’ll still get the most out of it with the card I have.

          • thingsiplay@beehaw.org
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            If you already have a working graphics card, then you can try to use it first. My last Nvidia card was 1070 and I just switched to AMD last year. And because my monitor is a little old, it has only G-Sync and no FreeSync; meaning I lost the ability of VRR. If you already have a monitor and gpu, then my recommendation is just to use that again and see how things are going before buying into new and expensive hardware. As a sidenote, I’m a huge emulation fan. And old consoles and arcades have weird sync-rates, in which case VRR like G-Sync is optimal. But thats just a sidenote.

            You can also dual boot Windows and Linux, meaning you choose what operating system to run at boot time. Then you have a little backup on one side and can jump back if needed and the new experiment on the other side. I assume you will do something wrong and it might even require to reinstall Linux again, maybe not, but you should always be prepared for the worst case scenario.

            • Telorand@reddthat.com
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              4 months ago

              I’ve been doing trial installs in VMs before I make the full switch, so I know what to expect when I do the bare metal install. It’s not exactly a 1:1 analog, but it’s given me some good expectations and allowed me to iron out some requirements (like a dumb VPN client my work requires).

              I plan to go full Linux and just have vfio pass the graphics card to a Windows VM for the few times I need it. Most of my programs have a Linux counterpart or equivalent, and the remaining few I don’t need.

    • monobot@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      If you need CUDA for certain applications in example, its better to use Nvidia.

      Depends on budget. PyTorch works nicely on ROCm and, for me, bigger constraint is available VRAM than GPU speed and looks like AMD has cheaper RAM, comparing their cheapest 16GB cards AMD is 33% cheaper than Nvidia where I live, and there was some card 45% cheaper few months ago. Huge savings if on limited budget.

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    5
    ·
    4 months ago

    The problem with this question is that most NVIDIA owners will have experience based on a very different stack than you will experience.

    NVIDIA and Wayland have had very big problems that have only recently been resolved. If you are using a very up-to-date distribution then you will have a great experience ( see other comments here about EndeavourOS for example ). If you have a distribution that does not have the latest, there will probably be issues.

    AMD has been the clear go-to choice for Linux for years. It is still a safe bet. The safest bet based on history. That said, AMD does have issues as well and with the NVIDIA issues now resolved it is not as clear cut. NVIDIA may actually be the better choice.

    • wallmenis@lemmy.oneOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      I am using it for gaming on windows. I will dual boot with a different os on seperate drives. For linux, i want something stable that won’t crash on wayland.

      • thejevans@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        4 months ago

        You have to decide what is more important to you: Linux compatibility or ray tracing and CUDA? There are other differences, but those are the big ones.

        • just_another_person@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          4 months ago

          Only difference is CUDA. AMD does better ray tracing from what I’ve seen, and FSR is more performant in most cases as well vs DLSS, though DLSS may have some quality tests which is subjective.

    • thejevans@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      I have a 7900XTX and I use a DisplayPort to HDMI adapter to get HDMI 2.1. I can use 4K@120Hz and HDR on my LG OLED TV just fine with that setup. The only real limitation is 3 display outputs vs 4 if I could use the HDMI out for what it is meant for.

      • kusivittula@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        i use an lg oled tv as monitor too and was planning to get an amd gpu. i looked at these adapters, only a couple different ones available where i live but they had horrible reviews. what brand is yours?

  • sovietknuckles [they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    4 months ago

    I know AMD works better on linux in general but I am curious to follow the NVIDIA advancements as they go with the new open source kernel modules and stuff…

    How is it open source? In the history of the whole repository, there were 11 merged PRs in 2022 (when the project began), and no merged PRs after, even though lots of PRs have been submitted since then. There has never been an issue-fixing PR merged, and no issues or PRs are submitted by the maintainers of the project.

    A maintainer explains their workflow:

    Because we will be sharing this code with our proprietary driver, we won’t be developing in the open for now. So far, our strategy is to apply proposed changes to our internal code base, merge pull requests on github, and then do one NVIDIA github commit per driver release (and because the internal code base also contains the change, the release-time commit should not revert the merged pull request). It is not a great workflow, but we’re trying to navigate the constraints as best we can.

    All of their commits are tagged versions, none of which tell you in words what they did or what changed. As the maintainer says, they still do their actual development internally, and the GitHub repository does not contain that incremental work. Because the commits are releases only, there are only 66 commits on the main branch from May 2022 to the latest commit/release 2 weeks ago.

    So whatever benefit you were hoping to get from Nvidia’s kernel modules being open source probably is not there.

    • Markaos@lemmy.one
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      How is it open source?

      How is it not? Open source doesn’t mean you have to accept other people’s code. And it is perfectly valid to only dump code for every release, even some GNU projects (like GCC) used to work that way. Hell, there’s even a book about the two different approaches in open source.

      So whatever benefit you were hoping to get from Nvidia’s kernel modules being open source probably is not there.

      It allowed the actual in-tree nouveau kernel module to take the code for interacting with the GSP firmware to allow changing GPU clock speed - in other words no more being stuck on the lowest possible frequency like with the GTX 10 series cards. Seems like a pretty decent benefit to me.

  • TomBombadil [he/him, she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    I started my Linux time on a 1080 and never really had issues. Never really knew I supposed to haha. Now on a AMD card that works great but actually took a bit more setup for myself to make work perfectly.

    So who knows

  • xarexyouxmadx@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    I went with Nvidia and I never really had any issues on Linux with it (I only use Linux).

    AMD might be better on Linux out of the box in the sense that you can just install whatever distro and it’s going to work and with Nvidia some distros will require to install the drivers yourself or tinker one time in the terminal but it’s really not that big of a deal

    But go with whatever you think is best for you.

  • thegreenguy@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    If NVIDIA is significantly better value over AMD in your use case, go team green. If not, I’d go team red and personally I wouldn’t buy NVIDIA just because one day it might be better.