• BURN@lemmy.world
    link
    fedilink
    English
    arrow-up
    301
    arrow-down
    1
    ·
    10 months ago

    This is a net win. Now they won’t be recommended to everyone trying to do hardware comparisons. The bias in their results has pretty much made them worthless as a source since Ryzen released.

    • IronKrill@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      Unfortunately the subscription appears to be for their benchmarking tool only, not for website access.

  • MataVatnik@lemmy.world
    link
    fedilink
    English
    arrow-up
    219
    arrow-down
    2
    ·
    10 months ago

    Aren’t these the people that straight up manipulated data to make AMD look worse than Intel or something wild?

    • HarkMahlberg@kbin.social
      link
      fedilink
      arrow-up
      113
      arrow-down
      1
      ·
      10 months ago

      Yeah that’s the guy. Hilarious to see he thinks his garbage biased opinion is worth any amount of money.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      65
      arrow-down
      1
      ·
      10 months ago

      Yup. They’ve always done it, both on the CPU and GPU side, but especially on the CPU side.

  • R0cket_M00se@lemmy.world
    link
    fedilink
    English
    arrow-up
    171
    arrow-down
    3
    ·
    10 months ago

    Aren’t these the same guys that have a stick up their ass about AMD and/or they get paid by Intel to be biased?

  • AtmaJnana@lemmy.world
    link
    fedilink
    English
    arrow-up
    154
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I know the admins have unquestionable integrity (they certainly pretend as much) so surely they are going to retroactively pay every user who contributed their benchmarks for free. Right? When should I expect my first royalty check?

  • Faceman🇦🇺@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    142
    arrow-down
    2
    ·
    10 months ago

    Ah, yes, the guy that obviously put a lot of money into intel shares or attempting to short AMD right around the launch of Zen architecture and got so butthurt about his poor investment that he started actively falsifying data and writing ridiculous unhinged reviews of amd products to make intel look better.

  • nukul4r@feddit.de
    link
    fedilink
    English
    arrow-up
    86
    arrow-down
    1
    ·
    10 months ago

    Hopefully this will hurt them up to a point where they go out of business. Just look at their review of the 5800X3D, it’s so unreal.

    • AlpacaChariot@lemmy.world
      link
      fedilink
      English
      arrow-up
      76
      arrow-down
      1
      ·
      10 months ago

      The 5800X3D has the same core architecture as the 5800X but it runs at 11% lower base and 4% lower boost clocks. The lower clocks are in exchange for an extra 64MB of cache (96MB up from 32MB) and around 40% more money. For most real-world tasks performance is comparable to the 5800X. Cache sensitive scenarios such as low res. canned game benchmarks with a 3090-Ti ($2,000 USD) benefit at the cost of everything else. Be wary of sponsored reviews with cherry picked games that showcase the wins, conveniently ignore frame drops and gloss over the losses. Also watch out for AMD’s army of Neanderthal social media accounts on reddit, forums and youtube, they will be singing their own praises as usual. Instead of focusing on real-world performance, AMD’s marketers aim to dupe consumers with bankrolled headlines. The same tactics were used with the Radeon 5000 series GPUs. Zen 4 needs to bring substantial IPC improvements for all workloads, rather than overpriced “3D” marketing gimmicks. New PC builders have little reason to look further than the $260 12600K which, at a fraction of the price, offers better all round performance in gaming, desktop and workstation applications. Users with an existing AM4 build should wait just a few more months for better performance at lower prices with Raptor Lake or even Zen 4. The marketers selling expensive “3D” upgrades today will quickly move onto Zen 4 (3D) leaving unfortunate buyers stuck on an overpriced, 6 year old, dead-end, platform. [Mar '22 CPUPro]

      Jesus

      • digdug@kbin.social
        link
        fedilink
        arrow-up
        53
        arrow-down
        1
        ·
        10 months ago

        What’s scary is that I think the owner of userbenchmark actually believes that statement. Which might explain how he’s so out of touch that he thinks his own crap doesn’t stink and deserves to be locked behind a subscription. I’m just sad that there might be a not insignificant number of people that pay for it.

        • Faceman🇦🇺@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          20
          ·
          10 months ago

          I’m certain he must’ve lost a lot of money betting against amd on the stock market right around the time of zen1 and he never got over it.

      • pivot_root@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        edit-2
        10 months ago

        The real Neanderthal social media account is the one writing that review.

        Instruction and data caches have a real, tangible benefit. Although there is a point of diminishing returns, more L3 cache is absolutely worth a 10% clock speed trade-off for consumer systems. Fetching memory from the bus is an order of magnitude slower than fetching from cache, and the processor has to perform other work or stall while it’s waiting for that.

        But, knowing the bias of the reviewer, they’re probably running DDR4 at 5200 MT/s (2000 over JEDEC specs) on their Intel systems to make up for the lack of cache while thinking, “just buy a more expensive processor and RAM, you brain-dead cretins.”

        • 8ender@lemmy.world
          link
          fedilink
          English
          arrow-up
          29
          ·
          10 months ago

          I mean it’s kinda amazing that there’s someone looking at a 14th gen Intel CPU sucking back 200+ watts, while it gets spanked by a 7800X3D running at 65 watts, and thinking “AMD is hurting consumers”. That’s some next level shit.

          • pivot_root@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            10 months ago

            Well said. The only thing hurting consumers is the reviewers omitting information or spreading misinformation.

      • brick@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        10
        ·
        10 months ago

        Ok so I am about to build a new rig, and looking at the specs the X3D does seem less powerful and more expensive than the regular 7950.

        While I completely agree that this guy seems extremely biased and that he comes off like an absolute dickbag, I don’t think the essence of his take is too far off base if you strip off the layers of spite.

        Really, it seems like the tangible benefit of the X3D that most people will realize is that it offers similar performance with lower energy consumption, and thus lower cooling requirements. Benchmarks from various sources seem to bear this out as well.

        It seems like a chip that in general performs on par with the 7950x but with better efficiency, and if you have a specific workload that can benefit from the extra cache it might show a significant improvement. Higher end processors these days already have a fuckton of cache so it isn’t surprising to me that this doesn’t benchmark much better than the cheaper 7950x.

        • Neshura@bookwormstory.social
          link
          fedilink
          English
          arrow-up
          12
          ·
          10 months ago

          Why are you talking about the 7950, the review is about the 5800X3D, when it released AM5 amd Ryzen 7000 chips were not released.

          Seems a bit silly to say the (lainch) review is right and then use a piece of hardware that didn’t exist at the time as proof.

          How about you compare the 5800X3D to a 5800X and a 5900X instead?

          • brick@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            10 months ago

            I was comparing the 7950x and the 7950x3d because those are the iterations that are available right now and what I have been personally comparing as I mentioned. I apologize if I wasn’t clear enough on that point.

            My point was that the essence of the take, which I read to be, “CPUs with lower clocks but way more cache only offer major advantages in specific situations” is not particularly off base.

            • Neshura@bookwormstory.social
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              10 months ago

              I still fail to see how comparing an AM5 chip is in any way shape or form a good addition for discussing an objectively terrible review of a late addition the AM4 product family. What you say might be true… for AM5. Which is not the subject of the review everyone is talking about. Nor is anybody except you talking about what X3D currently offers, we’re all talking about a review that, at the time it was written, was horribly researched, full of bias and false facts.

              You coming in and suddenly talking about the 7950X/X3D adds nothing of value to the topic at hand. Because the topic at hand isn’t “Is X3D worth it” it’s specifically “look at how badly Userbenchark twisted this 5800X3D review”.

              • brick@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                edit-2
                10 months ago

                So sorry to interrupt your circlejerk about this guy’s opinion on 3d V-Cache technology with a tangentially related discussion about 3d V-Cache technology here on the technology community.

                I fully understand the point you’re trying to make here, but just as you think my comments added nothing to the discussion, your replies to them added even less.

    • Neato@ttrpg.network
      link
      fedilink
      English
      arrow-up
      37
      ·
      10 months ago

      The only reason I can think for a site to do this is that they were about to go under already. This will absolutely tank them as there are free alternatives.

    • agent_flounder@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      1
      ·
      10 months ago

      Also watch out for AMD’s army of Neanderthal social media accounts on reddit, forums and youtube, they will be singing their own praises as usual.

      Wat

      Fellow AMD Neanderthal Army soldiers: any idea when I get my cool uniform and …paycheck?

      Zen 4 needs to bring substantial IPC improvements for all workloads, rather than overpriced “3D” marketing gimmicks.

      … the AMD Ryzen 7 5800X3D performs reasonably consistently under varying real world conditions.

      Uhh… Aren’t… Aren’t these two statements kinda contradictory?

      • kaitco@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        2
        ·
        10 months ago

        Not if you remember that the writers are being paid by Intel. Then, it all comes together.

      • Nilz@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        10 months ago

        Uhh… Aren’t… Aren’t these two statements kinda contradictory?

        No no, you see; it performs reasonably consistency under varying real world conditions but for a CPU to truly shine it needs to handle all workloads, including unrealistic synthetic ones.

      • ChicoSuave@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        10 months ago

        You’re expecting rationale from someone who just made crazy statements because their feeling are hurt.

  • jonesy@aussie.zone
    link
    fedilink
    English
    arrow-up
    64
    arrow-down
    1
    ·
    10 months ago

    I’m sure there are niche users for who paying the price of admission is acceptable, but for myself and I assume a vast number of other users, when I’m comparing performance of hardware I’m already checking reviews on multiple other sites, so this will only mean I don’t bother to check their site.

    I haven’t visited their site in a long time though, so I’m not sure what value-adds they offered that might make the price more palatable.

    • conciselyverbose@kbin.social
      link
      fedilink
      arrow-up
      45
      arrow-down
      1
      ·
      edit-2
      10 months ago

      None.

      The actual “single core”, “multi-core” were basically fine last I was aware, but they went so far into apeshit meltdown about the fact that AMD was offering better value than Intel with Ryzen (which is kind of back and forth since, but AMD is the reason I could get a 16 (real, capable of demanding single core loads too) core for $500 a couple years ago, not too long after Intel was selling 6 cores for more than that.) that it undermined everything else.

      Anyways, UB’s owner didn’t like that AMD had good shit so he kept changing the “gaming/desktop/whatever” grade formulas to tilt the comparisons to Intel using more and more hilarious mechanisms. It started with a reasonable “you don’t really benefit from games past 4/6/8 cores” and de-emphasizing super high core counts that hadn’t really been an issue before, but it quickly degraded into obviously cheating hard by whatever means necessary to punish AMD, with even worse diatribes in the descriptions to match.

  • casmael@lemm.ee
    link
    fedilink
    English
    arrow-up
    56
    ·
    10 months ago

    Right so clearly userbenchmark is trash, but where could one, hypothetically, go, to, hypothetically, compare the performance of various cpus, hypothetically?

  • notannpc@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    1
    ·
    10 months ago

    Of all the things to try and monetize with a subscription…

    Who’s more brain damaged, the site owner or the people that actually pay for it?

    • GladiusB@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      10 months ago

      That don’t actually work. My system can tell what memory I have, but you can’t? Fuck that website

      • aidan@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        My system can tell what memory I have, but you can’t?

        Wait, I’m confused about this. Are you upset the site can’t tell what RAM you have. Because I’m pretty certain there is no way a site could tell that.

  • HuddaBudda@kbin.social
    link
    fedilink
    arrow-up
    46
    arrow-down
    1
    ·
    10 months ago

    Why on earth do they have a monthly subscription on something people maybe use once every 1 or two years?

    Who is actually going to pay a netflix sub to see marginally bad data that often?

    Like Netflix I understand if you cannot help yourself.

    But is there a band of computer nerds out there, that I don’t know about, that want play by play updates on how a graphics card is preforming compared to others? On a monthly basis?

    • Danitos@reddthat.com
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 months ago

      Being a number nerd, I can see the appeal for something like this (extremely bad quality of data aside), or at least I do frequently visit OpenBenchmarkin.org (similar concept than UserBenchmark, but open source).

      I also know 1 person who is obsseded with constantly buying/selling parts for their PC, and for whatever reason still uses UB after I told them how shit it is.

      My guess is that this will also resonate with some Intel fanboys.

      All of this is more of an exception to the rule, but they need just a few bunch of people subscribing to generate more profit than before.