Hard to believe it’s been 24 years since Y2K (2000) And it feels like we’ve come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I’m a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don’t want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we’re not going to see huge gains in performance anymore because AMD isn’t caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn’t very widespread. We’re again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It’s so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they’re just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that’s copyrighted, but claim it’s for the public good, and then randomly swap to a for-profit model. Doesn’t make any sense and just looks like they’re going to be a vessel for widespread economic poverty…

It just seems like there’s a lot of bubbles that are about to burst all at the same time, like I don’t see how things are going to possibly get better for a while now?

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    3 months ago

    Well, that’s the doomer take.

    The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.

    I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.

    And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).

    And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:

    1. It’s not a social experience at all.
    2. There’s no budget for the kind of games that would drive adoption, because there’s no adoption to justify spending money on a VR version.

    If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.

    And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong.

      Some of the best games I’ve played have graphics that’ll run on a midrange GPU from a decade ago, if not just integrated graphics

      Case in point, this is what I’m playing right now:

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      The 5080 is rumored to be 10% faster, but also use 90% the power. While performance has a normal generational leap, power consumption has gone up to match leaving you with a much smaller actual improvement.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Power consumption numbers like that are expected, though.

        One thing to keep in mind is how big the die is and how many transistors are in a GPU.

        As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.

        Big die + lots and lots of transistors = bigly power usage.

        I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          You can also get big power consumption from turning up the voltage and cranking the clock speeds well past their efficient zone. You see that right now with most 40 series cards where turning the clock speeds down a smidge gives you huge power savings at almost no loss in performance.

          Cost per MM^2 of die space has only gone up with each process node these last 10 years, so unless you’re paying big money don’t expect big chip.

        • Vik@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Conversly, the apple silicon products ship huge, expensive dies fabbed on leading TSMC processes which sip power relative to contemporaries. You can have excellent power efficiency on a large die at a specific frequency range, moreso than a smaller die clocked more aggressively.

          • schizo@forum.uncomfortable.business
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.

            nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.

            Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.

            I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.

            I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.

            • Vik@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              3 months ago

              This outlines several issues, a key one is outbidding apple for wafer alloc on leading processes. They primarily sell such high margin products that I suppose they can go full send on huge dies with no sweat. Similarly, the 4090’s asking price was likely directly related to it’s production cost. A chunky boy with a huge l2$.

              I like the way Mike Clark frames challenges in semi eng as a balancing act between area, power, freq and performance (IPC); like a chip that’s twice as fast but twice the size of its predecessor is not considered progress.

              I wish ultra-efficient giga dies were more feasible but it’s kind of rough when TSMC has been unmatched for so long. I gather Intel’s diverting focus in 18A, and I hope that turns out well for them.

              I’m not sure that arm as an ISA (or even RISC) is inherently more efficient than CISC today, particularly when we look at Qualcomm’s latest efforts in notebooks, more that Apple have extremely proficient designers and benefit from vertical integration.

    • astropenguin5@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Little bit of pushback on the vr front: Sure, there aren’t many massive publishers driving it forward, but I would wholeheartedly argue that it can very much be a social experience, and offers experiences it is damn near impossible to get anywhere else, and three games immediately come to mind:

      VRchat (obviously): Literally entirely a social game, and has a pretty large community of people making things for it, from character models to worlds because that’s what drives the game. There is a massive scene of online parties, raves, hangouts, etc. that bring people together across the whole world in a medium more real than any flat game because of the custom models, worlds, and the relative abundance of people using full body tracking to show off, dance, and interact with each other.

      VTOL VR: This is still fairly social in that you can either play with friends or people online, but the main draw for me is the level of immersion in flying you can get. You have full interactable cockpits that you basically just use your real hands to interact with (depending on your controller/hand tracking) and it’s all pretty realistic. It’s just impossible to have the same level of experience without VR.

      Walkabout mini golf: I was pretty skeptical of this game when my friends wanted to play it, it’s literally just a mini golf sim. The thing is, the ability to play mini golf with friends who live across the country/world is amazing, and the physics of just swinging your controller/hands in the same way as real mini golf is so special.

      It is still quite expensive to get really good gear, and that is definitely the current biggest hurdle. It may forever be a smaller community due to the space/tech/cost requirements to make the experience truly incredible, but for me even just on a quest 2 in my room without a lot of fancy stuff, it is still interesting and something special. A lot of people really do care a lot about VR, and even if it is far less than conventional gaming, it should not be entirely discounted. And I personally think that while is probably won’t ever replace flat screen gaming, it is an entirely different kind of experience and has a at least decent future ahead

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.

        IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.

        I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.

        PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.

        Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.

        I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.