• circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    12GB VRAM in 2024 just seems like a misstep. Intel isn’t alone in that, but it’s really annoying they didn’t just drop at least another 4GB in there, considering the uplift in attractiveness it would have given this card.

      • circuitfarmer@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        The industry as a whole has really dragged ass on VRAM. Obviously it keeps their margins higher, but for a card targeting anything over 1080, 16GB should be mandatory.

        Hell, with 8GB you can run out of VRAM even on 1080, depending on what you play (e.g. flight sims).

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      I doubt it would cost them a ton either, and it would be a great marketing tactic. In fact, they could pair it w/ a release of their own LLM that’s tuned to run on those cards. It wouldn’t get their foot in the commercial AI space, but it could get your average gamer interested in playing with it.

      • Da Bald Eagul@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 hours ago

        It wouldn’t cost much, but this way they can release a “pro” card with double the vram for 5x the price.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 hours ago

          I doubt they will. Intel has proven to be incompetent at taking advantage of opportunities. They missed:

          • mobile revolution - waited to see if the iPhone would pan out
          • GPU - completely missed the crypto mining boom and COVID supply crunch
          • AI - nothing on the market

          They need a compelling GPU since the market is moving away from CPUs as the high margin product in a PC and the datacenter. If they produced an AI compatible chip at reasonable prices, they could get real world testing before hey launch something for datacenters. But no, it seems like they’re content missing this boat too, even when the price of admission is only a higher memory SKU…