Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel

Intel Debuts Arc Battlemage Discrete Graphics Cards (hothardware.com) 55

MojoKid writes: Intel officially revealed its next generation discrete graphics cards, code named Battlemage, this morning. There are two midrange cards in the series so far, branded Arc B580 and Arc B570, though future higher-end B700 series cards are unknown currently. The graphics architecture for Battlemage is Xe2, and it debuted in the iGPU on Lunar Lake Core Ultra 200V mobile processors earlier this year.

Arc B580 is paired to 12GB of GDDR6 memory operating at an effective data rate of 19Gbps over a 192-bit interface, and its average GPU clock should hover around 2,670MHz. The Arc B570 is based on the same slice of silicon, but scales things down with 10GB of GDDR6 memory operating at the same speed as the B580, but connected over a narrower 160-bit interface. The B570's average GPU clock will also be lower, in the 2,500MHz range. Performance-wise, Intel is projecting that Arc B580 will be about 10% faster than an NVIDIA GeForce RTX 4060 on average but will be priced at $249 USD, undercutting GeForce RTX 4060 substantially while offering 4GB more onboard graphics memory. Arc B580 cards are due to arrive in market this month, with Arc B570 arriving in January 2025 at $219 USD.

Intel Debuts Arc Battlemage Discrete Graphics Cards

Comments Filter:
  • by cfalcon ( 779563 ) on Tuesday December 03, 2024 @11:37AM (#64987955)

    While I generally don't hold out any hope that the performance will be as good as claimed (or that games will optimize for it), will Intel have great Linux driver support for it? Historically that has been something Intel is pretty good at.

    • Re: (Score:1, Funny)

      by Anonymous Coward

      I'm sure all seven Linux customers will be thrilled. You can't find seven Linux users to agree on anything. No matter what they will find something to bitch about.

    • by jmccue ( 834797 )
      Actually I am hoping these are open as opposed to what Nvidia does. That alone will get my business. Right now I avoid Nvidia due to thier licensing.
      • Right now I avoid Nvidia due to thier licensing.

        Wouldn't it be nice to be able to do business with everyone?

        Instead of having to avoid companies that think their customers are just dumb rubes and use every trick in the book to rip them off or scam them.

      • "Open" does not mean good support. Good support means releasing drivers rapidly to fix issues as they arrive which is especially important in the gaming world where the graphics card driver is responsible for a whole world of fixes and workarounds that would otherwise cause your game to crash to desktop (or worse).

    • https://www.reddit.com/r/Intel... [reddit.com]

      Intel is working on an entirely new driver package for future graphics products (the Xe driver). It's a bit of an unknown how well that will work.

    • will Intel have great Linux driver support for it? Historically that has been something Intel is pretty good at.

      Unless you want to use Intel WiFi in AP mode...

    • by mjwx ( 966435 )

      While I generally don't hold out any hope that the performance will be as good as claimed (or that games will optimize for it), will Intel have great Linux driver support for it? Historically that has been something Intel is pretty good at.

      Performance doesn't have to be great... it just has to be good enough and priced competitively. The market is still crying out for decent low end graphics cards.

      Of course, I have little faith in Intel to be able to do "good enough" these days.

  • Intel has had a rough year but their graphics efforts have gotten a lot better. They can't afford another debacle.
    I bet this will be good. We'll all know when the numbers are posted.

    • by klashn ( 1323433 )

      If you don't care for support, it will certainly suffice.
      Intel's main goal is to develop IP for themselves, the productize some of their efforts, but they don't do it for the customer's benefit.

  • Some competition in the GPU arena would be nice but I'm not holding my breath.

    • For now it seems Intel may be competing with AMD at the low to middle parts of the market. If this GPU has the performance that Intel claims, it will be a decent GPU for less than $300.
      • If this GPU has the performance that Intel claims

        If history is any indication, it will not.

        It may provide pretty good video encoding rates, so it might have a niche there.

        The real question is, will it even offer better performance than AMD integrated graphics? At this price point, that's what it's competing with, not real discrete GPUs.

    • by gweihir ( 88907 )

      Me neither. This would be their, what, 4th failure to make decent GPUs?

    • I'm really excited Intel is competing so that I can buy CUDA cards for cheaper.

      *The paradox Intel and AMD face in the dGPU market.

  • It seems like Intel is so late to the GPU race that even if they catch up (which it appears they have) in performance, they have decades of mindshare as plain-jane business graphics barely adequate for zoom calls.

    • by Sique ( 173459 )
      Intel is not late the the GPU race. Remember the i740, the first GPU supporting the AGP port? That was back in 1998.

      Intel's graphics efforts were just lagging so far behind in performance, that you lost them out of sight immediately. Starting with i752, Intel integrated their GPUs in their chip sets, so you had no chance to ever get them into the sight again.

      • The point being, for decades a motherboard with an Intel chipset included Intel graphics, which weren't useful for much of anything other than basic display. They may have done a good job over the years of integrating poor graphics into their chips, but it was still poor graphics that most people replaced [1]. Moreover, until recently, they didn't appear to have any interest in improving the quality of their GPUs, seemingly content to provide basic business graphics that anyone with a GPU intensive need i

      • None of what Intel did back in the late 90s relates to their dGPU efforts today. Where they're lacking is in dGPUs and the associated drivers. They have over a decade's worth of experience with small iGPUs. Translating that into performant dGPUs and compute accelerator cards has proven to be very difficult for them.

        • I agree. I think it was simply a market that Intel did not pursue, and that's fine. Not everyone can do everything, and Intel's core competency was CPUs and the accompanying chipsets. I just question how successful they can be getting into the dGPU market this late in the game. But truthfully, I admire the effort.

          • I agree. I think it was simply a market that Intel did not pursue, and that's fine.

            It isn't. They made multiple attempts, and those attempts all failed so badly that they hurt consumer confidence in Intel. They pursued that market, and that market ran away from them as quickly as it could.

            • I didn't know that. My only exposure to Intel graphics is their integrated stuff. I wonder what the problem was. My imperfect understanding of a GPU is a processor that does multiple floating point operations in parallel, very fast. Intel has decades and decades of experience in conventional processors. I wonder why they were not able to leap that particular chasm.

              So, one could say that this is merely the latest of several attempts. I still think it's too late to get into the market but who knows?

              • by Sique ( 173459 )
                It's a little more than just floating point. Even more important are matrix multiplications. Computer graphics is all about 4-vectors and 4x4 matrices, which you can use for geometry setup, for pixel color setup and for texture setup. Per rendering pipeline, you have several so called transform&lighting units to process the input and generate the output. The T&L units are programmable, and that's why they are so important for AI, because you can program them to be neuronal nodes instead of rendering
              • It's never too late to get into the market if you have a compelling product. Nvidia is in fact a latecomer (not this late to be fair but...) and the hardware T&L of the TNT got them caught up quick. The thing is, Intel wasted money on bullshit like buybacks instead of R&D investment like Nvidia, and it takes fundamental research to make fundamental improvements.

      • The problem is they keep losing interest. The Iris Pro 5100 graphics in the i5-4570R I had on a mini PC was entirely adequate. It came out in 2013, I bought the PC in 2014.

        Several years later that GPU still outperformed the HD 630. So for five years Intel sat on their hands and did nothing but put money in the CEO's super yacht fund.

    • by DrXym ( 126579 )

      They certainly do but I expect their plan will be to produce a line of solid, affordable graphics cards until the market sees them an equal or better choice to the equivalent AMD / NVidia cards. They're probably already offering better bang for buck but I wonder how they do with backwards compatibility - probably have to do some gnarly hacks for older versions of DirectX to make old games work.

      • That would actually be ok. The only thing I care about is how well it accelerates rendering in Adobe Photoshop and Lightroom. A cheaper alternative to the monster gaming video cards (which probably have features that Adobe CC doesn't use) would be welcome.

    • Yeah that's true, would you rather pay $300 for an Nvidia GPU, or $300 for an Intel GPU? Unless the benchmarks are significantly better for the Intel GPU, then you're getting the Nvidia GPU. Also, they're branding sucks, the packaging looks like corporate schoolhouse style.

      Cool that it has Raytracing support though. I'll give them that.
      • Yeah that's true, would you rather pay $300 for an Nvidia GPU, or $300 for an Intel GPU? Unless the benchmarks are significantly better for the Intel GPU, then you're getting the Nvidia GPU. Also, they're branding sucks, the packaging looks like corporate schoolhouse style.

        Cool that it has Raytracing support though. I'll give them that.

        At this moment, the reason I'd get the nvidia is that they have had a presence for significant time and I have experience with nvidia cards. Whereas, my only experience with intel graphics was the built-in graphics that everyone replaces first thing. Also as someone else pointed out, intel apparently has a record of losing interest in the market. So $300 investment in something that may be a dead end? I need gpu acceleration for my job. Let someone else take those chances.

  • Any compute support or are these just for gaming?

    • These are consumer GPUs for the low to middle end. They are not meant for compute support,
      • Maybe so but with 12GB at a decent price, they don't necessarily have to be the fastest

      • Every serious 3D engine has probably moved from OpenGL to Vulkan, which to me counts as a "compute" API. So even if you're targeting games only, you'll still need a compute-capable GPU.

        Even OpenGL (since about 2004) can do a lot of compute besides placing pixels on a screen, and so the distinction between "graphics" and "compute" capabilities is quite arbitrary. If you released a GPU that can only do OpenGL-level graphics (which is a business suicide anyway), there are multiple ways of using OpenGL for "

      • Re:OpenCL? CUDA? (Score:4, Insightful)

        by thegarbz ( 1787294 ) on Tuesday December 03, 2024 @02:45PM (#64988367)

        These are consumer GPUs for the low to middle end. They are not meant for compute support,

        Be that as it may, compute is rapidly turning into a standard consumer workload. I have multiple pieces of software on my PC for image editing, video editing, and other "non-technical" things which are CUDA accelerated. These days simply hitting full screen on a youtube video will load the GPU via CUDA or similar. It's no longer some niche for techheads, much like 3D hardware acceleration isn't just for gamers (used to draw the OS desktop these days).

      • They are not meant for compute support,

        Perhaps not but I would expect that there are quite a few people, particularly in STEM fields, who would like to test and develop code on their local GPU before uploading jobs to large compute facilities. If you can't support OpenCL then you are losing all of them as potential customers, not to mention that I suspect Intel are aiming at the compute market, even if they are not there yet, and having a base of users already acclimatised to your products via development would be very helpful.

    • by udif ( 32355 )

      Intel has oneAPI.
      https://www.intel.com/content/... [intel.com]
      The A770 was unique because it had 16GB memory at a street price of $279 and sometimes even less. I really hope we'll see a B7xx with at least 16GB RAM.
      Here is llama accelerated on intel HW:
      https://github.com/intel-analy... [github.com]

  • by thygate ( 1590197 ) on Tuesday December 03, 2024 @12:29PM (#64988063)
    So is OpenGL1.1 finally working now without bugs on intel GPUs ?
  • I initially bought my 6800 GPU as an eGPU, which Intel ARC sucks at. However, had I built a desktop from the start, the Intel cards are often the value champions.

    • by Targon ( 17348 )

      Now that almost all new AMD CPUs have graphics on them, is there actually a value to getting Intel discrete graphics? If you want better graphics performance, then an AMD or NVIDIA video card will give you better value, because 5 percent above integrated graphics holds no value.

      • Now that almost all new AMD CPUs have graphics on them, is there actually a value to getting Intel discrete graphics?

        Intel has also included an integrated GPU for the longest...and I was actually pretty surprised recently at how serviceable the integrated GPU actually is.

        I've got an MSI Katana laptop that includes both the integrated Intel graphics from the 12th-gen i7, as well as a GTX 4050. It's not winning any performance contests, but it plays my games just fine. I had it switched to the Intel integrated graphics and was playing Mass Effect Legendary Edition, 1920x1080, ultra-everything. It slowed down a bit during bu

  • It'll be interesting to see the benchmarks. Nvidia is treading a little too close to monopoly for now so I've got an AMD card in my gaming PC (RX 6650 XT).

    Intel actually even in the last gen had stellar transcoding/encoding performance for video so I'm actually using an Arc A310 in my Jellyfin/media server as its chews through H265 encoding like crazy.

    I'll wait and see how the benchmarks play out, but my gut feeling is even if you don't want to support Nvidia, Intel still probably won't compete with even A

    • by jonwil ( 467024 )

      Intel claims the Arc B580 will be better than both the RTX 4060 and the RX 7600 while also being cheaper than both.(when talking MSRP at least). If that's true (and the drivers are good) that will make Battlemage a compelling option for gamers on a budget.

      • by AmiMoJo ( 196126 )

        The only caveat is that their support for older games isn't as good as AMD or Nvidia. If you like older stuff then it's hit and miss.

  • I have deployed a few A380 cards for decoding and general office applications and, for the price, they have great performance. One card is running 35 security cameras smoothly on a huge TV at 40% GPU usage and 10% CPU (running at 2160 not 4k). Without the card the CPU would pin at 100% and stutter dramatically. The A380 is only $120. I remember paying 2 grand for a card that would run SolidWorks in the 90s and it was still crap. I hope this "B" series lives up to the hype as it's nice to have choices and c
  • it's the same old mediocre crap that PC builders have put in their machines over the past 25 years in order to be able to put a shiny sticker on the case.

  • By that I mean - if you run an AMD CPU, are you better off with an AMD GPU, and similarly if you run an intel CPU (that hasn't shorted out yet), are you better off running with one of these new Xe2 intel GPUs?

    I haven't followed the consumer GPU stuff for some time and only play old games, hence asking the question. I vaguely remember something about "resizable BAR" (um...phrasing) but can't remember if that's specific to a CPU brand or not.
  • If I were Intel and had a fairly ok graphics card, I would just slap 40GB and 80GB and 128GB on it and go for the LLM enthusiasts that will pay for a card that is slower than NVidia but has vastly more RAM and is affordable.

    In fact, I'm not sure why AMD hasn't done that.

    • My observation from what experience I have so far is that 16GB is fairly roomy if you have a low-end GPU, because as you need more RAM you also need more processing power.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...