Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Intel

Intel Enters the PC Gaming GPU Battle With Arc 92

Dave Knott writes: Intel is branding its upcoming consumer GPUs as Intel Arc. This new Arc brand will cover both the hardware and software powering Intel's high-end discrete GPUs, as well as multiple hardware generations. The first of those, known previously as DG2, is expected to arrive in the form of codename "Alchemist" in Q1 2022. Intel's Arc GPUs will be capable of mesh shading, variable rate shading, video upscaling, and real-time ray tracing. Most importantly, Intel is also promising AI-accelerated super sampling, which sounds like Intel has its own competitor to Nvidia's Deep Learning Super Sampling (DLSS) technology.
This discussion has been archived. No new comments can be posted.

Intel Enters the PC Gaming GPU Battle With Arc

Comments Filter:
  • by Kokuyo ( 549451 ) on Monday August 16, 2021 @11:06AM (#61697451) Journal

    ...this thing is competitive with AMD let alone Nvidia. Certainly not from the get-go.

    • by Opportunist ( 166417 ) on Monday August 16, 2021 @11:13AM (#61697465)

      Depends. Is it going to be available?

      • Is it going to have open source drivers?

        • First I have to have a chance to have the hardware before I wonder whether the software is what I want.

        • Exactly 0.003% of any customer base cares about that.

          • Exactly. Somebody wake me up when SolidWorks and MasterCam runs on *nix.

          • Yeah, "0.003%".... literally all computers, servers, mobile systems, etc, ... except for a few silly desktop PCs from the beforetime.

            Also, PROTIP: Professional rendering servers all run Linux. Pixar, for example. Because using Windows for professional work is just a sad joke. Professionals customize. They are not consumers. But they have a lot of money to spend.

            • Why exactly would I want a dedicated GPU in a server computer when practically all CPUs now have some kind of display driver capability baked in?

              I give you the Pixar argument, but i hope that we can agree that Pixar is a VERY small market segment and very likely hunting in a very different league than we're talking here.

        • git log linus/master..next/master -- drivers/gpu/drm/i915 says yes.

    • Oh I dunno.. the fragmentation of the market could lead to some serious innovation in the field of rendering a gun swaying back and forth on your screen.

    • Depends on the market. Sure these will not compete with flagship GPUs like the RTX 3090 or 6900XT. They probably do okay against budget "I just need a GPU" like the GTX 1050.
    • If the drivers work, I'd definitely consider it competitive for myself.
    • Why not?

      Intel is not a rinky dink little chip maker, who is just a couple of Grad Students trying their first stint outside of academia, going against big companies who's R&D budget exceeds the amount of money they can collect in a decade. Intel is a big company, and they have been making GPU, and they have been Good enough GPU, they just weren't Good enough for high end gaming.

      Also sometimes if you the established brand, you get stuck with the issue that you have to do things the old way, in order to

      • Why not?

        Intel is not a rinky dink little chip maker, who is just a couple of Grad Students trying their first stint outside of academia, going against big companies who's R&D budget exceeds the amount of money they can collect in a decade. Intel is a big company, and they have been making GPU, and they have been Good enough GPU, they just weren't Good enough for high end gaming.

        Most of us can remember them releasing shitty graphics cards, so yeah, I have no clue if they'll do it right this time, but I am underwhelmed. Also, their SSD offerings have been pretty undewhelming. When both came out, I kept reading articles pretending they were much more interesting than they sounded at the time...and ultimately ended up being. I have no clue if the publications were paid to write glowing articles about unreleased products or it was just a slow news day and "exciting new market for In

        • The infamous i740 graphics card.

        • by Junta ( 36770 )

          To reinforce your point on storage, it's an excellent example where they had a genuinely strong technical ability (phase change memory), but impractical because they priced out of the market. They also have all sorts of weird convoluted modes to use that as 'almost ram' that can be done transparently (at horrible performance) or with an application micromanaging it (no developers are bothering because it's more cost effective just to hit up NAND flash over PCIe, and it may not be as good, it's close enough

      • by Junta ( 36770 )

        Intel is a resourceful company, but their track record for going out of their core competency has not been that stellar. Even when they have big ideas, they can't seem to get partners to enable them. There's a revolving door of 'hot new product in a field Intel hasn't previously been in' eternally displacing last years experiments that fizzled out due to hardware limitations, inadequate software enablement, and/or just poor handling on the business side of things.

        Their GPUs historically have been 'good eno

        • quote>Even when they have big ideas, they can't seem to get partners to enable them.

          Eh, they have a reputation for screwing their partners, that's probably why nobody is all that enthusiastic about stepping up.

      • by gweihir ( 88907 )

        Intel has failed before. Several times in fact.

    • by darkain ( 749283 )

      depends on your definition of "competitive"

      The overwhelming vast majority of consumers are nor purchasing flagship GPUs. that's just something for people to wave their e-peens over mostly. But the large body of consumers buy mid-tier GPUs. Look at the Steam hardware surveys, the top of the list will always be the xx60/xx70 GPUs from Nvidia (and equiv AMD). Intel doesn't need a flagship GPU to beat out Nvidia and AMD to be a success in the market, they just need to have a price/performance competitive mid-ra

    • You're not familiar with Intel Corp's history. Intel still has a huge financial advantage over AMD, and decades ago had a huge manufacturing/marketing advantage over AMD. Every so often, Intel would put out a hugely mediocre, if not markedly inferior product. But everyone (manufacturers) would buy its inferior product, because at the end of the day, its "how many semiconductors you can order at price X". AMD may have had the superior performance/value CPU, but it was a boutique manufacturer, and couldn'

      • What AMD accomplished by buying ARM

        Ooops, I meant Nvidia buying ARM.

        • I guess Nvidia needed a non-useless GPU. Mali is scum from the bottom of the barrel, but it at least got semi-usable drivers these days, that's a huge step up over what Nvidia had before.

          New AMD requires opaque firmware blobs (boo!) but at least on the CPU side their drivers work ok. Thus, AMD is the only game in the town for gaming[1] GPUs, let's see how DG2/Arc will do.

          [1]. For normal-people games, that is.

          • New AMD requires opaque firmware blobs (boo!) but at least on the CPU side their drivers work ok. Thus, AMD is the only game in the town for gaming[1] GPUs, let's see how DG2/Arc will do.

            Huh?
            By what fucking metric are nVidia GPUs useless? Except for the fact that you can't fucking get them, because they're gone the second they hit the shelves- a problem not unique to them.

      • no one wants to buy desktop CPUs anymore

        70 million units/year is far from "no one", and three times that for laptops. Correct point would be "fewer are buying desktops", though that downtrend has stabilized over the last few years.

        Meanwhile it datacenter PC server market continues to expand exponentially, the result being that Intel+AMD revenue continues to increase.

    • Competitive where? Note they are competing with NVIDIA's GTX1030 here. How much have you researched GPUs too crappy to be used in gaming PCs?

    • by Luckyo ( 1726890 )

      The most popular GPU on Steam by far is 1060. Second most popular is 1050Ti. Third is 1650.

      With current availability of GPUs, all they need to deliver is something that is better than those three while available and reasonably priced.

    • by gweihir ( 88907 )

      Well, what Intel had so far indicates "Intel high-end" = "below low end of the competition". I believe they have anything better when I see it, not before. Of course we have these utterly crazy prices now so there may be a market even for low end discrete graphics.

  • Weird video (Score:5, Informative)

    by EnsilZah ( 575600 ) <EnsilZahNO@SPAMGmail.com> on Monday August 16, 2021 @11:22AM (#61697501)

    Why show off gameplay from a few not particularly resource intensive games, released years ago?
    I could play PUBG on my 8 year old computer, nothing to brag about.
    Looks like they're basically saying 'we can do the bare minimum expected.'

    • Considering the FUD campaign Intel launched against the Apple M1, Intel stretching the meaning of "gaming" is not surprising.
      • Oh, have the iBelievers moved on to calling anything questioning their delusion a "FUD campaign" by Intel now?

        Weird. I always thought I was an AMD guy... But if you say so...

        Call me when you get adequate cooling to prevent heat throttling and a real keyboard though. :)

        • Oh, have the iBelievers moved on to calling anything questioning their delusion a "FUD campaign" by Intel now?

          Did you actually evaluate Intel's points or is your Apple hate so strong that you did not even look at Intel claims and immediately dismiss any criticisms. Fortunately others [engadget.com] have looked at Intel's claims and found problems [bgr.com]. Among the issues: cherry picking non standard benchmarks, switching models selectively used in comparisons, and complaining Macs failed certification tests designed only for Intel PCs. Did you do this? I am guessing not.

          Call me when you get adequate cooling to prevent heat throttling and a real keyboard though.

          Come back when you can stop showing your ignorance as you assume my

    • by Ecuador ( 740021 )

      Why show off gameplay from a few not particularly resource intensive games, released years ago?

      Hopefully not because they are still trying to load textures through that slow AGP interface [wikipedia.org] :D

    • The caption is "See Intel Arc graphics pre-production silicon in action."

      pre-production is the keyword. I expect the Driver isn't fully coded, and needed to play against an older game that still looks good, but the new driver can support.

    • Looks like they're basically saying 'we can do the bare minimum expected.'

      That's exactly what they are saying. If you're expecting a RTX3090 competitor then you're in the wrong market here. Intel is competing against the GTX1030 here. The lowest of the low end.

      You're seeing the bare minimum because that's their target market. If you are a gamer then this is of no interest to you.

      • by mjwx ( 966435 )

        Looks like they're basically saying 'we can do the bare minimum expected.'

        That's exactly what they are saying. If you're expecting a RTX3090 competitor then you're in the wrong market here. Intel is competing against the GTX1030 here. The lowest of the low end.

        You're seeing the bare minimum because that's their target market. If you are a gamer then this is of no interest to you.

        This... and whilst competition is a good thing, it is still up to Intel to make a decent gaming graphics chip. They've done well producing a cheap, low powered one.

        Best case scenario I can see is that Intel makes a chip that is not quite as good as AMD/NVIDIAs lower end offerings, think below the 3060 territory but can price it cheaper and has a lower power draw. For almost a decade now I've bought NVIDIA optimus laptops (NVIDA GPU for gaming + Intel IGM for long battery life, you switch between the two

    • Maybe it's a $50 GPU.
    • by xwin ( 848234 )
      The market for cheap but capable GPU is 100 times larger than the market for powerful GPU capable of playing modern 3D games. I have NVDIA GPU in my work laptop but I never use it, because I do not play games at work. I use integrated Intel GPU 99.9% of the time. If this GPU can support ML inference and training at some decent level, it can seriously upset NVIDIA and AMD market. NVIDIA and AMD will sell most of their cards due to crypto mining craze, but for end users of laptops and desktops it does not mat
    • Well, to be fair, they do demonstrate they can run Crysis... :)
  • Honestly, if Intel just puts their effort into making sure their cards don't get diverted to the resale/crypto mining markets and actually put their cards into the hands of gamers, I'd give them a try. Neither AMD or NVIDIA are doing right by the loyal gaming customers that made them what they are. They literally can't or won't care who buys their products, or make any meaningful effort to prevent the secondary market.

    I am sure Intel would feel the same way... if you can sell out of your hardware, who cares

    • You're grasping at straws if you think a business is going to be dedicated to customers rather than dedicated to profit.

      • by Luckyo ( 1726890 )

        When you go into a market you're new to, the focus is often on market share rather than profit. Profit comes once decent market share is secured.

      • by Z80a ( 971949 )

        There are many benefits for your profits by dedicating yourself to customers or at least appearing to do so.
        Word of mouth, brand loyalty, support from game publishers...
        There's a reason why since the early 80's game companies keep the "console war" alive and thriving.
        Why pay billions to manufacture people talking about your product when you can have it for free?

        • The perception of dedicating to customers is good. but good customer service and customer loyalty are a side-effect of doing good business, not the ultimate goal.

          Even fiercely loyal customers can't save a failing business. Harley Davidson customer tattoo logos by the thousands, but that company is still suffering from financial problems that make its future uncertain.

          • by Z80a ( 971949 )

            Well yes, but you still have to give something to your "shills" to talk about.
            Criptominers will not convince people to buy your product and will ditch you the moment it's no longer profitable, probably wrecking with your sales by selling the cards they got for cheap.

    • by Rhipf ( 525263 )

      How exactly do you guarantee that cards get into the hands of gamers?
      Unless you go to each purchasers home and watch them put the card into their computer and then watch them play the games there really isn't much a company can do to make sure that graphics cards get into the hands of gamers.
      It is easy to say I am a gamer and then use the graphics card for whatever I like that is not gaming related.

      cash in hand ready to buy from whoever can put a card with an MSRP price tag in my hands.

      There is your problem right there. As long as there are more people willing to pay more than MSRP than there a

      • by spitzak ( 4019 )

        How exactly do you guarantee that cards get into the hands of gamers?
        Possibly by somehow making the cards unusable for mining. It could be designed so that all results are filled triangles, ie you can't get single numerical calculations out of it without paying for the overhead of drawing multiple results. I have no idea but it seems plausable.

        • Possibly by somehow making the cards unusable for mining.

          Crypto-mining relies on the ability of a chip to perform certain mathematical computations quickly and efficiently which unfortunately is the strength of GPUs to render graphics. No, not all mathematical functions are performed by GPUs better than CPUs but the kinds required by crypto-mining are easier. Essentially you are asking a GPU be worse at math.

          • by spitzak ( 4019 )

            I certainly don't know enough about this to state whether it is plausible, but I was thinking was the GPU is designed in such a way that it produces a large number of correlated results at once. These would be designed for graphics purposes and the hope is that graphics will be able to use almost the full capabilities of the chip, while somebody doing other calculations would get the answer they want and hundreds of irrelevant correlated resuts and thus only be able to use a fraction of the capabilities.

            • GPUs already produce correlated results at once. This is the basis of graphics and crypto-mining. But your other suggestion would be terrible. CPUs and GPUs are tools and they should work as they intended. They should compute whatever calculations are thrown at them accurately. Asking a chip to produce erroneous results deliberately is asking for trouble as the GPU/CPU does not need to know what the computation results are for. For this reason, GPUs are used in supercomputers where some of the computations
            • Please stop. If you want to make comments that make sense, at least learn a little more about the field you're commenting in.
      • I don't know how they guarantee they get their cards into gamers' hands. But then again, I'm not the corporation with a product to sell.

        And you're right. Having principles like not paying scalpers and encouraging their behavior, or buying a thermally abused card that's no longer profitable to run from a cryptominer and encouraging their behavior means that I have to wait to get the deal I want on the terms I want.

        It's not wrong for me to want a "new player" in the discrete graphics card market to at

        • by Rhipf ( 525263 )

          My point was, how do you know the person buying the card is a gamer?
          The purchaser could be a gamer, a videographer, a photographer, a spreadsheet user, an Internet browsing addict, a cryptominer, etc. (yes not all of these need powerful graphics cards but that won't necessarily stop people from thinking they need the latest and greatest card anyway).
          It is easy enough to say you are a gamer at time of purchase and then use the card for anything but gaming.

          There is nothing wrong with "wait[ing] to get the dea

          • My point was, how do you know the person buying the card is a gamer?
            The purchaser could be a gamer, a videographer, a photographer, a spreadsheet user, an Internet browsing addict, a cryptominer, etc. (yes not all of these need powerful graphics cards but that won't necessarily stop people from thinking they need the latest and greatest card anyway).
            It is easy enough to say you are a gamer at time of purchase and then use the card for anything but gaming.

            There is nothing wrong with "wait[ing] to get the deal I want on the terms I want" you just have to realize that there may never be a seller that is willing to meet your terms. Adding a new player to the game won't necessarily help any.

            Yeah I know that "paying MSRP" is a pipe dream. Such an unrealistic expectation. Totally not possible to ever again get a video card without paying a scalper. You're a joke. The whole point of my post was that "If Intel does try to change the market it would be a good thing for the market". The point of your post seems to be "Nobody will ever sell you a video card for MSRP again so if you can't figure out how Intel can fix the market, then you'll just have to buy from a scalper, LOL."

            • by Rhipf ( 525263 )

              No, as I stated, the point of my post was HOW DO YOU KNOW THE PERSON YOU ARE SELLING TO IS A GAMER.
              Unless there is some kind of hardware limitation that can make the cards useless for cryptocurrency mining then all the miners need to do is say they are gamers but use the card for mining. The problem with adding hardware limitations is that those limitations will most likely also impede gaming performance and then gamers will complain that the cards suck for gaming.

  • by jacks smirking reven ( 909048 ) on Monday August 16, 2021 @11:56AM (#61697677)

    From the early reviews I have seen from the OEM only Xe card that is already deployed it seems like Intel may focus on workstation/encoding tasks first and the card put up decent numbers for those applications. Gaming I imagine will be mediocre at first and maybe for a long while but it might be welcome competition for the very expensive Quadro and Radeon Pro lines.

    Overall I hope Intel can expand on this in the future, having a triopoly would be preferable to the current market for GPUs.

  • by thegarbz ( 1787294 ) on Monday August 16, 2021 @12:51PM (#61697897)

    Just try Googling "Intel Arc".

    What's the very first result: https://ark.intel.com/ [intel.com]
    Along with the very helpful:
    "Showing results for intel ark
    Search instead for intel arc"

    When even Google thinks you are confusing your own product with something else on your site, what hope have you got?

    • That’s just google being terrible as usual and giving you popular results instead of accurate results. Search for Devuan and google thinks it was a typo for Debian. It made me realize how the project picked that awful name.

      • Re-read what you just wrote. Google has determined that Intel has a popular result for "arc" called "ark" and offers links to it. That should tell you more than enough about Intel's naming choice.

        "Hey Bob, have you heard of Intel's new product Arc?"
        "Ark isn't a product Bill, it's a platform for looking up Intel products."
        "No Bob, that's Arc with a C!"
        "Bill, you've been off the rails since your state legalised weed. Ark is clearly spelled with a K!"

        I for one am interested what product arc the Arc will follow

    • just add -ark

      • Is that arc with a c or ark with a k? I mean the button is there to search for arc with a c instead, but the instinctive reaction will be "oh Intel's website returned hundreds of results for ark with a k, clearly I must have misspelled it!"
        Followed shortly after with "I thought Arc was a GPU why am I getting CPU results!"

        • Google runs a clustering algorithm on your search terms and includes nearby terms in the results, depending on how strongly correlated. In Google's infinite wisdom they decided you can't turn this off, sorry, never mind that the amount of effort that went into the clustering algorithm completely dwarfs the effort that would be needed to make their search interface not suck. Google, ground zero for self-appointed smart people.

          • Google runs a clustering algorithm on your search terms and includes nearby terms in the results

            And this is a *good* thing, as humans when search are often not quite sure of what they are searching for and especially when something is more complex they frequently misspell it or use an incorrect term.

            There's nothing wrong with their search interface. Search engines are dime a dozen and there's a reason Google has the market share it does. Not suiting *your* wants (I dare not use the word needs) doesn't make it sucky.

    • Google search often ignores your search term and replaces it with something it thinks is what you wanted. Ignoring half of your search terms is just stupid.

      If you find you are clicking "Search instead for.." more than you think you should start trying other search engines.
      • Google search often ignores your search term and replaces it with something it thinks is what you wanted. Ignoring half of your search terms is just stupid.

        I know what Google is doing. Google thinks Intel has thousands of pages entitled Ark and that must be what the user was looking for when they wrote Arc. Are they wrong? Does Intel in fact not have thousands of pages with a name that sounds phonetically identical and is a single letter away from another product they just announced?

        It's not Google in the wrong here. But I for one look forward to following Intel Arc's product arc on their Ark pages.

  • I've come to limit my involvement with Intel products to CPUs, and of course any other equipment I might own that inadvertently contains Intel components unbeknownst to me. This was a direct result of their forage into cable modem components, and firmware, namely Puma. It was such a shitshow for so long, and for many continues to be. Just a plain buggy, bad piece of firmware that they hummed and hawed about for several years before announcing fixes, no doubt in response to the Puma class action suits presen

Nothing ever becomes real till it is experienced -- even a proverb is no proverb to you till your life has illustrated it. -- John Keats

Working...