Intel Enters the PC Gaming GPU Battle With Arc 92
Dave Knott writes: Intel is branding its upcoming consumer GPUs as Intel Arc. This new Arc brand will cover both the hardware and software powering Intel's high-end discrete GPUs, as well as multiple hardware generations. The first of those, known previously as DG2, is expected to arrive in the form of codename "Alchemist" in Q1 2022. Intel's Arc GPUs will be capable of mesh shading, variable rate shading, video upscaling, and real-time ray tracing. Most importantly, Intel is also promising AI-accelerated super sampling, which sounds like Intel has its own competitor to Nvidia's Deep Learning Super Sampling (DLSS) technology.
Re:Intel vs. NVIDIA -- antitrust perspective (Score:5, Informative)
Why? buying arm was removing a competitor, deciding to start making a new product is adding a competitor to a market
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AMD's acquisition of ATI brought in a huge influx of engineering talent. Alternatively, ATI could have been left to be crushed by nVidia.
Re: (Score:1)
Re: (Score:2)
Someone is forgetting 3DFX and S3. At a minimum.
Re: (Score:2)
True that. Add SGI's Reality coprocessor.
Re: (Score:1)
Re: (Score:2)
That and there is little evidence that Intel is going to be successful against Nvidia. Microsoft who had the Dominance in terms of Operating System, and Office Software, tried to compete against Apple iPod with the Microsoft Zoom. The Zoom while had some success never really got anywhere, I expect because the portable music device market got absorbed into the Touchscreen Smartphone market.
Re: Intel vs. NVIDIA -- antitrust perspective (Score:5, Funny)
Maybe it failed because no one could remember what it was called.
Re: (Score:2)
Wasn't it "Zune"....
I think it was...
Re: (Score:2)
Re: (Score:2)
I swear it was just the thought of Ballmer squirting that killed Zune.
Re: (Score:2)
Pretty sure the brown colour didn't help.
Re: (Score:2)
Zune HD used an Nvidia Tegra APX 2600, derived from an earlier PortalPlayer design (used in earlier Apple iPod). Also used in the disastrous Microsoft Kin mobile phone.
.
Re: (Score:2)
NV and AMD can't keep dGPUs on the market. They keep selling out. As long as Intel's product can successfully run DirectX 9/11/12 applications it will find buyers.
Re: (Score:2)
Yeah, assuming it gets produced. Intel doesn't know how to fab GPU's so they'll be sharing fab space with AMD at TSMC.
Of course you're right, if they can make a 50% uplift of polaris from 5 years ago at ~$200 it will sell like hotcakes while still being massively profitable.
Re: (Score:2)
That and there is little evidence that Intel is going to be successful against Nvidia.
What makes you say that? Intel currently already has 68% market share for GPUs, and NVIDIA only has 15%. NVIDIA is a small fish in the drawing pixels on screen market.
Not everything is about running Crysis Remastered with RTX on. This market is quite price sensitive, and releasing a crap product if priced correctly can net you a huge portion of the GPU market. Like 68% of it.
Re: (Score:2)
What is the reasoning again? Intel is changing their strategy with an existing product. Intel has been making integrated GPUs for a long time. Now they are separating those GPUs and selling them as a product. How is that anticompetitive for Intel to modify their own existing product?
NVidia is buying out another company which some of their rivals and competitors rely upon. There is concern for anticompetitive behavior.
Re: (Score:2)
nvidia tried to buy an existing vendor.
intel created its own from scratch.
There's no way... (Score:3)
...this thing is competitive with AMD let alone Nvidia. Certainly not from the get-go.
Re:There's no way... (Score:5, Insightful)
Depends. Is it going to be available?
Re: (Score:3)
Is it going to have open source drivers?
Re: (Score:2)
First I have to have a chance to have the hardware before I wonder whether the software is what I want.
Re: (Score:3)
Exactly 0.003% of any customer base cares about that.
Re: (Score:1)
Exactly. Somebody wake me up when SolidWorks and MasterCam runs on *nix.
Re: (Score:1)
Yeah, "0.003%".... literally all computers, servers, mobile systems, etc, ... except for a few silly desktop PCs from the beforetime.
Also, PROTIP: Professional rendering servers all run Linux. Pixar, for example. Because using Windows for professional work is just a sad joke. Professionals customize. They are not consumers. But they have a lot of money to spend.
Re: (Score:2)
Why exactly would I want a dedicated GPU in a server computer when practically all CPUs now have some kind of display driver capability baked in?
I give you the Pixar argument, but i hope that we can agree that Pixar is a VERY small market segment and very likely hunting in a very different league than we're talking here.
Re: (Score:3)
git log linus/master..next/master -- drivers/gpu/drm/i915 says yes.
Re: (Score:2)
Oh I dunno.. the fragmentation of the market could lead to some serious innovation in the field of rendering a gun swaying back and forth on your screen.
Re: (Score:2)
Re: (Score:2)
1060 is the most used GPU on steam. That market you identified, or there abouts, is the biggest one.
Re: (Score:2)
Re: (Score:3)
Why not?
Intel is not a rinky dink little chip maker, who is just a couple of Grad Students trying their first stint outside of academia, going against big companies who's R&D budget exceeds the amount of money they can collect in a decade. Intel is a big company, and they have been making GPU, and they have been Good enough GPU, they just weren't Good enough for high end gaming.
Also sometimes if you the established brand, you get stuck with the issue that you have to do things the old way, in order to
They have a history of GPU/storage underwhelms... (Score:3)
Why not?
Intel is not a rinky dink little chip maker, who is just a couple of Grad Students trying their first stint outside of academia, going against big companies who's R&D budget exceeds the amount of money they can collect in a decade. Intel is a big company, and they have been making GPU, and they have been Good enough GPU, they just weren't Good enough for high end gaming.
Most of us can remember them releasing shitty graphics cards, so yeah, I have no clue if they'll do it right this time, but I am underwhelmed. Also, their SSD offerings have been pretty undewhelming. When both came out, I kept reading articles pretending they were much more interesting than they sounded at the time...and ultimately ended up being. I have no clue if the publications were paid to write glowing articles about unreleased products or it was just a slow news day and "exciting new market for In
Re: (Score:2)
The infamous i740 graphics card.
Re: (Score:2)
To reinforce your point on storage, it's an excellent example where they had a genuinely strong technical ability (phase change memory), but impractical because they priced out of the market. They also have all sorts of weird convoluted modes to use that as 'almost ram' that can be done transparently (at horrible performance) or with an application micromanaging it (no developers are bothering because it's more cost effective just to hit up NAND flash over PCIe, and it may not be as good, it's close enough
Re: (Score:3)
Intel is a resourceful company, but their track record for going out of their core competency has not been that stellar. Even when they have big ideas, they can't seem to get partners to enable them. There's a revolving door of 'hot new product in a field Intel hasn't previously been in' eternally displacing last years experiments that fizzled out due to hardware limitations, inadequate software enablement, and/or just poor handling on the business side of things.
Their GPUs historically have been 'good eno
Re: There's no way... (Score:3)
quote>Even when they have big ideas, they can't seem to get partners to enable them.
Eh, they have a reputation for screwing their partners, that's probably why nobody is all that enthusiastic about stepping up.
Re: (Score:2)
Intel has failed before. Several times in fact.
Re: (Score:2)
depends on your definition of "competitive"
The overwhelming vast majority of consumers are nor purchasing flagship GPUs. that's just something for people to wave their e-peens over mostly. But the large body of consumers buy mid-tier GPUs. Look at the Steam hardware surveys, the top of the list will always be the xx60/xx70 GPUs from Nvidia (and equiv AMD). Intel doesn't need a flagship GPU to beat out Nvidia and AMD to be a success in the market, they just need to have a price/performance competitive mid-ra
Re: (Score:2)
You're not familiar with Intel Corp's history. Intel still has a huge financial advantage over AMD, and decades ago had a huge manufacturing/marketing advantage over AMD. Every so often, Intel would put out a hugely mediocre, if not markedly inferior product. But everyone (manufacturers) would buy its inferior product, because at the end of the day, its "how many semiconductors you can order at price X". AMD may have had the superior performance/value CPU, but it was a boutique manufacturer, and couldn'
Re: (Score:2)
What AMD accomplished by buying ARM
Ooops, I meant Nvidia buying ARM.
Re: (Score:2)
I guess Nvidia needed a non-useless GPU. Mali is scum from the bottom of the barrel, but it at least got semi-usable drivers these days, that's a huge step up over what Nvidia had before.
New AMD requires opaque firmware blobs (boo!) but at least on the CPU side their drivers work ok. Thus, AMD is the only game in the town for gaming[1] GPUs, let's see how DG2/Arc will do.
[1]. For normal-people games, that is.
Re: (Score:1)
New AMD requires opaque firmware blobs (boo!) but at least on the CPU side their drivers work ok. Thus, AMD is the only game in the town for gaming[1] GPUs, let's see how DG2/Arc will do.
Huh?
By what fucking metric are nVidia GPUs useless? Except for the fact that you can't fucking get them, because they're gone the second they hit the shelves- a problem not unique to them.
Re: (Score:2)
If intel succeeds with buyout of SiFive, they will also need competitive GPU to go along with RISC-V CPU.
Dunno. A friend works on Intel GPU drivers (but he also did parts of Nouveau in the past, which suggests the former might be likewise made of explodium :p), I'm keeping bugging him to borrow my Unmatched to test+port the drivers, without success so far. I guess, more booze is needed. :p
Re: (Score:2)
no one wants to buy desktop CPUs anymore
70 million units/year is far from "no one", and three times that for laptops. Correct point would be "fewer are buying desktops", though that downtrend has stabilized over the last few years.
Meanwhile it datacenter PC server market continues to expand exponentially, the result being that Intel+AMD revenue continues to increase.
Re: (Score:2)
Competitive where? Note they are competing with NVIDIA's GTX1030 here. How much have you researched GPUs too crappy to be used in gaming PCs?
Re: (Score:2)
The most popular GPU on Steam by far is 1060. Second most popular is 1050Ti. Third is 1650.
With current availability of GPUs, all they need to deliver is something that is better than those three while available and reasonably priced.
Re: (Score:2)
Well, what Intel had so far indicates "Intel high-end" = "below low end of the competition". I believe they have anything better when I see it, not before. Of course we have these utterly crazy prices now so there may be a market even for low end discrete graphics.
Weird video (Score:5, Informative)
Why show off gameplay from a few not particularly resource intensive games, released years ago?
I could play PUBG on my 8 year old computer, nothing to brag about.
Looks like they're basically saying 'we can do the bare minimum expected.'
Re: (Score:2)
Re: (Score:1)
Oh, have the iBelievers moved on to calling anything questioning their delusion a "FUD campaign" by Intel now?
Weird. I always thought I was an AMD guy... But if you say so...
Call me when you get adequate cooling to prevent heat throttling and a real keyboard though. :)
Re: (Score:2)
Oh, have the iBelievers moved on to calling anything questioning their delusion a "FUD campaign" by Intel now?
Did you actually evaluate Intel's points or is your Apple hate so strong that you did not even look at Intel claims and immediately dismiss any criticisms. Fortunately others [engadget.com] have looked at Intel's claims and found problems [bgr.com]. Among the issues: cherry picking non standard benchmarks, switching models selectively used in comparisons, and complaining Macs failed certification tests designed only for Intel PCs. Did you do this? I am guessing not.
Call me when you get adequate cooling to prevent heat throttling and a real keyboard though.
Come back when you can stop showing your ignorance as you assume my
Re: (Score:2)
Why show off gameplay from a few not particularly resource intensive games, released years ago?
Hopefully not because they are still trying to load textures through that slow AGP interface [wikipedia.org] :D
Re: (Score:2)
The caption is "See Intel Arc graphics pre-production silicon in action."
pre-production is the keyword. I expect the Driver isn't fully coded, and needed to play against an older game that still looks good, but the new driver can support.
Re: (Score:2)
Looks like they're basically saying 'we can do the bare minimum expected.'
That's exactly what they are saying. If you're expecting a RTX3090 competitor then you're in the wrong market here. Intel is competing against the GTX1030 here. The lowest of the low end.
You're seeing the bare minimum because that's their target market. If you are a gamer then this is of no interest to you.
Re: (Score:2)
Looks like they're basically saying 'we can do the bare minimum expected.'
That's exactly what they are saying. If you're expecting a RTX3090 competitor then you're in the wrong market here. Intel is competing against the GTX1030 here. The lowest of the low end.
You're seeing the bare minimum because that's their target market. If you are a gamer then this is of no interest to you.
This... and whilst competition is a good thing, it is still up to Intel to make a decent gaming graphics chip. They've done well producing a cheap, low powered one.
Best case scenario I can see is that Intel makes a chip that is not quite as good as AMD/NVIDIAs lower end offerings, think below the 3060 territory but can price it cheaper and has a lower power draw. For almost a decade now I've bought NVIDIA optimus laptops (NVIDA GPU for gaming + Intel IGM for long battery life, you switch between the two
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AMD and NVIDIA need this (Score:1)
Honestly, if Intel just puts their effort into making sure their cards don't get diverted to the resale/crypto mining markets and actually put their cards into the hands of gamers, I'd give them a try. Neither AMD or NVIDIA are doing right by the loyal gaming customers that made them what they are. They literally can't or won't care who buys their products, or make any meaningful effort to prevent the secondary market.
I am sure Intel would feel the same way... if you can sell out of your hardware, who cares
Re: (Score:2)
You're grasping at straws if you think a business is going to be dedicated to customers rather than dedicated to profit.
Re: (Score:2)
When you go into a market you're new to, the focus is often on market share rather than profit. Profit comes once decent market share is secured.
Re: (Score:2)
There are many benefits for your profits by dedicating yourself to customers or at least appearing to do so.
Word of mouth, brand loyalty, support from game publishers...
There's a reason why since the early 80's game companies keep the "console war" alive and thriving.
Why pay billions to manufacture people talking about your product when you can have it for free?
Re: (Score:2)
The perception of dedicating to customers is good. but good customer service and customer loyalty are a side-effect of doing good business, not the ultimate goal.
Even fiercely loyal customers can't save a failing business. Harley Davidson customer tattoo logos by the thousands, but that company is still suffering from financial problems that make its future uncertain.
Re: (Score:2)
Well yes, but you still have to give something to your "shills" to talk about.
Criptominers will not convince people to buy your product and will ditch you the moment it's no longer profitable, probably wrecking with your sales by selling the cards they got for cheap.
Re: (Score:2)
How exactly do you guarantee that cards get into the hands of gamers?
Unless you go to each purchasers home and watch them put the card into their computer and then watch them play the games there really isn't much a company can do to make sure that graphics cards get into the hands of gamers.
It is easy to say I am a gamer and then use the graphics card for whatever I like that is not gaming related.
cash in hand ready to buy from whoever can put a card with an MSRP price tag in my hands.
There is your problem right there. As long as there are more people willing to pay more than MSRP than there a
Re: (Score:2)
How exactly do you guarantee that cards get into the hands of gamers?
Possibly by somehow making the cards unusable for mining. It could be designed so that all results are filled triangles, ie you can't get single numerical calculations out of it without paying for the overhead of drawing multiple results. I have no idea but it seems plausable.
Re: (Score:2)
Possibly by somehow making the cards unusable for mining.
Crypto-mining relies on the ability of a chip to perform certain mathematical computations quickly and efficiently which unfortunately is the strength of GPUs to render graphics. No, not all mathematical functions are performed by GPUs better than CPUs but the kinds required by crypto-mining are easier. Essentially you are asking a GPU be worse at math.
Re: (Score:2)
I certainly don't know enough about this to state whether it is plausible, but I was thinking was the GPU is designed in such a way that it produces a large number of correlated results at once. These would be designed for graphics purposes and the hope is that graphics will be able to use almost the full capabilities of the chip, while somebody doing other calculations would get the answer they want and hundreds of irrelevant correlated resuts and thus only be able to use a fraction of the capabilities.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
I don't know how they guarantee they get their cards into gamers' hands. But then again, I'm not the corporation with a product to sell.
And you're right. Having principles like not paying scalpers and encouraging their behavior, or buying a thermally abused card that's no longer profitable to run from a cryptominer and encouraging their behavior means that I have to wait to get the deal I want on the terms I want.
It's not wrong for me to want a "new player" in the discrete graphics card market to at
Re: (Score:2)
My point was, how do you know the person buying the card is a gamer?
The purchaser could be a gamer, a videographer, a photographer, a spreadsheet user, an Internet browsing addict, a cryptominer, etc. (yes not all of these need powerful graphics cards but that won't necessarily stop people from thinking they need the latest and greatest card anyway).
It is easy enough to say you are a gamer at time of purchase and then use the card for anything but gaming.
There is nothing wrong with "wait[ing] to get the dea
Re: (Score:1)
My point was, how do you know the person buying the card is a gamer?
The purchaser could be a gamer, a videographer, a photographer, a spreadsheet user, an Internet browsing addict, a cryptominer, etc. (yes not all of these need powerful graphics cards but that won't necessarily stop people from thinking they need the latest and greatest card anyway).
It is easy enough to say you are a gamer at time of purchase and then use the card for anything but gaming.
There is nothing wrong with "wait[ing] to get the deal I want on the terms I want" you just have to realize that there may never be a seller that is willing to meet your terms. Adding a new player to the game won't necessarily help any.
Yeah I know that "paying MSRP" is a pipe dream. Such an unrealistic expectation. Totally not possible to ever again get a video card without paying a scalper. You're a joke. The whole point of my post was that "If Intel does try to change the market it would be a good thing for the market". The point of your post seems to be "Nobody will ever sell you a video card for MSRP again so if you can't figure out how Intel can fix the market, then you'll just have to buy from a scalper, LOL."
Re: (Score:2)
No, as I stated, the point of my post was HOW DO YOU KNOW THE PERSON YOU ARE SELLING TO IS A GAMER.
Unless there is some kind of hardware limitation that can make the cards useless for cryptocurrency mining then all the miners need to do is say they are gamers but use the card for mining. The problem with adding hardware limitations is that those limitations will most likely also impede gaming performance and then gamers will complain that the cards suck for gaming.
Good option for workstations (Score:3)
From the early reviews I have seen from the OEM only Xe card that is already deployed it seems like Intel may focus on workstation/encoding tasks first and the card put up decent numbers for those applications. Gaming I imagine will be mediocre at first and maybe for a long while but it might be welcome competition for the very expensive Quadro and Radeon Pro lines.
Overall I hope Intel can expand on this in the future, having a triopoly would be preferable to the current market for GPUs.
Braindead name (Score:5, Funny)
Just try Googling "Intel Arc".
What's the very first result: https://ark.intel.com/ [intel.com]
Along with the very helpful:
"Showing results for intel ark
Search instead for intel arc"
When even Google thinks you are confusing your own product with something else on your site, what hope have you got?
Re: (Score:2)
That’s just google being terrible as usual and giving you popular results instead of accurate results. Search for Devuan and google thinks it was a typo for Debian. It made me realize how the project picked that awful name.
Re: (Score:2)
Re-read what you just wrote. Google has determined that Intel has a popular result for "arc" called "ark" and offers links to it. That should tell you more than enough about Intel's naming choice.
"Hey Bob, have you heard of Intel's new product Arc?"
"Ark isn't a product Bill, it's a platform for looking up Intel products."
"No Bob, that's Arc with a C!"
"Bill, you've been off the rails since your state legalised weed. Ark is clearly spelled with a K!"
I for one am interested what product arc the Arc will follow
Re: (Score:2)
just add -ark
Re: (Score:2)
Is that arc with a c or ark with a k? I mean the button is there to search for arc with a c instead, but the instinctive reaction will be "oh Intel's website returned hundreds of results for ark with a k, clearly I must have misspelled it!"
Followed shortly after with "I thought Arc was a GPU why am I getting CPU results!"
Re: (Score:2)
Google runs a clustering algorithm on your search terms and includes nearby terms in the results, depending on how strongly correlated. In Google's infinite wisdom they decided you can't turn this off, sorry, never mind that the amount of effort that went into the clustering algorithm completely dwarfs the effort that would be needed to make their search interface not suck. Google, ground zero for self-appointed smart people.
Re: (Score:2)
Google runs a clustering algorithm on your search terms and includes nearby terms in the results
And this is a *good* thing, as humans when search are often not quite sure of what they are searching for and especially when something is more complex they frequently misspell it or use an incorrect term.
There's nothing wrong with their search interface. Search engines are dime a dozen and there's a reason Google has the market share it does. Not suiting *your* wants (I dare not use the word needs) doesn't make it sucky.
Re: (Score:2)
If you find you are clicking "Search instead for.." more than you think you should start trying other search engines.
Re: (Score:2)
Google search often ignores your search term and replaces it with something it thinks is what you wanted. Ignoring half of your search terms is just stupid.
I know what Google is doing. Google thinks Intel has thousands of pages entitled Ark and that must be what the user was looking for when they wrote Arc. Are they wrong? Does Intel in fact not have thousands of pages with a name that sounds phonetically identical and is a single letter away from another product they just announced?
It's not Google in the wrong here. But I for one look forward to following Intel Arc's product arc on their Ark pages.
Puma, anyone? (Score:2)
I've come to limit my involvement with Intel products to CPUs, and of course any other equipment I might own that inadvertently contains Intel components unbeknownst to me. This was a direct result of their forage into cable modem components, and firmware, namely Puma. It was such a shitshow for so long, and for many continues to be. Just a plain buggy, bad piece of firmware that they hummed and hawed about for several years before announcing fixes, no doubt in response to the Puma class action suits presen