Intel To Enter Bitcoin Mining Market With Energy-Efficient GPU (pcmag.com) 52
Intel is entering the blockchain mining market with an upcoming GPU capable of mining Bitcoin. From a report: Intel insists the effort won't put a strain energy supplies or deprive consumers of chips. The goal is to create the most energy-efficient blockchain mining equipment on the planet, it says. "We expect that our circuit innovations will deliver a blockchain accelerator that has over 1,000x better performance per watt than mainstream GPUs for SHA-256 based mining," Intel's General Manager for Graphics, Raja Koduri, said in the announcement. (SHA-256 is a reference to the mining algorithm used to create Bitcoins.)
News of Intel's blockchain-mining effort first emerged last month after the ISSCC technology conference posted details about an upcoming Intel presentation titled: "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." ASICs are chips designed for a specific purpose, and also refer to dedicated hardware to mine Bitcoin. Friday's announcement from Koduri added that Intel is establishing a new "Custom Compute Group" to create chip platforms optimized for customers' workloads, including for blockchains.
News of Intel's blockchain-mining effort first emerged last month after the ISSCC technology conference posted details about an upcoming Intel presentation titled: "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." ASICs are chips designed for a specific purpose, and also refer to dedicated hardware to mine Bitcoin. Friday's announcement from Koduri added that Intel is establishing a new "Custom Compute Group" to create chip platforms optimized for customers' workloads, including for blockchains.
Sounds more impressive than it is (Score:1)
bitcoin mining moved away from GPGPUs and onto ASICs a long time ago. Why bother with a totally energy-efficient GPGPU if you can get even better performance-per-watt from an ASIC that's been optimised to do this one thing, run lots and lots of SHA256 operations in parallel, for, oh, going on a decade now?
Re: (Score:3)
Straw-man comparison...definitely. Something Intel has done for decades now. It's why places like Tom's Hardware and Anandtech were initially so useful. They'd tell you the truth and would test lots of use cases.
Re: (Score:1)
Re: (Score:2)
If you're just going by what msmash led off with:
iinstead of digging into the underlying article, it's entirely reasonable to be (mis)led into thinking that Intel is trying to bring GPUs back to Bitcoin mining. As a miner, I had my own skepticism of the assertion, which was borne out by TFA. I bought some of the earliest ASIC miners back in 2013 (still have my Butterfly Labs 7-GH/s cubes that shipped in place of
Remember when GPUs were for (Score:2)
Re: (Score:2)
Re: Remember when GPUs were for (Score:2)
Re: (Score:2)
I use my new GPU for games. It's not that you can't get them. They're just not cheap. So poors can't have new ones.
Some people think this is ruining PC gaming. I don't know. PC gaming has a lot of depth; older games are still excellent so you can have all the fun you can stand with an older GPU. You just can't max out the latest AAA titles.
And gaming GPU or not people are still buying PCs because they're working remote, so when all the new fab capacity comes online in 2023- there will still be a m
Re: (Score:3)
It use to be a decent video card that would run any game at good enough specs ran $150 give or take $25. That's more then doubled and it's terrible.
I don't need to build a new computer (thankfully) but I've priced out the components and it's pretty much a horrible time to build anything. I use to be able to build a good enough system for around $700. If you don't need a video card and stick with a SSD, you can still do this. Add the video card and the latest storage type and you are adding another $550 to t
Re: (Score:2)
You could just reuse your video card, if it's still capable of doing what you need it to do. My last upgrade was from a Core i5 4690K to a Ryzen 7 3800X. I replaced the CPU, motherboard, and memory and reused the other components: case, power supply, Blu-ray burner, M.2 SSD...a
Re: (Score:2)
This won't help. They'll just get more hardware (Score:1)
Re: (Score:2)
Maybe not, if you have more hardware, it's harder to win BTC and so the less money you make, so that would make revenue go down.
Re: (Score:3, Informative)
Re: (Score:1)
Actually, it's a bit of a game of chicken where, when the value of Bitcoin drops due to market conditions, miners who are no longer mining profitably, shut their rigs down. The mining difficulty level has no way of knowing what the present value of Bitcoin is, or what people are paying for electricity, it only knows the total network hashrate and adjusts accordingly. It is up to the individual miners to decide whether or not it is currently profitable to run their rigs.
Releasing mining hardware that is mo
Reading is fundamental (Score:2)
They're not making a GPU to mine Bitcoin, it's an ASIC. Read the article.
market grab (Score:3)
Seems like a good way to sell a lot of sub-par GPUs and get into position to grab market and mind share. nVidia should not have locked out bitcoin miners .. a sale is a sale (as long as bitcoin mining is legal.)
Re: (Score:2)
Also a pile of lies from Intel (Score:2)
Comparing their product to a GPU in an ASIC race is disingenuous at best. If Intel were marketing to non-bitcoin crypto-mining then comparing against other GPUs would have validity.
But the real big lie is that it'll save electricity use. Crypto-mining algorithms automatically scale to the level of competing resources. The more compute per dollar the more compute miners will throw at it. The more thrown at it the harder it scales. Result is an ever-growing need for electrical power.
The headline has all
Re: (Score:1)
Re: (Score:2)
From the article... "Intel also emphasized the blockchain accelerator isn't a multi-purpose GPU, but an ASIC." Which I assume means a purpose built chip.
Reusing failed GPU's is something Nvidia does sometimes ("floor sweeping"), but in this case I think there would be too much unused silicon area. Parts not needed to do Bitcoin's specific calculations.
Nvidia knows their core market has been gamers, and they don't want to piss them off. If you already have plenty of demand, then I can understand trying t
Any claim of caring about the planet GONE ! (Score:1)
Re: (Score:1)
What a disgusting move. What an evil company.
What a disgusting socialist. What an evil lie.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Well, I'm not buying Intel ever again,
.. or ASRock
.. or Biostar
.. or MSI
.. or Gigabyte
.. or Samsung
Goddamnit, now the situation is getting a bit ridiculous.
Isn't there any major brand out there, who has any principles left?
Re: (Score:1)
This move puts Intel pretty close to the bottom.
So it sucks at graphics? (Score:2)
No surprise. Intel simply cannot graphics well and they cannot get anybody that can to work for them.
Also, proof-of-work? At this time? What evil greedy scum!
just pay $$$ more for the mineing unlock key! (Score:2)
just pay $$$ more for the mineing unlock key!
Mr. Gelsigner wanted 50B from taxpayers... (Score:2)
to advance which manufacturing technologies ? Oh yeah, so we can have domestic casino support. Maybe he can find a job with Boeing, I've heard they have a management team he might be compatible with.
Reminds me a bit of AES NI (Score:2)
While there was actual value in implementing base AES instruction loop in pure silicon, there were some detractors to the idea of doing so. Always are a few folks who nay-say. I digress, but the point is that the AES NI instructions in Intel hardware (found in the i5/i7/i9 chipsets...not the i3 though) boosts overall AES encryption/decryption performance 200x over the best hand-optimized software implementation written in assembly language. On Intel chipsets, the result is extremely fast execution of the
I can hear teeth grinding in the PR department (Score:3)
It's going to be hard to paint the company as eco-friendly while churning out chips whose sole purpose is mining imaginary money.
Re: (Score:2)
It's going to be hard to paint the company as eco-friendly while churning out chips whose sole purpose is mining imaginary money.
I think it's more "if Intel is getting involved, this Bitcoin mining craze must be over".
Good news for those who want a reasonably priced GPU for gaming... I mean the Bitcoin craze being over, I doubt Intel's entry into the market will be any good.
Hope Blockchain Crashes Like a Cheap Airplane (Score:2)
Not efficient. (Score:2)
It doesn't matter if it only needs a picowatt or a terrawatt to compute a hash, zero times X is still zero! You waste power and accomplishing nothing at all. Everything about this is increasing pollution and honestly, it should be stopped.
It's not a GPU. (Score:4, Informative)
Re: (Score:2)
It's a press release with quoted GPU comparisons. That means Intel have supplied these numbers. The blame goes to Intel.
Re: (Score:2)
Already announced (Score:2)
This was already announced three weeks ago [slashdot.org].
In this dupe, you could have at least linked to the official post [slashdot.org] by Intel's VP Raja M. Koduri.
TFS is wrong: Is not a GPU, is an ASIC (Score:2)
While intel is about to release a GPU to the market, the part aimed at crytomining is an ASIC, not a GPU...
THIS IS A DUPE (Score:2)
You already carried this story Slashdot!
And it's an ASIC, not a GPU!
1000x performance at launch (Score:2)
No end to the greenwashing of bitcoin. (Score:2)
There's no end to the greenwashing of bitcoin.
Even if you had the most efficient chips ever, that would just mean you were wasting a little less energy doing something fundamentally stupid and useless. If we'd just ban this nonsense the problem would be solved and we might even find something useful to do with all that computing power. /P.
The standard Intel marketing (Score:1)
The standard Intel marketing of a future product - over promise and under deliver.
Intel, show us your confidence in this product (Score:2)
If Intel really thinks this chip is better than ASICs then they should prove it by deferring the payment until the buyer has actually accrued enough bitcoin to pay for the device. Lets call it a deferred payment plan. Intel would certainly move a *lot* of product that way.
The only question then is how much electricity does it really use per bitcoin? A GPU aught to be able to calculate out to the very day that the device pays for itself plus taking into account all the electricity used by it. The bill then