Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Bitcoin

Intel To Enter Bitcoin Mining Market With Energy-Efficient GPU (pcmag.com) 52

Intel is entering the blockchain mining market with an upcoming GPU capable of mining Bitcoin. From a report: Intel insists the effort won't put a strain energy supplies or deprive consumers of chips. The goal is to create the most energy-efficient blockchain mining equipment on the planet, it says. "We expect that our circuit innovations will deliver a blockchain accelerator that has over 1,000x better performance per watt than mainstream GPUs for SHA-256 based mining," Intel's General Manager for Graphics, Raja Koduri, said in the announcement. (SHA-256 is a reference to the mining algorithm used to create Bitcoins.)

News of Intel's blockchain-mining effort first emerged last month after the ISSCC technology conference posted details about an upcoming Intel presentation titled: "Bonanza Mine: An Ultra-Low-Voltage Energy-Efficient Bitcoin Mining ASIC." ASICs are chips designed for a specific purpose, and also refer to dedicated hardware to mine Bitcoin. Friday's announcement from Koduri added that Intel is establishing a new "Custom Compute Group" to create chip platforms optimized for customers' workloads, including for blockchains.

This discussion has been archived. No new comments can be posted.

Intel To Enter Bitcoin Mining Market With Energy-Efficient GPU

Comments Filter:
  • by Anonymous Coward

    bitcoin mining moved away from GPGPUs and onto ASICs a long time ago. Why bother with a totally energy-efficient GPGPU if you can get even better performance-per-watt from an ASIC that's been optimised to do this one thing, run lots and lots of SHA256 operations in parallel, for, oh, going on a decade now?

    • Straw-man comparison...definitely. Something Intel has done for decades now. It's why places like Tom's Hardware and Anandtech were initially so useful. They'd tell you the truth and would test lots of use cases.

    • Had you bothered to read the article, you'd know they're making an ASIC. I guess it's better to be lazy, open your pie-hole, and come off like an idiot...
      • by ncc74656 ( 45571 ) *

        If you're just going by what msmash led off with:

        Intel is entering the blockchain mining market with an upcoming GPU capable of mining Bitcoin.

        iinstead of digging into the underlying article, it's entirely reasonable to be (mis)led into thinking that Intel is trying to bring GPUs back to Bitcoin mining. As a miner, I had my own skepticism of the assertion, which was borne out by TFA. I bought some of the earliest ASIC miners back in 2013 (still have my Butterfly Labs 7-GH/s cubes that shipped in place of

  • games and content creation? now after a new comer finally gets a chance to compete with the big two boys, they blatantly boast about mining efficiency. whatever gets the cash flowing i guess
    • by slazzy ( 864185 )
      I know a few gamers who like to do some mining with their high-end gpu when they are not playing games, so sometimes it can be both.
    • by Tailhook ( 98486 )

      I use my new GPU for games. It's not that you can't get them. They're just not cheap. So poors can't have new ones.

      Some people think this is ruining PC gaming. I don't know. PC gaming has a lot of depth; older games are still excellent so you can have all the fun you can stand with an older GPU. You just can't max out the latest AAA titles.

      And gaming GPU or not people are still buying PCs because they're working remote, so when all the new fab capacity comes online in 2023- there will still be a m

      • It use to be a decent video card that would run any game at good enough specs ran $150 give or take $25. That's more then doubled and it's terrible.

        I don't need to build a new computer (thankfully) but I've priced out the components and it's pretty much a horrible time to build anything. I use to be able to build a good enough system for around $700. If you don't need a video card and stick with a SSD, you can still do this. Add the video card and the latest storage type and you are adding another $550 to t

        • by ncc74656 ( 45571 ) *

          If you don't need a video card and stick with a SSD, you can still do this. Add the video card and the latest storage type and you are adding another $550 to the system because the damn video card that use to cost $150 is now $375.

          You could just reuse your video card, if it's still capable of doing what you need it to do. My last upgrade was from a Core i5 4690K to a Ryzen 7 3800X. I replaced the CPU, motherboard, and memory and reused the other components: case, power supply, Blu-ray burner, M.2 SSD...a

    • These are Intel GPUs we're talking about here, they've never been useful for graphics so with the Bitcoin shift they may finally have found something they're useful for.
  • Bitcoin's value is fundamentally supported by the cost of electricity used to mine it. Creating more power-efficient hardware will just cause more hardware to be purchased until the same amount of power is used.
    • Maybe not, if you have more hardware, it's harder to win BTC and so the less money you make, so that would make revenue go down.

      • Re: (Score:3, Informative)

        by Stickboy75 ( 1868986 )
        no... the algorithm adjusts to compute power. You'll have an edge if you upgrade your hardware faster than everyone else but that edge will be fleeting.
    • Actually, it's a bit of a game of chicken where, when the value of Bitcoin drops due to market conditions, miners who are no longer mining profitably, shut their rigs down. The mining difficulty level has no way of knowing what the present value of Bitcoin is, or what people are paying for electricity, it only knows the total network hashrate and adjusts accordingly. It is up to the individual miners to decide whether or not it is currently profitable to run their rigs.

      Releasing mining hardware that is mo

  • They're not making a GPU to mine Bitcoin, it's an ASIC. Read the article.

  • by backslashdot ( 95548 ) on Friday February 11, 2022 @05:39PM (#62260339)

    Seems like a good way to sell a lot of sub-par GPUs and get into position to grab market and mind share. nVidia should not have locked out bitcoin miners .. a sale is a sale (as long as bitcoin mining is legal.)

    • GPUs haven't been used to mine bitcoins for years. Bitcoin mining requires ASICs that can't be used as GPUs.
    • Comparing their product to a GPU in an ASIC race is disingenuous at best. If Intel were marketing to non-bitcoin crypto-mining then comparing against other GPUs would have validity.

      But the real big lie is that it'll save electricity use. Crypto-mining algorithms automatically scale to the level of competing resources. The more compute per dollar the more compute miners will throw at it. The more thrown at it the harder it scales. Result is an ever-growing need for electrical power.

      The headline has all

    • I disagree. nVidia eventually would have to return to selling GPUs. Pretty sure they want the computer gaming market intact. It's already such that people are still limping along with the same GPU they've had for years. It can't sustain itself this way for much longer. Lifelong computer gamers are going to jump ship to consoles (already happening).
    • From the article... "Intel also emphasized the blockchain accelerator isn't a multi-purpose GPU, but an ASIC." Which I assume means a purpose built chip.

      Reusing failed GPU's is something Nvidia does sometimes ("floor sweeping"), but in this case I think there would be too much unused silicon area. Parts not needed to do Bitcoin's specific calculations.

      Nvidia knows their core market has been gamers, and they don't want to piss them off. If you already have plenty of demand, then I can understand trying t

  • What a disgusting move. What an evil company.
    • by Anonymous Coward

      What a disgusting move. What an evil company.

      What a disgusting socialist. What an evil lie.

    • by Misagon ( 1135 )

      Well, I'm not buying Intel ever again,
      .. or ASRock
      .. or Biostar
      .. or MSI
      .. or Gigabyte
      .. or Samsung

      Goddamnit, now the situation is getting a bit ridiculous.

      Isn't there any major brand out there, who has any principles left?

      • No-one is perfect, no company is perfect, so I think it comes down to working out the best- and the worst-behaving companies when making buying decisions.

        This move puts Intel pretty close to the bottom.
  • No surprise. Intel simply cannot graphics well and they cannot get anybody that can to work for them.

    Also, proof-of-work? At this time? What evil greedy scum!

  • just pay $$$ more for the mineing unlock key!

  • to advance which manufacturing technologies ? Oh yeah, so we can have domestic casino support. Maybe he can find a job with Boeing, I've heard they have a management team he might be compatible with.

  • While there was actual value in implementing base AES instruction loop in pure silicon, there were some detractors to the idea of doing so. Always are a few folks who nay-say. I digress, but the point is that the AES NI instructions in Intel hardware (found in the i5/i7/i9 chipsets...not the i3 though) boosts overall AES encryption/decryption performance 200x over the best hand-optimized software implementation written in assembly language. On Intel chipsets, the result is extremely fast execution of the

  • by ZiggyZiggyZig ( 5490070 ) on Friday February 11, 2022 @06:25PM (#62260495)

    It's going to be hard to paint the company as eco-friendly while churning out chips whose sole purpose is mining imaginary money.

    • by mjwx ( 966435 )

      It's going to be hard to paint the company as eco-friendly while churning out chips whose sole purpose is mining imaginary money.

      I think it's more "if Intel is getting involved, this Bitcoin mining craze must be over".

      Good news for those who want a reasonably priced GPU for gaming... I mean the Bitcoin craze being over, I doubt Intel's entry into the market will be any good.

  • Thanks for help making a $800 video card cost $2200 : P
  • It doesn't matter if it only needs a picowatt or a terrawatt to compute a hash, zero times X is still zero! You waste power and accomplishing nothing at all. Everything about this is increasing pollution and honestly, it should be stopped.

  • It's not a GPU. (Score:4, Informative)

    by RightSaidFred99 ( 874576 ) on Friday February 11, 2022 @06:33PM (#62260525)
    Article and replies mentioning a GPU are written by dunces. It's a fucking ASIC. Not everything that accelerates bitcoin is. GPU.
  • This was already announced three weeks ago [slashdot.org].

    In this dupe, you could have at least linked to the official post [slashdot.org] by Intel's VP Raja M. Koduri.

  • While intel is about to release a GPU to the market, the part aimed at crytomining is an ASIC, not a GPU...

  • You already carried this story Slashdot!

    And it's an ASIC, not a GPU!

  • But over the coming weeks and months that performance gain will be gradually chipped away at by firmware upgrades to fix all the new critical vulnerabilities Intel has invented.
  • There's no end to the greenwashing of bitcoin.

    Even if you had the most efficient chips ever, that would just mean you were wasting a little less energy doing something fundamentally stupid and useless. If we'd just ban this nonsense the problem would be solved and we might even find something useful to do with all that computing power. /P.

  • The standard Intel marketing of a future product - over promise and under deliver.

  • If Intel really thinks this chip is better than ASICs then they should prove it by deferring the payment until the buyer has actually accrued enough bitcoin to pay for the device. Lets call it a deferred payment plan. Intel would certainly move a *lot* of product that way.

    The only question then is how much electricity does it really use per bitcoin? A GPU aught to be able to calculate out to the very day that the device pays for itself plus taking into account all the electricity used by it. The bill then

E = MC ** 2 +- 3db

Working...