Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel AI Businesses

Intel Aims To Take on Nvidia With a Processor Specially Designed for AI (fastcompany.com) 43

An anonymous reader shares a report: In what looks like a repeat of its loss to Qualcomm on smartphones, Intel has lagged graphics chip (GPU) maker Nvidia in the artificial intelligence revolution. Today Intel announced that its first AI chip, the Nervana Neural Network Processor, will roll out of factories by year's end. Originally called Lake Crest, the chip gets its name from Nervana, a company Intel purchased in August 2016, taking on the CEO, Naveen Rao, as Intel's AI guru. Nervana is designed from the ground up for machine learning, Rao tells me. You can't play Call of Duty with it. Rao claims that ditching the GPU heritage made room for optimizations like super-fast data interconnections allowing a bunch of Nervanas to act together like one giant chip. They also do away with the caches that hold data the processor might need to work on next. "In neural networks... you know ahead of time where the data's coming from, what operation you're going to apply to that data, and where the output is going to," says Rao.
This discussion has been archived. No new comments can be posted.

Intel Aims To Take on Nvidia With a Processor Specially Designed for AI

Comments Filter:
  • by werld ( 102206 ) on Tuesday October 17, 2017 @05:07PM (#55385813)
    The thing about AI is this.. I play video games and boy the AI sure does suck. It has always sucked and every year I think its going to get better but never does.. If we can't program AI well in video games, then i'll say were a lot further off than the projections ive been seeing.. But what the hell do I know?
    • by Anonymous Coward

      AI doesn't sell games, so money isn't spent making it better. Also, It is usually only given a very small value for how much processing it can take up, lest it get in the way of what actually does sell games, like moar pixels and moar framerates.

    • by Quince alPillan ( 677281 ) on Tuesday October 17, 2017 @05:49PM (#55386067)

      AI in video games is very different from the AI that these chips would be using. The AI in video games sucks for three reasons:

      1. If the AI were too good, players would quit because it is too hard. Certain games specifically are designed to be extremely hard, but this isn't the norm. In a first person shooter, this would be the equivalent of fighting against an aimbot. (Aimbots are a great example of AI - they're designed to be perfect or to use information that the player doesn't normally have.) It becomes a game balance and design decision to make the AI imperfect.

      2. Complex AI can be very computationally intensive. There's a tradeoff in the speed of calculations between "good enough" and "perfect" in some algorithms. When you're dealing with a lot of variables in a complex game that a good AI might use, it takes a lot of processor cycles to calculate how to respond to you to make it more difficult for you. Take for example Civilization that takes far longer between turns at the end of the scenario than it does at the beginning of the scenario because there are more variables to consider.

      3. Given 1 and 2, the easiest method to implement a balanced AI is to cheat - give the computer advantages that the players don't have or to hardcode certain behaviors. Those behaviors can be easy for humans to learn and beat, even when the computer is cheating.

      The processors in this article are talking about new chips that are designed to calculate the math for a certain type of complex calculation by making assumptions about the type of calculation being done and taking some shortcuts with the drawback that they can't perform other calculations as fast.

      • A fourth reason: AI is hard to build, and takes a long time. It took the AlphaGo several years of work to build up the algorithms they used in Go. It may take them several more years to build a decent Starcraft AI.

        What company is going to delay releasing the game for several years while the AI gets built?
      • by Kjella ( 173770 )

        1. By far most game AIs are not "learning" in any sense of the word and since learning is essential to intelligence calling it AI is really a misnomer. They do what they do, when you've found a flaw in the algorithms you can just exploit it over and over again and it'll never change or improve.
        2. There's no such thing as "perfect play" in any game more complex than tic-tac-toe. Even chess computers only do "good enough".
        3. Cheating is orthogonal to AI, there's nothing inherent to AI about "use information t

      • by Anonymous Coward

        There was a third-party AI mod for Quake II that was actually good:
        * the developer went to some lengths to "hide" info from it it shouldn't have
        * it included variable error; you could set it to be perfectly "aimbot" accurate, or to exhibit random inaccuracy, or simulated human inaccuracy (starts off inaccurate and gets better for successive shots)
        * it learned which areas, routes and even weapons were more effective for a given map and given opponent(s)
        * you could set it's health and damage modifier to "chea

        • Which bot was that?

          I know the Steve Polge did the the ReaperBot for Quake 1 ...

          Anyone know which bot for Quake 2 this was ?

      • Regarding #1: Does the game allow you to use objects against the the AI's intelligence? For example: finding a mirror. Using it to reflect a "you" while you shoot it from another angle? Or simply shoot at something above it and flatten the damed thing?
  • by ClickOnThis ( 137803 ) on Tuesday October 17, 2017 @05:42PM (#55386005) Journal

    For Intel's sake, this had better rock, or else it's DOA.

    I'm guessing you'd need to purchase a specialized motherboard with accompanying chipset to use one of these. Whereas GPUs can just plug into slots that most motherboards have already.

    GPUs, like cassette tapes, may be with us for awhile before something else comes along that competes well enough with them in cost and utility to make switching a no-brainer.

    • by dj245 ( 732906 )

      For Intel's sake, this had better rock, or else it's DOA.

      I'm guessing you'd need to purchase a specialized motherboard with accompanying chipset to use one of these. Whereas GPUs can just plug into slots that most motherboards have already.

      GPUs, like cassette tapes, may be with us for awhile before something else comes along that competes well enough with them in cost and utility to make switching a no-brainer.

      Slots come and go. Just in my (young) lifetime I have seen 4 different slot standards for graphics cards (ISA,PCI,AGP, and PCI-E) just for the consumer market. Plus a bunch of other ones of varying popularity for the server market. If something better comes along and doesn't get mired in a patent fight, the slot will change again.

    • I see your point, but these specialized GPUs tend to cost much more than a motherboard. To get rid of all the serious bottlenecks, you'll probably be buying a lot of hardware anyway, it's not easy. Check out this device [nvidia.com]. When people are spending $100k USD on machine learning setups, they're willing to get custom motherboards.
    • For Intel's sake, this had better rock, or else it's DOA.

      Odds are it is already DOA. Intel being Intel they get the itch every now and then and feel the need to capture some high-yield revenue stream other than the x86 family.

      Over the years they have done it all. FPGAs? Embedded controllers? RISC processors? Switch Chips? Infiniband? GPUs? The list is endless. From this perspective "neural net" is nothing new.

      Start with some tech acquisition, run up a bunch of hype, do some trade shows or even TED talks. At the end of the day it doesn't run Windows s

  • by Anonymous Coward

    everyone starts talking games and PCs.

    This is about AI. We just had a discussion recently over an article ridiculing the energy cost of NVidia's approach to self-driving cars. Chips like this is why that article was ridiculous. NVidia's current approach is a developmental solution. Purpose-made solutions like this will be the rulers of the AI market and, within a few short years, use several orders of magnitude less energy to perform the same work.

  • We don't have real AI so you can't have an AI chip. It's just another microprocessor, STOP THE HYPE.
  • Nervana is designed from the ground up for machine learning, Rao tells me. You can't play Call of Duty with it.

    One could also read that as "We're trying to develop AI, but it's still so primitive that you can't even make a machine-learned bot for a contemporary game with it yet." ;)

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...