Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Businesses Intel

Jim Keller Resigns from Intel, Effective Immediately (anandtech.com) 95

Intel has published a news release on its website stating that Jim Keller has resigned from the company, effective immediately, due to personal reasons. From a report: Jim Keller was hired by Intel two years ago to the role as Senior Vice President of Intel's Silicon Engineering Group, after a string of successes at Tesla, AMD, Apple, AMD (again), and PA Semiconductor. As far as we understand, Jim's goal inside Intel was to streamline a lot of the product development process on the silicon side, as well as providing strategic platforms though which future products can be developed and optimized to market. We also believe that Jim Keller has had a hand in looking at Intel's manufacturing processes, as well as a number of future products. Intel's press release states that Jim Keller is leaving the position on June 11th due to personal reasons. However, he will remain with the company as a consultant for six months in order to assist with the transition.
This discussion has been archived. No new comments can be posted.

Jim Keller Resigns from Intel, Effective Immediately

Comments Filter:
  • Intel is fucked? and time to buy AMD??

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Intel is fucked?

      When a press release says someone is leaving for "personal reasons", that is corp-speak for being pushed out.

      Intel has serious problems. They are firing the guy who failed to fix things.

      There is little innovation at Intel. They are stuck at 10nm while TSMC (AMD's fab) is going from 7nm to 5nm.

      Intel has 110k employees. AMD has about 12k. Yet AMD is running circles around them.

      and time to buy AMD??

      AMD's stock price has doubled in the last year, but it is likely still a good buy.

      • buy both stock and hardware

      • by Junta ( 36770 ) on Friday June 12, 2020 @02:06PM (#60176404)

        When a press release says someone is leaving for "personal reasons", that is corp-speak for being pushed out.

        Eh, that's not a given. I've known of at least one case where that was due to a cancer diagnosis. Another time someone had a heart scare and decided to retire because they had enough money to just not bother anymore. In another scenario they got more money at another place but were going to take a hiatus from working to enjoy themselves a bit between jobs.

        It is certainly possible that he got push back and given what I know about Intel culture I would not be *surprised* if they forcced out the 'outsider' trying to fix things despite that being the whole reasoning for poaching him in the first place. However I know enough legitimate backstories behind these announcements to genuinely not assume anything.

      • AMD's stock price has doubled in the last year, but it is likely still a good buy.

        Their P/E very much says otherwise

        • Their P/E very much says otherwise

          AMD's PE is about 90 and Intel's is 11.

          So the market is predicting that AMD is going to gain a lot of market share at the expense of Intel.

          That is a reasonable assumption.

      • When a press release says someone is leaving for "personal reasons", that is corp-speak for being pushed out.

        Heh. So what happened? Did he get caught posting something raaaaaayyyyciiiiiisssss on Twitter of Facebook?

      • by orlanz ( 882574 )

        Intel is stuck at 14nm and at this point its looking like they will probably just drop 10nm and move on to ramping up 7nm.

        They have lots of 10nm fab lines but since like 2016 haven't been able to get the yields like they were supposed to. Its been so bad that they had to back port 10nm designs to the 14nm chips and keep making them. This is how we got Kaby, Wiskey, & Comet Lakes. Even thou they are running the 14nm at full throttle, they can't seem to meet demand. The 10nm Cannon was delayed till 2

        • by orlanz ( 882574 )

          Sorry, I meant to say 7nm, they should die shrink to 7nm, not 5nm. By moving away from 14 & 10, they can focus on 7 for mainline and 5 for next gen.

        • Can't they just order dies from TSMC to get by and get some breathing room to hire some experts and get EUV properly going?

          I bet they tried, but because they left scorched land everywhere in the past, TSMC told them to fuck off and die. ;)

          • Can't they just order dies from TSMC to get by

            No. Intel uses high-k metal gates. They use hafnium rather than SiO2 as the gate dielectric.

            It is a very different process than what TSMC uses. TSMC can't fab Intel chips and vice versa.

          • Intel is the last major Integrated Design Manufacturer (IDM) in the world. Their rise to success was based in no small part to their ability to produce advanced foundry nodes tailored to their own CPU designs. They went hand-in-hand. Moving to a standard foundry model would be a major shift in Intel's design strategy. If they give up on their own foundries, those foundries are mostly dead.

            Near as anyone can tell, Intel's future CPU designs still rely on the IDM model up through 2022/2023 or so. Willow

        • by sphealey ( 2855 )

          No one could have fixed that problem in two years though. I suspect this has more to do with Apple's expected announcement that it is moving to A-Series chips for the Macintosh.

        • Just an FYI, but Kaby/Whiskey/Coffee/CometLake were never 10nm "designs". They're just Skylake, repeated ad nauseam. You're thinking of Rocket Lake which isn't even out yet. Rocket Lake appears to be Tiger Lake (more specifically, Willow Cove cores) backported to 14nm. Though we don't know 100% sure yet what Rocket's really going to be like. Other than big and hot.

          Cannonlake is *not* a die shrink of anything. It is an entirely different uarch. It support AVX512 (among other things) and displays maybe

      • by gweihir ( 88907 )

        Intel is fucked?

        When a press release says someone is leaving for "personal reasons", that is corp-speak for being pushed out.

        So they are firing the one guy that has the proven skills they desperately need to make their stuff stop suck so badly? That means they not only have a problem of incompetence, they have one of arrogance and lack of insight.

        Will be interesting to see at what level they will finally survive. I would not be surprised if Intel is the ElCheapo choice in 10 or 20 years. Because, lets be realistic. AMD CPU tech was always superior, what gave Intel the edge was their own fabs with a vastly better process (over now

      • This was not Jim's Keller's fault or doing. He has only been with the company for 2 years while it takes 5 years to make a CPU, design the fan, and actually make it. It's the fault of the CEO and his predecessor 11 years ago.

        Intel was so thrilled with the 1st Gen i7 in 2009 that they killed the internal i9, laughed at AMD and put their feet up and chill and not do anything for 8 freaking years! Intel ignored the cell phone and tablet markets. The netbook in 2009 was the first sign of a demand for ul

      • by dddux ( 3656447 )
        Aren't they stuck at 14nm++ actually?
    • Intel cannot buy AMD (Score:5, Informative)

      by Laxator2 ( 973549 ) on Friday June 12, 2020 @12:34PM (#60175946)

      The original agreement between Intel and IBM demands that as long as x86 is around there have to be 2 vendors.
      That is why Intel wanted very much to kill x86 when they pushed the EPIC (remember Itanium ?) architecture in the late 90's and early 00's.
      With the shackles of x86 removed Intel would have been free to be the monopolist they always dreamed of.
      Even today Intel are still hurting after AMD released x86-64 and made the EPIC architecture irrelevant.
      Of course, the fact that Intel decided to simplify their hardware manufacturing process and move all the complexity into the software and the compiler did not help EPIC take hold.
      Running x86 software in emulation mode was dog-slow.
      But it was killed mostly by developers who has now access to a 64 bit architecture without having to put up with Intel's "I'll sweep my dirt under your rug and you'll thank me for it" attitude,

      • I guess the OP meant it’s time for people to buy AMD stock.
      • by dfghjk ( 711126 )

        "That is why Intel wanted very much to kill x86 when they pushed the EPIC (remember Itanium ?) architecture in the late 90's and early 00's."

        It is not why. Also, EPIC was developed cooperatively with HP so was unlikely to reap the benefits your absurd theory depends on.

        Intel did not desire to "kill x86", they desired to position themselves for post-x86 just as they did with the i960 before it. Remember the 960? It was Intel's 32-bit architecture back when they had no intention of doing 32-bit x86. /. his

        • Too bad that I cannot dig up the old articles from around '97 or so. They were mostly on paper anyway, I could not include a URL here.
          The way Intel wanted to move ahead with x86 and EPIC was by creating effectively two tiers. The x86 was going to be limited on purpose and that is what the original Celeron was, a hamstrung Pentium II,
          For server workloads (the .com bubble was only starting to inflate at the time) and workstations everyone was going to be forced to use Itaniums.
          I remember a quote from an artic

        • The real reason why Intel wanted a non x86 CPU had nothing to do with technical supperiorty or performance. It was to kill AMD.

          IBM forced AMD to license x86. Patents expire also after 6 years. Copyright is for 80 years thanks to Disney. Intel couldn't copyright 8086 because it was a number. Itanium on the other hand is a word BOOM.

      • x86 isn't as big of a deal as it use to be.

        Being that much of what we do today is being processed with interpreted langues (JavaScript, Python) and byte-coded languages Java, .NET the need for a particular CPU isn't such a big deal anymore.

        Our method of computing has changed as well. Back in the 1980's we had to have the Disk (That we bought at the store) with the .EXE file to run, on our platform. We had to fuss about if there was a compatible program for your system. If you had an IBM Compatible DOS Co

        • It's still a big deal considering how much is invested in x86 platforms and software binaries.

          Greenfielding a new platform on ARM is an obvious option, but just cross-compiling an existing platform to a new CPU is a lot more complicated than that. Maybe easy for simple, console-output CLIs but less simple for GUI-driven operating systems.

          If it was that easy, I'd suggest there would have been a mass migration to custom ARM platforms already, AMD would be dead or just another ARM designer and Intel would be

          • Even Apple hasn't done it, and they seem like the most likely candidate

            Rumor is that Apple's next MacBook will be ARM.

            ARM means longer battery life, more cores on a die (or more room for cache), and no more high priced Intel CPUs.

            Linux runs well on ARM. There is also a Windows 10 port.

            If Apple's transition is successful, expect a stampede.

            • I'm guessing the year of ARM on the desktop will also be the year of the Linux on desktop!

              • I'm guessing the year of ARM on the desktop will also be the year of the Linux on desktop!

                It may take more than a year, but billions of people got their intro to computing from mobile, where Linux already dominates. So moving to Linux on desktops and laptops will come naturally to them, especially if the same apps work on both platforms.

                • by bn-7bc ( 909819 )
                  Well you partly opened a can of worms there, let me finish it for you. Linuz, as in the linux kernel is dominant on mobile through Android, but a lot of people (in the linux community are not happy about Android, specifically the locked down (often proprietary) userland so a lot of times don't include Android as linux, I'm not shore what I think about it, in a strict sense Android is linux as it runs the linux kernel, but does that necessarily mean you can compile and run a random linux application on it, w
                • Billions of people got their exposure to electricity by turning on a light switch, but this doesn't mean they're going to become electricians.

                  The vast majority of people who "got their into to computing from mobile" are plain old users who use a tiny subset of applications that have kitchen appliance level simplicity. I guess Linux on desktops (by which I mean a desktop GUI computing paradigm, not the physical box per se) could succeed if it delivers the same kitchen appliance simple applications, but that

        • If client Java (or similar) had taken off more, I'd agree with you. The ideal situation for multiplatform proliferation would be people running HTML5 + WebASM through multiplatform browser software. Java isn't going to get the job done, nor is any other JIT language (apparently). Fact is that neither scenario is really playing out that way right now. ISA and hardware platform still matter. You have to do a lot of work to get your application running on PC, Android, iOS/MacOS, non-Android Linux, Linux f

      • Actually, EPIC was an attempt to simplify CPUs so they could crank the clock speeds and gain lots of performance. It's easy to forget that, all else being equal, clock speed is the BEST way to improve performance. It always is. A 10GHz single-core i9 would *trounce* the current multi-core i9s on almost every workload.

        Modern CPUs spend a lot of time trying to schedule things and eek parallelism out of the code. EPIC was supposed to move all that to the compiler. It was a good idea. A great idea, even. And it

        • s that writing code that lends itself to parallelism is difficult.

          Only in C-likes.

          In e.g. Haskell, it is a given and trivial for most code. (The non-I/O majority of the code.)
          For large-scale parallelism, you can basically just say "parllelize this list". (Lists can contain function closures and I/O actions and anything really. Even like running a piece of monadic code once for each of many definitions of what the semicolon means, in parallel, to speak in C-like speak.)

          You have nice things like "thread spark

        • I see so hardware acceleration bad. Moving to software only with slower hardware good. Explain?

    • by im_thatoneguy ( 819432 ) on Friday June 12, 2020 @01:07PM (#60176098)

      I don't understand how people still don't see the Jim Keller pattern. He likes to spend 2 years or so at a place fixing things. He air drops in. He puts a team on track and then he bails for the next interesting challenge.

      When Keller left AMD people thought that was the end of AMD. When keller left Broadcom people thought that was end of Broadcom. When Keller left Apple people thought that was the end of Apple. When Keller left AMD again people thought that was the end of AMD. When Keller left Tesla people thought that was the end of Tesla. Now when Keller leaves Intel again people think that's the end of Intel.

      Meanwhile you have a string of great products he had a hand in: AMD Athlon, AppleA4-A5, AMD Zen, Tesla Autopilot Hardware 3.

      It often takes as much work to finish something from 90% to 100% as it does to go from 0% - 90%. He's clearly just not the type of person interested in bug squashing.

      • by Sertis ( 2789687 )
        Well he's kind of getting pretty old, considering how sudden this is I wouldn't be surprised if health problems are catching up.
      • It often takes as much work to finish something from 90% to 100% as it does to go from 0% - 90%. He's clearly just not the type of person interested in bug squashing.

        Pareto principle [wikipedia.org] says that 80% of the work is in the last 20% of development.

      • Exactly. As they say:
        The first 90% of the work take the first 90% of the time. The other 10% of the work take the other 90% of the time! :)

      • by gweihir ( 88907 )

        I see the patterns, but I think the problem Intel has are too large for the 2-year fix. I may be wrong on that. Of course, Keller is doing CPU architecture, not manufacturing processes, so he may have fixed or initiated the fixes for that. But basically, Intel would need to give up their fabs, they clearly cannot compete anymore in that ares.

        • Exactly. When he left those other companies, we had already seen new products launching with his fingerprints all over them. The only question was whether they could keep going in his absence. Where are those products here? What step forward has Intel taken in the last two years? What great trajectory are we worried they won’t be able to maintain in his absence? Nothing.

          I’m inclined to think he said enough is enough.

      • He spent three years at AMD, and it was well-known what he was working on there: Zen and K12. One was shelved (K12) and the other has been iterated-upon by the team he built (along with Mike Clark) to the success that we now know as Zen2 in its many variants (Rome, Matisse, Renoir).

        It took five years for Keller's work to finally reach market.

        At Intel, nobody REALLY knows what he was working on. Best guess was he was with the Core team, working on the successor to Golden Cove (Ocean Cove, or something else

    • Nah. Intel will be screwed only when AMD starts properly supporting its customers and developers. When TensorFlow runs and AMD GPUs an and laptop makers can basically reuse a reference design from AMD and focus on quality and production yields then both Intel and Nvidia will be in trouble. However, as of today AMD doesn't understand the importance of supporting developers and the importance of application engineering. What I am curious about is whether spending more money on these two things will not bring
  • by Anonymous Coward

    #MeToo

    #BLM

    There's no virtue signaling involved (yet) so probably #MeToo

  • Wow... (Score:1, Interesting)

    by Anonymous Coward

    You people are incredible!

    Sex crimes? Discrimination? Some other malfeasance?

    It's likely just a bad medical diagnosis with a shortened time left on the clock, and he doesn't want to waste it sitting in meetings.

    Give him a break already.

    • by Anonymous Coward

      Corporations are people, people who run corporations are the devil. Remember that.

    • Comment removed based on user account deletion
      • Actually there's a 100% chance of anyone who spent time twiddling around with hardware dying, period. I think that needs to be investigated!
        I mean, sure, some people who never touched hardware die too - but I think that's just anecdotal!

  • by Vandil X ( 636030 ) on Friday June 12, 2020 @12:42PM (#60175978)
    Intel's lack of swift innovation caused Apple to seriously invest and develop ARM architecture for future Apple hardware, likely eliminating use of Intel processors.

    When you lose Apple, someone's head had to roll.
    • True someone has to take the blame but this person was hired only two years ago to try fix problems that leadership knew existed. And the problem is that they can’t get their 10nm process to produce enough chips. Despite working on this line for 4 years now, they cannot get their yield up enough to sell consumer CPUs. They have to make enterprise CPUs as far as I know because of the low yields.
    • Intel's lack of swift innovation caused Apple to seriously invest and develop ARM architecture for future Apple hardware, likely eliminating use of Intel processors.

      When you lose Apple, someone's head had to roll.

      So what does it mean to lose Apple, twice?

      https://www.fiercewireless.com... [fiercewireless.com]

      • by leptons ( 891340 )
        It means Apple is a fickle bitch.
        • It means Apple is a fickle bitch.

          You misspelled:

          Hubris (x86 vs mobile devices) and Incompetence (utter failure to produce a 5G modem)

          1/10

    • If that were the reason, it would be incredibly funny because Keller was one of the people who worked at Apple to develop their own ARM SoCs. The original and the first few generations of iPhones just used standard ARM designs. Their chips have come a long way since Keller left, but it would still be amusing to think that his past accomplishments were so good that they kicked off something that would get him fired in the future.

      It's more likely that he's resigned for some other reason, but if this were t
    • Intel's lack of swift innovation

      What has Intel to do with Swift innovation?

    • or AMD with MFN deals killing apple deals for lower costs

    • by leptons ( 891340 )
      Apple doesn't have enough market share that would be too much of a problem for Intel if they dropped the CPU platform for ARM.
    • Jim Keller wasn't responsible for any of that. The earliest any of his designs will hit market would be 2023.

  • With his job history, he may very well have the best bird's-eye view of most of the cutting-edge technology that exists in the U.S. Maybe his next step will be to start his own company and build ...stuff.

    • by PPH ( 736903 )

      he may very well have the best bird's-eye view of most of the cutting-edge technology that exists in the U.S.

      I thought TFS said he worked at Intel.

      • When you say TFA, did you mean the title? The first line of the summary:

        Jim Keller was hired by Intel two years ago to the role as Senior Vice President of Intel's Silicon Engineering Group, after a string of successes at Tesla, AMD, Apple, AMD (again), and PA Semiconductor.

        Or are you pointing out that Intel has all of that already?

  • by beheaderaswp ( 549877 ) * on Friday June 12, 2020 @01:34PM (#60176290)

    Given the background of what is going on in the industry this is at least "interesting".

    Can't foretell the death of Intel though. They are being chop to pieces with a million performance and security related paper cuts.

    Though Keller has moved around a lot. Not sure his move means what certain people thinks it means. He might just want to lay on the beach for a while with a Margarita.

    Not everything "means" something. Sometimes... it's about the drank on the beach.

    • by gweihir ( 88907 )

      Possibly. Keller may just have done all he could and got bored. The main problem Intel has is that their manufacturing process sucks and that they will not have others make their CPUs. Keller probably cannot do anything in this area. I do hope that the time where Intel plays fast and lose with security in new designs will be over.

  • Comment removed based on user account deletion
    • by gweihir ( 88907 )

      Well, I he tried to fondle anybody, Intel would be well advised to drop a ton of money on that person and encourage the fondling.

  • I noticed this tendency, lately, to not state the actual reasons for anything that is clearly bullshit, but say "security reasons" or "personal reasons" or something like that instead, and expect the listener to act like he was just given reasons and should shut up and stop asking now.

    It always reminds me of that Idiocracy court scene, where one lawyer goes "We found this ..., like, *psssh* ... /eeevidence/ ... that [...]".

    Whenever I see "reasons" I read it as "no reasons given, by the sleazy lying piece of

  • Gotta ask if Keller got cancelled for PC failure.

2.4 statute miles of surgical tubing at Yale U. = 1 I.V.League

Working...