Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Intel Businesses

Intel Fights For Its Future (mondaynote.com) 175

An anonymous reader shares a post: The Smartphone 2.0 era has destroyed many companies: Nokia, Blackberry, Palm... Will Intel be another victim, either as a result of the proposed Broadcom-Qualcomm combination, or as a consequence of a suicidal defense move? Intel sees the Qualcomm+Broadcom combination as an existential threat, an urgent one. But rather than going to the Feds to try and scuttle the deal through a long and uncertain process, Intel is rumored to be "working with advisors" (in plainer English, the company's Investment Bankers) on a countermove: acquire Broadcom. Why the sudden sense of urgency? What is the existential threat? And wouldn't the always risky move of combining two cultures, employees, and physical plants introduce an even greater peril?

To begin with, the threat to Intel's business isn't new; the company has been at risk for more than a decade. By declining Steve Jobs' proposal to make the original iPhone CPU in 2005, Intel missed a huge opportunity. The company's disbelief in Apple's ambitious forecast is belied by the numbers: More than 1.8 billion iOS devices have been sold thus far. Intel passed on the biggest product wave the industry has seen, bigger than the PC. Samsung and now TSMC manufacture iPhone CPUs. Just as important, there are billions of Android-powered machines, as well. One doesn't have to assume 100% share in the smartphone CPU market to see Intel's gigantic loss.

This discussion has been archived. No new comments can be posted.

Intel Fights For Its Future

Comments Filter:
  • As usual another stupid article full of line noise instead of anything intelligent to say.

    Incidentally, if Intel is "fighting for its future" by making huge profits in a variety of areas then why the hell is Qualcomm -- the effective monopolist in smartphone wireless devices and also a huge player in smartphone SoCs -- even conceivably a target of a takeover? Why the hell isn't Qualcomm about to buy out Intel if Intel is so behind the curve and Qualcomm is supposedly so great?

    Let's not even forget how this

    • by TWX ( 665546 ) on Monday March 12, 2018 @10:40AM (#56246723)

      The article (or at least the Slashdot synopsis) may be excessively editorializing it, but Intel is threatened by the reduction in the personal computer industry.

      Consider that when PCs rose to prominence there were lots of architectures. Even after Wintel and Motorola/Apple dominated personal computers at home, business computing still had other architectures (MIPS, and Alpha immediately come to mind) to the extent that Microsoft felt the need to port their business OS to those platforms, rather than to force x86.

      The end of the model that all software has to be compatible with x86/AMD64 and that the gatekeepers for software for new devices (Apple's and Google's respective repositories) require that the software work on their devices almost without respect to the underlying CPUs, plus the 'cloud' model and various other virtual machine models may abstract the software developer away from the physical hardware to the point that we might again see a proliferation of various architectures again. Intel has reigned supreme because it was difficult to port software or to write software to run on everything, but if that has changed then suddenly it doesn't matter what actual CPU is in the phone or tablet or even server, it'll just work when it's time for the software to run.

      That's the threat to Intel's business-model, a loss of near-monopoly on processors because new devices don't need Intel's processors.

      • by Bert64 ( 520050 )

        Those other architectures were forced into the highend niche, and eventually died out...
        The same is happening to intel now, they are forced into the highend niche where arm chips are suitable for an increasing amount of day to day tasks, and only people with specialised requirements currently require the higher performance intel chips.
        Fast forward a few years and the increased volume of sales for the arm chips provides more development money, and arm starts overtaking intel in performance too.

        • by mikael ( 484 )

          It's not just ARM. There are lots of startups coming up with various chips to do face and emotion recognition, posture recognition, motion recognition and all sorts of basic vision processing that would form a single visual circuit in a mammalian brain. These are being designed using machine learning techniques and don't require the double/floating point precision processing that are typically supplied by Intel chips. Even the fluid dynamics people were realizing that machine learning techniques were helpin

    • The same reason why AOL was able to buy TimeWarner in 2000. What matters is apparent value not actual value.

      Smartphones are the hot product, so Qualcomm has more apparent value than Intel which has focus on boring (but profitable) pc and server memory and cpus.

      Yeah, it is stupid, yeah it doesn't make sense; but that is how stock markets work. A company not making quite as much money as predicted is the same as that company losing money.

  • iPhone CPUs? (Score:3, Insightful)

    by 110010001000 ( 697113 ) on Monday March 12, 2018 @10:09AM (#56246579) Homepage Journal
    You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.
    • You don't want to enter a cutthroat low-margin market.

      That would be fine if Intel's markets weren't shrinking. However, Intel can't maintain itself on its current markets, as they are all shrinking in favor of Mobile and, to a lesser extent, Cloud. Those are both areas where Intel is not terribly strong.

      You would think that Intel and Cloud would go hand in hand, but that isn't necessarily true.

      • However, Intel can't maintain itself on its current markets, as they are all shrinking in favor of Mobile and, to a lesser extent, Cloud.

        Well Intel supplies an awful lot of those CPUs for the cloud so I don't think that worries them so much. Mobile is an issue for them because that is definitely where the growth is. The biggest threat to Intel is that they have so much of their revenue and profit tied up in the X86 platform. If software and PC makers continue to migrate away from X86 it's going to hurt Intel badly sooner or later.

        • However, Intel can't maintain itself on its current markets, as they are all shrinking in favor of Mobile and, to a lesser extent, Cloud.

          Well Intel supplies an awful lot of those CPUs for the cloud so I don't think that worries them so much. Mobile is an issue for them because that is definitely where the growth is. The biggest threat to Intel is that they have so much of their revenue and profit tied up in the X86 platform. If software and PC makers continue to migrate away from X86 it's going to hurt Intel badly sooner or later.

          Even Cloud providers are looking to move to a different platform and many are involved in OpenPower (https://openpowerfoundation.org/membership/current-members/) to find better hardware in terms of cost per watt for Cload loads where floating point or integer computational performance may not matter as much.

    • by tomhath ( 637240 )
      No doubt they would like to be a bigger player in the phone market, but a good company looks for next year's opportunity, not last year's. They're in many things besides CPUs.
    • by shess ( 31691 )

      You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.

      Also, what if Apple had gone with Intel for their CPU, and then failed because Intel's CPUs sucked batteries dry? Or because having the same CPU in desktops, laptops, and mobile devices lead Apple to the obvious path of cross-platform compatibility, and that sucked batteries dry? Or if Apple wanted to gradually take over more of the system to customize it to better serve their needs, and Intel said "F. U."? I think Apple was probably lucky on Intel turning them down.

    • You don't want to enter a cutthroat low-margin market.

      In general, yes, I agree.

      BUT. Semiconductor manufacturing is a very capital-intensive industry. If there isn't enough volume in the high-margin game to keep your multi-billion dollar factories occupied, you will not be able to justify the construction of another. And then you've lost the game, because your cheap-shit competitor will eventually surpass your manufacturing technology that allowed you to charge a premium in the first place. At that point, your best option is to contract out your manufacturing,

    • You don't want to enter a cutthroat low-margin market.

      As one of the largest players in the CPU space, you absolutely do want to do that.

      First of all, low-margin does not mean NO margin, and a billion of anything at low margin is still a lot of money.

      Secondly, that is a lot of great R&D opportunity in a challenging space you are giving up to sone other company. You can sit around all day designing new processors or features but until it comes into contact with real world uses and needs, your design will

    • You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.

      That might be true if your only concern is margin and not survival. I would say that a case is being made that PCs are slowly dying and being replaced with smaller devices many of which do not and will not run Intel CPUs. I would say it's the same problem that Sun Microsystems faced. AMD, Intel, and Cyrix were all fighting on the x86 market with Intel coming out on top and AMD relegated to 2nd class. Sun stayed out of the consumer market completely and failed to innovate in the server market. These days lo

      • by DarkOx ( 621550 )

        Right, Sun is probably the best example. They *thought* they were better off in the high margin, high end micro computer market. The problem is while the margins might have been better the market was shrinking and underwent a rapid shrink; to rapid for sun to effectively respond to once Intel ( and compatibles ) got good enough for a lot of those jobs. A nearly a century before you see the opposite with FOMOCO. There were 10s if not more little automakers around the great lakes region ( proximity to exi

    • You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.

      Same process happened in the 1980's when Intel overtook the Mainframe market, and folks said the same thing. Like it or not, laptops and (even more so) desktops are moving to areas of that generally require high performance stuff. As productivity (LibreOffice, MS Office, etc) and financial applications (Quicken) get better on mobile (Google Apps - Docs, Spreadsheet, Presentation; O365; Quicken is already on iOS and Android) then the need for even a laptop will go away entirely for the every-day-user will be

  • by Quakeulf ( 2650167 ) on Monday March 12, 2018 @10:18AM (#56246621)
    Intel dies.
    • by Anonymous Coward

      and intel can't create a power-efficient cpu without cheating the process and leaving exploitable holes in them.

  • But rather than going to the Feds to try and scuttle the deal through a long and uncertain process, Intel is rumored to be "working with advisors" (in plainer English, the company's Investment Bankers) on a countermove: acquire Broadcom.

    Is that what we're hoping for now? Or is it simply that we expect that to happen and are shocked when it doesn't?

  • by Anonymous Coward

    Intel chose not to worry about ARM.

    Intel is getting its breakfast eaten by ARM.
    Intel is getting its lunch nibbled on by Qualcomm/AMD
    AMD is now tasting Intel's Dinner

    Intel BOTCHED the Spectre and Meltdown patches. To the point that I will not apply those microcode pathes, and I will seriously consider AMD in the next build.

    Intel seems to be doing the MSFT QA Principle, "let your customers be the beta-testers", except they have moved into the "Alpha" phase.

    No thanks, I want a CPU that works and that is secure

  • by gweihir ( 88907 ) on Monday March 12, 2018 @10:27AM (#56246665)

    That monopoly is ironically called the AMD64 architecture today. This comes with a number of problems. While Intel managed to keep AMD small after the last time AMD (not Intel, they did not have the skills) not only came up with the only viable 64 bit extension to the x86 architecture, for a while they also had the fastest CPUs. AMD engineering in the CPU space has basically always been significantly superior to Intel, except for raw speed. Meltdown and Spectre have now nicely illustrated what Intel did to get that speed. And AMDs weakness is over, with a brand-new architecture that is very well designed indeed while Intel has nothing. It helps to understand that Intel it not actually a CPU company, they are a memory company and have struggled with CPUs since they began making them. AMD, on the other hand, came from signal-processors to x86 and _is_ a CPU company. This nicely explains Intel's incompetence, incredible as it sounds. They do not have the right culture.

    One other instance of that problem is also that while AMD can do extreme customization of their CPUs since the FX generation, Intel is completely incapable in this space. And just look how long it took Intel to get the memory controller into the CPU after AMD did it.

    Now, Intel also did never manage to come up with anything x86 that was suitable for a smartphone. AMD did not even try, because they understand CPUs and knew this architecture is not suitable for that field. But they went one step farther: They have server processors that include ARM cores. So AMD has real experience in that field, but Intel is, again, lost. Yet AMD is far smaller and does not need the smartphone market to survive, while Intel likely does. And they messed it up.

    My take is that finally Intel found out with much delay that they managed to screw themselves, in addition to their customers.

    • by jeff4747 ( 256583 ) on Monday March 12, 2018 @10:48AM (#56246767)

      AMD (not Intel, they did not have the skills) not only came up with the only viable 64 bit extension to the x86 architecture

      You're leaving out a rather important detail: Intel didn't try to create a 64 bit extension to x86.

      Instead, Intel tried to use Itanium and IA-64 to replace x86 and all the cruft in it that had built up over the years. Intel thought people would only buy Itanium for servers, since a 64-bit address space wasn't very useful for desktops at the time. So they priced their chips high.

      AMD countered with 64 bit extensions to x86 and cheaper chips.

      Cheaper won.

      • Yes, but it didn't help that Itanium was a honking disaster in the marketplace, (apart from HP, who tied themselves to that turkey...)

        • by swb ( 14022 )

          I think they thought they were going to be sitting in the catbird seat. I think Itanium was supposed to replace PA-RISC *and* PC server processors. So they would have a long-haul future with a new CPU for both PC server and workstation/midrange markets and would be able to start picking off Sun's business, too.

          I'm sure there was some MBA math involved that also took into account getting a share of licensing revenue for the chip patents, too.

          You have to admit that looking back it didn't seem like a terribl

      • You contradicted yourself here. On one hand you said x86 is "cruft", on the other hand, you admit its cheap to make x86 chips. There really isnt cruft in x86, its a perfectly useable design. Its not hard or expensive to implement any more than ARM. x86 instruction encodings can be weird, but not being aesthetic doesn't make them a performance problem or hard to implement on chip. Instruction encodings are things compilers need to be concerned with, not app programmers, anyway.

        • There really isnt cruft in x86, its a perfectly useable design

          We've figured out better ways to make CPUs than x86, and one of the ways is better instruction sets. Those instructions sets reduce the complexity in creating a chip. x86 can't do that because it has to remain compatible.

          In other words, x86 is carrying along things that are not as useful as they could be, and maintaining them is a limitation. Also known as cruft.

      • by gweihir ( 88907 )

        AMD (not Intel, they did not have the skills) not only came up with the only viable 64 bit extension to the x86 architecture

        And if you actually believe that, then you are stupid. Of course they tried. They just never had anything good enough to go public with.

      • by Kjella ( 173770 )

        Intel didn't try to create a 64 bit extension to x86.

        Yeah, Intel was trying to pull an IBM and introduce the MCA architecture to get rid of the competition and over on their shiny new platform where they held all the essential patents.

        Instead, Intel tried to use Itanium and IA-64 to replace x86 and all the cruft in it that had built up over the years.

        That's one way of putting it. Somebody at Intel managed to do a huge sell-in of compiler optimization and profiling as the future and so they created "Explicitly parallel instruction computing (EPIC)" which gave extremely fine detailed control to the compiler. The problem is that there's a balance between run-time optimization b

      • by Junta ( 36770 )

        Cheaper was not the only, or even necessarily the primary reason.

        Intel forgot the whole reason they ultimately dominated the market: backwards compatibility. Every product launch, no matter how revolutionary had a *massive* catalog of functional software to go with it.

        Intel had enough hubris to think they could spin up an ecosystem from scratch. AMD proved that to be an incorrect strategy.

        Also, the chips were plagued with performance issues, promiment among them was placing more demand on memory bandwidth

      • AMD countered with 64 bit extensions to x86 and cheaper chips.

        ...and faster.

        Working in the cash rich Oil and Gas industry in the early 2000's, SGI tried to make a machine [wikipedia.org] using the processors. A couple customers bought them, but realized after just a little bit of testing that they were horrendous for single threaded tasks and even crippled for larger ones. We ran some benchmarks in-house and couldn't find anything the machines were good at -- other than the huge memory footprint (3 TB of

    • History (Score:4, Informative)

      by DrYak ( 748999 ) on Monday March 12, 2018 @10:52AM (#56246801) Homepage

      Now, Intel also did never manage to come up with anything x86 that was suitable for a smartphone.

      Worse.
      They did never manage to come up with anything specifically running the x86 instruction set that was suitable for a smartphone.
      They used to have a decent Intel-manufactured CPU running ARM instruction set, but somehow managed to abandon the market and sell it off, just at the time when ARM is getting even more relevant thanks to smartphones, routers and IoT.

      Search for "Intel StrongArm" and "Intel XScale".

      Note that, according to Wikipedia, Intel is still in possession of ARM license that they acquired when bought StrongArm.
      So even after selling XScale out to Marvell, they could still start a new line of ARM core *now*, after having come to realization that the Atom doesn't scale down as much as they would have liked (isn't that well suited for smartphones and routers) and its x86 compatibility makes absolutely not sense in those markets (Seriously, nobody is going to run legacy Windows code on a smartphone)

      AMD did not even try, because they understand CPUs and knew this architecture is not suitable for that field. But they went one step farther: They have server processors that include ARM cores.

      I'm still hoping that, next to the ARM light-weight servers that they are targeting, these ARM cores will eventually also evolve to some high range phablets and dev boards.

      • (Seriously, nobody is going to run legacy Windows code on a smartphone)

        Wait, that sounds like a great idea to me. Why not run all our windows apps and games on a smartphone? There are still things that haven't been ported. Sure, it is not good for regular use. But if I could for example quickly check something in a Visual Studio project or make a small edit to an image in Photoshop on my phone it would be really great.

      • Re:History (Score:5, Interesting)

        by epine ( 68316 ) on Monday March 12, 2018 @03:32PM (#56248391)

        Now, Intel also did never manage to come up with anything x86 that was suitable for a smartphone.

        I went though a short, thirty-year obsession with all things microarchitecture. The appalling stupidity of accepted memes in this space I'll surely carry to my lonely grave.

        Crufty x86: here's how it broke down.

        First, about 50% of the original cruft drank the shrink-me fluid, and shrank down so small you can barely see it now (e.g. some extra microcode entries in a rarely used, unpopular spiral annex of the instruction decode table for misbegotten 286-era CISC call gates.) Jesus, people, exponential happens.

        Second, about 25% of the cruft turned out to not nearly be so crufty as legend would have it. The RISC camp soils itself over the read-modify-write instruction group. But generating a complex address once (yes, x86 does complex address generation within the context of a single instruction) rather than twice alleviates substantial pressure on the address look-aside unit. It's also a very handy and compact addressing mode for minor stack spill (e.g. function variables that don't quite manage to stay in registers all the time). With a 30% instruction encoding density advantage over the original ARM32, you need many fewer transistors in your i-cache to achieve the same i-cache hit ratio. The bigger your caches, the more free transistors to apply elsewhere. x86 is still a bit short on registers despite rmw, but you gain a bunch of this back on lighter context switches, so it's not a complete write-off.

        The other 25% is an eternal pain in the ass. Here's how the PITA component breaks down. The majority of it has little impact on peak throughput at all, but it comes at a thermal efficiency cost. The thermal cost is mostly irrelevant if you are sucking juice from a wall socket, and your processor is not hitting the thermal wall. The other side of this is a hideous sunk-cost in the engineering trickery required to pull this off (for a company the size of Intel, however, hideous is mostly peanuts, and nice barrier to entry you've got there, shame if a different device category became prominent).

        A minority of the PITA aspects of the instruction set are just permanently a PITA. Deep OOO requires extensive hazard detection, and x86 has hazards up the wazoo (many partial register writes, and seventeen different flavours of flag register update subsets). This costs silicon, this costs power, this costs cycle time, this costs pipeline stages. Lose, lose, lose.

        Considering the architecture is now 40-years old, that's not exactly a resounding F on the old report card, by a sane grader.

        Because of aspects like instruction decode alignment (with those blasted variable prefix bytes) and extremely complex hazard detection x86 is just always going to produce twice as much heat arriving at the same result 20% faster than any reasonable design that was originally power conscious.

        I suspect most of this fixed thermal inefficiency resides in the front end and not the back end. Meaning that an alternative x86 instruction set could be devised (somewhat more drastically different than Thumb-2 vs. Thumb) with vastly more efficient instruction decode (thermally) and vastly fewer implicit scheduling hazards. Caches, register sets, dispatch pipelines, retirement unit, memory ports, execution units, these could all remain the same. Perhaps the only register you'd want to muck with is the flags register, and maybe you'd trash the ability to write to AH (though you'd probably keep partial register writes to AL to handle common byte operations).

        [*] Fifteen years ago, the ugly details of this stuff was more in my head, so my examples predate AMD64, but mutatis mutandis.

        Maybe by doing so you'd even close the gap enough to compete with ARM. But: a huge redevelopment and validation cost (what, me validate?), another substantially different code generation mode for every major compiler, another by

        • I wasn't arguing wether the x86 or the ARM micro architecture is the "ultimate best one ever".

          The parent was just point that intel never had a good CPU for smartphone.

          I'm point that it's worse: they used to have one (an ARM based one) but managed to sell it out at the wrong time.

          the micro-architecture is only relevant to make the "never had a good cpu for smartphone" sentence true.
          They never had a x86 one.
          They actually has a good CPU for smartphone (which happens to be an ARM, but that's completely orthogon

        • Long, well informed comment with references, jokes, and asides? 5 digit UID? Brings me back to the glory days of /. when I was browsing using my dad's account...
    • MIPs/Joule is also AMD fail. Otherwise, I agree. It's also a big reason Intel failed in the portable market. Fast, but HOT. ARM beats both in that niche, which took over more market than Intel and AMD could cope with easily. Not all questions have only two possible answers.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      AMD did try to create a low power x86 with DSP extensions appropriate for such a market. It was via a skunkworks company named Stexar.

      Unfortunately, the ATI acquisition and contemporary price crash on x86 at the time made it look undesirable to continue development. In hindsight, a major lost opportunity as the smartphone market took off very shortly afterwards.

    • Intel didnt do a 64 bit extension at first because they were invested in Itanium. They did have the skills to extend x86 to 64 bit, its not a very difficult thing to do. Itanium never caught on, it provided too difficult for compilers to make code that was well optimized for it. Backwards compatibility won out.

      You are correct that Intel played dangerous games and sold a defective product in order to give themselves a speed advantage over AMD, which did the right thing by its customers.

  • Intel had a Wintel phone some years ago that was actually really quick and responsive. Plenty of power to multitask and do whatever on your phone. However, ARM continues to utterly destroy x86 on power consumption.

    Now it may be too little too late, unless they are somehow able to get that consumption better and maybe move toward tablets/phablets.

  • by pablo_max ( 626328 ) on Monday March 12, 2018 @10:35AM (#56246697)

    Seriously, Intel needs to get out of the mobile chipset game because they are pretty shit at it.
    I have been in the mobile certification business for a long time. We do 10's of thousands of tests on protocol stack and hardware layers of modules integrating these chipsets. Intel based products are always a pain. Their support is crap too. Most say, ok.. never again with Intel. We will use Marvell or something. QC tends to be 4 or 5 times the price, so it often doesnt make sense for high volume, low cost stuff.

    Anyhow.... they started way too late in this game and missed the boat.

  • by Anonymous Coward

    The bulk of the Nokia engineers are now working at HMD Global, a.k.a "the new Nokia", still in Finland, and their new line of Nokia Android phones and feature phones have generated ~80 million sales in their first 12 months alone.

    Far stretch from having been "destroyed".

  • by Gravis Zero ( 934156 ) on Monday March 12, 2018 @10:54AM (#56246813)

    Intel has a long history of anti-competitive behavior. One needs only search "Intel anti-competitive behavior" or see their Wikipedia page [wikipedia.org] to recognize that it's a persistent and ongoing. Yes, they have brought advances to the semiconductor field but they have always behaved in the most unethical manner possible to subvert the competition.

    I look forward to the rise of AMD.

    • by Nemyst ( 1383049 )
      If you think Intel's demise is at the hands of AMD, you're deluding yourself. Intel's scared because the entire x86 market is shrinking and they don't have a presence in other markets. AMD is in the exact same spot while being substantially smaller and dragging along a seriously hurt GPU division. I'd be delighted to see more competition in the x86 space and Ryzen will certainly help, but that's not what Intel's concerned about here.
  • by sjbe ( 173966 ) on Monday March 12, 2018 @10:54AM (#56246817)

    And wouldn't the always risky move of combining two cultures, employees, and physical plants introduce an even greater peril?

    Depends on how they handle it. If they operate the acquired company as a stand alone entity (sort of like how Berkshire Hathaway operates) then the cultures don't really have to mix much at all and that can work fine. Mixing company cultures is a serious challenge but it's not always required.

    I think Intel's biggest challenge is that they've been a de-facto monopoly for so long that they seem to have forgotten how to compete in areas where they don't dominate. It's always a risk for company that has one big cash cow that they just milk it to the exclusion of all else. The biggest risk to Intel is software makers leaving the X86 platform which is where the vast majority of their revenue comes from. They make some money from IoT and flash memory and security but these are about 12% of their revenue and 7% of their profit combined.

  • by lamer01 ( 1097759 ) on Monday March 12, 2018 @11:07AM (#56246887)
    Intel has a stranglehold on server CPUs. AMD is making a comeback there but the ARM camp does not have compelling enough solutions in that space. Low power is great but it's not everything.
  • Wasn't Broadcom already eated by Avago?
  • by ravrazor ( 69324 ) on Monday March 12, 2018 @12:32PM (#56247345)
    Intel stock is up 50% in the last 12 months (to $50) and they made about $63 billion dollars in 2017.

    I think they're doing okay.
    • by Nemyst ( 1383049 )
      The headline is obviously sensationalized, but Intel's always been pretty forward looking in their planning. They have to act now if they want to face challenges 5 years down the line as ARM takes more and more marketshare. When you're dealing with CPU designs and fabs, you can't turn on a dime.
  • by boa ( 96754 ) on Monday March 12, 2018 @01:53PM (#56247867)

    5G is coming soon with big promises about speed and availability. Always-online netbooks is a thing already. Maybe the next generation netbooks will use a non-Intel CPU to save both production costs and power?

    If people replace their PCs with a new Internet-enabled device(netbook, glorified cell phone, or something entirely new), sales of Intel CPUs will drop. A lot. It may be the death of both Intel CPUs and Windows OS.

    All hail Android? All hail ARM?

  • by foxalopex ( 522681 ) on Monday March 12, 2018 @02:01PM (#56247927)

    Intel isn't likely to go away anytime soon. They have some of the most advanced chip fabrication facilities in the world. Even if they didn't make x86 Processors, other manufacturers would be lining up to get their chips fabricated at their plants. They've also expanded into other areas although their x86 market remains to be their cash cow.

    Now the only grudge I've got against Intel is their massive anti-competitive behaviour against AMD in the past but nowadays they're one of the few companies that provide full opensource access to their GPUs and they generally do produce excellent mobile laptop chips.

  • by roc97007 ( 608802 ) on Monday March 12, 2018 @02:19PM (#56248031) Journal

    This is a real question. I don't have anything against Intel, and my current workstation has Intel Inside.

    Does Intel have anything that plays well in the phone/tablet market? My understanding is that Qualcom and/or Samsung don't own the market just because they were there first, but because their products are designed specifically for the application, whereas Intel's offerings in that arena all appear to be relatively low power x86 chips. Key term being "relatively". Like Microsoft's early struggles with hand held devices, trying to shoehorn a desktop OS into something with a 4 inch screen, Intel appeared to be trying to leverage existing designs in a market where they weren't appropriate.

    I could be missing something, but it seems like Intel's largest current issue is that they make the best possible processor for an increasingly smaller market, and don't make anything particularly appropriate for the most aggressively expanding markets. An issue they share to a certain extent with Microsoft.

    It'll be interesting to see what happens should Intel acquire Broadcom. I think there's a good chance -- maybe 40% that after acquisition Intel will drop or severely de-emphasize Broadcom's SoC products in favor of one of their lower power laptop x86 processors. And fail miserably at it.

  • by Darkness Of Course ( 4617959 ) on Monday March 12, 2018 @05:28PM (#56248979)
    There was exactly zero chance of Intel making iPhone CPUs. It was never on the table. Intel wasn't in the business of fab to mobile market.

    Otellini was smoking something when he made that claim. Absolutely nobody else in the company believed it. The market was too small, the IP was wrong (as in Intel was on the wrong end of it). Just like everybody else Intel/Otellini didn't think apple could cook up enough business to change their business model. Which was complete verticle slice of IP/Process/CPU/MB/Servers and it was making quite a bit of coin doing it. Becoming just the company that makes apple designed CPUs for phones, no viable business model for that.

    Maybe a fever dream left a vague unease behind, but it wasn't even a possibility. Intel never made the short list.

In 1869 the waffle iron was invented for people who had wrinkled waffles.

Working...