Intel Fights For Its Future (mondaynote.com) 175
An anonymous reader shares a post: The Smartphone 2.0 era has destroyed many companies: Nokia, Blackberry, Palm... Will Intel be another victim, either as a result of the proposed Broadcom-Qualcomm combination, or as a consequence of a suicidal defense move? Intel sees the Qualcomm+Broadcom combination as an existential threat, an urgent one. But rather than going to the Feds to try and scuttle the deal through a long and uncertain process, Intel is rumored to be "working with advisors" (in plainer English, the company's Investment Bankers) on a countermove: acquire Broadcom. Why the sudden sense of urgency? What is the existential threat? And wouldn't the always risky move of combining two cultures, employees, and physical plants introduce an even greater peril?
To begin with, the threat to Intel's business isn't new; the company has been at risk for more than a decade. By declining Steve Jobs' proposal to make the original iPhone CPU in 2005, Intel missed a huge opportunity. The company's disbelief in Apple's ambitious forecast is belied by the numbers: More than 1.8 billion iOS devices have been sold thus far. Intel passed on the biggest product wave the industry has seen, bigger than the PC. Samsung and now TSMC manufacture iPhone CPUs. Just as important, there are billions of Android-powered machines, as well. One doesn't have to assume 100% share in the smartphone CPU market to see Intel's gigantic loss.
To begin with, the threat to Intel's business isn't new; the company has been at risk for more than a decade. By declining Steve Jobs' proposal to make the original iPhone CPU in 2005, Intel missed a huge opportunity. The company's disbelief in Apple's ambitious forecast is belied by the numbers: More than 1.8 billion iOS devices have been sold thus far. Intel passed on the biggest product wave the industry has seen, bigger than the PC. Samsung and now TSMC manufacture iPhone CPUs. Just as important, there are billions of Android-powered machines, as well. One doesn't have to assume 100% share in the smartphone CPU market to see Intel's gigantic loss.
Typical idiocy clickbait from the "editors" (Score:2, Informative)
As usual another stupid article full of line noise instead of anything intelligent to say.
Incidentally, if Intel is "fighting for its future" by making huge profits in a variety of areas then why the hell is Qualcomm -- the effective monopolist in smartphone wireless devices and also a huge player in smartphone SoCs -- even conceivably a target of a takeover? Why the hell isn't Qualcomm about to buy out Intel if Intel is so behind the curve and Qualcomm is supposedly so great?
Let's not even forget how this
Re:Typical idiocy clickbait from the "editors" (Score:5, Insightful)
The article (or at least the Slashdot synopsis) may be excessively editorializing it, but Intel is threatened by the reduction in the personal computer industry.
Consider that when PCs rose to prominence there were lots of architectures. Even after Wintel and Motorola/Apple dominated personal computers at home, business computing still had other architectures (MIPS, and Alpha immediately come to mind) to the extent that Microsoft felt the need to port their business OS to those platforms, rather than to force x86.
The end of the model that all software has to be compatible with x86/AMD64 and that the gatekeepers for software for new devices (Apple's and Google's respective repositories) require that the software work on their devices almost without respect to the underlying CPUs, plus the 'cloud' model and various other virtual machine models may abstract the software developer away from the physical hardware to the point that we might again see a proliferation of various architectures again. Intel has reigned supreme because it was difficult to port software or to write software to run on everything, but if that has changed then suddenly it doesn't matter what actual CPU is in the phone or tablet or even server, it'll just work when it's time for the software to run.
That's the threat to Intel's business-model, a loss of near-monopoly on processors because new devices don't need Intel's processors.
Re: (Score:2)
Those other architectures were forced into the highend niche, and eventually died out...
The same is happening to intel now, they are forced into the highend niche where arm chips are suitable for an increasing amount of day to day tasks, and only people with specialised requirements currently require the higher performance intel chips.
Fast forward a few years and the increased volume of sales for the arm chips provides more development money, and arm starts overtaking intel in performance too.
Re: (Score:2)
It's not just ARM. There are lots of startups coming up with various chips to do face and emotion recognition, posture recognition, motion recognition and all sorts of basic vision processing that would form a single visual circuit in a mammalian brain. These are being designed using machine learning techniques and don't require the double/floating point precision processing that are typically supplied by Intel chips. Even the fluid dynamics people were realizing that machine learning techniques were helpin
Re: (Score:2)
Android and iOS are limited in some senses, but that hasn't stopped massively widespread adoption of them either, and that includes on devices more than just phones. In some ways, what was old is new again. For the longest time the actual processing was done on a computer remote to the user, and the user's terminal was basically a dumb device that didn't really do that much. In some ways the Web has made that happen again. Admittedly not as thoroughly since there is client-side stuff going on, but by an
Re: (Score:3)
In some ways, what was old is new again.
I think we're returning to the idea of a "Home Computer" vs a "Personal Computer / Workstation" Android and iOS really simplify things down. This to me is like the return of "Home Computers" C64, Apple //, etc. These systems you either switch them on and load right into ROM based BASIC, or you pop a cartridge or floppy in and start it up your application. Very limited functionality, but very simple to use. "Personal Computers", which are mote like workstations, are significantly more powerful, and flexible,
Re: (Score:2)
I would say the categories can be extended to:
smart TV, smartphone, tablet, netbook, office workstation, gaming laptop/desktop PC, rack-mounted servers, PC server, engineering laptop/desktop PC.
Even the smartphones and tablets are more powerful than a early gaming console like an Ultra 64/Playstation 3. A gaming PC is more powerful than an office "workstation" with multi-screens, SLI cards. Engineering workstations can have dual-socket boards with quad-SLI boards and 40+ core XEON chips.
Re: Typical idiocy clickbait from the "editors" (Score:2)
The Amiga and the Atari ST lacked the market-share over the sum of decades that both Wintel and Apple enjoyed. They aren't unimportant in the history of computing but when discussing the totality of the proliferation of personal computers they don't matter.
Re: (Score:2)
AMD had the best CPU back then, and it was cheaper too. The problem is to actually use the Athlon you were stuck with garbage like the VIA's KT133 chipset. It's no wonder the OEM's fled in terror to boring but safe choices like Intel's 440BX chipset.
Re: (Score:3)
The same reason why AOL was able to buy TimeWarner in 2000. What matters is apparent value not actual value.
Smartphones are the hot product, so Qualcomm has more apparent value than Intel which has focus on boring (but profitable) pc and server memory and cpus.
Yeah, it is stupid, yeah it doesn't make sense; but that is how stock markets work. A company not making quite as much money as predicted is the same as that company losing money.
amd epyc has the pci-e for storage without (Score:2)
amd epyc has the pci-e for storage without needing to cross-flash / reflash to IT mode or lot's of pci-e switches.
Re: (Score:2)
Don't forget that it would also give Intel a massive market share on WiFi radios too. Every Mac uses a Broadcom, and many Windows PC OEMs have Intel PRO/Wireless for vPro, and a Broadcom alternative for non-vPro. There's a few out there using Atheros (including Dell) but the early versions of those were rather lumpy, and the fix was usually to throw a $30 Intel radio into the slot where the Atheros used to be.
But guess what? Atheros is Qualcomm. So if all of this comes to pass, it all becomes Intel.
iPhone CPUs? (Score:3, Insightful)
Re: (Score:3)
You don't want to enter a cutthroat low-margin market.
That would be fine if Intel's markets weren't shrinking. However, Intel can't maintain itself on its current markets, as they are all shrinking in favor of Mobile and, to a lesser extent, Cloud. Those are both areas where Intel is not terribly strong.
You would think that Intel and Cloud would go hand in hand, but that isn't necessarily true.
Most revenue from X86 (Score:2)
However, Intel can't maintain itself on its current markets, as they are all shrinking in favor of Mobile and, to a lesser extent, Cloud.
Well Intel supplies an awful lot of those CPUs for the cloud so I don't think that worries them so much. Mobile is an issue for them because that is definitely where the growth is. The biggest threat to Intel is that they have so much of their revenue and profit tied up in the X86 platform. If software and PC makers continue to migrate away from X86 it's going to hurt Intel badly sooner or later.
Re: (Score:2)
However, Intel can't maintain itself on its current markets, as they are all shrinking in favor of Mobile and, to a lesser extent, Cloud.
Well Intel supplies an awful lot of those CPUs for the cloud so I don't think that worries them so much. Mobile is an issue for them because that is definitely where the growth is. The biggest threat to Intel is that they have so much of their revenue and profit tied up in the X86 platform. If software and PC makers continue to migrate away from X86 it's going to hurt Intel badly sooner or later.
Even Cloud providers are looking to move to a different platform and many are involved in OpenPower (https://openpowerfoundation.org/membership/current-members/) to find better hardware in terms of cost per watt for Cload loads where floating point or integer computational performance may not matter as much.
Threats to Intel (Score:2)
I don't think ARM or AMD are going to be the next big threat to the Intel monopoly.
AMD isn't and likely won't be a threat. Intel makes more in profit than AMD does in revenue. AMD is unfortunately a rival in name only and they operate at a significant cost disadvantage to Intel because Intel is vertically integrated and AMD isn't. I'd like to see AMD doing better but if you look at the financial statements of both companies (and I have) you'll quickly conclude that AMD is trying to diversify away from competing with Intel because it's a game they cannot win. They've been trying and fa
Re: (Score:2)
Virtualization in concentrated data centers can reduce physical hardware needs to a degree but this has a lower bound and doesn't continue indefinitely
I think the worry is that the lower bound is too low to justify the expensive fabs Intel has invested in. There is a threshold below which Intel chips would cease to be a high-volume but lucrative cash cow and begin to be just another medium-volume chip that cannot justify its own fab and high R&D costs. Once any chip architecture comes rolling out of the same fab, where is the performance advantage? What will happen to their margins?
Re: (Score:2)
Re: (Score:3)
You don't count motherboard chipsets, network interface controllers and integrated circuits, flash memory, graphics chips, embedded processors and other devices related to communications and computing. [wikipedia.org] as "much"?
Re: (Score:2)
You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.
Also, what if Apple had gone with Intel for their CPU, and then failed because Intel's CPUs sucked batteries dry? Or because having the same CPU in desktops, laptops, and mobile devices lead Apple to the obvious path of cross-platform compatibility, and that sucked batteries dry? Or if Apple wanted to gradually take over more of the system to customize it to better serve their needs, and Intel said "F. U."? I think Apple was probably lucky on Intel turning them down.
Re: (Score:2)
You don't want to enter a cutthroat low-margin market.
In general, yes, I agree.
BUT. Semiconductor manufacturing is a very capital-intensive industry. If there isn't enough volume in the high-margin game to keep your multi-billion dollar factories occupied, you will not be able to justify the construction of another. And then you've lost the game, because your cheap-shit competitor will eventually surpass your manufacturing technology that allowed you to charge a premium in the first place. At that point, your best option is to contract out your manufacturing,
As the largest player, that's where you should go (Score:3)
You don't want to enter a cutthroat low-margin market.
As one of the largest players in the CPU space, you absolutely do want to do that.
First of all, low-margin does not mean NO margin, and a billion of anything at low margin is still a lot of money.
Secondly, that is a lot of great R&D opportunity in a challenging space you are giving up to sone other company. You can sit around all day designing new processors or features but until it comes into contact with real world uses and needs, your design will
Re: (Score:2)
You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.
That might be true if your only concern is margin and not survival. I would say that a case is being made that PCs are slowly dying and being replaced with smaller devices many of which do not and will not run Intel CPUs. I would say it's the same problem that Sun Microsystems faced. AMD, Intel, and Cyrix were all fighting on the x86 market with Intel coming out on top and AMD relegated to 2nd class. Sun stayed out of the consumer market completely and failed to innovate in the server market. These days lo
Re: (Score:2)
Right, Sun is probably the best example. They *thought* they were better off in the high margin, high end micro computer market. The problem is while the margins might have been better the market was shrinking and underwent a rapid shrink; to rapid for sun to effectively respond to once Intel ( and compatibles ) got good enough for a lot of those jobs. A nearly a century before you see the opposite with FOMOCO. There were 10s if not more little automakers around the great lakes region ( proximity to exi
Re: (Score:2)
You are attaching too much importance to the iPhone CPUs (and Android) market. It is doubtful the margins are high on those, especially since Apple has multiple manufacturers. That is like saying Apple missed out on making Android phones because there were so many of them out there. You don't want to enter a cutthroat low-margin market.
Same process happened in the 1980's when Intel overtook the Mainframe market, and folks said the same thing. Like it or not, laptops and (even more so) desktops are moving to areas of that generally require high performance stuff. As productivity (LibreOffice, MS Office, etc) and financial applications (Quicken) get better on mobile (Google Apps - Docs, Spreadsheet, Presentation; O365; Quicken is already on iOS and Android) then the need for even a laptop will go away entirely for the every-day-user will be
Intel lies, Intel spies... (Score:3)
Re: (Score:1)
and intel can't create a power-efficient cpu without cheating the process and leaving exploitable holes in them.
Was that a hope? (Score:2)
But rather than going to the Feds to try and scuttle the deal through a long and uncertain process, Intel is rumored to be "working with advisors" (in plainer English, the company's Investment Bankers) on a countermove: acquire Broadcom.
Is that what we're hoping for now? Or is it simply that we expect that to happen and are shocked when it doesn't?
Intel is getting nibbled on (Score:1)
Intel chose not to worry about ARM.
Intel is getting its breakfast eaten by ARM.
Intel is getting its lunch nibbled on by Qualcomm/AMD
AMD is now tasting Intel's Dinner
Intel BOTCHED the Spectre and Meltdown patches. To the point that I will not apply those microcode pathes, and I will seriously consider AMD in the next build.
Intel seems to be doing the MSFT QA Principle, "let your customers be the beta-testers", except they have moved into the "Alpha" phase.
No thanks, I want a CPU that works and that is secure
Intel relies on a monopoly (Score:5, Interesting)
That monopoly is ironically called the AMD64 architecture today. This comes with a number of problems. While Intel managed to keep AMD small after the last time AMD (not Intel, they did not have the skills) not only came up with the only viable 64 bit extension to the x86 architecture, for a while they also had the fastest CPUs. AMD engineering in the CPU space has basically always been significantly superior to Intel, except for raw speed. Meltdown and Spectre have now nicely illustrated what Intel did to get that speed. And AMDs weakness is over, with a brand-new architecture that is very well designed indeed while Intel has nothing. It helps to understand that Intel it not actually a CPU company, they are a memory company and have struggled with CPUs since they began making them. AMD, on the other hand, came from signal-processors to x86 and _is_ a CPU company. This nicely explains Intel's incompetence, incredible as it sounds. They do not have the right culture.
One other instance of that problem is also that while AMD can do extreme customization of their CPUs since the FX generation, Intel is completely incapable in this space. And just look how long it took Intel to get the memory controller into the CPU after AMD did it.
Now, Intel also did never manage to come up with anything x86 that was suitable for a smartphone. AMD did not even try, because they understand CPUs and knew this architecture is not suitable for that field. But they went one step farther: They have server processors that include ARM cores. So AMD has real experience in that field, but Intel is, again, lost. Yet AMD is far smaller and does not need the smartphone market to survive, while Intel likely does. And they messed it up.
My take is that finally Intel found out with much delay that they managed to screw themselves, in addition to their customers.
Re:Intel relies on a monopoly (Score:5, Insightful)
AMD (not Intel, they did not have the skills) not only came up with the only viable 64 bit extension to the x86 architecture
You're leaving out a rather important detail: Intel didn't try to create a 64 bit extension to x86.
Instead, Intel tried to use Itanium and IA-64 to replace x86 and all the cruft in it that had built up over the years. Intel thought people would only buy Itanium for servers, since a 64-bit address space wasn't very useful for desktops at the time. So they priced their chips high.
AMD countered with 64 bit extensions to x86 and cheaper chips.
Cheaper won.
Re: (Score:2)
Yes, but it didn't help that Itanium was a honking disaster in the marketplace, (apart from HP, who tied themselves to that turkey...)
Re: (Score:2)
I think they thought they were going to be sitting in the catbird seat. I think Itanium was supposed to replace PA-RISC *and* PC server processors. So they would have a long-haul future with a new CPU for both PC server and workstation/midrange markets and would be able to start picking off Sun's business, too.
I'm sure there was some MBA math involved that also took into account getting a share of licensing revenue for the chip patents, too.
You have to admit that looking back it didn't seem like a terribl
Re: (Score:2)
You contradicted yourself here. On one hand you said x86 is "cruft", on the other hand, you admit its cheap to make x86 chips. There really isnt cruft in x86, its a perfectly useable design. Its not hard or expensive to implement any more than ARM. x86 instruction encodings can be weird, but not being aesthetic doesn't make them a performance problem or hard to implement on chip. Instruction encodings are things compilers need to be concerned with, not app programmers, anyway.
Re: (Score:2)
There really isnt cruft in x86, its a perfectly useable design
We've figured out better ways to make CPUs than x86, and one of the ways is better instruction sets. Those instructions sets reduce the complexity in creating a chip. x86 can't do that because it has to remain compatible.
In other words, x86 is carrying along things that are not as useful as they could be, and maintaining them is a limitation. Also known as cruft.
Re: (Score:2)
AMD (not Intel, they did not have the skills) not only came up with the only viable 64 bit extension to the x86 architecture
And if you actually believe that, then you are stupid. Of course they tried. They just never had anything good enough to go public with.
Re: (Score:3)
Intel didn't try to create a 64 bit extension to x86.
Yeah, Intel was trying to pull an IBM and introduce the MCA architecture to get rid of the competition and over on their shiny new platform where they held all the essential patents.
Instead, Intel tried to use Itanium and IA-64 to replace x86 and all the cruft in it that had built up over the years.
That's one way of putting it. Somebody at Intel managed to do a huge sell-in of compiler optimization and profiling as the future and so they created "Explicitly parallel instruction computing (EPIC)" which gave extremely fine detailed control to the compiler. The problem is that there's a balance between run-time optimization b
Re: (Score:2)
Cheaper was not the only, or even necessarily the primary reason.
Intel forgot the whole reason they ultimately dominated the market: backwards compatibility. Every product launch, no matter how revolutionary had a *massive* catalog of functional software to go with it.
Intel had enough hubris to think they could spin up an ecosystem from scratch. AMD proved that to be an incorrect strategy.
Also, the chips were plagued with performance issues, promiment among them was placing more demand on memory bandwidth
Re: (Score:2)
AMD countered with 64 bit extensions to x86 and cheaper chips.
...and faster.
Working in the cash rich Oil and Gas industry in the early 2000's, SGI tried to make a machine [wikipedia.org] using the processors. A couple customers bought them, but realized after just a little bit of testing that they were horrendous for single threaded tasks and even crippled for larger ones. We ran some benchmarks in-house and couldn't find anything the machines were good at -- other than the huge memory footprint (3 TB of
History (Score:4, Informative)
Now, Intel also did never manage to come up with anything x86 that was suitable for a smartphone.
Worse.
They did never manage to come up with anything specifically running the x86 instruction set that was suitable for a smartphone.
They used to have a decent Intel-manufactured CPU running ARM instruction set, but somehow managed to abandon the market and sell it off, just at the time when ARM is getting even more relevant thanks to smartphones, routers and IoT.
Search for "Intel StrongArm" and "Intel XScale".
Note that, according to Wikipedia, Intel is still in possession of ARM license that they acquired when bought StrongArm.
So even after selling XScale out to Marvell, they could still start a new line of ARM core *now*, after having come to realization that the Atom doesn't scale down as much as they would have liked (isn't that well suited for smartphones and routers) and its x86 compatibility makes absolutely not sense in those markets (Seriously, nobody is going to run legacy Windows code on a smartphone)
AMD did not even try, because they understand CPUs and knew this architecture is not suitable for that field. But they went one step farther: They have server processors that include ARM cores.
I'm still hoping that, next to the ARM light-weight servers that they are targeting, these ARM cores will eventually also evolve to some high range phablets and dev boards.
Re: (Score:2)
(Seriously, nobody is going to run legacy Windows code on a smartphone)
Wait, that sounds like a great idea to me. Why not run all our windows apps and games on a smartphone? There are still things that haven't been ported. Sure, it is not good for regular use. But if I could for example quickly check something in a Visual Studio project or make a small edit to an image in Photoshop on my phone it would be really great.
Better battery (Score:2)
Well, running the software on dedicated hardware (or on a full blown VM somewhere on the cloud - as several gaming solutions do) and streaming to your phone still beats everything in term of battery life vs performance.
(Also, you can turn the PC on and off remotely no need to have it run all the time. That's the whole point of Etherwake or newer technologies for lights-out management like IntelME, IPMI, etc. those even provide the VNC remote access. But saddly often also provide tons of exploitable bugs.).
E
Re:History (Score:5, Interesting)
I went though a short, thirty-year obsession with all things microarchitecture. The appalling stupidity of accepted memes in this space I'll surely carry to my lonely grave.
Crufty x86: here's how it broke down.
First, about 50% of the original cruft drank the shrink-me fluid, and shrank down so small you can barely see it now (e.g. some extra microcode entries in a rarely used, unpopular spiral annex of the instruction decode table for misbegotten 286-era CISC call gates.) Jesus, people, exponential happens.
Second, about 25% of the cruft turned out to not nearly be so crufty as legend would have it. The RISC camp soils itself over the read-modify-write instruction group. But generating a complex address once (yes, x86 does complex address generation within the context of a single instruction) rather than twice alleviates substantial pressure on the address look-aside unit. It's also a very handy and compact addressing mode for minor stack spill (e.g. function variables that don't quite manage to stay in registers all the time). With a 30% instruction encoding density advantage over the original ARM32, you need many fewer transistors in your i-cache to achieve the same i-cache hit ratio. The bigger your caches, the more free transistors to apply elsewhere. x86 is still a bit short on registers despite rmw, but you gain a bunch of this back on lighter context switches, so it's not a complete write-off.
The other 25% is an eternal pain in the ass. Here's how the PITA component breaks down. The majority of it has little impact on peak throughput at all, but it comes at a thermal efficiency cost. The thermal cost is mostly irrelevant if you are sucking juice from a wall socket, and your processor is not hitting the thermal wall. The other side of this is a hideous sunk-cost in the engineering trickery required to pull this off (for a company the size of Intel, however, hideous is mostly peanuts, and nice barrier to entry you've got there, shame if a different device category became prominent).
A minority of the PITA aspects of the instruction set are just permanently a PITA. Deep OOO requires extensive hazard detection, and x86 has hazards up the wazoo (many partial register writes, and seventeen different flavours of flag register update subsets). This costs silicon, this costs power, this costs cycle time, this costs pipeline stages. Lose, lose, lose.
Considering the architecture is now 40-years old, that's not exactly a resounding F on the old report card, by a sane grader.
Because of aspects like instruction decode alignment (with those blasted variable prefix bytes) and extremely complex hazard detection x86 is just always going to produce twice as much heat arriving at the same result 20% faster than any reasonable design that was originally power conscious.
I suspect most of this fixed thermal inefficiency resides in the front end and not the back end. Meaning that an alternative x86 instruction set could be devised (somewhat more drastically different than Thumb-2 vs. Thumb) with vastly more efficient instruction decode (thermally) and vastly fewer implicit scheduling hazards. Caches, register sets, dispatch pipelines, retirement unit, memory ports, execution units, these could all remain the same. Perhaps the only register you'd want to muck with is the flags register, and maybe you'd trash the ability to write to AH (though you'd probably keep partial register writes to AL to handle common byte operations).
[*] Fifteen years ago, the ugly details of this stuff was more in my head, so my examples predate AMD64, but mutatis mutandis.
Maybe by doing so you'd even close the gap enough to compete with ARM. But: a huge redevelopment and validation cost (what, me validate?), another substantially different code generation mode for every major compiler, another by
Not architecture (Score:2)
I wasn't arguing wether the x86 or the ARM micro architecture is the "ultimate best one ever".
The parent was just point that intel never had a good CPU for smartphone.
I'm point that it's worse: they used to have one (an ARM based one) but managed to sell it out at the wrong time.
the micro-architecture is only relevant to make the "never had a good cpu for smartphone" sentence true.
They never had a x86 one.
They actually has a good CPU for smartphone (which happens to be an ARM, but that's completely orthogon
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Interesting)
AMD did try to create a low power x86 with DSP extensions appropriate for such a market. It was via a skunkworks company named Stexar.
Unfortunately, the ATI acquisition and contemporary price crash on x86 at the time made it look undesirable to continue development. In hindsight, a major lost opportunity as the smartphone market took off very shortly afterwards.
Re: (Score:2)
Intel didnt do a 64 bit extension at first because they were invested in Itanium. They did have the skills to extend x86 to 64 bit, its not a very difficult thing to do. Itanium never caught on, it provided too difficult for compilers to make code that was well optimized for it. Backwards compatibility won out.
You are correct that Intel played dangerous games and sold a defective product in order to give themselves a speed advantage over AMD, which did the right thing by its customers.
It's funny (Score:2)
Intel had a Wintel phone some years ago that was actually really quick and responsive. Plenty of power to multitask and do whatever on your phone. However, ARM continues to utterly destroy x86 on power consumption.
Now it may be too little too late, unless they are somehow able to get that consumption better and maybe move toward tablets/phablets.
Re: It's funny (Score:2)
Intel CPUs are in$60 Windows tablets - they are in the tablet market, just not the Android tablet market.
Intel should forget about Mobile (Score:4, Interesting)
Seriously, Intel needs to get out of the mobile chipset game because they are pretty shit at it.
I have been in the mobile certification business for a long time. We do 10's of thousands of tests on protocol stack and hardware layers of modules integrating these chipsets. Intel based products are always a pain. Their support is crap too. Most say, ok.. never again with Intel. We will use Marvell or something. QC tends to be 4 or 5 times the price, so it often doesnt make sense for high volume, low cost stuff.
Anyhow.... they started way too late in this game and missed the boat.
How is Nokia destroyed, exactly? (Score:1)
The bulk of the Nokia engineers are now working at HMD Global, a.k.a "the new Nokia", still in Finland, and their new line of Nokia Android phones and feature phones have generated ~80 million sales in their first 12 months alone.
Far stretch from having been "destroyed".
Re: (Score:2)
People who were employees of Nokia are not Nokia.
Shed no tears for them. (Score:4, Informative)
Intel has a long history of anti-competitive behavior. One needs only search "Intel anti-competitive behavior" or see their Wikipedia page [wikipedia.org] to recognize that it's a persistent and ongoing. Yes, they have brought advances to the semiconductor field but they have always behaved in the most unethical manner possible to subvert the competition.
I look forward to the rise of AMD.
Re: (Score:2)
Operate standalone (Score:3)
And wouldn't the always risky move of combining two cultures, employees, and physical plants introduce an even greater peril?
Depends on how they handle it. If they operate the acquired company as a stand alone entity (sort of like how Berkshire Hathaway operates) then the cultures don't really have to mix much at all and that can work fine. Mixing company cultures is a serious challenge but it's not always required.
I think Intel's biggest challenge is that they've been a de-facto monopoly for so long that they seem to have forgotten how to compete in areas where they don't dominate. It's always a risk for company that has one big cash cow that they just milk it to the exclusion of all else. The biggest risk to Intel is software makers leaving the X86 platform which is where the vast majority of their revenue comes from. They make some money from IoT and flash memory and security but these are about 12% of their revenue and 7% of their profit combined.
This article is nonsense (Score:3)
Re: (Score:2)
ARM camp does not have compelling enough solutions in that space.
Check out these benchmarks.
https://blog.cloudflare.com/arm-takes-wing/ [cloudflare.com]
Looks like the situation is rapidly deteriorating for Intel.
I don't know, we've seen RISC vs CISC before (Score:2)
Acquire? (Score:2)
Re: (Score:2)
After Avago bought Broadcom, it changed its name to Broadcom.
"fighting"? really? (Score:5, Insightful)
I think they're doing okay.
Re: (Score:3)
Re: (Score:2)
And "now" seems almost like a desperate attempt at catch-up. I think Intel should have made their move a few years ago.
Re: (Score:2)
ARM is low performance, there is no threat to intel's main markets.
5G and "always online" netbooks as a threat? (Score:3)
5G is coming soon with big promises about speed and availability. Always-online netbooks is a thing already. Maybe the next generation netbooks will use a non-Intel CPU to save both production costs and power?
If people replace their PCs with a new Internet-enabled device(netbook, glorified cell phone, or something entirely new), sales of Intel CPUs will drop. A lot. It may be the death of both Intel CPUs and Windows OS.
All hail Android? All hail ARM?
Re: (Score:3)
I feel the same way, but people seem to be happy with their cell phones, data cap or not. BTW, in some countries, like Finland, there's no data cap. If 5G delivers what it promises, throughput should not be an issue. And Wi-Fi isn't dead yet either :)
Intel's Future is pretty solid (Score:3)
Intel isn't likely to go away anytime soon. They have some of the most advanced chip fabrication facilities in the world. Even if they didn't make x86 Processors, other manufacturers would be lining up to get their chips fabricated at their plants. They've also expanded into other areas although their x86 market remains to be their cash cow.
Now the only grudge I've got against Intel is their massive anti-competitive behaviour against AMD in the past but nowadays they're one of the few companies that provide full opensource access to their GPUs and they generally do produce excellent mobile laptop chips.
Can Intel even play in this market? (Score:4, Interesting)
This is a real question. I don't have anything against Intel, and my current workstation has Intel Inside.
Does Intel have anything that plays well in the phone/tablet market? My understanding is that Qualcom and/or Samsung don't own the market just because they were there first, but because their products are designed specifically for the application, whereas Intel's offerings in that arena all appear to be relatively low power x86 chips. Key term being "relatively". Like Microsoft's early struggles with hand held devices, trying to shoehorn a desktop OS into something with a 4 inch screen, Intel appeared to be trying to leverage existing designs in a market where they weren't appropriate.
I could be missing something, but it seems like Intel's largest current issue is that they make the best possible processor for an increasingly smaller market, and don't make anything particularly appropriate for the most aggressively expanding markets. An issue they share to a certain extent with Microsoft.
It'll be interesting to see what happens should Intel acquire Broadcom. I think there's a good chance -- maybe 40% that after acquisition Intel will drop or severely de-emphasize Broadcom's SoC products in favor of one of their lower power laptop x86 processors. And fail miserably at it.
No truth to iPhone cpu myth (Score:3, Insightful)
Otellini was smoking something when he made that claim. Absolutely nobody else in the company believed it. The market was too small, the IP was wrong (as in Intel was on the wrong end of it). Just like everybody else Intel/Otellini didn't think apple could cook up enough business to change their business model. Which was complete verticle slice of IP/Process/CPU/MB/Servers and it was making quite a bit of coin doing it. Becoming just the company that makes apple designed CPUs for phones, no viable business model for that.
Maybe a fever dream left a vague unease behind, but it wasn't even a possibility. Intel never made the short list.
Re: (Score:1)
The PC market isn't dying, it's just changing architectures away from the one Intel has dominated.
However, thanks to the Internet, cellular companies wanting data transfer, and businesses wanting to offer everything "aaS" -- they still have a large market server side though they're now missing out from many typical consumers.
Re: (Score:2)
I think there's an argument to be made that it is ossifying. CPU and other architecture capabilities have risen to such a level in the last decade that it has disrupted the upgrade cycle. We're still running workstations we bought in 2009, and they even run Win10 (though not the latest creators update, but who cares). They do fine for browsing, document editing and the like, and now we simply replace them as they die, which doesn't actually happen all that often.
Now maybe my company is pushing the envelope
Re:Strange article (Score:5, Insightful)
Because the PC market is dying.
Only if you ask the stock market. Stable demand without growth is called a business (though shareholders tend not to care about that). Replacement cycles are long, but nothing has supplanted the PC.
Re: (Score:2)
PC market is shrinking fast and what has replaced it is a plethora of smart products, the smart phone, the smart TV and tablet, for students in the family a cheap notebook. For business, smart terminals, simply easier to manager and no pesky USB or accessible disk drives of what ever type. See desktop replaced in by far the majority of instance and market shrinking back to power users, the core and they hate M$ only barely putting up with them and that is killing Intel because everyone is holding of upgrade
Re:Strange article (Score:4, Interesting)
For business, smart terminals, simply easier to manager
Thin client vs thick client has been a tick-tock ever since the first mainframes entered the business market.
Desktops are inevitably doomed but they can stretch out the next few decades
That's as close to living as we've ever had... Decades are an eternity in computing.
Re:Strange article (Score:5, Insightful)
PC market is shrinking fast
No, it isn't. The PC sales are dropping slowly and steadily, but the PC market penetration has not changed dramatically, people just upgrade more rarely.
Re: (Score:2)
Many don't get replaced at all...
A lot of people bought a PC to access the internet as there was little choice at the time, a lot of those people have moved onto tablets, gaming consoles and smartphones since then so while they still have a PC, it is probably gathering dust, won't be replaced if it dies and probably doesn't get used much if at all.
Re: (Score:2)
Re: (Score:2)
...people just upgrade more rarely.
Product Activation is what did me in. I used to rebuild my PC every 6 months, just because I could. After XP changed the game and I had my Windows license deactivated twice after upgrades (both times requiring me to call Microsoft and beg to use my PC again), I just stopped upgrading my hardware and learned to live with what I have.
No, I'm not going to switch to Linux (which I've been trying to do for 15 years). Yes, these days I can just use a pirated Windows if necessary. However, the point is that u
Re: (Score:3)
PC market is shrinking because they failed to solve the problems necessary to make them relevant in a portable device world.
A home PC should be like a furnace - rarely physically interacted with but fully integrated into the home. Every fixed screen in my home should be dumb, they should all run off a single PC. All "smart devices" should simply be interfaces that use the PC's hardware to execute/control their functions. Smartphones/tablets/portable devices should have a power saving mode that enables th
Re: (Score:2)
they should all run off a single PC. All "smart devices" should simply be interfaces that use the PC's hardware to execute/control their functions
For vendors... they keep more control if they use their own cloud, AND end users don't have to worry about replacing an expensive single PC to restore all those functions when it fails/dies
Re: (Score:2)
Vendors have an interest in not having to provide their own cloud. That is an on going expense for a one time purchase. Those who want it for control, are usually up to no good. Those who want it for data, really don't need the cloud service, they just need the software to phone home telemetry and perform updates, things that can be standardized for user privacy & security. The reality is that it's not a sustainable business model to run IoT devices without a subscription service.
On the expense issu
Re: (Score:2)
Vendors have an interest in not having to provide their own cloud. That is an on going expense for a one time purchase.
No... it's probably a small expense for most products, and an eventual opportunity to get recurring revenue out of their customers -- either by starting to bill a new subscription (The Cloud excuse helps facilitate a "Rental model" for license to use the hardware and software --- Being cloud-based usually means additional revenue opportunities for the provider or more options to furt
Re: (Score:2)
Subscription services sound nice, there's just not the income to support that very widely. People also can't afford the rate of obsolescence that would be required to sustain such a model either. We're already seeing that in the smartphone market and that's essentially what happened to the PC market. Buy small/cheaper devices that are "good enough" instead of an expensive/complicated/bulky PC.
I'm certainly not going to buy a "smart coffee maker" with a subscription & planned obsolescence when I can b
Re: (Score:2)
I'm certainly not going to buy a "smart coffee maker" with a subscription & planned obsolescence when I can buy a dumb one without.
The "Smart" coffee makers are called Keurig, and they're pretty popular --- some of the latest models use DRM (Digital Rights Management) technology in the form of a chip in their manufactured coffee pods to discourage/prevent using 3rd party or generic pods.
Re: (Score:2)
Vendors have an interest in not having to provide their own cloud. That is an on going expense for a one time purchase
They love forcing their own cloud. They can force your devices to quit working so you'll buy the next new thing. If you had direct interface, you could use the thing pretty much forever.
Re: (Score:2)
Ummm, that's a very stupid argument, I hope you realize.
Windows beat Linux because you can't play heavy games on Linux. And that's the only reason. You sure as hell can't play those games on a smartphone. First, it's running Linux, and second, it hasn't the power. It will never have the power because energy consumption goes up with the square of the CPU's abilities and batteries can't increase in capacity that fast.
Second, you can't do any decent wordprocessing in a phone or tablet, they're too prone to han
Re: (Score:3)
but nothing has supplanted the PC
If you think about this, it is not a defensible statement. Sure, PCs still have a large collection of niches that nothing has supplanted - but you are ignoring the huge number of niches which have disappeared. You probably have a computer in your pocket right now with approximately the same power as a late-80s Cray. It has almost entirely wiped out the social aspect of the PC - email, IM, web forums, video and music sharing, etc. The PC games market [newzoo.com] is slowly losing ground to mobile.
Will there always be a m
Re: (Score:2)
Since the argument was against the idea that the market for the PC is dying entirely, I wouldn't even consider all the people that never would have had a PC if they had an alternative. It's back to its original niche.
If the definition of PC is x86 compatibility, the market may eventually go away. If the definition is full-power, full-size personal computing device, I can't yet imagine a future like that.
Re: (Score:3)
. It's back to its original niche.
The problem for us (and for Intel) is that niche is a small fraction of what they are sized for. Their multi-billion dollar fabs sitting idle is a financial disaster. Intel is already doing some contract manufacturing, but that's a tough game with many experienced competitors. Yes, I think you are right that there will always be a market for workstations, but I think we're going to see a slow drift towards what is becoming the new standard in commodity hardware. Most of us will use "PCs" with repurposed mob
Re: (Score:2)
If the definition of PC is x86 compatibility, the market may eventually go away. If the definition is full-power, full-size personal computing device, I can't yet imagine a future like that.
This! My tablet isn't even close to replacing my x86 machine. I need a keyboard, mouse, multi-windows, file-system with user-created directories, etc. It might not have to be compatible with my current Wintel machines, but it has to have similar capabilities.
Re: (Score:2)
Most smartphones support bluetooth keyboards...
Many can be docked to a larger screen, so you have the same device wether your mobile or in a fixed location - best of both.
For many use cases a smartphone is "good enough", typing may be slower but many people aren't very proficient typists anyway. There are also various speech to text options which have improved a lot in recent years.
Re: (Score:2)
Whether you personally do it or not is beside the point - the market has gone that way.
Re: (Score:2)
No, we'd need to go from devices with walled gardens with a 30% cut on app purchases to a device with an open (but secure) market.
Re: (Score:2)
For anything commercial, for this to work, we need to convince commercial developers to go from selling $1000 specialized vertical marker applications to selling $1 mobile applications.
I think they'll use the same OS and programs that they've always used, but the chips will change to whatever is currently offering up the best bang for the buck. Windows can run fine on an ARM now, and in 10 years I believe it will be a lot more common if Intel can't adjust. Autodesk, Microsoft, and Adobe can still sell their big commercial programs compiled for a different architecture. My Sony TV is already an all-in-one computer running Android TV... it's not a big stretch too see it being sold with upgr
Re:Strange article (Score:4, Interesting)
I am thinking this might be planted news by Intel to justify their acquisition as otherwise it would be rejected as a major monopoly already fined for abusing their monopoly expanding its monopoly further.
Re: (Score:2)
From what I've heard from engineers is that there is very little or no overhead to support x86 on a chip. Its an infinitesimal part of the design. Its basically one of those urban legends that was cooked up in the 80s and isn't relevant any more. We've been in this situation many times before, people hear something and they keep on repeating it even though things have changed. I've looked at both x86 and ARM ISAs and its not much difference in complexity. We are in a post-RISC era. Even the so called RISC
Re: (Score:2)
Steve Jobs wanted Intel CPUs in his iPhone but his engineers did not. Anyone who's familiar with the differences between ARM and x86 would know that an Intel powered smartphone was not a good idea.
I don't think that's what the proposal was. Apple has tried different device prototypes with Intel CPUs but they didn't work out. I think the Apple proposal was that Intel manufacture ARM CPUs for Apple.
Re: (Score:2)
You really wonder why they would turn this down?
From what I understand it would be a very different way of doing business. Apple wanted Intel to make CPUs that Intel didn't design. While Intel has made and makes ARM processors in small volumes, the numbers Apple projected would make Intel a chip foundry like TSMC, GlobalFoundries, and Samsung that is more invested in making other people's ICs. It would be like asking Ford to make GM and Honda cars. During WWII, all American automakers made Chrysler designed Jeeps for the war effort but they don't normall