End of Intel-Pin-Compatible CPUs? 218
sonamchauhan writes ""Intel, Via bury the hatchet" proclaims this news.com article. The settlement reportedly allows Via to build Intel-pin-compatible CPUs for three years more, but Via must cease pin-compatibility after that."
This settlement apparently closes out 27 existing lawsuits.
fr1st ps0t #2 (Score:5, Insightful)
Well, pin compatibility isn't the issue I'd be concerned with, but opcode compatibility.
-uso.
Re:umm (Score:3, Insightful)
Having a fanless computer is really nice
The Register article, chipsets (Score:2, Insightful)
Now this brings up the question on what teh chipset clause means for the industry. I know I have via chipsets on my Athlon boards, and it seems likely that VIA will keep producing theese, but what about the Intel market? Does this mean that there will be a player less in that market in five years? Its a rather long time, perhaps the current hardware model is obsoleted by then? MiniATX + integrated systems + Palladium (TPC, was that what it was called?).
Re:About these pin-compatible CPUs... (Score:4, Insightful)
Not a big deal. (Score:4, Insightful)
Also, if you look at Via's upcoming and beta (www.mini-itx.com) products, it's quite obvious that they are aiming at the psuedo-embedded type market. People want very small and low cost mainboard/cpu's to make specialty type computers such such as MP3 jukeboxes, divx players, email machines and mame consoles. For most of these types of applications, the system requirements don't change as quickly. An MP3/Ogg dedicated machine will continue to be just as useful 10 years from now. You might upgrade it to make it smaller or add 9.1 whiz-bang-super-thx sound, but not being able to replace the CPU, doesn't matter.
0.02
Re:Intel Hate (Score:4, Insightful)
Re:Irrelevant (Score:5, Insightful)
Please define "lousy". Do you mean that it requires more clock to reach a certain level of performance (which is what many typically mean). If so, how does this make it lousy if what you're measuring is the "complete" performance and not just say "efficiency". Intel made a very concious design choice when they went the super deep pipeline high clock route. Which has more "wow", the fact that you can ramp the clock rates up quicker, or that you can get more done with the clock that you have? Isn't this similar to engines, where you have one camp that likes big cubes and massive torque vs the camp that likes high effiency and high rpms. They both have their plus's and minus's and it really depends on the application?
The only thing keeping Pentium-line procs afloat is marketing at this point.
But don't you think that Intel "plays the market". By this I mean their processors have the price/performance ratio that they currently do because the market allows them to? It would appear that Intel could certainly afford to drop the price of their chips quite considerably if they wanted to, but this would be very damaging to the bottom line in the share holders eyes for no real benefit. So Intel continues to have the price of their chips higher than anyone else, because it makes their pocket books fat. If push came to shove, they could do a LOT of damage to the clones while still being able to survive.
Re:Irrelevant (Score:3, Insightful)
I went to build a machine about 4 years ago. Top of the line P3 was $600 or so, so I picked up my 450mhz K6-2 for a little under $100. It wasn't faster by any stetch of the imagination, but it played games just fine coupled with the video card I could afford because of the savings. It wasn't beating Intel by any stretch, but it was cheap and reliable (that machine now resides with my parents, doing everything they want just fine).
I bought a 1ghz Athlon for about $200 or so. The 1ghz P3s cost more than twice as much, and were outperformed in nearly every respect. Those were AMDs glory days (starting there and progressing through to the Northwood P4s). AMD outperformed whatever Intel threw at them for about half as much. It was a no-brainer. The deficit increased even more wih the P4, which was only close when paired with RDRAM. There was an ever-so-brief period a little over a year ago where a lot of retail PC companies (the Compaqs and HPs of the world) were actually shipping their higher-end units with Athlons. I considered that great, because so few Joe Six-Packs knew the AMD name, and seeing that "trusted" companies (and not just screwdriver shops) used them went a good way in spreading the word about AMD
And then Intel got serious. They slashed the prices of their chips and released their 512KB L2 cache processors. I bought a 2.4Ghz P4 a couple of weeks ago for $160. The Athlon 2400+ was $130. These are two processors that would literally be neck-and-neck in almost any situation. If the Intel processor was $250, we'd definitely be talking AMD time, but it wasn't. It was $30 more, a number that could easily be made up for in any number of other areas. Plus, it was nice to forget about VIAs 4-in-1 crap (although the NForce stuff looks pretty nice). Now, I built a 2000+ for my brother about a month ago, and for stuff in that range ($80 for the proc, $70 for the NForce1 Board), AMD still rules, but I honestly hope the hammer seriously kicks ass if AMD wants to stay in this business.
Re:And may the market... (Score:3, Insightful)
> pin-outs, and blow non-conforming hardware right the fsck off.
Hehe, the funny part about that is Intel pretty much defined the standard pinouts, so if they choose to change it, guess what, that change is pretty much the standard.
So before and after the change, you think everyone should blow off every CPU maker that isnt Intel?
Well screw that parent poster, im sticking with AMD myself, you can keep your overpriced underpowered DRM enabled 'standard pinout' intel CPUs to yourself!
Re:Cheap solution for VIA (Score:2, Insightful)
Re:VIA, not Via... (Score:3, Insightful)
Re:VIA, not Via... (Score:3, Insightful)
This is just common sense on the part of journalists -- if they could get away with it, companies would insist that their name must always be in inch-high distinctive letters in bright colors. And all of their products, too.
Re:fr1st ps0t #2 (Score:2, Insightful)
You can tell I posted in a hurry, so I'll restate this more clearly; an x86 chip is still x86 no matter how it plugs into the mobo. It's not an issue that they're no longer going to be pin-compatible with an x86 - someone else says they make their own mobos, so they can make mobos for their own CPUs. No big deal.
But I have yet to see a *real* push away from x86. Just as well, because I am going to give up my dear old DOS when you pry it from my cold dead hands!!!
-uso.
Re:i can only hope... (Score:3, Insightful)
Re:i can only hope... (Score:3, Insightful)
Pin compatibility is not for end-users (Score:5, Insightful)
The main reason why it's desirable for Via to have a pin-compatible CPU with Intel's specification is because it shortens the development time and cost of a motherboard. It's easier and cheaper for the M/B manufacturer to design the board's layout if the signals are in the same place, because a re-layout of a M/B is very expensive in both time and money. (in some cases the full development can go upwards to several hundred thousand dollars)
Additionally, there are chipsets that can support both Intel and Via CPUs, (most notably some SIS SOC designs) making it even easier to make a M/B, but this fact it's not necessarily related to having interchangeable CPUs with a socket. Having a socket is of little to no use because Intel and Via CPU's are aimed at different market segments, anyway.
Remember the whole Slot-1/Slot-A fiasco? Intel developed the PII with a slot connector, and used patents/copyrights/trademarks/whatever to prevent AMD or any other CPU manufacturer to make pin-compatible CPUs. AMD then developed the Athlon to use exactly the same connector, although with different electrical specifications and pin definition. This move was aimed to facilitate the manufacturers' development and time-to-market efforts, never to give power to the end-user.
I can't believe nobody has mentioned this and everybody is easily misleaded into thinking this issue is not a important one. Maybe this shows just how few hardware development we have in the West.
Re:Tualatin owned. (Score:1, Insightful)
The P4 was designed to be highly scalable to high clockspeeds, which is something it does well. Not to mention the design that saw the light of day is missing a lot of the features the engineers originally envisioned for it, such as a L3 cache.
The Centrino would likely not scale as well as the P4. It's also got shared heritage with the P4 in some ways, and makes several consessions to speed in the name of lowering power usage.
Still, the P4's the fastest thing out there right now, lousy design or not.