Unlock seamless, secure login experiences with Auth0—where authentication meets innovation. Scale your business confidently with flexible, developer-friendly tools built to protect your users and data.
aftk2 writes "According to news.com, chip maker Transmeta - current home of Linux creator Linus Torvalds, has canned 40% (200 people) of its work force, and has shifted its goals toward obtaining profitability in 2003. No word on whether there were any penguins seen leaving the building."
This discussion has been archived.
No new comments can be posted.
Laying people off is not folding. I'm hard pressed to think of any company that hasn't gone through at least one round of layoffs in its history. BTW, as an AC it is hard for us to know who you are and just where to find your "new years(sic) forecast."
Granted their code-morphing and use of VLIW had some interesting concepts, and their power consumption was perfect for laptops, but there just wasn't much of a market for what they developed. Had some of the bigger players (Dell, etc) actively pushed transmeta chips on the market, perhaps they might have made some money.
I for one am not sad this happen... they had some good ideas, but nothing insanely great. br.
I agree: None of their ideas were "insanely great" as you so well put it. The problem as I see it was too many promises, late releases, and the simple fact that they bragged a bit too much too early.
They should base their business model on some company that is well liked as opposed to a company like Microsoft. Notice similar business tactics? Difference is, Microsoft is big enough to pull it off.
Seriously though, the ideas they had could very well be worthwhile. While I would hate to see Transmeta fold, at least the ideas and the technology are out there, and would likely be sold should the company fail completely. Supposing a company like AMD got ahold of Transmeta's research and knowledge base...a veteran company might be able to market such a product better than Transmeta has.
The way I see it, Transmeta will either pull through, or the technology will get passed on. Sad to say, but it's win/win for the industry...and only hard times ahead for Transmeta.
Granted their code-morphing and use of VLIW had some interesting concepts, and their power consumption was perfect for laptops, but there just wasn't much of a market for what they developed.
Nah, there just wasn't much of a market for how they where selling what they had developed.
Now if instead they had, say, concentrated on making development platforms . . . . heh.
Can you imagine sitting down at a machine that is a Sun, PowerPC, and x86 all in one?
That is (was?) the true promise of Transmeta and shoving the chips into laptops was just plain silly. Bleck.
(actualy I just just kind of hoping for an uber emulation machine myself, hehe. Think the next Generation of MAME.:-D )
yes I think that they could have done or have done it
really this is software and the only thing that would limit them is some stupid restriction in the hardware (which they have not made) this means that the transmeta laptop now with a rom update is a 64bit machine (-;
the only reason they have not done it is that they dont want to steal AMD's thunder
if transmeta get their system working like a SOC and all you have to do is wire up the Phy of a net/USB/LCD/ideHD then I think they will really take off
vendors are sick to death with chipsets and chips they just want a nice x86 System On a Chip
I know nothing about the feasibility issues involved, but it seems like that kind of idea could be extremely useful. I'm thinking a machine with a CPU capable of running multiple instruction sets simultaneously coupled with a VM-type operating system that allows you to bootstrap virtual machines of various architectures.
Unfortunately, their push was toward the mobile market, so they appear to have put more effort into power consumption than they did performance, and I dont think they even tried to get a Crusoe processor running multiple instruction sets simultaneously yet, so anything along those lines that we would see anytime soon would probably not be better than just buying two different machines of different architectures, and I doubt many companies percieve much of a need to have a machine capable of handling 3, 4, or 5 instruction sets, which is probably where the cost of purchasing such a machine would start to be justified. ..
There's also the possibility of using it as testing machines for software being developed for CPU architectures that haven't had fully functional prototypes come off the line yet, but that's wouldn't provide nearly enough business to keep a company going. . .
I think game developers would only love it if the performance could be increased by quite a bit. The clock rate on the fastest Crusoe processor is 800mhz, and although I don't know the architecture of anything past a 586 very well, I seriously doubt anything Transmeta makes could do as much in a single clock cycle as a G4 or a Pentium 4.
A look at the system requirements for most of the games I see my little brother playing suggests that there's quite a ways to go before the technology would be popular with the game industry.
There isn't a demand for their product. Transmeta is technology searching for a market, instead of the other way around. That's always a bad place to be.
Intel is able to produce products that satisfy the low power x86 market. If consumers *really* demanded lower power chips than those already on the market, Intel could easily exert more resources on creating those products and win on reputation/control of the arch/pricing/deals/you name it. For instances where a product didn't need an x86 chip, One could simply get an appropriately spec'd microcontroller from Motorola and be done with it.
No word on whether there were any penguins seen leaving the building.
That was the punishment for not being profitable. Not only were the 200 employees fired, but they were forced to where penguin suites as they were escorted out.
I'm somehow reminded of the Stonecutters episode of the simpsons. They were going to fire Linus, but then realized that he's responsible for half of the press of their company.
CEO: Remove the Penguin suit of shame! Linus: Woohoo! CEO: Present the Penguin suit of triumph.
A lot of smaller companies focus more on obtaining market share. Plus remember all the dot-com Superbowl commercials? I think they are pretty much intended for branding and establishing market identity, sometimes at great cost.
So technically these companies may have to make a conscious effort to focus on profits rather than marketing and growth...
> and has shifted its goals toward obtaining profitability in 2003.
Talk about short attention spans.
Of course I know the "shifted goal" is to obtain profitability(that's the whole point of the post). I'm questioning why that wasn't a main goal from the beginning.
> Our goal is to be profitable./Our goal is to be profitable in 2003.
I don't need to quote "profitable in 2003", my point was that their goal should have been to be profitable from the start of the business or even earlier. You can disagree with that, but you're wasting your time "fixing" quotes.
Their goal was to have a bottom line accouting profit in 2003. Many business, especially busines that require many resouces to be developed, only wish to not operate at a terrible loss as the business grows for the first few years of operation. Long-term profits are a goal for every company, but to suggest that short-term profits is a goal for every company neglects that companies must investment to expand and to couneract depreciation.
It is important to be profitable, but few businesses are at the very beginning, because you have to make a product, and let people know about it, and build your factory before you get a sale. There are a few businesses that start out proftable, but they are usually quite small. I assume, they always planned to eventually be profitable, but setting a date like this means that they are willing to forgo the possibility of larger future profits for some profits in the near term. Usually its a sign that the capital invested in you is running out, and until you show the owners of capital that they have some chance of recovering their investment, they aren't giving you more.
What were the initial goals??? And here I tought the goal of all businesses was to make money.
Business (especially big business) forgot that in the 90s. There's an old mantra that says, "Make a little profit everyday." Many businesses in the tech sector spend huge ammounts of money when they start-up, then find themselves playing catch-up for the rest of their (short) lives. This is IMO what happened with the dot-coms.
Myself, I understand there is a need to sink money into developing technologies early in their lives so you can control how they mature and then control how you bring them to market so that the result is best for your company. With that said, if you're constantly borrowing money to make payroll while waiting on that next big break to come along, you're never going to make it.
Disclaimer:I an not a fortune 500 comapny's CEO, an economics teacher, or (currently) a heavy investor in the market. Take everything Is ay with a grain of salt.
In "The Practice of Management" (1954 but still very relevant) Peter Drucker says that the goal of business is to obtain customers. Profitability follows from correctly managing towards this goal. It's weird but when I first read this it really struck me that he's right. From customers comes profit if the business is correctly managed and it's the customers you need to get if you want the profit.
I can see that C# has its failings, but they are pretty nit picking type failings compared to C. The use of pointers by itself renders whole programs unreadable unless one wants to spend a long time understanding the implications of every line of code.
To say that implicit type conversion in C is well understood and clear, and that in C# isn't, is just a joke. C has an incredibly complex time converting between types, nade all the more complex because types can vary between underlying machine architectures.
Despite the "celebrity" factor Linus brought to the place, their product just never panned out. It was a good idea, and hopefully some larger company will buy up their proprietary technology, but I don't see how Transmeta on their own ever could have made a run at capitalizing the chips in an already severely swamped market -- the barriers to entry were just too high.
Still, having been laid off twice last year, I wish all the former Transmetites the best. I hope Linus is able to find an interesting job after Transmeta folds -- otherwise, my company could use a good code jockey...
Why are all those articles so negative. Right now it looks like they overcame the production difficulties and are moving ahead with new costumers that are actually building notebooks with Crusoe CPUs. The U1 from Sony is the hihest selling notebook in Japan already and the Fujitsu P series is also selling well in the USA. With the upcoming HP tablet and the OQO, I would say they aren't doing so badly.
Transmeta never made much sense to me. People buy laptops for portability. They buy new laptops for speed. Most could give a rip about battery life or heat. Transmeta did not offer enough speed, and their ability to run in a sealed case (heat) would have only been nice for medical or industrial niches; but only if every other component on the system could be made water-tight and shock-proof as well. Their morph-code technology wasn't as desired; look at how many people don't even flash their BIOS.
You, sir, use a laptop as a portable, not a laptop.
Battery life is *more* important than processor speed, to me. Were I in the market for a new lappytop, I'd want something that I could use for a several hour stretch in the park, in the car, or just wherever the feng shui is best for writing.
Once it can run the word processor and MP3 player at once, at a speed I don't cringe at, I'm happy & the rest is just gravy.
How could you mention feng shui, brag about your belief in a western mythology and yet get modded as insightful?
Probably because those things were irrelevant, and the posting was actually insightful.
I'm using a Crusoe-based laptop right now. It weighs about 3 pounds, gets about 7 hours of battery life in real life use. I may have bought it a little early (before they get heavily discounted if the Crusoe is discontinued), but maybe I've got a collector's item here!
The fact is, it's nice to have a laptop small enough to carry with me all the time. It's about the speed of a 500 MHz PIII, and that's fast enough for just about anything I'd want to do on it. I used it to build the Windows distribution of a well-known statistical software package [r-project.org]. If I'd had a machine that was 3 times faster, I probably wouldn't have had it with me to do that build.
battery life and heat are interrelated, and have significant influence on the portability and speed of a laptop.
heat is what battery charge ends up as, so these are obviously directly related.
The portability of a laptop is largely influenced by its weight, and to a lesser degree, size. The battery is one of the heaviest and largest single components in a laptop (after the screen). So, a processor that draws significantly less power for a given level of performance allows the use of a significantly smaller and lighter battery pack, resulting in a more portable computer.
The performance of a laptop is, of course, by the performance of its processor. In a laptop heat dissipation and battery life conspire often force a practical limit on the processor. A more efficient processor, that demands less power, and therefore dissipates less heat, will allow a faster processor to be used in a given machine.
The on reason i am not buying a laptop, the short batery life!
I need atleast 8 hours or more, i cant cope with 3-5 hours come one, meaning you cant go out camping for a day or two three and use your laptop for some time, unless you take loads of spare bats with you..
Their morph-code technology wasn't as desired; look at how many people don't even flash their BIOS.
The real problem with code morphing is that why would anyone pay for morphing code to x86 when x86 processors are so widely available? Now if they'd had the ability to code morph x86 to native *and* the ability to code morph the instruction set of (say) the Java Virtual Machine to native, then maybe we'd be talking. All the advantages of Java, but executing at native speed, plus compatibility with all x86 applications, and maybe SPARC too. Then they could have simply sold the company to Sun for it's next attempt at thin-client desktops. But what was the point of code morphing to only one target? This is something the VCs should have asked before investing a single dollar.
Do the folks from Transmeta try to find jobs with chip manufacturers like Intel, AMD, IBM, or Texas Instruments; or, do they look for software jobs? Who got canned; which division? I looked for more information, and couldn't find any.
...which is getting longer and longer in the Valley. Its getting scary. It doesn't matter how hotshit you are, there are ten guys ahead of you who will do an adequate job for 60% of your salary.
Added to which, these workers are inflexible. Most wouldn't think of doing something other than programming or hardware, which makes their job search even harder. Programmers will have to start looking at Barnes & Noble as an employment opportunity, not just a place to browse for tech books.
$180mm of cash and near-cash, and $25mm of burn this quarter (+/- depending how you count), gives them a life expectancy of two years. I suspect we'll see a catalyst one way or the other before then though.
New YORK (CNN) - Intel Corp. Tuesday said it will
eliminate roughly 4,000 jobs in the second half of the year
after reporting a second-quarter profit that fell short of
recently reduced estimates.
Executives of the world's largest chipmaker also provided a
cautious outlook for the third quarter and the remainder of the
year, as large corporations continue to curtail their information technology spending
amid economic uncertainty.
Why didn't Slashdot report this news item, hhmmmmm?
The biggest insult is that at least when I pulled up the story (this may be a random advirtisment) but the ad that is almost as big as the story is for Gateways new line of laptops featureing, you guessed it the P4 from, you guessed it Intel.
The trouble with Transmeta was that the feature people wanted was fine-grained power management, not software translation into microcode. Transmeta was first with fine-grained power management, but as soon as it became clear that people cared about that, everybody else (i.e. Intel and AMD) started doing it, and Transmeta lost the only advantage it had.
Transmeta's "code morphing" turned out to be another Really Neat Computer Architecture Idea that Doesn't Matter. It goes to the graveyard with stack machines, tagged-word machines, capability machines, dataflow machines, single-instruction multiple-datastream machines, hypercube machines, and Forth machines. Each of those has been made to work, built, and sold. Few people have ever seen any of then, but they all did exist as working commercial hardware at one time or another. None of them had enough of an advantage over vanilla architecture to survive.
The same thing will probably happen to Intel's Itanium, which, even within Intel, is considered a marginal idea.
In a way, it's sad. We're stuck with vanilla architecture like x86 and vanilla languages like C. There are many better approaches, but none better enough that the pain of conversion is worth it.
Stack machines... you mean, like the Java Virtual Machine? Or PostScript printers? Nope, nobody using that idea any more. Actually, a number of the neat old ideas in computers turn out to be great for something somewhat different, or somewhat later. Stack machines are a great idea if you don't know how many registers you have. With real machines running compiled software, this is stupid; but for virtual machine or for document-formatting instructions, it's great.
In any case, it's not neat ideas that sell machines, it's solved problems. Code morphing is a great idea, and it'll be really big as soon as someone wants to do something that it's good for.
Stack machines... you mean, like the Java Virtual Machine? Or PostScript printers? Nope, nobody using that idea any more.
No, I think the OP meant that nobody is building actual stack-based processors anymore. The JVM is a virtual machine, and PostScript printers contain an interpreter running on a conventional processor (usually a RISC chip). Other than Sun's brief fling with the Java processor, stack machines have pretty much died.
Code morphing is a great idea, and it'll be really big as soon as someone wants to do something that it's good for.
Code morphing is a great idea indeed. But it already is in use in any JIT emulator or virtual machine. Crusoe is basically a very power-efficient processor running an x86 JIT emulator.
The big unanswered question is whether VLIW was a good idea or not...
X18 [colorforth.com] is an example of a modern-day stack machine (also a Forth machine -- forth and stacks are pretty much the same thing). It's in contrast to the register-based design of traditional chips.
C was the right solution at the time - a simple language, which made tools development easier, and it offered solid performance.
Its still the only way to go for most performance-intensive applications, regardless of the ridiculous and false claims that competing languages have better average performance.
I won't debate the virtues of having a small language with weak (heck, non-existent) type checking for systems programming. There are some bones to pick with C in terms of syntax and semantics, however, that could have been avoided from day one.
First, = vs. ==. It would have made more sense to use something like <- for assignment, or a keyword like "eq" for equality. Instead, we have the silly convention of writing "(SomeConstant == SomeVariable)" in conditions just in case we forget to hit the = key twice. (Very stupid mistake? Yeah, but I've done it, and I do know better.)
The C preprocessor. Probably no other piece of code has been more abused than/lib/cpp. Granted, macros are a cheap way to generate code, but the implementation is fraught with traps (try nesting macros and accidentally introducing a syntax error in one. Yum.). Not to mention what an unscrupulous developer can do with the "#undef" directive. Besides, using macros to define constants is silly (I use enum wherever I can for that reason).
Vague non-standardized data type sizes. Only chars have a defined size; everything else is up in the air. How many times have you been stung by using an "unsigned short" on another architecture, only to realize the size changed on you? (If you write kernel or driver code, it's probably happened to you.) And how many times have you had to deal with someone else's (e.g. your) implementation of types like "U16" or "unsigned32", just because the language forgot to include it?
And don't get me started on "long long". Grrr..
Don't get me wrong: if you view C as glorified assembler, it's a great language. But some cheesy semantics do allow for abuse, misuse, and neglect by the careless, and unnecessarily so.
In a way, it's sad. We're stuck with vanilla architecture like x86 and vanilla languages like C. There are many better approaches, but none better enough that the pain of conversion is worth it.
I disagree. There is a better approach to software construction and execution that can bring at least an order of magnitude improvement in reliability and productivity. We are in the middle of a crisis because there is something fundamentally wrong in the way we develop software. The root cause of the problem is as old as Lady Ada and Charles Babbage. It is the old practice of using the algorithm as the basis of software construction. Fortunately we don't have to live with it.
This is a golden opportunity for Transmeta (or any struggling chip and software company) to redefine software engineering and computing as we know it. They can do the right thing and leave Microsoft, Intel and AMD in the dust. Details at the links below.
C is not vanilla. C is more like the genome of the vanilla plant, that you could grow to produce beans (insert Java joke here) except you'd probably have a rogue pointer somewhere that would give the damn thing the blight.
The trouble with Transmeta was that the feature people wanted was fine-grained power management, not software translation into microcode.
Depends on your point of view. I consider controlling the CPU core voltage and clock speed as rather coarse grained.
Transmeta's "code morphing" turned out to be another Really Neat Computer Architecture Idea that Doesn't Matter.
No, it's a major advance in instruction decoding. Conventional IA-32 decoders are hardwired into transistors. They do perform extremely well, but at a very high cost in power and die area, and because they're hardwired it's difficult to design complex behavior. Hardwired logic giveth, but hardwired logic also taketh away. From a power point of view, they're very coarse grained: decoding instructions full-bore, or turned off. (And they have no sense of importance: code that runs for a microsecond every ten minutes is treated the same as the inner loop of a rendering algorithm.)
The Transmeta instruction decoder, on the other hand, is extremely fine-grained. It gives code the amount of attention it deserves. For rarely-executed code, the decoder wastes little power and does a suboptimal decoding. The more frequently code is executed, the more power the decoder burns to optimize it. Especially frequent decodings are cached so they can be used later with zero decoding. (Which hardwired decoders have a lot of trouble doing.)
The Transmeta decoder also has the potential for really neat tricks. E.g., you could put multiple ALUs and FPUs in the CPU, but leave them completely turned off except when a heavy-duty computational algorithm truly needs them. Ditto for power-hungry L1/2 caches. When your decoder is firmware, you can afford to try all sorts of things, and just not use them if it's too hard to get right, but complexity like that gives nightmares to the designers of hardwired decoders.
It goes to the graveyard with... single-instruction multiple-datastream machines...
You mean like IA-32?;-)
The same thing will probably happen to Intel's Itanium, which, even within Intel, is considered a marginal idea.
The latest Itanium 2 benchmarks look pretty good. With some more tweaking, I won't be surprised if the I2 will give the largest available computational throughput per CPU. I think technically it can be made to work well. The question is, how many people need absolute peak performance per CPU, die area, power consumption, and bus width be damned? I suspect that's a small market. Most people in the data serving and technical computing markets can just slap some more CPUs in their cluster if it's too slow. (And you can reasonably contemplate putting 500 Crusoes in a rack, and tossing the rack into some random warehouse. Equivalent Itanium power would be vastly more expensive and require a carefully-engineered cooling system.)
Those just reuse existing hardware with the registers split up a little differently - 8 8-bit operands instead of one 64 bit one, or 2 32-bit floats instead of one 64-bit one. Big-scale SIMD, like the Connection Machine, where one operation was applied to 1024 data items simultaneously, doesn't turn out to be that useful. Beating the problem into a form where you can apply the all-at-once SIMD hammer turns out to be more trouble than it is worth.
Capability machines...
The capability machine people have a terribly hard time explaining how their idea works. I've known several people involved in such work, including Norm Hardy, and
I had hopes for EROS, but it's not going anywhere.
The problem is writing applications that can effectively use these complex security architectures. Capabilities are a low-level mechanism; you have to build a system and a policy on top of them, and demonstrate that the policy is secure. This is hard.
Agreed that data flow machines exist down where the work gets done inside the CPU. The Pentium Pro and its successors are all dataflow machines deep inside. But the data flow architecture isn't exposed to the programmer.
There are some exotic architectures in game consoles, the PS2 being the fanciest one.
Graphics and signal processing pipelines often have wierd machines. But those are special-purpose applications where some tiny inner loop is executed over and over.
Let me predicate this by saying I have used or seen a Transmeta product. And in that, I think is the problem.
Going back a few years, I remember the buzz surrounding Transmeta. 'There is this company that's developing something ground-breaking... and Linus Torvolds is working there!' If memory serves me, investors--any investor would give their eye teeth to just be able to put money into something 'groundbreaking' being worked on by Torvolds.
Then we finally saw what it was. A chip. Oh. . . Well, what makes the chip so special? It uses less power. Oh. . . Does that change anything for us? Sure, your laptop batteries will last a little longer, and if you run a server your electric bills might be a little lower. Oh.
Of course, I'm not a programmer or do work on hardware, but for me this was a letdown after so much hype second only to learning 'It' was nothing more then scooter that was hard to tip over.
That was two years ago, and despite the fact that there is some benefit to the otherwise ho-hum technology, where is it? I buy a lot of computers, and I don't even know where to buy a Transmeta equipped machine (then again, I've never really looked, and have never been given a good reason to look).
So, again, this seems obvious. A company pours a big chunk of change into a product that never sees the light of day on a mainstream store shelf... a product that I quickly forget about and am only reminded from time to time on Slashdot stories.
I suppose, scanning the posts, that there are a handful of gee-wiz products out there (albeit not in the United States) with a Caruso chip, but I just don't see them, or see any reasoning to spend the extra money on them.
And so Transmeta starts laying off people. It just seems to be the next logical choice.
Transmeta just didn't give a big enough difference to matter. I'm on my Thinkpad T23 right now and it gives me over 4 hours of battery life doing normal work with no extra power management running. I don't spin the drives down, dim the screen, or any of that. All while using the wireless NIC built in. It would do even better if I used the power management options.
That's very good for most people since this is a "normal" notebook with a fast CPU (P3 1.13), plenty of RAM, big disk (48GB), nice screen (1400x1050), and a DVD/CD-RW.
Intel just laid off thousands of workers, following a reduction in workforce by attrition of a few thousand others. It's an advantage if a firm can be flexible enough to lay off a good portion of its workers during a down time in the market for their products. Transmeta is selling to the same markets as AMD and Intel. Being able to adjust their labor costs more flexibly at in this period might be a demonstration of what in the longer term turns out to be an advantage.
And as long as Linus is there, all you suckers will line up to work for them again in a year or two when their market comes back and they rehire. You're not gonna stay away just because your or some friends of yours once got layed off, are ya? ___
I would like to point out that if the workers owned the means of production, this wouldn't have happened.
But in a company like Transmeta - or indeed any high tech company whose value is its intellectual property - the workers do own the means of production. The company's product is the solidified thoughts of its professional staff. The staff own their own brains and their own educations and their own imaginations. If you've got those, what do you really need to be productive? A PC and a desk, and I bet most of them own those too. It's not like in old-style industries where the value of a business was in its physical assets.
Sadly, dot.communism doesn't protect you from the fundamental law of economics, which is that every participant in an viable economic system must produce at least as much as they consume. At the end of the day, TransMeta simply didn't sell anything that anyone else wanted to buy.
Code morphing with only one target implemented (Intel), offers nothing above buying an actual Intel chip. And as mentioned, the power savings advantage is something others have jumped on very quickly, so there's little to differentiate it. (Although laptops using TransMeta still seem to have battery life ratings beyond the competition.)
Have they ever stated any intention to implement another target for the code morphing? Being able to have the same computer be a Mac or a PC (or a Sparc) would be far more compelling, and is what I had hoped the original story was all about. Is that just not lucrative? Do they not have the resources to pull it off? Was the TransMeta designed too much with Intel in mind, so that a PPC or Sparc emulation isn't possible???
It's biggest advantage seems to have gone completely by the wayside.
My impression when Crusoe was announced, was that the x86 instruction set was very important to them -- it's a much easier thing to emulate efficiently. A RISC instruction set (as in PowerPC) is much more difficult -- since now you're translating from RISC to RISC (since the internal instruction set is more-or-less RISC as well). You can decompose CISC instructions efficiently, but there's nothing to decompose with RISC -- the instructions are already simple.
The other potential seemed to be that they'd create different cores with different optimizations -- the first one, Crusoe, being power-efficient, another one could be optimized towards floating point, another to integer operations, etc. But that hasn't happened.
Alternate architectures would be interesting -- at least PPC. In a Mac, it could allow efficient Windows emulation... but that seems like less and less of an issue, as portable applications usually mean web-based, and non-web applications usually have Mac alternatives. At least, I don't think Apple is enthusiastic about Windows emulation, and without Apple PPC is useless, since they won't have MacOS. Other non-x86 architectures don't seem important -- there's little software available for ARM or SPARC that won't be ported to x86 if there's demand.
It's like with languages -- if you know English, learning a second language is no longer that important. Transmeta started out learning the English of the instruction sets -- x86 -- and there's little incentive to learning other languages. Even if some programs started out with different machine languages, they all learn to speak x86 eventually.
It would have been interesting to have a 4-way system running code-morphing chips that could target multiple CPUs and a meta-OS that would allow you to run multiple OSs (Mac OS, Windows, BSD, Sun, etc) on the same machine at the same time.
I'm not sure who would want one other than cross-platform developers, but it would have been interesting.
I think it would be a big mistake to count Transmeta out any time soon. I say that not because I'm a penguin-loving Linus-worshipper. To the contrary, I primarily use Microsoft development tools, and when I'm feeling giddy about Unix I use FreeBSD. The only Linux boxes around here belong to paying customers.
So why not write off Transmeta? Simply put, they're working their way into the product channel. Transmeta does have a very low-power chip--and that Transmeta technology is at the core of an emerging form of hardware: the smarter embedded system. Don't think "desktop replacement"--think "death to the PLC."
What's a PLC? Programmable Logic Controllers are tiny CPUs that appear in all sorts of specialty uses: controllers, valves, automated-just-about-anything. They're cheap, they're generally very reliable--and they have zero memory, have very limited functionality, and require programmers who demand significant coin. When you try to add a feature to an embedded application you will typically a response on the order of "that will take--at least--200 bytes of memory. And we only have 68 bytes left. So what feature do you want to drop to do this?"
Coming soon, to a factory floor near you... The Palm OS, WinCE, and the Transmeta chips are going to change all that. Handhelds and rugged semi-embedded handhelds are appearing in larger numbers--with gigabytes of flash storage, and 128 MB of RAM. Skip counting bytes--add all the features you want. Connectivity? They have 802.11 already embedded, along with USB, serial ports, etc., etc., etc. Some of the vendors I've browsed recently include InfoCater [infocater.com] and SyntegraTech [syntegratech.com]; they're both distributors for Tablets, WebPads, and handhelds that run with WinCE or Midori Linux. Very, very cool stuff.
Laying off 20% of your staff may be painful--but it is not the same thing as shutting the doors. For example, note that VA Software is still around....
Don't think "desktop replacement"--think "death to the PLC."......
Programmable Logic Controllers are tiny CPUs that appear in all sorts of specialty uses: controllers, valves, automated-just-about-anything.....
The Palm OS, WinCE, and the Transmeta chips are going to change all that....
I have very very strong doubt about this. Industrial automation people are in general very conservative (for some good reasons, sometimes). The reason that they tolerate PLC because PLC is rock-solid. In a lot of cases, the task PLC controls is really simple but critical (e.g. if the nuclear reactor is going to melt down, push all the goddamned controller rod right into it!!!).
Many chemical processing plants have modern looking control rooms with goodies like touch-screen big CRT a decade ago; they do not really care about money. In many cases, the SCADA system and the nice GUI frontend just reads data from the PLC... Once upon a time, I did some contract work for a beer brewery. During one of the presentations, I forgot to explicitly mention we won't touch their PLCs if we are going to install the proposed software sensor module in the first slide. I did see the technical manager's face changed colour and wanted to kick us out...
Back when Apple switched the Mac from 68k to PPC, they did better than expected keeping the old 68k code going, but they did so by using slow, clunky software emulators. Imagine if they could have just had one chip than ran both instructions sets! Since Jobs has finally reached the point where he doesn't immediately shoot down the idea of switching to x86 (In a recent interview: "We like having options..."), maybe they should check into this as a way of keeping PPC going. It would solve many of their potential spin problems:
Not emulating PPC on x86. After hyping the superiority of PPC over x86 for so long, they'ld be insane to use an x86 based architecture to do the emulation that would absolutely need to support.
The "x86" Mac would not just be a pretty PC clone. Running MacOSx86 on Apple hardware would have a tangible advantage over running it on generic PC hardware: the ability to run all the current PPC based Mac software at reasonable speeds. Not a big deal for current x86-ers who just wanna dump Windows, but it would be crucial for their current customers.
On x86 hardware, but not Intel hardware. Given some historic biases, this might be a bigger deal than it should be. Suggesting AMD instead doesn't seem to help.
Lots of options for their appliance/"digital hub" ideas Imagine if they could use the same CPU in everything from a multi-cpu PowerMac Server down to a settop box or handheld?
Unfortunately, my curmudgeon side says this all makes too much sense to ever become reality.
I was asking the question, but they wouldn't sell
chips to me!
I always thought that it was a strange business model to develop something pretty cool and then lock it up and sell it only to restricted developers. Surely they should have set the price based on demand and how many of the suckers they could actually make.
There are jobs up in the East Bay, my brother. They might not pay as well (although my new job actually pays a small bit more), but it's work.
Take my advice: find someplace and do your best to hide out for the next two years, even if it's not doing something groundbreaking. Remember: there's always the next time around the bubble.
Sorry. Seriously, though, if you're not rooted in too firmly, there are jobs out there if you're willing to move to them, especially in places with big defense companies (San Diego, various east coast cities, etc).
If you're stuck in the boonies, though, you've got problems -- the lack of fallback jobs is why I passed on a very well-paying job with a startup in Madison, WI, even though I really needed the work at the time and have family there. If you get laid off and are a niche-type worker, you're in trouble.
The second line is good advice. Get a job. Any job. Don't matter how shitty. Hang on to it for dear life. Go to school. Hope that the next bubble will take advantage of your new skills.
Me, I'm working on mainframes and going to school to learn as much about AI as I can. I'm praying that my experience won't overshadow my eductation once my thesis is done. But, if it is, at least I can program on mainframes.
Happened to me in March - May. Small company, big project, bad management. Thing was we weren't alerted to the fact that payroll would be "late" (sometimes 2 weeks) until the very end of day payday. The last straw was finding out my family's health insurance, for which I paid $500/mo after taxes, had been canceled April 1. I found this out May 5, or so.
What a miserable experience. But we came out of it very well. My co-worker is working at a stable company for more pay (personal connection got the interview) and I'm working for my former client for much more pay, benefits, and equity (not options; equity in a profitable company). While I was sweating out the collapse of the old company I had very little hope going forward. Looking back, it was a great opportunity.
Didn't you listen to Bush about Kyoto, the American way of life is not up for negotiation. You have to keep spending, I am afraid, not saving. That is Un-American, and Un-patriotic.
Transmetta does not have to die. They need to focus on the two biggest problems in the computer industry: unreliability and low productivity. If they can come up with a solution that can bring at least an order of magnitude improvement in both productivity and reliability, they can kick both Intel's and Microsoft's asses.
There is something fundamentally wrong with the way we develop software and the way we design our CPUs to execute the software that we develop. There is something rotten at the heart of software engineering. It has to do with the old practice of using the algorithm as the basis of software development.
We need a new software construction paradigm, one which is based on signals. Transmeta has the golden opportunity to do something real cool and save lives in the process. More can be found at the links below.
This isn't linus. His last name is spelled "Torvalds" NOT "Thorvals". Just a troll begging for attention.
Unless he got locked out of his previous account and couldn't get back in (because the e-mail address is way out of date) so he had to create a new account. If you look at the account info, notice that this was the first time it was used, and subsequent postings do nothing to suggest that this was a troll.
5 years ago or so? IBM and Sun are arch-enemies; HP is usually considered just as "that printer company", not as worthwhile an opponen (at least if you believe McNealy).
Microsoft profits have doubled, and sales have soared. Don't think they're going anywhere soon. Where is this 'End of the Microsoft Era' Jon Katz told us it was two years ago?
And the idiots who believed him and threw money his way should have looked at financials or at least waited until customers showed up. They didn't go public untill well after the peak, so there shouldn't have been the mad rush for shares that locked you out at a reasonable price if you didn't get in in the first five minutes. Transmeta had a total of 3,817,000 in revenues for the nine months prior to it going public, and 76,670,000 in expenses for the same period! People over subscribed the IPO to buy a 1/127,752,858th of that at $21.00, even worse the could have sold it for 40somehthing a share at the end of the first day! Your share of revenues (sales) would be about 3 cents. This was an excellent candidate for another round of venture funding.
BRThere is an old saying from wall street, Bulls make money, bears make mone, pigs get butchered. Perhaps we should continue to heed that when the next big thing gets hot.
linus wasn't lying... (Score:5, Funny)
Re:linus wasn't lying... (Score:2)
Stop setting fire to it, then.
Re:I predicted this 7 months ago (Score:3)
Who didn't??? (Score:2)
Not surprising... (Score:2)
I for one am not sad this happen... they had some good ideas, but nothing insanely great.
br.
Re:Not surprising... (Score:4, Insightful)
They should base their business model on some company that is well liked as opposed to a company like Microsoft. Notice similar business tactics? Difference is, Microsoft is big enough to pull it off.
Seriously though, the ideas they had could very well be worthwhile. While I would hate to see Transmeta fold, at least the ideas and the technology are out there, and would likely be sold should the company fail completely. Supposing a company like AMD got ahold of Transmeta's research and knowledge base...a veteran company might be able to market such a product better than Transmeta has.
The way I see it, Transmeta will either pull through, or the technology will get passed on. Sad to say, but it's win/win for the industry...and only hard times ahead for Transmeta.
Re:Not surprising... (Score:4, Insightful)
Nah, there just wasn't much of a market for how they where selling what they had developed.
Now if instead they had, say, concentrated on making development platforms . . . . heh.
Can you imagine sitting down at a machine that is a Sun, PowerPC, and x86 all in one?
That is (was?) the true promise of Transmeta and shoving the chips into laptops was just plain silly. Bleck.
(actualy I just just kind of hoping for an uber emulation machine myself, hehe. Think the next Generation of MAME.
x86-64 out before AMD (Score:2)
really this is software and the only thing that would limit them is some stupid restriction in the hardware (which they have not made) this means that the transmeta laptop now with a rom update is a 64bit machine (-;
the only reason they have not done it is that they dont want to steal AMD's thunder
if transmeta get their system working like a SOC and all you have to do is wire up the Phy of a net/USB/LCD/ideHD then I think they will really take off
vendors are sick to death with chipsets and chips they just want a nice x86 System On a Chip
regards
john jones
Re:Not surprising... (Score:3, Interesting)
Unfortunately, their push was toward the mobile market, so they appear to have put more effort into power consumption than they did performance, and I dont think they even tried to get a Crusoe processor running multiple instruction sets simultaneously yet, so anything along those lines that we would see anytime soon would probably not be better than just buying two different machines of different architectures, and I doubt many companies percieve much of a need to have a machine capable of handling 3, 4, or 5 instruction sets, which is probably where the cost of purchasing such a machine would start to be justified. .
There's also the possibility of using it as testing machines for software being developed for CPU architectures that haven't had fully functional prototypes come off the line yet, but that's wouldn't provide nearly enough business to keep a company going. . .
Re:Not surprising... (Score:2)
A look at the system requirements for most of the games I see my little brother playing suggests that there's quite a ways to go before the technology would be popular with the game industry.
Re:Not surprising... (Score:2)
Intel is able to produce products that satisfy the low power x86 market. If consumers *really* demanded lower power chips than those already on the market, Intel could easily exert more resources on creating those products and win on reputation/control of the arch/pricing/deals/you name it. For instances where a product didn't need an x86 chip, One could simply get an appropriately spec'd microcontroller from Motorola and be done with it.
Their final humilation (Score:4, Funny)
That was the punishment for not being profitable. Not only were the 200 employees fired, but they were forced to where penguin suites as they were escorted out.
Re:Their final humilation (Score:2)
CEO: Remove the Penguin suit of shame!
Linus: Woohoo!
CEO: Present the Penguin suit of triumph.
(Minions carry out suit twice as big)
Linus: Doh!
Shifted its goals? (Score:4, Funny)
What were the initial goals??? And here I tought the goal of all businesses was to make money.
Re:Shifted its goals? (Score:2, Funny)
Re:Shifted its goals? (Score:2)
Re:Shifted its goals? (Score:2)
Don't you remember? Their goal was something along the lines of "destroy Intel and all things x86!"
Re:Shifted its goals? (Score:2, Interesting)
So technically these companies may have to make a conscious effort to focus on profits rather than marketing and growth...
Re:Shifted its goals? (Score:2)
I realize attention spans are getting shorter.... (Score:2)
Re:I realize attention spans are getting shorter.. (Score:2)
Talk about short attention spans.
Of course I know the "shifted goal" is to obtain profitability(that's the whole point of the post). I'm questioning why that wasn't a main goal from the beginning.
Geez!
Re:I realize attention spans are getting shorter.. (Score:2)
Our goal is to be profitable./Our goal is to be profitable in 2003.
I will eat lunch./I will eat lunch now.
I will die./I will die of old age.
I will pay you./I will pay you when I'm good and ready.
We will reform corporate accounting./We will reform corporate accounting when Hell freezes over.
Need more examples?
Re:I realize attention spans are getting shorter.. (Score:2)
I don't need to quote "profitable in 2003", my point was that their goal should have been to be profitable from the start of the business or even earlier. You can disagree with that, but you're wasting your time "fixing" quotes.
Re:I realize attention spans are getting shorter.. (Score:2)
Re:I realize attention spans are getting shorter.. (Score:2, Insightful)
Oh grow up. (Score:2)
Look, we all misread things. Jumping through hoops like this to avoid admitting a mistake is childish.
Now that would be a neat trick!Re:Shifted its goals? (Score:2)
Business (especially big business) forgot that in the 90s. There's an old mantra that says, "Make a little profit everyday." Many businesses in the tech sector spend huge ammounts of money when they start-up, then find themselves playing catch-up for the rest of their (short) lives. This is IMO what happened with the dot-coms.
Myself, I understand there is a need to sink money into developing technologies early in their lives so you can control how they mature and then control how you bring them to market so that the result is best for your company. With that said, if you're constantly borrowing money to make payroll while waiting on that next big break to come along, you're never going to make it.
Disclaimer: I an not a fortune 500 comapny's CEO, an economics teacher, or (currently) a heavy investor in the market. Take everything Is ay with a grain of salt.
Re:Shifted its goals? (Score:2)
no need to update his resume (Score:4, Funny)
ACCOMPLISHMENTS: Linux
Re:no need to update his resume (Score:4, Funny)
ACCOMPLISHMENTS: Linux
"Sorry Son - you're going to have to have a better resume than that if you want to work at Microsoft."
Re:Anders Hejlsberg (Score:2)
Re:Anders Hejlsberg (Score:2)
To say that implicit type conversion in C is well understood and clear, and that in C# isn't, is just a joke. C has an incredibly complex time converting between types, nade all the more complex because types can vary between underlying machine architectures.
There's a shock. (Score:5, Interesting)
Still, having been laid off twice last year, I wish all the former Transmetites the best. I hope Linus is able to find an interesting job after Transmeta folds -- otherwise, my company could use a good code jockey...
Re:There's a shock. (Score:2)
Re:Linus got stock from Redhat (Score:2)
Industry Wide (Score:2, Informative)
I think it was like 4,000.
A smaller percentage but still significant.
.
Increased revenue by 82% from Q1 (Score:2, Interesting)
Re:Increased revenue by 82% from Q1 (Score:2)
Goals of the company (Score:2)
Re:Goals of the company (Score:4, Insightful)
You, sir, use a laptop as a portable, not a laptop.
Battery life is *more* important than processor speed, to me. Were I in the market for a new lappytop, I'd want something that I could use for a several hour stretch in the park, in the car, or just wherever the feng shui is best for writing.
Once it can run the word processor and MP3 player at once, at a speed I don't cringe at, I'm happy & the rest is just gravy.
Re:Goals of the company (Score:2)
Probably because those things were irrelevant, and the posting was actually insightful.
I'm using a Crusoe-based laptop right now. It weighs about 3 pounds, gets about 7 hours of battery life in real life use. I may have bought it a little early (before they get heavily discounted if the Crusoe is discontinued), but maybe I've got a collector's item here!
The fact is, it's nice to have a laptop small enough to carry with me all the time. It's about the speed of a 500 MHz PIII, and that's fast enough for just about anything I'd want to do on it. I used it to build the Windows distribution of a well-known statistical software package [r-project.org]. If I'd had a machine that was 3 times faster, I probably wouldn't have had it with me to do that build.
portability, performance & battery life are li (Score:3, Informative)
heat is what battery charge ends up as, so these are obviously directly related.
The portability of a laptop is largely influenced by its weight, and to a lesser degree, size. The battery is one of the heaviest and largest single components in a laptop (after the screen). So, a processor that draws significantly less power for a given level of performance allows the use of a significantly smaller and lighter battery pack, resulting in a more portable computer.
The performance of a laptop is, of course, by the performance of its processor. In a laptop heat dissipation and battery life conspire often force a practical limit on the processor. A more efficient processor, that demands less power, and therefore dissipates less heat, will allow a faster processor to be used in a given machine.
Oh, wait. This is a troll, isn't it. Oh well.
Re:Goals of the company (Score:2)
Re:Goals of the company (Score:2)
the short batery life!
I need atleast 8 hours or more,
i cant cope with 3-5 hours come one,
meaning you cant go out camping for a day or two three and use your laptop for some time, unless
you take loads of spare bats with you..
Laptops SUCK! damn it..
Re:Goals of the company (Score:3, Insightful)
The real problem with code morphing is that why would anyone pay for morphing code to x86 when x86 processors are so widely available? Now if they'd had the ability to code morph x86 to native *and* the ability to code morph the instruction set of (say) the Java Virtual Machine to native, then maybe we'd be talking. All the advantages of Java, but executing at native speed, plus compatibility with all x86 applications, and maybe SPARC too. Then they could have simply sold the company to Sun for it's next attempt at thin-client desktops. But what was the point of code morphing to only one target? This is something the VCs should have asked before investing a single dollar.
IBM (Score:3, Interesting)
- A
Overflow of workers go where? (Score:2)
The unemployment line... (Score:2)
Added to which, these workers are inflexible. Most wouldn't think of doing something other than programming or hardware, which makes their job search even harder. Programmers will have to start looking at Barnes & Noble as an employment opportunity, not just a place to browse for tech books.
How long have they got? (Score:4, Interesting)
Intel Fires 4000 Employees 2002-07-16 (Score:2, Insightful)
New YORK (CNN) - Intel Corp. Tuesday said it will eliminate roughly 4,000 jobs in the second half of the year after reporting a second-quarter profit that fell short of recently reduced estimates.
Executives of the world's largest chipmaker also provided a cautious outlook for the third quarter and the remainder of the year, as large corporations continue to curtail their information technology spending amid economic uncertainty.
Why didn't Slashdot report this news item, hhmmmmm?
Re:Intel Fires 4000 Employees 2002-07-16 (Score:5, Insightful)
Because Linus doesn't work for Intel, silly.
Biggest Insult (Score:2)
Falling Tech (Score:2)
And more useful to job seekers.
Re:Falling Tech (Score:2)
Uh, where have you been? There are plenty of stories about that company [microsoft.com] isn't there?
Transmeta - the Power Management Company (Score:5, Insightful)
Transmeta's "code morphing" turned out to be another Really Neat Computer Architecture Idea that Doesn't Matter. It goes to the graveyard with stack machines, tagged-word machines, capability machines, dataflow machines, single-instruction multiple-datastream machines, hypercube machines, and Forth machines. Each of those has been made to work, built, and sold. Few people have ever seen any of then, but they all did exist as working commercial hardware at one time or another. None of them had enough of an advantage over vanilla architecture to survive.
The same thing will probably happen to Intel's Itanium, which, even within Intel, is considered a marginal idea.
In a way, it's sad. We're stuck with vanilla architecture like x86 and vanilla languages like C. There are many better approaches, but none better enough that the pain of conversion is worth it.
Re:Transmeta - the Power Management Company (Score:2)
Re:Transmeta - the Power Management Company (Score:3, Insightful)
In any case, it's not neat ideas that sell machines, it's solved problems. Code morphing is a great idea, and it'll be really big as soon as someone wants to do something that it's good for.
Re:Transmeta - the Power Management Company (Score:3, Insightful)
No, I think the OP meant that nobody is building actual stack-based processors anymore. The JVM is a virtual machine, and PostScript printers contain an interpreter running on a conventional processor (usually a RISC chip). Other than Sun's brief fling with the Java processor, stack machines have pretty much died.
Code morphing is a great idea, and it'll be really big as soon as someone wants to do something that it's good for.
Code morphing is a great idea indeed. But it already is in use in any JIT emulator or virtual machine. Crusoe is basically a very power-efficient processor running an x86 JIT emulator.
The big unanswered question is whether VLIW was a good idea or not...
Re:Transmeta - the Power Management Company (Score:2)
C doesn't deserve the bad rap (Score:2)
Its still the only way to go for most performance-intensive applications, regardless of the ridiculous and false claims that competing languages have better average performance.
Oh yes it does (Score:2, Insightful)
I won't debate the virtues of having a small language with weak (heck, non-existent) type checking for systems programming. There are some bones to pick with C in terms of syntax and semantics, however, that could have been avoided from day one.
And don't get me started on "long long". Grrr..
Don't get me wrong: if you view C as glorified assembler, it's a great language. But some cheesy semantics do allow for abuse, misuse, and neglect by the careless, and unnecessarily so.
Re:Transmeta - the Power Management Company (Score:2)
I disagree. There is a better approach to software construction and execution that can bring at least an order of magnitude improvement in reliability and productivity. We are in the middle of a crisis because there is something fundamentally wrong in the way we develop software. The root cause of the problem is as old as Lady Ada and Charles Babbage. It is the old practice of using the algorithm as the basis of software construction. Fortunately we don't have to live with it.
This is a golden opportunity for Transmeta (or any struggling chip and software company) to redefine software engineering and computing as we know it. They can do the right thing and leave Microsoft, Intel and AMD in the dust. Details at the links below.
Project COSA [gte.net]
Bertie Bott's Every Architecture CPUs (Score:2)
C is not vanilla. C is more like the genome of the vanilla plant, that you could grow to produce beans (insert Java joke here) except you'd probably have a rogue pointer somewhere that would give the damn thing the blight.
Re:Transmeta - the Power Management Company (Score:2)
The Transmeta instruction decoder, on the other hand, is extremely fine-grained. It gives code the amount of attention it deserves. For rarely-executed code, the decoder wastes little power and does a suboptimal decoding. The more frequently code is executed, the more power the decoder burns to optimize it. Especially frequent decodings are cached so they can be used later with zero decoding. (Which hardwired decoders have a lot of trouble doing.)
The Transmeta decoder also has the potential for really neat tricks. E.g., you could put multiple ALUs and FPUs in the CPU, but leave them completely turned off except when a heavy-duty computational algorithm truly needs them. Ditto for power-hungry L1/2 caches. When your decoder is firmware, you can afford to try all sorts of things, and just not use them if it's too hard to get right, but complexity like that gives nightmares to the designers of hardwired decoders.
You mean like IA-32?Re:Transmeta - the Power Management Company (Score:2)
Those just reuse existing hardware with the registers split up a little differently - 8 8-bit operands instead of one 64 bit one, or 2 32-bit floats instead of one 64-bit one. Big-scale SIMD, like the Connection Machine, where one operation was applied to 1024 data items simultaneously, doesn't turn out to be that useful. Beating the problem into a form where you can apply the all-at-once SIMD hammer turns out to be more trouble than it is worth.
Capability machines...
The capability machine people have a terribly hard time explaining how their idea works. I've known several people involved in such work, including Norm Hardy, and I had hopes for EROS, but it's not going anywhere. The problem is writing applications that can effectively use these complex security architectures. Capabilities are a low-level mechanism; you have to build a system and a policy on top of them, and demonstrate that the policy is secure. This is hard.
Agreed that data flow machines exist down where the work gets done inside the CPU. The Pentium Pro and its successors are all dataflow machines deep inside. But the data flow architecture isn't exposed to the programmer.
There are some exotic architectures in game consoles, the PS2 being the fanciest one. Graphics and signal processing pipelines often have wierd machines. But those are special-purpose applications where some tiny inner loop is executed over and over.
Why to an outsider this seems obvious (Score:3, Interesting)
Going back a few years, I remember the buzz surrounding Transmeta. 'There is this company that's developing something ground-breaking... and Linus Torvolds is working there!' If memory serves me, investors--any investor would give their eye teeth to just be able to put money into something 'groundbreaking' being worked on by Torvolds.
Then we finally saw what it was. A chip. Oh. . . Well, what makes the chip so special? It uses less power. Oh. . . Does that change anything for us? Sure, your laptop batteries will last a little longer, and if you run a server your electric bills might be a little lower. Oh.
Of course, I'm not a programmer or do work on hardware, but for me this was a letdown after so much hype second only to learning 'It' was nothing more then scooter that was hard to tip over.
That was two years ago, and despite the fact that there is some benefit to the otherwise ho-hum technology, where is it? I buy a lot of computers, and I don't even know where to buy a Transmeta equipped machine (then again, I've never really looked, and have never been given a good reason to look).
So, again, this seems obvious. A company pours a big chunk of change into a product that never sees the light of day on a mainstream store shelf... a product that I quickly forget about and am only reminded from time to time on Slashdot stories.
I suppose, scanning the posts, that there are a handful of gee-wiz products out there (albeit not in the United States) with a Caruso chip, but I just don't see them, or see any reasoning to spend the extra money on them.
And so Transmeta starts laying off people. It just seems to be the next logical choice.
Too late... (Score:2)
That's very good for most people since this is a "normal" notebook with a fast CPU (P3 1.13), plenty of RAM, big disk (48GB), nice screen (1400x1050), and a DVD/CD-RW.
Intel must be jealous (Score:2)
And as long as Linus is there, all you suckers will line up to work for them again in a year or two when their market comes back and they rehire. You're not gonna stay away just because your or some friends of yours once got layed off, are ya?
___
zerg (Score:3, Insightful)
Re:zerg (Score:2)
But in a company like Transmeta - or indeed any high tech company whose value is its intellectual property - the workers do own the means of production. The company's product is the solidified thoughts of its professional staff. The staff own their own brains and their own educations and their own imaginations. If you've got those, what do you really need to be productive? A PC and a desk, and I bet most of them own those too. It's not like in old-style industries where the value of a business was in its physical assets.
Sadly, dot.communism doesn't protect you from the fundamental law of economics, which is that every participant in an viable economic system must produce at least as much as they consume. At the end of the day, TransMeta simply didn't sell anything that anyone else wanted to buy.
Penguins vs. Cockroaches (Score:2)
I always thought Penguins were like cockroaches in that "Once they are in....they never leave."
Now if they could just survive a nuclear holocaust they would be all set.
Code Morphing (Score:4, Interesting)
Have they ever stated any intention to implement another target for the code morphing? Being able to have the same computer be a Mac or a PC (or a Sparc) would be far more compelling, and is what I had hoped the original story was all about. Is that just not lucrative? Do they not have the resources to pull it off? Was the TransMeta designed too much with Intel in mind, so that a PPC or Sparc emulation isn't possible???
It's biggest advantage seems to have gone completely by the wayside.
Re:Code Morphing (Score:4, Insightful)
The other potential seemed to be that they'd create different cores with different optimizations -- the first one, Crusoe, being power-efficient, another one could be optimized towards floating point, another to integer operations, etc. But that hasn't happened.
Alternate architectures would be interesting -- at least PPC. In a Mac, it could allow efficient Windows emulation... but that seems like less and less of an issue, as portable applications usually mean web-based, and non-web applications usually have Mac alternatives. At least, I don't think Apple is enthusiastic about Windows emulation, and without Apple PPC is useless, since they won't have MacOS. Other non-x86 architectures don't seem important -- there's little software available for ARM or SPARC that won't be ported to x86 if there's demand.
It's like with languages -- if you know English, learning a second language is no longer that important. Transmeta started out learning the English of the instruction sets -- x86 -- and there's little incentive to learning other languages. Even if some programs started out with different machine languages, they all learn to speak x86 eventually.
Re:Code Morphing (Score:2)
I'm not sure who would want one other than cross-platform developers, but it would have been interesting.
Do not count these guys out (Score:5, Interesting)
Hi All!
I think it would be a big mistake to count Transmeta out any time soon. I say that not because I'm a penguin-loving Linus-worshipper. To the contrary, I primarily use Microsoft development tools, and when I'm feeling giddy about Unix I use FreeBSD. The only Linux boxes around here belong to paying customers.
So why not write off Transmeta?
Simply put, they're working their way into the product channel. Transmeta does have a very low-power chip--and that Transmeta technology is at the core of an emerging form of hardware: the smarter embedded system. Don't think "desktop replacement"--think "death to the PLC."
What's a PLC?
Programmable Logic Controllers are tiny CPUs that appear in all sorts of specialty uses: controllers, valves, automated-just-about-anything. They're cheap, they're generally very reliable--and they have zero memory, have very limited functionality, and require programmers who demand significant coin. When you try to add a feature to an embedded application you will typically a response on the order of "that will take--at least--200 bytes of memory. And we only have 68 bytes left. So what feature do you want to drop to do this?"
Coming soon, to a factory floor near you...
The Palm OS, WinCE, and the Transmeta chips are going to change all that. Handhelds and rugged semi-embedded handhelds are appearing in larger numbers--with gigabytes of flash storage, and 128 MB of RAM. Skip counting bytes--add all the features you want. Connectivity? They have 802.11 already embedded, along with USB, serial ports, etc., etc., etc. Some of the vendors I've browsed recently include InfoCater [infocater.com] and SyntegraTech [syntegratech.com]; they're both distributors for Tablets, WebPads, and handhelds that run with WinCE or Midori Linux. Very, very cool stuff.
Laying off 20% of your staff may be painful--but it is not the same thing as shutting the doors. For example, note that VA Software is still around....
Re:Do not count these guys out (Score:3, Informative)
Many chemical processing plants have modern looking control rooms with goodies like touch-screen big CRT a decade ago; they do not really care about money. In many cases, the SCADA system and the nice GUI frontend just reads data from the PLC... Once upon a time, I did some contract work for a beer brewery. During one of the presentations, I forgot to explicitly mention we won't touch their PLCs if we are going to install the proposed software sensor module in the first slide. I did see the technical manager's face changed colour and wanted to kick us out...
YES! (Score:4, Funny)
Still a Perfect Match for Apple? (Score:2, Interesting)
Since Jobs has finally reached the point where he doesn't immediately shoot down the idea of switching to x86 (In a recent interview: "We like having options..."), maybe they should check into this as a way of keeping PPC going.
It would solve many of their potential spin problems:
Unfortunately, my curmudgeon side says this all makes too much sense to ever become reality.
How Transmeta can turn a profit. (Score:4, Funny)
2) ???
3) Profit!
End of the line for embedded nirvana? (Score:2)
Re:Transmeta is the answer to a question (Score:2, Insightful)
I always thought that it was a strange business model to develop something pretty cool and then lock it up and sell it only to restricted developers. Surely they should have set the price based on demand and how many of the suckers they could actually make.
Oh well, another .bomb in the making
Re:national semi? (Score:2)
so even the nation semi tech's might start turning out decent chips soon... THEN where will you get them?!
Re:Fuck (Score:2)
Take my advice: find someplace and do your best to hide out for the next two years, even if it's not doing something groundbreaking. Remember: there's always the next time around the bubble.
Re:Fuck (Score:2)
I just keep telling myself, now is not the time to panic.
Re:Fuck (Score:2, Offtopic)
Sorry. Seriously, though, if you're not rooted in too firmly, there are jobs out there if you're willing to move to them, especially in places with big defense companies (San Diego, various east coast cities, etc).
If you're stuck in the boonies, though, you've got problems -- the lack of fallback jobs is why I passed on a very well-paying job with a startup in Madison, WI, even though I really needed the work at the time and have family there. If you get laid off and are a niche-type worker, you're in trouble.
Re:Fuck (Score:2)
Re:Fuck (Score:2)
Re:Fuck (Score:2)
Me, I'm working on mainframes and going to school to learn as much about AI as I can. I'm praying that my experience won't overshadow my eductation once my thesis is done. But, if it is, at least I can program on mainframes.
Re:F*** (Score:3, Interesting)
What a miserable experience. But we came out of it very well. My co-worker is working at a stable company for more pay (personal connection got the interview) and I'm working for my former client for much more pay, benefits, and equity (not options; equity in a profitable company). While I was sweating out the collapse of the old company I had very little hope going forward. Looking back, it was a great opportunity.
Re:Welcome to the Bush Economy (Score:2)
Transmeta Does Not Have to Die (Score:2)
There is something fundamentally wrong with the way we develop software and the way we design our CPUs to execute the software that we develop. There is something rotten at the heart of software engineering. It has to do with the old practice of using the algorithm as the basis of software development.
We need a new software construction paradigm, one which is based on signals. Transmeta has the golden opportunity to do something real cool and save lives in the process. More can be found at the links below.
Project COSA [gte.net]
Re:No big deal (Score:2)
Re:No big deal (Score:2)
Anybody who is anyone reads slashdot!
I can hear the whispers of the word "slashdot", from a thousand geeks, roll across the plains.
This cylix fellow, I'm amazed he reads slashdot... he is way to famous!
Re:No big deal (Score:2, Insightful)
This isn't linus. His last name is spelled "Torvalds" NOT "Thorvals". Just a troll begging for attention.
Unless he got locked out of his previous account and couldn't get back in (because the e-mail address is way out of date) so he had to create a new account. If you look at the account info, notice that this was the first time it was used, and subsequent postings do nothing to suggest that this was a troll.
-a
Re:financial results?? (Score:2)
Re:financial results?? (Score:2)
Re:Monopoly (Score:2)
5 years ago or so? IBM and Sun are arch-enemies; HP is usually considered just as "that printer company", not as worthwhile an opponen (at least if you believe McNealy).
Re:Monopoly (Score:2)
Re:Ditzel should be behind bars (Score:2, Informative)
GNUbank (Score:2)