Forgot your password?
typodupeerror

Comment Re:Intel's political marketing has always been bad (Score 1) 17

It's a mixture. Intel licensed their designs to AMD for a while so IBM could use AMD as a second source. Later they became competitors. There's no evidence of "reverse engineering", which isn't even a bad thing (reverse engineering is what you do if you want to create a 1:1 compatible version of a product without copying it - you basically create as best you can documentation of how something should work, and then use the documentation to create a design) or of stealing it. And why would they steal it and then reverse engineer it? Rather at some point when they stopped getting licenses AMD just... made their own version based on Intel's public specs. As have a number of companies, using various degrees of reverse engineering, including NEC, Chips and Technologies, Cyrix, VMT, VIA Technologies, and even IBM.

Furthermore, the chip in your PC right now, be it Intel's or AMD's, is mostly an AMD design, with some legacy Intel design crufted on. That's right, AMD, not Intel, came up with the 64-bit ABI that most of us have been using since the mid-2010s. And Intel licensed it from them. It's AMD's technology now.

Does that mean Intel are the good guys after all? No, this is corporate bullshit. Neither AMD nor Intel are inherently good or bad. Intel foisted some pretty awful CPU architectures on the world before coming up with a non-mediocre one in the form of the 80386 (cue the idiot I argued with the other day who'll claim the 8086 is a modern CPU and works the way modern CPUs do and does not have a ridiculous architecture - you're still wrong!) because they didn't know what they were doing after FF left to found Zilog, but had the market dominance, mostly through mindshare, to get their CPUs everywhere.

AMD were responsible for the bulk of the "runs a little hot" CPU wars in the late 1990s/early 2000s, where AMD pushed power sucking cooling-system-overworking CPUs to try to beat Intel's performance... but then Intel decided to ape them until the Core architecture, so Intel's not a good guy there either.

Both have made mistakes and tried to paper over them. Both have fired people who didn't deserve it. Both are, ultimately, sociopathic corporations.

Unlike Motorola. Which they still made CPUs. ;-) 68000 FTW!

Comment Huh (Score 3, Insightful) 17

> Maybe it's because AMD stock sits around $196 while Intel hovers near $41,

What? This is what passes for financial literacy these days? Do they think that the stock price of two equal companies is equal?

Maybe Berkshire Hathaway Inc, stock price $716,299.99 at the time of writing, can buy both of them, and use the money in the couch to buy Apple? I mean, if that's how the stock market works...

For those who really do think this is a thing, look up "Market Capitalization". That, divided by the number of shares, constitutes the share price, and is the market cap is considered the stock market valuation of a company. AMD does have a higher market cap at $355B to Intel's $253B, but those numbers are within 30% of each other, not nearly 5x.

Comment Re:I love... (Score 2) 57

There's at least some evidence on some level that the C-suite class actually believes all this bullshit. Hence the mandates forcing people to use AI and giving people bad performance reviews if they don't use it.

This isn't to say there hasn't also been a lot of redundancies blamed on AI that wouldn't have happened anyway, I've said as much myself, but certainly we've had plenty of cases where the C-suite have assumed that AI can fill in the gaps.

I think once the AI bubble pops, rehiring skilled workers will be a necessity for those companies that didn't bankrupt themselves.

Comment Re:Bodes ill for Wikipedia (Score 1) 53

Wikipedia doesn't have an algorithm and it's not trying to make you angry or in any other way manipulate your mental state. It's a rabbit hole because knowledge is naturally a rabbit hole. People enjoy learning new things, despite the best efforts of the culture warriors to convince you otherwise.

Nothing about this lawsuit has anything to do with Wikipedia and you have to seriously misread it to think it does.

Comment Re:If only (Score 4, Informative) 94

> Funny how that once-in-a-life-time switch to work from home, didn't stick, and all the corporate morons wanted to go back to the office, because they don't care about fuel and energy costs.

I think the theory that a lot of this was about forcing people out has some truth to it. They're psychopaths but it's, for some reason, easier on them if they don't have to make the decision about who gets made redundant and if they pretend the employee made the decision themselves.

There's also some truth that external investors, who had a lot of money tied up in commercial real estate, were demanding RTO policies.

The one thing I don't buy are the excuses they gave. For the most part, WFH resulted in substantial productivity gains for the businesses that implemented it properly. It's unfortunate but the reality is that most businesses do not seem to prioritize the needs of the business over the need to stay in line with what they think people want to hear.

Comment Re:Wozniak - the real reason for Apple (Score 1) 55

Once again, Jack Tramiel gets no credit despite pushing computers into more hands than anyone else...

(Yes, not an engineer, but neither was Jobs. Jobs was always management and marketing, whatever his background might have been.)

I think the reality is that multiple people turned a hobby into a market and phenomenon. The reason Jobs gets so much focus is, I'm guessing:

1. While like Tramiel, Sinclair, Chris Curry/Hermann Hauser, and the largely forgotten names behind the TRS-80, he pushed for home computers to go to a wider audience than just electrical engineers, he charged more and aimed at the upper middle classes, from where our decision makers and journalists come from.

2. He was one of the few people with power to recognize the importance of the GUI work Xerox was doing and had Apple invest in that and produce the first computer aimed at the audience I just mentioned that had a WIMP interface.

3. He didn't leave.
  - Sinclair quit computing after the Z88.
  - Tramiel was fired from Commodore for dumb stupid reasons, was up against heavy competition from his old employer at Atari, and eventually wound down the latter not seeing it as having a future (and, to be honest, being at retirement age anyway.) As a result most Millennials and younger people have never heard of him. (If you're reading this and haven't, go read about him, he's a fascinating person. Engineers loved him. Marketers admired him. And people in business with him - dealerships, suppliers, etc - hated him...)
- The Acorn people faded out of view being kicked around upper management at Olivetti and then founding other interesting companies which, alas, was at a time the entire computing establishment had decided that only IBM PC clones mattered. Their relevance disappeared.
- Anyone know the TRS-80 people? Regardless, Radio Shack was more a traditional corporation anyway, if there was a Mr/Ms TRS-80, they didn't have much influence once Radio Shack hitched to the IBM PC clone thing.

So that left Jobs who came back to a still-active still-not-bankrupt still-selling-non-PCs Apple. And that gave him a far bigger boost in the public eye than those who had effectively left the industry because their companies could no longer operate doing anything interesting in the IBM PC clone world.

Jobs was not as big a figure pre-comeback. I knew of him, I remember reading the reports of him being fired from Apple in Personal Computer World, but these were inside baseball type stories. He was no bigger in that story than the person who fired him, John Sculley. The fact Jobs was the founder of NeXT was mentioned, but it was very much "Former Apple executive create impressive workstation". The articles would inevitably explain who Jobs was and why he was fired from Apple.

Over time, his rep grew. But don't discount that it wasn't during his first stint as major Apple executive.

Comment Re:Here we go again.... (Score 0) 118

> Unfortunately they chose to change the UI for change's sake

I'm about 90% convinced they introduced the Ribbon for anti-trust reasons. Here's a change in UI that cannot be fully cloned by competitors (they'd have to make their own custom Ribbons with a custom, non Word, layout to avoid falling foul of copyrights), and which hampers users being able to transfer their skills from Word to, say, Wordperfect (or even Word to Excel.)

Look at the timings, with development occurring at a time when Microsoft had just wiggled out of a substantial anti-trust suit that threatened to break it up into application and operating system companies, and released at a time they were being forced to open up their file formats in Europe, and it explains perfectly why they introduced that user-hostile garbage when they did.

Comment Re:They don't want to make other OSes more attract (Score 5, Informative) 118

They're not. Electron apps are not accessed via a browser. While it's true you can easily port an Electron app to GNU/Linux, that's also true of a .NET app (which, let's be honest, is likely what they're talking about here, I doubt they're going back to C++ for everything.)

The real advantage of Electron is you can use most of the same code and assets for a website as for an Electron application, which is useful, but given how ridiculously inefficient Electron is, that isn't much of a justification for using it. Over the last 15 years, most desktop operating system's UIs have been debased by increasingly inconsistent designs making them harder to use, and a huge amount of that has been designing for some superficial "web" design that doesn't really exist - at least, not in a form that stands still.

My sense of this:

Microsoft is in a panic. Almost everything different between Windows 10 and Windows 11 is disliked, from the centralized logins to the AI-with-everything. On top of this RAM prices are sky high meaning the bloat is rapidly becoming a problem. What they've realized is they have to do a full overhaul of Windows 11. And one of these is to stop using technologies like Electron where they shouldn't be used. They can literally reduce its memory footprint to Windows 7 levels, and make their code more reliable and less dependent on third party libraries and APIs by eliminating a rather absurd example of abstraction-for-abstraction's sake from their development stack.

This might even be good news.

Comment Re:Insider perspective: AI helps with amnesia only (Score 1) 66

> The point being...AI doesn't tangibly save time. It might save a bit under some circumstances, but not enough to justify layoffs.

Agreed with all of the above, but my even bigger concern with the idea of changing programming to babysitting electronic code writers, and doing the same for other parts of the business, is we're losing knowledge. Actively destroying knowledge indeed.

If luddites were in charge of the world, they could do nothing more effective to their cause than promote AI. AI means nobody understands what the code is doing, and reduces the number of people who know how code can work in general. In a decade it'd only take a few well placed supply chain breakages and we could be looking at anything from a severe recession to the complete collapse of society. I'm not kidding. Businesses being run this way are setting themselves up as places where nobody from the CEO to the janitor has any idea how the business works.

This is so unbelievably fucked up, and even more so when you consider that the advocates of these technologies didn't simply introduce them, iPhone style, to an excited set of consumers, but actively forced it on everyone, trying to go from 0-60 in the space of 2-3 years. Why? Why the hell wouldn't you give it a chance to prove or disprove itself first?

Because, you (Altman et al) know it's not what it's supposed to be perhaps?

Slashdot Top Deals

Nothing in progression can rest on its original plan. We may as well think of rocking a grown man in the cradle of an infant. -- Edmund Burke

Working...