Even being above average means you're surrounded by (relative) idiots. Hell, just stay informed about world events, history, literature, and then stand there in disgust as all people can talk about is the latest episode of "Naked and Afraid". This is by no means a recent thing either; every generation throughout history has repeated the same sorry story.
Slashdot videos: Now with more Slashdot!
This second case is also somewhat of a nautical term. The Captain of a ship and its Chief Engineering can be standing on the bridge of the ship and the Chief Engineer may report the ship to be "lost", meaning uncontrollable sinking.
Also when a ship is sunk you only have the position of where it slipped below the surface, you don't necessarily know how it traveled on the way to the bottom. More importantly prior to GPS ship position weren't necessarily that accurate. Wrecks are often considered lost until someone has eyes (real or synthetic, ex side scan sonar) on them. Which is what seems to be happening here.
Scuttled naval vessels sometimes become artificial reefs that greatly support the food chain for local fisheries. This can have a positive economic effect. A long term one at that.
As for live fire testing. Laboratory testing and mockups are one thing, but how a missile performs against an actual ship is something else. What is the cost of an anti-ship weapon system that turns out to be ineffective against modern ships? Sadly real ships are a necessity for such testing.
Run Windows VMs and keep adding them until the boxes are under some level of resource contention (3:1, 4:1 vCPU:pCPU). If you don't see a difference, I'd be highly curious of your workloads and configuration.
This is exactly correct. I myself replaced a SQL Server cluster that was using boxes with dual 12-core AMD procs with one using dual 4-core Xeons a couple years ago. Performance and responsiveness went way up while the bill to Microsoft dropped massively.
I was a solid AMD enthusiast from the original Athlons all the way up until about 5 years ago. They went from huge underdog to reigning champion for a long time while the marketing guys ran Intel's product offering into the ground with everything from Northwood to Prescott and all the stuff in between. But the landscape has shifted for AMD. They've simply gone downhill. As of the last couple of years, I can no longer justify buying AMD procs at work and I'd already switched at home. That AMD could boast significantly more cores was the last leg they had to stand on in the server market; now they're a has-been.
I sincerely hope they recover and blow past Intel as they've done in the past. I think that's healthier for the market and I think we all win when that competition heats up. But at this point, there's little to justify their existence in the server space and the market share numbers reflect that (dropping from >25% share to ~3%).
"He pushed them aside by killing development systems (VB6,FoxPro)"
Except VB6 and Foxpro were never really developer tools. They were tools for non-developers to get basic programming tasks done. That was kind of exactly why they were developed- things like Visual C++ were always the tools targeted at professional developers.
"the Win32 API, to slowly become more irrelevant with endless layers of cruft built on top"
Win32 API became irrelevant, because it's become increasingly irrelevant. Why on earth would you want to keep an outdated API that's now decades old in it's design and origins as your primary development target when no one is using it? That just doesn't make any sense whatsoever. It doesn't make sense to maintain something that's inefficient to develop with, and I say this as someone who cut my teeth on Windows development with C and Win32 API many years ago. I have fond memories of it but I've no idea why you'd give a shit about it in this day and age.
Your argument is classic of someone incapable of dealing with change, which, in the world of technology is probably one of the least desirable traits you can have. You can boil your argument down even further when you talk about layers of cruft, you can say using C with the Win32 API is in itself a layer of cruft built on top of just doing everything in pure assembly, which is a layer of cruft on top of just doing everything directly with machine code and fuck the APIs.
"Ballmer wasn't bad; his jumping around on stage shouting "Developers!" showed that he knew what the true value of Windows was: the external developers who wrote Win32 code for retail products or company-internal developers."
Except no one's actually done that for the best part of two decades now. Even before
"However, his middle-empire stage was a shift to focusing on selling to enterprise customers. This isn't a bad things by itself, but by taking his eye off the "Developer!" ball and focusing elsewhere, he guaranteed that plenty of developers went elsewhere."
So what? The enterprise became more important, fat client applications gave way to web applications. Ballmer doubled the profits of Microsoft during his tenure, so it looks like his focus change was exactly what the majority of businesses and developers needed. The fact there's a handful of luddites that bemoan the decrease in usage of Win32 API is meaningless because you're such an irrelevance in the grand scheme of things- most people can deal with change and follow necessary trends, even if you cannot. That's not a problem with Ballmer or Microsoft, that's a problem with you. You can't blame Ballmer's capability for pursuing necessary change for your inability to change.
People shifted to Java because it was a paradigm that gained a lot of hype in both business and academic circles and had a 6 year headstart on
"It's interesting to see how Nadella is shifting the focus again and broadening it (Windows 10 on Raspberry Pi, for example). Time will tell if Nadella is simply being an anti-Ballmer or if this glasnost is signs of a more fundamental shift in the way Microsoft does business. I hope it's the latter."
It's neither. It's a continuation of the status quo, or did you completely miss that Ballmer also pushed Windows 8 on ARM? This increase in scope of platform support, and move to open sourcing of APIs started well within Ballmer's years. Nadella is just continuing what was already started. It's not a change in direction, it's business as usual as it has been for some time now.
I keep hearing that, and every time I look up for factual information on it I end up on a deadend of sites, or non-existent papers, or articles in non-scholarly journals. And considering I've been digging through this off and on for the last 20 years, and always end up at the same state, that leads me to believe that it's simply being used as a hyped up bit of propaganda work. I'm not saying there aren't extinctions, I'm saying that they're not at the level that people claim it is.
You know, much similar to the end of the world, or global warming will cause the earth to have no ice caps by 2000(said in early 70s and again in the 90s), or the arctic ocean will be free of ice by 2010(early 80s), or New York City will be like Ft. Lauderdale by 1995(said in late 60s).
Compaq had to reverse engineer the PC BIOS using engineers who had never looked at the BIOS. These engineers wrote a spec that a separate set of engineers then had to implement.
That's not how it worked. The first team absolutely looked at the BIOS to create that spec. Its the second team that implemented the spec that had never seen the BIOS.
There's quite a few games you can't buy from Australia at all(i.e. refused classification). For my friends who live there, they usually paypal me the money and I fire them off either a code or gift it.
This is a night and day difference with respect to reverse engineering...
No, it isn't. They had to go further out of their way to dance around that issue in order to make a legal clone.
The half of the clean room effort that does the implementation are the one's making the clone, they don't see source code, disassemblies, etc. The other half doing the reverse engineering in order to develop the specification have to discover the *intent* of the original developers with respect to functionality. That discover is easier when you have their commented source code rather than a disassembly of a binary.
The dancing you refer to is for non-clean room scenarios where the developer implementing the compatible non-infringing clone has access to the original copyrighted code. And that dance occurs regardless of whether he/she is working from a binary disassembly or commented source code. Lawyers literally look at the code and say these ten or so lines in the new are too similar to these ten or so lines in the original. Disassembly or source has this same problem. Now source still has the advantage of better divining the original intent, so having the source is also a win in the non-clean room scenario.
...and the fact that IBM didn't want a compatible BIOS to be produced does not change this.
It changes this part:
Compaq et al were able to create clones because the IBM PC was an open platform.
No, it didn't. The fact that IBM provided source code to all PC programmers as a way of documenting the BIOS API actually made things simpler despite such a desire. If IBM was to act in a manner more consistent with that desire so as to hamper Compaq et al they would have simply provided PC programmers with registers for input/output parameters and the interrupts to use to invoke an API call. As was done with DOS.
...the fact is those working on a compatible BIOS had the IBM source code with comments to work from
... they clean-room reverse engineered it.
A clean room design involves *two* teams. A dirty team that reverse engineers and writes a specification for a compatible device, and a clean team that does the actual implementation using only the provided specification. The "wall" is between these two teams, the implementation team has no contact other than the specification.
The dirty part of the team had a much easier time creating the specification given that they had commented source code. This source code, widely distributed by IBM to PC programmers, was the BIOS API documentation. This is a night and day difference with respect to reverse engineering and the fact that IBM didn't want a compatible BIOS to be produced does not change this.
Open listings like this were *the* documentation on how to use the BIOS API calls.
Compaq et al were able to create clones because the IBM PC was an open platform.
Wow, you know nothing about what happened, do you? Are we really already to the point where people don't have any idea how 'locked down' the PC was when it first came out? We've already forgot? Oh, you misread a Wikipedia article
Wrong. Are you under the mistaken impression that "open" means the source code is also free to re-use and distribute? It does not, contrary to how the FSF would like to redefine "open". The fact remains that the IBM PC BIOS was open, PC developers had access to the source code. This source code was part of the documentation provided by IBM to PC programmers so that they could call the BIOS API. The comments in the source code were the API spec. We weren't using pirated copies, we were using official copies provided by IBM.
The fact that IBM was open with the source code and the specification team had access to commented source rather than disassembled binaries was a great advantage. Keep in mind that this source code listing was official IBM documentation on how to use the BIOS. IBM intended it to be viewed by PC programmers so that they could make use of BIOS API calls.