I have one word for you: Babbage.
I have one word for you: Babbage.
Umm. Should I even bother to point out how many things are wrong with what you said?
PhysX itself is used by developers. The card being capable of running it is fairly irrelevant. "Do you want shiny extra visual-only effects? Yes/No". PhysX GPU acceleration is fairly useless, and even many Nvidia users turn it off to save framerate.
You have a video card, which you presumably use for a game. If PhysX is used by the game...good for you, unless of course it uses PhysX software only (as the overwhelming majority oft PhysX licensees do). If it doesn't...or it's unsupported, have you somehow lost something?
You're effectively putting to task the expectation that A( 'everything' uses PhysX, B( 'everything' must support PhysX.
You're effectively (and naively) blaming Nvidia for the lack of PhysX on AMD/ATI cards, when AMD itself hasn't been interested in it, hasn't developed its own comparable libraries, and let the Havok deal fall through. AMD has done, as far as I can tell, absolutely nothing to compete or interoperate in this area, not even encourage open source as do the work for them (as they're now doing for UNIX drivers).
The ultimate defense for libel is the truth, the ultimate defense for anti-competitive practices, is the competitor's own incompetence/inability to stay afloat. Nvidia had much (but not everything; NEC was the bigger factor there) to do with the demise of 3DFx, and absolutely nothing to do with the decline of AMD's competitive quality and their image in the eyes of the consumer. If more people bought AMD cards, then AMD would have a greater market share, logically. Nvidia hasn't sabotaged them in any way whatsoever, not even a teensy bit, nor do they hold the Lion's share of the GPU market.
Why not also blame Nvidia for not porting PhysX to Linux, MacOS, FreeBSD, and Solaris while you're at it? Come on. This is the real world, not 'every company does what you want before you want it' land.
The market for PhysX, is specifically developers, not you, the end user. Why are you so hell bent on getting a few extra graphical effects due to PhysX? That's what it's used for in most cases, and the most cases are...what, MAYBE 10 games released ever that directly support hardware PhysX from the GPU? And another 10 that supported the Aegia PPU, but not an Nvidia card?
Market share is also a non-issue for you, the end user. After all, do you get PhysX acceleration a Havok game either? No. Does it matter how much market share Havok or Bullet Physics have, if the game you like uses it? No.
You're...kinda wrong about MS and monopoly. They were ruled a monopoly for anti-competitive practices and various licensing deals they made to hurt the any competition. It has nothing to do with Apple 'not available on x86'. That's...very very strange. Apple backed IBM PowerPC, after they backed Motorola 68k. That it wasn't available on x86 was simply, they didn't want it to be. Just like they don't want it to be available for general non-Apple computers still today. Microsoft had nothing to do with that, and even periodically developed software for MacOS/PPC. They even developed a version of Windows NT 3.51 for PPC (which didn't hurt Apple at all).
Nvidia is far from a monopoly in the graphics market, with a mere 29% of the market and AMD 17%. Intel dominates with 44%. At least according to JPR, whom I'm sure you distrust as well: http://jonpeddie.com/press-releases/details/amd-soars-in-q209-intel-and-nvidia-also-show-great-gains/
Could AMD's loss of market share (and I seem to recall that Nvidia and ATI were fairly neck and neck in the Geforce 2/3 era) be due to their own problems which they have yet to resolve, despite ATI/AMD merger? Maybe possibly? If people don't trust you and your product to work particularly well, people will be less likely to put down $100+ merely for you to disappoint them. The door swings both ways.
Everyone knows that MS Windows is the main host of botnets, zombies, and general malware on the Internet. Hardly a month passes without Microsoft patching yet another "critical vulnerability". Unfortunately there are reasons why MS Windows is more vulnerable than e.g. MacOs, Unix, FreeBSD, or Linux. For one thing, MS Windows (until Vista) was never designed from the ground up for multi-user operation, security was ever tacked on as an afterthought, the architecture of MS Windows with its miriad add-on's (that tend to carry out _system_ tasks) and the (deliberately) tight coupling between MS Windows and MS applications conveniently makes for multiple points of attack, and once a process is suborned by an attacker there is nothing in MS Windows architecture that's designed to contain it or stand in its way. That's why we see so many infected Windows PC's on the Internet.
Oh yes, there are those who hold that e.g. Linux would suffer the same level of penetration had it had the same level of penetration on the desktop but the fact that about 60% of all Internet traffic is handled by Linux machines (which are far less often compromised) pleads against that. It's not exposure that does it but architecture (and the quality of administration, but that's another issue).
So that being the case, what would benefit Microsoft more than to be able to cast doubt on tales of machines being infected and taken over as "Probably pirated copies; legal Windows versions are protected by MS security updates."?
That would give Microsoft a good reply when called out over the insecurity of MS Windows (e.g. when a large organization is considering what OS it should use in the next 10 years).
What do you think? Might I be anywhere near the mark?
I'm from a small org, fully embracing the leading edge.
But I can See the following scenario:
1) Org has large internal App written for IE6 only. Can't upgrade so users are forced to have IE6 on their workstations
2) Org's IT admins are well aware of the security problems IE6 forces them to work around.
3) Roll out the Chrome plugin, and set things up so everything *but* the internal site uses Chrome.
Installing IE upgrades makes it difficult to leave an ie6 & ie_latest deployment side-by-side in a 'supported' fashion (Unless ms has a 'supported' way of doing this?)
Using the Chrome plugin lets the Org upgrade the browser to something maintained & more secure on their deployment, while allowing the archaic app to work as expected.
Good for you! My high school was pretty high-tech too. We had an IBM PC in 1979 (OK, technically it was an IBM 5150. They didn't start calling it a PC until later), and a computer science teacher, Alan Schulz who was one of the "boys from Boca" that invented the damned thing. He got us a computer lab (with Apple ]['s of course) and a variety of other machines (does anybody really remember the Timex Sinclair or TI99/4a?). They made him teach math most of the time because computer sciences weren't some serious business endeavor back then. He owned the local Apple store.
He taught me a lot about basic science. Don't accept anything as a "magic black box". Start with an understanding of the transistor and how they build into gates and logic. Proceed to an understanding of machine language -- especially comparison and branch operations. When you know how such things are done on an electrical level it does amazing things for the persistence of your understanding of the rest of it and your ability to detect bullshit. Having struggled through a course where we had to write useful applications that worked in 8 bit opcodes written in pencil on paper in binary I learned some things I'm unlikely to forget. Doing so as the only member of a four-person team to produce anything useful I learned other facts that still give daily service. A few years ago I went back and some of the apps I wrote are still in daily use, though heavily modified of course.
As a historical note, the student computer society (BUHSCCIOBBDT) had fundraisers and bought some stock - IBM, Microsoft and Apple among others. It did quite well.
Logic diagrams, Venn diagrams, and other primitives are still as useful as they ever were. APL is still a write-only language. BASIC is still good for quick mock-ups of what a program will be when you've written it in a real language. Tape still sucks for bandwidth. ADA is still easy to sell and gruesome to program in. Game programming is still about balance between challenge and reward. GOTO is still flamebait. Programmers still play D&D (or some modern equivalent) in high school. Applications are still data structures + algorithms. To be honest, a lot of the stuff I learned then and in years following is now worthless (SNOBOL anyone?) but I'm doing better than some because my excursions from Assembler, C, and C++ have been recreational at most. I've collected scores of languages the way some people collect Happy Meal toys and discovered the same thing such collectors have: 90% of stuff that's manufactured is junk to stuff a landfill with.
I was also fortunate to be in school with folks like Robert Toth and Vince Sherart, who were great minds well ahead of their time. From your post I'm guessing that you're also surrounded by folks who will persist and do well.
Let me put this another way. In every field there's a ton of fakers who subsist by getting in with buzzword proficiency or an MCSE cert and rise to middle management through meeting management. These people serve the purpose of preventing excess productivity, which believe it or not is a socially useful goal. You don't have to be one of those. You can get ahead by knowing how to do stuff. If you proceed in your education from understanding the first causes to the prime forces, then when you have to deal with one of these jerks you can cut him off at the knees by pointing out the things he doesn't know, and in the process make your work environment more fun to be in. As a bonus it's fun to watch them wilt.
I had to take touch typing lessons from 6-9th grade and I'll be damned if any of them stuck. I've been two finger typing now for 30+ years and all those hours of class time were a waste.
The person who had thought they'd seen a gunman in the neighborhood had actually seen a Bungie employee carrying a replica Halo rifle back to the studio's offices, Bungie community director Brian Jarrard told me. Recognizing there was no longer an emergency, officers advised Bungie officials to transport the gun more discretely in the future.
Note that the article says the employee was 'carrying' the weapon, and that police advised Bungee to be more discreet in 'transporting' the replica. So although there are no guarantees, the article certainly implies that the replica was just being carried.
Me, I think the police should have advised the individual who called in not to be such a candy ass in the future. My personal, biased, unscientific risk assessment tells me we suffer far more from excess paranoia than we do from random shootings. I acknowledge that random shootings are a real problem in the U.S., but I think the paranoia we live under is a much bigger problem.
Particularly as this relates to 'imaginary property', I see no reason why the buyers, who bought in good faith, should be deprived of their purchases.
If we accept that the rights holder didn't ever give permission to the sales, their wishes have been ignored when offering the items for sale but should their rights be given more weight than those of the customers? I see no reason why this should be so.
You seem to be working from an assumption of some sort. How about we consider the taxman coming to your place of business to conduct an audit. He finds that you owe 7 million dollars in taxes, instead of the 4 million dollars that you claimed. Can you argue that the numbers aren't off by an order of magnitude? "But, sir, my numbers are only wrong by about 75%, this isn't fraud!" Or, we can work those same numbers backward - "But, sir, my numbers are only off by about 50% - of course it's not fraud!" Good luck with that, huh?
The numbers are fraudulent, plain and simple. As others have pointed out, anyone in the scientific field(s) would be laughed out of academia for submitting such flawed numbers and such flawed reasoning.
"In fact, unless they only surveyed people WITH internet access,"
BTW - TFA specifically says that everyone in the survey had internet access.
There is no line of work on planet earth where people are permitted to do such obviously fraudulent math. If similarly flawed mathematics were applied to a construction job by a bunch of backwoods hill billies, they would soon be out of business.
You simply cannot justify the numbers with any sort of logic. Attempting to do so is an exercise in fraud.
OK, even the SUMMARY contains a sentence that says the roads wouldn't need plowing in the winter because they heat themselves to automatically melt any snow accumulation.
I strongly suspect the author of that statement have never seen a cold day in Minneapolis or Ottawa... where the temperature dips almost to -30 at night... and you don't see the sun for days.