Wrong! It's "By the way iFarted."
* Not having to deal with endless driver updates
* Consoles "just work". No maintenance, no constant fiddling with the system.
Oh, come on now. "The console has new upgrade. Upgrade now or be forever offline".
"This game has update too. Apply now or be forever alone."
How many people complain about Steam forcing them to upgrade?
It's not a case of "upgrade now or be forever offline". You can always choose to upgrade at a later date.
There are very valid reasons for either a console manufacturer or game publisher from refusing to accept non-updated clients. The first one that comes to mind is cheating in online games; you patch the problem and refuse to allow unpatched clients. The game publishers and console makers both have a vested interest in removing cheaters from their ecosystem. It spoils the game for everyone else, which generally means the gamer will spend their money elsewhere.
Most MMO games on PC's have similar policies: You must keep your game upgraded, or you can't play. "Oh, you wanted to play today? Sorry, you have to download this 2.9 GiB patch first. See you tomorrow."
Gaming ecosystem issues aside, think of the upgrade process on the various platforms:
From a user's perspective:
* with a console, you get a popup window telling you it will install updates -- for both hardware and software. Users don't have to keep track of anything. The user just presses one button on the paddle to manage the entire update process. One click, 15-20 seconds, no problems at all.
* Touchscreen phones & tablets are another very popular gaming platform, and they are also painless to use or upgrade. Whether it's an OS upgrade, a driver upgrade, or an application upgrade, users automatically receive a notification that there are updates, and can tap to upgrade automatically and painlessly.
Now, let's look at the way a user would have to update a driver on the PC:
* Know exactly what hardware is in your machine
* Determining if you need to update the driver (news of some sort)
* Launching the web browser
* Visiting the driver site
* Find & Download the correct driver
* Opening the download folder
* Unzipping the driver
* Executing the driver installer
All of that takes dozens of mouse clicks, launching several different programs, and several minutes of time.
I'm not saying that there's anything wrong with PC gaming. I'm only saying you can't gloss over the additional complexity and overhead gamers have to deal with on a PC.
After 20 years of being a hardcore PC gamer, my tune towards consoles has changed dramatically. Consoles have become a perfectly adequate platform for gaming, and have always been a lot less trouble to use.
"Better" is a relative term
The following are things console users don't have to deal with:
* Variable hardware platform - different CPU's, GPU's, memory, disks, etc, etc.
* This gives developers the ability to tune very tightly to the hardware, instead of having to support everything.
* Users don't have to mess with "detail" settings to tune for performance & appearance.
* No constant upgrade march (that ends up being more expensive than buying several consoles)
* Not having to deal with endless driver updates
* Consoles "just work". No maintenance, no constant fiddling with the system.
* I have yet to hear of malware on a console
If you're willing to pay the maintenance and administrative overhead that you have to deal with a PC, then sure, it's fine.
Consoles, however, just work and don't require constant fiddling. It's a gaming appliance. You plug it in, it works. No advanced knowledge required.
I'm not arguing about whether it's right or wrong.
I'm just pointing out that in a number of ways, forcing someone to decrypt an encrypted disk isn't that dissimilar from compelling a defendant from unlocking a combination safe. In both cases, it unlocks a trove of data, and can give the prosecution evidence to help them ruin the defendant's life.
You're correct to a degree, but...
Local governments (including city, county, and state) are far more sovereign than most non-Americans know. The US is more tightly integrated than the EU, but one principle is the same: the idea of allowing for, and preserving differences that are valued by an individual state.
Nevada and California are different states, just as Germany and France are different states. Each has its own culture, law, traditions, and so forth, but share a common currency, and cede some control to their respective unions.
With such a government, you have an escalation process that must be followed before you make a ruling that affects everyone.
It's probably more like the court ordering you to unlock a safe when they have a warrant to search its contents.
That will be a tough one to prove
Obstruction of justice? It'd be hard for the prosecution to screw that one up.
As they already appear to have enough evidence to incarcerate him for possession of CP, it's hard to see what sort of good pissing off the judge and adding further penalties would do.
The far better course of action is to comply with the court's orders, and then contest that any data obtained is inadmissible as evidence as the order to decrypt it violated his 5th amendment right against self-incrimination. Then he can get the conviction thrown out entirely.
As the FBI has already shown it is proficient enough to crack one of the guy's disks, there's a good reason for the accused worry they'll decrypt more data.
Anything the FBI decrypts on their own will be used to screw him to the wall.
Honestly, I'd prefer to contest all the way to the Supreme Court that being forced to decrypt a disk is a violation of the 5th Amendment.
I'd rather have a meaningful 5th Amendment as my shield than a faith in my ability to misunderstand and misuse encryption.
Such systems exist.
Even Dell sells them.
ARM is quite capable of competing with Intel, but it is no magic bullet.
CPU Core power usage is only part of the overall system power usage.
What you want is a number of web requests served per second. You can have a fast quad-core x86_64 chip do the job, or you can have many more (considerably slower) ARM cores do the job.
In the end, the number of requests/second is similar, as is the power consumption.
You can't get around the laws of physics - there is a minimal amount of power to perform an operation. Intel chips use more power because they perform more operations.
75% of the performance for 25% of the power compared to x86
The problem is they don't provide anything approaching that sort of efficiency.
I've had the, um, privilege of benchmarking a few of the new up-and-coming ARM server systems and chips. It's pretty neat to be able to have four quad-core servers, each with 4GB of memory, pulling a total of 40W or so. That's a great system for a web server farm.
The problem is when you compare the throughput vs performance for high performance computing. For a few workloads, the new ARM systems compare favorably - giving a small edge in work done per watt. The performance advantage per watt in these workloads is usually less than 5%.
The best-case for the ARM systems, under workloads that are most ideal for the ARM systems in question: 105% of the performance for the same power draw.
If your workload doesn't scale to a large number of cores easily, or has a large amount of inter-process communication, the current ARM systems are hopeless. Even with a supposedly high-performance backplane in a chassis hosting around 40 ARM nodes, it was soundly trounced by one x86_64 node.
Note this is for a cluster system, so many x86_64 and ARM nodes are being used for the benchmark.
One node X86-64 can handle the workload of 40+ ARM nodes. Granted, you can fit 40+ ARM nodes in a 3U chassis, but it's still a lower overall compute density than with 1U x86_64 nodes. Simply throwing more cores at the problem doesn't necessarily give you the gains you'd think.
While ARM is theoretically capable of better performance/watt, it's impossible to get anything resembling theoretical in a supercomputing application. Any advantage ARM has in performance/watt is eaten up by the overhead of having to use so many (slower) cores. Very few workloads scale linearly as you throw in more cores, as various overheads (MPI, network, etc.) decrease the overall efficiency dramatically.
Currently, you can't use ARM for memory-hungry applications, as you'll hit the 4GB limit. 64-bit ARM is promising, but it's also not for sale.
The best performance per watt for supercomputing workloads is still found in accelerators, such as GPU's or Intel's Xeon Phi.
ARM is very promising for many datacenter type workloads, where there are a large number of unrelated, independent processes, such as a farm of web servers. (Any database backend is, however, a different matter).
While slightly OT (as it's a non-supercomputing application): What if you want to use an application that uses Java server-side? Forget it. The current ARM JVM's (both openJDK as well as Oracle's) both appear to lack JIT; the only way to get Java to have a similar performance/watt between ARM and x86_64 is to disable JIT on x86_64. This is largely a software issue, but until it's fixed, forget about Java on an ARM server.
Read the actual report and see if you really think a few chemicals could really do what you suggest -- keep the temperature steady and glowing hot for 100 hours. If so, that would be amazing.. especially since the weight of the reactor did not change!
I don't only think a few chemical could really do what I'm suggesting, I know they can.
The "Glowing hot" reaction was, by their own admission, a very short term reaction, to "prove' that it can generate a lot of heat - in fact, enough to melt the steel and ceramic device. This was an earlier test, and had no part in either of the tests that lasted around 100 hours. In fact, there's no indication in the paper of the duration of the "glowing hot" test at all.
The 100 hour "long term" test in their report is an entirely different device, "purposely" running the device cooler. There's no evidence it's even the same design.
Chemical reactions can easily be used to melt the steel and ceramics used in the E-Cat. Thermite will melt through steel, concrete, and a few feet of dirt underneath. Thermite is self-oxidizing, the reaction is Fe2O3 + 2 Al 2 Fe + Al2O3 -- note that the resulting compounds aren't gaseous and therefore won't exhaust into the atmosphere - so the weight before and after the reaction is the same. That's only one of many reactions that can produce the same effect.
When the test is prepared beforehand, and/or you're not allowed to monitor the whole process, it's quite easy for a charlatan to adulterate the test.
Generating a lower level of heat for 100 hours isn't particularly difficult either, though not necessarily by chemical means this time.
Notable is the fact that Rossi would not let anyone disconnect the power cables, instead demanding they use an ammeter to "prove' the cables weren't drawing any power.
I've seen such a demonstration by an EE professor - the working meter read zero current. The purpose was to demonstrate that you really should understand how the meters work before infer anything from their readings. Otherwise, you end up with a meter reading zero on a circuit with enough electricity flowing to kill. Rossi could easily be using one of several methods to deliver power through the wire such that an ammeter still shows zero.
Putting too much trust in a measuring instrument you don't understand has been the source of a lot of scientific embarrassment over the centuries.
Similarly, Rossi will not allow any chemical analysis after the fact that would prove his claims (such as analysis for the type of copper isotopes that would result from the proposed reaction). Trying to claim that it would give away his "trade secret" catalyst is a joke: If he ever plans on selling this thing commercially, he will be required by law to provide an MSDS which details exactly what is in the catalyst.
Even Coca-Cola has to list its ingredients on the can. The difference is the laws are a bit more lenient towards "natural and artificial flavorings" in foods than they are towards industrial reagents.
I stand corrected.
TOTP is still very much outside the realm of Kim's patent.
Minor fix: Replace "Rossi does none of this" with "Rossi does all of this".
You obviously have no experience in the realm of charlatans.
In every single case of too-good-to-be-true power sources, there is a gimmick to make the results seem legitimate to the untrained.
It's little more than an magician's trick of misdirection, and ignorance of the audience. There's no real magic in a magician's act; misdirection and suspension of disbelief are their breadwinners.
There's a reason charlatans won't allow people who know what they're doing to examine their apparatus. It has nothing to do with "trade secrets", and everything to do with the fact that experienced chemists, engineers, and physicists generally find the gimmick and expose their fraud in minutes.
There are a lot of ways to heat up the reactor chemically. If you honestly believe that "it can't be done conventionally," then you've utterly failed chemistry.
Simply heating a lithium-ion battery pack over 185 C starts an unstoppable chain-reaction that quickly reaches a couple thousand degrees. Surround a LiPo battery with a chunk of steel, short circuit it, and it it will reach a temp over 185 C. Now you just wait for the steel to slowly heat up. Create an array of a few of these, add copper to equalize the heat in your bank of LiPo batteries, and voila, you've got a heat source of well over 500 watts that can last for hours - days even, if you set it up right.
As much as the
I'd like to believe that Rossi has somehow found out how to generate abundant, clean, and cheap energy. His actions, however, are identical to a garden-variety charlatan. Activities such as grandstanding before customers or the press, "demonstrations", and refusal to let experienced and external experts examine his device.
Rossi does none of this. The smell of charlatan coming from the guy is overpowering. His behavior is identical to a classic con/charlatan. As the saying goes, you don't have to eat the whole turd to know it isn't chocolate.
Have you seen the price of an E-Meter these days?
After having actually read the patent, it looks like Google Authenticator, for example is in the clear.
The patent states that the following must occur:
1.) User inputs a password
2.) Authenticating device receives the password from #1, generates a password, and sends this new password out-of-band to an external device. (Pager, phone, etc)
3.) Person then reads the password from the device
4.) Person inputs the new password into their computer
5.) Computer sends second password over to authenticating device.
6.) Authenticating device finally grants access.
Google authenticator works differently.
1.) User input password
2.) User inputs password read from device
3.) BOTH are sent over the network to the authenticating computer, at the same time.
4.) Authenticating computer grants access.
Note that Google Authenticator does not generate the 'multi-factor' password after receiving the first password from the user.
The multi-factor password is streamed passed to the (pager, phone, etc.) every X seconds.
It's an entirely different mechanism.
Which means that my already low opinion of this guy is now lower, as he's descended into obvious patent troll territory.