I mean, it's free (GPL), open source (C++ with plugins in Lua), and there are no paid accounts. Why bother looking into MInecraft when we can just build it ourselves, and in a more original and better way?
But this is not UEFI secure boot, but a completely different thing.
Then they will not be windows 8 certified, and may not affix a "Windows 8" WHQL sticker, or advertise their systems together with any Microsoft Logo.
You remove it (or never have it to begin with if you are a hardware vendor) and put your own platform key on it. For examples on how to do so, please google James Bottomley's blog.
You do not need to disable UEFI in order to boot a different OS, but only need to disable Secure Boot.
You can disable Secure Boot and still boot multiple OS's (with UEFI, as almost all the major distros now support). You can then add a second key and re-enable Secure boot, and dual boot any OS you want with Secure Boot enabled.
And to answer my own question: All the Apple fanbois care, obviously.
Microsoft were fined for a reason. Who cares that google complained? They make a browser... this is sooooo non-news.
If everyone runs their WIFI AP's open.
The problem is that enforcing public disclosure by the organization itself is equivalent to self-incrimination. Think about that for a second. Do you really want to put that in law? In the US, it would be thrown out immediately as unconstitutional.
As I posted before, the guidelines mention explicit timelines that should be followed. 60 days for software, 6 months for hardware.
Most likely scenario for Security, Dick:
1) Criminality. Failure to ensure funding from reputable companies forces these folks into blackmail or abuse of disclosure process. Eventually, they end up behind bars.
2) Corrective collective: Companies never give out freebies, but well-behaved security researchers have far more fun not being chased by police and get all the chicks. This creates a role model. You should see Bruce Schneier at rave parties.
Two thoughts on your message:
1) you must hate yourself.
2) the Dutch will still love you.
The guidelines (dutch PDF) have a whole chapter outlining the responsibilities of the organization receiving a disclosure. They include guidelines for solving the issues (60 days for software, 6 months for hardware), reporting back progress to the discloser, allowing a discloser to report the vulnerability to a larger audience as part of the NCSC (government). Combined, these guidelines are an effective tool for security researchers to play by the rules and put pressure on companies together with others.
Researchers are encouraged to disclose to the NCSC as well, which means many security experts will be able to put pressure on companies not fixing vulnerabilities according to these rules.
The documents create a neutral middle-man organization that can mediate between companies refusing to cooperate and disclosers. It effectively puts irresponsible companies directly in the line of sight of the government and thus legal action. What's not to like?
Being a native dutch speaker, I read the entire guidelines in Dutch, and they include disclosure terms to encourage companies to rapidly fix (60 days) issues, and make agreements with the discloser about the disclosure.
This is common practice and rather well accepted practice already. So, in essence, the document encourages the public disclosure. Any company that wishes to ignore the vulnerability will have their asses handed to them anyway, so this guideline actually helps - security researchers can use it to show to companies that they are acting in good faith as long as companies play by the same rules.
So personally, I highly encourage governments to do something like this.
This Dutch variant is interesting in the sense that it creates a possible middle man that can mediate and monitor the disclosure. This protects disclosers, and puts more pressure on companies to abide by these standards. Not the other way around.