Most of the time I don't WANT to save. I'd be damned pissed if my disk was writing shit I didn't want it to.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Why is the file's type (meta data for the file) any more important than a music file's bit rate, or a jpeg's compression ratio? All of these things are just meta data. They should be held as meta data behind the scenes, not conflated with a file's name.
The name is also meta data.
Your move, dipshit.
Anyone who's ever updated a Wi-Fi enabled GoPro knows about this.
When I last did it, the website gave me 2 methods for doing the update - the dummy version where you give them your serial, network name, and password and they spit out the file with the plaintext Wi-Fi password for you, and the not-so-dummy version where you handle your own shit. I don't know if that's changed, but the end result is the same - most users send and receive plaintext network passwords to GoPro and anyone who wants to can update their GoPro Wi-Fi password by booting it with that (modified) update file in the root directory of the SD card.
Further, who gives a fucking shit? The range on the GoPro's Wi-Fi is so short that someone within Wi-Fi range is a few steps away from physical access anyway., and you only ever use the Wi-Fi when you're actively using the GoPro - you would know immediately when someone connected to it and fucked with it.
I can read and understand C++.
C++ on the other hand is an indecipherable bizarro language that makes only half sense 90% of the time and only 90% sense the other half of the time.
It's worse than Old English vs. English.
That moment when you realize that fervor, hysteria, and blue balls all describe the same condition.
Those things are not the same at all.
OS pirates far outnumber Steam pirates.
The Star Wars reference was wrong.
My "fear mongering" was not.
Within weeks of my post, we saw just how ill-prepared the US was for Ebola. We saw failing protocols for medical workers, an utter lack of protocols for medical workers in other cases, an inability to quarantine infected people, an inability to effectively track exposed people, etc.
I was proven very fucking right. The post I was responding to said such a thing was impossible in a nation such as the USA.
Keep on tryin, though!
Star Wars took place long, long time ago. So what is this 1973 you are talking about?
The quote is "A long time ago in a galaxy far, far away....".
The top 3 pieces of advice, ever:
3: Think about what you want to do before you do it.
2: Use the proper tool for the job.
That link talks about 11.5, makes claims with no evidence, and admits to using the ICC compiler.
We KNOW the ICC compiler is rigged. http://yro.slashdot.org/story/...
Here's one where the cheating was exposed when leaked benches for new Macs surfaced before Cinebench had been updated to take them into account.
Those scores require a bit of context, though. The 32-bit build of Geekbench uses x87 code, for starters, so it isn’t optimized for any of the other instruction set extensions that Westmere-EP or Ivy Bridge-EP support. Getting close to Apple’s claim of doubled floating-point performance requires software compiled with the AVX flag. John Poole, the founder of Geekbench, posted several other reasons why the next-gen and previous-gen Mac Pros might be separated by such a narrow margin.
The leaked result was run using the free 32-bit build of Geekbench on a pre-release build of OS X Mavericks. Switching over to the paid 64-bit build of the benchmark adds SSE support, though that’s still a pre-Pentium 4 extension. Tab between the 32- and 64-bit runs on Xeon X5675-based systems and you’ll find that the SSE-capable build averages 14%-better performance.
Curious as to how the very same 12-core Xeon E5-2687 V2 compared in Windows, I ran my own test on a 64-bit build of Geekbench and scored in excess of 30,000 points—more than 25% faster than the leaked number. The individual sub-tests showed both Xeon E5-based platforms trading blows in the integer and floating-point components, but clearly a more real-world comparison was needed in order to establish the new Xeon’s performance in a workstation environment. Fortunately, I have the upcoming Xeon E5-2697 V2, the upcoming Core i7-4960X, an existing eight-core Xeon E5-2687W, and a Core i7-3970X.
This kind of thing isn't exactly a revelation. Benchmarks have been tainted by Intel and the ICC for ages. The real problem is that a lot of actual software is as well, so in the end the artificially-gimped performance reflected in the benchmarks translates to actual usage. Even among fairly-compiled programs Intel's parts typically maintain an IPC advantage, but it's no where near to a degree that would justify the cost difference. Add in nVidia's moneyhatting and gameworks bullshit, and you've got AMD taking it from both ends. This sort of thing should piss you off regardless of what brand you prefer because it stifles competition, increases prices, and retards progress.
The button is bloat. It bloats the UI by default. It bloats the overall package.
The code to insert the button into the UI by default is bloat.
The code to pop up a message telling people how great it is is bloat.
The code to support the standard is bloat.
The "standard" itself is bloat.
I want a build of Firefox without this shit or any of the shitty recent additions like it. Disabling it (or however much of it you can) isn't a fix because it's still resident, still takes up storage space, still needs to be updated when there are security issues, still wastes time and money in development, etc.
They don't fucking understand what people liked about Firefox.
Hint: Fast, flexible, extensible, lean, secure, reliable.
Over the past few years it's gotten relatively slower, less flexible, fatter, less secure (largely as a result of all the new insecure bloaty shit they add in), and less reliable.
Hell, just last month I had to do a complete wipe of Firefox because the fucking internal database was broken as shit after some update and the "Awesome Bar" wasn't pulling from any history past the date it broke (even though all the subsequent history entries were present). I can no longer tell Firefox to remember my browsing history but not to remember the download history. I can't tell Firefox to tie into the standard cert store on any OS, making site-wide management a pain in the ass (you have to use their own half-working, half-documented command line tool and maintain a separate copy of all certs) and making trust pinning nearly impossible. I can't do half the shit I used to be able to do with Firefox because some fucking asshat at Mozilla decided that I didn't need to.
It was on my toolbar with an annoying popup as of 35.0.1.
I believe this shit was introduced in the last version. And yes, it's absolutely useless fucking shit that shouldn't be in a fucking browser.
This is NOT the same thing. In fact, it's closer to the exact opposite. Infants are born with still-developing immune systems, and honey contains botulism spores which are capable of germinating in subjects with weak or still-developing immune systems. This is a proven risk that presents itself *every time* an infant eats honey, and there's a ton of downside if the risk is realized. On the other side of the coin, there's absolutely no upside to feeding an infant honey while their immune system is still developing the ability attack these spores.
False. Infants like honey. There's you upside.