Well, you must also know the HTML entities, even in plain text mode... writing æøå doesn't work, but æøå works. In this case µ doesn't work though. And I think all languages have Unicode support good enough to strip control characters and shit if you're not lazy. My impression was that it was more to sabotage the ASCII "art" than anything else.
Well, sometimes they make convenient little assumptions about the write amplification and other things in coming up with that number. Also it's the number they use for warranty claims, so it may not reflect the kind of endurance you'd normally expect. The latest trick is to basically use part of your drive as a semi-permanent SLC cache and only write it to MLC/TLC NAND later, if ever so what you actually get will depend on your usage pattern. If you just keep on rewriting a small file it'll probably not leave SLC at all, while if you use it as a scratch disk filling it up with large files and emptying it you'll hit the MLC/TLC hard. The rating is just to give consumers who don't want an in-depth look something to relate to.
Personally my first idea was, if they can deliver us a MLC drive at 45 cents/GB doesn't that mean they should be able to deliver us a SLC drive at 90 cents/GB? That's not disturbingly much, considerably faster and should have all the endurance you'll ever need. That said, TechReport got 3 (out of 6) consumer drives they've written >1 PB to, so I'm guessing most drives fail from something else than NAND exhaustion. And I don't reinstall my OS disk every day.... I just checked and I've used up 50 of my 3000 P/E cycles after 150 days of 24x7 running so at this rate it should take 25 years.
I know people who turn on their computer maybe 2-3 hours a day on average, just streaming no heavy media usage. Any SSD will last them forever, it's all about $/GB. Now if you want a guess they said 5000 P/E -> 3000 P/E (60%) for 25nm -> 20nm MLC, so I'm guessing 3000 * 0.6 = 1800 P/E for 16nm. And TLC is probably like 500 P/E, though this drive doesn't use that.
How is this an advantage to anyone who plans ahead? I suppose if you wrote your original application in Objective-C and weren't thinking about cross platform support, then fine. But if you're planning on supporting both platforms why don't you just go completely cross platform and use C?
In other words pretty much exactly what some tried to say when Google first launched Chrome, except for OSS zealots who were blinded by their Mozilla support and "do no evil" slogan.
For Google open source is not a goal, it's a tool. Google funded Mozilla to run a browser war by proxy, as an open source and non-profit organization Mozilla could get massive support from organizations and volunteers that Google never could and a much higher tolerance of bugs and broken functionality. And I mean that both with respect to internal bugs as well as broken web sites due to MSIE-only code. As a means to an end to push a standards compliant web for Google to profit from it was a success.
With Android Google again used open source as a battering ram against an entrenched monopoly, this time against Apple in smart phones. Once again a host of unlikely allies - pretty much everyone except Apple and Nokia, really - jumped on board along with the open source rah-rah and low cost clone manufacturers looking to get a free ride. That you could have things like CyanogenMod and get root on your phone was new - even though some manufacturers blocked that it was a step up from the all-closed platforms.
I'm not saying those are bad things, but those mutually beneficial interests come to an end. Once we've been released from the old stranglehold, Google wants to make a new one with themselves in control. I don't think I can make a catchy acronym for it like embrace-extend-extinguish but it goes something like commodify-bundle-obsolete:
1. Commodify the functionality through open source
2. Bundle it with Google APIs/services
3. Let the open source version toil in obsolescence
Search results are still a major driver of Google's revenue. The default search engine is defined by your browser, the default browser is defined by the platform so from their perspective pushing Android and Chrome both makes very much sense - if you're using a Google product you'll never be pointed anywhere but a Google service. Chrome is also a vital part of that "all-or-nothing" bundle Google is selling to make companies use Google Play which is now their second cash cow.
Firefox is no longer a partner against MSIE, they're a threat against the OHA bundle. If you can take AOSP and install Firefox with no further strings attached that's one of the many pieces you need to replace filled. The less alternatives you have, the more power Google has over the Android ecosystem. If you're still stuck in the mindset where MSIE had 95% market share you'll fail to see that your one-time ally is no longer on your team. They're on their own team, as every for-profit company eventually end up being.
Possibly. But the short-term social disruption would not be something I'd like to witness.
And since the 'short-term' in this case is probably 'a generation or two', I'd have to be a witness. (Or dead.)
All it takes is a couple of people who 'aren't infected, just look' (there are a few days of little-to-no symptoms) to bribe some official to get on some plane or past a border check. We're a significantly more interconnected world today than even a hundred years ago - you don't need rats to spread things widely.
It's not a pandemic - yet. But it wouldn't take much for it to be one, and it would be major.
Now you know why I started to learn Mandarin a few years ago - yeah, I've accepted that I won't get far on Danish alone, and there are more people knowing Mandarin/Hindi/Spanish than English
Native speakers, yes. But whether you're in China or India or Spain the most popular second language is English. Functionally you're much better off because at almost any tourist destination you'll find somebody speaking English, while Mandarin is great if you go to China and pretty much useless everywhere else. English got presence in Europe (UK + EU really), North America (USA), Asia (India), Oceania (Australia) and Africa (several former colonies). I'm not going to argue the moral side of colonialism, just say that practically it's the only language with global reach.
That's kinda impressive - from experience, there aren't all that many Americans, that "do English well"
The quality of the English version is what it is. The quality of the non-English version is what it is plus all that was lost in translation, it's certainly not going to be better. The worst is when they move around on standard shortcuts, for example in MS Office all English versions has Ctrl-F as Find and Ctrl-B as Bold. In Norwegian Ctrl-F = Bold (Fet) and Ctrl-B is Find (Finn) and I absolutely hate it every time. And yet in the interest of sanity they do keep other English shortcuts like Ctrl-S = Save (Lagre), even though that makes no sense in Norwegian. Never mind that when you're working with code or databases there is no Norwegian C# nor SQL, so it all ends up rather Norwenglish when you try.
Don't get me wrong, I'm fond of my language when it comes to identity and culture. But when it comes to communication having global terminology and one way of doing it makes everything so much simpler. Yes, there's a whole lot of "English" speakers out there but any resemblance of a common tongue beats trying to use translators. It's something of a first world issue though as 16% of the world is still illiterate in their first language but I hope that in 100 years you could talk to at least half the world's population in one language.
Try not stealing one since the US government doesn't own them and I think you'll find yourself in jail. Any takers who'd like to bet otherwise? I think in practice this is resolved already, what you bring back to Earth is yours. The fun parts would be that nobody has mining rights, if you find a big gold vein there's nothing stopping another country/company dropping a mining rig right next to yours.
It takes a long time to compute the size of 20 files when a division by 1000 takes 300 odd cycles on a 10kHz machine. It doesn't take such a long time when a right shift 10 takes 1 cycle.
This must be the most clueless post about the 1000/1024 divide so far. It never had anything to do with the computer's performance, it's that when you build a digital computer a lot of things will be sizes of two because what you can address with n bytes will be 2^n. Physical memory, memory pages, caches, buffers, floppy and hard drive sectors all the "microunits" in the computer are powers of two. Hint: No actual hard drive gives you 1MB = 1000000 bytes because it's not divisible with 512, in reality they give you 1954*512 = 1000448 so they don't underdeliver. Actually make that divisible by 4096 for modern HDD drives with 4K (no, not 1000) sectors.
There is a single reason why computer scientists usurped the prefix kilo and that is because they needed to describe "one thousand and twenty four bytes" - or multiples of that - very, very often. They needed a shorter name, they never needed the unit "1000 bytes" and so "one kilobyte" became their shorthand for 1024 bytes. And unless you're really good at doing math in your head, tell me how much is seven kilobytes exactly? (And if you answer 7000 I'll slap you). We still say 512GB of RAM. Nobody wants to say 549.755813888 GB of RAM, because multiply that with a billion and you have how many bytes that is. It's not some nice, round number.
Either way you're going to run into some f*cked up conversions if you mix GiB and GB, which I'll use now for clarity. If you have 512GiB of RAM (hey, servers do) and load 512GB from disk, how much of your RAM have you used up? Now while you're calculating that, this other person who uses a GiB system says so that was like ~477 GiB so like ~35 GiB free? Or you have to say you have 549.8 (rounded) GB RAM and use exactly 512 GB. Of course in reality file sizes are probably a rather random size so you'll have two long floating point numbers. At least with base 2 you just have one, because you have exactly 512 GiB RAM.
And when you do have base 2 numbers then multiplication/division gives other nice base 2 numbers like 10 MiB / 2 KiB = 5 KiB. 10.485760 MB / 2.048 KB = how much? It's a lot uglier if you numbers are 2^n values, which again they will be a lot of the time. At least far more often than base 10 as long as you're working with the computer itself and not business data or whatever. If you for example want to make something fit in L3 cache to optimize and algorithm, the numbers will be in base 2. You can't "bugfix" your way out of that.
Well I seem to be both right and wrong, it's 42 bits from the camera but it's losslessly compressed so an actual RAW file is still around 70 MB/photo (listed under cons) so the card does hold 70,000+ photos.
So the highest MP camera I could find in a normal store is 40 Mpix (Pentax 645D) * 14 bit RAW = 70MB/picture. So good for 70,000+ photos. Or the Panasonic HC-X1000 4K/24 & UHD/60p camera just released, 150 Mbps = 7-8 hours continuous recording. But I suppose it's good for when you want to carry 10 BluRays in your phone. Whoops, wrong format not microSDXC. I guess there's a niche for this since they made it, but I kinda fail to see the target market, unless it's the "give me the biggest and best you got" crowd.
Well, if you want Kickstarter to be your go-no go decision maker then you can't wait so long you're already pot committed, as they'd say at the poker table. If you're half finished and your Kickstarter fails, what do you do? Throw away all that work and start over on something else? Try to salvage it and publish something, even if it has lackluster appeal? Not to mention then you must go it alone, if you already know you can finish it alone do you really need Kickstarter? My impression is that Kickstarter works best when your "selling points" aren't your product but your reputation and history. I donated fairly big to the Musopen project because there was quite a bit of history to show that yes, they're serious about creating free music but lacked the funds to do it. I'd be very weary of people with just photoshop and powerpoint skills.
Well even if you got most people to agree this is a bad thing there's also the small problem of what line of action would ultimately make things better and not be horribly much worse in the meantime. It'd be an awful shame if it ended up being "meet the new boss, same as the old boss".as they too are corrupted by the establishment, now you have a third party whose politics you feel is nuts as well. Or perhaps I should say that the other way around, chances are they'd have to sell out to get to power otherwise they'd be fought tooth and nail every step of the way. There will be a massive gloom and doom campaign about all the horrors that would bring and people get scared. Even if what they have isn't fair the world isn't unicorns and rainbows and it could get a whole lot worse.
For example many of the people in the Arab spring, they topple a secular dictator in the hope of freedom and democracy and end up with extreme religious fundamentalists taking over the country. Granted, it probably wouldn't come to that but even if we take the Civil War that ended slavery and whatnot there's probably a whole lot of injured or dead soldiers and civilians, people who had lost someone or all their belongings or property who'd rather wish they'd let the Confederacy go. It's easy to point out the flaws in the old system. It's hard to make people believe in change. It's much harder still to make change. And no, I'm not trying to point out one politician in particular they all tend to backtrack on their promises once they get into office. What is going to make people believe you're actually going to be different?
I also think you have to consider Google's risk/reward here. They wanted to pass the test and most of all they did not want to get involved in any accidents, even one where the car was driving technically correct but "unhuman" as that would be used in no small about of FUD. They passed, they got their license to drive, they got the PR and the news that they couldn't drive a roundabout two years ago is nothing compared to the bad PR they'd get for crashing in a roundabout.
And the railroad thing was just policy, it can't be that much harder than crossing a priority road. Or maybe they just hadn't bothered to implement that logic, since almost all crossings in central areas have signals/gates. I just checked here in Norway and all public roads now have that, there are ~3100 unsecured crossings but all on private roads.