I pay $45/month for Comcast's 25Mbps Internet (which may be called Performance Plus or Blast depending on the market I guess). I get 30/6 on speedtests. I also get free HBO Go because that $45/month includes a tv box for basic cable that is still shrink-wrapped sitting in the corner somewhere.
Flash is useless on my 192dpi laptop. Everything is so tiny or sometimes only fills up the top left 25% of the box. Adobe doesn't ever seem to care -- https://bugbase.adobe.com/inde...
If you're self-employed, have investment income, or asset depreciation, you probably already do your taxes with a real CPA. If you aren't, you probably should.
UTF-16 is terrible, yes, but Windows does support it. I'm sure naive programmers create bad code by assuming UCS-2 and all characters being 2 bytes, but surrogate pairs like Emoticons U+1F600 - U+1F64F work just fine.
And by "out of luck" I was referring to possible future codepoints above U+10FFF. UTF-16 can only support up to that by using surrogate pairs. It does not have any way to represent higher codepoints, where as UTF-8 can easily be extended with 5 and 6 byte sequences.
I'm pretty sure most font systems already DO do this. In fact, this was the reason I rooted my Android phone - I wanted to change the font-fallback order so that certain Kanji would display with a Japanese font instead of Chinese one. An example is http://jisho.org/kanji/details... which is drawn completely different in Chinese fonts, to the point where Japanese readers would not know the symbol, yet both are supposed to be represented by the same codepoint, because they're the same character.
But anyway, fonts and display aren't a character set encoding issue. It doesn't matter how you represent the glyph on disk or in memory, if your fonts are all missing a rendering for the character, you're going to just see a placeholder box no matter what.
The official spec limits UTF-8 to 10FFFF to help it place nice with UTF-16, so no 5 or 6 byte sequence is valid anymore. There isn't any characters defined above 10FFFF yet anyway. But in the future, if those ranges are defined, it would be easy to have programs using UTF-8 utilize those characters. If you use UTF-16 like Windows, you'd be out of luck though.
The font issue is a silly thing to worry about. The same thing can be said of ASCII of and Windows-1252. I'm sure lots of early fonts, and probably even some you find today, that claim to support all glyphs in Windows-1252, are missing the Euro sign at codepoint 0x80, because they added it later on. Even for a small character set restricted to 256 max characters, as you can see, things change over time, and fonts don't always keep up.
The answer is UTF-8. It's pretty much going to be the de-facto character set now. It has backwards compatibility with ASCII, and can easily be extended in the future to support possible U+200000 - U+7FFFFFFF codepoints, as the original UTF-8 specification used to include that anyway.
Any important point is to not mess things up and end up with CESU-8 like MySQL did. There are completely valid 4-byte UTF-8 characters, so don't think of it as some special alternate UTF-8 by artificially capping UTF-8 at a max of 3 bytes per character.
I've hiked in the backcountry for a week in Airplane mode with MyTracks recording just fine the whole way. MyTracks can also save your KML or other formats to your SD card for easy access last I checked.
Mozilla's refusal to implement PPAPI and Adobe's refusal to fix bugs in the NPAPI version of flash is going to cause a lot of problems in the future. For example, right now, the NPAPI version of Flash can't handle HiDPI properly (retina display and Windows at non-96dpi). Websites that use Flash for video or text display are unusable if you have a 192-dpi screen already, and it's only going to get worse as HiDPI both becomes more common and people start using even higher DPI devices.
Meanwhile, the same websites display just fine in both Chrome (PPAPI Flash) and IE (ActiveX Flash).
Time and time again, they just refuse to properly implement CSS3 gradients.
Version after version, no progress on https://code.google.com/p/chro... at all
See http://slashdot.org/comments.p... from version 38.
It's pretty clear at this point, use Firefox or IE10+ if you want good HTML5/CSS3 support. Chrome only cares about what benefits Google and their ability to advertise to you.
I think Firefox is the only good browser, and the only one people should be using. It renders the best, has the best adblock, and is secure and respects privacy as best as possible.
As a web developer, when all I care about is how the site renders, I want people to be using Firefox, or at least IE10+. Using Chrome or Safari is like using IE9. Is sort-of works with modern HTML5/CSS3 design, but with a graceful fallback to a crappier, sub-par look due to missing support for all the CSS3 features I want to use.
But I guess Chrome is a lot better than IE8 and below, but at that point we have to start comparing Netscape 4, so it's best to just forget about anything that old now.
That's a great excuse. IE6 sucked so Chrome 38 might as well still suck. Yeah CSS3 support is split up all over the place, but there are a certain small set of really useful core features that just about every browser supports, and are particularly more useful for webpages than other features. Pretty much everything in that small list of features is supported by IE10+, Chrome 10+, and FF 4+. Sometimes support requires with vendor prefixes, but it still works. Except gradients on Chrome. Up to version 38 still and you can't make basic angled striped patterns for backgrounds, or smoothly blend two colors over a large distance.
And sorry, if you're talking about security, let's talk about privacy. Google is to the point where I'd rather trust Microsoft with my personal information over Google, so that's a huge sting against Chrome, and I'm not really trying to advocate IE here. Firefox is pretty much where people want to be, especially given how much better adblock support is there.