Quartz extreme is just the compositing part - like Xgl. Quartz2D Extreme is the actual "draw widgets on GPU" part
No, it isn't. Seriously, I've not seen any real evidence of this.
Move a window. See how the window lags behind the cursor? (Or the cursor lags behind the window, i don't remember). That's one. Now scroll in either chrome or firefox. Less than 60fps. Enable compositing - now you've lost vsync with any kind of accelerated video (unless you enable that ugly hack in intel drivers, which slashes performance in half). That's just the top of my head.
If anything it's one of the best performing systems out there
Even the current developers of Xorg disagree with you
On some cards with the "glamour" driver, all 2D operations are done on the graphics card using shaders. Never mind the EXA and XAA systems which have also used older 2D acceleration.
That's exactly ONE driver - intel. And glamour is a HACK. It does a double-reacharound to do what wayland does by default (and any other sane windowing system). With adding the X protocol cruft with all it's stupid extensions on top.
Yes it does and you're just making shit up.
No, I'm not. First it was XAA, which did shitall. Then it was RENDER, which supposedly did what glamour does now. Oh, and let's not forget about EXA, FMA and other 3 letter acronyms which were supposed to "fix it". I read Keiths blog regularly, I remember his benchmark for intel, when he worked on SMA and GLAMOR (too lazy to link to his benchmark, but you can use google). And let's get this straight, all those "acceleration" extensions were for one purpose - going AROUND the X protocol, because it was designed for LINES, not bitmaps.
So using shaders to do 2D stuff is a "hack" in your book? How does that make any sense whatsoever?
No, doing 2D in shaders is exactly what you want. Gluing that to the X protocol is patently stupid and counterproductive. It's a stopgap until the Linux desktop adopts wayland.
Did OSX get some hardware accelerated features first? Yes, but it didn't get hardware assisted rendering of any sort first
Actually, it did.
OS X 10.4 (Quartz Extreme 2D) - 2005
Win 7 (DirectDraw) - 2009
Linux - about now-ish, and STILL not fully adopted. Unless you can point me to a data source that claims otherwise. And no, hacks like glamour (which got release a fucking month ago!) or Xgl don't count (it doesn't matter really, since it was just compositing)
While i hate systemD (Gentoo user here), it takes a minute playing around with weston to notice that we should've been using this for a long time. Xorg IS old, slow, buggy and deprecated by all standards.
And this is exactly why OS X is superior in that regard. They had GPU assisted rendering (first compositing, then full UI rendering) since what? 2005? I remember it was around 10.4 PPC. We have 10.10 now. Windows only figured it out on Vista and up. Linux STILL doesn't have it (Xgl was just compositing) unless you count hacks like glamour.
X11 is neither antiquated or stale
Which is why the Linux world is so eagerly jumping on the wayland/Mir bandwagon, right?
Textmate would be one fine example that I use, you also have Sublime (granted, it's multiplatform) and Coda if web development is your cup of tea.
And then you can have pretty much ANY of the Linux text editors if you wish, as GP said.
Very well, let them. Nobody in Michigan can afford one anyway.
Because it was a really complex architecture, which even todays PC can’t get right and accurate
Too bad 2 out of 3 OSes you mentioned aren't "locked down"
Too bad Chrome is becoming less of a browser and more of an operating system in itself. The emacs of web browsers if you will.
Not to mention it got unbearably slow since some time ago. For me, every time a website starts to do some DOM operations, it just stops dead in its track, does that, then resumes rendering. Very noticeable when scrolling. So much, that i switched to Safari for the time being. I still run Chrome in the background for the apps (Hangouts, Play Music). I wish they'd just fix it.
Which is why I wrote "not ready yet"
Not to mention the whole
The original WRT54GL had a cult following, but in perspective was a pretty poor OSS router. The wifi driver was binary and heavily tied to broadcoms kernel tree. It was a start however.
Nowadays we have OpenWRT which IMO is the pinnacle of SOHO router software - up to date kernel, upstream OSS drivers, and a kickass config system, all contained in ~6MB firmware file.
Now to answer the question - you want to stick to Atheros/Qualcomm-Atheros chips and make sure the router is supported by OpenWRT. If you have those 2 things, you absolutely can't go wrong.
My suggestion is most TP-Link stuff (except for the newer Archer C-series, it's just not ready yet), or the Atheros-based Netgear stuff (WNDR3700v2 or 3800 if you can still get them). Stay the f*** away from Linksys and D-Link, Asus seems to be nice but they keep using Broadcom chips which are extremely poor for OSS software.
Macs don’t ship Xquartz by default anymore (since 10.5 if memory serves right)
Aren’t they also the ones who limit their firmware updates only to customers who have support contracts? I guess you get what you pay for..
I'm a network engineer myself, so I get the points you're making, but you have to realize - to Nintendo, these are TOYS. Not software you have to perpetually support. It has a shelf life, which ran out some time ago now. Online wasn't even that big part of the console anyway (compared to what you have with Xbox Live or PSN) so it doesn't matter if it fades to obscurity.
Oh, and neither the console nor the games could be updates. It's different on the 3DS now.
The DS didn't really have an operating system. Each game would ship with the "driver" to the chip, and yes, the game would talk directly to the chip. No space (or need) for a network stack.
Since when does MTV have music back on it?