Also OpenWRT I found to be closer to a standard Linux distrib, with a package manager and a fully modifiable filesystem. Much easier to work with than DD-WRT IMHO.
I switched to a TP-Link N900 with OpenWRT. I've found it very stable.
As someone else pointed out "Seven leading Internet firms" and AOL
Who's still using AOL , or is still paying for it and actually uses their service. I'm sure I read somewhere that a large percentage of their users are unaware that they no longer needed their AOL subscription to get online via broadband?
Also how are companies supposed to effectively web filter if everything is HTTPS. DNS filtering is, in general, too broad as brush. We may not like our web filtered, but companies have a legal duty that employees shouldn't be see questionable material, even if on someone else's computer. Companies have been sued for allowing this to happen.
Your heart bleeds for them doesn't it.
Sauce for the goose and screw them leaps to mind (and maybe a boo hoo)
I've had to deal with MS locking out Unix based platforms for the last couple of decades, they've had this coming and deserve tons more!
Many real world application use RHEL as a workstation because it doesn't change ABI/API, is supported, there are no major changes through the life-cycle. This is why RH specifically sell a Workstation version of RHEL. See also the existence of Scientific Linux (RHEL clone) and why that is used on workstations at every major particle accelerator.
Too many people view Linux through the prism of their home machine needs. Professional users need things like support, stability, regression tested updates, directory services, NFS (probably secure even if it's just to satisfy an auditor), speaking to (sadly) MS systems (AD, Exchange and even (the horror) Sharepoint). Bleeding edge functionality is worth nothing against stability.
When someone (many) people say use Fedora, is the reason this isn't the year of the Linux desktop. No company wants to reinstall it's desktop estate every 6 months, retest all their apps every 6 months and retrain their users every 6 months. No software vendor wants to retest it's software on a new release every 6 months.
The RHEL approach (7 year life cycle) is correct for most users. Google is wrong to not support this but probably more to do with Google not really having to care corporate Linux desktop users (pretty small base really).
Maybe will happen, seems not so likely this time....
These guys just can't see anyway that MS could lose in a market they enter, just like they have never lost for almost the last 30 years.
Others in this delusion would be Nokia and former HP management (ditching their own better tablet OS).
1/ The Wavelength of light is much larger than the structures used in modern day chips. So the optical circuity wouldn't be as dense as modern day electronics.
2/ When you turn off an optical signal it gets turned into heat (i.e the transistor goes black) not true in electronics as there is no electrical flow when the transistor is off. This causes a theoretical optical devices run hotter than electronic ones and therefore hurting density yet again.
Optics and optical processing have there place (especially processing communications data) but for high density processing these two problematic are very problematic.
Press Release is here:
Detailed timelines are here: