No. Just no. That is flat out incorrect. Windows got a lock on the desktop because you bought it with every computer whether you used it or not, and joe blow secretary or the old-school executive did not *PREFER* it to other options, s/he did not typically understand there was any alternative. And because MS has always been willing to use their position today to acquire or destroy any company that might get in their way tomorrow, of course.
"I once read a great take on organization. If you have more than ten of something, you probably need another level for ease of use, be it files in a folder, icons in a start menu, etc. I took the time to redesign my start menu in windows, and boy I and anyone else could find right where any program was, quickly."
Arent you glad that the system *allows* you to do this manually, instead of insisting on hiding all the details and just giving you an unchangeable 'view' that enables only the most commonly used options rather than confuse you?
By whom? Since when?
"Try setting up Kmail and you would know what I mean. "
Havent used it lately but I dont remember it being much different from more common GUI email apps. What are you getting at?
"The KDE developers are aware of it and now they are working on making KDE UI simpler. "
Thinking of GNOME, which was once somewhat useful and useable before the developers started talking like this, a shiver runs down my spine.
"KDE usability team lead Thomas Pfeiffer Thomas prefers a layered feature exposure so that users can enjoy certain advanced features at a later stage after they get accustomed to the basic functionality of the application. He quotes the earlier (pre-Plasma era) vision of KDE 4 â€" "Anything that makes Linux interesting for technical users (shells, compilation, drivers, minute user settings) will be available; not as the default way of doing things, but at the user's discretion."
Ugh. *Minute user settings* are actually very important to many non-technical users. This does sound like GNOME, unfortunately.
As sites, one by one, go insane, I quit going to them.
The nice thing is, the internet is still very useful without them.
If you are tired of facebook bling and mindrot, if you are looking for the informative web that we used to have, you have only to open your eyes. Turn off ecmascript, and when you hit an address that refuses to return a web-page, just hit your back button and go somewhere else.
It's a good thing in a way. I used to have to spend some time reading to figure out that a site was worthless. Now I just notice that it isnt actually a webpage right off and save some time.
Exactly why it needs to be nuked from orbit.
"Umm...Objective-C is the ONLY [good] way (besides Swift, which you'd hate even more) to write software for iOS devices, and the best language for programming Macs."
Neither of which is a good reason to use it, but it's actually a great language despite the failed attempt to defend it - it was the one thing on his list that did not fit.
"However, some folks still wear mullets and pine for the trash-80..."
And some of us use computers for practical reasons, rather than as fashion accessories.
In the case of Russia, the ability to obtain non-vetted or embarrassing information (like invading Ukraine) constitutes an emergency.
Before, you'd have the same concept spread into a dozen of different systems, each only doing part of that functionnality."
Which is exactly how it should be.
PID1 only needs a small subset of those capabilities to do its job. And because it is PID1, because everything after has to rely on it, it's essential that it be well behaved and stable. Therefore it is essential that it have only the required set of capabilities and absolutely nothing else should be added or linked to it.
Other things can and should be done by other systems, not concatenated together and poured into PID1 where an error can bring the house down.
Right now there are 3000 dead from Ebola. Europe lost a quarter of its population to the Spanish Flu just a 100 years ago, so I'd say there's no worries there.
He always posted this schizoid stuff. Just ignore him.
No way, man. From the ruins of Baltimore to the nuclear wastes of upstate-NY... Mega-City 1.
An oversimplification. The US, UK, and allies variously broke many cipher systems throughout WWII. Still the US benefitted from this.
What if the Germans were using, say, Windows, Android phones, SSL, Gmail, Yahoo, and Skype, instead of Enigma machines?
I presume you wouldn't say it was "wrong" of the United States to crack the German and Japanese codes in WWII...
This isn't so much a law enforcement question as a question of how to do SIGINT in the modern digital world, but given the above, and given that intelligence requires secrecy in order to be effective, how would you suggest the United States go after legitimate targets? Or should we not be able to, because that power "might" be able to be abused -- as can any/all government powers, by definition?
This simplistic view that the only purpose of the government in a free and democratic society must be to somehow subjugate, spy on, and violate the rights of its citizens is insane, while actual totalitarian and non-free states, to say nothing of myriad terrorist and other groups, press their advantage. And why wouldn't they? The US and its ever-imperfect system of law is not the great villain in the world.
Take a step back and get some perspective. And this is not a rhetorical question: if someone can tell me their solution for how we should be able to target technologies that are fundamentally shared with innocent Americans and foreigners everywhere while still keeping such sources, methods, capabilities, and techniques secret, I'm all ears. And if you believe the second a technology is shared it should become magically off-limits because power might be abused, you are insane -- or, more to the point, you believe you have some moral high ground which, ironically, would actually result in severe disadvantages for the system of free society you would claim to support.
Is that the new word for 'adult?'
But you do nonetheless. My current machine was bought for one reason - price - and lacks it. When I've built my own systems in the past I have always used it. Scoping out parts to build a new one, I see the price of sane memory has only gotten further out of line than I remember.
This is one aspect of a market where the buyer does not understand the product well enough to make intelligent choices. If computer buyers understood the technology, at least 70% of them would insist on ECC, and as a result economy of scale would have eliminated the price premium long ago. Instead, manufacturers continue to skimp a few pennies on the RAM by default, creating an economy of scale advantage in the other direction, which only reÃ«nforces the bad allocation and ensures it continues.
Instead of ECC memory they should call it 'sanity-checking memory.' Maybe then people would understand what it is enough to realize they want it. But since no one in particular stands to make a windfall by doing it, no one promotes it.
The UK hasn't started killing off Scots yet, so the comparison is somewhat premature.