Also, you can use your mouse with elinks. Even right-clicking works, open in a new tab, background downloads
Also, you can use your mouse with elinks. Even right-clicking works, open in a new tab, background downloads
The ribbon really isn't the same thing as a flat look. The two are totally independent of one another. Take a look at this screenshot, for example: Windows 7 with visual styles turned off—no doubt familiar to anyone who's managed a recent Windows Server or used RDP. It's still full of the newfangled conveniences you loathe, despite being cast in traditional 90s bezels.
This is the point where the holier-than-thou crowd says you should know all the hotkey combinations for everything if you want to be efficient. Those never changed, creating a nightmare for anyone who wants to learn them in the post-ribbon Word.
Almost all of the biggest offenders were iOS apps, so if you never had an iPhone you were spared the vast majority of incidents where skeuomorphism caused problems. Ideally, you're right, skeuomorphism should be helpful, but many designers used it to create the illusion of quality by borrowing images and textures from physical objects that they perceived as being valuable. Here is a thorough breakdown of the nausea of the era.
It's a little sharper than that—the current generation of interface designs was a direct reaction to the previous decade's tradition of absurd skeuomorphism. The moment Steve Jobs died, Apple did an about-face and started following Microsoft's Win8/Modern/Metro UI lead. It may look like a step backward to those who from the Windows 2000 and Gnome 2 era, since there's a loss of visual cues, but the flatness of current interfaces is way better than what the classics became in the post-Windows XP era: bloated, overdesigned, pseudo-real-objects cluttered with mismatched shadows and conflicting perspective angles. You couldn't tell what was a button there, either! At least now there's a consistency and a return to the actual use of design guidelines.
That said, there are still a lot of cases where literacy in idioms dominates: for example, the largely inexplicable convention of swiping sideways on a list to reveal 'delete' or 'edit' buttons in mobile apps. That's probably where you and the UX designers run into the most difficulty. But two decades ago, every "how-to-use-a-computer" class targeted at seniors started with how to operate a mouse—so, as I think you've already recognized, it's important to try to take these things with a grain of salt, and recognize that no one is completely objective when it comes to understanding the culture of computer operation.
The question is what should they do for a refresh? They've been waiting for processors from Intel but it almost looks like the bad old days of the PPC at the moment with Intel dialling right back on improvements, I mean an i7 processor from five years ago is still a pretty good chip all things considered. Hard to sell new computers to people who don't need them and I know from my history of Macs that three years is far too short a time for me to get maximum value out of them. More like 6 in fact. My current laptop is two years old and I consider it virtually brand new and won't be looking to upgrade it for quite some time to come. Apart from bumping the RAM and putting in a new HDD to replace a failed one, all my Macs have been virtually sealed units so I don't mind the current state because with the lack of upgradability comes reliability. I've had problems with machines in the past where I needed to reseat the RAM to get it to behave, but that's not the case any more. Dead HDD? Built in SSD solves that and at 500GB it is plenty big enough when allied to external storage as needed. As for the design? Why mess with a classic just because a few years have gone by? I like that I can buy a new Mac and in a few years it will still look and generally act like a new Mac (a few minor cosmetic features may differ but overall it looks the same) and that may not excite people who constantly want new stuff but I like it. I certainly don't like PCs which change models frequently and become hard to maintain because the specific parts are no longer made for that model, and I don't like Windows which is a ghastly mess and doesn't know if it is a tablet or a desktop where at least the few things macOS has picked up from iOS are subtle and I don't really use them anyway. Maybe people are refreshing their PCs after holding off due to Windows 8 and finally accepting Windows 10, but for mac users who just got Sierra there's still no need to upgrade unless the machine is really old.
Not all of us have forgotten, that is still one of the reasons why I do not buy anything from Amazon.
Damn, you have a lower UID than me. Hey, on Slashdot, that's a pretty good test of credibility!
Seriously, I get bothered a lot by mixed model software, as it's very hard to keep straight what the programmer intends (and worse for the compiler, the nonsense that compilers magically work it all out is just that, stacks don't manage themselves and optimization - which has always been one of the black arts - isn't simplified by ad-hoc paradigm mixes).
When it comes to using a single paradigm, always choose the one that makes robust code the easiest to write. Forth is still used because there's a lot of hardware programming that is far, far easier with stack operations than procedures or objects. No matter what the current generation of whippersnappers think.
So, the question reduces to this. Is there a class of problem that is better done with aspects? If so, you need it the same way you need Forth. And even Ada.
I travel a lot and never use roaming. Most of my stuff comes over the network anyway so I just make sure I have plenty of data. Last time I visited the UK I bought a SIM from 3 for Â£20 from a machine which came with unlimited calls, text and data. What I didn't realise at the time was it would also work almost anywhere in the world. When I went over to Denmark it connected to 3-DK and worked fine there, Sweden, yep, USA it switched to T-mobile and then I ended up in NZ and it connected to 2degrees. The SIM only worked for 4 weeks but boy did it work.
I want him to roll in the additions from Cilk++, Aspect-Oriented C++ and FeatureC++, the mobility and personalisation capabilities of Occam Pi, the networking extensions provided by rtnet and GridRPC, full encryption and error correction code facilities, everything in Boost, and a pointless subset of features from PL/1.
If you're going to do it all, might as well do it in style.
Seriously, though, Aspects would be nice.
PGP with a normal email client does nothing to protect your "metadata", i.e. who you are, who you communicate with, the subject line, date, etc. All you can do is use TLS/SSL and hope that the email servers communicate with each other encrypted without NSA backdoors (i.e. they have a copy of the TLS/SSL private key).
Ok, I missed that option and probably others off my list. Gets complicated fast and only a full time research assistant has a hope of mapping it all out.
Yes there is. It's not a right-left test, but there's a near-perfect match between gender and specific neurological features. In a higher than expected number by chance, people who think they are mentally female are female in structural and functional studies. Likewise, people who believe themselves male have a male brain.
I try not to get too annoyed at dogmatic statements, but unless I specifically defer, I have a comprehensive archive of published literature from high-standing sources. Don't rip on me unless you know either my interpretation is wrong (it happens) or you plan on publishing a peer-reviewed rebuttal on each particular of relevance.
The first of those has happened a few times. Let's see if you can bring it up into double digits. Feel free, but remember that you're dealing solely with article facts and my interpretation. Where I used other sources, pick any peer-reviewed paper that covers the same basic aspect of brain development concerned (i.e. neuron type is indicated by chemical transmitter, it is not hardwired into the genome. Doesn't matter if it is the one I used or not. Falsify it. Better yet, falsify it and get the scientist or magazine to retract it for further work.
Ok, you should now be at the point where you accept the data sets I used. That just leaves two options. If the seat of the mind is in the brain, then a female brain must have a female mind, regardless of Y chromosomes, appendages and birty certificate.
The only other option is to falsify that, to argue that the mind is independent of brain. If you choose this, please choose to announce it at a medical school outside the brain surgery department after a very taxing practical, shortly before exams. Contrary views are nothing to worry about.
Finally,You can just let the basis be, the chain of reasoning be, but then you have to accept the conclusion.
Let me know your preference.
Manning should get a full pardon and a medal of honour. S/he has done more for this country than Biden ever did, and that was after getting a forcible deployment against regimental doctor's orders.
The worst Manning is truly guilty of is exploiting severe violations of DoD regulations by the unit s/he was in. Those violations, and not her actions,compromised national security, as did Manning's superior officer. Those people were under strict orders on not deploying the severely mentally ill into Iraq and to withdraw clearance from such folk, but violated those orders in order to look pretty. That is a serious crime. A crime they, not Chelsea, are guilty of.
Under DoD regulations, computers holding top secret information may NOT be secured by just a password and may NOT support USB devices. I was working for the military when they did the cutover from passwords to passwords plus Class III digital certificate on a smartcard. The USB restriction has been there more-or-less from the introduction of USB, as it violates Rainbow Book standards requiring enforceable multi-level security.
I should not have to point this out on Slashdot, half the three digit IDers were probably involved in writing the standards! And the rest know all this because we had to look the bloody stuff up to get the NSA's SELinux working!
She was also under orders, remember, to ensure that no war crime was concealed by the military. Concealing a war crime, even if that's your sole involvement, is a firing squad offence under international law. Has been since the Nuremberg Trials. Nor is it acceptable to be ordered to carry out such a cover-up. You are forbidden from obeying such orders on pain of death.
Those are the rules. The U.S. military's sole defence is that nobody is big enough to enforce them. If someone did, the U.S. population would be noticeably smaller afterwards. We know that because of Manning.
But Manning's service doesn't end there. Military philosophers, tacticians and strategists will be poring over those notes for decades, running simulations to see when, where and how the U.S. was eventually defeated in Afghanistan and Iraq. They will compare actions carried out with the military philosophies the U.S. officially abandoned in favour of modern theories. They will search for ways in which the new approaches worked and where they should have stuck with the traditional.
Because modern computers can run millions, even billions, of tactical simulations in just a few hours, it is certain that, inside of a decade, someone will have done this and published a book on where the military went wrong and where the Taliban and Iraqi army went wrong as well. This core material allows for that.
These wars may turn out to be our Sun Tzu Moment, when through cataclysmic defeats at the hands of, essentially, barbarians (and make no mistake, they're defeats), a systematic analysis of all that went wrong will be conducted in order to produce a guide on how to have things guaranteed to go right.
Without Manning's data, this couldn't happen. Direct footage, real-time tactical information, logistics, international political interactions, there's enough there to actually do that.
I'd prefer it to be us, because nothing stops the next terror group to form from performing the same study. Historically, it has been shown that a smart army can defeat a confident opponent with superior technology and ten times the numbers, or with inferior technology and a hundred times the numbers. No reason to assume these are hard limits.
If it is us that figures it out, the Pentagon (still fixated on Admiral Poyndexter and his psychic warriors) won't be involved, it'll be people on the outside with more nous and fewer yes-men. And for that, Manning deserves the highest reward.
Besides, it'll annoy the neoconservatives and that's worth their weight in gold-plated latium.
Adding features does not necessarily increase functionality -- it just makes the manuals thicker.