The number of shit guzzling conspiracy theorists here is truly shocking.
The number of shit guzzling conspiracy theorists here is truly shocking.
Change the "a bunch" into "alot". That's even better.
"The last company that makes lethal injection drugs, decides to stop doing it. In fact Justice Alito referred to this in recent cases - guerrilla warfare by these companies. Right. So the last company that has been providing drugs for execution, says to the Government, we are no longer going to help you out when it is time to execute somebody in Terre Haute. Can -- are they thwarting a lawful death sentence by doing that, and can they therefore be compelled under the All Writs Act to re-import something that is held abroad or release something from existing stock or actually manufacture the drug anew?"
"Burden of proof?" That's just pompous.
You're perfectly welcome to strong-arm your way into the design lab of some major phone company and look at all their failed, cracked, dented, bulky, ugly prototypes. If you can. Then line them up next to the one they shipped, and quiz a collection of bystanders on which phone they would prefer to purchase.
Your conspiracy theory does not explain this:
iOS 5: Apple introduces a dynamic scrubbing feature on their music player, where you gain more fine-grained control if you drag your finger up, when dragging side-to-side.
iOS 9: Apple turns the scrub bar in the music player into a tiny, barely visible pinprick on a line, with both ends extended to the absolute edges of the screen, so the bar is both much harder to see AND much harder to use.
But oh hey, it leaves more room for the cover art! The glooorious cover art that I mostly ignore because I'm listening to music and my eyes are elsewhere.
I lay this directly at the feet of Jony Ive and his design team, glorifying the cleanness of the appearance ABOVE the cleanness of the usage. Do you even use your own fucking products any more, guys? Or just lathe them into different shapes and stare at them with your chins on your fingers, hoping that the appearance of competence is what matters?
No kidding. You wanna answer emails - light on attachments - the iPad is fine. You wanna do anything else business-related, you need a "real computer".
I think there's something a lot of people are missing when they pursue a tablet, or even a laptop that converts into a tablet. A basic fact that just never occurs to them, but ends up highly influencing their level of satisfaction with the product:
A trackpad is more energy-efficient for a human than a touchscreen is.
If you are working in 20 minute intervals, OR pointing at things on a screen about the width of your hand, a touchscreen is fine. But if you are working for eight hours a day and you need to point at things on a large screen, a trackpad is MUCH more efficient.
The touchscreen aspect of the "surface Pro" is fun for the casual user, but a needless gimmick for anyone doing professional work. Microsoft is just treating these two user groups as a converged set, and selling one product to both. An acceptable tactic, as the SUV has shown in the auto industry. It won't be the fastest sedan or the toughest truck - but that hardly matters to the segment of buyers who can only afford one vehicle anyway.
Speaking in practical terms, the point is not actually to make it completely impossible for anyone to access data they "shouldn't". The point is to raise the cost of access - in effort - beyond the point where it's economically viable to go after any but the biggest targets. In short: So a dedicated group could compromise you with a computing cluster... So what?
If you don't find that reasoning palatable, consider this:
Try and think of an estimate, in dollars for services rendered, to do the following:
1. Have someone steal your phone
2. Disassemble the phone and read the contents of the flash RAM and the secure enclave out (in the latter case by dropping it into an acid bath and manually reading the status of the bits out of the traces - yes it can be done) (remember, the password only permutes another, much longer key in the enclave)
3. Pass this info to a good-sized computing cluster
4. Dig actionable intel out with some good forensic software
Now compare that dollar cost to what you might pay some local thug to:
1. Hit you with a brick until you give out the password, or in the case of touch-ID, wrestle your finger onto your own device.
If the cost of scenario A is higher than the cost of scenario B, then problem = "solved".
Unfortunately for you, even if you come up with some epic convoluted method to render BOTH scenarios totally unfruitful, as long as scenario B works _some_ of the time they will try it _anyway_. And you will probably end up dead.
Buglers are such assholes... First they wake me up at the crack of dawn, then they crack my encryption...
Oh I see what you're doing. You're trying to say that this big anonymous h4x0r group is actually our own federal government attacking our own most successful tech industry companies, to try to get them to completely abandon the technology they're using to defend themselves from such attacks.
That's way, waaaaay stupider than what I originally thought you were saying. Because it is prima facie self-contradictory.
I suggest you go outside, sit in a nice sunny park for a while, and breathe some fresh air. That tinfoil hat isn't doing you any good at all.
Oh those wacky stooges! nyuk nyuk nyuk
What will they demand next!
Wait, they haven't demanded anything.
... but hackers and geeks stopped being the arbiters of computer-based culture at least ten years ago. Our pedantic definitions don't mean squat to a public that uses them for their own purposes.
Besides, "cracker" was already turning unfashionable when Hackers came out in '95, and that movie gave it a shove out the door. And you just can't say it out loud in the South without getting weird looks.
Congratulations! And I mean that sincerely. The next step in your transcendence is: Abandon Slashdot. It is the corner gas station.
A very _wealthy_ "niche group" of developers, too.
I noticed you completely failed to mention Java, which was hailed as the total cross-platform solution for a while until web browsers started crapping on it, and now it's synonymous with Android programming and would be decisively in the rear-view mirror of the tech industry by now, if not for that. What's funny is, Google almost had to choose it by default. What were they going to use instead? C# from Microsoft? ObjC from Apple? What else are you going to implement and entire OS in that isn't 20 years old?
If you're placing bets that Swift will dry up and blow away because Apple is due - any day now - to do the same, you're probably a little TOO old-school. You know what will die before Swift dies? In terms of popularity and profitability? C#, because its fate is tied almost entirely to Microsoft. And that's not going to die for quite a while.
Text input via voice is garbage for anything you don't already do in direct, live conversation with another human. Instant example: Mispronounce something, then try to correct it. What we need is a novel new pointing device. My idea of the future tech involved is: Very very f*%^ smart radar, bounced off your skull, that tracks the location of your tongue in your mouth.
For a long time, these things will need to NOT have an obvious camera on them. The cultural zeitgeist is against it. They'll just have to do augmented reality some other way.
They sold me a phone with the RAM SOLDERED IN?
Good grief, next you'll be telling me that I can't swap out the L2 cache in my CPUs any more...
The smartphone market is NOT simply a larger version of the desktop or laptop computing market. The priorities of the consumers making it up are quite different. You're comparing apples and
And seriously, if you think the iPhone 6 or the Macbook Air is "mid to low end tech"
Oh hang on,
Computers are unreliable, but humans are even more unreliable. Any system which depends on human reliability is unreliable. -- Gilb