Excellent responses. Makes my EE-heart thump a bit faster reading things like this and Eben certainly delivered. All credit due... BRAVO!
Excellent responses. Makes my EE-heart thump a bit faster reading things like this and Eben certainly delivered. All credit due... BRAVO!
I've been doing this IT thing for a long time. A very long time.
I don't think there is an IT expert/admin on Slashdot who would attest that--if given the job to engineer/configure an email server for Secretary of State (much less, merely private citizen) Clinton--this server was in any way designed or implemented properly. Not for security, not for compliance, nothing.
So... am I to believe that Hillary Clinton is so woefully incapable of finding a competent IT engineer/admin? Here is ALL OF SLASHDOT. Am I to believe that? Because, if so, she's woefully incompetent for ANY governmental position; I don't believe she should be in any position of power that directly impacts me, my freedom. And anyone who supports her, at this point, in this community, given what is so obvious to see about her character and her intentions, either has to be insane or be seen as complicit in her and her "party's" power grab. It is that simple.
As documented in the book, AppleDesign [https://amzn.com/1888001259], Apple's Industrial Design Group prototyped just such machines in the mid-to-late-80s and early-90s. Too bad that Apple has failed to use its own in-house design history for inspiration. I really expected to eventually see a Mac mini that incorporated the stacking concept, instead they basically emasculated it.
With "cloud computing", the Internet of Things, and concepts like the Intel NUC and Arduino and Raspberry Pi already upon us, modular computing makes a lot of sense. When users have much more personalized devices like iPhones and iPads, big "contained" boxes of computers don't make much sense. Instead, clusters of computing and storage, invisibly inter-meshed with cloud resources will take over and deliver "computing" just as with how electrons are delivered through a mesh of power plants big and small, solar panels, batteries, and generators. As with power, the cost and bandwidth of the delivery pipes compared with localized need will determine the "network".
Same here. The 128Kbps MAX is a joke unworthy of a $5 a month "penalty". I especially like how AT&T tries to wrap the same $5 fee as Verizon's "Safety Mode" as "We're not doing what Verizon is doing. We don't charge you!" Uh huh. And both of them sitting on this 128kbps thing like a badge of honor. 128Kbps is all but unusable. I'd like to see Randall Stephenson do a John Legere and live for a week on 128Kbps data. If he can do it for a day, I'll get off my high horse. (He wouldn't be able to, so, no worries there.) At least Verizon says 128Kbps is what you should expect; AT&T pulls a mealy-mouthed "128Kbps max". Like... what? 2Kbps? 1Kbps? How 'bout we drop into the "human readable bitstream" speeds... Are those possible, acceptable speeds, AT&T? Jerks. 2 steps forward, 3 steps back. Always with these guys.
Go back through the transcripts of EVERY Quarterly call and Keynote/Product/WWDC speech Tim Cook has given since he took over the helm from Steve Jobs...Cook has said, in pretty much the SAME TERMS, EXACTLY this same line. Every. Single. Time.
And what have we gotten?
* iPhones with bigger screens: Something the Android manufacturers had been doing for a few years before Apple, and something that would have been trivial for Apple's engineers to do. (In fact we knew, from various reports and Isaacson's Jobs book, that Apple had long been experimenting with MANY screen sizes, for years.) Chassis gets thinner, gains an unsightly camera bulge that Apple would have laughed at a year before if it showed up on a Samsung, battery life stays pretty much the same: inadequate.
* iPads with a smaller screen, and a bigger screen: see above
* an iPad Pencil: neat. (Had a pressure sensitive Wacom in 1994, so can't really get THAT excited. Everything old is new again, I suppose.)
* the Apple Watch: another response to another nascent Android-industry first; and product that a year after launch still does nothing appreciably better (and a lot worse) than the Android-ecosystem units.
* a Mac Pro: a "pro" computer that debuted to long manufacturing delays to replace a "pro" computer that Apple didn't bother to update for 3 years that hasn't been updated in over 18 months. Uh huh.
* Retina 27" iMacs: neat. Expensive.
* Retina 21" iMacs with 5400rpm 2.5" spinning drives, glued shut, and no expandable RAM: Uh huh.
* the one-port MacBook: charge your laptop or charge your battery-life-barely-decent iPhone. But not both. Charge your laptop or use an external monitor/projector. But not both. Or, buy this $80 dongle that weighs 1/4 the weight of the whole laptop. Uh huh. Oh, and EXPENSIVE. MUCH more $$$ than the ChromeBooks that K-12 is now buying...didn't the MacBook used to be an Ed Market target??
* a Mac Mini: same as the last, minus $100, with a 5400rpm 2.5" spinning drive, no expandable RAM. Slower than the year before's model. Uh huh. Now 18+ months old.
* no new Cinema Display. In fact, no new Display from Apple in several years. Despite improvements in Thunderbolt. (Oh, yeah, reminds me about that Mac Pro again.) What should I plug my MacBook Pro into again when I'm IN the office? Ahhh, a crap-ass HP or Dell monitor, gotcha.
* Bugs. Bugs. Bugs. Bugs. Bugs. At least 4 iterations of iOS and OS X that have each taken nearly 6 months to reach an acceptable level of "stability". Yet many people STILL can't seem to get Mail on OS X to display their messages correctly. Or notify them of new messages correctly. Or show messages in the correct folders. Or even show messages at all. Because email is "new", I guess. iOS updates that brick brand new iPhones' radios. iOS updates that disable hardware features. iOS updates that disable Wi-Fi. Bugs. A lot of bugs.
* Swift: cool. World can't have enough languages.
* Apple Music: I think I've seen this service before.
* iCloud: I think I've seen this service before. (Oh, and before you ask...NO, you can't merge your old iTunes account with your new iCloud account yet.)
* Apple Pay: don't know a single user who actually, uh, uses it.
* new versions of iTunes. Yeah. I'm just going to slowly walk away now...
I'm sure I'm missing something. But I really DO await all this magic sparkle fairy unicorn dust that Tim Cook is expecting Apple to fart out later this year. And next year's magic sparkle fairy unicorn dust fart will even be better! No doubt. Because he SAYS so.
Meanwhile, I spent 2 hours of my life today troubleshooting various Apple bugs for clients that Apple blames on every thing but Apple. Known issues. Apple software. Clearly...CLEARLY...not "a bug". "You're holding it wrong." Riiiiiight. I've been an Apple user since 1983. I sold Apple gear from 1989 until 1996. I worked for Apple in the early '90s. I've been working on Apple gear since. I have a hint for you all: that magic sparkle fairy unicorn dust they're farting out? Sometimes, at best, it is just hot air. You should HOPE it is just hot air. The downside is much worse.
(Look up the word 'hubris' in the dictionary.)
Not excited for the 4-inch iPhone, but glad it is "back" in the line up. I was disappointed that Apple stupidly went bigger--after having spent so much time talking shit about big devices--without also hewing to their prior conviction that ~4" was the "best" size. Apple either HAS design credibility, or does not. Human hands haven't changed significantly since the iPhone 5 shipped and their iPhone design philosophy was either right or wrong. Can't have it both ways. So this is rectifying a mistake; can't get excited about that. Plus, no 3D Touch, that sucks.
That Jack said 140 characters is staying, I'm wondering might be one of those Steve Jobsian deft maneuvers where you say what people are listening for, but aren't actually saying what you're planning (and thus don't actually ANSWER the question).
I'd not mind Twitter to stick to 140 characters for tweets as they appear in the Feed. In fact, I tweeted Jack my suggestion:
- 140 characters Tweets would stay. You could continue to tweet 140 characters at a time, OR the 140 character tweet could also be a Summary Tweet that includes expands out into a Super Tweet.
- Super Tweet that would be 500 characters (something like that). The Summary Tweet would show in the Timeline with a "more..." expander widget. Users could choose to subscribe to a Feed or an Expanded Feed, and that would determine how much Tweet the feed is sent. If you sub to an Expanded Feed, you get the 500 character tweets in your timeline without having to make a call-back to Twitter to "load more". For normal feeds, you'd have to wait while the tweet expands. 140 characters can just be too limited a lot of the time, and anywhere from 200 to 500 would be a welcome expansion. The user would be able to edit the Summary Tweet "part" of a Super Tweet before Sending.
- Finally, Twitter should support a Long-form Note. This wouldn't be that difficult, basically some standardized manner to link to a blog post or article, which might be hosted at Twitter, or not be. The nice thing is that they could implement Notes the way Jack has already been using OS X's Note app, use an image representation and attach that to the Summary Tweet/Super Tweet that links to it.
That doesn't mean that 140 has to go away, just as Jack says. It will stay. But that wouldn't have to mean that Twitter can't expand the kinds of Tweets as well.
My thought too. Nothing stopping him from taking any other iPhone 5c, setting Auto-Erase to on, and proving his hacking team's prowess on YouTube for the world to see.
(The exception is that the court order doesn't actually reveal what specific iOS version the iPhone is running. The FBI alludes, a lot, that it is running iOS 9.something, but doesn't otherwise clearly say. Which I find a bit suspicious; they spewed out a lot of other info about the device.)
I await reading the FBI/DHS memo detailing the alert on the threat of the Sweet Meteor of Death hitting the 2016 Super Bowl. I mean, come on, the odds are about the same as for this.
I don't have a problem with Windows 10. Overall I like it, much better than 8, and clearly more "futuristic" than 7. Free is a great price.
I am also very impressed with the "new" Microsoft under Satya Nadella. The company has done things I'd NEVER imagined they'd do, GOOD things...SMART things. Windows 10 being FREE was one of those things. There have been a few rocky issues, some high-profile like the Live One Drive storage space snafu. But overall, I've been impressed. The open source initiatives are just mind-blowing coming from Microsoft.
But this thing RIGHT HERE... THIS has been a fucking mess. Abject "What the fuck??" failure. First of all, people have stuff to get done, and small businesses often work on cycles. This thing is happening RIGHT IN THE SMACK MIDDLE of Tax Season in the US. Any idea how rickety the software that runs tax prep is? Trust me, this stuff isn't Win7 material. There are A LOT of small, independent tax preparers in the US. A LOT. And they all use Windows. And they're all getting nagged like crazy right now. I know, I'm getting the calls. They're not the only ones. QuickBooks Pro users, CRM users, and the list goes on. They can't afford this, not now, and they're not on Windows Home...they PAID for a Pro product to support OTHER "pro" software which is more important to their income stream.
It is bigger than that, even. Because Microsoft is nagging people running Win7 with hardware that just maybe SHOULD NOT be on Win 10. Core Duo CPUs, Intel Chipsets without driver support. And there is no opt out. No way to even say, "Hey, thanks for the offer Microsoft, but I'm just going to let this hardware which is running just fine on Win7 die with Win7." There is NO WARNING that Win10 will be incompatible with networking and wireless drivers, so that users' laptops will disconnect from the network after sleeping EVERY. SINGLE. TIME. There are NO WARNINGS that touchpads won't have similar levels of driver support, so people used to touch-tapping and driver-cobbled 2-finger dragging lose that. Nope. Nothing. And no way to simply say "This equipment just isn't ready and probably never will be...thanks, but please stop nagging me." And those aren't from little know vendors, mind you, that's from Intel! Synaptics! Broadcom!
And worst: Microsoft is pushing this upgrade onto sometimes ancient hardware, the gross majority of which on the backs of 5-year-old 5400rpm spinning platters from the sub-terabyte generation, WHICH HAS NEVER, EVER--NOT ONCE--been backed up. Suuuure, you get that 30-day restore Window. Yeeeeeaaaaaah. Good luck with that. More spinning and intensive read/writing to sectors never tested or touched.
So, WHAT THE FUCK, Mr. Nadella? Why? Just let users, especially Windows Pro users on older hardware, have a reprieve. Make it a year. Make it two. I don't care. But YOUR CUSTOMERS need the option to permanently stop the incessant nagging. You owe them THAT MUCH RESPECT for their business.
AI won't be our biggest problem, it will merely be a stepping stone. The biggest problem facing humanity is the collapse of the informational time line. In other words, data time travel.
"WTF?" I hear you saying. "Whacko." OK, OK. But hear me out... Einstein, et al, are pretty sure that moving matter across space time, especially backwards in the time line, is unlikely without some pretty extreme technology. AKA likely impossible. However, at the Quantum level, moving information may not be that difficult, thanks to quantum entanglement surviving "time displacement" (even maybe black hole event horizons). Surely AI will help to accelerate the research into these areas. And that will culminate with the ability to communicate with the future. And the future being able to communicate with the past. All that needs to be done is construct the "radio". The future will do the rest and send back blueprints for improvement. Even if the humans aren't willing to do it over a shorter future span, the computers would likely have little emotional concern for doing it...after all, what is time to them but energy burnt towards a computational goal (that has already probably been computed in the future). Once the channel of communication is open, it will be as leveling as the Internet across "space" today.
So, the "Singularity" defined as the merging of human and AI is less likely to be as impactful as a "Singularity" defined as the complete crushing of the Time Line. All of human knowledge, nay ALL knowledge--human and AI--will suddenly be known, instantaneously (or very nearly). Parallel computing across both space AND time. Short of the Sweet Meteor of Death, of course.
[I have no interest in voting for a socialist as President. Just not my politics. Also there is also NO WAY I'd vote for Hillary Clinton. NO WAY. But...]
After all the political snafus and screw-ups that the Democrats have been involved with in the past 30 years, one thing is clear: NO ONE ever gets fired. Ever.
So, if Bernie Sanders helmed a campaign that FIRED someone--I humbly submit that if you're trying to decide between the two, and don't want more of the same from this f'd up political system--Bernie should DEFINITELY get your vote.
When I first read this, gotta admit, I kinda thought "Whaaaa?"
But it hit me. This will be an EXCELLENT opportunity for Yahoo! (YAHOO!? Yahoo!? Yahoo?) to finally move past an 800lb elephant standing in the corner...right over there...IN THE CORNER!
C'mon. "Yahoo!" was fun back in the '90s. That was the internet then. But it doesn't carry the same reverence now. And this move provides a chance. If Yahoo was named something...BETTER...maybe some of its issues would be different. Sure, getting a DOMAIN NAME is going to be a pain in the ass, but they do still have some money. Whether it be for prep'ing for acquisition or trying to move forward, moving away from that name has been an (mostly) unmentioned problem for a long while now.
And with competition like BING!, Google/Alphabet, "-i-" Cloud, and DuckDuckGo, I think they CAN do better.
(However they need to NOT bring the folks into the decision-making circle who worked on the logo redesign; clearly they're idiots.)
The wording in the initial post is incorrect; Lightning is NOT "functionally compatible" with USB Type-C. And AC, by your definition, carrier pigeon and telegraph are "functionally compatible" with what we're discussing. Lightning is not even physically compatible with the Type C implementation of 3.1, which is but one of several communications systems Type C carries over its 24 pins, only with USB 3.0 channel compatibility. (USB 3.1 uses two USB 3.0 channels, and requires 8 conductors on its own).
Basically what law enforcement and prosecutors have done for 100 years
They are called computers simply because computation is the only significant job that has so far been given to them.