Most schools (and corporations, even) with private networks require you to re-register your adapter when your MAC changes. So it wouldn't really work.
That's what I'm saying. Cheers!
That happened to me with a 10' $5 cable from Monoprice. Out of desperation after I broke it the other day, I bought a 3' $5 cable from the bargain bin in the cell phone accessories department at a big store that I usually don't go to... last resort, I know, but it couldn't wait until the next day. And the $5 cable works like the OEM cable.
UI design is good when it enables you to be efficient at a task and it does that. I agree there. However I think his point is more of a UX point. How do you learn other than just hitting things to see what they do? There's no manual, no alternative noob mode with text, no hover-for-tooltip, and no help file. Users shouldn't have to enlist the aid of another user.
User thoughts might include: Is this going to break something? Can I undo it? I'd better not touch it. How do you Google an icon? (You don't.) Why is that there? Where is the menu? Where is the list of things I can do here? I don't want to break anything. It's scary/frustrating because I don't know how to use it, but there is no guidance.
Eventually, you get enough of these negative/confusing emotions and the thought "Ah well, back to iPhone" forms. I've seen it firsthand with several people who tried to switch because they kept having thoughts like that during the whole return period on their Androids.
So you see, it's a fairly basic UX misstep that could be avoided with some sort of hint. It's really unfortunate that it happens all over the Android ecosystem because I like it and want to see it succeed... but they're really shooting themselves in the foot still.
It's when you get two Cheetos that fused together during manufacturing.
You jest, but I'm sure I've rolled my face around on my keyboard and produced a Perl script that does that.
Yup, especially restaurant menus, which are *always* in PDF. It's frustrating when you're on mobile and just want to see the menu before you commit a large party with diverse dietary restrictions to going somewhere.
It's like your sig says.
Trolling is a art.
Go ahead, try to sing "Jumpin' Jack Flash" accurately without looking at a lyrics sheet.
I dare you.
or "Louie Louie"
The good: Visually impressive. The sound was excellent. 3D was tastefully done and not gimmicky. Special effects didn't seem to overreach and I wasn't sitting there irritated by a lens flare overload. Good cameo for Stan Lee. The extra scene at the end of the credits (like in every Marvel movie) left some interesting loose ends.
The bad: It seems like they cut out some minor parts of the plot and various explanations/reveals for things so they could fit more action into the time allotted. At multiple points during the movie, I thought to myself, "What is this and why didn't they introduce it?" Maybe it's a movie for people more familiar with Thor's comic book history.
I bet they will put some of the things I wanted in an extended director's cut later. So I'll probably watch for that in the stores in a few months.
Summary: It was a pretty interesting fantasy/action movie, and is very appealing to the senses. Don't leave until ALL the credits are over. The fact that I *wanted* additional exposition is good. I am just a little bit dissatisfied. Worth it for a matinee showing at least.
Sweet, ANOTHER "standard".
Did we really need it?
And how do you propose to know if any particular compiler or library is or isn't compromised?
Promotion in that position? What's that look like?
I hereby promote you to the rank of "Corporal Chair-napper over by the nukes"...
I ran into this a few months ago. I had bought my player back when most models had all the outputs. In the past few years, though... almost all of them got rid of the analog ports.
I was shopping for a Blu-ray player with someone who had a decent CRT TV with no HDMI port. We went into various big box stores: Target, Best Buy, Walmart, Sears, local home theater stores, etc. and among those, there was only one model of Blu-ray player offered among all of those stores that had more than an HDMI port - and it was out of stock all over the place. It was coincidentally one of the most expensive models. They are available but they are hard to get.
The funny thing about the story is that this person has an SDTV and only needs or gets 3G cell phone data service for their Internet usage, rents from Blockbuster and Redbox all the time, and needed a BD player because the DVD selection is shrinking as the BD collection grows at Blockbuster and Redbox. It's gotten difficult to find DVDs to rent or buy but the BD selection is really great. So there are cases out there, you know?
Oh man, I had to deal with ZIPs... Those things were horrible. You never knew when it would just stop working. CLICK CLICK CLICK... there goes that copy of my work.
Look at any motherboard manufacturer and you'll probably find an expensive motherboard that comes equipped with Thunderbolt.
I did. There's usually one model which is the top end of their offerings, for which you pay a premium, but isn't the best you can get. It's positioned in a strange place. It's not enthusiast gear for anyone other than a Thunderbolt enthusiast, since the other models have better features. It's too expensive for mainstream use. So it's basically a niche product and they know it, but they offer it to basically say they offer it. System builders other than Apple aren't going "hey, buy Thunderbolt!" because the products just aren't there to be marketed for Windows or Linux or whatever. You've gotta have a whole ecosystem (of sorts) built to push it and Apple makes their own. Nobody else. Not Sony, not Samsung, nobody that you'd imagine would have consumer credibility with higher-end stuff. The middle of the line like Dell and Lenovo aren't doing it either. Down at the bottom, it's not offered because it's expensive to license.
Thunderbolt is almost exclusively a 2012 Apple thing. Even people who HAVE Thunderbolt ports wind up wishing they had an extra USB port instead. I know I do and that's the sense I get from plenty of other people. (I have not polled a statistically significant audience.)
USB caught on because it was supported by a zillion cheapo devices and was first with the whole Plug n Play thing. I mean, I know of people who called USB "Plug and Play" when talking about the port. It was that ubiquitous. USB 2 kept the momentum going because USB had brand recognition at that point. USB 3 is still just "USB" as far as consumers know and it tends to work wherever you find a USB port. Including on Apple, which has USB ports next to the Thunderbolt. And now that they can combine USB3 and ESATA ports... why do you need something else for external disks, which was one of the things Thunderbolt was supposed to be good at?
Thunderbolt is only available on expensive devices and there was a lot of licensing kerfuffle in its infancy while USB 3 was being developed. So USB 3 stole its thunder. So, it got stuck in the Apple premium price corner, only people with that kind of money are buying it, and the USB 2/3 alternatives of the same products (and their less-expensive lower end brethren) are available in large supply to satisfy a larger market than just Apple. And you usually have a USB port right there next to your Thunderbolt port, reminding you that you don't have to spend the premium ordering a Thunderbolt device when you can just walk into a bricks-and-mortar store and buy the USB version.
I see the problem... It's a great idea, but they're really trying to make something stick that doesn't have what it takes to stick in a very competitive market - should have gone with USB 3.0, or made Thunderbolt much more accessible to manufacturers early on. If they had played their cards differently, it would have been a very different story today. To disrupt a market you have to create something everyone in that market can get their hands on.