I'm not a kid by any stretch, but Rock Band did let me live the rock star lifestyle in a small way. Toward the end of the craze, a local radio station had a contest up at Lake Tahoe where the brand prize was $400 worth of bottle service at one of the fancier nightclubs at Harveys casino. My boyfriend and I went up on stage with two random guys we met that night to fill out the band and won with a rendition of Aqualung. A couple weeks later the four of us went up there and got absolutely blasted. Just like rock stars!
First step? Don't call them clueless right off the bat.
Destroyed. At the end of every season, the CG models for Babylon 5 assets were deleted according to contract requirements with the Prime Time Entertainment Network who distributed the show. Probably as an asset reduction thing for financial BS in the era of protoCG-era production.
I think the missing key in current smart home options that most people can actually afford to purchase, is reliable voice control. I know Google's acquisition of Nest (and whatever Apple gets around to doing) will make a big difference here, but I can already say that I'd be a lot happier with my "smart" lighting if I had:
A: More money for more components such as light switches and socket replacements.
B: Voice controls that were as responsive and reasonably reliable as the Amazon Echo, which gets it right a surprisingly large amount of the time.
Last year, I picked up a Wink Hub and four "TCP Connected" brand (which is a horrible name for obvious reasons) daylight LED bulbs to see how dipping my toes into home automation would work out, and it really has been a seriously mixed result just like the author of the the original article says. I'm using a very simple setup, two lights in my home office, and one light in the rear of the living room. The only "smart" part I have set up, is a group to let me control the office lights all at once.
And it's really not all that stable. The TCP Connected bulbs actually require the use of a home gateway and online service to control, and Wink ties into that. When that service is glitchy, things will either work or not work. There's no apparent reliable activity confirmation set up in the protocols from what I can tell, so the software never knows if a device is on or off. A fairly simple schedule I have set up dims my lights for a period before bed, and then turns them off later. This usually works, but not always. It's also supposed to turn them back on, and it doesn't appear to do that about half the time.
Is the problem the TCP bulb integration? Is it Wink? Is it the signal in my house? Is it a bug? There's no way to tell for sure, and systems just aren't bulletproof enough to rely on just yet. But is it a nice step? Absolutely.
The big thing I feel that I should do in my personal case though, is replace the light switches so I don't always have to pull out a smartphone or tablet. Is it a pain to do that? Yes and no. It's more of a pain than it should be for something advertised as super simple, because of the article's mentioned process of unlocking a device, loading app, swiping to control you need, and then hitting said control.
The prices can definitely be appealing, but once you realize that a light switch is going to be $50, it adds up.
This isn't a testing fault. I'm sure they tested the hell out of it. Dozens if not hundreds of QA people sat in cubes for months, maybe years, testing bits of this game as it got produced. And I'm sure that many of them wrote up really detailed, well reasoned explanations of just how broken it was in every single way that people are counting today.
And nobody cared because the game had to launch before the holiday season of 2014, Thousands of jobs and millions upon millions of dollars were at stake.
It isn't that nobody tested, it's that nobody really cared.
That is pretty much the only resource people have. And it won't work until lots of people try and do it, loudly, repeatedly, but politely. The fact that Ubisoft is already making excuses actually makes it seem like people might have a better case this time than in most, because it's not just going into a cone of silence.
It's a case where the developers looked at the raw numbers for the system that was coming, and said "Wow! We're going to have almost three times the cores, sixteen times the RAM and so much GPU!" and then went on ahead and jacked the engine demands up to a level that probably shouldn't have been reached until a few more years into the life cycle for the platform. It took years and years for the Xbox 360 and PS3 to be understood well enough to be able to create things like the GTAV engine, and possibly in part because of the switch to essentially PC hardware they now find themselves having to work with the hardware that was until recently considered second-tier to console hardware in general, but then in addition, they used AMD parts that have always been second-tier in the PC market.
This really should not have been that bad. They're overreaching, and that's basically the fundamental problem. Wait a few years, and games that try and pull off what Unity does will be successful and well optimized, but right now they're still working out just what's capable. It's just too bad for the customers that get screwed while inadvertently helping Ubisoft and other developers learn just how this hardware can be put to use.
Anyone that still pre-orders anything that has a launch day review embargo, after the "Aliens: Colonial Marines" disaster, that's who.
This is a side effect of what happens when game franchises become more profitable than movie franchises. Once the flow of money starts on a game with a budget in the tens of millions like the Call of Duty, Assassin's Creed, or Grand Theft Auto franchises to name just a few - there comes a point of no return where you finish what you started, because you've sunk millions upon millions into something that just turned out *wrong*, like this. Just like Warner Brothers couldn't put the Green Lantern movie on hold and rewrite it to not suck, Ubisoft backed themselves into a corner.
Nobody had the balls or the power to say "Wait a minute, we're overreaching. Let's scale this back to something that will actually run." Instead, they launch a buggy, bad game because they're into just the marketing campaign for tens of millions of dollars. It's so much worse for consumers than a flop of a movie, because you're spending $60+ on the cost of entry, and when the reviewers are embargoed there's just no way to tell if you're going to get screwed. Thank the big budget productions and stock market demands for this kind of disaster.
Every time I see something like this, or a botched Call of Duty release, I get a *little* less annoyed with Valve for not saying a word about Half-Life 3/Ep. 3. They're private. They can take the time without investors freaking out.
Let's also take into account that Ubisoft had to know something was up, because the pre-release copies they gave game reviewers came with an embargo that lasted 17 hours into the release date. I'm not surprised at all to see this, though I'm admittedly surprised it's quite as large a problem as it is. When they announced the system requirements, I winced. I know that the horsepower demand for a game engine designed for a modern console is finally going to be a lot more demanding than last year's titles, but a GTX 680 as minimum specification? Someone screwed up engine design, plain and simple.
As of last night, I actually have a license from IBM to run V5R2 on an older AS/400 system I purchased through Craigslist. I prodded the giant, it woke up just a little tiny bit and managed to decide that giving a hobbyist a license for an obsolete version of the OS/400 platform wasn't going to kill anyone.
It's my hope that I'll be able to help prove that there are more people like me, and indeed, far more talented and curious than me, to show IBM that there's some value to be had for them in opening up access to at least older platforms to enthusiastic hobbyists. The AS/400 platform is an incredibly neat system, and it shows that IBM really does have a niche that nobody else can touch. I've never used AIX, but would love to check that out as well. I hope that some time in the future, I'm not a one-off case when it comes to hobbyists getting an actual license.
But your comment was well timed for me, because I wonder if IBM might be coming around as an institution and realizing that the mindshare gap they have is a problem that it's worth investing a little bit of time and effort in fixing.
Gah, I really wish this article had come up after I had been awake for a while at least. Time for coffee and letting the page refresh in case I can organize my thoughts just a little tiny bit more coherently.
Numbers that you can't even comprehend. Any system that uses Windows software on non-upgradeable hardware. Medical devices that require specific levels of precision and predictability.
Have you ever looked closely at medical devices? I work with some systems less than five years old that cost close to $100,000 and they run Windows XP. Should they be replaced? No, not just because the OS beneath the application layer is old. I'm probably the only person in the office that knows it's an XP machine, which helps with security. Sometimes you can't just upgrade.
Science is what happens when preconception meets verification.