That's a fair point.
Still, there's plenty of room for the
That's a fair point.
Still, there's plenty of room for the
Indeed. I'm too busy struggling to stay almost not quite embarrassingly behind on front-end buzzword compliance, and now this? I'd have no idea what it was if I wasn't friends with a devops specialist. Ditto Chef, Hadoop, and a few other extremely specific buzzword compliant "concepts" tech writers whisper about in worshipful tones.
I kinda miss the era in which a general computing proficiency was possible. Specialization used to be for insects.
I don't know how a controller reports itself to the system, but it probably makes sense to take a balanced approach - if it's possible to sniff an x-box controller then present the appropriate menus; if it's unrecognized then the oldschool List Of Options makes sense - while you can account for a good number of popular variables full coverage just doesn't seem to be economically feasible, especially for small developers.
Keyboards present their own problems - the French, for example, don't use WASD - the keys in those positions are ZQSD (see here). I use a Mac keyboard on my PC - it was easier to just bring it along than to retrain muscle memory for a windows layout. I haven't had issues with games since screenshot functionality was added to Steam (f12). My control/alt/"windows" keys are laid out differently... and I have F13-F15 and no "print screen" key. I'm an edge case on Windows but that's a standard Mac layout, and a variable to account for with multi-system ports.
Oddly, back when I played EVE Online I was able to map drone controls to F13-F15 on the PC... but the Mac port, with a Mac keyboard, didn't see the F13-F15 keys. I haven't tried to map those keys on other games, as the majority of my game time these days is keyboard and mouse with the left hand in the general area of WASD.
The advantage of the keyboard is that the basic shape doesn't change much - the sizing and spacing of the meta keys between Windows an Mac keyboards and localization differences aside it makes sense (to me, anyway) to separate the keycaps from the keys themselves. Localization and the occasional rogue Dvorak layout seem to be the biggest issues. There are a few weird split "hacker" or "ergonomic" designs but I haven't seen one in the wild since around 2003.
If you went with a keyboard graphic for a controls menu I think any design would be acceptable so long as it fits with the rest of the game's UX - I wouldn't expect a representation of a factory-fresh bondi blue Apple USB keyboard in Metro 2033, though a banged-up IBM Model M missing a few keys would fit right in (and could be used to, say... indicate that the Windows key is unmappable - just an empty socket). I like Half-Life 2's implementation - Valve uses a custom font for UI icon display (all the guns/weapons are font characters) - I don't know if their keyboard representation is done the same way but it makes sense that it would be. There are a few basic shapes/sizes for keys and a full board could be assembled with a small number of glyphs/objects.
I think that in a roundabout way we've summarized why consoles are attractive for developers - they're fixed, comparatively slow-moving targets.
Not hardcoding names/functions is a an excellent idea; I'd totally accept seeing "Press Start" on the title screen on a console or if I had a controller with a "start" button plugged into the machine.
Re: keys - this is a UX issue with a number of different approaches. The keybinding menu on PC games is typically a lengthy list of $function [ $keybind ] - FORWARD [ W ], CROUCH [ SHIFT ], USE [ E ], etc. Some games present the keybind menu as a list, some break it into sections ("exploration," "combat," "utility," etc). I have yet to see one that shows the keyboard as a graphic in the way I've seen some games show the controller, as a graphic or technical drawing with clearly defined labels. The experience of displaying and remapping keybinds has plenty of room for improvement - it doesn't seem to get much love in large part because many people don't change the defaults, or when they do it's once or twice and that's it... or they use a controller. The existing editing interface is similar across most games I've seen allow for it; as it stands now it's low-hanging fruit for UX development.
WASD is so omnipresent that it's considered a solved problem; knowledge of what those keys do is assumed to the point where they aren't even covered in tutorials anymore. Half-Life 2 did a good job of relaying controls to the player in-game - while a game that did this sort of intuitive and timely reveal with its menu system might not win any awards for it, the work would not go unnoticed.
Re: Big Picture - I'm not in the living room demographic; I don't know how many people have a TV-sized display hanging off their PC. I do know that text two feet away and text eight feet away need to be different sizes to appear the same relative size to the human eye, regardless of the number of pixels on the screen. Fallout 3 and Fallout New Vegas are the best examples I can think of offhand that account for this, albeit after market. Darnified UI is a mod that, among other things, changes the font size, making it possible to fit a lot more text on screen without scrolling. I think SkyUI does the same thing, though I have "before" and "after" experience with F3/FNV and have never played Skyrim without mods.
Basically, it boils down to polish - some games feel right, some don't. There's no need to go totally overboard in one way or the other; I think game developers could stand to make fewer assumptions about their target demographic (assuming controller, assuming Big Picture, etc - if you're building a PC port assume there's at least a few people out there who use the things as something other than glorified skinner boxes).
In Bethesda's case the policy seems to be "let the community figure it out." Skyrim has SkyUI, which really takes advantage of the real estate (and cursor). Just allowing for mods and having well-documented tools is a step in the right direction - the people that are bent out of shape enough to do something about it will, and the people that don't mind aren't going to be looking for UI mods anyway.
JC2 handles fine on the PC - the fact that on first run it loaded to a "Press Start" screen felt like sloppy QA more than anything else. Apparently the game plays fine with a controller - why the game expects one instead of checking to see if one is plugged in is beyond me. Psychonauts actually has x-box controls in the game menu screens - it would be cool if those weren't there by default and popped up when you're playing with a controller but as-is the x-box menu screens really drive home the fact that the PC version is a port.
Borderlands is fine once you get used to it - I came in from the S.T.A.L.K.E.R. series so the fact that the BL games are very arcade-like to the point that an the "INSERT COIN" message on death would not be out of place was pretty jarring. Like the Metro series it doesn't need a particularly deep or complex UI.
I wouldn't mind some developer out there remembering that ten percent of the population is left-handed and making handedness a character option, but that's ultimately up to the animators and not a console -> PC issue. Ever watch a lefty use a bolt-action rifle?
As to interface improvements, in the general sense... allowing for key remapping is a pretty standard feature - to the point where having hard-mapped keys that can't be remapped can be pretty jarring (F1/F2/F3 in Fallout 3 / New Vegas, for example). Allowing for customization without clearly explaining why the keys are where they are by default can cause difficulties as well - Dishonored allows for remapping but the default layout is fairly organic once you're used to it.... muck around with one or two keybindings and suddenly RSI is through the roof... with no menu indicator of what the old keybind was without resetting everything to default.
The real console -> PC issue seems to be font sizing and use of real estate more than anything else - PC users are typically sitting closer to smaller higher resolution screens whereas console users are typically sitting further back, looking at larger, lower resolution displays.
I bought my PC to build and render 3d environments and assets for my webcomic. The fact that Steam was the second thing I installed and the Orange Box was the first PC game pack I bought is gravy - if the hardware can't be used for productivity I have no use for it and I'm not wasting money on it. Entertainment is a secondary function.
That said, the fact that Borderlands, Skyrim, Just Cause 2 etc. are all running barely-localized console interfaces makes me feel like the PC is a second-class citizen, shoved to the back of the line in favor of consumer hardware that can't do anything else.
Bing is also better on old hardware and marginal connections... even in Chrome. I have a 2009 Shuttle box and a megabit DSL link and Bing just kind of appears. Faster hardware improves things a bit but Google services just seem to assume infinite bandwidth - the lack of throttling on Google Drive makes it useless and OH GLOB I'M RANTING.
tldr; Bing is faster than Google - and loads immediately on those occasions when Chrome's address bar is horking like it has a hairball. It's not as drastic as the difference between Amazon (just loads) and Newegg (takes forever) but it's there.
I'm fine with twitter being a self-contained thing. "News" "reports" that consist of screen after screen of embedded tweets with "analysis" along the lines of "he said.... she said.... OH NO THEY DIDN'T!" is a waste of clock cycles, electricity, photons, and calories.
Until it runs on something other than Windows it's already locked-in.
Humor is pretty subjective - I set "funny" at a -1 and found that it improved the quality of the comments I was reading enormously. Funny posts are still in the mix, just not at the default intensity. This was a lot more of an issue a decade or so back, when the Slashdot Effect was a real threat to websites and the site was practically my home page. These days the sort of shoot-from-the-hip snark that swamped the comments section "above the fold" has migrated over to Reddit, where, depending on the subreddit and subject, I may have to hit the Page Down key up to half a dozen times to get past the jokes, one-liners, and associated snark - all upvoted far past the point where my downvote would have any meaningful impact. While that's where the fun is for a lot of people, I really like being able to de-prioritize that sort of commentary or toss it out entirely - the fact that Slashdot allows for a degree of user-controlled comment display means we're not as subject to groupthink... and I'm far more likely to use my modpoints to upvote deserving comments than I am to spitefully downvote "funny" posts that don't mesh with my sense of humor.
Slashdot's comments are upvoted/downvoted in a more granular fashion than any other site out there and comment display can be skewed by user preferences - I penalize "funny" posts and really wish I could do the same on Reddit. The best the rest of the internet has managed to implement is a Nero-style upvote/downvote system, which puts the same weight on puns and one-liners as it does on trolls and insightful responses.
Commenting in general is ripe for disruption - if Disqus upgraded from upvote/downvote to something along the lines of the system Slashdot has had since the 90s it would change the Comments section overnight.
Part of my old job (in a museum Exhibits department) was upgrading interactives and videos from the 80s and 90s to modern equipment - that included "transferring" laser discs the old fashioned way - plugging one of the still-working players from the floor directly into the capture hardware.
The thing is, I was transferring LD to DVD, which is actually a step *down* in quality. Kind of but not quite like how VHS is a step down from Beta (which I also dealt with).
The great thing about standards is there's so many of them!
Anyone who's used Apple software for more than five years has been burned by forced format obsolescence - ClarisWorks, AppleWorks, old QuickTime codecs, the PICT format, SimpleText, Font Suitcases, the list goes on. And on. And that's just *one* platform and set of formats off the top of my head. I lose data to software "upgrades" so often that it's the single biggest determining factor in my upgrade cycle and a huge determining factor in the uptake and use of new software. We aren't heading for a digital dark age - we're in one already.
This is where Comcast building wifi hotspots into their cable modems becomes pretty damned insidious - how long until devices like this are "pre-authorized" to automatically connect to the mothership through any available wireless connection?
Imagine if a Samsung TV automatically phoned home through your neighbor's Comcast wifi/modem link not because you enabled it but because Samsung had paid Comcast to allow its devices through. And of course this behavior is on by default and block it, thanks to some timely lobbying, is now a violation of the first amendment (or something equally deranged-but-feasible vis-a-vis corporate personhood).
Over the years I've learned that I can rely on two factors when it comes to games - word of mouth and development staff. Somebody who knows me and knows what I like probably isn't going to recommend something outside of that sphere (or if they do it's due to incomplete information, for a laugh, or for reasons unrelated to gameplay), and if I like a game or series of games it's usually a good indicator that I'm going to like whatever the people that made that game work on next - usually but not always.
I agree on "multiple reviewers per game" - different reviewers have different priorities and play styles and that can subtly skew impressions. While I find Rock Paper Shotgun reporting on FPS games to be solid and reliable, my impression of their review of Just Cause 2 - which I read after playing the game - was "Did we play the same game? o_O"
Real programs don't eat cache.