That sounds like you spend a lot of time using a lot of broken software.
No software is shipped perfect.
[logs] will simply look like "some linux scrolling by" or "the matrix". Those users will instantly be made fearful of the application
You claim that the availability of logs necessarily induces fear. I'd like to see evidence of this.
my PC is connected to my TV. I did it because I already had the PC and the TV and wanted a DVR that [obeys me]
So when you want to use the PC connected to the TV for web surfing or word processing or something similar, where do you put the mouse and keyboard when you use the PC connected to the TV? And how does the gaming or other use affect the DVR functionality?
And if the head of household is unwilling to buy an Xbox, there's no gaming at all, so what's your point?
My point is that far more gamers are willing to buy an Xbox 360 and connect it to a TV than to buy a PC and connect it to a TV.
If you can point to anywhere where I claimed this was a free of cost solution
The presumed free of cost solution, from the gamer's side (not the developer's side), is to use the console that one presumably already bought sometime in the past seven years. But as the article points out, the console makers make it cost-prohibitive for a small developer to gain access to those gamers who already bought a console.
I always thought the point of playing games like that with others was the bonding experience and which game you play, and on what system you play it, was kind of a secondary factor.
But you still have to get past the steps of 1. finding games to play and 2. setting up a system compatible with those games in the first place. The small selection of PC-compatible major-label local multiplayer games isn't quite compelling enough to get past step 1, let alone step 2.
It's not like I'm planning to lead a revolution in the gaming market, so I really only tend to focus on personal solutions.
I'm approaching this from the point of view of small video game developers. The article and other articles point out that the tools and approval fees to port even a completed Windows game to Xbox 360 can cost tens of thousands of dollars per year. A lot of small developers can't afford this entry barrier.
The reason you put up with the hoop jumping is because the overall gain in cleanliness and layout of the UI dominates the very rare occurance of wanting to see this information.
Perhaps it's because I'm a geek, but I disagree that it's a "very rare occurrence". Say the progress dialog has a show/hide button to show or hide this tail display, placed next to the random number generator labeled "estimate of remaining time". How exactly does removing this show/hide button produce an overwhelming "overall gain in cleanliness and layout of the UI"? What it does is provide an incentive to keep the "estimated remaining time" honest.
Even if the ETAs are increasing rather than decreasing because of the slowdowns you mention, they will still be reassured that the process hasn't frozen.
If a time remaining display ends up fluctuating between (say) 1 minute and 1 hour depending on what step the process is on, the user gets the impression that the estimation is uselessly inaccurate. In this case, showing the title of the current step assures the user that the time remaining display isn't just wired up to display random numbers to placate the user.
Ideally, the program would write a log file containing the title and completion timestamp of each step, and it would send that log file to the developer to help improve the estimation in the next version. But I imagine that a lot of users aren't willing to enable that out of phobia against applications that "phone home". Do you check the "Customer Experience Improvement Program" box (or other publishers' counterparts) when you install software? Showing the title of the current step gives the user something to talk about in reviews even if the user chooses not to share the log file, as a form of indirect customer feedback.
The PC doesn't fill the role of [local multiplayer gaming box]
There's no reason that it can't though.
I agree with you that there's no technical reason that it can't, but tradition is still a reason. So is the cost of buying a second PC to put next to the TV, and the time and trouble to make sure the operating system and antivirus on the PC that is kept next to the TV are updated. Other users agree: "I'm not putting together a living room PC rig just for one game"; "No PC in my living room, thanks"; "I don't want to hook a computer up to my TV".
shiny new game with multiplayer support
These tend not to be ported to the PC at all (such as any fighting game other than Street Fighter IV), or when they are ported to the PC, the local multiplayer part is cut out in favor of an emphasis on online multiplayer.
Having a wide selection of roms to choose from is nice too.
But where will customers (legally) get these ROMs? If I recommend copyright infringement to other people, I incur secondary liability (MGM v. Grokster).
The option is there though, costs roughly the same as a console, with the addition that it is slightly more complex
Geeks like you, me, and the rest of the Slashdot population don't power the economies of scale in the video game market; the median user does. And the median user will stop at "slightly more complex". The kind of people who play video games socially (the real life sense, not the Facebook sense) tend to be the kind of people who want things to be push-button easy.
These days, TVs have VGA inputs and video cards have HDMI outputs.
I am aware that a PC's VGA, DVI, or HDMI output can be used with a TV's VGA or HDMI input.
Somewhere in that mix is your answer.
How so? The fact that a PC can output to a TV is no help if the head of household is unwilling to build or buy an additional PC just for the living room. Nor is it helpful if there aren't enough professionally made games designed to take advantage of the tendency of PCs that have been connected to a TV's VGA or HDMI input to have more controllers connected to them.
Meanwhile, replacing dedicated 'media centers' with a PC is becoming more common.
I'd like to see evidence that you're right and FunkSoulBrother's estimation is outdated, because several Slashdot regulars have repeatedly explained that this is not the case. "Most non-geek people simply have no desire to hook up their computer to their TV" and "Nobody wants to attach their PC to their TV".
I can't remember the last time an installer failed for anything other than out of disk space and the network failing.
So how should the end user distinguish a step that's doing something that inherently takes a long time, such as a large transfer through the network, from something that's taking far longer than it should, such as a transfer through the network running at an unexpectedly low throughput or unexpectedly high latency? Or a large write vs. a write to a nearly full, overly fragmented file system? Or distinguish an installer that just happens to take a long time on a specific step on a specific machine from a completely frozen installer? There are a lot of things that can interfere with an accurate ETA that the installer won't know about until it gets to that step.
Don't be irreplaceable, if you can't be replaced, you can't be promoted.