At some point, the hassle of working with old junk and making it work, putting up with how slow it is, dealing with failing electronics, and so forth isn't worth it.
I have 17 Pentium 3 class systems in my basement in a render farm. Sure, it's neat to have so many systems. But for my purpose, a single $300 quad core box literally has more compute power, more memory, more memory bandwidth, and uses way less electricity. Plus you don't have to maintain a billion systems. And it takes up less space. And there's no heat problem. I haven't replaced the pile yet, because I'm not doing that much 3D lately, but I will, and it will be awesome to be rid of so much clutter. I also have a bunch of Sun boxes. They were fun to get working, but they use too much power, and it's an absolute hassle to fiddle with them, maintain software on several platforms, and so forth. My free time is valuable; I don't want to waste it doing menial maintenence on crappy hardware.
Off brand low end consumer gear is barely designed to last 3 years, let alone past life expectancy. Most of that 802.11b gear is pretty limited in what it can do, and barely worked when it was new. It's not like you can install dd-wrt and turn them into a mesh.
Best case scenario is probably hooking up somebody who has no wireless and no resources up, like your local church or whatever. If it breaks, meh, they had low expectations to begin with. It may not even be worth doing that though, because a lot of older consumer routers break when subjected to the network behavior of newer versions of Windows because they can't handle scaling window sizes with the default settings, and it's a support chore to dink around with the settings on every machine that comes along in a non-enterprise environment.
Bottom line is that old junk starts costing you more to use than buying new stuff would.
Yeah, I went to get firmware updates for some older Sun hardware I wanted to fix up and ran in to this too. Got the same "support contract is required for firmware updates" crap.
My 11 sun machines are headed to the dumpster, as a direct result of this policy. It's incredibly stupid. It's not like Sun was winning any new customers these days anyway, and now they'll bleed out the few they had. Obviously their intention is to kill off their hardware business, because no one in their right mind would decide to implement a policy like this.
Their model is vertical market with machiavellian control over customers. That model broke in the 1990's. Ridiculous to keep hanging on to it like a moldy wet blanket.
Actually, "the computer", and by extension "the internet" would be just another device to control as far as these interfaces go. In the various cybernetic monkey experiments, the robotic parts ultimately become an extension of the body; a two way digital communication link to the outside world would probably be adapted to in the same way; eventually the user would sort of "think at" the link to be able to read and write information. Would work a lot like io completion ports and device registers, which is how current drivers communicate with current devices. Bit of a challenge to work out a protocol for meaningful communication, but this part does get bits in and out.
This class of device is quite close to the minimum requirements for a direct neural interface. (Assuming the infection bits get worked out and so forth.) Would not want to be a beta tester though.
Video and image transfer requires a lot of bandwidth to transmit, but audio streams are nowhere near as intense. Won't take too many generations of the technology to get bandwith rates that are high enough to sustain a data stream the size of a voice data stream; the brain will pretty much automagically learn to interpret the data.
It is a long way from something like the Matrix, with total override of perception, though.
Sorry, not quite... sort of on the 120hz, no on the "free" stereo 3d from the console. Most "120hz" TV's actually just interpolate the intermediate frames in their processing chip. They can't actually display real incoming data at 120hz, they just fake it.
"All it requires" for free stereo 3d as described above is about 2x the work. Game console hardware already has to drop resolution to 720p to maintain 30fps for the usual level of detail in games these days. Make the graphics subsystem render 2x frames in the same time, and at best it's going to manage 15fps. Sure, you get to re-use some of the setup work (mostly moving data into the graphics subsystem) from the first angle, but the graphics subsystem still has to do the matrix transforms to render the scene, as well as run the shaders, which are likely not designed to look good from multiple angles, and may involve destructive calculations.
Usually in games there are distinct tasks that are parallel in nature, but within a given task there is an inherent degree of serialization. Most games have a simulation thread to update the game world state, and a seperate render thread that periodically draws a frame of the current state at say 30fps or so, depending on system load. But the render thread doesn't actually do the drawing; instead it gathers up the required 3d models, textures, scene data and whatever else, and feeds them to the GPU to draw. Rendering is embarrassingly parallel. Unfortunately, snapshotting the game world state to determine what to draw in a given frame is inherently serial. Recently, shaders have added the capability to do additional manipulation of the data on the GPU itself, which is faster because of the i/o bandwidth bottleneck. (talking to the slower main system bus). Anyway, a free core on the system does precisely nothing for you unless the game architecture is designed to parallelize the render thread to some degree, which most engines do not do. Instead, that core should be used for better AI or whatever. While game consoles are designed to have more bandwidth between the graphics subsystem, main memory, and the cpu, it's still the bottleneck, just not as bad as on PC hardware.
There are other engine differences you would have to build to do this as well, so it's hardly free, and that's not even thinking about memory constraints in the graphics subsystem, and whatever requirements there are for a 3d signal to the 3d TVs (presumably some sort of alternating frame format at a higher framerate; I haven't really looked in to it lately).
Gonna be a couple generations of the 3d display technology before the kinks are worked out, too. Probably the next generation of consoles will source it reasonably well. That's still 5-7+ years out though. Which is about the same timeframe that the no-glasses 3d TV tech is anticipated to appear. That means several years of crappy rapidly changing tech are in store.
Existing deployment tools from Microsoft already do this. You need the WAIK, which is a free download from Microsoft.
You need to create a generalized image. If you get all the required drivers for all your hardware into the driver store, the drivers will be found during install. You can also deploy from PXE boot using WDS with a generalized image...
There are a few caveats around a few drivers that aren't designed properly for Sysprep, and applications that aren't designed with sysprep in mind, but otherwise it's quite slick. You can script the installation of these exceptions to occur later on during deployment using unattend.xml and RunSynchronous commands though. You can also supply your licence key in the unattend.xml file.
About 90% of all Windows deployments are sysprepped by OEMs or by corporate IT folks....
Please read the documentation, the tools are quite flexible.
Somebody ought to cross ball point pens with coat hangers so that the pens will multiply instead of disappear.