I have never played it, but have enjoyed other Civ-style games. Should I play the original, Alien Crossfire, or both? Opinion on the latter seems mixed; half loves it and the other half thinks it unbalances the game.
and be the syfy movie of the week.
Unlikely. Syfy prefers animals as the villains of its Saturday-night original movies, not people.
Now, if it turns out that a shark or octopus (or, even better, both) stole the cobalt-60, then you'l have the network's attention. Expect Sharktopus II: Nuclear Boogaloo any week now.
No, he is referring to the $30 plan with 100 minutes and unlimited text and data.
If you want to become a professional software developer as opposed to being locked into IT support, the Masters program at the University of Chicago sounds ideal for you. It is specifically designed for those with little or no formal programming experience before beginning the degree.
And very successfully so, too. With more than 8 million copies monthly, it is the third-largest magazine in the United States.
Ah yes, the man who routinely wins a two thirds margin despite his district not having voted for a GOP president since 1984 is disliked by his constituents.
I figured this out on the day in 2003 when I first tried out OS X. I've been using LInux since 1995 and had tried every available desktop: CDE, KDE, Gnome, Enlightenment (The horror
I still use Linux as a server, but for a Unixlike desktop that actually works and runs a lot of applications, OS X is it. Period.
People accept glasses for watching 3D movies in theaters because they are there for the experience of watching a film on a giant screen with other people while eating popcorn and drinking soda. The same goes for other specific, controlled environments, like 3D CAM in an office; people accept it as part of the experience (or job in this case).
3D in the home will never succeed until and unless glasses are not needed. It doesn't matter whether the glasses are disposable or expensive, or if today's multiple competing standards congeal into one. No one will accept needing to constantly put on and take off 3D glasses to watch TV. Period.
This wouldn't be the first wristwatch from HP. The company sold the HP-01 from 1977 to 1980. It was a calculator watch that was very advanced for its time (At $750, it should have been!).
> "[W]ould it kill people to put a reasonable amount of content on pages?"
Yes, the twin demons of page views and banner ad impression counts would indeed kill people. Many non-tech news sites offer a "Single page" or "View all" or "Print" option, but tech-oriented sites generally don't, with a few exceptions like Wired or C|Net.
I am surprised the parent said he could do 1080i without VDPAU.
Playing MPEG-2 high-definition streams (whether from over-the-air or FireWire) is easy. To oversimplify, video playback involves 1) decoding the compressed video signal and 2) rendering, or displaying, it. As mentioned, my Pentium 4 was fast enough to decode MPEG-2 streams in 2005, and the Xv hardware-assisted renderer (usable from Linux via any Nvidia or Intel video card/chipset made in the past many, many years) quite nicely displayed the video with the more-than-decent Bob 2X deinterlacer. The resulting 50-70% CPU usage I saw is perfectly adequate for a box that doesn't do anything else, and of course the usage would be less with a faster CPU. Before VDPAU, software decoding and Xv render is what the vast majority (I'd guess 95%) of MythTV users used for high-definition playback.
Decoding high-definition h.264 video (such as produced by the Hauppauge HD-PVR, which shipped in May 2008) is much more difficult. My Pentium 4 was able to just barely play 720p 6Mbps h.264 recordings, but no more; people on mythtv-users were reporting in mid-2008 that a the fastest Core 2 Duo boxes were just barely adequate to play 13Mbps (the best quality, more or less indistinguishable from the original) HD-PVR recordings, and sometimes were overstretched even then. In other words, MythTV users were beginning to create recordings they could not play back!
VDPAU has the video card handle everything. The card itself, not the CPU, decodes both MPEG-2 and h.264 streams and renders the resulting video using excellent deinterlacers. Given the dilemma that the HD-PVR created, VDPAU could not have arrived later (late 2008/early 2009) than it did.
 There's still no adequate Xv support using ATi, from what I understand. I don't know whether current ATi Linux drivers have finally solved this; most sensible people on mythtv-users just throw up their hands and buy a $30 Nvidia card.
At some point the reduction in power costs will justify a switch to something like the Revo.
Correct. I do have a Kill-a-Watt and made those calculations a while ago. This is a key reason why I say I wouldn't start with a Pentium 4 today even if I could buy one new (which I can't).
What is holding me back? 1) Inertia, since my frontend/backend continues to work 24/7 without any issues. As the saying goes, why change what isn't broken? 2) More to the point, I am waiting until the ION platform supports the Advanced 2X deinterlacer. Once it does in a $200 Revo-like form factor--hopefully soon--I'll buy one, but until then I'll stick with the Pentium 4.
Yeechang fails to mention that [a P4 3GHz] is roughly the sweet spot for a mythtv frontend.
Yes, it was indeed the sweet spot when I bought it more than four years ago. I certainly wouldn't buy a new P4 today, even if it were possible. I'd get an ION-based Aspire Revo for $200-300; that's clearly today's sweet spot.
My larger point stands; most people wouldn't expect that a box that was state of the art five years ago would still be adequate for recording and playing 1080i and even 1080p high-definition video, but it is.
Just set up a plain old linux box and it'll work even with the plain jane VESA driver. Now you can do all this binary NVIDIA driver and XVmc and VDPAU or whatever for even better performance, but it'll "just work" on a stock plain old linux install.
I am not aware of a single case of successful high-definition video playback with MythTV using Xorg's stock VESA driver, and the folks at mythtv-users would certainly want to hear of one. For high-definition playback some type of hardware acceleration, whether partial (as with Xv) or full (as with VDPAU) is required. Some people are successful with OpenChrome/Unichrome, or Intel video, or even ATi, but the vast, vast, vast majority of MythTV users use Nvidia cards and thus its binary drivers.
You can spend more money on an even faster system for myth. But its just money down the drain
I am happy with my P4 frontend/backend because it meets my needs. Again, though, I'd not choose a P4 if I were building a new MythTV box today. In any case, prices have decreased, not increased; there simply was no equivalent back then to today's inexpensive ION boxes.
I've been using a Pentium 4 3.0GHz-powered box as a MythTV frontend/backend for more than four years. It often records four high-definition over-the-air or FireWire MPEG-2 streams while playing back another.
For the first three years I used an Nvidia video card with Xv output to play the recordings at very good quality with 50-70% CPU usage. A year ago I moved to VDPAU, which gives me even better playback with under 5% CPU usage, and will do the same with h.264 recordings (generated by the Hauppauge HD-PVR, for example). Thanks to VDPAU, there's every possibility I'll be able to use the Pentium 4 box for another four years.
Isn't Madonna already 60?
. . . No way. Wow. You're kidding, right?!?