Theora was based on one of On2's earliest codecs. VP6 & VP7 have been far more successful and are even used as the Flash video codecs. If Google is acquiring On2, it could mean that they're looking to open up the formats that have defined Flash as the media player of choice.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
(Sorry, this is somewhat offtopic, but it was the first thing I thought of when I saw the comparison between Windows XP and Windows 7.)
I once saw someone here on Slashdot mention that Microsoft should not have shipped a 32-bit version of Vista, opting instead to push only the 64-bit version. While it seemed like an odd statement at the time (despite the fact that my home XP machine was an AMD64 processor), I find myself agreeing with it on Windows 7.
As it stands today, 32-bit Windows is quickly becoming too small for many business and industrial uses, and it's very affordable to build a high-performance home machine with more than 4GB of RAM. (Case in Point.) In fact, with intensive web applications and sophisticated desktop tools (yeah, some of them are bloated) chewing more memory than ever before, it just doesn't make sense to get anything less than 4GB (nay, 3GB if you're running Windows 32-bit!) except for a few edge cases.
Unfortunately, Windows has been kind of lagging on the 64-bit front. By treating it as sort of a bastard child (like they treated all their non-i386 NT versions), Microsoft managed to ensure that hardware manufacturers wouldn't make an effort to support 64-bit windows in a non-server environment. Which is frustrating as I've started bumping up against that once-awesome 4GB barrier.
In an attempt to turn this into a slightly more useful conversation rather than a one-sided rant, I was wondering if I could get some opinions on using virtualization as a solution? With Windows' poor track record as a 64-bit OS, I have been thinking about running a 64-Bit Unix and virtualizing 32-bit windows for backward compatibility. I've already had some success with virtualizing Windows 7 on a MacBook, and have even been able to get desktop integration working. (Quite spiffy that. Though the two interfaces occasionally confuse my wife. She's the primary user of Windows, needing support for some specialized programs with no real alternatives available.)
Does anyone here have experience with setting up a system like this? Do you use Xen, VMWare, Sun VirtualBox/OpenxVM, or some other solution? What do you use as your primary OS? Linux has come a long way, but the upgrade treadmill is still frustrating. Especially with the seemingly regular ABI upgrades. Does anyone use [Open]Solaris x86_64 as a host? Do you have 3D Graphics completely disabled, or have you found a good way to allow all OSes solid and reliable access to the underlying graphics card? Do you bother with mounting virtual shared drives to move data between the OSes, or do you have a home NAS for storing data? (I'm leaning toward a NAS myself.)
Just a few thoughts, anyway. Thanks in advance for experiences & suggestions!
Wasn't Wolfenstein, released in 1992 the first game with 3D graphics?
Not even close. Wolfenstein wasn't even the first raycaster game. It was preceded by Catacombs 3D (also by Id) which itself was preceded by Hovertank (also by Id).
Before those were even a twinkle in Carmack's eye, we had MIDI Maze (1987) and Star Wars Arcade (1983), just to name a few. There were tons of attempts at 3D games before Carmack. He merely popularized the First Person Shooter genre and made 3D Graphics the standard.
The problem here is in trying to patent a trade secret rather than an invention. Patents are intended to cover inventions. Real, working gizmos that operate is a specific fashion. Trade secrets cover processes and information that is of a competitive advantage.
In this case, the two are getting mixed up. The company may have a device to detect certain attributes (which IS patentable) but the fact that the attributes can be measured in order to draw conclusions is inherently unpatentable. If someone else develops a machine for measuring the attributes that works different from your machine... well... tough noodles.
All that can be done is to keep the information a secret. By keeping it secret, it is legally viewed as a "trade secret" which can be contractually protected when sharing with interested parties.
Disclaimer: I am not a lawyer, but I did stay at a Holiday Inn Express once.
I happen to agree with the GP, and I've written tons of games in the past 40 years. Here's my Atari 2600 version of Lunar Lander:
Run it through an emulator like Stella to play.
I later ported the game to Flash, but it's not quite as fun as the 60Hz 2600 version. However, you can play it on a Wii! (Use S for thrust if you're on a PC.)
Evidence is presented in courtrooms, not to journalists. Saying, "there's no evidence" is basically a fancy way of saying, "the court has not yet heard the case".
For now we have allegations. They will be proven or disproven in court.
FTFA: "The suit alleges white officers post on and moderate the privately operated site, Domelights.com, both on and off the job."
That was one variation of the term "computer". But as my old fashioned flight computer can attest to, slide rules were often referred to as computers as well.
It's interesting listening to some of the vets from WWII. They'll often talk about their "trajectory computers" or their "bombing computers" or their "landing computers". To the modern ear, it sounds like they're talking about early electronic machines. Yet these references are just specialized slide rules used to "compute" results for a set of measurable inputs.
The Saturn V could lift more than double the shuttle's cargo capacity
I addressed this above. The Shuttle Transport System has better power output, but it has to waste it on carrying a giant airplane into space. The Saturn V was less powerful, but far more flexible. Put whatever you want on top and it gets to space. That often meant the Apollo capsule/command module/lander/moon equipment combo with sufficient velocity to make lunar orbit, but also occasionally meant a huge hulk of steel and solar panels like SkyLab.
The Saturn V boosters were detuned as well.
I'm not talking about detuning. I'm talking about reducing engine output once maximum dynamic pressure is reached. If the SRBs maintained maximum thrust, they'd push the shuttle beyond its structural limits.
The propellant is an 11-point star-shaped perforation in the forward motor segment and a double-truncated-cone perforation in each of the aft segments and aft closure. This configuration provides high thrust at ignition and then reduces the thrust by approximately a third 50 seconds after lift-off to avoid overstressing the vehicle during maximum dynamic pressure (Max Q).
What you're referring to is the resonance problems inherent in the engine vibration of the F-1 engines. i.e. The "pogo" effect. As I recall, this issue is currently the biggest challenge facing the Ares I stack. The Space Shuttle was vulnerable to some pogo effect, but adding dampeners to the LOx fuel lines was sufficient to prevent the effect.
Pedantic nit: they used slide rules back then.
Yeah, except they often referred to them as "computers". At least until the "electronic" variety became popular. So shush, you.
Replacing titanium structure with aluminum, for example.
Interesting. I would have pointed to the heat shield, instead. Carbon-carbon was nearly invincible and was used for the leading edges of the space shuttle. But as a cost savings measure, they came up with that screwy tile system instead. It saved a ton on development, but it cost them later on.
And oh man, did it ever cost them.
By artificial gravity, I assume you mean using rotation to produce centrifugal force?
Correct. While we usually think of "artificial gravity" as some sort of sci-fi graviton thingy, von Braun used to term to describe the effect of rotating wheel in space.
That means that to get a full G of apparent gravity, you need a station with a radius of nearly 225 meters. Obviously, you could probably make do with less than a full G.
The original proposal by von Braun and Willy Ley was a 3-deck, rotating wheel with a diameter of 76 meters. Rotation would have been 3 RPM to provide artificial gravity of 1/3 earth normal. Since the effects of weightlessness were not known at the time, I believe von Braun intended the gravity to make the station more operationally efficient rather than meet the health needs of the crew.
I just don't see that being likely until we have a more efficient way than rockets to get material into space
You have to remember that they had the power of the Saturn V at their disposal. No weight was too heavy! No craft too large! And with the Nova drawings on the board, it was only a matter of time before mankind was the master of his solar system!
Of course, the fact that NASA was spending a fairly sizable chunk of the GDP on space exploration was lost on these engineers. There was not going to be a Nova, the Saturn V was seen as too expensive, and their ideas for a space station were simply too grand.
First, your numbers for the shuttle are flat out wrong. You forgot to account for the thrust from the SRBs. Second, your numbers for the SatV are missing. Third, the F-1 and the SSMEs are not comparable. The F-1 == SRB and the SSME == J2. Look them both up and you'll find that the shuttle is WAY more powerful on a per-engine basis.
Here are some corrected numbers:
Thrust: 34.02 MN
Mass: 3,038,500 kg
Thrust to weight ratio: 11.19:1
Mass: 2,030,000 kg
Thrust to weight ratio: 15:1
As you can see, the shuttle has 34% more power for its weight than the Saturn V. This is more than sufficient to accomplish the liftoff goals. The SRBs are actually shaped internally to REDUCE thrust during flight to prevent overstressing of the Shuttle hardware. The idea is to get up to Max-Q as quickly and smoothly as possible, then throttle back until the thickest part of the atmosphere is cleared.
There's a reason why the cosmonauts always like hitching a ride on the shuttle. As launch vehicles go, it's a really nice ride both on the way up and on the way down.
Compare apples to apples, please. The SSMEs are analogous to the J-2. (SSMEs are more powerful, BTW.) The F-1 analog is the SRB engines.
If you want to get off the ground in a hurry, the SRBs will happily flatten you to a pancake. In comparison, the Saturn V barely lumbered off the pad.
It's not ironic at all. NASA made an economic misstep by developing the Shuttle. The economics of launch vehicles favor the inline stack with smaller boosters for man-rated vehicles and larger boosters for cargo. Ne'er the two shall meet.
In absence of a clear need for a space station as a rendezvous point, taking a step backwards to more sophisticated capsules is how you get back on track for economic success.