Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Bug

Saboteur Launch Plagued By Problems With ATI Cards 230

An anonymous reader writes "So far, there are over 35 pages of people posting about why EA released Pandemic Studios' final game, Saboteur, to first the EU on December 4th and then, after knowing full well it did not work properly, to the Americas on December 8th. They have been promising to work on a patch that is apparently now in the QA stage of testing. It is not a small bug; rather, if you have an ATI video card and either Windows 7 or Windows Vista, the majority (90%) of users have the game crash after the title screen. Since the marketshare for ATI is nearly equal to that of Nvidia, and the ATI logo is adorning the front page of the Saboteur website, it seems like quite a large mistake to release the game in its current state."

Comment Free-floating input data (Score 1) 561

The actual problem comes to the input data to these models. As TFA says, they measured the correlation value that would predict the observed market prices of CDS's.

This is kind of common in the financial business; you assume that "the market" is already taking into account all facts when deciding those prices, so that you can calibrate a single parameter (the correlation in this case) that will make up for all that assumed knowledge. If your model doesn't explain all data, you simply add more parameters.

I see two problems with this.

The first one is that many times "the market" actually doesn't know about the future, making your calibrated parameters reflect a collective subjective opinion instead of a tangible reality.

The second one is that, as markets mature, many players end up using the same models. This not only leads to a single failure mode for the whole market, but may end up producing a free-floating (unstable) fed back system if no tangible (real world) inputs are at hand; you price things with model M given parameters that were deduced from prices via the same model M. When everyone is doing the same without looking at their surrounding reality at all, there's no more of that "collective market knowledge" left, but a bunch of lemmings running towards the cliff edge.

Comment HPC and time sharing - failure (Score 1) 228

At my work, we maintain several Windows clusters for financial derivatives valuation.

We can't really move all of them to Linux (no matter how much we would like to do it), because some of the calculations have been implemented using MS only technologies, like ActiveX (yes, you read that well) and .NET (the last time we checked it the Mono runtime was ~5-6 times slower than MS implementation of .NET for our code).

When we needed to upgrade the Windows clusters recently, we had to move from 2-cpu-1-core to 2-cpu-4-core machines, since it was what was being sold. What we've observed is that Windows (Server 2003) is unable to fairly share the CPU time when there are more active threads than available cores. We get a lot of variance on the overall calculation time when the clusters are very loaded because of this.

The same tests done on SLES-9 (yes, 9) based Linux clusters with similar hardware did not suffer from this problem. CPU time was divided equally among all threads. And we are using a 4-year old kernel, that doesn't even sport the newest completely-fair-scheduler.

Slashdot Top Deals

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...