Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Why Amiga? (Score 2, Interesting) 289

I call bullshit. Even some of the most simple Flash games would be impossible to re-create on a (then) mid-range Amiga.

The Amiga would struggle even with a 'match-3' game where any case arose that the grid full of symbols all had to fall down at the same time. You've got to remember that the Amiga didn't have enough graphical horsepower to move even a 16-colour 320x256 screen full of objects around at 50 or 60fps. Oh, it could move the entire screen around as one object, but the Blitter couldn't shift actual pixels around that fast.

Now try doing Warzone Tower Defense, or *any* of the physics-based games where graphic objects undergo rotation. The Amiga had no built-in support for rotating graphics. It could be kludged but it was usually limited to demoscene stuff. Brian The Lion was the only commercial game to implement full-speed rotating graphics. Well, Turrican 3 I think might have (on small objects), but I may be mis-remembering.

The game Rotox was based entirely on a top-down rotating vector playfield, but framerate was fairly poor.

The only area the PC falls down when dealing with 2D gaming is that there is absolutely no hardware support for detecting per-pixel collisions between objects. You either iterate through the objects pixel-by-pixel using the CPU, or you do bounding-box, bounding-circle or ever more complex bounding-polygon stuff.

Comment Re:IBM PCs compared extremely poorly with Amigas (Score 1) 289

Whilst I agree with everything you say, the Amiga OS was also an insecure hell. OK, as coders we got absolute control over everything, but had the Amiga 'won', the whole OS would have had to go through a total re-write to implement a whole lot of protection in order to prevent a gross malware bloom.

The Amiga died just around the time I developed an almost complete knowledge of its hardware and became fluent in 680x0 assembly. A state of coding Nirvana I have never been able to achieve on the PC, much to my dismay.

Privacy

Submission + - Benchmark Reviews Caught Red Handed Over 'Review' (podgamer.com)

An anonymous reader writes: Benchmark Reviews were thoroughly caught out recently when it turned out that a 'review' for a $1300 office chair was a near bare-faced reproduction of a number of press releases provided by the manufacturer. Forum goers investigated the strange wording of the review and laid their findings bare. When an investigative journalist e-mailed Benchmark Reviews asking for citations of studies mentioned in the review, what was their response? Dig up the personal details of the journalist, including address and phone number, and post them publicly online in a "Hall of Shame".

Comment Re:Remember the LOLAMO (Score 1) 197

Your reply has nothing to do with basing a purchasing decision on a value-for-money process.

*any* card can be a lemon. Top-end, low-end, mid-range.

I get your point that top-end cards go through rigorous QA and thus are less likely to be defective, but to base a purchasing decision on this criteria alone is pure madness. Top-end cards are enormously overpriced, however you want to measure it ($ per fps, FLOPS per $). This is because supply is limited and the market will bear it. Everyone expects to pay a premium for the kudos of having the fastest card. The law of diminishing returns is vividly evident in the graphics card marketplace.

Sensible people will offset the incredibly small risk of purchasing a lemon against the real and immediate lower purchase cost of a mid-range (or just not top-end) card.

Comment Re:Remember the LOLAMO (Score 1) 197

Absolute rubbish advice. If a card doesn't work right through defective manufacturing or design, you're due a free repair, replacement or full refund.

Nobody likes getting a lemon, but that's no reason to spend stupid amounts of money on a flagship card from *any* generation.

Getting a previous generation flagship is as dumb an idea. Not only will technological advances be entirely missed (DX11 features, maybe), energy usage will be woefully inefficient compared to a low or mid-range card from the current generation.

If you want an 'always' rule, always buy the card in the very middle of the current generation, or the one below that if your budget doesn't permit. This would mean a 465 from Nvidia (though this is likely due to be dropped, or price dropped very soon).

I speak from experience. I'd never paid more than 100GBP for a graphics card until I decided to buy a 256MB 7800GTX some years ago, when it was the fastest card available. >300GBP and I've got a card which is an OK performer (Half-Life 2 at 1920x1200 with decent quality), but nowhere near as capable as a 100GBP card of today. I should have just spent 100GBP at the time and then upgraded twice more between then and now.

I was thinking of putting an ATI 5670 into a new computer, this year. For various reasons, I want a single-slot card. I'm somewhat hoping that at least one OEM can create a single-slot cooling solution for Nvidia's 460, because it's a much better proposition, performance-wise.

Comment Re:not the highest resolution: 8k super hi-vision (Score 1) 204

Having looked at some of the YouTube sample clips, 4k video presented me with a few issues: -

1. My CPU, GPU and drivers combination were not capable of sustaining full rate playback.*
2. One clip suffered corruption every few seconds.
3. Whilst I have plenty of constant, uncontended bandwidth, YouTube does not.**
4. I don't have a 4k display.

OK, the last one is pretty obvious. Mind you, you need HDMI1.4 to transport 4k video from your computer to your 4k screen. Even then, HDMI1.4 supports that resolution at a maximum 24fps (i.e., not really suitable for your Windows desktop).

I don't think 100Mbps is needed for 4k video at the consumer level. Current encoders can squeeze 1080p24Hz into, e.g. 5.1Mbps average over the length of a movie with very good quality. 4k video contains 4.2667 times as many pixels as 1080p. Very unscientifically, this means less than 22Mbps to transport a high quality 4k video stream.

Of course, consumers are happy with shitty quality digital video, as witnessed by the many cable-TV channels using low bitrate encoding which nobody seems to complain about (at least, not enough for cable companies to worry).

YouTube is not exactly cable-TV quality, so I'm sure they would get away with 5Mbps for 4k video.

*2.2GHz Core2Duo, Nvidia 8700M GT, latest drivers.

**I have 20Mbps internet, one of the 4k test videos constantly paused during playback due to buffer underruns.

Comment Re:not the highest resolution: 8k super hi-vision (Score 1) 204

1920x1080 is not the absolute last word on things. 1920x1080 is *not* enough for everybody. More screen-space is always welcome.

My old phone had a 640x360 display. New phones are available with 800x480. The iPhone 4 has a 960x640 resolution screen and that's only a 3.5" display!

I would like to enjoy as many pixels per square centimeter on my TV and laptop as I do on my phone. Consider old laptops had 800x600 displays, then we had technological advances through 1024x768, 1280x1024... then widescreen from 1280x800 now up to a maximum of 1920x1200 for laptop displays, 2560x1440 and 2560x1600 for very expensive external monitors. Technological advances now seem to apply solely to mobile devices. In fact, we're going backwards ever so slightly because 2560x1600 was last year's top resolution. Same monitor family this year sports 2560x1440 at the top of the range. Pixels per square centimeter is still nto great, because to go more than 1920x1200 means a 27" screen, minimum.

Perhaps there is no market demand? 1920x1200 is a nice resolution on a laptop. Still considered luxury, for the most part. People even seem happy with 1024x600 on their netbooks.

1920x1080 seems to be the new 1920x1200 for laptop displays and external monitors from anywhere just short of 23" all the way up to 27". If we're going to stick with 16:9 as our ratio, I'd like to see us progress from 1920x1080 to 2560x1440, 3200x1800 and then 4000x2250.

Imagine that on your 17" laptop. Heck, why not go crazy and have 6000x3375? We could do away with horrible, blurry, anti-aliased text on our displays.

Sadly I think progress will come in the form of OLED (or a variant). That will at least get rid of refresh lag, which really sucks but everyone kinda just ignores it because those of us who remember (or still use) CRT's know that LCD's clarity outweighs CRT's short-persistence phosphory blur.

Comment Re:Why I prefer physical media (Score 1) 232

*When* a *digital* *download* game *from* Steam *gets* corrupted (read: *Never*)*,* you click "repair cache" and it *repairs* itself. You console dudes are idiots *[no comma]* and this *is* why you buy consoles. You have zero *intelligence*.

Seriously, if you're going to bash an entire collection of people by calling them stupid, don't fuck up your post in stupid ways.

Comment Re:Why I prefer downloads (Score 1) 232

Steam is good because it's more convenient and is a better overall experience than downloading pirate copies.

I have no love for DRM or systems which use it, but Steam is absolutely the best way for me to reward developers for a job well done. If some of that money also goes to Valve, then that's cool too. I really like Valve games.

I don't know if the devs get more or less reward than if I'd bought a copy of their game from a retail store, but I don't care. Store-bought games mean I have to deal with these crappy, outdated and overly-fragile things called DVD's. Plus, if I can't at least have a really nice box and manual... forget it. Little DVD cases and a PDF of the manual really doesn't get my impulse-purchase juices flowing.

If Steam dies tomorrow and I can no longer play the games I've paid for. Well, that's why I also support piracy as a distribution mechanism.

Comment Re:Fill 'er up! (Score 1) 431

Some LCD's are utter rubbish for watching video, but that doesn't mean people won't buy them as an upgrade to an existing CRT.

The majority of consumers will rather buy a 52" plasma with a 1024x768 resolution than a high-end, full HD, 37" LCD/LED TV at the same price. Bigger is not better, but it does impress their friends.

The /. crowd are (hopefully) an exception, in that I'd rather buy an expensive 32" TV than a cheap 42" TV because it will be better quality.

Heck, this is why I've (still) got a 24" Dell LCD monitor which set me back a month's pay.

Slashdot Top Deals

Always look over your shoulder because everyone is watching and plotting against you.

Working...