Forgot your password?
typodupeerror
United States

Rupert Murdoch Publishes North Korean Flash Games 186

Posted by Soulskill
from the wonder-if-they're-paywalled dept.
eldavojohn writes "You might recall back in June when it was noted that North Korea was developing and exporting flash games. Now, the isolated nation state is apparently home to some game developers that are being published by a subsidiary of News Corp. (The games include Big Lebowski Bowling and Men In Black). Nosotek Joint Venture Company is treading on thin ice in the eyes of a few academics and specialists that claim the Fox News owner is 'working against US policy.' Concerns grow over the potential influx of cash, creating better programmers that are then leveraged into cyberwarfare capabilities. Nosotek said that 'training them to do games can't bring any harm.' The company asserts its innocence, though details on how much of the games were developed in North Korea are sparse. While one of the poorest nations in the world could clearly use the money, it remains to be seen if hardliner opponents like the United States will treat Nosotek (and parent company News Corp.) as if they're fostering the development of computer programmers inside the DPRK. The United Nations only stipulates that cash exchanged with companies in the DPRK cannot go to companies and businesses associated with military weaponry or the arms trade. Would you feel differently about Big Lebowski Bowling if you knew it was created in North Korea?"
The Almighty Buck

On the Expectation of Value From Inexpensive Games 102

Posted by Soulskill
from the all-about-the-washingtons dept.
An article by game designer Ian Bogost takes a look at what type of value we attach to games, and how it relates to price. Inspiration for the article came from the complaint of a user who bought Bogost's latest game and afterward wanted a refund. The price of the game? 99 cents. Quoting: "Games aren't generally like cups of coffee; they don't get used up. They don't provide immediate gratification, but ongoing challenge and reward. This is part of what Frank Lantz means when he claims that games are not media. Yet, when we buy something for a very low price, we are conditioned to see it as expendable. What costs a dollar these days? Hardly anything. A cup of coffee. A pack of sticky notes. A Jr. Bacon Cheeseburger. A lottery ticket. Stuff we use up and discard. ... I contend that iPhone players are not so much dissatisfied as they are confused: should one treat a 99-cent game as a piece of ephemera, or as a potentially rich experience?"

Comment: Re:Oh Noes! (Score 1) 583

by addie macgruer (#26144271) Attached to: Microsoft Knew About Xbox 360 Damaging Discs
Water boils at 100 deg C at one atmosphere only. It will boil at about 60 deg C at the top of tall mountains - you can't make good coffee on Everest - and can be boiled at increasing pressures up to its critical point (380 deg C at 218 atm) after which it exists as a superfluid, showing both liquid and gas character. High temperature is normal for good coffee: preparation of good espresso will use superheated steam to extract the most flavour from the beans, for instance. Not that I'm suggesting McDonald's prepare good coffee.

Comment: Re:Experts please explain something (Score 1) 179

by addie macgruer (#23074806) Attached to: Nvidia Physics Engine Almost Complete
Rendering a world in 3d requires you to do *a lot* of very simple maths. OpenGL breaks down the operations into two steps, DirectX is essentially the same.

"Per vertex": the 3d world is made out of triangles. The camera position and viewpoint is expressed as a matrix, and subject to a matrix inversion, a simple mathematical transform. Every corner of every triangle (vertex) is expressed as a matrix, and then multiplied by the inverted camera matrix. The product is the (x,y) position on your screen, together with the depth (distance).

Matrix maths is easy, it's a series of multiplications followed by a series of sums. A general purpose CPU has to follow the instructions for each vertex every time, which is time consuming. A specialised GPU has circuitry which does this maths, and only this, and thus can run many at once at considerable speed.

"Per fragment": if you were rendering to the screen, you could think of this as per-pixel, but you could rerender each pixel many times to anti-alias, or you could render to a file, or do 3d tricks like Valve's Portal by rendering alternative views elsewhere, first.

Once you've converted your triangles into what they look like on the screen, you need to colour them in. Old-school 3d graphics (late 80's) might use a single colour for each one, but we've come to expect more. Texture rendering is easy maths: load the texture from memory, interpolate how far along the texture the part of the triangle you want is, and put that colour of the texture on the fragment you're rendering. General purpose CPUs can do this, but rendering the entire world takes a lot of time. As a bonus, each "fragment" is a seperate bit of maths, so lots of "pixel shaders" (specialised GPU circuits) can work on fragments separately, and pool their results for speed.

GPUs have really, really, fast local memory compared to CPUs. It's optimised for reading, not writing, as textures don't need to be changed that often. General purpose CPUs need to check whether multiple cores are using the same memory, as they are quite likely to write and change it in general operation, slowing things down. Also, GPUs have specialised circuits for texture maths, and only this maths, which lets them do it faster.

In addition, some other fragment tricks (eg. if you add a bit of grey with increasing distance, you get a "fog" effect; if you add some white depending on the angle of the triangle, it looks shiny and reflective) can have specialised bits of circuitry on the GPU. Not hard maths, but you can do it faster if that's all you can do.

GPUs now also tend to have provision for managing your triangle lists and whatnot which lets them crank out the maths faster than if they had to wait on the CPU.

That clear things up?

Nothing ever becomes real until it is experienced. - John Keats

Working...