Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:This is a sad day for the tech world (Score 1) 1027

That's basically what Apple does, though. Pay $100/year for developer status and run whatever damn program you want on your iPhone. As taxation for developer status goes, that's an incredibly low sum - game console development runs in the thousands per seat and the math on keeping MSVC++ up to date works out to far more than that per year (admittedly MS has a passable free-for-hobbyists option in Visual Studio Express).

Sure, it's not "free" and that tends to chafe the hardcore open source nerd set, but if you can afford an iOS device in the first place, the gateway to tinkering shouldn't be particularly onerous.

Comment Re:OpenGL a thing of the post (Score 2) 98

It's a troll or loufoque is a bit detached from reality, but this does bring up an interesting point: a lot of what people are looking into these days in terms of rendering is voxels drawn using polygons. Minecraft? Basically those tiles are voxels being rendered as an uniform convex hulls - lends itself to some amazing efficiency.

This is even more interesting from a technical perspective - stretching isosurfaces across voxel terrain to create a truly malleable world.

Comment Re:Feature Bloat? (Score 2) 98

I'm a newbie at this stuff, but here goes:

"single-rendering-pass order-independent transparency" - let's say I have three translucent objects at roughly the same depth, with parts of one in front of and behind parts of the others (and maybe the same is true for objects B and C as well). Figuring out the correct draw order is absolute fucking murder, and there still isn't a generalized approach for anybody but the most advanced of the most advanced (like Dual depth peeling or making convex hulls out of all translucent geo in the scene). Core API support for dealing with this issue would be a godsend and is about 10 years overdue for ALL graphics APIs.

Neat fact: the PowerVR-based GPU used by the iPhone/iPad uses a tile-based rendering method in which (I am told) this problem generally doesn't arise.

"Capturing GPU-tessellated geometry and drawing multiple instances of the result of a transform feedback to enable complex objects to be efficiently repositioned and replicated;" Easier to quickly render massive crowds, forests, and procedural cities.

"Modifying an arbitrary subset of a compressed texture, without having to re-download the whole texture to the GPU for significant performance improvements;" Shaders not requiring four fucking separate mask textures all dancing on the head of a pin to pull off a simple effect? Yeah, I'll take that. Could probably also have some nice gains in procedural content variation.

"Packing multiple 8 and 16 bit values into a single 32-bit value for efficient shader processing with significantly reduced memory storage and bandwidth, especially useful when transferring data between shader stages." Massive performance gains for any sort of post-processing work, basically.

Comment Re:Feature Bloat? (Score 2) 98

They did that already. As of OpenGL 3.1 the only non-deprecated rendering method is Vertex Buffer Objects. Link.

There are a lot of things OpenGL could do to make itself more accessible - better-supported crossplatform utility libraries, three or four shortcut commands that set the various glEnable() states that 95% of new developers actually care about, streamlining eyebrow-raising pile of mipmap generation options, the entire process of setting up a vertex buffer object could be MASSIVELY simplified...

Honestly, what OpenGL needs isn't fewer features, but rather for the features most people want to use to be placed front and center with extremely simple, well-documented data formatting rules and optimized, efficient helper functions. Microsoft might have been Slashdot's Great Satan for a long time, but they do listen to the sort of developers they're hungry for, and DirectX is one of the better examples of that.

Comment Re:Ugh (Score 5, Insightful) 334

The basic issue at hand is that the majority of people don't have time for anything more than "it just works." What they want is appliance computing, and that's what App stores enable. This is the reason Apple has had so much success lately, and why they won't ever be loved by Slashdot. Personally, I'm happy to roll my own OpenBSD kernels for my media server and firewall at home, but when it comes to my phone I'll take Steve Jobs' walled garden. I don't have the time for anything else, and I really need my phone to "just work".

True general-purpose computing exists on the desktop and will continue to do so - but the consequences of that model will be continued security issues far in excess of the walled garden's, compatibility issues due to a functionally infinite number of hardware configurations to support, and abandonment by any developers unwilling to tolerate piracy/off-label usage of their applications [some might say 'good riddance' to the latter, but there's an awful lot of money and talent in that pool that will be spent making the walled gardens more attractive].

As far as the open source and freedom-to-code communities go, they can either approach this with ineffectual wailing and gnashing of teeth, or they can resolve to make this work for them. How? By building compelling services that are free-as-in-speech on general-purpose computers, and charging nominal fees for viewers targeting closed platforms, the proceeds from which are used to fund further development. I suspect we're about to witness a period of brutal natural selection in which the greater software ecosystem culls out those who refuse to embrace and leverage the new environment.

We'll find out, either way.
--Ryv

Comment Re:ROI (Score 3, Insightful) 854

The problem with that line of argument, which I'm sympathetic to personally, is that the rough numbers I'm describing are (give or take 5%) reflected across every major FPS/action title in the past several years.

Quality and engaging stories are critical to good base sales and customer satisfaction, but you'd be surprised by how little impact they have on player completion rates.

The solution taken by the better studios in the industry, and I apologize as judging from the responses I seem to have poorly presented my point - is not to phone in the ending, but rather to shorten the experience while maintaining consistent quality throughout.

I think a lot of people don't realize that the levels you see in, say, Modern Warfare 2 cost literally millions of dollars to make, and the debate regarding optimal running time is still very much in progress.

--Ryv

Comment ROI (Score 5, Informative) 854

(All opinions expressed herein may not reflect the views of my employer, and in fact we try to avoid falling into this trap but it's a pretty prevalent attitude in the industry right now):

I work as a game designer on big-budget shooters for a living, so here's my take:

Game companies are consciously making the decision to do this for two reasons:
1) Easier games have broader markets, by increasing the likelihood and rate at which the user receives validation we increase sales, and much more importantly:

2) It's unusual for more than 50% of the people who beat the first level of your game to beat the last level. Money spent on later levels is generally money wasted, and shortening the experience altogether is a function of the increasing development cost per hour of gameplay and ROI of even having more than 10 hours of content at all. If 95% of the people who bought the game complete the first level (as tracked by developers through achievement systems) but only, say, 35-40% finish the game, that necessarily influences how you invest your limited development funds.

--Ryv

Comment Re:Meh (Score 1) 348

Speaking as someone who builds his own machines and rolls his own BSD kernels... the Macbook Air is pretty awesome for certain uses. Specifically: it is, by far, the best subway commute laptop I've ever had. Perfect balance of screen/keyboard size, extremely low weight, and it runs Minecraft wonderfully smoothly (especially if you install a 3rd party SSD in lieu of Apple's traditionally slow ones). Even after two years of extremely heavy usage, it has more than enough battery life for max screen brightness Minecraft or just coding in Eclipse for the 45 minute commute and return trip.

That having been said, the latest upgrade is a disappointment due to the identical processor. I would easily pay the full price for a new one, right now, if they'd tolerate .15" greater thickness and .2 pounds greater weight to give the thing a *real* heatsink and fan. The cooling issues mean you get about 15-30 seconds of 720P Youtube videos before the stuttering kicks in - obviously that's not going to be an issue on the limited connectivity of a subway commute, but it's unacceptable at home.

It's not perfect, and it's not for everyone, but within certain niches it really shines. It's also probably the closest you can get to an iPad that you can code on without rolling your own iOS IDE.

--Ryv

Comment Re:Glass, glass everywhere (Score 3, Informative) 324

I've already conducted this test twice unintentionally with the new iPhone, sans bumper (I generally use one, so during two separate incidents I butterfingered on the new glass). Two six foot falls onto marble with zero protection, both times landing flatly face down, not on an edge. Not so much as a scratch either time.

The plural of anecdote is not data, but after my experiences I'm somewhat skeptical of any claims about reduced fracture strength with the new glass. It's difficult to imagine a worse scenario that still falls within the confines of everyday wear-and-tear.

--Ryvar

Comment Re:Interested to know... (Score 2, Interesting) 282

It's *possible* that the very slight short circuit of a user's palm is playing havoc with the frequency calibration system. This would also neatly explain why people are more often reporting that the signal gradually falls off over several seconds rather than instantly.

If that's the case, then Apple *might* be able to retool the frequency calibration code to ignore the mild short circuit.

In all likelihood, the answer is probably to ship all future iPhone 4s with a very thin layer of clear resin (nail polish works wonders on the existing ones)over the external metallic surfaces.

--Ryvar

Comment Re:The mac (Score 5, Insightful) 253

Honestly what Apple have done isn't so much listening to developer's requests as it is fulfilling those requests to the greatest extent possible *without compromising user experience*.

Not compromising user experience, even potentially, appears to be their guiding principle and it's served them well. Slashdot will never love Apple because they aren't the target market. I, like a lot of people who swear by the iPhone - actively want appliance computing when it comes to a smart phone. I actively want the walled gardens of the XBox 360, PS3, Appstore, Wii, and even Steam, because these things substantially reduce malware and/or cheaters. I understand that it is fundamental to the basic principals of a Turing machine that they can never eliminate these things (ie virtual machines, etc.), merely reduce to a level unlikely to affect me. But in practice that's all I need, much like how in practice I only *need* 256-bit TLS for securing online purchases.

The antagonism seen towards Apple on Slashdot is due to the fact that it's an explosively growing market segment that isn't targeted for the core Slashdot demographic. It implies that the world is moving on from them, and nobody likes to hear that.

--Ryv

Comment Re:All that negativity about the IPhone (Score 3, Interesting) 484

It's a good phone, but it's not made for tinkering with, which is going to prompt a lot of hate on a site whose primary demographic is people who love to tinker with things.

As an iPhone developer I'm very happy with Apple's walled garden, but maybe this is because my 9-5 is game development, where all the biggest platforms are walled gardens. I get an industry standard cut of the profits, there's a minimum of casual piracy of my work, the development environment is first rate and extremely cheap ($100! Mind-bogglingly cheap to someone who comes from an industry where engine licenses run in the low millions, and the standard 3D modeling package is $3500), and the hardware platform is standardized enough to make it easy to work with.

I can't imagine trying to develop for Android, where the hardware is going to be all over the place. That's all well and good for beefy PCs, but for an embedded system? How could you possibly optimize sufficiently for a multi-target mobile platform and still turn software around quickly enough to be profitable?

Ultimately people's preferences are going to reflect how and why they use their phones, and for developers it will reflect their target demographic. Slashdot will never love the iPhone because it isn't *for* them, which suggests that they aren't the most important people out there - and that's a message nobody likes to receive.

--Ryvar

Slashdot Top Deals

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...