On the Subject of OpenGL 2.0 126
zendal writes "The danger with pixel shaders and vertex shaders is that there is no standard for programmability of graphics hardware. A schism has formed within DirectX between the competing demands of GPU makers Nvidia and ATI. Noted analyst Jon Peddie gives THG an exclusive first look at a White Paper on how OpenGL 2.0 is trying to bring stability and open standards to programmable graphics and GPUs."
The Standard is always long to come (Score:5, Interesting)
It was the same with SuperVGA (took about 2 years), Internet Protocols (still on going, W3C is struggling for standards) and now OpenGL and DirectX.
OpenGL 2.0 seems pretty much like the definitive solution...
Re:The Standard is always long to come (Score:3)
Re:The Standard is always long to come (Score:1)
"The choice of APIs used was guided by the criterion of simplicity, stability and portability. In that spirit, OpenGL is used for 3D rendering what enables high compatibility with all OpenGL compliant 3D cards (every usable 3D accelerator nowadays) and high stability due to OpenGL's long tradition of quality."
Croteam uses OpenGL.
Re:The Standard is always long to come (Score:1)
That makes their planned Linux port of Serious Sam 2 more feasible. (It is planned - beat the windows version and you'll see somebody credited for it in the... uh... credits)
Re:The Standard is always long to come (Score:2)
Linux port of SS2? Didn't know about that one. I know Ryan Gordon (formerly of Loki) is doing the port of the first Serious Sam game, with the full support of Croteam. Details are hard to find, but take a look at http://icculus.org/cgi-bin/finger/finger.pl?user=i cculus§ion=ssam [icculus.org]
Re:The Standard is always long to come (Score:1)
Croteam uses OpenGL, but has also added DirectX8 support since.
Re:The Standard is always long to come (Score:2)
Re:The Standard is always long to come (Score:5, Interesting)
Rarely do you see something on Slashdot that contains as much truth as that statement. Microsoft focuses their best development efforts into free products designed to crush other people's standards. OpenGL has been a continuing thorn in their side, and their ferocious work on Direct3D is aimed at obtaining the complete dominance they're used to in the gaming market. Jon Carmack has (almost singlehandedly) prevented them from doing this, and the ensuing competition has left consumers and game developers with... two really good standards. I could almost feel good about this, if it weren't for the fact that iD is competing with a monopoly, and is succeeding only because they remain privately owned and hold their market presence through sheer programming prowess.
If only we had someone like Carmack to write Office software for Linux.
Re:The Standard is always long to come (Score:2)
Makes you wonder if it's just a matter of time before there are extensions to d3d to make something like the quake engine redundant as a licensed technology. The quality of the quake games have been less and less important compared to the quake-engined games. Look at Half Life/ Counter Strike, Medal of Honor - Day of Defeat, etc. They are the real games, not Quake III Team Arena. Doom 3 might be a great seller, but I don't see ID veering from their pattern of game design...
Hopefully the federal government will be able to keep enough control on MS to keep such a thing from happening.. who knows.
Re:The Standard is always long to come (Score:1)
Re:The Standard is always long to come (Score:1)
Re:The Standard is always long to come (Score:1)
I guess it depends on what you mean by professional. I was also refering to Max which uses both OpenGL and Direct3D, although it also includes its own custom Heidi driver. And last time I checked (which was quite a long time ago) Martin Hashs 3d applications had switched to Direct3D, but again I dont know if you consider it professional either. Maya, Softimage, Lightwave, Houdini of course are OpenGL based. There are also a lot of smaller applications I wouldnt label consumer (they are expensive and not usefull for consumers) but more professional that use Direct3D, but they are almost not worth mentioning.
> And they don't much care about OpenGL 2.0, because they have little use for pixel and vertex shaders.
Actually, I'm seeing more game developers moving to Maya, and if that is the case, you will see a lot more interest in using a lot of the 2.0 features for developing game content. Its not so important for feature film kind of work but for game developement it is important. Newtek may be interested in it, they were demonstrating in some of their newer versions of Lightwave (6.x and up) using a lot more OpenGL features and doing tricks that game developers use in their editors, more so then I have seen from other packages, its nice to see lens flares and fog when your animating a scene, instead of having to render a few frames to see what it looks like.
> They really just want high performance fixed-function transformation (no, your new Geforce 4 can't compete well here), fast gourad shaded triangles and blazingly quick antialiased lines.
Last I heard the wildcat line of cards were pretty dominant in this area. But at the same time it depends on what you are doing, game developement, tv, or film. For Film they usually want the best quality and will want something to really push the detail levels really high. TV and game cinematics it's a little lower, althought there is some that push as hard as some of the low end films. With real time aspects of games, your dealing with 800 poly for low end and 80k for high end characters (Depending on the platform and/or hardware), and just about any consumer card can be used to handle that.
Re:The Standard is always long to come (Score:1)
~Eric
Absolutely (Score:3, Insightful)
It's especially problematic for graphics standards. GKS, which was hideously based on ideas from pen plotters, still dominated much of the 1980's. Open GL has been pretty good, but it's stuck in some 1980's ideas. For example, the strict ordering of primitives makes sense in a world of bit blasters where double-buffering and Z-buffering are expensive, but it makes little sense on modern hardware and even worse makes it impossible to implement some of the better modern techniques (such as hierarchical global scanline algorithms). Some day, we're going to have systems that cost less than a million dollars that can do real-time ray-tracing, radiosity, and other solutions of the Rendering Equation.
Not really (Score:2)
Actually, OpenGL (which was very much an almighty standard) was famous for being very forward-looking. Rather than ratifying commonly-available functionality, it defined a standard for hardware companies to aspire to and work towards. The actual hardware came later; for a while, many OpenGL core features were available only on extremely expensive SGI big-iron.
Incidentally, OpenGL's role as defining the long-term goal for hardware has now been usurped, not by OpenGL2.0 or D3D but by Renderman.
Re:Not really (Score:1)
I thought Renderman stuff was done primarily in software using render farms. Could you explain what you mean? I'm curious.
-Kevin
Re:Not really (Score:2)
Re:DirectX and speed (Score:3, Funny)
(methinks the editor left in a few comments
Re:DirectX and speed (Score:1, Troll)
ADD SOMETHING INTELLIGENT
^H^H^H^H
blib
...
Damn the lameness filter! It wouldn't let me post this comment because it had too many allcaps. Maybe this rant at the end will get it through...despite making my posting even more lame.
What? Again... (Score:1)
Is it me or is this a standard marketing technique by a company that wants domination?
interesting point at the end (Score:5, Insightful)
A most interesting point is right at the end of the article:
One of the key points stressed by the ARB is that the "open" needs to go back into OpenGL. The group has pledged that all ideas submitted for OpenGL, if adopted, are then open for use and not licensable as IP.
So, they won't pull a "Rambus" here... hopefully.
Re:interesting point at the end (Score:4, Insightful)
And the next sentence is an important explanation:
what we're seeing is a recognition
W3C take note! The same goes for internet standards. If it can't be used by everyone, then it's not a standard, it's proprietary. Anyone who wants to make/use something proprietary is free to do so, but standards bodies should never impose them.
-
Re:interesting point at the end (Score:1)
The problem with OpenGL on Windows... (Score:3, Informative)
...has always been that driver support is buggy. nVidia is notoriously bad at this; their DirectX drivers are quite stable, but OpenGL blue screens left and right (especially with a lot of detail in the scene graph). I always wondered why they even bothered to include OpenGL support in their drivers, although I suppose with such a major standard they have pretty much no choice.
Now, with OpenGL 2.0, if they have to support three different API's, isn't driver quality going to suffer even more? Oh well, ATI has been getting a lot better recently, I guess we can always switch to them. :-)
---Crash Windows XP with just a simple printf! [zappadoodle.com]
Re:The problem with OpenGL on Windows... (Score:3, Interesting)
THEN, maybe Micorsoft and the OpenGL group can try come to terms and maybe bring DirectX compatibility closer to OpenGL (Or vs/vs) and have a single standard.
It sounded like Microsoft wanted to come into compliance with OpenGL before, but dropped it because the OpenGL group moved too slowly. (Insert your own M$ conspiricy theory here, but I suspect they really honestly tried, and if a good opportunity arose, would come back and try again).
Re:The problem with OpenGL on Windows... (Score:3, Insightful)
always wondered why they even bothered to include OpenGL support in their drivers
If i was a video card driver writer, I'd be concentrating on making the existing one more stable instead of contemplating why we even bother producing it. Doesn't the PS2 (playstation) use OpenGL? Macs use OpenGL. What do SGI machines use? DirectX may be most popular among Windows games programmers, but it's not the only industry standard worth following. If software developers for PCs concentrate purely on DirectX they would never have had some of the amazing games that originated in the console market, as they never had DirectX support to start with.
Re:The problem with OpenGL on Windows... (Score:1)
I've been under the impression that video cards come out so frequently now that almost no time is put into the drivers. after the driver is released, some minor bug fixes may happen, but thats all. the majority of the work now begins on the new card that will be just around the corner...
Re:The problem with OpenGL on Windows... (Score:2)
Re:The problem with OpenGL on Windows... (Score:2)
Re:The problem with OpenGL on Windows... (Score:4, Informative)
"Blue Screens" are caused by a fault in the Kernel or something writing to memory it's not meant to be writing to.
This is almost correct. Blue screens are caused by a fault in *kernel mode* (Ring 0 on Intel architecture), which is not equivalent to "in the kernel." WDM drivers [amazon.com] (like the nVidia graphics drivers), as well as all NT drivers [amazon.com] and in fact the entire USER and GDI subsystem [amazon.com] (since NT4), all run in kernel mode. None of these components are technically the kernel. Btw, wild pointer writes are a kind of "fault in kernel mode."
Assuming normal user processes can only write to their own memory space, then it is a fault of the kernel.
No argument there. See also this page [zappadoodle.com]. But as I already pointed out, the nVidia driver runs in kernel mode, not user mode, so this argument is not relevant.
Sure, Open GL might be buggy,
OpenGL can't be buggy, it's just a specification. nVidia's implementation is buggy, like I said. This is especially apparent considering that the blue screen errors have the name of nVidia's kernel mode driver in them.
but it's your Windows kernel that's causing the blue screen.
Again, confusing "the Kernel" with "kernel mode." Hey, I hate Windows as much as the next guy, but that's no reason to post incorrect technical information about it and hope nobody will realize you're blowing smoke out your ass. Next time, do a little more research [amazon.com] first.
---Windows 2000/XP stable? safe? secure? 5 lines of simple C code say otherwise! [zappadoodle.com]
Re:The problem with OpenGL on Windows... (Score:3, Informative)
That's true, but I already admitted I was wrong about that completely irrelevant (to the original post) detail in this reply [slashdot.org] to this helpful comment [slashdot.org]. Thanks for pointing it out again though, it *was* stupid of me to post that and then bitch about someone else doing the exact same thing later in the thread.
---Windows 2000/XP stable? safe? secure? 5 lines of simple C code say otherwise! [zappadoodle.com]
Re:The problem with OpenGL on Windows... (Score:1)
No, it's straight to the metal. You set registers on the GS, and DMA packets to the GIF.
Now, there is a high level library called ps2gl, and it is OpenGL-like (similiar to Mesa.)
i.e. most OpenGL source code will compile without any problems.
The problems with ps2gl are two-fold:
a) some OpenGL features will *allways* be dog slow on the PS2, so they aren't supported (and probably never will be) i.e. lack of stencil buffer
b) Only the basic OpenGL commands are implemented / supported.
I'm not aware of any shipping games using ps2gl. It's much faster (performance wise) to just write custom VU1 TnL microcode.
Summary:
If you're game runs exclusively on PCs (Windows, Mac, Linux) then OpenGL is great, since you dont' need to abstract the rendering layer.
If you're game runs on PCs and one (or more) consoles, OpenGL, sadly, isn't any advantage, since there is terrible OpenGL support on consoles. (OpenGL on XBox? Yeah, right
-=-=- Posting as an Arrogant Coward* so I don't potentially invalidate any NDAs -=-=-
* Yes I know AC really means Anonymous Coward, but from half of the posts around here, it sure doesn't seem like it.
Doesn't the PS2 (playstation) use OpenGL? (Score:1)
Apparently the Game Cube libraries look remarkably similar to OpenGL though.
Re:The problem with OpenGL on Windows... (Score:1)
Nvidia's OpenGL drivers are my "gold standard", and it has been quite a while
since I have had to report a problem to them, and even their brand new
extensions work as documented the first time I try them. When I have a
problem on an Nvidia, I assume that it is my fault. With anyone else's
drivers, I assume it is their fault. This has turned out correct almost all
the time. I have heard more anecdotal reports of instability on some systems
with Nivida drivers recently, but I track stability separately from
correctness, because it can be influenced by so many outside factors.
Read the rest of the plan for yourself to see his comments on ATI's drivers.
Note the comment about anecdotal reports of instability.
Re:The problem with OpenGL on Windows... (Score:1)
I admit, I haven't played my "blows up every time" test case (Descent 3) for a couple months, so it's possible the newest drivers actually won't crash. The DirectX component of the nVidia drivers is definitely rock solid. Looks like it's time to download the newest ones and try again :-) Thanks for the link btw.
Re:The problem with OpenGL on Windows... (Score:1)
Re:The problem with OpenGL on Windows... (Score:1)
The nvidia drivers have their bugs, sure, but in my experience (graphics programmer), blue screens are very rare. Perhaps something is wrong with your hardware.
Also, OpenGL has no scene graph. I think you don't know what you're talking about.
Re:The problem with OpenGL on Windows... (Score:1)
Wrong.
Could you please be more specific? I'm not wrong about it bluescreening, that's for sure. :-)
OpenGL has no scene graph. I think you don't know what you're talking about.
You got me there. I'm not a graphics programmer by any stretch of the imagination, I must have confused it with something else. This isn't relevant to my point about the drivers crashing though. I apologize for misusing a technical term, all I meant was that it tended to crash more when the scenes got more complex.
Perhaps something is wrong with your hardware.
Based on the number of comments in this thread insisting there must be something wrong with my hardware, I think I'll have to go with the majority opinion. There's probably something wrong with my hardware. Probably the power supply, actually, I just upgraded it, I'll have to try those nVidia OpenGL drivers again. :-)
Windows 2000/XP stable? safe? secure? 5 lines of simple C code say otherwise! [zappadoodle.com]
Re:The problem with OpenGL on Windows... (Score:5, Informative)
What exactly leads you to say otherwise? Presumably personal experience, rather than just a desire to trash nVidia, but compared to what? Given that 3D game luminaries have repeatedly stated they prefer nVidia's OpenGL drivers to those from ATI or (shudder) Matrox, that really only leaves the few remaining "professional space" vendors (sgi, 3DLabs), and I can't imagine they're universally perfect either.
Perhaps your perspective needs widening? Or perhaps you're running into the same bug over & over and have not bothered to notify nVidia about it? (or perhaps they just think it too isolated a case to get a high priority)
Re:The problem with OpenGL on Windows... (Score:3, Interesting)
I have heard bad things about nVidea on Linux. Part of the problem was a bug in certain AMD mother-boards that got fixed in the kernel two months ago. (AMD mother-boards in the sense that they worked with AMD cpus. AMD doesn't make mother boards itself). I think the problem was probably publicized more because people don't like the close source driver.
I don't remember hearing bad things about nVidea on windows.
Re:The problem with OpenGL on Windows... (Score:1)
I'm still using the 21.83 drivers and will continue to do so until I hit a problem in a game that is fixed by a later driver.
Re:The problem with OpenGL on Windows... (Score:1)
Re:The problem with OpenGL on Windows... (Score:1)
So developer really wouldn't be supporting three different APIs as you would suspect.
Re:The problem with OpenGL on Windows... (Score:1)
This puzzles me. Aren't there a large number of engineers originally from SGI at nVidia (e.g. Mark Kilgard and the like) who used to be the original OpenGL "gurus"? You would think they'd know OpenGL cold?
Re:The problem with OpenGL on Windows... (Score:1)
You would think they'd know OpenGL cold?
That seems to be the consensus here. Maybe it's a hardware problem, not a software problem. But then, why doesn't DirectX crash?
---Windows 2000/XP stable? safe? secure? 5 lines of simple C code say otherwise! [zappadoodle.com]
Re:The problem with OpenGL on Windows... (Score:2)
Total FUD. NVIDIA has been quite good about OpenGL support. I've played several of the first-person shooters (Quake, SoF etc.) using NVIDIA hardware and haven't had a single OpenGL related problem. Also, a person I have a lot of contact with at work uses SolidWorks (a nice 3D CAD package for Windows) which is exclusively OpenGL display with both consumer and professional level NVIDIA cards under Win2K. He hasn't had a single card-related crash, and he has some very complex models.
You should also read some of Carmack's recent .plan updates [webdog.org]. If anyone knows how to stress an OpenGL implementation, it's him. A direct quote: 'Nvidia's OpenGL drivers are my "gold standard", and it has been quite a while since I have had to report a problem to them'. That is stellar praise from an ISV. Also you should give NVIDIA credit for it's great OpenGL support under Linux and MacOS X.
In short, if you're seeing blue screens using the NVIDIA OpenGL drivers, you very likely have a hardware problem. Otherwise, they are very good.
Now, with OpenGL 2.0, if they have to support three different API's, isn't driver quality going to suffer even more?
How so three different APIs? There will be OpenGL and Direct3D. What other one do you count?
Oh well, ATI has been getting a lot better recently, I guess we can always switch to them. :-)
Good luck...and do read Carmack's .plan carefully first.
299,792,458 m/s...not just a good idea, its the law!
Re:The problem with OpenGL on Windows... (Score:1)
extensions good. opengl good.. directx??? (Score:4, Insightful)
And we all know MS wants DirectX to rule them all. OpenGL works, and is an open standard by definition. Extensions in there make life interesting certainly, but you pretty much know what you're getting into when you try NV_texture_rectangle or NV_texture_shader. (hint, the NV stands for NVidia) sure you can find out in directx if the hardware supports XYZ before you call it, but i find the naming convention of OpenGL a bit more coder friendly. it's readily obvious if you're trying something that's not supported across the specification.
Re:extensions good. opengl good.. directx??? (Score:3, Insightful)
Regardless, the extensibility of OGL is a double edged sword. Really, it does make features board-specific, which is not the point of OGL. On the other hand, allowing these extensions does drive the future of the standard. It lets everyone throw what they got out in the public and see what sticks.
Personally, I'm glad that OGL has huge gaps in standards updates, unlike DX. After all, it *is* a standard, and should be relatively static. Each new version of the standard should be absolutely positively a good standard. Anything missing can be used as an extension in the meantime and added to the next version 4-6 years down the road. This is the strongest point of OpenGL vs DirectX, there is a controlling body of many companies, rather than one main controller (MS). Bringing together the experience of co's like MS, SGI, NVIDIA, ATI, etc not only makes the standard better but adds a level of comfort to the users of the standard.
Basically, my point is that OGL standards *should* take a long time to finalize. Everyone seems to forget that standards should be able to last a long time. OGL is now on v1.2/1.3 after an evolution of around 10 years, while DX is nearing DX9 since, what, 1995 or so?
Re:extensions good. opengl good.. directx??? (Score:1)
No. (Score:2)
As an OpenGL developer, I can truely say: Extensions SUCK ASS. I've been keeping up with extensions with my DemoGL library for some time now, but it's a battle you can't win, there is no consistency, no 1 clear API, but there are a few: nVidiaGL (with the nv extensions) and ATIGL, with the ATI(X) extensions. Oh, and some drivers support OpenGL 1.2, others support OpenGL 1.3... Yeah, nice and all. (not).
And we all know MS wants DirectX to rule them all. OpenGL works, and is an open standard by definition.
OpenGL is a standard, but not 'open'. You don't have anything to say about what OpenGL will be in the next version. The ARB does, but they also are limited, since nVidia and ATI are always ahead of them, and because what they offer in propriety extensions is _THE_ stuff to use in bleeding edge 3D graphics, using an 'older' ARB standard is not the way to go to stay ahead of the pack of competitors.
Extensions in there make life interesting certainly, but you pretty much know what you're getting into when you try NV_texture_rectangle or NV_texture_shader.
Oh, do you? What if I have an ATI radeon 8500, these extensions are not available, I have to use ATI's syntaxis. But what's worse: you have to rely on the cardmanufacturers documentation for these extensions. I don't know if you've tried to figure out how to do cubemapping and compressed textures using nVidia's docs, but it's a pain to say the least.
With DirectX, there is one clear manual, one clear API and no zillion codepaths to code to support OpenGL 1.1, 1.2, 1.3, nVidia's extensions, ATI's extensions etc.
Re:No. (Score:2, Insightful)
Sorry, I disagree about the mess with OpenGL.
I think the OpenGL API is exceptionally clean, and generally stable. Yes, there are core code paths in OpenGL that are not accelerated by everyone, nor every driver. However, I find the general implementation far better for developing real 3D applications. 99% of the time I am not developing for other hardware, I'm working to get a concept rolling at 15fps on whatever I'm developing on. Same concept that John Carmack uses to develop his next generation game engines. Just push the boundary till it crawls, because in the timeframe it takes to refine the product to a shippable solution will unfailingly show a shift in feature support. This means, you make it WORK at 15fps or even slower then, over the next year, design code paths that optimize for existing hardware.
Both Direct3D and OpenGL require 'special' code paths to handle the tons of strange hardware out there; there is no escaping that fact. Therefore the various application code paths must be designed to switch over to alternative paths, primarily for optimization purposes.
Otis, you know from DemoGL how hard it is to handle all the tons of alternative hardware out there through OpenGL, but have you done the same with Direct3D? I find the CAPS structures in Direct3D far more difficult to write support code because I end up dealing with the problem of supporting various BITS of Direct3D which may kill my code dead, instead of reliably knowing OpenGL's core code path always works albeit slowly. Another interesting consideration, once your application has demand to work correctly by over even a hundred users because it looks cool, expect the next driver iteration to support the code path better. ATI has been highly responsive to my requests in the past, and so has nVidia.
Extensions are there mainly to optimize code paths for specific purposes. With the ability to transfer frame buffer data back into texture data, allows for nearly everything to be accomplished in multiple passes, even extremely slowly. Only some very specialized pixel operation absolutely require cube mapping, generalized shader, or bump support. However, they are generally cosmetic, and should be treated as dress-up accelerated features. i.e., Grab a Toy Story DVD and watch the 'behind the scenes' stuff, it's amazing to see how the visual process works. Animate simple blocky characters, apply lighting and moderate texture detail, drop in finer meshes, dress-up the scene, and polish. That's how programming game engines work too.
Exclusive first look? (Score:5, Informative)
Re:Exclusive first look? (Score:2)
Some don't want Standards (Score:2, Insightful)
>stability and open standards to
>programmable graphics and GPUs
The problem is that there are quite a few companies out there that do not want open standards because it gives them a competitive edge over other companies, end users, and organizations; even if it actually helps them as well. (And you know who you are).
Lots of programmable processors (Score:4, Interesting)
A vertex processor, fragment, pack and unpack are going to be supported.
Re:Lots of programmable processors (Score:2)
All this will be done in software (although fragment processing is notoriously slow to do in software), but hardware already exists that does programmable vertex & fragment processing. It wouldn't surprise me if programmable pack/unpack hardware also existed on modern GPUs, and was just waiting for an API to expose it.
How about a new version of X windows with OpenGL? (Score:4, Insightful)
So Xlib would have it incorporated and it would be much faster than as is done now of building it
on top of Xlib and extensions.
my bet is string based (Score:2)
the Nvidia solution you had to licence from them then the ARB did not want to get into that ugly mess only when ATI after being given time and incentive (from MS to do DirectX 8.1) did NVidia change their terms even then its a kludge
I quite like the ATI solution and they right from the start made it so that anyone could implement it, Yay for standards (-;
regards
john jones
Re:my bet is string based (Score:2)
I thought the ATI command method looked cleaner, but Carmack says it was "massively more painful" [bluesnews.com], and prefers nVidia's string-based approach.
It's true that nVidia did want a licence (protecting their IP, yadda yadda) which may have slowed adoption, but come to think of it, there isn't even ONE NV_* extension supported by ATI anyway, even the useful ones like NV_texture_rectangle (and yes, the ATI hardware does support it, they just refuse to expose it under Windows until it becomes "officially" supported), so I kinda doubt it really slowed down anything except ARB adoption (which is happening now in OpenGL 2.0, the way it should be).
Cunning move by 3DLabs (Score:4, Insightful)
Otherwise, I think it's a good idea. It'd be nice to see OpenGL keeping up with (or even outshining) DirectX...
it'll be like MPEG ! (Score:1, Offtopic)
Re:it'll be like MPEG ! (Score:1, Offtopic)
Re:it'll be like MPEG ! (Score:1)
DivX is used by lots of people for its good compression and high quality, oh and the fact that you can use it with pretty much any video player in Windows or Linux/Unix
While on the other hand QuickTime might be technically better, but its not used as often since you can get decent players or rippers/compressors for it....
Go figure
Re:it'll be like MPEG ! (Score:1)
Open GL is much better than than Direct X (Score:1, Redundant)
Scrap OpenGL, start over.... (Score:3, Insightful)
Modern hardware support is critical, interfacing with what seems to be a diminishing number of compatile drivers is critical. Keeping the spec out of MS control is critical.
There are MANY options to a ground up rewrite of OpenGL supporting CURRENT hardware, working with the Hardware vendors directly is the key.
I understand this is not an undertaking to be taken lightly, I have been working on options and looking for cross platform alternatives for a couple of months up to now, there are several promising alternatives, I hope to present these in a short time.
OpenGL's time is over it seems, MS is working with vendors explicitly to limit their support, there remain major differences of opinon in it developer base and schisms seem to be forming in it goals.
Re:Scrap OpenGL, start over.... (Score:2)
What they're planning to do is have a set of "compatability" functions, for people still using the OpenGL 1.3 standard, but make a "Pure OpenGL 2.0" subset of these commands, plus some new commands, to completely replace the functionality of OpenGL 1.3.
So, in the end, they've done almost what you've recommended in this message, but still with the OpenGL name on it. IIRC NVidia and other graphics corps are behind this as well.
Your arguments are valid, but don't be so quick to scrap the car when all it needs is a tire change and a few engine block repairs
Re:Scrap OpenGL, start over.... (Score:2)
I don't see much signs of this happening, however. Huge amounts of drawing code have been built into toolkits (like Qt and KDE and the Gnome libs and on Windows into MFC and various DLLs) that should really be in the graphics interface, and the people who know how they work are not willing to cooporate together and move all this work to a more sensible place.
Excuse me? (Score:2, Funny)
Uh...isn't that what OpenGL did in the first place?
This is a technology that's been around for years and isn't verion 9 yet
Save your time... (Score:1, Flamebait)
OpenGL's shot at victory (Score:2, Insightful)
On the other hand, if this OpenGL extension craziness continues on as it has been, the project might collapse into a tangled and unsalvageable mess. The OpenGL standards people have one shot at doing this right, and whether or not they pull it off will determine their long term success or failure.
gfx programming. (Score:2)
plan. [bluesnews.com]
A interesting read about the ATI8500 vs. GF4.
He also says:"Do not buy a GeForce4-MX for Doom."
Re:gfx programming. (Score:1)
Extensions (Score:4, Insightful)
Basically they all disapear.
Some have already become part of the standard. Some are added to the standard in opengl 2. Some just disapear altogether.
But the large majority of them are not needed anymore when you have programability, memory management and opengl objects etc.
To me that means that opengl 2 is way more flexible. Flexible enough so that we won't need as many extensions in the future.
And that's pretty cool.
(BTW: Brian Paul is a member of the ARB. He wrote on the mesa list that he hasn't been following the opengl 2 process very closely but that he expected that they would probably want him to write a free implementation).
Problems with programability in the graphics board (Score:2)
There's the artistic question of whether textures should be drawn or programmed. In the film industry, everybody except Pixar mostly draws, while Pixar writes RenderMan shaders for everything. (The whole OpenGL vertex/pixel shader thing is basically a lightweight version of RenderMan). Artists would rather have more texture memory than programmability.
A whole language for writing data pack and unpack functions is overkill. Pack/unpack doesn't do that much. It's not like the graphics board is going to unpack a JPEG image. It's just converting bitplane order, depth, and such. OpenGL already has quite a number of modes for texture storage; this just handles the people who wanted their favorite storage mode supported. A more declarative mechanism would have been more appropriate.
Re:Problems with programability in the graphics bo (Score:3, Informative)
They are used extensively in film graphics. All other major renderers, not just RenderMan, have shader languages. Ie vMantra, Maya, LightWave, etc.
Shaders do not "replace" texture maps. One of the most-used functions in a shader is to look up a given uv coordinate in a texture map and use the resulting color to control the shader. In fact most of the shaders we write involve manipulating texture maps, which were (as you said) painted by hand. We can do much more interesting things with textures other than just using them to color the surface!
I agree about the pack/unpack mess. I think all useful image formats could be described by these items: number of bits per sample (limited to powers of 2), number of samples per pixel, delta between each pixel (so they can be further apart than the number of samples or you can trivially mirror it with negative numbers), delta between each line (allows a "window" to be cut out of a larger image, allows flipping upside-down, and allows 90 degree rotations by adjusting both deltas). There is no need to describe what the samples are, that can be determined from the count and what function you are calling, we can insist on RGBA order for normal images.
A whole lot of misconceptions (Score:2, Interesting)
Now don't take this as a diatribe against DX. I use it frequently, and for some projects it's the thing to use. It's easy to develop quick apps, prototypes in particular, and obviously with the Xbox it's the thing to do.
As for whether a generic shader language should go into OpenGL 2, well, everyone has their opinion on that. Personally, I think it's a bad, bad, bad idea. Any pixel or vertex shader language that is implemented now will be out of date in short order. Anyone who has used these shader languages knows how crippled they are. Ergo, you'd implement these standards in OpenGl 2 and everyone would use the vendor supplied extensions instead anyway. How does that improve OpenGL?
DX has made many mistakes with regard to implementing these kind of features. They're barely used in one version of D3D and become white elephants in subsequent versions.
Fundamentally, the OpenGL standard is an API core. It only supports a minimal core level of functionality that can reasonably be expected to persist over time. Everything else should be an extension. It's not like it's hard to figure extensions out, after all. The vendors do supply documentation, and the Opengl repository maintains a list of all extension documentation. Then you can freely recognize which extensions are garbage and not use them, rather than be saddled with them for a decade.
John Bible
Bioware
Re:what about directX (Score:1)
OpenGL, it's just all around better than DirectX. It's more portable, it's faster, et al. It's only problem is, it's not as popular. Here where the only book store in town is Barnes & Noble (not that that's bad) and there are only maybe 2 people interested in programming in a general 50 mile radius, there are a limited number of books on OpenGL available at any given moment (no programming books at my library, already checked). Other than that, I have yet to find something not good about OpenGL. It has a higher learning curve than DirectX does (so I hear) but, it's worth it.
My thoughts on DirectX is, first it's Microsoft's. That's not bad, Bill Gates isn't too stupid. But, it tries to take way too much out of things. It's like a high-level language. Like BASIC. It does too much per command that it's completely unoptimizable. A smart programmer could do what DirectX can do only better. I first started programming for TI calculators. My expertise was in the TI-89 and TI-92+. There was one major problem with them. It required a kernel which came with "libraries" with it that every one used. It had basic routines in it like printing sprites on the screen, printing words on the screen (I kid you not, even though TI made a rom call to do that), even to pause the calculator (even though _ROM_CALL_051 was exactly that). The code was choppy and very unoptimized. If the programmer of a game made his own routines, he could design it and fit it into exactly what he needs w/o other junk that's useless to him. I don't know how to use DirectX but, here's a general idea along with that. You have networking code that will send 10 bytes of data to the computer at the other end while in the game. You have exactly what you want and it does it fast, efficient and only what he needs. But, using DirectX's network code, it requires that you send another 100 bytes of junk that is useless to either side. Like I said, DirectX is easier to learn but, that just puts it further into the problem of the calculators I mention above. So, why bloat your code any more than you need? But, you don't have to worry about it as much since all computers have both. Still, it helps with optimization to write your own code as much as possible.
That should sum up my views on both of them. Of course, OpenGL might have some of the same problems that I mentioned with DirectX but, I don't know of any. Anyways, have fun with OpenGL!
3DLabs (Score:2)
I actually think standardizing stuff like vertex and pixel shaders is the wrong thing to do right now - there are too many differences between the major implementations that smoothing them over in a universal API would be costly/inelegant, and the feature sets are changing very rapidly (new texture modes every 6 months!)... It would be smarter to wait a year or two, when things will have settled down a bit, and then write the standard in stone.
3DLabs are in a different market. (Score:3, Informative)
3DLabs are actually (I believe) the best selling make of professional graphics cards - they're not a wannabe by any stretch of the imagination.
himi
Re:comments? (Score:1)
I will. In typical Tom fashion, this article uses a lot of words to express a small bit of information. From my reading, I get that OpenGL 2.0 is taking a long time to finalize, and the OpenGL ARB is trying to standardize a lot of the extensions without hitting copyright/Intellectual Property/etc issues. A decent read, but it needs to be re-written by someone who isn't so wordy.
RagManX