Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

EA Announces Multi-Title Unreal Engine 3 License 54

An anonymous reader writes to mention a Gamasutra article about a surprising announcement from EA. They've made the move to license the Unreal 3 Engine for a series of next-generation titles. "The brief announcement states that EA 'employs a variety of engines, tools and technologies to best serve the needs of each game and development team', but raises interesting issues regarding the Criterion-authored Renderware engine, purchased by EA in 2004 alongside the Burnout developer, and its intended global EA rollout."
This discussion has been archived. No new comments can be posted.

EA Announces Multi-Title Unreal Engine 3 License

Comments Filter:
  • by Anonymous Coward on Sunday August 20, 2006 @10:57AM (#15944043)
    There used to be a time where your engine defined what you could do in a game, and the engine you choose would have a massive impact on the quality of game you could produce; I think these days are long gone. If you discount certain cutting edge graphical techniques, there are few (gameplay modifying) features that are implemented in the Unreal 3/Doom 3 engines that could not be done in an open source engine written in Java.

    Personally, I think that it is time that someone focuses on generating an open source java framework that is designed around splitting a game engine into its smaller components (Graphics, Physics, Scripting and AI); this would allow for smaller (more focused) open source projects to exist which (should) produce higher quality results.
  • by Dr. Spork ( 142693 ) on Sunday August 20, 2006 @11:14AM (#15944095)
    I guess if there is anything here to wonder at, it's that this game engine consolidation did not gather steam sooner. Maybe EA, who's been vacuuming up small game companies, whanted their newly-acquired employees to maintain a brief sense of independence. But if you're a game company that cranks out dozens of games a year, with almost all of them being 3D in some way, it makes sense to standardize. I would guess their intention with Renderware is to make it a very modular, clean and optimized game engine, so that its core can be used across all the lines of EA games. This will make redundant lots of the back-end people in EA's recent acquisitions. The people who remain will "generate content" for THE game engine.

    I'm not sure whether this is bad or good. I was thinking it might make future games feel generic, but then I thought... more than now? Let's hope not. But maybe the generic feel of today's FPSes is that the oft-reused game engines are not quite flexible enough, so the player "recognizes" the engine underneath. Maybe in the future they will fix that.

  • by perkr ( 626584 ) on Sunday August 20, 2006 @11:43AM (#15944176)

    You are probably right that games will end up being written in an easier language than C++ and with critical and difficult-to-write components such as AI and Graphics as seprate components.

    However, it seems hard to separate AI and for instance physics. For an AI to be smart it has to know how the physics component work or no? I mean is game development going to end up like BizTalk hehe (components "brokering" over XML basically). :)

    Also, for games to have an "edge" creativity in all diverse areas have been needed. If components are open-sourced it would be cool cause then the devs are free to let an expert in the team in and hack the render engine to do whatever they want to introduce to impress the audience. But if we end up with several closed-down or hard-to-change components that might impact the creativity in games. Basically it would be all script-programming.

  • by Anonymous Coward on Monday August 21, 2006 @07:27AM (#15947510)
    This is simply not true. The modern x86 is designed for, and great on, general code but if you have a loop and you want to optimize it to the max you cannot obtain the same clock-for-clock performance on an x86 as you can on say the PPC and the factor is somewhere between 20% and 50%. This is partly down to register name shortage (it does matter when you can't express your algorithm without inserting several dozen extra u-ops, and those stack pushes are not optimized away). The other big reason is that PPC has predictability while what's going on inside the x86 is anybody's guess. You can't optimize for x86. Abrash's best work is pure trial-and-error.

    The x86 may be "here to stay" in the desktop or server PC (er, until you want more than 2GB of RAM) but it's dead in the water in every other CPU-based industry. All that x86 to RISC costs a fortune in die size, power consumption and maximum throughput efficiency. The only reason anyone buys an x86-compatible processor today is (duh) to run x86-compatible code. If there is no legacy code to cope with, no-one uses x86. Microsoft made that mistake and lost a lot of money and market placement as a result.

    Anyway the x86 ISA is a 32-bit ISA and it can't survive past the current generation of 32-bit CPUs so your point is about 5 years too late. The death of the x86 ISA is right around the corner. It will live on in software emulation just as the 6809 did.

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...