Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:STAAAAAHP! (Score 2) 84

It's getting much closer. Most ASM.js demos show C++-compiled-into-Javascript is only half performance of native C++ (and getting faster). That's a difference between 30fps and 60fps if all code was Javascript. WebCL, on the other hand, is almost exactly OpenCL speeds... so for GPU-accelerated apps (depending on whether Javascript or WebCL is your primary bottleneck) you could get almost native performance.

SmallPtGPU, from the testing I did a while ago, seems to be almost the same speed whether run in WebCL via Javascript or OpenCL via C++

Comment Re:Missing the point? (Score 2) 84

Some want to use the same algorithms OpenGL and DirectX does... and those APIs are still for them.

Some do not. A good example is Epic Games who, in 2008, predicted "100% of the rendering code" for Unreal Engine 4 would be programmed directly for the GPUs. The next year they found the cost prohibitive so they kept with DirectX and OpenGL at least for a while longer. Especially for big production houses, if there is a bug or a quirk in the rendering code, it would be nice to be able to fix the problem directly rather than hack in a workaround.

Comment Re:STAAAAAHP! (Score 4, Interesting) 84

Actually, I look at web browsers as an art platform. It is programmed by a set of open standards which gives any person or organization the tools to build support for the content which is relevant to society. A video game, designed in web standards, could be preserved for centuries by whoever deems it culturally relevant.

For once, we have a gaming platform (besides Linux and BSD) which allows genuine, timeless art. If the W3C, or an industry body like them, creates an equivalent pseudo-native app platform... then great. For now, the web is the best we have.

Comment Is it supposed to? (Score 1) 1

I am not really sure Valve is intending to compete with the next generation of consoles. To me, Steam Machines seem to be filling the niche left behind by "Media Center Extenders". In other words, using the PC ecosystem to widen the market currently dominated by Rokus and invaded by consoles. It will overlap with consoles, and may even prove them redundant, but I do not see Steam Machines as aimed at consoles.

Submission + - Ask Slashdot: Can Valve's Steam Machines Compete Against New Xbox, PS4? (slashdot.org) 1

Nerval's Lobster writes: Valve has announced SteamOS, Steam Machines, and a Steam controller — the components necessary for it to create a viable living-room gaming experience. Valve’s strategy with these releases seems pretty clear: create a platform based on openness (SteamOS is a Linux-based operating system), in contrast to the closed systems pushed by console rivals such as Sony and Microsoft. If Valve chooses to release "Half-Life 3" in conjunction with its Steam Machines' rollout, it could help create further buzz for the system, given the years' worth of pent-up demand for the next chapter in the popular FPS saga. But can Valve's moves allow it to actually compete against Nintendo, Microsoft, and Sony on equal terms? What do you think?

Submission + - Software Rendering Engine GPU-Accelerated by WebCL

Phopojijo writes: OpenGL and DirectX have been the dominant real-time graphics APIs for decades. Both are catalogs of functions which convert geometry into images using predetermined mathematical algorithms (scanline rendering, triangles, etc.). Software rendering engines calculate colour values directly from the fundamental math. Reliance on OpenGL and DirectX could diminish when GPUs are utilized as general "large batches of math" solvers which software rendering engines offload to. Developers would then be able to choose their algorithms for best suits their project, even native to web browsers with the upcoming WebCL.

Submission + - Multi-Display Gaming Artifacts Shown with AMD, 4K Affected Too (pcper.com)

Vigile writes: Multi-display gaming has really found a niche in the world of high-end PC gaming, starting when AMD released Eyefinity in 2009 in three panel configurations. AMD expanded out to 6 screen options in 2010 and NVIDIA followed shortly thereafter with a similar multi-screen solution called Surround. Over the last 12 months or so GPU performance testing has gone through a sort of revolution as the move from software measurement to hardware capture measurement has taken hold. PC Perspective has done testing with this new technology on AMD Eyefinity and NVIDIA Surround configurations at 5760x1080 resolution and found there were some substantial anomalies in the AMD captures. The AMD cards exhibited dropped frames, interleaved frames (jumping back and forth between buffers) and even stepped, non-horizontal vertical sync tearing. The result is a much lower observed frame rate than software like FRAPS would indicate and these problems will also be found when using the current top end dual-head 4K PC displays since they emulate Eyefinity and Surround for setup.

Submission + - AMD releases 13.8 beta driver to implement frame pacing support

Vigile writes: Over the past year AMD has been getting hammered over its CrossFire technology and the issues the multi-GPU scaling solution has with frame pacing — the ability to present frames evenly to the user and create a smooth gaming experience. As new tools have become available to evaluate the performance of graphics solutions (like capture cards and overlays), the battle between CrossFire and NVIDIA's SLI has really taken new life. After denying the problem existed for quite some time, AMD has put out the first beta driver that implements a software frame pacing solution to more evenly produce animations from CrossFire configurations. PC Perspective has done extensive testing with the Catalyst 13.8 beta and found that it has basically solved the single screen pacing problems. More trouble remains for AMD though as they still need to find a way to fix Eyefinity and 4K displays that are exempt from this driver's improvements.

Submission + - ASUS PQ321Q Monitor Brings Multi-Stream Tiled Displays Forward (pcper.com)

Vigile writes: While 4K displays have been popping up all over the place recently with noticeably lower prices, one thing that kind of limits them all is a 30 Hz refresh rate panel. Sony is selling 4K consumer HDTVs for $5000 and new-comer SEIKI has a 50-in model going for under $1000 but they all share that trait — HDMI 1.4 supporting 3840x2160 at 30 Hz. The new ASUS PQ321Q monitor is a 31.5-in 4K display built on the same platform as the Sharp PN-K321 and utilizes a DisplayPort 1.2 connection capable of MST (multi-stream transport). This allows the screen to include two display heads internally, showing up as two independent monitors to some PCs that can then be merged into a single panel via AMD Eyefinity or NVIDIA Surround. Thus, with dual 1920x2160 60 Hz signals, the PQ321Q can offer 3840x2160 at 60 Hz for a much better viewing experience. PC Perspective got one of the monitors in for testing and review and found that the while there were some hurdles during initial setup (especially with NVIDIA hardware), the advantage of a higher refresh rate made the 4K resolution that much better.

Comment Re:Really? (Score 1) 316

Some quarters they make a lot of money, other quarters they lose a lot of money; net is pretty near zero over the whole console life-cycle.

Had they not wasted so much money and worked on an open platform, they would have steady profits almost the entire time.

Comment Re:Really? (Score 1) 316

Not only that, but that is roughly 2000$ of license fees (~$10/game + 50$/yr * 10 years + "tons of peripherals and crap" which I'll conservatively say is $500) that you did not need to pay if you didn't game on a console.

And once your consoles break and are out of support... all that money has nothing to run on.

Not only is it not profitable for Microsoft and Sony... but customers, like you, who overpaid for disposable content.

Submission + - Vastly improved Raspberry Pi performance with Wayland

nekohayo writes: While Wayland/Weston 1.1 brought support to the Raspberry Pi merely a month ago, work has recently been done to bring true hardware-accelerated compositing capabilities to the RPi's graphics stack using Weston. The Raspberry Pi foundation has made an announcement about the work that has been done with Collabora to make this happen. X.org/Wayland developer Daniel Stone has written a blog post about this, including a video demonstrating the improved reactivity and performance. Developer Pekka Paalanen also provided additional technical details about the implementation.

Submission + - Console Manufacturers Want the Impossible?

Phopojijo writes: Consoles have not really been able to profitably scale over the last decade or so. Capital is sacrificed to gain control over their marketshare and, even with the excessive lifespan of this recent generation, cannot generate enough revenue with that control to be worth it. Have we surpassed the point where closed platforms can be profitable and will we need to settle on an industry body, such as W3C or Khronos, to fix a standard for companies to manage slices of and compete within?

Submission + - Snags on the Road to WWW History (bbc.co.uk)

Rambo Tribble writes: The BBC is reporting that difficulties are being encountered in Cern's effort to recreate the original World Wide Web. It appears no one kept adequate backups and passwords have been lost, (can you imagine?) The public is being asked to help and one early page from 1991 has been recovered from the Next machine of American Paul Jones. Can you help?

Slashdot Top Deals

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...