Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

+ - Khronos Group Announces Vulkan to Compete Against DirectX 12.

Submitted by Phopojijo
Phopojijo (1603961) writes "The Khronos Group has announced the Vulkan API for compute and graphics. Its goal is to compete against DirectX 12. It has some interesting features, such as queuing to multiple GPUs and an LLVM-based bytecode for its shading language to remove the need for a compiler from the graphics drivers. Also, the API allows graphics card vendors to support Vulkan with drivers back to Windows XP "and beyond"."

Comment: Re:Wut? (Score 1) 42

by AllynM (#48975733) Attached to: Mobile G-SYNC Confirmed and Tested With Leaked Driver

1. That is a false claim - Gamenab didn't even cite the correct FPGA model when he made that DRM claim.
2. G-Sync is actually good down to 1 FPS - it adaptively inserts additional redraws in between frames at rates below 30, as to minimize the possibility of judder (incoming frame during an already started panel refresh pass). FreeSync (it its most recently demoed form) reverts back to the VSYNC setting at the low end. Further, you are basing the high end of G-Sync only on the currently released panels. Nothing states the G-Sync FPGA tops out at 144.
3. I use the word 'experience' because it is 'my experience' - I have personally witnessed most currently shipping G-Sync panels as well as the FreeSync demo at this past CES. I have also performed many tests with G-Sync. Source: I have written several articles about this, including the one linked in this post.
5. I believe the reason it is not yet released is because Nvidia wants to be able to correctly cover more of the range (including the low range / what happens when the game engine hitches).

Comment: Re:its Nvidia FREESYNC (Score 1) 42

by AllynM (#48971971) Attached to: Mobile G-SYNC Confirmed and Tested With Leaked Driver

Gamenab stumbled across the leaked driver and tried to use it to spread a bunch of conspiracy theory FUD. I hope most people here can correctly apply Occam's razor as opposed to the alternative, which is that he supposedly designed those changes, those changes going into an internal driver build that was inadvertently leaked and happened to apply to the exact laptop he already owned.

ExtremeTech picked apart his BS in more detail: http://www.extremetech.com/ext...

Comment: Re:Wut? (Score 1) 42

by AllynM (#48971895) Attached to: Mobile G-SYNC Confirmed and Tested With Leaked Driver

1. The FPGA *was* required for the tech to work on the desktop panels it was installed in.
2. FreeSync (as I've witnessed so far) as well as the most recent adaptive sync can not achieve the same result across as wide of a refresh rate range that G-Sync currently can.
3. Nvidia could 'make it work', but it would not be the same experience as can be had with a G-Sync module, even with an adaptive sync panel (as evidenced by how this adaptive sync panel in this laptop intermittently blanks out at 30 FPS or when a game hitches.
4. ...
5. The driver was not a release driver, and was not meant to call the experience it gives 'G-Sync'. It was meant to be internal.

Conclusion - Adaptive sync alone is not the same experience you can currently get with a real G-Sync panel, which is why any possible future G-Sync that does not need a module it's not yet a real thing.

+ - NVIDIA GTX 970 Specifications Corrected, Memory Pools Explained->

Submitted by Vigile
Vigile (99919) writes "Over the weekend NVIDIA sent out its first official response to the claims of hampered performance on the GTX 970 and a potential lack of access to 1/8th of the on-board memory. Today NVIDIA has clarified the situation again, this time with some important changes to the specifications of the GPU. First, the ROP count and L2 cache capacity of the GTX 970 were incorrectly reported at launch (last September). The GTX 970 has 52 ROPs and 1792 KB of L2 cache compared to the GTX 980 that has 64 ROPs and 2048 KB of L2 cache; previously both GPUs claimed to have identical specs. Because of this change, one of the 32-bit memory channels is accessed differently, forcing NVIDIA to create 3.5GB and 0.5GB pools of memory to improve overall performance for the majority of use cases. The smaller, 500MB pool operates at 1/7th the speed of the 3.5GB pool and thus will lower total graphics system performance by 4-6% when added into the memory system. That occurs when games request MORE than 3.5GB of memory allocation though, which happens only in extreme cases and combinations of resolution and anti-aliasing. Still, the jury is out on whether NVIDIA has answered enough questions to temper the fire from consumers."
Link to Original Source

+ - NVIDIA Responds to GTX 970 Memory Issue->

Submitted by Vigile
Vigile (99919) writes "Over the past week or so, owners of the GeForce GTX 970 have found several instances where the GPU was unable or unwilling to address memory capacities over 3.5GB despite having 4GB of on-board frame buffer. Specific benchmarks were written to demonstrate the issue and users even found ways to configure games to utilize more than 3.5GB of memory using DSR and high levels of MSAA. While the GTX 980 can access 4GB of its memory, the GTX 970 appeared to be less likely to do so and would see a dramatic performance hit when it did. NVIDIA responded today saying that the GTX 970 has "fewer crossbar resources to the memory system" as a result of disabled groups of cores called SMMs. NVIDIA states that "to optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section" and that the GPU has "higher priority" to the larger pool. The question that remains is should this affect gamers' view of the GTX 970? If performance metrics already take the different memory configuration into account, then I don't see the GTX 970 declining in popularity."
Link to Original Source

+ - How we'll know whether BICEP2 was right about gravitational waves

Submitted by StartsWithABang
StartsWithABang (3485481) writes "The Big Bang takes us back to very early times, but not the earliest. It tells us the Universe was in a hot, dense state, where even the possibility of forming neutral atoms was impossible due to the incredible energies of the Universe at that time. The patterns of fluctuations that are left over from that time give us insight into the primordial density fluctuations that our Universe was born with. But there’s an additional signature encoded in this radiation, one that’s much more difficult to extract: polarization. While most of the polarization signal that’s present will be due to the density fluctuations themselves, there’s a way to extract even more information about an even earlier phenomenon: gravitational waves that were present from the epoch of cosmic inflation! Here's the physics on how that works, and how we'll find whether BICEP2 was right or not."

+ - Using naval logbooks to reconstruct past weather—and predict future climat-> 1

Submitted by Lasrick
Lasrick (2629253) writes "What a great idea. The Old Weather Project uses old logbooks to study the weather patterns of long ago, providing a trove of archival data to scientists who are trying to fill in the details of our knowledge about the atmosphere and the changing climate. 'Pity the poor navigator who fell asleep on watch and failed to update his ship’s logbook every four hours with details about its geographic position, time, date, wind direction, barometric readings, temperatures, ocean currents, and weather conditions.' As Clive Wilkinson of the UK's National Maritime Museum adds, 'Anything you read in a logbook, you can be sure that it is a true and faithful account.'

The Old Weather Project uses citizen scientists to transcribe and digitize observations that were scrupulously recorded on a clockwork-like basis, and it is one of several that climate scientists are using to create 'a three-dimensional computer simulation that will provide a continuous, century-and-a-half-long profile of the entire planet’s climate over time'--the 20th Century Reanalysis Project. Data is checked and rechecked by 3 different people before entry into the database, and the logbook measurements are especially valuable because it was compiled at sea. Great story."

Link to Original Source

+ - Interviews: Ask Warren Ellis a Question

Submitted by samzenpus
samzenpus (5) writes "Warren Ellis is an acclaimed British author of comics, novels, and television who is well known for his sociocultural commentary. The movies Red, and Iron Man 3 are based on his graphic novels. In addition to numerous other comic titles he started a personal favorite, Transmetropolitan. Ellis has written for Vice, Wired UK and Reuters on technological and cultural matters, and is co-writing a video project called Wastelanders with Joss Whedon. Warren has agreed to give us some of his time to answer any questions you may have. As usual, ask as many as you'd like, but please, one per post."

+ - Micron SSDs with MLC/SLC Conversion Technology Tested

Submitted by Vigile
Vigile (99919) writes "Earlier this month Micron announced a technology called Dynamic Write Acceleration in the new M600 SSD models that is able to swap NAND flash from MLC and to SLC (multi- and single-level cell) on the fly in order to improve performance in low cost solid state drive implementations. In short, a new and empty M600 SSD will have its dies in SLC mode. While the SSD will appear to the user at its rated capacity, the actual flash capacity in SLC mode is half of what it would be if all dies were in MLC mode. As the SSD is filled past 50% capacity, the controller intelligently switches dies from SLC to MLC, shuffling data around as necessary in the background to briefly empty a given die before switching its mode. In PC Perspective's testing though, the hardware was very inconsistent in write speeds, even at the same capacity fill levels, and would often run at a much lower throughput level than expected. Read speeds are not affected by the DWA feature. While interesting in theory it appears the dynamic flipping technology needs a bit more work."

+ - GeForce GTX 980 and GTX 970 Bring Unseen Power Efficiency->

Submitted by Vigile
Vigile (99919) writes "Launching today is a new GPU from NVIDIA along with two new graphics card that utilize it. GM204, the second chip released based on the Maxwell architecture, brings an incredibly high level of power efficiency to high-end enthusiast level graphics cards. The GeForce GTX 980, reviewed by PC Perspective, with 2048 CUDA cores, a 256-bit memory bus, 4GB of GDDR5 running at 7.0 GHz and a base clock over 1100 MHz, is able to outperform cards like the GeForce GTX 780 Ti and the AMD Radeon R9 290X and will sell for $549. Maybe most impressive is the power draw difference — the GTX 980 uses 130 watts LESS POWER than the R9 290X under a full load. The GTX 970, with 1664 CUDA cores, the same memory configuration and a base clock of 1050 MHz runs at even lower power, outperforming the Radeon R9 290 and using 80 watts less power and has an MSRP of just $329. Faster GPUs using less power — it's pretty impressive. New features of the GTX 900 series include MFAA (multi-frame AA), Dynamic Super Resolution and full DX12 feature set support. And the fact that we were able to overclock the GTX 980 to nearly 1500 MHz doesn't hurt either."
Link to Original Source

+ - AMD Releases new Tonga GPU, Lowers 8-core CPU to $229

Submitted by Vigile
Vigile (99919) writes "AMD looks to continue addressing the mainstream PC enthusiast and gamer with a set of releases into two different component categories. First, today marks the launch of the Radeon R9 285 graphics card, a $250 option based on a brand new piece of silicon dubbed Tonga. This GPU has nearly identical performance to the R9 280 that came before it, but includes support for XDMA PCIe CrossFire, TrueAudio DSP technology and is FreeSync capable (AMD's response to NVIDIA G-Sync). On the CPU side AMD has refreshed its FX product line with three new models (FX-8370, FX-8370e and FX-8320e) with lower TDPs and supposedly better efficiency. The problem of course is that while Intel is already sampling 14nm parts these Vishera-based CPUs continue to be manufactured on GlobalFoundries' 32nm process. The result is less than expected performance boosts and efficiency gains."

+ - Intel Core i7-5960X Brings 8 Haswell Cores to Enthusiasts->

Submitted by Vigile
Vigile (99919) writes "Today Intel released its updated E-class, enthusiast platform based on Haswell, known previously as just Haswell-E. The Core i7-5960X Extreme Edition CPU is an 8-core processor (addressing 16 threads with HyperThreading) that doubles core count over mainstream Haswell parts and jumps from the 6-core parts in previous E-class platforms. That not only turns into dramatic performance increases in highly threaded applications like rendering and encoding, but Haswell-E is also the first consumer platform to integrate a quad-channel DDR4 memory controller, with frequencies starting at 2133 MHz. The top two tiers of Haswell-E processors also include 40 lanes of PCI Express 3.0 while the lower cost Core i7-5820K will be limited to 6-cores and 28 lanes of PCIe. New motherboards based on the new X99 chipset are required as well and include additional storage options like 14 USB ports and 10 SATA 6.0 Gbps channels. Clearly this is the fastest consumer platform tested but as with all E-class releases, the cost is higher. The Core i7-5960X will set you back $999 and expect to pay at least $500 for a motherboard and 4 DIMMs of the new DDR4 as well."
Link to Original Source

Comment: Re:Cheaper drives (Score 4, Insightful) 183

by Vigile (#47663157) Attached to: Solid State Drives Break the 50 Cents Per GiB Barrier, OCZ ARC 100 Launched

No, no particular technical difficulty, just another step in gradually falling prices. We have seen drives hit $0.39/GB as well with standard Amazon.com pricing. The Crucial M550 (a bit faster) is at $407 for 1TB model today, for example: http://amzn.to/1kBpIs1

When you don't know what to do, walk fast and look worried.

Working...