Polls on the front page of Slashdot? Is the world coming to an end?! Nope; read more about it. ×
Intel

Intel Adopts USB-C Connector For 40Gbps Thunderbolt 3, Supports USB 3.1, DP 1.2 148

Posted by timothy
from the ok-that's-pretty-cool dept.
MojoKid writes: The high speed Thunderbolt interface standard, which is used for everything from hyper-fast external storage solutions to external graphics cards, has been slow to take off. You can blame the high-priced Thunderbolt peripherals and the uber-expensive cables (at least when compared to your garden-variety USB cables). For most people, USB 3.0 is "good enough" and making a huge investment into the Thunderbolt ecosystem has been reserved for those in the professional video editing arena. However, Intel is looking to change all of that with Thunderbolt 3. Thunderbolt 3 once again doubles the maximum bandwidth, this time jumping from 20Gbps to a whopping 40Gbps. While that is impressive in its own right, the truly big news is that Thunderbolt 3 is moving away from the Mini DisplayPort connector and is instead adopting the USB-C connector. As a result Thunderbolt will also support USB 3.1 (which is currently spec'd at 10Gbps) and can optionally provide up to 100W of power (in compliance with the USB Power Delivery spec) to charge devices via USB-C (like the recently introduced 12-inch Apple MacBook).
Graphics

Intel Releases Broadwell Desktop CPUs: Core i7-5775C and i5-5675C 100

Posted by timothy
from the chips-and-chips dept.
edxwelch writes: Intel has finally released their Broadwell desktop processors. Featuring Iris Pro Graphics 6200, they take the integrated graphics crown from AMD (albeit costing three times as much). However, they are not as fast as current Haswell flagship processors and they will be soon superseded by Skylake, to be released later this year. Tom's Hardware and Anandtech have the first reviews of the Core i7-5775C and i5-5675C.
GUI

Cinnamon 2.6: a Massive Update Loaded With Performance Improvements 132

Posted by timothy
from the also-delicious dept.
jones_supa writes: The Linux Mint team has just announced that Cinnamon 2.6 desktop environment is considered stable and ready to download. It is a big update. The load times have been greatly improved and unnecessary calculations in the window management part are dropped, leading to a 40% reduction in the number of CPU wakes per second. Other improvements include a screensaver that does more than just lock the screen, panels that can be removed or added individually, a much better System Settings panel that should make things much clearer, a cool new effect for windows, and a brand new plugin manager for Nemo. Linux Mint users will receive the new Cinnamon as an update by the end of the month.
Graphics

NVIDIA's GeForce GTX 980 Ti Costs $350 Less Than TITAN X, Performs Similarly 152

Posted by timothy
from the most-of-the-price-of-a-chromebook-pixel dept.
Deathspawner writes: In advance of the rumored pending launch of AMD's next-generation Radeon graphics cards, NVIDIA has decided to pull no punches and release a seriously tempting GTX 980 Ti at $649. It's tempting both because the extra $150 it costs over the GTX 980 more than makes up for it in performance gained, and despite it coming really close to the performance of TITAN X, it costs $350 less. AMD's job might just have become a bit harder. Vigile adds The GTX 980 Ti has 6GB of memory (versus 12GB for the GTX Titan X) but PC Perspective's review shows no negative side effects of the drop. This implementation of the GM200 GPU uses 2,816 CUDA cores rather than the 3,072 cores of the Titan X, but thanks to higher average Boost clocks, performance between the two cards is identical. And at Hot Hardware, another equally positive, benchmark-laden review.
Microsoft

Windows 10 RTM In 6 Weeks 288

Posted by timothy
from the but-apple's-had-10-for-years dept.
Billly Gates writes: Ars Technica has the scoop on a new build with less flat icons and a confirmation of a mid July release date. While Microsoft is in a hurry to fix the damage done by the Windows 8 versions of its operating system, the next question is, is ready for prime time? On Neowin there's a list of problems already mentioned by MS and its users with this latest release, including Wi-Fi and sound not working without a reboot, and users complaining about tiles and apps not working in the new start menu.
Android

NVIDIA SHIELD Android TV Reviewed: Gaming and Possibly the Ultimate 4K Streamer 54

Posted by timothy
from the ensmallening-rocks dept.
Earlier this week, NVIDIA officially launched its SHIELD Android TV set-top device, with far more horsepower than something like Roku or Apple TV, but on par with an average game console, and at a more affordable price tag of $199. MojoKid writes: What's interesting, however, is that it's powered by NVIDIA's Tegra X1 SoC which features a Maxwell-derived GPU and eight CPU cores; four ARM A57 cores and four A53s. The A57 cores are 64-bit, out-of-order designs, with multi-issue pipelines, while the A53s are simpler, in-order, highly-efficient designs. Which cores are used will depend on the particular workload being executed at the time. Tegra X1 also packs a 256-core Maxwell-derived GPU with the same programming capabilities and API support as NVIDIA's latest desktop GPUs. In standard Android benchmarks, the SHIELD pretty much slays any current high-end tablet or smartphone processor in graphics, but is about on par with the octal-core Samsung Exynos in terms of standard compute workloads but handily beating and octal-core Qualcomm Snapdragon. What's also interesting about the SHIELD Android TV is that it's not only an Android TV-capable device with movie and music streaming services like Netflix etc., but it also plays any game on Google Play and with serious horsepower behind it. The SHIELD Android TV is also the first device certified for Netflix's Ultra HD 4K streaming service.
Graphics

Dell Precision M3800 Mobile Workstation Packs Thunderbolt 2, Quadro, IGZO2 Panel 133

Posted by samzenpus
from the running-the-numbers dept.
MojoKid writes: Dell recently revamped their M3800 model to better entice graphic designers, engineers, and other high-end users who often work in the field, with a true mobile workstation that's both sufficiently equipped to handle professional grade workloads and is thin and light to boot. Dell claims the M3800 is the "world's thinnest and lightest 15-inch mobile workstation" and at 4.15 pounds, it could very well be. In addition, ISV tools certifications matter for workstation types, so the M3800 gets its pixel pushing muscle from an NVIDIA Quadro K1100M GPU with 2GB of GDDR5 memory. Other notable specs include an Intel Core i7-4712HQ quad-core processor, 16GB of DDR3L memory, and a 256GB mSATA SSD. One of the new additions to the M3800 is a Thunderbolt 2 port with transfer speeds of up to 20Gbps that allows for the simultaneous viewing/editing and backing up of raw 4K video. Finally, the M3800 is equipped with a 3840x2160 native resolution IGZO2 display, which equates to a 60 percent increase in pixel density over a current gen MacBook Pro with Retina display. Performance-wise, the M3800 holds up pretty strong with standard productivity workloads, though as you can image it excels more-so in graphics rendering throughput.
Graphics

Epic's VR Demo Scene For the GTX 980 Now Runs On Morpheus PS4 Headset At 60 FPS 35

Posted by timothy
from the blit-blit-bloop-bleep dept.
An anonymous reader writes: Originally created as a Unreal Engine 4 demo scene to push the limits of VR-capable graphics on the Oculus Rift 'Crescent Bay' prototype VR headset, Showdown is now running flawlessly at 60 FPS on Morpheus, Sony's PS4 VR headset. The demo was previously only able to run at Oculus' 90 FPS target VR framerate on the Nvidia GTX 980, a GPU which costs nearly $200 more than the PS4 itself. To the delight of UE4 developers, the performance improvement comes from general optimizations to UE4 on PS4, rather than specific optimizations to Showdown.
The Media

WSJ Crowdsources Investigation of Hillary Clinton Emails 231

Posted by timothy
from the tag-this-story-recursive dept.
PvtVoid writes: The Wall Street Journal now has a page up that encourages readers to sift through and tag Hillary Clinton's emails on Benghazi. Users can click on suggested tags such as "Heated", "Personal", "Boring", or "Interesting", or supply their own tags. What could possibly go wrong? I'm tagging this story "election2016."
Handhelds

Asus ZenFone 2 Performance Sneak Peek With Intel Z3580 Inside 108

Posted by timothy
from the doesn't-work-with-google-fi-though dept.
MojoKid writes: Asus just finally made their ZenFone 2 available for sale in the US. It's an Intel-powered smartphone running Android Lollipop that's compatible with AT&T and T-Mobile, and other cellular networks that utilize GSM technology, like Straight Talk, MetroPCS, and Cricket Wireless among others.The device is packing a quad-core Intel Atom Z3580 (2.3GHz) with PowerVR G6430 graphics and 4GB of RAM, along with Intel 7262 and Intel 2230 modem tech, a 5.5" Full HD screen, a 13MP rear camera, dual-SIM support and 802.11ac Wi-Fi. The high-end model can be had for only $299, unlocked. A $199 version with 2GB of RAM and a slightly slower Intel Atom Z3560 is also available. In the benchmarks, the Zenfone 2 offers competent though middling performance but considering Asus has priced the ZenFone 2 so aggressively, it's sure to grab some attention at retail with consumers looking for a contract-free commitment.
AMD

AMD Details High Bandwidth Memory (HBM) DRAM, Pushes Over 100GB/s Per Stack 98

Posted by timothy
from the lower-power-higher-interest dept.
MojoKid writes: Recently, a few details of AMD's next-generation Radeon 300-series graphics cards have trickled out. Today, AMD has publicly disclosed new info regarding their High Bandwidth Memory (HBM) technology that will be used on some Radeon 300-series and APU products. Currently, a relatively large number of GDDR5 chips are necessary to offer sufficient capacity and bandwidth for modern GPUs, which means significant PCB real estate is consumed. On-chip integration is not ideal for DRAM because it is not size or cost effective with a logic-optimized GPU or CPU manufacturing process. HBM, however, brings the DRAM as close to possible to the logic die (GPU) as possible. AMD partnered with Hynix and a number of companies to help define the HBM specification and design a new type of memory chip with low power consumption and an ultra-wide bus width, which was eventually adopted by JEDEC 2013. They also develop a DRAM interconnect called an "interposer," along with ASE, Amkor, and UMC. The interposer allows DRAM to be brought into close proximity with the GPU and simplifies communication and clocking. HBM DRAM chips are stacked vertically, and "through-silicon vias" (TSVs) and "bumps" are used to connect one DRAM chip to the next, and then to a logic interface die, and ultimately the interposer. The end result is a single package on which the GPU/SoC and High Bandwidth Memory both reside. 1GB of GDDR5 memory (four 256MB chips), requires roughly 672mm2. Because HBM is vertically stacked, that same 1GB requires only about 35mm2. The bus width on an HBM chip is 1024-bits wide, versus 32-bits on a GDDR5 chip. As a result, the High Bandwidth Memory interface can be clocked much lower but still offer more than 100GB/s for HBM versus 25GB/s with GDDR5. HBM also requires significantly less voltage, which equates to lower power consumption.
Windows

How Windows 10 Performs On a 12-inch MacBook 241

Posted by Soulskill
from the burning-questions dept.
An anonymous reader writes: As Microsoft prepares for the launch of Windows 10, review sites have been performing all sorts of benchmarks on the tech preview to evaluate how well the operating system will run. But now a computer science student named Alex King has made the most logical performance evaluation of all: testing Windows 10's performance on a 2015 MacBook. He says, "Here's the real kicker: it's fast. It's smooth. It renders at 60FPS unless you have a lot going on. It's unequivocally better than performance on OS X, further leading me to believe that Apple really needs to overhaul how animations are done. Even when I turn Transparency off in OS X, Mission Control isn't completely smooth. Here, even after some Aero Glass transparency has been added in, everything is smooth. It's remarkable, and it makes me believe in the 12-inch MacBook more than ever before. So maybe it's ironic that in some regards, the new MacBook runs Windows 10 (a prerelease version, at that) better than it runs OS X."
Graphics

A Look At GTA V PC Performance and Image Quality At 4K 72

Posted by timothy
from the so-you're-saying-more-is-better dept.
MojoKid writes: Rockstar's Grand Theft Auto series has been wildly successful for many years now, offering some of the edgiest story lines, game play tactics and objectives the gaming industry has ever seen. With psychopathic main characters, you are left in the depraved communities of Los Santos and Blaine County, to walk a path few would dare choose in real life. And it's rather entertaining of course, that you're tasked with leaving a virtual world worse off than you found it, consequences be damned. But what does it take to run GTA V at 4K (3840X2160) resolution? This article takes a look at that, as well as how it scales over multiple NVIDIA GeForce GTX 980 GPUs, along with some screen shots that look at image quality at Ultra HD resolution. It's safe to say one strong, high-end GPU will get the job done, but two in SLI or CrossFire are better of course, if you want to max out all IQ settings.
Graphics

Oculus Rift Hardware Requirements Revealed, Linux and OS X Development Halted 227

Posted by Soulskill
from the sad-penguin dept.
An anonymous reader writes: Oculus has selected the baseline hardware requirements for running their Rift virtual reality headset. To no one's surprise, they're fairly steep: NVIDIA GTX 970 / AMD 290 equivalent or greater, Intel i5-4590 equivalent or greater, and 8GB+ RAM. It will also require at least two USB 3.0 ports and "HDMI 1.3 video output supporting a 297MHz clock via a direct output architecture."

Oculus chief architect Atman Binstock explains: "On the raw rendering costs: a traditional 1080p game at 60Hz requires 124 million shaded pixels per second. In contrast, the Rift runs at 2160×1200 at 90Hz split over dual displays, consuming 233 million pixels per second. At the default eye-target scale, the Rift's rendering requirements go much higher: around 400 million shaded pixels per second. This means that by raw rendering costs alone, a VR game will require approximately 3x the GPU power of 1080p rendering." He also points out that PC graphics can afford a fluctuating frame rate — it doesn't matter too much if it bounces between 30-60fps. The Rift has no such luxury, however.

The last requirement is more onerous: WIndows 7 SP1 or newer. Binstock says their development for OS X and Linux has been "paused" so they can focus on delivering content for Windows. They have no timeline for going back to the less popular platforms.
Space

How SpaceX and the Quest For Mars Almost Sunk Tesla Motors 126

Posted by Soulskill
from the rocket-bucks-versus-car-bucks dept.
braindrainbahrain writes: Elon Musk and his rocket company are well known to Slashdottters. This article and book excerpt tell the story of the creation of SpaceX and how it almost sank Musk's other company, Tesla Motors. Musk recalls, "I could either pick SpaceX or Tesla or split the money I had left between them. That was a tough decision. If I split the money, maybe both of them would die. If I gave the money to just one company, the probability of it surviving was greater, but then it would mean certain death for the other company." But then, at the last moment, years of work at SpaceX finally paid off: "[O]n Dec. 23, 2008, SpaceX received a wonderful shock. The company won a $1.6 billion contract for 12 NASA resupply flights to the space station. Then the Tesla deal ended up closing successfully, on Christmas Eve, hours before Tesla would have gone bankrupt. Musk had just a few hundred thousand dollars left and could not have made payroll the next day." Also, it turns out the inspiration for SpaceX was the idea of sending mice to Mars.
Intel

Intel NUC5i7RYH Broadwell Mini PC With Iris Pro Graphics Tested 80

Posted by timothy
from the why-pay-for-big-any-more? dept.
MojoKid writes: In addition to ushering in a wave of new notebooks and mobile devices, Intel's Broadwell microarchitecture has also found its way into a plethora of recently introduced small form factor systems like the company's NUC platform. The new NUC5i7RYH is a mini-PC packing a Core i7-5557U Broadwell processor with Iris Pro graphics, which makes it the most powerful NUC released to date. There's a 5th-gen Core i7 CPU inside (dual-core, quad-thread) that can turbo up to 3.4GHz, an Iris Pro 6100 series integrated graphics engine, support for dual-channel memory, M.2 and 2.5" SSDs, 802.1ac and USB 3.0. NUCs are generally barebones systems, so you have to build them up with a drive and memory before they can be used. The NUC5i7RYH is one of the slightly taller NUC systems that can accommodate both M.2 and 9.5mm 2.5 drives and all NUCs come with a power brick and VESA mount. With a low-power dual-core processor and on-die Iris Pro 6100-series graphics engine, the NUC5i7RYH won't offer the same kind of performance as systems equipped with higher-powered processors or discrete graphics cards, but for everyday computing tasks and casual gaming, it should fit the bill for users that want a low profile, out-of-the-way tiny PC.
Graphics

The Decline of Pixel Art 175

Posted by Soulskill
from the one-palette-at-a-time dept.
An anonymous reader writes: Blake Reynolds, lead artist for a pair of popular mobile games, has put up a post about the decline of pixel art in games. He decries the current state of "HD fetishism" in the industry, saying that games with great pixel art get needlessly marked down in reviews for their pixelation, while games that have awful — but high-res — art get glowing praise. He walks through a number of examples showing how pixel art can be well done or poorly done, and how it can be extremely complex despite the lower resolution. But now pixel artists are running into not only the expectation of high-definition content, but technological obstacles as well. "Some devices blur Auro [their game]. Some devices stretch it. Some devices letterbox it. No matter how hard I worked to make the art in Auro as good as I could, there's no way a given person should be expected to see past all those roadblocks. Making Auro with higher-resolution art would have made it more resistant to constantly-changing sizes and aspect ratios of various devices." Reynolds says his studio is giving up on pixel art and embracing the new medium, and recommends other artists do the same. "Don't let the medium come between you and your audience. Speak in a language people can understand so that they can actually see what makes your work great without a tax."
Security

GPU Malware Can Also Affect Windows PCs, Possibly Macs 49

Posted by samzenpus
from the protect-ya-neck dept.
itwbennett writes: A team of anonymous developers who recently created a Linux rootkit that runs on graphics cards has released a new proof-of-concept malware program that does the same on Windows. A Mac OS X implementation is also in the works. The problem the developers are trying to highlight lies not with the operating systems, such as Windows or Linux, nor with the GPU (graphics processor unit) vendors, but rather with existing security tools, which aren't designed to scan the random access memory used by GPUs for malware code.
Security

Proof-of-Concept Linux Rootkit Leverages GPUs For Stealth 67

Posted by Soulskill
from the jellyfish-eating-penguins dept.
itwbennett writes: A team of developers has created a rootkit for Linux systems that uses the processing power and memory of graphics cards instead of CPUs in order to remain hidden. The rootkit, called Jellyfish, is a proof of concept designed to demonstrate that completely running malware on GPUs is a viable option. Such threats could be more sinister than traditional malware programs, according to the Jellyfish developers, in part because there are no tools to analyze GPU malware, they said.
AMD

AMD Outlines Plans For Zen-Based Processors, First Due In 2016 166

Posted by samzenpus
from the check-it-out dept.
crookedvulture writes: AMD laid out its plans for processors based on its all-new Zen microarchitecture today, promising 40% higher performance-per-clock from from the x86 CPU core. Zen will use simultaneous multithreading to execute two threads per core, and it will be built using "3D" FinFETs. The first chips are due to hit high-end desktops and servers next year. In 2017, Zen will combine with integrated graphics in smaller APUs designed for desktops and notebooks. AMD also plans to produce a high-performance server APU with a "transformational memory architecture" likely similar to the on-package DRAM being developed for the company's discrete graphics processors. This chip could give AMD a credible challenger in the HPC and supercomputing markets—and it could also make its way into laptops and desktops.