bigwophh writes: Micron just launched its new 9100 Series NVMe solid state drives today, which come in multiple flavors and form factors. The Micron 9100 PRO series targets read-centric environments, while the 9100 MAX targets mixed-use cases. Capacities for the drives range from 800GB on up to 3.2TB, though all of the drives are outfitted with a similar Microsemi 16-core / 16-channel controller and 16nm Micron MLC NAND flash memory. The fastest drives in the series are rated for peak sequential read / write throughput of 3GB/sec and 2GB/sec, respectively. In testing, the drives generally outpace Intel's DC 3700 series drive, but can't catch Intel's higher-end SSD DC P3608 in some read tests, though the Micron drives did outpace Intel's flagship in some write tests and can hit that peak 3GB/sec bandwidth number easily.
bigwophh writes: Artificial intelligence has been a divisive topic as of late, and while we may never electronically mimic the human brain or its capabilities in their entirety, many of the world's brightest minds are now capable of creating rather convincing systems that are beginning to learn sophisticated concepts. Some are even capable of learning on their own. IBM, for example, is working hard on its Watson cognitive computing platform. However, the company faces a challenge in keeping Watson’s massive data stores fresh, with the world's devices producing some 2.5 exabytes of data every day – which is expected to increase to a whopping 44 zettabytes by the year 2020. To keep up with the information overload, IBM announced late last year that it was adding NVIDIA's Tesla K80 processing engines to its neural network. Those high performance compute GPUs are playing a key role in Watson's cognitive computing development, especially in terms of natural language processing capabilities. Watson is now more capable and human-like, especially when encapsulated in a robot body. At NVIDIA's GPU Technology Conference (GTC) Rob High, an IBM fellow, vice president, and CTO for Watson, introduced attendees to a robot powered by Watson. During the demonstration, the Watson-infused robot naturally responded to queries just like a human would, using not only speech but subtle movement. A dance routine even made it into the mix. The underlying AI at work can also get a read on people’s emotions and mood through movement and analysis of their speech patterns.
bigwophh writes: In Q4 2016, Intel will release a follow up to its Skylake processors named Kaby Lake, which will mark yet another 14nm release that's a bit odd, for a couple of reasons. The big one is the fact that this chip mayn not have appeared had Intel's schedule kept on track. Originally, Cannonlake was set to succeed Skylake, but Cannonlake will instead launch in 2017. That makes Kaby Lake neither a tick nor tock in Intel's release cadence. When released, Kaby Lake will add native USB 3.1 and HDCP 2.2 support. It's uncertain whether these chips will fit into current Z170-based motherboards, but considering the fact that there's also a brand-new chipset on the way, we're not too confident of it. However, the so-called Intel 200 series chipsets will be backwards-compatible with Skylake. It also appears that Intel will be releasing Apollo Lake as early as the late spring, which will replace Braswell, the lowest-powered chips Intel's lineup destined for smartphones.
bigwophh writes: Last year, Elon Musk hinted at a new product that Tesla Motors was working on in its research lab. What Musk described seemed creepy at the time, especially considering that he had just recently shown off “The D” at an evening press event. “By the way, we are actually working on a charger that automatically moves out from the wall and connects like a solid metal snake,” said Musk. We didn’t think much else about this intriguing contraption given the precious little details that Musk provided at the time. But fast forward seven months and we now have video of the serpent-like charger in action.
bigwophh writes: 14nm Broadwell processors weren’t originally destined for the channel, but Intel ultimately changed course and launched a handful of 5th Generation Core processors based on the microarchitecture recently, the most powerful of which is the Core i7-5775C. Unlike all of the mobile Broadwell processors that came before it, the Core i7-5775C is a socketed, LGA processor for desktops, just like 4th Generation Core processors based on Haswell. In fact, it’ll work in the very same 9-Series chipset motherboards currently available (after a BIOS update). The Core i7-5775C, however, features a 128MB eDRAM cache and integrated Iris Pro 6200 series graphics, which can boost graphics performance significantly. Testing shows that the Core i7-5775C's lower CPU core clocks limit its performance versus Haswell, but its Iris Pro graphics engine is clearly more powerful.
bigwophh writes: Machine learning has helped a multitude of different technologies become a reality, including emotion-detection. Most examples to date have been rather simple, such as being able to detect a smile or a frown. But with today's super-fast computers, and even mobile devices, we're now able to detect emotion with far greater accuracy and nuance. Facial recognition expert Rana el Kaliouby recently gave a talk at TED to highlight just how accurate emotion-detection has become, and depending on your perspective, the result is either amazing, or downright scary. To accurately detect someone's emotion, Rana's software detects eight different factors, which include frowning, showing disgust, engaged, and raised eyebrows, among other things. Through research with this software, a couple of interesting factoids are revealed. In the United States, women are 40% more likely to smile than men. But the technology is ultimately destined for software that will detect the user's emotion and react accordingly.
bigwophh writes: AMD announced new Radeon R9 and R7 300 series of graphics cards earlier this week, and while they are interesting, there are nowhere as titillating as the upcoming flagship of AMD GPUs: Fiji. Fiji will find its way into three distinct products this summer: the Radeon R9 Nano, Radeon R9 Fury, and the range-topping (and water-cooled) Radeon R9 Fury X. And we can’t forget other upcoming variants like the dual-Fiji board that was teased at E3. While full reviews are still under embargo, the official specification of the R9 Fury X have been revealed, along with an array of benchmark scores comparing the GPU to NVIDIA’s GeForce GTX 980 Ti. Should the numbers AMD has released jibe with independent reviews, the Fury X looks very strong.
bigwophh writes: A study published in the journal Computers in Human Behavior suggests that watching videos of cats may be good for your health. The study pinged nearly 7,000 people and asked them how viewing cat videos affected their moods. Of those surveyed, over a third (36 percent) described themselves as a "cat person" and nearly two-thirds (60 percent) said they have an affinity for both dogs and cats. Survey subjects noted less tendencies towards feeling anxious, sad, or annoyed after watching cat videos, including times when they viewed the videos while at work or trying to study. They also reported feeling more energetic and more positive afterwards. There may have been some guilt from putting off work or studying to watch Internet videos, but the amusement they got from seeing the antics of cats more than made up for it.
bigwophh writes: Intel gathered a number of its OEM and software partners together in New York this past week to showcase the latest technologies and innovations surrounding the company's RealSense 3D camera. From new interactive gaming experiences to video collaboration, 3D mapping and gesture controls, the the front-facing RealSense technology holds real promise that could someday reinvent how we interact with PCs. The F200 RealSense camera module itself integrates a depth sensor and a full color 1080p HD camera together with standard technologies like dual array mics, but with an SDK, on-board processing engine and 3rd party software that can allow the camera module to sense numerous environmental variables. In the demos that were shown, RealSense was used to create an accurate 3D map of a face, in a matter of seconds, track gestures and respond to voice commands, interact with a game, and remove backgrounds from a video feed in real-time, for more efficient video collaboration.
bigwophh writes: Asus announced the super-slim Zenbook UX305 during the IFA trade show in Berlin in September. The machine will be available in two models, one with a 1920x1080 IPS display and one with a QHD+ display that boasts a native resolution of 3200x1800. They’re both built around the power efficient Intel Core M processor, which was designed for ultra-thin, fanless form factors. Intel’s Core M offers some significant advances both in terms of power consumption and performance, which enables many of the design features found on the 12.3mm thin UX305. The Core M 5Y10 in the Asus Zenbook UX305 is complemented by 8GB of RAM and a 256GB SSD, and this is one of the few ultrabooks to feature a matte display. All told, the machine put up some decent numbers in the benchmarks and battery life was excellent.
bigwophh writes: NVIDIA’s Android-based, portable gaming system and media streaming device, originally known as Project SHIELD, was a big hit at CES. NVIDA has since dropped "Project" from the name and it appears the device is about ready to ship. If you’re unfamiliar with SHIELD, it is essentially a game controller with a built-in, flip-up 5” multi-touch screen. It is powered by NVIDIA’s own Tegra 4 quad-core SoC (System-on-Chip) with ARM A15 CPU cores, 72 GPU cores, 2GB of RAM, 16GB of internal storage, 802.11n 2x2 MIMO Wi-Fi, Bluetooth 3.0, and GPS support, among a number of other features. In addition to offering an array of Tegra-optimized games, part of SHIELD’s allure is the ability to wirelessly stream games and other media from a GeForce GTX-powered PC to any TV connected to SHIELD. Pricing for the device is set at $349 and pre-sales begin on May 20.
bigwophh writes: AMD has just announced hUMA, and acronym for Heterogeneous Unified Memory Access, which will arrive with the company’s Kaveri APU architecture sometime later this year. To understand hUMA, we should first talk about the UMA (Uniform Memory Access) and NUMA (Non-Unified Memory Access) architectures prevalent today. UMA refers to how the processing cores in a system view and access memory. In a UMA-capable architecture, all processing cores share a single memory address space. NUMA architectures, however, require data to be managed across multiple heaps with different address spaces for CPUs and GPUs, which adds programming complexity. Things come full circle with hUMA though, with both the CPUs and GPUs accessing the entire memory space with bi-directional coherency. AMD claims that programming for hUMA-enabled platforms should ease software development and potentially lower development costs as well. The technology is supported by mainstream programming languages like Python, C++, and Java, and should allow developers to more simply code for a particular compute resource with no need for special APIs.
bigwophh writes: Samsung has officially released their next generation flagship smartphone for the U.S. market today, the Galaxy S 4. Samsung's choice of components and its implementation of Android 4.2.2 quite simply make the Galaxy S 4 the fastest Android smartphone tested to date. The 1.9GHz quad-core Snapdragon 600 SoC at the heart of the Galaxy S 4 put up some excellent benchmark scores and during real-world use, the phone is just plain fast. The 1920x1080 Full HD, 441PPI screen on the Galaxy S 4 is also very nice and despite the phone’s high-performance, battery life is solid too. The Galaxy S 4 also sports 2GB of RAM, a 13MP camera, an IR blaster, and a number of other additional sensors and updates that allow for some interesting features like hand gesture interface controls.
bigwophh writes: Though the product has obviously been in development for quite some time, AMD's new Radeon HD 7990 is still a relatively exciting release. Despite the relative age of its GPUs, the 7990 has the potential to be the fastest graphics card to hit the market yet. NVIDIA’s been firing on all cylinders with its GeForce GTX 600 series products, but AMD has steadily wrung out additional performance from its GPUs through updated drivers and refreshed products, clocked slightly higher than their predecessors. The Radeon HD 7990 offers double the number of transistors, stream processors, texture units, ROPs, and memory as AMD’s previous flagship Radeon HD 7970. However, that’s because the Radeon HD 7990 is outfitted with two of the same GPUs and the same 3GB of frame buffer memory for each (6GB total). In the benchmarks, AMD's new dual-GPU driven card bests NVIDIA's GeForce GTX 690 or even a pair of GeForce GTX 680 cards in SLI under most workloads.
bigwophh writes: "A few weeks back, the PC enthusiast community was abuzz after news broke that AMD’s Radeon HD 7000 series would remain “stable throughout 2013” and would be the company’s focus “for quite some time”. The wording of the initial news made it sound like AMD wouldn’t be releasing any new GPUs for the rest of the year and that it would further differentiate its offerings with only new game and software bundles. Well, here we are, about a month out from that news, and AMD is ready with a new mainstream graphics card, which features a brand new GPU, codenamed Bonaire. Today AMD is taking the wraps off of the Radeon HD 7790, a mainstream GPU designed to fill the gap in the company’s current graphics card lineup between the Radeon HD 7770 and HD 7850 and challenge NVIDIA’s GeForce GTX 650 Ti."