Forgot your password?

Comment: Re:all (Score 1) 104

by default luser (#48197141) Attached to: Which Android Devices Sacrifice Battery-Life For Performance?

Turn on Power Save mode and turn off Wifi? Check to see if you have any CPU-heavy applications, and force-quit them while not in-use?

Also, the amount of power used depends entirely on how powerful a signal you are getting. For example, my Galaxy S4 typically uses 30-35% battery per-day at normal home/work day, but this last weekend I went up to the middle of nowhere, PA. The house barely got 3G at one bar, and because of the shit signal my phone was down to 30% every night. If your place of work is inside a large building, it can play havoc on your signal.

Comment: Re:Yes yes yes (Score 1) 405

They are "undervalued" in your eyes. In reality, they get no value because there is no more growth. This is evidenced by the increase in base unemployment since the early 1990s.


While other countries have employment ups and downs, Japan's population has been on a downward spiral because there is no more real growth. Since the 1980s, manufacturing has moved to Korea and China, and there's not enough of a service and tech industry to cover the loss.

So yeah, for the last 20 years those people who have jobs are treated well, and the young people get the finger. Same thing is beginning to happen here in the United States, although not as massive a movement. The Japanese companies are going to either adapt, or crater and then be reborn as fast-and-lean pension-less workhouses like the US.

Comment: Re:How much! (Score 1) 405

This only carries for simple stuff like clothing, furniture, dishware and other branded crap. Anything with a visible tie to the team (prominent logo), not too gaudy, and some basic functionality understandable by the average idiot is a guaranteed sale.

Most people won't buy anything as complicated as a Surface just because they see the NFL players using them. There's no SIMPLE need that the Surface satisfies, and the then it's also rather expensive, and thanks to poor naming and late to market, it's lost in a sea of tablets/convertibles. There's also no actual NFL branding on the device, just product placement; so as soon as they see it, most people just automatically think it's an iPad anyway.

Anything complex to use is best not sold directly to the NFL audience. I mean, nobody went out to buy Motorola radio headsets (or even their cell phones) just because they saw them them every week for 13 years on the NFL sidelines, but they sure as hell buy the team-branded warmups that everyone is wearing. Electronics sponsorship for the NFL is more about letting everyone know your company still exists, not about pushing specific products - much like putting your name on a stadium.

Comment: There's more to it than just that (Score 2) 65

by default luser (#47560515) Attached to: $299 Android Gaming Tablet Reviewed

You have to want a better streaming experience than Valve's Steam already offers for free (and you can buy a Windows Tablet for the same price, and Valve is expected to support Android and iOS soon). You can use whatever system and whatever video card you want to stream the game to and from - even go wired ethernet to get around the inevitable problems you get streaming games over wireless.

If you go Shield, the tablet price is just the beginning: you have to have a mid-range GeForce card purchased in at least the last 2 years ($120+ if you don't already have one), the controller + stand ($100), and of course a suitable dual-band router runs at least $70 (most people use the crap one that came with their internet install).

In all, you could be on the hook for anywhere from $400 all the way up to $600. That's getting DANGEROUSLY close to the same price as an entry-level gaming PC, so again the need just doesn't present itself there.

Comment: Re:I can already do this on Android... (Score 1) 42

You'll notice in the second half if my post that I called the separate controller + tablet + stand gaming method CLUMSY, and praised the original Shield for including all-in-one. This is because I also agree with you that this sort of gaming is untenable, and I actually gave up and bought a 3DS about 6 months back to get my portable gaming fix.

Touchscreens suck for gaming!

Also, you'll notice I suggested the WIRED 360 controller because it's not nearly as complicated as pairing a wireless bluetooth controller. Just plug it in, and play your games...or you can sit here all day and complain about Android breaking your favorite bluetooth controller.

Comment: Re:I can already do this on Android... (Score 1) 42

The K1 has *vastly more GPU power* than any other ARM SoC out.

And your point is?

Games are being produced for the mainstream to high-end currently out there, so a SoC with twice the power brings nothing useful to the table. And streaming PC games requires none-of-the-above GPU power, so it's one of those questionable value cases.

Then there's also the concern about whether K1 will throttle under load, since this tablet doesn't have a fan like last year's Shield did. You'll notice that NONE of today's reviews go into that level of detail, and that's probably no accident.

Comment: I can already do this on Android... (Score 1) 42

With something as simple as a USB OTG adapter and a wired Xbox controller.

Believe it or not, most modern Android games already support the Xbox controller, and if you're gaming on such a tiny screen you can believe me that the wireless controller is NOT a necessity (will be 1-2 feet away at most). You can buy a $200-ish entry-level Android tablet that can handle games just fine, and reuse the Xbox controller you undoubtedly already have. So, why would you spend $300 + $100 for the same thing with an Nvidia label on it?

I'm actually disappointed that they didn't stick with the concept of the original Shield, and deliver another handheld gaming system; that alone is the only thing you cannot find in the Android world. If they insist that you use a clumsy tablet setup involving a screen prop and a separate controller, then why do we have to buy the tablet, screen prop and controller FROM THEM?

Comment: Promises... (Score 1) 64

by default luser (#47457491) Attached to: Coming Soon(ish) From LG: Transparent, Rollup Display

Also, there's the unavoidable problem you have with display clarity. Right now screens are on a flat substrate, and so each pixel is aligned with the next one, which reproduces an image accurately. But what happens when you have an unrolled display sitting on your desk, or held in your hand? It will inevitably be have varying levels of curve along it's length and possibly more complex crumples, resulting in poor image accuracy. Fixing that will require some clever sensors embedded in the display along with some expensive signal processing, and that fix will STILL cost you resolution.

Then when you consider that LG's current flexible displays have poor color rendition and contrast, along with piss-poor resolution, you realize how much of a lost cause this is. I cannot see myself giving up the best qualities of modern displays so that they break a little less often, and can fit in a smaller pocket.

Comment: Re:Moore's Law (Score 1) 143

by default luser (#47310839) Attached to: Researchers Unveil Experimental 36-Core Chip

In your glass tower, yes.

In the real world, not so much.

Here is an example of one of the world's most optimized pieces of software: x264. It's also one of the few real-world loads that can take advantage of multiple processor and SSE. So how much speedup did this incredible piece of software see with AVX2, which DOUBLED the width of the integer pipelines?

FIVE PERCENT! Yup, that's it!

All that work for so very little improvement, because in the REAL WORLD data does not align on perfect AVX2 boundaries, and data fetch is as much of a hindrance as the actual processing of that data. Read more about WHY this is the best that could be done here, if you don't mind paying for SCRIBD.

Parading around test results form something like Passmark is just self-delusion. It only tests that the features do in-fact work, and these tests tend to work directly from cache in small data sets that are usually not branch-heavy. IT gives score for number of MIPS, but does not take into account the fact that most real software can't actually make use of these features at-speed.

And when they increase the vector size yet-again to 512-bits wide in a year, it will once-again be a limited real-world improvement, because optimization of real loads is hard, and auto-vectorization of arbitrary loads is even harder problem to solve. So Intel keeps adding new features, and they keep adding about 5-7% each (real world). So I don't see how you get above 3x from those puny performance increases, while not deluding yourself.

Comment: Re:Moore's Law (Score 1) 143

by default luser (#47306865) Attached to: Researchers Unveil Experimental 36-Core Chip

Which uses Passmark, which is a simple corner-case number-crunching bonanza. Pure AVX2, or FMA without any real-world qualifiers, restriction or branching? Sure, we got that!

And even with that, you're still off. The performance improvement with Haswell per-core is less than 5x. See here:[]=2020&cmp[]=1127

So, in the unoptimized case the performance improvement is 2-3x, and in the embarrassingly-parallel case the speedup is 4-5x. But then, if you had such an embarrassingly-parallel case, you'd just port it to OpenCL and be done with it. Haswell is for all those hard-to-optimize compute cases.

Comment: Re:Moore's Law (Score 1) 143

by default luser (#47298799) Attached to: Researchers Unveil Experimental 36-Core Chip

Absolutely not true.

The Core 2 Duo is approximately 2x faster clock-for-clock versus the Pentium 4, and the current Haswell core is barely 40% faster than that (assume a 7% speedup per-clock for every core rev since). That gets you somewhere in the 2x-3x performance improvement range for Haswell, barring corner-cases that are embarrassingly easy to leverage AVX/FMA (most real-world use cases show small improvements).

Intel proved that they could do a whole lot better than the Pentium 4, but your performance improvement factor is off by half!

Comment: This affects our entire industry (Score 1) 142

by default luser (#47282597) Attached to: Will 7nm and 5nm CPU Process Tech Really Happen?

Because whatever you do in the computing world, you are affected by processing power and cost. Growth in these regions drives both new hardware and new software to go with it, and any hit to growth will mean loss of jobs.

Software (what most of us here create) usually gets created for one of two reasons:

1. Software is created because nobody is filling a need. Companies may build their own version if they want to compete, or a company may contract a customized version if they can see increased efficiency or just have a process they want to stick to. There used to be a lot of unfulfilled need out there, but this demand is much sated in the 21st century.

2. Software is created because a company desires increased performance/new features (basic need is filled, this is a WANT). Once a new processor/feature becomes available, you either wedge it into existing code. Or, if it's a massive enough of an improvement, you create entirely new software enabled by the new level of performance-per-dollar.

Without continued growth, the industry is in danger of cratering because there's only so much processor architecture optimization you can do in the same process node, and the same goes for optimized libraries on the software side. In addition, brand-new industries enabled by cost reductions (e.g. digital FMV explosion in the 1990s, or the movement to track your every move in the 2000s) will no-longer be so common, and that will again force people to look elsewhere for employment.

Software engineers won't disappear, but they will be culled. The industry has not had to deal with that yet in it's entire history, so it will be painful. I'm hoping they can hod this off for as long as possible!

It's time to boot, do your boot ROMs know where your disk controllers are?