Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Here's what's wrong (Score 1) 134 134

I'm nearly 100% certain both displays are IPS. One is just anti-glare while the other is glare + touch. The linked review certainly would have noticed that otherwise, but instead just essentially saying the more expensive one is rubbish...
(FWIW it's TN vs. IPS - both of these are actually TFT.)

Comment Re:Yes (Score 1) 1067 1067

FWIW, there are indeed languages where div by zero doesn't result in errors. Typically languages which can't throw exceptions. So, GLSL integer division gives an undefined result, as does the same in OpenCL. One reason of course is that these languages are meant to operate on vectors in hardware, and even if you could, you don't want to throw exceptions just for one element in the vector even though the rest of the vector is fine. Of course, unless you really don't care about the result in this case, you better make sure you recognize these zeros and do something to handle it (for instance, after doing the division, replace the result with zero or whatever if the divisor was zero). d3d10 shading language otoh gives you a defined result, but not zero (0xffffffff, for both quotient and remainder).

Comment Re:See nothing that says this is x86 (Score 1) 128 128

Well that Q6600 has 105W TDP - this new cpu has got a 2W SDP (ok that's marketing bs number, intel doesn't disclose the TDP, but should be around 4W or so). And that includes the gpu (which isn't included in the Q6600, and the gpu would be WAY faster than the one in the chipset the Q6600 came with at least if that was a intel chipset).
But yes in absolute cpu performance terms this can't touch a Q6600. However, compared to other tablet cpus it should do ok (compared to the broadwell Core-M chips some managed to cramp in similar sized fanless devices it will still lose but at least it shouldn't have that many thermal issues). Something like the iPad Air 2 could also be close in performance (or beat it depending on the task).
So, comparing to desktop class cpus isn't really fair, and against chips of the same class it should do ok (I don't really trust those preliminary benchmark numbers though by all accounts they could be true - cpu wise this is essentially a die-shrinked baytrail atom, so probably roughly the same performance with somewhat reduced power draw, though the graphics should be much improved - it now has intel Gen 8 graphics with 16 EUs instead of Gen 7 graphics with just 4 EUs though clocked lower).

Comment Re:Easier to support than OpenGL 4.x (Score 1) 52 52

If it runs on Haswell, it should on Ivy Bridge and Bay Trail Atoms as well. Bay Trail Atoms and Ivy Bridge are of the same graphics generation (gen7), and Haswell is just minimally different (gen 7.5) sharing nearly all driver code. Sandy Bridge is definitely different, though since khronos is saying everything supporting GLES 3.1 and up should be able to support it (meaning even things such as geometry shaders have to be optional), I guess it should be good enough. I am not convinced though anyone is going to write a driver for it but who knows?

Comment Re:Transactional Memory support (Score 1) 189 189

That is not quite correct, Haswell (or any cpu which supports AVX2) can do gather but not scatter. I agree this can be useful. I'm not sure how much faster it'll actually be in practice compared to using multiple loads (note that ever since sse4.1 you will typically not need any shuffles to achieve the same with multiple loads as you can use pinsrx instructions to directly fill up those vectors hence your concerns about increased register usage are unfounded), since the instruction is microcoded in the cpu so to the execution core it will still look like multiple single loads.

Comment Re:Valve finds Intel's driver to be great. (Score 1) 159 159

I don't know how good it would scale up. One thing is for sure though they'd need to scale other things than just the execution units (which is all they do for now).
Oh and your scaling numbers are a bit off. intel has only 6-16 EUs but these are 8-wide. So if they'd want a chip comparable to a high-end nvidia or amd card, they'd only need around below 200 EUs (they also run at somewhat higher frequency) not 1600 (which would be insane). Likewise for some good performance card ~100 EUs would be enough.

Comment Re:"moving irresistibly"? (Score 0) 673 673

This is the fastest GPU (well either that or a similarly fast one from AMD, HD7850M/HD7870M) you can fit in such a case (and, like other similar sized notebook chassis, it's already struggling with maintaining full clocks at high cpu+gpu load).
I don't consider this underpowered. Sure for full-res gaming this is a no-go but neither the high display resolution nor the MacBook itself is intended for that. If you want to game, just use lower resolution with upscaling (which does cost some performance too but not that much), though why some serious gamer would even consider a MacBook completely escapes me.
There's a reason GeForce 680M (and Radeon HD7970M) are only found in big and bulky gaming notebooks.

Comment Re:The Raspberry PI is currently underpowered (Score 1) 95 95

There's no way a Cortex-A8 1Ghz is ~4 times faster than a 700Mhz arm11. A factor of 2 would be a better estimate (only looking at the cpu, it also has twice the memory, not sure about the gpu if it can be used it will obviously be better than a framebuffer driver...).

Comment Re:Years off? (Score 1) 386 386

"50 shader cores" vs. 800 is not quite a valid comparison. The 800 number is totally unconfirmed and based on the rumor that the chip will be "similar to rv770", personally I believe people read a bit too much into that "similar" by deducing it will have the same number of shader units (if I'd have to guess I would say it's closer to lower performance level chips, like 480 shader units). Plus, if you count the cores like that, the X360 really has 192 shader cores, not 50 (3x16x4), though granted the ones in the Wii U (no matter the number) should be more flexible.
So I have some doubts it will be "phenomenally more capable" (some cautious statements from Nintendo about being able to match PS360 in graphics add to that, as does the tiny form factor) though it might indeed be somewhat better.

Comment Re:It might be worse than that. . . (Score 1) 234 234

The reactors were not 1000MW electrical - 2 and 3 were 784MW, 1 460MW, hence thermal output was about 2x2.2GW and 1.3GW respectively.
The 7% figure for decay heat is only true immediately after shutdown, after an hour (roughly when the tsunami hit) this goes already down to 1.5% (and there was some limited cooling after that due to battery backup). So you're really only looking at about 20MW or so per reactor which is not THAT much. Looks easy enough to use some portable generator to get some pump going (some 100kWs of electrical power should probably be enough) but apparently it didn't work that way...
Though the GP suggestions to rely on generators/turbine for cooling by just shutting down to self-sustaining power levels sounds extremely risky to me. The tsunami certainly flooded not only the diesel generators but other areas as well. If the diesels didn't work after the tsunami, I've got some doubts the turbines did (not to mention there could have been short-circuits or even direct earthquake related damage). We never really heard about if there was damage to these parts, if someone even knows.

Comment Re:Time to buy all new chipsets! (Score 1) 163 163

It's not like intel reduced the number of memory channels. This chip is for mainstream desktop platform, which is currently using LGA1156, and also "only" has 2 memory channels (I say "only" in parantheses because for desktop workloads, 16GB which this allows is plenty). When Sandy Bridge derived chips for the other platforms come out (I wouldn't know when), they will continue to have the same amount of memory channels as those platforms currently do (which is 3 for enthusiast/workstation).

Comment Re:Multiple factual errors and dubious statements. (Score 1) 204 204

ddr3 memory could potentially make a difference in graphics performance, though. dual vs. single-channel ddr2 memory definitely made a difference with the desktop gma 950 (which atom n4xx graphics is based on), though considering the gpu is clocked lower than the old desktop gma 950 the difference might not be that big. Not that this would really make much of a difference of course compared to modern gpus it will still be very slow.

Comment Re:Damages? (Score 1) 258 258

In theory at least public buildings should be quake-proof - not quite sure to what degree, but certainly earthquakes with a magnitude above 4 aren't quite unexpected in that area. In fact there was a 6.5 earthquake at basel - not that much was left of it after that but that was in 1356.
There were however 2500 damage reports, which explains the 9 million. That's not really much per damage report, it'll quickly cost quite a bit to for instance repair some superficial visible crack in plaster. And yes, there are strong doubts all of the damage was actually caused by that earthquake...

Comment Re:OMG, I brought this up with them (Score 1) 314 314

TDP numbers from both AMD and intel are based on cpu running at full clock (amd also publishes numbers when running in lower p-states, not sure about intel). So no, there's no way it should draw 57W when it has a TDP of 35W. Now, in theory, you're right, TDP does not have to be the absolute maximum a cpu can output. However, all measurements I've seen indicate it's pretty much impossible to exceed TDP with any workload (IIRC, not even with Core2MaxPerf, a tool specifically designed to cause maximum power consumption).
Even if you somehow would exceed cooling system capability, you wouldn't need to downclock a lot. Since these cpus drop voltage when downclocking, going from 2.5Ghz to 2Ghz already drops power requirements by half or so (that's just a guesstimate but you get the idea).
That said, after reading some of the links, it seems it is some problem with cooling the chipset, plus some overly aggressive bios with way too low thermal thresholds for throttling. Dunno if that's a simple bios bug or there is some reason they do it (could for instance have some components where you don't know temperature but dell knows they get too hot). In any case, this is indeed clearly broken, and IMHO a obvious case for getting a replacement part (or refund if dell can't fix it). If dell isn't going to do something about it, probably some lawyers are going to have a lot of fun with this...

Comment Re:OMG, I brought this up with them (Score 1) 314 314

This is untrue. Well-built notebooks should not have any thermal problems running core 2 duo cpus at full throttle. Certainly the ones I've seen don't have any problems (not at normal ambient temperatures at least). Most notebook cpus are rated 25W or 35W these days (and their real consumption is actually lower), for thin and light notebooks there are also 17W and 10W versions. I think the last time notebooks had performance problems on a wide scale due to throttling was back in days with Pentium 4 Mobile (the most useless mobile cpu ever built), which was rated (IIRC) for 45W (and actually had a higher power draw than that).
However, most notebooks will indeed get annoyingly noisy if you do number crunching or something else running cpu at full tilt. And certainly, battery runtime is going to suffer a lot.
Notebooks usually have cooling systems with heatpipes, so the cooling area is actually quite large considering the size of the notebooks.
That said, I don't know why those dells throttle so much. Normal Speedstep certainly doesn't clock down that much, could be some serious thermal issue (if heatsink/pipe wouldn't make contact with cpu for instance), though I haven't read the pdf hence can't say if that's really an issue (there are tools out there to read out cpu temperature certainly...).

"It might help if we ran the MBA's out of Washington." -- Admiral Grace Hopper

Working...