Forgot your password?
typodupeerror

Comment: Re:Transactional Memory support (Score 1) 189

by mczak (#43886739) Attached to: Intel Haswell CPUs Debut, Put To the Test
That is not quite correct, Haswell (or any cpu which supports AVX2) can do gather but not scatter. I agree this can be useful. I'm not sure how much faster it'll actually be in practice compared to using multiple loads (note that ever since sse4.1 you will typically not need any shuffles to achieve the same with multiple loads as you can use pinsrx instructions to directly fill up those vectors hence your concerns about increased register usage are unfounded), since the instruction is microcoded in the cpu so to the execution core it will still look like multiple single loads.

Comment: Re:Valve finds Intel's driver to be great. (Score 1) 159

by mczak (#41199801) Attached to: Valve Finds Open Source Drivers To Be Great
I don't know how good it would scale up. One thing is for sure though they'd need to scale other things than just the execution units (which is all they do for now).
Oh and your scaling numbers are a bit off. intel has only 6-16 EUs but these are 8-wide. So if they'd want a chip comparable to a high-end nvidia or amd card, they'd only need around below 200 EUs (they also run at somewhat higher frequency) not 1600 (which would be insane). Likewise for some good performance card ~100 EUs would be enough.

Comment: Re:"moving irresistibly"? (Score 0) 673

by mczak (#41062747) Attached to: Sealed-Box Macs: Should Computers Be Disposable?
This is the fastest GPU (well either that or a similarly fast one from AMD, HD7850M/HD7870M) you can fit in such a case (and, like other similar sized notebook chassis, it's already struggling with maintaining full clocks at high cpu+gpu load).
I don't consider this underpowered. Sure for full-res gaming this is a no-go but neither the high display resolution nor the MacBook itself is intended for that. If you want to game, just use lower resolution with upscaling (which does cost some performance too but not that much), though why some serious gamer would even consider a MacBook completely escapes me.
There's a reason GeForce 680M (and Radeon HD7970M) are only found in big and bulky gaming notebooks.

Comment: Re:The Raspberry PI is currently underpowered (Score 1) 95

by mczak (#40693205) Attached to: Debian Derivative Optimized for the Raspbery Pi Released
There's no way a Cortex-A8 1Ghz is ~4 times faster than a 700Mhz arm11. A factor of 2 would be a better estimate (only looking at the cpu, it also has twice the memory, not sure about the gpu if it can be used it will obviously be better than a framebuffer driver...).

Comment: Re:Years off? (Score 1) 386

by mczak (#37827368) Attached to: Next-Gen Game Consoles Still Years Off
"50 shader cores" vs. 800 is not quite a valid comparison. The 800 number is totally unconfirmed and based on the rumor that the chip will be "similar to rv770", personally I believe people read a bit too much into that "similar" by deducing it will have the same number of shader units (if I'd have to guess I would say it's closer to lower performance level chips, like 480 shader units). Plus, if you count the cores like that, the X360 really has 192 shader cores, not 50 (3x16x4), though granted the ones in the Wii U (no matter the number) should be more flexible.
So I have some doubts it will be "phenomenally more capable" (some cautious statements from Nintendo about being able to match PS360 in graphics add to that, as does the tiny form factor) though it might indeed be somewhat better.

Comment: Re:It might be worse than that. . . (Score 1) 234

by mczak (#36073738) Attached to: Chain Reactions Reignited At Fukushima
The reactors were not 1000MW electrical - 2 and 3 were 784MW, 1 460MW, hence thermal output was about 2x2.2GW and 1.3GW respectively.
The 7% figure for decay heat is only true immediately after shutdown, after an hour (roughly when the tsunami hit) this goes already down to 1.5% (and there was some limited cooling after that due to battery backup). So you're really only looking at about 20MW or so per reactor which is not THAT much. Looks easy enough to use some portable generator to get some pump going (some 100kWs of electrical power should probably be enough) but apparently it didn't work that way...
Though the GP suggestions to rely on generators/turbine for cooling by just shutting down to self-sustaining power levels sounds extremely risky to me. The tsunami certainly flooded not only the diesel generators but other areas as well. If the diesels didn't work after the tsunami, I've got some doubts the turbines did (not to mention there could have been short-circuits or even direct earthquake related damage). We never really heard about if there was damage to these parts, if someone even knows.

Comment: Re:Time to buy all new chipsets! (Score 1) 163

by mczak (#33585622) Attached to: Intel Unveils 'Sandy Bridge' Architecture
It's not like intel reduced the number of memory channels. This chip is for mainstream desktop platform, which is currently using LGA1156, and also "only" has 2 memory channels (I say "only" in parantheses because for desktop workloads, 16GB which this allows is plenty). When Sandy Bridge derived chips for the other platforms come out (I wouldn't know when), they will continue to have the same amount of memory channels as those platforms currently do (which is 3 for enthusiast/workstation).

Comment: Re:Multiple factual errors and dubious statements. (Score 1) 204

by mczak (#32060602) Attached to: Blurring Lines — Dual Core Atom To Lift Netbooks
ddr3 memory could potentially make a difference in graphics performance, though. dual vs. single-channel ddr2 memory definitely made a difference with the desktop gma 950 (which atom n4xx graphics is based on), though considering the gpu is clocked lower than the old desktop gma 950 the difference might not be that big. Not that this would really make much of a difference of course compared to modern gpus it will still be very slow.

Comment: Re:Damages? (Score 1) 258

by mczak (#30453820) Attached to: Swiss Geologist On Trial For Causing Earthquakes
In theory at least public buildings should be quake-proof - not quite sure to what degree, but certainly earthquakes with a magnitude above 4 aren't quite unexpected in that area. In fact there was a 6.5 earthquake at basel - not that much was left of it after that but that was in 1356.
There were however 2500 damage reports, which explains the 9 million. That's not really much per damage report, it'll quickly cost quite a bit to for instance repair some superficial visible crack in plaster. And yes, there are strong doubts all of the damage was actually caused by that earthquake...

Comment: Re:OMG, I brought this up with them (Score 1) 314

by mczak (#30287866) Attached to: Dell Defect Turning 2.2GHz CPU Into 100MHz CPU?
TDP numbers from both AMD and intel are based on cpu running at full clock (amd also publishes numbers when running in lower p-states, not sure about intel). So no, there's no way it should draw 57W when it has a TDP of 35W. Now, in theory, you're right, TDP does not have to be the absolute maximum a cpu can output. However, all measurements I've seen indicate it's pretty much impossible to exceed TDP with any workload (IIRC, not even with Core2MaxPerf, a tool specifically designed to cause maximum power consumption).
Even if you somehow would exceed cooling system capability, you wouldn't need to downclock a lot. Since these cpus drop voltage when downclocking, going from 2.5Ghz to 2Ghz already drops power requirements by half or so (that's just a guesstimate but you get the idea).
That said, after reading some of the links, it seems it is some problem with cooling the chipset, plus some overly aggressive bios with way too low thermal thresholds for throttling. Dunno if that's a simple bios bug or there is some reason they do it (could for instance have some components where you don't know temperature but dell knows they get too hot). In any case, this is indeed clearly broken, and IMHO a obvious case for getting a replacement part (or refund if dell can't fix it). If dell isn't going to do something about it, probably some lawyers are going to have a lot of fun with this...

Comment: Re:OMG, I brought this up with them (Score 1) 314

by mczak (#30278428) Attached to: Dell Defect Turning 2.2GHz CPU Into 100MHz CPU?
This is untrue. Well-built notebooks should not have any thermal problems running core 2 duo cpus at full throttle. Certainly the ones I've seen don't have any problems (not at normal ambient temperatures at least). Most notebook cpus are rated 25W or 35W these days (and their real consumption is actually lower), for thin and light notebooks there are also 17W and 10W versions. I think the last time notebooks had performance problems on a wide scale due to throttling was back in days with Pentium 4 Mobile (the most useless mobile cpu ever built), which was rated (IIRC) for 45W (and actually had a higher power draw than that).
However, most notebooks will indeed get annoyingly noisy if you do number crunching or something else running cpu at full tilt. And certainly, battery runtime is going to suffer a lot.
Notebooks usually have cooling systems with heatpipes, so the cooling area is actually quite large considering the size of the notebooks.
That said, I don't know why those dells throttle so much. Normal Speedstep certainly doesn't clock down that much, could be some serious thermal issue (if heatsink/pipe wouldn't make contact with cpu for instance), though I haven't read the pdf hence can't say if that's really an issue (there are tools out there to read out cpu temperature certainly...).

Comment: netgear wnr3500l? (Score 1) 376

by mczak (#30252174) Attached to: Home Router For High-Speed Connection?
Ok this one was dissed due to being advertized as "open source router". However, I looked at the specs and from all the cheapie routers this one actually seems to have the best hardware specs. It's got a apparently quite fast cpu (broadcom 4718 at 480Mhz, supposedly mips 74k core said to be much faster than the older broadcom 470x chips), it's got 8MB flash, 64MB ram. Might not be open source but should run dd-wrt... For what it's worth, netgear advertizes it with 350mbit wan to lan throughput, make of that number what you will...

Comment: Re:Yay! Re-badged 9800GT FTW! (Score 1) 208

by mczak (#30139562) Attached to: NVIDIA Ships Decent DX10 Graphics Card For Under $100
You're quite wrong here. I think you're confusing the GT240 with the GTS240, which is a OEM-only deal and indeed pretty much a rebadged 9800GT (I won't disagree though naming is silly). GT240 is based on a entirely new chip (GT215 based on 40nm process) instead of the old and trusted G92(b) (65/55nm) which was used in 8800GT/9800GTX/GTS250 (and more). It can also do DX10.1 - something neither G92 based cards nor GT200 based ones (GTX260 and friends) can do.

Comment: Re:Binary blob ... eh? (Score 1) 461

by mczak (#29180845) Attached to: Linux Port For id's Tech 5 Graphics Engine Unlikely

From TFA, it seems that Carmack believes it would be hard to get the necessary performance without using the NVidia drivers. It's somewhat surprising to me if it wouldn't be possible to get it running acceptably on anything else, even if the game does use a lot of advanced features - but if Carmack says so!

Well, if it absolutely requires new OpenGL features / extensions chances are they aren't present in mesa (on which all open-source drivers are based) yet. So if he's coding for OpenGL 3.1 plus maybe even a couple extensions (for example geometry shaders which are only core GL in 3.2) then currently it wouldn't run certainly. That said, this certainly shouldn't stop id from doing a port. First, the open source drivers never really achieved the performance of the closed ones (though glsl made things worse), so that's nothing new, and some people were (and still are) quite willing to use the binary drivers. Plus, this might actually help to implement these features - if no app uses them anyway, particularly people working in their free time on the open source drivers probably don't see much reason to do any work on them. Can't blame him though that the business case for a linux port might just be not there, that's not really his fault.

Comment: Re:still no multithreaded h.264 decoding (Score 1) 176

by mczak (#27141097) Attached to: FFmpeg Finally Releases Long-Awaited Version 0.5
Depends on encoding options really. Baseline profile (I think apple movie trailers use that exclusively) should be fine, but there's no way you're going to decode high profile 1080i50 content on a X2 4200+ with ffmpeg (if it's not slice based). Even in benchmark mode (nosound etc.) of mplayer I was only able to reach something like 70% of realtime with some selected 1080i clips (that was without deinterlacer, 1080p24 might be slightly better but it's not going to be fast enough neither). Oh and that was a 2.6Ghz X2, so faster than yours... You _could_ of course use CoreAVC (it's possible to get it to work in linux media players), which is a bit faster even with single cores and of course pretty much twice as fast with dual cores.

Money is the root of all wealth.

Working...