Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Time to buy all new chipsets! (Score 1) 163

It's not like intel reduced the number of memory channels. This chip is for mainstream desktop platform, which is currently using LGA1156, and also "only" has 2 memory channels (I say "only" in parantheses because for desktop workloads, 16GB which this allows is plenty). When Sandy Bridge derived chips for the other platforms come out (I wouldn't know when), they will continue to have the same amount of memory channels as those platforms currently do (which is 3 for enthusiast/workstation).

Comment Re:Multiple factual errors and dubious statements. (Score 1) 204

ddr3 memory could potentially make a difference in graphics performance, though. dual vs. single-channel ddr2 memory definitely made a difference with the desktop gma 950 (which atom n4xx graphics is based on), though considering the gpu is clocked lower than the old desktop gma 950 the difference might not be that big. Not that this would really make much of a difference of course compared to modern gpus it will still be very slow.

Comment Re:Damages? (Score 1) 258

In theory at least public buildings should be quake-proof - not quite sure to what degree, but certainly earthquakes with a magnitude above 4 aren't quite unexpected in that area. In fact there was a 6.5 earthquake at basel - not that much was left of it after that but that was in 1356.
There were however 2500 damage reports, which explains the 9 million. That's not really much per damage report, it'll quickly cost quite a bit to for instance repair some superficial visible crack in plaster. And yes, there are strong doubts all of the damage was actually caused by that earthquake...

Comment Re:OMG, I brought this up with them (Score 1) 314

TDP numbers from both AMD and intel are based on cpu running at full clock (amd also publishes numbers when running in lower p-states, not sure about intel). So no, there's no way it should draw 57W when it has a TDP of 35W. Now, in theory, you're right, TDP does not have to be the absolute maximum a cpu can output. However, all measurements I've seen indicate it's pretty much impossible to exceed TDP with any workload (IIRC, not even with Core2MaxPerf, a tool specifically designed to cause maximum power consumption).
Even if you somehow would exceed cooling system capability, you wouldn't need to downclock a lot. Since these cpus drop voltage when downclocking, going from 2.5Ghz to 2Ghz already drops power requirements by half or so (that's just a guesstimate but you get the idea).
That said, after reading some of the links, it seems it is some problem with cooling the chipset, plus some overly aggressive bios with way too low thermal thresholds for throttling. Dunno if that's a simple bios bug or there is some reason they do it (could for instance have some components where you don't know temperature but dell knows they get too hot). In any case, this is indeed clearly broken, and IMHO a obvious case for getting a replacement part (or refund if dell can't fix it). If dell isn't going to do something about it, probably some lawyers are going to have a lot of fun with this...

Comment Re:OMG, I brought this up with them (Score 1) 314

This is untrue. Well-built notebooks should not have any thermal problems running core 2 duo cpus at full throttle. Certainly the ones I've seen don't have any problems (not at normal ambient temperatures at least). Most notebook cpus are rated 25W or 35W these days (and their real consumption is actually lower), for thin and light notebooks there are also 17W and 10W versions. I think the last time notebooks had performance problems on a wide scale due to throttling was back in days with Pentium 4 Mobile (the most useless mobile cpu ever built), which was rated (IIRC) for 45W (and actually had a higher power draw than that).
However, most notebooks will indeed get annoyingly noisy if you do number crunching or something else running cpu at full tilt. And certainly, battery runtime is going to suffer a lot.
Notebooks usually have cooling systems with heatpipes, so the cooling area is actually quite large considering the size of the notebooks.
That said, I don't know why those dells throttle so much. Normal Speedstep certainly doesn't clock down that much, could be some serious thermal issue (if heatsink/pipe wouldn't make contact with cpu for instance), though I haven't read the pdf hence can't say if that's really an issue (there are tools out there to read out cpu temperature certainly...).

Comment netgear wnr3500l? (Score 1) 376

Ok this one was dissed due to being advertized as "open source router". However, I looked at the specs and from all the cheapie routers this one actually seems to have the best hardware specs. It's got a apparently quite fast cpu (broadcom 4718 at 480Mhz, supposedly mips 74k core said to be much faster than the older broadcom 470x chips), it's got 8MB flash, 64MB ram. Might not be open source but should run dd-wrt... For what it's worth, netgear advertizes it with 350mbit wan to lan throughput, make of that number what you will...

Comment Re:Yay! Re-badged 9800GT FTW! (Score 1) 208

You're quite wrong here. I think you're confusing the GT240 with the GTS240, which is a OEM-only deal and indeed pretty much a rebadged 9800GT (I won't disagree though naming is silly). GT240 is based on a entirely new chip (GT215 based on 40nm process) instead of the old and trusted G92(b) (65/55nm) which was used in 8800GT/9800GTX/GTS250 (and more). It can also do DX10.1 - something neither G92 based cards nor GT200 based ones (GTX260 and friends) can do.

Comment Re:Binary blob ... eh? (Score 1) 461

From TFA, it seems that Carmack believes it would be hard to get the necessary performance without using the NVidia drivers. It's somewhat surprising to me if it wouldn't be possible to get it running acceptably on anything else, even if the game does use a lot of advanced features - but if Carmack says so!

Well, if it absolutely requires new OpenGL features / extensions chances are they aren't present in mesa (on which all open-source drivers are based) yet. So if he's coding for OpenGL 3.1 plus maybe even a couple extensions (for example geometry shaders which are only core GL in 3.2) then currently it wouldn't run certainly. That said, this certainly shouldn't stop id from doing a port. First, the open source drivers never really achieved the performance of the closed ones (though glsl made things worse), so that's nothing new, and some people were (and still are) quite willing to use the binary drivers. Plus, this might actually help to implement these features - if no app uses them anyway, particularly people working in their free time on the open source drivers probably don't see much reason to do any work on them. Can't blame him though that the business case for a linux port might just be not there, that's not really his fault.

Comment Re:still no multithreaded h.264 decoding (Score 1) 176

Depends on encoding options really. Baseline profile (I think apple movie trailers use that exclusively) should be fine, but there's no way you're going to decode high profile 1080i50 content on a X2 4200+ with ffmpeg (if it's not slice based). Even in benchmark mode (nosound etc.) of mplayer I was only able to reach something like 70% of realtime with some selected 1080i clips (that was without deinterlacer, 1080p24 might be slightly better but it's not going to be fast enough neither). Oh and that was a 2.6Ghz X2, so faster than yours... You _could_ of course use CoreAVC (it's possible to get it to work in linux media players), which is a bit faster even with single cores and of course pretty much twice as fast with dual cores.

Comment still no multithreaded h.264 decoding (Score 3, Informative) 176

Multithreaded h.264 decoding is what I'm missing. Still only slice-based multithreading support, which doesn't work with 95% the content out there, which means you can't get real time decoding of full hd content on A64 X2 (core2 cpus are probably fast enough even with one core, at least the faster ones). ffmpeg-mt branch fixes this, I wonder when this will be merged (still seems to be a bit buggy).

Comment Re:Why oh why... (Score 1) 123

If you pair atom with the poulsbo chipset, it certainly won't require a fan (I'm not quite sure in which environments it could run even without a heatsink). Even paired with a 945gse instead (but not the 945gc) it doesn't really require a fan neither. Not sure about nvidia's ion platform...

Comment Re:Check out the patent (Score 1) 603

For the 3-min charge of this thing here though I doubt just installing a new breaker would do. I'm not sure about the US, but I don't think you can actually get that much power delivered to your home with low voltage. To charge such a battery in 3 minutes would require _over 1 MW_ ! Would require something like 3x1600A at 230V, and twice that for 110V. Needless to say, that would require some serious cabling... Of course for industrial use you can get a 20kV connection which should easily do... If you'd be content with 1 hour charge of this battery though, you'd only require about 3x80A at 230V, which electric power companies indeed provide (though most one-family houses here are limited to less).

Comment Re:Hardware of Software Problem? (Score 2, Informative) 111

Jumpers are not really used a lot these days. They cost extra, and are clumsy to handle (need to open case). You are right it would be really good if there were some precautions taken so no accidental writes happen (for instance need some special command sequence hard to trigger accidentally), but often those eeprom chips just have a simple serial interface, and reading and writing works almost exactly the same. A couple of years ago you could easily overwrite the eeprom of hauppauge tv cards (though there wasn't much information in there, just the exact model IIRC which was needed to set things up fully correct), a bug very similar to this.

Slashdot Top Deals

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...