Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Err (Score 1) 220

The new "bulldozer" architecture from AMD is a disaster, in just about every way. They're terrible. The charts clearly show that.

For what it's worth, AMD was, like any other technology company working on significant changes, under pressure to get a release out the door. Early releases are more technological previews than completed products. It takes several generations of incremental updates to get a new architecture to reach its performance potential, and when it comes to CPUs, relatively minor changes can have a significant impact on performance. Look at the difference with the original first generation bulldozer CPUs with piledriver, the first point upgrade.

To be fair, the first releases of the bulldozer microarchitecture were not impressive performance wise. That does not justify slamming the architecture entirely, it's immature technology that isn't there yet, but we are yet to see what it's capable of.

Comment Re:Doubtful. (Score 2) 110

That is disappointing. I was hoping for some amusing inconsistencies in the spec, and it turns out that it is just a few optional elements in there to support ancient packages and which the standard recommends that you don't actually support!

Is this really the reason that the entire standards organisation is denigrated, and that this format said to be impossible to implement? That is pretty lame. Why does everyone worry about compatibility tags that date back to Windows 3.1 days when the ODF spec neglected to document the spreadsheet functions at all? If you are looking for an impossible to implement standard, then that would be a more likely candidate.

The whole point in standards is to provide compatibility between different implementations of that standard. If MS Word produces documents that qualify according to the OOXML specification, but rely on the deliberately vague parts referring to older document formats, any other implementation would not be able to process said document properly.

Comment Re:That means we lefties (Score 1) 258

Further, very few tools existed in historical times where handedness mattered at all. A wrench or a hammer or a spear have no handedness. Only much later were tools invented to meet the needs of the majority or users, which is why there was a tendency to put controls on power tools on the right.

The whole thesis mistakes cause for effect, suggesting tools and games we invented had something to do with what made us what we are. Whether our ancestors threw the spear, or picked the berry right or left handed couldn't have mattered at all.

The mention of a wrench here is a perfect example of how to overlook favorable tendencies for left/right handedness. Yes, the tool itself can be used in either hand, but the direction in which nuts and screws are wound in typically favors right handed people. Similarly, scissors with no ergonomic shape to favor one hand over the other typically have the top blade on the right, which leaves the cut much easier to see if the scissors are held in the right hand.

I'm not intimately familiar with tools mankind has used throughout history, but I'd be cautious to make statements like "very few tools existed in historical times where handedness mattered at all".

Comment Interlacing? (Score 1) 291

As many readers have pointed out, the article is light on technical details. Based on the demonstration video on the article, it looks to me like they've come up with an algorithm to interlace models -- that is, to load and process 3D models in non-linear order. If that's the case, they could be producing simplified objects(with only enough polygons to render a pixel perfect image with a given resolution and viewing distance) from incredibly detailed models on the fly.

At 5:07 into the video is a demonstration of how games would typically show a simpler version of an object from far away and switch into a higher detailed version once the object comes closer. Their algorithm could be producing these dumbed down models on the fly, just not limited to two levels of detail but countless different variations.

The reason I called this interlacing is because it reminds me of how web sites with JPEG images would load over a slow dial-up modem back in the day. At first a blurry or pixelized version of the image was portrayed, and once the rest of the image was loaded (much later) the image had arrived at its full quality.

Comment Re:First exosceleton post (Score 3, Interesting) 135

Given that all the stupid computers in hospitals are running windows, this threat is actually already there, and does not seem to have caused many problems so far. Yet, I'm still very anxious to see these things more and more popular.

With a year's experience working in a public hospital purchasing office I found (without searching) many critical security flaws in the processes. For example, the European Procurement Announcement agency regularly sends catalogs of EU-wide procurements in CD's that require Windows and autorun to function. The CD would start a web server off the disk and launch Internet Explorer to interface with the server. In other words, we regularly executed programs from CD's we got by mail in a very simple (and easy to replicate) packaging, on the same computers we use to make very expensive purchases (believe me, hospital equipment isn't cheap).

Comment Re:Why really does Apple behave this way? (Score 3, Informative) 432

They behave the way they do because they are control freaks. They want absolute control over their platform. Their ultimate vision is that they'll be the source of all your media, all your apps, etc. They'll dictate how you consume stuff. Such a setup would be, needless to say, very profitable.

As for why they can get away with it, well I'd say there are two reasons:

1) Fanboyism/zealotry. Apple has had a following for a long time of people for whom they can do no wrong more or less. A non-trivial amount of these people are in the press (Macs are big in prepress work). They just love Apple and everything they do. So when something bad comes out, they find ways to rationalize it away, or ignore it.

2) For many of the Apple buyers these days, Apple is not a technology company but a fashion company. They largely won't admit it, but they buy them as fashion accessories. They are the "cool" product to own. As such they are purchased based on that alone. Whatever restrictions/costs accompany that are ok because they want to be cool. I see the same thing these days with fixed gear bikes. They are in with college kids (I work on campus and bike to work). They buy brand new, surprisingly expensive, fixed gear bikes. This, of course, makes them harder to ride up hill, but they are ok with that because fixed gear is cool, road or mountain bikes are not.

or 3) Their market share is sufficiently low to face antitrust investigations for monopolistic behavior.

Comment Re:At least they tell you.. (Score 2, Interesting) 248

People who bought iPads and iPods on the understanding that they could be used without having their location information gathered and shared now find that they *must* allow this information to be gathered and shared (I suppose you could try not updating iTunes, but then you would also have to not upgrade your OS -- OS upgrades come with iTunes upgrades -- and be prepared to be locked out of the app store, and since Apple's use of DRM prevents third parties from putting apps on your devices, you're fundamentally abandoning any hope of loading any code, even third-party code, onto your iPad and iPod).

Sounds like a class action lawsuit waiting to happen.

Comment Re:Here's a question (Score 1) 776

Had I gotten into an accident and someone looked at the black box, it would show the same thing. "Umm, he took his foot off the gas and then floored it, repeating. Probably drunk or distracted."

Not if they have any sensible sampling frequency on the gas pedal. Under no circumstances should it be possible to go from 'idle' to 'flooring' without samples from the transition between the two extremes. It would be easy to recognize a faulty sensor when its input displays changes that are impossible to produce mechanically.

Comment Re:Sure... (Score 2, Interesting) 460

While the USB memory key (in this example) could have low level software to snoop your data, how are they going to get it? Is the USB key going to open a TCP/IP or UDP connection back to their servers without tripping my firewall that a new application is trying to connect? Is my virus scanner going to get tripped that something suspicious is coming out of the key without my interaction?

Just because the cases are not obvious doesn't mean there is no potential for exploit.

Keyboards get a lot of raw sensitive data: usernames and passwords, often even accompanied with the direct URLs where the credentials apply. Now, the keyboard obviously wouldn't be able to open a TCP/IP or UDP connection to upload the data, but it could sneak time-encoded hints about pre-recorded data into your typing. While you type, the keyboard firmware could impose miniature delays that would go unnoticed by the human eye, but would in turn influence the timing of packets sent by an SSH session. Such an attack wouldn't necessitate decrypting the SSH session and it would go completely unnoticed through all your Intrusion Detection Systems and firewalls. The practicality of such an attack can be questioned, but it demonstrates non-obvious applications.

The closest equivalent I can think of for a USB memory dongle would be firmware that could recognize, say, JPEG images in FAT file systems. Any information the firmware recognizes as interesting could be steganographically watermarked into your images by the time you pull them off the dongle. In such a case, any image you upload online that came from that dongle could contain sensitive information and you'd have no idea you uploaded it.

Comment Re:Why so ?!? (Score 1) 143

Also most OSes are LLP64 or LP64, meaning that the default "int" still is 32bits. Thus code recompiled in 64bits will tend to approximately use the same amount of data as original code in 32bits.

True, but the instruction set architecture doubles by size, resulting in larger binaries and more executable code to fetch from memory. As an example, the ARM instruction set is 32-bit by default, but ARM processors support a 16-bit Thumb mode. The programs typically execute slower in Thumb mode but often completion speed results in a net gain when the program is being loaded from slow flash/ROM. This is a typical set-up in many embedded systems.

Comment Re:Ideology? (Score 1) 521

Linus will only think about moving from GPLv2 if Linus thinks it's necessary or beneficial, not because some pen-pusher, pundit or journo tells him to.

He couldn't move to another license even if he wanted to. The contract works both ways: you are allowed to copy, modify and redistribute their project under the terms of the GPLv2, but any contributions sent to Linux must also adhere to the same terms. Thus, the contributions are GPLv2 and, without written consent from all contributors, changing the terms(license) is out of the question. He could, of course, change the license to the bits he wrote himself since he owns the copyright to them, but they only account for a small fraction of the whole kernel.

Comment Re:I have had the opposite results (Score 1) 907

Dial that speedstep down WAY low when unplugged.

Actually, forcing everything down is not the most efficient way to save power - at least when it comes to CPU frequency governing. Dropping the speed does in indeed decrease power draw, but dropping the speed to 50% won't save 50% from the power draw, but it does double the calculation time and prevent the CPU from entering deep sleep states. It's often more power efficient to throttle up for the calculation, finish as fast as possible, and then enter sleep state.

Slashdot Top Deals

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...