Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: Privacy? Yeah, we've heard of it. (Score 1) 80

"Bob, we're glad you Intel boys have finally come around".
"Hi Jeff. Yeah, well we just about got the marketing boys convinced enough to run with it. They managed to find an angle that flies pretty well. Gets us off the hook and gets your boys into a heck of a lot of servers!"
"Hey, it's a win-win so far as I'm concerned. Wish it could have been sooner though, but what with all this pressure from the purse-holders, we couldn't bankroll it for you".
"Times are getting tough, huh? It wasn't that long ago you NSA boys had infinitely deep pockets".
"The times we live in, Bob. What can you do?".
"You need any help with the rootkit code? Some of our wizards are pretty good"
"Thanks, Bob, but we already have it covered".

Comment: Re:Give 'em your Kool-Aid (Score 1) 226

by VinylPusher (#46684767) Attached to: Should Microsoft Give Kids Programmable Versions of Office?

Throughout various jobs and companies over the past [many, many] years, I've used VBA *extensively* to automate or otherwise add polish to Excel and Word. Being able to use VBA to a semi-pro level enables some unique opportunities for advancement in many office based jobs.

I have hardly used Visual Studio within the workplace.

It depends what job you're going to be in, but there are a lot more 'standard' office/admin workers than there are software developers. Not everyone wants to break in to software development, but if you can demonstrate some unique capabilities within the office... assuming your employer isn't a jerk, you can do very well from it.

Comment: Re:It's not dead, it's fun! (Score 1) 405

by VinylPusher (#38470086) Attached to: Is Overclocking Over?

My server spends /some/ time idle, but a lot of the time it is handling 4 Java processes.

These processes do low-latency transactional stuff. Typically, each tick of a job is so small that it doesn't trigger a frequency stepping of the CPU. Windows doesn't schedule these very well, typically making the single threads hop between CPU cores and actual CPU's.

It all conspires to make performance suck unless I keep the CPU's running at full speed.

Yes, I've tried setting processor affinity. It's of limited use.

Comment: Re:Maybe, maybe not... (Score 1) 405

by VinylPusher (#38470066) Attached to: Is Overclocking Over?

Your great, my great and everyone else's great may be entirely different ;)

I've been waiting for true curved surfaces in games since... well, since there were 3D cards that did both 2D and 3D.

Improved lighting, reflections (including reflected light/shadow casting), caustics and fluid motion... these are things I want to see in games. These are also all things not requiring much more effort from modelers and artists but /do/ require more horsepower from GPU's.

Comment: Re:Maybe, maybe not... (Score 1) 405

by VinylPusher (#38469574) Attached to: Is Overclocking Over?

Would you prefer that we take e.g. the new ATI 7xxx series GPUs, set that as a "this is fast enough and has as many features as we ever need" benchmark and focus any future advancement on making GPU's that are exactly the same, only they use less power?

The 'need' for advancement comes not just from making players feel more immersed, it's about meeting the requirements of HPC in order to provide realtime data visualisation and rapid number crunching. A lot of drug research is done via simulation of protein interaction, which lends itself perfectly to running on GPU's (hence the Folding@Home project).

In 10 years time, most of us will be enjoying our Super-HD screens. Some of us may be enjoying our Super-HD, QLED, 48 bit colour, 120Hz, 3D, high dynamic range screens. Some of us may even have 3 of them in a multi-monitor config. That could potentially mean a GPU having to drive a (virtual) display resolution of 12288x2560. That is a minimum 180MB frame buffer needing to be refreshed 120 times per second.

It's the difference between having to process 7GB/s versus today's reasonable maximum of 0.35GB/s. That will require some advancements in (GDDR) memory technology as well as obvious improvements to GPU performance. If we maintain the current pace of advancement, this may be possible within the next 5 years.

Comment: Re:Most people don't understand that it's a bad id (Score 1) 405

by VinylPusher (#38458456) Attached to: Is Overclocking Over?

Perhaps they are content to derive £800+ from CPU's that run at a nominal 3.33GHz for now.

If you think marketing and pricing strategies are so simple, perhaps you ought to tell us more about how you could improve Intel's market position and profitability by making some simple changes to the way they do things?

Comment: Re:Maybe, maybe not... (Score 1) 405

by VinylPusher (#38458360) Attached to: Is Overclocking Over?

I've only recently swapped my 7800GTX 256MB for a GT440 (a nice GDDR5 version).

New card uses half the energy and performs quite a lot faster. When was a 7800GTX brand new? I paid >£300 for it, that much I can remember.

Next upgrade will be when I can buy a card requiring no more than 75W (i.e., no external PCIe power) yet runs slightly faster than a GTX580.

IMHO games to be more visually appealing need to ditch the idea of polygons altogether and be comprised of voxels or truly curved surfaces and implement at least some elements of ray-traced lighting and material properties.

Fairly sure this will be happening within 20 years.

"Pull the trigger and you're garbage." -- Lady Blue