Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Goes along with my poll: (Score 2) 144

I definitely saw that in my undergraduate experience. I'd say a good 90% of my peers never went the extra mile on anything; if it wasn't going to be on an exam, you can bet they wouldn't bother studying it. When it came time to collaborate with them on projects, all they did was drag the serious students down. It was so frustrating by the time I graduated, but fortunately I had a really nice professor who worked with me to publish two papers on my independent study.

I really hope the slackers don't wind up with these mythical $130k a year jobs. I know I'll never be in a position to earn that much, because I'm more interested in research and theory. Which, to be honest, is what CS should really be about - these generic programming jobs are more or less software engineering, which has its own curriculum.

Comment $130k a year?! (Score 1) 144

That's ridiculous. None of the jobs that interest me offer anything near that much, even for senior management. I guess he is silently implying that you find a job that will bore the hell out of you but pay well.

I'm not buying that, as a summa cum laude graduate, I want a job that challenges me. I could have settled for software engineering or even some mickey mouse IT degree if I cared about salary. Honestly, I think computer science is too lumped in with software engineering these days.

Comment Re:Not buying it. (Score 1) 238

Yeah, it's definitely not the norm at the moment. However, GPGPU is gaining traction very fast.

It really comes down to the application, algorithms still have to be re-tooled to function optimally on stream processor architectures, and the languages that have cropped up around the new hardware (e.g. CUDA / OpenCL) introduce challenges to legacy software as well. Some applications will never benefit from billions of simple simultaneous threads, as much as they would fewer, more capable hardware threads.

We'll probably see a mix of all three architectures in the coming years. You have the pick the right tool for the job, and the jobs are just as diverse as the hardware we're discussing.

Comment Blessing in disguise (Score 1) 605

I had two papers published during my undergraduate years because my writing was head and shoulders above every other student in my department. Granted, I also put more work into my research than anyone else, but the deciding factor that made the head of the department eager to publish undergraduate research was the clarity of my writing. Needless to say, when it came time to apply to graduate school, having two papers published as an undergraduate was one hell of a plus.

At the graduate level, thankfully, the story is a little bit different. The SATs did not always have an essay section, but the GREs have had them for as long as I can remember. Multiple choice tests, even as sophisticated as the GREs and SATs are, do not give an indication of a student's ability to organize their OWN original thoughts. It always struck me odd that the SATs did not include any sort of writing when I took them.

In any case, the quality of students at the undergraduate level is really to blame here. By the time you get to graduate school, academia filters out most of the dumbasses (except in the case of basket weaving and MBA programs).

Comment Re:HURD not founded in 1983 by RMS (Score 1) 274

By work started on the operating system, he is referring to the user-land side of things. Basically, most of the command line stuff that you interact with in an implementation of GNU/Linux. GNU is basically user-land, and Linux is kernel-land. The marriage of the two gives you the most widely used Linux desktop platform.

Comment Re:Why should I bother? (Score 1) 274

OS X has a hybrid micro kernel, that mostly powers a desktop platform (true server implementations exist). I did a year's worth of research on kernel-level memory allocation in the context of real-time systems, and OS X's kernel design surprisingly came out head and shoulders above Linux and (not surprisingly) Win32 and in many scenarios, better than specialized kernels such as VxWorks.

Unfortunately I never had a chance to extend my research to QNX, which has a true microkernel. But microkernel performance penalties are largely overrated, and desktop platforms that have adopted the design have found clever ways around the issue :)

Comment Re:Replacement console?? (Score 1) 592

If it is anything like the current 360 architecture, the content is actually licensed partially to the storage medium, partly to the console and partly to the Live account.

You can do license transfers between 2 of the 3 online and using a device to transfer files between drives, but I do not think transferring between Live accounts will ever see the light of day :)

You wind up having to re-download / rip most software after installing a new hard drive and migrating previously installed games to satisfy the license system that is ALREADY in place. They definitely have a better solution to replacing hardware than Nintendo, which requires you to send your old Wii / Wii-U back to the factory to transfer software licenses :)

Comment Re:Always on = !on (Score 1) 592

Yeah, the game will play but it will force you to disconnect from Xbox Live the entire time you play it. I do not do this often, since 360 game updates are MUCH smaller than PS3 updates and download in a matter of seconds usually. You can even continue to earn achievements while playing an un-patched game.

I think the disconnection from Xbox Live mechanism is just an anti-cheating system, which is silly for games that do not even use online services :)

Comment Re:Touch typing is taught in elementary school! (Score 1) 183

I would have figured with all time kids waste on Facebook, Twitter, texting, etc... these days that any internet connected teen would already have this skill set whether taught in school or not :)

I learned to touch type in IRC in the mid 90's, long before I decided to take up programming as a profession. I almost certainly would not have gotten involved in programming before learning to type.

Comment Re:As a professional, I would say... (Score 1) 183

Really?

I have never met anyone with an interest in computer programming who could not already locate any key on the keyboard by muscle memory. I always assumed it was a skill people would already have acquired before deciding to jump into software design. Like, a natural progression from crawling to walking and running.

But if this was a bad assumption, then by all means :) I cannot imagine writing C code without a firm knowledge of where to find {, |, ~, etc...

Comment Re:Unlimited Supply of Laptops? (Score 4, Informative) 215

You can almost certainly re-program it using a JTAG interface... Samsung can do this at the factory if you return it to them. JTAG is not intended for consumer use, though. My old university had a JTAG probe and several adapters to interface with various hardware vendors proprietary interfaces - without this we would have had several multi-thousand dollar bricks in our hardware lab :)

I would hope that Samsung would have the decency to admit a flaw in their design and provide the reprogramming free of charge, but ...

Comment Re:memo to hardware producers (Score 2) 215

Yeah, not to mention the latency involved in writing to EEPROM. You would pretty much have to do it asynchronously to keep it from becoming a bottleneck, which then invalidates the usefulness -- since there is no guarantee that the last log message(s) completely written to UEFI was the last message generated. I implemented something similar in a custom VxWare boot loader, which wrote boot status to on-board flash memory, but it did so synchronously at a limited number of application-defined checkpoints.

I do not like the idea of EEPROM being constantly written whenever the kernel spits out a message. You are spot on, this will inevitably wear the memory out :-\

Comment Re:Erosion (Score 4, Interesting) 51

Do not forget that volcanism and liquid water were also once a factor in weathering. There is no life, that we know of, to speed up erosion - so it is possible that drilling only a few cm will reveal geologic history on different timescale than the equivalent depth on Earth.

Granted the top layer, which is all we have studied up until now will be nothing exciting (likely layers of dust deposited over millennia), but unexposed layers have a lot of historic potential. The layers may even be old enough to portray Mars during a more interesting period, perhaps when it still had a respectable magnetic field and atmosphere.

Comment Re:Rats, already upgraded (Score 1) 266

Apple really doesn't care much for legacy support; Rosetta is not even installed on Mac OS X by default anymore. They constantly make minor changes to their core OS APIs that require writing separate code paths or using separate languages altogether. The OS X window system interface in my graphics engine is 4x as long as the Microsoft Windows interface. And now, to compile the engine as a 64-bit binary on OS X, I have to write an Objective C wrapper to use parts of the OS that are no longer accessible in their 64-bit C APIs.

It has been this way for most of Apple's history and it is frustrating for software developers and consumers alike.

Slashdot Top Deals

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...