Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment: Re:Cell (Score 1) 338

I really appreciated the Cell BE too. I do hope that that architecture, with cache coherence (local stores are a pain to manage), becomes more common. Have you taken a look at Texas Instrument's KeyStone II? It's ARM + crazy DSPs. It doesn't seem that anyone has really noticed it though. http://www.ti.com/dsp/docs/dsp...

Comment: Maybe Perl is just "complete," not dying. (Score 4, Informative) 547

by Arakageeta (#48102729) Attached to: Goodbye, World? 5 Languages That Might Not Be Long For This World

"Perl is an excellent candidate, especially considering how work on Perl6, framed as a complete revamp of the language, began work in 2000 and is still inching along in development."

This does not imply that Perl is on its way out. I don't use the language myself (I despise it, personally), but I know many who use it on a daily basis. It is still a go-to language for many programmers (albeit, who may no longer be in their 20s) who need to quickly hack together a test harness for a larger system. It could merely be that Perl is "complete" for applications where it is useful. Further revision is no longer necessary.

Also, I'd hardly say that C++ is on it's way out, even though C++11 took so long to be ratified.

+ - This wearable Robot will give 2 extra fingers to our Hand.

Submitted by rtoz
rtoz (2530056) writes "Researchers at MIT have developed a robot that enhances the grasping motion of the human hand. This wrist-wearable robot gives two extra fingers to our hand.

The robotic fingers are at either side of the the hand — one outside the thumb, and the other outside the little finger.

A control algorithm enables it to move in sync with the wearer's fingers to grasp objects of various shapes and sizes.

With the assistance of these extra fingers, we can grasp objects that are usually too difficult to do with a single hand."

Comment: Re:Good scholarship - tenure (Score 4, Interesting) 325

by Arakageeta (#47181245) Attached to: Fixing the Humanities Ph.D.

There is a new problem that comes with reliance on adjuncts. Departments rarely monitor the performance of instruction themselves. Departments make decisions on re-hiring or firing an adjunct based upon student reviews and evaluations. Left without recourse, adjuncts are perversely incentivized to teach easy classes and give out high marks---this helps ensure good reviews. (It also continues the trend in grade inflation.) Adjunct professors cannot challenge their students without risking being fired.

Comment: You can come back with half the pay and no benefit (Score 4, Insightful) 325

by Arakageeta (#47181167) Attached to: Fixing the Humanities Ph.D.

My girlfriend recently graduated with a PhD in history from a department ranked 11th by US News. She's won a number of nationally recognized awards. She still can't find a tenure-track job. She was hired as a visiting professor at a university for this past year. Pay was around $40k with benefits. She got great reviews from her students, so the university offered to re-hire her as an adjunct with the same workload (teaching four classes a semester)... but at *half* the pay and *without* benefits. Her pay and benefits were better as a graduate student! She politely declined the offer. Being valued so little by the same world that qualified you is hard to endure.

Comment: Beef already high and dairy is climbing (Score 2) 397

by Arakageeta (#46795877) Attached to: Beer Price Crisis On the Horizon

Recent CNN report on the prices of beef and dairy: http://money.cnn.com/2014/04/1...

This will increase the cost to farmers too. That gets passed on to consumers. But perhaps we're all just commenting on the obvious: Production cost of X increases. The production cost of any product Y directly (or transitively) dependent upon X will also increase (or the value/quality of Y may decrease to compensate).

Comment: Re:Unobtainium (Score 1) 208

by Arakageeta (#45875567) Attached to: Intel's Knights Landing — 72 Cores, 3 Teraflops

What made Cell a nightmare to program for was the SPU's local store. The local store is great for performance, but a pain to program since the programmer had to explicitly move data back and forth between main memory and the local store (hardware designers back then all thought compilers could solve their problems for them--see Itanium). MIC is cache coherent. All memory references are snooped on the bus(es). MIC programmers don't have to worry about what's loaded in memory and what is not. An instruction merely has to dereference a memory address, and the MIC hardware will be happy to go fetch the needed data for you, automagically. It was not so with Cell.

Comment: Re:Programmability? (Score 2) 208

by Arakageeta (#45868103) Attached to: Intel's Knights Landing — 72 Cores, 3 Teraflops

It's not entirely syntactical. Local shared memory is exposed to the CUDA programmer (e.g., __sync_threads()). CUDA programmers also have to be mindful of register pressure and the L1 cache. These issues directly affect the algorithms used by CUDA programmers. CUDA programmers have control over very fast local memory---I believe that this level of control is missing from MIC's available programming models. Being closer to the metal usually means a harder time programming, but higher performance potential. However, I believe NVIDIA has made CUDA pretty programmer friendly, given the architectural constraints. I'd like to hear the opinions of MIC programmers, since I have no direct experience with MIC.

Comment: Continued PREEMPT_RT development & NVIDIA supp (Score 2) 201

by Arakageeta (#45693129) Attached to: Under the Hood of SteamOS

I'm glad to see SteamOS has picked up PREEMPT_RT. I hope they stick with it. The PREEMPT_RT developers recently reported that they lacked the man-power to continue development (https://lwn.net/Articles/572740/). Maybe Valve can contribute money or man-power?

Also, since NVIDIA is keen to support SteamOS, this means that NVIDIA must officially support PREEMPT_RT. NVIDIA's driver support for PREEMPT_RT has always been spotty. At best, hacks to the driver's GPL layer were required to make it work. I hope those days are over. NVIDIA has really improved their Linux driver over th years in order to better serve the Android and HPC markets. PREEMPT_RT support should make it even better (PREEMPT_RT can often uncover pre-existing bugs).

Comment: choppy slashdot (Score 1) 488

by Arakageeta (#44915557) Attached to: Ask Slashdot: Is iOS 7 Slow?

The choppiest site I've visited on my 4S with iOS7 is slashdot's mobile site. The background of each story is "active" in the sense that when I thumb-down to scroll, the story's background dims to grey. The regular white background returns when I lift my thumb. This, combining this action with scrolling really makes for a choppy experience!

Comment: Does ultra low-latency really matter? (Score 1) 177

by Arakageeta (#44271045) Attached to: Ask Slashdot: Low-Latency PS2/USB Gaming Keyboards?

I am very skeptical of the marketing claims of low-latency human input devices like gaming mice and keyboards. I understand the usefulness of special device configuration (e.g., macro buttons), but does a mouse really need to be polled every 1ms (like Razer mice)? In driving tests, the reaction time of a prepared driver is on the order of 750 to 1000ms (http://www.tandfonline.com/doi/abs/10.1207/STHF0203_1#.UeGmimR4a04 --- sorry for the paywall). Obviously, driving is not gaming, but let's suppose a gaming reaction time is half this: 375ms to 500ms. Let's compare two mice: one polls at 1ms and the other polls at 10ms. With a base reaction time of 375ms, the resulting difference is about 3% at worst, 2% at best. Is low-latency input devices where we should be optimizing a player's performance? Does it really matter all that much? Wouldn't it be better to focus on things such as network latency and possibly even OS schedulers?

I admit, I am not a serious gamer and I don't invest heavily in gaming equipment. I would be very interested in hearing objective opinion from a gamer. Does an input latency 10ms really matter? If so, do you have objective data that can rule out the placebo effect?

Logic is a systematic method of coming to the wrong conclusion with confidence.

Working...