Forgot your password?
typodupeerror

Comment: Re:Calculation was flawed (Score 1) 127

by raddan (#46635689) Attached to: State Colleges May Offer Best ROI On Comp Sci Degrees
Not to mention: many UVA grads likely stay in Virginia, and Stanford grads likely stick around in Silicon Valley (e.g., 100% of the Stanford grads that I know). The cost of living in Silicon Valley is dramatically more expensive than in Virginia. E.g, the cheapest condo in Palo Alto listed on Zillow is priced at $548,000 (which is > $300k above the already insane appraisal value) and for that, you get 679 square feet. Since I was an intern, my housing was (fortunately!) covered by my employer when I worked in Mountain View, but my boss ended up taking a job elsewhere because he and his wife simply could not afford anything more spacious than an RV. If you don't adjust the salary to the cost of living, your study is fundamentally flawed.

Comment: Re:Guarantee (Score 1) 716

by raddan (#46225673) Attached to: Ask Slashdot: Should Developers Fix Bugs They Cause On Their Own Time?
We don't need to certify programmers, we need to certify programs. I'm not sure that certification for programmers would provide any extra benefit other than maybe being a prior on whether you think the programmer can get the job done or not (and I'm not a Bayesian, so...).

On the other hand, many properties about programs themselves can and should be verified. A great deal of current programming language research is devoted toward both improving the capabilities of automatic program verification as well as designing languages more amenable to verification. Functional languages, for instance, rule out entire classes of bugs present in imperative languages. People complain that they're hard to understand. Maybe. I argue that they're hard to understand if you're the kind of person who does not care about whether your program is correct or not.

Comment: Re:"So who needs native code now?" (Score 1) 289

by raddan (#45772467) Attached to: Asm.js Gets Faster

Unless and until some unforeseen, miraculous breakthrough happens in language design, GCd languages will always be slower when it comes to memory management. And because memory management is so critical for complex applications, GCd languages will effectively always be slower, period.

This isn't true. Have a look at Quantifying the Performance of Garbage Collection vs. Explicit Memory Management. The take-away is that GC'd languages are only slower if you are unwilling to pay an extra memory cost; typically 3-4x of your explicitly-managed program. Given that GC gives you safety from null-pointer dereferences for free, I think that's a fair tradeoff for most applications (BTW, you can run the Boehm collector on explicitly-managed code to identify pointer safety issues).

Comment: Re:Maximum precision? (Score 1) 289

by raddan (#45772395) Attached to: Asm.js Gets Faster
I was being glib. Just nitpicking on the phrase "maximum precision". Sorry, it's a bad habit developed from working around a bunch of pedantic nerds all day.

Thanks for the pointer about native ints, although I can't seem to find any kind of authoritative reference about this. This guy claims that asm.js converts these to native ints (see Section 2.3: Value Types), but his link seems to be talking about the JavaScript runtime, not the asm.js compiler. If you have a reference, I'd appreciate it if you'd send it along.

Comment: Maximum precision? (Score 4, Informative) 289

by raddan (#45763197) Attached to: Asm.js Gets Faster
Let's just open up my handy Javascript console in Chrome...

(0.1 + 0.2) == 0.3
false

It doesn't matter how many bits you use in floating point. It is always an approximation. And in base-2 floating point, the above will never be true.

If they're saying that JavaScript is within 1.5x of native code, they're cherry-picking the results. There's a reason why people who care have a rich set of numeric datatypes.

Comment: Re:Numerical computation is pervasive (Score 5, Interesting) 154

by raddan (#45730195) Attached to: 'Approximate Computing' Saves Energy
Not to mention floating-point computation, numerical analysis, anytime algorithms, and classic randomized algorithms like Monte Carlo algorithms. Approximate computing has been around for ages. The typical scenario is to save computation, nowadays expressed in terms of asymptotic complexity ("Big O"). Sometimes (as is the case with floating point), this tradeoff is necessary to make the problem tractable (e.g., numerical integration is much cheaper than symbolic integration).

The only new idea here is using approximate computing specifically in trading high precision for lower power. The research has less to do with new algorithms and more to do with new applications of classic algorithms.

Comment: Re:Cross language - what .Net gets right (Score 1) 286

by raddan (#45598185) Attached to: The Challenge of Cross-Language Interoperability
Believe it or not, CIL (or MSIL in Microsoft-speak), the bytecode for .NET, is an ECMA standard, and implementations of both .NET JIT'ers and standard libraries exist for practically all modern platforms, thanks to Mono. So I'd say: "competition for portable applications". Really! Just take a look at Gtk#. As a result, there are numerous applications for Linux written in .NET languages (e.g., Banshee). Having written huge amounts of code in both JVM languages (Java, Scala, JRuby, and Clojure) and .NET languages (F# and C#), I would take .NET over the JVM for a new project any day.

Also, to pre-emptively swat down this counter-argument: while the Mono people and Microsoft may have had some animosity in the past, it is most definitely not the case any more. Most of the Mono people I have spoken to (yes, in person) say that their relationship with Microsoft is pretty good.

Build systems and dependency management for the JVM are their own mini-nightmare. .NET's approach isn't perfect but compared to [shudder] Ant, Maven, Buildr, SBT, and on and on and on... it largely just works.

Comment: Re:Cross language - what .Net gets right (Score 3, Informative) 286

by raddan (#45598011) Attached to: The Challenge of Cross-Language Interoperability
P/Invoke, the other interop mechanism alluded to by the poster, is substantially faster than COM interop. I spent a summer at Microsoft Research investigating ways to make interop for .NET faster. There's maybe 20 or so cycles of overhead for P/Invoke, which is practically free from a performance standpoint. In addition to having its own [reference-counting] garbage collector, COM has extensive automatic-marshaling capabilities. These things make interop easy, but they add substantial overhead compared to P/Invoke. On the other hand, P/Invoke is somewhat painful to use, particularly if you want to avoid marshaling overheads and play nice with .NET's [tracing] garbage collector and managed type system. P/Invoke will often happily accept your ginned-up type signatures and then fail at runtime. Ouch.

Coming from the Java world, I was totally blown away by what .NET can do. I can't speak for Microsoft myself, but I would be very surprised if .NET was not going to stick around for a long time. With the exception of perhaps Haskell, the .NET runtime is probably the most advanced managed runtime available to ordinary programmers (i.e., non-researchers). And, with some small exceptions (BigInteger performance... GRRR!), Mono is a close second. What the Scala compiler is capable of squeezing out of the poor, little JVM is astonishing, but Scala's performance is often bad in surprising ways, largely due to workarounds for shortcomings in the JVM's type system.

Comment: Re:overly broad then overly specific definition (Score 1) 318

by raddan (#45492789) Attached to: On the subject of robots ...
I think the key distinction is that a robot is autonomous to some degree. It needs to make use of techniques from AI. I.e., it learns.

As someone who dabbles in techniques from AI to solve problems in my own domain (programming language research), solutions in AI tend to have the quality that the algorithms that produced them are extremely general. For example, a robot that can manipulate objects may not even possess a subroutine that tells it how it should move its hands. Often, it learns these things by example instead. It "makes sense" of the importance of these actions through the use of statistical calculations, or logical solvers, or both. Since information in this context is subject to many "interpretations", these algorithms often most closely resemble search algorithms! If a programmer provides anything, it's in the form of "hints" to the algorithm (i.e., heuristics). To an outsider, it's completely non-obvious how "search" and "object manipulation" are related, but when you frame the problem that way, you get weird and sometimes wonderful results. Most notably, autonomy. Sadly, you also sometimes get wrong answers ;)

If your washing machine could go collect your dirty laundry and wash it without your help, I'd call it a laundry robot. Particularly if you could tell it something like "please do a better job cleaning stains", and it could figure out what you meant by that. Note that a Roomba doesn't know anything about your house until you turn it on the first time.

Comment: Re:Fixed-point arithmetic (Score 4, Interesting) 226

by raddan (#45487017) Attached to: Ask Slashdot: How Reproducible Is Arithmetic In the Cloud?
Experiments can vary wildly with even small differences in floating-point precision. I recently had a bug in a machine learning algorithm that produced completely different results because I was off by one trillionth! I was being foolish, of course, because I hadn't use an epsilon for doing FP, but you get the idea.

But it turns out-- even if you're a good engineer and you are careful with your floating point numbers, the fact is: floating point is approximate computation. And for many kinds of mathematical problems, like dynamical systems, this approximation changes the result. One of the founders of chaos theory, Edward Lorenz, of Lorenz attractor fame, discovered the problem by truncating the precision of FP numbers from a printout when he was re-entering them into a simulation. The simulation behaved completely differently despite the difference in precision being in the thousands. That was a weather simulation. See where I'm going with this?

Man is the best computer we can put aboard a spacecraft ... and the only one that can be mass produced with unskilled labor. -- Wernher von Braun

Working...