Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Scala, Haskell (Score 3, Interesting) 897

If you want to learn something new without throwing away all your java experience, you might try Scala. I've heard good things about it (though I have no personal experience with it myself). As functional languages go, I prefer Haskell [1] as my default problem-solving language. You might have trouble finding a Haskell job, but it will teach you things that will be relevant in other languages.

Erlang is an interesting language. I view it as kind of a one-trick pony, but for distributed systems I've not seen anything better.

[1] Learn you a Haskell for great good

Comment Re:I'll give ya half credit (Score 1) 368

Define wealthy for me. Then prevent that definition from getting broader and broader as cash-strapped governments seek to acquire more money.

The definition of "wealthy" that I think is the most useful at present, is: any person who makes sufficient income from capital gains to be relatively unaffected by variations in the income tax. (People specifically excluded from this those who are unaffected by changes in income tax on account of having no job, or such a low income that their tax rate is close to zero.)

If the capital gains tax were made to be comparable to income tax, then we'll perhaps need a more specific definition with a dollar amount attached.

Comment Re:Wait... (Score 1) 117

Why can't it just shut down one of the two normal cores, and run the other core at a highly reduced rate to get the same power savings?

I'm not really the expert at hardware design, but I'd guess that the energy savings from reducing the clock on a high-speed chip aren't all that dramatic. If you have a 1.5 ghz chip, it has to be designed around circuits that can reach a stable state in less than a nanosecond. A chip clocked at a third the speed can use longer wires and more complex circuits, and probably use lower voltages because it has a lot more time between clock cycles. The optimal design for the slower chip may differ considerably from the optimal design of the fast chip. Similarly, the fast cores might possibly be simpler if they don't have to be capable of running at a slower clock.

Additionally, I've seen plenty of benchmarks where a higher-power draw chip that can get done with a task quickly and drop back to low-power idle mode is actually more energy efficient than a lower-power chip that takes longer to get the task done.

I'd guess that the slow core is designed for tasks that really aren't cpu constrained at all, but which might have real-time requirements, such as logging gps coordinates or accelerometer readings.

Comment Re:What's the point? (Score 1) 184

Reflections and shadows are easy from an implementation point of view, but they aren't "free" from a performance point of view, and as MidoriKid noted, rays don't all cost the same - you usually do approximately log(N) ray intersection tests for each ray, with a good acceleration structure, where N is the number of polygons. There are also problems with very large numbers of independently moving geometry - rebuilding the acceleration structure is generally N*(log(N).

In practice, ray tracers slow down as you increase the amount of screen space taken up by complex objects, but they tend to be fairly insensitive to the total amount of geometry in the scene, whereas graphics cards tend to be limited by the total number of polygons in the scene.

Space

Exoplanet Reports Exaggerated 55

The Bad Astronomer writes "The reports of the first direct picture of an exoplanet are misleading. The real news is that an image of a probable exoplanet taken in 2008 using a telescope in Hawaii have been confirmed — it's a planet. In fact, exoplanets have been directly imaged before; the first was in 2005. More images of other planets were released in 2008. To be specific: this new planet is the first to be directly imaged orbiting a sun-like star using observations made from the ground. That's actually still quite a technological achievement, but don't be misled by hyperbolic headlines."

Comment Re:First 7.5 minutes watched (Score 1) 321

I had wondered if I was just ignorant, since the Petrov story sounded like something I should have heard about somewhere. I suppose that's the risk of making a show with a target audience that includes a lot of trivia nerds - it's hard to come up with surprising true stories that they haven't already heard.

(I am familiar with a very similar story about a Soviet submarine with nuclear torpedoes that decided not to fire at an American destroyer that was dropping depth charges on them during the Cuban missile crisis because they needed the unanimous consent of a certain three officers, and only two agreed.)

Comment Re:First 7.5 minutes watched (Score 1) 321

I watched the whole thing and thought they did a pretty decent job. Some of the camera work could have been better, but I'm willing to overlook that given their constraints. I like the characters. None of the roles seem like they've been forced in order to advance the plot (though the interactions between the DHS guy and his assistant seemed rather cliche). The "nuclear battery" thing is real technology (called a radioisotope thermal generator), and Stanislav Petrov appears to have been a real person (the Wikipedia article is probably too old to have been made up by the creators of the show). For me, it's too early to tell if the show is likely to be a good one, but the first episode is promising.

Stanislav Petrov

RTG

Comment Re:Simple answer (Score 5, Insightful) 321

One viewer that really loves Firefly and will buy the DVDs is worth more revenue than a viewer who kills an evening watching Dancing with the Stars because they're bored and then forgets about it forever.

I don't know that that's really true. The point of network TV isn't to sell DVDs, it's to sell commercials. If Ford runs a commercial, and viewers go out and buy Fords, the show is a success, regardless of whether the viewers were really enjoying the show.

It may be possible that if a person really likes a show, they're more likely to think highly of its advertisers, but I think the networks are really more interested in attracting the maximum number of eyballs, and the more gullible they are, the better.

-jim

Education

Visual Network Simulator To Teach Basic Networking? 138

unteer writes "I am a US Peace Corps volunteer currently teaching a computer technician course at a technical college in Kenya. My students have all completed the Kenyan equivalent of high school and have been accepted into a program where they give a year of nation-building non-military service in return for a technical education. My students' course load includes an introduction to computer networking, and this is where my problem lies. Do any of you know of a visual network simulator that can create an interactive network map that allows me, the instructor, to manipulate various components of a network, including the physical media, routing configuration, and which applications are being used to submit data? An example would be to have a visual of the differences between mail traffic and web traffic, and be able to show how the configuration of a wireless network might be different from a wired network. I know this may seem silly, but visuals of all this are critical to getting ideas across. It doesn't even have to be technically accurate, but rather just pictorially accurate, possibly just labeling the various components correctly. Also, it would be highly preferable if it ran on Linux, as I teach using FOSS only."

Comment Re:Oh god.. (Score 1) 659

I gave up in disgust after looking at the first question. "Legitimate" psychological tests don't ask you to self diagnose; they ask a large number of concrete questions that can be used to infer psychology.

I had a similar reaction to the test. (I didn't bother to read the article.. it sounds like that was a good plan on my part.) The test didn't seem like it measured empathy so much as whether I self-identify as an empathetic person. I don't think my own system of empathy works at the same level as what the test was trying to measure. I really don't think very much about what circumstances are like for other people. Rather, I've adopted a set of rules for what constitutes fair and decent behavior between human beings, and it bothers me when those rules are violated. Similarly, it bothers me when it becomes popular to think of some group of people as sub-human in some way (whether they are Mexicans, Palestinians, Republicans, whatever...). Maybe I've lost some of my empathy by reducing it to a set of axioms, or maybe my empathy is still there in full force, it just does its work subconsciously. It might even be possible to test this in some way, but I'm pretty sure the test linked from the article doesn't measure anything meaningful whatsoever.

Comment Re:Alarmist much? (Score 2, Interesting) 175

Thanks for your insight, I too thought the article was a bit over-the-top. Even supposing that the USPTO does raise fees, the article took at as a foregone conclusion that they would raise them across the board. The patent process already has cheaper fees for individuals and small businesses, and there's no reason to assume they wouldn't continue and/or expand that. I'm in favor of higher fees, if it means we get higher quality patents and prices don't go up for small organizations.

Comment huge fines is not the solution (Score 1) 175

A potential $100,000 fine just for filing a bad patent would stop most small businesses or independent inventors from filing at all, even if the patent was valid -- it's way too much risk. I certainly don't have $100,000 lying around.

Ultimately, it's the patent offices' job to determine if patents are valid or not. The applicant ought to do due diligence to check if their idea is original, but realistically it's not possible to know every invention that has ever been thought of in the whole history of the human race. Also, infringement is fairly subjective -- it isn't always possible to predict which way the courts are going to rule.

If you want there to be fewer bad patents, there is an easier solution: simply make it cheaper and easier to challenge an existing patent. When you file a patent, the patent office is effectively saying "this patent represents a unique idea significantly different from anything that has been invented before". This is the claim they make after you pay them a couple hundred dollars. It should be even simpler to ask them, "does such-and-such invention infringe on such-and-such patent?", and they should be able to come up with an answer for a cost that is similar to filing a patent. As far as I understand it, such a simple determination currently requires a very expensive lawsuit.

Comment language safety vs programmer ability (Score 1) 192

I don't know if I agree; haskell programmers tend to be demographically more experienced (it's the only language I know of where it seems that the median programmer has or is working on a PhD), but I would also trust a relatively inexperienced programmer to write fairly good code in Haskell, especially if they used an existing web framework like HappStack. Static typing and well-defined libraries go a long way towards making it hard to do the wrong thing. This is one of the things I find compelling about Haskell - you don't have to be some kind of awesome programmer to be able to write solid applications.

I feel like web security is like a bunch of switches that you have to know to turn off - training can help, but it's better if those switches are not just off by default, but hard to turn on. For instance, I used HappStack to write a simple web app, and never worried about escaping strings. Once things were working properly, I thought, "ok, now let's fix all the security holes I've left lying around and do some proper string escaping". I discovered, when I tried to enter html tags in one of the input forms, that the html library was escaping the strings for me, and I didn't even know it.

There is, of course, a steeper learning curve, and you won't find a pile of books to tell you how to use HappStack, like you would if you were using a more mainstream framework. This contributes to sampling bias; the mediocre or poorly motivated programmers are likely to give up, so it would be difficult to test my hypothesis.

Slashdot Top Deals

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...