Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Different is OK, but it has to work. (Score 1) 1040

I have a netbook with Ubuntu 11.4. I didn't mind Unity at first, but then I quickly realized that a) it is designed to prevent me from working the way I want to and b) it's really buggy.

I won't elaborate on the second point. If you already know what I'm talking about, then fine, if not then it must either be working fine for you, or you aren't using Unity at all. Regarding the first point, though -- there seems to be someone out there with some sort of religion about not having two windows of the same application at once. Years back, there was a big controversy about Nautilus not opening a new window when you double-clicked on a directory. I didn't pay much attention because I don't use Nautilus. More recently, I've been using gedit quite a lot to write code (take that, vim and emacs users!), but every now and then I want to have two windows open so I can look at more than one file at once. Can't do it. If I try to open another gedit window, it just creates a new tab in the existing window. Maybe there's some way to turn this off, but it seems like a pretty bad default to have an application go out of its way to prevent a user from ever looking at two files at the same time. Fast forward to Ubuntu 11.4. Now, the application launcher thing on the left has some default launch buttons that can be deleted and replaced with the applications I actually use (good bye, word processor; hello, terminal). These launcher buttons have strange behavior though -- if the app is already running, they just highlight the window of the already opened app instead of opening a new window. I'm not convinced that's a good idea even for a word processor, but for terminals it's just unusable.

To be fair, it is possible to open multiple terminals if you don't use the launcher, but instead click on the magnifying glass with a plus in it, which (contrary to every GUI interface I've ever used before) isn't actually a zoom button, but rather is used to search for applications. Searching for terminal and launching it from there allows you to open more than one window.

I feel like the party line about Unity is that "It may be different from what you're used to, but Unity is a significantly better, more usable interface and it will be worth a short learning curve." In actuality, it seems like the real story is, "Working with multiple windows is too messy. We've fixed that bug by not allowing you to use multiple windows. If this upsets you, then you're not Ubuntu's target audience."

I haven't tried the newest Ubuntu. If it isn't any better, I think it might be time to start looking for a better distro.

Comment Re:Well, that's a clever tactic. (Score 1) 386

Quite true. As I understand it, members of the United States military are subject to the uniform code of military justice, and we typically require soldiers to have immunity from local laws as a condition of us maintaining a military base on foreign soil. In the case of Iraq, they as a sovereign nation decided not to grant that immunity, so we're leaving.

I'm not so sure what the situation is for contractors. I believe there were a few embarrassing events where private security personnel killed civilians and, not being subject to either local laws or the UCMJ, there was no way to charge them with a crime. I don't know if Iraq has relinquished immunity from contractors since then.

Comment Re:why I use Haskell (Score 1) 338

I'm not sure exactly what you're trying to say, but it sounds like something along the lines of "We already have for loops and spinlocks and they work. Why would anyone want anything else?", so I'm just going to assume that's what you meant.

Haskell doesn't use for loops because they don't make any sense in a pure-functional execution model. "You want to cause a side effect N times? What is this "side-effect" that you speak of? What do you mean the body of the for loop doesn't have a value?" Haskell does have a sort of for loop for monadic code called forM_, if that's what you really want. Ignoring the restrictions of the Haskell model of computation, if we take a look at the for loop construct, there are few things in its favor and a few that aren't so great. It is simple and it is possible to express a surprising variety of common patterns with the for loop, thus freeing the programmer from learning to use a larger number of special purpose constructs (like map, fold, zip, etc...) that you might use in a language like Haskell. On the downside, though, for loops are rather verbose, they tend to be lousy at abstracting away the details of iterating over data structures, and the compiler often has to make fairly conservative about how it optimizes the loop. It may be a "solved problem", but technology isn't just about running with the first solution that happens to work. Sometimes, it's worthwhile asking if we're using the best abstractions. I'm sure Alta Vista thought search was a "solved problem" before Google showed up.

As for spinlocks, they certainly do work but they're pretty easy to misuse. Common mistakes include forgetting to grab a lock in the first place before reading or writing some shared variable and getting into deadlocks by acquiring multiple locks in an inconsistent order. You can't make either of those mistakes in STM, and it comes with an added benefit that it's possible to take two existing atomic operations and combine them into a single atomic operation without having to re-write the original two operations.

This is assuming you use STM, but in Haskell you don't have to. If you like spinlocks, then you'll be right at home with the MVar interface. Alternatively, it is possible to parallelize pure code without using locks at all, using par, parMap, pseq, etc... The neat thing about going the pure route is that you get parallel code that is deterministic. Pure functions are guaranteed to always return the same value given the same arguments, even if the function is implemented internally as a parallel computation. It's these sorts of invariants that make Haskell programs rather easier to reason about. If you know a certain class of bug won't happen, you don't have to be always on your guard.

Comment why I use Haskell (Score 1) 338

I use Haskell for personal projects and use C at work. The reasons I use Haskell when I have a choice is that a) I like the typechecker to catch my mistakes, b) the language is performant enough for what I want to do, c) I can get a lot done with not very much code, d) the language is aesthetically appealing and e) using Haskell is fun. If "d" and "e" were the only reasons, I wouldn't use Haskell. I'm lazy, and if I just want to get something done, I'll usually do it the easiest way I know how. For many problems, Haskell is the easiest way I know.

I don't expect people who haven't used Haskell to understand this. I didn't, until I had used the language for awhile and found to my surprise (writing my first large Haskell program) that restricting file access to the IO monad not only didn't restrict me in any significant way, it actually made the design of the program cleaner than what I would have come up with had that restriction not been in place. "Clean design" may sound like an aesthetic issue, but for some reason cleanly designed programs seem to consistently work better than programs that aren't cleanly designed. (I'm not claiming that writing in Haskell is the only way to achieve clean design, just that the language seems to encourage it.)

Some programs I've written in Haskell are the glome raytracer (available on hackage), a web-app, and an environmental simulator. I would consider these, if not "real world" programs, then at least representative of real world programs. I could have written them in other languages... but I didn't. Glome is the most complicated piece of software I've ever written by far (meaning that it does a lot, not that is internally complex), and I don't think it's a coincidence that it was written in the most powerful language I know.

Comment Re:What about pre-rendering everything? (Score 1) 91

I don't think pre-rendering all possible images would be practical (except in a game like Myst where movement is confined); however when we get into the realm of global illumination, it does make a certain amount of sense to do the ambient lighting computation (which is the same regardless of camera position) up front. In a sense, this is what modern games already do -- the lighting effects are essentially painted on the walls as textures. I think the areas where there's room for improvement is that global illumination ought to respond to changes in the scene (opening a door or window, for instance), and these calculations could be made part of the game engine rather than as an artist's tool, so that procedurally-generated content could benefit from the same high-quality lighting effects as static scenes produced by level designers.

Comment We can already do that without exotic hardware (Score 1) 91

I'm a big fan of real-time ray tracing, but this doesn't sound all that exciting, considering that about three years ago I was able to play a real-time ray-traced game on a middle-of-the-road laptop. Resolution and framerate weren't great, but it was playable. The game I refer to is Outbound an open-source game based on the Arauna engine.

It's great that this is on Intel's radar, but whenever Intel demonstrates some new real-time ray-tracing demo that requires a cluster of machines, or some other kind of expensive, specialized hardware, I just think they've kind of missed the boat. We can already do that sort of thing on an ordinary desktop. (The linked site is down, so perhaps there's more to this announcement than the slashdot summary would lead me to believe.)

Comment Re:Why you want to reconfigure on the fly. (Score 1) 108

So why don't they build systems like this?

I think it's mostly a matter of the toolchain being different and very, very primitive. My impression of Verilog (from a few trivial experiments) is that it's the sort of language you might expect someone to design 40 years ago. Also, many idiom are different. Some things may be easier, but programmers tend to take for granted that they can make random memory accesses -- they don't expect to need to implement a memory controller to do so.

Comment the-keyboard-is-the-computer form factor (Score 1) 339

It may sound strange, but sometimes it's actually pretty nice to have the computer built into the keyboard, but no screen. For several years I used an old laptop I bought cheap from a friend because the screen was cracked. I took the screen off and just plugged it into a monitor and used it like that -- it's nice to have a computer that's portable, but you can use with a full-size screen without the bulit-in screen getting in the way.

Comment land use (Score 3, Insightful) 320

If I remember correctly, a couple of the proposed crops for making cellulosic ethanol are switchgrass and miscathus, and they both grow fine without human intervention. Switchgrass is native to North America. My understanding is that either crop could be used on land that isn't actively being farmed for food crops or that is "resting" for a few years as part of a normal crop rotation cycle.

Comment ad hoc routing, flooding (Score 2) 314

Strange as it may seem, the parent is right - a lot of modern ad-hoc routing algorithms don't automatically keep their routing tables up-to-date -- instead, they flood the nework with a "where is so-and-so" message when they need to send a message to a certain host they don't already know about. As the reply is flooded back from the destination node, every other node learns how to reach it, and the path is built up by the forwarding nodes in the reply, so that when it finally gets back to the initiator, it knows the full path to get to the destination. Data packets are not flooded, only route requests and replies. This is how DSR works. AODV works a little differently but I don't remember the details. By contrast, OLSR is a link-state protocol rather than a distance-vector protocol -- every node tries to keep a current map of the whole network. This is expensive for large networks, but it's reasonably efficient for small ones, like you might see popping up in a disaster area in order to re-establish local communication. The nice thing about OLSR is it runs at the IP layer, so it doesn't have any kind of weird hardware dependency -- it's easy to set up on all kinds of computers (Linux, Windows, WRT54Gs...), or at least that was the state of things a couple years ago when I was using it.

I have used OLSR in small networks of wireless routers (running OpenWRT) and laptops, and it seems to work well. I haven't done any large-scale testing, but some people have.

Comment Re:A challenge to the complainers... (Score 3, Insightful) 257

Here are some options that Virgin Mobile isn't doing:

  • shutting off a user's connection as soon as they reach the cap
  • throttling or blocking on a per-application (e.g. bittorrent and youtube) basis
  • charging a large fee for exceeding the quota

Network capacity is a finite resource. It looks like Virgin Mobile is dealing with that in the most reasonable way. Good for them.

Comment What I want (Score 1) 459

Here's what I want -- a netbook with an e-ink style non backlit (or optionally backlit) screen readable in direct sunlight that can be swiveled around so it can function as either a tablet or a laptop, and with a non-locked-down OS. I want one device that I can use to write code, read books, browse the web, and connect to a projector to display slides, and do so with a long battery life.

What I've just described is essentially the OLPC XO, which would be quite tempting if the keyboard were easier to use, and maybe the screen were a bit bigger. I would be happy with a black and white screen, especially if it had an external vga or dvi or hdmi cable for connecting a color display. I am surprised no one is making such a device.

Maybe I ought to just buy a netbook and a Kindle, but it seems like a waste not to combine the best features of both into a single device.

Slashdot Top Deals

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...