Follow Slashdot stories on Twitter


Forgot your password?

Comment Re:I've never understood the UNIX world's fascinat (Score 1) 267

Why use something not really made for that?

It's simpler to use something already built and tested, with known strengths and weaknesses, multiple mostly-compatible implementations available, tool support, and plenty of books and trained personnel to choose from, than to use a much simpler solution that is less well understood, or worse, one that I have to design and implement myself.

Or, to put it another way, why do the Chinese and Indians do business with each other in English when Esperanto would suffice?

Comment Re:Adapt (Score 1) 626

Thanks for the tips. I'll take a look at lazy-lock. I don't know if the font-locking code even terminates on the problem files and don't have the patience to find out, so caching wouldn't help.

Just out of curiosity, since you're well-qualified to prognosticate, what do you foresee for Emacs? Will a more modern Lisp-based editor eventually displace Emacs, or will Emacs continue to eat its young until something totally different kills it?

Comment Re:Adapt (Score 1) 626

What I meant by "RAM is cheap" is that on modern systems you'll only start swapping under pathological circumstances. The GP poster was worried that running two jobs concurrently would turn the disk into a bottleneck. His objection was framed around jobs with heavy disk I/O, but I also wanted to address the question of swapping, which was a valid a concern back in the days when switching from task #1 to task #2 could mean swapping in a bunch of code and data for task #2. These days, #2's code and data would be in memory somewhere, a much quicker trip than disk.

And I disagree with the title of this thread - Linux (the kernel at least) is quite well prepared for multicore chips.

That seems to be the case to me, too. It's the applications that drop the ball. Emacs can get hung opening a large .cpp file if the macros confuse the parser used by the syntax highlighter. Why isn't that done in a separate thread so I can make my changes and close the file while the syntax highlighter flails in the background?

Comment Re:Adapt (Score 2, Interesting) 626

Short answer: only one thing I mentioned involved disk I/O, RAM is cheap, and application frameworks typically limit the number of jobs being run at one time.

If there's really a performance need to serialize tasks involving disk I/O, then go ahead and serialize them. Eclipse, the application framework I'm most familiar with, makes this straightforward: just define a scheduling policy that allows only one job to run at a time and apply that policy to all your disk I/O jobs. Other jobs will continue to be scheduled and run according to the default policy or whatever other policy you specify -- might as well get some work done while you're waiting for the I/O to complete.

Comment Re:Adapt (Score 4, Insightful) 626

But most computing in the world is done using single-threaded processes which start somewhere and go ahead step by step, without much gain from multiple cores.

Yeah, I agree. There are a few rare types of software that are naturally parallel or deal with concurrency out of necessity, such as GUI applications, server applications, data-crunching jobs, and device drivers, but basically every other kind of software is naturally single-threaded.


Sarcasm aside, few computations are naturally parallelizable, but desktop and server applications carry out many computations that can be run concurrently. For a long time it was normal (and usually harmless) to serialize them, but these days it's a waste of hardware. In a complex GUI application, for example, it's probably fine to use single-threaded serial algorithms to sort tables, load graphics, parse data, and check for updates, but you had better make sure those jobs can run in parallel, or the user will be twiddling his thumbs waiting for a table to be sorted while his quad-core CPU is "pegged" at 25% crunching on a different dataset. Or worse: he sits waiting for a table to be sorted while his CPU is at 0% because the application is trying to download data from a server.

Your example of building construction is actually a good example in favor of concurrency. Construction is like a complex computation made of smaller computations that have complicated interdependencies. A bunch of different teams (like cores) work on the building at the same time. While one set of workers is assembling steel into the frame, another set of workers is delivering more steel for them to use. Can you imagine how long it would take if these tasks weren't concurrent? Of course, you have to be very careful in coordinating them. You can't have the construction site filled up with raw materials that you don't need yet, and you don't want the delivery drivers sitting idle while the construction workers are waiting for girders. I'm sure the complete problem is complex beyond my imagination. By what point during construction do need your gas, electric, and sewage permits? Will it cause a logistical clusterfuck (contention) if there are plumbers and eletricians working on the same floor at the same time? And so on ad infinitum. Yet the complexity and inevitable waste (people showing up for work that can't be done yet, for example) is well worth having a building up in months instead of years.

Comment Re:WTF? (Score 1) 230

Complete human beings with enlightenment and wisdom would never find the reasoning you mention to be convincing or tempting. Because they are complete, they would not be suckered into this type of false dichotomy. They would have none of the personal vulnerabilities on which the deception of false ideas is built.

False ideas don't require personal vulnerabilities. Any attempt to engage with the world produces false ideas, because we are ignorant and fallible beings.

Personal wisdom and enlightenment can reduce one's vulnerability to manipulation, but it does not answer political questions. It can tell me why I'm scared, but if my fear is based on the prospect of an undesirable outcome, spiritual wisdom cannot tell me whether the outcome is plausible or likely. Even if you reject fear as a basis for thought and action, you will still find yourself agreeing with it quite often. For instance, assuming you have not yet achieved perfect enlightenment, you are probably scared of injecting heroin. You were manipulated into this fear by information and media provided by people who want you to be scared of injecting heroin. Spiritual wisdom allows you to realize that you are scared and manipulated, but it does not answer the question of whether or not it is a good idea to inject heroin.

In general, there's no way to dodge the necessity of examining everything on its merits, and no way to get out of the catch-22 that all the information you consume is produced, directly or indirectly, by people who care about what you do and believe. Rejecting all self-interested manipulation would mean rejecting almost all human interaction. It would certainly make it impossible to learn anything about politics. As for fear, you can reject fear, but you can't simply say, "Fear is on one side of the issue, so I must be on the other." The presence of fear is informative, but it is not that informative.

To go back to the example at hand, wisdom cannot advise me to ignore the issue, nor can it resolve the issue one way or the other. After all, the bogeyman prospect is a story of how the current legal system is vulnerable to exploitation and how that will affect American business. Wisdom helps, but not the kind of wisdom you're talking about. The issue must be approached by seeking information, reading opposing viewpoints, and discussing it.

Comment Re:Yup (Score 1) 230

You're wrong about the user interface. Remember how everyone complained, "I don't mind the iPhone as a product, but I can't stand how people who own them are constantly taking them out and playing with them just to show them off?" That's what I thought, too, until I got one and started pulling it out to read news on the web every time I had to wait in line for a few seconds. It was a completely new experience after years of button and stylus phones. Other devices made it possible many years ago, but the iPhone was the first device that made it pleasant enough that large numbers of people actually bothered.

Anecdotal evidence: I had a Nokia n800, which I thought was a really neat device, but it never seemed worth the hassle. I carried it on and off for several weeks, and then it started gathering dust. I didn't use any mobile device's browser (neither the n800 nor the ones in my various phones) more than once a month until I got my iPhone, then suddenly I was browsing the web away from home several times a day. The difference was usability (especially one-handed usability) and the slim form factor.

By now some slick, usable Blackberries are on the market, but I bet they account for a small percentage of RIM's subscribers. You have a point about tethering, but there's a difference between a handful of traveling businessmen tethering when they don't have any other option and a bunch of young people watching YouTube all day and not even bothering to look for wifi.

Comment Re:WTF? (Score 1) 230

I think you missed the fact that I was disagreeing with you in the explanation of people's behavior. People have little imagination, and for most people, almost any impulse takes them in the direction of conformity. They can find group identity and ready-made identities on the right or on the left, through worship of corporations or enmity to them. (But you know that already.)

What the corporations are doing is slightly more subtle than just offering an opportunity for conformism. Their reasoning even allows people who distrust corporations to believe that trusting corporations over consumers is the least bad option.

Comment Re:WTF? (Score 1) 230

Actually, this mentality comes from disenchantment with the legal system that is carefully cultivated by businesses to give themselves a legal leg up on consumers. If you convince people that the legal system is unable to decide consumer complaints justly according to their merits, then logically, there are only two choices: trust the corporations' word on everything or allow them to be torn apart by jealous parasites.

So, if you make people cynical about lawsuits by individuals, people see every consumer complaint as a threat to the production of all the food, services, and cool stuff that we currently enjoy. That is, a threat to capitalism and all we know as good.

Companies are happy to rely the legal system to regulate relations among themselves when they can't get along, of course. Then they gang up on consumers to exclude them from the system because they don't have to rely on lawsuits to hold consumers to their word -- that's what credit reporting services are for.

Frankly, I'd love to see our ridiculous liability system restored to some kind of sanity and credibility. Then corporations will have to face more public responsibility. These days, when a company gets walloped in court for blatant fraud and dishonesty, people don't take it very seriously because business interests make sure there's a steady stream of ridiculous personal injury lawsuits in the news. I have to admit they have a point, but they don't invest billions in cultivating our cynicism just as a public service.

Comment Re:Yup (Score 1) 230

I bet the average BlackBerry user consumes far less bandwidth than the average iPhone user. The iPhone is a media device; if you don't want media and web browsing, there's no reason to buy it. Many people with little interest in media and web browsing own Blackberries for purely business reasons. Plus, the iPhone's slick UI means people consume more bandwidth simply because it's more convenient. The lack of friction in the UI means iPhone users will start browsing the web at the drop of the hat, just because they're bored. Most people wouldn't access the web on a Blackberry without a reason -- at least that's true for a lot of older and cheaper Blackberry devices that account for many of RIM's subscribers.

Comment Re:Yup (Score 1) 230

I agree, but AT&T has always been known for mediocre cell service. If you aren't known for being good at something, there's no point in being better than adequate. Everywhere I go, the AT&T network seems barely adequate (though the data bandwidth can be excellent in off-peak hours.) I wouldn't know, but I assume it takes careful management to achieve such a consistent level of mediocrity. And they know that consistent mediocrity means that any aberration from normal conditions (such as an event that attracts an unusual number of people to one place) results in terrible service or no service at all. I've learned not to rely on my AT&T service. I'm looking forward to switching back to Verizon as soon as I can get a Verizon iPhone or something similar. Verizon has always hung their reputation on the quality of their network, so they have an incentive to keep it from crapping out all the time.

Comment Re:the workaround is bad design (Score 0) 421

Short version: "We're sorry we changed something that worked and everyone was used to, but hey -- it's compliant with a standard." If this were Microsoft, we'd give them a healthy helping of humble pie, but because it's Linux and the magic word "POSIX" gets used, I'm sure we'll forgive them for it.

I think what we've learned is that there's a bug in the POSIX standard, and Ext4 exploits the bug to deliver high measured performance in a way that is actually bad for users. So it's a benchmark hack on top of a flawed spec -- all in all, a shit sandwich for users.

That's not to say that Ext4 is bad technology. It sounds like it will deliver on its performance promises on systems that run well-written, failure-resistent software. It just won't work with the software that desktop users currently use. It will take a while for this to get sorted out, and we have to moderate our expectations from "everyone switches to ext4 and gets an automatic speed boost" to "wait and see; desktop users might not benefit from it anytime soon."

Slashdot Top Deals

"An ounce of prevention is worth a ton of code." -- an anonymous programmer