Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Beyond the theoretical limit (Score 2) 223

There's a theorem in imaging that says you cannot focus a light source to create a beam any more intense then at the surface of what is emitting the light. A consequence of this is that you cannot heat something to hotter than the surface temperature of the sun by concentrating sunlight in any way, even if you had a lens the size of the solar system. The spot size that you get will just keep getting bigger.

That's true, but it only applies to imaging optics. Non-imaging optics, hyperbolic concentrators being one of the commoner cases, are not subject to this limitation. If I'm remembering right -- it's been about 25 years -- there was pioneering work done at the University of Chicago in the late 80's using hyperbolic concentrators to achieve concentrations considerably above those of the surface of the sun. This doesn't violate the Second Law because you only get the amount of light that falls into the collector, minus losses due to absorption and scattering. There are limits to non-imaging concentrators, too, but those revolve around the refractive index of the reflector material, which limits the range of useful hyperbolic profiles and thus the level of concentration achieved. Back when I was paying closer attention to this area, the highest-performing concentrators were using corundum, which is a tad pricey for large-scale work.

Comment Worst /. article ever? (Score 1) 257

First, this is just flamebait.

Second, the only people who want everything to be done in one language are those clueless zealots everyone finds an excuse not to hang out with after work.

Third, even if one language completely dominated the niche category of handheld consumer devices, it would mean nothing outside that niche.

Slow news day?

Comment Close, but no banana (Score 4, Insightful) 344

'I suspect that the majority of users are more likely to be satisfied with KDE 4.6 than GNOME 3.'

I'm certain that the majority of users are likely to wish developers would stop fucking with the interface they're already comfortable and familiar with and find something more useful to do with their time.

Comment Re:Not really surprising (Score 1) 249

Not exactly correct. Take wordstar for example and compare it to any modern program.
Fonts? Yea right you where lucky if the screen could display bold and italics.
Graphics? What?
Spell checker? It was a separate program you ran.

A more informative comparison might be made between Office 97 and Office 2010. The overwhelming majority of the features in Office 2010 were already present in Office 97, but I could comfortably run Office 97 on a 100 MHz Pentium with 8 megs of RAM and have several other programs, including Photoshop, open at the same time with little or no noticeable lag. The resource requirements have grown much, much faster than the functionality.

Comment Not really surprising (Score 4, Insightful) 249

Surprisingly, it was enough to run databases, word processors and complex, professional software. Today's iPad is equipped with 512MB of RAM (roughly one thousand times more), and some reviewers complain it's a bit on the low side.

This is not surprising at all. The general trend over the intervening three decades has been to trade efficiency for development time. The result is applications that are often less responsive than their primitive predecessors which were written in hand-coded assembly language. Moreover, because most users -- especially corporate users -- only upgrade their software when they replace their machines, often when a new package has increased hardware demands, there's a feedback effect between hardware and software vendors, with less efficient resource hogging software driving hardware sales which in turn drives the sales of new licenses for established software. As application categories mature -- when was the last time you saw a new word processor or spreadsheet feature worth paying for an upgrade? -- this becomes the only driver of substantial new sales.

Software has to get worse for both industries to maintain their desired growth rates. And because technical users ceased to be the majority of users decades ago, the industry has largely managed to get away with it. I had hoped FOSS software would have reversed this trend since FOSS is largely free of market pressures, but the Free Software folks could never sully themselves by making end-user-friendly software, and the Open Source folks were bent on imitating the very corporations they despised. Ergo, you can have Microsoft Office hog your resources or have OpenOffice.org hog your resources or you can use emacs or vim to write your documents in LaTeX. The user gets screwed either way, profits continue as normal for Intel, Apple, and Microsoft, and FOSS remains a minor player in userspace.

Comment Source check (Score 1) 343

It's worth pointing out that the author of this study is an associate professor of communications at a very small upscale women's college in Boston with a grand total of one graduate level science program, namely behavior analysis. Said program's website proudly announces two recent studies, "Task Analysis, Correspondence Training, and General Case Instruction for Teaching Personal Hygiene Skills" and "Learning to Ride: Pedaling Made Possible Through Positive Behavioral Interventions".

The short version is that questioning the credentials of the author and whether he even has the resources to conduct a meaningful study in this area are fair game.

Comment Re:A comment on Fark sums this up perfectly (Score 1, Insightful) 343

One has to wonder whether it's not as big a problem as is advertised or whether men just have that little value in modern society.

It has more to do with being at the tail end of a time period when spousal abuse of women was considered normal and was tolerated, so there was a big push to alter its depiction in the media to discourage it. Now that times have changed some, albeit not enough yet, one of the side effects of abused women finally being able to come forward and seek help is that the much smaller number of men who have been abused by their wives are beginning to be able to come forward. They're still mostly below the radar, but as public awareness of the problem grows, expect there to be a backlash against thoughtless media depictions of it.

There are always loud reactionaries associated with any kind of attempt to deal with social problems, but for the vast majority of reasonable people, it's not an us-versus-them thing. It's just a matter of directing limited resources at the most serious -- or just most visible -- problems first.

Comment Re:Design consequences into the game (Score 2) 343

Also dying in a game should be a bit more painful. You lose all your gear and you start at the first level, thats how it was when I played growing up. They didn't have a "save" feature.

Thank you so much! I've been waiting patiently for forty years to find out what "get off my lawn" would sound like coming from our generation, and you have surpassed my wildest expectations. Never did I imagine it would come in the form of something like, "Listen sonny, when I was your age, we didn't have save points. We had to pump in more quarters. Uphill. Both ways. In the snow."

Comment Re:"Superdecoherence" (Score 1) 101

Since for some algorithms the computational power is exponential in the amount of quantum memory, you can do "significant" stuff without a lot of memory.

That may be so, but I have a feeling that they'll still need to be able to implement at least three registers to accomplish anything, and they haven't quite made it up to being able to implement a single short int. The idea of quantum computing has a lot of potential, but so does holographic memory, and they've been promising results there since the 1960's. When you're fighting entropy on as many fronts as they are, there's good reason to be pessimistic.

Comment Re:Another example of an obnoxious long-term mania (Score 1) 444

My main objections to PHP are its excessive memory consumption, lack of even optional type discipline, lack of optional predeclaration of variables like with Perl's use strict, amazingly crappy error reporting, and its kludgy lexical scoping rules. To be fair, other than that, it's actually not anywhere near as awful as people like to say, and used intelligently, it's actually pretty decent. I'm mostly just tired of using it and tired of writing web apps -- I'd rather be writing desktop applications -- so I'm probably a little overly sensitive to its flaws. So yeah, I bitch about it because I use it a lot.

Whether it's faster than Perl really depends on what you're doing with it. PHP arrays can operate like Perl hashes and lists, which is convenient, but it also means that every array element has both a scalar offset and an associative key regardless of how you're actually using it, and that (along with some other metadata) consumes unreasonable quantities of memory. In a local, single-user app, that's not necessarily a disaster, but in a data-intensive high-traffic web app, it can be a real pain to manage memory use. It can also lead to some really frustrating bugs if an associative array key happens to be a literal numeric value, though PHP does the right thing with dynamic typing a lot better than Javascript does.

Comment Another example of an obnoxious long-term mania (Score 1) 444

This one goes all the way back -- at least -- to defining begin and end macros in C that resolve to { and } for the benefit of people who thought Pascal was the shit. And the answer, substituting the appropriate languages, still is that if you think Pascal is the shit, use Pascal. Don't try to disguise C as Pascal.

With the JVM, things are a little different, since the JVM itself is a platform, albeit one designed in tandem with a particular language in mind. Writing a JVM backend for languages other than Java does actually make sense. Butchering Java to make it vaguely resemble some other language is just as pointless, counterproductive, and obnoxious as trying to make C more like Pascal.

I've been coding since the 70's and have used most of the major languages from FORTRAN onwards, along with the design methodologies that have risen and fallen with them. Aside from choosing a language at the appropriate level -- some tasks are better handled in Java or Perl or Python than C and assembly language -- the main rational reason to prefer one language over another is determined by the codebase and workgroup you're working with. You'll get more done with a bunch of Java programmers and an ass-ton of legacy Java code by continuing to code in Java than you will by starting over in some other language.

Almost everything else is just pointless churn generated by people who fell in love with a particular language or methodology (often the first one they mastered), who then develop a mania for getting everyone else to admire their hobby horse as much as they do. Thanks, but no thanks. We already have religion for that kind of irrational, aggressive stupidity, and you can see how well that has worked out. I don't claim to be immune to the instinct: my particular favorite is C, but it's been more than a decade since I used it professionally. These days, I use PHP (ugh), C# (not bad, even if it is from MS), a bit of Perl, and an unbelievably large amount of my current least favorite, ECMAscript. They all do the job they're called to do. The associated development tools make a much bigger difference than the language itself.

These days, the current blind enthusiasm is for Ruby and Python. Big fucking deal. The language advocates were just as full of shit when it was Java, Perl, C++, C, BASIC, and COBOL. (Okay, they were especially full of shit about BASIC and COBOL.) They'll be just as full of shit in two or three years when Ruby begins to decline and some other Algol-derivative with minor syntax tweaks and cute jargon catches the fancy of the next crop of undergrads who think they're being revolutionary because they don't have the experience yet to know they're getting hot and bothered about reinventing something from ten or twenty years ago. Meanwhile, the actual work of trying to get software to be more useful to actual users will continue, and the software that actually succeeds in that role will almost certainly not have a cute name beginning with the first letter or two of the implementation language.

Comment Re:Standard Apple Operating Procedure (Score 1) 298

Add in a higher than average up front cost for the device. iOS users need to speak out on this. Stop paying more for less!

If iOS users were interested in freedom (either as in beer or as in speech), they wouldn't be iOS users in the first place. They have iPhones and iPads because they like the product and/or because it's fashionable in their peer group. Big deal. And if you don't like the product, there's a workaround: buy something else.

That's not intended to be a defense of Apple. I don't like their products or their business model. But despite the hype, they're not going to drive all the alternatives out of business. This is especially true in the phone market, which is so fragmented that being the biggest player there is like being the fattest kid in an elementary school, but it's also true of the personal computer market. Steve Jobs' delusions of grandeur may lead him to believe that he's taking over the world, but his narcissistic giggles can be safely ignored. There are more choices for users now than there were before the days of the Microsoft monopoly.

The short version? "Don't feed the trolls" applies to bored Slashdot editors as much as it does to commenters.

Comment Re:DeVry is very expensive (Score 2) 557

Allowing unqualified students into a classroom simply because they can pay for it has the reverse effect of "a rising tide raises all ships" - 2 or 3 (or 8 or 10) students in a classroom of 25 who don't have the prerequisite knowledge to be there causes NO END of distractions and problems for both the teacher AND the qualified students in the room.

I'm at an age and a point in my career where I could go back to school to study something that interests me for its own sake, and this is exactly why I won't even consider doing it. There's no reason for me to spend an entire semester on material I could teach myself in six weeks just so a bunch of undermotivated assholes can have some slow-motion hand-holding while constantly questioning whether each new item is going to be on the test and whether it has any "real world" utility. The gratification of seeing them rack up debts that dwarf the meager income their putative education will eventually earn them just isn't enough to make up for the irritation of listening to them mouth-breathe.

Slashdot Top Deals

If a thing's worth having, it's worth cheating for. -- W.C. Fields

Working...