Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Comment Re:Perl's a mess (Score 1) 379

The problem is that people look at Perl - without having learned it - and say "unreadable!"
That really is the kind of circular thought only stupid people can achieve - "I dunno Perl, so I can't read Perl, so I don't know Perl, so I think it's unreadable (...)"
Now, anybody seriously considering reading large arrays into Perl can ask the Bioinformatic guys how Perl is cutting it for them, or also choose to use the Perl Data Language which seems good enough for some guys in an Astrophysics department.

Comment Re:Wait, what? (Score 1) 379

When you say "Perl is scary" I take it you mean that you don't like TIMTOWDI.
I have often pondered about this. It seems a simple design has its advantages - you quickly pick up the simple rules, and away you go. Now, there's an aspect of engineering. Java, Python, they might be better for the larger software houses, in the sense that they facilitate things for the "code monkey". There's nothing wrong with that job position, although the term is derrogatory. Eiffel, for example, is a well-designed language that takes that approach very explicitly (they even say there's no space for the "language guru" in Eiffel).
But Perl is no more scary then C++. C++ is large, complex, full of hidden gotchas. That never stopped it from being used at a very large scale.
I really feel most arguments against Perl are lazy and not well thought out.
Let's say Perl is "complex", like, say Common Lisp is "large and complex". I tend to think these complex languages are the languages of the experts, the power-users. The language bend and twist to the expert's desires. This makes them achieve great productivity.
The point being, there's a learning curve to Perl (not much, it takes reading the Camel book) and other "complex" languages, but there's a pot at the end of the rainbow in personal satisfaction and productivity.

Comment Re:Wait, what? (Score 1) 379

Perl and Perl's hacker community, when you think about it, did amazing feats, bending Perl to take whatever shape they wanted/needed.
People wanted OOP - Perl's closures allowed that. Perl's OOP is better than most would think. (Read: Object-Oriented Perl).
Want to program in functional-style? They proved you could do a lot of FP stuff in Perl. (Read: Higher-Order Perl).
They even went ahead and gave Perl a Meta-Object Protocol, sort of CLOS-style (CLOS = Common Lisp Object System, which some would argue is the most advanced out there).
And Perl is pretty fast, when compared to Python or Ruby.With a huge number of libraries (CPAN).
So Perl is pretty successful and has held its own as one very flexible language. The fact that the language did not change much, and yet achieved so much, is really a testament to its happy, fortuitous design.

IMHO some mistakes were made in not supporting things that are not Perl per se, but would be crucial in a modern programming language:
- Perl is a PITA for C-interoperability and a lot of Python these days is about, in fact, using a deeper C layer (why not more Lua, then?). C is crucial for speed (that means number-crunching and graphics), for software re-use, and its *the* lingua franca of software.
- Some sort officially-sanctioned GUI should've been adopted. Again, this isn't Perl per se, but having mult-platform GUI support could've secured a better position at the desktop space (but most Perl people work at the data center/server level).

And then there's Perl6...Perl6 was supposed to be great - its design is great - but IMHO the developers made the mistake of forgetting the lessons of the past, forsaking an advanced functional language for the implementation of complex language features - this despite Audrey Tang's effort - and then got caught in a labyrinth they can't get seem to get out of. And this move might have been caused by pure prejudice against Haskell or sheer stubborness, I don't know.

Whatever the reasons (and to be fair, one got tired of following the slow progress of Perl6), letting Pugs die when Audrey had done so much by herself just didn't seem rational at all. Maybe it was, maybe it wasn't, but if the history of programming languages servers as a parameter, the odds were in favor of a Haskell (or SML, or Ocaml, etc.) implementation, because the advanced features Perl6 was aiming for, to my knowledge, were never decently implemented with the usual bag of C-oid tools (the olnly thing that comes to mind is Mathematica, which was very buggy in its initial version, when compared to Maple). Smalltalk, Lisp, etc. They all seem to built some sort of interpreter or language kernel and grow the language from there. It doesn't seem that was what the Perl6 implementors were doing. It's almost as if they were emulating the Java developers - and we know how long it took then to implement a modicum of advanced language features over there...

Comment Re:WAAAAAAAAAY too little, too late. (Score 1) 175

I agree with him. Paypal has a terrible track record, and if a moderately successful fundraiser can cause them to seize funds. Since the shelter isn't a business with a track record of sales, it could take significant effort to get access to those funds. And since Paypal isn't a regulated bank, there's no recourse other than taking them to court.

Paypal is convenient, and is only worth the risk if you can afford to lose the transaction.


Submission + - Has the bazaar model gone insane? (acm.org)

synthespian writes: In a scathing review of the current "bazaar" situation in open source Unixen called A Generation Lost in the Bazaar, in a recent ACM Queue column, Poul-Henning Kamp — of FreeBSD fame — writes: "At the top level, the FreeBSD ports collection is an attempt to create a map of the bazaar that makes it easy for FreeBSD users to find what they need. In practice this map consists, right now, of 22,198 files that give a summary description of each stall in the bazaar... Also included are 23,214 Makefiles that tell you what to do with the software you find in each stall... the map helpfully tells you that if you want to have www/firefox, you will first need to get devel/nspr, security/nss, databases/sqlite3, and so on. Once you look up those in the map and find their dependencies, and recursively look up their dependencies, you will have a shopping list of the 122 packages you will need before you can get to www/firefox.
Here is one example of an ironic piece of waste: Sam Leffler's graphics/libtiff is one of the 122 packages on the road to www/firefox, yet the resulting Firefox browser does not render TIFF images...
That is the sorry reality of the bazaar Raymond praised in his book: a pile of old festering hacks, endlessly copied and pasted by a clueless generation of IT "professionals" who wouldn't recognize sound IT architecture if you hit them over the head with it. It is hard to believe today, but under this embarrassing mess lies the ruins of the beautiful cathedral of Unix, deservedly famous for its simplicity of design, its economy of features, and its elegance of execution."

Has the bazaar model become unmanageable? We all remember when Debian just crumbled under its own weight, when its packagers (they're called "developers" in Debianland) just couldn't keep up with the exponential explosion of packages. It took money — i.e., Shuttleworth's money — to get Debian rolling again (except now it's called Ubuntu). But that entails both parasitizing a distro and too much human work. While that seems to confirm the old adage that money will get you anything, it doesn't really look like it's a real technical solution. FreeBSD and Ubuntu/Debian remain the open source Unixen with the largest collection of Userland goodies, and are the prime victims of bazaar dependency-hell.
Do you agree with PHK's view? Do need to go "old school" and take more responsibility for the design of our code? Has the bazaar become, perhaps, the mirror image of the NIH syndrome? Or maybe we need updating our old ways, reaching out for newer cutting-edge tools that can analyze and automate intelligently, such as SAT solvers, Abstract Interpretation error-checking, Formal Concept Analysis-based toolsfor better dependency analyses [pdf],and non-destructive (referentially transparent) updating tools — all mostly absent in the developer's radar? Or do we do both?


Submission + - Magnetic brain stimulation - what can it do for you?

An anonymous reader writes: Magnetic brain stimulation is an up-and-coming technology and its purveyors seek to apply it to new diseases nearly as fast as they can be diagnosed. The latest comer to the party is Israeli-based Brainsway, which recently got the green light from the FDA to treat depression with their new, deeper-penetrating electromagnet. Dumping up to 8000 amps into their coils and driving it home with more than a kilovolt, any of these machines would make one heck of a wireless phone charger. It should come as little surprise that they could have an effect on your brain. The question we must first demand be answered is — what exactly do they do?

Submission + - Dangerous Remote Linksys 0-Day Root Exploit (net-security.org)

Orome1 writes: "DefenseCode researchers have uncovered a remote root access vulnerability in the default installation of Linksys routers. They contacted Cisco and shared a detailed vulnerability description along with the PoC exploit for the vulnerability. Cisco claimed that the vulnerability was already fixed in the latest firmware release, which turned out the be incorrect. The latest Linksys firmware (4.30.14) and all previous versions are still vulnerable."

Submission + - Java vulnerability could take 2 years to fix, despite Oracle's patch (networkworld.com)

An anonymous reader writes: After the Department of Homeland Security's US-CERT warned users to disable Java to stop hackers from taking control of users' machines, Oracle issued an emergency patch on Sunday. However, HD Moore, chief security officer of Rapid7, said it could take two years for Oracle to fix all the security flaws in the version of Java used to surf the web; that timeframe doesn't count any additional Java exploits discovered in the future.

"The safest thing to do at this point is just assume that Java is always going to be vulnerable," Moore said.


Submission + - PC Shipments Decline as Windows 8 Fails to Ignite Market (ibtimes.co.uk)

DavidGilbert99 writes: "PC manufacturers are in trouble. Despite the launch of Microsoft's radical re-think of its operating system in years, and the launch of dozens of thin-and-light Ultrabooks, people are simply not buying PCs.

Tablets are the big problem with Gartner analyst Mikako Kitagawa saying that when tablets first appeared in 2010, consumers were expected to buy them as companion devices for their main laptop or desktop PC, instead tablets are replacing those devices."


Submission + - Theory of cometary panspermia gaining weight (buckingham.ac.uk)

Carpespain writes: Fossil diatoms (a kind of phytoplankton) have been discovered in a carbonaceous meteorite that fell in the North Central Province of Sri Lanka on 29 December 2012. The new data provides
strong evidence to support the theory of cometary panspermia.

The Internet

Submission + - How the Internet Makes the Improbable Into the New Normal 1

Hugh Pickens writes writes: "A burglar gets stuck in a chimney, a truck driver in a head on collision is thrown out the front window and lands on his feet, walks away; a wild antelope knocks a man off his bike; a candle at a wedding sets the bride's hair on fire; someone fishing off a backyard dock catches a huge man-size shark. Now Kevin Kelly writes that in former times these unlikely events would be private, known only as rumors, stories a friend of a friend told, easily doubted and not really believed but today they are on YouTube, seen by millions. "Every minute a new impossible thing is uploaded to the internet and that improbable event becomes just one of hundreds of extraordinary events that we'll see or hear about today," writes Kelly. "As long as we are online — which is almost all day many days — we are illuminated by this compressed extraordinariness. It is the new normal." But when the improbable dominates the archive to the point that it seems as if the library contains only the impossible, then the "black swans" don't feel as improbable. "To the uninformed, the increased prevalence of improbable events will make it easier to believe in impossible things," concludes Kelly. "A steady diet of coincidences makes it easy to believe they are more than just coincidences.""

Submission + - Ask Slashdot: What Does It Take to Become a Software Engineer? 2

Jeheto writes: "I’ve always had an interest in IT, and now I’m at the point where I can choose my career. I’m currently a freshman college student trying to decide between Electrical Engineering and Computer Science as a major. I’m about to take my A+ certification, I have a few years of high school level training in electronics theory/soldering, and I know a just the smallest bit of Python from working with the MITOpenCourseWare program. I’m equally interested in networking, electrical engineering, and programming. My question to Slashdot is, what should I learn?"

Slashdot Top Deals

The opposite of a correct statement is a false statement. But the opposite of a profound truth may well be another profound truth. -- Niels Bohr