Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Books

Remembering Sealab 138

An anonymous reader writes "'Some people remember Sealab as being a classified program, but it was trying not to be,' says Ben Hellwarth, author of the new book Sealab: America's Forgotten Quest to Live and Work on the Ocean Floor, which aims to 'bring some long overdue attention to the marine version of the space program.' In the 1960s, the media largely ignored the efforts of America's aquanauts, who revolutionized deep-sea diving and paved the way for the underwater construction work being done today on offshore oil platforms. It didn't help that the public didn't understand the challenges of saturation diving; in a comical exchange a telephone operator initially refuses to connect a call between President Johnson and Aquanaut Scott Carpenter, (who sounded like a cartoon character, thanks to the helium atmosphere in his pressurized living quarters). But in spite of being remembered as a failure, the final incarnation of Sealab did provide cover for a very successful Cold War spy program."
Biotech

UCSD Researchers Create Artificial Cell Membrane 54

cylonlover writes with an excerpt from a Gizmag article: "The cell membrane is one of the most important components of a cell because it separates the interior from the environment and controls the movement of substances in and out of the cell. In a move that brings mankind another step closer to being able to create artificial life forms from scratch, chemists from the University of California, San Diego (UCSD), and Harvard University have created artificial self-assembling cell membranes using a novel chemical reaction. The chemists hope their creation will help shed light on the origins of life." The full paper is available in the Journal of the American Chemical Society (behind a paywall).

Comment Re:I really don't get the point of this... (Score 1) 416

the ipad is a AWESOME device or textbooks, reading about dinosaurs and having animations or being able to have interactive parts is incredibly cool.

Well, yes... except in this case the animation and interactivity parts would all be lies, because nobody has ever seen a dinosaur move, much less interacted with one.

For that matter, no one has seen a dinosaur in any form other than fossilized skeletal remains. All the illustration and sculpture we see featuring dinosaurs with flesh on them is also fiction. Even some of the assembled skeletons have turned out to be wrong. It's fiction based on our best reasoned guesses about how the animals probably looked, but fiction nonetheless.

If such fiction is well-reasoned, then it is not without value.

Comment Re:Year o' the Linux Tablet (Score 1) 86

I'm kind of surprised nobody's mentioned this aspect, because it seems sort of obvious to me.

HP can make a *lot* of money selling for lack of a better term "open" tablets.

I am a bit skeptical about that... I mean, tinkerers can be very enthusiastic about a product that's fun to hack on - but there aren't necessarily enough of them to make it worth manufacturing a tablet for them...

Comment Re:Webos is beyond repair (Score 1) 86

The whole project lost its focus: Palm used to be a neat piece of software for a PDA. It was not bloated with a file system, with Flash, and with all this other junk that has become the primary focus of Apple.

I think it's worth bearing in mind that their OS design was a great fit for the kind of devices they could produce in the late 90s. These days, the capabilities of a handheld machine are a lot better, so what would have been "bloat" thirteen years ago is now pretty reasonable in terms of the functionality provided vs. the resources consumed.

Democrats

Meet the Strange Bedfellows Who Could Stop SOPA 231

jfruhlinger writes "In a political environment that's become very strongly defined by partisan lines, the SOPA debate has offered an unexpected ray of hope: the two main Congressional opponents of the bill are Ron Wyden, an Oregon Senator deemed a 'hardcore liberal' and Darrell Issa, a California Representative who is one of the Obama Administration's fiercest critics. (There are both Ds and Rs in favor of the bill, too.)" (Read more below.)

Comment Re:Great idea! (Score 1) 938

Many bans on cellphones allow hands-free, but I've always felt it wasn't because it is better to use hands-free, but because it is impossible to enforce. (If someone can back up or refute my assumption, mod him up).

Either that, or someone argued that hands-free devices made cell calls in the car safer, so they could sell more hands-free calling devices. :)

Comment Re:existing tools and suggestion (Score 1) 276

The use cases, options, and interfaces are different for searching programming language source files, XML files, and other text. ... Trying to force them into a single command line tool makes little sense to me.

You make a good point, but I think a tool that forces the different data types into a single mould could still be useful, even if it can't provide all the functionality that a specialized tool would.

Comment Re:Perl to wa chigau no da yo! Perl to wa! (Score 1) 276

Right, but what is there not already a parser for in CPAN? And if you are handy with perl, what kind of comparison is difficult?

I couldn't say, honestly. :) So write a Perl script that recognizes the input file types, chooses the correct module, implements some kind of matching rule syntax, and performs the comparison with whatever module you chose in step 2, and a plugin system so people can add more file types without modifying your script, and yes, you've pretty much got bgrep.

I think at that point you're beyond "Perl's capabilities" and into the realm of "capabilities of things you can implement in Perl".

Comment Re:Terrible idea (Score 2) 276

This violates so many rules of the Unix philosophy that I don't even know where to begin...

I'll take this on. It's a subject that is of particular interest to me.

First of all, you have to consider whether it even matters that a tool violates "rules" of the "Unix philosophy". I mean, seriously, why assume that some system design ideas cooked up 30-40 years ago are necessarily the One True Path? Because "those who do not understand Unix are doomed to reinvent it poorly"? What if the designers in question do understand Unix? Or what if <gasp> they might actually have some ideas that surpass those of Doug Mcllroy, ESR, K & R, and so on?

Second, how does one account for tools like Perl? By many accounts it is one of the greatest Unix tools ever created. By combining the functionality and syntax of several useful tools, incorporating a rich regexp syntax, and binding it together with a general-purpose programming language, it can be a very versatile and effective tool. But it runs afoul of various "rules" as well: (I will use a star to mark the rules I don't particularly agree with)

  • "Write programs that do one thing and do it well" (Doug Mcllroy's summary of the philosophy, first clause)
  • "Clarity is better than cleverness" (ESR, second rule* - I think there are times when it's worth having a compact notation with a difficult learning curve.)
  • "Design programs to be connected to other programs." (ESR, third rule - I would argue that Perl encompasses as much functionality as it can to avoid having to connect to other programs - to avoid outside dependencies, to eliminate the problem of communicating with other processes, and to stabilize and simplify the interface to that functionality.)
  • "Design for simplicity: add complexity only where you must" (ESR, fifth rule... Though it could be argued that this is exactly how the design of Perl evolved.)
  • "Programmer time is expensive; conserve it in preference to machine time." (ESR 13th rule - Perl runs afoul of this if you accept the idea that Perl code is particularly hard to maintain. A language with a clearer syntax would, presumably, conserve programmer time.)
  • "Use shell scripts to increase leverage and portability." (Gancarz, 7th rule. I would argue that Perl scripting exists largely as a way to avoid solving problems in the shell language.)

Perl's biggest "violation", which it shares with other scripting languages, is that first one: "do one thing and do it well." Perl, Python, etc. are perfectly capable of doing a fork/exec or popen or loading a .so or whatever - but generally if there's a piece of functionality that people want to have in those languages, they re-implement it as a native library for those languages. Why do we accept so blatant a violation of what may be rightly considered the Unix philosophy? Because it works. It's useful. So a better question, then, is why is it that violating such an important "rule" is apparently necessary to create such a useful tool? There are various reasons: First, any reliance on an outside program is a maintenance issue. If your script is written for GNU find, for instance, and you move it to a system that has some other implementation of find, it may not work. Things can change from revision to revision as well. Second, it actually makes it easier to access the functionality, since you don't have to deal with writing out a stream of values and/or reading back a stream of results - when you call a Perl module, everything is neatly packaged into a (usually) synchronous call/result function interface, and presented as native Perl data.

Perl could be a contentious example - but I chose it because to me, it and other scripting languages are examples of people bypassing the shell environment, rather than augmenting it. I would go so far as so say that it's a symptom of a failure of the Unix shell environment. Much of that failure has to do with these two facets of the "Unix philosophy" which are at odds with each other:

  • "Write programs to work together" (Mcllroy, second clause)
  • "Write programs to handle text streams, because that is a universal interface." (Mcllroy, third clause*)

There is nothing intrinsic about text streams that make them a "universal interface". They are only "universal" in the sense that we already have tools in place to deal with them. And yet, look at all the file formats that aren't based on some kind of text stream... Media files, for instance.

Building all our tools around the assumption that we can't make assumptions about the format of our data means that it's harder than it ought to be to "write programs to work together". Scripts may have to incorporate multiple parsing/serializing steps (and the tools need to provide functionality to support this) - or people can bypass the obstacles that stand in the way of coupling different programs together with pipes and line-processing text tools by ignoring the Mcllroy's first clause - go the Perl route, use a tool that incorporates all the functionality into one environment so you don't need to make programs work together.

Grep has issues with data blocks as well. "With regular expressions, you don't really have the ability to extract things that are nested arbitrarily deep," Weaver said.

If your data structures are so complex that diff/grep won't cut it, they should probably be massaged into XML, in which case you can use XSLT off the shelf. It's already customizable to whatever data format you're working with.

Massaging your data is a problem at least as complex as bgrep/bdiff's proposed method of using plug-ins to generate parse trees and operating on parsed data. The only difference is that the representation for the parsed data (process-native data structures in RAM vs. XML text being passed over a pipe) is less efficient.

With [operational data in block-like data structures], a tool such as diff "can be too low-level," Weaver said. "Diff doesn't really pay attention to the structure of the language you are trying to tell differences between." He has seen cases where dif reports that 10 changes have been made to a file, when in fact only two changes have been made, and the remaining data has simply been shifted around.

No, 10 changes have been made.

That's kind of a narrow view. The aim here is to find the relevant differences in the file, which is often a more useful piece of information (for the user or for the computer) than a byte-level diff result. There are bound to be cases where you really do need to see the exact difference in the byte sequences of two files - but there are also cases where it's useful to get more contextually-relevant information.

Slashdot Top Deals

The optimum committee has no members. -- Norman Augustine

Working...