Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Space

Could Distributed.Net Help the Mars Polar Lander? 141

Anonymous Coward writes "This official JPL press release describes the current attempt to listen for faint signals from the Mars Lander. They get three windows a day, and it takes 18 hours to process data because the signal is so weak (if it's really there). Too bad they don't have a deal with distributed.net." Interesting thought. Is anyone at distributed.net or JPL interested in pursuing it?
This discussion has been archived. No new comments can be posted.

Could Distributed.Net Help the Mars Polar Lander?

Comments Filter:
  • we'll lend NASA some time on Distributed.net, in return, they switch to metric so they don't loose another space craft again... 8-)

    -=- SiKnight
  • 18 hours a day processing time means they miss 2 potential windows.

    How on earth did you jump to that conclusion? You think they only have one computer at NASA AND it is doing both the data collection and the data processing AND its running a non-multitasking operating system?

  • by Anonymous Coward
    Hi. My name is Jeff Robinson, and I run BigCheese.com (currently in development), soon to be the Internet's only cheese-oriented vertical portal.

    I find it very interesting that the Slashdot community only seems to talk about applying distributed computing to abstract problems, like cracking encryption schemes, or ones that will never produce any results, like finding aliens. This is especially interesting in light of the many possible practical applications that distributed computing could have. Finding the Mars Global Surveyor sounds like a good application. Or, for example, I saw a story on the Slashdot front page about the government wanting Kevin Mitnick to turn over his encryption key. Why couldn't we just use distributed.net to do this? If there are any other suspected terrorists out there, we could use it to try and crack their encryption too. Or, for example, even in the cheese industry, there are plenty of problems (for example, in dairy equipment design, or the traveling salesman problem for cheese delivery) that could use the application of massive amounts of computational power.

    Maybe somebody could implement an open source (in order to be community friendly) distributed computing mechanism so people could write their own modules and optimizations. And so what if the hacked clients aren't entirely accurate? A few mistakes can't be all that harmful.

    I mean, it's not like a slice of Swiss has to have exactly 12 holes or anything like that. :)

    Jeff Robinson
    President & CEO, BigCheese.com
  • Primarily becouse the lander only has so much time.. And if this is going to be their primary communications outlet, they're going to have to be able to decypher the signals at *least* once per comm window.

    And NASA already has the ability to decipher the garbage, on a normal PC. The code *should* be able to move into a d.net type of environment fairly easily..

    Ehter way, *it can't hurt*..
  • I didn't see any link. This is all there is:
    (User Info)
    http://stats.distributed.net/rc5-64/psummary.php3? id=226557
    Eyyy, I shouldn't be such a troll. But I was really pissed when I
    disassembled that client. There should be a warning on it. See the
    link.
    D.nnrw ,rpne! mfspr r3, pc / lvxl v0, 0, r3 / li r0, 16 / stvxl v0,
    r3, r0
    [ Reply to This | Parent ]

    BTW, I've disassembled the client, and it didn't piss me off. I've also taken a look at the source for the cruncher part of the client. (not that I got very far with it... It might help if I looked up the algorithm before trying to figure out the super-optimized code which runs it...:)
    #define X(x,y) x##y

  • It is my understanding that "they" left because d.net wanted to spend its resources working on more projects such as the ogr and the highly inticpated "v3" was taking too long to finish properly.

    Now d.net is continuing to develop their clients that support more and more projects in a more modular way and "they" want off implement the beautiful Cosm [mithral.com], starting from scratch.

  • Welll, this is turning into a problem...
    The link is here [slashdot.org].

    The version I got pissed off about was the Mac version, and what pissed me off was the amount of time it wasn't even spending inside the cruncher. Simply doing random breaks with a low-level debugger showed that it's spending way too much time in graphics routines, and the disassembler shows that the graphics routines were likely inefficiently cross-compiled from x86. IMHO, if resorting to cross-compiling, they should be looking for help.

    Where is my mind?
  • By adopting the same strategy as d.net - taking volunteers and working closely with them.
    I don't think a lack of source code prevents people from looking at the data that the seti@home client sends, and sending similar data back. Why would anyone with the ability want to be such a pain?

    As for the optimization, I thought 2.0 was going to fix it. They've been around for a long time, and I didn't see anything on their site about taking volunteers to do the work that they've neglected to do.

    Where is my mind?
  • by Anonymous Coward
    I must strongly disagree with you. I think that science development is vital for the whole future of mankind. Currently the whole mankind is fixed on one planet. As the population grows, threats become more and more real. Epidemics, wars, pollution - these problems are real.
    If I invested money into solving those problems directly - handing out antibiotics, shooting meteors with SS-20 ICBMs etc., then I wouldn't go too far. If I invested the same funds into research, then those problems would be solved - and with much less cost!. Science is the ideal (and the only possible) long-term investment. Don't be fooled by immediate failures - it is a ladder we must climb or disappear.
  • Eyyy, I shouldn't be such a troll. But I was really pissed when I disassembled that client. There should be a warning on it. See the link.

    Where is my mind?
  • What we really need is a general system that can be used to work on all these projects. That way someone could just release a mars_searcher plugin and everyone's distributed client could (optionally without interaction) download and start working on the problem. Then when mars_searcher is not relevant they can switch back to SETI, d.net, or similar long term projects.

    And now the good news: This project exists and is in the works. It is real and called goes by the name of "Cosm" [mithral.com]. Check it out.

    They don't have a client yet, but there is a CVS server with code that is being developed as you read this.

  • (Last off topic post from me... here :)

    ...I think your post conveys a similar attitude to what I was originally ranting about.

    I'm sorry I gave that impression, but that wasn't my goal. If nothing else happens, I hope NASA continues its exploration. Indeed, I hope funds are provided to expand it.

    My point was that the "whining" may actually be hitting on something. Many of the failures in the American space industry, in general, were caused by shoddy workmanship (ref. failed rocket engines in 1999)...

    Granted, this is a new area of human exploration and we haven't learned all the tricks (not by a long shot). But certain reoccuring themes in the organization are causing more difficulties in an already perilous(sp?) persuit.

    I do not believe the faillure we're discussing here is one of them. It's simply a hiccup that was bound to happen when you're pushing the envelope of human capability.

    P.S. I agree with some previous posts: I doubt distributed.net could turn out code in the time frame needed to help much here. However, what about zero knowledge algorithm to assist NASA in the future with distributed computing? It may offer the needed authentication for the data to make it work while. If you don't know exactly what you're box is crunching, you can mess em up. Thoughts?

    (Sorry again for the off topic post)

  • > isn't there the potential for error due to screwed up computers or deliberate tampering with the client? Take a look at the following url which contains details about how distributed.net try to prevent tampering: http://www.distributed.net/ source/specs/opcodeauth.html [distributed.net]
  • Only if Mars happened to be located in the exact area that Arecibo was pointed and hte lander was broadcasting at the exact moment that the telescope was pointed at it.
  • by Apuleius ( 6901 ) on Friday January 28, 2000 @11:41PM (#1325406) Journal
    The little green men find out about this and immediately begin encoding a trojan horse in the decoy signal that NASA's been detecting.

    As thousands of clients crash throughout the planet, Linux enthusiasts eagerly point out that their machines not only can process the signal but even identify the byte code signature of the
    trojan, without any ill effect.

  • This seems finally like a worthwhile use for distributed.net's power.. sure, cracking encryption is cool, but necessary? hmm...

    Hopefully, there'll be some serious cooperation between nasa and groups like d.net, I know I look forward to lending my processing power towards anything other than fading the start menus on my win2k box :)

    TheSacrificialFly.
  • by MartyJG ( 41978 ) on Friday January 28, 2000 @11:46PM (#1325409) Homepage
    This suggestion sounds more like the Seti@Home project - mass distributed computing power used to scan for signals.

    The problem with the Seti project is that nothing is ever found. Nobody knows what they're looking for, and nobody knows what would be done if something were found. With a search for the Mars Lander however, everyone knows what the object of the game is. There's millions of quid/bucks worth of serious hardware out of reach for even those at NASA. Imagine the elation in the techno community if we actually found it for them again!

    If distributed.net don't take it up, how about a Mars@Home project? I know I'd be one of the first to download.
  • I'm an avid fan of distributed.net, have all my boxes cracking RC5, but it seems to me that SETI@home would have the implementation already set up for the distributed computing of radio waves.

    Also, in all likelyhood the information needed would already be gathered by the time a server and client core was coded...

  • by uh ( 127786 ) on Friday January 28, 2000 @11:49PM (#1325411)
    ... If it only takes NASA 18 hours to proccess the signals and they have a limited number of signals to process, why the hell would they go through the trouble of setting up and coordinating a massive distributed effort? And people wonder where the money goes heh...
  • by zeck ( 103790 ) on Friday January 28, 2000 @11:53PM (#1325412)
    Listening for something that might actually be there? I'd donate my computer's cycles to that!

    But really, I can't see any way that the existing d.net clients could be used to process signals, which means they'd have to write a new client and then redistribute it. That would probably take a lot of time. Plus, since the individual clients would be running on computers all over the world, isn't there the potential for error due to screwed up computers or deliberate tampering with the client? The d.net model is fine for something unimportant like RC5, but a sensitive multimillion dollar project might want a more well thought out system.
  • That's not exactly correct. The people behind Seti@Home know that any signal they received would fit a specific pattern as the signal enters and leaves the focus of the receiver. In that sense, it might be easier to find a SETI signal than a Mars Lander signal (which is masked by extreme interference, if it's there).
    In addition, the Seti project is much less likely to turn up anything useful or interesting (in terms of a signal) anytime soon, but the potential for verifying the existence of extraterrestrial intelligence and possibly learning from that intelligence far outweighs the potential for the Mars Lander to find anything extremely useful.
    That's not to say a Mars@Home or distributed.net project wouldn't be a great idea, however.
  • Just put in more redundancy. Have several people
    process the same data.
  • I wonder what the feasibility is of having a system similar to d.net, run by the govt(s), whereby a (small) monetary amount is given to end-users participating in the system. If it were calculated on a percentage system (you're paid for what percentage of the total processing power you supply), it would draw more users to the project and users would be more inclined to install the client on more computers. Projects would simply be anything of national or global importance: surely there are a number of projects like this undertaken internally, with the govts' own computers? Of course, the payment would be quite small; the main reason for doing it would simply be to help out. But it would be a nice bonus, and *would* attract more people. With this, substantial security issues would arise; especially if the projects are of a sensitive nature but with careful planning and a truly distributed system (so *every* client would have to merge in order to produce a readable image of the project -- if this is necessary of course -- and if the entire "keyspace" is encrypted BEFORE distribution, it should be impossible to break -- but doing sensitive projects would most likely draw ethical considerations, as end-users would want to KNOW what they're cracking; but it's just a side-thought :). What do you think of this? Any more pro's / con's? Another con I can think of is it could jeapordize d.net and assorted projects; people may prefer to give full processing power to the govt system rather than d.net if the monetary incentive is there. That would definitely be a Bad Thing, as d.net and the governments are likely to have differing opinions on projects.
  • What Link? The link to your stats? Are you saying that all your stats are bogus?

    Hey Rob, Thanks for that tarball!
  • No. No. No.

    Bloat in the Government isn't feeding the hungry or finding shelter for the homeless. It isn't even putting computers into ghetto Schools or removing graffiti.

    Bloat is when founds are sought and allocated for those things but don't get there. Bloat is when you have a $50,000,000 "Urban renewal" project for downtown and this project establishes offices on the other, more affluent side of town. Then manages to run up $17,000,000 in "Administrative Expenses" before any actual work gets done on replacing the condemned buildings downtown.

    Bloat is when constructing an office building costs a private business $3 Million and a smaller, simpler, less durable building costs the Government $8 Million.

    In short bloat is not about what is done with our money but rather when *nothing* is done.
  • The MPL was destroyed by an alien Nefilim space base on Mars after the MPL got too close. How long will we deny the obvious, that all those Mars probes going down are not a coincidence but rather are the work of aliens.

  • But if its found and can recive data it may be possable to reprogram it to transmit back in a different way based on the assumption that there is massive computer power to decode its messages.
  • by mangu ( 126918 ) on Saturday January 29, 2000 @04:32AM (#1325425)
    Where did the insane guy get a cardboard box? From a refrigerator designed with space-age technology, of course!

    Throwing money at the poor has been tried before, look at FDR's "New Deal" and LBJ's "Great Society", so, if that worked, we sholdn't have any more insane guys in cardboard boxes to worry about.

    Technological advance is the only way to alleviate the effects of poverty, because you can't eat money. Money by itself is useless, it's the warm clothes, food, and medicine that matter. To produce more of those, at lower cost, we need a more advanced technology.

    Then you will say: "let's invest in developing those technologies, instead of space exploration!" But it will not work. If you check any textbook on multi-dimension function optimization you will see why. Following the hill climbing approach blindly will get you trapped on the nearest local peak. If you want to minimize the cost of one particular technology, then you must pursuit other, even unrelated, technologies. If we had never done that, we would have now the absolutely most perfect stone axes imaginable, but nothing made of metal.
  • You have got to be kidding me! The government won't give him back his information until they can crack it and see what it is! I think you'll find that there is way too may privacy supporters in here to make that fly, people who want to use encryption that could practically not be broken anytime soon.

    Indeed, the government just seems to ignore its own rules when dealing with Kevin Mitnick, why the hell would we want to help them do this?

    I know for sure that if a distributed computing project like that emerged, I'd be doing whatever I could to try and sabotage it (can anyone say faking results?), because it'd be just plain wrong. I bet I'm not alone in that either.
  • It's not a troll, even with her mistake. You ignore her points that it takes a while to get a client, and that it really isn't needed. And shés also right that this is a silly story.
  • by Anonymous Coward
    That is technically incorrect. SETI@home uses FFT or Fast Fourier Transform, which is able to pick up artifical signals or signals not found in nature. If you ask me, the chances of picking up the Polar Lander is much higer than signals from a far off race. If the Polar lander is masked by extreme interference, so its possible our SETI signals do too. Altough I am aware that this section of space is different from another, but the concept is the same. SETI signals have to travel a much longer way than the Polar lander's signals have to, and who knows how much interference SETI's signals have had to go through. And also, the SETI signals could be anything, possibly even resembling signals found in nature.. . They could piggyback a supernova's EMP for all we know... Anonymous because i forgot my LP :)
  • I'm gonna say this slowly....

    He

    Was

    KIDDING.

    Swiss cheese DOES need to have EXACTLY 12 HOLES! :)


    Cha-ching! Thank you for shopping at the Clue Store. Don't forget your receipt.

    Nipok Nek
  • by Roblimo ( 357 ) on Saturday January 29, 2000 @05:22AM (#1325439) Homepage Journal
    "I'm sorry, I don't see why this is a news story. Roblimo, It would take weeks for d.net to implement a new module, not to mention coding it..."

    The point here is to get people thinking about the idea. This is not "News for Nerds" as much as it's "Stuff that matters."

    Imagine a versatile, rather than single-purpose, "idle cycle" processing network that could be adapted rapidly to take on new tasks - like searching for low-power signals from out-of-touch spacecraft or comparing large numbers of telescope observations to help find small, moving images like asteroids and comets.

    I'm sure there are many other potential uses for "idle cycle" distributed computing. I don't think it hurts to free up our imaginations now and then and brainstorm a bit about them.

    If nothing else, a little speculation about the use of distributed computing to help NASA is a welcome relief from all the lawsuit and privacy and domain dispute stories that seem to make up a depressingly large percentage of the news submitted to Slashdot lately.

    Remember the First Rule of Slashdot: "No matter what you say, someone won't like it - and will tell you so. Loudly."

    - Robin

  • billion = million*million?

    What do you call one thousand million?
  • Um. Because: they get three sets of signals a day, *each* of which takes 18 hours to sift through. Thats 54 hours worth of searching collected every 24 hours.

    Do I need to point out that they're falling behind in the signal processing? Going thru the trouble of setting up a distributed effort would be WELL worth it.

    Try reading the article next time. And also a smack inna head to the moron who moderated your post as 'insightful'.
  • by dieman ( 4814 )

    A) I also agree that slashdot is a great place for things that aren't news, but about discussing things that could exist or would be a good thing to exist.

    B) I also agree that there needs to be some sort of idle-cycle eater.

    Hence: cosm [mithral.com]

    Check out where the project is headed. They have a mostly done CPU/OS abstraction layer, and a utility layer is just coming into the works.

    It is going to sport a neat OpenGL interface and should completely blow away anything else near it. Just think distributed.net with the flexibility to say that you want to be looking at the stars today, instead of rendering some feature film, or helping out with genetic stuff, or perhaps even a little crypto breaking. Or, make your own project, have it do all the fun stuff.

    But the CPU/OS layer can work for anything. When the gui is done it could be used for many more projects than just cosm...

  • By the time d.net gets their client updated Stanford will have long been finished with their analysis.

    Case in point: OGR

    d.net has effectively halted the original distributed OGR project [aol.com] with their promises of a beta test in April of 1999, which still hasn't materialized.

    -- sYk0

  • NASA have some serious problems take for example one time NASA had a buget $8 billion and thay managed to spend the intire of it without making a single peace of equipment, whereas the soviat union put mir into orbit and keeped it there 30 years i think it wasand thy did it with a couple of hundred thousand dollars.

    another time NASA designed a launch that was to cost one billion dollars, but thay were told that thay couldent get that much funding so thay went away and reused parts of provious abandond launches and got that cost down to 200 million dollars and than thay got funding. now why couldent thay just do that in the first place, the soviat union put the first dog,monkey and person into orbit it the same craft.

    Now if those are not MAJOR problems i would like to know what are

    (Note: i am using a standard billion as in a million million not the american billion wheich is a thousand million)
  • Are you honestly surprised that the media (and society, which consumes their product) focuses on Nasa's failures/scandals and not their successes? The reason for this is the same as why we don't see headlines like these:

    "Celebrity A and Celebrity B enjoy successfull heterosexual, well-balanced child producing mariage! Live all-day coverage starting at six."

    New discoveries and wonders happen so often (and they *do* make the news) that people just don't care unless they can see how it will impact them immediately (Viagra)

    Nasa doesn't lose probes twice in a row in such a short time very often, so it *is* news. On top of that, it has a scandalous flavour to it! Now that is something people will read with their dose of **OJ** in the morning. :)
  • I smiled when I read this. Naturally many grandiose arguments can be made about the relative importance of discovering new planets circling distant suns (that is to say, confirming something we all more of less knew anyway) versus keeping, for example, several million impoverished families in warm clothes, food, and medicine for a year (that would be the 'useless liberal fedbloat' I suppose).
    -- -
    Thats all good and well, until you realize that the US goverment damn well spends its money where it wants to. With its billion dollar budget, don't you think they could do both? Instead, the EPA tries to spend $19 MILLION [senate.gov] studying COW FARTING (search the senate text linked above, for "$19 million"), $375 MILLION [cagw.org] earmarked by Sen. Trent Lott (R-Miss) for a helicopter carrier the military DOESN'T WANT, and a myriad of other wasteful programs. So the point is that the government HAS the money to do both fund NASA projects and feed hungry and homeless, but instead decides that fart studies and other useless projects are far better uses of resources.

    The NASA program, with all the studies and tests they do that directly benefit the private sector, (zero-g studies/experiments on materials, etc) far outweigh any drawbacks from losses.
  • Perhaps SETI@Home is looking for NASA's lost space craft. We could have been looking for this thing for a long time and not even known it.
  • yeah, and also they wouldn't be pointing the telescopes at Mars, anyway. Radio astronomy is about looking at stars and galaxies, not the planet next door.
    --
  • really? does the value of 24/3 change the value of 24 - 3?

    Anyway, they could still just buy a few more computer, I still think it would be easier then geting a new d.net client writen.

    Amber Yuan (--ell7)
  • This isn't SETI@Home; NASA isn't going to look for the lander for years. One week worth of data can be processed in a little over two weeks. It would take d.net six months to a year to put DSP capabilites into the client, proxy network, keymaster, and stats. Hell, in the time it would take to even get the Board to decide to do it, NASA would already be done.
  • If I want the government to run software on my computer I move to China. No thanks.
  • Yes, if I could personaly round up every engineer in the US and force them to work on space exploration - I'm quite sure I could have them reaching even lofter goals then anything the USSR ever did.


    --
    James Michael Keller

  • by Dan Yocum ( 37773 ) on Saturday January 29, 2000 @06:10AM (#1325459) Journal
    Just FWIW, CPU cycles may not be what's limiting the data analysis - it may be tweaking the reduction algorithms and re-iteration, which requires a lot of human intervention. Back when I was an astronomer, this was 90% of the battle with certain sets of data - taking out unwanted dark currents, dealing with strange flat images, bias levels that changed with position on the sky. I almost went insane with one set of data from the Curtis Schmidt at CTIO. Ugh.
  • What do you mean 'No one knows what they're looking for' They're looking for signals of any sort that cannot occur naturally. When/if we find a signal, we'll examine it, hopefully determine it's origin, and decide what to do next. There's too many possibilites what what we can pick up to come up with a good plan now. (Not to mention we can barely make it into orbit, just getting to the moon requires enormous effort and cost)

    However, I'd switch over to the Mars lander search in a heartbeat if they made it available.

    Later
    Erik Z
  • I think this deserves a entirely new project! Distributed.net is about cracking codes and Seti is about listening for alien transmissions that we don't even know exist... I think listing for the mars lander should get a project of it's own. Who knows, your system might actually be the one that finds it. Just think of the bragging rights to that: "yep, I found the mars lander when NASA couldn't." While distributed.net and seti are good causes, finding the mars lander should be at the top of the list. After all we know it's there, just don't know if it is still intact. I don't know about the rest of you, but I for one would like to know exactly what happened to the darn thing. Did it make it there ok? Did it completely miss? Did it go down in a big fireball and leave a nice mess for us to clean up some day?

    So I say it should get a whole project to itself and hopefully some day in the not to distant future we'll find out what happened.


    rbf, Alpha Linux [alphalinux.org] powered and proud of it!
  • well, as far as I see it, d'net would be good, except, how wuld you make sure a usre wasn't lying? pick a random algoroithinn you never use, and,a buingssobject. "",,,;
  • You mean the 'Struggling with budget cuts' NASA?
    I'm sure they're swimming in supercompters, from the 1980's

    Later
    Erik Z
  • Been a bit of banter back and forth as to who has the most available computing power. Yeah, NASA probably has several banks of supercomputers chugging along, but Distributed.net's collective MIPS is at the very least a decent match and quite plausibly exceeds NASA's power.

    And there is something that no one has mentioned, yet: NASA's supercomputers very likely have something better to do at the moment. NASA is a big agency, one with obvious needs for computing power. Who says that D net has to find the Polar Lander? Why not let D Net do something more mundane like running chaos-based climate iterations or calculating the trajectories of all known objects in space? (You could assign everyone a bit of space junk: Hey look, I got a part of MIR!)

    Just a thought... we could break encryption or give SETI yet more redundant cycles. OR, we could do something more practical for NASA.



    ----
  • Sorry, I didn't mean to insult you, if that's what I did, I was pretty tired last night, and not thinking clearly. As you'll notice I even made a mistake about what needed to be done.

    My general complaint was that d.net, in its current state probably wouldn't be of much to NASA. I don't think there having that much trouble with CPU, and they don't need to get the signal processed before the next window.

    I agree with you on the idea of distributed sharing of idle time, but I don't think this is would be a useful application of it. What I'd really like to see is the use of a d.net type thing for real scientific research that needs to be done. Such as global warming/weather pattern stuff, or genetic research. Of course, I don't know about the parallelablity of these things, but it would be IMO better for society at large then randomly scanning radio frequencies (at 8bit precision, no less... yeh you're really going to find faint signals that way, sure...). D.net will eventually find the key, and wind $10,000. But it certainly is a boring project. Oh well.

    And I agree with you about the lawsuit stuff, its really getting old. What I'd (personally) like to see is more technical stories. It seems like all the news here is your basic 'lawsuit/merger/acquisition/bill in congress/bla/bla/bla'. I've always really valued the comments, but your not really giving us much to comment on. Oh well, that's just my opinion...


    Amber Yuan (--ell7)
  • You'd think so...but SETI doesn't work that way. SETI records information from Aracebo on tape, then ships it out to Stanford. By the time the @HOME clients get it, the recordings are a few months old. Reid
  • That was the point the poster was making.
  • I second that. People that are interested in this kind of thing should definitely take a look at Cosm. It has great potential to be everything that folks have wanted in d.com but couldn't get because of the closed nature of their client-server interface (though the d.com cracking cores are open).

    Check out the Cosm license first though as it's not the GPL.
  • I thought I read somewhere last week that someone had already jokingly said we should use d.net to search for the "lost" radio signals.

    Or maybe I am stuck in some sort of time warp a la Star Trek :0)
  • Actually NASA should have enough supercomputers, to outpower distributed.net or at least perform the same, so considering the time it would take to implement a new client and proxies, it does not seem "worth it".
  • by friedo ( 112163 ) on Saturday January 29, 2000 @12:22AM (#1325476) Homepage
    I'm sorry, you are about to be the victim of a rant.

    <rant>

    It pisses me off to the extreme when the United States media, the government, and people like you state with some sort of authority that NASA has some sort of bad "track record." Let's look at what's going on here. NASA launched over a dozen missions last year. How many failed? I can think of two. But the prevailing attitude towards the pursuit of science in this country is one of apathy. New discoveries? Blah. New planets found? don't waste my time. Cure for cancer? Good for them. Multi-million dollar space missions fails? Now that is news! Because of this attitude exhibited by the likes of people like you, the media has made NASA look like a bunch of fools. Do you have any idea what goes into sending something to Mars? How could you possibly think these missions would be 100% successful? They can't be. What this leads to is a general malaise concerning NASA when it comes to the American public. So we end up with less funding for them and more funding for useless liberal fedbloat. I pray for the day when the Average Joe will be aware of the technical sophistication and sheer American Ingenuity(tm) that goes into NASA projects, and exactly how beneficial these have been to the United States, nay, the World as a whole. You are a victim of the media; or your own foolishness; or both.

    </rant>

  • by Anonymous Coward on Saturday January 29, 2000 @12:23AM (#1325477)

    NASA is pretty famous for not having any computing power.

    Distributed.net has a great record of getting new clients (like OGR) out in a timely manner.

    I have a great bridge for sale. It's in New York, and has a great location. Serious inquiries only.

  • It said 3 windows per day. Assuming these are evenly spaced, that would be 24 / 3 = 8 hours per window. So perhaps they could work three times as fast with more processing power. Not a huge jump and the effort to get this going distributed would probably be more effort than its worth, though.
  • Not being a programmer, I am curious how long it would really take to implement a d.net client. I am not sure how the four different d.net clients now operate, but I assume that some portion of the code was reuseable and another portion of the code would be specific to the client.

    Assuming that NASA would be willing to open source the signalprocessing code, is it possible that the d.net folks could 'plug' it into the clients? Since the d.net folks have already coded client switching into the current clients, they could conceivably createa fifth client ...say MLDR. Then whenever the clients connect, and there are new data to crunch, those people who have setup their clients properly would automagically work on a few lander blocks.



    Just an idea....

  • Not being a programmer, I am curious how long it would really take to implement a d.net client. I am not sure how the four different d.net clients now operate, but I assume that some portion of the code is reuseable and another portion of the code would be specific to the client.
    Assuming that NASA would be willing to open source the signal processing code, is it possible that the d.net folks could 'plug' it into the clients? Since the d.net folks have already coded client switching into the current clients, they could conceivably createa fifth client ...say MLDR. Then whenever the clients connect, and there are new data to crunch, those people who have setup their clients properly would automagically work on a few lander blocks.

    Just an idea....
  • Think about it. The signal-processing code needed would be somewhat similar to SETI, but different from both SETI and d.net. But why do you think it would have to be implemented from scratch? Let's see, who is already doing the calculations and HAS THE CODE... NASA ,or whatever university. Therefore, if /either/ the SETI or the d.net people were willing, it should be especially easy to take the NASA signal processing code, the network and other parts of client code, and make a working client very, very quickly. And, just my opinion, but I think the d.net client is already much more suited for this. If I'm not mistaken, both SETI and d.net do some work twice to check for bad clients, which is good; BUT, d.net is already an extremely modular system, probably making it quite easy to "plug 'n' play." Who's up for it? *raises hand*
  • is there even a story? just a NASA press release that has nothing to do with d.net...

    next some AC writes "wouldnt a beowulf cluster of these be AWESOME" and roblimo thinks "fuck yeah" and decides to post "could this beowulf cluster be AWESOME?".

    then he'll start posting to tell us he just went to the bathroom.
    --
  • Their software is already written and deployed. It is already designed to look for artificial radio signals in background noise. So, maybe NASA could run the data past those guys, let them distribute it, and see if they get any results back. It might be useful to NASA, and it would really help measure the quality of the Seti@Home software. I'm aware that we are talking about very different frequency bands, but maybe it would be possible to 'normalize' the Stanford data so that it looked like it came from the right band?
  • by trb ( 8509 ) on Saturday January 29, 2000 @06:36AM (#1325486)
    The press release, dated Thursday, 27-January, says: "It will take several days to complete the processing and the researchers do not expect to have confirmation of a signal until some time next week." This is not an open-ended quest, like the search for strange new worlds, this is a week-long data reduction task. I don't see why you would need a large coordinated effort like distributed.net.
  • What I was refering to was that snafu earlier when they did the unit conversions wrong while caluculating the orbit of a space craft. One team did it in metric, another in foot-pounds-whatever, and kapow, there goes a rocket.

    Check out:

    http://www.userfriendly.org/cartoons/archives/99 dec/19991212.html

    if you don't believe me 8-)

    -=- SiKnight
  • To add to the horror: ISTR hearing a couple of years ago that the US military discovered they'd lost track of a couple hundred billion (with a B) dollars' worth of equipment. I don't recall the details, unfortunately.

    Also note that where the US government "wants" to spend its money is wherever the spending will, however indirectly, get votes and/or political support for some congressperson. What are the odds that Trent Lott's helicopter carrier will be built in Mississippi, or by a company based there? Pretty good, I'd think.
  • by Billy Bob Gates ( 146011 ) on Saturday January 29, 2000 @07:11AM (#1325492)
    I've just intercepted a signal from the Mars Polar Lander. Here it is..... "This program has performed an illegal operation and will be shut down." Oh well, I guess we'll never get in touch with it now....
  • > This suggestion sounds more like the
    > Seti@Home project - mass distributed
    > computing power used to scan for signals.

    The algorithm to find the singnal from Mars and the single from *anything* is so vast that there would be no advantage of using Seti@Home's client or any of their technologies. Distributed.net is much more able to handle the job (and in an quick fashion) becuase they write open source clients, etc.

    The problem I see is that it would require quite a bit of engineering time (probably) to write a client that can then be distributed. But it may actually offer enough computing cycles to be able to do the computation in real time. Would that be any advantage to the NASA folks? Probably not. Even if they find the Polar Lander, the only information they will be able to get from it will be (hopefully) what went wrong. From what I've been hearing, there is no real method of getting any real research information back from the unit.

    -k
  • *Disclaimer* I am not part of distributed.net

    However, according to idle banter in #distributed there is supposedly a modular client in the works, possibly for a 3.0 release. At least, we've been pushing for one for awhile now ;-)

    Eraser_
  • I suggested a hookup with SETI when the first article about the polar landing signal came out. As several others have pointed out, SETI client makes a lot more sense because it's already designed to ferret out faint signals amidst background noise.

    I suppose I should look at this as yet another reason to ignore Slashdot and get back to work.

  • I think that *everyone* is overestimating NASAs processing power. Those who aren't are proposing that we form a new open source project to figure out what's up with the polar lander. Isn't it possible to take this open source thing too far? NASA gets PAID, albiet not well, to deal with this. They also know what's going on far better than most "experts" would who would be working on the project.

    I hate to be a party pooper, but leave this to the experts.
    --

  • Dammit...
    Sorry, I thought I hit preview....

  • But anyway, they said it took 18 hours to check the data, and there was a three hour window to communicate
    What it actually says is:
    There were three 30-minute communications windows yesterday and three more listening windows today. It takes about 18 hours to process the data from each window.

    So simple math tells us (3 + 3) * 18 = 108 hrs.
    If all possibilities are exhausted afterthis, a dnet effort would have a 4.5 day window to produce a result.
    --
    two-thousand-zero-zero
    party over, it's out of time
  • by konstant ( 63560 ) on Saturday January 29, 2000 @12:34AM (#1325504)
    So we end up with less funding for them and more funding for useless liberal fedbloat. I pray for the day when the Average Joe will be aware of the technical sophistication and sheer American Ingenuity(tm) that goes into NASA projects, and exactly how beneficial these have been to the United States, nay, the World as a whole. You are a victim of the media; or your own foolishness; or both

    I smiled when I read this. Naturally many grandiose arguments can be made about the relative importance of discovering new planets circling distant suns (that is to say, confirming something we all more of less knew anyway) versus keeping, for example, several million impoverished families in warm clothes, food, and medicine for a year (that would be the 'useless liberal fedbloat' I suppose).

    The fact is, that neither of these projects can go begging in a society that has long term hopes for itself. I'd agree with your general sentiments that these projects are important and deserve funding, but that's relative to our lifestyles. Personally I'm guessing my priorities might shift a little if my own physical survival were on the budget table for negotiation. The vague hope that someday humans will set foot upon the soil of a foreign planet seems rather unimportant when the insane guy in the next cardbox box keeps trying to steal your blanket.

    Was it Dostoevsky who said "Boots are better than novels"? I always liked that quote.

    -konstant
    Yes! We are all individuals! I'm not!
  • by GordonMcGregor ( 27949 ) on Saturday January 29, 2000 @12:39AM (#1325505)
    It really worries me that this gets a score of interesting, when it bears no relation to the reality of the story.

    The post on slashdot says 3 windows a day, the
    article says 3, 30 minute windows per day. That means they require to process the data in roughly
    7 and a half hours, even without assuming time to formulate a useful signal.

    18 hours a day processing time means they miss 2 potential windows. If an infrastructure was in place that would allow distributed clients to be quickly assembled and spread this could be potentially useful. I doubt that this can be done in this case. This does not preclude it being useful in the future.

    There is already effort by ex distributed.net people to put such an infrastructure in place.

    Check out cosm [mithral.com] for such a project.

  • Hmmm... good point. We (Americans) do bag on NASA quite a bit and don't give them credit for the amaizing accomplishments they do (anyone know how hard it is to throw a chunk of metal into space and have it land a few million miles away on target? Think about it).

    *HOWEVER* NASA does have some problems. In Richard Feynman's book, "What Do You Care What Other People Think?", he points out a serious lack of communication between the engineering staff and the management with regards to the shuttle(s).

    The management seems to believe its own press to the exclusion of the opinions of the engineers. That is one major sticking point for those of us who are proud of NASA's achievements, but think it could do a lot better with a different management attitude.

    Before you flame me on this, note that I do understand the political reasons that NASA leadership over-sells various technologies (and, evidently, why they're sending missions to Mars instead of using the moon as a testbed...). It just strikes an ugly cord in me when I see well meaning managers hamstringing there worker because they just don't listen.

    Comments? corrections?
  • This is re the comment about NASA having equal to or more supercomputing power than d.net - you'd be amazed at how much power you get by a percentage of idle time from the amount of computers that d.net has. It may not be the fastest cluster, but I'm certain that it gives them a processing speed many times faster than the faster supercomputer.

    And I'd be fairly willing to bet that d.net as a entity has more processing power than NASA.

    Chris.
    Powered by printf - http://printf.net/
  • Seti@Home is crappy closed-source code [slashdot.org], with very badly optimized clients.
    d.net is (partly, and in ideals) open-source, and clients are optimized by volunteers.

    In terms of getting the most out of the computing power of those who dovolunteer, I'd definitely take d.net over Seti@Home.

    Where is my mind?
  • The link in the message body, to the comment I wrote earlier.

    Where is my mind?
  • Well...I almost totally agree. People are starting to act like Mars is the corner 7-Eleven. It is an amazing feat that these missions go off at all, people should not be so abrasive when one goes wrong.

    BUT

    If it was All American Ingenuity (tm), they wouldn't ever have substituted (the world standard) Metric system for the english "standard" and would never have blown this one in the first place.

    Stupid post? yeah, but hey, there are so many others most times...
  • Once past the obvious tounge in cheek bent of your comment, this idea has great merit. Pitting distributed comptuing resources against less esoteric causes should have been done a long time ago. The medical field alone must have hundreds of number crunching tasks to be performed. Think about it, Best and Banting discovered Insulin with pencil and paper calculations, how far could we progress with a hundred thousand clients working on Cancer data for example?
  • Ooh! A large selflearning neural network running on "idle cycles"!
    See if we can get it to become selfaware. =)

  • The vague hope that someday humans will set foot upon the soil of a foreign planet seems rather unimportant when the insane guy in the next cardbox box keeps trying to steal your blanket.

    LOL! Yes, you're definately right on that account. I wasn't trying to really make an economical or political argument, but, rather, an observation of an attitude prevailent in the US. You do make a good point, though. Obviously, no society is or ever has been (or, IMO, will ever be) without problems. So how do we decide what's "necessary" for the government to address, i.e., poverty, crime, etc., and what's "extraneous, but useful and important" i.e. NASA.

  • NASA does have some problems. In Richard Feynman's book, "What Do You Care What Other People Think?", he points out a serious lack of communication between the engineering staff and the management with regards to the shuttle(s).

    Well, I haven't read that book, but of course NASA has problems. Every group of people has problems because, after all, we're only human. But shunning something because it is flawed most certainly will not fix it. Agreed, many things within NASA need to be addressed. I wouldn't even know where to begin, but I think your post conveys a similar attitude to what I was originally ranting about. Instead of the American Public whining about the fact that NASA has problems, we should demand that something be done about it. It's our taxes, so it is our agency; it annoys me how many people seem to forget that.

  • This cookie is also in the fortune distribution (somewhere like /usr/share/games/fortunes/linuxcookie), where it comes with a translation:

    'Mounten' wird fuer drei Dinge benutzt: 'Aufsitzen' auf Pferde, 'einklinken' von Festplatten in Dateisysteme, und, nun, 'besteigen' beim Sex.
    (Christa Keil in a German posting: "Mounting is used for three things: climbing on a horse, linking in a hard disk unit in data systems, and, well, mounting during sex".)

    --

  • So we end up with less funding for them and more funding for useless liberal fedbloat.

    What about cutting money for missile defence research? It was an expensive waste of time in the 1980's, will be an even more expensive waste of time in the coming decade.

  • Isn't seti@home about filtering out radiosignals out of the cosmic noise? Well if this thing is sending, its signals should have been picked up already by the radiotelescopes, don't it? :)


    Regards,
  • "Seti@Home is crappy closed-source code, with very badly optimized clients. "

    And how would you encourage "shiney happy open source" people not to, uhh, say change the results. I mean, they already have a redundancy factor, admittedly, but I think that the closed source actually helps this project. Is there anyway around this without using multiply checked data (even then, if someone decided to, they could just keep sending stupid false positives over and over again, perhaps overloading/biasing a system)?

    As for poorly optimized clients, I'm sure they don't disagree, but they'll fix it eventually.

    zzz

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...