Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Security Software Conflicts with AJAX? 84

ithyus needs help with the following: "My employer is running an e-commerce site that, until recently, our customers were quite happy to use. With increased traffic to the site we decided to implement AJAX to try to reduce the load on our database servers. In doing so, our customers have experienced all kinds of problems with security/privacy software such as Norton and McAfee. It seems that no matter what we do we can't make these programs happy. Bigger companies such as Google have documented work arounds for some of them, but we wouldn't be able to keep our docs current with all the software that's presently out there. I'd really like to know how Slashdot's readers have handled these issues. Since security programs don't appear to be compatible with the emerging features of the Internet, do you simply suggest that the customer disable the offending software or do you opt to offer some support for the more popular ones? Are those really the only two options? How do you justify your method?"
This discussion has been archived. No new comments can be posted.

Security Software Conflicts with AJAX?

Comments Filter:
  • In general, you want to work around users' environments, not the other way around. That means if something you're using isn't compatible with a large number of your visitors' systems, you either work around it or use something else. Then your justification becomes "because it works". If you tell everyone to uninstall their antivirus software to use your site, I suspect you'll lose a substantial number of visitors.

    If you're resorting to using AJAX only ameliorate your DB load, you may wish to try more conser
    • If you're resorting to using AJAX only ameliorate your DB load, you may wish to try more conservative methods that will work on all client machines, such as optimizing your queries, first.

      Unless you can predict the future, you can't "optimize queries" to only get the data the user is going to want. Random crap advice of "optimize your queries" is worthless and unhelpful.

      Getting data on demand is how the web is going to work. I'd suggest you stick to the standards and the users will complain to the anti-
      • > Unless you can predict the future, you can't "optimize queries"
        Do you even know what he meant by "optimize queries"?

        The actual data retrieved by a client is pretty much irrelevant. It mostly doesn't matter which data the user is going to want. It only matters how they get the data. No (sane) web site allows visitors to run custom SQL queries, so therefore there are a limited number of queries that are performed. Each of those can be examined to determine whether or not they are being done in the mo
        • Please don't quote me out of context.

          What I said was:

          Unless you can predict the future, you can't "optimize queries" to only get the data the user is going to want.

          It's very often that a website doesn't present all the data to the user that the user could request. Fetching the data on demand can result in database improvements that "optimized queries" can't touch because you don't even have to get the data often times.

          And yes, I do know what it means, I do it for a living.
  • by conJunk ( 779958 ) on Monday June 05, 2006 @07:26PM (#15476432)
    ummm.... You answered your own question, didn't you? The like to Google's "work arounds" [] seems to be the answer. These aren't work arounds, they're specific steps for authorizing your site with the AV software. Just make your own document similar to google's and direct customers to it.
  • by cjsnell ( 5825 )
    Explain to me how creating more HTTP requests by using AJAX is going to decrease your database load.

    Oh, while we're at it, explain how sheep's bladders may be employed to prevent earthquakes.
    • I don't use ajax, but I'm guessing the logic goes somewhat along the lines of this: A page refresh to update a single control might require 5 database queries in order to completely populate the page's controls, whilst an ajax request to update a single control requires 1.
    • Yes, it's very cool. AJAX prepopulates all data required for all possible actions on the page, saving round trips to the server. The fact that the user will only perform one of those N actions, and all the other data is thrown away, is a minor detail.

      Those AJAX-bashers don't seem to understand that web servers don't scale horizontally, whereas database servers can be scaled by throwing more cheap boxes in the data center. And we live in a strange world where delivering a single huge page is faster than s
      • That's basically wrong.
        It's easy to scale web servers by adding more of them, with a load-balancer (such as a Cisco CSS, and/or a software layer like mod_jk) to distribute requests appropriate.
        It's hard to scale a database, especially one that gets updated, because distributed writable databases are hard to do, and even distributed read-only replicas get tricky, especially if they must be up to date.
        In any web services application you really want to avoid hitting the database if you can, by using simple dat
        • It's funny how what I said was pretty much the inverse of what is correct, huh?
        • maybe he doesn't have to add a distributed database; adding a database server to host the autherization system tables on one machine, maybe another to serve the read-mostly tables, and if they need it a third for the Read Write workhourse tables like orders and inventory. That way he could even pick and choose the database servers individualy on their particular strengths and weaknesses. There have been times when I've done that by accident which makes bug-bashing interesting.

          Maybe the real question should
      • Yes, it's very cool. AJAX prepopulates all data required for all possible actions on the page, saving round trips to the server. The fact that the user will only perform one of those N actions, and all the other data is thrown away, is a minor detail.

        Does everyone really wear goatees in your dimension?
      • parent, I wish I could mod you "Funny/Informative Satire" For those people who can't reverse your satire:

        It is relatively difficult to add more database servers. It is relatively easy to add more webservers. Any caching structure you could possibly use in AJAX you could definitely use on your webserver - and many more, because you have power over those servers.

        You MIGHT use AJAX to reduce the load on your webserver in some cases, but if it reduces load on your DB server you didn't have caching setup ri
      • database servers can be scaled by throwing more cheap boxes in the data center.


        Sorry, it's just that when you sai... HAHAHAHHAHA!

        Umm, yeah, what was I saying?. Oh yeah: databases don't scale horizontally, and Oracle clusters are a pain in the ass. You could always do some sort of caching to unload the db, but AJAX is good for improving percieved interactivity.

    • Re:Eh? (Score:5, Informative)

      by Bogtha ( 906264 ) on Monday June 05, 2006 @08:40PM (#15476774)

      Explain to me how creating more HTTP requests by using AJAX is going to decrease your database load.

      Simple. Let's take Slashdot moderation as an example. Last time I saw it, it included a drop-down for each comment, and the ability to submit your moderation for all comments. When the form is submitted, the user-agent transmits the moderation status for each comment to the server, and reloads the entire page. This entails not only wasting bandwidth (by transmitting all comment statuses instead of only those that have been altered), but also a high cost because even if you only moderate one comment, a page with potentially hundreds of comments has to be sent back to you.

      A moderation system that uses Ajax to submit comments, on the other hand, sends only one status for only one comment, and doesn't have to reload the page with hundreds of comments, because all it needs is a simple success or failure flag in return. Thus, if you moderate five comments, you might make five requests, but those requests are tiny compared with the single massive request that the non-Ajax version needs to make.

      In the more general case, it may very well be that some database queries simply don't need to be made in most cases, but do in a minority of cases upon certain user interactions. In these cases, without Ajax, you are stuck performing the queries preemptively for all users, instead of only in the minority of cases where it is needed.

      Thinking "more HTTP queries == worse performance" is an incredibly superficial analysis and neglects many important factors.

      • by Mr Z ( 6791 )

        Thank you. At the same time, background refreshes when the user isn't paying attention, such as GMail or the Yahoo Beta do, could generate many queries that the user will never notice but the DB has to endure. There needs to be a way to prioritize background activity over direct UI interactions, and/or optimize the server for these background refreshes. Random idea: You could have the server generate a parallel disposable mini-database of "things that have changed since the currently active sessions las

        • Re:Eh? (Score:3, Informative)

          by GoRK ( 10018 )
          The cost of reloading some or all of a webpage (outside of the obvious bandwidth needed) depend a great deal on the way the application is coded.

          In GMail for instance, it would make sense for open sessions to not query the users mailbox each time the requestor wants a list of (new) messages, but to move to an approach where a new message being delivered to the mailbox notifies the session that there is a new message. I really see no reason that GMail should be querying for the list of messages more than onc
          • by Mr Z ( 6791 )

            Disclaimer: I am not a database expert. :-)

            This is why I mentioned the "session cache" idea. Once the server farm knows a session is active, any transactions that come in that modify the state the session might be interested in can go to a separate "things that happened since you last asked" database. That database can be really small, and discarded when convenient. The cost of discarding it is that you need to do a full query on the main database.

            With this sort of approach, you can stick to a cl

            • by GoRK ( 10018 )
              Well you can sort of fake a push model with AJAX in that you can have the server start sending data that is processed as it comes in 'live' without having to continually request updates. This is often the type of thing that is done (and is favorible) with file upload progress bars, chat applications, and the like. The session cache is the right idea, yes, except generally it doesn't really have to be a full database. Depending on the application, data may be copied to a database table stored in RAM or seper
      • In the case of Slashdot moderation, I think you're forgetting something. Working through a page of comments, reading each one, and assessing whether the existing moderation seems fair and accurate takes a significant amount of time. In that time it is entirely likely that others have also moderated the same page.

        I'm sure that all moderators will have been caught at least once by seeing a particular comment end up with an inappropriate score as a result. Conscientious moderators will probably then decide tha
        • 1. If we were going to do /. moderation in AJAX, we might change how it worked. Instead of modding an existing score up or down, you'd vote for what you think the true score is. Since this wouldn't be dependent on the existing score, you wouldn't need to refresh anything.

          2. The time when you want to reload the page isn't *after* you've moderated, it's before. I.e. refreshing the page after you moderate only helps you for the next moderation, not the current moderation. AJAX could offer you a button th
      • What happens to the piled up requests if you get a connectivity loss or something ? Discarded ?
      • why not just use some javascript and only submit changed values. That means a max if 5 key/value pairs per page; at least for most user who don't use lynx or w3m.
      • but also a high cost because even if you only moderate one comment, a page with potentially hundreds of comments has to be sent back to you.

        Which shouldn't actually hit the database - maintain an lru cache of all stories with a size of around 50 and an expiry of about 1 minute. Each box that holds this cache loads a list of comments once per story per minute at most. run four or five instances of the cache and point the webservers at it - presto! no slammed database.

    • How about this:

      A shopping cart application where everything but the price and inventory levels are cached out to the web servers as static content. Then a small piece of javascript goes and pulls the inventory and the price and feeds them to the controls on the page.

      Before AJAX, the entire page would be dynamically generated, meaning the database would have to supply the product description and all other content on the page. Now that information can reside in the database, but the app server first looks t
      • Many web-based applications already do something better; my employer owns at least one.

        The basic technique is to create a derived database access class(es) that include simple caching logic. The whole app then uses that derived class for DB access.

        When the cache-aware class(es) are used, the developer indicates whether or not cached data is desiered, and optionally what the specific timeout should be. When a call to this "caching DB class" is made, the SQL that would be sent to the database is instead fir

        • Sounds like a very cool implementation. The primary driver for us was applications that on average don't see very much usage, but on certain days can see that usage spike by up to 1000-2000% (think registration-type stuff)

          So the site tends to be fairly static by the time of the crush, but we need to serve as many pages as possible in a matter of 5-10 minutes. For us, static html with only a few of the pages making any db calls at all allowed us to handle the crush and scale amazingly well.

          The diversity of
  • No way out (Score:1, Redundant)

    by unity100 ( 970058 ) *
    First, Ajax is not going to reduce your database load, if you do not use it like a cache for already performed query results in remote client pc. And even in that case, if you use query caching (in mysql for example), as most common queries will have been already cached, they wouldnt incur as much load as they would as they will be pulled out of the cache without an extra new query. Even if you go for "using as a cache to store the content in remote client pc" route, than there is the matter of uploading lo
    • Java is already something that the anti-virus guys and people are wary about, it is widely exploited to plant a phletora of stuff in visitor pcs.

      I'm somewhat arguing with this point, but also legitimately curious. There are really only a few very insignificant examples of Java applet based virus, all of which use outdated JVM. Your statements make it sound otherwise. Do you have proof of "widely exploited" holes in the Java sandbox?? I doubt it, but I'd appreciate being proven wrong otherwise.

      For stand
      • I think he meant JavaScript.
      • Nay exploit there is not in the technical meaning as we know it - ie usage of holes and bugs to accomplish something. I told it as it is in dictionary meaning.

        I was referring to planting of trojans, viruses, etc to clients' pc through using legitimate commands and routines. Ie like in many porn sites. - 5 Popup windows open, they spawn another 10 popup window and redirect to another 5, and despite youre loaded with patches, zonealarm, karspersky anti vir, closed dcom, and so on, you get infected.
      • Yea this is correct, i meant javascript. Late was the hour of my posting. []
  • by Anonymous Coward
    I believe before AJAX should be fully deployed, HTTP will need to be rewritten to allow much more streamlined updating. I would also have a look at Norton and other virus detectors. I use AVG, but asking your clients to use another software isn't feasible. Is it possible to go back to the old way? If it isn't broken, I wouldn't fix it. I have read some topics from message board software deploying AJAX techniques and some large forums turn off many of the AJAX features because it just isn't resource savvy.
  • by Siergen ( 607001 ) on Monday June 05, 2006 @08:01PM (#15476616)
    Over the years I have had 3 on-line merchants ask me to disable or uninstall my network security software to access their site. I immediately stopped shopping at each of them. They were not selling anything worth the risk of being connected to the Internet with no protection, and I doubt that you are selling anything worth that risk either...
    • On the other side, obviously your "network security software" was always correctly written and configured, by people who understand the subtleties of how all the common network protocols work?
    • Some customers are MORE vulnerable because they use McAfee or Norton. They either don't realize that their subscription is expired, they expect protection from obvious emailed viruses that slip past the scanner, or the scanner itself introduces a critical flaw in the system like Symantec often does. Not a year goes by without a critial Symantec security software flaw.

      I'm not saying that telling a customer to uninstall or disable Norton is the right way, but there are worse things you could do.
  • You know... (Score:5, Insightful)

    by NeoThermic ( 732100 ) on Monday June 05, 2006 @08:12PM (#15476657) Homepage Journal
    There is a quick and simple answer to this one. Detect if the AJAX is not working (or let the user specifiy they do not wish to use AJAX), and send a document that can do the same end results as the AJAX version, but (and wait for it), without AJAX.

    If you are designing programs that can be potentially used by many thousands of users, you can not afford to write programs that only cater for those who wish to play by *your* game. A good few of them will refuse and use another software.

    • Yeah, I do this. While it adds a small level of complexity and a tiny extra hit at the initial page load, you can save the status of that in the session and send pages accordingly.
  • by spyrral ( 162842 ) on Monday June 05, 2006 @08:20PM (#15476689) Journal
    And as a web developer that tries to make good use of ajax style techniques, this is very troubling.

    I'm always seeing articles about AJAX security issues, and they always puzzle me. AJAX is just another way of sending http requests to the server from the browser. If you're able to write secure server side scripts already, then you should have no trouble writing ajax responders. How do these security aps decide that these particular http requests from the browser are "bad"?
    • i'd guess most likely they are completely disabling activex in IE

      IE only supports AJAX through activex (this is changing in version 7 but that won't be widely deployed for a while).

    • You said it yourself:

      AJAX is just another way of sending http requests to the server from the browser.

      Unfortunately this alternate method makes use of a couple of potentially harmful mechanisms: client side scripting and (in IE) ActiveX. Additionally, it is not uncommon to see AJAX requests and responders bend or break the rules of HTTP also which can cause packet-inspecting firewalls some grief. Sure, you can easily code up a little shopping site where you click 'show price' to load the price via AJAX, but
      • You're telling me that IE's XHR violates the HTTP protocol? I'm shocked, sir!
      • Can you explain this:

        Additionally, it is not uncommon to see AJAX requests and responders bend or break the rules of HTTP also which can cause packet-inspecting firewalls some grief.

        I can see XMLHttprequest breaking HTML data(as in using XML and parsed/rendered through javascript as opposed to preformatted HTML), but HTTP? How?

        The same headers are sent as though you're doing an regular GET by typing a URL in the browser, therefore sessions,cookies,authentication/authorization, etc. remains the same.
        • The same headers are sent as though you're doing an regular GET by typing a URL in the browser...

          That all depends on how the application is written, doesn't it? I can hook into Apache and make it violate any part of HTTP that I want to with little effort. As hard as it might be to believe that programmers sometimes take shortcuts, when coding these thin AJAX responders, some authors do actually neglect to send proper headers or any headers at all! Shockingly, some even neglect to send XML entirely!

          The XMLHt
          • That all depends on how the application is written, doesn't it?

            Not really. The web app developer can only assume that AJAX HTTP requests are following the rules, and as such security on the server side is important which it should be regardless of issued the http request in the first place.

            Shockingly, some even neglect to send XML entirely!

            XML isn't really a requirment, you could use JSON, or even preformatted HTML if you'd like, but you'd lose the built in DOM parser that most javascript interpret
  • by Bogtha ( 906264 ) on Monday June 05, 2006 @08:30PM (#15476742)

    The kinds of things security software disables should be non-essential anyway. For instance, ActiveX disabled in Internet Explorer will stop you from using XMLHttpRequest, but that throws an error that you can catch, and your fallback behaviour for non-JavaScript users can be used.

    Whenever I see somebody complaining about software interference with web applications, it's virtually always because they've cut corners and neglected to code appropriate fallback behaviour when browsers don't support a particular feature. Unfortunately, it's impossible to give you specific advice because you've unhelpfully neglected to mention anything specific at all about the problems you are having.

    As somebody else mentioned, if your goal is to reduce load on your databases, then this can be achieved through other means. For instance, caching (both page fragments, and HTTP caching) can significantly reduce load if most of your transactions are reads that apply to multiple users.

    • For all we know the guy has a fallback in place. He just wants to know how to avoid having to use it. That's the thing about fallbacks, they're a fall. Back. Best only for worst case scenarios, and a good designer wants to plan for those scenarios, but also try to avoid them.
  • by Fencepost ( 107992 ) on Tuesday June 06, 2006 @12:47AM (#15477658) Journal
    I suggest they replace it with something else. Almost anything else, really.

    Norton is the only thing I've ever seen decide that outbound DNS queries were *all* suspicious and should be silently blocked.
  • I don't get it... (Score:3, Insightful)

    by Turmio ( 29215 ) on Tuesday June 06, 2006 @02:19AM (#15477848) Homepage
    ...why and how would "implementing AJAX" cause problems with your "security/privacy software such as Norton and McAfee". Can you please explain? As far as I know AJAX is is just a friendly word for technique where a web browser sends regular HTTP requests dynamically based on action the user performs using web based user interface so that certain aspects of the user interface change accordingly to the performed action without fully reloading the page. In what way would these kind of HTTP requests/responses be distinct from the requests the browser performs when initially loading the page and resources that are needed to render the page from the point of view of "security/privacy software"?

    And why do you think that "implementing AJAX" would magically reduce the load on database servers? The load on database server is purely dependant on the data you are storing and the queries you're doing in order to access it. Sure, you could be able to avoid certain queries hitting the database by firing the queries only after the user does something on the web interface. But on the other hand, this is not magic bullet for sure. If you don't know what you're trying, you might find yourself in a situation where your web and database servers are hammered with say 10x increase in requests/queries.
    • by mdfst13 ( 664665 )
      If you click the google link, it gives examples. E.g. Norton Internet Security's ad blocking software blocks google. Cookie blocking software blocks the google cookie that it uses for persistence. I didn't read through the rest, but it's at = 1523 [] if you want to know.

      In terms of AJAX reducing DB queries, it works as follows. In traditional web apps, you often have data that gets reloaded repeatedly. For example, a list of products. The user clicks on a
      • Even with "perfect" caching, in a traditional web app model, you still have to build the product list again (web server load), and you have to re-send all that data to the customer (network load). With a smart ajax application, when the user clicks back on the product list, they're shown the list immediately, without even a web request having to go out to your server.

        So even with the best possible server-side caching scenario set up, using ajax and related techniques to implement client-side caching can be
  • If getting ActiveX controls to work is a problem, you could fall back on loading your results by javascripting into an IFrame. Granted, its much more difficult to implement "POST" handlers that way, but with the right library, most of those differences can be hidden away. The main thing you lose is the automatic XML parsing - but if using Ajax as an RPC mechanism, you should try using JSON instead. The Javascript DOM implementation ain't the greatest.
  • My first thought was, "You use Microsoft." AJAX [] is HTML, CSS, Javascript, XML, and XLST. One can throw in a sprinkle of "XMLHttpRequest "; But by appling KISS, one will find XMLHttpRequest is at best, redundent. The reader should at this point start to notice that this is all Client Side handling; Basically, the Server spoon feeds the Client. If one ponders how to talk to the Server, consider the POST, and GET options for a <form ...> tag? If one wonders about those who would listen, consider the u
  • This comment will only be relevant if I guessed your intent from your problem definition.

    When you say you are using Ajax to lighten your DB load, it implies might be using Ajax to request content from other servers than the source server your content comes from (cross-site scripting).

    If that is the case, you can certainly expect your clients' antivirus systems to (rightfully) give you a headache.

    You simply should not be doing that, and until something like []JSONRequest (proposed fo
  • If instead of going for full-blown AJAX, you just do some simple dynamic html tricks, like hiding/unhiding various sections of the page, you can in many cases get a very responsive pages, without going back to the server so often. For example, going back to the list of items from the specific item detail can be simply changing the style to hidden and back on two different page elements. Ditto for browsing hierarcies (folder/subfolder, etc.) -- you can make a static html list of the fully unfolded hierarch
  • How AJAX reduced DB load.

    Enlighten us.
  • At my work, we've chosen the XForms [] route using Chiba [] for a recent product. It's not that surprising that I use it, because I was one of the editors of the 1.0 spec a long time ago, but it progressed even in my absence ;-) and it does fit many of the needs people describe here, in particular security and accessibility.

    We write our dynamic markup in XHTML+XForms, following W3C standards (including nascent accessibility standards []), and then use Chiba server-side in Tomcat to translate it into HTML4 and JavaS
  • I don't think many of the people who commented here have actually used AJAX. One person's complaint was "Explain to me how creating more HTTP requests by using AJAX is going to decrease your database load." The answer is two part: 1) It decreases overall load & bandwidth used by reducing the number of full pages sent. Not having to re-render a page significantly reduces the number of queries required; 2) It reduces database load in particular by running a greater number of low-load/complexity queries

"The number of Unix installations has grown to 10, with more expected." -- The Unix Programmer's Manual, 2nd Edition, June, 1972