

AJAX Applications vs Server Load? 95
Squink asks: "I've got the fun job of having to recode a medium sized (500-1000 users) community site from the ground up. For this project, gratuitous use of XMLHttpRequest appears to be in order. However - with the all of the hyperbole surrounding AJAX, I've not been able to find any useful information regarding server load [Apache + MySQL] when using some of the more useful AJAX applications, such as autocomplete. Is this really a non-issue, or are people neglecting to discuss this for fear of popping the Web2.0 bubble?"
Cache, cache and CACHE (Score:5, Informative)
Re:Cache, cache and CACHE (Score:4, Informative)
Astounding Conclusions (Score:2)
I'm sorry, what?
Re:Astounding Conclusions (Score:1, Funny)
Re:Astounding Conclusions (Score:2)
isnt it just so common that the most common things are the most common ones used
anyway, i suggest you forget about php in the first place. ajax will do much better if you have persistance on the server side. so if you're old-school, go java, if you're new and ambitious, try python or the dotnot thingy (i'm just emotionally not able use the latter but i would recommend the 2 first ones).
simplest example ? 2 users type in E with 2 seconds difference, java and python applications can cache this in their o
bad advice: try BENCHMARK BENCHMARK BENCHMARK (Score:2, Informative)
STEP 1: develop a set of benchmarks.
STEP 2: adjust something
STEP 3: see if it improves your benchmarks. If not, roll it back. REPEAT STEP 2
How can you possibly improve your app if you can't even tell when you've improved it? PHP accelerators may or may not help (actually I would recommend AVOIDING PHP because of the difficulty in dealing with persistent compiled PHP code). MySQL query cache may or may n
Re: (Score:3, Informative)
not bad advice (Score:2)
This experience holds valid for most web applications (AJAX or not) as anyone who has worked on any large web applications can attest. Creative use of caching has shown time and again the most effective way to reduce server load. (for some reason spitting out a byte array is faster than calling a database and building a document with the results)
I'm curious how "you can use benchmarks to *identify* which parts of the app are the slowest". This could be done by
Re: (Score:2)
Re:not bad advice (Score:2)
As always, it depends (Score:5, Interesting)
Other AJAX concepts actually make things faster. I've been implementing a forum that never reloads. When you write an entry and press the submit button, an XmlHTTP request is sent containing the new post and the id of the last recieved post. The reply contains all new posts, which are then appended to the innerHTML of the content div-tag. Less CPU-time is spent regenerating virtually identical pages over and over, and less data is sent over the network.
Re:As always, it depends (Score:3, Funny)
I've been implementing a forum that never reloads.
I have one of those. I also have a server admin that never responds. Not good.
Re:As always, it depends (Score:1)
Built a dictionary of commonly searched for terms, those are the only ones which appear on the autocomplete. Cache the list, change the AJAX call so it references a static list (basically a two layer alphanumeric hashing structure)
The only problem is that all the lame-o ajax frameworks cropping up don't offer that type of flexability without being considered bloatware. I did finally manage to implement it us
Appropriate AJAX? (Score:2)
Quite interesting, but I'd have my reservations with an all-AJAX forum. IMHO forums shouldn't break the REST behaviour of the traditional web model in very many places. I want the navigation buttons to work, and I want to be able to bookmark the URL and feel confident that when I visit that URL it will contain the posting that was there before. Yes, there are ways around this in AJAX but to fix what it inherently breaks takes more effort than I think shou
Depends (Score:5, Insightful)
There isn't any useful information out there because it all depends on what you are doing.
Take a typical web application for filling in forms. One part of the form requires you to pick from a list of things, but the list depends on something you entered elsewhere in the form. In this instance, you might put the choice on a subsequent page. That's one extra page load, and needless complication for the end user. Or, you can save the form state, let them pick from the list, and bring them back to the form. That's two extra page views and saving state. Or, you can use AJAX, and populate the list dynamically once the necessary information has been filled in. That's no extra page views, but a (usually smaller) JSON or XML load.
In this instance, using AJAX will usually reduce server load. On the other hand, something like Google Suggest will probably increase page load. Without knowing your application and its common use patterns, it's impossible to say. Even using the exact same feature in two different applications can vary - autocomplete can reduce server load when it reduces the overall number of searches, but that's dependent upon the types of searches people are doing, how often they make mistakes, how usual it is for people to search for the same thing, and so on.
Delay the autocomplete (Score:5, Informative)
Re:Delay the autocomplete (Score:1)
Really... (Score:2, Interesting)
For this project, gratuitous use of XMLHttpRequest appears to be in order.
All this hyperbole surrounding AJAX is just that - hyperbole. I dunno exactly what your requirements are, but the first thing you can do to ensure that AJAX XML requests don't bog down your system is to decide whether you really need all that fancy AJAX stuff. It's neat stuff, but the majority of web apps can still be done the conventional way, without wasting time on AJAX codeRe:Really... (Score:2)
It is futile to claim that there is no important difference between Google Maps and the mapping web applications that preceded it, because the difference is screamingly obvious.
Re:Really... (Score:2, Insightful)
For every website/application you can name that was made fantastic by the use of AJAX it is possible to list at least ten that didn't need it and only have it to try cash in on the latest fad.
The GP's point was a valid one, it is important that people sit down and work out whether AJAX will actually benefit their application.
Despite all the crap being spouted about AJAX it is NOT some magic wand that works for every given situation.
Re:Really... (Score:1, Troll)
Prove you are not engaging in your own hyperbole. Name ten web applications that are using AJAX that don't need to.
I love the smell of stagnation in the morning!
Re:Really... (Score:1)
'Prove you are not engaging in your own hyperbole. Name ten web applications that are using AJAX that don't need to.'
You made my task easier by using the word 'need' since we had the following kinds of services before the tools of AJAX became widely used (specifically the first A) then we can assume that the following don't *need* AJAX:
I'll leave it at that because your response (if there is one) will no doubt be 'oh but I meant name ten applications that use AJAX and it doesn
Re:Really... (Score:1)
Re:Really... (Score:2)
So it is reasonable to conclude from this list that the rule "sit down and decide whether your application needs AJAX" would have forbidden AJAX to Gmail, Google maps, and Digg.
What a lossy rule.
Re:Really... (Score:1)
Not really. Of the three examples Google maps is the only one that uses AJAX in a manner that provides major benefits over a traditional implementation.
My point remains that developers should be asking themselves why they need to use AJAX and whether it will provide any real benefit. Often the answer is not only no, but also that an AJAX implementation might actually detract from the application (be it in performance or user experience).
As a developer, arguing that this question isn't important because
Re:Really... (Score:2)
Or they should just create the application in the "legacy" way, then check if there is any area of the current application that could use AJAX (or other advanced Javascript technique) to improve usability, comfort or response time (user-wise) and layer it on top of the existing and working application.
This is the principle behind the Progressive Enhancement philosophy, and it allows your application to work fine in any and every context, be it your local nerd's text browser, your mom's Internet Explorer, y
Re:Really... (Score:5, Insightful)
Of the three examples Google maps is the only one that uses AJAX in a manner that provides major benefits over a traditional implementation
Now hold on. Gmail is another perfect example of how AJAX can help. Say I have an inbox with 50 emails and I want to trash, archive, or otherwise do something to one message/thread that would involve it being removed from my view, the rest of the inbox (not to mention all the other peripheral UI elements) shouldn't change. In the old way we'd re-request all this tremendous information (say ~95% of the UI) that didn't even change! And this is even more obvious when you remember that each seemingly tiny, simple piece of the UI (say a message line) may use a bunch of HTML (not to mention scripts, css, etc) behind the scenes to make it look/feel a certain way. In the AJAX version we'd just have to add some scripting to remove that DOM element from the page and send a simple, say 0.5KB, HTTP message like "[...]deleteMsg.do?msgid=x[...]" to the server. You still have to suffer the TCP round trip latency (but less so), but the difference can be dramatic, no?
Re:Really... (Score:1)
It seems to vary. Often Gmail will still find the need to display its' custom red loading bar in the corner of the screen and you have to pause and wait for it.
Sure, this wait time is nowhere near as long as I remember say Hotmail traditionally was, but on most connections it is quite possible that Gmail might actually be as slow as or slower than a 'clean' old traditional implementation.
Importantly the keyword in my statement was 'major'. It's definitely arguable that Gmail provides benefits over the t
Re:Really... (Score:3, Insightful)
Have you used gmail? Have you used any other web mail applications? I've used gmail, hotmail (the old, pre-ajax one), yahoo mail (the old, pre-ajax one), hushmail, squirrel mail, open webmail and outlook web access all fairly heavily at one time or another. gmail and OWA really stand out from that crowd. gmail uses AJAX, and OWA is conceptually very simila
Re: (Score:2)
Glass House (Score:2)
Speaking of fads, I note that as of 02:06 EST, www.houseofzeus.com fails to validate as XHTML 1.0 Transitional.
Re:Glass House (Score:1)
Re:Glass House (Score:2)
Re:Glass House (Score:1)
Have a look at the bottom of the page, niether the blog software nor the template in use were created by me. I haven't changed the doctype from its original state and I don't intend to.
If bad doctypes and abuse of standards really gets to you that much I suggest you stop using computers now.
Re: (Score:2)
Re:Ummm, cool it. (Score:2)
Re:Ummm, cool it. (Score:1)
What site feature is gained by escaping them? Run the validator over Digg or Slashdot and you will see the same errors (because of Slashdot's referrer block on w3c.org you will need to use the save as html option of your browser).
While we're at it, you mightn't have noticed, but the ampersands weren't what blocks the page validating. Under XHTML 1.0 Transitional that only generates a warning.
The error is because my chosen blogging software doesn't shove an alt tag on when it converts [img][/img] to prop
Re:Ummm, cool it. (Score:2)
Reliability. Sure, you can get away with ?foo=x&bar=y nine times out of ten, but then you use something like ?foo=x&bar=y&trade=z and links start breaking in some software but not others.
Escaping isn't there just for the hell of it. It has a purpose. When you ignore that purpose and hope all software that will encounter your code will compensate for your mistake, you are bound to have things break some of the time.
Re:Ummm, cool it. (Score:1)
Reliability. Sure, you can get away with ?foo=x&bar=y nine times out of ten, but then you use something like ?foo=x&bar=y&trade=z and links start breaking in some software but not others.
They weren't in a URL. They were in body text, hence no breaking links.
'The rules for escaping ampersands are uniform across all variants of XHTML. Not sure why you think the rules vary between Transitional and Strict; the differences between those two document types are structural in nature, not syntacti
Re:Ummm, cool it. (Score:1)
The rules for escaping ampersands are uniform across all variants of XHTML. Not sure why you think the rules vary between Transitional and Strict; the differences between those two document types are structural in nature, not syntactical.
For interests sake I went and shoved an unescaped ampersand back into the body text of the page and ran it through the validator again to check I wasn't seeing things.
http://www.houseofzeus.com/filez/fails.jpg [houseofzeus.com]
As it stands I never said anything to suggest the XHTML
Re:Ummm, cool it. (Score:2)
When you said that "Under XHTML 1.0 Transitional that only generates a warning." you seemed to be implying that it being just a warning had something to do with it being XHTML Transitional. I guess I read too much into that sentence.
While it's
Re:Ummm, cool it. (Score:1)
Re:Ummm, cool it. (Score:2)
What personality feature is enabled by failing to escape uppity snidery?
I was going to help you... (Score:2)
Every AJAX call is a request (Score:3, Interesting)
Now that isn't quite true as you only reload part of the page.
The common example is google sugest. Instead of a list of searches lets try a list of products. If you use AJAX against a database of 1000 products and you had say 5 users using AJAX hiting the database. If you just did a select each time it would be really bad. At least 5 database hits per second. In the old environment it would have been 1 hit every second (assuming it took 5 seconds to fill in the form). So in this case you're increased your database load by more than 5 times (if you used like instead of = in the SQL).
To get around this you have a number of options. Here some of the ideas I've seen
1. Add columns to the product table with 1, 2, 3, 4 characters and index the columns. This means you can use = instead of like which is faster.
2. Hard code the products in an array.
3. Use has files. EG create 10 has files for different lengths. Then check the length of the imput and then load the correct hash file and look up the key.
The basic idea concept is try and do some kind of work upfront to decrease the overhead of each call because you'll end up having a lot more requests.
Re:Every AJAX call is a request (Score:2)
where name like 'ab%'
still properly uses the index and is efficient; and if it weren't you could do:
where name > 'ab ' and name 'ab}' or something
Sam
Re:Every AJAX call is a request (Score:1)
I'm thinking when the server starts up maybe initialize some arrays or temp files that get updated when the database gets updated (though this seems to violate DRY and could cause concurrency issues).
Re:Every AJAX call is a request (Score:2)
What you are suggesting is not good for the database at all!
A decent database (even MySQL) can do LIKE matches on indexes.
The trick, as others have said, is to use caching. Lots of caching.
You may even want to put a Squid (or Apache 2.2) in front of your web server to catch these little requests and keep them away from your database.
Decide what you really need (Score:5, Informative)
It's going to be tempting to use a lot of AJAX, especially if sounds fun. In reality though, you should be considering user experience, since this is a community site. Don't use an AJAX call where someone might expect a page refresh.
With that said, it's best to try to cache frequently accessed items in memory (regardless of whether you're doing AJAX calls). ASP.NET does a good job of this--I don't know what you're programming in, but definitely find out how to cache so that you don't have to read the database all the time. This reduced our database server load from 55% to 45% upon implementation (it's separate from the web server).
To specifically answer your question, the thing that's fast about AJAX is mostly perceived. Yes, you'll reduce calls, but at the sacrifice of having to code things twice: once for users with JS, once for those without. Use it in places where it's senseless to reload an entire page. For example, opening a nested menu. Searches that aren't done by keyword are good as well. Like has been said above, delay a server request until the user is done typing so that you can reduce calls. Remember, it's still a hit on your server, it just doesn't have to get all the rest of the crap on the page.
To reduce bandwidth, use JSON instead of XML, and only pass the headers that you need to into the AJAX call. To reduce server strain, cache frequently accessed database calls/results. Also, other non-AJAX javascript can help reduce calls, such as switching between "tabs" with some display:none action instead of reloading a page.
The answer is not gratuitous AJAX, the answer is thinking through how people will most commonly use your site, and making those parts easiest (so users don't have to redo things, therefore wasting your server capacity/bandwidth). Take things that shouldn't have to refresh the page and make them work using javascript, AJAX or not. Depending on how crappily things are coded now, you should see between a 15 and 35% reduction in server load and database calls.
Re:Decide what you really need (Score:2)
Why not?
I expect to fail a course and die a smelly virgin, but if I pass the final and get laid, I won't complain.
With that said, it's best to try to cache frequently accessed items in memory (regardless of whether you're doing AJAX calls). ASP.NET does a good job of this
Sorry, I don't think that's where you should be focussing. Cache frequently accessed items in client memory. In the javascript. Oh, and he's probably not using ASP.NET if h
Re:Decide what you really need (Score:2)
As you think that the browser parse XML for you, it would be coherent to think that the browser parses and interpret JavaScript for you.
As you will finally process the retrieved data in the Javascript interpreter, the only comparison apply to:
Re:Decide what you really need (Score:2)
Why does everybody advocating JSON skip a few steps when describing how it is processed?
JSON is data supplied in the form of Javascript instructions. Using it goes like this:
You neglected to mention the fact that JSON information doesn't magically find its way into your code, it
Re:Decide what you really need (Score:2)
"Fast" and "DOM Access" should never belong to the same phrase. Ever. Not without a negation somewhere anyway.
Re:Decide what you really need (Score:1)
Why not?
I expect to fail a course and die a smelly virgin, but if I pass the final and get laid, I won't complain.
How about the fact that it breaks the back/forward buttons and means you cannot add the page to favourites/bookmarks. Small issues for some, but could be damn irritating for people who bookmark a page for the bookmark just to resolve back to the original page.
With that said, it's best to try to cache frequently accessed
Re:Decide what you really need (Score:2)
A basic search function that can quickly and efficiently (the focus of this conversation) return relevant results without needing to refresh the browser can be a huge
Re:Decide what you really need (Score:1)
Sorry, I don't think that's where you should be focussing. Cache frequently accessed items in client memory. In the javascript.
That's quite possibly the stupidest idea I've heard today. How the heck do you expect anyone to get data in the first place? It's not going to magically appear in the client's cache. I sure hope you mean doing it both ways, if anything, although I don't see any benefit to caching anything on the client-side...if you're working with data, chances are good it's going to be entered
Re:Decide what you really need (Score:2)
Likewise, don't give us any crap about 10 million people having JavaScript disabled unless you can give us some verif
Re:Decide what you really need (Score:1)
The following URL has some information from a counter service that seems to have aggregated their data. http://www.thecounter.com/stats/2005/September/jav as.php [thecounter.com]. I admit, this data is skewed because the service is probably not tracking users by IP (which is flawed anyway). However, if they're getting that many people with JS disabled and they have a pretty small slice of the pie, then it's obvious there are plenty more. The percentage is probably pretty accurate in either case.
If you're not using AJAX in
Re: (Score:2)
Re:Decide what you really need (Score:2)
Is that supposed to be an argument for, or against?
In the real world, who cares about Opera and Safari users? The real world where until about 3 months ago, few people even cared about Firefox users.
Re:Decide what you really need (Score:2)
The only place where speed matters like this is the server. Client side, the difference is a one-time occurance, and it's a matter of microseconds. Compared to download time, processing time is negligable. Compared to the speed at which the user interacts wit
Coded reasonably, most likely less load (Score:5, Interesting)
The happy medium is somewhere in between. Come up with functions that return just the right amount of data, including sufficient contextual data to not require another call. For a contacts-type app you would provide functions to read and write an entire user record at a time, as well as a function to obtain a list of users with all the required columns to display them in a single call. You will generally find it more bandwidth and client-side processing efficient to taylor the remote functions towards the UI that needs them, fetching or uploading just the required data for a particular application screen or view. Once you have a decent remote function architecture you will have no doubt considerably less server traffic, since practically only raw data makes the trip anymore.
could be less load (Score:3, Insightful)
An example is an application that display a list of cities in a state, after a user selects the state. (1) If you send ALL the data to the client at onece, its a large file transfer and takes a long time. This produces a heavy load all at once. (2) If you coded it to refresh the whole page after the selection then it is a smaller initial load, but on the 'refresh' you are sending the whole page plus the new data. (3) If you use AJAX, you only have to send the initial small request ( not the heavy load ) and then the second request for the part of the page that needs updating.
Between 2 & 3, #3 is better because it reduces the second hit on the server and network as it does not have to resend parts of the page that have already been sent. Between 1 and 3 you actually will have more hits on the server but #3 will result in less data being sent across the network.
The biggest problem with #2 is that sometimes refreshing a whole page ( onchange ) confuses users. Yes, this may sound weird, but I have had people tell me this.
The biggest problem with #3 is that if the server request fails, you must code for this and if you don't the user may not know what happened. Also how do you handle a retry on something like this?
Since it's a webapp... (Score:5, Interesting)
Make some simple test scripts using something like wget, and capture the response time with PasTmon or ethereal-and-a-script, one test for each transaction type, while at the same time measuring cpu, memory and disk IO/s.
At loads that wget or a human user will generate, 1/response time equals the load at 100% utilization of the application (not 100% cpu!), so if the average RT is 0.10 seconds, 100% utilization will happen at 10 requests per second (TPS).
For each transaction type, compute the CPU, Memory, Disk I/Os and network I/Os for 100% application utilization. That becomes the data for your sizing spreadsheet.
If you stay below 100% load when doing your planning, you'll not get into ranges where your performance will dive into the toilet (:-))
--dave
This is from a longer talk for TLUG next spring
Re:Since it's a webapp... (Score:1)
That's just wrong. You mixed up bandwidth and latency.
Re:Since it's a webapp... (Score:2)
This is from Raj Jain's, "The Art of Computer Science Peformance Analysis", Chapter 33 (Opertional Laws).
--dave
Webapp Benchmarking (Score:1)
Your post confirmed some hunch I've had for a while - I've been trying to figure out a way to measure CPU, I/O, RAM and other resources on our web servers to get reasonable application benchmarking data - I've been told it's next to impossible, and I've had the hardest time finding any info on this type of benchmarking on the net (beyond gutfe
It's a two-way street (Score:5, Interesting)
I'm in the latter stages now of my first serious professional project using AJAX-style methods. In my experience so far, it can go either way in terms of server load versus a traditional page-by-page design. It all depends on exactly what you do with it.
For example, autocompletions definitely raise server load as compared to a search field with no autocompletion. Using a proper autocomplete widget with proper timeout support (like the Prototype/Scriptaculous stuff) is a smart thing to do - I've seen home-rolled designs that re-checked the autocomplete on every keystroke, which can bombard a server under the hands of a fast typist and a long search string. But even with a good autocomplete widget, the load will go up compared to not having it. That's the trade-off. You've added new functionality and convenience for the user, and it comes at a cost. Many AJAX enhancement techniques will raise server load in this manner, but generally you get something good in return. If the load gets too bad, you may have to reconsider what's more important to you - some of those new features, or the cost of buying bigger hardware to support them.
On the flip-side, proper dynamic loading of content can save you considerably processing-time and bandwidth in many cases. Rather than loading 1,000 records to the screen in a big batch, or paging through them 20 at a time with full page reloads for every chunk - have AJAX code step through only the records a user is interested in without reloading the containing page - big win. Or perhaps your page contains 8 statistical graphs of realtime activities in PNG format (the PNGs are dynamically generated from database data on the server side). This data behind each graph might potentially update as often as every 15 seconds, but more normally goes several minutes without changing. You can code some AJAX-style scripting into the page to do a quick remote call every 15 seconds to query the database's timestamps to see if any PNG's would have changed since they were last loaded, and then replace only those that need updating, only when they need to be updated. Huge savings versus sucking raw data out of the database, processing it into a PNG graph, and sending that over the network every 15 seconds as the whole page refreshes just incase anything changed.
Re:It's a two-way street (Score:2)
A new tier? (Score:1)
Re:A new tier? (Score:2)
You mean like the Presentation Layer of the OSI model? The one below Application (second from the top)?
-Charles
Separation of Services (Score:2, Insightful)
To serve up the webpage, I think you should go with a static HTTP server if you can. If you can't I would use a different server because
Cash cash and money (Score:5, Informative)
People talked about RSS web server loads versus advertising revenue about 2 years ago on slashdot, so I hardly think people are that stupid.
Also, if every page is (at best - which I doubt in your case) 50Kb - and the AJAX traffic each call is 500bytes - decide if that ajax call saved an entire page refresh (from your site, a page is probably 120Kb, with ads for customer pages can be 200kb..)
So, initial download even at worst (or best) would be 50Kb, each call 500bytes, so you can see the % of overhead is little, and if this call SAVED a refresh then you have saved 49.5p which is good for half a pint on fridays between 12 and 2 at the little willow on hidge street.
good day.
Web 2.0 is... (Score:2)
Paul Moore
Web 2.0 is made of
Ian Nisbet
Web 2.0 is made entirely of pretentious self serving morons.
Max Irwin
Web 2.0 is made of
Jeramey Crawford
- and a load of other things, see http://www.theregister.co.uk/2005/11/11/web_two_po int_naught_answers/ [theregister.co.uk]
I read those quotes (Score:2)
I like the one about pretentious self-serving morons and 600 million unwanted opinions.
Web0.002 is like the web, only with a lower signal:noise ratio.
Does anyone find the fucktarded way ingaydget puts every fucking ke
mod parent up (Score:4, Interesting)
Because they are so dumb it all looks non-obvious to them; 1 click ordering is so dumb nobody bothered doing it, but hey- the customer (dolt) likes it, so as Amazon were the first senseless idiots to actually do it they get to patent it!
Sam
Re:I read those quotes (Score:2)
Nothing beats a little intelligent, well thought out criticism.
Beware Accessibility Issues (Score:3, Insightful)
This isn't directly related to your question, but it's something that most people experimenting with "AJAX" seem to be overlooking. It's too easy to fall into the trap of using XMLHttpRequest to do everything just because you can, but by doing that you are restricting yourself to a small set of browsers that actually support this stuff. This doesn't include many of these phone/PDA browsers that are becoming more common.
Also worth noting is that changing the DOM can cause confusion to users of aural browsers or screen readers. In some cases this doesn't cause a major problem; for example, if you have a form page where choosing your country then changes the content of the "select region" box that follows, the user will probably be progressing through the form in a logical order anyway and so the change, just as in a visual browser, won't be noticable. However, having a comment form which submits the comment using XMLHttpRequest and then plonks the some stuff into the DOM will probably not translate too well to non-visual rendering, as the user would have to backtrack to "see" the change.
Of course, depending on the application this may not matter. Google Maps doesn't need to worry about non-visual browsers because maps are inherently visual. (though that doesn't actually use AJAX anyway!) Google Maps would be useful on a PDA, however. I'm not saying "avoid AJAX at all costs!" but please do bear in mind these issues when deciding where best to employ it. Most of the time it really isn't necessary.
Smart Use of Client Side is key (Score:3, Insightful)
There are a lot of good points posted in here. Caching on the client on the server are two big things for a good application that is using the XHR. A good database design is also key if you do not want to use "like" which slows down the search. In Ajax In Action [manning.com] as discussed on Slashdot here [slashdot.org]. In chapter 10, the project talks about how to limit post backs with an auto suggest by using the clientside efficiently. The basic idea examines the results returned. If it is under a certain number, it uses JavaScript regular expressions to trim down the dataset instead of hitting the server. Plus there is a limit on number of results returned so it speeds up response time.
One thing I can not get through people's minds enough when I do my talks is Ajax is not going to be a "client-based app" on the web. The main reason is going to be network traffic getting in the way of your request. Imagine a dial up user in India with your server sitting in the United States. The request is going to have to travel to the other side of the world and back with the slow speeds of dial-up. Testing on your localhost is going to look great until you get on an outdated shared server hosting multiple applications with a full network load. Yes we are talking small requests pinging the server, but 1000 users with a 10 letter word could mean death if you designed the system badly!
I love XHR, cough Ajax, but you need to look at what you are dealing with. The design of an XHR app can kill you if you do not think it out fully.
My 2 cents,
Eric Pascarello
Coauthor of: Ajax In Action
Re:Smart Use of Client Side is key (Score:2)
Ummmm...no, it is not. UUCP is deader, thank God. It is going to travel from the dial-up users computer to his local ISP at the slow speed. After that, it gets routed on the rest of the Internet. Last I checked, that isn't using dial-up connections. The last link is going to be back to the user at dial-up speeds but
Re:Smart Use of Client Side is key (Score:1)
The whole point is you need to realize that there is a difference between dialup, DSL, Cable, and localhost that a lot of developers tend to forget. I have seen people ask why it was so fast in development and sluggish in production. That is why I brought up the point.
Some people think that slapping an XHR on a page is going to be a beam of light from the skies to end all of their troubles. Ends up it can
Re:Smart Use of Client Side is key (Score:2)
Re:Smart Use of Client Side is key (Score:2)
Re:Smart Use of Client Side is key (Score:2)
AJAX not the answer (Score:3, Insightful)
AJAX reliance on ECMA-script seems like a shaky foundation at best. I imagine debugging ECMA-script can be quite clunky and even if tool support might solve this problem at some point, there is no guarantie that browsers will interpret ECMA-script the same way, it seems like an embrace and extend waiting to happen.
I will not venture too far into the dynamically vs. statically typed language discussion other than stating that personally I prefer strongly typed languages.
I get the impression AJAX is a quick-and-dirty solution to a problem that requires something more advanced.
It seems like AJAX is an attempt to overcome the shortcomings of thin clients using the technology that had the widest market penetration, without considdering whether the technology was the appropriate tool for the job.
I am afraid that we will have to live with AJAX for a long time. A tradgedy similar to VHS victory over betamax, where an inferior technology beat a superior one.
I wonder if something like a next generation X-Server browser plugin or a thick client Java framework might not have been better suited for the job. It can't help but feel like AJAX is somehow trying to force a round peg through a square hole.
Use javascript instead of AJAX (Score:1)
Interesting (Score:1)
http://www.devx.com/asp/Article/29617 [devx.com]
http://www.port80software.com/200ok/archive/2005/0 4/29/393.aspx [port80software.com]
My opinion I would think folks haven't done enough apps to know what is what and the only people saying much are going to be the Web2.0 folks themselves (unlikely to own up to it quite yet) or the few folks like these sitti
It had been discussed (Score:1)
Basically, the server that is used to handle X number of customers making a request every 2-3 minutes, will get a multiple of that because the requests are coming in much more frequent.
You will need to tune the server for much higher throughput value (more listeners/threads/workers) to deal with AJAX.