Forgot your password?
typodupeerror
User Journal

Journal Journal: Top methods for faster, speedier web sites 1

Top Methods for Faster, Speedier Web Sites
  1. Make fewer HTTP requests
    Reducing 304's with Cache-Control Headers
  2. Use a CDN
  3. Add an Expires header
    Caching with mod_expires on Apache
  4. Gzip components
  5. Put CSS at the top
  6. Move JS to the bottom
  7. Avoid CSS expressions
  8. Make JS and CSS external
  9. Reduce DNS lookups
    Use Static IP address, use a subdomain for static content.
  10. Minify JS
    Refactor the code, compress with dojo
  11. Avoid redirects
    Use internal redirection with mod_rewrite, The correct way to redirect with 301
  12. Remove duplicate scripts
  13. Turn off ETags
    In htaccess, FileETag None Header unset ETag
  14. Make AJAX cacheable and small
User Journal

Journal Journal: mod_rewrite Fix for Caching Updated Files

mod_rewrite Fix for Caching Updated Files

Web Developers sometimes use file.ext?v=004 as a version control system to force visitors to use updated files. This is terrible. Instead link to apache-003.css and set it to be cached forever. When you change the file you just change the links to apache-004.css. That eliminates millions of bandwidth and resource robbing If-Modified-Since requests.

PHP

Journal Journal: Serve a page for every HTTP Status Code (all 57)

WOW, I served a page for every single HTTP Status Code and saved Headers and Content

There are 57 recognized HTTP Status Codes in the latest version of Apache. But chances are that you will only ever see or hear about 3-6 of them. Well I figured out a neat little hack by looking in the Apache Source Code that will let anyone setup Apache to send ANY HTTP status code for specific requests. I went ahead and performed the requests on all 57 of my special http status code spitting pages. I saved the headers and src for each one, so now you can use this page as a literal map for designing cgi, perl, and php scripts that have custom headers.. Now you will be able to find out what each of those 57 Status Codes do.. and copy them. All of this is possible if you have htaccess setup..

User Journal

Journal Journal: Instruct Search Engines to come back to site after you finis

Instruct Search Engines to come back to site after you finish working on it

What do you think Googlebot and other Search Engines do when they try to reach your site while you are tinkering with it?

Nifty SEO tip to get Search Engine Bots to check your site every hour until you finish working on it and tell them you are finished.

What if you could conveniently tell Googlebot and other bots that you are working on the page but you would like them to come back in, oh, say an hour? I know what I did when I found out this was possible.. I found out how to do it and now I'm sharing with you.

Slashdot Top Deals

"Being against torture ought to be sort of a bipartisan thing." -- Karl Lehenbauer

Working...