Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Microsoft.com Makes IE8 Incompatibility List 358

nickull writes "Microsoft is tracking incompatible Web sites for its upcoming Internet Explorer 8 browser and has posted a list that now contains about 2,400 names — including Microsoft.com. Apparently, even though Microsoft's IE8 team is doing the 'right' thing by finally making IE more standards-compliant, they are risking 'breaking the Web' because the vast majority of Web sites are still written to work correctly with previous, non-standards-compliant versions of IE."
This discussion has been archived. No new comments can be posted.

Microsoft.com Makes IE8 Incompatibility List

Comments Filter:
  • Google.com?! (Score:5, Interesting)

    by kramulous ( 977841 ) * on Thursday February 19, 2009 @06:00PM (#26922405)

    I'm no web developer but how can google.com be on that list as well? It is one of the simplest websites around. A text field, few links and a bit of javascript.

    How the hell can a web browser, that let's face it, is probably going to be the dominant web browser, not render that.

    No wonder the general population get pissed of with 'the computer's not working again'. These days I tell them that I don't know Windows. I'm going to have to start walking around with a Ubuntu live on USB.

  • Re:Options (Score:5, Interesting)

    by nmb3000 ( 741169 ) <nmb3000@that-google-mail-site.com> on Thursday February 19, 2009 @06:07PM (#26922503) Journal

    What if we could just define which rendering engine to use in pages, e.g. IE7 or IE8 in a meta tag...

    Oh if we only could! [msdn.com]

    Watching the development of IE8, the teams is taking great pains to make sure that site authors and owners have an overall say about how their page is rendered with respect to new IE standards-compliance. You can use both a META tag as well as a HTTP header to tell IE8 to use either the new rendering engine (default) or to fall back to the IE7 standards. Companies can also specify compatibility options using GPOs which should help keep older intranet sites working.

    I think it's a pretty good tradeoff between pushing for modern standards and not "breaking the web". Yes, it is largely IE's fault that there are so many non-conforming sites out there, but compatibility is important regardless, especially for "offline" sites which cannot be fixed easily or cheaply (CD help files, embedded web servers, etc). At least by having the new rendering mode the default it will encourage standards compliance (or at least IE's [admittedly improving] version of it.)

  • Re:Options (Score:5, Interesting)

    by duguk ( 589689 ) <dug@frag[ ].uk ['.co' in gap]> on Thursday February 19, 2009 @06:14PM (#26922577) Homepage Journal
    I was testing my *new* site in IE8 yesterday, I'm using the "<!-- if ie" syntax. Works great in IE6 and IE7.

    Doesn't work at all in IE8; so I clicked the little compatibility mode button. It rendered it as IE7, but ignored the compatibility markup, totally breaking everything! Whats the point of an IE7 compatibility mode, if it ignores the IE7-specific markup?

    That's totally useless when it comes to testing IE8, and hover/dropdown menus still doesn't work correctly; however you try and do them.

    Disclaimer: Works perfectly in Firefox/Safari/Chrome/Opera/IE6/IE7, just not bloody IE8.
  • by spuke4000 ( 587845 ) on Thursday February 19, 2009 @06:21PM (#26922675)
    The real story is not that microsoft.com is on the list, it's all the other sites. Ostensibly this is a list of sites that are not standards compliant, which IE8 will treat in as non-standard so they display correctly. But if you check the list [zdnet.com] you'll find wikipedia.org, google.com, mozilla.com(!!). Are these sites really non-compliant? Or is IE8 just incompatible with them?
  • by Dionysus ( 12737 ) on Thursday February 19, 2009 @06:27PM (#26922757) Homepage

    validating Google.com [w3.org]. Don't think google ever tried to be compliant.

  • by iYk6 ( 1425255 ) on Thursday February 19, 2009 @06:36PM (#26922839)

    It's funny you mention that. I have always been amazed at Google's capacity for error. In 4 lines of HTML, on the very simple page you mention, Google has managed to fit 65 errors and 8 warnings. Sibling poster has a link to the w3c validator.

  • by TBerben ( 1061176 ) on Thursday February 19, 2009 @06:43PM (#26922965)
    According to the W3C validator Mozilla.org [w3.org] passes with 1 warning, Wikipedia.org [w3.org] passes with flying colours but Google.com [w3.org] fails miserably with 65 errors.
  • Re:Options (Score:3, Interesting)

    by duguk ( 589689 ) <dug@frag[ ].uk ['.co' in gap]> on Thursday February 19, 2009 @06:51PM (#26923079) Homepage Journal
    I absolutely agree, for some nice looking drop down menus, however - it's impossible to avoid the "if ie" tag. I wish I didn't have to use them (and wish I didn't have to sacrifice decent code for looks and design).

    As I mentioned originally though, this is for a drop down menu using ul/li and the hover psudeocode. I can't find any way to do this that works nicer, and avoids using javascript. The "if ie" tag I'm using is non-essential and the site still works without it. However, to make it looks best it's hard to avoid these.

    At least its better than some of the old css tricks with invalid code - the "if ie" syntax is W3C valid code.
  • by Anonymous Coward on Thursday February 19, 2009 @06:58PM (#26923167)

    In the list, there's "mozilla.com"... but...

    w3.org mozilla [w3.org]

    Maybe, Microsoft didn't want to have "microsoft.com" all alone...

  • by psyclone ( 187154 ) on Thursday February 19, 2009 @07:08PM (#26923271)

    IE8 passes ACID 2:

    http://blogs.msdn.com/ie/archive/2007/12/19/internet-explorer-8-and-acid2-a-milestone.aspx [msdn.com]

    But in September, IE8 lags in the ACID 3 test:

    http://www.anomalousanomaly.com/2008/03/06/acid-3/ [anomalousanomaly.com]

    The closer they all get to standards (any standards) the better.

  • by gazbo ( 517111 ) on Thursday February 19, 2009 @07:09PM (#26923283)
    That is a truly shocking email. And if it were 10 years ago it may even be slightly relevant.
  • Re:Options (Score:5, Interesting)

    by Hurricane78 ( 562437 ) <deleted&slashdot,org> on Thursday February 19, 2009 @07:10PM (#26923297)

    No it is not. You apparently never tried to program a real web application to work in that thing.

    It contradicts its own rules, based on random things like race conditions between the first execution of JavaScript in an <IFRAME> and the end of page the rendering routine.
    Been there, seen it, circumvented stuff like that in anything from 2 minutes to no less than two weeks of hard debugging.

    In the matrix of IE, you only have to remember one thing: There is no standard.
    Everything can change, and change back in the blink of an eye, for no reason at all.

    I fear that to be a Trident developer, you must be a genius to understand that mess, and crazy to stand it, at the same time.

  • Re:Google.com?! (Score:2, Interesting)

    by psyclone ( 187154 ) on Thursday February 19, 2009 @07:22PM (#26923417)

    If it is raw bandwidth on the main page and even the search results page they are concerned about, Google could store their inline CSS and javascript externally. The browser and proxy cache savings would be more than enough to make up for adding the doctype and standards code.

    Personally I don't think Google cares enough to make their documents W3C standards compliant. But as long as everyone's browser works on it, no end user will care. (Someone writing a rendering engine might be annoyed.)

  • by Anonymous Coward on Thursday February 19, 2009 @07:26PM (#26923453)

    They were at some point, if i remember correct.
    But since they added in all the JavaScript fuctionality, things got "broken".

    While i can forgive most of the errors in there, but this one in particular is pretty damn shocking in my eyes.
    Line 3, Column 2223: an attribute value must be a literal unless it contains only name characters
    ...om/maps?hl=en&tab=wl" onclick=gbar.qs(this) class=gb1>Maps a href="http:
    Really?! Yes, usually a ; would end that, but since a bracket with a space was found, that is taken as the end, but seriously, that's just crazy, it makes my brain cells scream.
    In fact, all the attributes not being in quotes are pretty damn bad.

    Surprised they haven't fixed that, kinda saddening seeing Google leave it like that.
    It won't exactly add that much more to the download speed, their site is already tiny, 56k could handle it reasonably well.

  • Re:Options (Score:5, Interesting)

    by commodoresloat ( 172735 ) on Thursday February 19, 2009 @09:06PM (#26924219)

    This is exactly right. I don't know why but this company seems to be doing everything ass backwards and still getting away with it. I work at a very large organization, and a lot of Office documents get sent back and forth on email. Most people have not "upgraded" to the latest version of office (2007/8). The few who have send everything in the new xml format (docx etc), which is not compatible with older versions. This is annoying as hell when I have to explain that Word is incompatible with Word, or Excel is incompatible with Excel. Thankfully there are tools on the microsoft site that can convert these documents, but there is no reason people should have to jump through these hoops. Even worse, these programs have expiration dates -- just today I tried to open a docx document and was told the program had expired. I had to go to the MS website and download a minor point upgrade to the converter program (the link was hidden on a page [microsoft.com] that was mostly about Microsoft Messenger. Then I ran the program and it told me to quit Entourage, Word, and Excel - each of which had about 10 windows open - just so I could update this external application. Even as I'm typing this I just realized there is yet another minor point update on the website, so I'll need to upgrade to 1.0.2 now. What a nightmare.

    Here's another example of this sort of nonsense -- if you own MS Office 2004 for OS X, it has been updated to 11.5.3. But you can't just update from version 10 to version 11.5.3 in one swoop. If you installed Office years ago and kept it up to date it's a minor nuisance but if you're installing Office 2004 on a new computer, you need to use AutoUpdate like 15 times to get it up to date, one point upgrade at a time. Seriously, who has time for this nonsense? And who thinks up this crap?

  • by Bill Dog ( 726542 ) on Thursday February 19, 2009 @10:19PM (#26924679) Journal

    While testing a socket helper class I was writing about a year and a half ago, I noticed that the Google homepage's entire direct content (i.e. excluding content like their logo, which the browser fetches in a separate request (and which will be cached for visits thereafter)) always arrived in a single TCP/IP packet. I assumed that this was on purpose, by the following reasoning:

    • This bypasses the possibility of the rendering of a partially downloaded web page. A user who sees part of the page there but it's still not yet in a state where they can begin to use it will likely think it's due to slowness on that web site's end. (I.e. they'll be mad at Google.)
    • In any delay in loading that first packet, even if on the web site end, because users don't see anything yet, they're likely to assume they're just experiencing a slow connection for some indeterminate reason, and assign blame to their ISP or the Internet in general.

    So if all of Google's main page content still fits in the 1500 or so byte limit, then they prolly indeed are dropping characters here and there and violating standards, as long as it still renders properly, to maintain that snappy response we're used to when going to Google. In other words, I think Google's characteristically spartan home page was not only about the look, but the look and feel. Pretty smart.

  • by dave562 ( 969951 ) on Thursday February 19, 2009 @10:19PM (#26924683) Journal
    I've been using IE8 for a couple of months and have been staying on top of the beta releases. The browser is pretty much worthless unless I put it into "compatibility mode". It doesn't work with my banking sites. It doesn't work right with Gmail, even in compatibility mode. It doesn't work on Slashdot. It barely works anywhere. So either a good portion of the internet isn't coded to standard, or the IE8 interpretation of the standard is borked.
  • Re:Options (Score:4, Interesting)

    by Firehed ( 942385 ) on Friday February 20, 2009 @02:19AM (#26925935) Homepage

    Why not try changing your "if ie" to "if lte ie7" and stop confusing the hell out of the poor thing? Unless you've done some bizarre javascript (please tell me you're using one of the plethora of fully-cross-browser libraries!), this shouldn't really be an issue. At least not a significant one - it may not be pixel-perfect, but easily close enough. My brief testing in IE8 has it rendering stuff just as well as Firefox or Safari.

    I realize that it's not always (read: almost never) an option with CSS, but it's far better if you can avoid browser-specific conditions by other means. For instance, you can check if a recent JS/DOM method exists (getElementsByClassName, for instance), use it if so, otherwise revert to your fallback/ugly/slow code. If/when the browser gets the method in question (not that it's at all likely, but what if MS patched some of the flaws in IE6/7?), your code will automatically use the better version without you having to touch it after the fact, and no browser sniffing.

  • by Anonymous Coward on Friday February 20, 2009 @02:54AM (#26926077)

    "Now, why would Linux users want to go to the Windows Update site anyway?"

    Are you kidding? It drives me bat guano nuts when I CAN'T download Windows updates/patches from
    LINUX/Firefox. Just because I have one machine with say Vista/MS Office why should I have to use IE and FFS pass some stupid WGA/OGA test just to download patches for things that shouldn't have been broken in the first place!

    * Because I'm a sysadmin for multiple machines.

    * Because I prefer to use LINUX/UNIX when at all possible.

    * Because it is very reasonable to use ANOTHER PC to download updates/fixes for machines that ARE NOT on / online / available / working. Burn them to CD or copy them to the LAN or flash disc and you can update them when locally convenient. Actually most "critical" PCs aren't connected to the internet AT ALL by organizational security policy, hence they have to have their updates pulled from another machine.

    * Because when I *most* want to see / download Windows Updates / security alerts / bulletins / patches is *exactly* in the situation when my Windows boxes are in danger of being 0wn3d by the unpatched remote code execution vulnerability of the month and I don't DARE connect them to the internet until the problems are identified / analyzed / understood / patched. Typically a lot of Windows based computer malware actively PREVENTS you from updating / patching the box or its virus definitions, et. al. If a Windows host is infected/vulnerable you have to worry about it being susceptable to more / initial infections by bringing it online especially via the IE browser.

    * Because for instance Microsoft's download center helpfully offers downloads like monthly CD ISO image security updates or various other tools / documents in ISO image format. Oops MS Windows HAS no official built in capability WHATSOEVER to burn Microsoft's own ISO images to CD/DVD or to extract/mount them. Whereas if I download them on UNIX I'll have them burning in about 20 seconds and the images loopback filesystem mounted for sharing over SAMBA/CIFS to the LAN using perfectly standard built in utilities.

    * Because for instance Microsoft has no built in capability to do things like MD5/SHA1/GPG verify the various downloads for which hashes / signatures are available, whereas it takes about 10 seconds with standard tools on UNIX.

    * Because even on NTFS with Windows you typically run into stunningly brain dead limitations like 128 character path name limitations, and also a lot of the download/filesystem utilities are pretty bad about preserving file/directory creation/modification times. So if I'm trying to be organized and actually store information about WHERE/WHEN I've downloaded a given update I need UNIX tools/filesystems for best success. This is relevant since [thank you Microsoft!] they typically have no good / simple way based on filename or standard metadata to identify WHAT revision/version/platform a given patch is for, or even necessarily what KB/issue it is relevant to. You can end up with a lot of brain damaged "SETUP.EXE" downloads from microsoft and you'll forever be wondering "What's that?" "Why do I want it?" "Is it even the most recent version?", hence you need to manage the files in the filesystem which, as aforementioned, is much more difficult on FAT32/NTFS/Windows than LINUX.

    * Because typically you don't find standard tools like download managers / bandwidth control utilities et. al. on Windows, though of course they're available as 3rd party tools. firefox, wget, curl, et.al. are better for UNIX than Windows.

    e.g.: /home/sysadmin/2009-01-30-Microsoft/Windows_7_Beta_7000.0.081212-1400/download.microsoft.com/download/6/3/3/633118BD-6C3D-45A4-B985-F0FDFFE1B021/EN/7000.0.081212-1400_client_en-us_Ultimate-GB1CULXFRE_EN_DVD.ISO ...illustrates nicely the problems with (a) ISO images, (b) 128 character path limits, (c) preserving metadata information about the date/source of the download that just doesn't happen on Windows, et. al.

The best defense against logic is ignorance.