Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Mozilla The Internet

Firefox Going the Big and Bloated IE Way? 653

abhinav_pc writes "Wired is carrying an article pondering whether Firefox has become big and bloated, much like IE. As the browser's popularity has risen, the interest in cramming more features into the product has as well. Slowdowns and feature creep have some users asking for a return to the days of the 'slim and sexy' Firefox. 'Firefox's page-cache mechanism, for example, introduced in version 1.5, stores the last eight visited pages in the computer's memory. Caching pages in memory allows faster back browsing, but it can also leave a lot less memory for other applications to use. Less available RAM equals a less-responsive computer. Firefox addresses this issue somewhat, setting the default cache lower on computers with less than a gigabyte of RAM. Though the jury is still out on where the perfect balance between too many and too few features lies, one truth is apparent: The new web is pushing our browsers to the limit.'"
This discussion has been archived. No new comments can be posted.

Firefox Going the Big and Bloated IE Way?

Comments Filter:
  • Very nice FUD (Score:5, Insightful)

    by The Bungi ( 221687 ) <thebungi@gmail.com> on Thursday May 17, 2007 @04:47PM (#19169021) Homepage
    Wow, I actually RTFA and nowhere in there does it say that Firefox is becoming as "bloated" as Internet Explorer. Nope, it says it's becoming as bloated as Seamonkey. Oh the horror. The article is also (as usual) not kind to Firefox as far as the speed and insane memory consumption it suffers from, which thousands of fanboys have spent the past three years desperately denying for some weird reason. To be fair, I use FF and I don't care about the memory problem, but that doesn't mean it's not there.

    Disingenuous FUD aside, I can't for the life of me imagine how IE could be "bloated". It never had much functionality to begin with.

    Kudos to Bashdot. Even the current Digg submission [digg.com] doesn't mention IE at all.

  • well (Score:3, Insightful)

    by mastershake_phd ( 1050150 ) on Thursday May 17, 2007 @04:48PM (#19169051) Homepage
    Caching pages in memory allows faster back browsing, but it can also leave a lot less memory for other applications to use.
     
    The amount of RAM used for caching pages could be set by the user in the options. I think most Firefox users could handle that.
  • Firefox=Mozilla? (Score:5, Insightful)

    by Aeron65432 ( 805385 ) <agiambaNO@SPAMgmail.com> on Thursday May 17, 2007 @04:49PM (#19169063) Homepage
    More than anything it's reminding me of Mozilla, now known as SeaMonkey. The reason I switched from Mozilla to Firefox was because I wanted a smaller, more nimble browser. I didn't want a RSS reader, e-mail, IRC, etc. packaged together. Firefox hasn't integrated all of those yet but it's moving towards it and I don't like it.
  • Opera! (Score:5, Insightful)

    by Romwell ( 873455 ) on Thursday May 17, 2007 @04:49PM (#19169073)
    I guess it is the time now for people to look into Opera, which seems to be able to keep the balance. I think software should not be discriminated on the basis of not being FOSS.
  • by Paradigm_Complex ( 968558 ) on Thursday May 17, 2007 @04:50PM (#19169087)
    Firefox has an awesome ability to add-on things very effectively. I don't understand why they don't keep fx slim with with all the proposed additional features as external (and hence optional) add-ons. Perhaps the not-so-computer-literate can use the bloated-up version of fx so they don't have to figure out how to use add-ons (I'm still amazed at how computer illiterate people can be), but leave a streamlined version for us techies to add-on options as we choose.
  • by Foofoobar ( 318279 ) on Thursday May 17, 2007 @04:53PM (#19169165)
    Aside from that, the ongoing issue with Web 2.0 apps and javascript with multiple tabs using the same shared namespace and overwriting variable names still hasn't been highlighted by the security community and as AJAX and web based applications become more prominent, the end user will find more and more applications breaking other applications.
  • by Paradigm_Complex ( 968558 ) on Thursday May 17, 2007 @05:00PM (#19169331)
    If people want to bloat up their browser, I don't see how thats your problem. Now, when the browser comes bloated so that you can't slim it down without spending a good bit of time cutting chunks out of the code, it becomes a problem. If they took things like spellcheck out - slimming the base fx - and allow me to chose if I want it in or not, that'd be nice. But what do you care if I want to put a bigjillion plugins on?
  • Re:Opera! (Score:5, Insightful)

    by glwtta ( 532858 ) on Thursday May 17, 2007 @05:01PM (#19169345) Homepage
    I think software should not be discriminated on the basis of not being FOSS.

    And I think it should. Guess that's why different things matter to different people.
  • by eln ( 21727 ) on Thursday May 17, 2007 @05:01PM (#19169357)
    That's really the problem, I think. Firefox was originally supposed to be just a great browser, as opposed to the bloat that was Mozilla, with additional functionality being provided by Add-ons. Now, though, the development direction seems to be to take the best of the extensions and incorporate them into the main product. It might be better to keep the browser as it is, and then release a separate bundle with Firefox + the most popular add-ons. That way, people that want the slim browser they switched to Firefox for in the first place can have it, while the Firefox team can still have a download that will allow them to crow about all of the great features Firefox has.
  • become? (Score:5, Insightful)

    by nanosquid ( 1074949 ) on Thursday May 17, 2007 @05:02PM (#19169379)
    I don't think Firefox ever was such a lean or efficient browser. It's also buggy and the developers don't seem to care much about Linux or MacOS (bad profile support, inefficient graphics, etc.). Opera and Konqueror both seem better written and better designed.

    I still use Firefox. Why? Because Firefox works well enough, it's up-to-date, compatible, and, most importantly, has tons of useful extensions.

    I hope the Firefox developers will be able to clean up their act, but unless it gets a lot worse, I'm sticking with Firefox, because, on balance, it's still the best browser there is.
  • by edwdig ( 47888 ) on Thursday May 17, 2007 @05:03PM (#19169389)
    Firefox was only leaner than Mozilla back when it was called Phoenix and had only the bare minimum UI necessary to be a web browser.

    Mozilla never was slow (at least not after it reached the point that it was good enough to consider using as your standard browser) and really wasn't a memory hog. That perception came about from the people who really didn't want an integrated email program, but absolutely refused to choose "Browser only" when the installer asked what they wanted.

    Around the time of the name changed from Phoenix to Firebird, the two browsers were about on par. By the time the name changed to Firefox, it was already more bloated than Mozilla. The project goals moved more towards grabbing attention than being lean.

    If Mozilla had just made a theme that blended in to the OS (Classic doesn't do a good enough job of it) and put a link on the download page to an installer that only had the browser included, there never would have been a need for Firefox.
  • Re:well (Score:2, Insightful)

    by mastershake_phd ( 1050150 ) on Thursday May 17, 2007 @05:05PM (#19169441) Homepage

    The amount of RAM used for caching pages could be set by the user in the options. I think most Firefox users could handle that.
    Sure, for geeks. But if we want people to stop using IE we must provide a credible alternative.

    There should definitely be an option to tell Firefox to use less than n megabytes of memory, and let firefox figure it out, instead of setting the memory limit through the number of undo levels per tab.


    Firefox would have to default to something, doesnt mean you shouldnt be able to change the default amount.
  • Re:is it time (Score:5, Insightful)

    by thePsychologist ( 1062886 ) on Thursday May 17, 2007 @05:07PM (#19169461) Journal
    Actually, perhaps it's time to totally rethink the internet. Browsers today are bloated partly because websites are bloated.

    The majority of websites could do with a simple and less cluttered layout like google's website for instance. Compare it to yahoo and you'll see that yahoo has a bunch of "advanced features" like inpage tabs and whatnot. Lots of this extra junk you'll find around the web is javascript that chooses CSS based on browser and that displays advertisements. Lots of it is just poor use of HTML often from WYSISYG programs. More features in language means more junk on website. More junk on website means more junk in browser.
  • by OffTheLip ( 636691 ) on Thursday May 17, 2007 @05:07PM (#19169475)
    To paraphrase an often heard comments about EMACS way back when, "EMACS isn't an editor, it's a lifestyle". Hopefully Firefox isn't headed down the same path.
  • Re:Opera! (Score:2, Insightful)

    by Paradigm_Complex ( 968558 ) on Thursday May 17, 2007 @05:07PM (#19169481)
    I do. Other people can use it - fine - but if I can't see the source, I don't know whats in it and I'm not very trusting. For all I know Opera is grabbing and selling information such as my web history. I know what fx does with the passwords it stores - I can see the code. How do I know Opera doesn't use it to log into my gmail account? I can watch whats going in and out of my ethernet and wireless card, but even so opera could be using some undocumented "feature" of a closed-source operating system to make sure I don't see it. I'm not trying to convert others to F/OSS too actively, but I'm pretty dedicated to the idea. Firefox still has a long way to go before it falls enough for me to seriously consider a closed source browser. Hopefully someone will fork fx and fix these issues - or if not I can. Because, you know, its open source.
  • by Anonymous Coward on Thursday May 17, 2007 @05:08PM (#19169493)
    You say you have two gigabytes of RAM like it means something. DDR? DDRII? 333? 400? 667? 800? Boosting technology on your motherboard? CPU to match?

    I looked into my RAM recently, I have 2 1gig sticks running in dual, and I know for a fact many people are getting less use out of 4 gigabytes of RAM. I'm just saying, its a big number, but there is an even bigger amount of leeway.

    Back on topic, the RAM topic is a bad one because IMHO any browser should be able to still run very easily with 512mb RAM, and run okay on 256mb - if it's set up correctly.
  • IE is bloated? (Score:4, Insightful)

    by ThinkFr33ly ( 902481 ) on Thursday May 17, 2007 @05:09PM (#19169505)
    There are many, many things you can criticize IE for... but being bloated doesn't really seem like one of them. If you RTFA, they compare the growing bloat not with IE, but with Mozilla.

    True, 3rd party add-ons for IE can bring it to a crawl, but that's not IE's fault. The same problem exists in any browser that supports extensibility via a plugin model.

    I use Firefox on XP because it's safer than IE, certainly not because it's less bloated. Firefox consistently uses far more ram (I have several screen shots of Firefox using 1.5GB+ of ram with *no* plugins enabled and just one tab open), dies a painful death due to poor integration with things like Flash (100% CPU Flash advertisements, anyone?), or simply just crashes.

    On Vista I use IE 7 w/Protected Mode. Why? Well, again, because it's safer. But it also has the benefit of returning me to the days when a browser didn't use 2x the RAM of Photoshop. Imagine that.
  • The Wrong Question (Score:5, Insightful)

    by Jeremy_Bee ( 1064620 ) on Thursday May 17, 2007 @05:09PM (#19169517)
    All over the web today there are stories about FireFox's (supposed) bloat, but no actual facts on whether it is or is not actually "bloated." Since "bloat," to most people, apparently means the state of a program having more features than is necessary, it's hard to see how the average user would ever be able to definitively answer this question. The question is probably better phrased as "Are you having major performance problems with FireFox 2.0?"

    I don't know how the file size (the other definition of "bloat"), of a FireFox installation compares with other browsers but it doesn't seem like an overly large file to download. It also seems to me that when I check my FireFox preferences it actually has a very basic, simple feature set similar to what's available in almost every other browser. If the feature set is roughly the same as other browsers, how can it be rightly called "bloated?"

    I think the problem with FireFox is one of performance, not "bloat" per se. I run FireFox on a Mac with only a single extension and a single theme. My computer is relatively new, the OS is up to date, it has a Gig and a half of RAM and a fast video card. On this machine FireFox is as slow as molasses. It takes ages to start and ages to load a page. It also crashes (a lot!).

    I use FireFox because of AdBlocker and because as bad as it is, it's still the best there is on the Mac right now. This will likely change in October when the new Safari comes out so this summer's FireFox 3.0 release will have to be extremely, extremely good just to keep the same market share IMO.
  • OS Level Control? (Score:3, Insightful)

    by hattig ( 47930 ) on Thursday May 17, 2007 @05:10PM (#19169535) Journal
    Why can't the OS, when it sees that it is running out of memory, send a signal/message/henchman to applications and tell them that if they have the ability to give up some memory (i.e., caches, etc), then do so, to keep the system happy. There could be several levels of urgency in the request as well, like "yeah, dude, just thinking here, yeah, could you ease up a little on the memory, cheers!" through to "Sieg Heil! Deine Memory, SCHNELL!!".
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday May 17, 2007 @05:11PM (#19169571)
    Comment removed based on user account deletion
  • Re:Opera! (Score:5, Insightful)

    by Timesprout ( 579035 ) on Thursday May 17, 2007 @05:13PM (#19169597)
    Yes, some of us are prepared to use the best tool for the job rather than blindly follow FOSS.
  • Re:is it time (Score:5, Insightful)

    by Anonymous Coward on Thursday May 17, 2007 @05:17PM (#19169677)
    As someone who lives out in the sticks, and pays $100/month for a 1.5MBit 802.11 connection, I say no. Keep the web as plain old HTML. Limit flash (And other plugins) to things like embedded video, NOT AS THE ACTUAL WEBPAGE.

    There's still a lot of people out there who are limited to dialup, satellite, or some other jerry-rigged internet connection.
  • by Movi ( 1005625 ) on Thursday May 17, 2007 @05:20PM (#19169759)
    256MB ? Holy crap! I don't understand why people think we need at least 512MB to run anything decently! In 2001 i was running on 64MB and i can remember i could run a web browser (granted it was IE, but nevertheless!) Winamp and some other stuff. And people _expected_ it to run smoothly with only 64MB ! I know it 6 years from that time, Moores law and such, but i still wonder - why this insane amount of hardware requirements? Notice that Opera for Symbian must run with 8MB of RAM and it has to share. And there's no virtual ram, so swapping is not an option. This of course doesn't count Flash. Right now both of my boxes have 1GB of Ram, and i don't plan on upgrading that number anytime soon - I don't play games (consoles are for that, and my Gamecube has about 48MB combined too!), i don't run VMs and i don't even have a swap partition - it never got touched anyway.
  • Re:Opera! (Score:3, Insightful)

    by OrangeSpyderMan ( 589635 ) on Thursday May 17, 2007 @05:20PM (#19169763)
    I do. Other people can use it - fine - but if I can't see the source, I don't know whats /i>[sic] in it and I'm not very trusting.

    I call bullsh1t on this. You've reviewed all the source of all the pgms you use? Stop this argument, please, it's not a real reason to choose one over the other unless you're actually willing to go through the source of every one of them, and I doubt you have the time, and if you do - you should do something better with it :) .
  • by markov_chain ( 202465 ) on Thursday May 17, 2007 @05:24PM (#19169823)
    It seems that there is a magical responsiveness threshold which humans tolerate, and as the processing power and memory sizes grow, the applications follow along, staying just below that threshold. Usually the reasons are increasing amounts of shared libraries and scripting languages, which allow us to build more application per unit programmer time. We get more features and modern applications, at the expense of a sluggish environment.

    This performance penalty is perhaps hard to notice. The easiest way to experience it is to run some old applications; they absolutely scream on modern hardware, to the point that the instant response becomes almost worth the loss of extra features. This is probably why things like xfce prosper.

  • Bad excuse (Score:2, Insightful)

    by mirshafie ( 1029876 ) on Thursday May 17, 2007 @05:27PM (#19169879)
    Every time the subject of Firefox's sluggishness and memory slaughtering habits come up, someone tries to excuse it the fact that a few MINOR features has been added over the years. Which were the last big news for Firefox? Phising filter, better search management, incorporated RSS. The truth is that Firefox has had memory and speed problems since 1.x versions. At the very least, nobody can deny it for 1.5.+ versions. At the same time, other projects seem to be able to add features without their browsers eating such inexcusable amounts of RAM and virtual memory. Konqueror and Opera both do LOADS better, and both have all the functionality that you should expect from a browser (that is to say, much more than Firefox has out of the box). Actually it's hard for me to believe that Firefox is so popular among tech people. Whenever I'm at a Windows computer, I naturally use IE7 since it beats Firefox with it's little piggy eye closed.
  • rethink the OS (Score:4, Insightful)

    by goombah99 ( 560566 ) on Thursday May 17, 2007 @05:27PM (#19169881)
    Is there not some way that operating systems can manage caches for applications in a way that certain datasets can be marked as opportunisitic caching. That is, make it as keep a copy of this in any free space, but you can discard it if real memory is needed.
  • Re:Opera! (Score:2, Insightful)

    by Mad Merlin ( 837387 ) on Thursday May 17, 2007 @05:28PM (#19169915) Homepage

    Proprietary software is not practical, especially in the long term. The best tool for the job must also be practical.

  • Re:Opera! (Score:1, Insightful)

    by Anonymous Coward on Thursday May 17, 2007 @05:28PM (#19169931)
    Yeah, that's fair. Compare Opera, four years ago, to Firefox, today.

    I guess the trouble with comparing Opera to Firefox four years ago is that Firefox didn't exist four years ago (not as Firefox, anyway)

    Firefox on Linux crashes occasionally when I try to leave pages with embedded an Google Video object or multiple embedded Youtube videos. It freezes and I have to kill it. The most recent time this happened was yesterday.
  • by Anonymous Coward on Thursday May 17, 2007 @05:38PM (#19170117)
    Go look at the source code to Gecko, the rendering engine behind Firefox, Seamonkey, Thunderbird and other projects. In short, it's a mess.

    Part of the problem is the foolish complexity of it. Their whole XPCOM idea sounds nice in theory. But then you actually go to implement it in C++, and it becomes a pile of crap. Soon enough, difficult tasks start to become hard, the damn near impossible tasks can't be done, and nobody really has a good idea of what large portions of the codebase actually does. That's not the way to create an efficient rendering engine. You'll end up with memory leaks galore, and excessive CPU consumption, just as we've witnessed with Firefox.

    Although it's unlikely to happen now, the best thing for them to have done would have been to throw out most of the code released by Netscape, rather than rewriting a lot of it (at the same low-quality level) in the following years. Then they could have re-implemented it using a natively-compiled implementation of Standard ML. One benefit of this would have been an elimination of the memory leaks that we hear to much about today, due to the garbage collection of SML. Additionally, functional languages are well-suited to parsing (ie. of HTML, XHTML, etc.) and language implementation (ie. JavaScript), more so than C++.
  • by Anonymous Coward on Thursday May 17, 2007 @05:38PM (#19170127)
    if I can't see the source, I don't know whats in it and I'm not very trusting.

    Most F/OSS users just don't want to pay for software and that's OK. BTW Opera is 100% free and has been for years now).

    But please don't try to sell me the old canard that you scrutinize every line source for all the F/OSS that you use - I'm not buying. My guess is that there is a FAR greater likelihood of some clever programmer slipping something by you in one of your beloved plug-ins than Opera ever doing anything untoward (a for-profit company has a lot more to lose from bad PR on its primary product than some reprobate teenager in Bakersfield or Bratislava). A little degree of paranoia is healthy only if directed correctly.

    For me, Opera (although not perfect) stands head and shoulders above IE and several inches above FF in terms of performance, security, utility and functionality.
  • by Anonymous Coward on Thursday May 17, 2007 @05:45PM (#19170281)
    Face it, folks - Phoenix was about slim, slim, slim. Firefox is NOT - it's about best-of-breed. For all the talk of bloat, take one look at things like the default toolbar layout (simple and streamlined), preferences (compare it to Mozilla or IE), and menu layout (again, dare to compare). Internal things like memory management and feature support may have increased, but the user interface has remained streamlined and efficient. THAT is what is important to the vast majority of users.

    As to the parent post, let's see now:

    RSS Support:
    • Web Feeds (RSS)
    • Live Titles
    • Live Bookmarks

    I could easily see removing RSS support. Firefox's implementation is nothing an extension couldn't do, and do much better. It's a joke for handling more than a handful of feeds, and stifles development of third-party extensions. Gee, and we used to complain about competing against built-in programs...

    Security:
    • Pop-up Blocker
    • Phishing Protection
    • Automated Update

    Can you honestly say a browser should be shipped without these, or even an option to not install them? Especially for the popup blocker - are you insane, or have you simply forgotten what the popup-infested web was like? Phishing protection is unobtrusive and useful, as is auto-update.

    Miscellaneous:
    • Spell Checking
    • Integrated Search
    • Search Suggestions
    • Session Restore
    • Accessibility

    Integrated search was one of the highlights of Mozilla ages ago, and is now a standard feature in every single browser. Firefox/Mozilla did a particularly good job by adopting an existing open format (from Apple's Sherlock) rather than reinventing the wheel. Search suggestions are the latest evolution of that (primarily thanks to Google Suggestions, if I'm not mistaken). Spell check is marginal - many operating systems offer their own - but I don't see how a third-party extension could improve upon it. Accessibility is just critical for those who need it. Session Restore I'm torn on, as many extensions handled it, but not necessarily well. I see that as the Firefox team deciding to take all of the lessons learned from the third parties, and do it right (much like Apple did with iTunes 1.0).

    Bloat is only a problem if it hinders program development, maintenance, execution, or usability. The examples given here don't generally meet those criteria. Most of the features here are simple, self-contained, unobtrusive, and likely have low code and memory footprints.
  • by mcrbids ( 148650 ) on Thursday May 17, 2007 @05:53PM (#19170429) Journal
    I don't see any reason why all of those things are integrated and not seperate addons. And that list gets bigger with each new version.

    For a seminal work that explains this concept to the intellectually unenlightened: Bloatware and the 80/20 myth. [joelonsoftware.com] It's not that bloated, slow software is preferred, exactly, it's simply that so-called "bloat" features are actually an advantage.

    I'd personally prefer that FF has automated updates. I noticed the spell-checker after an update, and think it's kinda nice, although my spelling is generally pretty good. The popup blocker is quite nice. The other features I just don't care about, and I never noticed any particular performance decrease on my dual-core, 2 GB RAM laptop. Thus, for me, this "bloat" is something I either like or don't mind.

    Other people may think an RSS reader is DA SHIZNIT! Some people lean hard on the anti-phishing features. And they will find bloat just as tasteful as I do. Go ahead - read the article I linked to, and then think about it. Of the functionality, what 20% do you want? And, is that the same 20% that everybody else wants? There's the reason for your bloat.

    Want just a browser and only a browser? It's open source code, dude. You are welcome to create a fork and do whatever you like with it.
  • Re:Opera! (Score:3, Insightful)

    by JordanL ( 886154 ) <jordan,ledoux&gmail,com> on Thursday May 17, 2007 @05:55PM (#19170465) Homepage
    Then the entire paradigm is based on the (flawed) assumption that a general set of eyes is inherently better than the hired set of eyes which the programmers for any given closed source project employ.

    The issue you have is that you don't trust the set of eyes, but the process is fundamentally the same if you do not review the source code yourself. You are trusting someone else to assure you nothing is wrong, and confusing motive with action.

    The people who try and claim that FOSS is better than closed source because you can't be sure the evil corporate grmlins are stealing your soul are grasping at straws, and don't understand the fundamental benefits of closed source or FOSS, and IMHO, they are doing a disservice for OSS by promoting something that is not a reasonable benefit of OSS nor something which is an inherent difference it has.
  • Re:Opera! (Score:2, Insightful)

    by OrangeSpyderMan ( 589635 ) on Thursday May 17, 2007 @05:58PM (#19170543)
    We don't have to review the source of all the programs we use to gain the "transparency" benefit of Open Source or Free Software. The idea is "many eyes", not "my eyes".

    Oh so Open Source is all about relying on someone else to review the code for you? Please tell me how this is different from a software house paying QA guys to do it, apart from the fact they're paid professionals :)

    I've said it once, and I'll say it again, you're trusting someone you don't know to check it's ok, and if that's the case, there is NO difference whether it's Open Source or not - so stop using it as an argument. There are many more arguments a lot better than this for using FOSS, it simply doesn't need BS like this.
  • Re:Opera! (Score:3, Insightful)

    by shish ( 588640 ) on Thursday May 17, 2007 @05:59PM (#19170571) Homepage

    You can save your specious arguments for an audience that will buy them. We don't have to review the source of all the programs we use to gain the "transparency" benefit of Open Source or Free Software. The idea is "many eyes", not "my eyes".

    Seeing as most people who look at firefox's source code go blind soon after, I'm not buying this.

    Yes, hundreds of thousands of people can in theory look at the source; but then all of them think that someone else will do it, and nobody actually *does*. (Not counting full time mozilla employees, since MS has full time IE employees, and they don't count towards the "many eyes" effect)

  • by Anonymous Coward on Thursday May 17, 2007 @06:08PM (#19170707)
    There are a *lot* of printing bugs that cause the browser to get stuck in an infinite loop. Fix those! Please! Please!

    I work on a web application where people print a lot and they cause the browser to crash all the time. You have to go to the task manager to kill the firefox process.

    Don't add another feature until that is fixed, please.
  • Re:-1 troll (Score:2, Insightful)

    by moderatorrater ( 1095745 ) on Thursday May 17, 2007 @06:10PM (#19170747)
    It stores the rendered page, not the html, which is why it takes so much memory. 500k is small for a rendered page.
  • All software... (Score:3, Insightful)

    by Rix ( 54095 ) on Thursday May 17, 2007 @06:13PM (#19170807)
    Expands until it can read your mail.
  • Yes, it was. (Score:3, Insightful)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Thursday May 17, 2007 @06:24PM (#19170997) Journal
    I was there.

    In the early, early days of Firefox, Internet Explorer was pretty slow and bloated. Most of its snappiness came from being "part of the OS". (Or I was a deluded fanboy then, maybe...)

    So, on Windows, you had the choice between Netscape (which was big and bloated), or Mozilla if you were smart (which was also big and bloated), or IE (which was bloated, for a browser). Mozilla was not so terribly bloated, except for the fact that it was a browser/mail/news/irc/dev platform/kitchen sink, and not just a browser. So, if you needed web and email, Mozilla was fine, but if you just needed web, IE was faster.

    At the time, I believe Opera was somewhat buggy, and still cost money. I am not sure whether Konqueror existed or not; I only fairly recently became aware of KDE as being better than GNOME in just about every way (at least, as a desktop environment).

    So, the Phoenix project was started. I used that from maybe 0.6, and it was good. A bit unstable, yes, but it would come back up in 2 seconds. And on the machines of the time, that was pretty damned impressive. It only seemed to be getting smaller and lighter. If anything was slow/buggy about it, it was that Phoenix required the full Mozilla sources, but the existance of Phoenix was actually cleaning up quite a bit of Mozilla.

    And yeah -- Phoenix vs Mozilla was amazingly dramatic. Consider that Windows at the time sucked so much (at least for me) that I'd have used Linux even if it meant using Netscape 4.0, Mozilla was kind of ok. But Phoenix just kicked ass.

    Now it's Firefox, though, which has sort of just become a word, and lost its meaning. I know why they changed the name, but still, Phoenix was cool -- the beast that was Mozilla (stomping on IE) had died, but from its ashes, Phoenix rose and became Firebird, something that could fly on its own, with no concern for IE at all...

    So, where'd all that go?

    Well, some of it's memory leaks. Some of it's almost by design -- note that Firefox uses Gecko for EVERYTHING. Firefox doesn't just embed Gecko, it IS a Gecko app. The menus, config options, the entire UI is coded in XUL, which is basically XML + JavaScript, with some C++ libraries. (Correct me if I'm wrong here.) Firefox itself was an AJAX app before AJAX even had a name. (And so was Mozilla.)

    That's another part of it, but it's not really the whole picture.

    Extensions, I think, are what kills it. The more extensions you add, the more likely you are to break something. At the same time, extensions are what sold it. There's still two that I miss dearly, now that I mostly use Konqueror -- adblock (the real adblock is so much better than Konqueror's adblock) and unplug (lets you download anything normally viewed via browser plugins, including YouTube videos as FLV files). For awhile, you could even get Thunderbird as an extension -- it was called something else at the time, I think -- and you still can get Sunbird as a Firefox or Thunderbird extension.

    Extensions are the killer feature of Firefox, and they are also what kills Firefox performance.

    I think it could have been a bit better. I know part of it is bad/buggy extensions, but I imagine part of it is also that extensions are written in XUL/JavaScript. I mean, yes, that enables them -- it's easy to transition from web developer to Firefox extension hacker -- but I do wonder, occasionally, if we could do better, starting from scratch. Konqueror is right out (though we might borrow KHTML or Gecko for awhile), but maybe something written in, say, Python, or LISP, or some good language with a really solid design? Maybe a killer app for its platform, so that people start making Python faster to make their browser faster? (If you think Python is fast enough, you're deluded -- why does the GIL still exist in these days of multicore processors?)
  • by evilviper ( 135110 ) on Thursday May 17, 2007 @06:27PM (#19171047) Journal

    Firefox was only leaner than Mozilla back when it was called Phoenix and had only the bare minimum UI necessary to be a web browser.

    I did several benchmarks at the time, and even way back then it was only nominally faster or lighter on RAM. The myth of Firefox being lean and fast is complete marketing.

    If Mozilla had just made a theme that blended in to the OS (Classic doesn't do a good enough job of it) and put a link on the download page to an installer that only had the browser included, there never would have been a need for Firefox.

    IMHO, Firefox has only ever had two things going for it beyond Mozilla and Seamonkey... More customizable interface, and per-user extensions/Add-ons. And that's traded-off in things like a horrible user-preferences page that's only getting worse with time, lack of an editor, etc., etc.
  • by ASBands ( 1087159 ) on Thursday May 17, 2007 @06:33PM (#19171163) Homepage

    I once attempted to create a page-rendering engine, starting with XHTML. Eventually, I got a decent-working rendering engine. Unfortunately, anytime there was an error (even a minuscule one), my engine would completely fail. I can't even being to imagine the hell Gecko goes through to render a site like MySpace [w3.org]. I've often thought about a better way to implement a rendering engine, but most involve fixing the web developer's crappy code before attempting to render it, which is not possible in most cases. In C++, you can't compile with an error. Perhaps development software that isn't notepad (my software of choice) should add in validation service in the same way Visual Studio 2005 does.

    The internet: We have the tools to rebuild it, but we don't want to spend a lot of money.

  • by SEMW ( 967629 ) on Thursday May 17, 2007 @06:34PM (#19171171)
    Why does slimming Firefox down necessarily mean removing features? Opera can do pretty much all of the things you quotes and much, much more besides (email client, bittorrent client, customizable to the extent that would need about 15 different FF extensions to emulate, etc.) -- and it still manages to be slimmer than Firefox -- a smaller download (4.7 vs 5.7MB), faster to start, more responsive, a smaller memory footprint, etc.
  • Re:Opera! (Score:3, Insightful)

    by Bogtha ( 906264 ) on Thursday May 17, 2007 @06:37PM (#19171235)

    Sometimes the best tool for the job depends on how far into the future you are looking. Free Software advocates are more pragmatic than you think. You just need to stop thinking about what works today and start wondering about what will work tomorrow. Mark Pilgrim wrote a couple [diveintomark.org] of decent articles [diveintomark.org] about the kinds of problems proprietary software can cause.

    Now I don't use Opera for anything other than testing, so I don't know what kinds of risks that particular software exposes you to. What I do know is that staying in control of your computer is a decent policy to stick to, and Opera would have to be significantly better than Firefox or Konqueror for me to use it. That's not being "blind", as you put it, it's being sensible in exercising caution.

  • by Anonymous Coward on Thursday May 17, 2007 @06:39PM (#19171261)
    Your renderer did what most renderers years ago should have done: failed outright upon errors. That would have been essentially the same as a C++ compiler not emitting anything upon encountering a syntax error. Unfortunately, the early browser developers, mainly at Netscape and Microsoft, decided to try to handle such shit input. And so today we have crap like MySpace.

  • Re:well (Score:2, Insightful)

    by DittoBox ( 978894 ) on Thursday May 17, 2007 @07:01PM (#19171671) Homepage
    Yes because my mom, my dad, my uncle, a dozen or so of my computer illiterate co-workers and my grandma can do it.

    Making stupid assumptions that "my users are geeky enough to overcome my development laziness, I'll just make them change a bunch of caching settings once its shipped." is about as stupid. This is why a lot of (FL)OSS doesn't get off the ground: arrogant developers that expect way too much from their users.

    I love (FL)OSS just as much as the next slashdotter but if you're going to make your software usable only via arcane knowledge don't whine and complain that 99% of the populace hasn't caught on yet. They won't. I don't mind software that's written for the dummy, as long as software is made to be able to change so that non-dummies can use it too.

    But please don't expect your user-base to be 99% non-dummies while you market it to a world full of dummies.
  • by aichpvee ( 631243 ) on Thursday May 17, 2007 @07:17PM (#19171837) Journal
    Too little, too late? And who's going to come along and sink their ship in that year and a half? If Opera were going to do it, they'd have done it by now. Maybe if Konqueror could be a contender if it goes multiplatform (anywhere that runs KDE plus Windows and maybe a native Mac port) with KDE4/Qt4.

    Other than that there really isn't anyone to take their place. Oon windows I highly doubt that you'll see many converts going back to IE, even if microsoft somehow makes it stop sucking with IE8, which I guarantee won't happen anyway.
  • by Anonymous Coward on Thursday May 17, 2007 @07:24PM (#19171951)
    "That made this old coder wanna cry. My first Mac had only 512 kilobytes (kilo - not mega) but that was enough for me to write GUI applications with.

    Kids these days don't know how to write code."

    Uh, huh. And "back in the day" could you run a VM, host two operating systems, and run a web browser? All in 512K? Didn't think so. Old folks, always living in the past.
  • Re:Opera! (Score:3, Insightful)

    by bfields ( 66644 ) on Thursday May 17, 2007 @07:24PM (#19171961) Homepage

    Then what's the difference? Closed source software has "many eyes," too. They just happen to be paid by someone.

    Right. The same "someone", usually. And a "someone" whose interests are not those of users.

    This is the same reason we usually trust a result published in a peer-reviewed journal more than one reported by a corporation with only internal peer review, even when the resources available for the internal peer review may be excellent.

    It's the difference between "here's our results, here's how we got them, feel free to try yourself and see if you come to the same conclusions" and "here's our results, our best people checked them, honest!"

  • by Anonymous Coward on Thursday May 17, 2007 @08:03PM (#19172441)
    Hopefully Firefox isn't headed down the same path.

    What!? You don't hope that Firefox will be a super-good browser in 30 years?

    I'm still editing DocBook documents (well, really "books" actually) using emacs + nxml (XML mode supporting realtime RelaxNG validation). I also happen to write .txt files, shell scripts, etc. under emacs. The only time I'm not under emacs I'm under IntelliJ IDEA (best IDE ever made for *any* language on *any* platform). Of course I'm using IntelliJ IDEA with a (custom modified) emacs plugin.

    Oh and btw, maybe emacs was a pig 12 years ago but nowadays on my Core 2 Duo with 4 GB of ram I can't really notice that ;)

    What's your point about emacs? Isn't this a Good Thing (TM) that a bunch of real hackers, more than 30 years after it was created, can still beat the crap out of any other IDE for most tasks? And once I find way better, I switch... Like, say, to IntelliJ IDEA. So don't call me an emacs fanboy. I'm simply being realistic about what is more productive and what is not.

    Of course now that /. has became the base of clueless MS fanboys that think that Visual Studio is all the shit, it's easy to get modded +4 insightful with a completely offbase comment like yours.
  • by eugene ts wong ( 231154 ) on Friday May 18, 2007 @12:10AM (#19174603) Homepage Journal

    On the other hand, the fact that those early versions of Mosaic, Netscape, IE, etc. would do something with broken code instead of refusing to display it meant that the barriers to entry were a lot lower. It vastly increased the pool of people who could create web pages, and the talent pool.
    That's part of the problem, not a benefit of the choice we made. We have PDF and HTML. Both do 2 different jobs poorly. Firms don't hire Aunt Tilley to drag and drop a brochure. Why do they insist on dragging and dropping a web site? It's absurd. Lowering the bar doesn't improve the pool. Do you drag and drop you way to a better Linux kernel?

    A raised bar doesn't automatically equal programmers making web sites. A quality web site is made by a technical person who understands text and design.
  • by Overly Critical Guy ( 663429 ) on Friday May 18, 2007 @12:35AM (#19174793)
    A lot of people reminisce without thinking through the differences between then and now. A lot of us remember the days of 8MB, 16MB, and 64MB of RAM being enough for our needs, but don't take for granted the following:

    1.) New rendering paradigms in the operating system that require more resources, like resolution independence, vector graphics, and hardware acceleration of window textures in Quartz and Avalon.
    2.) In the same vein, screen resolutions and color depths have increased.
    3.) Sound cards are operating at higher frequency and bit rates, and multiple speaker systems are not uncommon.
    4.) Today's audio and video codecs are higher quality but more resource-intensive.
    5.) Convenience services like metadata file indexing, spellchecking, garbage collection, automatic network configuration, automatically updating RSS feeds, background system snapshots (e.g., System Restore), automatic file defragmentation ala Mac OS X, and more.
    6.) Today, I bet you commonly have 20 or 30 browser tabs open at times, maybe more. Five years ago, you might have had only five or ten open. Before that, you only browsed with one or two windows open at a time. And websites back then used lower quality JPEGs and GIFs, while today we have high-resolution, high-quality PNGs and JPEGs and high-quality video clips running through Flash and Quicktime.

    We have a lot more things running at once that all add up, and to have all these things running smoothly enough for a responsive user interface, it takes a lot of resources allocating precious cycles at every opportunity. Your 48MB GameCube doesn't have to run a general purpose operating system, and its specs are set in stone so that developers can specifically optimize for it to extreme degrees that desktop applications relying on high-level APIs and cross-platform compatibility can't afford.
  • by jgrahn ( 181062 ) on Friday May 18, 2007 @01:34AM (#19175145)

    I agree -- halfway. Had early web browsers been strict about errors, we wouldn't have so much broken code out there, and cross-browser compatibility would be solely a matter of which features are supported -- not which set of error-correcting assumptions you expect.

    Right. Well put.

    On the other hand, the fact that those early versions of Mosaic, Netscape, IE, etc. would do something with broken code instead of refusing to display it meant that the barriers to entry were a lot lower. It vastly increased the pool of people who could create web pages, and the talent pool. Sure, some people have both artistic talent and programming ability, or have the resources to team up. But can you imagine a web built solely by programmers?

    Nonsense. If you can write a buggy HTML document, you can also write a compliant one. You don't suddenly need a bloody programmer! Especially not if the browsers themselves (or external validators) had given reasonably helpful error messages, which they would have.

  • Re:Opera! (Score:3, Insightful)

    by TheLink ( 130905 ) on Friday May 18, 2007 @04:13AM (#19175899) Journal
    Many eyes?

    Haha, so 6 billion monkeys will be able to spot a security problem and submit a bug report? Remember it has to be done before the software is exploited (whether secretly or not).

    Sure, many users can spot a UI problem, but only a very few people can and will spot security problems, and fewer will bother to actually report them to the right channels. Plenty of evidence for that - gaping holes in open source that were not spotted and fixed till years later.

    There's too much crap code out there for everyone to look at it. Hackers will just pick a target, find exploits, exploit them, when they run out, they pick another. And only a few people actually go around fixing the bugs (or writing good code in the first place).
  • by Anonymous Brave Guy ( 457657 ) on Friday May 18, 2007 @08:20AM (#19177107)

    In a moment I'll talk about my views on rewriting large code bases, but first I'll say that I'm glad I wasn't the only one who was with the GP poster up until the SML advocacy, and then disagreed. Even given the neat way that functional languages tend to model parsing problems, web browsers do a lot more than parse HTML and CSS files. In fact, I'd go as far as to say that this is the easiest problem they solve. Systematically resolving the layout rules in arbitrarily complex cases is a somewhat difficult problem, given the way those rules are expressed in CSS. And of course, web pages are no longer the static things they used to be: today's browsers need to cope with scripts moving the goalposts arbitrarily, maintaining the integrity of the display as much as possible during lengthy downloads of large pages or after AJAXified updates, etc. etc. It's far from clear that a language like SML offers better support for these naturally concurrent operations than the many alternatives.

    I do disagree with the parent post on one fundamental point, though:

    In general, throwing out an existing code base is rarely a good idea. Practically speaking there's rarely a code base so bad that no part of it can be salvaged. Even when things are rewritten, it's almost always the overall structure that's just refactored by a lot of copy pasting.

    I'll see your reuse dogma, and raise you my "plan to throw one away" dogma. :-)

    Actually, I don't cite this as some sort of dogmatic adherence to ConceptsTryingToSoundMoreCleverThanTheyAre at all. Rather, I happen to agree with the principle based on practical experience. In general, software design is difficult, and few people are good at it. Even those who are rarely have the good fortune to know exactly what their design will be called upon to do a few years later, and will inevitably allow more flexibility (and commensurate overhead) in some places than is really needed, while making some things unnecessarily strict and thus making later changes more difficult than they might have been.

    It's been my experience that in long-term projects, far too many managers aren't willing to throw out a whole module, subsystem, or even product, because of popular wisdom that anything they replace it with will just have bugs of its own. I believe this is a mistake because, again speaking only from my own experience, a high proportion of bugs originate in special or boundary cases. According to my reasoning above, a new project built from scratch with no prior experience will rarely get an overall design that automatically avoids these completely. Discipline is rarely good enough on software projects to allow for this and ensure that new requirements are integrated into a clean overall design rather than bolted on; indeed, in a commercial environment, this may not be realistic given short term deadlines and the typical management and marketing pressures. However, over time, such bolted-on special cases will tend to build up. They start to interact, they don't always get properly documented, and new people on the project team either don't know about them or at best don't know all the original reasoning behind them, making safe maintenance difficult.

    Sometimes, this problem is manageable, particularly if your project leadership consistently take a long-term view and give maintenance and testing the priority they deserve. But usually, IME, the problem reaches a certain critical mass where the costs of ongoing development of a code base full of dubiously documented special cases outweigh the costs of stopping to clean things up.

    As an additional, very practical concern, tools and programming techniques are always developing. Over the sort of timescales we're talking about here, it's entirely possible that more effective tools will have been created, or more effective techniques discovered, that could solve the underlying problem much more effectively in a different way.

    Thus, s

  • by Anonymous Brave Guy ( 457657 ) on Friday May 18, 2007 @08:26AM (#19177177)

    Your renderer did what most renderers years ago should have done: failed outright upon errors. That would have been essentially the same as a C++ compiler not emitting anything upon encountering a syntax error.

    Which would make perfect sense, except that the person running the C++ compiler probably wrote the C++ code they're putting through it, or at least has direct access to it so if something doesn't work they can fix it. For how many of the web pages you visit regularly did you write the HTML and CSS?

    Unfortunately, the early browser developers, mainly at Netscape and Microsoft, decided to try to handle such shit input.

    Following the established user interface principle that when things go wrong, you don't make it the user's fault.

    And so today we have crap like MySpace.

    Where "crap" presumably means a hugely popular service used regularly by a bazillion people?

    Technical details and web standards and browser workarounds and so on are just means to an end. That end is getting web sites that people want to use onto their computers so they can use them. The means matter exactly up to the point that they help to do this, and no further.

  • by arth1 ( 260657 ) on Friday May 18, 2007 @09:05AM (#19177511) Homepage Journal
    Firefox started out with the goal to be leaner. This goal was not reached.

    Before you people mod me down for stating this, and before you mod the Firefox apologist up, please do a comparison between any two concurrent release versions of Firefox and plain Seamonkey. The Firefox version has always had a bigger footprint than Seamonkey. Yes, really. Try it, dammit!

    And also keep in mind that Seamonkey builds on the Gecko engine that Firefox uses, and not, like some people seem to think, the Mozilla codebase with proprietary code going all the way back to Mosaic.
    The big difference is that Seamonkey follows the Mozilla suite paradigm of separating out the major pieces and allowing them to be installed or not as per the user's preference, while Firefox became an "Everything but the kitchen sink" project, where "kitchen sink" equals e-mail. This despite the intentions to be lean. Things included with Firefox have been stripped from Seamonkey, because if a user wants to install "Browser only", that's what the user should get -- not fifty different built-in "helper" apps that may or may not assist with certain types of browsing.

    Both are great browsers, but they are directed toward different audiences. If you want the leaner version, try Seamonkey "browser only" install before assuming that it's going to be big and bloaty. You may be in for a surprise.

    --
    *Art
  • That's part of the problem, not a benefit of the choice we made.

    I disagree -- both with you and the post you replied to. It should be classified neither as a problem nor as a benefit. It's a consequence, and with it comes both good and bad things. The bad is that we have so many horribly coded web sites. The good is that we have much greater interest in the internet today than raising the bar would have allowed. Interest in the internet was a prerequisite to development of E-Commerce. I don't think the pool of academics and technology enthusiasts would have been sufficient to entice Amazon, Ebay and the like to set up shop.

    ... Firms don't hire Aunt Tilley to drag and drop a brochure. Why do they insist on dragging and dropping a web site? It's absurd. Lowering the bar doesn't improve the pool. Do you drag and drop you way to a better Linux kernel?

    You have cited capabilities required for business and technology. Aunt Tilley isn't interested in making brochures for a firm, nor is she interested in developing OS kernels. She wants to put together a small web site with pictures of friends and family and maybe a blog area where she and her friends can talk about her adventures with her 14 cats. In order to get that, she has to pay for an internet connection, and her friends have to pay for one too. This greatly increases the number of subscribers, which ultimately increases competition (in the ISP market) and lowers costs. Do you remember how dialin used to cost $30 - $40? Aunt Tilley and her friends are the one who created a market big enough for competition to drive the costs down. Furthermore, once online, Aunt Tilley's friends stumbled across some of the experimental online shopping sites and started the uptake of E-Commerce. If you had told Aunt Tilley that she had to use a text editor to develop her website and had to make sure each and every tag was valid and closed properly, do you think she would have been persistent enough to do it anyway? Not likely.


    Now, having said all of that -- it's perfectly reasonable to expect any web authoring tool to generate compliant code (ahem, Microsoft???), and it's also reasonable to expect commercial and large social sites to at least run their code through a validator.

  • Nonsense. If you can write a buggy HTML document, you can also write a compliant one.

    Sorry, but you overestimate the ability of much of the population. Many people building web pages do not fully understand what they are doing. The copy and paste other code, and when it looks right to them in Internet Explorer, they feel they are done. They don't know their HTML code is buggy, and they wouldn't be able to fix it if they did. This type of person isn't stupid. They just do not have the interest or inclination for technology that you have. I'm sure they have other skills that would amaze you.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...