Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:I've never seen a 32 bit Vista, 7, or 8 install (Score 1) 209

> For the Mozilla team to say there will "never" be a
> 64-bit build for Windows

Which is something no one at Mozilla ever said. But don't bother reading what they actually said, just read the lies lazy reporters spouted instead.

What Benjamin said is that there are no plans to ship a final 64-bit product in the next several months.

Comment Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (Score 2) 182

The problem is that as runtimes evolve the compiled format changes. Furthermore, the end result of the compilation depends on the exact processor being used by the user, and at least in SpiderMonkey on things like the location of the Window object in memory.

Not only that, but the final compiled version is unsafe machine code, so a browser couldn't trust a web page to provide it anyway.

So pages wouldn't be able to provide a final compiled version no matter what. They may be able to provide bytecode of some sort, but again the bytecode format browsers use is not fixed (assuming it exists at all; V8 doesn't have a bytecode) and compilation of JS to bytecode would have to be replaced by some sort of bytecode verifier for security reasons, so there may not even be much of a performance win from the switch.

Comment Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (Score 5, Informative) 182

A short summary:

1) TraceMonkey turned out to have very uneven performance. This was partly because it type-specialized very aggressively, and partly because it didn't deal well with very branchy code due to trace-tree explosion. As a result, when it was good it was really good (for back then), but when it hit a case it didn't handle well it was awful. JaegerMonkey was added as a way to address these shortcomings by having a baseline compiler that handled most cases, reserving tracing for very hot type-specialized codepaths.

2) As work on JaegerMonkey progressed and as Brian Hackett's type inference system was being put in place, it turned out that JaegerMonkey + type inference could give performance similar to TraceMonkey, with somewhat less complexity than supporting both compilers on top of type inference. So when TI was enabled, TraceMonkey was switched off, and later removed from the tree. But keep in mind that JaegerMonkey was designed to be a baseline JIT: run fast, compile everything, no fancy optimizations.

3) IonMonkey exists to handle the cases TraceMonkey used to do well. It has a much slower compilation pass than JaegerMonkey, because it does more involved optimizations. So most code gets compiled with JaegerMonkey, and then particularly hot code is compiled with IonMonkey.

This is a common design for JIT systems, actually: a faster JIT that produces slower code and a slower JIT that produces faster code for the cases where it matters.

https://blog.mozilla.org/dmandelin/2011/04/22/mozilla-javascript-2011/ has a bit of discussion about some of this.

Comment Re:64bit (Score 1) 224

The fact that there is no 64-bit MSVC compiler that can produce 32-bit binaries has certainly been a problem for a number of people. It means that trying to do PGO on a large codebase being compiled into a 32-bit binary runs out of address space. Both Mozilla and Google have run into this, for example; in Google's case the result was them not using PGO at all.

Comment Re:Windows being the laughing stock of the OS worl (Score 1) 224

Compiling is easy in a vacuum,

Fixing all the bugs introduced by the different compiler that you haven't worked around yet, then fixing all the issues due to the 64-bit plug-ins (esp Flash) having a different set of problems than the 32-bit ones, then fixing any remaining issues due to Windows-specific code possibly making dumb assumptions about sizes of things is a different matter altogether.

Which is why 64-bit nightlies _existed_. They just don't work that well, on average.

Then the question becomes whether to make (and test, which causes even more load on the test infrastructure) these builds, which no one plans to ship to actual end users anytime in the next 6+ months. That's what the discussion was really about: does Mozilla keep spending time keeping these builds limping even though they don't have the time to make them actually tier-1, or do they just stop doing them for now and start again when they have the resources to actually do it right?

Comment Re:Mac OS X 10.5 (Leopard) (Score 1) 137

The amount of effort needed to support multiple versions of OSX at the same time is much larger than the amount of effort needed on Windows, because Microsoft usually bends over backwards to not break compat, while Apple will go out of its way to do so.

Combined with the lower user base on Mac and the faster OS update cycle of Mac users, this means that dropping support for old MacOS versions is a much simpler call than dropping support for old Windows versions: They're more work to support, and the number of users using them is much smaller.

For perspective, about half of Mozilla's Windows users are still on WinXP (which approximately matches the overall fraction of Windows users on WinXP), while the fraction of Mac users on 10.5 was 10% and falling rapidly when support was dropped.

Comment Re:Memory hog? (Score 2) 302

Ah, 350MB in task mgr would match the ~400MB resident metrics from about:memory.

And again, about 100MB of that is not even Firefox itself...

For the rest, the basic problem is that web sites are doing a _lot_ of JS, as are the extensions you have installed. So they're using a lot of memory for all those JS objects. :(

It would be interesting to see how much memory other browsers use on that set of sites, for what it's worth.

Slashdot Top Deals

I've noticed several design suggestions in your code.

Working...