Forgot your password?
typodupeerror

Comment: Re:Virtual Desktops (Score 2) 1002

by sd.fhasldff (#36145562) Attached to: Do Developers Really Need a Second Monitor?

Most multi-monitor users I've seen don't sit 6 feet from their cluster of monitors to allow them simultaneous (non-peripheral) viewing of multiple monitors. On the other hand, I can switch between two virtual desktops in a fraction of a second.

The only time a second monitor, IMHO, is an improvement over a virtual desktop is when you can use your peripheral vision to monitor some live

Of course, most Windows users (even developers) are so glued to their mice, that switching desktops would be a time-consuming issue.

As for the comparison to "tabbing between windows", I find that ridiculous. Perhaps inflammatory (apologies...), but I really do. It presupposes a complete "Microsoft Windows" view of the world, where every application runs in exactly one window and all windows are inherently either maximized are minimized. It's not uncommon to see Linux developers have a multitude of windows open and visible at the same time.

You can't easily "tab between" groups of specifically positioned and sized windows. (Note, I said *easily*).

Comment: Virtual Desktops (Score 4, Interesting) 1002

by sd.fhasldff (#36145202) Attached to: Do Developers Really Need a Second Monitor?

My opinion is this is largely a consequence of how the Maximize functionality works / has worked.

My money is on the complete lack of virtual desktops on Microsoft's platform.

Yes, there are third party apps that add the capability, but I don't know a single Windows developer who uses them. On the other hand, I don't know a single Linux developer who DOESN'T use them... (now watch Slashdot provide countless counter examples).

Developing on a system without virtual desktops *or* a second (at least) monitor is a huge pain in the ass.

Google

+ - SPAM: Top 20+ google blogs

Submitted by Anonymous Coward
An anonymous reader writes "Google is the largest technology company no doubt. We use Google product and services on daily basis, and many of the users are depends on it very much, and if any of the services are down, I am feeling me becomes un comfort. Today i compiled 20+ google blogs that helped you very much to keeping you up to date. Enjoy!"
Medicine

+ - Metabolically Engineered Plants Produce New Drugs->

Submitted by fergus07
fergus07 (1145927) writes "Scientists have been engineering new genes into plants for a number of years in an effort to expand on naturally occurring medicinal compounds. Now chemists at MIT have gone one step further, using an approach known as metabolic engineering to alter the series of reactions plants use to build new molecules, thereby enabling them to produce unnatural variants of their usual products."
Link to Original Source

Comment: Re:Distributed search engines failed (Score 1) 378

by sd.fhasldff (#34074870) Attached to: Is Google Polluting the Internet?

It's ironic that Grub returns nothing but sites that want to sell me jerseys when I search for an NFL player, whereas the top Google links are always the player profile pages from NFL.com and other major sports sites, and the player's Wikipedia entry. Add to that the easy access to "news" for the player and there's little question which search engine is the more useful.

Google works because their ranking system works. If it stopped working well, they would lose market share very quickly.

Comment: Re:...And one generation behind on HTML5 (Score 2, Informative) 341

by sd.fhasldff (#33287566) Attached to: Firefox 4 Will Be One Generation Ahead

h264 isn't going to be a practical problem for the vast majority of users, since Firefox can just use a system codec (non-Windows-users would have to make sure they have one, of course).

As for JS speed, Mozilla are very ardent in their speed claims, so it's hard not to believe they have something to back it up. It's difficult for users and external testers to figure out exactly how fast they are, despite being open source, because the Moz team is pursuing several parallel tracks to increase JS speed. There's "fat-val", "tracer JIT" and "method JIT". Each is currently significantly faster than the "normal" versions, but there hasn't been any public testing on a build that combines all three.

Mozilla claim they'll be faster than everyone else and while they may be scuppered by new advances from Google and Opera, it seems reasonable that they will at least be faster at launch than where everyone else is now. That alone would ensure "next-generation JS performance".

Where they trail Chrome is in "use speed". Chrome starts and shuts down a lot faster -- and I think that's going to be a problem for Firefox moving forward (more than it already is).

Comment: Re:Keyword: fast*ER* ... sometimes (Score 1) 222

by sd.fhasldff (#32907166) Attached to: Mozilla's New JavaScript Engine Coming September 1

They are complementary only in a way icecream and fish are complementary. They don't seem to have overlap, but you don't want fish with your icecream.

Since bunratty's reply provides a technical description of why they are truly complementary (possibly even orthogonal), I'm going to believe that argument over your rather weak ice cream simile.

Comment: Re:Keyword: fast*ER* ... sometimes (Score 4, Informative) 222

by sd.fhasldff (#32901266) Attached to: Mozilla's New JavaScript Engine Coming September 1

I think you're missing the point of what is being benchmarked. Mozilla hasn't released benchmarks of their new JS engine with both "method" and "tracer" JIT combined. They are being evolved separately, but are (according to Moz) complementary. Thus, we don't know how far they actually are from their goal yet.

Check out http://www.arewefastyet.com/ for benchmarks and description.

From what I can gather from the associated bug report, the "fatval" optimizations are also not applied to the portions of JS code that is traced... which would imply that the better job the tracer engine does, the less the "fatval" optimizations are applied.

The result is that an unknown "free" speed increase is waiting in the wings. What the magnitude of this increase is... well, that's the question, isn't it?

Does 1 September seem like a really tight deadline? Yes, sure does, but more in terms of stability and robustness than actually getting to a specific speed milestone.

Comment: Doubled in 4 hours (Score 1) 290

by sd.fhasldff (#32092768) Attached to: The Humble Indie Bundle

About 4 hours later, total sales have roughly doubled:

- Total raised $103,758
- Average contribution $7.96
- Number of contributions 13038

I can't help but wonder how long this thing has been running. The article claims "7 days", but considering the current timer state (6days19hours35min) and the article timestamp (-5hours), that appears unlikely to be entirely accurate.

Comment: Updating apps without updating OS (Score 1) 319

by sd.fhasldff (#32082758) Attached to: Next Ubuntu Linux To Be a Maverick

There are a few options here:

1) Enable the backports repository (and perhaps even -proposed, if you're desperate).

2) Check if the the app devs make a package available for download.

3) Download the source and compile yourself ;)

It's worth noting that there's a simple reason why new versions of apps are not supported on old versions:

Dependencies.

Take a look at pretty much any package for Ubuntu and note how many other packages it depends on. In the Windows world (or indeed Mac world) the vast majority would be included when compiling (instead of linked to). The problem with the inclusion-model (aside from bulky programs) is that you don't get security updates applied centrally. Not that the link-model is perfect -- you noted it's biggest (IMHO) weakness.

You might have mail.

Working...