Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Minimum mass (Score 1) 89

5.4 solar masses is m sin i, where i is the inclination of the orbit to the plane of the sky. Therefore, the mass could well be greater than 5.4 solar masses, and so it could be a neptune or in rare cases of close to face-on inclination have even higher mass.

This is a limitation of the radial velocity method, which was used in this detection; with transits (where you watch the star dim as the planet passes in front of it) you already know the inclination---it's 90 degrees to a high accuracy. So you know the mass once you have a transit and a radial velocity.

Comment Re:I seem to remember... (Score 5, Insightful) 275

What happened to all the penguins---are they no longer on Slashdot anymore? How about these reasons to like Dropbox over MS, Google, and the others:

- Linux client
- Follows symlinks
- Automatic infinite version history (for a fee)
- LAN syncing for faster speed
- Bandwidth controls
- Automatic full resolution photo uploading from mobile
- Sync that just works

It's not all about the price ya know. Some of us like quality too. I currently have 24GB of free storage through Dropbox which I got through a special promotion. It has always worked flawlessly and never let me down.

Comment Re:Ready in 30 years (Score 5, Informative) 305

What a load of bull. Only in the core of the Sun does fusion actually occur. The temperature at the core is 15 million Kelvin and the central density is 160,000 kg/m^3. That is an energy density fucking orders of magnitude about decomposing manure. The numbers you get are by averaging over the entire Sun, which is irrelevant, because only a tiny central region of the Sun is hot enough for fusion.

10+ years on Slashdot and in the past few years it has really been taken over by amateurs. Every hard physics / astronomy article is filled with nonsense patently FALSE comments modded up to +4. Our collective intelligence has been decreasing, friends.

Please know what you are doing before you mod up an incorrect article... a simple Wikipedia peek will fix it for you folks.

Comment New scientist story leaves out a lot (Score 1) 127

These BICEP2 guys didn't back-pedal of their own accord, friends---how about citing the much more senior and respected people, such as WMAP guru Spergel, who already DID the joint Planck analysis and showed them how hasty they had been? This is pretty poor reporting on NS's part.

BICEP2 were a bunch of young upstarts riding into town with guns a-blazing. The sheriff came down and told them to calm down, boys, calm down.

Comment Thinking inside the box (Score 2) 33

I'm all in favor of spending money on space exploration, but the way I see it, Mars represents a point of diminishing returns. In the true spirit of exploration, we should begin looking at other interesting environments, such as drilling into Europa or Enceladus. This obsessive focus on Mars is a boon for Mars experts, but it has a real cost in terms delayed progress towards understanding other solar system and deep space targets.

Space exploration missions will inspire audiences and yield side-benefits no matter where they go. Why not spread what little wealth there is and look towards bolder, more exciting targets?

Here's another well-argued perspective on my point:

http://www.theonion.com/articl...

Comment Re:Why it matters (Score 5, Interesting) 293

In general relativity, wormholes *do* require negative mass (or energy density), for sure. Outside the context of the Casimir effect, negative mass in wormholes and warp drives can yield causality violations. Causality is the last thing you'll pry from a physicist's cold, dead hands. Therefore, while it may be fun to speculate about such things, they lie squarely within the realm of science fiction for now.

To post on a news site that the galactic black hole "may be a wormhole" is like posting a headline saying that extraterrestrial aliens "may currently be among us." Both ideas are exciting. Both ideas are remotely within the realm of possibility. And both are so unlikely that they would readily be dismissed by all except those who are credulous or who like to drum up sensationalism for its own sake.

It's sensationalism for nerds.

Comment Re:Shuttleworth is a lunatic. (Score 1) 63

If "fixing toolkits" is all that is needed, why hasn't anybody done it? How about yourself?

As far as I can tell, everyone who has tried to speed up X over the network has taken a different route. Maybe once you actually look at the toolkit code, it's not that easy---otherwise it would have been done ages ago.

[quote]But yes, you need a fast network with todays bloated desktops.
[/quote]
"Bloated desktop"? Hardly. Try putting up a 10-year old tcl/tk based GUI which allows you to manipulate 2D graphics by live-changing brightness/contrast. Try doing this over a 12Mbps down connection. This is my use case. And it is ridiculously slow---unusable.

Comment Re:Shuttleworth is a lunatic. (Score 1) 63

Change is not bad. X11 needs change. Perhaps they should have worked with X11, but from what I hear the X11 authors themselves didn't want to keep it.

X11 is broken in one serious way. X11 window forwarding over network is slow. Pathetically slow. Maybe some people who only ever forward terminals from X11 and whose entire computer use case involves manipulating ASCII characters might be fine with this, but those of us who work with graphics of any kinds that third-party hacks are required to make X work over a network. Witness the abomination that is NoMachine NX, where they had to basically rewrite X11 so that it would work fast over a slow network connection.

X11 may have many plusses, but this inability to do fast networking is just stupid and needs to change---whether via updating X11, Wayland, Mir, or what have you.

Comment Re:Shuttleworth is a lunatic. (Score 1) 63

You sound like someone who's gotten burnt by installing non-long-term-support Ubuntu in a production environment. That was your error, really; the non-LTS releases get minimal support, so installing them in an infrastructure-critical environment is pure silliness.

I've never had an issue with LTS releases... I have a machine that's been continuously updated since Ubuntu 8.10 (non-LTS), and the thing has miraculously upgraded with zero hitches via 8.10->9.10->10.04->12.04 and soon to be 14.04. This was a 2008 Mac Pro, to boot, so I've enjoyed incredible support in terms of mac-specific drivers and even the Broadcom wifi working out of the box.

Shuttleworth has inspired many people, and while he's made mistakes he's not afraid to push the boundaries and make disruptive changes because that's what keeps things going. The formula of disrputive changes in non-LTS with stability in LTS works great. Just stick to 0.1 releases of LTS Ubuntu and you'll have a stable system.

Comment Re:Question seems to be already answered. (Score 1) 480

If Stallman thinks that it is a "reasonable" proposition to force companies to put their source code in escrow, he is further removed from reality than I thought. It is more reasonable to leave things the way they are.

He also proposes a reasonable fix:

So I proposed that the Pirate Party platform require proprietary software's source code to be put in escrow when the binaries are released. The escrowed source code would then be released in the public domain after 5 years. Rather than making free software an official exception to the 5-year copyright rule, this would eliminate proprietary software's unofficial exception. Either way, the result is fair.

Slashdot Top Deals

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...