Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Concerns about online voting (Score 1) 139

I agree with your concerns about selling your vote. But I do think it's possible to design a system that makes vote-selling worthless, though it would be less convenient than normal online voting, and would still involve some physical visits to the voting booth, just not as often. Basically, every N years you would visit the voting locales in person and draw any number of random numbers on slips of paper or similar. You choose one of these random numbers and copy it to a new slip of paper which you put in a sealed envelope and submit just like you would during normal physical voting. That random number is now a pseudonym that can be used to submit online votes for the next period. The other numbers are basically decoys to make selling such numbers harder.

When voting online with this, each vote would be signed with the same number. Crucially, the same response would be given no matter whether the number matches a registered one or not: "your vote has been received" or similar. If somebody is looking over your shoulder, you can just type a wrong code, and your vote will be invalid, but look just as correct to the person observing you.

Since you can draw an arbitrary number of random numbers in the voting locale, but only one is correct, selling these numbers should be worthless too, since you have no way of proving that the number you're selling is the one you actually registered, and a buyer can't ask for "all of them" because he can't know how many numbers you got. Though.. people are lazy, so many people would only keep the one they actually used, I guess.

Of course, this doesn't do anything to solve the problem of botnets etc., which I think is a scary problem, which could put a lot of political power in the hands of botnet operators and those who buy their services.

Comment Mod parent up! (Score 1) 215

Great suggestion, and much better than the one in the discussion thread linked from the summary about refusing to expand names starting with -. I don't think your suggestion would break anything, and it would eliminate the problem. It should be the default.

Comment Re:Gravity? (Score 1) 112

But since its much smaller, the surface gravity would be much greater (you can go deeper into its gravitational well before you reach its surface). The sun has a surface acceleration of 275 m/s^2, or about 28 g. This white dwarf would have a surface acceleration of 3.33 Mm/s^2, or 3.3e5 g, more than ten thousand times higher. Attempt no landings there.

Comment Mod parent down (Score 1) 184

In 1983, the US Army Institute for Professional Development considered TV and Radio to be the second and third most effective means of propaganda after face-to-face interaction. (Nowadays internet and social media would probably be high on the list too.)

What did you check to determine that TV is a "piss-poor way to attempt to shape society"? Did you look at yourself and go "I don't feel like I've had my opinions or feelings manipulated by TV, so it clearly doesn't work"? Many (most) people don't think they are affected by advertising either, yet advertising is so profitable that it is almost everywhere. Our minds have known vulnerabilities in the form of appeal to emotion, subconcious association of unrelated things, etc,, and of course these are being exploited. Sadly, we can't fix these vulnerabilities like we can with software, but at least it helps to be aware of the ways we're being influenced.

Comment Re:Falling funding: Why fusion stays 30 years away (Score 1) 135

Now on top of this all, the energy density of a fusion system is *tiny*, so you need to build *enormous* reactors.

And that's where it falls apart. There is simply no way, under any reasonable development line, that the cost of building the plant, and servicing its debt, can possibly be made up by the electricity coming out. PV, one of the worst power sources in terms of cents/kWh, is currently running at about 15 to 20 cents/kWh. A fusion reactor almost certainly cannot be built that will produce power at under ten times that cost.

I think your post as a whole was very interesting, especially the numbers, but this last section suddenly became very handwavy. Do you have numbers for the last part also? Some of what you say there is quite suprising. Especially the energy density part: Am I completely mistaken in remembering that high power vs. low occupied land area was one of the advantages of fusion? By volume, ITER will produduce about 500 MW for 840 m^3 of reactor, or 0.6 MW/m^3. Area-wise, the ITER facilities will cover 40 hectar, which gives it a power footprint of 1.25 kW/m^2, comparable to coal. And it's unlikely that a reactor with 10 times more power would need ten times the area of the facilities, so that number would get better for larger reactors, especially one's that aren't full-blown research bases.

The main difficulty of a tokamak is plasma confinement, but this is a problem that benefits from the square-cube law because insulation goes up with volume, and the magnetic field curvature goes down. So a larger reactor could use higher densities and higher temperatures than a small one, and would be overall more efficient. So the power density would go up with reactor size.

As you say, an important question is what the final cost per energy will be. Here you basically pull a number of about 200 cents/kWh out from thin air. How do you arrive at that number? What size and type of reactor is is based on? (As a side note, solar power does not look like one of the worst power sources cost-wise in this table.

I'll also note that while progress has been slow compared to the first optimistic guesses, it is still being made.

Comment Falling funding: Why fusion stays 30 years away (Score 5, Informative) 135

It's common to hear someone say that "fusion power was 30 years away in the seventies, it's 30 years away now, and it will stay 30 years away"" or similar, and sadly, there is some truth to that (though perhaps it's 30 years now (estimated time for the DEMO full power-plant is 2033)). I think one of the reasons is that funding keeps decreasing, far below the optimistic projections of the 70s. The MIT fusion project made this graph to illustrate: https://i.imgur.com/sjH5r.jpg

It's a bit like when you're downloading a file, and while the download keeps making progress, the estimated time left stays put because the download speed keeps going down. I've had that happen a few times, and it requires an exponentially falling download speed. With fusion, the situation isn't quite that bad, but when you consider the sort of funding levels people were imagining before, it isn't surprising that they thought we would have fusion power by the year 2000.

One interesting way of putting this is to say that fusion power isn't a constant amount of time away, but about 50 billion dollars of funding away. To put those 50 billion dollars in context, fossil fules have received 594 billion dollars in subsidies in the USA since 1950. So partially fusion is difficult, and partially we're not trying very hard.

Comment Re:Free stuff sucks (Score 1) 221

Lets be honest, if copyright did not exist and people were all expected to give everything away for free and starve

It's hard to be honest about what would happen if copyright didn't exist, because we don't know. But we do know that not having copyright doesn't mean that one has to "give everything away for free and starve". Many other alternative means of compensation have been proposed, but we haven't really tested any of them seriously, which is why we aren't sure how it would compare to the current copyright system, quality and amount-wise.

My favorite alternative means of compensation is up-front payment, like we have everywhere else in society. This model has gained quite a bit of steam lately through Kickstarter. Basically, the author asks for the full payment for his work before he performs it, rather than extracting it gradually over years afterwards. The author creates a Kickstarter page detailing his plan for, say, a new work, with some information about what it would be about, and states a price he wants for writing it (say 50,000€), possibly with some stretch goals (bonus chapter after 100,000€, for example). Potential readers then choose how much money they want to commit. Once enough money to reach the author's price has been reached, he gets the money, and starts working. If too much time passes (time-limit is commonly 90 days with Kickstarter) without the goal being reached, then the potential readers get their money back, and the author must try some other approach.

The advantage of this approach is that since the author has already been paid before he does the work, he does not need to control copying: copies are free, and can be shared freely. The more copies are shared, and the more people who enjoy his work, the easier it will be for him to gather money for his work. What is today called piracy would now just be free advertisement.

The disadvantage of this system is that it will be hard for unknown authors to find people willing to fund them. Probably, their first book would need to be written for free in order to get enough interested readers for this approach to work. On the other hand, in practice authors already write their first book for free under the current system (they need something to show the editor in order to be funded), so this is not a serious disadvantage.

Projects of more than $1,000,000 are regularly funded through Kickstarter, and more than 50,000 projects have been funded during the 4 years since its founding. So a Kickstarter-inspired model of up-front payment really looks like it could work, and I think it's worth a try.

Comment Re:i applaud the effort (Score 2) 66

I recommend youtube-dl. It's an easy-to-use open source command line tool for downloading videos from youtube and many other sites. It's a part of the package repositories of most linux distributions also. I usually start a download (youtube-dl link-to-video-page), and then immediately point mplayer at the in-progress file. So there's no delay compared to watching it in the browser, but seeking is much faster, and you get to use a decent player. And if the connection is slow, you just wait a bit.

If you prefer not to use a command-line tool, there are firefox extensions that do this kind of thing, like netvideohunter and downloadhelper, but they are a bit sleazy and support fewer sites, I think. They also can't be automated the way youtube-dl can.

Comment Re:Armchair astrophysics question (Score 1) 129

From the Wikipedia article

Estimating the exact rate at which GRBs occur is difficult, but for a galaxy of approximately the same size as the Milky Way, the expected rate (for long-duration GRBs) is about one burst every 100,000 to 1,000,000 years. Only a small percentage of these would be beamed towards Earth. Estimates of rate of occurrence of short-duration GRBs are even more uncertain because of the unknown degree of collimation, but are probably comparable

So they are mostly seen very far away due to volume effects. There are far more galaxies >100 Mly away than galaxies 100 Mly away, for example.

Comment Re:Andromeda Rules (Score 1) 129

In theory, all we need to do is find a black hole and look with unreasonably high resolution and sensitivity at a point slightly more than 0.5 black hole radii away from its horizon (i.e. 1.5 schwarzchild radii from the center). The black hole acts as a lens, and at that point light is deflected by 180 degrees, letting us look back at ourselves as the earth was 2d years ago, where d is the distance to the hole in lightyears. In fact, by looking even closer to the point 0.5 black hole radii away, you can get to a point where light is deflected by 540 degrees, giving us an even fainter and more distorted image of ourselves a few minutes after the second image, and so on in infinity. In practice, even the first image will be so faint that it probably won't contain even a single photon, and would be washed out by the noise in the environment of the black hole (and between us and it) even if it did. But it's a fun through experiment.

Comment Re: far enough (Score 1) 129

Certainly not the flooding of the Mediterranean that was recorded in Sumerian legends and thence made it's way into Christian myths?

Isn't it unrealistic that the Zanclean flood, that ended the Mediterranean's latest dry phase 5.33 million years ago (that's about 2 million years before the evolution of Australopitthecus afarensis), should be recorded in Sumerian legends? Perhaps you're thinking of the Black Sea deluge, which might have occured somewhere between 7400 BD and 5600 BC (if it happened at all).

Comment Re:Nothing to stop the errors creeping in (Score 1) 200

I have also seen articles decline in quality, but I wonder how big a problem this is. What fraction of articles does this apply to? And what fraction of articles are getting better at the same time? And when an article declines, does it stabilize at some quality level? If so, what is that level? Perhaps one even has something more complicated going on, like a slow overall increase in the level of an article, but with significant short-time fluctuations, just like global temperatures? I read wikipedia extensively, both at work and for fun, and at least in the topics I visit, the average article quality is very high. And in my fields of expertise, the error rate is also very small. I think this indicates that either the fraction of articles that tend to decline in quality are very small, or that the level at which the quality stabilizes is very high.

It's a fundamental problem for them, but one which they can do little about without changing their most basic policies.

I think it's also the reason for Wikipedia's success: More articles recruit more editors, which leads to more articles, etc.. Its predecessor, Nupedia, was written more according to your wishes, but because of its strict focus on experts and quality, it never got the network effects going that have driven Wikipedia's enormous growth. Wikipedia's success, and both in the number of articles and the quality it has achived (and its quality is, on average, pretty good) is quite the miracle, and if you had asked me, or most others, whether "an encyclopedia anyone can edit" could work, I think the answer would have been that it would get bogged down in trolling and sabotage. I guess most people are more constuctive than we give them credit for, and the "armies of editors" approach seems to be a very good strategy.

Lately, Wikipedia's balance seems to have shifted away from the initial inclusionism ("allow imperfect and incomplete articles, someone (not necessarily the same person) will improve them and add sources later") towards deletionism ("if an article isn't good enough (yet), delete it; if information isn't sourced (yet), delete it"). While the intentions behind this is good, namely getting more reliable articles, I think it might be counterproductive, as aggressive deletion policies probably hurt editor recruitment, and hence lowers the pool from which expertise can be drawn. I speculate that part of the reason for the slowdown in Wikipedia's growth the last few years might be this deletionism trend, though the fact that many important topics already have articles probaby is more important.

Slashdot Top Deals

Suggest you just sit there and wait till life gets easier.

Working...