Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Collectives = individuals (Score 1) 591

Individual consumers (as opposed to collective market forces) have decided

What does collective market forces mean if not the aggregate (e.g. sum) of individual forces? (Likewise for decisions).

So, what you're saying is that you lament the fact that there weren't enough piracy? 'Cause if there were, that would be a collective market force, right? Or maybe you are against the state-protected temporary monopoly called copyright because it (like any other monopoly) prevents the market forces from doing their thing?

Or did you mean a political (rather than market) force, i.e. the masses should (at great transaction and information costs to each individual, more than what it is worth to most) get together and counter-lobby the politician, fighting back the "copyright industry"? Sadly, widely dispersed small interests tend to not defeat the concentrated, big interests.

TL;DR: I call your bluff. Collective just means many individuals.

Comment Note the assumption about power boundaries (Score 1) 413

Most browsers will lock down cross domain requests.

The assumption in this philosophy is that all the URLs below a particular domain are all owned by the same party.

If geocities.com/~userfoo/*html sources geocities.com/~userbar/*js, should the js files be trusted?

The other side of the assumption, that each party is limited to one domain, causes inefficiency: if I own foo.com and bar.com and I want foo.com/baz.html to source qux.js, I can't have that be bar.com/qux.js --- it must be foo.com/**/qux.js. But if bar.com/*html wants qux.js, I must put qux.js on bar.com also. (I guess virtual hosting and sym/hardlinking comes to the rescue here, and I can stop whining about my tiny, tiny js files being stored twice OH TEH NOESES...)

Comment Warning: too much aggregation (Score 1) 173

The problem [is] that Wikipedia doesn't want the crap in the first place.

There is no "Wikipedia wants", it's not a person---that's the whole problem, people disagree about the direction wikipedia should go in.

and if the foundation is unsound, then the roof cannot be.

I cannot find any meaning to this which is relevant to what we're talking about. True, if the servers crash all day, even the best-written article is only going to be moderately useful. But bad articles don't somehow infect good articles. They don't even draw editing effort away (under my system)---the editing effort they attract couldn't be put elsewhere.

[Will be edited, most edits reduce quality, need editorial oversight]

So? What's the wrong in letting the few volunteers who want to write about some topic maintain those articles for themselves, at whatever level of quality they will bear?

lack of volunteer hours to maintain the article at the intended level of quality.

Intended by whom? And why do they get to dictate terms to others?

What my solution aimed at was giving the deletionists what they want (stars burning twice as bright but half as many), to the extent they can make people voluntarily contribute to that end, while at the same time giving the inclusionists what they want (the blooming of a thousand flowers) without detracting from the quality of the narrower core of high-quality articles the deletionists want.

I haven't seen you argue that this is unobtainable, nor why this isn't a decent compromise betweent the wishes of the people involved. I've seen you take the side of the deletionists (as I understand them), without really saying why, just asserting that it's "what wikipedia wants". Am I misunderstanding the deletionists here? If not, care to explain why the compromise is bad?

Comment That makes a zero-sum assumption (Score 3, Insightful) 173

editors time is not [cheap] [...] Given the number of pages that I regularly see that have tags months and years old indicating that they need sources, formatting, etc... I'd say Wikipedia is in the midst of an unrecognized crisis in this regard.

Your argument assumes that editor time can be freely shifted from one article to another. If I'm very interested in anime and manga (and nothing else), I'm not going to start editing articles about voting theory or cladistics and the tree of life, or whatever---I don't have the interest, and/or I don't have the knowledge. A similar argument has been applied to free software contributors: people do what they're going to do, and you can't boss volunteers around.

To some extent, people care about Wikipedia in general; to that extent, you can transfer editor work hours between articles. I think the policy that maximizes use of both flexible and non-flexible volunteer labor is to direct the flexible labor to where the marginal return is greatest, given a fixed and unalterable supply of non-flexible labor. Concretely: use a bug tracker or ticket system and auto-fill it with "Most visited [citation needed]", "Oftenest viewed [flag:foobar]". That way, flexible volunteer labor can be directed to where that's useful, and the seldom-viewed stuff can coexist and be crap, and no one will care because no one reads it anyways, and in that way everyone gets to have their cake and eat it too.

Comment What?? (Score 1) 769

There are also benefits [...] "peak uranium"

The benefit of not using uranium is that you save yourself the trouble of changing to something else when uranium runs out? Doesn't not using uranium just move that cost closer (and thus makes it more expensive, as the real interest rate is positive*)?

* Basically, if we postpone the transition process, we can spend some time preparing ourselves better, thus making it require less work i.e. be cheaper

Or am I completely misunderstanding you?

Comment A correction (Score 1) 203

Someone please correct me if I'm mistaken.

While you are correct that computers are deterministic, there are ways to generate pseudo-random numbers based on cryptography, where the "figure out the algorithm" step essentially is the same as breaking the cryptography.

(Actually what you figure out is not the algorithm---which can be publicly known---but a secret input, i.e. a secret key and/or seed.)

So while you are correct in principle, it is possible to make numbers which look so random that their pattern is in practice undetectable.

Comment Econ is corrupt? (Score 1) 61

I can name several econ nobel laureates who, in my not overly well-informed opinion, have made a genuine net positive contribution to the world.

Every heard of game theory and Nash equilibria? That'd be John Forbes Nash. How about Vickrey auctions---they might add a little more honesty to the world, and help people allocate goods more efficiently. How about Kenneth Arrow, proving that social decision making processing will always have flaws (so we can stop looking for the perfect ones and start discussing trade-offs)? Or how about Daniel Kahnemann, for reminding economists the danger of their foolish assumptions about human rationality? ;-)

But of course, I'm eager to learn so if econ is corrupt please enlighten me as to how.

Comment The effects of random voting (Score 1) 304

Plus, those who wouldn't have voted would be voting randomly and skew the results.

If they vote uniformly randomly, then they vote for every option in equal amounts. For the whole body of indifferent people, it amounts to expressing no preference for any one particular option. So if all indifferent people truly vote randomly, their behavior perfectly expresses their viewpoint.

Slashdot Top Deals

This place just isn't big enough for all of us. We've got to find a way off this planet.

Working...