Become a fan of Slashdot on Facebook

typodupeerror

Comment the falsifiable universe (Score 1)458

Well, here we do not talk about knowledge that have no immediate application. We talk about knowledge that by definition (the unobservable beyond the universe) will never have application.

I vote for the observable universe to be named the falsifiable universe, the notion of the universe that experimental physicists inhabit in mind and body.

I've long regarded the unobservable universe as akin to an analytic continuation.

Analytic continuation often succeeds in defining further values of a function, for example in a new region where an infinite series representation in terms of which it is initially defined becomes divergent.

Far from being useless, these are tremendously useful in suggesting new ways to approach the mathematics of the original function.

So we have two things here: the falsifiable universe, and its intellectually stimulating analytic continuation.

If you're burning through chalk and pencils on an exponent growth curve, you'll soon give the analytic continuation some terse symbols, such as i and by the human psychology of oft-masterbated terse symbols, you'll come to regard it as being as real as any other symbol dripping down from the Matrix.

If you're lucky, at some point the unobservables all cancel out, and you're left with an insight into the falsifiable universe, arrived at through a mathematical worm hole. Mathematics folds in on itself in mysterious ways, no quantum particles required, so far as I've been able to tell.

The first requirement of a falsifiable universe is the state of being casually connected. If the falsification process is embedded in the falsifiable universe, there are additional requirements: you're dealing at least with a self-falsifiable universe. Falsification, it turns out, itself sits pretty far up the food chain.

Here's a good gig. Posit some primitive element amenable to nearly limitless analytic continuation, such that it can never be shown that there does not exist a continuation capable of collapsing back through some miracle of symbolic reduction to a testable statement about the falsifiable universe.

Congratulations. You have now made it permanently impossible to tell whether you're doing physics or not. It's important that the math is in some way highly constrained and very difficult, or it becomes immediately obvious that the playground exceeds the project.

If the constraints are difficult enough that you can tell the difference between the really smart people and the really, really smart people, and an Ed Witten or two comes along from time to time to humiliate the really, really smart people you've at least got the foundations in place for a credible intellectual discipline. If not physics, at least it's a sport.

It's just too bad that most of the people doing quantum-cosmic analytic continuation pass themselves off as physicists. Different rules, different discipline, whether or not they share the first twenty years of the same education. You can tell there's a lot of strain over this because the string people mutter the word "testable" as often as Microsoft mutters the word "innovation"—and to equal effect.

If we had a nice standard of elegance E and a proven theorem stating that all theories of physics more elegant than e are necessarily true, we could mend the house.

But we haven't yet written down the most surpassingly elegant equation that's actually false as witnessed in the falsifiable universe. Without an objective decision point, it's just a bunch of exceedingly smart guys refusing to kill their darlings.

Comment landscape : consumption :: portrait : production (Score 1)520

Sometimes you just want to read the bottom of a long article on the top half of a tall display within a screen-maximized Firefox instance. It's Firefox that causes the problem in the first place, making pessimistic assumptions about deployable pixels.

I can hear it whispering churlishly "you should be thankful that content is on the screen at all" never mind that it's forcing me to hold my head at an awkward angle. I suppose a Firefox designer afflicted with use-case blindness could argue that if I don't want to incline my head downwards, I should maximize my view port to the top half of my tall display.

Wrong.

The peripheral viewing area is extremely valuable when zooming around and regaining your bearings. Excess horizontal area is pretty much useless for anything other than turning your browser window into a strip mall.

Comment Re:Why not just multiple monitors. (Score 1)520

There was a problem with the above configuration. The part of the portrait display that sits an inch off my desk is not ideal for long term viewing. I put a Tilda instance down there. This is always on top, but there's still more than enough vertical space above it to hold any web page I'm referencing concurrently.

The problem is that by default, the web page won't continuing scrolling past the bottom. I either have to resize the web window (what's the point of a hot-key if you're right back to dragging digital ditches?) or I have to pop Tilda away briefly.

Problem solved with a Stylish tweak.

body:after {
content: '**********';
color: #505050;
display: block;
text-align: center;
font-size: 1vmax;
}

This adds a blank region to the bottom of every page large enough to let me scroll any content above my always-on-top console window. It adds some text just so I know that my script is messing with stuff.

Comment Re:Why not just multiple monitors. (Score 1)520

The problem that you describe is just an indicator that our software has not yet evolved for this type of display. Solutions to the problems that you have described are sure to pop up as creative individuals start a race toward different solutions.

Yeah, and disappear again just as quickly as your favourite distro decides the form-factor is greener on the other side. What's your crystal prognosis for solutions willing to make a commitment and settle in the for the long haul?

It wasn't all bad news. After my perfectly configured window-key window management accelerators suffered an ignomble fate, I bothered myself to flip my right-hand 21" IPS into portrait orientation. This has been a god-send. In combination with NoSquint, I'm able to size almost every web site so the third-column cruft vanishes into the non-pixel margins. I have twice the normal vertical depth with which to scan the actual text. It's truly glorious.

My left-side panel dates back from before the HD craze. It's nearly as big, but closer to square, and despite this—because I'm stubborn to the last drop—they have identical pixel dimensions.

I wouldn't actually gain much with this 4k display, but I'd consider it for my workstation at the office, which is not my primary work location.

Comment Cantor's libertarian hierarchy (Score 2)674

Kodak was replaced by a whole slew of companies that make components for digital cameras, cell phones, picture hosting, digital frames, etc.

You actually checked with the Kelley Blue Book or CarProof that the companies making digital phones, etc. aren't sopping up employees already discarded long before the Kodak disgorgement? This is the kind of set mapping that gives libertarianism a bad name: the vague presumption that the new necessarily has greater cardinality than the old.

In this lame conception, when the old industries fade and fail and fling off a finitude, a new industry springs up able to sop up an infinitude, and then the next neonatal industry incumbent (only in California does one encounter a neonatal incumbent) continues the aleph-upmanship and so it goes that progress Cantors along.

Comment use case bigotry genre (Score 1)128

Nobody wants to spend \$300 dollars on a console that ties up your \$500 TV while your using it and buy a few \$60 games on top of it, when you can just download a game on your phone that you already have and spend \$4 on it.

I don't see the original post. Kinda interesting if there never was one. In any case, whatever it's origins, it's a fine example of the use case bigotry genre.

This is the kind of thing frequently heard expressed by a person riding the special-needs short bus—as in, not comprehending the needs of others worth a damn. The longer one lives life the more one realizes that we are all special needs in some dimension, which is why the fascist unification of consumer sentiment sucks ass.

From my perspective, ? what the hell else would you do ? with a \$500 television if you subscribe to Telus Optik 50, and you haven't even installed the television modem—as I haven't—because the default content available represents negative value: for every good show one manages to watch, there's an equal amount of cognitive filth to studiously avoid.

Studious avoidance is an expensive activity. Ask any college drop-out. Or read any of the recent science on the will-power muscle, which suggests that the effort expended successfully avoiding the tempting (but awful) TV program is quite likely to show up as inferior decision making later that evening when you juggle your retire savings plan.

I suppose that "Nobody" is just a youthful code word for "Nobody who is anybody" after first screening out the educated, the thoughtful, and the literate in order to better isolate the spending demographic of happening now.

Comment the cancer meme (Score 1)366

When I was a growing up in Canada, we were bombarded on television with the slogan: "Cancer can be beaten." Google informs me that the Canadian Cancer Society unveiled this slogan on 2 January 1969.

Blasted with this slogan on TV, even when I was very young I thought it was the dumbest thing I'd ever heard coming from a technological elite. "Cancer isn't just some pathogenic disease, it's an incremental systems malfunction" I used to say to myself.

Turns out the slogan was first invented to help people seek medical attention when they discovered a possible cancer symptom, rather than freaking out and modelling their behaviour after the strong-and-silent-and-dying-inside heroes from the 1950s. By my teenage years, the use of the slogan had shifting mainly toward the appeal for funding fundamental research. This was the only version I knew.

It's not a given that the genetic system needs to decay. But there's a metabolic cost to flawless genetic replication, and evolution seems to have decided that the price is not worth paying: that which makes us immortal saps our youthful vigour. Without youthful vigour, a species risks becoming one of many, many dead-end side branches on the tree of evolution.

In the world of memes, the desire to live forever is cancer. If this meme ever succeeds in achieving its goal, it will prove fatal to the host organism—the human species.

Immortality is stasis. In order for stasis to thrive, any form of vigorous external change (evolution acting on other forms of life) must be thoroughly trampled. Immortality is the nirvana of paranoid jackboots.

Comment one-way certainty (Score 5, Insightful)573

'If it turned out that Snowden did give information to the Russians or Chinese (or if intelligence assessments show that the leaks did substantial damage to national security, something that hasn't been proved in public), then I'd say all talk of a deal is off â" and I assume the Times editorial page would agree.'

This is one of those propositions that can only ever be in the past tense in a single logical state: busted.

These one-way allegations have a way of never dying, or at least not until it's back page news. Meanwhile, they muddy the waters a great deal just hanging there.

Neither is it self-evidently clear that the NSA's voraciousness is separable, to where informed public debate can exist with only one-half of the picture (aka the domestic half).

I think this article translates to: "it's our policy to never grant clemency under any conditions just in case we later discover a game-changing fact".

The option of a conditional clemency is fraught with unsolvable issues. Snowden could attest that he's never actually done any entirely non-clement things, and if were subsequently learned otherwise, his clemency could be revoked. This would be "clement until proven guilty".

Only for this to be workable, one would have to have a way to prove that the NSA never plants leaks of its own information to gain what it dearly wants—have I got a bridge to sell you—as there's no way to prove that a leak originated from Snowden unless the substance of the leak contains information one can verify the NSA never had at that time.

Good luck with that.

And somehow the subtext of all this seems to imply that the NSA's proven snookery (illegitimately authorized as far as the eye can see) should take a back seat to Snowden's unproven snookery (the worst things he might have done).

I don't blame the NSA for the lamentable standards of civic discourse. But neither can the agency hide from their legacy of operating behind a thick smoke screen of democratic false impressions.

Comment Re:Link to Asimov's actual article (Score 1)385

Of the last twenty TED talks, this one has the most views by nearly a two to one margin over the runner up:

David Steindl-Rast: Want to be happy? Be grateful

I personally found it upbeat yet vacuous. He doesn't specify whether in the topology of his gratitude vector space, there's a primary node where all the gratitude goes in, and no gratitude comes out (presumably due to Hawking radiation, all that gratitude is re-emitted from the fearful symmetry as cosmic love). Asimov, of course, never held the majority standard for spiritual malaise.

I am honorary president of the American Humanist Association, having succeeded the late, great, spectacularly prolific writer and scientist, Dr. Isaac Asimov in that essentially functionless capacity. At an A.H.A. memorial service for my predecessor I said, âoeIsaac is up in Heaven now.â That was the funniest thing I could have said to an audience of humanists. It rolled them in the aisles. Mirth! Several minutes had to pass before something resembling solemnity could be restored.

And yet ... the majority of the world's population continues to itch for any hint of a master honour roll for special snowflakes, no matter how shallowly disguised.

China did it. But yeah, it's really not a problem for first-worlders. Asimov didn't see that coming.

Brave New World was published in 1931. Asimov would have been thoroughly familiar with it. Nineteen Eighty-Four is not the only game in town concerning the control of the masses. First, we have all the drugs. Second, we do have laws forcing parents to turn their children over to the puppy mill of public education which--along with mass culture--promptly fills their heads full of all kinds of garbage, that only the most strenuous parental exertion can hope to mitigate.

So you can have a large family, but at some deep level, it's not entirely yours.

It amazes me the number of people attracted to the purity cult concerning the foods they eat (local/non-GMO/vegetarian/unprocessed), who barely blink over the obnoxiousness of the vast majority of the thousands of media impressions we soak in each day, the end result of which is that a billion people cared about two seconds of Janet Jackson's nipple.

We live in a society where it's a permanent, relentless battle to resist the frivolous.

We have this notion of "parental controls". We can keep our children ignorant of how sex functions in the real world (as opposed to the retail world), though this electronic chastity belt is ultimately futile if your child has half a brain. We can pretend we're filtering out violence. Yet most violence is social, and you can really only filter graphic depictions (unless sex is also involved, in which case social aggression is also considered graphic).

What you really want to filter out is not sex or violence, but stupidity, and for this the "parental control" widget has no back-lit chicklet engraved with an undiscoverable hieroglyphic rune. In 90% of MSM political coverage, they're not even trying, to put it kindly.

It was Asimov who postulated the discipline of psychohistory, in which the vacuous can be distinguished from the salient by the vigorous cranking of some vast algorithmic matrix. We've become very, very good at the vigorous cranking of vast algorithmic matrices, yet I have no channel where political figures never intrude on my consciousness unless in the act of making a substantive statement. I don't even want the operatic comedy of "Brownie, you're doing a heck of a job!" Whatever.

Wake me up when it reaches the level of 'Heck of a job, Brownie' calls Bush inattentive 'fratboy'.

That could be riddled with a hundred falsehoods, distortions, and lies but at least it contains testable hypotheses, unlike 99% of the vapid crap that ever came out of a Bush photo op. Yet which one makes the headline news?

Hardly any wonder we're left feeling cynical about copulating with cause.

Point, Asimov.

Comment Re:Yeah right (Score 1)354

The NSA admit they were wrong? Hell, when has anyone in government admitted they were wrong?

Just off the top of my head:

What McNamara doesn't do is out himself as a sadistic tyrant bent on personal glory, so his book wasn't warmly received.

I can see clearly now... that I was wrong in not acting more decisively and more forthrightly in dealing with Watergate.

Do I need to attribute that?

When is the last time you admitted you never let the facts interfere with a cherished aspersion?

Comment Sherlock's theorem (Score 0)165

Many security bugs are really failures to implement correctly a requirement of the form "No matter what the input to this program is, it must not do X."

This is a special case of Sherlock's theorem:

Once you have eliminated the disallowed, whatever remains, however unintuitive, must be the robust.

It's far easier to debug a sin of omission than a sin of commission. If a piece of code never performs a disallowed function (e.g. leaking memory, failing to sanitize user input) then all failures that remain are sins of omission: the program doesn't actually transfer the file requested, out of excessive restraint on some edge case the programmer never even considered.

Well, the programmer needs to get in there and consider the omission in the harsh light of day. Then the specification document needs to be updated.

And questions need to be asked about the user environment when an edge case is tripped three years into a heavy use cycle.

The only way to achieve software up-front with no failure modes and no functional omissions is to massively gold-plate the validation process, and this rarely works anyway.

I'm never happier writing code than when I'm subtracting stupid.

Comment Re:Not very practical (Score 1)103

So the next time you appear in a photo consider the fact that a simple procedure might reveal who you are with.

Yeah, I hang out all the time in public spaces with KH-11 prosumer cameras concealed behind 1970s ceiling tiles in every room and corridor.

It's this increasingly common tag line on the article submissions that makes Slashdot news for slack-jawed mooncalves.

Comment Re:Not enough, (Score 1)415

He committed a crime, there were witnesses, AND he confessed. You don't get any more guilty than that.

Not true. You're a hell of a lot more guilty if a victim harmed by your crime steps forward and presses charges. Meanwhile, back at the office, he did as much to protect the safety of his fellow countryman and soldiers as any venerated war hero known to the public.

Technically, just about every person in Russia sent to the gulags first signed a confession for crimes against the state, if only committed inside their heads.

If that's the standard of guilt, guilt can go fuck itself.

Comment the amazing unicycle with sidecars and yoke (Score 2)199

I'm sympathetic to PHK, but I could never have written this piece myself without commenting on a single disadvantage of the Chinese wheelbarrow.

You seem to be stuck with one of three problems:
* using a small wheel that won't easily roll over path obstructions
* having the wheel intrude into the barrow, obstructing tending or shifting the load
* having a large wheel under the barrow with a high center of gravity (what could possibly go wrong?)

The large carts at my nearby Costco are set up so that they won't pivot at the front (only at the middle). This is fine if you can find space to make a 90 degree turn on the spot. It's not at all good for creeping around a tight bend. Moreover, you've got both the front and back end swinging at the same time—which is the number of places you can visually attend plus one—so your chances of taking down some rickety display item are fairly decent if try to wing it.

Furthermore, nothing prevents two people from grabbing different handles on the European wheelbarrow. Also, PHK is wrong about the weight distribution. With a heavy load, it's customary to pile as much as possible up against the lip that protrudes over the front wheel in many front-wheel designs. I'd guess an European wheelbarrow front-loaded with wet clay has about a 4:1 lever arm in vertical displacement of the handle compared to vertical displacement of the load.

Wouldn't a Chinese wheelbarrow be something like a small unicycle with saddlebags and a trailer hitch? If you need to clear some brush (where only your wheel fits the path), you've got no way to jack the suspension under the load, either.

And wouldn't it be much harder for short and tall people to share the Chinese design unless equipped with some sort of adjustable handle. Somehow I'm just positive that the Chinese design from 1000 [BC|AD] comes replete with ergonomic dongles for the comfort of whatever schlep needs it next.

But then, with a billion identical people growing rice on ten million identically manicured terraces, I'm sure the Chinese design is a total win.