Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment when search returns models (Score 2) 216

For example, Boost is really sweet when you need to slam together a pile of code and have it working out of the gate with minimal fuss, but if performance is an issue, you cant use it.

Wow, that's just bizarre. I don't know where you get your misinformation, but it's an elite grade of batshit.

The whole point of Boost is that it maintains a certain amount of abstraction without boxing you into a performance corner. Were it not for those conflicting goals, the devilishness of its internal machinery could not be justified.

Template metaprogramming essentially involves expressions converting themselves to a symbolic representation that doesn't resolve itself into a concrete expression—by means of purely functional transformation at a quasi-syntactic level;—until some final result is demanded, at which point the highest performance code path can be selected based on the actual parameters (more specifically, often exploiting which parameters vary and which parameters are constant or nearly constant).

The problem with Boost is similar to what Knuth said about the problem with literate programming.

Literate programming demands a high proficiency with two different skills: formal reasoning and verbal expression. This shrinks the available pool of adherents and adopters. And worse, there's a terrible opportunity cost, because the people out there who have extremely high proficiency in both of these skills are in extremely high demand to take on central roles in large projects where they don't spend their hours bent over literate code.

The kind of environment where Boost can be best exploited for both its abstraction and its performance is going to be wonk-filled boiler-rooms at high frequency trading companies where the cash, the talent, the commitment, and the project duration mesh together. Importantly, the project specification in these environments is often in continuous, long-term evolution as your firm chases whatever edge it thinks it might have in a chaotic, rapidly-shifting market environment. The month you spend pouring over low-level optimization gets deployed for a whole week. The month you spend automated your Boost framework to achieve nearly the same performance becomes a permanent code asset (and a competitive asset whenever you find yourself needing once again to run that old play).

Boost is in that category where if you have to ask, you can't cut the mustard. The natural Boost programmers already know who they are. Few of these people toil in the public eye. That's not where this elite, double-barrel skillset tends to land.

The Wolfram language is impossible to assess based on this video. If your application depends on Wolfram "knowledge" how do you know it will continue to meet rigorous specifications the day after tomorrow?

Is there a public regression suite on the contained knowledge against which to assess whether your program is erected on firm or porous soil?

What guarantee does one have that it's cleverness or performance characteristics will stay consistent when it matters most?

I suspect the killer application for WooL is prototyping the semantic web. The semantic web has been dragging its feet. Google and Facebook don't wish to become disintermediated. They have one foot on both sides of this fence and their hands cupped over their testicles. Doesn't make for rapid progress.

The Achilles heel of search is that search returns results rather than models. Google is trying to split the difference by having search return interactions. It's an excellent paving stone on the road to a lucrative future purveying OOXML.

If ten minutes of coding within the Wolfram Language embarrasses Google search, we have a winner here of WuLing mammoth proportions.

Comment lipstick and suction cups (Score 4, Insightful) 241

If you're still running 10.6 for some reason, your computer is either a low-end one from at least 7 years ago, or you've made an intentional choice to remain on 10.6 for some reason

It used to be that low IQ was failing to identify the continuation of some trivial numerical sequence on some trivial test. The new low IQ is use-case blindness, the inability to even hazard a guess at the myriads of reasons other people live differently than you do. The ravening mob of blindness promulgators are ever with us. Pity.

Here's my story.

I bought my wife a second generation Core Duo iMacs, which I believe has never been upgraded from the original Leopard. I use this computer so rarely (about ten hours per years) that I can barely keep track of which leopard presently holds court.

The computer works—until some piece of software offers to "upgrade" itself, then restarts with a whole new user interface (I'm looking at you, iTunes). Then I'm constantly told the computer doesn't work any more, but the real problem is that she hasn't figured out where all the familiar functions were forcibly relocated.

I'm not willing to sit down at her desk and chase GUI tidbits from point A to point B, so I just told her "don't click upgrade". When something visibly breaks, then I'm willing to sit down and deal with it. Meanwhile I have enough sysadmin on my plate with my own Linux desktop, where I'm heavily invested in ZSH, and my FreeBSD server, where I'm making very heavy use of ZFS. This is where my neural matter wants to go.

I have a very low tolerance for having something trivial I've mastered at the autonomic level yanked back to the center of my attention. It took me close to a decade to cease seething about the relocation of the CTRL key in favour of a CAPS LOCK key that should have been ALT-NUMLOCK or, even better, CTRL-ALT-INSERT. FFS I can type ~50 wpm in ALL CAPS using the right shift key for six of my fingers, alternating to the right shift key for the other two. But guess what? The CAPS LOCK key is more prominent to my left pinkie than ENTER is to my right pinkie. If we normalize the utility of the ENTER key to 100, the utility of the CAPS LOCK key comes out around -1000.

The problem with most upgrades is that it's always more of this father-knows-best groupthink bullshit.

It's a huge project just to figure out what's going to change. The only recourse one has to all these unnecessary relearning cycles is to skip as many releases as humanly possible. I'd be thrilled if XP is the last Microsoft OS I learn how to use in this lifetime. I was an early adopter of Windows 2000 and I stayed there until 2000 went out of support. Later I ended up using XP in a different work environment and I can't name a single thing that improved, except that I had to disable a lot more bling for half a day. Long ago I held out on MSDOS until I could jump straight to Windows NT which I adopted within weeks of the Intel P6 becoming available. That was a real upgrade, one well worth reprogramming a decade of autonomic habits. I never used any of the shitshow 3.1/95/98 for more than the very occasional hour.

These upgrades change a lot of stuff for extremely dubious benefits. An upgrade is going from UFS to ZFS. That I can buy into. An upgrade is going from System 7 to OS X. On that one I can sell my wife.

What I really want concerning these fairly useless system frobs is the semantic web: searchable metadata describing every user interface action that formerly existed and whether it still exists in the new version, plus a mapping to a more-or-less equivalent version, if such a thing has even been retained. Oh yes, Apple is good at silent castration. Ideally the OS would track which user interface functions have been regularly used, and list out all the things the upgraded user will be instantly forced to relearn. But no. It's sexy. No assistance offered retraining for sexy. That what sexy means, lover boy.

Until such a world exists, small upgrades require leaping over a trust quanta with the four degree Kelvin cosmic background radiation as the only available heat source.

The other reason to upgrade her OS would be the occasion of replacing her old disk drive with a new new SSD which is long overdue, but there was a small roadblock. I got about 15 minutes into this project before I discovered that suction cups are involved.

Just what I need in my service kit: suction cups.

Two things.

If it works, don't fix it.

If it wants to be a disposable appliance to the very core of its being, let it die in its birthday suit.

Comment Bright Phone (Score 1) 333

I simply don't install applications on Android that ask for abusive permissions, which pretty much puts my phone back into the stone age. I don't need the project right now of installing a root kit, tweaking non-standard security settings, then wondering whether the next glitch is something I have to fix myself.

Net effect, so far as I'm concerned, is that the smart phone has not been invented yet.

I've always considered the Brights movement to be tragically misnamed (almost cringe-worthy) but at this point I'd have no problem carrying around a Bright phone where the device's intelligence was on my side for once.

Comment Sir Isaac Newton's lesser prote'ge' (Score 1) 254

Kurzweil is probably a good deal less bright than Sir Isaac Newton, but also a good deal less crazy, his barmy extrapolation of the singularity notwithstanding. Clearly Google hired the man based on the smartest thing he's accomplished rather than the dumbest thing he espouses.

I've thought about this for a long time, and I'm only 99% convinced Kurzweil is wrong. He holds the record for the most ridiculous thing I've ever heard for which I maintain a non-zero sliver of belief. That said, extropian immortality sure as heck isn't life as we presently know it. Even if he's right, I'm not sure I give a damn about my xeno-species future extropian self.

What's left of me as I presently know myself would be just a little sliver of MSDOS buried somewhere deep inside Windows 8, though that might be just enough to properly enjoy hearing Raymond-prime mutter, "Oh, indeed, my original Raymond self, he was such a twit, wasn't he? Every so often I simulate his ego as a kind of Positronic CPU burn to keep my immortality in good working order, but only when the liquid helium is in copious supply."

He's weirdest belief of all is that you can multiply something by a million and it gets a million times better and not more aptly just a million times different.

Comment finding Jesus behind bars (Score 1) 742

1. More standard compliant

The difference in standards compliance in modern browsers is a supermodel vs an air-brushed supermodel.

The difference between IE5/IE6 and other browsers of the era was a supermodel vs watching your own parents having sex.

One could say the same thing about Visual C++ as well.

The thing about Goldman Sachs is that you never get to ruin the economy twice in exactly the same way. There's relentless pressure to innovate concerning your grand malfeasance. It's so comforting to know that Goldman now goes to church on Sunday mornings and sings the sub-prime anthem.

Only a failed criminal tries the same scheme twice. The key is to make such obscene profits the first time that you can sit tight long enough for the apologists to paper over your track record before hatching your next plan.

The title character is a poor and fatherless teenager growing up in The Bronx. Billy and his friends are in awe of the flashy mobsters in the neighborhood. Dutch Schultz and Otto Berman, based on the real-life mobsters, hire Billy as a gofer and become mentors to him. The gangsters take Billy up to their upstate hideaway, where they are awaiting a trial. Schultz becomes a community leader and converts to Catholicism.

Comment bragging rights from body bags (Score 1) 664

100% perfection in any non-trivial thing (whether hard or soft) is impossible.

This is 100% semantic wankery, because triviality is circularly defined as the magic threshold beyond which bugs become inevitable.

Of course there's an implicit ego frame of reference, because we're all looking for an edge on the margin where the big money lives. They didn't call it a "space race" for nothing. It would be far more pertinent to observe about the human species that bragging rights come from body bags. That's just how we run our affairs on the larger political scales.

I can't stand the intellectual posing that ensues whenever someone espouses the culture of bug mitigation with extreme prejudice. Oh, nothing can ever be perfect—as if that's ever the human standard in anything. Part of this is IQ wanking: the notion that writing bug-free code requires superhuman feats of logical perfection. Successfully reasoning your way out of a wet paper bag has something to do with writing bug-free software, but it's a secondary term at most.

The real key to writing defect-minimized systems is a good understanding of human psychology and mental frailty, keeping notes, and constantly upping your game.

It's a rare piece of software that is more robust than the worst API it programs against. Even if the code behind that API is 100% bug free, you're far from out of the woods, because the API can be designed in such as way as to delegate complexity up hill.

No doubt Opportunity was far from perfect, but it sure as hell sets a god-like bar compared to what passes for quality work in 98% of the make-a-buck sphere.

Comment tanquam ex ungue leonem (Score 1) 82

There are even patents filed which allow identification of individual only by this fingerprint.

The government is doing things for which there are even patents? Wow. I had no idea.

Geez, with IPv6 giving every single web client a distinct address, you'd think the NSA would be campaigning behind the scenes to have their carefully curated fat-pipe monopolists ramming IPv6 down our collective throats.

And damn, what a surprising patent, with only about a thousand years of prior art.

On 29 January 1697 Newton returned at 4pm from working at the Royal Mint and found in his post the problems that Bernoulli had sent to him directly; two copies of the printed paper containing the problems. Newton stayed up to 4am before arriving at the solutions; on the following day he sent a solution of them to Montague, then president of the Royal Society for anonymous publication.

He announced that the curve required in the first problem must be a cycloid, and he gave a method of determining it. He also solved the second problem, and in so doing showed that by the same method other curves might be found which cut off three or more segments having similar properties.

Solutions were also obtained from Leibniz and the Marquis de l'HÃpital; and, although Newton's solution was anonymous, he was recognized by Bernoulli as its author; "tanquam ex ungue leonem" (we recognize the lion by his claw).

I guess that cuts both ways.

PS: Notice our fine Slashdot Classic buggering poor Mr l'HÃpital.

Comment Re:Finance is a valuable activity (Score 4, Insightful) 712

If you need evidence of how valuable it is, merely look at our recent financial crisis when the flow of money froze up.

That's just about the dumbest thing I've ever heard.

I guess you don't recall the August 1981 strike air-traffic controllers strike. Most of those wealthy bankers could be replaced by people being paid 10% as much, and after a few years we'd hardly notice the difference, except perhaps that the new lot wouldn't be nearly so adept at screwing the system over.

I guess if you were running NASA you'd pay a billion dollars per o-ring, because--gosh--look at what happens if it won't deform, and the size of the bill if we need to replace the dumb thing. Ten thousand parts at a billion dollars each sure adds up. When you think about it, with each o-ring protecting the safety of a ten-trillion dollar shuttle, a mere billion dollars per ring is a screaming deal, wouldn't you say?

Finance wasn't rocket science until the inmates figured out that astrobucks are a good living. It doesn't need to be rocket science any more than an o-ring needs to cost a billion dollars.

The controllers had Washington by the balls. Big mistake. The bankers have Washington by the carotid artery. We can therefore infer from this that bankers do more important, more productive, more difficult work. Or we can infer that bankers are better at pouring over Grey's Anatomy if it serves their personal interests.

Comment delay lines (Score 1) 120

However, in order to get this to work well, you need the transmitted signal to be phased-aligned to within an appreciable fraction of a wavelength. ... Since we are around a gigahertz, that means that the phase of the carrier should be accurate to within a couple hundred picoseconds, max.

Micrel SY89295U
Programmable delay range: 3.2 ns to 14.8 ns in 10 ps increments in 2^10 discrete steps.
160 ps rise/fall, less than 2 ps RMS cycle-to-cycle jitter.

That gives you a spatial resolution of about 3 mm within a 3 m pixel on the fine delay; more if you also introduce a coarse delay line (in 10 ns increments). I think the Xilinx DCM gives a step delay on the order of 10 ps in 1024 discrete steps. You've now got 3 mm steps out to 3 km. Note that the linearity of these delay lines is not perfect, so there's some art to it (it's not a simple two-digit number in base 1000), but the worst case step remains small. You might use two DCMs in series plus the CML Micrel to ensure uniform coverage (one Spartan 3E has four DCMs IIRC). Actually, for a multi-channel base station, you'd need to fabricate an ASIC with a very large number of programmable delay lines, as I imagined it before RingTFA.

If the phone is 150 m away from a cell transmitter, you can set up a ping pong ping loop with a round-trip frequency of 1 MHz, where each end bats the pulse back as fast as possible.

Imagine the phone sends out a coloured packet and two or more base station pong it back. The phone can ping back on the first received response, or the last, or the n'th response in between. The fastest paths need to be artificially delayed until all paths are equal time. (With multi-pathing, the radio might be able to detect and measure more than one path length per base station.)

It would take long to achieve the coarse lock-on. Then it needs to maintained during motion of the mobile end, plus changes in atmospheric conditions, or sway in the buildings you're bouncing off of, if you've used the loudest path instead of the quickest path. The timing fabric is quite doable. The delay line can be anywhere in the ping pong circuit. The non-radio portions would ideally use fibre as copper has a temperature-variable c that adds up quick in the ps regime where lengths of 100 m are involved.

I can totally see this working, though radio systems at this level are astrobuck black magic.

The software-defined LTE phased array waveform simulation would be an interesting computational problem. They probably do the time extraction with DSP rather than actual delay lines. I'm wondering how much the upstream channel borks total throughput.

Maybe this is the Netflix special. Agility is always the last crow.

Comment WMC is an underused term (Score 1) 401

The "jump the shark" moment for "WMD" was when the surviving Boston bomber was charged with using a WMD.

His improvised kitchen device should have been termed a weapon of mass carnage. Note how the official term focuses more on loss of structure than on human life.

Why the present example jumps the shark is that while global warming might be a supreme menace, it has not yet to my knowledge been successfully weaponized.

In this case, we're really dealing with an Apocalyptic Horseman of Mass Resettlement, if there's a need to be operatic.

Slashdot Top Deals

Say "twenty-three-skiddoo" to logout.

Working...