Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment This is about the MATH library, folks (Score 1) 36

Has anyone noticed that this rather underwhelming announcement isn't about the Linux operating system at all, but rather just about the math library that Red Hat Linux includes as part of supporting the C language standard runtime?

I mean, I don't want to dismiss the value of correctly calculating arctangents, I like trigonometric functions as well as the next bloke. But this certification has nothing at all to do with any operating system functions like real-time scheduling, device I/O, file systems, network communications, security, or the myriad other things that make operating system development a complex, fascinating, and ever-evolving discipline. No, it's just about a modest-size lump of software being certified to make correct and error-free calculations of mathematical functions, bringing a moment of well-deserved(?) glory to a little back-corner of computer science that that has been well-understood for perhaps 50 years.

It's not a bad thing, of course, it's just not very exciting. Just the thing to justify pumping out a press release during what must have otherwise been a rather slow news month.

Comment use the neighbor's built-in hotspot (Score 2) 344

Get a Comcast account by some other means. Get a directional antenna. Login to a neighbor's Comcast Wifi router using the default "xfinitywifi" hotspot SSID that comes with every Comcast-provided router. Presto: WiFi for life.

I did this at a vacation property, and because I already had Comcast at home, I didn't need a new account, just used my existing credentials to login to my neighbor's hotspot, so no additional costs at all. Worked for years without a hitch until I sold the place. (Pro tip: this is also a great way to get WiFi in an apartment)

This is one of the relatively unheralded great features of Comcast Internet: there are (literally) millions of "xfinitywifi" hotspots all over the country (even in rural areas) available to any Comcast customer. I detest Comcast customer service as much as the next fellow, but this universal hotspot SSID is a great thing.

Alternatively, he could be aboveboard about it and offer to split the Comcast costs with a neighbor. Or even pay the whole monthly fee, like he would have paid otherwise ("Hey neighbor, want free WiFi for life?" seems like it would be hard to turn down). Maybe run direct burial cable through the backyard. Or a point-to-point laser (but that's likely pretty expensive, too).

Maybe this guy isn't a techno-geek himself, but surely he could find one... in Seattle, right? I sense a distinct lack of creativity here, coupled with a tendency toward whining.

Submission + - Possible superconductivity in the brain

time961 writes: Pavlo Mikheenko, a superconductivity researcher at the University of Oslo, has published a paper in the Journal of Superconductivity and Novel Magnetism (abstract only; arxiv pre-print here) suggesting that microtubule structures in pig neurons exhibit evidence of superconductivity that could represent a mechanism for quantum computing performed by the brain to achieve the brain's phenomenal information processing power. The observed effects (at room temperature and standard atmospheric pressure) are claimed to indicate a critical temperature of 2022 +/- 157 K, far higher than the 135 K achieved in other materials under similar conditions.

Interesting, if true.

Comment clever ideas, at what cost? (Score 1) 21

There are some very cool ideas here, particularly the use of hydraulic coupling to measure pressure and vibration, as well as the measurement of thermal properties. It's a very nice sensor for robot fingers.

However, it seems to targeted solely toward bulk perception, since the fingertip looks smooth and uniform. This is quite different from human fingertips, where the fingerprint ridges provide a significant component of tactile perception, particularly in motion. It's also not obvious why the thermal measurements should correspond to human perception, which depend at least as much on the thermal (and mechanical) properties of what's underneath the material as the material itself. For example, the sensation of touching a clothed live person's arm is very different from touching a clothed mannequin, even if the clothing materials are identical. But even so, it's a big improvement on florid adjectives.

Unfortunately, the website seems completely devoid of information about ordering their products or services, or on examples of their measurement results, which looks like a bad sign. It would be a pity if they were aiming to introduce an expensive proprietary standard. Many measurement standards in the physical world (e.g., ASTM) are hugely and disproportionately expensive (not to mention comparatively ancient in technical sophistication--Shore Durometer, anyone?). These costs form a significant barrier for small businesses attempting to introduce a novel product or material. A cynic might say that's precisely why large established enterprises provide financial support for those standards-setting organizations.

Comment Re:Err, not the "birth of time-sharing" (Score 1) 146

JOHNNIAC Open Shop System (JOSS) was another early time-sharing system, demonstrated in 1963. By 1964, the time-sharing idea was becoming widespread.

But, yes, undisputably, Dartmouth gave us BASIC, and like George Washington's proverbial axe (which had both its head and handle replaced multiple times), BASIC remains with us today. At least it's not as harmful as C; BASIC arrays and strings always had bounds-checking.

Comment Economic bias, not just cultural (Score 4, Insightful) 379

As others have observed, older workers tend to want to be compensated for their experience... so they're more expensive.

In a rational hiring world, that might not matter much--they usually deliver greater value, too--but it's often not rational people (or, let's be polite and say, people who could be better-informed) that are making that decision--it's people who want to minimize costs no matter what.

Hire an expensive engineer who really understands the work? Or two young cheap ones who might not? The latter, of course--for the same reason that outsourcing to the third world is so popular despite the incredible hurdles of management and quality. And if the bet fails, and neither of the young'ns can get it done (despite the 80-hour weeks that they can deliver and have come to expect), well, you'll be off to another job by then anyway and no one will know.

It's a vicious cycle: VCs like start-ups that live on ramen noodles because they're cheap to fund, unlike ones that have a real staff and a real track record. And sure, some of those cheap ones will succeed, and they'll get the press (in no small part because they are young), and that will perpetuate the myth that only young folks can innovate, leading the VCs to believe in their own decisions.

I don't see the bias going away. As a general rule, young people are less expensive, more dedicated, more attractive, and just more fun than us old farts. The market want crap in a hurry, and this is one of the primary reasons they get it.

Comment Re:Missed opportunity? (Score 1) 17

You might think that larger gates are an inherent advantage, but it's not that simple. To a modest extent, the advantage is there, but the counter-effect is strong, also: smaller gates have that much less cross-section in which a particle hit can deposit charge or cause damage. In practice, radiation tolerance is much more dependent on a bunch of other process characteristics, and it is very difficult to predict.

Failover is rarely "simple". There's a lot of code and mechanism, somewhere, to decide when a failure has occurred, determine the kind of failure, apply applicable recovery procedures, and restore what context can be restored and resume. This is a lot easier to do when you're not also trying to fit in 32KB of flash.

Space computing is very conservative. It is astonishing how much has been accomplished with such simple processors. But advances in the semiconductor art beg to be used, and projects like this could help light the way if not hamstrung by limited architectural choices.

Comment Missed opportunity? (Score 3, Insightful) 17

Arduinos make somewhat more sense than phonesats (Really? We're sending a touch-screen and graphics controller into low earth orbit? Because the boss couldn’t think of any sillier project and had a spare $100K for launch costs?).

But it's hard to understand why a 17-wide parallel configuration of 8-bit microprocessors each having just 2.5KB of RAM makes for a sensible satellite payload processor. Why not something with an architecture more like a Raspberry Pi or BeagleBoard? Not those specific boards necessarily, but a similar, simple one-chip SoC approach and a decent amount of memory. A processor like that could drive a bunch of experiments (more than you can fit in a Cubesat), and have enough room for the software to be comfortable and maybe even maintainable on-orbit.

A SoC-based system would fit in the same low cost profile but could run much more interesting applications. Ardusat feels like a missed opportunity, because it has lots of other things going for it: open source, submission process, international coalition, hobbyist/student focus, etc.

Comment Just be sure your customers acknowledge it (Score 1) 364

Consultants can largely solve this problem by having customers declare explicitly that the work doesn't fall in the realm of taxable services as defined by the ruling.

There's so much ambiguity in the wording that as long as you're not in the crosshairs of being a reseller who supplies expensive software (think Oracle, not so much Windows) in the guise of a (heretofore) non-taxed service, you'll be fine. It's not worth their time to enforce it otherwise.

The key is being creative. Supplying customized Drupal installations? No, you're writing unique software to customer specifications for the customer to use with their existing Drupal platform. And maybe you're supplying training about operation and installation of Drupal systems. And helping them evaluate their business needs that might be met by aforesaid custom software. The ruling (section II) even explicitly exempts "training" and "evaluation". Maybe a small fraction of your business might fall under the ruling, but if that's the case, you just need to make sure it's covered by separate contracts. If there isn't significant money flowing out of your business for (reseller tax-exempt) software that your customers eventually get, it will be pretty challenging for the DOR to argue that your business is taxable... as long as you're smart about how you define the business.

I'm as worried as the next fellow about jackbooted thugs from the government running my business into the ground. However, the reality here is that these are overworked civil servants who are motivated by meeting their goals--and they'll do that by pursuing the cases that the statute is intended to target, because those will be most likely to generate revenue. No bureaucrat wants a lawsuit, they want passive compliance. Maybe ten years from now, it will be different, but if it is, I'd bet it's because the law is expanded (to cover all services, in the name of "fairness"), not because this statute is egregiously misinterpreted.

Comment Embed logging technology in your software (Score 1) 205

By this I mean that you should instrument the code with real, meaningful activity logging, not just some afterthought that grabs a stack trace and some state variables (although you'll want to have that data, too). If you instrument your code with logging that produces readily human-interpretable information about what's going on, the payback is huge, because it makes internal developers' lives easier, and it allows even first-level support folks to to a better job of triage and analysis. It's really important to make it meaningful to the human reader, not just "readable"--an XML representation full of hexadecimal doesn't cut it, it needs to include symbolic names.

Let the users see the logged data easily, if they ask for it, and maybe give them a single knob to turn that controls the level of logging. This will help technically sophisticated users give more useful reports, and it's really helpful in any sort of interactive problem resolution (OK, do X. Now read the last few log messages. Do any of them say BONK?).

It's really useful to include high-resolution time--both clock time and accumulated CPU time--in log messages. This is great for picking up weird performance problems, or tracking down timeouts that cause mysterious hangs. Depending on your architecture and implementation technology, other sorts of "ambient" data (memory usage, network statistics) can useful here, too.

There's a trade-off between logging by frameworks, mixins, macros, etc., and logging of specific events. The former approach gets comprehensive data, but it often can't provide enough contextual semantic information to be meaningful. The latter approach scatters logging ad-hoc throughout the code, so it's very hard to make any argument for comprehensiveness, but if done properly, it's spot-on for meaningful messages. Usually best to do some of each, and have good control knobs to select.

Logging can generate a lot of data, so it's important to be able to minimize that burden during routine operation (especially in deployed applications, where there should be a strict limit on the amount of space/time it takes up). But it's also useful (especially when it's configured to generate a lot of data) to have tools that allow efficient ad-hoc review and analysis--an XML tree view, maybe filtered with XSLT, can be easier than a giant text file.

In any complex system, logging is one of the very first things I recommend implementing. After the architecture is settled enough to know what will be some of the meaningful activities and objects to record, bolting in a high-efficiency, non-intrusive logging infrastructure is the very next step. Then comes business logic, user interface, and all the other stuff. Pays for itself many times over.

Comment Re:Historically, NSA have done the opposite. (Score 1) 407

Considering the rest of Coppersmith's work, I have no trouble believing in his genius or that he independently invented differential cryptanalysis. Are you suggesting that he didn't, and instead lied about it 20 years later?

Your post rather mischaracterizes the content of that section of Wikipedia. It is hardly "everyone else's version" that NSA made changes. That section cites both the Senate inquiry and Walter Tuchman (then of IBM) as saying that NSA did not dictate any aspect of the DES algorithm. The Konheim quote ("We sent the S-boxes to Washington...") is an un-referenced comment from Applied Cryptography (which says "Konheim has ben quoted as saying..." without saying where or by whom). Schneier goes on to express admiration for IBM's work and how it scooped the rest of the open crypto world for 17 years.

In any case, the important point is that changes were made, whether by IBM alone or in collaboration with NSA, and they unequivocally made the algorithm much better, as opposed to the conspiracy theory that NSA made it worse. The 56-bit key is reasonably commensurate with the security DES actually supplies (against the attacks of the day, secret and otherwise). Now if it had turned out to be weak against linear cryptanalysis, or indeed any other attack of the last 40 years, that would be news--but it's not weak, it's just average, strongly suggesting that no better attacks were known back then.

Comment Re:Historically, NSA have done the opposite. (Score 5, Interesting) 407

Biham and Shamir, Differential Cryptanalysis of the Data Encryption Standard, at CRYPTO '92. They showed that the S-boxes were about as strong as possible given other design constraints.

Subsequently, Don Coppersmith, who had discovered differential cryptanalysis while working (as a summer intern) at IBM during the development of DES in the early 1970's, published a brief paper (1994, IBM J. of R&D) saying "Yep, we figured out this technique for breaking our DES candidates, and strengthened them against it. We told the NSA, and they said 'we already know, and we're glad you've made these improvements, but we'd prefer you not say anything about this'." And he didn't, for twenty years.

Interestingly, when Matsui published his (even more effective) DES Linear Cryptanalysis in 1994, he observed that DES was just average in resistance, and opined that linear cryptanalysis had not been considered in the design of DES.

I think it's fair to say that NSA encouraged DES to be better. But how much they knew at the time, and whether they could have done better still, will likely remain a mystery for many years. They certainly didn't make it worse by any metric available today.

Comment The GSM ciphers are an interesting story (Score 2) 407

I can't find a good reference right now, but I recall reading a few years back the observation that one of the GSM stream ciphers (A5/1?) has a choice of implementation parameters (register sizes and clocking bits) that could "hardly be worse" with respect to making it easily breakable.

This property wasn't discovered until it had been fielded for years, of course, because the ciphers were developed in the context of a closed standards process and not subjected to meaningful public scrutiny, even tough they were nominally "open". The implication was that a mole in the standardizing organization(s) could have pushed for those parameters based on some specious analysis without anyone understanding just what was being proposed, because the (open) state of the art at the time the standard was being developed didn't include the necessary techniques to cryptanalyze the cipher effectively. Certainly the A5 family has proven to have more than its fair share of weaknesses, and it may be that the bad parameter choices were genuinely random, but it gives one to think.

Perhaps some reader can supply the reference?

The 802.11 ciphers are another great example of the risks of a quasi-open standardization process, but I've seen no suggestion that the process was manipulated to make WEP weak, just that the lack of thorough review by the creators led to significant flaws that then led to great new research for breaking RC4-like ciphers.

Slashdot Top Deals

To stay youthful, stay useful.

Working...