Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Treading on their toes (Score 1) 34

The NSA's objections to the publication of "The Codebreakers", would not, by any chance, refer to some less than flattering comments on the performance of this semi-mythical organization? The bitter irony of all that is that, despite all the precautions, NSA has been involved in security breaches more spectacular and more damaging to the free world than any others in the Cold War except those of the atomic spies.

And on NSA's relations with Congress, This stratagem plays upon Congress' fear and ignorance., continuing a little further down with It is much easier not to bother with checking up on NSA. But it must be done. Otherwise the nation jeopardizes some of the very freedom that NSA exists to preserve.. I concede my quotes are form the 1996 revised edition, not the original 1967 edition. Still, it is hard to believe that anything in The Codebreakers can have been a technical risk to national security in 1967. A political risk to the people in the intelligence community, perhaps.

But here is a fascinating thought: America cryptography owns a lot to Elizabeth Wells Gallup, a high school principal from Michigan, who had "discovered" a secret message in the works of Shakespeare. A secret message of Bacon, of course. But Mrs. Gallup's theory attracted the attention of the rich George Fabyan, Fabyan hired Elizebeth Smith to help investigate it, and Elizebeth attracted (and married) William Friedman. Without that unlikely chain of events, William Friedman would never have entered cryptology, and the course of history could have been very different -- including the course of a few wars.

Comment Re:$225,000 (Score 1) 216

The paper on confocal system performance by Zucker and Price (Cytometry 44, 273-294, 2001) has a few useful data points and references, although it does not refer to two-photon systems. Maybe a later publication by them does.

I can't comment on the figures for two-photon microscopes, but on single-photon systems the ratio of the maximal power output from the objective to input beam power tends to be depressingly low, often much less than 0.1. There are fairly high losses in the microscope optics, especially in the highly corrected high-NA objectives used for confocal imaging. Of course there also is the problem of trying to achieve a fairly homogeneous illumination of the entrance pupil of the objective with a more-or-less Gaussian laser beam, and most solutions are wasteful in power.

.

Comment Re:$225,000 (Score 2, Informative) 216

A widefield deconvolution system doesn't really need a laser. Probably a lamp coupled by a liquid light guide is the better option for such a system. The excitation is not monochromatic but the illumination of the field is excellent.

Prices for this class of laboratory equipment are rarely put on paper, because you are expected to haggle. There usually is considerable margin for negotiation. Sometimes you can beat them down by as much as a third of the list price, although 10 to 20% is more common.

Why do you want to buy a Tsunami? It's a good laser system, but unless you have a specialization in optics or physics and are willing to spend a lot of time on tuning the system, it is better to spend your money on a laser with automatic tuning. (A Mai Tai, in Newport's case.) Performance is a bit less than a well-tuned Tsunami, but certainly good enough for most purposes, and the single box is more convenient than a Tsunami plus an external pump laser.

Anyway, femtosecond pulsed laser systems are somewhere in the quarter-million range, but the small solid-state lasers in most confocal microscopes can be had for an order of magnitude less.

Comment Re:$225,000 (Score 1) 216

A bit over $50,000 will buy you a good quality inverted research microscope with a basic set of options for widefield fluorescence imaging, with a sensitive camera and a few objective lenses. If you want to add options, $1,000 will pay for an extra set of optical filters, and roughly $5,000 for a single high-resolution objective. By adding more options you can spend $100,000 on a fairly standard microscope.

About $250,000 is roughly the ballpark for a confocal system, depending on exchange rate fluctuations. (Most manufacturers are German or Japanese, still reflecting the traditional location of the optical industries.) You would pay two or three times as much if you equip it for two-photon imaging or fancy techniques such as FCS or FLIM. Deconvolution systems are a bit cheaper (less hardware, more software) and they are also preferred for long-term experiments because they are more gently on the cells under study.

These are expensive tools, and besides most biologists don't understand microscopy nearly well enough to let them loose on it without assistance and supervision. The systems also need a suitable location (vibration free, stable temperature, moderate darkness) to work well. So at universities these systems tend to be in core facilities with a dedicated staff that keeps the instrument in shape, helps the users, and charges the departments back for their use.

Comment Re:Sounds good... (Score 1) 144

In the long term, certainly yes, variants of disease that are are not recognized by the immune system or resistant to existing treatments will spread again, even from a small core, perhaps one created by accidental mutations. Influenza, of course, manages to do so every year, demonstrating how efficient viruses can be at playing this game. Experience with bacteria and parasites (malaria) is also rather discouraging.

However, how fast that occurs depends on quite a number of factors, including the ability of those mutant viruses to replicate, infect, and cause disease; and of course their geographical spread. Characteristic for many treatment-resistant mutants of HIV is that they are less virulent, because their optimal functioning as a virus has been compromised. Resistance mutations are known to all existing antiviral drugs, and it is only a matter of time before these resistant viruses will become more frequent, but meanwhile the drugs do increase the life expectancy of most patients by about thirty years. In reality, partial solutions are worth the effort. Imagine for a moment that you would have a vaccine that would largely eliminate HIV-1 subtype C, which accounts for over half the infections in the worlds poorest countries. That might be worth having, even if it is only a "short-term" solution and the virus will reconquer the lost terrain.

Unfortunately, in this case it is not clear from the information that I can access, what the practical implications of those 10% of viruses that escape the potential treatment are. The summary mentions 90% of HIV-1 strains, but actually the abstract in Science claims 90% of isolates. I can't access the article from here, but probably the authors identified the sequence of the binding site for their antibody, and then looked how frequently this occurs in a database of known HIV sequences, probably the Los Alamos database. That number doesn't tell you much about the viruses in which is does not occur.

Comment Evolved Computing (Score 1) 331

Curated Computing sounds like a bad idea to me, because those third persons are making decisions without actually knowing my needs and habits as a user. Therefore less choice is very likely to lead to less relevance as well. This is the kind of computing you get in a big company where a central IT department sets policies and standards for everything, and it generally drives people who try to develop something new or display some creativity into raging fury -- even if the choices that are being made for you aren't braindead.

I think in the long term devices such as the iPad are going to be a success only if they can be personal enough. In theory a more convenient model could be one in which the systems learns from my behavior as a user and adapts accordingly. However, so far this tends to be based on a frequency-of-use approach, which is rather limiting. It isn't much help to the less skilled user, who might never be able to find the right options. And there are potential privacy considerations, if this is focused on monitoring the behavior of a single person.

A better mechanism could be a kind of 'Evolved Computing' working like this: I make myself a member of a peer group, based on common activities and common user interface preferences. I get a software package which may be inherently flexible but complex, perhaps too complex for the daily needs. Monitoring the group statistics allows the system's managers to de-emphasize some features, and highlight or offers others which might be attractive to this group. As a user, I can be presented with tools that other members in my peer group have found useful, and can adapt or reject. Another group may have another set of preferences, of course, but a particular group is offered the relevant subset in its user interface.

It's nothing really knew -- it's traditional user feedback, and the selection mechanism for iPod Apps or other extension packages. But it could be done smoother and more intelligently.

Comment Finding the causes of disease (Score 1) 103

I think the most useful application of systems like this (not necessarily this one) would be in research to identify the causes of disease. There has been a long-standing suspicion that bacterial or viral infections at least contribute to many diseases, from asthma to depression, but this has been difficult and costly to investigate on a large scale. The most infamous case is chronic fatigue syndrome, long suspected to be the result of a viral infection. Don't forget that it wasn't until 1982 that we figured out that stomach ulcers are mostly caused by bacteria, not by stress or acid food. The discovery that most cervical cancers are caused by a papillomavirus (of the same family as the viruses that cause warts) was a few years later. The impact on treatment and prevention of these diseases has been huge. A tool that would permit clinicians to study the presence of infections, even at low levels, in relatively large numbers of patients might reveal surprising links.

Comment Re:Proof of the tenacity and ingenuity of humanity (Score 0, Troll) 175

Yes, or of over-engineering...

Most modern structures are designed to have a finite life plus some safety margin. That's not just a trick to sell more cell phones and washing machines, it is normal engineering practice to get the right balance of cost. The FAA would even refuse to approve a wing design for an aircraft that did not have a predicted but finite life. If the design life is exceeded that can be regarded as a bonus, but often it is also considered a sign that the engineers made it too expensive / heavy / complicated.

I guess that a life of 2,500 days for a design goal of 90 days can be justified on the grounds that, given the cost of getting it there, a premature failure would have been a great disappointment. On the other hand, maybe we could have added some useful extra sensor to the Rover, reducing its lifespan to "only" 1,000 days but providing it with a means to avoid sand traps...

Comment Re:Why the surprise? (Score 1) 222

I think it was Edmund Burke who pointed out that "It isn't always wise to do what you have a right to do". However, the common law legal system doesn't operate that way. It tends to assume that it you don't exercise your rights, you implicitly abandon them.

This is what makes companies take legal action in such cases, even if they might have sympathy for the project itself. If they do not, it might be used as a precedent next time, when somebody has a project that is less beneficial to Nintendo.

So don't blame them -- write to your Congressman instead.

Comment Serendipity != Luck (Score 1) 51

There is more to scientific serendipity than just luck. A degree of luck is certainly involved, as by definition the process involves observing something one did not plan to see. However, that is why scientists do research: If we only ever saw what we expected to see, then why bother?

But there is an important additional ingredient to it, and that is being able to actually absorb the unexpected and to be able to think of a reasonable explanation for it. The ability to give unexpected data a rational interpretation is crucial, because this is what protects good scientists from the cognitive dissonance that makes other close their eyes for the unexpected. Without interpretation, a surprising observation is just that; it may be a coincidence or an experimental error, and is often thrown out.

The most famous example is Alfred Wegener and his theory of continental drift. Mainstream science has been criticised a lot for its scepticism about Wegener's ideas, but Wegener failed to propose a credible mechanism for the motion of the continents -- the concept of plate tectonics arrived fifty years later. Without an explanation, the observation didn't convince, and Wegener was long dead when it was recognized that his intuitive idea had been right.

The reverse is also true, there is a real danger in theory without experimental observation. This is illustrated by the case of the mysterious "N-Rays" of a patriotic French scientist, who "discovered" them as a counterweight to the German Roentgen's discovery of X-Rays. The N-rays did not exist, but an otherwise very capable scientist also proved highly capable of seeing just what he wanted to see.

Comment Yes (Score 1) 605

Yes for local systems, and that is using a wide definition of "developers" -- not just the professional software developers we employ, but also the people in "informatics" who are basically scientists who build or evaluate new software tools for data analysis, and the engineers who write software to control automated systems.

As far as I understand, this is against our own Very-Large-Company policy, and the threat of having administrator rights taken away is hovering constantly over our heads (now said to be a central IT goal by the end of 2010), but fortunately our local IT managers do understand that we need them. It wouldn't be worth the massive overhead of routing all requests for installation of new software over three different continents (I'm not kidding) and some people with difficult-to-replace skills would simply walk out if this was taken away from them.

There are always dire predictions of the disasters that will result by granting mere end-users administrator rights, but incidents are rare and usually easily resolved. Frankly, while it is true enough that many of the people with admin rights don't know that much about system administration, they actually know a lot more than the average helpdesk jockey who is in and out within a year, but can apparently be trusted with the powers to mess things up thoroughly...

Admin rights for servers we don't have on a personal basis, but on servers that are dedicated for specific purposes, some of us have access to service accounts that have administrator rights. If the server provides services to multiple people, the IT people are usually unwilling to grant that, and I find that understandable even if I don't entirely agree.

Comment Re:You never discard the data (Score 1) 190

Yes, but that pre-supposes the ability to invent a new theory, because scientists are very unwilling to discard a theory if they have no alternative. After all, having no theoretical framework at all is very uncomfortable.

And there I do think that Dunbar makes a perfectly valid point: Any group of specialists who are all of the same mind is very bad a thinking "out of the box" and inventing a new theory. To be able to do that, you need a healthy mixture of different backgrounds, and enough dissent to stimulate the debate. Unfortunately scientists often assemble in excessively homogeneous groups, sometimes on their own initiative ("old boys network") and sometimes through deliberate but foolish policy ("center of excellence").

This is part of the reason why publishing and attending scientific congresses is a vital part of scientific activity. I've often noticed that in industry, this is regarded as a kind of bonus, a perk that allows scientists to travel to nice places. But is absolutely essential, even if that means talking things through with direct competitors.

Comment Re:30 IT people in a 500 employee company?! (Score 1) 837

That number of IT people sounds like about our own rate, around 30 people in a company with 600 employees. Of which about half a dozen are helpdesk people. In practice that is not enough for us.

Yes, it all depends on what you do. If your company only needs an off-the-shelf accounting system and MS Office, these 2 people should do fine. But we have half a dozen transactional database systems, a number of servers that are running heavyweight scientific data analysis, a commercial web interface which we need to keep running 24/7, very strong legal requirements to maintain privacy and data integrity, and almost daily requests to make changes to applications.

Our IT departement therefore includes programmers, database analysts, testers and validation staff, and of course various managers. These 24 extra people in practice boil down to only 1 or 2 skilled people to support every business-critical system, which means that there are serious gaps every time when people are on holiday or leave the company... We have to cover part of the gaps by allocating time from IT-skilled people in the business unit.

As for the helpdesk, simply giving everybody the same PC and the same image isn't a viable solution, although sadly our CIO is incompetent enough to think it would be. There is a rather wide gap in requirements between addressing the needs of a manager or his secretary, meeting the more demanding requirements or bio-informatician wrestling with gigabytes of sequencing data, and setting up a controller for a robot that could easily crack somebody's skull. Not to mention all the internal development of software tools.

However, in fairness, it also depends on how people are organized. Our management has discovered outsourcing, specialization, and centralization: In practice that leads to a bureaucratic merry-go-round in which a simple problem passes through a dozen different mailboxes before somebody does something about it. I've seen trouble tickets that bounced back and forth for two weeks before they landed on the desk of some IT manager, who honestly had to admit that he too couldn't figure out who was responsible for actually solving the problem.

Comment Individuality Matters (Score 1) 837

My experience is from the user side, and I would strongly discourage uniforms for the IT helpdesk, in fact for any helpdesk.

The reality of life is that individuality matters, especially in a person from which one seeks help: People need to establish personal contact. And when it comes to essential tools such as computers, which can make our workday hell if they don't work properly, trust matters a lot. Putting your helpdesk people in uniform is the first step to make them uniformized drones. It will have a bad effect on employee morale, not just of the people who wear the uniforms, but also the people who call on them.

My advice is to fight it, tooth and claw. Instead promote rapprochement between the helpdesk people and the business by making sure that when people call, they get the same helpdesk person whenever possible. Don't over-emphasize central helpdesk lines and by all means avoid trouble ticket systems -- I've never seen one that didn't succeed in infuriating the end users.

Slashdot Top Deals

Nothing happens.

Working...