I commonly work in a clean room too.
I take my phone out of my pocket, wipe it down, and throw it in my tool box before gowning up.
Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.
I commonly work in a clean room too.
I take my phone out of my pocket, wipe it down, and throw it in my tool box before gowning up.
You've never done scientific work for the government.
These are not "meaningless" expenses, and this scale of project is not unusual, there is a real problem here. All of us who do this kind of work, from JSF contractors to small university professors, have to follow the same rules and be audited for the same things. It's understood that things like food and lobbying (!!) are not allowable expenses.
This doesn't necessarily show a lack of ethics, because a normal private contract may allow these things. What it shows is a complete disconnect from the culture of the scientific community. If the people running this are not scientists, and are not used to working on R&D projects, then why are they doing this and why do we think they'll produce useful information?
Moreover, why does everyone else in the multi-billion dollar government R&D market have to follow the rules (or be cut) and it's ok for them to mismanage funds?
I know windows phone doesn't have a large market share, but no one involved with this looked to see if this is a new feature? I've had this on my phone for a long time, it's not special at this point. It's on by default under 20% charge. It is a real thing and definitely slows down battery drain; definitely better than trying to manually adjust settings to get that extra hour of battery life.
Ok, so retina scans and face recognition don't work well in a clean room because your people should be wearing goggles and a face mask. Also, this is about training, not technology.
I'm assuming you're going beyond the standard card access machines that are already in most clean rooms and are instead trying to track "little" things like wash steps, microscopy review, hot plate use, etc.
Electronic lab notebooks (this used to be a server-workstation kind of thing, but it's tablets now) are great for this. This doesn't need to be very expensive or have custom software. Plus you add the convenience of carrying a clock & timer around with you. If you want to get really fancy, you can have the tablet talk with your computers (I've never seen that done in a lab or clean room, but it's probably out there).
You should be able to get all the info you need right now with your regular clean room notebooks and some transcription. If that's not happening, you're simply not keeping records well enough. That's a training problem. The level of record keeping required for good clean room work is very high. Trying to find a technology solution to remove good note taking practice can encourage sloppy work unless all of your tooling is set up for complete automation (in which case, you wouldn't be asking this question...).
You're point in general is good. We really shouldn't be asking anyone to work extra for free. Unfortunately, it's that way in many fields.
It's very difficult to get any job in a competitive or important industry that doesn't require night and weekend work in addition to normal working hours.
Like several other people commenting here, I tried to get out of this situation by starting my own company... where I work nights, weekends and workdays for free. The economy is a tough place right now for anyone not in financial services. I think that's just the bottom line.
Intel is indeed great, technically better than anything else out there and will probably continue to be so. There are several other large companies from telecom to biotech who also have in-house fabs in the USA and they will do great things. But IBM was the last significant stateside fab house that would work on external government contracts and work for small outside users.
The best we have now for small business electronics development or advanced academic work are training clean rooms like the various CNSEs out there, and that's a scary thought.
In four years of work, they've managed to break the "bigger is better" scaling law common to most fusion reactor designs as well as solve the wall material problems common to ALL fusion reactor designs?
Well, that would be something. If only this article told us anything actually useful.
The materials physics of creating a visible light LED was mirrored by what was going on in solid state transistor development. It was a great feat, but followed the work being done in electronics.
Before actual demonstration of a stable blue LED, theorists in the materials physics community thought it was impossible. The process to engineer the bandgaps for blue/UV LEDs was new and unique. It was an example of the optics guys being ahead of the electronics guys in bandgap engineering.
All that said, inclusion of Holonyak could be justified. His work was good. But... James Baird (who is also still alive) has a much better claim to the general LED discovery (including the first patent) and would be a much, much better inclusion. For IEEE to do an extensive article on Holonyak, but leave out Baird shows that this complaint is a farce.
This award is not about how great LEDs are in general, it's about the quality of physics the blue LED folks did. Appreciate that the award went to guys who did truly great experimental physics.
As a materials physicist, I am very happy with this prize. This is a very important recent discovery to my area of physics. Nobels as "lifetime achievement" awards are disappointing. It's much better to see an award go to someone who can leverage that prestige into new projects.
Good points there. Channeling people into high school education is something I hadn't considered, but would be helpful.
I tend to be more positive about industry than most scientists. I am biased, but I don't mean we should all work for bean counting businessmen. That's just horrible. I mean that those companies that do help lead science and tech development could have a bigger role in the training process (think Intel, SpaceX or JCVI... ok, maybe biotech has an industrial culture problem).
Hubble is a great example. It was built by a coalition of government labs, Lockheed, and Perkin-Elmer as the leading contractors. Universities were in charge of some small systems, got to help set the specifications, review the design and use the tool. That's what I meant by an industry led project (granted Perkin-Elmer really screwed up on Hubble, so there is that).
Ultimately, you're right, more funding and fewer PhDs are necessary. It doesn't all have to be grants. We used to require all defense contractors spend 15% of their budget on basic R&D. That went away with the Cold War, and it was a mistake to get rid of it.
I am a scientist and I have been a postdoc (and government grant manager and industrial scientist). This is not new, but is more new to biology than it is to other fields.
This problem is real. Our best researchers can't find a job and are "sitting on the sidelines." The investment in those folks by the government (i.e. your taxes) is going down the drain the longer they're unable to do meaningful work.
My feeling is that the underlying problem is the insulation of academics from the commercial world. Most science professors don't know what is involved in commercial work, don't know the relevant skills for commercial work, and don't have a network for landing jobs for students in industry. There are far too many professors who don't know how to train their students for anything other than academic work, and some who are adamantly against training their students for jobs outside of academia.
The result is that industry jobs that many PhDs expect to get go instead to people who left school with a BS or MS and received more relevant on-the-job training in industry. The truth is that there are very few jobs where the experience of a modern PhD is more meaningful than 6 years of industrial bench work. The government and academia still hire preferentially by degree, but those folks can't hire enough people to put a dent in the supply.
To fix this problem we need radical changes to the way we pursue science. Some possibilities for the future:
1) getting a PhD is "for fun." This is the current reality. If we all accept and understand this, that PhDs have no competitive advantage over MS students in the marketplace, there is no problem. If we do nothing, this will continue and will eventually make the PhD system obsolete.
2) Control of research direction shifts toward industry (i.e. professors become subcontractors on grants to people like Merk and IBM). I doubt many academics would like this, and there would absolutely be problems, but it would generate students with broader skillsets and networks.
3) Control of research shifts back toward government labs. This used to be the way things were. Government labs sat between industry and academia and facilitated movement of people, ideas and funding. Entire funding agencies that supported these labs are gone. Grant managers and review committees used to mostly be active scientists at government labs, that's no longer the case. This would be expensive to get back to and would really be unfair to the foreign scientists making up the majority of our young scientific workforce.
4) Set everyone on the GSA scale. Right now you can get a recent grad in his 3nd year of work funded at $60k/year on a grant to a commercial grantee, but it's almost impossible to get more than $25k for that same work done by a "graduate researcher" in academia. (Even if professors want to do right by their employees, they often can't.) So, don't allow any more $20k/year graduate students on grants. Everyone gets paid based on a combination of local cost of living and experience (years & degrees). That's the GSA scale (ok, it kind-of is). Removing the discount for students would remove free grad school for scientists, but would immediately fix the problem that the best bench scientists can't find jobs.
Whatever happens, the solution is not going to come from inside science. Scientific leaders range from completely disgusted with the human trafficking which is the modern research economy to openly hostile to the idea that this problem needs to be solved. Most people just don't know what to think. There will be no consensus amongst us in science on what, if anything, needs to be done.
Figure out what you really want to do with this. Do you want to understand everything very broadly? Do you want to become a specialist in a particular niche?
If the answer is broad understanding, lookup dogvomit's post and take the traditional coursework in the traditional order at your pace. There are sets of problems honed over the last 100 years to train people to think like physicists. Then you can go read Einstein and dense particle physics books; that's a lot of fun but probably won't go anywhere. Very, very few physicists contribute generally any more.
If you want to be a specialist, find a very well defined project you could really dig in to and enjoy. Something like one particular measurement you think you could do on your own. Pick something recent that you like. It's all out there on Google Scholar. Fill in the general physics you need as you go, but you'll probably need more engineering, software (and money) than anything else. Above all, please do get in touch with the people who inspired your work!
If you're able to successfully repeat a set of observations, or just do something that looks at all like what some grad student did 5 years ago, you will make their year by sharing it with them. If you can do that even once, you will be well on your way to contributing meaningfully to their field.
To put the time commitment in perspective, this is the kind of thing a new "generalist" physicist will do for 2-3 years full time while learning their specialty. Unless you really like this and find more than 10 hours a week to do it, it could easily take you 10+ years. That's ok. I'd be thrilled to find out someone outside traditional physics replicated my results from 10 years ago.
I think the majority of the scientific publishing culture and industry is bad for science. That said, this is not a fair criticism. It's entirely reasonable to tell someone you expect to see more data in order to publish and to start a conversation among the editor, reviewers and PI as to what is necessary to prove a point. Research is not a perfect process and does not progress in an orderly, predictable manner. There are going to be typos and blind spots in any paper.
In this case, obviously Nature should not have published in the end. We can't know how that decision was reached unless we see all the correspondence between the editor, reviewers and PI. It would be much more useful to the scientific community to see how the PI managed to convince the reviewers to allow publication, rather than to debate what is really a standard rejection response.
Tissue culture is definitely a thing.
Does that mean it's the only way to gather that information? Would it be possible to develop alternate tools?
Look. Whether you like it or not, we are developing the measurement techniques and hardware that are going to make this antiquated approach to biology obsolete. Feel free to yell and scream while the field passes you by.
It's not fair at all to link opposition to gain in function research to an "anti-science" mindset. You should be ashamed that you're resorting to that argument.
This is something which is seriously debated in the pages of serious journals, at scientific conferences and by government program managers. To link valid concerns to an "anti-science" crowd is political bullshit maneuvering.
There is a very real and valid cost/benefit analysis to be done on pursuing this work. As biology catches up to the physical sciences in scope and function, you're going to deal with the same issues we have dealt with (I am a physicist). One of those lessons is that scientists don't get to decide the purpose of our work. It doesn't matter what you write in your paper, or what the program manager tells you the purpose of the work is. It doesn't matter WHY someone does the work, all that matters is WHAT the work is. It's extremely naÃve to think an abstract in a research paper can properly define the purpose of a piece of research.
There are experiments and research paths we do not follow because the intellectual benefit does not outweigh the very real possibilities for misuse. You asked how you expect people to validate these hypothesis without the work? Take a page from physical science and learn to use computer modeling and limited experimental work in lieu of full studies. Do some tool development. Don't just throw up your hands and insist this is the only way. It's not.
This will require a cultural change, and there will be lots of hand-wringing over whether new results are valid, but biology will be a more mature field for it.
I've worked for the government in a scientific job, with a lot of IT folks. It was probably the most relaxed atmosphere I've worked in. No expectation to dress better than business casual. No expectation to work overtime. No expectation to really get anything done.
It's that last one that's really the killer. If you're not focused on getting projects done, first and foremost, then you're not going to attract good people.
A good engineer isn't necessary when the jobs at a government office survive only by making the right political and budgetary statements at precisely the right times. With very few exceptions, technical success or failure just doesn't have much influence on your career in the government.
Lastly, 140 engineers will make no difference. The federal government is huge. The office I worked in was a backwater, nearly forgotten location. We had a staff of 5000 people, about half of them engineers and scientists. There are thousands of engineers in the government right now who would love to work on meaningful projects. It's not a lack of talent or manpower that keeps those projects from happening.
The tree of research must from time to time be refreshed with the blood of bean counters. -- Alan Kay