Title aside, the ability to code is a workplace requirement, and if you are not looking at traveling/work internationally, you aren't going to get very far without a degree.
Some of the "college drop out" success stories are no longer just coders. They are now C-Level executives, different rules apply. If you don't have a degree then in general you won't be eligible to get Visas to work in other countries.
Independent about how good you are, without a degree you are restricted to your local geography (country, etc).
I would agree with this too. For that telescopish feel, you can get tripod mount attachments for most binoculars. Allows the adult to point the binoculars, swap to the kid without too much trouble.
Define actions (instant, daily, weekly alerts) for ranges of CVSS scores http://nvd.nist.gov/cvss.cfm?c...
Track incoming CVEs (http://nvd.nist.gov/download.cfm) , assign CVSS scores specific to your organization. Also have a organization specific remediation approach.
As you find out who is using what software, and use the CVE CPE (http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2014-2168) information to target more specific users.
In the blast emails, you could potentially harvest who thinks they may be affected to gather CPE information.
It's going to be a thankless, painful job, so you may as well automate as much as possible.
Link to Original Source
There will probably be a market for this in the tech enthusiast. But it will be highly unlikely to go mainstream. Mainstream (iphone 5s) is 7.6mm thick and weighs. According to http://motorolaara.com/2013/10... it is probably about 9.3mm - effectively as chunky as a 2 year old device.
What may evolve from this is specialist hardware and specialist configurations.
Some interesting spin-off technologies might be high speed bus interconnects (thunderbolt 2), modular and novel hardware configs (3d scanning - project tango, yotaphone - e-ink backside). Ultimately, enabling technology advances is what google spends it money on these days...
The report doesn't really go into an important measure.
What is the defect density of the new code that is being added to these projects?
Large projects and old projects in particular will demonstrate good scores in polishing - cleaning out old defects that are present. The new code that is being injected into the project is really where we should be looking... Coverity has the capability to do this, but it doesn't seem to be reported.
Next year it would be very interesting to see the "New code defect density" as a separate metric - currently it is "all code defect density" which may not reflect if Open Source is *producing* better code. The report shows that the collection of *existing* code is getting better each year.
That is different. My read of the GINA is that your health insurance provider is not allowed to use genetic screening to make coverage RISK decisions. As in, they can't force or require you to screen for cancer and then decide that you aren't coverable because of BRCA. Apparently life insurance is not covered by GINA, so that is another issue.
Also note that GINA is an American law. Not global.
The comment I made was about tuning treatment based on genetic information - which is very different. Rather than a cocktail of drugs to suppress and support different side effects and responses - you can more targeted doses to resolve your direct issue. Warfarin is a good example, too much doesn't help, too little doesn't help. Your genes help identify what your correct dose is.
As Genome Wide Association Studies begin to crack more of the genomic puzzle, there will be tighter and tighter direct correlation between medicine types & doses and the effectiveness of those drugs. As this efficacy increases, it is highly likely that the best insurance coverage will be based on genomic information.
Determining precise doses of a drug and which drug should be used is going to make for much better quality of medicine. I would expect that in a couple of decades people are going to look at the drug practices of today and laugh that we are pretty much throwing darts at the drug dartboard and choosing whatever it lands on.
Opting out of specific tests will be like not wanting X-Rays to see if a bone is broken.
Well for commodity items - I get your point. However, my personal experience is owning a house that has a really unusual shelf pegs. Unusual in that they are simply not available. I ended up modelling them and using shapeways to print them. What I made is up at https://www.shapeways.com/shop....
The cost, was about $2 per peg - which is about the same cost as low run retail products at home depot.
3D printers will make it affordable for extremely low run prints. For spare parts and out-of-production items it removes a lot of obsolescence.
A common definition of science is "knowledge, as of facts or principles; knowledge gained by systematic study."
Science is never stable. There is always layer upon layer of detail that is waiting to be discovered. The "Standing on the Shoulders of Giants" is the underlying concept. Our level of scientific understanding is driven by our current understanding and our needs to go deeper. The knowledge can change and grow based on deeper systematic study.
In the middle ages, when transportation was limited to horse, cart and walking. The naivety of a geocentric university was sufficient for the time. And for the most part motion of planets was fairly accurately explained by epicycles. The "Science" of the age was sufficient. As travel and migration required more detailed knowledge, the science improved to explain what was seen. New models were formed, and tides, winds and so on became more accurate and combined into a deeper understanding.
The beauty of science is that as the foundations of one area is broken down and rebuilt, what replaces it must not only encompass what was there, but also link deeper into other areas that caused the original science to fail. It doesn't make the previous science and knowledge bad, just incorrect. One can't deny that a model that explained a known phenomena for that point in history was bad science.
In 40 years time*, we'll look back at the misguided fools at the start of 21st century and our futile and plain incorrect approaches to fusion. We may not be there, but we'll probably dealing with all sorts of funky and interesting materials on the way to get there.
Those of us who will have children should know that their science *will* be different in a lot of areas than our science. That is a good thing.
* Bonus points for replies that say why I chose the "40 years time".
Alternate line exposure is not new, it is in a lot of current generation sensors. Omnivision, Sony and Toshiba all have sensors out with this capability.
The underlying issue is that when doing alternate line exposure you are getting only half the resolution for each range. DSP and image processing techniques can help smooth out the issues, but you are fundamentally dealing with a half-height dark and a half-height light image. Depending on the alternate-line approach, you also get other funky color fringing issues due to the underlying bayer pattern. As the article notes, there are color fringing issues
A good generalized approach is to output a 1/2 resolution image in both dimensions, otherwise you will get a vertical stretch if you keep the horizontal width at full resolution. So it means for a 16 MP camera, you will get only 4 MP HDR images. In a lot of cases this will be more than good enough... But it makes it really difficult to sell and explain to users.
There is usually a good reason that advanced features aren't release/published. A lot of the time it comes down to features be sub-optimal on what is supposed to be a highly polished product.
Both are leaders.
Managers are Organizational Leaders.
"Senior"/"Staff" Engineers/Architects are Technical Leaders.
Different focus, but similar soft leadership skills. They are peered, and should have a similar work load and a similar amount of hassle...
Firstly, I bow to your low 5 digit user number. You are an old hand...
I won't bite at all the points that are worth biting.
The mentoring part is the leadership part of management. When I have an engineer in my team go "wow, I've never done it that way" or "that was inspiring" it is all worth it. For reference, the two quotes were for "techniques for estimation" and "requirements analysis".
The managers role is to get the team as efficient and effective as possible. This means taking experience (from in the ranks) and finding ways to apply it to make their life easier and more effective.