I can apply buzzwords and promote synergies by empowering individuals to maximize their unique contributions. My team even volunteered overtime during the holiday season, because they were so positive about our project. It wasn't because they were afraid they would be pushed out of their jobs by a CIO whose eager to ship everything he can out of house.
I guess he did okay at the CDC and hey, if it saves money, great, but who cares. Just do your job already. I'm sure the pay scale isn't that bad and the benefits are pretty awesome.
When compared to the broad consensus of science, yes. Belief doesn't enter into it, the research is done. Global warming is an established fact. And not just by one paper, but by repeated, peer reviewed research. Even early skeptics in climate modeling have come to the same conclusions.
I hesitate to call him or others skeptical, as it suggests there is really any room for doubt. There really isn't. The core findings about global warming are established. Covering our ears and shouting "it's not true" won't change a thing.
...the gamma rays would set off a chain of chemical reactions that would destroy the ozone layer in a planet's atmosphere. With that protective gas gone, deadly ultraviolet radiation from a planet’s sun would rain down for months or years
Yeah, because it's impossible that complex life could be protected by a different (better!) kind of UV shield like... water. From my understanding, it's not exactly rare in the universe.
Seriously, a news for nerds site can't get the word electronic correct in a article headline? Amazing editing going on there, Slashdot.
The current state of x86_64 at Intel means that there is no reason to create a 32 bit only processor, it'd be a huge amount of architectural rework with little benefit.
Now, just because it's 64 bit capable doesn't mean that the OS will be 64 bit. In fact, given the low memory, that might be an option. This is all about SoC cost and low margins. That means each bump in memory really adds up. This isn't the same as just putting a more dense DIMM in a motherboard.
Also, given the target usage, one would have to argue why you would need more than 2G of ram. Seriously, I get away with it on a Windows 8.1 tablet for basic Office, etc. No reason Sailfish won't do even better.
The Wanam Xposed module lets you set any app to work as a window. Somehow I never actually end up using the multiwindow facility on my phone, though.
Of course, I saw all the expected arguments, and a lot of "but, Microsoft is the exact same company from 20 years ago, so this must be wrong, evil, etc." Well, companies change. Skepticism is good, but evaluating things as they are is good too.
The
And one of the main drawbacks to the platform in terms of target platforms is starting to be addressed in a real way.
It's a pragmatic decision. Microsoft has already benefited from open source projects (ASP
I bet that internally at Microsoft, lots of people are happy about this, as they really do think they did great work and this gives them greater visibility.
I think the sources for "some populations on average are smarter than others" are needed. Same with the "superior social ability". I've seen nothing that suggests that either are true and that they can be attributed to a genetic difference and not to social or environment confounders. The consensus is that it's not worth studying. The genomic data shows that any influence or different in complex behaviors would be just noise and impossible to measure in the face of strong confounders. The point is that these perceived "differences" between races (a term that many argue has no scientific basis) are in fact incredibly small genetic despite their outward appearance.
The reason it's not correct to state that different genetic sub-groups might have different intelligence levels is that there is no evidence that there is any significant difference between any population or group overall genetically.
You mention anthropology. Yes, there is an interest in studying how our population grew and spread over the planet. To do this, they do sophisticated analysis to detect certain changes to try and model how the population moved.
Here's the problem, you've assumed that these grouping are significant outside population migration. They aren't. If you take the genome as a whole, these variations are nothing compared to individual variation.
It's not culture that has caused the problem not to be looked into. It has been, significantly, and some people in our culture refuse to accept the results. That our perception of race and racial differences are completely environmental, and there is no basis whatsoever in science to say that one population is smarter than the other.
Again, this is a lot of posturing to try and ignore that as a society, we systemically have placed a certain set of people at a large disadvantage for no reason than our fears. We only talk about black culture because our history caused us to set apart a population first as property, then as second-class citizens, and then as "different' when convenient to explain why a group is poor or lazy or ambitious or whatever bucket we try to force people into.
Here's the point. Every time genetic differences comes up, it's "Are blacks less intelligent?" "Are Asians better in school?" "Are Latinos less motivated" and so on. All these are dumb questions. But, never, never have I seen: "Are whites more prone to discriminate against other groups?" It's still a dumb question, but it doesn't come up, does it.
The summary seems a bit misleading. The main thrust of page I saw what that the push to replace work with automation can have consequences at a certain level. Does decision making really work well in automation, or does it lead to problems? There's evidence in both camps. An example, some traders on Wall Street have complained about removing people from the process, they that really do add value at times. And sure, it's hard to imagine a human would issue a massive amount of bad orders, but a computer model with a bit of glitch might. But, is that enough to slow things down. Just one example of many.
In my mind, critical thinking does have value, and no, there is nothing in data science, machine learning, etc. that really comes even close to what humans can do in that area. There's a big debate in Medicine about following best practices and if just following algorithms would work better. Some note it would reduce unneeded tests and procedures. Others have noted that actually, doctors are much better at noting when something is going really wrong and that following a script could lead to unnecessary deaths that would be avoided by relying on clinical judgment. Is there is a need for better data? Sure, but can you really automate judgment? And what real value is there of taking the craft out of everything for humanity as a whole?
The problem is that some people don't think software engineering, programming, coding, whatever requires critical thinking, or that there is a craft or art to programming. And you can increasingly do it that way. Cut and paste, copy from the web, and when things don't work out, post on the web and hope somebody answers.
What is lost is somebody has to have the skills to figure out what is going wrong or that it can be done better. Where do those answers come from on the web after all? At some point, somebody has to know how to actually approach the problem from the fundamentals and solve it, and that's when all those things that we (okay, at least me and my schoolmates) studied in CS come into play.
I'm on a project and they are just throwing idea after idea to figure out a performance problem. Sure, it's tricky, but I realize, they have a huge blind spot. They don't know how to attach a low-level debugger to a process, to monitor OS resources, or even realize that you can debug something without sources. Sure, it's a Java enterprise application, so that's another layer of hard, but it can be done. Cripes, we had to debug core dumps. I'm glad (thrilled) that I don't have to do it anymore, but the skills that I learned doing it were invaluable.
A related aside. The problem is not better tools, it is not knowing there are better (or any) tools or that you can make better tools.
Ballot stuffing is actually pretty easy to protect against, and the method of voting doesn't do anything to change that equation. It's just as easy to telegraph votes on a voting machine versus paper. Also, voter fraud is really risky compared to the payoff. It's easy to get caught. It just takes one election judge to unravel a scheme to defraud. And it's much easier and less risky to disenfranchise voters to effect an outcome; history shows us that.
As to why the counts are off, in most cases, it's confusion over eligibility, not intentional fraud.
Oddly, given that it is Congressional gridlock, one would think to naturally look to Congress for a solution. But, this assumes that gridlock is viewed to be a problem.
The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.