Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Watson is a scientist (Score 1) 235

But there is a strong dogma that genetics is not a factor in the observed disparity in measurable intelligence between sub-Saharan Black Africans and Ashkenazi Jews. This dogma doesn't have any scientific basis that I'm aware of;

The first step in addressing the question scientifically is determining whether the question even makes sense. You have to *establish that the question is valid* before answering it.

A hundred years ago scientists didn't know about DNA, couldn't characterize someone's genes. They went with what they could observe: skin color, hair, eye shape etc. And they came up with various compelling three race and five race schemes. But we aren't limited like they were. We can open up someone's genetic black box and characterize his heritage precisely. And when we did that all those compelling, intuitively obvious schemes fell apart.

The problem with the question you pose is that it makes no sense to lump all Sub-Saharan Africans into one "race". Most of the genetic diversity of the human race is in Sub-Saharan Africa. There are ethnic groups in Africa that have more genetic diversity than all human populations originating outside Africa *combined*. So we can't answer the question you pose because it's very assumptions contradict the facts.

If we were to divide humanity into five "great races", they'd probably end up being five *African* races, with the rest of the world tacked on in various ways. What's more it would turn out that there were *other* equally justifiable ways to construct five African races.

Race is like constellations. Humans *will* see patterns in complex, random data. Just because Orion *looks* like an object doesn't mean that the stars in Orion are linked by some process. But boy is that pattern ever compelling.

Comment Re:will be seen as a dig against science (air quot (Score 1) 100

What you need to do is look at the impact factor of each journal -- a measure of how often articles in that journal are cited. What constitutes a "good" IF varies from field to field, so you want to compare the IF of a journal to the leading journals in that field. In this case the IF for the journals in question are 1.47 (0.3 for a five year period) and 1.15.. By comparison, the ACM Transaction on Intelligent Systems and Technology has an impact factor of 9.39.

There have always been low quality journals, but recently I'm seeing an uptick in pseudo-science advocates like anti-vaxxers and climate change denialists citing "published research" that makes absurdly broad claims. It's important to look up the IF for the journals referenced, they're often predatory journals that function like a "vanity press" for unpublishable papers.

The site http://scholarlyoa.com/ is also very useful. It maintains a list both of predatory journals and predatory publishers in the business of giving a platform to junk scholarship. Journal of Computational Intelligence and Electronic Systems is not on the list of standalone predatory journals, nor is the publisher American Scientific Publishers on the list of predatory publishers -- yet. Aperito *is* on the list of suspect publishers.

Unforutnately IF isn't infallible. You can't automatically dismiss a paper because it's published in a low IF journal. You have to look at the whole pattern. A new paper making unusual claims is a lot more credible if it's published in a high IF journal like Nature. If it's published in Fred's Research Journal, you have to wait and see whether the paper gets cited by reputable scholars or by papers in more mainline journals.

Comment Re:C++ is C (Score 1) 641

Private and unimplemented is the key, on old compilers.

On new compilers use:

class MyClass {
MyClass(const MyClass&) = delete;
// ...

Doesn't matter what access specifier you use, because any attempt, from any code, to copy a MyClass will be diagnosed as an error by the compiler.

Comment San Francisco already did this (Score 5, Interesting) 178

San Francisco already did this. Almost all the masonry buildings in SF have been reinforced since the 1989 quake, and now the rules are being tighened on wood buldings. If you've been in an older building in SF, you've probably seen huge diagonal steel braces. That's what it looks like.

All new big buildings meet very tough earthquake standards. The bridges and freeways have been beefed up in recent years. Overpass pillars are about three times as big as they used to be. Two elevated freeways were torn down after one in Oakland failed in the 1989 quake. The entire eastern span of the Bay Bridge was replaced with a new suspension bridge. The western span was strengthened, and there are now sliding joints, huge plates of stainless steel, between the roadway and the towers.

Comment Re: Here come the certificate flaw deniers....... (Score 1) 80

It's really quite different from a password. And you don't know the meaning of "security through obscurity", either. All cryptographic systems rely upon secrets; a well designed one relies upon a single secret that is impractical to guess. Security through obscurity is relying upon a number of weak secrets, like the methods you are using.

You need to read up on public key cryptography. There's a huge difference between a certificate and a shared password.

Comment Re:The simple truth of the matter (Score 1) 329

Methane photooxidizes with a half life of ~21 days in sunlight at average insolation. It gets released and then breaks down harmlessly and naturally.

... into carbon dioxide and water vapor, eventually, through a of several intermediate products. So the presence of methane is *at least* as important as CO2, because that's what it decays into. Nothing in nature just "goes away", you have to ask what it becomes.

The half-life for tropospheric methane is about 8.5 years by the way (Bergamaschi, P., & Bousquet, P. (2008). Estimating sources and sinks of methane: An atmospheric view. In The Continental-Scale Greenhouse Gas Balance of Europe (pp. 113-133). Springer New York. Photoxidation of CH4 is not a major sink of tropospheric CH4. It does occur in the troposphere in the presence of high concentrations of NO and may be a contributor to the ozone in urban smog.

Comment The corporate AI (Score 4, Insightful) 417

What I'm worried about is when AIs start doing better at corporate management than humans. If AIs do better at running companies than humans, they have to be put in charge for companies to remain competitive. That's maximizing shareholder value, which is what capitalism is all about.

Once AIs get good enough to manage at all, they should be good at it. Computers can handle more detail than humans. They communicate better and faster than humans. Meetings will take seconds, not hours. AI-run businesses will react faster.

Then AI-run businesses will start deailng with other AI-run businesses. Human-run businesses will be too slow at replying to keep up. The pressure to put an AI in charge will increase.

We'll probably see this first in the finanical sector. Many funds are already run mostly by computers. There's even a fund which formally has a program on their board of directors.

The concept of the corporation having no social responsibiilty gives us enough trouble. Wait until the AIs are in charge.

Comment Re: Very much so! (Score 1) 641

the reality is if you are given code to work with and you see i = j * 5, if it's C you know what it does; if it's C++, you don't, regardless of who wrote it.

Nonsense. The only way you don't know what that means in C++ is if the code was written by a complete, drooling idiot.

Seriously, this is the first thing C programmers pull out to criticize C++... but in 23 years of professional C++ programming, I have never seen bizarre arithmetic operator overloading used in practice, except for iostreams, and you get used to that pretty quickly, given that it's been part of the standard library since very early on.

I have a few times seen libraries of mathematical operations, say on vectors or tensors, that made heavy use of arithmetic operator overloading, but it was so they could say "i = j * 5" where i and j are n-dimensional matrices. In those cases, operator overloading not only makes sense, it's a dramatic improvement over C.

If you want to criticize C++, there are lots of valid criticisms, mostly around the huge variety of features and the complexity of their interactions, which can get really subtle. And you can criticize many of the insane template metaprogramming constructs (though those can be really useful sometimes, particularly in building up infrastructure that allow the compiler to diagnose all sorts of errors you might make). If you don't like "invisible" stuff, you can criticize the abuses that can be made of constructors and destructors. But operator overloading? Faugh.

Comment Re:I Don't Get It (Score 1) 149

He's implying that developers will specify a complete environment where every DLL available to the application within the environment is exactly what the developer used. There is no DLL hell because you run what the developer ran, and it doesn't matter if you have seventeen different incompatible versions of (to pick windows example everyone's familiar with) mfc42.dll, because things inside the container won't know that you have those dlls.

In that case, why bother with dynamic linking at all? Why not statically link everything? The effect is essentially the same -- you get exactly what the developer had. You also get no shared code pages -- even if you're using exactly the same library as someone else -- and bloated memory and disk usage since you have your own private copy of everything. Disk may be "cheap," but it's still surprisingly easy to fill up a 16GB eMMC device.

Comment Re:C++ is C (Score 1) 641

always implementing the big three (default constructor, copy constructor and = operator.)

Or, in the case of the last two, intentionally declaring them private and NOT implementing them (or if you have a C++11 compiler, explicitly deleting them), so as to make the class non-copyable. Relatively few classes need to be copyable. Declaring the copy ctor and assignment operator private ensures that client code can't accidentally copy your non-copyable objects. Not implementing them ensures that if class code accidentally copies an instance, you get a link error.

Comment I Don't Get It (Score 5, Insightful) 149

Am I getting hopelessly old and unable to appreciate new things, or is this not anywhere near as whoop-de-doo as they're making out?

"You can update transactionally!!" Great. What does that mean? Is it like git add newapp; git commit -a? If so, how do I back out a program I installed three installations ago?

Transactional updates have lots of useful properties: if they are done well, you can know EXACTLY what's running on a particular system, [ ... ]

dpkg -l

You can roll updates back, [ ... ]

dpkg -i <previous_version>

...lets you choose exactly the capabilities you want for yourself, rather than having someone else force you to use a particular tool.

#include <cheap_shots/systemd.h>

Because there is a single repository of frameworks and packages, and each of them has a digital fingerprint that cannot be faked, two people on opposite ends of the world can compare their systems and know that they are running exactly the same versions of the system and apps.

debsums

Developers of snappy apps get much more freedom to bundle the exact versions of libraries that they want to use with their apps.

...Did this guy just say he brought DLL Hell to Linux? Help me to understand how he didn't just say that.

I bet the average system on the cloud ends up with about three packages installed, total! Try this sort of output:

$ snappy info
release: ubuntu-core/devel
frameworks: docker, panamax
apps: owncloud

That's much easier to manage and reason about at scale.

No, it isn't!! What the hell is OwnCloud pulling in? What's it using as an HTTP server? As an SSL/TLS stack? Is it the one with the Heartbleed bug, the POODLE bug, or some new bug kluged in by the app vendor to add some pet feature that was rejected from upstream because it was plainly stupid?

Honestly, I'm really not getting this. It just sounds like they created a pile of tools that lets "cloud" administrators be supremely lazy. What am I missing here?

Comment Re:I don't know if 'profiteer' is the right term (Score 2) 33

Just because *some* or even *most* profit is reasonable, doesn't mean all profit is reasonable.

The term "profiteer" is used for people who put profit above a higher ethical claim; for example a citizen selling arms to an enemy during wartime. It's not that profit per se is unreasonable, but that the citizen has a higher duty of loyalty to his country than to his profits. Likewise people who profit by helping governments undermine civil liberties can reasonably be called "profiteers".

The issue isn't *that* we dislike them. It's *why* we dislike them that makes them profiteers.

Comment Re:Catholic Health (Score 1) 398

Keep in mind that the guy was probably told that he was replacing you because you were incompetent and had a bad work attitude. That's what some of them are told, they're here because of the low technical skills and poor work ethic of American workers. So why should he listen to your opinion?

Another thing to consider is that your replacement may come from a culture which is not as egalitarian as yours, and at the moment your status was pretty low. In America a junior programmer fresh out of school can tell a senior engineer with twenty years experience he's full of shit, and that's something we admire. But in other cultures accepting this kind of behavior is seen as weakness. I've dealt with this first hand. I once had to take over a troubled programming team full of H1Bs (not my choice, we inherited the team) that had been shipping really bad code. It turned out to be full of terrific talent, only the lead programmer was incompetent.

Slashdot Top Deals

The nation that controls magnetism controls the universe. -- Chester Gould/Dick Tracy

Working...