Please create an account to participate in the Slashdot moderation system


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Who cares (Score 4, Insightful) 296

So when they put out all the paranoid rhetoric that the US is only out to invade and bomb them, are they really being paranoid?

My drill instructor gave me some useful advice about thirty years ago: if someone says they want to kill you, you should take them seriously. Let's keep in mind that since the late 1950's North Korea has been militant, aggressive, threatening, and destabilizing no matter who was in the White House. Various administrations have tried various sticks and various carrots to get them to change all to no avail. If the Norks are afraid of external animosity they only have themselves to blame.

Comment Re:People are more worried about jobs (Score 1) 423

And why is such a monopoly present? Two main reasons:

1. It is economically unfeasible for anyone to bring you service other than the existing provider who is piggybacking on prior infrastructure. In this case, you chose a poor place to live. It's not the fault of the company or the taxpayer that it would cost $10 million in fiber to service 100 rural customers and they won't do it because there's no reasonable return on investment. If you don't like it, nobody is putting a gun to your head saying you have to stay there. Some people move because they want more land, or less traffic, or better climate. Internet service is no different.

2. Local politicians protect the monopoly in return for campaign contributions. Vote the fuckers out, problem solves itself.

Comment Re:The game is too one-sided (Score 1) 423

And nothing of value would be lost.

According to you and your set of values. It's rather arrogant of you to foist those on others.

Yes, plenty of "free" works can and do exist. None exist in any form at a scale equivalent to larger projects with correspondingly larger value (value also being subjective but I'm speaking in generic terms). Suffice to say, there are amazing creative works that contribute to society in ways that could not be practically accomplished in a "free" manner.

If "free" were the rule instead of the exception there would be no market for paid content in a capitalist open-market society. That there is demonstrates your conclusion is not correct.

Comment Re:Committing crime != convicted (Score 0) 384

If you assume criminality for a population you will catch more people from that group in suspicious circumstances, since you know that they are more suspicious.

You could get the exact same results if -- gasp! -- the group in question was actually committing more crimes on average. And FBI crime statistics compiled under the Obama (a minority) administration and under the leadership of Eric Holder (a minority) bear this out. But since you find the conclusion distasteful you disregard it and drum up some completely unsubstantiated conclusion that says it's not that minorities commit more crimes, it's widespread institutional racism in the police and justice system. Never mind minorities are over-represented in the police and have solid representation in the justice system and it's been that way for decades. Never mind this country and its "institutional racism" elected a biracial President twice with solid majorities. Nope, it's got to be racism. That racism has to be widespread and overt to cause the massive bias you cite, but it also has to be subtle and hidden because our PC society tolerates nothing and no one that even hints at bias. This complete and utter contradiction deters you not in the slightest, does it?

Since the black population in America is poorer than the white population, and poorer people are more likely to end up committing crimes anyway, you are asking the black population to behave not just a bit but lots and lots better than the white population would in the same circumstances. As a reward for this you offer only the same treatment they should already be receiving. lovely chap, aren't you.

Now who's being racist? You're claiming that blacks -- who tend to be poor -- shouldn't be expected to live up to the same standards of justice as everyone else, and if they do, they should somehow get more benefits from being law-abiding than whites. And this complete and utter contradiction deters you not in the slightest, does it?

SJW's. If you didn't have double standards you'd have none at all.

Comment Re:I'm with you (Score 1) 384

I've said it before, but this is what folks mean by "institutionalized racism".

No, this is called "objective analysis of data being used to draw a rational conclusion." If someone is impoverished then giving them a loan for a million-dollar house is a bad idea. Period. Race doesn't come into it. Any human or algorithm, when given this same data but leaving out race or gender, will come to the same conclusion. And that's a good thing because it is a rational, informed conclusion based on reality.

What you don't like is reality has unpleasant implications for your ideology and worldview. Reality laughs at you for this. So do a lot of other people.

Comment Re: Simple solution (Score 1) 384

Oh, but this is the whole point of outrage. The average, perfectly objective statistical sample is racist.

How dare you! You must be racist for pointing this uncomfortable fact out! You should be shouted down, your friends must be pressured to abandon you, your workplace must fire you, and your family should be ostracized and forced to denounce your racist views! Everyone knows there is no such thing as objective reality! Truth is merely a construction of the oppressor! Reality is whatever the collective decides it to be and any observations to the contrary are proof of bias!

Comment Re: Uh, no. (Score 1) 384

The problem is when the bias exists in reality, not in perception or opinions.

In such cases, reality must be ignored and a cherished fantasy substituted in its place so no one gets offended in their safe space.

The correlation between socioeconomic status and risk of defaulting on a loan is clear, and it would be silly to question it.

No, it is racist to question it, which means you're racist for suggesting it. And your friends are racist too because they like you. And your co-workers are racist because you sit near them. And your boss is racist because he hasn't fired you for being racist. And your dog is racist because he lives with you. And your car is racist because it carries you and your racist viewpoints to the secret volcano lair of the KKK which you obviously worship because you're racist.

Don't deny it! That would only prove your latent racism! And don't argue with me because only a racist would do that!

(sarcasm disengaged)

I'm sure the above looks and sounds depressingly familiar, does it not?

Comment Re:Or rather... (Score 1) 384

Then your training the AI with the wrong (or at least--incomplete) set of data. You wouldn't train the AI with what markers the human used before the machine was around. You would feed it the history of all the loans made, along with the data from each loan that was collected, and the information about whether it was paid, kept current, or defaulted on and the time-frames for those outcomes, as well as whether money was lost or not and how much if they went bad. Then let the AI make it's own set of determining factors about what loans are more likely to end up in default, and what the risk vs. reward is for giving out such loans on a scale of minor risk to major risk, and decline the more risky ones that have a high chance of going bad.

And they did exactly that. The outcome? It showed (unsurprisingly) those of lower socioeconomic status are higher credit risks. That those same people are typically minorities is utterly irrelevant to the algorithm because the algorithm didn't have that data. Now along come some humans with a SJW chip on their shoulders screaming "you're making a racist AI because the outcome highlights an unpleasant facet of reality!" Yes, by all means, let's make an AI that attempts to ignore reality and embrace the SJW fantasy so we can have an AI that makes shitty decisions leading to things like the 2008-2009 financial collapse (see below)! Clearly that's the best way to go, right?

*In 1995 Bill Clinton loosened housing rules by rewriting the Community Reinvestment Act, which put added pressure on banks to lend in low-income neighborhoods. This effectively penalized banks if they didn't make high-risk loans to people who were not credit worthy. Banks protested because they weren't allowed to charge higher interest rates to offset the risks (they were told THAT'S RACIST!). The government stepped in and said it would back up the loans, removing the risk to the banks. The banks went nuts (who wouldn't? There was no downside for them) making loans. When all of it collapsed, the government had to step in to clean it up with the taxpayers footing the bill.

So yeah, let's not learn any lesson from that at all. I'm sure it'll all work out if we rig an AI to be just as stupid as the humans who preceded it.

Comment Re:Or rather... (Score 1) 384

If the data included any location information it could very well exhibit racial bias as an unintended consequence.

You miss the point entirely. If an objective review of available data say "people on this side of this geographic line default on loans more frequently than people on the other side of the line" that's not racial bias, it is observable, provable fact. You cannot argue you way out of reality by calling reality an "unintended consequence." If the default rates for a given area are higher then the risk to to the lender is higher. There is no other metric that needs consulting to make the decision. Period. End of story.

That people on one side of the line are a different race than those on the other side is completely irrelevant to the decision. And to illustrate that, let's say the situation was reverse and a group of wealthy minorities (think Oprah, Tyler Perry, etc.) lived on one side of that line and a group of impoverished whites (think Honey Boo Boo trailer park types) lived on the other. The algorithm would decide the minorities were a lower credit risk and nobody would think a damned thing about it. Not a single solitary person would scream "but it's racist because it's turning down all the non-minorities!" Everybody would (rightly) say "no, it's turning down people with pathetic credit ratings who are high risks to lenders."

The AI isn't racist. It isn't being taught to be racist. People who are conditioned to see racism anywhere and everywhere regardless of reality are the problem.

Comment Re:Or rather... (Score 2) 384

You're missing what the parent is saying - you can't just tell the AI to ignore race/gender, it's baked into how we talk and act.

No, you're missing the point being discussed, namely an AI that makes loan decisions based on available data. The AI can look at income, credit history, employment, etc. and not know a damn thing about the race or gender of the applicant. Guess what it will find? People of lower socioeconomic status are higher credit risks. Who would've guessed, eh? (that's rhetorical -- anybody who can fog a mirror knows this is true).

That people of lower socioeconomic status are typically minorities is utterly irrelevant to the decision-making algorithm used by the AI. It's only the SJW making the link, claiming the AI is "racist" because the outcomes mean minorities would get turned down more often. NO. The AI isn't racist. It's simply acknowledging and unpleasant reality, something SJW's can't stand, therefore "racist." It's the all-purpose slur of the snowflake generation.

Comment Re:Self fulfilling prophecies... (Score 2, Insightful) 384

Ergo they will continue to be riskier and worse off than those in social groups with better evaluations. Rinse and repeat.

There is an obvious solution you're ignoring: how about you loan them the money? Or if you lack sufficient funds, get a group of like-minded individuals together, form a banking institution specializing in loans to these people being rejected by traditional institutions, and see how it turns out.

What? You don't want to risk your own money on such a venture? You can't find others willing to risk theirs? You find your default rates are higher than your institution can sustain? Your business fails?

Funny how reality -- which doesn't give a shit about race -- intrudes in the precious safe space SJW's want to construct.

Comment Re: Or rather... (Score 2, Insightful) 384

So why would a machine learning device ignore a strong factual trend, just because its existence is offensive?

Because SJW's want us to live in a society where anything offensive -- regardless of whether it's hard, provable, objective fact -- must be stamped out. These are the same type of people who burned people at the stake for daring to claim the Earth wasn't the center of the universe, or the same ones who destroyed scientific careers of those who dared claim luminiferous aether wasn't a real thing, or who shunned aeronautics engineers who said the sound barrier could be broken, and so on and so forth. These people want us to live in a world where nothing uncomfortable ever happens and everybody remains fat, dumb, and happy...and utterly ignorant.

Such a concept is repellent. Humans need to be challenged, preferably by each other in a constructive way lest reality catch up with us and do it in a much more destructive fashion.

Comment Re: Or rather... (Score 1, Insightful) 384

However, even if the factors that make minorities more risky are already accounted for, an AI may be biased against them because the training data contained a correlation between race and perceived risk.

There is no such thing as "perceived risk." There is either risk or there is not. If you perceive risk and it is real risk, it's not perceived. If you perceive risk and it is not real, you are in error.

There is this broad SJW initiative to discount reality whenever reality conflicts with what SJW's want to be true. Reality laughs at things like this because it is reality. If poor people have a higher risk of defaulting on a loan, that is simply fact. The algorithm isn't racist for determining that. The fact that a significant fraction of the poor is also a racial minority is irrelevant to the algorithm. Only overly-sensitive SJW humans make that connection and, despite the reality of the situation, want the algorithm to ignore to reality and proceed as if the risk didn't exist.

Then, when reality intrudes, the loans default, and the banks go belly up because they were terrified of a civil rights lawsuit if they didn't grant the loan, those same SJW's deny their actions had anything to do with the situation. And those of us who opposed this idiotic reality-denial end up paying the tab. And the SJW's never learn and do it all over again. And again. And again.

Slashdot Top Deals

Whom the gods would destroy, they first teach BASIC.