Catch up on stories from the past week (and beyond) at the Slashdot story archive

typodupeerror

## + - Mathematicians Study Effects of Gerrymandering on 2012 Election1

Submitted by HughPickens.com
HughPickens.com (3830033) writes "Gerrymandering is the practice of establishing a political advantage for a political party by manipulating district boundaries to concentrate all your opponents votes in a few districts while keeping your party's supporters as a majority in the remaining districts. For example, in North Carolina in 2012 Republicans ended up winning nine out of 13 congressional seats even though more North Carolinians voted for Democrats than Republicans statewide. Now Jessica Jones reports that researchers at Duke are studying the mathematical explanation for the discrepancy. Mathematicians Jonathan Mattingly and Christy Vaughn created a series of district maps using the same vote totals from 2012, but with different borders. Their work was governed by two principles of redistricting: a federal rule requires each district have roughly the same population and a state rule requires congressional districts to be compact. Using those principles as a guide, they created a mathematical algorithm to randomly redraw the boundaries of the state’s 13 congressional districts. "We just used the actual vote counts from 2012 and just retabulated them under the different districtings," says Vaughn. "”If someone voted for a particular candidate in the 2012 election and one of our redrawn maps assigned where they live to a new congressional district, we assumed that they would still vote for the same political party."

The results were startling. After re-running the election 100 times with a randomly drawn nonpartisan map each time, the average simulated election result was 7 or 8 U.S. House seats for the Democrats and 5 or 6 for Republicans. The maximum number of Republican seats that emerged from any of the simulations was eight. The actual outcome of the election — four Democratic representatives and nine Republicans – did not occur in any of the simulations. "If we really want our elections to reflect the will of the people, then I think we have to put in safeguards to protect our democracy so redistrictings don't end up so biased that they essentially fix the elections before they get started," says Mattingly. But North Carolina State Senator Bob Rucho is unimpressed. "I'm saying these maps aren't gerrymandered," says Rucho. "It was a matter of what the candidates actually was able to tell the voters and if the voters agreed with them. Why would you call that uncompetitive?""

## Comment: Re:"Perfectly timed"? (Score 0)252

by Coryoth (#48177543) Attached to: Apple's Next Hit Could Be a Microsoft Surface Pro Clone

Seems to me that Apple is playing catch-up in the phablet arena. Apple was late to the party and lost the toehold because of its tardiness.

No, no, you're looking at this all wrong. Apple stayed out of the Phablet market until they were "cool/hip/trendy". The vast sales Samsung had were merely to unimportant people. Apple, on the other hand, entered the market exactly when phablets became cool, because, by definition, phablets became cool only once Apple had entered the market.

## Comment: Re:I FIND THIS HIGHLY... (Score 1)460

by Coryoth (#47958797) Attached to: Science Has a Sexual Assault Problem

Logic is a binary function. Something is in a logical set - or it is not. Being illogical is not a synonym for being mistaken. Degrees of precision are irrelevant for set inclusion. Fuzzy logic is not logic.

Fuzzy logic is logic. So are linear logic, intuitionistic logic, temporal logic, modal logic, and categorical logic. Just because you only learned Boolean logic doesn't mean there aren't well developed consistent logics beyond that. In practice bivalent logics are the exceptions.

## Comment: Re:Some criticism (Score 1)184

by Coryoth (#47953649) Attached to: KDE's UI To Bend Toward Simplicity

... a lot of people respond to this by saying the criticisms are stupid, that "if you know what you're doing" then you'll understand what's really going on, etc.

Yes; "if you're just willing to get your hands a little dirty and muck in and learn then you can bend the hugely complicated interface to your needs" they'll say; they'll complain that your just not willing to learn things, and thus it is your fault. Such people will inevitably state that they are "power users" who need ultimate configurability and are (unlike you) willing to learn what they need to to get that.

They will inevitably deride GNOME3 for it's complete lack of configurability. Of course they'll gloss over the fact that GNOME3 actually exposes pretty much everything via a javascript interface and makes adding/changing/extending functionality via javascript extensions trivial (GNOME3 even has a javascript console to let you do such things interactively). Apparently actually learning an API and coding completely custom interfacdes from myriad building blocks is "too much work". They are "power users" who require a pointy-clicky interface to actually configure anything. Even dconf is "too complicated".

For those of us who learned to "customize our desktop" back in the days of FVWM via scriptable config files calling perl scripts etc. it seems clear that "power users" are really just posers who want to play at being "super-customised". Almost all the modern DEs do have complete customisation available and accessible; some of them just use a richer (scripting) interface to get such things done.

## Comment: Re:Is the complexity of C++ a practical joke? (Score 1)427

by Coryoth (#47673801) Attached to: Interviews: Ask Bjarne Stroustrup About Programming and C++

It's not the features that you stare at with no idea what they do that cause a problem. As you say, a quick look at the manual can help to sort that out (though it does add to the overall cognitive load). It's all the potentially subtle things that you don't even realise are features and so never look up and don't realise that, contrary to first inspection, the code is actually doing something subtly different to what you expect.

## Comment: Re:I disagree (Score 1)241

by Coryoth (#47488509) Attached to: Math, Programming, and Language Learning

Math is all about being precise, logical.. Communicating exactly one concept at a time. Natural languages do neither.

Except math is almost never actually done that way in practice. Euclid was wonderful, but almost all modern math does not work that strictly (and Euclid really should have been more careful with the parallel postulate -- there's "more than one thing at a time" involved there). Yes, proofs are careful and detailed, but so is, say, technical writing in English. Except for a few cases (check out metamath.org, or Homotopy Type Theory) almost no-one actually pedantically lays out all the formal steps introducing "only one concept at a time".

## Comment: Re: Your Results Will Vary (Score 1)241

by Coryoth (#47487341) Attached to: Math, Programming, and Language Learning

Not every programmer deals with these [mathematical] questions regularly (which is why I donâ(TM)t think math is necessary to be a programmer), but if you want to be a great programmer you had better bet youâ(TM)ll need it.

I don't think you need math even to be a great programmer. I do think a lot of great programmers are people who think in mathematical terms and thus benefit from mathematics. But I also believe you can be a great programmer and not be the sort of person who thinks in those terms. I expect the latter is harder, but then I'm a mathematician so I'm more than read to accept that I have some bias in this topic.

## Comment: Re:I disagree (Score 3, Insightful)241

by Coryoth (#47487263) Attached to: Math, Programming, and Language Learning

Math IS sequencing. So is using recipes. That is how math works.

Math is a language. Just because you can frame things in that language doesn't mean that that language is necessary. Recipes are often in English. English is sequencing (words are a serial stream after all). That doesn't mean English is necessary for programming (there seem to many competent non-english speaking programmers as far as I can tell).

Disclaimer: I am a professional research mathematician; I do understand math just fine.

## Comment: Re: Your Results Will Vary (Score 1)241

by Coryoth (#47487085) Attached to: Math, Programming, and Language Learning

College education wastes countless hours teaching academic stuff that a great majority of programmers will not use on the job, while neglecting critical skills that could be immediately useful in a large .[sic]

Of course there was a time when college education was supposed to be education and not just vocational training.

## Comment: Re:Your Results Will Vary (Score 1)241

by Coryoth (#47487063) Attached to: Math, Programming, and Language Learning

I think part of the problem is that "programming" is itself so diverse.

The other part of the problem is that math is so diverse. There's calculus and engineering math with all kinds of techniques for solving this or that PDE; there's set theoretic foundations; there's graph theory and design theory and combinatorics and a slew of other discrete math topics; there's topology and metric spaces and various abstractions for continuity; there's linear algebra and all the finer points of matrices and matrix decompositions and tensors and on into Hilbert spaces and other infinite dimensional things; there's category theory and stacks and topos theory and other esoterica of abstraction. On and on, and all very different and I can't even pretend to have anything but cursory knowledge of most of them ... and I have a Ph.D. in math and work for a research institute trying to stay abreast of a decent range of topics. The people who actually study these topics in depth are all called "mathematicians", but if you're an algebraic geometer then sure, you're probably familiar with category theory and homological algebra; if you do design theory and graph theory then those seem like the most useful subject available.

## Comment: Re: Your Results Will Vary (Score 2)241

by Coryoth (#47487035) Attached to: Math, Programming, and Language Learning

Calculus is perhaps not the best measure however. Depending on where you go in the programming field calculus is likely less useful than some decent depth of knowledge in graph theory, abstract algebra, category theory, or combinatorics and optimization. I imagine a number of people would chime in with statistics, but to do statistics right you need calculus (which is an example of one of the directions where calculus can be useful for programming).

Of course the reality is that you don't need any of those subjects. Those subjects can, however, be very useful to you as a programmer. So yes you can certainly be a programmer, and even a very successful and productive one without any knowledge of calculus, or graph theory say. On the other hand, there may well be times when graph theory, or calculus, or statistics could prove very useful. what it comes down to is whether you are inclined to think that way -- and if so it can be a benefit; if not it won't be the way you think about the problem anyway.

## Comment: Re:I agree Python (Score 3, Informative)466

by Coryoth (#47242155) Attached to: Ask Slashdot: Best Rapid Development Language To Learn Today?

I've gotten a lot of mileage out of Python for cleaning and pre-processing CSV and JSON datasets, using the obviously named "csv" and "json" modules. ... However, if you are doing very much manipulation of tabular data, I'd recommend learning a bit of SQL too.

You may want to look into pandas as a middle ground. It's great for sucking in tabular or csv data and then applying statistical analysis tools to it. It has a native "dataframe" object which is similar to database tables, and has efficient merge, join, and groupby semantics. If you have a ton of data then a database and SQL is the right answer, but for a decent range of use cases in between pandas is extremely powerful and effective.

## Comment: Re:Programming language in 2 hours ? Yeah, right. (Score 1)466

by Coryoth (#47242127) Attached to: Ask Slashdot: Best Rapid Development Language To Learn Today?

Because Ruby is my preference and I am more familiar with it, I can tell you that it is in continuous development, and bytecode-compiled versions are available (JRuby, which uses the JVM, and others). I do not know about Python in this respect because I haven't used it nearly as much.

Python has the default implementation CPython which compiles python to an interpreted bytecode; there's also Jython which compiles to JVM, and IronPython which compiles Microsoft's CLR. There's also Cython (which requires extra annotations) which compiles to C and thence to machine code, and numba which does compilation to LLVM. Finally there's Pypy which is a python JIT compiler/interpreter written in a restricted subset of Python.

## Comment: Re:worthless top five phrases (Score 2)38

by Coryoth (#46845653) Attached to: Algorithm Distinguishes Memes From Ordinary Information

So they mined the journal for words and phrases... meh, those aren't memes

They are memes in the sense that they are specifically finding words and phrases that are frequently inherited by papers (where "descendant" is determined by citation links), and rarely appear spontaneously (i.e. without appearing in any of the papers cites by a paper). An important feature is that their method used zero linguistic information, didn't bother with pruning out stopwords, or indeed, do any preprocessing other than simple tokenisation by whitespace and punctuation. Managing to come out with nouns and complex phrases under such conditions is actually very impressive. You should try actually reading the paper.

## Comment: Re:now if only people can stop calling netmemes me (Score 1)38

by Coryoth (#46845163) Attached to: Algorithm Distinguishes Memes From Ordinary Information

But the writers of TFA are still misusing the word

Actually no, they are not. By using citations to create a directed graph of papers they are specifically looking for words or phrases that are highly likely to be inherited by descendant documents and also much less frequently spontaneously appear in documents (i.e. not used in any of the cited documents). They really are interested in the heritability of words and phrases.

"Flattery is all right -- if you don't inhale." -- Adlai Stevenson

Working...