Forgot your password?
typodupeerror

Comment: Re:Legitimate concerns (Score 1, Informative) 182

by Space cowboy (#47577907) Attached to: UK Government Report Recommends Ending Online Anonymity

This is a very US-typical way of thinking.

In the UK, it's more of a "where is the harm" approach. If there is more perceived harm in the exercise of said speech than in allowing it, it won't be allowed. This is more difficult to administer (it means someone, usually a judge) has to make a decision about this rather than it just being black and white. It does make life more pleasant for more people.

Having lived in the UK and the US for over a decade each, I have some perspective on this, and personally I think it's worth it, worshipping at the altar of "Free Speech At All Costs[*]" is an absolute, and I tend to distrust absolutes.

Simon.

[*] It's not a real absolute in the USA, you can't shout "Fire!" in a crowded theatre in the US either, for example, but it's a massively more common mindset of US people compared to UK people in my experience.

Comment: Re:I disagree (Score 1) 241

by Coryoth (#47488509) Attached to: Math, Programming, and Language Learning

Math is all about being precise, logical.. Communicating exactly one concept at a time. Natural languages do neither.

Except math is almost never actually done that way in practice. Euclid was wonderful, but almost all modern math does not work that strictly (and Euclid really should have been more careful with the parallel postulate -- there's "more than one thing at a time" involved there). Yes, proofs are careful and detailed, but so is, say, technical writing in English. Except for a few cases (check out metamath.org, or Homotopy Type Theory) almost no-one actually pedantically lays out all the formal steps introducing "only one concept at a time".

Comment: Re: Your Results Will Vary (Score 1) 241

by Coryoth (#47487341) Attached to: Math, Programming, and Language Learning

Not every programmer deals with these [mathematical] questions regularly (which is why I donâ(TM)t think math is necessary to be a programmer), but if you want to be a great programmer you had better bet youâ(TM)ll need it.

I don't think you need math even to be a great programmer. I do think a lot of great programmers are people who think in mathematical terms and thus benefit from mathematics. But I also believe you can be a great programmer and not be the sort of person who thinks in those terms. I expect the latter is harder, but then I'm a mathematician so I'm more than read to accept that I have some bias in this topic.

Comment: Re:I disagree (Score 3, Insightful) 241

by Coryoth (#47487263) Attached to: Math, Programming, and Language Learning

Math IS sequencing. So is using recipes. That is how math works.

Math is a language. Just because you can frame things in that language doesn't mean that that language is necessary. Recipes are often in English. English is sequencing (words are a serial stream after all). That doesn't mean English is necessary for programming (there seem to many competent non-english speaking programmers as far as I can tell).

Disclaimer: I am a professional research mathematician; I do understand math just fine.

Comment: Re: Your Results Will Vary (Score 1) 241

by Coryoth (#47487085) Attached to: Math, Programming, and Language Learning

College education wastes countless hours teaching academic stuff that a great majority of programmers will not use on the job, while neglecting critical skills that could be immediately useful in a large .[sic]

Of course there was a time when college education was supposed to be education and not just vocational training.

Comment: Re:Your Results Will Vary (Score 1) 241

by Coryoth (#47487063) Attached to: Math, Programming, and Language Learning

I think part of the problem is that "programming" is itself so diverse.

The other part of the problem is that math is so diverse. There's calculus and engineering math with all kinds of techniques for solving this or that PDE; there's set theoretic foundations; there's graph theory and design theory and combinatorics and a slew of other discrete math topics; there's topology and metric spaces and various abstractions for continuity; there's linear algebra and all the finer points of matrices and matrix decompositions and tensors and on into Hilbert spaces and other infinite dimensional things; there's category theory and stacks and topos theory and other esoterica of abstraction. On and on, and all very different and I can't even pretend to have anything but cursory knowledge of most of them ... and I have a Ph.D. in math and work for a research institute trying to stay abreast of a decent range of topics. The people who actually study these topics in depth are all called "mathematicians", but if you're an algebraic geometer then sure, you're probably familiar with category theory and homological algebra; if you do design theory and graph theory then those seem like the most useful subject available.

Comment: Re: Your Results Will Vary (Score 2) 241

by Coryoth (#47487035) Attached to: Math, Programming, and Language Learning

Calculus is perhaps not the best measure however. Depending on where you go in the programming field calculus is likely less useful than some decent depth of knowledge in graph theory, abstract algebra, category theory, or combinatorics and optimization. I imagine a number of people would chime in with statistics, but to do statistics right you need calculus (which is an example of one of the directions where calculus can be useful for programming).

Of course the reality is that you don't need any of those subjects. Those subjects can, however, be very useful to you as a programmer. So yes you can certainly be a programmer, and even a very successful and productive one without any knowledge of calculus, or graph theory say. On the other hand, there may well be times when graph theory, or calculus, or statistics could prove very useful. what it comes down to is whether you are inclined to think that way -- and if so it can be a benefit; if not it won't be the way you think about the problem anyway.

Comment: Re:Misleading summary (Score 1) 150

Questioning and asking are two completely different things, otherwise one wouldn't "ask a question", one would either ask or question.

To question something is to doubt the premises that lead to a given statement. To ask something is to enquire about something. When one has doubts a conclusion (i.e.: questions), one normally asks to ascertain the veracity of the conclusion. This leads to the construct "to ask a question" as in "to resolve a doubt".

Simon

Comment: Re:I agree Python (Score 3, Informative) 466

by Coryoth (#47242155) Attached to: Ask Slashdot: Best Rapid Development Language To Learn Today?

I've gotten a lot of mileage out of Python for cleaning and pre-processing CSV and JSON datasets, using the obviously named "csv" and "json" modules. ... However, if you are doing very much manipulation of tabular data, I'd recommend learning a bit of SQL too.

You may want to look into pandas as a middle ground. It's great for sucking in tabular or csv data and then applying statistical analysis tools to it. It has a native "dataframe" object which is similar to database tables, and has efficient merge, join, and groupby semantics. If you have a ton of data then a database and SQL is the right answer, but for a decent range of use cases in between pandas is extremely powerful and effective.

Comment: Re:Programming language in 2 hours ? Yeah, right. (Score 1) 466

by Coryoth (#47242127) Attached to: Ask Slashdot: Best Rapid Development Language To Learn Today?

Because Ruby is my preference and I am more familiar with it, I can tell you that it is in continuous development, and bytecode-compiled versions are available (JRuby, which uses the JVM, and others). I do not know about Python in this respect because I haven't used it nearly as much.

Python has the default implementation CPython which compiles python to an interpreted bytecode; there's also Jython which compiles to JVM, and IronPython which compiles Microsoft's CLR. There's also Cython (which requires extra annotations) which compiles to C and thence to machine code, and numba which does compilation to LLVM. Finally there's Pypy which is a python JIT compiler/interpreter written in a restricted subset of Python.

Comment: Re:worthless top five phrases (Score 2) 38

by Coryoth (#46845653) Attached to: Algorithm Distinguishes Memes From Ordinary Information

So they mined the journal for words and phrases... meh, those aren't memes

They are memes in the sense that they are specifically finding words and phrases that are frequently inherited by papers (where "descendant" is determined by citation links), and rarely appear spontaneously (i.e. without appearing in any of the papers cites by a paper). An important feature is that their method used zero linguistic information, didn't bother with pruning out stopwords, or indeed, do any preprocessing other than simple tokenisation by whitespace and punctuation. Managing to come out with nouns and complex phrases under such conditions is actually very impressive. You should try actually reading the paper.

Comment: Re:now if only people can stop calling netmemes me (Score 1) 38

by Coryoth (#46845163) Attached to: Algorithm Distinguishes Memes From Ordinary Information

But the writers of TFA are still misusing the word

Actually no, they are not. By using citations to create a directed graph of papers they are specifically looking for words or phrases that are highly likely to be inherited by descendant documents and also much less frequently spontaneously appear in documents (i.e. not used in any of the cited documents). They really are interested in the heritability of words and phrases.

Comment: Re:ip over tcp exists. see also PPoE (Score 1) 804

Yeah, but you need the lower level frames (link layer) to implement the higher level protocol (TCP) so that you can encapsulate another lower-level protocol within it; you can't implement TCP without any link-layer underneath it, is what I was trying to say. Note the "only" using TCP in the post.

Comment: Re:64 GB ECC 32 consumer, pcie vs. sata. compare H (Score 1) 804

What I'm really saying is that thunderbolt is like a transport layer protocol, and pci-e, Ethernet, video, etc. are all protocols layered on top of this transport protocol. It's very like the OSI stack, in as much as there's a link-level protocol and service-level protocols building on that basic transport.

I have no experience with PC motherboards so I'm not *sure* what they're doing, but I suspect that they are exposing any pci-e level protocol traffic as hot-plug pci-e (as does the Mac), and that the OP is misunderstanding what the author of the HTML page he linked to is saying.

Thunderbolt itself is a lower-level protocol, but one that can be addressed directly which can be useful for particular applications. One example is raw dma, so any thunderbolt device can dma into any other device without the CPU getting involved (modulo the conditions I mention above).

I thought the spec comment was a bit odd as well, but I think he might be referring to the fact that the spec (and the hardware) has changed over time. There are several revisions...

Comment: Re:64 GB ECC 32 consumer, pcie vs. sata. compare H (Score 1) 804

Dude, I'm just describing what I see. I have the docs too, for both protocol and controller chips, and I have the code and measurements to prove it.

There's a clear difference in the time taken to process packets once the kernel gets involved, and (within experimental error), that time difference is nicely quantized.

I can't say it any clearer, when the kernel doesn't need to get involved (see above for criteria), it just doesn't - at least on a Mac. Perhaps the bios's Greg is using are not implemented well, I don't know (I have no experience there) but the Mac does it intelligently.

"If it ain't broke, don't fix it." - Bert Lantz

Working...