Forgot your password?

Comment: Re:We should all like this Bitcoin *concept* (Score 1) 276

by Coryoth (#45618365) Attached to: This Whole Bitcoin Thing Could Be Big, Says Bank of America

You are solely focused on bitcoin as an investment opportunity rather than its intrinsic utility.

Sure, but as far as intrinsic utility is concerned it doesn't matter when I get involved with bitcoin ... well, in fact it does: right now the price instability and general uncertainty mean it is far better to not get involved, wait for all this nonsense to sort itself out and join the game once everything is settled, stable, and bitcoin is actually being used purely for its intrinsic utility. In other words, it's better for me to ignore it for a few more years at least.

Comment: Re:A link between DPR and an early Bitcoiner (Score 5, Insightful) 172

by Coryoth (#45510155) Attached to: Study Suggests Link Between Dread Pirate Roberts and Satoshi Nakamoto

I think the more interesting part is the fact that we have some decent mathematicians (in this case Adi Shamir among others) are setting about pulling the entire bitcoin transaction graph and doing some serious data-mining on it. The reported result sounds like a mildly interesting result that happened to pop up in the first pass.

Given the advanced tools available these days for graph mining (largely developed for social network analysis among other things) I suspect some rather more interesting results may start coming out soon. What may seem hard to track on an individual basis may fall somewhat more easily to powerful analysis tools that get to make use of the big picture. I bet there's some interesting info on cliques and exchanges that could be teased out by serious researchers with some decent compute power at their disposal. Pseudonymity may be even weaker than you might think.

Comment: Re:a skeptic says "wow bitcoin is serious ". Hope (Score 2) 167

by Coryoth (#45507051) Attached to: 195K Bitcoin Transaction

Try pricing in Zimbabwean dollars - you'll see the same problem.

Well, you won't anymore because the Zimbabwe dollars were discontinued and the country now uses US dollars as its currency because price volatility made continued use of Zimbabwe dollars as a currency effectively impossible.

Now Zimbabwe had inflation not deflation, but the issue of volatility is the same: it makes things ultimately unworkable if it gets too high (even if it moves in a predictable way). When prices change significantly* by the minute and transactions take several minutes to complete then trouble may set in.

* significantly here means, say, double digit percentage change in price every minute. Bit coin is a long way from that currently, but is headed in that direction.

Comment: Re:yet another programming language (Score 1) 168

by Coryoth (#45436039) Attached to: Stephen Wolfram Developing New Programming Language

Being primarily a mathematician and not a computer scientist or engineer I've used Maple, Mathematica, Matlab, Magma and R. I've also programmed in Python, Perl, C, and Java and dabbled in things like Lisp and Haskell.

All the "math" programs on that list are terrible programming languages; they work great as interactive environments for doing (potentially symbolic) computation, but writing code in them? Ugh. If I actually have to write scientific computing code it's going to be in Python using numpy and sympy, or C if I need performance.

All the different math programs all have their strengths and weaknesses: Matlab kicks the crap out of the other for anything numerical or linear algebra related, both for ease of expression and performance; R has far more capabilities statistically than any of the others -- data frames as a fundamental data type make that clear; Magma is incomparable for the breadth and power of its algebra, none of the other come remotely close; Mathematica and Maple are ... well, sort of a poor jack of all trades that do most things but none of it very well.

Comment: Re:Remove CTRL + C as well (Score 1) 729

by Coryoth (#44942087) Attached to: Middle-Click Paste? Not For Long

Especially in an environment like Gnome 3 where the preferred method of working is full screen apps. Drag and drop to what?

I'm not really sure full screen is "the preferred method" in gnome 3 (I use gnome 3 and never full screen apps). Anyway, presuming you want to drag and drop you can drag to the Activities corner which will take you to the expose style overview from which you can select any window and drop into it. I've never done this until just now to see if it works and it does and is quite smooth (hover over a window for a second to have it restore as the front window if you want to drop to a particular location within the window).

Comment: Re:FUCK OFF (Score 3, Insightful) 729

by Coryoth (#44936319) Attached to: Middle-Click Paste? Not For Long

I try 'desktops' from time to time but they don't really give me much beyond managing windows. you know, the thing that fvwm does well enough and with 1/10 the memory and cpu.

A lot of 'desktops' these days are things you don't see immediately; the toolkits, internationalization/localization, canvases, setting centralization and management, advanced font handling, notification plumbing etc. that most GUI applications make use of these days (from one desktop or another). Presuming you're using apps other than xterm (and perhaps you are not) you are actually making use of most of this stuff; the part of the `desktop`you`re not using is simply the window manager and the panels which are, ultimately, the tip of the iceberg.

Comment: Re:interesting take. (Score 3, Insightful) 158

by Coryoth (#44391275) Attached to: Mozilla Labs Experiment Distills Your History Into Interests

It could work; it's not sending any data that couldn't be extracted from your history anyway (which they are largely getting now via blanket tracking) so it's not especially detrimental to the user.

Well, depending on how much you are blocking cookies and trying to keep information out of the hands of advertisers and other internet douchebags, you may feel differently.

Mozilla has said this is something you can opt out of, so it's no worse than blocking cookies etc. (and, in fact, is probably easier).

How about you develop tools to keep my information out of the hands of those 3rd parties? Instead they just seem to be looking to become yet another broker of your information.

Looked at the right way, this is almost exactly that. Presume for a moment that it works (a big if) and advertisers take to using this instead of pervasive tracking. Now we're is a place where we have a single central point of data release to advertisers; you can turn it off; you can potentially drop in a plug in that publishes a hand-crafted/approved list of "interests" instead of mining your history for it; etc. If it works it does give more control to users over their privacy.

The reality is that information is currency these days, and people will mine for this sort of data because it is valuable. You won't have much luck just blocking everything because the incentives to find a way around whatever blocks are put in place are high. So, assuming information is going to be given, trying to give the user more control over what information is handed over seems like a good thing. I doubt this particular plan will actually work, but I expect something along these lines will happen eventually.

Comment: Re:interesting take. (Score 3, Interesting) 158

by Coryoth (#44390731) Attached to: Mozilla Labs Experiment Distills Your History Into Interests

It makes sense if advertising companies were nice people, but please never turn this on by default. Most likely they will just add the info that you supply them to their trove of tracking data.

It could work; it's not sending any data that couldn't be extracted from your history anyway (which they are largely getting now via blanket tracking) so it's not especially detrimental to the user. On the other hand it is essentially doing the data mining and summarisation that the advertisers are going to have to do on the client side ahead of time. Getting your product to do some of your compute work for you may be enough of a carrot to get advertisers to end up taking this is preference to all the raw data collected by pervasive tracking.

Comment: Re:Gawd (Score 5, Insightful) 434

by Coryoth (#44387771) Attached to: Love and Hate For Java 8

And it doesn't mean Java doesn't have serious flaws. There's something deeply ingrained in Java that encourages over-engineering. But every language has its pitfalls.

I don't think there's much in Java the language that encourages over-engineering; it's more in the community that surrounds Java. It's in the tutorials, and books, and code examples and discussion groups . It's in the frameworks and libraries.

The reality is that a "language" is as much shaped by the community that grows up around it as by the actual language itself. Perl doesn't have to be particularly unreadable, but the culture that grew up around perl in the late 90's that was obsessed with cute hacks, fewest keystrokes, and self created obscurity created a state where anyone learning perl was immersed in that culture and came out writing a lot of unreadable stuff. It is my understanding that since many of those programmers left perl for other languages perl has been remade as "Modern Perl" which is largely the same core language, just with a different and libraries, and is quite readable.

Conversely python can be made quite diabolical (just through together chains of nested list comprehensions and single character variables for example), but because it grew up with a culture of "one obvious way to do it" and readability most code you'll see tends to eschew such things, and strive to read like pseudo-code. Again, there's not that much inherent in the language, it's the cultural conventions surrounding the language that enforce much of that.

Java fell in with the Enterprise crowd, and consequently found itself immersed in a culture obsessed with design patterns and over-engineering. Had things gone a little differently with, say, in browser applets somehow becoming the primary driving force for java (let's assume they ran better say) then I doubt java would be known for over-engineering.

Comment: Re:Honesty? (Score 1) 440

by Coryoth (#44344625) Attached to: How Climate Scientists Parallel Early Atomic Scientists

No spatial variation is treated as spatial variation, but the central limit theorem still applies wrt the mean temperature over spatial variation. Temporal variation is treated as temporal variation, but when deteermining the mean over a time period the central limit theorem stil applies and gives greater accuracy for more measurements over a time period. Etc. Apply a little bit of common sense.

Comment: Re:Honesty? (Score 2) 440

by Coryoth (#44341471) Attached to: How Climate Scientists Parallel Early Atomic Scientists

I still don't know how measurements of climate change can be done in fractions of a degree with the base measurements are done with margins of error sometimes as much as 5-10 degrees. Accumulations of rounding errors alone would seem to indicate that reports should have much larger margins of error on computed values. That is but one of many problems with current observations in climatology.

It's called the Central Limit Theorem. Suppose you have some independent random variable with mean mu and variance sigma squared; CLT says that if you take n observations (X_1, ..., X_n) from the random variable then the sample mean (X_1 + ... + X_n)/n tends to be normally distributed with mean mu and variance sigma^2/n. It is a very well established and formally proven theorem of basic statistics.

Now, how does this apply to large error bars on individual temperature observations and fractions of a degree on global warming estimates? Well, let's start with trying to figure out the temperature on some particular day 100 years ago. We have records of it. Those records are not especially accurate (to within 2 degrees of the actual temperature say). Well, that means that the records are a random variable with mean equal to the actual temperature and variance related to the margin of error on the observation (we are randomly a little high, or a little low), lets call if e for "error". According to CLT if we gather n such records and take their mean then that mean will have a mean of the actual temperature and a variance of e/n; that means if we actually reduce the margin of error of our estimate by gathering multiple different observations and averaging them. Thus, despite the innacuracy of any individual measurements we can have significant accuracy of measurement in aggregate.

And that's a quick summary of how it works; in practice there are more considerations, but there's also more statistical theory that covers those considerations too. Hopefully I've managed to give you at least some idea of how this works though.

Comment: Re:Agreed, it's stupid (Score 4, Insightful) 737

by Coryoth (#44016487) Attached to: Sexism Still a Problem At E3

Pardon my ignorance, by why is it repulsive to see attractive people at product promotion booths?

It's not, and as one of the linked articles pointed out, the ban on booth babes at PAX didn't stop some companies having attractive women there to sell stuff -- the difference being that said women were dressed normally, and actually knew all about what they were selling (that is, they were regular salespeople for the company that happened to be women). If you can't see the difference between that and booth babes then you are part of the problem.

Comment: Re:The Manchurian Candidate (Score 3, Informative) 240

by Coryoth (#43946661) Attached to: Clearing Up Wayland FUD, Misconceptions

There was a time when displays did everything by passing around rendering primitives -- lines, rectangles, black and white bitmap pattern tiles. At that time it made a lot of sense to integrate network at the low level because you had to figure out how to send and decode all those drawing primitives over the wire.

Display technology moved on. Displays became rich and complex and colourful, and different applications had very different needs and took on more and more of the rendering task themselves and simply pushed bitmap buffers to the display system. Now the task of the display system was to mediate and manage and request complex bitmap buffers from the various clients.

At this point remote display was a matter of having the client send (potentially compressed) bitmap buffers -- let the clients do their own rendering. This is how most remote display systems written in the last 15 years do it. Indeed, that is how X does it these days for most applications: the applications do their own rendering via GTK or QT and Cairo and X pushes the pixmaps down the wire.

If all you are doing is throwing around bitmap buffers, and the display software is simply mediating and displaying those, then remote display doesn't need a whole lot of thought at the display level -- all it has to do is mediate and display the bitmaps it gets from clients. Now, providing a remoting system to let remote clients get their bitmap buffers to the display when requested ... well that's still a thing that needs to be done, but it the base display software doesn't have to care too much about how that gets done.

Think of it as teasing out the layers in the software. The base layer is pushing pixels to the screen (no matter where the data for those pixels came from, remote or local). That's one job: pixels on screen. Focus on that and do it well. Another job is getting the data that the base layer is going to display to it, and you can worry about remote/local differences in that layer.

Comment: Re:The Manchurian Candidate (Score 4, Insightful) 240

by Coryoth (#43943331) Attached to: Clearing Up Wayland FUD, Misconceptions

They have barely given it second thought because they've established that it can be done in principle, and it isn't the stuff they're working on (which is getting the actual display working cleanly and efficiently). They can worry about it when they've gotten the fundamentals pinned down. Do you really want excellent networking for a display system that doesn't display well, or is horribly slow? First things first and all that. As long as nothing they do makes it infeasible, or overly complicated there's no point in worrying about it till the very core functionality is working as they wish.

Behind every great computer sits a skinny little geek.