Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment: org-mode in emacs (Score 1) 133

by quarrelinastraw (#45831409) Attached to: Ask Slashdot: Life Organization With Free Software?

I have tried several solutions and find org-mode to be the most useful one. I have it set up in essentially a GTD structure, with TODO items in one big list, a separate list of active projects, and a third list of potential future projects.

org-mode is extremely configurable, which is a definite plus for software you intend to organize your life. I recommend the following add-ons as well:

  • org-caldav to synchronize appointments and scheduled TODO tasks with Google calendar. (two-way sync that is!)
  • MobileOrg for Android, which will let your Nexus 7 work with your org-files.

Now you just take text notes in emacs and org-mode does the rest. I happen to write LaTeX and some code, and org-mode also supports literal programming. This is an extremely useful feature since I can write rich outlines of all my projects in org-mode and have the TODO items be placed naturally where they would have gone in my hand-written notes.

For a simple example of how this might work, suppose I'm drafting a research paper and don't remember some detail I don't care about currently. I might write, "Smith first published this theory in the 80s * TODO Look up the year for Smith". org-mode will find these todo items and tell me about them when I look at my todo list. Then when I find the appropriate date and want to insert it, it will take me directly to where I put the TODO item so I can insert it into the draft with minimal effort. In this way, outlines naturally progress toward finished projects, just as GTD wants them to.

Comment: Re:Still Bad Patents (Score 2) 162

by quarrelinastraw (#45226593) Attached to: Finally, a Bill To End Patent Trolling

To prove that all they should have to do is put twenty programmers on the stand and if ten of them can come up with that solution.

Why is that the standard of obviousness? When it comes to algorithm design, algorithms could be manifestly obvious to mathematicians and algorithm specialists but not to ordinary programmers. The obvious criterion should take into account the corporate structure, or else it will be profitable to keep a stable of uninformed programmers in the dark while they implement algorithms designed by specialists with a higher level of technical knowledge. Then any algorithm that is obvious to someone with technical knowledge will still be patentable because it's implemented by less informed programmers.

Comment: Re:Still Bad Patents (Score 1) 162

by quarrelinastraw (#45226521) Attached to: Finally, a Bill To End Patent Trolling

Because smart people, who think stuff up, ought to be able to get paid for their ideas.
Right, and they're not because somebody filed an overly broad patent that covered the obvious portions of their clever ideas. So instead of being paid for their ideas and work, they are paying extortionary licensing fees for work they never used when coming up with their ideas.

Patents don't work for massively distributed fields like software development. The solutions are often obvious next steps that are waiting only for hardware advances or other external events. Solutions and algorithms are often simultaneously discovered by multiple parties independently. Patents are not needed to spur or reward innovation in software and processes, but they are useful as ways of forcing competitors to refrain from implementing obvious improvements.

That is why the rules for patents used by the Supreme Court disallow software patents, even though the lower courts have allowed them in certain circumstances thereby overriding the rules used by the SC (until the SC decides to hear a case that challenges the lower court rulings).

Comment: Re:GP says, "you may be right" (Score 1) 150

by quarrelinastraw (#44699589) Attached to: New Zealand Bans Software Patents

I haven't looked carefully at "abstract idea" and how that applies to patents (or doesn't).

Hi, thanks for the response.

I'm a mathematician and I honestly find a lot of the "all algorithms are math" argument to be rather weak for reasons I'd be happy to go into. I think "abstract idea" is the real issue. The best analysis of the situation that I've seen so far is Ben Klemens' "The Rise of the Information Processing Patent", the pdf of which can be found here http://www.bu.edu/law/central/jd/organizations/journals/scitech/volume141/documents/Klemens.pdf.

Comment: Re:Below the line (Score 1) 162

Anyone who is actually voting for wikileaks will likely be well informed and voting below the line anyways.

For those not familiar with australian voting, we have preferential instant runoff first past the pole voting.

You can either vote "above the line," where you select ONE party, and that party decides how your preferences fall if they don't win a seat, or you can vote "below the line," where you number individual candidates "1, 2, 3.....".

So, effectively this ensures that anybody voting for Wikileaks is voting their preferences. Since the above the line preferences are unpopular, voters can't just vote on the party line. Or else they'll possibly get white supremacists in office. Instead they have to rank the candidates *as they'd actually like to see them governing* instead of a more accurate approximation than otherwise. Even though they want votes, they can't rationally want votes from people who just follow a party line, since that creates an environment where party politics thrive and transparency is reduced.

That's just an academic point though, I have no idea whether it is relevant to their thought process.

Comment: Re:Inevitable consequence of unfettered capitalism (Score 1) 255

I think you shouldn't be immune to the observation that in calling the parent's argument "based on emotion" and "deliberate propaganda" you occuse your opponent of "promoting ... dictatorship and slavery".

Let's take a step back here. We are all geeks and we can approach governance from an engineering point of view. If I read you correctly, you're advocating a somewhat extreme social contract viewpoint in which the only legitimate function of government is to take us out of the state of nature by granting the government as minimal a monopoly on violence and power as possible. This view isn't unreasonable on its face, but there is more to it.

In any society, including humans, there is a distribution of power and resources that comes from natural survival of the fittest. The basic idea of government is to voluntarily abdicate much of this power to a central authority (where central can mean tribal, state, federal, world, etc). This creates the natural problem of a single point of attack: any group that can infiltrate the government will be able to use that centralized power for personal gain. This creates the need to impose laws on things like (1) who can serve in government (probably not felons, e.g.), (2) what limits members of government can have in their personal influence (e.g. can anybody unilaterally declare war?), (3) limitations on the influence other powerful social members have on the government (e.g. campaign funds, revolving doors) and many others. The point here is that it's a question of engineering. How do we organize the central authority so that the people have maximal freedom to do anything they want and they don't have to fear being pushed around by anyone.

Where you and your opponent disagree most is in determing the strongest threat to government infiltration. The social contract tradition, of which you seem to subscribe, was concerned primarily about landed aristocracies using the power of government to do anything they like. This was historically opposed by a class of business owners and farmers who favored a representative democracy. But in the course of history many things have changed. America has no landed aristocrats, so we no longer worry about that. Instead, it has a class of people who believe that selfishness is a positive virtue that is much more important than telling the truth or helping others. This belief flourishes in America's political structure because it is easy for such people to lie in order to gain votes but then to ignore the voters once in office. The parent believes that these people are using government to their advantage to the detriment of the vast majority of the public.

From this point of view, the "free market" rhetoric is a device to convince the population that there is a moral imperative to turn every part of society into a money-making enterprise. Naturally, if you do this you end up with something very much like what you see in, say, Boardwalk Empire or any other popular depiction of organized crime. In fact, it should be rather obvious that organized crime is only crime because there are laws that prevent business men from engaging in certain profitable businesses, such as human trafficking, drug exportation, war profiteering, and so forth. It raises the question of why any of these things is illegal in the first place.

And, quite obviously, the reason they are illegal is that in a truly free market, labor has the right to organize and the poor have the right to protest and vote. So in a free market, a government will voluntarily implement policies that limit the ways that businesses can exploit others for profit. The only way to get an "ideal" free market of the sort Milton Friedman advocates is to sneak pro-corporate laws onto the books or into the courts since naturally those laws will be opposed by anybody not directly benefited by them.

The mistake you make is that you don't see a free market as a process, you see it as a set of laws. But historically you never get that set of laws unless through corruption and autocracy. For example, jobs moved offshore because other countries didn't have the labor protections we had in the US. When labor started organizing in these other countries (as they can in a free economy), the US used military and lethal force to impose autocratic rules that forbade it.

I would suggest educating yourself further on history, political science/philosophy, economics and the like. If you believe that "social" means "anti-individual" then you have an extremely narrow perspective that I believe you will find wholly unsatisfying after you learn more. I don't mean this to be insulting, but your post reads a little bit like an iPad gamer claiming to know everything there is about gaming. You just kind of want to take the person aside and open his eyes to the wide and fascinating world of things beyond his ken.

Comment: Re:Fearmongering. (Score 2) 407

Hi, I posted the question. Maybe it's worth pointing out that I've used Linux and open source exclusively for well over a decade and have no interest in smearing anybody. I'm positive they have backdoors to closed source programs, because it has been leaked that they have access to MS exploits before they're fixed, and one of the Snowden slides implied the UK has the ability to break BlackBerry encryption from devices owned by heads of state and diplomats. I assume open source is the *safer* option, but I want to know how safe.

That said, the link you posted to *confirms* that US intelligence has tried to put back doors in encryption libraries! That's really all the information we need. My understanding is that the NSA is far more advanced in cryptography than the FBI. It seems almost negligence for the head of the NSA not to attempt to put back doors in open encryption libraries. Plus they've had 13 years since the FBI attempt to learn from their mistakes. So if we haven't heard of the NSA doing it, it's reasonable to wonder if that's because they're doing it extremely well.

Comment: Re:Historically, NSA have done the opposite. (Score 1) 407

As stated in the submission, although NSA hardened the algorithm to most attacks, they lobbied to reduce the key length. Specifically they wanted 48 bit keys instead of 64 bit. Perhaps there is a good reason for this, but on the face of it, weaker keys would seem to weaken the algorithm to brute force attacks. It may have just been that at the time computing power was the best advantage NSA had.

Comment: Re:This is stupid (Score 2) 407

Hi, I wrote the submission. To fearmonger is to exagerrate some threat to use fear in order to promote some specific ends. This question is me asking to what extent caution should be justified so that I as a user can know what to do. I'm sure you can see how those things are extremely different and in fact the opposite. One is an attempt to drive action with fear you know is unjustified, the other is an attempt to systematically determine the appropriate amount of caution.

"This is fearmongering" seems inappropriate as a response to a submission that contains only links to Wikipedia documenting known facts and that even goes so far as to call some proponents of this theory paranoid.

That said, thanks for the link.

+ - How worried should we be about NSA backdoors in open source and open standards? 1

Submitted by quarrelinastraw
quarrelinastraw writes: For years, users have conjectured that the NSA may have placed backdoors in security projects such as SELinux and in cryptography standards such as AES. However, I have yet to have seen a serious scientific analysis of this question, as discussions rarely get beyond general paranoia facing off against a general belief that government incompetence plus public scrutiny make backdoors unlikely. In light of the recent NSA revelations about the PRISM surveillance program, and that Microsoft tells the NSA about bugs before fixing them, how concerned should we be? And if there is reason for concern, what steps should we take individually or as a community?

History seems relevant here, so to seed the discussion I'll point out the following for those who may not be familiar. The NSA opposed giving the public access to strong cryptography in the 90s because it feared cryptography would interfere with wiretaps. They proposed a key escrow program so that they would have everybody's encryption keys. They developed a cryptography chipset called the "clipper chip" that gave a backdoor to law enforcement and which is still used in the US government. Prior to this, in the 1970s, NSA tried to change the cryptography standard DES (the precursor to AES) to reduce keylength effectively making the standard weaker against brute force attacks of the sort the NSA would have used.

Since the late 90s, the NSA appears to have stopped its opposition to public cryptography and instead (appears to be) actively encouraging its development and strengthening. The NSA released the first version of SELinux in 2000, 4 years after they canceled the clipper chip program due to the public's lack of interest. It is possible that the NSA simply gave up on their fight against public access to cryptography, but it is also possible that they simply moved their resources into social engineering — getting the public to voluntarily install backdoors that are inadvertently endorsed by security experts because they appear in GPLed code. Is this pure fantasy? Or is there something to worry about here?

Comment: Re:Averages with how much deviation? (Score 1) 186

by quarrelinastraw (#42273337) Attached to: Netflix Ranks ISP Speeds

I'm just a little confused about your wording. I'm not sure what it means for an average to be better or for betterness to apply. I can see at least two reasons to want to know about standard deviation. One is that you want to know about how ISP speeds differ according to other variables, like neighborhood. Another is to see whether the difference in mean ISP speeds is "significant."

If you're a frequentist, then the probability that ISP A and ISP B have different speeds (so that one is larger than the other) is 1. So sigma here is 0. P(A = B) = P(A- B = 0) corresponds to doing an integral along the rectangle [0,0], which must be 0. (More generally, it's a measure 0 set). But people report sample standard deviation estimates (which are not sigma, but estimates of sigma) to show how precise the measurement is. In this case, since there are so many data points, I would imagine that the sample standard deviations aren't that relevant.

If you're more curious about what the average individual might see in terms of speed, then you probably want more info about speed broken up by, say, location. And that is more about giving conditional distributions than about standard deviation.

Practical people would be more practical if they would take a little more time for dreaming. -- J. P. McEvoy

Working...