Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Not so sure about the language... (Score 3, Interesting) 216

Indeed, and it appears that this is actually the goal of the project, per the original announcement
  http://blog.stephenwolfram.com...
The scary bit, is that many of the "novelties" there announced (i.e. homogeneous treatment of input, output and data, etc.) are actually quite old ideas in the arena of functional programming (lisp or scheme are built upon these foundations)... sometimes they work nicely; often you risk ending up with academic exercises.
I am myself not too keen on "revolutionary technologies" which should rather be considered "evolutionary developments" (even when the evolution actually provides something new and useful)!

What is new here should be the integration with a massive database of `facts' and the possibility of performing elaborate queries, relying on `ready-made' algorithms.
This is very convenient and potentially useful but
  a) it has little to do with `programming' per se; it is a programmatic interface to a knowledge-based system (where the knowledge itself includes also the algorithms being requested)
  b) it is opaque, in the sense that there is little control on what code is doing what data: many of the functions act actually as black boxes and it is not straightforward to see how to actually get in control of the system and/or understand what is actually being done in order to provide an answer.

A further remark: (b) is most of the time not required at all (we just want to get a rough picture of something), but it is essential e.g. for scientific applications.

Comment Not so sure about the language... (Score 4, Insightful) 216

As much as I would like to be impressed, what I see is quite underwhelming: a functional application language with some interface to "facts" and "databases" with a pattern matching engine might make some analysis easier but ... the principles of the language are mostly what you come to expect if you have seen lisp once or any modern functional language,e.g. haskell.

I can see it as being useful, but as another commenter pointed out, "FindShortestTour" is a library function (which might be handy), but definitely not an example of how concise the language might be; the same could be said about "EdgeDetect" or the like. The power of the language can be measured in how easily it can be extended or non trivial algorithms can be implemented ... not in how many functions are offered (even if this could be more convenient none-the-less).

Comment Re:LaTex plugin (Score 1) 84

Acttually LaTeX has not been written by Donald Knuth, but it is a macro language built upon TeX by Leslie Lamport (which is also a very remarkable computer scientist, but I suppose he is not who you were thinking about).
Yes, the grammar is horrible, but once the basics of the language are mastered it feels quite a natural setting where to write text; if you want to implement a program in it, on the other hand, things are not so "easy". Look at http://stackoverflow.com/questions/2968411/ive-heard-that-latex-is-turing-complete-are-there-any-programs-written-in-late
This being said, the people at http://www.luatex.org/ are doing a really good job to integrate the engine of tex with lua and exposing its internals, as to offer a "reasonable" programming language both for tuning the typesetting and for actually implementing algorithms within the documents.

Comment Re:LaTex plugin (Score 1) 84

Not really; if you are a professional mathematician, then you almost definitely have a good and fluent command of latex; as such the issue of a GUI is hardly relevant (and it feels "natural" to write equations in a certain way)

If on the other hand you just want to write some formulas on your web page, then I concur that latex might be the wrong technology and it is also a technology which is not so easy to integrate with GUI tools as soon as what you want to do is non-trivial (speaking of which I would like to point out that most CAS support a form or another of tex output for their results; the code they produce is "interesting", to say the least and I always find more convenient to retype the formula than to copy and paste what is needed).

Comment Re:LaTex plugin (Score 1) 84

For a human?
  Absolutely nothing: I actually sort of like the example, even if you would need at least

\catcode`/=0 /def/implies{/Leftrightarrow}

somewhere in your file for the "/implies" to work ;))
This is exactly one of the problems I was pointing out: too much flexibility is not so good in this case.

For a computer?
Well, a computer can do a good job to print that out, also; as I said, mathjax does render such an expression (provided you do not use external macros, etc. etc.) within a
web page. Actually this is what I use on my web page when I put online the abstract of a paper; Sciencedirect (a service by Elsevier) also makes full texts of papers available using the same trick. Yet, it is fairly clear that this looks more like a stopgap measure than a solution for the problem: the standard for mathematics on the web should not be designed for humans, but rather for ease of parsing and processing (within a DOM) by machines, with "sort of" standard techniques and tools.

Comment Re:LaTex plugin (Score 1) 84

Well, several reasons ...
First of all, latex is a macro language whose goal is to typeset documents, not render web-pages: it is based upon the tex engine and several of its extensions (nowadays I feel very confortable with luatex and I really enjoy the extra flexibility it provides). Its goal is to compose pages (encapsulating some of the typographic best practices in an algorithmic form), not just to write math.
The syntax of tex math mode is an handy way to write formulas and feels very natural (at least once you get used to it) and comfortable, but it is by no means perfect.
In particular, as soon as you need a non-trivial layout (e.g. for commutative diagrams), it requires you to understand how things will be rendered on the page and/or write your own custom macros in order to get a decent presentation (usually nowadays with a mixture of tikz/pgf packages; in olden times it was a matter of building up vboxes and hboxes by hand). Clearly, all of this is not acceptable if you are thinking about a standard for web pages like mathml.

I have found the mathjax project to do almost what you want: render mathematical formulas on a page in an "almost portable" way starting from a description of them in a standard restricted subset of tex (the subset almost everybody is using, unless there are `special needs'); it solves a real problem and it does it job in a commendable way, but it is by no means a proper solution.

In summary, there are at least these problems to consider:
  1) latex is a document description language (among other things), not just a for typesetting formulas; here we are dealing with a subset of tex math mode operators (e.g. we do not want custom macros)
  2) the typesetting algorithm of latex does not play nicely with the model used by html5 and it is based on different assumptions
  3) parsing a latex document is actually very different from parsing an xml one: as long as the document is "trivial", the conversion is sort of straightforward (\begin ... \end become opening and closing tags, etc. etc.), but as soon as you want to exploit some of the flexibility afforded by the system things get ugly very quickly.

lg

Comment Re:Certificate Authorities compromised? (Score 1) 527

So what?
A SSL certificate is used just to provide end-to-end encryption, not to protect the storage.
As such, it is sort of pointless to wonder if the root certificate used by any major provider has been or is known by some federal agency or not... it is much easier to ask the owner of the server for its contents than to intercept communication.

This being said, it appears that lavabit used encrypted storage as well but there is something amiss in the way the protocol was implemented, I fear.
(I have never been using their service, so it might be I am grossly misreading things: corrections would be very welcome!)

Let me explain: as long as encryption and decryption are being performed by a remote server there is no guarantee that data might not be captured (ok ... homomorphic encryption might be going to change part of the scenario; unfortunately is far from practical nowadays and so it will be in the next 5/10 years).
There are basically three approaches I might be thinking about
  1. perform decryption with a custom program on the client: the key is never sent "in clear" and the server just owns a public key to encrypt data as soon as they are received; however there is a window in which the server knows the plaintext (i.e. before writing it down to permanent storage) and might copy it.
[the sensible option is to ask people to use gpg and then rely on public servers, trusting the cryptography]
  2. perform decryption locally in a javascript client in the browser. This might actually work, and with the proper setup it is also possible to use public key algorithms
  (basically the user has to upload a copy of her private key encrypted with a symmetric algorithm to the server, together with a public key; upon a decryption request the server downloads the packet in the javascript app and locally decrypt it; then, once the private key is recovered it moves on to locally decrypt every single datum as stored remotely). There is the same disadvantage as in 1 here, in the sense that the server can copy the data while they are "in clear", but no special client is required. I point out, however, that in this scenario it is possible for the server to offer a compromised javascript page which also uploads the secret key as soon as decryption is required; as such the surface of attack is larger.
3. perform decryption remotely by providing a symmetric (and/or private) key. Here it is just a matter of trust between the user and the server in that the administrators are not going to either clone the data (yet this they could have done also in scenarios 1 and 2) or keep a copy of the key as provided. This is the simplest solution, but also the least safe of them all.

In summary: do not trust anybody to do cryptography in your own stead (unless you work on homomorphic encryption, of course ;-) ) and least of all to do decryption of any data; if you need secure (in the sense of 'secret') mail require all parties to use client applications providing the encryption on their own machines and not to delegate to any third party (third parties might be used to store encrypted data, though).

Comment Re: Mixing the signals (Score 1) 168

The point is not so much that the cipher would be weaker, as that it would be no stronger than using any of them and there are some cases where it could actually be as weak as the weakest of them both. For instance, you do not gain anything under a "known plaintext" scenario.

Consider this case: you have an enciphering machine (say E) and you want to recover the keys being used by probing its behaviour with a series of
texts (which are either `random' or suitably chosen by you).
If E(m,k1|k2)=B(A(m,k1),k2)
where A,B are your original systems and k1,k2 the respective keys, we might try to mount an attack by intercepting the stream between A and B.
There is a slight security advantage, as a chosen plaintext attack for E becomes a known plaintext attack for B (the chosen plaintext is m; the known one is A(m,k1)) but if B is vulnerable the attacker can recover k2 and strip the second layer of encryption. Now we are left with attacks against A under a known plaintext model (which might work or might not). This is a variant of the usual "meet in the middle" approach used against 2DES; if you want a direct parallel, just consider
having to look for collisions (x,y) to
B^(-1)(E(m,?),x)=A(m,y)
where "?" denotes an unknown key.
A particular case is when x=y *as a design decision*. If this turns out to be the case (argued as "256 bits should be enough for anybody!" or the like), then it is actually the weakest cipher which matters (and not the strongest one).

Furthermore, it can well be that there are distinguishers for the first cipher under consideration; if that turns out to be the case an attacker can infer strong statistical properties on the input stream to the second which could be exploited.

Comment Re: Mixing the signals (Score 1) 168

Nope. Two weak ciphers do not make a strong one, just a mess.
This is not to say that a cryptosystem should not be designed from basic (and rather insecure) primitives suitably chained and iterated: this is actually the case for all modern block ciphers from Feistel-style networks to the AES. The point is that it is not sensible to rely for security on the rather unpredictable interactions between different encryptions and the actual risk is indeed a false sense of security.

A different problem is whether it makes sense to consider "replaceable" encryption algorithms as suggested. In the case of public key systems this would not be a good idea, as the properties, security parameters and behavior might be widely different (even in comparable usage scenarios) and unexpected weaknesses might appear. As for block ciphers, they are sort of supposed to be interchangeable (for given block and key length) ; however, it has to be considered that a negotiation protocol might always be fooled by an attacker in order to select the "weakest" (in some sense) algorithm.

In short: in cryptography flexibility might (and usually can) be a liability rather than an advantage. The best course of action is to be able to fully audit a "simple" implementation (and be somehow able to guarantee some security) rather than leave too much room for unsuspected attacks.

Comment Madness (Score 5, Informative) 168

The least I would have expected from the documents about the extensive spying done by NSA was a generalized weakening of cryptography.
While it is true that some algorithms might have been deliberately weakened by the NSA, I doubt this could have been systematic; especially for those which are best investigated by the cryptological community at large.
  In particular, NIST mandated cipher suites while definitely amenable to some theoretical attacks in some cases, have been independently investigated and, as of today, no effective practical attack is known against AES. I would never trust a 'homemade' algorithm for anything, nor waste time to try and analyse it (cryptography is actually part of my job) unless there were some really compelling reasons for doing so (e.g. interesting mathematics, peer review requests or unusual attack models being considered).
Skein and twofish are definitely interesting algorithms, and they have also been well regarded in the competitions leading to SHA3 and AES; they are definitely not a bad choice, but to choose them because whatever has been selected by NIST is "tainted" by NSE (and not other architectural or practical considerations) resembles more a form of superstition than anything else.

Comment What programming language? "Go" or "Go!"??? (Score 2) 150

Please: observe that "Go!" and "Go" are two quite different programming languages.
The client appears to be written in "Go" which is the language by google, but the headline would suggest "Go!" by McKabe & Clark.
I find the same ambiguity in the text... still, I am looking forward to the time we shall use the full unicode range in order to have
similar looking, yet entirely different, names. That shall be fun!

Submission + - Extended TeX: past, present, and future (latex-community.org) 1

Hamburg writes: Frank Mittelbach, member of the LaTeX Project and LaTeX3 developer, reviews significant issues of TeX raised already 20 years ago. Today he evaluates which issues are solved, and which still remain open and why.
Examples issues are managing consecutive hyphens, rivers of vertical spaces and identical words across lines, grid-based design, weighed hyphenation points, and overcoming the the mouth/stomach separation. Modern engines such as pdfTeX, XeTeX and LuaTeX are considered in regard to solutions of important problems in typesetting.

Comment Re:Emacs (Score 1) 545

I find the combo
emacs with nxhtml and CEDET (for php)
very nice and powerful (the only inconvenience is the startup time with CEDET, but that is what you get if you want semantic analysis performed by the editor).
I seem to remember there should be also something running under eclipse, though.

regards,
  lg

Comment Re:ECB Mode is totally insecure (Score 1) 101

Writing parallel code is difficult. Writing parallel code which makes sense even more. Actually, if you have a quad-core CPU and do ECB instead of CBC, then you can manage a 4x increase in performance ... no need to use a GPU!
(The reason is that ECB encryptions might be done in parallel, as each of them is independent; for CBC you need to know
the encryption of textblock-1 in order to produce that of a block).
A counter mode (CTR) might make sense for ecryptfs, but the security analysis is definitely non-trivial to make.
Actually, it is amateurish at best to say that this implementation of ecryptfs "is not a toy" ...
(per http://code.google.com/p/kgpu/wiki/IozoneBenchmarkResults )
it is, in fact, something which seriously compromises security.

Slashdot Top Deals

You don't have to know how the computer works, just how to work the computer.

Working...