Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Ignore the Critics, Research is Necessary (Score 1) 190

Agreed on all points, though I'd have to agree with femtobyte as well that profiteers make horrible scientists. $100 million is peanuts, as the original article notes, but that is only a bad thing if it operates in complete isolation. If it cooperates with the Connectome Project and other neurological studies, this study could be quite useful. But that is only true if the division of labour is correct. You cannot break a scientific project into N sub-projects at random, even $100 million ones. If everyone got together and discussed who is best placed to do which part, the results could be extremely valuable.

Even more so, when you consider that a 13T MRI scanner capable of handling humans should be online just about now. Since that has already been built, the cost of building it is effectively zero. The resolution achievable from such a scanner, however, should be nothing short of spectacular.

Can you even begin to imagine the advances achievable from a consortium of Connectome researchers, high-end (9.3T and 13T) MRI labs, and this new foundation?

Ok, now you've imagined it, stop. We're talking politicians, scientists under publish-or-perish rules, get-rich-quick corporations and corrupt "advocacy". There's no possible way any of those involved will be capable of doing what they should do.

Comment Re:What about pictures? (Score 1) 300

Excellent! At this rate, by the time the thread is frozen, we'll have beaten DPI and other newspaper publishing systems. (Ok, ok, I'll be honest, we've already beaten most newspaper publishing systems.)

Not messed with tikz, but will take a look.

The main problem I've had with TeX and its subsystems for vectors is that it's actually very difficult to snap to points, or define relationships between vectors. Normally, this is a non-issue - TeX' built-in maths has perfectly good precision for most purposes, so provided the functions are defined correctly, you don't get freaky rounding errors or endpoints in the wrong place. There are pathological cases, however, where certain shapes only scale correctly by certain amounts. You need fiddly conditionals and other hacks. Since most engineering and maths software has had workarounds almost as long as TeX has existed, and it would be an addition to the syntax (so retaining backwards compatibility, just as LuaTeX is backwards compatible with TeX), there should not be any reason for such solutions to exist in TeX.

It may well be that tikz solves 99.9% of all the cases I'm concerned about. If so, great. If not, the system is built to be infinitely extensible. I'll get round to it. Maybe. Or wait for a new package on the TeX archive.

Comment Re:What about pictures? (Score 2) 300

Think it's graphicsx. One of the packages, anyways, lets you include PNGs, JPGs, etc. No problem. I include graphics all the time with LaTeX, very few of which are EPS. True, graphics import isn't as clean as I'd like (it's a bugger to remember all the different nuances of each type of graphics format you can use and through which package you need to use it with).

I also don't like the fact that vector images require you to master Asymptote, Metapost and an armful of other systems. This can - and should - be massively cleaned up.

So, whilst I agree that TeX has crappy image handling, it's not nearly as bad as you depict.

Comment Re:Old tech, and limited (Score 3) 300

TeX has control elements for describing structure, since structure is a key part of typesetting. Since these elements are macros, they're programmable, although not truly abstract as in XML. About the only thing I can think of that XML can do for document structure that TeX cannot is out-of-order elements, and I'd argue that out-of-order is incompatible with structure.

In database terminology, XML is a key-data pair system. The data can be anywhere in the XML file and you need some sort of key to know where it is and/or when you've found it. (Since XML is not organized, you can't do random access to get at the key. You have to load it in and organize it, in which case it isn't XML, or you have to sequentially search it.)

TeX is a semi-sequential structure, with relationship links between specialized data tables. Again in database terms, it's a set of batch sequential files with crude but useful support for concrete data association. Because it's batch sequential, real-time usage gets hairy. Big deal. Those in the middle of writing should be concerned with the writing. It would be nice if editors had better error-detection, but it's not usually that critical.

Comment Re:Old tech, and limited (Score 3, Informative) 300

Never had any problem writing books in LaTeX. The main difficulty has been in deciding whether I want a modern or medieval structure.

Docbook, on the other hand, I hated. I helped with the writing of a few chapters of the Linux Advanced Traffic Control book, which was abandoned in part because Docbook was such a disgusting system.

XML is useless for typesetting. It's not really that useful for organizing anything - you'll have used XML-driven databases, but you'll have never used an XML-driven database that had any performance or serious functionality. (LaTeX doesn't do databases, either, but it doesn't pretend to. It has external engines for databases, which are actually quite nice.)

Web pages? Never had any problem embedding HTML in LaTeX. In fact, I have very very rarely found ANY document style to be LaTeX-incompatible. Load up the correct document type, load up the appropriate stylesheets and you're good. Yes, spiral text is hard. Yes, embedding HDR images can be a pain. Yes, alpha blending isn't that hot. But how often do you use any of these for owner's manuals or contracts?

There are more table classes than I'd really like, and some of the style coding is scruffy, but I challenge anyone to find a genuine, common document type that LaTeX* cannot do as well as or better than any non-TeX wordprocessor, DTP solution or XML-based system. (Non-TeX means you can't compare TeX with Scientific Word, TeXmacs or any other engine that uses TeX behind the scenes.)

(To make it absolutely clear, "as well as or better than" can refer to any one or more parameters. So if I get better-quality output, that's better than. If I can achieve comparable results with cleaner, easier-to-maintain syntax, that's also better than. To win, your solution has to not merely equal but actually exceed what I can do on EVERY parameter, or you have failed to demonstrate something that supercedes.)

A bitcoin to anyone who can do this.

*I am including all dialects of LaTeX here, so LuaLaTeX, PDFTeX, etc, are all things I can consider on my side, as are all WYSIWYG and WYSIWYM editors, Metapost, supplemental services, style sheets, etc. Since this is versus a specific alternative, anything comparable for that specific alternative is fair game for you to use, but you can't mix in other alternatives. It has to be one versus the complete TeX family if you want to prove your point.

Comment Re:TeX for Math (Score 5, Interesting) 300

Well, with WebKit up the proverbial creek these days, a new rendering engine would make sense.

The question would be whether you could create a TeX-alike engine that supports the additional functions required in HTML and can convert any well-formed SGML document into a TeX-alike document. If you could, you can have one rendering engine and subsume HTML and XML entirely within it.

The benefits of doing this? The big drawback of style sheets is that no two browsers agree on units. TeX has very well-defined units that are already widely used. These also happen to be the units industry likes using. Eliminating browser-specific style sheets would be an incredible benefit.

The big drawback of the Semantic Web is that everyone, their brother, cat and goldfish have designed their own ontologies, none of which interoperate and few of which are any good for searching with SPARQL. LaTeX has a good collection of very standard, very clean methods for binding information together. Because it's standard, you can have a pre-existing ontology libraries which can be auto-populated. And because LaTeX is mostly maintained by uber-minds, rather than Facebook interns during their coffee break, those ontologies are likely to be very, very good. Also, microformats will DIE!!!! BWAHAHAHAHAHAHAHAHA!

The big drawback with HTML 5 is that the W3C can't even decide if the standard is fixed, rolling or a pink pony. TeX is a very solid standard that actually exists.

Ok, what's the downside of TeX? There's no real namespace support, so conflicts between libraries are commonplace. I'm also not keen on having a mixture of tag logic, where some tags have content embedded and others have the content enclosed with an end tag. It's messy. Cleanliness is next to Linuxliness.

Parsing client-side is a mild irritant, but let's face it. AJAX is also parsing client-side, as is Flash, as are cascading style sheets, etc, etc. The client is already doing a lot (one reason nobody has a fast browser any more), so changing from one set of massive overheads to another really wouldn't be that much of a pain.

Ok, so if we consider TeX the underlying system, do we need a TeX tag? No. We would rather assume all parts of a document not enclosed by an SGML tag are TeX. This would be a transitory state, since you could then write SGML-to-TeX modules for Apache, IIS and other popularish web servers. The world would then become wholly TeXified, as it should be.

Comment Re:TeX for Math (Score 3, Insightful) 300

Whereas now, people are still not accustomed to seeing correctly typeset documents and are now completely used to vast numbers of typos, malformed web pages, poor indexing via the semantic web, gratuitous XML, excessively long style sheets, browser incompatibilities, Javascript...

Comment Re:wtf? (Score 1) 605

You kill your cash cow IF your cash cow is lame and on life-support (basically the x86) AND you have a substitute cash cow that is so good, so novel and so ready that you can switch to it and gain more customers than you lose AND gain a time advantage over all your competitors of such magnitude that you'll have a very stable market before anyone is in a position to match you.

This does happen in the CPU world from time to time. The problem is that Intel has failed to make a new product of adequate maturity from the start, which is what it needs. If Intel had ditched its internal politics (which are horrible, BTW) and continued developing the Itanium until it got to where it is today BEFORE releasing it, it would have scored over big. (The extra time would have also been considerably shorter, as there would have been more designing, more working, less blaming and less avoidance.) Intel LOST a remarkable opportunity to eliminate the x86 market, because they had (and have) much the same culture as the old IBM, NASA and Lockheed Martin - pushing for good headlines rather than good products. I guess it's also the same as the Medieval Catholic Church. In the end, there is only one possible result - a catastrophic collapse in confidence and capability, where you go from market leader to borderline extinction in pretty much an afternoon. Sinkholes, rot and corporate failure all share one thing in common - you see nothing until it's all over.

ARM and MIPS have pretty much total control over the mobile world and the embedded world. However, the PC world isn't going to vanish. It will change, though. Why? Because there'll be more action in the ARM and MIPS toolchains, because people will want programs to work seamlessly across devices (not just talk, but actually run on multiple systems), because manufacturing is cheaper when you've a common base architecture, because software companies don't like supporting multiple architectures. And that means PCs will have to switch to the same architecture as the mobile/embedded world, albeit with performance considerations rather than power.

Comment Re:They could.... (Score 1) 605

There are still a huge number of programs (especially for Windows) for which 32-bit versions are the only versions. This is by no means a good thing - and at this point, there is no excuse for it. (Size may have been a factor at one point, but if you're capable of running 64-bits, you're probably working with a memory that can cope with the larger pointer sizes.)

Firefox, for example, is threatening to abandon the 64-bit line. You lose 32-bit support, you lose Firefox support. Now, that might not be such a bad thing (it's bleedin' bloatware, it's slow, the distinction between plugins and extensions is annoying, I hate the menu tree layout, they release major releases too often, there's no UTF-32, and I'm sure there are other problems I could find if I could be bothered, and they should Get Off My Lawn!). However, Chrome and clones don't have enough plugins yet, Opera isn't the performance giant it was, and there's really no other browsers out there of much significance.

Comment Re:Why would Intel want to kill the x86? (Score 1) 605

I would object to calling Intel's processors "the best". The best at what? I'd pit a MIPS64 CPU against an x64 CPU (equal cores, equal FPUs, equal bus speeds) for servers. I'd consider the T2 to be easier to radiation-proof than comparable Intel offerings. The AMULET processor series is the most original. For bus speeds, HyperTransport is still faster than PCI Express.

That doesn't mean Intel isn't "the best" at XYZ, what it means is that there is no generic "the best". There will ALWAYS be another processor that is better at something else.

Comment Re:evidence-based policy (Score 1) 1106

I have nothing against having the biggest government you could possibly have - 100% of the population actively working on some aspect or aspects of governance - provided that you have as small as is rational - ie: not so small it is essentially run by a single person as a personal dictatorship - government working on any given specific aspect.

In fact, I have rather more objection to the current state of affairs, where voters and pundits pronounce on issues they lack the information or education to comprehend. Education I've addressed lots of times (we need more of it, it needs to be considerably more intensive, it needs to be considerably more extensive and it absolutely needs to be devoid of creationists and conspiracy theorists), but that only provides the necessary foundation. The only place to acquire experience and current factual information on government is in government itself.

There is one other benefit. It's easier for lobbyists to bribe/coerce/blackmail 300 people to vote specific ways than to bribe 300 million.

Comment Re:Not the CPI please (Score 1) 1106

The "ideal minimum wage" would be the total cost of food, healthcare, transport and housing as recommended by the experts in each field for providing a quality and quantity of life such that for the simple majority of people on those wages, it would require a disproportionately large increase in wage to significantly impact that quality or quantity of life.

In other words, they are not suffering actual, measurable harm by being on the minimum wage versus the alternatives (leeching off others, going onto welfare, becoming a survivalist, becoming a member of the criminal class, etc).

(No, not all criminals are "just trying to survive", but anyone who is just trying to survive is going to consider crime a valid alternative to eating roots and berries.)

The "practical minimum wage" is the best available compromise between what wage-slave masters will tolerate and the ideal minimum wage. Wage-slave masters don't give an f about your health, even though it costs more to replace someone than to keep them healthy and functional. They don't give an f about anything beyond their day-to-day profits, the actual economics be damned. Odds are, many are guilty of criminal activity already, so to them one more crime means bugger all.

If you have no minimum wage, these people would pay as close to absolutely nothing at all that they possibly could. Some already pay nothing and use fear and intimidation to insure there's no trouble. If you raise minimum wage by anything at all, a few more bosses get added to the number. You can't escape it. But if you increase it by just the right amount, the number of people who gain truly functional lives outnumber the ones who become trampled on further.

Of course, if the US did something rational, like adopt universal "free" healthcare (ie: distribute the cost across everyone), legislate that nobody could be involuntarily homeless, and adopt other rules considered basic human standards by the rest of humanity, then you could actually LOWER minimum wage since there would be fewer expenses that minimum wage would need to cover.

Comment Re:Not an unexpected event.... (Score 1) 626

How do you get 3? At 10k per pop, 1 mil allows 100 sequences to be performed. Divide by 3 sources, they should be able to do 33 sequences each. The cost of sequencing is further reduced if you just want SNPs, as you can use a microarray for that. . But sequencing the entire genome is cheap these days. The cost has fallen very rapidly and continues to do so. It really is at the point where garage enthusiasts can perform many of the basic steps and will be able to do at-home SNP discovery within a few years. Of course, at-home work isn't criminology-grade, but just as enthusiasts can do basic work for pocket money, crime labs can easily swallow the cost of a clean room plus an Illumina or Oxford Nanopore sequencer.

(Oxford says it can get costs down to $1k per sequence within a year or two. They also are working on sub-thousand disposable sequencers, which may be useful.)

Slashdot Top Deals

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...