Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Old tech, and limited (Score 3) 300

TeX has control elements for describing structure, since structure is a key part of typesetting. Since these elements are macros, they're programmable, although not truly abstract as in XML. About the only thing I can think of that XML can do for document structure that TeX cannot is out-of-order elements, and I'd argue that out-of-order is incompatible with structure.

In database terminology, XML is a key-data pair system. The data can be anywhere in the XML file and you need some sort of key to know where it is and/or when you've found it. (Since XML is not organized, you can't do random access to get at the key. You have to load it in and organize it, in which case it isn't XML, or you have to sequentially search it.)

TeX is a semi-sequential structure, with relationship links between specialized data tables. Again in database terms, it's a set of batch sequential files with crude but useful support for concrete data association. Because it's batch sequential, real-time usage gets hairy. Big deal. Those in the middle of writing should be concerned with the writing. It would be nice if editors had better error-detection, but it's not usually that critical.

Comment Re:Old tech, and limited (Score 3, Informative) 300

Never had any problem writing books in LaTeX. The main difficulty has been in deciding whether I want a modern or medieval structure.

Docbook, on the other hand, I hated. I helped with the writing of a few chapters of the Linux Advanced Traffic Control book, which was abandoned in part because Docbook was such a disgusting system.

XML is useless for typesetting. It's not really that useful for organizing anything - you'll have used XML-driven databases, but you'll have never used an XML-driven database that had any performance or serious functionality. (LaTeX doesn't do databases, either, but it doesn't pretend to. It has external engines for databases, which are actually quite nice.)

Web pages? Never had any problem embedding HTML in LaTeX. In fact, I have very very rarely found ANY document style to be LaTeX-incompatible. Load up the correct document type, load up the appropriate stylesheets and you're good. Yes, spiral text is hard. Yes, embedding HDR images can be a pain. Yes, alpha blending isn't that hot. But how often do you use any of these for owner's manuals or contracts?

There are more table classes than I'd really like, and some of the style coding is scruffy, but I challenge anyone to find a genuine, common document type that LaTeX* cannot do as well as or better than any non-TeX wordprocessor, DTP solution or XML-based system. (Non-TeX means you can't compare TeX with Scientific Word, TeXmacs or any other engine that uses TeX behind the scenes.)

(To make it absolutely clear, "as well as or better than" can refer to any one or more parameters. So if I get better-quality output, that's better than. If I can achieve comparable results with cleaner, easier-to-maintain syntax, that's also better than. To win, your solution has to not merely equal but actually exceed what I can do on EVERY parameter, or you have failed to demonstrate something that supercedes.)

A bitcoin to anyone who can do this.

*I am including all dialects of LaTeX here, so LuaLaTeX, PDFTeX, etc, are all things I can consider on my side, as are all WYSIWYG and WYSIWYM editors, Metapost, supplemental services, style sheets, etc. Since this is versus a specific alternative, anything comparable for that specific alternative is fair game for you to use, but you can't mix in other alternatives. It has to be one versus the complete TeX family if you want to prove your point.

Comment Re:TeX for Math (Score 5, Interesting) 300

Well, with WebKit up the proverbial creek these days, a new rendering engine would make sense.

The question would be whether you could create a TeX-alike engine that supports the additional functions required in HTML and can convert any well-formed SGML document into a TeX-alike document. If you could, you can have one rendering engine and subsume HTML and XML entirely within it.

The benefits of doing this? The big drawback of style sheets is that no two browsers agree on units. TeX has very well-defined units that are already widely used. These also happen to be the units industry likes using. Eliminating browser-specific style sheets would be an incredible benefit.

The big drawback of the Semantic Web is that everyone, their brother, cat and goldfish have designed their own ontologies, none of which interoperate and few of which are any good for searching with SPARQL. LaTeX has a good collection of very standard, very clean methods for binding information together. Because it's standard, you can have a pre-existing ontology libraries which can be auto-populated. And because LaTeX is mostly maintained by uber-minds, rather than Facebook interns during their coffee break, those ontologies are likely to be very, very good. Also, microformats will DIE!!!! BWAHAHAHAHAHAHAHAHA!

The big drawback with HTML 5 is that the W3C can't even decide if the standard is fixed, rolling or a pink pony. TeX is a very solid standard that actually exists.

Ok, what's the downside of TeX? There's no real namespace support, so conflicts between libraries are commonplace. I'm also not keen on having a mixture of tag logic, where some tags have content embedded and others have the content enclosed with an end tag. It's messy. Cleanliness is next to Linuxliness.

Parsing client-side is a mild irritant, but let's face it. AJAX is also parsing client-side, as is Flash, as are cascading style sheets, etc, etc. The client is already doing a lot (one reason nobody has a fast browser any more), so changing from one set of massive overheads to another really wouldn't be that much of a pain.

Ok, so if we consider TeX the underlying system, do we need a TeX tag? No. We would rather assume all parts of a document not enclosed by an SGML tag are TeX. This would be a transitory state, since you could then write SGML-to-TeX modules for Apache, IIS and other popularish web servers. The world would then become wholly TeXified, as it should be.

Comment Re:TeX for Math (Score 3, Insightful) 300

Whereas now, people are still not accustomed to seeing correctly typeset documents and are now completely used to vast numbers of typos, malformed web pages, poor indexing via the semantic web, gratuitous XML, excessively long style sheets, browser incompatibilities, Javascript...

Comment Re:wtf? (Score 1) 605

You kill your cash cow IF your cash cow is lame and on life-support (basically the x86) AND you have a substitute cash cow that is so good, so novel and so ready that you can switch to it and gain more customers than you lose AND gain a time advantage over all your competitors of such magnitude that you'll have a very stable market before anyone is in a position to match you.

This does happen in the CPU world from time to time. The problem is that Intel has failed to make a new product of adequate maturity from the start, which is what it needs. If Intel had ditched its internal politics (which are horrible, BTW) and continued developing the Itanium until it got to where it is today BEFORE releasing it, it would have scored over big. (The extra time would have also been considerably shorter, as there would have been more designing, more working, less blaming and less avoidance.) Intel LOST a remarkable opportunity to eliminate the x86 market, because they had (and have) much the same culture as the old IBM, NASA and Lockheed Martin - pushing for good headlines rather than good products. I guess it's also the same as the Medieval Catholic Church. In the end, there is only one possible result - a catastrophic collapse in confidence and capability, where you go from market leader to borderline extinction in pretty much an afternoon. Sinkholes, rot and corporate failure all share one thing in common - you see nothing until it's all over.

ARM and MIPS have pretty much total control over the mobile world and the embedded world. However, the PC world isn't going to vanish. It will change, though. Why? Because there'll be more action in the ARM and MIPS toolchains, because people will want programs to work seamlessly across devices (not just talk, but actually run on multiple systems), because manufacturing is cheaper when you've a common base architecture, because software companies don't like supporting multiple architectures. And that means PCs will have to switch to the same architecture as the mobile/embedded world, albeit with performance considerations rather than power.

Comment Re:They could.... (Score 1) 605

There are still a huge number of programs (especially for Windows) for which 32-bit versions are the only versions. This is by no means a good thing - and at this point, there is no excuse for it. (Size may have been a factor at one point, but if you're capable of running 64-bits, you're probably working with a memory that can cope with the larger pointer sizes.)

Firefox, for example, is threatening to abandon the 64-bit line. You lose 32-bit support, you lose Firefox support. Now, that might not be such a bad thing (it's bleedin' bloatware, it's slow, the distinction between plugins and extensions is annoying, I hate the menu tree layout, they release major releases too often, there's no UTF-32, and I'm sure there are other problems I could find if I could be bothered, and they should Get Off My Lawn!). However, Chrome and clones don't have enough plugins yet, Opera isn't the performance giant it was, and there's really no other browsers out there of much significance.

Comment Re:Why would Intel want to kill the x86? (Score 1) 605

I would object to calling Intel's processors "the best". The best at what? I'd pit a MIPS64 CPU against an x64 CPU (equal cores, equal FPUs, equal bus speeds) for servers. I'd consider the T2 to be easier to radiation-proof than comparable Intel offerings. The AMULET processor series is the most original. For bus speeds, HyperTransport is still faster than PCI Express.

That doesn't mean Intel isn't "the best" at XYZ, what it means is that there is no generic "the best". There will ALWAYS be another processor that is better at something else.

Comment Re:evidence-based policy (Score 1) 1106

I have nothing against having the biggest government you could possibly have - 100% of the population actively working on some aspect or aspects of governance - provided that you have as small as is rational - ie: not so small it is essentially run by a single person as a personal dictatorship - government working on any given specific aspect.

In fact, I have rather more objection to the current state of affairs, where voters and pundits pronounce on issues they lack the information or education to comprehend. Education I've addressed lots of times (we need more of it, it needs to be considerably more intensive, it needs to be considerably more extensive and it absolutely needs to be devoid of creationists and conspiracy theorists), but that only provides the necessary foundation. The only place to acquire experience and current factual information on government is in government itself.

There is one other benefit. It's easier for lobbyists to bribe/coerce/blackmail 300 people to vote specific ways than to bribe 300 million.

Comment Re:Not the CPI please (Score 1) 1106

The "ideal minimum wage" would be the total cost of food, healthcare, transport and housing as recommended by the experts in each field for providing a quality and quantity of life such that for the simple majority of people on those wages, it would require a disproportionately large increase in wage to significantly impact that quality or quantity of life.

In other words, they are not suffering actual, measurable harm by being on the minimum wage versus the alternatives (leeching off others, going onto welfare, becoming a survivalist, becoming a member of the criminal class, etc).

(No, not all criminals are "just trying to survive", but anyone who is just trying to survive is going to consider crime a valid alternative to eating roots and berries.)

The "practical minimum wage" is the best available compromise between what wage-slave masters will tolerate and the ideal minimum wage. Wage-slave masters don't give an f about your health, even though it costs more to replace someone than to keep them healthy and functional. They don't give an f about anything beyond their day-to-day profits, the actual economics be damned. Odds are, many are guilty of criminal activity already, so to them one more crime means bugger all.

If you have no minimum wage, these people would pay as close to absolutely nothing at all that they possibly could. Some already pay nothing and use fear and intimidation to insure there's no trouble. If you raise minimum wage by anything at all, a few more bosses get added to the number. You can't escape it. But if you increase it by just the right amount, the number of people who gain truly functional lives outnumber the ones who become trampled on further.

Of course, if the US did something rational, like adopt universal "free" healthcare (ie: distribute the cost across everyone), legislate that nobody could be involuntarily homeless, and adopt other rules considered basic human standards by the rest of humanity, then you could actually LOWER minimum wage since there would be fewer expenses that minimum wage would need to cover.

Comment Re:Not an unexpected event.... (Score 1) 626

How do you get 3? At 10k per pop, 1 mil allows 100 sequences to be performed. Divide by 3 sources, they should be able to do 33 sequences each. The cost of sequencing is further reduced if you just want SNPs, as you can use a microarray for that. . But sequencing the entire genome is cheap these days. The cost has fallen very rapidly and continues to do so. It really is at the point where garage enthusiasts can perform many of the basic steps and will be able to do at-home SNP discovery within a few years. Of course, at-home work isn't criminology-grade, but just as enthusiasts can do basic work for pocket money, crime labs can easily swallow the cost of a clean room plus an Illumina or Oxford Nanopore sequencer.

(Oxford says it can get costs down to $1k per sequence within a year or two. They also are working on sub-thousand disposable sequencers, which may be useful.)

Comment Re:God, not this again. (Score 1) 292

The question has no meaning. By using a static spacetime diagram, there is no before or after. Time is merely a spacial dimension in this type of analysis.

Furthermore, "you" have no defined meaning in either of my two scenarios. In the first, simpler, framework, "you" can be either the individual particles in the body, the body as a collective whole, the instantaneous logical state of the brain, the collective logical state over a defined unit of time, or any combination thereof.

The Greek Ship paradox only occurs because you reuse the same label for utterly different aspects of a construct that is simultaneously logical and physical. By using a generic label, you can persuade yourself of almost anything. You must use specifics. And, yes, that means distinguishing object state from object dynamics from object encapsulation.

This is what I mean about uneducated. You lack the understanding necessary to comprehend my first post, you will doubtless fail to understand this one, and you cannot even be bothered to do the basic legwork to comprehend spacetime static waves (far simpler than m-theory, which I guarantee is as complex a model of existence as anyone has managed to achieve).

It is with nonsensical replies such as yours that I end up wondering if eugenics was such a bad concept. I still firmly believe better schooling would fix most examples of stupidity (I think 9 hours/day, compulsory between the ages of 3 and 23, narrow-band streaming per subject should suffice).

Comment Re:God, not this again. (Score 1) 292

Science is not done by straw poll, so the views of most (uneducated, I might add) people is unimportant. What matters is that physicists and mathematicians take the possibility seriously and have published papers on how simulation affects QM.

Something "exists" IFF there is a defined energy matrix superimposed on a defined probability matrix where said matrices cover non-zero, finite space, and non-zero, finite time, and interact with other such matrices of equal or higher number of dimensions.

The universe can be described also as a single object, static in 5D, with all possibilities as branches, that lies at the intersection of two membranes.

Comment Re:Pro Bono Opportunity (Score 1) 626

The cost of sequencing is negligible these days (around $10k for a full genome). Buying a machine isn't cheap - I looked at the cost of one a year or so ago, and they were still multi-million dollar devices. (Why? Bcause I'm a geek! Having a sequencer of my own would be bloody amazing! Useless, but amazing!) Buying a slot at a lab that can run a full sequence - dirt cheap.

This would, however, mean using REAL data and not the 7-12 markers they currently use for criminology. (NB: Genetic genealogists looking to see if two people are closely related would need to perform in excess of 100 STRs and a dozen or so SNPs - if both are male, PLUS a whole load of markers off the autosomnal region, PLUS a full mitochondrial sequencing. And even then, accuracy isn't great and falls off sharply. Nobody in genetics, even those who are experts in the criminology aspect, takes current DNA testing by police seriously. The probability of coincidental matches is too high.)

Comment Re:Not an unexpected event.... (Score 2) 626

Easy. Move to a system that focuses more on rehabilitation, retraining and (when an external element is a factor) removal of external factors contributing to the criminality. You still isolate from society (the sole benefit of prison) but with reduced or eliminated punitive element, there is no risk of punishing an innocent person who happens to be cojoined to someone who is guilty.

Slashdot Top Deals

For God's sake, stop researching for a while and begin to think!

Working...