Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
User Journal

Journal Journal: Transcribing WW1 biography 5

My great great grandmother wrote a biography of her three brothers killed in WW1. I'm typing it all into a LaTeX editor and will be adding a family tree along with a sketched outline of their lives and newspaper clippings.

A best-seller it ain't, but it may interest a few here as these guys show autistic traits and are geeks from just over a century ago.

User Journal

Journal Journal: Review: Bird of Prey 4

TL;DR version: 80s dystopian techno-horror geekfest with relatively accurate portrayal of cryptography and hacking.

Long version: Pretty much the same as above. It's a low budget BBC production that scores highly on accuracy of methods, exploits and technology of the era, insofar as TV ever gets.

The premise: a low-rank civil servant, tracking down bank fraud, discovers a trail of blackmail, corruption by intelligence services, deliberate weaknesses in security and criminal gangs operating with impunity.

By season 2, he's keeping himself alive the same way the Wikileaks journalists did, his wife has what we would call severe PTSD and the body count isn't slowing down.

Given trauma was barely understood in the 80s, the portrayal there and the bouts of temporary insanity are extremely close to what happens, again allowing for this being TV drama and not a psychological documentary.

The storyline deals with cryptography, surveillance society, backdoors and institutional corruption. All hot button issues of today. It even covers the inevitable issues of DIY security.

The conspiracy aspect is a trifle OTT bit, again, it's TV. It has to be to have a program.

It's geared to nerds, geeks and dystopia lovers, though, rather than the mainstream. I saw more reviews in computer journals than in TV guides.

It's the sort of show that would really need updating to be watchable by modern audiences, but fans of older shows would likely enjoy it.

It wasn't unusual for the time, which is the great thing

The 80s were a time for really bleak geek television - Codename Icarus (for the younger viewers), Edge of Darkness, Terry Nation's Survivors, Threads - all productions in this decade.

(Even late 70s had some dark stuff, Blake's 7, The Omega Factor, Day of the Triffids, and ABC/Central's Sapphire & Steel were not light watching. You have to go back to the start of the decade and Doomwatch to see a plausible contemporary dystopia.)

The stuff of a thousand bad dreams, these shows.

User Journal

Journal Journal: Teaching history via RPGs 3

There's a new RPG pack under development, called Carved In Stone. Well, it's called an RPG pack, but basically it's a fairly comprehensive history lesson about the Picts that can be used in roleplaying games. This is quite a neat idea and it got me wondering.

There were, at one point, quite a few historical wargames (Britannia, Decline and Fall, etc) but they were mostly about large-scale strategy rather than the history itself (which was mostly an excuse for blowing up other people's counters). History lessons via roleplaying games sounds quite an interesting approach and could be used to cover all kinds of events.

The expansion pack isn't out yet (it's still in kickstart) but there's enough information about it to get a good feel for how much depth there is in there. If it's done well, it could be very effective in the same way "...and then the Huns came and beat the sh*t out of the Romans before leaving again" isn't. Unless you're a Hun.

I'd like to get people's views on the use of roleplaying games and which system would be best for such gaming. Rolemaster? Call of Cthulhu? The ever-present Dungeons and Dragons? ("My 20th level mage casts a fireball at the fleeing Scots" sounds ahistorical.)

User Journal

Journal Journal: Consumer Genetics, the current state of play

Ok, so let's start be defining a few terms, as it is obvious from Facebook genetic genealogy groups that people are truly ignorant on the subject. (Not that I believe this is common on Slashdot, where we're all much more knowledgeable.)

First off, most genetic testing is NOT carried out by sequencing all of your DNA, a widespread belief that resulted in outrage on one Facebook group when I pointed that out.

The vast majority of consumer testing is done by SNP genotyping. They look at very specific genetic markers and see if those markers have changed from one base pair to another. That's the only type of mutation looked for and they typically look at only a few.

So we've our first way to group companies: sequencing vs genotyping.

SNPs (single nucleotide polymorphisms) are, as mentioned above, one type of mutation. Another is called STR (short tandem repeat), where a block of DNA is duplicated.

FamilyTreeDNA does both STR and SNP testing, STRs mostly for the Y chromosome. Both can be used for family history.

Most labs, though, use only SNP tests. It's quicker and cheaper than counting repeats but with many of the more interesting ones covered by patents or kept private by other means, there's a lot more secrecy involved.

(Note: This has doubtless led to a lot of unnecessary deaths, as genetic markers indicating a high probability of getting certain forms of cancer are being milked by private companies for profit. Few people get more than one test, so most people won't know if they carry such markers and can't take action in advance.)

So the second piece of jargon is SNP vs STR.

Finally, we come to the different areas of DNA. There are regions that are especially good for ancestrial reserch (mostly non-coding DNA), then there's the exome (which is where most of the protein coding takes place), you've telomeres (suicidal buffers between chromosomes, which have a function in longevity), and so on. I won't list them all.

The Y chromosome is particularly good for ancestry, but only has 9 coding genes left in it. It's possible it will vanish in time, but it seems to be fairly stable for right now.

Most companies test only DNA that is good for ancestral research in the autosomal regions (aDNA, the regions outside the sex chromosomes). This allows you to identify anyone who is genetically connected, but because you (on average) get just under 50% (remember, there's mutations in each generation and that DNA comes from neither parent) of your DNA from each parent, the distance you can track depends on how many markers are tested (very few). Reliability falls off sharply.

YDNA (Y chromosome DNA) tests only test for paternal ancestry, but if two people have a common paternal-line ancestor, it's a lot more precise once you're past about second cousins. It's popular with anthropologists as it's very good for tracking how men have migrated.

mtDNA (mitochondrial DNA) is only inherited through the maternal line. Again, it's very popular, this time for tracking how women have migrated. There are certain forms of mtDNA that are linked to health benefits and others to genetic diseases, so this one tends to be the most controversial of the ancestral DNA tests. It also changes very slowly, so you don't get high resolution on population movements.

These two (YDNA and mtDNA) tests can tell you a lot about whether societies are open or closed, and whether it was men who travelled to find partners, women, or both. So we can know something of the culture of even long-extinct societies.

The data I have been able to find is for 2019. It shows: Myheritage tests for 702,442 autosomal SNPS, AncestryDNA for 637,639, FTDNA for 612,272 and 23andme for 630,132. This is out of a total of 3 billion base pairs. So the best test that year looked at 0.0023% of the genome.

ISOGG produced a chart as well, but it's far older. Their chart is dated around 2013.

Since you inherit a random 50% from each parent, the assumption that this is statistically meaningful for such a small fraction of the DNA is questionable. It seems to work adequately, but I'm not sure what the error bars are.

FTDNA also tests up to 111 STRs on regular tests and 600+ STRs for their "BigY" (it depends on the quality of the genetic sample).

Companies that do sequencing sometimes offer partial kits (in the order of tens of millions of SNPs) or full sequencing (which is what the same suggests). These are rarer and more expensive.

Most DNA companies allow you to access the raw data, some only allow it if you pay vast sums of money, and some don't allow you to at all. Always check in advance.

When you download your own data, you can use public databases to search for matches (either for relatives or genetic conditions). The quality of public databases is less controlled, both in terms of privacy and quality of data. However, corporate databases will usually be smaller for both types of data and will also usually not contain data from rivals. If you want broad data sets, public databases are the way to go.

I've only tested with 23&Me, FamilyTreeDNA, CRI Genetics and Nebula Genomics, so can't tell you anything much about the quality of the other companies.

(Ok, I also tested with uBiome, a microbiome testing company in the US, but they had their computers seized some time back due to fraud. I have no idea what happened to my data on there, or whether there's a way to access it.)

The quality seems to be reasonable for all four.

FTDNA is the most expensive for a lot of things, but has less of a sticker shock than Nebula and gets more data than 23&Me. It looks like there are a few companies that are better for ancestry but it's one of the best and the one the Genomics Project used. They're the only ancestral company that gives you STRs AFAIK and they give you a much more detailed evaluation of haplogroups than anyone else I've tested with.

Nebula does up to medical grade (100x oversampling) DNA testing, so if you want results a hospital will trust, that's where you part with a vast amount of money.

23&Me is good for a lot of medical stuff and if you want to help with research is probably the best.

CRI Genetics produces a lot of data with much higher reliability than most of the others, but you can't access the raw data and their databases won't be as extensive. However, because you can't access the raw data, you have to test with them to compare against their database.

User Journal

Journal Journal: Tea 7

I have now passed the total of 30 different black teas. Not fruit, not spice, not herbal, not even green, white or red tea. Just black teas. No, blends like PG Tips and Yorkshire Gold don't count either.

Why so many? Aside from being my current monomania, it's because I'm fascinated by how different they are.

I couldn't tell you the chemistry that makes that difference, nor could I tell you what difference it makes in terms of the various compounds affecting alertness or sedation. (It contains both), in terms of health benefits or even in the simplest term of how water is retained in the body.

But I'm determined to find out at least some of this. It'll have to be on my own, as essentially no research is being done on the subject, and I've no idea of what that'll require beyond a very good gas spectrometer (I'm going to have to count molecules, not atoms).

But I think it would be fun to find out, and definitely worth doing as long as I can figure out how to (a) control the parameters, and (b) afford said piece of gear.

User Journal

Journal Journal: Continuation on education 13

Ok, I need to expand a bit on my excessively long post on education some time back.

The first thing I am going to clarify is streaming. This is not merely distinction by speed, which is the normal (and therefore wrong) approach. You have to distinguish by the nature of the flows. In practice, this means distinguishing by creativity (since creative people learn differently than uncreative people).

It is also not sufficient to divide by fast/medium/slow. The idea is that differences in mind create turbulence (a very useful thing to have in contexts other than the classroom). For speed, this is easy - normal +/- 0.25 standard deviations for the central band (ie: everyone essentially average), plus two additional bands on either side, making five in total.

Classes should hold around 10 students, so you have lots of different classes for average, fewer for the band's either side, and perhaps only one for the outer bands. This solves a lot of timetabling issues, as classes in the same band are going to be interchangeable as far as subject matter is concerned. (This means you can weave in and out of the creative streams as needed.)

Creativity can be ranked, but not quantified. I'd simply create three pools of students, with the most creative in one pool and the least in a second. It's about the best you can do. The size of the pools? Well, you can't obtain zero gradient, and variations in thinking style can be very useful in the classroom. 50% in the middle group, 25% in each of the outliers.

So you've 15 different streams in total. Assume creativity and speed are normally distributed and that the outermost speed streams contain one class of 10 each. Start with speed for simplicity I'll forgo the calculations and guess that the upper/lower middle bands would then have nine classes of 10 each and that the central band will hold 180 classes of 10.

That means you've 2000 students, of whom the assumption is 1000 are averagely creative, 500 are exceptional and 500 are, well, not really. Ok, because creativity and speed are independent variables, we have to have more classes in the outermost band - in fact, we'd need four of them, which means we have to go to 8000 students.

These students get placed in one of 808 possible classes per subject per year. Yes, 808 distinct classes. Assuming 6 teaching hours per day x 5 days, making 30 available hours, which means you can have no fewer than 27 simultaneous classes per year. That's 513 classrooms in total, fully occupied in every timeslot, and we're looking at just one subject. Assuming 8 subjects per year on average, that goes up to 4104. Rooms need maintenance and you also need spares in case of problems. So, triple it, giving 12312 rooms required. We're now looking at serious real estate, but there are larger schools than that today. This isn't impossible.

The 8000 students is per year, as noted earlier. And since years won't align, you're going to need to go from first year of pre/playschool to final year of an undergraduate degree. That's a whole lotta years. 19 of them, including industrial placement. 152,000 students in total. About a quarter of the total student population in the Greater Manchester area.

The design would be a nightmare with a layout from hell to minimize conflict due to intellectual peers not always being age peers, and neither necessarily being perceptual peers, and yet the layout also has to minimize the distance walked. Due to the lack of wormholes and non-simply-connected topologies, this isn't trivial. A person at one extreme corner of the two dimensional spectrum in one subject might be at the other extreme corner in another. From each class, there will be 15 vectors to the next one.

But you can't minimize per journey. Because there will be multiple interchangeable classes, each of which will produce 15 further vectors, you have to minimize per day, per student. Certain changes impact other vectors, certain vector values will be impossible, and so on. Multivariable systems with permutation constraints. That is hellish optimization, but it is possible.

It might actually be necessary to make the university a full research/teaching university of the sort found a lot in England. There is no possible way such a school could finance itself off fees, but research/development, publishing and other long-term income might help. Ideally, the productivity would pay for the school. The bigger multinationals post profits in excess of 2 billion a year, which is how much this school would cost.

Pumping all the profits into a school in the hope that the 10 uber creative geniuses you produce each year, every year, can produce enough new products and enough new patents to guarantee the system can be sustained... It would be a huge gamble, it would probably fail, but what a wild ride it would be!

Books

Journal Journal: History books can be fun (but usually aren't and this is a Bad Thing) 2

Most people have read "1066 and all that: a memorable history of England, comprising all the parts you can remember, including 103 good things, 5 bad kings and 2 genuine dates" (one of the longest book titles I have ever encountered) and some may have encountered "The Decline and Fall of Practically Everybody", but these are the exceptions and not the rule. What interesting - but accurateish - takes on history have other Slashdotters encountered?

Education

Journal Journal: HOWTO: Run an educational system 1

The topic on Woz inspired me to post something about the ideas I've been percolating for some time. These are based on personal teaching experience, teaching experience by siblings and father at University level and by my grandfather at secondary school, 6th form college and military acadamy. (There's been a lot of academics in the family.)

Anyways, I'll break this down into sections. Section 1 deals with the issues of class size and difference in ability. It is simply not possible to teach to any kind of meaningful standard a group of kids of wildly differing ability. Each subject should be streamed, such that people of similar ability are grouped together -- with one and only one exception: you cannot neglect the social aspect of education. Some people function well together, some people dysfunction well together. You really want to maintain the former of those two groups as much as possible, even if that means having a person moved up or down one stream.

Further, not everyone who learns at the same pace learns in the same way. Streams should be segmented according to student perspective, at least to some degree, to maximize the student's ability to fully process what they are learning. A different perspective will almost certainly result in a different stream. Obviously, you want students to be in the perspective that leads them to be in the fastest stream they can be in.

There should be sufficient divisions such that any given stream progresses with the least turbulence possible. Laminar flow is good. There should also be no fewer than one instructor per ten students at a secondary school level. You probably want more instructors in primary education, less at college/university, with 1:10 being the average across all three.

Section 2: What to teach. I argue that the absolute fundamental skills deal in how to learn, how to research, how to find data, how to question, how to evaluate, how to apply reasoning tools such as deduction, inference, lateral thinking, etc, in constructive and useful ways. Without these skills, education is just a bunch of disconnected facts and figures. These skills do not have to be taught directly from day 1, but they do have to be a part of how things are taught and must become second-nature before secondary education starts.

Since neurologists now believe that what is learned alters the wiring of the brain, the flexibility of the brain and the adult size of the brain, it makes sense that the material taught should seek to optimize things a bit. Languages seem to boost mental capacity and the brain's capacity to be fault-tolerant. It would seem to follow that teaching multiple languages of different language families would be a Good Thing in terms of architecturing a good brain. Memorization/rote-learning seems to boost other parts of the brain. It's not clear what balance should be struck, or what other brain-enhancing skills there might be, but some start is better than no start at all.

Section 3: How to test. If it's essential to have exams (which I doubt), the exam should be longer than could be completed by anyone - however good - within the allowed time, with a gradual increase in the difficulty of the questions. Multiple guess choice should be banned. The mean and median score should be 50% and follow a normal distribution. Giving the same test to an expert system given the same level of instruction as the students should result in a failing grade, which I'd put at anything under 20% on this scale. (You are not testing their ability to be a computer. Not in this system.)

Each test should produce two scores - the raw score (showing current ability) and the score after adjusting for the anticipated score based on previous test results (which show the ability to learn and therefore what should have been learned this time - you want the third-order differential and therefore the first three tests cannot be examined this way). The adjusted score should be on the range of -1 (learned nothing new, consider moving across to a different perspective in the same stream) to 0 (learned at expected rate) to +1 (learning too fast for the stream, consider moving up). Students should not be moved downstream on a test result, only ever on a neutral evaluation of some kind.

Section 4: Fundamentals within any given craft, study or profession should be taught as deeply and thoroughly as possible. Those change the least and will apply even as the details they are intertwined with move in and out of fashion. "Concrete" skills should be taught broadly enough that there is never a serious risk of unemployability, but also deeply enough that the skills have serious market value.

Section 5: Absolutely NO homework. It's either going to be rushed, plagarized or paid-for. It's never going to be done well and it serves no useful purpose. Year-long projects are far more sensible as they achieve the repetitious use of a skill that homework tries to do but in a way that is immediately practical and immediately necessary.

Lab work should likewise not demonstrate trivial stuff, but through repetition and variation lead to the memorization of the theory and its association with practical problems of the appropriate class.

Section 6: James Oliver's advice on diet should be followed within reason - and the "within reason" bit has more to do with what food scientists and cookery scientists discover than with any complaints.

Section 7: Go bankrupt. This is where this whole scheme falls over -- to do what I'm proposing seriously would require multiplying the costs of maintaining and running a school by 25-30 with no additional income. If it had a few billion in starting capital and bought stocks in businesses likely to be boosted by a high-intensity K-PhD educational program, it is just possible you could reduce the bleeding to manageable proportions. What you can never do in this system is turn a profit, although all who are taught will make very substantial profits from such a system.

User Journal

Journal Journal: I don't know which is scarier

That I am old enough to remember where my current .sig came from, or that nobody else is.....! For those who are suffering from a memory lapse, here is the sig: "The world is in darkness. To erase data is to suppress truth; to halt computing is to shackle the mind."

Ok, ok, you're too lazy to google it, so here's the link: Son of Hexadecimal Kid

User Journal

Journal Journal: Automotive Security

According to the Center for Automotive Embedded Systems Security, there are serious security flaws in the existing technology. Not necessarily a big deal, for now, as they observe that the risks are low at the current time. Emphasis on "current". They also state that no crackers have been observed to use the required level of sophistication. Again, emphasis needs to be on "observed". Yes, it may well be a while before automotive networks reach the point where this is exploited in the wild (at least to any scale), but I would remind you that it took Microsoft from Windows 3.0 through to Windows XP Service Pack 2 to take security even remotely seriously. That's a long, long time. And Microsoft had nothing like the install-base of the car industry. Further, the qualifications required by most companies to be a system administrator were a good deal steeper than the requirements for a car mechanic, so systems administrators were likely far more familiar with the issues involved. Also, said systems administrators are far more accountable for security issues, since there are plenty of third-party tools that novice users can use to spot malicious software.

The first question is why this even matters. It doesn't affect anyone today. No, but it's guaranteed to affect at least some current Slashdot readers in their lifetime and, depending on how rapidly car networks develop, may affect a significant fraction surprisingly fast. Technology doesn't move at Stone Age speeds any more. Technology advances rapidly and you can't use obsolete notions of progress to determine what will happen next year or over the next decade.

The second question is what anyone could seriously do, even if it was an issue. Not too many Slashdotters own automotive companies. In fact, I doubt if ANY Slashdotters own automotive companies. Well, the validation tools are Open Source. MISRA has a fair few links to members and software packages. In fact, even if developers just developed an understanding of MISRA's C and C++ specifications it might be quite valuable as it would allow people to understand what is being done (if anything) to improve reliability and to understand how (if at all) this impacts security. You don't get reliability for free, there will be some compromises made elsewhere.

User Journal

Journal Journal: Has anyone had problems with DB companies? What therapies work with bosses? 4

I've been having problems with Enterprise DB. This company maintains the Windows port of Postgres, but I have been finding their customer service.... less than satisfactory. This is the second time in, oh, 21 years that I've actually been infuriated by a company. However, to be entirely fair to the business and indeed the sales person, it is entirely possible this was a completely freak incident with no relationship to normal experience. There were all kinds of factors involved, so it's a messy situation all round, but the hard-sell aggressiveness and verbal abuse went way beyond what I have ever experienced from a professional organization in two DECADES. What I want to know from other Slashdotters is whether this is about on-par with the tales of meteorites landing on someone's sofa (which is my personal suspicion) or whether it's a more insidious issue. Please, please, please, do not take one incident as a general rule. I've not seen any article on Slashdot or LWN reporting wider issues with them, which you know perfectly well would have happened had there been a serious, widespread problem. Especially with all of the reporting on database issues over recent times and the search for alternatives to MySQL once leading developers defected and major forks arose.

This is, however, a major question. Like it or not, we need databases we can rely on and trust, which means that when they are backed by companies, we need the companies that back them to be honorable. (PostgreSQL itself isn't owned, so I trust the engine itself just fine. The development team is very impressive - and, yes, I do monitor the mailing lists.) Value-added only has any added value if it's valuable.

What is worse, from my perspective, is that my current boss is now treating it like this is how companies work when reselling Open Source products. His practical experience was being on the receiving end of all this. If we're to take advantage of the freedom (and bloody high quality) provided in the Open Source world, I need to deprogram him of the notion that they give hassle and sell grief. Does anyone have any experience doing this?

User Journal

Journal Journal: Save TV for Geeks! 2

A petition calling for the return of perhaps the most important television show since The Great Egg Race is currently running but isn't exactly getting anywhere fast. It is vitally important that intellectually-stimulating shows be encouraged -- the consequence of failure (24 hours of Jersey Shore on all channels) is too horrible to contemplate. Unfortunately, as things stand, that's exactly what we are heading towards. Save your television and your mind before it's too late!

User Journal

Journal Journal: 1-3% of all mainstream stars have planets?

The venerable BBC is reporting that a survey of light emitted from white dwarfs showed that between 1% and 3% had material (such as silicon) falling into the star on a continuous basis, potential evidence of dead worlds and asteroids. On this basis, the authors of the study speculate that the same percentage of mainstream stars in the active part of their life will have rocky matter. This is not firm evidence of actual planetary formation, as asteroids would produce the same results, but it does give an upper bound and some idea of what a lower bound might be for planetary formation.

Aside from being a useful value for Drake's Equation, the rate of planetary formation would be valuable in understanding how solar systems develop and what sort of preconditions are required for an accretion disk of suitable material to form.

Because the test only looked for elements too heavy to have been formed in the star, we can rule out the observations being that of cometary debris.

User Journal

Journal Journal: Fireball, but not XL5 3

Four fireballs, glowing blue and orange, were visible last night over the skies of the Carolinas on the southeast coast of the United States, followed by the sound of an explosion described as being like thunder. Reports of hearing the noise were coming in from as far afield as Connecticut. There is currently no word from NASA or the USAF as to what it could be, but it seems improbable that anything non-nuclear the military could put up could be heard over that kind of distance. It therefore seems likely to be a very big meteorite.

The next question would be what type of meteorite. This is not an idle question. The one slamming into the Sudan recently was (a) extremely big at an estimated 80 tonnes, and (b) from the extremely rare F-class of asteroid. If this new meteorite is also from an F-class asteroid, then it is likely associated with the one that hit Sudan. This is important as it means we might want to be looking very closely for other fragments yet to hit.

The colours are interesting and allow us to limit what the composition could have been and therefore where it came from. We can deduce this because anything slamming through the atmosphere is basically undergoing a giant version of your basic chemistry "flame test" for substance identification. We simply need to look up what metals produce blue, and in so doing we see that cadmium does produce a blue/violet colour, with copper producing more of a blue/green.

Other metals also produce a blue glow and tables of these colours abound, but some are more likely in meteoric material than others. Cadmium exists in meteorites. Well, all elements do, if you find enough meteorites. but it exists in sufficient quantity that it could produce this sort of effect. (As noted in the chemmaster link, low concentrations can't be detected by this method, however this is going to be vastly worsened by the fact that this isn't a bunsen burner being used and the distance over which you're observing is extreme.)

Ok, what else do we know? The fireballs were also orange. Urelites, such as the Sudan impact, contain a great deal of calcium, which burns brick-red, not orange. This suggests we can rule out the same source, which in turn means we probably don't have to worry about being strafed the way Jupiter was with the Shoemaker-Levy comet (21 impacts).

What can we say about it, though? Well, provided the surviving fragments didn't fall into the ocean, it means every meteorite hunter on the planet will be scouring newspaper stories that might indicate where impacts occurred. Meteoric material is valuable and anything on a scale big enough to be heard across the entire east coast of the US is going to be worth looking for. It had split into four in the upper atmosphere, so you're probably looking at a few thousand fragments reaching ground level that would exceed a year's average pay.

Slashdot Top Deals

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...