Please create an account to participate in the Slashdot moderation system


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re: EBooks (Score 1) 159

PDFs are inconvenient in e-readers because PDFs are page-oriented. That makes them inconvenient on smaller-screen devices. That said, PDFs represent that page with a high degree of fidelity, which is PDF's biggest strength.

The problem with math ebooks published in AZW format is that they either render equations incorrectly (making them useless), or render them as bitmaps (making them less useful than they should be, and sometimes illegible).

In most cases I'd take an EPUB or AZW over a PDF for reading on a small device, but for math I'd take the PDF, despite its inconveniences. Or better yet, a physical book.

Comment Re:Really? (Score 1) 57

Totally agree - this is sad. I am primarily an iPhone user, and have experience with Android tablets, but I had to use the Windows phone for a while and I ended up liking it better than either. The UI is far less intrusive and needlessly complicated: it just works, to coin a phrase. When my Dad wanted a smartphone, despite his fears of being overwhelmed, I got him a Windows Phone and he had no problems at all. Even he was surprised at how easy it was to use. It was easier than the Jitterbug he replaced. Yet as a power user I didn't come across anything I couldn't do - and do easier - than on my iPhone.

I think the tiles setup allows *much* better customization than the wall of icons approach that both Apple and Google went with.

Seems like an example of the market not rewarding a good product, I guess.

Comment Re:EBooks (Score 1) 159

I'm in my 50s and my house is literally full of physical books. Every room is lined with bookcases most of them stacked two deep, and I've literally had to put jackposts in my basement to keep the floors from sagging.

Buying new books as ebooks means I don't have to get rid of my old books. It's also nice being able to travel with a generous selection of reading material.

Overall I find the reading experience to be about a wash, but that's a highly personal thing. For pure reading a physical book is better except in low-light conditions, but the search and note taking functions on an ebook are a big plus.

The biggest drawback for ebooks for me is the terrible mathematics typesetting, which is obviously a niche concern; but it's beyond bad; it renders many math ebooks unusable. Often the equations are rendered as low-resolution bitmaps that are close to unreadable, or in other cases I've seen equation terms randomly spread hither-and-yon across the page. For scientific and technical books I would much prefer a larger, higher resolution device. It's too bad nothing really fits the bill because I hate throwing out cases of obsolete technical books every year.

If I had to choose just one format, I'd choose paper. But I find ebooks have their uses.

Comment Re:Pre-PC/Mac era (Score 1) 106

Later when Macs and PCs hit schools, the level of interest kids had in programming or even understanding computers dropped so we ended up with a generation of kids who couldn't do much more than type up a letter in MS Word compared with my generation which were writing hand coded assembly and building robots....Typing up letters and doing spreadsheets is not computing but seems to be all the schools are prepared to teach.

I think you've struck on a set of symptoms of an underlying problem. Like most Slashdotters, you were largely self-taught. Exposure to the 8-bit machines gave you a starting point, and you took the initiative from there.

I'd argue that we've ended up with two generations of kids who only know cursory word processing and web browsing skills. Would they have been apt to code if the only available computer to use required assembler? I doubt a statistically relevant number of them would have, at least not without direct education on the matter - and it is on that front where we find the root cause.

Yesterday, I spoke with my high school English teacher. One of the best my high school had. She left teaching about three years ago, and in our discussions, she stated that while she would like to go back to subbing or being a TA or full-time tutoring, she did not want to return to her own classroom again. In the last 2-3 years of her time teaching there, she indicated that there was a noticeable shift in student attitudes. There were always a number of students (a majority, at times) who wouldn't get excited about Shakespeare no matter what she did, but she at least attempted to make the projects more enjoyable than simple Q&A worksheets (newspaper production, video projects, etc.). By time she left, it was fighting tooth and nail to get anyone to even go along with the classwork. I'd be hard pressed to find a meaningful number of current high school teachers who would disagree.

Now, let's bring it back to computer class. The nature of "teaching computer" means that there is a need for one of precisely two types of people: teachers who understand computers, or CS/IT folks who know how to teach. Considering the amount of education requirements and "don't get us sued" workshops required for becoming a teacher, as well as the endless grading and meager salaries, to add "technologically adept" to the mix would be incredibly difficult. My English teacher had a bit of an advantage in that the English language hasn't changed in the past 20 years, and neither have most of the classical works we read. What language do you teach in school today? Do you start with the generally-irrelevant-but-easy-to-teach VB? Do you turn it into The Hunger Games and start with Perl? What happens when the Eclipse-based curriculum you've refined over the past 2-3 years needs to start from scratch because the superintended wants to look all modern to parents and thus informs you that you need to start teaching Comp Sci on iPads and the Win7 desktops will be gone by the summer? Even at that, how much time do you devote to programming when the students don't have a meaningful grasp of the file structure? Is it wisdom to assume programming is more important than using Word, even though there is a far more immediate use case for Word than there is for being able to program PHP? What about the IT side of things - if coding is a good thing to teach them, then isn't it also a good thing to teach how to do things like install simple PHP scripts like Wordpress on a LAMP stack and then secure them? All of this is subject to the question of shelf life on top of it.

All of this applies in reverse to the programmer or sysadmin who decides to go into teaching. The programmer now gets to create lesson plans and grade papers (which gets super tedious in the case of grading source code and/or checking IT projects), deal with classroom control, attend continuing education (which has very little to do with what is actually being taught), placate parents informing them that they are not doing their job right because they are training their snowflakes to use the CLI (which is totally irrelevant now because they don't use such ancient things on their Galaxy S8), the superintendent has got a crosshair on their back because they decided to let the students work on a set of VMs as root and no amount of "it's an isolated subnet that cannot communicate with the internal LAN and IT has already signed off on this being configured correctly" will calm him down if there's a data breach, and if it's not a way to post selfies on Snapchat, the kids don't care anyway. To deal with all of this for what is likely a five figure pay cut? That's a tough sell.

The result of both of these scenarios is that we commonly end up with "Word and The Internet" computer classes because teachers commonly don't know much more than that, and it's not uncommon for a teacher who's out of work to apply for 'an opening', even if their teaching certificate says "math".

I'm certain you've heard the cliche "Those who can, do. Those who can't, teach." If there is any truth to be had of this statement, then those who are teaching computer are going to do so at a level that is more readily obvious than may be in other topics. There is no concern of poor security in English class.

Comment Brand loyalty? Oy. (Score 2) 106

We had a lot of odd minicomputers in my high school, but the one I used most in school was a Digital Equipment PDP-8. You loaded the bootstrap from a paper tape reader, and you loaded the paper tape reader program by switches on the front panel which allowed you to set memory address contents word by word and set the program counter to a particular octal address. Input/output was through a teletype that printed on a roll of paper.

I have to say that this primitive hardware was as satisfying in its way to work on as the latest core i7 laptop I'm writing this on -- despite the actual core memory's unreliability in our building which was next to a busy subway track. I suppose I did have positive feelings toward DEC, until I got to college and worked in a lab that stored its research data on RK05 disc packs.

In my experience -- which as you can probably tell is by now extensive -- there are two kinds of people, those that adapt readily to new stuff, and those who stubbornly stick with whatever they already know. And as you look at successively older cohorts, the greater the proportion of stick-what-you-knowers there will be.

So the idea that you'll imprint *kids* on your technology is dubious. Yes you will imprint them, but it won't prevent them from switching to something else.

Comment Re: Well it's easy to show superhuman AI is a myth (Score 1) 255

The human population is composed of experts, with divisions of labor. It is not unreasonable for AI programs to have areas of expertise.

In fact this is not true. The human population is composed of experts, some of whom have required in addition specialized skills due to division of labor.

Comment Re:Well it's easy to show superhuman AI is a myth. (Score 1) 255

Oh, and here's another example I just thought of.

I once read a book by an early aviator on techniques of navigating by landmark from the air. He recounts a number of feats of navigation by what were then called "primitive people", including one account of preparations for raid by a group of 19th Century teenage Apaches on an enemy village. None of the boys had ever been there, and so they sought out an elderly man who'd been there once when he was a boy. He described all the landmarks along the way, e..g. turn south at the hill that looks like such and so, a process took almost two days because the village in question was almost five hundred miles distant.

Now if a 19th C Apache had devised an intelligence test, chances are you or I would score retarded. There's no way I could give turn-by-turn directions to a place I'd visited just once thirty years ago. And if I could the chance you could just hear them and then go there without any difficulty is nil. We are simply too unfamiliar and unpracticed a task that is second nature to them.

Comment Re:Well it's easy to show superhuman AI is a myth. (Score 1) 255

You have to look again at how the tests are devised. Let's say you just invented an intelligence test. How do you know it's any good? You give it to a bunch of people and see if it confirms what you already believe about those people.

This culturally biases the tests in several ways. Let's say your test evaluates verbal and spatial mental performance. Naturally the verbal part will be biased towards not only native speakers of your language, but native speakers of your dialect. Then how do you weight verbal vs. spatial in your net socre? That's a cultural assumption. Even if you decide to weight them equally, that's still a weighting and represents a de facto judgment that one is not necessarily more indicative of intelligence than the other.

Then there's the stuff you don't include in your test, for example social reasoning and perception. Inferring other peoples' mental states and intentions is an extremely important aspect of intelligence, but it is also intrinsically culturally specific. Let's say you ask your neighbor whether you can borrow his car and he tells you it's broken. You know it's not broken. What can you infer from that? It depends on where you live. In the US you'd take it as a sign of disrespect, but in some cultures you'd infer that it would be inconvenient for him to loan you his car. Social perception and reasoning is one of the most important aspects of intelligence, but it is nearly inpossible to get a culturally unbiased mesaure of that.


Australia Wants ISPs To Protect Customers From Viruses ( 68

An anonymous reader quotes Sopho's Naked Security blog: In a column in The West Australian, Dan Tehan, Australia's cybersecurity minister, wrote: "Just as we trust banks to hold our money, just as we trust doctors with our health, in a digital age we need to be able to trust telecommunications companies to protect our information from threats." A companion news article in the same newspaper cited Tehan as arguing that "the onus is on telecommunications companies to develop products to stop their customers being infected with viruses"...

Tehan's government roles include assisting the prime minister on cybersecurity, so folks throughout Australia perked up when he said all this. However, it's not clear if there's an actual plan behind Tehan's observations -- or if there is, whether it will be backed by legal mandates... Back home in Australia, some early reactions to the possibility of any new government interference weren't kind. In iTWire, Sam Varghese said, "Dan Tehan has just provided the country with adequate reasons as to why he should not be allowed anywhere near any post that has anything to do with online security."

The West Australian also reports Australia's prime minister met telecommunications companies this week, "where he delivered the message the Government expected them to do more to shut dodgy sites and scams," saying the government will review current legislation to "remove any roadblocks that may be preventing the private sector and government from delivering such services."

Comment Honored ? How? (Score 2, Interesting) 29

Non-routine deleted data is often the most interesting data of all.

Furthermore, most databases do not actually delete records, just flag them as "DELETED". Such records might be actually deleted/overwritten when a "Compaction" run is performed to recover space into larger blocks--if ever, might just recover LRU. How do we know what Google implements even if it not DELETED==INTERESTING?

Comment Re:Well it's easy to show superhuman AI is a myth. (Score 1) 255

in the middle range, 90 - 110 points,

IQ tests are also unreliable at the tail ends, for epistemological reasons.

How do you construct an intelligence test? You start with a collection of reasonable-seeming tests and you have a sample population perform them. You then rank them on test performance and assess whether your ranking confirms your preconceptions. So here's the problem with the tail ends: it's really hard to get a large enough sample of subjects to test the predictive value of your test with people who score three or more standard deviations away from the mean.

So while you can probably make predictions about differences in accomplishments between someone who scores 90 on IQ and someone who scores 110, I don't think you can predict much from a difference in IQ between 150 and 170, other than that people with an IQ of 170 will likely consistently score higher on an IQ test.

Slashdot Top Deals

"They that can give up essential liberty to obtain a little temporary saftey deserve neither liberty not saftey." -- Benjamin Franklin, 1759