Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Comment Re:Three independent teams found bug at same time (Score 2) 138

https://slashdot.org/~110010001000 protested:

It isn't possible all these people independently "discovered" a 20 year old flaw at the same time. Think about it. Google supposedly discovered it six months ago. I don't believe it.

Apparently you haven't heard of steam engine time. If Newton and Liebnitz could (more or less) simultaneously, independently invent "the calculus", why can't three disparate security research teams (more or less) simultaneously, independently discover the same security bug?

Note, as another example from a third field, that both Jennifer Doudna's and Zhang Feng's teams (more or less) simultaneously, independently discovered the CRISPR gene-splicing technique, just a few years ago. This kind of thing happens more frequently than you appear to believe is possible.

Paranoia is its own punishment ...

Comment Re:Facebook Time Well Spent? (Score 1) 112

zifn4b proclaimed:

Facebook is the ultimate time waster. You've exceeded everyone's expectations in that regard. Nothing left to do there.


Social media is the ultimate time waster.

It's not just FB. Twitter (!), Snapchat, Instagram, or even /. They're all designed to capture their users' attention and keep them on-site, so that the corporations that own them can mine the living shit out of their data streams and sell the bulk data they harvest to advertisers, marketers, and basically anyone who's willing to pay for it.

The thing is, though, FB can be as useful or as valueless as you care to make it. I spend, on average, probably 20 minutes a day on FB. I'm a member of several writers' groups there. I've found much of value to me in them, including beta readers, editors, and cover artists who are also members, and documents and links to websites with quite useful advice on things like Amazon keyword selection (a much more subtle and Byzantine discipline than anyone outside of the field might suspect), reviewer blogs, promotional services (of which a handful are worthwhile and a horde are merely scams - or might as well be, for all the good they do the indie authors who employ them), and many other topics that only writers give a damn about.

I also use it to check in with (and up on) old friends, read the latest Bloom County posts (Berkeley Breathed restarted the strip a couple of years ago. It's still every bit as wise, silly, compassionate, and funny as it ever was, too.), occasionally put items up for sale, and so forth. The key, though, is that I have an agenda whenever I go there. I ignore my "feed", because it is the principal time-wasting feature of FB. Instead, I do what I logged in to do, and then leave.

I also use NoScript's ABE feature to keep FB and other social sites from tracking me around the web, and I rely on Better Privacy to help take care of supercookie BLOBs (which is one of the main reasons why I haven't upgraded to the latest version of Firefox - because Mozilla has decided I won't be allowed to use Better Privacy if I do).

It basically comes down to self-control (something that Americans, in particular, are unskilled in and resistant to), and not taking the path of least resistance as an anodyne to boredom. When I have time on my hands, I play guitar, for instance. It's a helluva lot more satisfyiing than reading my Facebook feed ...

Comment Good news for the rest of us? (Score 5, Insightful) 225

From TFS:

The people who have left were responsible for collecting and analyzing the intelligence that goes into the president's daily briefing.

Daily intelligence briefings for the Chief Executive used to be a vitally important component of policy formulation. Then President Chump was sworn in, and suddenly they became completely irrelevant, because they bored him. He refuses to read or even listen to them, even when they mostly contain brightly-colored graphics, videos, and other visual elements designed to appeal to the functional-illiterate-in-chief. They've also been tailored to avoid topics, such as the latest intelligence on Russian psyops interference in the 2016 election, that push the Orange Oaf's buttons. (Let me point you to an alternative citation, because the Washington Post article may be paywalled for those who don't know how to use private browsing and cookie deletion to get around it.)

Think about how you'd feel if you had dedicated your career to producing detailed, highly-nuanced, daily reports on a whole range of intelligence topics for the most powerful national leader on the planet - only to discover that the new guy is completely uninterested in any information that can't be expressed in crayon drawings and bumper sticker catchphrases. Now throw in civil servant wages, and ask yourself whether that job would be in any way attractive to you?

Yeah - it's like that.

That's why they're leaving ...

Comment You don't know the half of it ... (Score 4, Informative) 294

https://slashdot.org/~fluffernutter observed:

You have to actually make a monorail do something for which there is no alternative transportation. The Vancouver Skytrain is actually the most efficient way to get across the city, so they get 117.4 million passengers in 2010 and 137.4 million in 2016.

We lived in Vegas when the monorail was built. There was, as you might imagine, a lot of coverage of the proposal, the construction of the track, and the grand opening of the line.

Of course, the coverage by the major dailies and the local media was mostly of the cheerleading kind. The alternative weeklies did a better job, but it didn't keep the deal with the Clark County supervisors from being made mostly behind closed doors. (The Strip, proper, lies entirely outside the City of Las Vegas, so the Vegas city planning commission, city council, and mayor had no seat at the table.)

What it boiled down to was that a private, non-profit (!) corporation formed by the casinos where the train actually has stations floated the bond for design and construction, with the voters on the hook to repay it - a typical Vegas klind of backscratching deal. If you didn't kick in, you didn't get to take advantage of the monorail traffic. Of course, since it was the big casinos financing it, one of the conditions they imposed was that it run behind them, so that patrons would have to walk through the gaming floor of each stop on their way to and from the train.

McCarren International Airport management took one look at the proposal and said, "No, thanks.". (It would have required McCarren to donate, get permits for, and clear the land across which the track would run, and build a terminal station, too - all at no expense to the hotel-casino operators who would gain the only real benefit from it. I thought McCarren's decision showed surprising common sense, under the circumstances.)

So that's why it doesn't run to the airport - or to the actual Strip - or stop at more than a handful of big casino properties. And, likewise, that's why it's an abysmal failure.

Vegas, baby ...

Comment Re: Surprised they lasted this long. (Score 1) 193

dskoll disagreed:

Not just teenagers. My BF and I love going out to movies and we are in our 50s. Getting rid of communal entertainment spaces will make people even more isolated and less engaged than before, continuing the atrophication of social skills kicked off by smartphones.

I used to love going to movies. Then, beginning in the 1990's, theater chains decided to stop enforcing basic movie-going civility - for fear that they would drive their teenage customers away.

Since then, going to see movies has become a more and more tooth-grindingly irritating experience for me. Finally, about 2007, I decided I'd had more than enough of assholes carrying on conversations with their friends, shouting advice to the characters on the screen, and (the very last straw) taking and even making calls on their cell phones. I'm sure smartphones have made the problem even worse in the intervening time.

I'm sorry to have lost the experience, but I simply can't accept spending however much money a movie ticket now costs to have my experience utterly ruined by narcissistic jackasses who don't know or care how negatively their oafish behavior impacts other patrons. And they're right that it's okay - because theater owners have allowed it to become okay. That sucks balls.

So now my wife and I watch movies and TV shows on the 55-inch flat screen TV in our living room, with our 7.1 sound system handling audio duties (and it's not one of those weenie, little deals with the juicebox-sized surround speakers, either - our mains have 15" woofers, and the surround and rear speakers are bookshelf speakers with 8" woofers that I'd've been happy to have as my mains back in my 20's), and we've learned not to miss being part of a larger audience.

But I definitely remember what it was like to sit in the dark, surrounded by strangers, all of whom were as enthralled as I was by the spectacle on the screen. Hell, I recall attending the Cinerama premiere of 2001: A Space Odessy in Honolulu (one of only six theaters it played in for its first week in domestic release). That was a freakin' magical experience - as was seeing Star Wars: A New Hope on its first night at the Oakland Paramount Theatre, back before they carved it up into a multiplex. Hell, even seeing Koyyannisqatsi at the UC Theatre in Berkeley was a blast, and that place was always a hole.

Okay - I do miss being part of an audience. But I'll never go back to the multiplex, because I simply can't lose myself in the moviegoing audience experience when every teenage dimwit in the crowd is doing his/her level best to take me out of that experience ...

Comment Re:Lots, actually ... (Score 1) 91

AmiMoJo noted:

Gotham isn't bad either.

I forgot about Gotham. Yes, it veers from canon - but it's a very well-written, well-cast, well-imagined series that does real justice to its huge (and growing) cast of characters.

Thanks for reminding me ...

Comment Re:Surprised they lasted this long. (Score 1) 193

SuricouRaven theorized:

The only reason I think cinemas exist at all is for people who want to watch new releases rather than wait for them to come out on disc.

Have you actually been to a multiplex recently?

The principal reason for cinemas to exist is to provide a place for teenagers to take their dates.

Sure, there are families who come to see PG stuff on weekends, but otherwise it's adolescents all the way down ...

Comment Re:"Average Reader?" (Score 3, Informative) 99

rock_climbing_guy mused:

I'd like to know what they were smoking when they said the average reader reads 12 books a year. How many people read even one?

I believe she says "the average reader" as distinct from "the average person". The average person - at least, the average person in the USA - barely reads at all. (Hardly surprising, given the American education establishment's devotion to the "whole word" approach to teaching new readers.) The average reader, by contrast, probably does read a book a month. They're the folks the Kindle store was created for.

Of course, half of those books are romance novels - the most popular fiction genre by a long margin. Mysteries are next, then science fiction and fantasy. (And there's not a lot of science in most of what gets categorized as science fiction nowadays, either, so lumping it in with fantasy is not necessarily inappropriate.)

Full disclosure: I'm a writer by trade and these details matter to me, so I pay attention to them. Most people couldn't care less.

FWIW - when I was a kid, I'd consume up to 10 novels a day. I was determinedly unathletic in those days - and still am - so I did little else until I reached puberty. Then my reading consumption dropped pretty steeply ...

Comment Lots, actually ... (Score 4, Insightful) 91

CaptainDork snorted:

... try me again next year.

Movies: Colossal, Dave Made a Maze, Atomic Blonde (despite the critics' naysaying), and Me and Earl and the Dying Girl, just for starters. All excellent in their very different ways.

TV: Legion, Dirk Gently's Holistic Detective Agency (season 2 - or "series 2" in Brit-speak - was even crazier than season 1), Your Pretty Face Is Going to Hell, Ken Burns' The Vietnam War, Marvel's The Defenders, The Orville (uneven, and it suffers from some pretty lame scriptwriting, but I expect it to improve in future seasons, as Seth Macfarlane shows always do), and BBC's The Alternativity (I've only seen the doc, not the performance that goes with it), off the top of my head. I'm sure I could think of more, if I tried.

Neal Stephenson's The Rise and Fall of D.O.D.O., and his three-volume masterpiece The Baroque Cycle (not new for 2017, but the best thing he's ever written, IMnsHO). I could go on here, too, but I'm being called away for Xmas brunch.

Cynicism and snarkiness are not nearly as hip - or as entertaining - as you might believe ...

Comment Re: Show me the videos (Score 3, Insightful) 79

Tomahawk demanded:

Point a camera through the lens and show us what it looks like through the glasses, not a rendered-image slideshow.

So I both looked at the Magic Leap web page - which features the "slideshow" you're complaining about - and read the rather long Rolling Stone article to which TFS points.

Yes, I know. Very un-slashdotty of me. I am obviously "not of the Body".

Reading the article first (including the bit where the author, obviously parroting a recorded statement from the interviews he conducted during his tour, talks about "a ray" of photonic computing structures, which makes it plain that he has no fucking clue about chip design and fabrication) greatly helped me to visualize what ML was trying to present on their home page. The page alone was certainly not at all impressive, but the Rolling Stone reporter's description of his experience with the beta ML1 - and especially the interactive quad sound that tracks virtual objects in the headset wearer's field of view - makes it pretty clear that a video "shot through the goggles" wouldn't necessarily convey that experience a whole lot better than the "slideshow" does. It would, however, put a huge demand on their servers, and probably be laggy as all hell the day they announced their forthcoming product, neither of which would be positives from the perspective of a company that's gone from stealth mode to full visibility on the web in a single announcement.

I do recommend the article, despite its shortcomings (some of which are a consequence of the NDA provisions under which the author labored). The product itself, and the technology ML has created to make it possible, are, in fact, potentially game-changing for interactive computing - albeit probably not in the short term. It's pretty clear that the ML1 will be strictly for developers and rich fucks who can afford to drop the price of a collectable guitar on what will essentially be a toy. The second and third generations are where the real effect on general computing will occur (if at all), after the initial capabilities of the device are seriously enhanced and the price drops from nosebleed territory to something at least marginally affordable to the masses.

That said, it seems like a reasonable bet to wager they'll make it that far. The founder put up a huge amount of his own money to get the company to the point where they had the tech taped down well enough to present it to Google, et al., and they've apparently been pouring cash into ML ever since. It's clearly not a scam, because you don't build a production-level chip fab just to hoodwink the rubes. And ML has constructed such a fab in the basement of their headquarters.

Rainbow's End might not be that far away, after all ...

Comment Re: From whence came the Internet ... (Score 5, Informative) 278

IMightB inquired:

Do people not remember the origin of "The Internet"? It started as a Defense Project to ensure communications in the event of a nuclear war... They opened it up to universities, and then to the public. Back then they did a fairly decent job of being hand-off. It wasn't until they turned over to private corps, that it started to go downhill.

As it turns out, that's a common belief - and it's wrong.

While it's true that a 1962 RAND Corporation white paper authored by Paul Baran theorized that a packet-switched data network could allow military communications to survive a general nuclear war, that was entirely a thought experiment. The Department of Defense filed it away and largely forgot about it.

It wasn't until 1965, after accepting a position at DARPA, that an electrical engineer named Robert W. Taylor first got the idea for what would eventually become first the DARPAnet, then the ARPAnet, and finally the Internet.

As a condition of the DARPA grants that helped fund their experiments, research teams at three different major research centers were required to install remote terminals at DARPA for their - entirely separate and self-contained - multi-user mainframe systems. These were the first computers to operate interactively, rather than in what mainframers call "batch mode", and support multiple, concurrent user sessions via dumb terminals with line printers as their "displays". One of Taylor's assignments was to monitor and liase with the scientists who built and ran this trio of individual experimental systems, and he quickly noticed that something very like what we would think of as newsgroups spontaneously appeared on all three systems. (That is to say that computer scientists who had accounts on all three, separate, not interconnected in any way systems had each decided that something very much like a computer BBS or Usenet-style messaging system would be a useful addition, and had - again, independently - hacked such a tool together for the users of each of these systems to communicate with each other in a way that had some degree of persistence and which was accessible to the entire user community of that particular machine.)

The fact that users on each system had more-or-less-simultaneously decided such a tool was desireable, and had developed code to create it - and we're talking three different sets of code here - without ever communicating with the other two teams greatly interested and excited Taylor. He immediately wondered what would happen if all three systems were physically connected together in a way that would allow their users to communicate not only with each other, but with users on the other two systems, as well. He took that idea to his supervisor, one Charles Herzfeld, who thought it might have merit. Herszfeld asked Taylor to draw up a formal proposal, and committed, sight unseen, to fund it to the tune of a million dollars (which was real money in 1965).

So Taylor wrote a proposal, and with a million bucks to spend on it approached the managers of the three, separate multiuser systems with his idea to interconnect their systems. All three turned him down flat.

Robert W. Taylor was from Texas, where they grow 'em stubborn, so he persisted in pitching his idea to the three managers of different, multiuser mainframe systems, despite their continued objections that each saw no merit in his proposal, and each considered it a potentially major distraction from the purposes for which each of their disparate systems had been created. Eventually, over the course of time, he wore them down to the point where he got two of them to agree to at least test the idea. It took nearly two years from then before all the ducks were duly aligned, the necessary equipment designed and built, and the long-distance, dedicated telephone lines contracted for.

At 22:30 hours on October 29, 1969, the first two nodes of what was dubbed the DARPAnet - at UCLA and the Stanford Research Institute - began exchanging data packets. The first two characters exchanged between them were successfully received. The system crashed when the third packet was sent.

But there was no turning back from there.

Taylor left DARPA to take a job as director of information systems at the University of Utah - which became the network's third node. He went on to be the founding director of computer research at Xerox's Palo Alto Research Center - whose network-centric, GUI-driven Alto computer was the direct inspiration for the Apple Macintosh. (I say that, because Steve Jobs signed an NDA to get a look at the Alto in 1979 - and started development on the Macintosh after Xerox cloroformed the Alto project.)

I know these things because of a long phone call Taylor made to me after I said the same thing you did - except I said it in my column in Boardwatch Magazine, to which he subscribed. Throughout that conversation, Mr. Taylor insisted that the network of computers he had envisioned and brought into being while at DARPA, was a pure research project, with no connection at all to Paul Baran's thought experiment, other than being based on the packet-switched networking idea Baran had imagined, but never actually pursued.

Our conversation sparked me to write what became the cover story for the December, 2000 issue of Boardwatch. You should read it ,,.

Slashdot Top Deals

The rich get rich, and the poor get poorer. The haves get more, the have-nots die.