Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Well, that's embarrassing... (Score 5, Informative) 54

Hi folks, I'm Jason, the guy who found this bug.

I feel kind of embarrassed this is on the front page. I like to think that I spend time doing cooler things than reading PHP, let alone the source of random Wordpress plugins. My brother lives at the south pole and has a pretty damn cool blog about it (yay, more linkspam!), but the NASA satellite only flies overhead a few times a day, and bandwidth is pretty limited, so he asked me to help with some maintenance, and in the process I noticed this. But now the Intertubes have me pinned as a Wordpresser, alas. I guess that's just how it goes.

Anyway, my feeling on this is basically, to put it in /. terms -- "Random Wordpress plugin has gaping security hole... news at 11!" If you want a reasonably secure Wordpress rig, it's probably best to stick with plugins and themes put out by Automattic.

It wasn't mentioned in the linked article, so it's worth nothing here -- I think the best remediation, until W3 Edge releases a fix (he's on Christmas vacation now or something I think), is to either disable the plugin entirely, or, if that's not a possibility, just disable the object cache and database cache, and then empty all caches. Doing that should at least clear up this hole.

-- Jason

Comment Q: What in 1 shouldn't lead to 2? (Score 1) 88

1: "The suspect in the spy data theft worked for the NDB, or Federal Intelligence Service, which is part of Switzerland's Defense Ministry, for about eight years."
2: "He was described by a source close to the investigation as a "very talented" technician and senior enough to have "administrator rights," giving him unrestricted access to most or all of the NDB's networks, including those holding vast caches of secret data."

A: "for about eight years" --> "unrestricted access to most or all of [...] vast caches of secret data"

Eight years? That's it? Really?

Comment Some Simple Wiresharking... (Score 5, Informative) 359

It appears to just be PPTP, with the credentials generated dynamically from a JSON HTTP endpoint. There's a required ping URL the client has to hit every 900 seconds in order for the credentials to stay alive. Shouldn't be too hard to make an open source client.

Interestingly, the JSON config endpoint contains a list of IP address ranges that should be excluded. Haven't started investigating these yet, but here are the ranges if anyone wants to look:

  • 8.34.208.0/20
  • 8.35.192.0/20
  • 64.15.112.0/20
  • 64.233.160.0/19
  • 66.102.0.0/20
  • 66.249.64.0/19
  • 70.32.128.0/19
  • 72.14.192.0/18
  • 74.125.0.0/16
  • 89.207.224.0/21
  • 108.59.80.0/20
  • 108.170.192.0/18
  • 108.177.0.0/17
  • 142.250.0.0/15
  • 172.217.0.0/16
  • 173.194.0.0/16
  • 173.255.112.0/20
  • 193.142.125.0/24
  • 199.192.112.0/22
  • 207.223.160.0/20
  • 208.65.152.0/22
  • 208.117.224.0/19
  • 209.85.128.0/17
  • 216.58.192.0/19
  • 216.239.32.0/19
  • 67.63.48.0/20
  • 77.247.182.224/28
  • 216.185.96.0/19
  • 65.60.0.0/18
  • 69.175.0.0/17
  • 198.143.128.0/18
  • 173.236.48.0/24
  • 184.154.151.0/24

Comment Don't turn students into drones. (Score 0) 120

At the recommendation of another comment, I just watched the demo video for jupitersis. I was horrified. One unified solution for students, teachers, and parents to check grades, assignments, discipline, schedules, and more.

My objection is that there is something to be learned from the good old fashioned way. Students need to learn to write down or remember their assignments and keep track of how they're doing in class by themselves. Maybe that means a motivated student will build his own gradebook. Maybe that means other students will learn to manage a planner. And maybe other less dull students will develop a memory inside their brains. Whatever the means is, this classic exercise of remembering grades and assignments is a cornerstone of childhood education.

And then there's the discipline issue. A screenshot in that demo video showed a nice citation webpage with checkboxes like "[x] gum/food [x] violence [x] talking". This bureaucratic approach only serves to depersonalize discipline and prematurely convert youthful chaps into the office drones that many will become. When students act up, parents and teachers should be aware of it on a personal level, and monitor for any deeper issues that can be dealt with sensitively.

The discipline chart also had boxes for "[x] email parents [x] notify parents on login [ ] notify dean". The automation of informing parents is frighting. There's something to be said about the not knowing if the parents know, when a student gets in trouble at school. There is a healthy guilt associated with it, that can translate into a productive dinner table conversation. It also educates the student's human sensibilities -- he learns to gauge who is likely to inform whom, and at what time, and become sensitive to all the small human factors that make the world go round.

If anything, "solutions" like jupitersis only serve to raise children for a 21st century bureaucratic totalitarian society, which even if this is the state of the world, children should at least have their childhood.

As for the question in TFA, which I believe was about how teachers should manage their grades, each teacher should manage it by themselves. No need to subject teachers to any more needling paperwork. Some teachers like old fashion grade books with the nice rows and columns. Other teachers are Excel aficionados. Let the teacher do what's most organic to his or her teaching style.

Moral of the story: technology is not the solution to childhood education.

Submission + - How not to respond to vulnerabilities in your code (launchpad.net)

zx2c4 writes: "I found some gigantic vulnerabilities and wrote 5 different exploits for Calibre's mount helper. The dev was stubborn and belligerent, and over the course of playing whack-a-mole with the half-baked fixes he rolled out, this bug report became an excellent example of how *not* to respond to a security vulnerability. The story made the top of reddit — http://www.reddit.com/r/programming/comments/lzb5h/how_not_to_respond_to_vulnerabilities_in_your_code/ — and bounded around twitter. Please don't quote this blurb here in the /. article verbatim — I'm sure you can come up with something a bit more articulate. But I felt like Slashdot needed to hear about this"

Comment Server Execution is the Issue (Score 5, Informative) 70

Most quality web hosting provides customers with shell access to the web server, or when cases where they don't, usually something like PHP is installed that usually allows for arbitrary execution.

On a web server that hosts a few thousand sites, using the Bing IP Search, you can find a list of all the domains. Usually there will be a lowest hanging fruit that's easy enough to pluck. Or, if you can't get shell access through a front-facing attack, you can always just sign up for an account with the hosting company yourself.

So once you have shell, then it's a matter of being a few steps ahead of the web host's kernel patching cycle. Most shared web hosting services don't utilize expensive services like ksplice and don't want to reboot their systems too often due to downtime concerns. So usually it's possible to pwn the kernel and get root with some script-kiddie-friendly exploit off exploit-db. And if not, no doubt some hacker collectives have repositories of unpatched 0-day properly weaponized exploits for most kernels. And even if they do keep their kernel up to date and strip out unused modules and the like, maybe they've failed to keep some [custom] userland suid executables up to date. Or perhaps their suid executables are fine, but their dynamic linker suffers from a flaw like the one Tavis found in 2010. And the list goes on and on -- "local privilege escalation" is a fun and well-known art that hackers have been at for years.

So the rest of the story should be pretty obvious... you get root and defeat selinux or whatever protections they probably don't even have running, and then you have access to their nfs shares of mounted websites, and you run some idiotic defacing script while brute-forcing their /etc/shadow yada yada yada.

The moral of the story is -- if you let strangers execute code on your box, be it via a proper shell or just via php's system() or passthru() or whatever, sooner or later if you're not at the very tip top of your game, you're going to get pwn'd.

Comment Before anyone gets ahead of themselves... (Score 5, Informative) 283

I'm a good friend of John, the blog post author, and have been working with him throughout this process in trying to unravel Hamstersoft's deceit. I want to make a few things pretty clear:

Yes, they posted a zip of code on a hard-to-find link. But they did something sneaky. They included the very short and trivial C# wrapper around Calibre, but they only included a compiled (well, .NET dll) binary blob of the bulk of the application code -- the user interface. And of course, since all the heavy lifting is in Calibre itself, this code is the most important part of the application. They went through pains to extract the source of the UI components and only include it publicly as already compiled. They even packaged it up in a nice Visual Studio Solution so that you can load it up and hit "compile" and you get the software. It looks, at first, like they've complied. But then you dig into the source code actually provided, and it becomes obvious that they haven't provided the majority of the code at all, but only the wrapper code and a few call outs to the provided compiled DLL.

Cheap trick.

The other thing to take notice of in John's post is that in fact the search engines and Facebook have hardly complied -- there are still search results and Facebook pages for this company. Now, you can debate and troll and bikeshed and argue the validity and ethics of the DMCA all you want, but the fact of the matter is that when the big companies want to use it against the small, it seems to work, but when some OSS devs want to take the case up with giant companies, the response is exceedingly lackluster. (Likely, this being on /. will change things, we'd hope...)

The final point to consider is what this all means for GPL and OSS. Hamstersoft is Russian, so good luck trying law suit or anything. But at the very least, shouldn't the OSS community have an army of lawyers willing to work probono, or financed by various foundations, for this kind of thing exactly? John mentioned he tried contacting one such organization, and was unsuccessful. He's told me that at another point, he got in contact with a lawyer from another place who didn't offer to do any work for him but vaguely suggested he send these notices to Google, Facebook, etc. That's pretty lackluster. I don't want to complain to loudly, but instead I just want to suggest that this issue call our attention to the bigger issue -- what institutions do we have in place to protect OSS software effectively as small OSS devs? Do such institutions work? In this case, thus far, they don't seem to be working.

Comment Re:Search engines? (Score 1) 4

Actually that's not true. With the AJAX Crawl Spec, search engines can read AJAX pages.

http://code.google.com/web/ajaxcrawling/docs/getting-started.html

Basically, it rewrites domain.com/#!/someajaxstate to domain.com?_escaped_fragment_=someajaxstate, and then it's the responsibility of the server to render it statically. PhotoFloat uses HtmlUnit to do serverside javascript execution. I wrote about it here: http://blog.zx2c4.com/589 .

Submission + - Client-side JavaScript to Replace Server-side HTML (zx2c4.com) 4

zx2c4 writes: "I've recently finished writing a simple photo gallery web application that scans a directory tree of photographs and generates static JSON files and thumbnails. There is then an accompanying web page that consists of a single index.html with some heavy JavaScript that fetches the JSON files and writes the layout of the page. You can navigate various pages and switch between different views, all without loading a different HTML page, but because the information is downloaded from the JSON files via AJAX. The app uses hash URLs to mimic navigating through a normal web page. It's all very similar to how GMail works, really. So, we've all seen AJAX used in a low key way at a zillion places around the Internet. But what I'm wondering is — do you suppose that the future of web applications will be in doing all of the page structure in client-side JavaScript, and that servers will only serve up the static index.html/scripts.js/styles.css and then a bunch of (dynamic or static) JSON files? Are the days of having a server dynamically write the actual HTML over? Do you expect to see nothing but JavaScript apps doing all the display for JSON data? Do websites still have a responsibility to display with out JavaScript as a requirement, or have we all got to accept that JavaScript is here to stay, and will be in the future responsible for most HTML writing?"

Comment Quality? (Score 1) 69

The big question for me is this--

The download link only allows you to get the encoded FLV file. Does this mean they failed to store the originals? And if this is so, does that mean YouTube would be serving up the old fashioned h.263 FLV low quality encodes? If that's the case, we'd be much better off _not_ using the auto-move service, as YouTube encodes at much higher quality than Google Video did.

Or, did they just not want us to be sucking their bandwidth by allowing us to download the original footage, but they'll happily transfer it in-house over to YouTube?

Anyone have any pointers?

Slashdot Top Deals

Never test for an error condition you don't know how to handle. -- Steinbach

Working...