Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Submission + - Ask Slashdot: where can I publish my FLOSS-related tutorial to get most readers?

d33tah writes: I wrote a tutorial that describes how to get started using a newly popular open source security tool. I put quite a lot of effort writing this and then I realized that my blog isn't the right place to publish such an article if I want it to actually be read by a bigger group of people unless the few readers start sharing it like crazy. I hadn't published it yet, so I contacted lwn.net and heard that they're not exactly interested in tutorials. Which other spots should I try? I don't care about getting paid for writing the article, I just want attribution.

Comment What's their endgame really? (Score 1) 256

Sometimes I wonder what kind of world MAFIAA actually imagines. I prefer to assume it's somehow coherent given how much they must have invested into developing this obviously wrong philosophy. Now the question: suppose they actually succeeded and Google went bankrupt. Given that their goal right now is obviously to harm they way we look for information, is there any other system they propose in place of the current one?

Comment Don't bother trying Btrfs. (Score 1) 42

I am using OpenSUSE 13.1 right now with ext4 partitions and I am pondering migrating to OpenSUSE 13.2 with btrfs or simply updating the distro with ''zypper dup'' and keeping my ext4 fs.

If you are using btrfs, what has been your experience? Better performance? As stable as ext4?

You can't really say how much disk space you have (especially if you use compression and snapshots), overfilling the hard drive might leave you in a situation when you can't basically do anything other than reformatting the filesystem, and from my personal experience support for directories with loads of files is much worse than in ext4. My advice - don't bother.

Submission + - Google Code-In 2014 and Google Summer of Code 2015 announced (blogspot.com)

d33tah writes:

A call to all students: if you have ever thought it would be cool to write code and see it make a difference in the world, then please keep reading. We are excited to announce the next editions of two programs designed to introduce students to open source software development, Google Summer of Code for university students and Google Code-in for 13-17 year old students.


Submission + - Internet Census 2014 to come?

d33tah writes: According to the project's website, the Internet Census 2012 researchers are crowdfunding next internet-wide research via Bitcoin.

"We are working on a vast and ground-breaking census, this time we hope to do it legally. Please help us make this happen by donating bitcoins to: 1tUCEnTyKzWrTBn1tgruSRkfahGUhxHcq"

Internet Census 2012 was a biggest complete scan of the entire internet with results being publicly available. This included traceroute information, port scanning, service and OS fingerprinting and more.

Comment Re:OS Fingerprints! (Score 1) 32

"Apparently the researchers didn't analyze OS fingerprints at all."

Did you look into their paper? This is apparently not true. They focused on the ICMP data set but also looked into others, in particular the service probes that you mentioned. One of their validation sets is using that data set.

Okay, point taken about the service fingerprints, but I still see no mention for the OS fingerprints. If they looked at the data format that is there, they could get much more out of the set. (they'd also find more mess by the way as there was some weird bug that destroyed quite a few samples there)

Comment OS Fingerprints! (Score 3, Interesting) 32

Apparently the researchers didn't analyze OS fingerprints at all. There is some metadata that the original researcher(s) forgot to remove (as well as a lot more mess). Service fingerprints are interesting as well. I did a lot of research on this data set and I have to say that while messy, this is also a really amazing data set. This article is IMHO biased.

Comment Re:Still abusive (Score 1) 511

Hashing is just not going to help there. The DNS domain space is basically so little that it could probably even be bruteforced offline, not to mention web crawlers. You can easily download the rDNS for the whole internet as of 2012 from Internet Census 2012 database for free (http://internetcensus2012.bitbucket.org/paper.html). While it's not forward DNS, I would expect to get a very high match rate just by hashing it all. Definitely feasible.

Slashdot Top Deals

<<<<< EVACUATION ROUTE <<<<<

Working...