Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Good to see things like this. (Score 1) 135

Map the human genome with a parallel database. The only "downtime" would be sequencing, but query times to test for different factors in a particular splice would be relatively short. The downside to this would be the amount of space required to group, and tie together relevant data. Something like this would probably be a start, which I still haven't gotten around to releasing in its entirety yet, given that I don't have much free time nowadays.

Comment Good to see things like this. (Score 2) 135

As a data analyst/software engineer, it makes me glad to see these kind of actual strides are being made to ensure that both data and software will eventually start being designed properly from their inception. To have a single cluster database with anything more than a few thousand entries is nothing short of incompetence, and I believe anyone who does this should be publicly shamed and flogged. When dealing with excessively large amounts of data, it quickly becomes a necessity to have a paralleled database design to ensure that searches aren't hampered by long query times. It genuinely makes me thrilled to see someone else use this kind of design other than me, so when I put out numbers on my end, maybe my results won't seem as fantastical or unbelievable. Even though I don't know you personally, keep up the good work, Todd.

Comment Same process I've been using for about 4 years. (Score 1) 61

http://tot-ltd.org/techinf.html

NSRL is also a pretty good site to get a comprehensive whitelist from. Best of all, the whitelist database is free, and used for forensic file analysis. The only mildly difficult part is sometimes keeping up with the release of new malware, but that's why I implement several other databases, including one based on API calls in known hostile applications. The really interesting thing with API groups, is that you can identify which piece of new malware most likely belongs to a specific family. So far, I've had no false positives on whitelisted files checked against the API database. ( http://www.tot-ltd.org/API )
Programming

Journal Journal: TT Livescan Database Milestone

As of today, TT Livescan's database total approximately 350 million unique definitions, and nearly 17GB in size. In my spare time, I am currently working on a new version with some performance improvements. There is no set release date for the new version, but when that time comes, the announcement will be made.

More information, and other projects can be found at http://www.tot-ltd.org

Comment The most troubling aspect... (Score 1) 228

Is the fact that these competing antimalware companies do not openly publish and/or share detection methods or datasets. This ultimately does little more than give the users a false sense of security no matter which product is being used. What should be done (and what I've been attempting to do for quite some time) is to have a centralized/universal database of definitions, and from there, the real competition would be who, or what company can write the most effective *scanner*, thus benefiting the user, and weeding out ineffective coding practices, half-baked theories and groundless conjecture. To illustrate what I'm referring to, here are the datasets I maintain on a fairly regular basis. Keep in mind that 0-F is not an actual URL, but some of the datasets are defined as single characters, and sorted accordingly.

http://www.tot-ltd.org/blacklist/0-F/
http://www.tot-ltd.org/whitelist/0-F/
http://www.tot-ltd.org/API
http://www.tot-ltd.org/heuristics.dat
http://www.tot-ltd.org/installation.db
http://www.tot-ltd.org/packer.db
http://www.tot-ltd.org/files-wl/
http://www.tot-ltd.org/files-bl/


In the end, sure, there are several million files, but each specific group is only a few hundred bytes in size, which reduces a LOT of overhead and brings individual scantimes to near zero with a halfway decent connection speed. By doing this, a single scan is limited only by your hardware and internet latency.

Comment This shouldn't come as any sort of surprise. (Score 1) 56

Given that it seems about every other year, there's some sort of price fixing scheme that's discovered. Considering the price of LCD/Plasma/Flatscreens have never really come down that much from their original price when they first hit the market, it seems that this kind of news is more of a defacto standard we've come to expect from a capitalist scheme where every company that churns over profits whines about how poor they are. For the record, I'm not really a fan of HP either, considering their business model attempts to lock in consumers with half-assed printers and artifically high priced ink cartridges. As a side note, fining companies that rip off consumers is no way to handle the market in a reasonable, logical manner. If anything, these companies should be forced to pay a class action settlements to anyone who bought their products at artificially high prices. Then again, that might just be rewarding stupidity.

Comment HAH! (Score 1) 179

Wait until they discover the universe *IS* the Higgs-Boson. Their so-called scientific "theories" and "methods" will be left in utter shambles in accordance with the prophecy. The stars fall from the sky and the heavens will be aligned, properly allowing the opening of The Dark Portal, which will allow The Ancients to be summoned from the other side of The Great Cosmic Divide, unleashing 10000 years of darkness upon the land and skies. Only then will The Keeper Of The Threshold be satisfied with a crop most bountiful. Pray that you will be eaten first. PRAY THAT YOU WILL BE EATEN FIRST.

Slashdot Top Deals

Neutrinos have bad breadth.

Working...