Comment Re:Wet dogs vs. wet t-shirts (Score 1) 97
They started that study at the same time, but they're still gathering data. I think it might take a while.....
They started that study at the same time, but they're still gathering data. I think it might take a while.....
I knew Flash had a certain air of suck about it because of some of the security issues. Then I went to FX's talk at BlackHat US 2010. He released a tool (Blitzableiter http://blitzableiter.recurity.com/), that essentially does all the file validation for SWF files that Adobe's Flash player Completely Fails at. I think that maybe I would feel a lot better about Adobe's position if they didn't still have, after just about 10 years, a giant kludge job that they expect us all to freely install in our browsers.
Actually, I just built a low voltage ultra-portable notebook using an X25-V (CULV CPU, no optical drive, 8+ hour battery life). I'm running Linux, so my OS load is under 3Gb right now, so a typical quarter to half terabyte drive seems like overkill for a system that only runs productivity apps. I haven't done much battery benchmarking thus far, but the reduction in disk access times has been tangible. For example, even using a low power CPU, my boot times are under 15s to the log in screen.
Your setup is a good one, mine is just one that uses an SSD as the sole drive.
In order to establish a pretty much unassailable prior art, you may wan to file for a provisional patent.
Good luck
Another reason for ordering online s the famous long tail. Niche and esoteric items are much more viable when you're Amazon, not Bob's Corner Furry Bondage Shop (unless you're in NYC, then you can find anything). I've seen a decline in the breadth of tech/computing books at my local big box book stores, which I think is caused by the online availability.
I'm breeding cockroaches to write code. How ironic: bugs will solve bugs.
Well if you're not breeding them to debug, then really you're making bugs to make bugs - which arguably they already know how to do.
I would prefer to use Firefox w/ NoScript for surfing less trusted sites, and Chrome for known legit sites. Given the recent work on CSRF type attacks, it's probably a good idea to do your banking and shopping in a different browser than you do riskier stuff.
I actually find that my best work happens when I've helped get organizations from the fire fighting mentality to the proactive maintenance mentality. Every place that's gotten fixed I've left because it wasn't engaging any more. I think that even if we as a profession have reached a consensus about how things should generally work, doesn't mean we're all at our best in that mature, well run organization. Some guys are one good as the lone IT guy, or on a small team, some are only good in a well structured environment, and people like me are at their best untangling the mess.
Have you looked at releasing your in house app?
For some things that might be possible, it might even be cool if that mechanic were used with a motion controller. I think that it might be harder to come up with a good way to quantify high skill in non-melee classes though.
Having been a dedicated healer in a few MMOs, the skill lies in resource management at least as much as just keeping people up. Letting a tank get down to 10% health (assuming it wasn't a 1 hit) is a sign of failure, running out of power is a sign of failure, in a bad situation, wasting power on healing the wrong guy is a sign of failure.
The AC has a very good list, I'll see if I can add anything to it.
Network diagrams should be at a network, physical and datalink layers. Only the simplest networks can have all this information on a single diagram and have it be useful. Seperate the network drawing from the datalink and physical drawings as requred but be sure to leave enough detail to connect the drawings (Visio has a nice linking feature for this). Also keep a spreadsheet or database of assigned networks, IP ranges, and assigned static IPs, including a responsible POC for each entry. Also, a spreadsheet of all infrastructure devices with model and options documented along with firmware versions, and support contract information. All ports should have a description entry for what it connects to, and the project/request/change identifier that created the connection.
System documentation starts with the system name, project, admin, data owner, system specs, OS and application software name/vendor/version information, as well as support contracft information. Then comes backup and recovery procedures. After that you have the build procedure, including all configuration changes, and scripts. Also include any system standards i.e. all sofware added is in
Domain/authentication system documentation should include a description behind the premission model and standard premission and logging settings for all systems related. There should be procedures for credential and access changes that are documented and understood by everyone with administrative privilege. All systems should be build to not share credentials, and imperitive credetials should be in a sealed, tamper evident envelope in a secure location (a safe typlically). Things like root and domain admin passwords can be made by 2 or more people and added to the envelope, so no person can make changes without an audit trail.
Databases should have all the system documentation along with schema information, connection parameters, and roll back procedures. Any configuration made for logging transaction logging should be docuemntated and scripted where possible (anyone who has had to custom roll persistent trace logging for MSSQL databases will empathize).
Logging and managment systems should have procedures for adding new systems and new metrics. Managed systems should be baselined, using system thresholds where possible.
Patching and patch testing should have procedures and deployment schedules (i.e. MS patch Tuesday patches should be full deployed within X days/hours of release, Sun patches will be applied to the dev environment within 24 hours of release and deployed to production after 7 days etc.)
Whenever possible use a central system for this information. A Sharepoint, Zope/Plone, or even a wiki can make the information accessible. If the support folks use the docuemntation, it will be maintained. If nobody uses it, no procedures mandate it, it will die. If you have a change management system that enforces documentation updates then people will use what you've done for years to come.
The point about hoarding is a big one. The amount of address space held by US governement entiies ie huge. I've worked with/in several
IPv6 will create some serious growing pains. We have more 20 years of the world wide web and IPv4 w/VLSM experience as an industry. There's a number of things we take for granted in the conventions, and even the protocols that IPv6 can put into question.
I thought I saw this kind of thing at Blackhat US 2006, as a browser expliot.
The difference is that it's "weaponized" now. We start patching, tracking and working on sigs when an expliot comes out, but the risk level really goes up when the threat is in the wild, and again when the expliot is packaged. I'm actually suprised that it's not a multi-vector threat, using maybe a spam or lured browser propagation. That would give the worm access to the protected interface.
Defecit spending should only be done for things that a) you would be doing anyway, or b) that have long term value. Stimulus money should only go to projects that have a effect on the way
8 Catfish = 1 Octo-puss