It's funny that you describe the situation that you do. I've worked hard with my teams to create a culture that emphasizes that everyone is responsible for every part of making great deliveries on time. The result is that we accrue technical debt like everyone else, but we've been in a pretty good position to choose where that technical debt comes from, and so we keep it as far away from unit tests and core design / infrastructure code as we can, knowing full well that it's seeped into the implementations of individual components. But because we got the design right, continuously improving those component implementations is straight forward.
You're asking the wrong question, as I've learned the hard way. The question you ask your users is not "what do you want", because obviously the answer is invariably "a pony and a cake and a million dollars and world peace." I've had much better success with "tell me about how things work today", which very quickly gets the conversation centered on pain points --- what is, from the user perspective, broken or otherwise less usable and friendly than it should be.
The day you gave up using the wrong tools for the job on a MS platform was a happy one? What a surprise.
I just cannot embrace the mess C++ became anymore. Life is too short to learn C++. Basically I'm using "C with classes" today, without STL, Boost or any of these aberrations.
Out of curiosity, is this desire for self-inflicted misery part of a larger BDSM kinda deal or is it due to willful ignorance of standard practices and idioms that have been well entrenched in the C++ community for the better part of a decade now?
IANAL but Wikipedia thinks there is. It's called barratry ( http://en.wikipedia.org/wiki/Barratry_(common_law) ). There's also http://en.wikipedia.org/wiki/Champerty and http://en.wikipedia.org/wiki/Vexatious_litigation
FWIW, I've found code reviews to be particularly effective in this situation as it forces everyone involved to explain themselves and argue on technical merits and helps frame the discussion in risk/benefit terms. It also shifts the tone so that there is an opportunity for a positive outcome for all: if the "kid" is right, you have an opportunity to learn and improve going forward. If the kid is wrong, he's going to get smacked down by the other engineers -- hopefully he's smart enough that he learns his lesson the first time.
We do this in C++ and Python for all our projects now. All in the repo commit hooks.
Is to achieve this: http://www.joelonsoftware.com/articles/Wrong.html -- make things that are wrong be more obviously wrong. Using discipline and coding standards is just one part of the appropriately paranoid developer's defensive programming toolkit.
You seem to be confused. I did not imply that there is anything wrong (inherently or otherwise) with non-profit work, as a manager or in any other capacity. What I did imply is that in this particular case, "non-profit" is a euphemism for "unemployed".
Why do I feel an overwhelming desire to read "non-profit executive" as "unemployed douche bag with too much access to a thesaurus"?
A great deal of the version wrangling you are facing is best done with a tool like Git.
The bigger problem (development discipline) is much harder to fix.
Until someone offers your boss a compelling case demonstrating the educational value of access to Facebook, you block all of it. The purpose of the computers is to be an aid to the school's educational mission.
I do a lot of work with the Axis cameras, although almost always running my own algorithms on their video. They're nice hardware to work with, but I don't think their algorithms perform all that well (and if I'm not mistaken, they're another example of "ObjectVideo Inside", so it's not really their algorithms). Of course, designing better algorithms for the motion detection and what not is part of how I pay my bills, so YMMV.
"... then you're not doing it right" pretty much summarizes (to the best of my knowledge) state-of-the-art when handling exterior conditions with purely visual sensors, particularly when you're relying on crappy source video with no algorithm-in-the-loop auto-iris control (or worse: analog auto-iris) and god only knows how many layers of crappy A2D, D2A, quantization, etc. steps in between. You need other motion sensors that give you more reliable cues, but even with clever placement, you are always going to face a fundamental trade-off between probability of detect and probability of false alarm.
As for motion sensors + dogs, what is stopping them from aiming the detectors to have a deadzone that ends ~3-4ft off the ground?
Depends on lighting conditions but generally, yes. Which is least partially why I ended with the question that I did.