I've been thinking a lot about the software methodology religious wars lately. It seems to me that all methodologies have their strength and weaknesses. 15 years in, Agile has given us faster coding, but worse quality. Waterfall was flawed due to its overemphasis on architecture and underemphesis on business. Lean cuts out inefficiency- at the cost of elegance and maintainability. Devops sacrifices quality and cheapness for speed of continuous Outcomes. And while it's true that the "User Interface is everything" because that's the only thing the user sees, Outcomes sacrifices the future for one-off unmaintainable code.
So here's my solution- WALDO. The ultimate 3-6 person team- no more than 6, no less than three if a couple of guys wear multiple hats.
But these aren't just methodologies- they imply roles on the team. The ideal six person team consists of:
W- The customer's view of the project should always be waterfall with iterations. They tell us what they think they want, we build it- they're involved in every iteration. Of course, they don't really know what they want- it takes several iterations before we discover what they want- but the W role is the customer herself.
WA- The Waterfall Architect, or perhaps the Waterfall Analyst. This is the guy who is the face of the team to the customer- the single point of contact. On smaller teams, may also be the scrum master- but ideally should be a master of the models. This person should also be the principle advocate in scrum meetings for the customer.
AL- the real scrum master should be a master of both Agile and Lean. This guy lives in the world of Gantt charts and excel spreadsheets- keeping both schedule and budget, keeping the team on schedule, communicating that schedule to the team and to the WA and W. Daily scrums should keep people on task.
LD- the Lean Developer is a Model First Full Stack Programmer, but is the king of Object Orientation, maintainability, and reuse. It is this role that you want somebody who excels in data- but can work in the higher tiers of programming right up to the User Interface Tier.
DO- the ultimate Devops guy should be as much artist as programmer- a whiz of the User Interface. This is what the customer will see, so the DO and the WA are a natural Quality Assurance feedback loop for each other. Since the DO consumes data and objects coded by the LD, there's a natural QA feedback loop there as well.
O- the Outcomes guy. QA and Build Engineer rolled into one- this is your build manager, working with WA and W to make sure every release happens quickly and accurately, and that beta testing actually occurs to provide data back to WA.
This is my ideal team- one that insures you get the best of all methodologies, not the short sightedness of focusing on one or two.
Sounds like a great way to enable widespread identity theft to me since 501(c)3s are not likely to be able to afford robust security. If this rule goes into effect, it almost makes me want to turn black hat and aquire the donor SSNs for Planned Parenthood
- Pushy recruiters with foreign accents so thick you can't understand what they are saying
- Who clearly have not read your resume and only found you on a keyword search
- Who cannot read a map and do not understand "I cannot relocate 500 miles for a temporary job, and in software, all jobs are temporary"
Any android wizards out there know why I'd get this error from the emulator?
They claim they've read my resume, then they ask for references. Well, references are at the bottom of the resume, so did you really read it?
What is your favorite Data Modeling Software that interfaces with SQL Server?
It looks like Microsoft has dropped Visio for Enterprise Architects, which is what I used the last time I had to do a massive knowledge transfer of a data heavy application.
I've seen it done enough that this *should* be a readily available control, but I don't seem to be able to find one with my first three rounds of google searching, I'm probably calling it the wrong thing.
What I want is to be able to configure a user's home page on the website with their choice & order of several widgets.
Anybody know of a great tool for doing this? Worst case scenario is I roll my own with a three column table built up from a sub table off of the users.
Oh yeah, and mandatory technology for this project is SQL Server Database, Visual Studio
Yep, a child can survive an abortion, at least, long enough to perform vivisection on a being that can feel pain. Which, of course, is what the genocidal maniacs claim is safe, legal, and rare......
For those who doubt that children can be born alive after abortion here's a whole lot of adults who have, in the last 40 years, been born after attempted murder from their mothers.
The question is. If you had a fast TM (FAST_TM) with a little bit of paper tape, and a slow TM (SLOW_TM) with a lot of paper tape, could you simulate FAST_TM on SLOW_TM? Could you trade off memory for speed to get a (close to realtime even) representation of FAST_TM on SLOW_TM?
I think this gets interesting if you ask for a SLOW_TM to have with an infinite ( |â| ) paper tape. As long as you could 'read state' into those tape cells fast enough you could have stories of state going back as far back as the FAST_TM has not been in a stable/idle state. In an idle state you could start to play catchup. Between idle state points you would be at some indeterminate point between where the last idle state was that you could play catch up, in effect you'll probably have a whole stack of possible state points(what is the computational complexity of catching up on this stack for the size of the stack STACKSIZE?). So the load % of FAST_TM would determine the average lag time between your successful modelling of it.
This suggests that you could probably get a probablistic chance of modelling the FAST_TM by just adding memory of (O(S)+O(T))t where S is the memory required to record one state, t is the time and O(T) is the amount of memory you spend transitioning from state to state. I'm guessing this would be hard to do so on cpus with little memory involved, ie there's a high constant factor, but once you're past this constant factor it gets relatively easy to do...but then again maybe it doesn't? I think the lower bounds for turing machines of certain memory capability in terms of size are very small and we haven't got a lot of proofs for them.
Now here's an idea for a new kind of machine: a FAST_TM simulated on a new kind of SLOW_TM, ie where SLOW_TM has a bounded amount of memory proportional to some % of possible outcomes. Let's call this SIM_TM_LB. SIM_TM_LB is going to have a lower busy beaver-like number than a regular TM of its size because there's a certain % of possible outcomes that cannot be simulated (that overflow the stack allowed). going to be some lower limit L 1 L(M) BB(M) that the largest program available on simulated-fast-cpu can run. Proving what that is would be interesting because as you expand M (again in relation to the load average ratio between the two, the state transition memory footprint T, and the FAST_TM state size S) you're also define a L(M) BB(M) which means you're defining a new kind of number let's say Î that seems to be related to \omega: it's \phi = \sigma_i 1/L(i). Why is Î important?
It's yet another way of looking at problems where you're dealing with something smarter than you are. It's where you're playing a game with god. Where you're having an argument with an oracle. It expresses all information that you can possibly acquire rather than what your opponent can possibly know. Or does it?
This suggest \psi s parameters are T and S. Are there any others?
Also: what happens if we start allowing stack overflows to transverse from TM to TM? This seems to build a new kind of machine, also that has potentially really weird properties.
What is wrong with this code? It's like I can have the inner loop execute, or the outer loop execute, but not both.
DECLARE @RC int
DECLARE @BatchMatch varchar(40)
Declare @mcount int
Declare @dcount int
While @mcount <13
Set @BatchMatch=Right('00'+Convert(varchar(2), @mcount),2) + Right('00' + Convert(varchar(2),@dcount),2)
Set @mcount=@mcount +1
PS, no I don't care that all months don't have 31 days, but I must cover months that do.
Update- failed to reset the inner loop counter, that's what.
The Stupid Fallacy (named/discovered by Chris Rileyâ) Is-A Argument from Ignorance that includes an extra component:
Instead of merely being an argument that draws a conclusion from the *lack* of knowledge on a topic (and not in a bayesian-friendly way of enumerating possibilities and going from there, either) you have
Why don't you like GMOs?
Who knows what chemicles they put in GMOs! They're probably dangerous! Besides, God tells me to not let my precious bodily fluids become tainted by GMOs. You have to believe me because my beliefs are not subject to logical fallacies, since you and I are both christians.
Why is this worth keeping around?
Because it's not just ignorance. It's recursive, or close to recursive ignorance. It's ignorance that requires disproportionate amounts of cognitive surplus available to dispel. You have to basically reconstruct an entire worldview relying on evidence rather than 'feelings' or 'blind belief in what my elders said' in order to get your point across.
 no really.
My linux skills have atrophied. I need to set up 10 workstations today. I have one done. What is the *easiest* way to clone a partition in ubuntu 14?
Update: Ghost 4 Linux and LinuxLive USB Creator to the rescue. I haven't had a usable copy of Ghost since floppies ruled the emergency boot sector, now, I'm going to buy a 32GB keychain drive off of Amazon and make sure I am NEVER without a copy of G4L. Drive imaging in an emergency is just way too useful of a skill to have.
The first 90% of a project takes 90% of the time, the last 10% takes the other 90% of the time.