Comment Re: Do they have a choice? (Score 1) 312
neural nets with backpropagation are as deterministic as any other algorithm; if the inputs are the same and no one has "improved" the numerical routines, you should get the same result every time you run the training algorithm. yes, you can add stochasticity to NN training to try to improve convergence, but this is true about many algorithms (whether their implementors realize it or not). running the same trained, frozen neural net on the same input should definitely always give you the same result, unless something weird is happening (as it sometimes does).
i understand that in certain cases, we need provably correct results. my point is that "deterministic" is often (but not always) the wrong word, which i think you've illustrated.