That might be stretching the definition of "deterministic". If it depends on implicit state, precise timing, external conditions, etc., I would say it's closer to fitting the definition of "chaotic" than "deterministic".
With perfect observation and recording, it may be possible to see a tornado and work backwards to figure out which butterflies flapped their wings to cause it, but that would still be a long ways off from being able to train all butterflies to move only in ways which won't cause tornados.
Sure, but you don't need to know which butterflies flapped their wings - you just need to know "if there's a tornado, the house falls down, and we're in tornado country."
In the example above, it seems sufficient and entirely testable to conclude, if for whatever reason NFS is flaky, $PATH resolution will skip the directory with the right linker and fall back to the old one, and if for whatever reason you call the old linker, it has different behavior about static initializer order. That's enough to both describe the problem, reproduce it, and fix it. No, you're not going to be able to identify which particular packet was lost, but (and I say this as someone who's maintained production systems that rely on cross-continent NFS) you expect that packets can be lost and you figure out how to make the system robust against it. Don't put multiple files with the same name on $PATH, or call the linker by its full path, or something.
And that leaves you a very far distance from "something spooky is happening with the computer and so passwords weren't being checked today, who knows."
With perfect observation and recording, it may be possible to see a tornado and work backwards to figure out which butterflies flapped their wings to cause it, but that would still be a long ways off from being able to train all butterflies to move only in ways which won't cause tornados.