Hi guys,
Sorry for the late post. Batterman’s paper was quite dense for me since it was full of concepts I hadn’t encountered before and so took me a while to digest them.
In this paper Batterman criticises Stone and Ford separately. In this post I am going to mention some points which came to my mind during reading his critique of Stone (I may post another note for Batterman’s critique of Ford in the next couple of days) and comparing it with lecture slides.
1- Stone introduces three criteria for deterministic system: (a) there exists an algorithm which maps a state of system at any given time to a state at any other time (not probabilistic), (b) a given state is always followed by the same history of state transitions, (c) any state of system can be described with arbitrarily small nonzero error.
Batterman says condition (c) which is known as well-posedness is equivalent to the notion of continuous dependence which says: a solution depends continuously on the initial data if convergence of initial data entails convergence of solutions. It is not transparent to me that these two are equivalent. To make it clear we first need to find out what Stone means by the vague term “describe” in condition (c). We may want to say that by this term he means 'a description of the states of system which uses algorithm and some other states'. So we may reconstruct condition (c) as: 'any state of system can be described, with arbitrarily small nonzero error, using algorithm and some other state of system'.
Let’s compare this reconstruction with condition (d) which Stone introduces for defining the notion of predictability: "any state of the system can be generated from the algorithm with arbitrarily small nonzero error from any other state of the system”. These two sentences seem very similar except for their second quantifiers.
However, it seems, condition (d) and reconstruction of condition (c) are far from precise mathematical definition of continuous dependence and non-sensitive dependence which, in the paper, are respectively taken to be equivalent to (c) and (d). The mathematical definition of continuous dependence is in the slides, and the mathematical definition of non-sensitive dependence is the following:
∀δ, ∃ε, x0, ∀x1,t such that
dist(x0 , x1) < ε → dist(φ(x0, t), φ(x1, t)) < δ
One can see that condition (d) is far from this definition, at least in terms of its details: in condition (d) there is no mention of time variable. I don’t know why Batterman doesn’t criticise Stone for this.
2- Stone claims that determinism is a necessary condition for predictability. Batterman provides an example of an indeterministic system which is predictable to falsify Stone’s claim: "one can make decent predictions about quantum systems”. My question is: Can one make decent non-probabilistic predictions about quantum systems? If not, then I doubt this does count as a counterexample for Stone’s claim as I think what Stone means by prediction is non-probabilistic prediction.
3- Stone says "systems which do not admit closed form solutions have this property that the error present in the specification of the initial state can be amplified exponentially". The problem is Stone doesn’t provide any argument for this claim and I couldn’t find any criticism about this in Batterman's paper.
4- I pondered for a while on the difference between discontinuous dependence and sensitive dependence and finally reached the conclusion that having (ε, δ)-definition of the mathematical notion of limit in mind, one can say that the difference between discontinuous and sensitive dependence is analogous to the difference between (1) a situation in which a discontinuous function like f(x) doesn’t have any limit when x approaches a number like c and (2) another situation in which the limit of a function like g(x) goes to infinity as x approaches c.
5- In the slides it was said that exponential unstable systems may have continuous dependence but Batterman says they may have non-sensitive dependence. I was wondering which one of these claims is true and how one can prove either of them by using the mathematical definition of exponential instability: dist (φ(x0,t), φ(x1,t)) ≥ (dist (x1,x0))λt
P.S: I realised that in the paper Batterman does not assert that Stone's condition (d) is equivalent to non-sensitive dependence. So I was wrong in saying that these two are taken to be equivalent throughout the paper (However he explicitly claims condition (c) and continuous dependence are equivalent). He also doesn't mention anything about the relation between this condition and continuous dependence. He only claims requirement (d) can be satisfied by exponential unstable systems. So I was also wrong in claiming that Batterman says exponential unstable systems may have non-sensitive dependence.
P.S: I realised that in the paper Batterman does not assert that Stone's condition (d) is equivalent to non-sensitive dependence. So I was wrong in saying that these two are taken to be equivalent throughout the paper (However he explicitly claims condition (c) and continuous dependence are equivalent). He also doesn't mention anything about the relation between this condition and continuous dependence. He only claims requirement (d) can be satisfied by exponential unstable systems. So I was also wrong in claiming that Batterman says exponential unstable systems may have non-sensitive dependence.