Suppose you have a programming language designed to efficiently perform calculations with real numbers. In particular, it allows for the definition of (computational, deterministic) functions which take one number as an argument and return a number:
1 2 3 |
|
Let be an arbitrary mathematical function, mapping real numbers to real numbers. What is the probability that can be modeled using a function programmed in ?
Assume that can handle real numbers in an arbitrarily large range, with infinite precision. The programmed function is of finite length.
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
The probability is 0%.
Since every program has finite length, the number of programs that could be written is countably infinite, corresponding to a countable set of points C in R R . For any integer n , the disjoint union of n copies of C is still only countable, strictly smaller than R R , showing that n P ≤ 1 for every integer n . Therefore the probability is P = 0 .