Complete consistent mathematics
May. 17th, 2010 12:14 pmMathematics can be considered as Turing- and HP-complete untyped equational theory λβη: every true statement is provable, and any false one makes the theory inconsistent. Negation of mathematical statements can be expressed, for instance, using Church's syntactic sugar. Mathematics including calculus without any paradoxes is thereby claimed to be covered by λβη with real numbers limited to recursive ones. Within such consideration, continuum hypothesis does not make any sense, since the whole theory is then a countable set of statements. In turn, recursive real numbers are known to be all and the only real numbers able to be described. There can be provided no contrary instances.
It is significant that the untyped variant has been chosen because any type system will eliminate either Turing-completeness (and therefore, the resulting system will be weaker than mathematical language), or consistency of the theory. Weak Curry-style types are inconsistent while strong Church's ones do not save Turing-completeness.
Pure untyped extensional lambda calculus (or equivalent system) is therefore the most powerful consistent theory. This is the reason to consider only real-time lazy (optimal by Levi) graph reduction machines based on it. One of the simplest ways is to use Micro Lambda Calculus which is being planned to be implemented in Macro Lambda Calculus.
Within such a system, any proofs are not just possible, but also all efficient. Any two lambda expressions, that is, any two computable functions are proved to be extensionally equivalent if they have one and the same normal form. If the normal form does not exist, they are considered unequal in λβη. (In fact, there is a version of the theory where all the expressions which have no head normal form are said to be equal. This version is HP-complete itself as well.) Normalizing strategy provides the efficient algorithm to find the normal form if it exists at all. For λβη, one of normalizing strategies is the normal strategy also known as the left-most strategy. It is however not normalizing for Distributive Lambda Calculus (a formal term to describe the system of Micro Lambda Calculus proposed by Vincent van Oostrom). A one-step normalizing strategy a little bit different from the normal one for this system has been found earlier and later proved to be normalizing by Vincent van Oostrom himself in private correspondence.
Advantages of the chosen way to describe mathematics include simplicity in full formal construction of mathematical objects. That allows teaching mathematics as a theory with only easy grammar, two rules for reducing expressions, and a normalizing strategy for reduction to achive normal forms if any, and without any axioms. Opposed to usual informal definitions and a lot of axioms when teaching elementary mathematics in schools, the proposed method is likely to be easier to understand by children who are known to re-invent their own grammar rules for the first human language to learn.
It is significant that the untyped variant has been chosen because any type system will eliminate either Turing-completeness (and therefore, the resulting system will be weaker than mathematical language), or consistency of the theory. Weak Curry-style types are inconsistent while strong Church's ones do not save Turing-completeness.
Pure untyped extensional lambda calculus (or equivalent system) is therefore the most powerful consistent theory. This is the reason to consider only real-time lazy (optimal by Levi) graph reduction machines based on it. One of the simplest ways is to use Micro Lambda Calculus which is being planned to be implemented in Macro Lambda Calculus.
Within such a system, any proofs are not just possible, but also all efficient. Any two lambda expressions, that is, any two computable functions are proved to be extensionally equivalent if they have one and the same normal form. If the normal form does not exist, they are considered unequal in λβη. (In fact, there is a version of the theory where all the expressions which have no head normal form are said to be equal. This version is HP-complete itself as well.) Normalizing strategy provides the efficient algorithm to find the normal form if it exists at all. For λβη, one of normalizing strategies is the normal strategy also known as the left-most strategy. It is however not normalizing for Distributive Lambda Calculus (a formal term to describe the system of Micro Lambda Calculus proposed by Vincent van Oostrom). A one-step normalizing strategy a little bit different from the normal one for this system has been found earlier and later proved to be normalizing by Vincent van Oostrom himself in private correspondence.
Advantages of the chosen way to describe mathematics include simplicity in full formal construction of mathematical objects. That allows teaching mathematics as a theory with only easy grammar, two rules for reducing expressions, and a normalizing strategy for reduction to achive normal forms if any, and without any axioms. Opposed to usual informal definitions and a lot of axioms when teaching elementary mathematics in schools, the proposed method is likely to be easier to understand by children who are known to re-invent their own grammar rules for the first human language to learn.