Author : Rodrigo Tamiazzo, Full-Stack Developer

Hi to you who reached this article! Hope you are getting a nice day and productive outcomes :) Within the body of this text I will try something not trivial: Talking about a deep and technical subject in a simple form accessible for all readers. To achieve this, I ask you to read with patience. It's not a long text, but probably you will find strange terms or ideias. I encourage you to google them. And then go back here, and then search again. Hopefully from a small cut from the whole history I can transmit the core idea that for a long time people have been trying to solve all (solvable) problems, and society has evolved so much that it looks like we have already scratched those limits.

#### Leibniz Dream: The perfect Language

Have you ever programmed in Haskell or any Pure Functional Language? Have you tried to understand what is a Monad? Have you ever read Leibniz's Monadology book? ;) I'm mentioning all of this to give a glimpse of the importance of Gottfried W. Leibniz (1646) for Philosophy, Mathematics and Computability Theory. He is regarded as one of the few Universalists, i.e., a genius that spreaded his knowledge across all areas. In fact, after his conflicts with Newton about who created Infinitesimal Calculus first (with Newton winning, supported by the Royal Society in England), it was Leibniz philosophy that persisted in his name over the centuries. Also he is regarded as one of the 3 great rationalists, together with René Descartes (1596) and Baruch Spinoza (1632).

Leibniz contributions deserve a whole text, but here I will focus on only one of his concepts, the idea of a Characteristica Universalis. As described by Wikipedia, it is a "universal and formal language imagined by Gottfried Leibniz capable of expressing mathematical, scientific and metaphysical concepts." His wish was to be able to solve any human problem automatically by a deterministic language that computed without errors every possible human problem. He died without seeing the fulfillment of his intent.

#### Hilbert, Formalism and the limits of our knowledge

Let's jump from 1675 to the early 1920s. The mathematical world was big enough and not well defined enough. Mathematicians like Cantor brought innovative ideas about infinity, Bertrand Russell and Alfred North Whitehead have just written the Principia Mathematica and until that time Mathematics was seen as separated fields: probability, arithmetic, etc.. Hilbert was one of the best mathematicians of his time, and he was sure he could unify and solve all Mathematics gaps, proposing then his famous Hilbert Program: a set of 23 problems that, when solved, would formalize and prove consistent all Math. Hilbert was a formalist, where Formalism is a philosophy of Mathematics where one believes that it is possible to achieve complete and consistent axiomatization of all of mathematics just by formal methods.

What he wasn't counting was the work of an Austrian youngster in the beginning of his logician life. Kurt Gödel (1906) will be important here for 2 reasons: 1st (and most notorious) is his work "On Formally Undecidable Propositions of Principia Mathematica and Related Systems'' and consecutively his famous Incompleteness Theorems, that proved that Hilbert dream could never be achieved since there is no possible system for arithmetics that can be both complete and consistent. While this human limitation can be already schoking for some, let's move on and see how Hilbert and Gödel shaped how computers are now.

The "Entscheidungsproblem '' (Or decision problem) is a problem related to Hilbert 10th problem (about Diophantine equations). It asks if it is possible to create a universal machine/algorithm that always responds "yes' ' or "no" for every well defined problem. In 1936 Alonzo Church and Alan Turing proved (independently) that it was impossible. This is known as the Church-Turing thesis. Gödel was also working on this problem and corresponded frequently with Church and Turing. Godel's method to show an "Effective computation" was using "General Recursive Functions''. The Church method was using only Pure Functions (a.k.a. Lambda Calculus), and Turing contributed with his model of "Turing machine". What is interesting about these conclusions is that General Recursive Functions, Pure Functions and Turing Machines all have the same limitations! And if you try to be more concise you get: Functions (both pure and recursive) and Turing Machines all have the same limits of what can be computed.

#### Brouwer Intuitionism: A new philosophy for Mathematics / A new view of the world

The beginning of the XX century was indeed turbulent for the world of Logic and Mathematics (let's remember no computer was developed so far, even with all already existing theory). Despite the limitations posed we still need a foundation of Mathematics. Set theory was regarded as a good candidate, and after a few iterations we have today what is known as ZFC-Set Theory as our current accepted model. ZF stands for Zermelo-Fraenkel (the people behind it) and C stands for the "Axiom of Choice" (AC). In fact it is very useful and trustful, but it still has its limitations. Have you ever heard of the Banach-Tarski paradox? I encourage you to Google it. It is a direct consequence of the AC. After reading this paradox, do you still trust in this Axiom?

Luitzen Brouwer (1881) was a Dutch mathematician and philosopher. Unlike the predominance of classical thought and Logic, he saw the world from a different perspective. ZFC-Set Theory tries to structure Mathematics with some polemic and sometimes not well accepted ideas. For example the Axiom of Choice or the Law of the Excluded Middle (LEM), that since Aristotle stated that or one thing IS or it IS NOT, there is no middle ground. He devised it differently: Based on former thinkers like the German idealism Kant, Brouwer recognizes that all mathematical objects are purely creation of the human mind and don't have real correspondence to the real world. It may sound strange and maybe guide us to some form of solipsism, but it is actually much better not assuming things beyond the necessary. We are sure that the mind creates Math, but we still can't use it in all its completude in the real world (we still get the field of 'Pure Abstract Mathematics'

#### Martin Lof: Dependent Type Theory and Correctness by Construction

As mentioned, for the sake of the article I'll have to skip some important people and ideas regarding Intuitionism and Constructivism, but I'd like to introduce the direct relation of Brouwer Intuitionism with this other extremely important Philosophy of Mathematics and Mind: Constructivism basically states that human beings construct internally their understand and knowledge. It is an active work. This concept was also very explored in Pedagogy with the great work of Piaget (internal knowledge constructivism), and Vygotsky (social constructivism). Per Martin-Löf (1942) is a Swedish Mathematician and Philosopher. He worked upon Brouwer work and included his concept of Dependent Types with his 1972 paper "An Intuitionistic Theory of Types' '. Types aren't news: Bertrand Russel used them in Set Theory to avoid his famous Russell Paradox and Hindley (1939) and Milner (1934) gave work to what is known as Hindley-Milner Type System. But a full Dependent Type System when Combined with a Pure Functional Language establishes a Correct by Construction system. C.b.C. means that the output code is free of runtime errors (once they compile). Another way to look at them are languages that, for every input, the correspondent Turing Machine always halt. The system is 100% Deterministic.

Coq, Agda and Idris are examples of pure functional languages that adhere to Martin Lof Type System (MLTS). Haskell is a curious case. Despite being projected to adhere to Hindley-Milner Type System (HMTS) it still has influence from MLTS. But even with all Haskell goodness and correctness, it can't be considered Correct by Construction by default in the same sense as Coq, Agda or Idris, which adheres completely to MLTS.

#### Total Functions and Total Languages

A total function is a function that returns an output for every single possible input. Total functions are important because they won't throw runtime exceptions. The similar concept can be applied to languages: A total language is a language where all computations must terminate and return a value without any runtime errors. And it happens that the class of languages (Coq/Agda/Idris) is projected to give total languages.

#### Conclusions and further thoughts

So, what is the conclusion? Have we reached Leibniz 'dream of a "Perfect Language"? Actually it is hard to say, and when "the best" is chosen, the majority of people get upset because it wasn't his/her first option. Are type systems always necessary? Are deterministic languages always necessary? I would like to remember Rich Hickey's quote on why he made Clojure a dynamic typed language: "It is a feature, not a problem!".

Let's analyze Idris: It looks a nice candidate for the perfect Language: It is general purpose, C.b.C. and can be compiled to a variety of targets (JVM/JS/LLVM). You can install it standalone or within Haskell Platform (just like Agda). But as a drawback one has more concepts to study and implement. Clojure is extremely fast to learn and implement. I like to see it like this:

1 - Untyped or Dynamic typed languages are a joy to work with scripting. Please try Babushka (or its brother nbb), you will never come back! We can build apps also, and despite having much fewer bugs (because of immutability, etc.), it is not 100% guaranteed without extra validations (as tests).

2 - Static Strong Types - I really have hard times to justify those. While indeed preventing simple errors, Java or Typescript still provides a good amount of headache to the developer. Please take my criticism softly. It's not like they are totally unnecessary, but since you are introducing types, why not aim for 0% runtime errors?

3 - Dependent Typed Pure Functional Languages have the highest degree of correctness ever achieved so far: They are used by mathematicians to prove theorems; They have even been used to prove the correctness of microcontrollers! If I have a system with critical parts (a nuclear plant, a spaceship), I would consider using Idris at its core critical parts. You can compile it to a .jar file. Then less critical parts can be fast developed and evolve with Clojure and everything can live together in the same application environment.

While it's hard to cast the best language ever, it really looks like it should be a deterministic, correct by definition, total language. Possibly a Pure Functional Language with dependent Types. It should compile to all platforms. It should be able to encompass all areas of Mathematics. We are living now in a fast paced world where breaking changes are happening daily. AGI is around the corner. Other strong possible Foundations for Mathematics and Computers are being developed (like cubical Agda, that implements Homotopy Type Theory with the Univalence Axiom).

Did you know about those theories? I know it can be a dense reading, but I hope it could give you a sparkle of inspiration. If you have any comments or observations please reach us through hello@flexiana.com. Happy Hacking!

The Category of

“Correct by Construction”

Languages