Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-26T05:53:58.407Z Has data issue: false hasContentIssue false

Special issue on ‘Logical frameworks and metalanguages’

Published online by Cambridge University Press:  20 March 2003

GÉRARD HUET
Affiliation:
INRIA-Rocquencourt, France
Rights & Permissions [Opens in a new window]

Extract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

There is both a great unity and a great diversity in presentations of logic. The diversity is staggering indeed – propositional logic, first-order logic, higher-order logic belong to one classification; linear logic, intuitionistic logic, classical logic, modal and temporal logics belong to another one. Logical deduction may be presented as a Hilbert style of combinators, as a natural deduction system, as sequent calculus, as proof nets of one variety or other, etc. Logic, originally a field of philosophy, turned into algebra with Boole, and more generally into meta-mathematics with Frege and Heyting. Professional logicians such as Gödel and later Tarski studied mathematical models, consistency and completeness, computability and complexity issues, set theory and foundations, etc. Logic became a very technical area of mathematical research in the last half century, with fine-grained analysis of expressiveness of subtheories of arithmetic or set theory, detailed analysis of well-foundedness through ordinal notations, logical complexity, etc. Meanwhile, computer modelling developed a need for concrete uses of logic, first for the design of computer circuits, then more widely for increasing the reliability of sofware through the use of formal specifications and proofs of correctness of computer programs. This gave rise to more exotic logics, such as dynamic logic, Hoare-style logic of axiomatic semantics, logics of partial values (such as Scott's denotational semantics and Plotkin's domain theory) or of partial terms (such as Feferman's free logic), etc. The first actual attempts at mechanisation of logical reasoning through the resolution principle (automated theorem proving) had been disappointing, but their shortcomings gave rise to a considerable body of research, developing detailed knowledge about equational reasoning through canonical simplification (rewriting theory) and proofs by induction (following Boyer and Moore successful integration of primitive recursive arithmetic within the LISP programming language). The special case of Horn clauses gave rise to a new paradigm of non-deterministic programming, called Logic Programming, developing later into Constraint Programming, blurring further the scope of logic. In order to study knowledge acquisition, researchers in artificial intelligence and computational linguistics studied exotic versions of modal logics such as Montague intentional logic, epistemic logic, dynamic logic or hybrid logic. Some others tried to capture common sense, and modeled the revision of beliefs with so-called non-monotonic logics. For the careful crafstmen of mathematical logic, this was the final outrage, and Girard gave his anathema to such “montres à moutardes”.

Type
Special issue on Logical frameworks and metalanguages
Copyright
© 2003 Cambridge University Press
Submit a response

Discussions

No Discussions have been published for this article.