Mobile code safety becomes increasingly important as mobile code becomes more prevalent. Proof-carrying code(PCC), proposed by Necula, is a framework for the mechanical verification of safety properties (such as type safety) of machine language programs with a machine-checkable proof. Foundational PCC (FPCC) described by Appel and Felty is a refinement that gives types a denotational semantics and thus bases all the proofs on a foundational logic.
The semantic model of types in FPCC links machine states with types, enabling us to formalize high level specifications of machine instructions as abstract instructions and prove the implementation relation between concrete and abstract instructions. Proofs of implementation relations are done in an implementation hierarchy. Illustrated by a toy machine, I'll also show how to formalize the notion of typability of machine code based on implementation relations and prove type safety theorems (progress and preservation).
We have been working on compiler for Moby for the past two years. In this talk, I will describe the current state of the implementation and discuss a number of interesting technical issues that have come up in the implementation effort. I will talk about the infrastructure for cross-module typechecking and inlining, work on loop optimizations, and supporting advanced control structures and concurrency.
The Moby effort is a joint project with Kathleen Fisher (AT&T Research) and we have had help from Stephanie Weirich and Dan Grossman (Cornell).
Point-free relation calculi have been fruitful in functional programming, but in specific applications pointwise expressions can be more convenient and comprehensible than point-free ones. To integrate pointwise with point-free, de Moor and Gibbons [AMAST 2000] give a relational semantics for lambda terms with non-injective pattern matching. Alternative semantics has been given in the category of ideal relations and also in its associated category of predicate transformers. The predicate transformer model integrates functional programming constructs with the specifications and program constructs of imperative refinement calculus. As an application, pattern terms are used to concisely express update operations on pointer based data structures, simplifying recent work of Reynolds by eliminating the need for state predicates and non-standard logic.
The ability to extend a programming language with new constructs is a valuable tool. With it, system designers can grow a language towards their problem domain, enhance productivity and ease maintenance. We present an extension to the Java language, called the Java Syntactic Extender (JSE), that allows Java programmers to define new syntactic constructs. The design is based on the Dylan macro system (e.g., rule-based pattern matching and hygiene), but exploits Java's compilation model to offer a full procedural macro engine. In other words, syntax expanders may be implemented in, and so use all the facilities of, the full Java language. This talk will include motivating examples, an implementation overview, and future challenges. The described system is implemented and working as a Java preprocessor.
Joint work with Keith Playford of Functional Objects, Inc.
This talk is based on a paper with the same title, recently accepted for publication in SCP.
One view of this talk is that is is about advanced functional programming: I present and use Arrows (a gen. of Monads) and polytypic programming (type parametrized programs). But from a different viewpoint it is about embedded domain specific languages for data conversion and about proving correctness of conversion programs.
Several generic programs for converting values from tree-like container datatypes to some other format, together with their corresponding inverses, are constructed.
Among the formats considered are shape plus contents, compact bit streams and pretty printed strings.
The different data conversion programs are constructed using John Hughes' Arrow combinators along with a proof that printing (from a datatype to another format) followed by parsing (from that format back to the datatype) is the identity.
The printers and parsers are described in PolyP, a polytypic extension of the functional language Haskell.
Call-by-push-value (CBPV) is a new typed programming language paradigm that, we claim, provides the semantic primitives from which call-by-value and call-by-name are built. Evidence for this claim is found in a wide range of semantics, from machine and big-step operational semantics to denotational models such as Scott semantics, possible worlds, continuations and games.
In this talk, we introduce the CBPV paradigm. Using an example program, we illustrate the slogan "a value is, a computation does". We give a Felleisen/Friedman-style CK-machine and Scott semantics, and show how CBV and CBN type and term constructors decompose into CBPV. We also look briefly at semantics for global store and continuations, by decomposing the Moggi monad T into two parts U and F.