Overview: We discuss the expectations of the course, and give a rough outline of the topics that will be covered. We will also touch on the benefits of the functional paradigm.
Keywords: course structure, course policy, OCaml, functional programming
Required reading: OCaml Programming (OCP) 1.1-1.3: Better programming through OCaml
Additional information: Homework 0 is released. It's not graded but it's important that you complete it so that you're set up to submit the later assignments.
Overview: We look at a couple ways of interacting with OCaml via UTop. We discuss expressions in OCaml, particularly if-expressions and let-expressions, and begin looking more careful at evaluation.
Keywords: UTop, values, expressions, evaluation, types, syntax, dynamic semantics, static semantics, if-expressions, let-expressions, let-defintions
Required reading:
Additional information:
Overview: We cover the various ways of defining functions in OCaml, some fundamental and some just convenient. We note the issue of parentheses, which includes a discussion of the right associativity of function types and the left associativity of function application. This is all connected to type inference, a topic that will (likely) come up later in the course
Keywords: recursive functions, anonymous functions, (partial) function application, pipelines, labeled and optional arguments, associativity, type inference
Required reading:
Additional information:
Overview: We examine unions and products as ways of organizing data in OCaml via variants, tuples and records. We also start studying pattern matching more closely and, in particular, what exactly a pattern is.
Keywords: variants, pattern matching, wildcards, tuples, records, accessors, record updates
Required reading:
Additional information:
Overview: We look at lists, our first example of a more complex data type, one which can depend on other types (i.e., it is polymorphic). We also discuss the connection between lists and tail recursion. In the reading, we see some useful functions for printing.
Keywords. cons, tail, type constructor, deep pattern matching, unit, printf
Required reading:
Additional information:
Overview: We cover in more detail algebraic data types (ADTs), in particular recurive ADTs and parametric ADTs. This will get us set up to start discussing things like evaluation strategies and infinite data types.
Required reading: OCP 3.9: Algebraic Data Types
Keywords: algebraic data types, simple variants, pattern matching, constructors, data-carrying variants, recursive variants, parametric variants, lists, options, results
Additional information:
Overview: We see some examples of more complex algebraic data types like trees, as well as some of their operations.
Required reading:
Keywords: algebraic data types, trees, pre-order, post-order, membership
Additional information:
Overview: We introduce the notion of higher-order functions, which can be used to parametrize functions by other functions. We look at two common higher order functions: map and filter.
Required reading:
Keywords: higher-order functions, first-class values, functions as function parameters, the abstraction principle, map, filter
Additional information:
Overview: We finish our discussion of higher-order functions with folding and the use of higher-order functions for data types other than lists.
Required reading:
Keywords: fold right, fold left, tail-recursion, associativity, tree map, tree fold
Additional information:
Overview: We discuss very briefly the module system of OCaml and the use of monads.
Required reading:
Keywords: options, monads, modules
Additional information:
Overview: We introduce formal grammars and BNF specifications.
Keywords: Formal grammar, context-free grammar, BNF specification, nonterminal symbol, terminal symbol, sentential form, sentence, production rule, derivation, recongizing a sentence, generating a sentence, parse tree
Additional information:
Overview: We discuss how grammars can be ambiguous, how we might avoid ambiguity, and why we might want to avoid ambiguity.
Keywords: grammatical ambiguity, (reverse) polish notation, parentheses, fixity, associativity, precedence
Additional information:
Overview. We introduce syntactic sugar into our BNF specifications to make them easier to define. We also take a brief detour to talk about regular grammars and regular expressions, which capture a weaker but ultimately very useful class of grammars.
Keywords: extended BNF, regular grammars, regular expressions
Additional information:
Overview: We begin our study of parsing, starting with lexical analysis, and then thinking a bit about the general parsing problem. We will look at an example of recursive descent, a top-down parsing method.
Keywords: parser generators, lexical analysis, lexeme, token, parsing, recursive descent
Supplemental reading: Much of this part of the course is based loosely on the book Concepts of Programming Languages by Robert W. Sebesta (available online without looking too hard, though we don't explicitly condone this...) We diverge from the book a fair amount, but if you want some additional reading on the topics, this source may be useful.
Additional information:
Overview: We reiterate some of the material from last lecture and introduce the notion of Chomsky normal forms.
Keywords: lexical analysis, Chomsky normal forms, context-free grammars
Additional information:
Overview: We introduce a high-level functional interface for building up complex parsers. We see many examples, and connect this interface back to our dicussion of EBNF grammars
Keywords: combinator, mapping, sequencing, alternatives, pure, fail, bind, optionals, repetition
Additional information:
Overview: We look generally at the notion of semantics, distinguishing between static and dynamic semantics. We introduce configurations and reduction rules as the basis of operational semantics.
Keywords: static semantics, dynamic semantics, denotational semantics, operational semantics, abstract machine, configuration, reduction rule, stack-oriented language
Additional information:
Overview: We look more carefully at the notion of a reduction rule and define the notion of a derivation, which verifies that a reduction holds. We also introduce additional reduction rules to be able to reason about multi-step reductions. We see see general techniques for building derivations of reductions.
Keywords: operational semantics, configuration, abstract machine, reduction rule, derivation, single-step reduction, multi-step reduction
Additional information:
Overview: We briefly discuss the design of rules. We then introduce a toy stack language with variables (this will be useful for project 1).
Keywords: Environment, fetching from the environment, updating the environment
Overview: We discuss the difference between dynamic and lexical scoping. We see some examples of dynamic scoping and hint at the difficulty of lexical scoping in the context of subroutines.
Keywords: environment, scope, mutability, binding, assignment, lexical scoping, dynamic scoping, let-bindings, binding-defined scope, block-defined scope
Overview: We discuss various methods of parameter passing, in particular call-by-name and call-by-value. We see the benefits and downsides of each method.
Keywords: evaluation strategy, call-by-name, call-by-value, the lambda calculus, call-by-reference