This page is to collect all the bits and pieces of source code and other material that I created during my studies at Eötvös Lóránd University. Expect it to be mostly in Hungarian.
MSc thesis
Compositional Type Checking for Hindley-Milner Type Systems with Ad-hoc Polymorphism: My 2011 MSc thesis. For the software behind it, and other details, please check this page.
Theoretical subjects
Relational Programming
Read this introductory paper to get an idea of what this is all about. I've put some problems and solutions online. If you understand Hungarian, you're in luck because there's an online textbook in two parts.
Formal languages & automata
- The Turtle language family: Family of languages describing geometrical shapes using turtle graphics semantics.
- The SP programming language: A very simple toy language for writing programs solving number theory problems.
Functional programming languages
- Alef: Interpreter for a toy functional language with lazy evaluation and Hindley-Milner type inference
- Dependent types with Agda: Literate Agda source file and presentation slides, in Hungarian
Parallel programming & Grid computing
- Pipelined calculation of a Taylor approximation
- Parallel prefix computation
- Generating rainbow tables for MD5 on the EGEE grid
Parsers & compilers
- Simple backtracking parser: A simple backtracking parser generator, written in Lisp
- Recursive descent LL(1) parser: Another Lisp compiler generator macro, this time implementing the recursive descent strategy.
- SP compiler: A very simple compiler for the SP language (see above), that compiles directly into IA-32 assembly.
- Advanced SP compiler and interpreter: A slightly more complex compiler and interpreter for the SP language that does const folding.
- LALR(1) parser generator: Completing the list of Lisp macros for creating parsers using various strategies, this last one implements the LALR(1) optimization of the LR(1) algorithm.
Algebra and Number Theory
- Twin prime sieving: An application of the Miller-Rabin primality test (Literate Haskell source)
Linear and non-linear optimization
- Infeasible interior-point optimizer for linear optimization problems, with Literate Haskell source
Practical subjects
Programming Environments
Programming assignments for the fall semester of 2005-2006:
Programming Languages/C++
The assignment was to write a solver for the Sudoku puzzle. Input format: 81 digits (separated by whitespace) describing the board row by row, 0 denoting unspecified cells. Output format: the 81 digits of the solution, or 81 zeros if an error occured (such as wrong input format, or unsolvable puzzle).
To aid testing, I also wrote a small Python script that checks solved Sudoku boards of the format described above.
Programming Languages/IA-32 Assembly
Blog
The case for compositional type checking
23 October 2010 (programming haskell language ELTE) (1 comment)This is an based on a chapter of the M.Sc. thesis I am writing at ELTE, supervised by Péter Diviánszky.
For my M.Sc. thesis, I've been working on writing a compositional type checker for Haskell 98. The basic idea is to extend Olaf Chitil's compositional type system with ad-hoc polymorphism, Haskell 98's major extension to the Hindley-Milner type system. In this post, I'm showing the motivation behind wanting to go compositional.
A property shared by both commonly-used algorithms for doing Hindley-Milner type inference, W and M, is that both W and M infer the type of composite expressions by inferring one subexpression (in some sense, the “first” one) and using its results in inferring the type of the “next” one. They are linear in the sense that partial results are threaded throughout the type inference.
The effect of linearity on type inference is that certain sub-expressions (those that are processed earlier) can have greater influence on the typing of other subexpressions. This is bad because it imposes a hierarchy on the subexpressions that is determined solely by the actual type checking algorithm, not by the type system; thus, it can lead to misleading error messages for the programmer.
For example, let's take the following definition of a Haskell function:
There are two ways to typecheck this definition using W: either we first typecheck toUpper x, using the context {x :: α}, resulting in the type equation α ~ Char, then checking not x with {x :: Char}, or do it the other way around, by first looking at not x, then as a result recursing into toUpper x with the context {x :: Bool}.
GHC, it seems, does the former, resulting in the following error message:
In the first argument of `not', namely `x'
In the expression: not x
In the expression: (toUpper x, not x)
Whereas Hugs 98 does the latter:
*** Expression : toUpper x
*** Term : x
*** Type : Bool
*** Does not match : Char
The problem is that they are both misleading, because there is nothing wrong with either not x or toUpper x by itself. The problem only comes from trying to unify their respective views on the type of x.
A compositional type checker, in contrast, descends into toUpper x and not x using the same context, {x :: α}. The first one results in the typing (which is defined to be not just the type of an expression, but also a mapping of monomorphic variables to their types) {x :: Char} ⊢ Char, and the second one in {x :: Bool} ⊢ Bool. Only afterwards are these two typings tried to be unified.
This is better because it becomes meaningful to talk about the typing of a subexpression. For the example above, my work-in-progress compositional type checker can report errors with an (IMO) much more helpful message:
(toUpper x, not x)
Cannot unify `Char' with `Bool' when unifying `x':
toUpper x not x
Char Bool
x :: Char Bool
Of course, the devil's in the details — but that's what my thesis will be about.