Instructor: Instructor: Dr. István MIKLÓS
Text: handout. We will also use some chapters from Dasgupta-Papadimitriou-Vazirani: Algorithms and Miklós: Computational Complexity of Counting and Sampling.
Prerequisite: General mathematical experience up to the level of elementary algebra, calculus and combinatorics is expected. Very elementary probability theory is also needed.
Course description: The central and steady question of computer science is: “how to compute it?”. What changed during time is what we wanted to compute. In the ‘60s, decision and optimization problems were centre of interest. Nowadays, in the age of exabytes and the internet of things, we would like to mine information and knowledge from data, and want to teach computers to automatically do tasks (like classifying objects). The aim of this course is to give an overview of the most important algorithms invented in the last 60 years. There will be special emphasis to highlight the similarities between the classical and modern approaches. This course is an introductory course, however, it gets demanding quite rapidly.
Topics:
- Basic notions: algorithms, running time, asymptotic behaviour. The O, \Omega and \Theta notations. The growth of the functions. Polynomial, supra-polynomial, sub-exponential, exponential growth. Very brief introduction to computational complexity: the P, NP and NP-complete classes.
- Dynamic programming. Money change problem, Knapsack. String processing algorithms: edit distance, longest common subsequence. Shortest path algorithms. Algebraic dynamic programming: universal algorithms for deciding, counting and optimizing.
- Divide and conquer. Karatchuba’s fast multiplication. Fast matrix multiplication. Mergesort. The master equation.
- Sorting algorithms. Bubble sort, heapsort, quicksort.
- Hidden Markov Models: The Viterbi and the Forward algorithm. The Backward algorithm, Baum-Welch training: the Expectation-Maximization algorithm.
- Clustering algorithms: Hierarchical and non-hierarchical clustering. UPGMA, Neighbor Joining. The k-mean algorithm.
- Markov Chain Monte Carlo: The Metropolis-Hastings algorithm. Gibbs sampling. Parallel Tempering. Connection to modern theoretical computer science: hard counting and sampling problems that are easy to approximate. Example: the Karzanov-Khaciyan Markov chain on the linear extensions of partially ordered sets.
- Statistical learning algorithms: (if time permits) Classification methods. Artificial neural networks, support vector machines.