![]() |
Welcome to the homepage of the lecture Image Compression Summer Term 2024 |
Home |
![]()
Image Compression
Lecture (4h) with exercises (2h) Motivation: High resolution image data is becoming increasingly popular in research and commercial applications (e.g. entertainment, medical imaging). In addition, there is also a high demand for content distribution via the internet. Due to the resulting increase in storage and bandwith requirements, image compression is a highly relevant and very active area of research. Teaching Goals: The course is designed as a supplement for image processing lectures, to be attended before, after or parallel to them. After the lecture, participants should understand the theoretical foundations of image compression and be familiar with a wide range of classical and contemporary compression methods. Contents: The lecture spans the whole evolution of image compression from the dawn of information theory to recent machine-learning approaches. It is seperated into two parts: The first half of the lecture deals with lossless image compression. We discuss the information theoretic background of so-called entropy coders (e.g. Huffman-coding, arithmetic coding, ...), talk about dictionary methods (e.g. LZW), and cover state-of-the-art approaches like PPM and PAQ. These tools are not limited to compressing image data, but also form core parts of general data compression software such as BZIP2. Knowledge about entropy coding and prediction is key for understanding the classic and contemporary lossless codecs like PNG, gif or JBIG. The second part of the lecture is dedicated to lossy image compression techniques. We deal with classic transformation based compression (JPEG, JPEG2000), but also with emerging approaches like inpainting-based, fractal, or neural network compression. Furthermore, we consider related topics like human perception, and error measures. For registration and more detailed information, please visit the CMS. |
MIA Group |