logo_ipparis.png     TelecomParis_endossem_IPP_RVB_100pix.png Telecom Paris
Dep. Informatique & Réseaux

Dessalles_2018.png J-L. DessallesHome page

July 2021
5

AIAI0.png
IA325/IA703: Algorithmic Information and Artificial Intelligence

Lecturers:     Jean-Louis Dessalles
and             Pierre-Alexandre Murena
Etienne Houzé

                                                other AI courses 5






Objectives

Algorithmic Information Theory (AIT) is based on the mathematical notion of complexity, which has been invented 50 years ago to solve issues related to machine learning, randomness and proof theory. It derives from a fundamental intuition: Complex objects cannot be described by short algorithms. Complexity corresponds to the size of algorithms (and not to their speed; see caveat below).

Creating Artificial intelligence is one of the greatest challenges in the history of humankind. Programs are said to be "intelligent" because they solve difficult problems, such as playing the game of Go. Unfortunately, Artificial intelligence is often perceived as no more than that, just a collection of brilliant, innovative methods to solve problems. Most people don’t imagine that intelligent behaviour can be universally described in terms of algorithmic information.

There is currently a growing interest in Complexity and AIT for their role in the theoretical foundations of Artificial Intelligence. Moreover, practical approaches to complexity based on compression techniques or minimum length descriptions offer efficient techniques in machine learning. AIT plays an important role in mathematics, for instance to set limits to what a formal theory or an intelligent system can do. More recently, AIT has been shown essential to address aspects of human intelligence, such as perception, relevance, decision making and emotional intensity.






Content

Topics
Chapter 1.     Description complexity

Complexity measured by code length.
Complexity of integers.
Conditional Complexity.
Chapter 2.     Measuring Information through compression

Compressibility.
Language recognition through compression.
Huffman codes - Complexity and frequency.
Zipf’s law.
"Google" distance - Meaning distance.
Chapter 3.     Algorithmic information applied to mathematics 

Incomputability of C.
Algorithmic probability - Algorithmic Information.
Randomness.
Gödel’s theorem revisited.
Chapter 4.     Machine Learning and Algorithmic Information

Induction - Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression.
Chapter 5.     Subjective information and simplicity

Cognitive complexity.
Simplicity and coincidences.
Subjective probability & subjective information.
Relevance.






Slides

   read →
PdfIcon.png     Chapitre1 (coding & description complexity)    
   read →
PdfIcon.png     Chapitre2 (complexity, frequency & compression)    
   read →
PdfIcon.png     Chapitre3 (complexity & maths)    
   read →
PdfIcon.png     Chapitre4 (compl & ML)    
   read →
PdfIcon.png     Chapitre5 (complexity, AI & cognition)    






Validation

  1. Answers to questions during the lab sessions are recorded and evaluated.
  2. Students will have to answer a short quiz on the last day.
  3. Students will make a small original contribution (typically, as a continuation of a lab work question)
  4. and present it orally.
    This micro-study should emphasize the link with Kolmogorov complexity and Algorithmic Information. You are expected to choose a topic of study, and to do something for this project (typically write a small program). The topic of the project must be related to K-complexity.


Very soon:

    

Before then end of the course


Your small report will describe your project and what you found (typically: 3 or 4 pages, more if there are images). Don’t forget to include: All contributions that pass will be grouped together into a document made accessible to all.






Short bibliography

En français:

    

Line.jpg