diochnos/teaching/CS5713/2023F

CS 5713 – Computational Learning Theory (Fall 2023)

Table of Contents

Course Description

Topics of machine learning theory. Learning using membership queries, equivalence queries, version spaces, linear models, decision trees. Probably approximately correct (PAC) learning, Occam algorithms, VC-dimension, sample sizes for distribution-independent learning. Representation issues, proper learning, reductions, intractability, learning in the realizable case, agnostic learning. We explore topics under the broader umbrella of trustworthy machine learning, such as noise models, statistical queries, and adversarially robust learning against poisoning attacks and against adversarial examples. In addition, we expland upon interpretability and explainability aspects, as well as upon fairness concerns. Other topics include distribution-specific learning and evolvability, online learning and learning with expert advice in the mistake bound model, and weak and strong learning (boosting).

[Course Description] [Table of Contents] [Top]

Basic Information

Syllabus

The syllabus is available here.

Time and Location

Tuesdays and Thursdays, 10:30am – 11:45am, Carson Engineering Center 0119.

Contact Information

Please see here.

Office Hours

I will be holding my office hours at the following times.

Tuesdays
3:00pm – 4:00pm, 230 Devon Energy Hall
Thursdays
2:00pm – 3:00pm, 230 Devon Energy Hall
Fridays
10:45am – 11:45am, 230 Devon Energy Hall

Please note that while anyone is welcome during my office hours (Dimitris Diochnos), students from CS 5713 will have precedence on Fridays, while students from CS 3823 will have precedence on Tuesdays and Thursdays.

Exceptions to the Regular Schedule of Office Hours

If you want to meet me outside of my office hours, please send me an email and arrange an appointment.

[Basic Information] [Table of Contents] [Top]

Homework Assignments

Assignment 1: Announced on Thursday, Aug 31 (week 2). Due on Thursday, Sep 14 (week 4).

Assignment 2: Announced on Tuesday, September 19 (week 5). Due on Wednesday, September 27 (week 6).

Assignment 3: Announced on Wednesday, September 27 (week 6). Due on Friday, October 6 (week 7).

[Homework Assignments] [Table of Contents] [Top]

Course Projects

Please have a look on Canvas for the course description.

Candidate Material

Please have a look here for some candidate papers.

[Final Projects] [Table of Contents] [Top]

Computational Learning Theory Resources

Books

The following books are very relevant to our course and are also available for free in electronic format in the following links:

Personal Notes

Papers

Assigned Readings
Optional Readings

[Computational Learning Theory Resources] [Table of Contents] [Top]

Class Log

A log for the class will be held online here.

Week 1

Class 1 (Aug 22, 2023)

Discussion on syllabus and policies.

How this class is changing from previous years.

Discussion of coupon collector's problem. We derived the expected running time of the process of collecting all $N$ coupons.

Mentioned the existence of the following handouts:

Week 2

Discussion on basic terminology and enumeration of simple classes of functions.

Begin discussion on concept learning and version spaces.

Week 3

Conclusion of concept learning and version spaces.

Beginning of decision trees.

Week 4

Conclusion of decision trees.

Linear models for classification using perceptrons.

Week 5

Introduction to PAC learning - definitions.

Assigned Reading: Started Chapter 2 from the book Foundations of Machine Learning. Equivalently, from the book Understanding Machine Learning, we started discussing the content of Chapters 2 and 3.

Axis-aligned rectangles, learning conjunctions.

Assigned Reading: This is Example 2.4 from the book Foundations of Machine Learning. Equivalently, this is the first example in Chapter 1 in the Kearns-Vazirani book.

Theorem for learning in the realizable case using a finite hypothesis space.

Efficient PAC learnability of conjunctions (using Find-S).

Assigned Reading: Section 1.3 in the Kearns-Vazirani book.

Week 6

Discussion on version spaces, consistent learners, Occam algorithms and generalization.

Theorem for sample complexity of PAC learning using finite hypothesis spaces (in the realizable case).

Discussed some implications of the theorem that we proved last time; e.g., we improved the sample complexity of PAC learning conjunctions without even arguing about the algorithm that can be used for learning. We also saw a specific example of plugging-in certain values for ε and δ.

Assigned Reading: Sections 7.1, 7.2, and 7.3 (up until 7.3.1 but not 7.3.1 yet) from Tom Mitchell's book. This is where Mitchell is introducing PAC learning and is proving the theorem with the sample size for finite hypothesis spaces in the realizable case.

Assigned Reading: From the book Foundations of Machine Learning, Theorem 2.5 corresponds to the Theorem that we proved in class about finite hypothesis spaces in the realizable case and Example 2.6 has the discussion that we did in class for improving the sample size needed for PAC learning conjunctions using Find-S.

Optional Reading: Occam's Razor, by Anselm Blumer, Andrej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. This is the paper that proved this simple but very important theorem.

Set covering and finding an O(log(m)) optimal approximation with a greedy algorithm.

Application of the algorithm to PAC learning conjunctions with few relevant literals by covering the negative examples (that Find-S would otherwise ignore) with the literals obtained from the solution of Find-S (that initially only takes into account the positive examples).

Sep 27, 2032

Due today: Homework 2.

Assigned today: Homework 3.

[Class Log] [Table of Contents] [Top]