Subject description - B0B36JUL

Summary of Study | Summary of Branches | All Subject Groups | All Subjects | List of Roles | Explanatory Notes               Instructions
B0B36JUL Julia for optimization and learning
Roles:  Extent of teaching:1P+3C
Department:13136 Language of teaching:CS
Guarantors:Adam L. Completion:KZ
Lecturers:Adam L., Mácha V. Credits:4
Tutors:Adam L., Mácha V., Mašková M. Semester:Z

Web page:

https://juliateachingctu.github.io/Julia-for-Optimization-and-Learning/stable/

Anotation:

Julia programming language is increasingly known by the community for its suitability in the field of numerical calculations. The course consists of two parts. The first part presents the basics of Julia. The second part introduces mathematical optimization and its application in machine learning, statistics and optimal control of differential equations. While the first part shows the individual concepts of Julia, the second part combines them into longer logical sections of code. We explain each application theoretically. Students are encouraged both to write simple functions by themselves and compare them with already existing packages. The course ends with a final project. Students can either choose a topic connected to their theses or join a Kaggle competition with real data. This course is also part of the inter-university programme prg.ai Minor. It pools the best of AI education in Prague to provide students with a deeper and broader insight into the field of artificial intelligence. More information is available at https://prg.ai/minor.

Study targets:

The course aims to teach students to program in Julia. At the same time, we will show the use of Julia in applied fields.

Content:

The course consists of two parts. The first part (7 lectures) presents the Julia language, shows its basics, and compares commonly used languages. While the first part shows the individual concepts separately, the second part (6 lectures) combines these concepts into longer logical code sections. The examples are based on the theory of optimization, the use of which we show in machine learning, statistics and optimal control of differential equations. While the first part shows the individual concepts of Julia, the second part combines them into longer logical sections of code. We explain each application theoretically. Students are encouraged both to write simple functions by themselves and compare them with already existing packages. We hope that this approach will lead to the ability to work with existing packages and a better understanding of how these packages work. Although the second part is focused on applications, during it we show the vast majority of things discussed in the first part. In this way, students will practice Julia on interesting applications. Emphasis is also placed on working with environments. At the same time, the second part tries to show the connections between applications. We think this connection is important for students. Although the pieces of the second part are taught in many subjects, we are convinced that in many cases, we show an extension that is not heard in the lectures. We believe that our course will be a suitable supplement for many of these subjects. The first lecture is motivational. It shows the expected knowledge after successful completion of the course. It also presents the advantages and disadvantages of Julie and compares it with other programming languages ​​such as Python, Matlab, R and C. The following six lectures present Julia. The first three lectures are standard and include variables, data structures, conditions, cycles, functions or methods. The fourth lecture shows useful packages, while the last two lectures are specific characteristics of Julia, such as its type system, generic programming or environment management. The last six lectures are applications. The first lecture shows optimization problems, including the basic gradient methods for optimization. The next three lectures present linear regression, logistic regression and deep learning as optimization problems. In addition to training, these lectures show data preparation, distribution and visualization. The fifth lecture is devoted to statistics and the sixth to differential equations and optimization in the optimal control of differential equations.

Course outlines:

8. Introduction to optimization. Optimality conditions for optimization with and without constraints.
9. Introduction to linear regression and classification problems. Closed-form solutions and iterative methods.
10. Introduction to neural networks. Types of neural networks. Individual layers. Overfitting.
11. Convolutional layers, neural network structure, stochastic gradient descent.
12. Selected parts of statistics. Regularized linear regression. Spectral decomposition. Monte Carlo sampling. Hypothesis testing. Generalized linear models.
13. Introduction to ordinary differential equations. Relation with spectral decomposition. Optimal control as a combination of ODEs and optimization.

Exercises outline:

1. Introduction to Julia, advantages and disadvantages compared to Matlab, Python and R. Declarations of variables, basic numerical types, numerical and logical operators. Text strings.
2. Vectors, matrices and multidimensional arrays: basic properties, indexing, application of functions by elements. Other data structures and their properties: Tuple, NamedTuple, Dict.
3. If-elseif-else statements and ternary operator. Simple and nested for/while loops. Basic use of iterators and generators. Overview of the most used iterators: zip, eachcol, eachrow, enumerate. Scopes in loops.
4. Function declarations, mandatory, positional and keyword arguments. Functions as a set of methods. Calling methods according to the number of arguments and their type (multiple-dispatch). Local and global workspace.
5. Overview of the most used standard libraries: LinearAgebra, Statistics, Random. Useful packages: Plots.jl for creating graphs, DataFrames.jl for working with tabular data. Packages for interaction with other languages: PyCall, RCall.
6. Compound types and creating a logical hierarchy using abstract types. Parametric types. Internal and external constructors. Definition of specialized methods for complex and abstract types.
7. Organization of code into scripts and modules. Function import and extending existing code. Organization of installed packages into separate environments. Julia package creation.
8. Gradient visualization. Gradient descent method and step length selection. Projected gradients.
9. Data preparation and visualization. Linear and logistic regression training. Predictions.
10. Data preparation. Implementation of a simple neural network, including gradient calculation and training.
11. Creating neural networks using the Flux.jl package. Working on GPU.
12. Implementation of LASSO and ridge regression according to iterative methods for optimization. Analysis of the required number of samples for good approximation quality. The Distributions.jl, HypothesisTests.jl, and GLM.jl packages.
13. Hand-written implementation of the wave equation. Package ODE.jl.

Literature:

1. Online skripts https://bit.ly/JuliaML
2. Julia documentation. https://docs.julialang.org/en/v1/manual/documentation/index.html
3. Kochenderfer, M. J. and Wheeler, T. A. Algorithms for optimization. MIT Press, 2019. https://algorithmsbook.com/optimization/
4. Lauwens, B. and Downey, A. B. Think Julia: how to think like a computer scientist. O'Reilly Media, 2019. https://benlauwens.github.io/ThinkJulia.jl/latest/book.html
5. Goodfellow, I. and Bengio, Y. and Courville, A. Deep Learning. MIT Press, 2016. https://www.deeplearningbook.org/

Requirements:

The course assumes only knowledge of basic programming and linear algebra.

Keywords:

Julia, programming, optimization, machine learning.

Subject is included into these academic programs:

Program Branch Role Recommended semester


Page updated 28.4.2024 17:51:20, semester: Z/2024-5, Z,L/2023-4, Send comments about the content to the Administrators of the Academic Programs Proposal and Realization: I. Halaška (K336), J. Novák (K336)