Studentský matematický seminář - FJFI/ČVUT



Čas a místo: Út ve 14h v T-115, Trojanova 13

Seminář je otevřen všem pravidelným i občasným zájemcům. Přdmět je možné zapisovat jako SMS1 (zimní semestr) a SMS2 (letní semestr). Podmínkou zisku zápočtu je jedna proslovená přednáška a rozumná účast na ostatních přednáškách. Přednášky jsou česky nebo anglicky.

Program semináře:



Winter term 2023/24:


7. 11. 2023: Daniel Khol (FJFI ČVUT Praha)
Title: t.b.a.
Abstract: t.b.a.

31. 10. 2023: Věra Kůrková (ICS CAS)
Title:Approximation of classifiers of large data sets by deep ReLU networks
Abstract: Rapid development of experimental research and successful practical applications of deep networks inspires many theoretical questions. In this talk, we will focus on approximation capabilities of deep ReLU networks, which are one of the most popular network architectures. We will explore the effect of network depths and numbers of their parameters on behavior of approximation errors. To obtain probabilistic bounds on approximation errors, we will employ concepts from statistical learning theory (growth function, VC dimension) and high-dimensional geometry (concentration of measure). We will address the dilemma between approximation accuracy and consistency in learning from random samples of data. We will discuss limitations of approximation capabilities of networks of finite VC dimension in distribution agnostic settings.

24. 10. 2023: Pavel Jakš (FJFI ČVUT Praha)
Title: Transformery
Abstract: Přdstavíme transformery, architekturu neuronových sítí, která využívá tzv. attention mechanism, což staví na reprezentaci slov pomocí vektorů.

17. 10. 2023: David Rendl (FJFI ČVUT Praha)
Title: Dekonvoluce a rekonstrukce obrazu
Abstract: Přednáška bude o dekonvoluci a rekonstrukci obrazu. Popíšeme, jak reprezentujeme snímky, jak modelujeme rozmazání a odvodíme některé dekonvoluční algoritmy. Dále odvodíme Richardson-Lucyho algoritmus ve dvou regularizovaných verzích. Představíme také některé triky pro snížení výpočetní náročnosti. Celé povídání bude doplněno praktickými ukázkami. Pokud zbude čas, tak ukážeme metody měření ostrosti a vyhodnocování výsledků rekonstrukce snímků.

10. 10. 2023: Václav Klika (FJFI ČVUT Praha)
Title: t.b.a.
Abstract: t.b.a.

3. 10. 2023: Jan Volec (FJFI ČVUT Praha)
Title: Turánovské problémy
Abstract: V přednášce si představíme Turánovu větu - základní výsledek z extremální teorie grafů - a několik různorodých technik, jak tuto větu dokázat. Dále si pak představíme přirozené zobecnení Turánovy věty pro výšší dimenze, a zformulujeme tzv. hypergrafovou Turánovu domněnku - jeden z nejvýznamnějších otevřených problémů extremální kombinatoriky. Pro nejmenší instanci Turánovy hypergrafové domněnky - jaké maximální množství trojic na nosné množine n bodů lze vybrat tak, aby vybraný systém trojic neobsahoval čtyřstěn - zmíníme nejlepší známý horní odhad, a zmíníme základní myšlenku, jak se tento odhad dokazuje.

Summer term 2022/23:

9. 5. 2023 (Tuesday): Stanislav Hencl (MFF UK, Prague)
Title: Models of nonlinear elasticity: Questions and progress
Abstract: Lecture is an introduction to nonlinear elasticity, its mathematical formulation and basic results and open questions.

27. 4. 2023 (Thursday): Jan Haskovec (KAUST, Saudi Arabia)
Title: Functional Differential Equations in Models of Collective Behavior
Abstract: I will give an overview of recent results for models of collective behavior governed by functional differential equations. The talk will focus on models of interacting agents with applications in biology (flocking, swarming), social sciences (opinion formation) and engineering (swarm robotics), where latency (delay) plays a significant role. I will explain that there are two main sources of delay - inter-agent communications and information processing - and show that they have qualitatively different impacts on the group dynamics. I will give an ovierview of analytical methods for studying the asymptotic behavior of the models and their mean-field limits. Finally, motivated by situations where finite speed of information propagation is significant, I will introduce an interesting class of problems where the delay depens nontrivially and nonlinearly on the state of the system, and discuss the available analytical results and open problems here.

24. 4. 2023: Martin Kunz (FJFI ČVUT)
Title: The Tale of a Boundary-Layer Problem
Abstract: I will demonstrate the solution with all its steps (and missteps) to the boundary layer problem eps y'' = yy' - y, such that y(0) = 1 and y(1) = -1 and where eps <<1. The journey towards the solution will take us on an excurse through some basics of perturbation theory and asymptotic approximation matching. I will demonstrate visualizations that will give you insight into the reasoning behind the steps taken towards the solution serving as an exhibition for the leverage in problem solving accessible to us through the computational power at our fingertips.

17. 4. 2023: David Sychrovský (MFF UK)
Title: t.b.a.
Abstract: t.b.a.

3. 4. 2023: Aleksej Gaj (FJFI ČVUT)
Title: Quantum decision making: an introduction
Abstract: Decision making (DM) is one of the basic disciplines in contemporary AI and ML. DM is a purposeful choice among several alternatives based on the available information. Classical subjective expected theory (Savage, 1954) serves as a main paradigm currently used. Confrontation with a human-like style of thinking (see Ellsberg and Allais paradoxes) has stimulated a new line of research towards the quantum version of DM (see Yukalov and Sornette, 2010). In the talk we will speak about how quantum mechanics apparatus can be used to modify & improve classical decision theory.

27. 3. 2023: Erin Carson (MFF UK)
Title: Using Mixed Precision in Numerical Linear Algebra
Abstract: Support for floating point arithmetic in multiple precisions is becoming increasingly common in emerging architectures. Mixed precision capabilities are already included in many machines on the TOP500 list and will be a crucial hardware feature in exascale machines. From a computational scientist's perspective, our goal is to determine how and where we can exploit mixed precision computation in our codes. This requires both an understanding of performance characteristics as well as an understanding of the numerical behavior of algorithms in finite precision arithmetic.

After giving an introduction to floating point computation, mixed precision hardware, and current work in mixed precision numerical linear algebra, we present examples that demonstrate what can go wrong if we use low precision blindly. This motivates the need for rigorous rounding error analysis in algorithms used in scientific computing and data science applications.

Understanding the behavior of algorithms in finite precision is necessary not only for illuminating potential dangers, but also for revealing opportunities. As an example of where rounding error analysis can lead to new insights and improved algorithms, we present a general algorithm for solving linear systems based on mixed-precision iterative refinement. From this, we develop a mixed-precision GMRES-based iterative refinement scheme that works for even ill-conditioned systems. We then present recent extensions of this theoretical analysis to least squares problems and practical settings in which approximate and randomized preconditioners are used.

20. 3. 2023: Pavel Jaks (FJFI ČVUT)
Název: Hypercubes and the Sensitivity Conjecture

13. 3. 2023: Magdalena Prorok (AGH University of Science and Technology)
Title: Directed graphs without rainbow triangles
Abstract: One of the most fundamental questions in graph theory is Mantel's theorem which determines the maximum number of edges in a triangle-free graph of order n. Recently a colourful variant of this problem has been solved. In such variant we consider k graphs on a common vertex set, thinking of each graph as edges in a distinct colour, and want to determine the smallest num- ber of edges in each colour which guarantees existence of a rainbow triangle. In this talk we solve the analogous problem for directed graphs without rain- bow triangles, either directed or transitive, for any number of colours. The constructions and proofs essentially differ for k = 3 and k >= 4 and the type of the forbidden triangle.
This is joint work with Sebastian Babinski and Andrzej Grzesik.

6. 3. 2023: Adam Janich (FJFI, ČVUT)
Název:word2vec
Abstrakt: Bude referována metoda word2vec.

27. 2. 2023: RNDr. Věra Kůrková, DrSc. (ÚI AV)
Název: Some implications of high-dimensional geometry for classification by neural networks
Abstrakt: Computational difficulties of multidimensional tasks, called the ``curse of dimensionality’’, have long been known. On the other hand, almost deterministic behavior of some randomized models and algorithms depending on large numbers of variables can be attributed to the ``blessing of dimensionality’’. These phenomena can be explained by rather counter-intuitive properties of geometry of high-dimensional spaces. They imply concentration of values of sufficiently smooth functions of many variables around their mean values and possibilities of reduction of dimensionality of data by random projections. In the lecture, it will be shown how these properties of high-dimensional geometry can be employed to obtain some insights into suitability of various types of neural networks for classification of large data sets. Probabilistic bounds on network complexity will be derived using concentration properties of approximation errors based on Azuma and McDiarmid inequalities. Consequences for choice of network architectures will be analyzed in terms of growth functions and VC dimensions of sets of network input-output functions. General results will be illustrated by examples of deep perceptron networks with various piecewise polynomial activation functions (ReLU, RePU).

20. 2. 2023: Jan Vybíral (FJFI ČVUT)
Název: A multivariate Riesz basis of ReLU neural networks
Abstrakt: We consider the trigonometric-like system of piecewise linear functions introduced recently by Daubechies, DeVore, Foucart, Hanin, and Petrova. We provide an alternative proof that this system forms a Riesz basis of L2([0,1]) based on the Gershgorin theorem. We also generalize this system to higher dimensions d>1 by a construction, which avoids using (tensor) products. As a consequence, the functions from the new Riesz basis of L2([0,1]^d) can be easily represented by neural networks. Moreover, the Riesz constants of this system are independent of d, making it an attractive building block regarding future multivariate analysis of neural networks.




(Zatím) Navržená témata:


Honza Volec:
Decades-Old Computer Science Conjecture Solved in Two Pages
A 53-Year-Old Network Coloring Conjecture Is Disproved
Google Researcher, Long Out of Math, Cracks Devilish Problem About Sets


Honza Vybíral:
Kernel Principal Component Analysis
word2vec
Spherical codes and Borsuk’s conjecture
Optimal asymptotic bounds for spherical designs
Approximation of infinitely differentiable multivariate functions is intractable


Vašek Klika
Surprises in a Classic Boundary-Layer Problem
Deep Learning: An Introduction for Applied Mathematicians
An Algorithmic Introduction to Numerical Simulation of Stochastic Differential Equations
Period Three Implies Chaos
The chemical basis of morphogenesis