• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site
ФКН

Winter School "Symmetry and Complexity in Mathematics" for undergraduate students February 1 – 6, 2019

The Winter School is held jointly by the NRU HSE and SPbSU as a part of the cooperation in the area of mathematics. The aim of the school is to overview the principal trends in contemporary theoretical mathematics dealing with symmetry and complexity in a broad sense. We invite senior undergaduate students of mathematics and related fields, as well as graduates of BSc programs planning to continue their studies of fundamental mathematics.    

When: February 1 – 6, 2019  
Where: HSE Faculty of Mathematics, Moscow, 6 Usacheva ul. 

How to take part in the school: to attend the school one needs to fill out the form. The strongest applications will be chosen and their authors will be invited to take part in the school.
Deadline: if you do not need the Russian visa to attend the school, then the deadline to fill out the form is January 12, 2019, 23h 59min. If you need the visa, then the deadline is December 10, 2018, 23h 59min.

Financial support: a dormitory is provided for the school participants. All other expenses, including the transportation costs and meal, the participants pay from their own funds. The school participants will be able to have lunch in the cafeteria of the faculty, the estimated price of one lunch is 120 rubles.

Schedule of the school


List of lectures and mini-courses. 

  • Yu.S. Belov (SPbSU). Time-frequency analysis.

    Time-frequency analysis is one of the modern branches of harmonic analysis, which studies shifts and modulations in functional and operator spaces. The first results in this area were due to Weil, Wigner and Neumann and appeared in the 1930s, when the quantum mechanics were developing rapidly. In the past 20 years interest to this area of analysis has been revived again in connection with wide application in the fields of information theory and signal analysis. Another reason for the revival is the intensive development of the wavelet theory, which is “similar” to the time-frequency analysis.

    The aim of the course is to discuss the time-frequency analysis from its basis to the modern results which concern the Gabor frames and modulation spaces. For understanding of the course basic knowledge of analysis is desirable.

  • N.A. Vavilov (SPbSU). Octonionic mathematics.

    Notoriously, Arnold divided all mathematics into three parts ("celestial mechanics, hydrodynamics and cryptography") --- real, complex, and quaternionic mathematics. For instance, in the language of classical groups, this is expressed as series of orthogonal, unitary and symplectic groups. However, as observed by Roman Mikhailov, “dear V.I. was somewhat mistaken in this matter. In fact, he was completely wrong. Mathematics is not divided into three parts, but into four parts.” Octonionic mathematics that was not mentioned by Arnold is at least as important --- and, in any case, possesses much more elegant symmetry. 

    During the course we will discuss the hierarchy of objects, whose existence is related to octonions, starting with small finite simple groups and the corresponding geometric objects, up to exceptional algebras, symmetric spaces, the Monster, etc. In the main part of the course we will describe both classical and recent constructions of exceptional algebras and Lie type groups in terms of algebras, forms, combinatorial geometries and special projective varieties.

  • V.A. Vassiliev (HSE). Buridan's complexity and discriminants.

    Buridan complexity of the computational problem arises because of the need to choose one of the possible equivalent solutions, for example, when the problem has some symmetry. In terms of algorithms, this complexity means that any algorithm solving the problem should have a number of conditional transfer operators.  This issue rarely arises arise in a single problem, but it may arise when solving a family of problems depending on a parameter: for example, when creating an algorithm that finds a root of a polynomial equation (the parameters in this case are the coefficients of the equation). The simplest example is the impossibility to give a good approximate solution of the complex equation X2 = A by a continuous function of A, as well as the solution of the real equation X3 + AX ​​+ B = 0 by a continuous function of the real parameters A and B. Scale of this problem, in particular the essential number of discontinuities of a general solution of a polynomial equation or system of equations, is estimated in terms of geometry and topology of the corresponding discriminant set, that is, of the set of polynomials with coinciding roots. (This set has many other important applications and it will be discussed in detail). In this issue there are many questions which are easy to formulate but which are still open.

  • E.A. Hirsch (SPbSU). Complexity of proofs.

    Does any theorem have a short proof? For example, to prove that a system of polynomial equations has solutions in {0,1}, it suffices to present such a solution. And what can we present to prove that a system has no solutions? In the example above, one could apply Hilbert's Nullstellensatz (however, the involved polynomials can be very complicated). What can and what cannot be done in the general case? This question is open even in the "easiest" case of propositional logic (where we have only logical variables and connectives), and the propositional case is equivalent to the equality of complexity classes NP and co-NP, and has been intensively studied since the work of S.А. Cook and R.А. Reckhow (1979), who introduced a formal notion of a propositional proof system. A proof system is a polynomial-time algorithm that verifies proofs: it accepts correct proofs of correct statements and does not accept proofs of false statements. Although the question is open in the general case, exponential lower bounds are known for a number of specific proof systems. “Cook’s program” for studying the complexity of proofs consists in obtaining new exponential lower bounds for increasingly more powerful proof systems. Concepts and methods used in this area belong to various branches of mathematics. For example, there are proof systems based on geometric principles or on the proof of the emptiness of a semi-algebraic set. In this introductory mini-course, we will formulate several proof systems and demonstrate several lower bounds. We will also discuss the connection with the general NP vs co-NP question and the existence of "proofs from The Book" - a system with the shortest possible proofs.

  • A.L. Gorodentsev (HSE). Exceptional bases and Diophantine approximations.
  • S.L.Kuznetsov (HSE). Substructural logics from algebra to linguistics.

    Substructural logics are logical systems in which some or all rules, which are classically valid, are omitted. For example, contraction (A -> A and A) becomes invalid, if we interpret A as a kind of resource, which is spent when A is used: in this situation, one could have A -> B, A -> C, but not A -> (B and C). Another rule, weakening (A -> (B -> A)) becomes invalid in so-called relevance logics (where we wish the premises of an implication to be essentially used in obtaining the goal, and disallow reasoning like "If 2x2=5, then Volga flows into Caspian Sea"). Finally, there are non-commutative logics (where "A and B" is not the same as "B and A"). In this course, we will discuss various substructural systems, their algebraic semantics and applications to natural language study.

  • F.V. Petrov (SPbSU). Concentration of measure.

    Concentration of measure is a phenomenon in probability theory, analysis, and combinatorics when any function of a large number of variables, satisfying fairly general and not too restrictive assumptions, has to be almost constant. A classic example is the following: almost all surface of a multidimensional sphere is concentrated near the equator. In the 1970s, Vitaly Milman found an application of this fact in the local geometry of Banach spaces, by giving a new proof of the famous Dvoretsky theorem (which originally was the Grothendieck hypothesis): any centrally symmetric convex body of sufficiently large dimension has an almost circular central section of a given dimension. Since that time, the idea of ​​concentration of measure has found many bright and effective applications, some of which we are going to discuss. 

    Minimal knowledge of analysis and probability theory is desirable for understanding of the course.

  • G.I. Olshanski (HSE). Determinantal measures (in Russian).

Questions can be addressed to Andrey Dymov by email dymov@mi-ras.ru