новость Россиянин получил "математическую Нобелевку"

19.08.2010
Станислав Смирнов. Фото с сайта unige.ch

Станислав Смирнов. Фото с сайта unige.ch

В число математиков, удостоившихся в 2010 году медали Филдса – математического аналога Нобелевской премии – попал россиянин Станислав Смирнов. Комитет Международного союза математиков принял решение о присуждении Смирнову награды за его работы в области статистической физики, а именно – теории перколяции.


Комментарии
User kubikrubik33, 19.08.2010 15:53 (#)

воздух Швейцарии полезен и приятен для русских математиков

User maxwell, 20.08.2010 08:33 (#)

И не только Швейцарии )))

"Станислав Смирнов окончил СПбГУ, после чего уехал работать за рубеж. Он работал как в США, так и во многих странах Европы. Как передает РИА Новости, сейчас Смирнов преподает в Женевском университете Швейцарии".

User shveyk, 20.08.2010 12:58 (#)

Поздравляю, Станислав Смирнов ! Станислав Смирноволодец, светлая голова!

Только не делай глупости, на вздумай приехать в Сколково! Не поддавайся сурковской пропаганде!

User lapland, 20.08.2010 14:47 (#)

Присоединяюсь к поздравлениям, и - Главное - к предыдущему посту - ни в коем случае не возвращайтесь! Сколково - блеф...

User adventurer, 24.08.2010 18:11 (#)

recommend to pay attention mainly on the links I will give them later

the H-theorem was introduced by L. Boltzmann, which describes the increase in the entropy of an ideal gas in an irreversible process. H - theorem follows from considerations of Boltzmann's equation. The H-B’s theorem.implies the estimate:H(f(t)+integral(0---t) integral D(f(s, x))dx ds= H(f(0)), where H stands for the H functional and D for associated dissipation functionsl…And in Quantum Statistical Mechanics (which is the quantum version of Classical Statistical Mechanics), the H-function is the function: H = summ (i) pi ln pi where pi is the probability that the system could be found in the i th state. Else summation runs over all possible distinct states of the system in this equation. This is closely related to, the entropy formula of Gibbs :S=-k summ (i) pi ln pi … Now , differentiating with respect to time gives:dS/dt=-k summ (i)[(dpi/d t) ln pi (dpi/d t)]=-k summ (i) (dpi/d t) ln piFor an isolated system the jumps will make contributions dpa/dt=sum(beta) nuab(pb-pa) and dpbeta/dt=sum(alfa) nuab(pb-pa) in this case the reversibility of the dynamics ensures that the same transition constant nuab appears in both expressions. Even reducing the distance just I can say that, so each contribution to dS / dt cannot be negative.Therefore :delta S>/=0 . Well this is a very important point, because H is a forerunner of Shannon's information entropy. Shannon's information entropy contains a good explanation of the discrete counterpart of the quantity H, it is known as the information entropy or information uncertainty (with a minus sign). By extending the discrete information entropy to the continuous information entropy, also called differential entropy, one obtains. Boltzmann's H-theorem: The quantity H can also be defined as the integral over velocity space:H=def= integral P(lnP)d^3v= where P(v). is the probability. Where P is the probability of finding a system chosen at random from the appropriate microcanonical ensemble . Now one can written as:H=-lgG const :where G may be spoken of as the number of classical states. For a system of N statistically independent particles, H is related to the thermodynamic entropy S through:S=def=-NkH. according to the H-theorem, S can only increase. However, if consistent with the statements Loschmidt then, it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism: something must be wrong (Loschmidt's paradox). One can talk about this a bit more detail: Loschmidt's paradox, also known as the reversibility paradox, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. Well, it turns out that the time reversal symmetry of (almost) all known low-level fundamental physical processes put in conflict with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict; hence the paradox. The criticism was provoked by the H-theorem of Boltzmann, which was an attempt to explain using kinetic theory the increase of entropy in an ideal gas from a non-equilibrium state, when the molecules of the gas are allowed to collide. Another way of dealing with Loschmidt's paradox is (eg) the second law as an expression of a set of boundary conditions, in which our universe's time coordinate has a low-entropy starting point: the Big Bang. But we digress somewhat to the side, better let's go back to B's H theorem…. Well, because we are talking about Mr. Villani here (ie his works) : The H-theorem's connection between information and entropy (of course) plays a central role understanding of the phenomenon, which is called the B. h. information paradox. - So go:-The Boltzmann's H-theorem (B’s h theorem. or B’ theorem) can be represented (or more precisely formulated) the Boltzmann's H-theorem as so-called Boltzmann equation models the dinamics (well, as it is noted earlier) of rarefied gases via a time-dependent density f(x, v) of particles in phase space ( x stands for positions, and v for velosity) : delta smallf/delta small t v « the Laplacian operator» x f= Q(f, f). Where v « the Laplacian operator» x is the transport operator , while Q is the bilinear collision operator : Q(f, f)= integral Rv^3 xS2 B[ f(v’)f(v*’) - f(v)f(v*)] dv*dsigma and B= B(v-v*, sigma)>/=0 is the collision kernel . The notation (v * 'and v') seems that it means per-collisional velocities , while the notation (v * and v )stands for post-collisional velocities . And according to this model the entropy S is nondecreasing in the time . Ie if one defines:S(f)=-H(f)=integral Rv^3 xS3 f(x,v)=Mro,u,t(v)=(ro(x)e^(|v-ux|^2)/(2пТ(x)^3/2)) .If one examines this issue from the standpoint hydrodinamics or (differently) in words a hydrodinamic state is a density, whose velosity dependence is Gaussian, with scalar covariants ... It depends on three local parameters : density po( ro), velosity u and temperature T ( the tem-re measures the variance of the velosity distribution) .In short one can say follows: - B's theorem. is in follows: it states that a positive amount entropy is produced at time t, unless the density f is locally Maxwellian ( hydrodinamics), ie unless there exist field ro- (scalar),u(vector) and T (scalar) ... In short, so as not to butt here, according to the theme of our narrative (ie, completely in the final vriante), B's theorem. (eventually) can be represented in this form :[D(f)=0]<=>[f(v)=Mro uT(v) for some parameters ro, u ,T]… And now this:- …Well, what is it (?) In other words, that there will be : Let us recall some basics concepts from information theory:The Shannon-Hartley theorem is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free digital data (ie information) that can be transmitted with a specified bandwidth in the presence of the noise interference, under the assumption that the signal power is bounded and the Gaussian noise process is characterized by a known power or power spectral density. The law is named after C.Shannon and R. Hartley .The physical information refers generally to the information that is contained in a physical system. Its usage in quantum mechanics (ie. quantum information) is important, for example in the concept of quantum entanglement to describe effectively direct or causal relationships between apparently distinct or spatially separated particles. Information itself may be loosely defined as "that which can distinguish one thing from another". An amount of information is a quantification of how large a given instance, piece, or pattern of information is, or how much of a given system's information content (its instance) has a given attribute, such as being known or unknown. Amounts of information are most naturally characterized in logarithmic units. Physical information and entropy: An easy way to understand the underlying unity between physical (as in thermodynamic) entropy and information-theoretic entropy is as follows: Entropy is simply that portion of the (classical) physical information contained in a system of interest (whether it is an entire physical system, or just a subsystem delineated by a set of possible messages) whose identity (as opposed to amount) is unknown (from the point of view of a particular knower). This informal characterization corresponds to both von Neumann's formal definition of the entropy of a mixed quantum state (eg, von Neumann entropy), and Claude Shannon's definition of the entropy of a probability distribution over classical signal states or messages (eg. information entropy). Well let this be a little confusing, but it really is almost directly related to the theme of our story (the most famous works Villani are associated with it) .Furthermore, even when the state of a system is known, we can say that the information in the system is still effectively entropy if that information is effectively incompressible, that is, if there are no known or feasibly determinable correlations or redundancies between different pieces of information within the system. Entropy is a macroscopic property of a system that is a measure of the microscopic disorder within the system. Ie one can say about it , this is an important part of the second law of thermodynamics. In a nutshell , the Shanonn- Boltzmann entropy : S=-H=-integral f ln f quantifies how much information there is in "random" signal , or a language; and now one can assume that f as a density of the distribution of the signal. A deterministic language means complete predictability, in this case no surprise and no information. Thus S=- infinity . And there is a measure , the proper equation for H is: H mu( nu)= integral ro ln ro d mu; nu= ro mu . Let us mention in passing : the Fisher information which quantifies difficult is reconstruct . Physical information refers generally to the information that is contained in a physical system. Its usage in quantum mechanics (ie. quantum information) is important, for example in the concept of quantum entanglement to describe effectively direct or causal relationships between apparently distinct or spatially separated particles. Another note that this, essentially it plays an important role in many problems of "pure" mathematics. And yet from a physical point of view, the entropy functional measures the volume of microstates assosieted, to some degree of accuracy in macroscopic observables , to a given macroscopic configuration, or observable distribution function . But in particular the concept of entropy implies an observation with some margin of mistake . The basic question related entropy is :” How exceptional is the observed configuration ?”. Let us say something about entropic lattice Boltzman models :- Thermal Lattice Boltzmann methods (TLBM) is a class of computational fluid dynamics (CFD) methods for fluid simulation. Instead of solving the Navier-Stokes equations, the discrete Boltzmann equation is solved to simulate the flow of a Newtonian fluid with collision models such as Bhatnagar-Gross-Krook (BGK). Lattice Boltzman models of fluid dynamics were introduced in late 80s and early 90s. Since then ,they have matured into an important new methodology for computational fluid dynamics. The most physically motivated being the so-called "entropic " version of the lattice Boltzman model . This version endows the discrete-velosity kinetic equation with a Lyapunov function in the spirit of Boltzman's celebrated H-function .... From a physical point of view the Lyapunov function specifies an arrow time , from a numerical point of view , it ensures the nonlinear numerical stability of the model .

User adventurer, 24.08.2010 20:00 (#)

Forgive me for this typo :

H(f(t)+integral(0---t) integral D(f(s, x))dx ds= H(f(0)), where H stands for the H functional and D for associated dissipation functionsl… And Villani has also brought closure to a long-standing question concerning entropy and equilibrium in the plasmas( or the"ion gases"). What's more, he has made surprising connections between the theory of gas diffusion and an eminently practical problem in economics. But I will not tell about this .http://arxiv.org/PS_cache/arxiv/pdf/0904/0904.0187v5.pdf http://dsfd.physics.ndsu.nodak.edu/cgi-bin/mailman/listinfo/dsfd-announce http://www.awi.de/fileadmin/user_upload/Research/Research_Divisions/Climate_Sciences/Paleoclimate_Dynamics/Modelling/Lessons/Einf_Ozeanographie/lecture_19_Jan_2010.pdf. http://www.ma.utexas.edu/mp_arc/c/05/05-258.pdf http://research.nianet.org/~luo/Reprints-luo/1997/HeXY_JSPv88-1997.pdf

User adventurer, 24.08.2010 20:20 (#)

hah

H(f(t)+integral(0---t) integral D(f(s, x))dx ds= H(f(0)),

User adventurer, 24.08.2010 20:30 (#)

Why? What's the matter?

Apparently there is be some very strange technical problem ( in this my post :24, 2010 16:20)
H(f(t))+integral (0->t)integralD(f(s,x)dxds=H(f(0)) А впрочем какая разница, это уже чисто спортивная поправка

User adventurer, 24.08.2010 18:14 (#)

Это ссылки к комменту

http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6X1B-4JXRX6W-3&_user=10&_coverDate=06/01/2006&_rdoc=1&_fmt=high&_orig=search&_sort=d&_docanchor=&view=c&_searchStrId=1436350197&_rerunOrigin=google&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=9c2d5238583d5f45154ce00836cb73c3 http://cmouhot.wordpress.com/2010/06/30/new-preprint-factorization-for-non-symmetric-operators-and-exponential-h-theorem/http://www.umpa.ens-lyon.fr/~cvillani/cv.html http://www.ams.org/journals/bull/2004-41-02/S0273-0979-04-01004-3/

User adventurer, 24.08.2010 18:24 (#)

Это ссылки к первому комменту .

http://www.cise.ufl.edu/research/revcomp/physlim/plpaper.html http://cdsweb.cern.ch/record/521260/files/0110018.pdf http://arxiv.org/PS_cache/arxiv/pdf/0903/0903.5082v1.pdf

User adventurer, 24.08.2010 20:10 (#)

http://dsfd.physics.ndsu.nodak.edu/cgi-bin/mailman/listinfo/dsfd-announce
http://www.awi.de/fileadmin/user_upload/Research/Research_Divisions/Climate_Sciences/Paleoclimate_Dynamics/Modelling/Lessons/Einf_Ozeanographie/lecture_19_Jan_2010.pdf. http://research.nianet.org/~luo/Reprints-luo/1997/HeXY_JSPv88-1997.pdf.. http://arxiv.org/PS_cache/arxiv/pdf/0904/0904.0187v5.pdf And Villani has also brought closure to a long-standing question concerning entropy and equilibrium in the plasmas( or the"ion gases"). What's more, he has made surprising connections between the theory of gas diffusion and an eminently practical problem in economics. But I will not tell about this .

User adventurer, 24.08.2010 18:20 (#)

These links are relevant to Mr. Smirnov
http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VJ2-43P2369-K&_user=10&_coverDate=08%2F01%2F2001&_rdoc=1&_fmt=high&_orig=search&_sort=d&_docanchor=&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=a63707b5792b91f673b1b34c9546fac0 There u can see follows-(DOI: 10.1016/S0764-4442 (01) 01991-7)- one can open it
http://www.ams.org/notices/200605/what-is-kesten.pdf

Анонимные комментарии не принимаются.

Войти | Зарегистрироваться | Войти через:

Комментарии от анонимных пользователей не принимаются

Войти | Зарегистрироваться | Войти через: