After welcome addresses by the conference chair, a representative of the government agency in charge of fundamental research, and the rector of the university, the conference started off in a somewhat sombre mood with a commemoration of Roberto Petronzio, a pioneer of lattice QCD, who passed away last year. Giorgio Parisi gave a memorial talk summarizing Roberto's many contributions to the development of the field, from his early work on perturbative QCD and the parton model, through his pioneering contributions to lattice QCD back in the days of small quenched lattices, to his recent work on partially twisted boundary conditions and on isospin breaking effects, which is very much at the forefront of the field at the moment, not to omit Roberto's role as director of the Italian INFN in politically turbulent times.

This was followed by a talk by Martin Lüscher on stochastic locality and master-field simulations of very large lattices. The idea of a master-field simulation is based on the observation of volume self-averaging, i.e. that the variance of volume-averaged quantities is much smaller on large lattices (intuitively, this would be because an infinitely-extended properly thermalized lattice configuration would have to contain any possible finite sub-configuration with a frequency corresponding to its weight in the path integral, and that thus a large enough typical lattice configuration is itself a sort of ensemble). A master field is then a huge (e.g. 256

^{4}) lattice configuration, on which volume averages of quantities are computed, which have an expectation value equal to the QCD expectation value of the quantity in question, and a variance which can be estimated using a double volume sum that is doable using an FFT. To generate such huge lattice, algorithms with global accept-reject steps (like HMC) are unsuitable, because ΔH grows with the square root of the volume, but stochastic molecular dynamics (SMD) can be used, and it has been rigorously shown that for short-enough trajectory lengths SMD converges to a unique stationary state even without an accept-reject step.

After the coffee break, yet another novel simulation method was discussed by Ignacio Cirac, who presented techniques to perform quantum simulations of QED and QCD on alattice. While quantum computers of the kind that would render RSA-based public-key cryptography irrelevant remain elusive at the moment, the idea of a quantum simulator (which is essentially an analogue quantum computer), which goes back to Richard Feynman, can already be realized in practice: optical lattices allow trapping atoms on lattice sites while fine-tuning their interactions so as to model the couplings of some other physical system, which can thus be simulated. The models that are typically simulated in this way are solid-state models such as the Hubbard model, but it is of course also possible to setup a quantum simulator for a lattice field theory that has been formulated in the Hamiltonian framework. In order to model a gauge theory, it is necessary to model the gauge symmetry by some atomic symmetry such as angular momentum conservation, and this has been done at least in theory for QED and QCD. The Schwinger model has been studied in some detail. The plaquette action for d>1+1 additionally requires a four-point interaction between the atoms modelling the link variables, which can be realized using additional auxiliary variables, and non-abelian gauge groups can be encoded using multiple species of bosonic atoms. A related theoretical tool that is still in its infancy, but shows significant promise, is the use of tensor networks. This is based on the observation that for local Hamiltonians the entanglement between a region and its complement grows only as the surface of the region, not its volume, so only a small corner of the total Hilbert space is relevant; this allows one to write the coefficients of the wavefunction in a basis of local states as a contraction of tensors, from where classical algorithms that scale much better than the exponential growth in the number of variables that would naively be expected can be derived. Again, the method has been successfully applied to the Schwinger model, but higher dimensions are still challenging, because the scaling, while not exponential, still becomes very bad.

Staying with the topic of advanced simulation techniques, the next talk was Leonardo Giusti speaking about the block factorization of fermion determinants into local actions for multi-boson fields. By decomposing the lattice into three pieces, of which the middle one separates the other by a distance Δ large enough to render e

^{-MπΔ}small, and by applying a domain decomposition similar to the one used in Lüscher's DD-HMC algorithm to the Dirac operator, Leonardo and collaborators have been able to derive a multi-boson algorithm that allows to perform multilevel integration with dynamical fermions. For hadronic observables, the quark propagator also needs to be factorized, which Leonardo et al. also have achieved, making a significant decrease in statistical error possible.

After the lunch break there were parallel sessions, in one of which I gave my own talk and another one of which I chaired, thus finishing all of my duties other than listening (and blogging) on day one.

In the evening, there was a reception followed by a special guided tour of the truly stunning Alhambra (which incidentally contains a great many colourful - and very tasteful - lattices in the form of ornamental patterns).

## No comments:

Post a Comment