Saturday, June 19, 2010

Lattice 2010, Day Five

The day started with plenary sessions again. The first plenary speaker was Chris Sachrajda on the topic of phenomenology from the lattice. Referring to the talks on heavy and light quarks, spectroscopy and hadron structure for those topics, he covered a mix of various phenomenologically interesting quantities, starting from those that have been measured to good accuracy on the lattice and progressing to those that still pose serious or perhaps even unsurmountable problems. The accurate determination of Vus/Vud from fK/fπ and of Vus from the Kl3 form factor f+(0), where both the precision and the agreement with the Standard Model are very good, clearly fell into the first category. The determination of BK is less precise and there is a 2σ tension in the resulting value of K|. Even more challenging is the decay K --> ππ, for which however progress is being made, whereas the yet greater challenge of nonleptonic B-decays cannot be tackled with presently known methods. Chris closed his talk by reminding the audience that at another lattice conference held in Italy, namely that of 1989 (i.e. when I was just a teenager), Ken Wilson had predicted that it would take 30 years until precise results could be attained from lattice QCD, and that given that we still have nine years we are well on our way.

The next plenary talk was given by Jochen Heitger, who spoke about heavy flavours on the lattice. Flavour physics is an important ingredient in the search for new physics, because essentially all extensions to the Standard Model have some kind of flavour structure that could be used to find them from their contributions to flavour processes. On the lattice, "gold-plated" processes with no or one hadron in the final state and a well-controlled chiral behaviour play a crucial role because they can be treated accurately. Still, treating heavy quarks on the lattice is difficult, because on needs to maintain a multiscale hierarchy of 1/L << mπ << mQ << 1/a. A variety of methods are currently in use, and Jochen nicely summarised results from most of them, including, but not limited to, the current-current correlators used by HPQCD, ETMC's interpolation of ratios between the static limit and dynamical masses, and the Fermilab approach, paying special attention to the programme of non-perturbative HQET pursued by the ALPHA collaboration.

The second plenary session started with a talk by Mike Peardon about improved design of hadron creation operators. The method in question is the "distillation" method that has been talked about a lot for about a year now. The basic insight at its root is that we generally use smeared operators to improve the signal-to-noise ratio, and that smearing tends to wipe out contributions from high-frequency modes of the Laplacian. If one then defines a novel smearing operator by projecting on the lowest few modes of the (spatial) Laplacian, this operator can be used to re-express the large traces appearing in correlation functions with smaller traces over the space spanned by the low-modes. If the smearing or "distillation" operator is D(t)=V(t)V(t)+, one defines the "perambulator" τ(t,t')=V(t)+M-1(t,t')V(t') that takes the place of the propagator, and reduced operators Φ(t)=V(t)+ΓV(t), in terms of which to write the small traces. Insertions needed for three-point functions can be treated similarly by defining a generalised perambulator. Unfortunately, this method as it stands has a serious problem in that it scales very badly with the spatial volume -- the number of low-modes needed for a given accuracy scales with the volume, and so the method scales at least like the volume squared. However, this problem can be solved by using a stochastic estimator that is defined in the low-mode space, and the resulting stochastic method appears to perform much better than the usual "dilution" method.

The last speaker of the morning was Michele Pepe with a talk on string effects in Yang-Mills theory. The subject of the talk was the measurement of the width of the effective string and the observation of the decay of unstable k-strings in SU(2) gauge theory. By using a multilevel simulation technique proposed by Lüscher and Weisz, Pepe and collaborators have been able to perform these very challenging measurements. The results for the string width agree with theoretical expectations from the Nambu-Goto action, and the expected pattern of k-string decays (1 --> 0, 3/2 --> 1/2, and 2 --> 1 --> 0) could be nicely seen in the plots.

The plenary session was closed by the announcement that LATTICE 2011 will be held from 10-16th July 2011 at the Squaw Valley Resort in Lake Tahoe, California, USA.

In the afternoon there were again parallel sessions.

Friday, June 18, 2010

Lattice 2010, Day Four

Today's first plenary session was started by Kazuyuki Kanaya with a talk on finite-temperature QCD. Many groups are looking for the transition temperature between the confined and deconfined phases, but since in the neighbourhood of the physical point, the transition is most likely a crossover, the value of the "critical" temperature found may be dependent on the observable studied. There was further some disagreement even between different studies using the same observables, but those discrepancies seem to have gone mostly away.

Next was Luigi Del Debbio speaking about the conformal window on the lattice. The motivation for those kinds of studies is the hope that the physics of electroweak symmetry breaking by originate not from a fundamental scalar Higgs, but from a fermionic condensate similar to the chiral condensate in QCD arising from a gauge theory ("technicolor") living at higher energy scales, perhaps around 1 TeV. To make these kinds of models viable, the coupling needs to run very slowly. One is then motivated to look for gauge theories having an infrared fixed point. Lattice simulations can help studying the question which combinations of Nc, the number of colours, and Nf, the number of fermion flavours, actually exhibit such behaviour. The Schrödinger functional can be used to study such questions, but while there are a number of results, no very clear picture appears to have emerged yet.

The second plenary session of the morning was opened with a talk on finite-density QCD by Sourendu Gupta. QCD at finite density, i.e. finite chemical potential, is plagued by a sign problem because the fermionic determinant can no longer be real in general. A number of ways around this problem have been proposed. The most straightforward is reweighting, the most ambitious a reformulation of the theory that manages to eliminate the sign problem entirely. On the latter front, there has been progress in that the 3D XY model, which also has a sign problem, has been successfully reformulated in different variables in which it does no longer suffer from its sign problem; whether something similar might be possible for QCD remains to be seen. Other approaches try to exploit analyticity to evade the sign problem, either by Taylor-expanding around zero chemical potential and measuring the Taylor coefficients as susceptibilities at zero chemical potential, or by simulating at purely imaginary chemical potential (where there is no sign problem) and extrapolating to real chemical potential. In this way, various determinations of the critical point of QCD have been performed, which agree more or less with each other. All of them lie in a region through which the freeze-out curve of heavy-ion experiments is expected to pass, so the question of the location of the critical point may become accessible experimentally.
The last plenary talk of the morning was Takeshi Yamazaki talking on a determination of the binding energy of helium nuclei in quenched QCD. The effort involved is considerable (there are more than 1000 different contractions for 4He, and the lattices considered have to be very large to be able to accommodate a helium nucleus and to distinguish between true bound states and attractive scattering states), even though the simulations were quenched and the valence quarks used corresponded to a pion mass of about 800 MeV. The study found that helium nuclei are indeed bound.

In the afternoon there were parallel sessions.

Thursday, June 17, 2010

Lattice 2010, Days Two and Three

Yesterday was an all-parallels day, so there are no plenary talks to summarise. In the evening there was the poster session.

The internet connection at the resort does not really have the capacity to deal with 360 computational physicist all reading their email, checking on their running computer jobs, browsing the hep-lat arXiv or writing their blog at the same time; this may lead to late updates from me, so please be patient.

Today's first plenary session was the traditional non-lattice plenary. The first talk was by Eytan Domany, who spoke about the challenges posed to computational science by the task of understanding the human genome. A large part of his talk was an introduction to the biological concepts involved, such as DNA, chromosomes, genes, RNA, transcription, transcription factors, ribosomes, gene expression, exons, introns, "junk" DNA, regulation networks and epigenetics. These days, it is possible to analyse the expression of thousands of genes in a sample by means of a single chip, and the data obtained by performing this kind of analysis on large numbers of samples (e.g. from different kinds of cells or from different patients) can be seen as an expression matrix with rows for genes and columns for samples. The difficult task is then to use this kind of large data matrix to infer regulation networks or connections between gene expression and phenotypes. Apparently, there are physicists working in this area together with the biologists, bringing in their computational expertise.

The second plenary talk was an LHC status summary given by Slawek Tkaczyk. The history of the LHC is of course well known to readers of this blog; so far, the first data are being analysed to "rediscover" the Standard Model with the aim of discovering new physics in the not too distant future, but there was no evidence of e.g. the Higgs or SUSY shown (yet?).

The second plenary session was devoted to non-QCD lattice simulations. The first talk was Renate Loll speaking on Lattice Quantum Gravity, specifically on causal dynamical triangulations. This approach to Quantum Gravity starts from the path integral for the Einstein-Hilbert action of General Relativity and regularises it by replacing continuous spacetime with a discrete triangulation. The discrete spacetime is then a simplicial complex satisfying certain additional requirements, and the Wick-rotated path integral can be treated using Monte Carlo techniques. In one phase of the (three-parameter) theory, the macroscopic structure of the resulting spacetime has been found to agree with de Sitter-space. Another surprising and interesting result of this approach has been that the spectral dimension associated with the diffusion of particles on the discrete spacetime is continuously going from around 2 at short (Plackian) to 4 at large distances.

Next was a talk on exact lattice SUSY by Simon Catterall. Normally, a lattice regularisation completely ruins supersymmetry, but theorists have found a way to formulate certain classes of supersymmetric theories (including N=4 Super-Yang-Mills) on a special kind of lattice, giving a local, gauge-invariant action with a doubler-free fermion formulation. This may offer a chance to study quantum gravity by simulations of lattice SUSY via the AdS/CFT correspondence.

In the afternoon there were excursions. I had signed up to the only excursion for which places were still available, which was a tour of a Sardinian winery with a wine tasting. The tour was not too interesting, as everything was very technologically modern, and as somebody said, we can go and look at the LHC if we want to see modern technology. The wines tasted were very nice, though.

Monday, June 14, 2010

Lattice 2010, Day One

Hello from the Atahotel Tanka Village Resort in Villasimius, Sardinia, Italy, where I am at the Lattice 2010 conference.

The conference started this morning with a talk by Martin Lüscher about "Topology, the Wilson flow and the HMC algorithm". It is by now well known in the lattice community that Monte Carlo simulations of lattice QCD suffer from a severy problem with long autocorrelations of the topological charge of the gauge field. This problem affects the HMC algorithm and its variants that are used in lattice simulations with dynamical fermions just as well as the simple link updating schemes (Metropolis, heat bath) that can be used for pure gauge or quenched calculations. The autocorrelation time of the topological charge grows roughly like the fifth power of the inverse lattice spacing a as a is taken to zero. This is a real problem because it indicates the presence in the system being simulated of modes that are updates only very slowly, and as a consequence the statistical errors of observables measured from Monte Carlo simulations may be seriously underestimated, because the contribution to the error coming from the long tails of the autocorrelation function that stem from those modes are not properly taken into account. Martin Lüscher then introduced the Wilson flow, which is an evolution in field space generated by the Wilson plaquette action, and which can in some sense be seen as consisting of a sequence of infinitesimal stout link smearings. For the case of an abelian gauge theory, the flow equation can be solved exactly via the heat kernel, and it can be shown that it gives renormalised smooth solutions. For QCD, the same can be seen to be true numerically. Defining a transformed field V(U) by running with the Wilson flow for a specified time t0, it can then be shown that the path integral over U is the same as the path integral over V(U) with an additional term in the action that comes from the Jacobian of the transformation and is proportional to g0/a times the integral of the Wilson plaquette action along the flow trajectory. As a goes to zero, the latter term will act to suppress large value of the plaquette. An old theorem of Lüscher shows that the submanifold of field space with a plaquette values less than 0.067 divides into topological sectors, and hence the probability to be "between" topological sectors decays in line with the suppression of large plaquettes by the g0/a term. This explains the problem seen, but also offers hope for a solution, since one might now try to develop algorithms that make progress by making large changes to the smooth fields V.
This was followed by two review talks. The first was a review of the state of the art in hadron spectroscopy and light pseudoscalar decay constants by Christian Hölbling emphasizing the reduction of systematic errors achieved by decreasing lattice spacings and pion masses and increasing simulation volumes.

The second review talk of the morning was given by Constantia Alexandrou, who reviewed hadron structure and form factor calculations from the lattice, drawing attention to the many remaining uncertainties in this important area, where in particular the axial charge gA of the nucleon is consistently measured to be significantly lower on the lattice than in nature.

The last plenary speaker of the day was Gregorio Herdoiza, who spoke about the progress being made towards 2+1+1 flavour simulations. The collaborations currently pursuing the ambitious goal of including a fully dynamic charm quark in their simulations are ETMC and MILC. MILC is using the Highly Improved Staggered Quark (HISQ) action to reduce discretisation errors, whereas ETMC is relying on a variant of twisted mass fermions with an explicit breaking of the mass degeneracy for the strange/charm doublet. In the former case, the effects of reduced lattice artifacts are clearly seen, while in the latter case the O(a2) mass splitting between the neutral and charged pion increases with the number of flavours. In either case, a significant effort is necessary to tune the strange and charm quark masses to their physical values, but the effort is definitely well-spent if it leads to Nf=2+1+1 predictions from lattice QCD that include all effects of an active charm quark.

In the afternoon there were parallel talks. Two that I'd like to highlight were the talk of Bastian Knipschild from Mainz, who presented an efficient method to strongly reduce the systematic error on nucleon form factors coming from excited state contributions, and David Adam's talk in which he presented a generalisation of the overlap operator to staggered fermions that gives a chiral two-flavour theory.