The final day of the conference started with a review talk by Claudio Pica on lattice simulations trying to chart the fundamental physics beyond the Standard Model. The problem with the SM is perhaps to some extent how well it works, given that we know it must be incomplete. One of the main contenders for replacing it is the notion of strong dynamics at a higher energy scale giving rise to the Higgs boson as a composite particle. The most basic "technicolor" theories of this kind fail because they cannot account for the relatively large masses of the second- and third-generation quarks. To avoid that problem, the coupling of the technicolor gauge theory must not be running, but "walking" slowly from high to low energy scales, which has given rise to a veritable industry of lattice simulations investigating the β function of various gauge theories coupled to various numbers of fermions in various representations. The Higgs can then be either a dilaton associated with the breaking of conformal symmetry, which would naturally couple like a Standard Model Higgs, or a pseudo-Goldstone boson associated with the breaking of some global flavour symmetry. So far, nothing very conclusive has resulted, but of course the input from experiment at the moment only consists of limits ruling some models out, but not allowing for any discrimination between those models that aren't rules out.
A specific example of BSM physics, viz. strongly interacting dark matter, was presented in a talk by Enrico Rinaldi. If there is a new strongly-coupled interaction, as suggested by the composite Higgs models, then besides the Higgs there will also be other bound states, some of which may be stable and provide a dark matter candidate. While the "dark" nature of dark matter requires such a bound state to be neutral, the constituents might interact with the SM sector, allowing for the production and detection of dark matter. Many different models of composite dark matter have been considered, and the main limits currently come from the non-detection of dark matter in searches, which put limits on the "hadron-structure" observables of the dark matter candidates, such as their σ-terms and charge radii).
David Kaplan gave a talk on a new perspective on chiral gauge theories, the lattice formulation of which has always been a persistent problem, largely due to the Nielsen-Ninomiya theorem. However, the fermion determinant of chiral gauge theories is already somewhat ill-defined even in the continuum. A way to make it well-defined has been proposed by Alvarez-Gaumé et al. through the addition of an ungauged right-handed fermion. On the lattice, the U(1)A anomaly is found to emerge as the remnant of the explicit breaking of chiral symmetry by e.g. the Wilson term in the limit of vanishing lattice spacing. Attempts at realizing ungauged mirror fermions using domain wall fermions with a gauge field constrained to near one domain wall have failed, and a realizations using the gradient flow in the fifth dimension turns the mirror fermions into "fluff". A new realization along the lines of the overlap operator gives a lattice operator very similar to that of Alvarez-Gaumé by coupling the mirror fermion to a fixed point of the gradient flow, which is a pure gauge.
After the coffee break, Tony Hey gave a very entertaining, if somewhat meandering, talk about "Richard Feynman, Data-Intensive Science and the Future of Computing" going all the way from Feynman's experiences at Los Alamos to AI singularity scenarios and the security aspects of self-driving cars.
The final plenary talk was the review talk on machines and algorithms by Peter Boyle. The immediate roadmap for new computer architectures shows increases of around 400 times in the single-precision performance per node, and a two-fold increase in the bandwidth of interconnects, and this must be taken into account in algorithm design and implementation in order to achieve good scaling behaviour. Large increases in chip performance are to be expected from three-dimensional arrangement of units, which will allow thicker and shorter copper wires, although there remain engineering problems to solve, such as how to efficiently get the heat out of such chips. In terms of algorithms, multigrid solvers are now becoming available for a larger variety of fermion formulations, leading to potentially great increases in performance near the chiral and continuum limits. Multilevel integration methods, which allow for an exponential reduction of the noise, also look interesting, although at the moment these work only in the quenched theory.
The IAC announced that Lattice 2018 will take place at Michigan State University. Elvira Gamiz as the chair of the Lattice 2017 LOC extended an invitation to the lattice community to come to Granada for Lattice 2017, which will take place in the week 18-24 June 2017. And with that, and a round of well-deserved applause for the organizers, the conference closed.
My further travel plans are of interest only to a small subset of my readers, and need not be further elaborated upon in this venue.