Tuesday, November 21, 2006

Lattice Forecast for 2056

Via Cosmic Variance and BioCurious: New Scientist has some well-known scientist forecast where science will be in 50 years.

A lot of the predictions are of the kind that people made 50 years ago for today: AIs more intelligent than people, permanent colonies on other planets, immortality drugs, contact with alien civilisations. They haven't come true in the past 50 years, and (exponential growth laws notwithstanding) I see no reason why they should come true in the next 50 years. The other kind of prediction seems much more likely to come true: detection of gravity waves, important discoveries at the LHC, significant progress in neuroscience, solutions for all of the Millennium problems, a firm understanding of dark matter and dark energy, a means to grow human organs in vitro, working quantum computers. And of course, just like nobody 50 years ago predicted the internet or the role of mobile phones in today's world, we should really expect that something completely unexpected will become the leading technology in 50 years.

What really irks me, though, is that there is no forecast from a lattice field theorist. After all, lattice QCD has made huge progress over the past decade, but apparently it isn't sexy enough for New Scientist these days. So here I am going to contribute my own 50-year forecast:

Over the next few decades, parallel computing will make huge advances, with machines that make today's TOP500 look feeble by comparison becoming readily affordable even to smaller academic institutions. As a consequence, large-scale simulations using dynamical chiral fermions will become feasible and will once and for all lay to rest any remaining scepticism regarding the reliability of lattice simulation results.

Predictions of "gold-plated" quantities will achieve accuracies of better than 0.1%, outshining the precision of the experimental results. If the limits of the Standard Model are at all accessible, discrepancies between accurate lattice predictions and experimental results in the heavy quark sector will be a very likely mode of discovering these limits, and will hint at what comes beyond. The use of lattice QCD simulations of nuclear structure and processes will become commonplace, providing a first principles foundation for nuclear physics and largely replacing the nuclear models used today.

On the theoretical side, the discovery of an exact gauge dual to quantum gravity will allow the study of quantum gravity using Monte Carlo simulations of lattice gauge theory, leading to significant quantitative insights into the earliest moments of the universe and the nature of black holes.

3 comments:

Anonymous said...

Regarding your prediction of huge advances in lattice qcd with the advent of more powerful

computers and algorithms, wouldn't it be more palpable, for the time being, if there were a

project similar to SETI or Einstein@home to harvest the power of distributed computing for

lattice qcd simulations until those ever faster supercomputers arrive?

And btw, Happy anniversary!

Georg said...

Unfortunately, the kinds of computations that we need to do in lattice QCD are to a large extend bounded by the communications bandwidth between computing nodes. This makes them very different from the kinds of tasks that can be efficiently run by cycle-harvesting, which gives very poor bandwidth, and hence is only really suitable for so-called "embarrassingly parallel" problems, where you just hand each node a piece of the problem and it happily crunches away at it, reporting the result when it is done. Lattice QCD simulations require communication between the nodes at each computational step, making LQCD@home infeasible.

Markk said...

In regard to computation what is the ultimate limit? CPU, cross-CPU bandwidth, or memory access? Is memory stressed or is it really in cache these days?

Why is there so little locality? Naively looking at the general equations makes it seem that there should be a fair amount of locality, but obviously I'm missing something.