## Pages

Blogging ICHEP 2010

A collective forum about the 35th edition of
the International Conference on High Energy Physics (Paris, July 2010)

## Saturday, July 31, 2010

### Meanwhile in the South: CoGeNT dark matter excluded

Parisians say that il n'y a que Paris. This is roughly true, however ICHEP'10 in Paris was not the only important conference in France last week. At the same time down south in Montpellier there was the IDM conference where a number of results in dark matter searches was presented. One especially interesting result concerns hunting for light dark matter particles.

Some time ago the CoGeNT experiment noted that the events observed in their detector are consistent with scattering of dark matter particles of mass 5-10 GeV. Although CoGeNT could not exclude that they are background, the dark matter interpretation was tantalizing because the same dark matter particle could also fit (with a bit of stretching) the DAMA modulation signal and the oxygen band excess from CRESST.

The possibility that dark matter particles could be so light caught experimenters with their trousers down. Most current experiments are designed to achieve the best sensitivity in the 100 GeV - 1 TeV ballpark, because of prejudices (weak scale supersymmetry) and some theoretical arguments (the WIMP miracle). In the low mass region the sensitivity of current techniques rapidly decreases, event though certain theoretical frameworks (e.g asymmetric dark matter) predict dark matter sitting at a few GeV. For example, experiments with xenon targets detect scintillation (S1) and ionization (S2) signals generated by particles scattering in a detector. Measuring both S1 and S2 ensure very good background rejection, however the scintillation signal is the main showstopper to lowering the detection threshold. Light dark matter particles can give only a tiny push to much heavier xenon atoms, and the experiment is able to collect only a few resulting scintillation photons, if any. Besides, the precise number of photons produced at low recoils (described by the notorious Leff parameter) is poorly known, and the subject is currently fiercely debated with knives, guns, and replies-to-comments-on-rebuttals.

It turns out that this debate may soon be obsolete. Peter Sorensen in his talk at IDM argues that xenon experiments can be far more sensitive to light dark matter than previously thought. The idea is to drop the S1 discrimination, and use only the ionization signal. This allows one to lower the detection threshold down to ~1 keVr (it's a few times higher with S1) and gain sensitivity to light dark matter. Of course, dropping S1 also increases background. Nevertheless, thanks to self-shielding, the number of events in the center of the detector (blue triangles on the plot above) is small enough to allow for setting strong limits. Indeed, using just 12.5 day of aged Xenon10 data a preliminary analysis shows that one can improve on existing limits for the scattering cross section of a light dark matter particle:Most interestingly, the region explaining the CoGENT signal (within red boundaries) seems by far excluded. Hopefully, the bigger and more powerful Xenon100 experiment will soon be able to set even more stringent limits. Unless, of course, they will find something...

## Friday, July 30, 2010

### Random collection of final impressions, and a tentative balance

ICHEP is over. After the last plenary session the few remaining braves stormed out of the auditorium, strained with conference fatigue, and headed back home. I must confess, I found a week-long conference, with six full days packed with presentations, pretty long and tiring. I'm not completely surprised that in the last days not many questions came from the (depleted) audience.

Since this is probably my last entry in this blog, I'll entertain you with a random collection of final impressions, and maybe a tentative balance on the blogging experience itself.

The conference itself

Lot has already been said and written, so let's simply put it this way: the conference was excellent. Superb location (Paris is always Paris), excellent venue (I was just astonished the Palais de Congres doesn't provide wireless microphones in the smaller rooms, everything else was perfect), very efficient organization (thanks!), and an optimal balance of contents. Ok, the catering was less-then-perfect, but why should we indulge in complaining about the little details? :-)

The LHC has entered the game

Again, not a big news, but it's good to repeat it: we begin to see the first physics results from the LHC experiments! And even if this is not yet exciting new physics, those times are approaching fast: after more than 20 years of preparation, it's a nice sensation for the whole community.

Experiments vs theory

On the low side, I must say that I found the theory contributions in the first part of the conference a bit isolated. This is probably normal in the context of parallel sessions (and there were anyway good phenomenological contributions in the sessions more oriented to experiment), but as an experimentalist I probably missed the opportunity to learn something really new for me. For instance, I learned from Georg that:
the talks in the lattice session had actually been selected to be accessible and of interest also to people outside the lattice community (in particular there were a number of review talks), so it was a bit of a pity that the talks were attended almost exclusively by lattice theorists.
I agree: pity! Maybe this should have been advertised more? The situation was of course different in second part of the conference, and I really appreciated some of the more theory-oriented talks in the plenary sessions.

"Sliduments" vs nice talks

The quality of the talks was in general rather good, and of course touched its best in the plenary sessions. I had anyway the impression that the non-LHC and non-Tevatron speakers gave the best talks in the parallel sessions. I have a theory, at least for the LHC talks. Nowadays we (the LHC experimental physicists) routinely use slides as a support for documentation of the daily work we are doing. Most of us have taken the (bad!) habit of packing them of all the information we want to record, information that should anyway go into a written report, sacrificing the graphical quality - and the effectiveness when used as a visual support for an oral presentation - in favor to an hybrid object that the experts in the field call slidument. Sure, it's possibly something easier to present: one can pretend to use the text on the slide as a reminder of what to say, maybe even avoiding to reharse. Well, the quality of this kind of presentation will definitively be worse, it's guaranteed, and - if they maybe can fit a weekly collaboration meeting - will certainly not meet the standard needed for a conference . Have a look at the slides of some of the presentations in plenary session of Wednesday, for instance the ones on dark matter or cosmology:

Almost no text, just the few word need to stress the concept, clear figure, no clutter. Sure, the speaker must now what to say on this slide! Now compare for instance with this one (taken from an ATLAS talk, so that nobody can say I'll try to blame our competitors only):
No excuse, we have still a lot to learn!

Blogging ICHEP 2010

I am still digesting the experience, and in this sense I'd really appreciate to get some feedback by the readers on this. On my side, I can say it has been interesting to blog a conference - it was a primer for me - and to do it in a collective blog, with different voices and styles.

Some of the feedback I go tell me that the blog has been appreciated outside, especially by the colleagues that were not attending the conference: apparently it helped to feel connected, more than the webcasts and slides only can do. It might also have helped the journalists reporting the conference to the media: a blog like this can certainly act a filter, and help the non-physicist to grasp what's important, what gets us excited, and why.

This (semi)official blog of the conference was an experiment, and in this respect the organizers wanted to keep a low profile, and verify on the field what the reactions would have been. It seems to me that, if in effect the community seems interested by the format, maybe next time something slightly more ambitious could be tried. For instance, with a bit more of organization we could have had some video interviews at the conference (someone did that, and did it very well indeed), a dedicated Twitter stream, and especially more visibility at the conference itself. I had in fact the impression that - at least at the beginning of the conference - a large part of the participants had no idea that this project was existing at all. And, since the most interesting and useful part of the blogging experience is the conversation with the readers, this could have been even more fun.

Anyway, I would probably do it again, should the occasion came. See you in two years in Melbourne?

### The ICHEP Effect

I created a tool that watches how many plots DZERO, CDF, ATLAS, and CMS release as a function of time. Here are the results for this year (each little square is a plot):

I’m going to call that bump in July there the ICHEP effect.

## Thursday, July 29, 2010

### The CMS Momentum Scale And Resolution

While the focus of the international conference in high-energy physics in Paris last week has been on the search for new physics and the precise measurement of standard model quantities, I will offer to you today something more technical, but in no way less physics-rich; it was presented in Paris, but with the many parallel sessions it may have well gone unnoticed... What I wish to explain to you is the procedure by means of which the CMS experiments calibrates the scale and resolution of its charged particle momentum measurement.

The dull sound of the topic as stated above should not deceive you: this is a really exciting, interesting technology, which allows the measurement of physical quantities with high precision. Since the M in CMS stands for "muon", we certainly care for the precise measurement of muons -and muons are the particles used for the calibration procedure.

What happens when a charged particle leaves ionization deposits ("hits") in the silicon tracking system is that we can reconstruct its trajectory, forming a track. The track is curved in the plane transverse to the beam, because the S in "CMS" stands for "solenoid", a big cylinder that provides a B= 3.8 Tesla magnetic field within its volume. If you know what the Lorentz force is, you might also remember the formula P = 0.3 B R, expressing the proportionality of the momentum of a charged particle and its curvature in a magnetic field. This demands that within the CMS solenoid a P = 1.14 GeV muon follow a curved trajectory, which resembles a circle of radius R = 1 meter if observed in the "transverse" plane to the beam axis, the one along which the solenoid is symmetrical. By measuring the curvature, we determine the transverse momentum!

Things are always complicated if you want perfection. We of course can measure the position of the silicon hits with extreme accuracy, but alignment and positioning errors may create imperfections in the measurement of the track curvature. We also know the magnetic field with high accuracy, through Hall probes and other means, but imprecisions will affect the momentum measurement. Finally, the amount of material of which the tracking detector is composed affects the trajectory, producing further imprecisions if our map of the material is not perfect.

In the end, all the effects and all the details of the geometry of our detector are encoded in a carefully crafted simulation. With the simulation we can figure out what a 1-GeV track would look like, given our reconstruction and our assumptions about geometry, material, and magnetic field. But we need real data to verify that our model is correct, and to tune it in case it is not!

Real data: we now have it. CMS uses resonance decays to opposite-charge particles for this business: they are easy to identify, have little background, and there are plenty to play with. In particular, we use J/Psi meson decays to muon pairs for some of the checks of the momentum scale and resolution. Other dimuon resonances are also used -there is a large amount of such decays already available in the data so far collected- but here I will only discuss what CMS did with its J/Psi signal.

The dimuon mass spectrum in the vicinity of the nominal J/Psi mass value is shown in the picture below. A large number of signal events is observed. These events can be used to calibrate the momentum scale.

If one looks closely, one observes that the measured mass is very slightly lower than the nominal 3.097 GeV. This is already evidence for a very small underestimation of the momentum scale. To dig further, a simple thing one can do is to divide the J/Psi events depending on the value of the particle's reconstructed momentum or rapidity, measuring the mass in all sub-samples to check if in particular kinematical regions there is a bias. The bias, of course, would arise from the momentum reconstruction of the individual muons; but if one only measures the mass, which is a quantity constructed with the measurement of two muons, surely only an "average" bias can be detected, right ?

Wrong. Each muon from the decay of each J/Psi has a different momentum, travels through different parts of the detector, and is subjected to different reconstruction biases: we can turn these differences to our advantage. What we can do is to assume we know the functional form of these biases, and plug them into a likelihood function.

A further benefit with respect to methods I have seen in the past for the correction of scale biases is that a well-written likelihood function is also capable of extracting the momentum resolution from the same set of data. One just needs to produce a functional form (whose exact shape is suggested by simulation studies) that describes how the resolution on the momentum depends on the track kinematics; then, the likelihood fit will take care of finding the best parameters of the resolution function as well, by comparing the expected lineshape of the resonance with the mass value measured for each particle decay.

The likelihood is very complicated, because it accounts for the dependence of the mass on the muon momenta and the resolutions, and momenta and resolution in turn are functional forms of bias parameters. I know very well the code of this likelihood function, and I can tell you it is not for everybody! So I will abstain for once from finding a suitable analogy, lest I squeeze my brains for the rest of the evening. Let me just say that in the end, the likelihood maximization produces the most likely value of the parameters describing the bias functions, allowing a correction of the bias in the track momentum measurement!

Maybe it is best to show a couple of figures. The first one below shows the average mass of the J/Psi meson as a function of the pseudorapidity of the muons from its decay. The hatched red line shows the true value of the J/Psi mass; but more meaningful are the crosses, which show what should be measured with a perfect detector, given the fitting procedure (which, I am bound to specify, assumes that the lineshape follows a Crystal Ball form). The crosses are our "target": if we measure a mass in agreement with them, given our fitting procedure to extract the mass, our momentum scale is perfect.

In blue you can see that the mass, before corrections, is biased low, especially at high rapidity. Instead, after the likelihood maximization and the correction procedure, we obtain the purple crosses. The agreement with the black crosses is still not perfect, and the statistics is too poor to detect further small deviations, but the demonstration of the validity of the procedure is clear!

And then, the resolution. This is also a function of rapidity in CMS, due to the way the detector is built and the decay geometry. The figure below shows what resolution we expected to measure as a function of rapidity, from simulated J/Psi decays (in black), given the measurement method.

In red the figure also shows what the true resolution is, from simulated muons that are then compared to reconstructed ones. In blue, the band shows what instead CMS measured. The agreement between data and simulation is encouraging, and the result demonstrates the validity of the method. This functional form and its parameters are extracted from the way the reconstructed masses of J/Psi decays distribute around the nominal mass, accounting for the fact that muons in those events have different rapidity: the likelihood knows all the details, and produces a very complete answer to our question.

I think the method is very powerful and I cannot wait to see it applied to all resonances together, with more data -the different dimuon resonances have different kinematics and produce muons of widely varied momenta, allowing a very complete picture of the calibration and resolution of the CMS detector!

## Wednesday, July 28, 2010

### Final days, brief summary

Tuesday was the days of the quarks -- almost every talk was about QCD or flavour phyics.

Unfortunately, the lattice QCD talk did perhaps not make the best of the opportunity to convince a larger HEP audience of the importance of lattice results, since Yoshinobu Kuramashi chose to pass by the many important contributions that lattice QCD can make and has made towards flavour physics, and concentrated on the derivation of nuclear properties from lattice QCD instead. This is clearly a very important topic and replacing phenomenological models with true first-principles predictions from QCD will have an enormous impact on nuclear physics; the present audience of experimental high-energy particle physicists and Beyond-the-Standard-Model theorists might have been more excited to hear about lattice determinations of the decay constants and semileptonic form factors of heavy mesons, however.

The decays of B and Bs mesons are also the field in which the most exciting discrepancies between experimental results and Standard Model predictions keep appearing. While the Ds discrepancy of last year has disappeared in the meantime, there is now some tension in B meson decays, the most significant of which (at 3.2σ) is the like-sign dimuon charge asymmetry measured at the Tevatron. However, combining all results from D0 and CDF gives a discrepancy of less than 2σ when compared with the Standard Model, while all individual results are mutually compatible within errors. So perhaps this is again a fluctuation, or else something really weird is going on here. Phenomenologists may have a clearer picture of what that could be, see the post by Jester.

Today (i.e. Wednesday) started out with a session on neutrinos. Neutrinos have their own version of flavour physics -- neutrino oscillations. The MNS matrix, which is the leptonic analogue of the CKM matrix, is a lot less well-known than its hadronic cousin, however. In particular, it is not clear whether the mixing angle θ13 describing the mixing between the first and third generations is non-zero, although recent results indicate that it likely is. The flavour structure of the neutrino sector might be much richer than the quark one, though, since the existence of sterile neutrinos (i.e. neutrinos not partnered with a charged lepton via the weak interactions) cannot be ruled out at present.

The neutrinos detected at great effort by huge detectors can come from a variety of sources, some of which got their own talks: Alain Bellerive talked about solar neutrinos (i.e. the ones created by nuclear fusion in the Sun), where the Solar Neutrino Observatory (SNO) is now beginning to achieve precision measurements, which favour a scenario of large neutrino mixing angles. Tsuyoshi Nakaya presented long-baseline accelerator neutrino experiments (where neutrinos beams are created at an accelerator from the leptonic decays of a beam of charged particles), among whom OPERA has recently reported the first candidate for a ντ oscillation. Fabrice Piquemol spoke about reactor-based experiments (where the neutrinos come from the reactions in nuclear reactors) and single and double β decays; the tail of the energy distribution of the electrons produced in β decays can impose an upper limit on the mass of the electron neutrino (the limit of 2.3 eV does not seem terribly competitive with other limits, however), and the observation of neutrinoless (or "0ν") double β decay would establish that neutrinos are Majorana particles, indentical to their antiparticles.

Today's second session was about the links between particle physics and cosmology: Dark matter is one of the big "known unkowns" in our present understanding of the universe. What is known is that it cannot be normal (hadronic) matter because of limits imposed by Big Bang nucleosynthesis, and that it cannot be neutrinos, either. The best candidate would be a WIMP (i.e. a Weakly Interacting Massive Particle), such as the lightest stable superpartner of a Standard Model particle if SUSY exists in nature. The discovery of such a particle at the LHC would thus have significant impact also on cosmology.

### Trouble with Flavor

Flavor physics gives me a headache. Unfortunately, this sub-field of particle physics is where new particles and interactions are very likely to show up, so it's essential to follow all hints from latest observations. Yesterday at ICHEP Gino Isidori gave a nice theoretical summary of where we stand.

Overall, the standard model frustratingly well explains the multitude of observed transitions between quarks and leptons of different generations. If we extend the standard model with generic non-renormalizable 4-fermion operators, their coefficients are extremely constrained by experiment. The scale suppressing certain flavor-violating operators has to be at least 100 TeV in the bs sector, at least 1000 TeV in the bd sector, and the astounding $10^5$ TeV in the ds (kaon) sector. It means that if any new particles exist at the TeV scale they better be very careful not to destroy the approximate flavor symmetries of the standard model, as otherwise they would generate effective 4-fermion operators with large coefficients.

That probably means that at the TeV scale there is no new particles beyond those of the standard model. There is still some hope, however, that the above is not true, and this forlorn hope is fueled by three results that are currently in tension with the standard model predictions. One is the widely discussed the D0 measurement of the same-sign dimuon asymmetry which points to new contributions to $B_s-\bar B_s$ meson mixing at 3.2 sigma level. The 2 other less widely known discrepancies are:
• Various determinations of the beta angle in the unitarity triangle (determined most precisely from $B_d \to J/psi K_S$ decays, and from the $\epsilon_K$ parameter in the kaon mixing) do not agree very well. The current discrepancy is around 2.5 sigma.
• The branching fraction of the $B \to \tau \nu$ decay measured by BaBar and Belle is currently two times larger than the standard model prediction. Given the errors, the current discrepancy with the standard model is around 3 sigma.
These are most likely flukes, unavoidable among such a huge number of measurements, but it won't hurt to keep an eye on those.

Theorists have put up several models that may fit all up-to-date flavor observables and explain the existing anomalies. For Gino, the favorite model is the 2-Higgs doublet model. In this scenario, the Higgs sector contains additional 4 scalar particles who can mediate flavor violating transitions. The point is that they do it in a very respectful way, including the suppression by small CKM angles and by small quark masses, so they not to produce excessively large effects even when the new Higgs particles are at the TeV scale. The quark mass suppression leads to the desired pattern where the smalest new contributions come in the kaon sector, while the largest occur in the Bs meson sector. This scenario also predicts new contributions to $B_s \to \mu \mu$ decay and to the neutron electric dipole moment at the level of the current sensitivity, so fresh tests of this idea are soon to come.

For more details, see the ICHEP'10 slides. As a bonus, a very accurate rendering of theorists waiting for hints of new physics in flavor physics.

### How can there be no questions?

I’m sitting here, on the last day of ICHEP, listening to some excellent summary talks. One of the experimental neutrino talks just went by – excellent (sorry, no link because I don’t have internet as I write this). But there were no questions. I’ve noticed this for many of the talks at the plenary. Or if there is a question, it is the “if you doubled your dataset what would happen to that 2 sigma excess?”

How can this be? There are almost 1000 people watching these talks, many of them experts in the topics being discussed. And no questions? Physics is built on questions! This is how we learn – almost never is something so clear that we don’t need questions! Heck, the whole field is structured around this. We invite experts to our University to give a seminar so we can have their undivided attention for a whole day to ask them questions. We go to workshops so we can ask each other questions and learn. We write papers and then respond to the papers with letters and more papers which are basically a slow version of Q&A. So what is going on here?

I don’t know, of course, however I’m going to take a stab:

• Exhaustion. It is the end of a full week of conference. We’ve all be talking, discussing, and thinking about these topics for a solid week and we need a break for our brains to process everything.
• Embarrassment. There are 1000 people here. Many of them very important people in the field (i.e. they think you are stupid and your chances of funding go down). So you’d better make sure you have a good question before you ask it.
• These are summary talks. The speaker is often presenting a huge amount of information in a very short amount of time. It is likely they are an expert in the topic they are presenting, but given the volume of information discussed in such a short amount of time, it is not likely they know all the details. The people who are experts on each topic gave talks in the plenary sessions. Indeed, there were lots more questions there than there are in the plenary sessions.

I’m sure there are other possibilities as well.

## Tuesday, July 27, 2010

### Is there?

I just liked this...
For better pictures see Marco's photostream!

### Back to the future

It's Tuesday and I'm exhausted! I've only just found some time to sit down and think about the future session that happened way in the past (on Saturday, pre-Sarkozy, pre-press conference, pre-dinner - gee, was it only three days ago?!). Anyway, with the network problems already reported by Gordon, the dinner already covered by Katie and hundreds of reports in media around the world I hope I am excused. Also, sitting in the press booth isn't exactly supportive of concentrated work as many people pass by for a chat. I guess I have to educate them to pass by for a chat AND bring a coffee. Since you ask, no sugar but a lot of milk please, thanks!

But now, here goes! Saturday saw a big overview of all the exiting and less exciting future projects for the world of particle physics. I leave it up to your judgement to decide which is which, comments are welcome! I am also arranging them by my brain's personal sorting system and will happily accept comments and corrections - the list is probably not complete.

Particle physics is a global field. You just have to look around around the room to notice that people come from all over the place. The big machines that we work on these days are challenging and cost a lot of money so that no one country could afford to build and host them - all countries have to chip in and work together. The more challenging the technologies become the more this is the case, and it also takes many years for a machine to evolve from idea to design study to running accelerator. Consequently it may seem strange to the outside world that while we've only just switched on the LHC and are waiting for discoveries, we already plan the next generation – but we have to have a variety of options in the drawer that will enable us to make the best choice when results are there. And then of course there are physics topics that aren't covered by the LHC!

There are several ideas for 'LHC follow-ups'. Two different varieties of LHC upgrades exist - one for more luminosity and thus higher statistics and safer discoveries, one for higher energies. Whereas one, the luminosity upgrade, is virtually around the corner, the energy upgrade is an option that's far in the future (around 2030, according to Roger Bailey's talk from Saturday). The discoveries at the LHC will probably dictate whether the higher energies are more interesting, or whether the LHC could be transformed into a electron-proton machine, or a 'superHERA', although it goes by the name of LHeC in the session, with the e for electron. LHeC would collide the LHC's protons with electrons from a linear collider - an intriguing thought for someone working on the ILC! My imagination already went off into dreamscapes where LHeC and the linear collider would run together on different physics programmes as the best possible synergy of machines we have yet to see (and believe me, physicists are great at creating synergies and reusing existing machines!). I guess I'll have to talk to a few proper scientists to check whether this is imagination running wild or whether it's actually possible.

Certainly possible and the most likely next big project is a linear collider for electron-positron collisions. It'll complement the LHC and it's only a question of LHC discoveries, again, whether the ILC or CLIC is the machine of choice. While the ILC is basically ready to be built - the Technical Design Report is due in 2012, which means "Here's how we would build it", CLIC is a few years behind, with its Conceptual Design Report due next year. When you think that first collisions from a linear collider could be expected in the 2020s you start to understand why there is plenty of planning, designing and testing going on around the world!

Then there are b factories, machines that would complement and extend anything that the b physics experiments around the world, like the LHCb experiment at the LHC, find. One is proposed in Italy, and KEK in Japan has just started reconfiguring its KEKB accelerator into - can you guess its name? - SuperKEKB. Funding isn't final but they are planning get 40 times more luminosity. The Italian b factory would also be a light source, and it shares its multifunctionality with Fermilab's 'Project X', which can contribute to the ILC, to a possible muon collider and ultimately a possible neutrino factory. Fermilab is busy working on a plan for a muon accelerator program (i.e. a MAP), and muon colliders, though technologically still a big challenge, are also a big topic for machine physicists. Probably something called 'dielectric acceleration is, too, but I couldn't tell you as I didn't understand the talk, sorry....but when asked about whether there is a plan for a beam delivery system, the speaker laughed and said that he'd like to have this questions again in 10-15 years -- so I conclude it's not something that would pop up in the next months.

What I missed in almost all of the talks were good, catching, convincing arguments why these machines that were proposed are needed. I am sure there are solid physics cases for all of them, but surely it can't harm to state them again, clearly, understandably, in talks like these that will live on for a while? I'll go hunting for them for a future story in NewsLine, but first I go hunt particles in the Grand Rex, the nuit des particules -- see you there!

### A Spectroscopist's Delight!

While everybody is busy discussing the latest Tevatron results on the Higgs boson searches -is that the light-mass excess the internet was abuzz, is it consistent with a signal as we expected it, how long will it take to confirm it is not a fluke, etcetera, etcetera, etcetera- I think I have a different plot with which to enthuse you.

If you do not like the figure below, courtesy CMS Collaboration 2010, you are kindly requested to leave this blog and spend your time reading something else than fundamental physics. I do not know what will ever make you believe particle physics is beautiful, if not what is shown here.

The figure shows, using a logarithmic scale on both axes, the reconstructed mass of pairs of muon candidates of opposite charge, collected by CMS in its first 280 inverse nanobarns of 7-TeV proton-proton collisions collected until a week ago. Nothing fancy has been done to prettify this graph: these are honest-to-god muon pairs, as Nature (the bitch, not the magazine) has produced them in the core of CMS. True, the interecession of a detector and a reconstruction software were needed to go from ionization clouds to event counts; but this is the absolute minimum of manipulation you can ever expect from particle signals.

Now, what should enthuse you about the graph is the following. The distribution reveals, clearer than a million words could describe, the structure of all the most important bound states decaying by electroweak interactions into pairs of muons which we can produce in hadron collisions. We immediately spot the Z boson on the far right, and the towering peak of J/Psi mesons; but we also see Upsilon mesons, and at lower energy, we detect the ligher resonance decays of rhos, omegas, and phi mesons. What a spectroscopist's delight! This figure is tremendously informative! If we sent it to outer space, without labels or units, no intelligent race could ever mistake its meaning!

You also notice that these jewels stand atop a background of unidentified muon pairs. Muons can be produced singly by the weak decay of kaons and pions, for instance, or even more massive states like bottom and charm. Occasionally, pairs of muons of opposite charge can emerge that do not have the same parent: the frequent production of these uncorrelated pairs creates the significant backgrounds you see in the picture. Note, however, how these backgrounds die out for large dimuon masses: the Z boson is basically background-free, a fact I have noted in my previous posting here.

As these pages testify, CMS and ATLAS have presented scores of interesting physics results at ICHEP this week. None of those were groundbreaking ones; a few were significant advances, though, and many others were just meant to demonstrate that the experiments are ready for big challenges, such as discovering new physics, the Higgs, measuring the top mass better than the Tevatron, etcetera. The presented results took about a hundred man-years to produce, and I have a lot of respect for them -not to mention the fact that I did my little bit to contribute. But it is my humble opinion that the graph shown above could well be the one to single out and attach on the bulletin board of all the universities and institutes participating in the LHC experiments!

### Paris est la France, et la France est Paris

If you have lived in France (but not in Paris) at least for a little while, you have certainly learned the hard way that there's no France outside Paris. Or, like the inhabitants of the City of Light prefer to put it, Paris est la France, et la France est Paris.

While there are certainly good reasons to identify the French grandeur with the capital city, there is certainly more in France that just Paris. And I'm not only thinking about the food variety, that we experienced yesterday evening at the conference dinner where the excellent food specialties of five different French regions were served. Actually, that we should have experienced, and that were supposed to be served: in reality the food flow was definitively not enough to satisfy the appetite of the disappointed conference participants. But this is another story.

This valuable variety (of fine food, nice places, and of other excellent things) is rarely recognized by the Paris people - that's a fact - but it is was rather unfortunate to discover that this point of view seems to be shared by the French President himself. As you might have heard, Nicolas Sarkozy attended the ICHEP conference yesterday to give an "opening" talk (well, the conference was opened since four days, but still). Now, I really don't want to indulge in commenting the speech, nor I dare to discuss the subtleties of the French research politics. But I cannot refrain at least to note (at a very superficial level indeed) that both in the announcement of the President participation, and especially in the President speech, the French excellence in high energy physics seemed to be contained in circle of a 30 km radius centered on the Tour Eiffel.

I can tell you, this has definitively not pleased those ATLAS colleagues of mine working for instance in Annecy or Marseille, and I guess the other French physicist working for instance in Grenoble, or Clermont-Ferrand, or Strassbourg must not be that happy too. Oh, yes, the press office of the Elisee has finally changed the initial announcement, a now they are cited along with the other Paris university and research centers as "universités de province", but I'm still not sure they are at ease with the classification. And, despite the last-minute change on the web site, the President speech itself was still confined to the "region parisienne"! Are the Paris people really so distracted? I would tend to doubt it, but again, this would lead me to speculate about the current French research politics, and I'm certainly not qualified for this. Pity, anyway.

### Physicists gone wild

Most (perhaps all?) physics conferences share the conference dinner tradition. One evening during the conference, all participants are invited to get together to share a meal, usually featuring cuisine from the local area. Last night was the ICHEP banquet (the fancier term ensured that quite a few attendees came in fancier attire than they wore to the sessions). The banquet took place at the Grande Galerie d'Evolution, and while the gallery, stuffed and mounted wild animals, and line to get in and check our laptop bags was certainly grand, the food and drink were another story. The crowds surrounding the tables holding tapas-style food from various areas of France wouldn't have looked out of place on a rugby field.

(Here's a photo of physicists in the wild, during the calm before the food-induced storm.)

But the point of conference dinners isn't really food, but to talk to old and new colleagues, and that went off without a hitch. The food situation also provided a perfect ice-breaker for conversation, and I heard many a story about banquet disasters from conferences past.

Tonight will see a gathering of a very different sort in Paris: the Nuit des Particules, a particle-physics-themed evening for the public. The event at the Grand Rex theater in Paris starts off with a public lecture by scientist Michel Davier (in French), at which French actress Irene Jacob will also be present, and ends with a showing of the science fiction film Sunshine (in English).

### Day 4: the Higgs is not there. Yet.

But we also know that, if it exists, we would not find it in the 158-175 GeV mass range. The saga of the Tevatron Higgs talks finally came to an end, certainly matching the expectation, at least for what concerns the show part. Well, as for the scientific part, after all the preliminary steps of the last days nobody was really any hint of signal anymore. We mortals might not be able to make fancy statistical combinations by eye, but it was still not too complicated to anticipate that, since there was no excess in any channel or single detector combination, we should not have expected any dramatic announcement.

I found Ben's talk really excellent: the slides (that, by the way, were kept secret until the very last moment) were very well prepared, and Ben proved to be an excellent presenter. In a sense he was even rather humble, especially if you think about the fuss about a possible signal before the conference (at slide 46: "I'm sorry, this 2 sigma excess is the closest we have to a discovery"!).
Still - but maybe it's just me- I had the feeling that the presentation contained a subtle innuendo. Of course Ben did not dare to say anything that was not scientifically backed, but take for instance slide 36. Ben gently dropped this CDF plot:
and casually commented: "People keeps on asking how would the exclusion plot look like if there existed a Higgs boson. We did the exercise of injecting a 115 GeV SM Higgs boson signal in several channels, and that's what we got". And, guess what, the results is that the exclusion curve jumps up like it had a 1 sigma fluctuation on a rather large mass range. Now, doesn't this jump remind you of any feature we saw in another curve? There a region where the unspeakable dreams and hopes of many live, between a green and a yellow band.

### Monday, brief summary

Since I was having trouble with the WLAN yesterday, I couldn't blog the first plenary session. My colleagues have already written in more detail about the most interesting results presented, so I'll just give a brief summary of my impressions: The LHC experiments are well on their way towards "rediscovering" the Standard Model (they have the low-mass resonances, the J/ψ, Υ, W and Z, and several top candidate events). Nicolas Sarkozy is a politician; it is of course a great honour to have him officially open the conference (in particular when he calls the participants "the hope of the planet"), but it might be unwise to put too much faith into his words regarding research funding. he Tevatron has excluded the Higgs mass range of 158 to 175 GeV and is beginning to also exclude a low-mass Higgs (although the LEP bounds are still stronger there). The rumour about a possible discovery of the Higgs as the Tevatron was just a rumour, the 2σ fluctuation observed is not significant.

A significant discrepancy between theory and experiment was, however, observed at the banquet: theory predicted a stable circulating beam of physicists scattering elastically off several targets representing the culinary traditions of the different French regions. Experimentally, a lot of pile-up events were observed, leading to early depletion of the targets and a failure to achieve saturation for many participants ... clearly, some more effort needs to go into the modelling of such high-density situations in culinary physics.

## Monday, July 26, 2010

### The present and future of the LHC

Today's been a hectic day: between Sarkozy's speech, the press conference, and the lack of wireless internet in the auditorium where the plenary talks are being held, there's been little time to blog. So any time I've been able to grab during coffee breaks has been spent uploading blog entries to my usual haunt, symmetry breaking. There you can read about the 10-year plan for LHC running and whether or not it might affect some Tevatron scientists' hopes to run their accelerator for a further three years. Plus my take on the new measurements reported by the LHC experiments at ICHEP, such as the W cross section measurements at 7 TeV, and the limits ATLAS and CMS have placed on some exotic physics. (Keep in mind that symmetry is for a more general audience; my fellow ICHEP bloggers have covered those results in much more detail here.)

### A fun way to browse ICHEP talks

One of my hobbies is better ways to visualize information with modern computers. Those of you that have followed me know about my efforts with DeepZoom. So, ICHEP is being rendered this way as we speak.

If you click on the link above your browser will load up a very large image using Sliverlight (which you’ll need to have installed: Windows and Mac computers only, I’m afraid). You’ll see about 400 talks all displayed at once. You can then use your mouse and mouse wheel to zoom in to see a particular talk. Your browser should load up the bits of the talk you are looking. You can click on a slide to bring it full screen and use the back and forward arrow keys to move around.

BUT: you need a decent computer, decent graphics card, and, above all, a decent internet connection. In short, no one at ICHEP can look at this display because the internet connection is not robust enough (I’d add a picture here but I’m at ICHEP and can’t load it up right now!).

The display is updated about once every 4 hours, so as the plenary talks continue they should slowly show up there in that display.

Enjoy!

### Higgs still at large

Finally came the moment we all waited for at this conference:
Tevatron now excludes the standard model Higgs for masses between 156 and 175 GeV. The exclusion window widened considerably since the last combination. Together with the input from direct Higgs searches at LEP and from electroweak precision observables it means that Higgs is most likely hiding somewhere between 115 and 155 GeV (assuming Higgs exists and has standard model properties). We'll get you bastard, sooner or later.

One interesting detail: Tevatron can now exclude a very light standard model Higgs, below 110 GeV. Just in LEP people missed it ;-) Hopefully, Tevatron will soon start tightening the window from the low mass side.

Another potentially interesting detail: there is some excess of events in the $b \bar b$ channel where a light Higgs could possibly show up. The distribution of the s/b likelihood variable (which is some inexplicably complicated function that mortals cannot interpret) has 5 events in one of the higher s/b bins, whereas only 0.8 expected. This cannot be readily interpreted as the standard model Higgs signal, as then one would also expect events at higher s/b where there is none. Most likely the excess is a fluke, or maybe some problem with background modeling. But it could also be an indication that something weird is going on that does not fit the standard model Higgs paradigm. Maybe upcoming Tevatron publications will provide more information.

### A Big Day

I was going to post this before anything started this morning, however the ICHEP committee decided not to purchase wifi in the main conference hall, so it took until now for me to have time to post this….

Today is the first day of plenary talks at ICHEP. Up to now it has all been plenary sessions – and there have been a lot of talks. Now everything collapses into one room. And it is a very big room. I don’t think I’ve ever been in a room this big for a physics conference before.

Several things happen today – but two I’m particularly interested in seeing are the final combination of the Tevatron Higgs searches and a speech by the president of France, Sarkozy.

Sarkozy's speech was much better than I was expecting! He and his speech writers had worked together to craft a quite good message. One part was a vigorous defense of why it is a good idea to invest in science now even though there is a financial crisis on. Basically, his message boiled down to: a country can not ignore the future for the crisis of the present – there is always a crisis of the present. A message I wish more of the legislators in the state of Washington would get. The second half of his message was the details of the investments that France is making. You can’t always trust the numbers that politicians give you in settings like this, but one stuck out: a reinvestment in Saclay – one billion Euros in the next 10 years.

The second thing I’m looking forward to are the combined results from the Tevatron. You can guess what will happen when you look at the CDF and DZERO higgs talks (which I’d link to if there was any wireless at all here!). Both experiments are basically excluding the region around 160 GeV on their own, and both have a downward fluctuation at low mass.

Update from the Higgs talk:

Ben’s fashion choice: Tour de France yellow tee shirt along with a suit jacket. Nice!

High mass region: from 158 – 175 GeV is now excluded (about x4 bigger than before).

Low mass region: Starting to exclude the really low mass region as well now. Expected limit is around 1.2 or 1.4 or so. But the exclusion right now looks like a fluctuation low, so it will be very interesting to watch the next update 6 months from how. The expected is getting quite close to expected SM cross section! Sweet!

### Sunday impressions

Sunday was the day off between the parallel and plenary days. I used the free time to walk into Paris from my hotel in Suresne (a very nice way through the Bois de Boulogne) and through the city before spending the afternoon in the Louvre (which is of course way too big to explore even in a whole day, so I just took a look at some of the Oriental Antiquities, Dutch paintings and Italian paintings -- there is so much more beauty there than "La Gioconda" to whom everyone rushes without looking around them).

I was quite impressed by the diversity and level of the street music in Paris once again: at the Pont Neuf there was a man playing Handel's Music for the Royal Fireworks ... on a steel drum! Not exactly an orthodox instrumentation, but listening I thought il bravo Sassone would have been delighted by the adaptation. And just a little further, in the Châtelet métro station, there was a string septet playing Mozart.

Now for Monday and the presidential speech.

### Electroweak Signals From CMS

The ICHEP parallel sessions are over, and it is time for a summary of results. Of course if you are in Paris you will get it from the summary talks, but if you prefer some armchair, remote attendance of the conference, I have collected for you a few meaningful plots.

Here I wish to assemble some of the electroweak physics results produced by CMS in time for ICHEP. The CMS experiment has shown results that use up to 280 inverse nanobarns of proton-proton collisions, but for electroweak measurements -those involving W and Z signals, to be clear- the statistics used is up to 200 inverse nanobarns of well-understood data.

It is exciting, and quite pleasing, to see how quickly the results have been produced. The speed at which a collaboration goes from raw data on tape to plots for conferences is in my opinion a quite important indicator of the confidence of the collaboration on the whole chain -detector, analysis tools, internal scrutiny. And CMS appears to pass this evaluation with flying colours!

So, W bosons are readily produced in proton-proton collisions, as is clear in the distributions below. These show the transverse mass of muon-neutrino systems, in events where a high-momentum muon has been detected, and where the calorimeter is used to measure the imbalance in the energy flow due to the escaped neutrino.

From a comparison of the left and the right panel you can see that LHC produces more positive W bosons than negative ones (the W contribution is the yellow histogram in both panels, turning on above 40 GeV of transverse mass). Violation of some basic symmetry rule ?? No, simply the result of the initial state containing more positive-charge quarks than negative-charge ones!

Please also note how clean the W signal is. These distributions will one day allow us to improve the already excellent precision in the mass of the W boson, plus to perform a host of other detailed studies.

One thing we can do already now, however, is to study the energy of jets recoiling against the W boson. This can be seen in the plot attached below: the recoiling jet transverse energy follows closely the predictions of simulations.

And what about Z bosons ? Well, of course they are less frequent -because of the smaller production rate, and because of the smaller branching fraction to electron and muon final states. Still, CMS produced significant signals already with 200 inverse nanobarns of data. Have a look at the dimuon mass, shown both in linear (left) and logarithmic scale (right) in the figure below.

What is significant in the plots is the extremely clean signal these decays provide: backgrounds are totally invisible in a linear scale, and they only appear in the log plot. At the Z mass peak, backgrounds appear to amount to less than a part per mille. Also worth noting is the very good resolution of the detector: the width of the Z boson mass distribution is close to that which the Z naturally has, due to its extremely short lifetime.

A similar signal is visible with electron-positron final states, as shown below:

Again, one notes the extremely clean nature of these events: the QCD background is mostly irrelevant. However, this is an inclusive selection: if one were to look for events with a Z boson and several jets, say, the QCD component would dramatically increase its relative importance. Such considerations will come into play when we search for new physics signals!

With the data shown above, CMS has measured the cross section of W and Z production in electron and muon final states at 7 TeV, as well as the ratio of W over Z production, a number which can be known with better accuracy than the absolute rate, due to the canceling of several systematic uncertainties. You can find all the measurements in the CMS public web pages. Here I will just flash one last figure, which amiably shows the increase of the production rate of vector bosons with the center-of-mass energy of the hadron collision.

Note that the blue lines showing the trend of cross section versus energy are broken: having proton versus antiproton or proton versus proton changes the production mechanisms, and thus the rate cannot be strictly compared with the Tevatron and UA1/2 measurements (on the left).

All in all, a rich bounty of measurements, already with 200 inverse nanobarns of data! I drool at the thought of what we will do with three orders of magnitude more data next year!

### Snapshots from Saturday sessions

Not everyone is visiting Paris (but still...)

A supernova exploding in Salle Maillot

Lonely poster session

The trouble with Ultra High Energy Cosmic Rays

Preparing for the President...

## Sunday, July 25, 2010

### Day 3: jets!

Saturday sessions were not really well attended. if we exclude the one on CP violation, CKM and Rare Decays that succeed in packing a lot of people in one of the smaller rooms. Maybe the average ICHEP participant decided to make the grasse matinee, or simply to be a tourist in the City of Light in a very nice and sunny day. Still, in the semi-empty rooms there were lots of interesting things to be learned.

For instance, I was eager to attend the session on Perturbative QCD, Jets and Diffractive Physics: partly because some of the results I have been working on in the last months were presented there, partly because I wanted to know how CMS was doing on the same subject, but mainly because I wanted to know how the LHC experiments are doing on the jet measurements. Jets are in fact copiously produced in the 7 TeV LHC collisions, and even with the not huge amount of data we have collected up to not, ATLAS and CMS are in fact already capable to make nice measurements, and, in a sense, already unique ones.

I was not disappointed: Tancredi, that was presenting the jet measurements for ATLAS, opened his talk with a nice historical reminders: there was a time, nearly 30 years ago, when jets were seen for the first time at an hadronic collider (and presented in Paris!). Those were days when a physicist could get excited for a a di-jet event with a 140 GeV invariant mass, produced in hadronic collisions at a center of mass energy of 540 GeV. Today the hype is about di-jet events of 2 TeV invariant mass: it seems to me that such a comparison helps to put things into a humbling perspective, reminding us how much road has been done, how mush is still to do, and that we are all standing on the sholders of giants.

Both ATLAS and CSM had impressive first cross section measurements for single jets and di-jet objects, already binned in different rapidity regions, and up to unprecedented di-jet masses. And the agreement with the NLO QCD theory calculation is already impressive, despite the data uncertainties are not yet the best possible!

In this particular respect, I was not completely satisfied of the way the CMS explained their approach to get the 5-10% jet energy scale they claim. They certainly have several ratio measurements that reduce the impact of the systematic uncertainty on this quantity, but I'm anyway still curious! And since the data uncertainty is still the dominant one for the cross section measurement of both experiments, and it's mainly driven by jet energy scale, it's a point that will become very relevant as soon at the statistics will be large enough to make precise measurements in previously unexplored $p_T$ and $m_{1,2}$ ranges. This moment is certainly not far in time!

Of course pQCD is a nasty beast, and as soon as one starts to compare his jet results with some tune of his preferred MonteCarlo, he can be assured that someone will ask how much the chosen tunes are reliable, how well they fits with the low energy data from previous experiments, how well he know the MC authors... :-) I suspect a human component in this aggressive questioning: like it or not, jets are really the only domain in which the LHC experiments have already overtaken the Tevatron analyzes in mass reach. Thing that both the speakers did not fail to remind to the audience, and that might have not pleased everyone!

### Away from Palais des Congrès ?

As usual, when you attend a conference, you tend to forget that there are other things happening and other people living outside from the venue of the conference. This is particularly true when you are abroad, away from the daily routine, and the time and attention spent in the session rooms makes you forget where you are, and whether it is morning, afternoon or night...

However, even if you are not around Palais des Congrès now, you can follow all the talks on the webcast of the conference here. This is true for the plenary talks of next week, which will be broadcasted live, but you can also access all the parallel talks that took place in Salle Maillot last week.

In case you want to enjoy a bit your time in Paris (or skip a session, but shhh...), you may still bump into ICHEP 2010 during your walks. Indeed, if you stroll along the Seine and enjoy Paris-Plages, you will certainly notice a stand for kids around physics -- with a "trombinosquark" and other fun activities...

And on Tuesday evening, if you enjoy the atmosphere of the "Grands Boulevards", you will again see another ICHEP-related event. But this time, we have moved from Palais des Congrès to the Grand Rex theatre hall, with the "Night of the Particles" : a popular physics conference and a good movie ! Another way of enjoying ICHEP away from Porte Maillot...

### End of Part One

Sunday... Time for a rest for the participants... as well as for the organisers. Obviously, the last three days were quite hectic for me, since I had both kinds of duties (hard to explain to sa colleague that you see only once every two years that you must post first then discuss physics afterwards). Thus it was not really blogging time for me, as we had a few things to sort out here and there.

As expected, I followed quite extensively the flavour-related sessions. Obviously, now that Babar and Belle have intensively studied the Bd and B+ mesons, the Bs meson is the focus of all interests -- which explains the large crowd gathering in the small room. D0 reported again their exciting results shown at ICHEP for the dimuon asymmetry, which rely on a clever cancellation of some of their background by combining the dimuon asymmetry with the single muon one. As reported by Jester, things are getting far less optimistic -- if you like New Physics -- for the phase in Bs mixing, since CDF agrees now with the SM, whereas D0 remains in muddy waters. Let us hope that both collaboration will provide an average very soon...

Fortunately, the other sessions were less overcrowded, but still quite lively. Belle has showed its ability to run nicely at the Υ(5S), yielding interesting results on some nonleptonic decays for the Bs mesons after analysing only 1/5 of its sample. This makes all the more interesting the prospects of a super-B factory and will probably renew our interest in old-fashionned (but still so reliable) studies on SU(3) and U-spin symmetries. On the lighter side of CKM, the KOTO experiment seems to be right on track, with a completion of the detector in 2011 to reach a sensitivy around 10-9 on Br(K0->π0νν-bar) in 2012. Considering how well we understand this mode from the theoretical point of view in the Standard Model, this is excellent news. There were also preliminary results from MEG on μ to e conversion: a few events can be seen in the signal region -- still compatible with no signal at 90% CL (but the best fit result corresponding to 3 signal events, which makes you dream that maybe, with an increased statistics...).

I popped into other sessions to hear a few interesting talks, like the first results from LHCb (with the first measurements of b-bar production cross-section, as well as the first W and Z events), the not-so-surprising non-discovery of the Higgs at CDF/D0, and the related (and maybe underestimated) theoretical uncertainties on the channels used for this search. Unfortunately, our "presidential" change of schedule kept me quite busy out of the session rooms at the time of the new results on neutrinos and cosmic rays, but I have still the plenary talks to learn about them...

Let Part Two begin !

## Saturday, July 24, 2010

### D0 says: neither dead nor alive

This year CP violation in the Bs meson system has made the news, including BBC News and American Gardener. The D0 measurement of the same-sign dimuon asymmetry in B decays got by far the largest publicity. Recall that Tevatron's D0 reported 1 percent asymmetry at the 3.1 sigma confidence level, whereas the standard model predicts a much smaller value. That would suggest a new source of CP violation, perhaps new heavy particles that we could later discover at the LHC.

The dimuon asymmetry is not the only observable sensitive to CP violation in the Bs system. Another accessible observable is the CP violating phase in time-dependent Bs decays into the J/ψ φ final state. In principle, the dimuons and J/ψ φ are 2 different measurements that do not have to be correlated. But there are (not completely bullet-proof) theoretical arguments that a large deviation from the standard model in one should results in an observable deviation in the other. This is the case, in particular, if new physics enters via a phase in $M_{12}$ (the so-called dispersive part of the mixing amplitude, as opposed to the absorptive part $\Gamma_{12}$), which is expected if the new particles contributing to that amplitude are heavy. The previous, 2-years old combination of the CDF and D0 measurements displayed an intriguing 2.1 sigma discrepancy with the standard model. CDF updated their result 2 months ago and, disappointingly, their results seems perfectly consistent with the standard model. D0 revealed their update today in an overcrowded room at ICHEP. Here is their new fit to the CP violating phase vs. the difference of the widths of the 2 Bs mass eigenstates
Basically, D0 sees the same 1.5 sigmish discrepancy with the standard model as before. Despite 2 times larger statistics, the discrepancy is neither going away nor decreasing, leaving us in the dark. Time will tell whether D0 found hints of new sources of CP violation in nature,
or merely hints of complicated systematical effects in their detector.

### Searching for Higgs; finding colleagues

The hot topic at ICHEP is definitely the Higgs search (even President Sarkozy has caught it!). I just posted a few more Higgs-y details to Here I'll discuss something (slightly) less massive--the interactions between physics bloggers.

While a few of the bloggers knew each other (or of each other) before we joined the ICHEP effort, most of us hadn't met in person. So we gathered yesterday for lunch, to discuss blogging, rumors, and bagels. (Our attempt to eat at a French brasserie was thwarted by dozens of other particle physicists, so we ended up enjoying a very American lunch at the Westside Cafe - something especially appreciated by this expat American, who can't find bagels anywhere near Geneva.)

See if you can spot your favorite ICHEP blogger... It's a lot easier than finding the Higgs!

### Day 2: Higgs at Tevatron, Higgs at LHC, Higgs on the BBC

Friday afternoon I sat again in a relatively packed room for the Higgs session. The effect of the rumor seems to fade away, but there is still quite a buzz around the Tevatron Higgs searches, especially because our friends are professionally distilling their results at a tantalizing pace, and - in case you missed that - the final Tevatron combination will be shown only on Monday :-)

Every Higgs search channel at Tevatron in has its peculiarities and its reasons of interest (since I'm working on the $H\to\gamma\gamma$ search at LHC myself, I was for instance particularly interested by this presentation), but what that I always find impressive in these session are not the single analyzes, but the combination of them.

What is rather clear in fact is that neither CDF nor D0 have enough sensitivity and data to see the Higgs (or claim it does not exist) in a single decay channel. Take for instance the $\gamma\gamma$ channel I was mentioning before: with this channel only both the Tevatron experiments can today only place a limit on the Standard Model Higgs boson production around 20 times the SM cross section.

On the other hand, this channel can add about 5% sensitivity to the combined SM Higgs combinations, and plays an especially important role in the mass region around 130 GeV. Similarly, dozens of other channels can bring their small but important contribution to the global sensitivity. Have a look for instance at the list of Higgs searches that are combined by D0:

or at the impressive combination of the CDF limits for all the channels they are looking at:

Putting all these searches together is an industrial work, with a non negligible effort of standardization of the results format, both by the different analysis teams in a collaboration and by the two collaborations. It's something ATLAS and CMS have to learn to do quickly: as it came out during the session, there already exists a combined ATLAS-CMS effort for the statistical combination of their results, and very recently a first exercise of LHC Higgs results combination was performed, but the road to reach the current Tevatron expertise and organization is still rather long.

But - don't you know? - the Tevatron combination will only be presented on Monday, so let's still stick to the separated D0 and CDF ones. As you can see for the previous plot, CDF was lucky and was able to reach sensitivity to the SM at 165 GeV alone. D0 was slightly less lucky, but is nearly there too: congratulations!

The most interesting questions to the Tevatron experiments during the session were all rotating around the same subject: how much more data wold they need to bring their curves below 1 along all the mass range? It's certainly a very relevant question: as you can see form the the ATLAS and CMS talks at the same session, the LHC experiments will need time since we can reach similar sensitivities, and in the meanwhile the Tevatron would certainly like to keep on taking data as long as possible. This is such a hot subject these days that it has percolated to the media, as the D0 speaker reminded us:
It's certainly not easy to answer: how much would the CDF and D0 sensitivity curves would move toward 1 with twice the luminosity they have today? And with three times? Taking into account that to improve the sensitivity by a factor $N$ one needs $N^2$ the luminosity, they certainly still need quite a lot of additional data. And even if they claim they can improve the analyzes further more, and maybe include some other remote channel they might still miss in the combination, statistics will still play the dominant role. But if I were them, I would certainly try to keep on running anyway as long as I can.

### Pessimist In Elafonisos

While a thousand physicists gather in hot Paris and listen to talk after talk, I am confined in a small island of the Mediterranean, trying to relax and gather my ideas for the next few aggressive months of data analysis, a course of subnuclear physics in the fall, and of course, more reckless rumor-mongering!

Being away from the scene naturally makes me a less-well informed subject as a blogger writing for this site, but it also yields some advantages. No, I am not talking of the blue waters of Elafonisos (see picture), of the fish dinners on tables planted on the sand of a beach, or of the siestas "Greek style" which lasts from about three in the afternoon to about 6.30 in the evening.

I am talking instead of the advantage of having time to think, for once. And to have a global look at where high-energy physics is going in the next decade. The crazy schedule that experiments have forced upon themselves in order to produce as many high-quality results as possible in time for ICHEP left all of us little time to think during the last few months, and those in Paris are still immersed in the same turmoil. Let us instead take a step back and observe HEP from a distance.

After the completion of the picture of elementary fermions with the discovery of the top quark and the verification of the existence of the tau neutrino, many of us had great hopes for the future. The Higgs boson was the next target, sure, but most of my colleagues felt that besides that particle fantastic new riches were going to be at arm's reach very quickly, with the soon-to-come Run II of the Tevatron and the LHC coming thereafter.

The line of reasoning is simple: the standard model is an effective theory -of this everybody is certain. It only can be valid at the energy regime at which we are testing particle physics today; it cannot be valid at much higher energies. It does not include gravity (and we all think a honest-to-God theory of everything should do that), it does not "explain" things as we see them, but just allows us to calculate reactions and rates.

Our deeply-rooted Illuminism demands that there be something else in store for us, an apocalypse (I am studying Greek, and so I was reminded recently that "apocalypse" means "revelation")! A revelation of why the masses of fermions are so different from one another, why electroweak symmetry is broken, why neutrinos mix and what causes them to do that. And the place where this something else should reveal itself is... around the corner! It was around the corner ten years ago, it surely is there now!

There was a hint that we did get, just before the last ten years of draught. The observation that neutrinos mixed among themselves was taken as an appetizer of a twelve-entrees dinner to come. Was it true ? So far, not true. After the appetizer, we have been starving. The Tevatron experiments, Belle, Babar, and a host of other apparata have provided a wealth of new knowledge about the inner workings of the standard model, the details of quantum chromodynamics, etcetera. But nothing was found beyond the picture we had drawn, and which we already knew all too well. We are as much in the dark about what exists outside as we were ten years ago. One is reminded of that old sentence, "Consistency requires to be as ignorant as you were a year ago"...

Is it going to be supersymmetry ? Or new generations of matter ? New vector bosons, gravitons, extra dimensions ? Or still more unexpected things ? For many, the hope of a new golden age of particle physics similar to that of the late fifties of last century seemed almost a certainty ten years ago. Is it right to nourish the same enthusiasm today ? Is insisting with that enthusiasm the right thing to do with the large number of students pressing from below to do fundamental physics today ?

We keep repeating to ourselves that the LHC is a discovery machine, and that if something is there, ATLAS and CMS will find it. I cannot object to this reasoning -the LHC is a unbelievable project, the most technologically complex endeavour that humanity has produced. But I do not see why new physics had to hide in the corner of phase space where the Tevatron and the other machines could not reach it yet. The LHC is great, but this does not prove it will discover anything: it is simply a non sequitur!

These ten years have seen us exploring large chunks of the parameter space of dozens of new physics models, in vain. I may be a pessimist, but I have the feeling that the LHC will only repeat the play, at a bigger scale.

I have bet 1000 dollars four years ago that we would not see any hint of Supersymmetry at the LHC. The original bet was targeting 2010 and 10 inverse femtobarns of collisions as the year and conditions for a payoff, one way or the other. Now with the delays of the machine, we are still forced to wait. According to tentative schedules of running and shutdowns at CERN, the 10 inverse femtobarns of analyzed data that we took as the basis of a conclusive deadline to assess the bet are not going to be collected before 2013. But the bet still stands, and alas, I am as convinced as I was four years ago that the next ten years will be the apocalypse of our short-sightedness.

Will I be happy if I eventually cash my bet ? Of course not! If you know the gambling game of Blackjack, you understand that the 1000 bucks were placed as an "insurance bet". But I do not see a ten coming after the ace.

### Day two -- the lattice track

Friday was the day of the lattice QCD track, which I attended exclusively throughout the day. The talks in the lattice session had actually been selected to be accessible and of interest also to people outside the lattice community (in particular there were a number of review talks), so it was a bit of a pity that the talks were attended almost exclusively by lattice theorists.

Stefan Schaefer started the first session with a review talk about pion physics and progress in simulation algorithms in lattice QCD. The low-energy physics of pions is described by Chiral Perturbation Theory (χPT), which is an effective field theory with a number of "low-energy constants" that need to be determined from the underlying ("high-energy") theory, i.e. from QCD. In order to determine these constants from lattice simulations, one needs to be able to simulate at low pion masses, so that one can establish the link to the domain of validity of χPT. This is where algorithmis developments come in, since ten years ago simulations at those quark masses would have been considered impossible, even given today's computers. The reason is that the effort needed for a lattice simulation grows rapidly with decreasing quark masses; in order to curtail that growth, new algorithms were needed. One such algorithm is the domain-decomposed Hybrid Monte Carlo (DD-HMC) algorithm by Martin Lüscher, other choices include the Rational Hybrid Monte Carlo (RHMC) algorithm. Further improvements have been obtained using "deflated" solvers for the Dirac equation, which separate the low-lying (infrared) modes from the ultraviolet, and thus manage to speed up calculations considerably.

Next was Takashi Kaneko speaking about disconnected quark loops. Many quantities are difficult to measure on the lattice, because they involve not only quark flows going from source to sink, but also quark loops attached only to one side, so-called disconnected diagrams. These are very hard to measure, since they require an "all-to-all" quark propagator, i.e. in principle the complete inverse of the Dirac operator on a given gauge background. A number of techniques have been developped to deal with this problem, all of which revolve around the idea to estimate the inverse of the Dirac operator stochastically, with a variety of methods devised to beat down the resulting noise in order not to lose the signal completely.

The last talk of the session was Dru Renner talking about the calculation of the leading-order hadronic contribution to the anomalous magnetic moment of the muon from lattice QCD. The anomalous magnetic moment of the muon has a well-known and persistent 3σ discrepancy between theory and experiment; however, the theory error is dominated by the error on the hadronic contributions, which currently are being estimated phenomenologically from the cross-section e+e- --> hadrons. A lattice calculation would provide a first-principles predictions for the hadronic contribution, and the leading hadronic contribution is given by the photon self-energy insertion, which can be measured as a current-current correlator on the lattice. Low-momentum contributions are dominant, however, and on a finite lattice there is a lower bound on momenta, so that the accuracy of the lattice prediction is still limited at present. Nevertheless, its good agreement with the phenomenological number provides a valuable cross-check. Determining the next-to-leading hadronic contribution will involve measuring light-by-light scattering amplitudes on the lattice, which is a completely unsolved problem as yet.

The second session had Owe Philipsen speaking on the QCD phase diagram at finite baryon desnity. At finite baryon density, i.e. finite baryon chemical potential, the fermionic determinant is in general no longer real and positive, and thus can no longer be interpreted as a probability measure, which prevents direct lattice simulations. A number of methods to work around this problem have been invented: one can Taylor-expand in the chemical potential and measure the Taylor coefficients at zero chemical potential, or one can simulate at a number of imaginary values of the chemical potential (where the determinant is positive) and analytically extend a fit to real values, or one can try to use reweighting techniques. Using a variety of these techniques, a number of groups are now exploring the QCD phase diagram in the region of low chemical potential, which however will probably be enough to clarify the existence, nature and location of the QCD critical point in the near future.

George Fleming gave a talk on the subject of "walking" technicolor models and lattice simulations of gauge theories that are not QCD-like. Depending on what the LHC finds, this may become a very very hot topic in the near future, as theorists may need to find a description of possible strong dynamics that exist at the TeV scale and give rise to electroweak symmetry breaking instead of a Standard Model Higgs.

Also related to Beyond-the-Standard-Model (BSM) physics was Philipp Gerhold's talk on the effects of a possible fourth-generation on Higgs boson mass bounds. By simulating a Higgs-Yukawa model (i.e. the Higgs and fermions, but no gauge fields), he has derived upper and lower mass bounds on the Higgs in the presence of a fourth generation; the result is that if a fourth generation with a top'-mass of about 700 GeV exists, then the Higgs mass is bounded to lie between about 600 and 900 GeV, or in reverse that if the Higgs is significantly lighter than that, there cannot be a fourth generation coupling to the Higgs in the same way as the first three generations.

For lunch, I met with some of the other ICHEP bloggers, which was very nice.

In the afternoon, the first session was started by Sinya Aoki with a talk about determining nuclear potentials from QCD, which is the programme of the HAL ("Hadrons to Atomic nuclei from the Lattice") collaboration. The potentials obtained from measuring Bethe-Salpeter amplitudes on the lattice and determining the non-local potential from them and expanding it in a derivative expansion (assuming an energy-independent potential) agree quite well with the qualitative features of phenomenological models used so far.

Colin Morningstar spoke about the calculation of the excited state spectrum using the "noisy LapH" or "distillation" method, which is a new method for improving the signal-to-noise ratio in lattice measurements of mass spectra. The method is based on realising a smearing operator for the quark fields as a projection on the space spanned by the lowest few eigenmodes of the 3d Laplace operator (hence the name Laplace-Heaviside or LapH), and on introducing a stochastic estimator of the inverse of the Dirac operator from this subspace.

This was followed by a presentation on nucleon structure calculations results from ETMC by Constantia Alexandrou. Nucleon structure is quite hard on the lattice -- there are significant systematic effects from the unphysically heavy light quarks used in simulations, and disconnected diagrams also contribute in many cases. As a result, lattice predictions disagree with experiment in many cases, although there is a notable tendency towards the experimental data as more sources of error come under control.

Jose Rodriguez-Quintero gave a talk on determinations of the strong coupling αs. This is a field in which a number of different methods have been used: the Schrödinger functional method pioneered by the ALPHA collaboration, calculations of Wilson loops within lattice perturbation theory as performed by the HPQCD collaboration, the use of moments of current-current correlators matched to continuum perturbation theory (also pursued by the HPQCD collaboration), and methods based on the determination of Landau-gauge propagators as presented by Jose.

The last session of the lattice track started off with Enno Scholz talking about results from 2+1 flavour QCD with domain wall fermions. Domain wall fermions are a formulation of fermions on the lattice that manages to realise chiral symmetry and evade the Nielsen-Ninomiya theorem by introducing a fictitious fifth coordinate in which the fermions are localised on a four-dimensional domain wall. The fifth dimension means that simulations are much more expensive than for other (e.g. Wilson or staggered) fermions that do not maintain the continuum chiral symmetry.

Michele Della Morte gave a review of heavy flavour physics, focussing on work done by the ALPHA collaboration. One important point he made was that in order for heavy flavours to be reliably simulated, fine lattice are needed (the Symanzik effective theory is an asymptotic expansion, and it only works for ma small enough) -- e.g. a<0.08 fm for charm quarks. For bottom quarks, this means that one needs to resort to effective theories such as HQET, which has been studied non-perturbatively by the ALPHA collaboration in the quenched case, with results for Nf=2 forthcoming.

Another review of heavy flavour physics, this time focussing on work done by the HPQCD and FNAL/MILC collaborations, was given by Elvira Gamiz. A lot of work has done into reducing the errors on fDs, and the previously reported tension between theory and experiment has vanished. Significant progress is also being made in the determination of semileptonic form factors.

The only thing I wasn't too happy with about the lattice track was that almost no non-lattice people attended, since the talks were selected specifically to give an overview of recent progress for the benefit of a wider audience (most of the lattice folks have been to the LATTICE 2010 conference and already know a lot of these results). So why don't non-lattice people come to lattice talks -- is it that the lattice has a reputation for being highly technical?