The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Degree projects - computing

Bachelor and Master projects

Computing in particle physics experiments

Modern Particle Physics is also appropriately called High Energy Physics, because it relies upon accelerators delivering high-energy particle collisions in order to generate data. Gone are the days when particle physicists were climbing mountains with photographic plates, hoping to register rare particles coming from collisions of cosmic rays with elements in Earth athmosphere. Even though such collisions can have a very high energy, they are rather rare and quite unpredictable. Accelerators allow us to recreate such collisions in a predictable and abundant manner. Large Hadron Collider at CERN is the latest in the line of large particle colliders, delivering high quality research data. Even at lower energies great discoveries can be made, if we can discern anomalies in data.

These amazing scientific achievements come with a new challenge: the data need to be collected and processed in previously unseen amounts, and very fast. While photographic plates were processed at a slow pace and analyzed with a naked eye, modern particle detectors are like huge 3-dimensional digital photo cameras, with resolution reaching microns, and producing thousands of "frames" per second. Handling this immense load of data in such a manner that scientists can obtain results rapidly requires a very special computing, storage and software infrastructures.

Our group contributes to the solution of this data problem in several ways. We develop specialised software that brings together data centers from around the world, helping to create a computing infrastructure that is distributed worldwide. The software is called ARC - Advanced Resource Connectior, and the infrastructure is called WLCG -Worldwide LHC Computing Grid. We also contribute to operation of the Nordic Tier1 data centre of WLCG, which is a "branch" of CERN in Sweden. And of course Lund researchers develop software that helps processing data from the LHC, as well as modelling the collisions and detectors at LHC and the future LDMX experiment.

Diploma projects in computing are typically done in conjunction with one of the experiments, such as e.g. ATLAS at LHC or LDMX, or may concern generic software that is common to all experiments. As the projects often deal with code development, students are expected to be familiar with Linux and at least one programming language. Knowledge of C++ is most beneficial, though Python, C, Java etc are also relevant.

Examples of projects are:

  • Studies of optimal software algorithms and tools for data analysis, including e.g. artificial neural networks, programming for new processor architectures etc
  • Optimisation of existing software, such as e.g. the GEANT4 detector simulation toolkit
  • Development of analysis or simulation workflows for distributed computing infrastructures
  • Development of tools to support distributed data processing or storage
  • Machine learning
Processing LHC data