This website is currently under construction. Some parts not function properly!

The JENA Computing Workshop – Discussing the European strategy for computing

Interview with Gonzalo Merino, director of the Port d’Informació Científica

At the Joint ECFA-NuPECC-APPEC (JENA) Seminar in May 2022 in Madrid, both the plenary presentations and the closed session of funding agency representatives revealed that there is an increased need for discussions on the strategy and implementation of European federated computing at future large-scale research facilities. Therefore, APPEC, ECFA and NuPECC decided to organize a European, cross-community workshop on the strategy of computing. Gonzalo Merino, director of the Port d’Informació Científica, is part of the Organizing Team and will explain the implementation and aims of the workshop.

During the JENA Seminar the status and needs of computing for all three communities was discussed. Can you shortly explain, what are the differences and commonalities?

I actually see more commonalities than differences. Each of the three communities has its own specific research program of course, but I think they show a lot of synergies in their common quest to understand fundamental physics questions such as the nature of dark matter, the origin of the highest energy cosmic rays or many others. Last year we heard a lot of examples of this for instance in detectors, or accelerator programs. But it is in the software and computing that I think the commonalities and potential synergies are more evident. To carry out our research program, we increasingly rely on massive amounts of experimental data as well as complex simulations. In this data-driven era that we live in, our science depends more and more on computing. Both, the availability of large computing and data infrastructures and our capability of developing and maintaining a rich software ecosystem to exploit this increasing complexity. There are several challenges ahead, and they all affect the three communities: handling exabyte size datasets keeping budget under control, making effective use of increasingly powerful HPC machines and new architectures such as GPUs, incorporating emerging paradigms such as AI or, more into the future, quantum computing into our analyses and, probably above all of them, training and retaining the talented people that is needed to build and maintain all this infrastructure. For the differences, we could say that there has not been a long tradition of working together on these topics, but I see a change of trend and I am confident that there is an emerging transversal conversation.

How can the workshop contribute to addressing the computing challenge in the coming years?

I think the workshop can be an important catalyst for the cross-community dialogue and work that we need to see in the future. Initiatives like the ESCAPE project planted the seed for this process, which has now to consolidate. There are activities in place, such as EOSC-Future, or others being planned that I hope will stimulate this common development environment. I think that the software and computing challenges are common and very complex, not only for our three communities but also for many others. To me, one of the key aspects to succeed in managing that complexity is to work together and develop common infrastructure.

Credits: CERN

What format do you plan for the workshop? Will there be talks or discussions?

There will be both talks and discussions, but I hope we will have plenty of the latter. We will first have a number of talks to set the scene, remind us of the main challenges and the evolution of the global landscape. Then, we should have plenty of time for discussing the main issues, guided by experts from different fields organised in various panels. This will happen mostly on the second day of the workshop.

What do you hope to get out of the workshop?

I hope we end our meeting with a clearer vision of the roadmap for jointly developing the future software and computing infrastructure for our communities, and with a strategy to speak as a single voice in the global e-infrastructures conversation. Acknowledging that we are of course not alone in this ecosystem, hence we will also need to establish connections beyond our fields, with communities such as the photon sciences, life sciences or earth observation, to just name the most obvious that come to my mind.

 


Gonzalo Merino

Gonzalo Merino is the Director of PIC, a scientific-technological center near Barcelona that specializes in data-intensive research and which is jointly operated by CIEMAT and IFAE. PIC collaborates with scientists from different disciplines to develop advanced data handling services.  It also provides data preservation and analysis services for the ATLAS, CMS and LHCb experiments of the LHC, the MAGIC telescopes in La Palma,  the Euclid ESA satellite mission and others. From 2013 to 2018 he was at University of Wisconsin Madison managing the computing for the IceCube Neutrino telescope, located at the South Pole. Gonzalo did a PhD in Physics at the UAB analyzing ALEPH data,  one of the four experiments at LEP, the former particle collider at CERN.

 

Further information