Computing and Astroparticle Physics 1
7-8 OCTOBER 2010,  Lyon, France

posterA workshop entitled ‘Computing for Astroparticle Physics’ was held in Lyon, France at the end of 2010 aimed at addressing the challenging computational problems that the ASPERA Roadmap projects will face regarding data collection, data storage and data mining, considering models that have been developed in neighbouring fields of particle physics (grid and cloud computing, large databases) and astrophysics (virtual observatories, public access). In addition the workshop covered issues of intelligent distributed data gathering and heterogeneous data fusion, as well as environmental, geoscience data availability and outreach. Fifty people attended the workshop.

Three types of computing examples were presented during the workshop: Gravitational Waves (EGO, LIGO, VIRGO), Cosmic Rays (CTA, MAGIC, AUGER) and Dark Energy (LSST, EUCLID). Hardware (what will be the impact of Graphical Processing Units or GPUs) and middleware (Grid and/or Cloud?) were greatly discussed in each case.
In addition, the astroparticle physics community interfaced with the following two communities:

  • The European Grid Infrastructure (EGI): EGI is a publicly funded e-infrastructure put together to give European scientists access to more than 320,000 CPUs, 152 PB of disk space and data, which together drive research and innovation within their own research domains. EGI supports computing, compute workload management services, data access and transfer, data catalogues, storage resource management, and other core services, such as user authentication, authorization and information discovery. Resources are provided by about 350 resource centres who are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America.
  • The International Virtual Observatory Alliance (IVOA): was formed in June 2002 with a mission to “facilitate the international coordination and collaboration necessary for the development and deployment of the tools, systems and organizational structures necessary to enable the international utilization of astronomical archives as an integrated and interoperating virtual observatory.” The IVOA comprises 20 VO programmes and focuses on the development of standards and encourages their implementation for the benefit of the worldwide astronomical community. Working Groups are constituted with cross-programme membership in those areas where key interoperability standards and technologies have to be defined and agreed upon. The IVOA also has interest groups that discuss experiences using VO technologies and provide feedback to the working groups.

Finally, in the field of linking communities working at the regional level, the examples of HEAVENS (ISDC, Geneva, Switzerland) and the François Arago Centre (APC, Paris, France) were given, to discuss how can these be networked at the national and international level.

  • HEAVENS (High-Energy Astrophysics Virtually ENlightened Sky) provides analysis services for a number of recent and important high-energy missions. These services allow, through a simple query, on-the-fly data analysis to produce images, spectra, and light curves for any source, sky position, time and energy intervals without requiring mission specific software or detailed instrumental knowledge. By providing a straightforward interface to complex data and data analysis, HEAVENS makes the data and the process of generating science products available to scientists and higher education. HEAVENS also promotes the visibility of high-energy and multiwavelength astrophysics to society at large and encourages the public to actively explore the data.
  • The François Arago Centre is a project at the Astroparticule et Cosmologie (APC) laboratory and is supported by the IPGP within the Space Campus of Université Paris Denis-Diderot. The centre was founded in 2010 with the goal of supporting space-based experiments facing the challenge of the data treatment of steadily increasing and complex data sets. The centre provides services, such as computing facilities, data analysis, archiving, and distribution support, access to the heavy-duty computing facilities at CC-IN2P3, and office space to both, ground and space-based projects with strong French involvement.

 

Computing and Astroparticle Physics 2
30-31 MAY 2011, Barcelona, Spain

The second computing workshop, also entitled ‘Computing for Astroparticle Physics’ was held in Barcelona, Spain in May 2011 and was organised as a dialog between Models for Computing and Data Pipelines for Astroparticle Physics projects (“the Modellers”) and Technologies for Data Processing and Computing (the “Technologists”).
A novel feature that was tried in this workshop was to have two tracks of parallel sessions. The first was on models with rapporteurs who were technologists (Models for Computing and Data Pipelines track). These sessions included talks on models on event-driven, gravitational waves and dark energy experiments. The second track of parallel sessions was on technologies with rapporteurs who were modellers (Technologies for Data Processing and Computing track). There were sessions on technologies for pipeline automation and database access, for data storage and for computing. Plenary and parallel sessions allowed the participants to hear an overview of current computing models developed by upcoming astroparticle observatories, including CTA, KM3net, Auger, VIRGO/LIGO, and LSST. Availability of environmental data relevant to other areas, outreach, and links with existing centres of particle physics and astrophysics were also discussed.

This workshop also addressed the development of the computing model as a tool for the financial and technical planning of the computing needs of an experiment, encapsulated in a management document, periodically revised.
Key elements of a computing model are detailed descriptions of:

  • DATA: Raw and derived data types, sizes and volumes.
  • WORKFLOWS: Raw-to-Science data processing, analysis and simulation.
  • USER ANALYSIS: Often difficult to estimate, assumptions made should be clearly stated.
  • SOFTWARE PROVISIONING: Including human resources for development within the project and licensing costs of commercial software.
  • OPERATIONS AND SUPPORT: Including platform support and data bookkeeping, movement and placement procedures and the human resources to support them.

These descriptions should be followed by detailed technical solutions, which in turn lead to budget estimates as well as needed levels of personnel, including computing professionals. A solid accounting infrastructure should be implemented in order to enable regular reporting. When dealing with petabytes 10,000 cores, etc., the development of a computing model is mandatory to make efficient use of resources. Such a model will prove to be key for the inclusion of computing needs in the planning/funding cycles of the funding agencies. In addition, since conditions always evolve during the experimental lifetime, there is a need for a procedure to manage these changes as much as the computing model itself. Solid accounting infrastructure that enables regular reporting is essential. There needs to be a formal procedure to periodically feed back requirement changes into the capacity planning/funding. Opportunities should be developed for re-use and economy, particularly software. Finally, it should be discussed whether one computing model per project or one per “area” of astroparticle physics is the optimal strategy for the future.

 

Computing and Astroparticle Physics 3
3-4 MAY 2012, Hannover, Germany

computinglogoThe third ASPERA ‘Computing and Astroparticle Physics’ workshop took place in Hannover, Germany, and focused on hardware and technology in order to take a detailed look at challenging problems of data collection, data storage, and data mining. In some cases computing challenges are the bottleneck, and so using the best and most appropriate hardware and technology will enable more and better science to be done. The current status of progress in existing and future models of computing in LSST, AUGER and CTA were presented, but because computing technology is largely driven by non-science market forces, the workshop also involved some of the relevant market leaders, whose technology roadmaps are of great relevance. Industry representatives were invited to present their developments and expectations for high performance computing. Given the extraordinary and complete review of hardware trends that was presented at the workshop it was suggested that any next workshop be organised jointly by the astroparticle physics community and the computing industry.

A consensus was reached on the following aspects of computing in relation to astroparticle physics. Given the nature and the scale of astroparticle physics experiments, computing and data processing are essential parts of this interdisciplinary science. Estimated data taking periods are long, in some cases even decade-long, so the experiments’ computing must have continuing support and planning with technological renewal. In fact, requirements have already been found to be large and continuously growing. Given that computing costs are not negligible, they must be planned for, making “computing models” an essential tool. In this respect, a lot can be learnt from high energy physics, particularly from LHC, but the needs of astroparticle physics experiments are not exactly the same. Finally, data centres seem to be currently in a quantum transition to resource-efficient mega-centres, which means that the community should be preparing to adapt to these changes.

The final session of the workshop aimed at discussing the steps that need to be taken towards a draft of an ASPERA Computing Whitepaper, which will summarise the conclusions of the three ASPERA workshops. The outline of the paper was discussed and working groups were established to prepare the input for this white paper.