
A look back at the conference on “the new computing paradigms”
On Wednesday, 8 October 2025, CNRS Informatics organised a conference on new computing paradigms, its thematic focus for this year. Here is a look back at the day, which took place at the CNRS headquarters in Paris.
Antoine Petit, President and CEO of the CNRS, opened the conference by highlighting the growing importance of computing and its presence in all CNRS institutes. Computing now plays a key role, particularly due to the rise of computers, and also raises questions in terms of energy efficiency.
Sustainable computing: past, present and future
The first presentation focused on the challenge of energy efficiency in computing. Anne-Cécile Orgerie, head of GDRS EcoInfo, and Denis Trystram, professor at the Grenoble Institute of Technology and member of LIG, addressed this issue by asking the question of energy-efficient and responsible digital technology. Indeed, while computer science is often presented as part of the solution to climate problems, it nevertheless bears a significant share of the responsibility. This is evidenced by the share of the global carbon footprint generated by digital technology, which in 2021 stood at between 2.1% and 3.9% of global emissions. In terms of French electricity consumption, 11% of this, or 52TWh on average, was used for digital technology in 2020. What’s more, if we take into account data centres located abroad, consumption rose to an average of 65TWh in the same year. By 2050, consumption is expected to increase by 79%, which would amount to 93TWh. This does not include other externalities such as the water consumption required to cool servers, which is also increasing year on year.
As a result, solutions are being implemented to try to reduce electricity bills: low-power processors, communications optimisation, hotspot management, etc. But is this enough to reduce consumption? Unfortunately, energy optimisation efforts do not translate into a clear reduction in consumption. Although optimisation reduces the cost of resources used, it tends to accompany continued growth in digital technology and its uses, and therefore in overall energy consumption. In other words, the pursuit of efficiency alone is not enough. But how can these systemic and ethical issues be addressed? What does the future hold for computer science, and how can we ensure that it is sustainable and environmentally friendly? In her presentation, Anne-Cécile Orgerie addressed all these questions, emphasising the important role of researchers in this global challenge.
Quantum computing
The next session focused on one of the research areas of the PEPR Quantum, specifically quantum computing. Yassine Hamoudi, a CNRS researcher at LaBRI, opened this session by comparing classical circuits with quantum circuits in order to highlight the notable differences between classical physics and quantum physics. A classical computer simulates a classical computational model, which is not the case for its quantum counterpart. The latter uses the laws of quantum mechanics to perform calculations on data, but these operations still contain too many errors today for the machine to be usable: we talk about noise (imperfections in the implementation of qubits and operations) and decoherence effects (uncontrolled loss of quantum properties). No technology currently exists to reproduce a large-scale mathematical model based on quantum physics. The challenge is therefore to develop error-correcting codes and build high-quality qubits in order to scale up.
Among the promises of quantum computing are, for example:
- Faster resolution of certain problems thanks to quantum algorithms.
- The development of new cryptographic tasks that are purely impossible to perform with classical computers.
- The improvement of quantum computer networks for more secure communications.
Anthony Leverrier, an Inria researcher at the Inria Centre in Paris and leader of the NISQ2LSQ project at PEPR Quantum, then focused in more detail on the error correction codes currently being developed. Two types of errors commonly occur when discussing quantum algorithms, so a combination of two correction codes is needed to remedy them. The most promising codes today are Bosonic codes, developed in particular by the start-up Alice & Bob, and LDPC codes. However, the search for an ideal architecture is still ongoing.
Eleni Diamanti, CNRS research director at LIP6 and leader of the PEPR’s QCommTestbed project, concluded this session with a presentation on quantum communication networks. According to her, ‘the world is slowly moving towards an era of quantum connectivity’. To achieve this goal, it is necessary to establish non-local quantum connections and to scale these networks over long distances via repeaters, which relay quantum information. These developments are based on the principle of quantum entanglement, a term describing the phenomenon whereby two particles form a single system and become dependent on each other. Today in France, several connections of this type exist between laboratories in Paris and the Saclay plateau, as well as between these and Nice.
Neuromorphic computing
The conference resumed in the afternoon with a session led by Marwed Belkaid, professor at CY Cergy Paris University, and Elena Ioana Vatajelu, CNRS researcher at TIMA. The topic was neuromorphic computing, which explores new computing architectures by mimicking the functioning of the brain.
Marwed Belkaid explained certain physiological processes, such as economic decision-making, and how they can be modelled in an artificial neural network, which is also inspired by the functioning of biological neurons. In general, neuromorphic computing is based on various computational principles inspired by the brain: distributed computing, distributed coding, learning, redundancy, competition and neuromodulation.
Elena Ioana Vatajelu discussed the increase in performance of artificial intelligence-based computing systems since the 2010s. At the same time, we have witnessed the emergence of connected objects, while the materials needed to manufacture them are becoming increasingly scarce, not to mention the high energy costs they entail. Today, we face a major challenge: achieving high computing performance while limiting our energy consumption. Added to this is the immense amount of data to be handled, which is constantly increasing and therefore also requires growth in memory capacity to store it. Storage involves security efforts and research into the best way to make it as secure as possible.
Information at the heart of molecules: from storage to DNA computing
For this final presentation on new computing paradigms, the conference welcomed Marc Antonini, programme director of the MoleculArXiv PEPR for the CNRS. He also highlighted the incredible amount of data that exists today, estimated at 175 zettabytes* in 2025, a figure that is growing exponentially. To store all this data, numerous data centres have been built and are still being built. However, their proliferation is leading to an increase in environmental costs: carbon emissions, massive water consumption and waste generation during data migration. This is why solutions aimed at improving data storage are currently being developed.
Among the potential candidates is DNA. The development of a molecular storage medium would have a very small footprint, be compact, durable and secure. But how does it work? The process envisaged here is to convert binary information into quaternary code (for the four nitrogenous bases** of DNA) on manufactured DNA. It would then need to be encapsulated to protect it and store the converted code. Despite promising performance and durability, the writing time for this code is very long. To overcome this difficulty, the exploratory PEPR MoleculArXiv was launched in 2022, with the objectives of storing massive amounts of data on DNA and polymers and accelerating the current speed of the DNA write/read cycle by a factor of 100, while reducing its cost by an equivalent factor.
Nicolas Schabanel, CNRS research director at LIP, concluded this session by presenting the possibility of calculating directly on DNA data. To do this, a computer that calculates with DNA must be built, and a dedicated algorithm developed. In this field, inspiration is drawn from nature by taking systems that work and recreating them as needed. Currently, no fewer than ten laboratories are working on the creation of such a machine.
Computing: a current and future challenge
The day continued with a lecture given by Frédéric Worms, Director of the École Normale Supérieure (ENS) – PSL. He was invited to share his philosophical reflections on calculation, as well as his hypothesis on the evolution of this concept over time.
Finally, a debate with the day’s speakers, moderated by Marian Scuturici, Deputy Scientific Director at CNRS Computer Science, was organised. Topics included the language of computing paradigms, the limits of current computing, the impact of artificial intelligence on this sector, and the interests of industry.
Finally, Adeline Nazarenko, Director of CNRS Computer Science, closed the conference by emphasising the need to highlight this theme. This has recently resulted in the creation of a dedicated GDR (Research Groups), the launch of various PEPRs (Priority Research Programmes) and the publication this year of the book Le Calcul à découvert (Computing Uncovered) by CNRS Éditions.
// More to come in a few days when the videos of the event are online! //
Footnotes:
*1 Zettabytes = 10(21) bytes
**A (adenine), C (cytosine), G (guanine) and T (thymine)
Copyright banner image: © Cyril FRESILLON / IDRIS / CNRS Images
Latest news
No news
More news Events


