Mise à jour 04/01 — 699 €, c’est le nouveau prix plancher du MacBook Air M2. Il était déjà disponible à ce prix chez Boulanger en milieu de semaine. C’est au tour de Darty de le proposer à ce prix en collaboration avec Rakuten. Pour bénéficier de cette offre, il suffit de saisir le code RAKUTEN50 lors de la commande. La transaction est gérée par Rakuten, mais la livraison est l’œuvre de Darty.
Cette configuration embarque 16 Go de RAM et 256 Go d’espace de stockage. Une offre à ne pas rater, si vous cherchez un Mac à petit prix.
Mise à jour 26/12 — Boulanger poursuit sa double remise sur le MacBook Air M2 minuit qui le fait tomber à seulement 724 €, son prix le plus bas. La machine est affichée à 749 €, mais une fois dans le panier, une remise supplémentaire de 25 € est appliquée.
MacBook Air M2 minuit. Image MacGeneration.
Lancé en 2022, le MacBook Air M2 est très agréable à utiliser : il est léger, silencieux, performant et endurant. Deux générations lui ont succédé, mais la formule n’a pas changé, si bien qu’il reste tout à fait dans le coup aujourd’hui. Les 16 Go de RAM sont suffisants pour les usages classiques. Les 256 Go de stockage peuvent, eux, être trop faibles pour certains, mais on peut pallier le problème avec un SSD externe.
Mise à jour 20/12 — Petit à petit, le MacBook Air M2 se rapproche de la barre psychologique des 700 €. Ces derniers jours, on voit fleurir de plus en plus d’offres éphémères entre 720 et 750 €. Aujourd’hui, la meilleure nous vient du duo Rakuten / Darty : en saisissant le code DARTY10, vous pouvez obtenir le portable d’Apple à 739 €. Il s’agit d’une configuration avec 16 Go de RAM et 256 Go de SSD. La transaction est effectuée via Rakuten, mais la livraison est assurée par Darty. Amazon de son côté propose la même configuration pour 749 €.
Mise à jour 15/12 — En 2026, le prix des Mac pourrait à nouveau augmenter, mais 2026, c’est encore (un peu) loin. Autant dire qu’on ne reverra peut-être pas de si tôt un MacBook Air à 724 € ! À ce prix, vous pouvez obtenir chez Boulanger le MacBook Air M2 équipé de 16 Go et 256 de mémoire vive. Il s’agit bien entendu d’un modèle neuf ! Pour l’obtenir à ce prix, pensez à saisir le code NOEL25.
Mise à jour 11/12 — Le MacBook Air M2 est proposé ce jour à 749 € chez Boulanger ! Il s’agit du même modèle : 16 Go de RAM et 256 Go de SSD.
Mise à jour 09/12 — Depuis le Black Friday, les prix ont tendance à repartir à la hausse sur certaines configurations de Mac. Il reste toutefois de bonnes affaires à saisir ! Après avoir été proposé pendant quelques jours à 799 €, le MacBook Air M2 avec 16 Go de RAM et 256 Go de stockage est de nouveau affiché à 775 €. Mais la vraie surprise vient de Cdiscount, qui ne s’est pas contenté de s’aligner : le site dégaine une contre-offensive encore plus agressive. Avec le code MBA25, le même MacBook Air M2 tombe à 750 €, tout simplement l’un des meilleurs prix jamais vus pour ce modèle.
Le MacBook Air M4, lui, est proposé à 942,11 €. Il était resté longtemps à 899 €.
Mise à jour 3/12 — Amazon vient de baisser à nouveau le prix du MacBook Air M2 16 Go. Il est proposé au prix de 748 € !
Mise à jour 26/11 — Chaque jour, le MacBook Air M2 abandonne quelques euros. Le voilà disponible pour 773 € sur Amazon ! Pour l’avoir à ce prix, il vous faut activer le coupon qui est proposé !
Mise à jour 21/11 — Le prix du MacBook Air M2 repart à la baisse sur Amazon. Il est affiché ce jour à 798 €, mais Amazon lui retranche 15 € au moment de passer la commande. Ce qui nous ramène le MacBook Air M2 à 783 € !
Mise à jour le 14 novembre 14:10 : Le prix du MacBook Air M2 continue de dégringoler : on peut l’obtenir pour 773 € en ce moment chez Cdiscount. Il faudra pour cela entrer le code POMME25 à l’étape du paiement. Il s’agit de la version 256 Go et avec 16 Go de RAM. La machine est vendue et expédiée par Cdiscount. Ne traînez pas trop, car rien n’indique jusqu’à quand l’offre restera en ligne.
Article original : Si Apple a diminué récemment le prix du MacBook Air M4 13 pouces, qui est passé à 1 099 €, il n'y a pas encore de Mac portable réellement low cost dans la gamme… du moins pas chez Apple directement. En effet, de nombreux revendeurs proposent encore le MacBook Air M2 à la vente, dans sa variante dotée de 16 Go de RAM et de 256 Go de stockage. Et Amazon propose même une (petite) réduction : il est à 798 €, son prix le plus bas chez Amazon1.
Le MacBook Air M2 en version Minuit. Image MacGeneration
La machine a été lancée en 2022 à 1 500 € (avec 8 Go de RAM), et c'est un ordinateur portable toujours performant, très autonome et silencieux, contrairement aux MacBook Pro M5, par exemple. Le MacBook Air M4 a évidemment un système sur puce plus moderne et plus performant, mais la puce M2 ne démérite pas. C'est la version noire (Minuit) qui est proposée à ce prix, et elle n'a qu'un défaut : elle est (très) sensible aux traces de doigts. Mais pour le reste, le MacBook Air M2 reste un excellent appareil, surtout à ce prix.
Soyons honnêtes : il est depuis quelques semaines à 799 €, mais ça reste une bonne affaire souvent méconnue. ↩︎
Une fédération veut bannir le terme "homme du match" qui récompense le meilleur joueur après chaque rencontre, jugé "trop excluant". Une initiative qui choque le plateau des Grandes Gueules ce mardi 4 novembre.
Using a new type of low-power, compact, fluid-based prism to steer the beam in a laser scanning microscope could transform brain imaging and help researchers learn more about neurological conditions such as Alzheimer’s disease.
“We quickly became interested in biological imaging, and work with a neuroscience group at University of Colorado Denver Anschutz Medical Campus that uses mouse models to study neuroscience,” Gopinath tells Physics World. “Neuroscience is not well understood, as illustrated by the neurodegenerative diseases that don’t have good cures. So a great benefit of this technology is the potential to study, detect and treat neurodegenerative diseases such as Alzheimer’s, Parkinson’s and schizophrenia,” she explains.
The researchers fabricated their patented electrowetting prism using custom deposition and lithography methods. The device consists of two immiscible liquids housed in a 5 mm tall, 4 mm diameter glass tube, with a dielectric layer on the inner wall coating four independent electrodes. When an electric field is produced by applying a potential difference between a pair of electrodes on opposite sides of the tube, it changes the surface tension and therefore the curvature of the meniscus between the two liquids. Light passing through the device is refracted by a different amount depending on the angle of tilt of the meniscus (as well as on the optical properties of the liquids chosen), enabling beams to be steered by changing the voltage on the electrodes.
Beam steering for scanning in imaging and microscopy can be achieved via several means, including mechanically controlled mirrors, glass prisms or acousto-optic deflectors (in which a sound wave is used to diffract the light beam). But, unlike the new electrowetting prisms, these methods consume too much power and are not small or lightweight enough to be used for miniature microscopy of neural activity in the brains of living animals.
In tests detailed in Optics Express, the researchers integrated their electrowetting prism into an existing two-photon laser scanning microscope and successfully imaged individual 5 µm-diameter fluorescent polystyrene beads, as well as large clusters of those beads.
They also used computer simulation to study how the liquid–liquid interface moved, and found that when a sinusoidal voltage is used for actuation, at 25 and 75 Hz, standing wave resonance modes occur at the meniscus – a result closely matched by a subsequent experiment that showed resonances at 24 and 72 Hz. These resonance modes are important for enhancing device performance since they increase the angle through which the meniscus can tilt and thus enable optical beams to be steered through a greater range of angles, which helps minimize distortions when raster scanning in two dimensions.
Bright explains that this research built on previous work in which an electrowetting prism was used in a benchtop microscope to image a mouse brain. He cites seeing the individual neurons as a standout moment that, coupled with the current results, shows their prism is now “proven and ready to go”.
Gopinath and Bright caution that “more work is needed to allow human brain scans, such as limiting voltage requirements, allowing the device to operate at safe voltage levels, and miniaturization of the device to allow faster scan speeds and acquiring images at a much faster rate”. But they add that miniaturization would also make the device useful for endoscopy, robotics, chip-scale atomic clocks and space-based communication between satellites.
The team has already begun investigating two other potential applications: LiDAR (light detection and ranging) systems and optical coherence tomography (OCT). Next, the researchers “hope to integrate the device into a miniaturized microscope to allow imaging of the brain in freely moving animals in natural outside environments,” they say. “We also aim to improve the packaging of our devices so they can be integrated into many other imaging systems.”
Marking 100 years since the advent of quantum mechanics, IYQ aims to raise awareness of the impact of quantum physics and its myriad future applications, with a global diary of quantum-themed public talks, scientific conferences, industry events and more.
You can find out more about the contributions of Indian physicist Satyendra Nath Bose to quantum science; explore weird phenomena such as causal order and quantum superposition; and discover the latest applications of quantum computing.
A century after quantum mechanics was first formulated, many physicists are still undecided on some of the most basic foundational questions. There’s no agreement on which interpretation of quantum mechanics holds strong; whether the wavefunction is merely a mathematical tool or a true representation of reality; or what impact an observer has on a quantum state.
Some of the biggest unanswered questions in physics – such as finding the quantum/classical boundary or reconciling gravity and quantum mechanics – lie at the heart of these conundrums. So as we look to the future of quantum – from its fundamentals to its technological applications – let us hope that some answers to these puzzles will become apparent as we crack the quantum code to our universe.
Les éditions Michel Lafon nous dévoilent un tout nouveau webtoon en ce mois d’octobre : Just Twilight de Kang Ki et Woo Jihye. En plus, l’éditeur nous ravie avec la sortie des deux premiers tomes simultanément.
Just Twilight – sortie des tomes 1 et 2
Pour le tome 1 collector : une couverture métallisée exclusive, contenant 4 ex-libris format polaroïds, une planche de stickers et un marque-page !
Joon-yeong, meilleure élève de son lycée, cherche désespérément un refuge pour étudier, loin du domicile familial qu’elle fuit. Par hasard, elle découvre une maison abandonnée dans la forêt, idéale pour travailler en paix. Mais le lieu est déjà occupé : Beom-jin, son camarade de classe à mauvaise réputation, s’y cache lui aussi. Une étrange cohabitation se met alors en place entre ces deux lycéens que tout oppose, unis par l’envie de préserver ce havre secret. Et dans ce lieu hors du monde, quelque chose d’inattendu pourrait bien naître.
Les tomes 1 et 2 sont sortis simultanément le 16 octobre dernier. Le premier tome est disponible au tarif de 16,95 € tandis que vous trouverez tome 2 pour 14,95 €.
Dans la grande série « Depuis que je suis à la retraite, je ne sais plus quel jour on est », je viens de me faire rappeler à l’ordre (encore heureux, merci, Daniel et merci, Ysengrain): j’ai oublié l’Open Bar de ce début du mois. Et ce n’est pas la première fois! Bon, j’ai fait un examen neurologique pour être sûr, je n’ai aucune dégénérescence cognitive et toutes mes mémoires vont bien. ... Continuer la lecture
Le microbiote des plantes ? L’holobionte végétal ? Des concepts méconnus, mais cruciaux pour la santé des plantes et l’agriculture durable. Explications avec l’écologue Philippe Vandenkoornhuyse, réputé mondialement pour avoir contribué à révéler leur rôle.
Unless you’ve been living under a stone, you can’t have failed to notice that 2025 marks the first 100 years of quantum mechanics. A massive milestone, to say the least, about which much has been written in Physics World and elsewhere in what is the International Year of Quantum Science and Technology (IYQ). However, I’d like to focus on a specific piece of quantum technology, namely quantum computing.
I keep hearing about quantum computers, so people must be using them to do cool things, and surely they will soon be as commonplace as classical computers. But as a physicist-turned-engineer working in the aerospace sector, I struggle to get a clear picture of where things are really at. If I ask friends and colleagues when they expect to see quantum computers routinely used in everyday life, I get answers ranging from “in the next two years” to “maybe in my lifetime” or even “never”.
Before we go any further, it’s worth reminding ourselves that quantum computing relies on several key quantum properties, including superposition, which gives rise to the quantum bit, or qubit. The basic building block of a quantum computer – the qubit – exists as a combination of 0 and 1 states at the same time and is represented by a probabilistic wave function. Classical computers, in contrast, use binary digital bits that are either 0 or 1.
Also vital for quantum computers is the notion of entanglement, which is when two or more qubits are co-ordinated, allowing them to share their quantum information. In a highly correlated system, a quantum computer can explore many paths simultaneously. This “massive scale” parallel processing is how quantum may solve certain problems exponentially faster than a classical computer.
The other key phenomenon for quantum computers is quantum interference. The wave-like nature of qubits means that when different probability amplitudes are in phase, they combine constructively to increase the likelihood of the right solution. Conversely, destructive interference occurs when amplitudes are out of phase, making it less likely to get the wrong answer.
Quantum interference is important in quantum computing because it allows quantum algorithms to amplify the probability of correct answers and suppress incorrect ones, making calculations much faster. Along with superposition and entanglement, it means that quantum computers could process and store vast numbers of probabilities at once, outstripping even the best classical supercomputers.
Towards real devices
To me, it all sounds exciting, but what have quantum computers ever done for us so far? It’s clear that quantum computers are not ready to be deployed in the real world. Significant technological challenges need to be overcome before they become fully realisable. In any case, no-one is expecting quantum computers to displace classical computers “like for like”: they’ll both be used for different things.
Yet it seems that the very essence of quantum computing is also its Achilles heel. Superposition, entanglement and interference – the quantum properties that will make it so powerful – are also incredibly difficult to create and maintain. Qubits are also extremely sensitive to their surroundings. They easily lose their quantum state due to interactions with the environment, whether via stray particles, electromagnetic fields, or thermal fluctuations. Known as decoherence, it makes quantum computers prone to error.
That’s why quantum computers need specialized – and often cryogenically controlled – environments to maintain the quantum states necessary for accurate computation. Building a quantum system with lots of interconnected qubits is therefore a major, expensive engineering challenge, with complex hardware and extreme operating conditions. Developing “fault-tolerant” quantum hardware and robust error-correction techniques will be essential if we want reliable quantum computation.
As for the development of software and algorithms for quantum systems, there’s a long way to go, with a lack of mature tools and frameworks. Quantum algorithms require fundamentally different programming paradigms to those used for classical computers. Put simply, that’s why building reliable, real-world deployable quantum computers remains a grand challenge.
What does the future hold?
Despite the huge amount of work that still lies in store, quantum computers have already demonstrated some amazing potential. The US firm D-Wave, for example, claimed earlier this year to have carried out simulations of quantum magnetic phase transitions that wouldn’t be possible with the most powerful classical devices. If true, this was the first time a quantum computer had achieved “quantum advantage” for a practical physics problem (whether the problem was worth solving is another question).
There is also a lot of research and development going on around the world into solving the qubit stability problem. At some stage, there will likely be a breakthrough design for robust and reliable quantum computer architecture. There is probably a lot of technical advancement happening right now behind closed doors.
The first real-world applications of quantum computers will be akin to the giant classical supercomputers of the past. If you were around in the 1980s, you’ll remember Cray supercomputers: huge, inaccessible beasts owned by large corporations, government agencies and academic institutions to enable vast amounts of calculations to be performed (provided you had the money).
And, if I believe what I read, quantum computers will not replace classical computers, at least not initially, but work alongside them, as each has its own relative strengths. Quantum computers will be suited for specific and highly demanding computational tasks, such as drug discovery, materials science, financial modelling, complex optimization problems and increasingly large artificial intelligence and machine-learning models.
These are all things beyond the limits of classical computer resource. Classical computers will remain relevant for everyday tasks like web browsing, word processing and managing databases, and they will be essential for handling the data preparation, visualization and error correction required by quantum systems.
And there is one final point to mention, which is cyber security. Quantum computing poses a major threat to existing encryption methods, with potential to undermine widely used public-key cryptography. There are concerns that hackers nowadays are storing their stolen data in anticipation of future quantum decryption.
Having looked into the topic, I can now see why the timeline for quantum computing is so fuzzy and why I got so many different answers when I asked people when the technology would be mainstream. Quite simply, I still can’t predict how or when the tech stack will pan out. But as IYQ draws to a close, the future for quantum computers is bright.
Modular and scalable: the ICE-Q cryogenics platform delivers the performance and reliability needed for professional computing environments while also providing a flexible and extendable design. The standard configuration includes a cooling module, a payload with a large sample space, and a side-loading wiring module for scalable connectivity (Courtesy: ICEoxford)
At the centre of most quantum labs is a large cylindrical cryostat that keeps the delicate quantum hardware at ultralow temperatures. These cryogenic chambers have expanded to accommodate larger and more complex quantum systems, but the scientists and engineers at UK-based cryogenics specialist ICEoxford have taken a radical new approach to the challenge of scalability. They have split the traditional cryostat into a series of cube-shaped modules that slot into a standard 19-inch rack mount, creating an adaptable platform that can easily be deployed alongside conventional computing infrastructure.
“We wanted to create a robust, modular and scalable solution that enables different quantum technologies to be integrated into the cryostat,” says Greg Graf, the company’s engineering manager. “This approach offers much more flexibility, because it allows different modules to be used for different applications, while the system also delivers the efficiency and reliability that are needed for operational use.”
The standard configuration of the ICE-Q platform has three separate modules: a cryogenics unit that provides the cooling power, a large payload for housing the quantum chip or experiment, and a patent-pending wiring module that attaches to the side of the payload to provide the connections to the outside world. Up to four of these side-loading wiring modules can be bolted onto the payload at the same time, providing thousands of external connections while still fitting into a standard rack. For applications where space is not such an issue, the payload can be further extended to accommodate larger quantum assemblies and potentially tens of thousands of radio-frequency or fibre-optic connections.
The cube-shaped form factor provides much improved access to these external connections, whether for designing and configuring the system or for ongoing maintenance work. The outer shell of each module consists of panels that are easily removed, offering a simple mechanism for bolting modules together or stacking them on top of each other to provide a fully scalable solution that grows with the qubit count.
The flexible design also offers a more practical solution for servicing or upgrading an installed system, since individual modules can be simply swapped over as and when needed. “For quantum computers running in an operational environment it is really important to minimize the downtime,” says Emma Yeatman, senior design engineer at ICEoxford. “With this design we can easily remove one of the modules for servicing, and replace it with another one to keep the system running for longer. For critical infrastructure devices, it is possible to have built-in redundancy that ensures uninterrupted operation in the event of a failure.”
Other features have been integrated into the platform to make it simple to operate, including a new software system for controlling and monitoring the ultracold environment. “Most of our cryostats have been designed for researchers who really want to get involved and adapt the system to meet their needs,” adds Yeatman. “This platform offers more options for people who want an out-of-the-box solution and who don’t want to get hands on with the cryogenics.”
Such a bold design choice was enabled in part by a collaborative research project with Canadian company Photonic Inc, funded jointly by the UK and Canada, that was focused on developing an efficient and reliable cryogenics platform for practical quantum computing. That R&D funding helped to reduce the risk of developing an entirely new technology platform that addresses many of the challenges that ICEoxford and its customers had experienced with traditional cryostats. “Quantum technologies typically need a lot of wiring, and access had become a real issue,” says Yeatman. “We knew there was an opportunity to do better.”
However, converting a large cylindrical cryostat into a slimline and modular form factor demanded some clever engineering solutions. Perhaps the most obvious was creating a frame that allows the modules to be bolted together while still remaining leak tight. Traditional cryostats are welded together to ensure a leak-proof seal, but for greater flexibility the ICEoxford team developed an assembly technique based on mechanical bonding.
The side-loading wiring module also presented a design challenge. To squeeze more wires into the available space, the team developed a high-density connector for the coaxial cables to plug into. An additional cold-head was also integrated into the module to pre-cool the cables, reducing the overall heat load generated by such large numbers of connections entering the ultracold environment.
Flexible for the future: the outer shell of the modules is covered with removable panels that make it easy to extend or reconfigure the system (Courtesy: ICEoxford)
Meanwhile, the speed of the cooldown and the efficiency of operation have been optimized by designing a new type of heat exchanger that is fabricated using a 3D printing process. “When warm gas is returned into the system, a certain amount of cooling power is needed just to compress and liquefy that gas,” explains Kelly. “We designed the heat exchangers to exploit the returning cold gas much more efficiently, which enables us to pre-cool the warm gas and use less energy for the liquefaction.”
The initial prototype has been designed to operate at 1 K, which is ideal for the photonics-based quantum systems being developed by ICEoxford’s research partner. But the modular nature of the platform allows it to be adapted to diverse applications, with a second project now underway with the Rutherford Appleton Lab to develop a module that that will be used at the forefront of the global hunt for dark matter.
Already on the development roadmap are modules that can sustain temperatures as low as 10 mK – which is typically needed for superconducting quantum computing – and a 4 K option for trapped-ion systems. “We already have products for each of those applications, but our aim was to create a modular platform that can be extended and developed to address the changing needs of quantum developers,” says Kelly.
As these different options come onstream, the ICEoxford team believes that it will become easier and quicker to deliver high-performance cryogenic systems that are tailored to the needs of each customer. “It normally takes between six and twelve months to build a complex cryogenics system,” says Graf. “With this modular design we will be able to keep some of the components on the shelf, which would allow us to reduce the lead time by several months.”
More generally, the modular and scalable platform could be a game-changer for commercial organizations that want to exploit quantum computing in their day-to-day operations, as well as for researchers who are pushing the boundaries of cryogenics design with increasingly demanding specifications. “This system introduces new avenues for hardware development that were previously constrained by the existing cryogenics infrastructure,” says Kelly. “The ICE-Q platform directly addresses the need for colder base temperatures, larger sample spaces, higher cooling powers, and increased connectivity, and ensures our clients can continue their aggressive scaling efforts without being bottlenecked by their cooling environment.”
You can find out more about the ICE-Q platform by contacting the ICEoxford team at iceoxford.com, or via email at sales@iceoxford.com. They will also be presenting the platform at the UK’s National Quantum Technologies Showcase in London on 7 November, with a further launch at the American Physical Society meeting in March 2026.
Due to government shutdown restrictions currently in place in the US, the researchers who headed up this study have not been able to comment on their work
Laser plasma acceleration (LPA) may be used to generate multi-gigaelectronvolt muon beams, according to physicists at the Lawrence Berkeley National Laboratory (LBNL) in the US. Their work might help in the development of ultracompact muon sources for applications such as muon tomography – which images the interior of large objects that are inaccessible to X-ray radiography.
Muons are charged subatomic particles that are produced in large quantities when cosmic rays collide with atoms 15–20 km high up in the atmosphere. Muons have the same properties as electrons but are around 200 times heavier. This means they can travel much further through solid structures than electrons. This property is exploited in muon tomography, which analyses how muons penetrate objects and then exploits this information to produce 3D images.
The technique is similar to X-ray tomography used in medical imaging, with the cosmic-ray radiation taking the place of artificially generated X-rays and muon trackers the place of X-ray detectors. Indeed, depending on their energy, muons can traverse metres of rock or other materials, making them ideal for imaging thick and large structures. As a result, the technique has been used to peer inside nuclear reactors, pyramids and volcanoes.
As many as 10,000 muons from cosmic rays reach each square metre of the Earth’s surface every minute. These naturally produced particles have unpredictable properties, however, and they also only come from the vertical direction. This fixed directionality means that can take months to accumulate enough data for tomography.
Another option is to use the large numbers of low-energy muons that can be produced in proton accelerator facilities by smashing a proton beam onto a fixed carbon target. However, these accelerators are large and expensive facilities, limiting their use in muon tomography.
A new compact source
Physicists led by Davide Terzani have now developed a new compact muon source based on LPA-generated electron beams. Such a source, if optimized, could be deployed in the field and could even produce muon beams in specific directions.
In LPA, an ultra-intense, ultra-short, and tightly focused laser pulse propagates into an “under-dense” gas. The pulse’s extremely high electric field ionizes the gas atoms, freeing the electrons from the nuclei, so generating a plasma. The ponderomotive force, or radiation pressure, of the intense laser pulse displaces these electrons and creates an electrostatic wave that produces accelerating fields orders of magnitude higher than what is possible in the traditional radio-frequency cavities used in conventional accelerators.
LPAs have all the advantages of an ultra-compact electron accelerator that allows for muon production in a small-size facility such as BeLLA, where Terzani and his colleagues work. Indeed, in their experiment, they succeeded in generating a 10 GeV electron beam in a 30 cm gas target for the first time.
The researchers collided this beam with a dense target, such as tungsten. This slows the beam down so that it emits Bremsstrahlung, or braking radiation, which interacts with the material, producing secondary products that include lepton–antilepton pairs, such as electron–positron and muon–antimuon pairs. Behind the converter target, there is also a short-lived burst of muons that propagates roughly along the same axis as the incoming electron beam. A thick concrete shielding then filters most of the secondary products, letting the majority of muons pass through it.
Crucially, Terzani and colleagues were able to separate the muon signal from the large background radiation – something that can be difficult to do because of the inherent inefficiency of the muon production process. This allowed them to identify two different muon populations coming from the accelerator. These were a collimated, forward directed population, generated by pair production; and a low-energy, isotropic, population generated by meson decay.
Many applications
Muons can ne used in a range of fields, from imaging to fundamental particle physics. As mentioned, muons from cosmic rays are currently used to inspect large and thick objects not accessible to regular X-ray radiography – a recent example of this is the discovery of a hidden chamber in Khufu’s Pyramid. They can also be used to image the core of a burning blast furnace or nuclear waste storage facilities.
While the new LPA-based technique cannot yet produce muon fluxes suitable for particle physics experiments – to replace a muon injector, for example – it could offer the accelerator community a convenient way to test and develop essential elements towards making a future muon collider.
The experiment in this study, which is detailed in Physical Review Accelerators and Beams, focused on detecting the passage of muons, unequivocally proving their signature. The researchers conclude that they now have a much better understanding of the source of these muons.
Unfortunately, the original programme that funded this research has ended, so future studies are limited at the moment. Not to be disheartened, the researchers say they strongly believe in the potential of LPA-generated muons and are working on resuming some of their experiments. For example, they aim to measure the flux and the spectrum of the resulting muon beam using completely different detection techniques based on ultra-fast particle trackers, for example.
The LBNL team also wants to explore different applications, such as imaging deep ore deposits – something that will be quite challenging because it poses strict limitations on the minimum muon energy required to penetrate soil. Therefore, they are looking into how to increase the muon energy of their source.
Que vaut le kit de 2 caméras SoloCam E42 de eufy ?
Vous le savez chez Vonguru, on adore tout ce qui est domotique et chez eufy, on a toujours le choix de ce côté-là ! Aujourd’hui, c’est la sécurité qui est mise en avant avec le nouveau kit de 2 caméras SoloCam E42 ! Au programme de cette nouveauté, résolution 4K UHD, Protection 360° sans angles morts, détection IA et suivi intelligent !
Voyons ensemble ce que vaut ce nouveau kit, muni de sa HomeBase 3. Vous retrouverez ce kit au prix de 479 € hors promotion directement sur le site de la marque ou bien sur Amazon. Place au test !
Unboxing
On retrouvera les couleurs bleu et blanche bien caractéristiques de la marque, avec sur la face avant un rappel de la marque ainsi que du nom du modèle, ici eufy SoloCam E42 2-Cam Kit ainsi que quelques arguments marketings sous formes de listes mais aussi avec quelques pictogrammes. On retrouvera également en visuel deux caméras aux côtés de leur HomeBase.
À gauche, on nous parlera en détails de l’absence d’abonnement et de la sécurisation en local de vos données, chose que nous apprécions beaucoup avec eufy, tandis qu’à droite, nous aurons cette fois un visuel d’un extérieur surveillé par la caméra, quelques fonctionnalités listées en anglais et un rappel de l’application, eufySecurity, à télécharger.
Caractéristiques techniques
Usages recommandés pour le produit
Sécurité extérieure
Marque
eufy Security
Nom de modèle
Solo Cam E42
Technologie de connectivité
Sans fil
Utilisation intérieure/extérieure
Extérieure
Protocole de connectivité
WLAN
Type de fixation
Installation murale
Résolution d’enregistrement vidéo
4k
Couleur
Blanc
Nombre d’articles
2
Fonctionnalités
Netteté ultime – avec sa véritable résolution 4K UHD, cette caméra ne passe à côté d’aucun détail. Elle peut même reconnaître une plaque d’immatriculation jusqu’à 10 m de distance.
Détection IA et suivi intelligent – l’IA intégrée détecte instantanément les mouvements et suit automatiquement les personnes, les véhicules, ou les événements importants qui se déroulent à portée de vue. Les fausses alarmes sont minimisées, et votre propriété est sécurisée.
Protection 360° sans angles morts – l’angle de vue large fournit une couverture complète, minimise les angles morts et vous permet de garder un œil sur votre palier, votre entrée ou votre jardin.
Sirène détection de mouvement – protégez votre domicile grâce au stroboscope puissant et déclenché par le mouvement, qui effraie les visiteurs indésirables et vous informe en instantané de tous les comportements inhabituels.
Sécurité assurée avec la technologie SolarPlus 2.0 – deux heures d’ensoleillement direct suffisent pour que votre caméra fonctionne toute la journée. Une utilisation en continu, sans entretien, et dans toutes les conditions météorologiques.
Sans frais mensuels – insérez une carte microSD de 128 GB pour stocker vos images de manière confidentielles et éviter les frais d’inscription. Attention : carte microSD non incluse.
Contenu
Deux caméras de surveillance sans fil avec panneaux solaires.
Deux supports de montage ajustables.
Visserie pour fixation murale.
Câbles pour le chargement initial.
HomeBase 3
Fiches secteurs selon région pour le branchement de la HomeBase
Câble de raccordement pour la HomeBase
Des stickers
Installation
Commençons par parler de l’installation, aussi bien hardware que software, qui dans les deux cas n’a plus vraiment de secret pour nous. On rappellera cependant de BIEN vérifier la porter de son Wi-Fi AVANT de percer son mur car oui, il vous faudra percer
Préparation :
Chargez les caméras via USB-C.
Choisissez un emplacement extérieur dégagé, à 2–3 m de hauteur, bien exposé au soleil si vous utilisez les panneaux solaires intégrés.
Fixation :
Montez les caméras avec les supports fournis.
Orientez-les légèrement vers le bas pour une meilleure couverture et moins de fausses alertes.
Connexion Wi-Fi :
Installez l’application Eufy Security et créez un compte si vous n’en avez pas encore un.
Ajoutez les caméras : appuyez sur le bouton SYNC, scannez le QR code via l’application et connectez-les au Wi-Fi 2,4 GHz.
Paramètres dans l’application :
Renommez chaque caméra.
Configurez zones de détection et notifications (personnes, véhicules, animaux).
Choisissez la résolution et le stockage (microSD).
Tests :
Vérifiez le flux en direct, la détection de mouvement et la vision nocturne.
Test & Application
Lorsqu’on pense « sécurité extérieure » sans prise de tête, on cherche : une image nette, des alertes fiables, pas d’abonnement, et une installation qui ne transforme pas votre mur en chantier, que vous soyez propriétaire ou locataire. Le kit 2 caméras SoloCam E42 de eufy coche à première vue beaucoup de ces cases : résolution 4K, alimentation solaire/batterie, detection IA, et hub inclus (HomeBase S380) pour aller plus loin. Je suis déjà munie de caméras intérieures et extérieures de eufy, et j’avais très hâte de tester ces nouveaux modèles.
D’ailleurs, ayant déjà d’une HomeBase, cette dernière se trouve toujours dans mon Rack Server, bien au chaud aux côtés des copains.
L’installation, c’est vu. Parlons donc de tout le reste. Côté image, Eufy ne déçoit pas. Le capteur 4K de la SoloCam E42 offre un rendu vraiment très bon, aussi bien de jour que de nuit. Les détails sont précis, les couleurs équilibrées, et la compression ne dégrade pas la qualité du flux, même sur un réseau Wi-Fi standard. La vision nocturne en couleur est également au rendez-vous, grâce à un projecteur LED intégré. La caméra bascule automatiquement entre vision IR et couleur selon la luminosité, garantissant une visibilité constante.
Eufy intègre ici une détection intelligente de mouvement qui distingue les personnes, véhicules et animaux. Fini les notifications inutiles dès qu’une feuille bouge ou qu’un insecte passe devant la lentille. La précision est excellente, surtout pour une caméra sans station de base. Les alertes arrivent rapidement sur smartphone, accompagnées d’une courte vidéo enregistrée en local sur carte microSD (jusqu’à 128 Go). Et surtout : aucun abonnement n’est nécessaire. Tout est stocké et géré en interne.
Autre bonne surprise : la communication bidirectionnelle. Depuis l’application, on peut parler directement via le haut-parleur de la caméra. Pratique pour répondre à un livreur ou dissuader un intrus. La E42 embarque aussi une sirène et un flash lumineux, activables automatiquement ou manuellement. Un combo efficace pour faire fuir quiconque s’approcherait un peu trop près.
Le grand atout de la SoloCam E42, c’est son autonomie. En usage normal, la batterie tient plusieurs mois, et le panneau solaire intégré assure une recharge continue. En pratique, même par temps couvert, la caméra maintient un niveau de batterie stable. Pour ceux qui n’aiment pas grimper à l’échelle tous les trois mois, c’est un vrai luxe. Pour vous dire, mon kit 4 caméras S330 n’a JAMAIS eu besoin d’être rechargé. Les panneaux font le boulot tout le temps.
Après quelques jours d’utilisation, on oublie la présence du système. Les notifications sont pertinentes, le flux vidéo rapide, et l’application parfaitement fluide. Eufy a trouvé un excellent équilibre ici entre ergonomie, performance et tranquillité d’esprit. On regrettera cependant que ces nouvelles caméras ne filment pas en dôme, ce que l’on espère voir apparaître chez eufy dans les mois à venir avec de nouveaux modèles, car c’est bien là la seule fonctionnalité qui manque à ses caméras haut de gamme.
J’aime également que les caméras intègrent directement les panneaux comme sur d’autres modèles de la marque déjà testé, mais pour abriter tout ce condensé de technologie, il faut plus d’autonomie je présume, d’où la présence d’un panneau plus grand. Cela permet également de positionner ledit panneau à un autre endroit si l’emplacement choisi est peu exposé au soleil. C’est donc à vous de voir ce que vous préférez selon vos besoins et vos envies esthétiques.
Conclusion
Le kit de 2 caméras SoloCam E42 de Eufy coche toutes les cases : image 4K, autonomie solaire, installation en quelques minutes et stockage local sécurisé. C’est une solution idéale pour ceux qui veulent protéger leur maison sans s’encombrer d’un système complexe. Au vu de sa rotation à 360, on apprécie une installation par exemple sur un poteau, afin de l’utiliser pleinement.
En clair, ce kit de caméra fonctionne vraiment très bien, autonome et discret, qui fait exactement ce qu’on attend de lui, sans surcoût avec une installation somme toute facile et une app bien rôdée. Cependant, le prix est tout de même élevé. 479 € hors promotion directement sur le site de la marque ou bien sur Amazon, ce n’est pas à la porter de toutes les bourses.
Test de la montre connectée Withings ScanWatch 2 2025
Nous vous en avions parlé lors de son annonce par la marque, la ScanWatch 2 2025 est maintenant disponible en boutique. Il s’agit pour rappel, d’une montre connectée hybride et dotée d’une autonomie de 30 jours. Ce modèle, grâce à son application permet aussi un suivi du cycle menstruel, sans parler des divers relevés permettant de vérifier la bonne santé de notre cœur, le nombre de pas que l’on marche chaque jour, la température corporelle ou bien de suivre un score de qualité du sommeil.
La ScanWatch 2 2025 est disponible dès à présent au tarif de 349,45 €.
Bref, que vaut cette nouvelle ScanWatch au quotidien ? Réponse dans ce test !
Déballage
La ScanWatch 2 2025 arrive dans une boîte blanche avec la montre de représentée dessus. Il s’agit d’une boîte identique à la ScanWatch 2 sortie en 2023. À l’arrière, nous retrouvons quelques informations complémentaires sur sa compatibilité (iphone, ipad, smartphones avec Android 10 au minimum). Nous avons aussi le droit à diverses informations techniques ainsi que celles de recyclage. Sur les côtés, la marque met en avant son application Withings App.
Dès que l’on ouvre cette petite boîte, nous découvrons la montre installée sur une base en carton. On la retrouve aux côtés de :
un socle de chargement
et son câble USB-C vers USB-A
un guide d’installation
une carte promotionnelle des produits Withings
Spécifications techniques
Test
Mise en route
Pour utiliser la ScanWatch 2 2025 pour la première fois, c’est relativement rapide et facile. Il faudra en effet installer la montre sur votre poignet et au préalable télécharger l’application de Withings (la même que pour les autres produits de la marque). On pensera bien entendu à mettre son smartphone en Bluetooth et à activer la localisation puis on lancera l’application de la marque. Grâce au bouton disponible tout en haut à droite de l’application, il sera possible d’installer la montre. En moins de 5 minutes, tout est prêt à l’emploi !
Le plus « dur » en soit sera de mettre la bonne heure sur la montre, si elle n’est pas parfaitement mise par défaut. Il faudra alors tourner les aiguilles sur l’application pour la mettre à l’heure en direct, mais cela reste un détail.
Petit point négatif concernant la montre en sortie de boîte. Effectivement, le bracelet était un peu marqué par l’emballage. Néanmoins, cette marque s’atténue au bout de quelques jours de portage mais reste encore légèrement visible après 10 jours au poignet. C’est un peu dommage pour le coup.
Utilisation quotidienne
Au quotidien, la ScanWatch 2 2025 est assez confortable à utiliser et à porter. Néanmoins, le bracelet aura tendance à marquer un peu la peau. Et si je le desserre, ne serait-ce qu’un peu, il ne sera plus assez serré pour moi. Il faudra donc peut-être envisager un bracelet dans une autre matière pour la vie courante. Par contre, c’est aussi ce qui permet à la montre connectée de supporter l’immersion… Donc à voir selon votre utilisation quotidienne, si vous souhaitez la garder pendant la douche, à la piscine, ou non.
À l’usage, j’ai remarqué que ce modèle avait tendance à irriter ma peau, contrairement à la ScanWatch première du nom. En effet, si je la porte pendant plusieurs jours d’affilés, sans la retirer, je me retrouve avec des rougeurs et irritations sur le poignet (en dessous du cadran et aussi au niveau du bracelet silicone). C’est un point qui me fait penser qu’il faudra sûrement opter pour un bracelet cuir peut-être si vous êtes également concerné par ce soucis.
Sur ce modèle, contrairement aux montres et bracelets Fitbit par exemple, l’écran restera bien visible en extérieur ! Et ce, peut importe l’ensoleillement.
Sinon, la ScanWatch 2 2025 est dotée d’un bon nombre de fonctionnalités intéressantespour suivre sa santé de manière générale. On pourra notamment vérifier comme je disais plus haut, le nombre de pas faits par jour, le rythme cardiaque mais on pourra aussi faire un électrocardiogramme ou encore vérifier ses perturbations respiratoires pendant le sommeil ou la quantité d’oxygène dans le sang. Dans le même ordre d’idées, nous avons un relevé de notre température disponible dans l’application. Une température de « base » est analysée par la montre, elle pourra ensuite vous dire si vous situez en dessous de cette moyenne ou bien au dessus.
Il s’agit d’un excellent allier pour faire attention à sa santé, surtout dans notre ère moderne où le télétravail a été beaucoup généralisé. Toutefois, il faudra quand même faire attention aux données relevées par la ScanWatch 2 2025 et ne pas hésiter à aller consulter un médecin en cas de doute. La technologie peut être défaillante, alors faites bien attention à vous.
Avec son application, la marque propose aussi un suivi du cycle menstruel. C’est une option très intéressante pour les personnes qui n’ont pas spécialement d’appli pour leur suivi. D’autant plus que Withings propose actuellement une offre pour bénéficier de l’application Clue pendant un an. À titre personnel, c’est une application que j’utilise depuis des années, je trouve cela plutôt chouette. Du côté de Withings, nous pourrons aussi noter nos périodes menstruelles, nos symptômes directement sur l’application. En plus de cela, directement sur la montre (en utilisant la molette), il sera possible de noter quelques informations sur notre cycle. C’est rapide et pratique si l’on a pas le temps d’ouvrir l’appli.
Esthétiquement parlant, la montre connectée de Withings est très élégante. On est loin de l’aspect sportif des montres Fitbit. Ici, notre exemplaire du jour possède des couleurs bleues foncées et un joli cadran couleur or rose. Sans parler de son cadran avec aiguilles qui donne un aspect haut de gamme non négligeable.
En plus de cela, si l’envie vous prends, de nombreux autres bracelets sont disponibles sur le site de la marque. Vous permettant ainsi de changer de couleur au gré de vos envies. Néanmoins, ils ne sont pas donnés puisqu’ils se trouvent entre 19,95 € pour les plus « simples » et 49,95 € pour d’autres (ceux en cuir notamment). Ceci dit, la marque Fitbit, ne les vend pas beaucoup moins chers de son côté !
Autonomie et recharge
Utilisant les montres Withings depuis plusieurs années maintenant, je peux vous assurer que la batterie d’un mois tient ses promesses. C’est vraiment top, on ne passe pas son temps à recharger sa montre.
Application
L’application de Withings est vraiment complète et intuitive. Par défaut, au démarrage, nous nous retrouverons sur notre accueil répertoriant la plupart des données relevées et nous donnant accès à des notifications. Nous avons notamment le nombre de pas relevés, notre temps de sommeil, fréquence cardiaque moyenne, poids etc. De quoi tout vérifier en peu de temps !
Suivi du cycle
Dans la partie suivi du cycle, il nous sera possible de suivre nos règles, de les ajouter, d’ajouter aussi différents symptômes que l’on pourrait ressentir grâce au petit « + ». L’appli nous indique aussi notre phase (folliculaire, ovulation, lutéale), la fenêtre de fertilité et les dates de nos potentielles prochaines règles. De plus, en se basant sur notre température corporelle, l’application est censée pouvoir détecter notre ovulation.
Progresser et partager
Dans progresser, Withings nous donne des programmes et astuces pour améliorer notre santé. Tandis que la partie « partager » concerne plutôt le suivi de la santé et le partage à nos médecins entre autre.
Conclusion
Au final, nous retrouvons une nouvelle fois une montre connectée pleine de fonctionnalités, qui nous rappellera notamment celles disponibles dans les bracelets/montres de la marque Fitbit. Elle est agréable à porter au quotidien et apportera une dose d’élégance non négligeable à votre poignet, n’étant pas du tout typé sport. Attention toutefois aux peaux sensibles qui pourront être irritées par les matériaux de cette dernière. Dans ce cas, il faudra peut-être envisager l’achat d’un bracelet en matière naturelle, type cuir.
Sinon, ce modèle bénéficie d’une grande autonomie de 1 mois. C’est une autonomie qui est ultra satisfaisante, nous n’avons pas l’impression de la recharger toutes les trente secondes.
On regrettera toutefois, à la sortie de la boîte, avoir constaté que le bracelet de notre montre était marqué. Bien que cela s’estompe petit à petit mais reste tout de même visible pendant une quinzaine de jours.
Malheureusement, c’est un constat que l’on fait à chaque fois, les accessoires pour la ScanWatch 2 2025 ne sont pas donnés. Il faudra rajouter entre 20 et 50 euros pour bénéficier d’un bracelet de rechange. La montre en elle-même est disponible au tarif de 349,95 €.
Mais, au vu des fonctionnalités qu’elle intègre et de son élégance, je ne peux que la recommander. Il n’y a pas de réel point noir sur ce modèle, mise à part, un prix qui pique et malheureusement un abonnement mensuel à rajouter si on veut profiter pleinement de toutes les fonctionnalités qu’elle peut proposer !
Sortie Mana Books – Persona Le Livre de Cuisine Officiel
Comme régulièrement, Mana Books sort des nouveautés sur le thème du manga ou encore du jeu vidéo. Aujourd’hui, nous allons vous parler d’un tout nouveau livre de recettes qui vient de sortir : Persona Le Livre de Cuisine Officiel de Jarrett Melendez.
Découvrez tous un tas de recettes dans Persona Le Livre de Cuisine Officiel
Persona Le Livre de cuisine officiel vous séduira autant comme objet collector que comme hommage culinaire à la saga Persona…
Plongez dans les moments culinaires mémorables et réconfortants des jeux Persona, que ce soit en cuisinant avec les membres du SEES ou en traînant sur les banquettes familières du café Leblanc ! Vos statistiques sont-elles suffisamment élevées pour relever le défi du méga bolde bœuf spécial jour de pluie ? Êtes-vous prêt à affronter le Cosmic Tower Burger ? Vous le saurez en parcourant ce florilège de recettes emblématiques de la franchise Persona. De Tatsumi Port Island à Inaba en passant par Tokyo, découvrez les plats incroyables qui rapprochent nos personnages favoris et leur donnent la force nécessaire pour les combats à venir !
Ce livre de recettes est disponible dès à présent en librairie. Vous le trouverez pour le prix de 29,90 €.
Si vous avez déjà terminé Pokémon Z-A et que vous souhaitez repartir dans la chasse de monstres : Digimon Story Time Stranger est fait pour vous !
La licence Digimon n’est pas morte et continue d’exister grâce aux jeux vidéo et à une nouvelle série bientôt disponible. Nous avons eu l’opportunité de tester ce nouvel épisode sur PC et l’expérience vaut le détour. Digimon Story Time Stranger est disponible sur Playstation 5, Xbox Series et PC uniquement.
La chasse aux monstres !
Digimon Story Time Stranger propose à mon goût un scénario plus captivant que ses concurrents et un gameplay bien plus difficile. Au début du jeu, vous allez pouvoir choisir entre 3 modes de difficultés : histoire, normal, difficile et après avoir terminé une première fois le jeu, vous allez pouvoir débloquer un mode hardcore. Attention, le mode hardcore ne vous laisse aucun répit et il vous faudra maîtriser l’univers des Digimons à la perfection pour venir à bout des combats de boss.
Au lancement du jeu, vous allez pouvoir choisir entre 2 antagonistes et 3 Digimons de votre choix. Cela n’affecte pas le scénario et personnellement j’ai opté pour Yuki Kanan. Digimon Story Time Stranger vous invite à vivre une aventure intense dans plusieurs dimensions et espaces temps. Votre objectif est d’éviter la destruction du monde et de comprendre les raisons qui ont poussé les Digimons à se battre contre les humains. Dans ce nouvel épisode, vous allez pouvoir obtenir environ 450 Digimons et cela demande énormément de temps. La chasse aux monstres est très différente d’un Pokémon car il vous faut combattre un certain nombre de fois les créatures des différents donjons pour pouvoir en convertir. La conversion représente la capture et il faut attendre d’avoir des stats à 200% pour essayer d’avoir un Digimon optimisé. Ces créatures peuvent Digiévoluer et cela reste très utile pour obtenir des Digimons plus puissants. Cependant il vous faut atteindre des prérequis pour obtenir la forme que vous désirez. Dans ce menu, vous allez pouvoir aussi faire régresser vos Digimons pour enlever leur évolution et décider d’une nouvelle forme. Un aspect très utile car il permet d’adapter votre équipe à votre stratégie.
Digimon Story Time Stranger vous propose une quête principale d’environ 40 heures et le double si vous souhaitez achever l’ensemble des quêtes annexes. Dans ce nouveau titre, vous allez pouvoir jouer à un jeu de cartes à la sauce Digimon dans le but de compléter votre classeur. Il vous faudra vaincre les différents joueurs de cartes pour toutes les obtenir.
Le jeu propose un peu de personnalisation pour votre personnage et avec l’édition deluxe j’ai pu obtenir de super costumes !
En mode histoire, Digimon Story Time Stranger reste assez simple même si les combats de boss restent un peu plus difficiles. Le mode normal est à mon goût déjà assez corsé car il vous faudra farmer énormément d’XP pour augmenter les niveaux de vos Digimons et choisir le meilleur équipement pour ces derniers. Le jeu propose aussi un système de gestion de capacité, et il vous faudra faire attention à laquelle choisir pour chaque Digimon pour tirer profit des faiblesses des autres créatures.
Digimon Story Time Stranger propose une expérience riche pour tous les fans de créatures grâce à une durée de vie conséquente. Le scénario est intéressant et le doublage en japonais favorise encore plus l’immersion dans ce Tokyo futuriste. Le jeu est entièrement traduit en français et fort heureusement car cela simplifie la progression et vous permet d’apprendre les nombreuses mécaniques du jeu. Digimon Story Time Stranger arrive au bon moment et permet de satisfaire les fans de la série et d’attirer un nouveau public grâce à un bon système d’accessibilité. Si la chasse au monstre ne vous effraie pas, je vous recommande chaudement d’y jouer.
When it comes to building a fully functional “fault-tolerant” quantum computer, companies and government labs all over the world are rushing to be the first over the finish line. But a truly useful universal quantum computer capable of running complex algorithms would have to entangle millions of coherent qubits, which are extremely fragile. Because of environmental factors such as temperature, interference from other electronic systems in hardware, and even errors in measurement, today’s devices would fail under an avalanche of errors long before reaching that point.
So the problem of error correction is a key issue for the future of the market. It arises because errors in qubits can’t be corrected simply by keeping multiple copies, as they are in classical computers: quantum rules forbid the copying of qubit states while they are still entangled with others, and are thus unknown. To run quantum circuits with millions of gates, we therefore need new tricks to enable quantum error correction (QEC).
Protected states
The general principle of QEC is to spread the information over many qubits so that an error in any one of them doesn’t matter too much. “The essential idea of quantum error correction is that if we want to protect a quantum system from damage then we should encode it in a very highly entangled state,” says John Preskill, director of the Institute for Quantum Information and Matter at the California Institute of Technology in Pasadena.
There is no unique way of achieving that spreading, however. Different error-correcting codes can depend on the connectivity between qubits – whether, say, they are coupled only to their nearest neighbours or to all the others in the device – which tends to be determined by the physical platform being used. However error correction is done, it must be done fast. “The mechanisms for error correction need to be running at a speed that is commensurate with that of the gate operations,” saysMichael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC). “There’s no point in doing a gate operation in a nanosecond if it then takes 100 microseconds to do the error correction for the next gate operation.”
At the moment, dealing with errors is largely about compensation rather than correction: patching up the problems of errors in retrospect, for example by using algorithms that can throw out some results that are likely to be unreliable (an approach called “post-selection”). It’s also a matter of making better qubits that are less error-prone in the first place.
Qubits are so fragile that their quantum state is very susceptible to the local environment, and can easily be lost through the process of decoherence. Current quantum computers therefore have very high error rates – roughly one error in every few hundred operations. For quantum computers to be truly useful, this error rate will have to be reduced to the scale of one in a million; especially as larger more complex algorithms would require one in a billion or even trillion error rates. This requires real-time quantum error correction (QEC).
To protect the information stored in qubits, a multitude of unreliable physical qubits have to be combined in such a way that if one qubit fails and causes an error, the others can help protect the system. Essentially, by combining many physical qubits (shown above on the left), one can build a few “logical” qubits that are strongly resistant to noise.
According to Maria Maragkou, commercial vice-president of quantum error-correction company Riverlane, the goal of full QEC has ramifications for the design of the machines all the way from hardware to workflow planning. “The shift to support error correction has a profound effect on the way quantum processors themselves are built, the way we control and operate them, through a robust software stack on top of which the applications can be run,” she explains. The “stack” includes everything from programming languages to user interfaces and servers.
With genuinely fault-tolerant qubits, errors can be kept under control and prevented from proliferating during a computation. Such qubits might be made in principle by combining many physical qubits into a single “logical qubit” in which errors can be corrected (see figure 1). In practice, though, this creates a large overhead: huge numbers of physical qubits might be needed to make just a few fault-tolerant logical qubits. The question is then whether errors in all those physical qubits can be checked faster than they accumulate (see figure 2).
The illustration gives an overview of quantum error correction (QEC) in action within a quantum processing unit. UK-based company Riverlane is building its Deltaflow QEC stack that will correct millions of data errors in real time, allowing a quantum computer to go beyond the reach of any classical supercomputer.
Fault-tolerant quantum computing is the ultimate goal, says Jay Gambetta, director of IBM research at the company’s centre in Yorktown Heights, New York. He believes that to perform truly transformative quantum calculations, the system must go beyond demonstrating a few logical qubits – instead, you need arrays of at least a 100 of them, that can perform more than 100 million quantum operations (108 QuOps). “The number of operations is the most important thing,” he says.
It sounds like a tall order, but Gambetta is confident that IBM will achieve these figures by 2029. By building on what has been achieved so far with error correction and mitigation, he feels “more confident than I ever did before that we can achieve a fault-tolerant computer.” Jerry Chow, previous manager of the Experimental Quantum Computing group at IBM, shares that optimism. “We have a real blueprint for how we can build [such a machine] by 2029,” he says (see figure 3).
Others suspect the breakthrough threshold may be a little lower: Steve Brierley, chief executive of Riverlane, believes that the first error-corrected quantum computer, with around 10 000 physical qubits supporting 100 logical qubits and capable of a million QuOps (a megaQuOp), could come as soon as 2027. Following on, gigaQuOp machines (109 QuOps) should be available by 2030–32, and teraQuOps (1012 QuOp) by 2035–37.
Platform independent
Error mitigation and error correction are just two of the challenges for developers of quantum software. Fundamentally, to develop a truly quantum algorithm involves taking full advantage of the key quantum-mechanical properties such as superposition and entanglement. Often, the best way to do that depends on the hardware used to run the algorithm. But ultimately the goal will be to make software that is not platform-dependent and so doesn’t require the user to think about the physics involved.
“At the moment, a lot of the platforms require you to come right down into the quantum physics, which is a necessity to maximize performance,” says Richard Murray of photonic quantum-computing company Orca. Try to generalize an algorithm by abstracting away from the physics and you’ll usually lower the efficiency with which it runs. “But no user wants to talk about quantum physics when they’re trying to do machine learning or something,” Murray adds. He believes that ultimately it will be possible for quantum software developers to hide those details from users – but Brierley thinks this will require fault-tolerant machines.
“In due time everything below the logical circuit will be a black box to the app developers”, adds Maragkou over at Riverlane. “They will not need to know what kind of error correction is used, what type of qubits are used, and so on.” She stresses that creating truly efficient and useful machines depends on developing the requisite skills. “We need to scale up the workforce to develop better qubits, better error-correction codes and decoders, write the software that can elevate those machines and solve meaningful problems in a way that they can be adopted.” Such skills won’t come only from quantum physicists, she adds: “I would dare say it’s mostly not!”
Yet even now, working on quantum software doesn’t demand a deep expertise in quantum theory. “You can be someone working in quantum computing and solving problems without having a traditional physics training and knowing about the energy levels of the hydrogen atom and so on,” says Ashley Montanaro, who co-founded the quantum software company Phasecraft.
On the other hand, insights can flow in the other direction too: working on quantum algorithms can lead to new physics. “Quantum computing and quantum information are really pushing the boundaries of what we think of as quantum mechanics today,” says Montanaro, adding that QEC “has produced amazing physics breakthroughs.”
Early adopters?
Once we have true error correction, Cuthbert at the UK’s NQCC expects to see “a flow of high-value commercial uses” for quantum computers. What might those be?
In this arena of quantum chemistry and materials science, genuine quantum advantage – calculating something that is impossible using classical methods alone – is more or less here already, says Chow. Crucially, however, quantum methods needn’t be used for the entire simulation but can be added to classical ones to give them a boost for particular parts of the problem.
Joint effort In June 2025, IBM in the US and Japan’s national research laboratory RIKEN, unveiled the IBM Quantum System Two, the first to be used outside the US. It involved IBM’s 156-qubit IBM Heron quantum computing system (left) being paired with RIKEN’s supercomputer Fugaku (right) — one of the most powerful classical systems on Earth. The computers are linked through a high-speed network at the fundamental instruction level to form a proving ground for quantum-centric supercomputing. (Courtesy: IBM and RIKEN)
For example, last year researchers at IBM teamed up with scientists at several RIKEN institutes in Japan to calculate the minimum energy state for the iron sulphide cluster (4Fe-4S) at the heart of the bacterial nitrogenase enzyme that fixes nitrogen. This cluster is too big and complex to be accurately simulated using the classical approximations of quantum chemistry. The researchers used a combination of both quantum computing (with IBM’s 72-qubit Heron chip) and RIKEN’s Fugaku high performance computing (HPC). This idea of “improving classical methods by injecting quantum as a subroutine” is likely to be a more general strategy, says Gambetta. “The future of computing is going to be heterogeneous accelerators [of discovery] that include quantum.”
Likewise, Montanaro says that Phasecraft is developing “quantum-enhanced algorithms”, where a quantum computer is used, not to solve the whole problem, but just to help a classical computer in some way. “There are only certain problems where we know quantum computing is going to be useful,” he says. “I think we are going to see quantum computers working in tandem with classical computers in a hybrid approach. I don’t think we’ll ever see workloads that are entirely run using a quantum computer.” Among the first important problems that quantum machines will solve, according to Montanaro, are the simulation of new materials – to develop, for example, clean-energy technologies (see figure 4).
“For a physicist like me,” says Preskill, “what is really exciting about quantum computing is that we have good reason to believe that a quantum computer would be able to efficiently simulate any process that occurs in nature.”
3 Structural insights
(Courtesy: Phasecraft)
A promising application of quantum computers is simulating novel materials. Researchers from the quantum algorithms firm Phasecraft, for example, have already shown how a quantum computer could help simulate complex materials such as the polycrystalline compound LK-99, which was purported by some researchers in 2024 to be a room-temperature superconductor.
Using a classical/quantum hybrid workflow, together with the firm’s proprietary material simulation approach to encode and compile materials on quantum hardware, Phasecraft researchers were able to establish a classical model of the LK99 structure that allowed them to extract an approximate representation of the electrons within the material. The illustration above shows the green and blue electronic structure around red and grey atoms in LK-99.
Montanaro believes another likely near-term goal for useful quantum computing is solving optimization problems – both here and in quantum simulation, “we think genuine value can be delivered already in this NISQ era with hundreds of qubits.” (NISQ, a term coined by Preskill, refers to noisy intermediate-scale quantum computing, with relatively small numbers of rather noisy, error-prone qubits.)
One further potential benefit of quantum computing is that it tends to require less energy than classical high-performance computing, which is notoriously high. If the energy cost could be cut by even a few percent, it would be worth using quantum resources for that reason alone. “Quantum has real potential for an energy advantage,” says Chow. One study in 2020 showed that a particular quantum-mechanical calculation carried out on a HPC used many orders of magnitude more energy than when it was simulated on a quantum circuit. Such comparisons are not easy, however, in the absence of an agreed and well-defined metric for energy consumption.
Building the market
Right now, the quantum computing market is in a curious superposition of states itself – it has ample proof of principle, but today’s devices are still some way from being able to perform a computation relevant to a practical problem that could not be done with classical computers. Yet to get to that point, the field needs plenty of investment.
The fact that quantum computers, especially if used with HPC, are already unique scientific tools should establish their value in the immediate term, says Gambetta. “I think this is going to accelerate, and will keep the funding going.” It is why IBM is focusing on utility-scale systems of around 100 qubits or so and more than a thousand gate operations, he says, rather than simply trying to build ever bigger devices.
Montanaro sees a role for governments to boost the growth of the industry “where it’s not the right fit for the private sector”. One role of government is simply as a customer. For example, Phasecraft is working with the UK national grid to develop a quantum algorithm for optimizing the energy network. “Longer-term support for academic research is absolutely critical,” Montanaro adds. “It would be a mistake to think that everything is done in terms of the underpinning science, and governments should continue to support blue-skies research.”
The road ahead IBM’s current roadmap charts how the company plans on scaling up its devices to achieve a fault-tolerant device by 2029. Alongside hardware development, the firm will also focus on developing new algorithms and software for these devices. (Courtesy: IBM)
It’s not clear, though, whether there will be a big demand for quantum machines that every user will own and run. Before 2010, “there was an expectation that banks and government departments would all want their own machine – the market would look a bit like HPC,” Cuthbert says. But that demand depends in part on what commercial machines end up being like. “If it’s going to need a premises the size of a football field, with a power station next to it, that becomes the kind of infrastructure that you only want to build nationally.” Even for smaller machines, users are likely to try them first on the cloud before committing to installing one in-house.
According to Cuthbert , the real challenge in the supply-chain development is that many of today’s technologies were developed for the science community – where, say, achieving millikelvin cooling or using high-power lasers is routine. “How do you go from a specialist scientific clientele to something that starts to look like a washing machine factory, where you can make them to a certain level of performance,” while also being much cheaper, and easier to use?
But Cuthbert is optimistic about bridging this gap to get to commercially useful machines, encouraged in part by looking back at the classical computing industry of the 1970s. “The architects of those systems could not imagine what we would use our computation resources for today. So I don’t think we should be too discouraged that you can grow an industry when we don’t know what it’ll do in five years’ time.”
Montanaro too sees analogies with those early days of classical computing. “If you think what the computer industry looked like in the 1940s, it’s very different from even 20 years later. But there are some parallels. There are companies that are filling each of the different niches we saw previously, there are some that are specializing in quantum hardware development, there are some that are just doing software.” Cuthbert thinks that the quantum industry is likely to follow a similar pathway, “but more quickly and leading to greater market consolidation more rapidly.”
However, while the classical computing industry was revolutionized by the advent of personal computing in the 1970s and 80s, it seems very unlikely that we will have any need for quantum laptops. Rather, we might increasingly see apps and services appear that use cloud-based quantum resources for particular operations, merging so seamlessly with classical computing that we don’t even notice.
That, perhaps, would be the ultimate sign of success: that quantum computing becomes invisible, no big deal but just a part of how our answers are delivered.
In the first instalment of this two-part article, Philip Ball explores the latest developments in the quantum-computing industry
When a star rapidly accumulates gas and dust during its early growth phase, it’s called an accretion burst. Now, for the first time, astronomers have observed a planet doing the same thing. The discovery, made using the European Southern Observatory’s Very Large Telescope (VLT) and the James Webb Space Telescope (JWST), shows that the infancy of certain planetary-mass objects and that of newborn stars may share similar characteristics.
Like other rogue planets, Cha1107-7626 was known to be surrounded by a disk of dust and gas. When material from this disk spirals, or accretes, onto the planet, the planet grows.
What Almendros-Abad and colleagues discovered is that this process is not uniform. Using the VLT’s XSHOOTER and the NIRSpec and MIRI instruments on JWST, they found that Cha1107-7626 experienced a burst of accretion beginning in June 2025. This is the first time anyone has seen an accretion burst in an object with such a low mass, and the peak accretion rate of six billion tonnes per second makes it the strongest accretion episode ever recorded in a planetary-mass object. It may not be over, either. At the end of August, when the observing campaign ended, the burst was still ongoing.
An infancy similar to a star’s
The team identified several parallels between Cha1107-7626’s accretion burst and those that young stars experience. Among them were clear signs that gas is being funnelled onto the planet. “This indicates that magnetic fields structure the flow of gas, which is again something well known from stars,” explains Scholz. “Overall, our discovery is establishing interesting, perhaps surprising parallels between stars and planets, which I’m not sure we fully understand yet.”
The astronomers also found that the chemistry of the disc around the planet changed during accretion, with water being present in this phase even though it hadn’t been before. This effect has previously been spotted in stars, but never in a planet until now.
“We’re struck by quite how much the infancy of free-floating planetary-mass objects resembles that of stars like the Sun,” Jayawardhana says. “Our new findings underscore that similarity and imply that some objects comparable to giant planets form the way stars do, from contracting clouds of gas and dust accompanied by disks of their own, and they go through growth episodes just like newborn stars.”
The researchers have been studying similar objects for many years and earlier this year published results based on JWST observations that featured a small sample of planetary-mass objects. “This particular study is part of that sample,” Scholz tells Physics World, “and we obtained the present results because Victor wanted to look in detail at the accretion flow onto Cha1107-7626, and in the process discovered the burst.”
The researchers say they are “keeping an eye” on Cha1107-7626 and other such objects that are still growing because their environment is dynamic and unstable. “More to the point, we really don’t understand what drives these accretion events, and we need detailed follow-up to figure out the underlying reasons for these processes,” Scholz says.
It’s Halloween today and so what better time than to bring you a couple of spooky stories from the world of physics.
First up is researchers at the University of Georgia in the US who have confirmed that six different species of bats found in North America emit a ghoulish green light when exposed to ultraviolet light.
The researchers examined 60 specimens from the Georgia Museum of Natural History and exposed the bats to UV light.
They found that the wings and hind limbs of six species – big brown bats, eastern red bats, Seminole bats, southeastern myotis, grey bats and the Brazilian free-tailed bat – gave off photoluminescence with the resulting glow being a shade of green.
While previous research found that some mammals, like pocket gophers, also emit a glow under ultraviolet light, this was the first discovery of such a phenomenon for bats located in North America.
The colour and location of the glow on the winged mammals suggest it is not down to genetics or camouflage and as it is the same between sexes it is probably not used to attract mates.
Given that many bats can see the wavelengths emitted, one option is that the glow may be an inherited trait used for communication.
“The data suggests that all these species of bats got it from a common ancestor. They didn’t come about this independently,” adds Castleberry. “It may be an artifact now, since maybe glowing served a function somewhere in the evolutionary past, and it doesn’t anymore.”
Thread lightly
In other frightful news, spider webs are a classic Halloween decoration and while the real things are marvels of bioengineering, there is still more to understand about these sticky structures.
Many spider species build spiral wheel-shaped webs – orb webs – to capture prey, and some incorporate so-called “stabilimenta” into their web structure. These “extra touches” look like zig-zagging threads that span the gap between two adjacent “spokes,” or threads arranged in a circular “platform” around the web’s centre.
The purpose of stabilimenta is unknown and proposed functions include as a deterrence for predatory wasps or birds.
Yet Gabriele Greco of the Swedish University of Agricultural Sciences and colleagues suggest such structures might instead influence the propagation of web vibrations triggered by the impact of captured prey.
Greco and colleagues observed different stabilimentum geometries that were constructed by wasp spiders, Argiope bruennichi. The researchers then performed numerical simulations to explore how stabilimenta affect prey impact vibrations.
For waves generated at angles perpendicular to the threads spiralling out from the web centre, stabilimenta caused negligible delays in wave propagation.
However, for waves generated in the same direction as the spiral threads, vibrations in webs with stabilimenta propagated to a greater number of potential detection points across the web – where a spider might sense them – than in webs without stabilimenta.
This suggests that stabilimenta may boost a spider’s ability to pinpoint the location of unsuspecting prey caught in its web.
Si les animaux ont une sensibilité, comme le montre depuis peu la recherche, qu’en est-il de leur attitude face à la mort ? Dans un récent ouvrage, la biologiste Emmanuelle Pouydebat révèle les émotions et comportements complexes de nombreuses espèces dans cette situation.
Depuis quelque temps, j’ai fait l’acquisition de matériel Leica, dans la gamme Q, absolument extraordinaire, ce dont je vais vous parler très très prochainement. Il se trouve que j’ai eu l’occasion de tester trois jours le tout nouveau Leica M EV-1, doté d’un objectif Sumicron 90 mm f2 apochromatique. Si vous n’êtes pas au courant de la vie de Leica, la gamme M, née dans les années cinquante, c’est celle des appareils mythiques ... Continuer la lecture
Female university students do much better in introductory physics exams if they have the option of retaking the tests. That’s according to a new analysis of almost two decades of US exam results for more than 26,000 students. The study’s authors say it shows that female students benefit from lower-stakes assessments – and that the persistent “gender grade gap” in physics exam results does not reflect a gender difference in physics knowledge or ability.
The study has been carried out by David Webb from the University of California, Davis, and Cassandra Paul from San Jose State University. It builds on previous work they did in 2023, which showed that the gender gap disappears in introductory physics classes that offer the chance for all students to retake the exams. That study did not, however, explore why the offer of a retake has such an impact.
In the new study, the duo analysed exam results from 1997 to 2015 for a series of introductory physics classes at a public university in the US. The dataset included 26,783 students, mostly in biosciences, of whom about 60% were female. Some of the classes let students retake exams while others did not, thereby letting the researchers explore why retakes close the gender gap.
When Webb and Paul examined the data for classes that offered retakes, they found that in first-attempt exams female students slightly outperformed their male counterparts. But male students performed better than female students in retakes.
This, the researchers argue, discounts the notion that retakes close the gender gap by allowing female students to improve their grades. Instead, they suggest that the benefit of retakes is that they lower the stakes of the first exam.
The team then compared the classes that offered retakes with those that did not, which they called high-stakes courses. They found that the gender gap in exam results was much larger in the high-stakes classes than the lower-stakes classes that allowed retakes.
“This suggests that high-stakes exams give a benefit to men, on average, [and] lowering the stakes of each exam can remove that bias,” Webb told Physics World. He thinks that as well as allowing students to retake exams, physics might benefit from not having comprehensive high-stakes final exams but instead “use final exam time to let students retake earlier exams”.
Earlier this year I met the Massachusetts-based steampunk artist Bruce Rosenbaum at the Global Physics Summit of the American Physical Society. He was exhibiting a beautiful sculpture of a “quantum engine” that was created in collaboration with physicists including NIST’s Nicole Yunger Halpern – who pioneered the scientific field of quantum steampunk.
I was so taken by the art and science of quantum steampunk that I promised Rosenbaum that I would chat with him and Yunger Halpern on the podcast – and here is that conversation. We begin by exploring the art of steampunk and how it is influenced by the technology of the 19th century. Then, we look at the physics of quantum steampunk, a field that weds modern concepts of quantum information with thermodynamics – which itself is a scientific triumph of the 19th century.
This podcast is supported by Atlas Technologies, specialists in custom aluminium and titanium vacuum chambers as well as bonded bimetal flanges and fittings used everywhere from physics labs to semiconductor fabs.
Progression of the Rayleigh–Taylor instability: Initially, an applied force (grey arrows) destabilizes the fluid interface between spin-up (blue) and spin-down (yellow) sodium atoms. The interface then develops nominally sinusoidal modulations where currents counterflowing in the two fluids (coloured arrows) induce vorticity at the interface (black symbols). The modulations subsequently acquire characteristic mushroom or spike shapes before dissolving into a turbulent mixture. (Courtesy: Taken from Science Advances11 35, 10.1126/sciadv.adw9752, licensed under CC BY-NC)
Researchers in the US have replicated a well-known fluid-dynamics process called the Rayleigh–Taylor instability on a quantum scale for the first time. The work opens the hydrodynamics of quantum gases to further exploration and could even create a new platform for understanding gravitational dynamics in the early universe.
If you’ve ever tried mixing oil with water, you’ll understand how the Rayleigh–Taylor instability (RTI) can develop. Due to their different molecular structures and the nature of the forces between their molecules, the two fluids do not mix well. After some time, they separate, forming a clear interface between oil and water.
Scientists have studied the dynamics of this interface upon perturbations – disturbances of the system – for nearly 150 years, with major work being done by the British physicists Lord Rayleigh in 1883 and Geoffrey Taylor in 1950. Under specific conditions related to the buoyant force of the fluid and the perturbative force causing the disturbance, they showed that this interface becomes unstable. Rather than simply oscillating, the system deviates from its initial state, leading to the formation of interesting geometric patterns such as mushroom clouds and filaments of gas in the Crab Nebula.
An interface of spins
To show that such dynamics occur not only in macroscopic structures, but also at a quantum scale, scientists at the University of Maryland and the Joint Quantum Institute (JQI) created a two-state quantum system using a Bose–Einstein condensate (BEC) of sodium (23Na) atoms. In this state of matter, the temperature is so low, the sodium atoms behave as a single coherent system, giving researchers precise control of their parameters.
The JQI team confine this BEC in a two-dimensional optical potential that essentially produces a 100 µm × 100 µm sheet of atoms in the horizontal plane. The scientists then apply a microwave pulse that excites half of the atoms from the spin-down to the spin-up state. By adding a small magnetic field gradient along one of the horizontal axes, they induce a force (the Stern–Gerlach force) that acts on the two spin components in opposite directions due to the differing signs of their magnetic moments. This creates a clear interface between the spin-up and the spin-down atoms.
Mushrooms and ripplons
To initiate the RTI, the scientists need to perturb this two-component BEC by reversing the magnetic field gradient, which consequently reverses the direction of the induced force. According to Ian Spielman, who led the work alongside co-principal investigator Gretchen Campbell, this wasn’t as easy as it sounds. “The most difficult part was preparing the initial state (horizontal interface) with high quality, and then reliably inverting the gradient rapidly and accurately,” Spielman says.
The researchers then investigated how the magnitude of this force difference, acting on the two sides of the interface, affected the dynamics of the two-component BEC. For a small differential force, they initially observed a sinusoidal modulation of the interface. After some time, the interface enters a nonlinear dynamics regime where the RTI manifests through the formation of mushroom clouds. Finally, it becomes a turbulent mixture. The larger the differential force, the more rapidly the system evolves.
Spin up and spin down: Experimental setup showing the vacuum chamber where the sodium atoms are prepared. (Courtesy: Spielman–Campbell laboratory, JQI)
While RTI dynamics like these were expected to occur in quantum fluids, Spielman points out that proving it required a BEC with the right internal interactions. The BEC of sodium atoms in their experimental setup is one such system.
In general, Spielman says that cold atoms are a great tool for studying RTI because the numerical techniques used to describe them do not suffer from the same flaws as the Navier–Stokes equation used to model classical fluid dynamics. However, he notes that the transition to turbulence is “a tough problem that resides at the boundary between two conceptually different ways of thinking”, pushing the capabilities of both analytical and numerical techniques.
The scientists were also able to excite waves known as ripplon modes that travel along the interface of the two-component BEC. These are equivalent to the classical capillary waves –“ripples” when a droplet impacts a water surface. Yanda Geng, a JQI PhD student working on this project, explains that every unstable RTI mode has a stable ripplon as a sibling. The difference is that ripplon modes only appear when a small sinusoidal modulation is added to the differential force. “Studying ripplon modes builds understanding of the underlying [RTI] mechanism,” Geng says.
The flow of the spins
In a further experiment, the team studied a phenomenon that occurs as the RTI progresses and the spin components of the BEC flow in opposite directions along part of their shared interface. This is known as an interfacial counterflow. By transferring half the atoms into the other spin state after initializing the RTI process, the scientists were able to generate a chain of quantum mechanical whirlpools – a vortex chain – along the interface in regions where interfacial counterflow occurred.
Spielman, Campbell and their team are now working to create a cleaner interface in their two-component BEC, which would allow a wider range of experiments. “We are considering the thermal properties of this interface as a 1D quantum ‘string’,” says Spielman, adding that the height of such an interface is, in effect, an ultra-sensitive thermometer. Spielman also notes that interfacial waves in higher dimensions (such as a 2D surface) could be used for simulations of gravitational physics.
Improving the efficiency of solar cells will likely be one of the key approaches to achieving net zero emissions in many parts of the world. Many types of solar cells will be required, with some of the better performances and efficiencies expected to come from multi-junction solar cells. Multi-junction solar cells comprise a vertical stack of semiconductor materials with distinct bandgaps, with each layer converting a different part of the solar spectrum to maximize conversion of the Sun’s energy to electricity.
When there are no constraints on the choice of materials, triple-junction solar cells can outperform double-junction and single-junction solar cells, with a power conversion efficiency (PCE) of up to 51% theoretically possible. But material constraints – due to fabrication complexity, cost or other technical challenges – mean that many such devices still perform far from the theoretical limits.
Perovskites are one of the most promising materials in the solar cell world today, but fabricating practical triple-junction solar cells beyond 1 cm2 in area has remained a challenge. A research team from Australia, China, Germany and Slovenia set out to change this, recently publishing a paper in Nature Nanotechnology describing the largest and most efficient triple-junction perovskite–perovskite–silicon tandem solar cell to date.
When asked why this device architecture was chosen, Anita Ho-Baillie, one of the lead authors from The University of Sydney, states: “I am interested in triple-junction cells because of the larger headroom for efficiency gains”.
Addressing surface defects in perovskite solar cells
Solar cells formed from metal halide perovskites have potential to be commercially viable, due to their cost-effectiveness, efficiency, ease of fabrication and their ability to be paired with silicon in multi-junction devices. The ease of fabrication means that the junctions can be directly fabricated on top of each other through monolithic integration – which leads to only two terminal connections, instead of four or six. However, these junctions can still contain surface defects.
To enhance the performance and resilience of their triple-junction cell (top and middle perovskite junctions on a bottom silicon cell), the researchers optimized the chemistry of the perovskite material and the cell design. They addressed surface defects in the top perovskite junction by replacing traditional lithium fluoride materials with piperazine-1,4-diium chloride (PDCl). They also replaced methylammonium – which is commonly used in perovskite cells – with rubidium. “The rubidium incorporation in the bulk and the PDCl surface treatment improved the light stability of the cell,” explains Ho-Baillie.
To connect the two perovskite junctions, the team used gold nanoparticles on tin oxide. Because the gold was in a nanoparticle form, the junctions could be engineered to maximize the flow of electric charge and light absorption by the solar cell.
“Another interesting aspect of the study is the visualization of the gold nanoparticles [using transmission electron microscopy] and the critical point when they become a semi-continuous film, which is detrimental to the multi-junction cell performance due to its parasitic absorption,” says Ho-Baillie. “The optimization for achieving minimal particle coverage while achieving sufficient ohmic contact for vertical carrier flow are useful insights”.
Record performance for a large-scale perovskite triple-junction cell
Using these design strategies, Ho-Baillie and colleagues developed a 16 cm2 triple-junction cell that achieved an independently certified steady-state PCE of 23.3% – the highest reported for a large-area device. While triple-junction perovskite solar cells have exhibited higher PCEs – with all-perovskite triple-junction cells reaching 28.7% and perovskite–perovskite–silicon devices reaching 27.1% – these were all achieved on a 1 cm2 cell, not a large-area cell.
In this study, the researchers also developed a 1 cm2 cell that was close to the best, with a PCE of 27.06%, but it is the large-area cell that’s the record breaker. The 1 cm2 cell also passed the International Electrotechnical Commission’s (IEC) 61215 thermal cycling test, which exposes the cell to 200 cycles under extreme temperature swings, ranging from –40 to 85°C. During this test, the 1 cm2 cell retained 95% of its initial efficiency after 407 h of continuous operation.
The combination of the successful thermal cycling test combined with the high efficiencies on a larger cell shows that there could be potential for this triple-junction architecture in real-world settings in the near future, even though they are still far away from their theoretical limits.