↩ Accueil

Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Aujourd’hui — 2 juillet 20246.5 📰 Sciences English

New titanium:sapphire laser is tiny, low-cost and tuneable

Par : No Author
2 juillet 2024 à 15:54

A compact, integrated titanium:sapphire laser that needs only a simple green LED as a pump source has been created by researchers at Stanford University in the US. Their design reduces the cost and footprint of a titanium:sapphire laser by three orders of magnitude and the power consumption by two. The team believes its device represents a key step towards the democratization of a laser type that plays important roles in scientific research and industry.

Since its invention by Peter Moulton at the Massachusetts Institute of Technology in 1982, the titanium:sapphire laser has become an important research and engineering tool. This is thanks to its ability to handle high powers and emit either spectrally pure continuous wave signals or broadband, short pulses. Indeed, the laser was used to produce the first frequency combs, which play important roles in optical metrology.

Unlike numerous other types of lasers such as semiconductor lasers, titanium:sapphire lasers have proved extremely difficult to miniaturize because traditional designs require very high input power to achieve lasing. “Titanium:sapphire has the ability to output very high powers, but because of the way the laser level structure works – specifically the fluorescence has a very short lifetime – you have to pump very hard in order to see appreciable amounts of gain,” says Stanford’s Joshua Yang. Traditional titanium:sapphire lasers have to be pumped with high-powered lasers – and therefore cost in excess of $100,000.

Logic, sensing, and quantum computing

If titanium:sapphire lasers could be miniaturized and integrated into chips, potential applications would include optical logic, sensing and quantum computing. Last year, Yubo Wang and colleagues at Yale University unveiled a chip-integrated titanium:sapphire laser that utilized an indium gallium nitride pump diode coupled to a titanium:sapphire gain medium through its evanescent field. The evanescent component of the electromagnetic field does not propagate but decays exponentially with distance from the source. By reducing loss, this integrated setup reduced the lasing threshold by more than an order of magnitude. However, Jelena Vučković – the leader of the Stanford group – says that “the threshold was still relatively high because the overlap with the gain medium was not maximized”.

In the new research, Vučković’s group fabricated their laser devices by creating monocrystalline titanium:sapphire optical resonators about 40 micron across and less than 1 micron thick on a layer of sapphire using a silicon dioxide interface. The titanium:sapphire was then polished to within 0.1 micron smoothness using reactive ion etching. The resonators achieved almost perfect overlap of the pump and lasing modes, which led to much less loss and a lasing threshold 22 times lower than in any titanium:sapphire laser used previously. “All the fabrication processes are things that can be done in most traditional clean rooms and are adaptable to foundries,” says Yang – who is first author of a paper in Nature that describes the new laser.

The researchers achieved lasing with a $37 green laser diode as the pump. However, subsequent experiments described in the paper used a tabletop green laser because the team is still working to couple the cheaper laser into the system into the system effectively.

Optimization challenge

“Being able to complete the whole picture of diode to on-chip laser to systems applications is really just an optimization challenge, and of course one we’re really excited to work on,” says Yang. “But even with the low optimization we start with, it’s still able to achieve lasing.”

The researchers went on to demonstrate two things that had never been achieved before. First, they incorporated the tunability so valued in titanium:sapphire lasers into their system by using an integrated heater to modify the refractive index of the resonator, allowing it to lase in different modes. They achieved single mode lasing in a range of over 50 nm, and believe that it should be possible, with optimization, to extend this to several hundred nanometres.

They also performed a cavity quantum electrodynamics experiment with colour centres in silicon carbide using their light source: “That’s why [titanium:sapphire] lasers are so popular in quantum optics labs like ours,” says Vučković; “If people want to work with different colour centres or quantum dots, they don’t have a specific wavelength at which they work.” The use of silicon carbide is especially significant, she says, because it is becoming popular in the high-power electronics used in systems like electric cars.

Finally, they produced a titanium:sapphire laser amplifier, something that the team says has not been reported before. They injected 120 pJ pulses from a commercial titanium:sapphire laser and amplified them to 2.3 nJ over a distance of 8 mm down the waveguide. The distortion introduced by the amplifier was the lowest allowed by the laws of wave motion – something that had not been possible for any integrated amplifier at any wavelength.

Yubo Wang is impressed: “[Vučković and colleagues have] achieved several important milestones, including very low-threshold lasing, very high-power amplification and also tuneable laser integration, which are all very nice results,” he says. “At the end of the paper, they have a compelling demonstration of cavity-integrated artificial atoms using their titanium:sapphire laser.” He says he would be interested to see if the team could produce multiple devices simultaneously at wafer scale. He also believes it would be interesting to look at integration of other visible-wavelength lasers: “I’m expecting to see more results in the next few years,” he says.

The post New titanium:sapphire laser is tiny, low-cost and tuneable appeared first on Physics World.

  •  

How to get the errors out of quantum computing

2 juillet 2024 à 14:26

All of today’s quantum computers are prone to errors. These errors may be due to imperfect hardware and control systems, or they may arise from the inherent fragility of the quantum bits, or qubits, used to perform quantum operations. But whatever their source, they are a real problem for anyone seeking to develop commercial applications for quantum computing. Although noisy, intermediate-scale quantum (NISQ) machines are valuable for scientific discovery, no-one has yet identified a commercial NISQ application that brings value beyond what is possible with classical hardware. Worse, there is no immediate theoretical argument that any such applications exist.

It might sound like a downbeat way of opening a scientific talk, but when Christopher Eichler made these comments at last week’s Quantum 2.0 conference in Rotterdam, the Netherlands, he was merely reflecting what has become accepted wisdom within the quantum computing community. According to this view, the only way forward is to develop fault-tolerant computers with built-in quantum error correction, using many flawed physical qubits to encode each perfect (or perfect-enough) logical qubit.

That isn’t going to be easy, acknowledged Eichler, a physicist at FAU Erlingen, Germany. “We do have to face a number of engineering challenges,” he told the audience. In his view, the requirements of a practical, error-corrected quantum computer include:

  • High-fidelity gates that are fast enough to perform logical operations in a manageable amount of time
  • More and better physical qubits with which to build the error-corrected logical qubits
  • Fast mid-circuit measurements for “syndromes”, which are the set of eigenvalues that make it possible to infer (using classical decoding algorithms) which errors have happened in the middle of a computation, rather than waiting until the end.

The good news, Eichler continued, is that several of today’s qubit platforms are already well on their way to meeting these requirements. Trapped ions offer high-fidelity, fault-tolerant qubit operations. Devices that use arrays of neutral atoms as qubits are easy to scale up. And qubits based on superconducting circuits are good at fast, repeatable error correction.

The bad news is that none of these qubit platforms ticks all of those boxes at once. This means that no out-and-out leader has emerged, though Eichler, whose own research focuses on superconducting qubits, naturally thinks they have the most promise.

In the final section of his talk, Eichler suggested a few ways of improving superconducting qubits. One possibility would be to discard the current most common type of superconducting qubit, which is known as a transmon, in favour of other options. Fluxonium qubits, for example, offer better gate fidelities, with 2-qubit gate fidelities of up to 99.9% recently demonstrated. Another alternative superconducting qubit, known as a cat qubit, exhibits lifetimes of up to 10 seconds before it loses its quantum nature. However, in Eichler’s view, it’s not clear how either of these qubits might be scaled up to multi-qubit processors.

Another promising strategy (not unique to superconducting qubits) Eichler mentioned is to convert dominant types of errors into events that involve a qubit being erased instead of changing state. This type of error should be easier (though still not trivial) to detect. And many researchers are working to develop new error correction codes that operate in a more hardware-efficient way.

Ultimately, though, the jury is still out on how to overcome the problem of error-prone qubits. “Moving forward, one should very broadly study all these platforms,” Eichler concluded. “One can only learn from one another.”

The post How to get the errors out of quantum computing appeared first on Physics World.

  •  

Oculomics: a window to the health of the body

Par : No Author
2 juillet 2024 à 12:00

More than 13 million eye tests are carried out in the UK each year, making it one of the most common medical examinations in the country. But what if eye tests could tell us about more than just the health of the eye? What if these tests could help us spot some of humanity’s greatest healthcare challenges, including diabetes, Alzheimer’s or heart disease?

It’s said that the eye is the “window to the soul”. Just as our eyes tell us lots about the world around us, so they can tell us lots about ourselves. Researchers working in what’s known as “oculomics” are seeking ways to look at the health of the body, via the eye. In particular, they’re exploring the link between certain ocular biomarkers (changes or abnormalities in the eye) with systemic health and disease. Simply put, the aim is to unlock the valuable health data that the eye holds on the body (Chronic Disease. Ophthalmol. Ther. 13 1427).

Oculomics is particularly relevant when it comes to chronic conditions, such as dementia, diabetes and cardiovascular disease. They make up most of the “burden of disease” (a factor that is calculated by looking at the sum of the mortality and morbidity of a population) and account for around 80% of deaths in industrialized nations. We can reduce how many people die or get ill from such diseases through screening programmes. Unfortunately, most diseases don’t get screened for and – even when they do – there’s limited or incomplete uptake.

Cervical-cancer screening, for example, is estimated to have saved the lives of one in 65 of all British-born women since 1950 (Lancet 364 249), but nearly a third of eligible women in the UK do not attend regular cervical screening appointments. This highlights the need for new and improved screening methods that are as non-intimidating, accessible and patient-friendly as a trip to a local high-street optometrist.

Seeing the light: the physics and biology of the eye

In a biological sense, the eye is fantastically complex. It can adapt from reading this article directly in front of you to looking at stars that are light-years away. The human eye is a dynamic living tissue that can operate across six orders of brightness magnitude, from the brightest summer days to the darkest cloudy nights.

The eye has several key structures that enable this (figure 1). At the front, the cornea is the eye’s strongest optical component, refracting light as it enters the eye to form an image at the back of the eye. The iris allows the eye to adapt to different light levels, as it changes size to control how much light enters the eye. The crystalline lens provides depth-dynamic range, changing size and shape to focus on objects nearby or far away from the eye. The aqueous humour (a water-like fluid in front of the lens) and the vitreous humour (a gel-like liquid between the lens and the retina) give the eye its shape, and provide the crucial separation over which the refraction of light takes place. Finally, light reaches the retina, where the “pixels” of the eye – the photoreceptors – detect the light.

1 Look within

Diagram of the eye with labels including iris, cornea and vitreous humour
(Courtesy: Occuity)

The anatomy of the human eye, highlighting the key structures including the iris, cornea, the lens and the retina.

The tissues and the fluids in the eye have optical characteristics that stem from their biological properties, making optical methods ideally suited to study the eye. It’s vital, for example, that the aqueous humour is transparent – if it were opaque, our vision would be obscured by our own eyes. The aqueous humour also needs to fulfil other biological properties, such as providing nutrition to the cornea and lens.

To do all these things, our bodies produce the aqueous humour as an ultrafiltered blood plasma. This plasma contains water, amino acids, electrolytes and more, but crucially no red blood cells or opaque materials. The molecules in the aqueous humour reflect the molecules in the blood, meaning that measurements on the aqueous humour can reveal insights into blood composition. This link between optical and biological properties is true for every part of the eye, with each structure potentially revealing insights into our health.

Chronic disease insights and AI

Currently, almost all measurements we take of the eye are to discern the eye’s health only. So how can these measurements tell us about chronic diseases that affect other parts of the body? The answer lies in both the incredible properties of the eye, and data from the sheer number of eye examinations that have taken place.

Chronic diseases can affect many different parts of the body, and the eye is no exception (figure 2). For example, cardiovascular disease can change artery and vein sizes. This is also true in the retina and choroid (a thin layer of tissue that lies between the retina and the white of the eye) – in patients with high blood pressure, veins can become dilated, offering optometrists and ophthalmologists insight into this aspect of a patient’s health.

For example, British optometrist and dispensing optician Jason Higginbotham, points out that throughout his career “Many eye examinations have yielded information about the general health of patients – and not just their vision and eye health. For example, in some patients, the way the arteries cross over veins can ‘squash’ or press on the veins, leading to a sign called ‘arterio-venous nipping’. This is a possible indicator of hypertension and hardening of the arteries.”

Higginbotham, who is also the managing editor of Myopia Focus, adds that “Occasionally, one may spot signs of blood-vessel leakage and swelling of the retinal layers, which is indicative of active diabetes. For me, a more subtle sign was finding the optic nerves of one patient appearing very pale, almost white, with them also complaining of a lack of energy, becoming ‘clumsier’ in their words and finding their vision changing, especially when in a hot bath. This turned out to be due to multiple sclerosis.”

2 Interconnected features

Diagram of the eye with labels explaining detectable changes that occur
(Courtesy: Occuity)

Imaging the eye may reveal ocular biomarkers of systemic disease, thanks to key links between the optical and biological properties of the eye. With the emergence of oculomics, it may be possible – through a standard eye test – to detect cardiovascular diseases; cancer; neurodegenerative disease such as Alzheimer’s, dementia and Parkinson’s disease; and even metabolic diseases such as diabetes.

However, precisely because there are so many things that can affect the eye, it can be difficult to attribute changes to a specific disease. If there is something abnormal in the retina, could this be an indicator of cardiovascular disease, or could it be diabetes? Perhaps it is a by-product of smoking – how can an optometrist tell?

This is where the sheer number of measurements becomes important. The NHS has been performing eye tests for more than 60 years, giving rise to databases containing millions of images, complete with patient records about long-term health outcomes. These datasets have been fed into artificial intelligence (AI) deep-learning models to identify signatures of disease, particularly cardiovascular disease (British Journal of Ophthalmology 103 67J Clin Med. 10.3390/jcm12010152). Models can now predict cardiovascular risk factors with accuracy that is comparable to the current state-of-the-art. Also, new image-analysis methods are under constant development, allowing further signatures of cardiovascular disease, diabetes and even dementia to be spotted in the eye.

But bias is a big issue when it comes to AI-driven oculomics. When algorithms are developed using existing databases, groups or communities with historically worse healthcare provision will be under-represented in these databases. Consequently, the algorithms may perform worse for them, which risks embedding past and present inequalities into future methods. We have to be careful not to let such biases propagate through the healthcare system – for example, by drawing on multiple databases from different countries to reduce sensitivities to country-specific bias.

Although AI oculomics methods have not yet moved beyond clinical research, it is only a matter of time. Ophthalmology companies such as Carl Zeiss Meditec (Ophthalmology Retina 7 1042) and data companies such as Google are developing AI methods to spot diabetic retinopathy and other ophthalmic diseases. Regulators are also engaging more and more with AI, with the FDA having reviewed at least 600 medical devices that incorporate AI or machine learning across medical disciplines, including nine in the ophthalmology space, by October 2023.

Eye on the prize

So how far can oculomics go? What other diseases could be detected by analysing hundreds of thousands of images? And, more importantly, what can be detected with only one image or measurement of the eye?

Ultimately, the answer lies in matching the imaging technique to the disease. It is critical to choose the measurement technique that fits the disease. So, if we want to detect more diseases, we need more measurement techniques.

At Occuity, a UK-based medical technology company, we are developing solutions to some of humanity’s greatest health challenges through optical diagnostic technologies. Our aim is to develop pain-free, non-contact screening and monitoring of chronic health conditions, such as glaucoma, myopia, diabetes and Alzheimer’s disease (Front Aging Neurosci.13 720167). We believe that the best way that we can improve health is by developing instruments that can spot specific signatures of disease. This would allow doctors to start treatments earlier, give researchers a better understanding of the earliest stages of disease, and ultimately, help people live healthier, happier lives.

Currently, we are developing a range of instruments that target different diseases by scanning a beam of light through the different parts of the eye and measuring the light that comes back. Our first instruments measure properties such as the thickness of the cornea (needed for accurate glaucoma diagnosis); and the length of the eyeball, which is key to screening and monitoring the epidemic of myopia, which is expected to affect half of the world’s population by 2050. As we advance these technologies, we open up opportunities for new measurements to advance scientific research and clinical diagnostics.

Looking into the past

The ocular lens provides a remarkable record of our molecular history because, unlike many other ocular tissues, the cells within the lens do not get replaced as people age. This is particularly important for a family of molecules dubbed “advanced glycation end-products”, or AGEs. These molecules are waste products that build up when glucose levels are too high. While present in everybody, they occur in much higher concentrations in people with diabetes and pre-diabetes people who have higher blood-glucose levels and are at high risk of developing diabetes, but largely without symptoms. Measurements of a person’s lens AGE concentration may therefore indicate their diabetic state.

Fortunately, these AGEs have a very important optical property – they fluoresce. Fluorescence is a process where an atom or molecule absorbs light at one colour and then re-emits light at another colour – it’s why rubies glow under ultraviolet light. The lens is the perfect place to look for these AGEs, as it is very easy to shine light into the lens. Luckily, a lot of this fluorescence makes it back out of the lens, where it can be measured (figure 3).

3 AGEs and fluorescence

Graph with x axis labelled fluorescence and y axis labelled age. The data are spread out but roughly follow a line that is gently rising from left to right
(Courtesy: Occuity)

Fluorescence, a measure of advanced glycation end-products (AGE) concentration, rises as people get older. However, it increases faster in diabetes as higher blood-glucose levels accelerate the formation of AGEs, potentially making lens fluorescence a powerful tool for detecting diabetes and pre-diabetes. This chart shows rising fluorescence as a function of both age and diabetic status, taken as part of an internal Occuity trial on 21 people using a prototype instrument; people with diabetes are shown by orange points and people without diabetes are shown by blue points. Error bars are the standard deviation of three measurements. These measurements are non-invasive, non-contact and take just seconds to perform.

Occuity has developed optical technologies that measure fluorescence from the lens as a potential diabetes and pre-diabetes screening tool, building on our optometry instruments. Although they are still in the early stages of development, the first results taken earlier this year are promising, with fluorescence clearly increasing with age, and strong preliminary evidence that the two people with diabetes in the dataset have higher lens fluorescence than those without diabetes. If these results are replicated in larger studies, this will show that lens-fluorescence measurement techniques are a way of screening for diabetes and pre-diabetes rapidly and non-invasively, in easily accessible locations such as high-street optometrists and pharmacists.

Such a tool would be revolutionary. Almost five million people in the UK have diabetes, including over a million with undiagnosed type 2 diabetes whose condition goes completely unmonitored. There are also over 13 million people with pre-diabetes. If they can be warned before they move from pre-diabetes to diabetes, early-stage intervention could reverse this pre-diabetic state, preventing progression to full diabetes and drastically reducing the massive impact (and cost) of the illness.

Living in the present

Typical diabetes management is invasive and unpleasant, as it requires finger pricks or implants to continuously monitor blood glucose levels. This can result in infections, as well as reduce the effectiveness of diabetes management, leading to further complications. Better, non-invasive glucose-measurement techniques could transform how patients can manage this life-long disease.

As the aqueous humour is an ultra-filtered blood plasma, its glucose concentration mimics that of the glucose concentration in blood. This glucose also has an effect on the optical properties of the eye, increasing the refractive index that gives the eye its focusing power (figure 4).

4 Measuring blood glucose level

Graph with x axis labelled refractive index and y axis labelled glucose concentration. The data points show a gradually rising line from left to right
(Courtesy: Occuity)

The relationship between blood glucose and optical measurements on the eye has been probed theoretically and experimentally at Occuity. Their goal is to create a non-invasive, non-contact measure of blood glucose concentration for diabetics. Occuity has shown that changes in glucose concentration comparable to that observed in blood has a measurable effect on refractive index in cuvettes and is moving towards equivalent measurements in the anterior chamber.

As it happens, the same techniques that we at Occuity use to measure lens and eyeball thickness can be used to measure the refractive index of the aqueous humour, which correlates with glucose concentration. Preliminary cuvette-based tests are close to being precise enough to measure glucose concentrations to the accuracy needed for diabetes management – non-invasively, without even touching the eye. This technique could transform the management of blood-glucose levels for people with diabetes, replacing the need for repetitive and painful finger pricks and implants with a simple scan of the eye.

Eye on the future

As Occuity’s instruments become widely available, the data that they generate will grow, and with AI-powered real-time data analysis, their predictive power and the range of diseases that can be detected will expand too. By making these data open-source and available to researchers, we can continuously expand the breadth of oculomics.

Oculomics has massive potential to transform disease-screening and diagnosis through a combination of AI and advanced instruments. However, there are still substantial challenges to overcome, including regulatory hurdles, issues with bias in AI, adoption into current healthcare pathways, and the cost of developing new medical instruments.

Despite these hurdles, the rewards of oculomics are too great to pass up. Opportunities such as diabetes screening and management, cardiovascular risk profiling and early detection of dementia offer massive health, social and economic benefits. Additionally, the ease with which ocular screening can take place removes major barriers to the uptake of screening.

With more than 35,000 eye exams being carried out in the UK almost every day, each one offers opportunities to catch and reverse pre-diabetes, to spot cardiovascular risk factors and propose lifestyle changes, or to identify and potentially slow the onset of neurodegenerative conditions. As oculomics grows, the window to health is getting brighter.

The post Oculomics: a window to the health of the body appeared first on Physics World.

  •  
Hier — 1 juillet 20246.5 📰 Sciences English

Satellites burning up in the atmosphere may deplete Earth’s ozone layer

Par : No Author
1 juillet 2024 à 10:00

The increasing deployment of extensive space-based infrastructure is predicted to triple the number of objects in low-Earth orbit over the next century. But at the end of their service life, decommissioned satellites burn up as they re-enter the atmosphere, triggering chemical reactions that deplete the Earth’s ozone layer.

Through new simulations, Joseph Wang and colleagues at the University of Southern California have shown how nanoparticles created by satellite pollution can catalyse chemical reactions between ozone and chlorine. If the problem isn’t addressed, they predict that the level of ozone depletion could grow significantly in the coming decades.

From weather forecasting to navigation, satellites are a vital element of many of the systems we’ve come to depend on. As demand for these services continues to grow, swarms of small satellites are being rolled out in mega-constellations such as Starlink. As a result, low-Earth orbit is becoming increasingly cluttered with manmade objects.

Once a satellite reaches the end of its operational lifetime, international guidelines suggest that it should re-enter the atmosphere within 25 years to minimize the risk of collisions with other satellites. Yet according to Wang’s team, re-entries from a growing number of satellites are a concerning source of pollution; and one that has rarely been considered so far.

As they burn up on re-entry, satellites can lose between 51% and 95% of their mass – and much of the vaporized material they leave behind will remain in the upper atmosphere for decades.

One particularly concerning component of this pollution is aluminium, which makes up close to a third of the mass of a typical satellite. When left in the upper atmosphere, aluminium will react with the surrounding oxygen, creating nanoparticles of aluminium oxide (AlO). Although this compound isn’t reactive itself, its nanoparticles have large surface areas and excellent thermal stability, making them extremely effective at catalysing reactions between ozone and chlorine.

For this ozone–chlorine reaction to occur, chlorine-containing compounds must first be converted into reactive species – which can’t happen without a catalyst. Typically, catalysts come in the form of tiny, solid particles found in stratospheric clouds, which provide surfaces for the chlorine activation reaction to occur. But with higher concentrations of AlO nanoparticles in the upper atmosphere, the chlorine activation reaction can occur more readily – depleting the vital layer that protects Earth’s surface from damaging UV radiation.

Backwards progress

The ozone layer has gradually started to recover since the signing in 1987 of the Montreal Protocol – in which all UN member states agreed to phase out production of the substances primarily responsible for ozone depletion. With this new threat, however, Wang’s team predict that much of this progress could be reversed if the problem isn’t addressed soon.

In their study, reported in Geophysical Research Letters, the researchers assessed the potential impact of satellite-based pollution through molecular dynamics simulations, which allowed them to calculate the mass of ozone-depleting nanoparticles produced during satellite re-entry.

They discovered that a small 250 kg satellite can generate around 30 kg of AlO nanoparticles. By extrapolating this figure, they estimated that in 2022 alone, around 17 metric tons of AlO compounds were generated by satellites re-entering the atmosphere. They also found that the nanoparticles may take up to 30 years to drift down from the mesosphere into the stratospheric ozone layer, introducing a noticeable delay between satellite decommissioning and eventual ozone depletion in the stratosphere.

Extrapolating their findings further, Wang’s team then considered the potential impact of future mega-constellation projects currently being planned. Altogether, they estimate that some 360 metric tons of AlO nanoparticles could enter the upper atmosphere each year if these plans come to fruition.

Although these estimates are still highly uncertain, the researchers’ discoveries clearly highlight the severity of the threat that decommissioned satellites pose for the ozone layer. If their warning is taken seriously, they hope that new strategies and international guidelines could eventually be established to minimize the impact of these ozone-depleting nanoparticles, ensuring that the ozone layer can continue to recover in the coming decades.

The post Satellites burning up in the atmosphere may deplete Earth’s ozone layer appeared first on Physics World.

  •  
À partir d’avant-hier6.5 📰 Sciences English
❌
❌