Slime mold that saves memories without nervous system

Researchers have identified a single-cell slime mold called Physarum polycephalum that saves memories – although it’s no nervous system.

The study published within the Proceedings of the National Academy of Sciences within the us of America was conducted by researchers at the Max-Planck Institute for Dynamics and Self-Organization (MPI-DS) and therefore the Technical University of Munich (TUM).

Having memory about the environment is vital in taking informed decisions. The concept of memory is traditionally related to organisms possessing a nervous system.

However, even very simple organisms store information about past experiences to thrive during a complex environment—successfully exploiting nutrient sources, avoiding danger, and avoiding predators.

Karen Alim, head of the Biological Physics and Morphogenesis group at the MPI-DS in Göttingen and professor for the idea of Biological Networks at the Technical University of Munich said “It is extremely exciting when a project develops from an easy experimental observation.”

Giant unicellular slime mold Physarum polycephalum responds to a nutrient source. The organism entirely consists of interlaced tubes of varying diameters.

“Given P. polycephalum’s highly dynamic network reorganization, the persistence of this imprint sparked the thought that the specification itself could function memory of the past,” says Karen Alim. However, they first needed to elucidate the mechanism behind the imprint formation.

Memory about the nutrient location is encoded within the morphology of the network-shaped organism. a nutrient source locally releases a softening agent that gets transported by the cytoplasmic flows within the tubular network.

“The gradual softening is where the prevailing imprints of previous food sources inherit play and where information is stored and retrieved,” says first author Mirna Kramar. “Past feeding events are embedded within the hierarchy of tube diameters, specifically within the arrangement of thick and thin tubes within the network.”

“For the softening chemical that’s now transported, the thick tubes within the network act as highways in traffic networks, enabling quick transport across the entire organism,” adds Mirna Kramar.

“Previous encounters imprinted in the specification thus weigh into the choice about the longer term direction of migration.”

Tubes receiving tons of softening agent grow in diameter at the expense of other tubes shrinking. Thereby, the tubes’ capacities for flow-based transport get permanently upgraded toward the nutrient location, redirecting future decisions and migration. This demonstrates that nutrient location is stored in and retrieved from the networks’ tube diameter hierarchy.

In the study, scientists identify a flow networks’ version of associative memory—very likely of relevance for the plethora of living flow networks also as for bioinspired design.

“Given the simplicity of this living network, the power of Physarum to make memories is intriguing. it’s remarkable that the organism relies on such an easy mechanism and yet controls it in such a fine-tuned manner,” says Karen Alim.

“These results present a crucial piece of the puzzle in understanding the behavior of this ancient organism and at an equivalent time points to universal principles underlying behavior. We envision potential applications of our findings in designing smart materials and building soft robots that navigate through complex environments,” concludes Karen Alim.

Source: Wion news

New catalyst could power next-gen electronics

Unlike LIBs, the reaction pathway in LSBs results in an accumulation of solid lithium sulfide (Li2S6) and liquid lithium polysulfide (LiPS), causing a loss of active material from the sulfur cathode (positively charged electrode) and corrosion of the lithium anode (negatively charged electrode). to enhance battery life, scientists are trying to find catalysts which will make this degradation efficiently reversible during use.

In a new study published in ChemSusChem, scientists from Gwangju Institute of Technology (GIST), Korea, report their breakthrough during this endeavor. “While trying to find a replacement electrocatalyst for the LSBs, we recalled a previous study we had performed with cobalt oxalate (CoC2O4) during which we had found that charged ions can easily adsorb on this material’s surface during electrolysis. This motivated us to hypothesize that CoC2O4 would exhibit an identical behavior with sulfur in LSBs also ,” explains Prof. Jaeyoung Lee from GIST, who led the study.

To test their hypothesis, the scientists constructed an LSB by adding a layer of CoC2O4 on the sulfur cathode.

Sure enough, observations and analyses revealed that CoC2O4’s ability to adsorb sulfur allowed the reduction and dissociation of Li2S6 and LiPS.

Further, it suppressed the diffusion of LiPS into the electrolyte by adsorbing LiPS on its surface, preventing it from reaching the lithium anode and triggering a self-discharge reaction.

These actions together improved sulfur utilization and reduced anode degradation, thereby enhancing the longevity, performance, and energy storage capacity of the battery.

Charged by these findings, Prof. Lee envisions an electronic future governed by LSBs, which LIBs cannot realize. “LSBs can enable efficient electric transportation like in unmanned aircrafts, electric buses, trucks and locomotives, additionally to large-scale energy storage devices,” he observes. “We hope that our findings can get LSBs one step closer to commercialization for these purposes.”

Perhaps, it’s only a matter of time before lithium-sulfur batteries power the planet .

Source:

Materials provided by GIST (Gwangju Institute of Science and Technology).

Disease-sniffing device that rivals a dog’s nose

Image credit : Bechewy

But it takes time to coach such dogs, and their availability and time is limited. Scientists are looking for ways of automating the amazing olfactory capabilities of the canine nose and brain, during a compact device. Now, a team of researchers at MIT and other institutions has come up with a system which will detect the chemical and microbial content of an air sample with even greater sensitivity than a dog’s nose. They coupled this to a machine-learning process which will identify the distinctive characteristics of the disease-bearing samples.

The findings, which the researchers say could someday cause an automatic odor-detection system sufficiently small to be incorporated into a cellphone, are being published today within the journal PLOS One, during a paper by Clare Guest of Medical Detection Dogs in the U.K., Research Scientist Andreas Mershin of MIT, and 18 others at Johns Hopkins University, the prostatic adenocarcinoma Foundation, and a number of other other universities and organizations.

“Dogs, for now 15 years approximately , are shown to be the earliest, most accurate disease detectors for anything that we’ve ever tried,” Mershin says. And their performance in controlled tests has in some cases exceeded that of the simplest current lab tests, he says. “So far, many various kinds of cancer are detected earlier by dogs than the other technology.”

What’s more, the dogs apparently obtain connections that have thus far eluded human researchers: When trained to respond to samples from patients with one sort of cancer, some dogs have then identified several other sorts of cancer — although the similarities between the samples weren’t evident to humans.

These dogs can identify “cancers that do not have any identical biomolecular signatures in common, nothing within the odorants,” Mershin says. Using powerful analytical tools including gas chromatography mass spectrometry (GCMS) and microbial profiling, “if you analyze the samples from, let’s say, carcinoma and bladder cancer and lung cancer and breast cancer — all things that the dog has been shown to be ready to detect — they need nothing in common.” Yet the dog can somehow generalize from one kind of cancer to be ready to identify the others.

Mershin and therefore the team over the previous couple of years have developed, and continued to enhance on, a miniaturized detector system that comes with mammalian olfactory receptors stabilized to act as sensors, whose data streams are often handled in real-time by a typical smartphone’s capabilities. He envisions each day when every phone will have a scent detector inbuilt , even as cameras are now ubiquitous in phones. Such detectors, equipped with advanced algorithms developed through machine learning, could potentially acquire early signs of disease far before typical screening regimes, he says — and will even warn of smoke or a gas leak also .

In the latest tests, the team tested 50 samples of urine from confirmed cases of prostate cancer and controls known to be freed from the disease, using both dogs trained and handled by Medical Detection Dogs in the U.K. and therefore the miniaturized detection system. They then applied a machine-learning program to tease out any similarities and differences between the samples that would help the sensor-based system to spot the disease. In testing an equivalent samples, the synthetic system was ready to match the success rates of the dogs, with both methods scoring quite 70 percent.

The miniaturized detection system, Mershin says, is really 200 times more sensitive than a dog’s nose in terms of having the ability to detect and identify tiny traces of various molecules, as confirmed through controlled tests mandated by DARPA. But in terms of interpreting those molecules, “it’s 100% dumber.” That’s where the machine learning comes in, to undertake for finding the elusive patterns that dogs can infer from the scent, but humans haven’t been ready to grasp from a qualitative analysis .

“The dogs do not know any chemistry,” Mershin says. “They don’t see an inventory of molecules appear in their head. once you smell a cup of coffee, you do not see an inventory of names and concentrations, you are feeling an integrated sensation. That sensation of scent character is what the dogs can mine.”

While the physical apparatus for detecting and analyzing the molecules in air has been under development for several years, with much of the main target on reducing its size, so far the analysis was lacking. “We knew that the sensors are already better than what the dogs can neutralize terms of the limit of detection, but what we’ve not shown before is that we will train a man-made intelligence to mimic the dogs,” he says. “And now we’ve shown that we will do that . We’ve shown that what the dog does are often replicated to a particular extent.”

This achievement, the researchers say, provides a solid framework for further research to develop the technology to A level suitable for clinical use. Mershin hopes to be ready to test a far larger set of samples, perhaps 5,000, to pinpoint in greater detail the many indicators of disease. But such testing doesn’t come cheap: It costs about $1,000 per sample for clinically tested and authorized samples of disease-carrying and disease-free urine to be collected, documented, shipped, and analyzed he says.

Reflecting on how he became involved during this research, Mershin recalled a study of bladder cancer detection, during which a dog kept misidentifying one member of the control group as being positive for the disease, albeit he had been specifically selected supported hospital tests as being disease free. The patient, who knew about the dog’s test, opted to possess further tests, and a couple of months later was found to possess the disease at a really early stage. “Even though it’s only one case, I even have to admit that did sway me,” Mershin says.

The team included researchers at MIT, Johns Hopkins University in Maryland, Medical Detection Dogs in Milton Keynes, U.K., the Cambridge Polymer Group, the prostatic adenocarcinoma Foundation, the University of Texas at El Paso , Imagination Engines, and Harvard University . The research was supported by the prostate cancer Foundation, the National Cancer Institute, and also the National Institutes of Health.

Source: Materials provided by Massachusetts Institute of Technology. Original written by David L. Chandler. Note: Content may be edited for style and length.

Over 140,000 viral species living in the human gut

The paper, published today (18 February 2021) in Cell, contains an analysis of over 28,000 gut microbiome samples collected in several parts of the planet .

The amount and variety of the viruses the researchers found was surprisingly high, and therefore the data reveal new research avenues for understanding how viruses living within the gut affect human health.

The human gut is an incredibly biodiverse environment. additionally to bacteria, many thousands of viruses called bacteriophages, which may infect bacteria, also live there.

It is known that imbalances in our gut microbiome can contribute to diseases and sophisticated conditions like Inflammatory Bowel Disease, allergies and obesity.

But relatively little is understood about the role our gut bacteria, and therefore the bacteriophages that infect them, play in human health and disease.

Using a DNA-sequencing method called metagenomics, researchers at the Wellcome Sanger Institute and EMBL’s European Bioinformatics Institute (EMBL-EBI) explored and catalogued the biodiversity of the viral species found in 28,060 public human gut metagenomes and a pair of,898 bacterial isolate genomes cultured from the human gut.

The analysis identified over 140,000 viral species living within the human gut, quite half which haven’t been seen before.

Dr Alexandre Almeida, Postdoctoral Fellow at EMBL-EBI and therefore the Wellcome Sanger Institute, said: “It’s important to recollect that not all viruses are harmful, but represent an integral component of the gut ecosystem. For one thing, most of the viruses we found have DNA as their genetic material, which is different from the pathogens most of the people know, like SARS-CoV-2 or Zika, which are RNA viruses. Secondly, these samples came mainly from healthy individuals who didn’t share any specific diseases. It’s fascinating to ascertain what percentage unknown species live in our gut, and to undertake and unravel the link between them and human health.”

Among the tens of thousands of viruses discovered, a replacement highly prevalent clade — a bunch of viruses believed to possess a standard ancestor — was identified, which the authors ask because the Gubaphage.

This was found to be the second most prevalent virus clade within the human gut, after the crAssphage, which was discovered in 2014.

Both of those viruses seem to infect similar sorts of human gut bacteria, but without further research it’s difficult to understand the precise functions of the newly discovered Gubaphage.

Dr Luis F. Camarillo-Guerrero, first author of the study from the Wellcome Sanger Institute, said: “An important aspect of our work was to make sure that the reconstructed viral genomes were of the very best quality. A stringent internal control pipeline including a machine learning approach enabled us to mitigate contamination and acquire highly complete viral genomes. High-quality viral genomes pave the thanks to better understand what role viruses play in our gut microbiome, including the invention of latest treatments like antimicrobials from bacteriophage origin.”

The results of the study form the idea of the Gut Phage Database (GPD), a highly curated database containing 142,809 non-redundant phage genomes which will be a useful resource for those studying bacteriophages and therefore the role they play on regulating the health of both our gut bacteria and ourselves.

Dr Trevor Lawley, senior author of the study from the Wellcome Sanger Institute, said: “Bacteriophage research is currently experiencing a renaissance. This high-quality, large-scale catalogue of human gut viruses comes at the proper time to function a blueprint to guide ecological and evolutionary analysis in future virome studies.”

Source : Wellcome Trust Sanger Institute

Supermassive black holes could form from dark matter

Exactly how supermassive black holes initially formed is one of the biggest problems in the study of galaxy evolution today. Supermassive black holes have been observed as early as 800 million years after the Big Bang, and how they could grow so quickly remains unexplained.

Standard formation models involve normal baryonic matter — the atoms and elements that that make up stars, planets, and all visible objects — collapsing under gravity to form black holes, which then grow over time. However the new work investigates the potential existence of stable galactic cores made of dark matter, and surrounded by a diluted dark matter halo, finding that the centres of these structures could become so concentrated that they could also collapse into supermassive black holes once a critical threshold is reached.

According to the model this could have happened much more quickly than other proposed formation mechanisms, and would have allowed supermassive black holes in the early Universe to form before the galaxies they inhabit, contrary to current understanding.

Carlos R. Argüelles, the researcher at Universidad Nacional de La Plata and ICRANet who led the investigation comments: “This new formation scenario may offer a natural explanation for how supermassive black holes formed in the early Universe, without requiring prior star formation or needing to invoke seed black holes with unrealistic accretion rates.”

Another intriguing consequence of the new model is that the critical mass for collapse into a black hole might not be reached for smaller dark matter halos, for example those surrounding some dwarf galaxies. The authors suggest that this then might leave smaller dwarf galaxies with a central dark matter nucleus rather than the expected black hole. Such a dark matter core could still mimic the gravitational signatures of a conventional central black hole, whilst the dark matter outer halo could also explain the observed galaxy rotation curves.

“This model shows how dark matter haloes could harbour dense concentrations at their centres, which may play a crucial role in helping to understand the formation of supermassive black holes,” added Carlos.

“Here we’ve proven for the first time that such core-halo dark matter distributions can indeed form in a cosmological framework, and remain stable for the lifetime of the Universe.”

The authors hope that further studies will shed more light on supermassive black hole formation in the very earliest days of our Universe, as well as investigating whether the centres of non-active galaxies, including our own Milky Way, may play host to these dense dark matter cores.

Story Source:

Materials provided by Royal Astronomical SocietyNote: Content may be edited for style and length.

Perseverance rover safely touchdown on Red Planet

image credit : Wikipidea

With groundbreaking technology, the Mars 2020 mission launched July 30, 2020, from Cape Canaveral Space Force Station in Florida. The Perseverance rover mission marks an ambitious initiative within the effort to gather Mars samples and return them to Earth.

“This landing is one among those pivotal moments for NASA, the us , and space exploration globally — once we know we are on the cusp of discovery and sharpening our pencils, so to talk , to rewrite the textbooks,” said acting NASA Administrator Steve Jurczyk. “The Mars 2020 Perseverance mission embodies our nation’s spirit of persevering even within the most challenging of situations, inspiring, and advancing science and exploration. The mission itself personifies the human ideal of persevering toward the longer term and can help us steel oneself against human exploration of the Mars .”

About the dimensions of a car, the 2,263-pound (1,026-kilogram) robotic geologist and astrobiologist will undergo several weeks of testing before it begins its two-year science investigation of Mars’ Jezero Crater.

While the rover will investigate the rock and sediment of Jezero’s ancient lakebed and river delta to characterize the region’s geology and past climate, a fundamental a part of its mission is astrobiology, including the look for signs of ancient microbial life.

There to end, the Mars Sample Return campaign, being planned by NASA and ESA (European Space Agency), will allow scientists on Earth to review samples collected by Perseverance to look for definitive signs of past life using instruments overlarge and sophisticated to send to the Mars .

“Because of today’s exciting events, the primary pristine samples from carefully documented locations on another planet are another step closer to being returned to Earth,” said Thomas Zurbuchen, associate administrator for science at NASA. “Perseverance is that the initiative in bringing back rock and regolith from Mars. we do not know what these pristine samples from Mars will tell us. But what they might tell us is monumental — including that life may need once existed beyond Earth.”

Some 28 miles (45 kilometers) wide, Jezero Crater sits on the western fringe of Isidis Planitia, an enormous impact basin just north of the Martian equator. Scientists have determined that 3.5 billion years ago the crater had its own river delta and was crammed with water.

The power system that gives electricity and warmth for Perseverance through its exploration of Jezero Crater may be a Multi-Mission Radioisotope Thermoelectric Generator, or MMRTG. The U.S. Department of Energy (DOE) provided it to NASA through an ongoing partnership to develop power systems for civil space applications.

Equipped with seven primary science instruments, the foremost cameras ever sent to Mars, and its exquisitely complex sample caching system — the primary of its kind sent into space — Perseverance will scour the Jezero region for fossilized remains of ancient microscopic Martian life, taking samples along the way.

“Perseverance is that the most sophisticated robotic geologist ever made, but verifying that microscopic life once existed carries a huge burden of proof,” said Lori Glaze, director of NASA’s Planetary Science Division. “While we’ll learn tons with the good instruments we’ve aboard the rover, it’s going to alright require the much more capable laboratories and instruments back here on Earth to inform us whether our samples carry evidence that Mars once harbored life.”

Paving the Way for Human Missions

“Landing on Mars is usually an incredibly difficult task and that we are proud to continue building on our past success,” said JPL Director Michael Watkins. “But, while Perseverance advances that success, this rover is additionally blazing its own path and daring new challenges within the surface mission. We built the rover not just to land but to seek out and collect the simplest scientific samples for return to Earth, and its incredibly complex sampling system and autonomy not only enable that mission, they set the stage for future robotic and crewed missions.”

The Mars Entry, Descent, and Landing Instrumentation 2 (MEDLI2) sensor suite collected data about Mars’ atmosphere during entry, and therefore the Terrain-Relative Navigation system autonomously guided the spacecraft during final descent. the info from both are expected to assist future human missions land on other worlds more safely and with larger payloads.

On the surface of Mars, Perseverance’s science instruments will have a chance to scientifically shine. Mastcam-Z may be a pair of zoomable science cameras on Perseverance’s remote sensing mast, or head, that makes high-resolution, color 3D panoramas of the Martian landscape.

Also located on the mast, the SuperCam uses a pulsed laser to review the chemistry of rocks and sediment and has its own microphone to assist scientists better understand the property of the rocks, including their hardness.

Located on a turret at the top of the rover’s robotic arm, the Planetary Instrument for X-ray Lithochemistry (PIXL) and therefore the Scanning Habitable Environments with Raman & Luminescence for Organics & Chemicals (SHERLOC) instruments will work together to gather data on Mars’ geology close-up. PIXL will use an X-ray beam and suite of sensors to delve into a rock’s elemental chemistry.

SHERLOC’s ultraviolet laser and spectrometer, along side its Wide Angle Topographic Sensor for Operations and eNgineering (WATSON) imager, will study rock surfaces, mapping out the presence of certain minerals and organic molecules, which are the carbon-based building blocks of life on Earth.

The rover chassis is home to 3 science instruments, as well. The Radar Imager for Mars’ Subsurface Experiment (RIMFAX) is that the first ground-penetrating radar on the surface of Mars and can be wont to determine how different layers of the Martian surface formed over time. the info could help pave the way for future sensors that search for subsurface water ice deposits.

Also with an eye fixed on future Mars explorations, the Mars Oxygen In-Situ Resource Utilization Experiment (MOXIE) technology demonstration will plan to manufacture oxygen out of nothingness — the Red Planet’s tenuous and mostly CO2 atmosphere.

The rover’s Mars Environmental Dynamics Analyzer (MEDA) instrument, which has sensors on the mast and chassis, will provide key information about present-day Mars weather, climate, and dust.

Currently attached to the belly of Perseverance, the diminutive Ingenuity Mars Helicopter may be a technology demonstration which will attempt the primary powered, controlled flight on another planet.

Project engineers and scientists will now put Perseverance through its paces, testing every instrument, subsystem, and subroutine over subsequent month or two. Only then will they deploy the helicopter to the surface for the flight test phase. If successful, Ingenuity could add an aerial dimension to exploration of the Mars during which such helicopters function a scouts or make deliveries for future astronauts faraway from their base.

Once Ingenuity’s test flights are complete, the rover’s look for evidence of ancient microbial life will begin in earnest.

“Perseverance is quite a rover, and quite this amazing collection of men and ladies that built it and got us here,” said John McNamee, project manager of the Mars 2020 Perseverance rover mission at JPL. “It is even quite the ten .9 million people that signed up to be a part of our mission. This mission is about what humans are able to do once they persevere. We made it this far. Now, watch us go.”

More About the Mission

A primary objective for Perseverance’s mission on Mars is astrobiology research, including the look for signs of ancient microbial life. The rover will characterize the planet’s geology and past climate and be the primary mission to gather and cache Martian rock and regolith, paving the way for human exploration of the Mars .

Subsequent NASA missions, in cooperation with ESA, will send spacecraft to Mars to gather these cached samples from the surface and return them to Earth for in-depth analysis.

The Mars 2020 Perseverance mission is a component of NASA’s Moon to Mars exploration approach, which incorporates Artemis missions to the Moon which will help steel oneself against human exploration of the Mars .

JPL, a division of Caltech in Pasadena, California, manages the Mars 2020 Perseverance mission and therefore the Ingenuity Mars Helicopter technology demonstration for NASA.

Story Source:

Materials provided by NASANote: Content may be edited for style and length.

Global temperatures temporarily raised due to COVID-19 lockdowns

The counterintuitive finding highlights the influence of airborne particles, or aerosols, that block incoming sunlight.

When emissions of aerosols dropped last spring, more of the Sun’s warmth reached the earth , especially in heavily industrialized nations, like the USA and Russia, that normally pump high amounts of aerosols into the atmosphere.

“There was an enormous decline in emissions from the foremost polluting industries, which had immediate, short-term effects on temperatures,” said NCAR scientist Andrew Gettelman, the study’s lead author. “Pollution cools the earth , so it is sensible that pollution reductions would warm the earth .”

Temperatures over parts of Earth’s land surface last spring were about 0.2-0.5 degrees Fahrenheit (0.1-0.3 degrees Celsius) warmer than would are expected with prevailing weather , the study found.

The effect was most pronounced in regions that normally are related to substantial emissions of aerosols, with the warming reaching about 0.7 degrees F (0.37 C) over much of the USA and Russia.

The new study highlights the complex and sometimes conflicting influences of various sorts of emissions from power plants, automobiles , industrial facilities, and other sources.

While aerosols tend to decorate clouds and reflect heat from the Sun back to space, CO2 and other greenhouse gases have the other effect, trapping heat near the planet’s surface and elevating temperatures.

Despite the short-term warming effects, Gettelman emphasized that the long-term impact of the pandemic could also be to slightly slow global climate change due to reduced emissions of CO2 , which lingers within the atmosphere for many years and features a more gradual influence on climate.

In contrast, aerosols — the main target of the new study — have a more immediate impact that fades away within a couple of years.

The study was published in Geophysical Research Letters. it had been funded partially by the National Science Foundation, NCAR’s sponsor. additionally to NCAR scientists, the study was co-authored by scientists at Oxford University , Imperial College, and therefore the University of Leeds.

Teasing out the impacts

Although scientists have long been ready to quantify the warming impacts of CO2 , the climatic influence of varied sorts of aerosols — including sulfates, nitrates, black carbon, and mud — has been harder to pin down. one among the main challenges for projecting the extent of future global climate change is estimating the extent to which society will still emit aerosols within the future and therefore the influence of the various sorts of aerosols on clouds and temperature.

To conduct the research, Gettelman and his co-authors used two of the world’s leading climate models: the NCAR-based Community Earth System Model and a model referred to as ECHAM-HAMMOZ, which was developed by a consortium of European nations. They ran simulations on both models, adjusting emissions of aerosols and incorporating actual environmental condition in 2020, like winds.

This approach enabled them to spot the impact of reduced emissions on temperature changes that were too small to tease call at actual observations, where they might be obscured by the variability in atmospheric conditions.

The results showed that the warming effect was strongest within the mid and upper latitudes of the hemisphere . The effect was mixed within the tropics and relatively minor in much of the hemisphere , where aerosol emissions aren’t as pervasive.

Gettelman said the study will help scientists better understand the influence of varied sorts of aerosols in several atmospheric conditions, helping to tell efforts to attenuate global climate change .

Although the research illustrates how aerosols counter the warming influence of greenhouse gases, he emphasized that emitting more of them into the lower atmosphere isn’t a viable strategy for slowing global climate change .

“Aerosol emissions have major health ramifications,” he said. “Saying we should pollute isn’t practical.”

Story Source:

Materials provided by Imperial College London. Original written by Caroline Brogan. Note: Content may be edited for style and length.

Limiting warming to 2 degrees Celsius

Now, an equivalent authors have used their tools to ask: What emissions cuts would actually be required to satisfy the goal of two degree Celsius warming, considered a threshold for climate stability and climate-related risks like excessive heat, drought, extreme weather and water level rise?

The University of Washington study finds that emissions reductions about 80% more ambitious than those within the Paris Agreement, or an average of 1.8% drop by emissions per annum instead of 1% annually, would be enough to remain within 2 degrees. The results were published Feb. 9 in Nature’s open-access journal Communications Earth & Environment.

“A number of individuals are saying, particularly within the past few years, that the emissions targets got to be more ambitious,” said lead author Adrian Raftery, a UW professor of statistics. “We went beyond that to invite a more precise way: what proportion more ambitious do they have to be?”

The paper uses an equivalent statistical approach to model the three main drivers of human-produced greenhouse gases: national population, gross domestic product per person and therefore the amount of carbon emitted for every dollar of economic activity, referred to as carbon intensity. It then uses a statistical model to point out the range of likely future outcomes supported data and projections thus far .

Even with updated methods and five more years of knowledge , now spanning 1960 through 2015, the conclusion remains almost like the previous study: Meeting Paris Agreement targets would give only a 5% probability of staying below 2 degrees Celsius warming.

Assuming that climate policies won’t target increase or economic process , the authors then ask what change within the “carbon intensity” measure would be needed to satisfy the two degrees warming goal.

Increasing the general targets to chop carbon emissions by a mean of 1.8% annually, and continuing on it path after the Paris Agreement expires in 2030, would give the earth a 50% chance of staying below 2 degrees warming by 2100.

“Achieving the Paris Agreement’s temperature goals are some things we’re not on track to try and do now, but it wouldn’t take that much extra to try to to it,” said first author Peiran Liu, who did the research as a part of his doctorate at the UW.

The paper looks at what this overall plan would mean for various countries’ Paris Agreement commitments. Nations set their own Paris Agreement emissions-reductions pledges. The us pledged a tenth drop by carbon emissions per annum until 2026, or slightly more ambitious than the typical . China pledged to scale back its carbon intensity, or the carbon emissions per unit of economic activity, by 60% of its 2005 levels by 2030.

“Globally, the temperature goal requires an 80% boost within the annual rate of emissions decline compared to the Paris Agreement, but if a nation has finished most of its promised mitigation measures, then the additional decline required now are going to be smaller,” Liu said.

Assuming that every country’s share of the work remains unchanged, the U.S. would wish to extend its goal by 38% to do its part toward actually achieving the two degrees goal. China’s more ambitious and fairly successful plan would wish only a 7% boost, and also the uk , which has made substantial progress already, would wish a 17% increase. On the opposite hand, countries that had pledged cuts but where emissions have risen, like South Korea and Brazil, would wish a much bigger boost now to form up for the lost time.

The authors also suggest that countries increase their accountability by reviewing progress annually, instead of on the five-year, 10-year or longer timescales included in many existing climate plans.

“To some extent, the discourse around climate has been: ‘We need to completely change our lifestyles and everything,'” Raftery said. “The idea from our work is that really , what’s required isn’t easy, but it’s quantifiable. Reducing global emissions by 1.8% each year may be a goal that’s not astronomical.”

From 2011 to 2015, Raftery says, the U.S. did see a drop by emissions, thanks to efficiencies in industries starting from lighting to transportation also as regulation. The pandemic-related economic changes are going to be short-lived, he predicts, but the creativity and adaptability the pandemic has required may inaugurate an enduring drop by emissions.

“If you say, ‘Everything’s a disaster and that we got to radically overhaul society,’ there is a feeling of hopelessness,” Raftery said. “But if we are saying , ‘We got to reduce emissions by 1.8% a year,’ that’s a special mindset.”

This research was funded by the National Institutes of Health.

Materials provided by University of Washington

Solar system’s most distant known object

Farfarout was first spotted in January 2018 by the Subaru Telescope, located on Maunakea in Hawai’i. Its discoverers could tell it had been very distant , but they weren’t sure exactly how far. They needed more observations.

“At that point we didn’t know the object’s orbit as we only had the Subaru discovery observations over 24 hours, but it takes years of observations to urge an object’s orbit round the Sun,” explained co-discoverer Scott Sheppard of the Carnegie Institution for Science. “All we knew was that the thing seemed to be very distant at the time of discovery.”

Sheppard and his colleagues, David Tholen of the University of Hawai’i and Chad Trujillo of Northern Arizona University, spent subsequent few years tracking the thing with the Gemini North telescope (also on Maunakea in Hawai’i) and therefore the Carnegie Institution for Science’s Magellan Telescopes in Chile to work out its orbit. they need now confirmed that Farfarout currently lies 132 astronomical units (au) from the Sun, which is 132 times farther from the Sun than Earth is. (For comparison, Pluto is 39 au from the Sun, on the average .)

Farfarout is even more remote than the previous system distance record-holder, which was discovered by an equivalent team and nicknamed “Farout.” Provisionally designated 2018 VG18, Farout is 124 au from the Sun.

However, the orbit of Farfarout is sort of elongated, taking it 175 au from the Sun at its farthest point and around 27 au at its closest, which is inside the orbit of Neptune. Because its orbit crosses Neptune’s, Farfarout could provide insights into the history of the outer system .

“Farfarout was likely thrown into the outer system by getting too on the brink of Neptune within the distant past,” said Trujillo. “Farfarout will likely interact with Neptune again within the future since their orbits still intersect.”

Farfarout is extremely faint. supported its brightness and distance from the Sun, the team estimates it to be about 400 kilometers (250 miles) across, putting it at the low end of possibly being designated a dwarf planet by the International Astronomical Union (IAU).

The IAU’s asteroid Center in Massachusetts announced today that it’s given Farfarout the provisional designation 2018 AG37. The Solar System’s most distant known member will receive a politician name after more observations are gathered and its orbit becomes even more refined within the coming years.

“Farfarout takes a millennium to travel round the Sun once,” said Tholen. “Because of this, it moves very slowly across the sky, requiring several years of observations to exactly determine its trajectory.”

Farfarout’s discoverers are confident that even more distant objects remain to be discovered on the outskirts of the system , which its distance record won’t represent long.

“The discovery of Farfarout shows our increasing ability to map the outer system and observe farther and farther towards the fringes of our system ,” said Sheppard. “Only with the advancements within the previous couple of years of huge digital cameras on very large telescopes has it been possible to efficiently discover very distant objects like Farfarout. albeit a number of these distant objects are quite large — the dimensions of dwarf planets — they’re very faint due to their extreme distances from the Sun. Farfarout is simply the tip of the iceberg of objects within the very distant system .”

Story Source:

Materials provided by NSF’s NOIRLabNote: Content may be edited for style and length.

Create a realistic VR experience

Photo by Sophia Sideri on Unsplash

Virtual reality headsets are getting increasingly popular for gaming, and with the worldwide pandemic restricting our ability to travel, this technique could even be an inexpensive and straightforward thanks to create virtual tours for tourist destinations.

Conventional 360° photography stitches together thousands of shots as you progress around one spot. However, it doesn’t retain depth perception, therefore the scene is distorted and therefore the images look flat.

Whilst state-of-the-art VR photography, which incorporates depth perception, is out there to professional photographers, it requires expensive equipment, also as time to process the thousands of photos needed to make a totally immersive VR environment.

Dr Christian Richardt and his team at CAMERA, the University of Bath’s motion capture research centre, have created a replacement sort of 360° VR photography accessible to amateur photographers called OmniPhotos.

This is a quick , easy and robust system that recreates top quality motion parallax, in order that because the VR user moves their head, the objects within the foreground move faster than the background.

This mimics how your eyes view the important world, creating a more immersive experience.

OmniPhotos are often captured quickly and simply employing a commercially available 360° video camera on a rotating selfie stick.

Using a 360° video camera also unlocks a significantly larger range of head motions.

OmniPhotos are built on an image-based representation, with optical flow and scene adaptive geometry reconstruction, which is customized for real time 360° VR rendering.

Dr Richardt and his team presented the new system at the international SIGGRAPH Asia conference on Sunday 13th December 2020.

He said: “Until now, VR photography that uses realistic motion parallax has been the preserve of professional VR photographers, using expensive equipment and requiring complex software and computing power to process the pictures .

“OmniPhotos simplifies this process in order that you’ll use it with a commercially available 360° camera that only costs a couple of hundred pounds.

“This exposes VR photography to an entire new set of applications, from estate agent’s virtual tours of homes to immersive VR journeys at remote tourist destinations. With the pandemic stopping many of us from travelling on holiday this year, this is often how of virtually visiting places that are currently inaccessible.”