Science X Newsletter Tuesday, Nov 26

Dear ymilog,

Be an ACS Industry Insider:

Sign-up and get free, monthly access to articles that cover exciting, cutting edge discoveries in Energy, Environmental Science and Agriculture.

Here is your customized Science X Newsletter for November 26, 2019:

Spotlight Stories Headlines

MOOSE: A platform to create complex multiphysics simulations

Hardening of the arteries: Platelets, inflammation and a rogue protein conspire against the heart

Fast surface dynamics enabled cold joining of metallic glasses

A new framework could aid the search for heavy thermal dark matter

Fire ants' raft building skills react as fluid forces change

16-million-year-old fossil shows springtails hitchhiking on winged termite

Dinosaur skull turns paleontology assumptions on their head

Fertility treatment, not maternal age, causes epigenetic changes in mouse offspring

AI debate machine argues with itself at Cambridge Union

Leftover grain from breweries could be converted into fuel for homes

Multiple sclerosis linked to variant of common herpes virus through new method

Harvesting fog can provide fresh water in desert regions

Chemical herders could impact oil spill cleanup

Ternary acceptor and donor materials increase photon harvesting in organic solar cells

Investigators narrow in on a microRNA for treating multiple sclerosis

Physics news

A new framework could aid the search for heavy thermal dark matter

Astrophysicists have been searching for dark matter for several decades, but these searches have so far yielded disappointing results. In a recent study, two researchers at Weizmann Institute of Science and the Hebrew University of Jerusalem in Israel have introduced a new theoretical framework outlining a mechanism of elementary thermal dark matter with a mass up to 1014 GeV.

Fire ants' raft building skills react as fluid forces change

Fire ants build living rafts to survive floods and rainy seasons. Georgia Tech scientists are studying if a fire ant colony's ability to respond to changes in their environment during a flood is an instinctual behavior and how fluid forces make them respond.

Harvesting fog can provide fresh water in desert regions

Fog harvesting is a potential practical source of fresh water in foggy coastal deserts, and current solutions rely on meter scale nets/meshes. The mesh geometry, however, presents a physiologically inappropriate shape for millimeter scale bulk bodies, like insects.

Chemical herders could impact oil spill cleanup

Oil spills in the ocean can cause devastation to wildlife, so effective cleanup is a top priority. One method to clean up oil spills is by burning, which only works if the oil is heavily concentrated in one area. Research from Johns Hopkins University shows the effects of chemical herders, which are agents that may be used to concentrate oil spills, on wave breaking.

Ternary acceptor and donor materials increase photon harvesting in organic solar cells

Organic solar cells are steadily improving as new materials are developed for the active layer, particularly when materials are stacked in a bulk heterojunction design that takes advantage of multiple combined absorption windows to use photons at more parts of the spectrum.

From firearms to fish—following patterns to discover causality

Mathematicians have successfully applied a new, pictorial approach to answer complex questions that puzzle analysts, such as, do media stories on firearm legislation influence gun sales? Cause-and-effect queries like this pop up in various fields, from finance to neuroscience, and objective methods are needed to deliver reliable answers.

Industrial bread dough kneaders could use physics-based redesign

Bakers have been crafting bread for more than 6,000 years with four simple ingredients: flour, salt, water and yeast. Apart from using high-quality ingredients, the kneading process and amount of time the dough is given to rise ultimately determine the bread's quality.

Researchers set new upper limit on neutrino mass

An international team of researchers has used a new spectrometer to find and set an upper limit for the mass of a neutrino. In their paper published in the journal Physical Review Letters, the group describes how they came up with the new limit and why they believe finding it was important.

Injection of magnetizable fluid could extend trauma patients' survival time

Inspired by their use in mechanical systems, Massachusetts Institute of Technology researchers are testing a magnetically-actuated fluidic valve to use in trauma patients suffering from hemorrhage.

Resolving the 'proton radius puzzle'

How do you measure the width of a proton?

New study shows unique magnetic transitions in quasicrystal-like structures

In the world of materials science, many have heard of crystals—highly ordered structures in which atoms are arranged in a tight and periodic manner (in which the atomic arrangement is repeated). But, not many people know about quasicrystals, which are unique structures with strange atomic arrangements. Like crystals, quasicrystals are also tightly arranged, but what's different about them is the fact that they possess an unprecedented pentagonal symmetry, such that the atomic arrangement is highly ordered but not periodic.

Astronomy & Space news

More dark-matter-deficient dwarf galaxies found

A team of researchers with members affiliated with multiple institutions in China has found evidence for more dark-matter-deficient dwarf galaxies. In their paper published in the journal Nature Astronomy, the group describes their study of dwarf galaxies and how they found some they expected to be dominated by dark matter were not.

Black hole nurtures baby stars a million light years away

Black holes are famous for ripping objects apart, including stars. But now, astronomers have uncovered a black hole that may have sparked the births of stars over a mind-boggling distance, and across multiple galaxies.

Scientists inch closer than ever to signal from cosmic dawn

Around 12 billion years ago, the universe emerged from a great cosmic dark age as the first stars and galaxies lit up. With a new analysis of data collected by the Murchison Widefield Array (MWA) radio telescope, scientists are now closer than ever to detecting the ultra-faint signature of this turning point in cosmic history.

Image: Giant magnetic ropes in a galaxy's halo

This image of the "Whale Galaxy" (NGC 4631), made with the National Science Foundation's Karl G. Jansky Very Large Array (VLA), reveals hair-like filaments of the galaxy's magnetic field protruding above and below the galaxy's disk.

Space travel can make the gut leaky

Bacteria, fungi, and viruses can enter our gut through the food we eat. Fortunately, the epithelial cells that line our intestines serve as a robust barrier to prevent these microorganisms from invading the rest of our bodies.

Testing time for MetOp Second Generation

MetOp Second Generation (MetOp-SG) is a follow-on system to the successful MetOp satellites, the last of which launched into its 800 km polar orbit in 2018.

Anisotropic radio-wave scattering in the solar corona

Solar radio emission is produced in the turbulent medium of the solar atmosphere, and its observed properties (source position, size, time profile, polarization, etc.) are significantly affected by the propagation of the radio waves from the emitter to the observer. Scattering of radio waves on random density irregularities has long been recognized as an important process for the interpretation of radio source sizes (e.g., Steinberg et al. 1971), positions (e.g., Fokker 1965; Stewart 1972), directivity (e.g., Thejappa et al. 2007; Bonnin et al. 2008; Reiner et al. 2009), and intensity-time profiles (e.g., Krupar et al. 2018, Bian et al. 2019). While a number of Monte Carlo simulations have been developed to describe radio-wave scattering (mostly for isotropic density fluctuations), not all agree. The present work addresses this important issue both by extending and improving the previous descriptions.

Technology news

MOOSE: A platform to create complex multiphysics simulations

In recent decades, technological advances have opened up exciting new possibilities for research in a variety of fields, including physics. Nonetheless, creating sophisticated simulations to represent or address multiphysics problems using computing resources can still be very challenging.

AI debate machine argues with itself at Cambridge Union

IBM has a Project Debater AI system that can debate humans on complex topics. A recent event to showcase its capabilities turned into pure drama as the machine proceeded to throw AI under the bus as it took both con and pro positions as to whether or not AI is harmful to humans.

A record-setting transistor

Many of the technologies we rely on, from smartphones to wearable devices and more, utilize fast wireless communications. What might we accomplish if those devices transmitted information even faster?

Nuclear reactors with a newly proposed barrier could have withstood Chernobyl and Fukushima

In the aftermath of the notorious accidents in the history of nuclear energy at Three Mile Island (1979), Chernobyl (1986) and Fukushima (2011), where all three have turned into devastating disasters due to meltdown in the core of a reactor, leading in turn to the release of radiation into the environment, many countries around the world have already pledged to a nuclear power phase-out.

Alibaba shares surge on Hong Kong debut

Chinese online retail giant Alibaba surged Tuesday as it drew back the curtain on a Hong Kong listing the firm described as a vote of confidence in the embattled city.

Firings spark dissent in Google ranks

Google on Monday fired four employees on the grounds they violated data security policies, prompting ire among colleagues concerned it was retaliation for worker organizing.

Saving bats from wind turbine death

Wind energy holds great promise as a source of renewable energy, but some have wondered if measures taken to address climate change have taken precedence over conservation of biodiversity. Wind turbines, for example, kill some birds, and the fatality rate for bats is even higher. Since bats are a crucial part of the ecosystem, helping with pollination, insect management, plant seed dispersal, etc., the high fatality rate is concerning.

Search results not biased along party lines, Stanford study finds

In recent months, questions have arisen about big tech's unparalleled influence over what news and information people see online. Potential political bias and censorship in search engine results are a big part of the conversation. Is the concern well-founded?

Periodic review of the artificial intelligence industry reveals challenges

As part of Stanford's ongoing 100-year study on artificial intelligence, known as the AI100, two workshops recently considered the issues of care technologies and predictive modeling to inform the future development of AI technologies.

Former Tinder CEO Sean Rad accused of secretly recording employees and bosses in new court filing

The multibillion-dollar legal battle between Sean Rad, the co-founder and former chief executive of Tinder, and parent company IAC took a new turn this week when IAC alleged in a new court filing that Rad secretly recorded multiple conversations with Tinder employees and his supervisors, potentially violating California law requiring both parties to consent to being recorded.

Even computer algorithms can be biased. Scientists have different ideas of how to prevent that

Scientists say they've developed a framework to make computer algorithms "safer" to use without creating bias based on race, gender or other factors. The trick, they say, is to make it possible for users to tell the algorithm what kinds of pitfalls to avoid—without having to know a lot about statistics or artificial intelligence.

New safety recommendations for culvert repair released

Communities across the U.S. rely on drainage culverts to keep roadways safe. While these buried structures cross streams and divert water from roadways, many are in need of repair. Unexpected culvert failures can disrupt traffic, damage the environment and nearby property, and can even be fatal.

New research considers future interactions with computer-generated people in virtual reality

Dr. Rachel McDonnell, Assistant Professor in Creative Technologies at Trinity, focuses on the animation of virtual humans for the entertainment industry and virtual reality (VR).

Hydrogen from natural gas without carbon dioxide emissions

Methane pyrolysis will allow for the future climate-friendly use of fossil natural gas. Methane is separated into gaseous hydrogen and solid carbon that is a valuable material for various industry branches and can also be stored safely. This may be a key component of future climate-neutral energy supply. Researchers of Karlsruhe Institute of Technology (KIT) have developed a highly efficient process for this purpose. Together with the industry partner Wintershall Dea, this process will now be further developed for use on the industrial scale.

Xerox launches shareholder fight for control of HP

Xerox said Tuesday it would take its hostile takeover offer for HP to shareholders after the computer and printer maker rejected the $33 billion offer.

Ad business a boon for Amazon but a turn-off for shoppers

Mike Maddaloni went to knowing exactly what he wanted to buy.

Audi to slash 9,500 jobs in Germany by 2025

German luxury carmaker Audi said Tuesday it planned to slash 9,500 jobs in Germany by 2025 as part of a massive overhaul to help finance a costly switch to electric vehicles.

Report shows high injury rate at Amazon warehouses

Injury rates reported for workers at Amazon warehouses across the United States are more than double the national average, according to a news investigation into workplace conditions at the electronic commerce giant.

Google tensions deepen over firings of 'Thanksgiving Four'

Google on Monday fired four employees on the grounds they had violated data security policies, but the tech titan was accused of persecuting them for trying to unionize staff.

Tesla and Ford trade challenges in macho truck world

Tesla and Ford were in a virtual stare-down on Tuesday in the macho truck world, each claiming their electric pick-up was strongest.

New technology makes internet memes accessible for people with visual impairments

People with visual impairments use social media like everyone else, often with the help of screen reader software. But that technology falls short when it encounters memes, which don't include alternate text, or alt text, to describe what's depicted in the image.

Team develops satellites that fix other satellites

When satellites break, which is surprisingly often, there isn't much you can do about them.

Senate Democrats propose sweeping data privacy bill

Senate Democrats are proposing a broad federal data privacy law that would allow people to see what information companies have collected on them and demand that it be deleted.

UK science engineering company ready to take Purdue heating technology to the market

A novel heating technology based on materials commonly used in the aerospace industry soon may be helping doctors, forensic scientists and automobile manufacturers. Alconbury Weston Limited, a science-engineering company based in the United Kingdom, has licensed carbon fiber technology from Purdue Research Foundation to support industries ranging from research institutes to commercial manufacturers.

Do you know exactly where you are?

We all rely on GPS to tell us where we are and where we're going. The US government's global network of 30+ satellites guides planes, ships, cars, tractors and much more. The latest GPS systems can provide mm- to cm-accuracy using advanced equipment and technique.

Baby Yoda GIFs are back after 'confusion' led to removal

People can send each other animations of Baby Yoda again.

This email is a free service of Science X Network
You received this email because you subscribed to our list.
If you do not wish to receive such emails in the future, please unsubscribe here.
You are subscribed as You may manage your subscription options from your Science X profile


Mathematicians Catch a Pattern by Figuring Out How to Avoid It

We finally know how big a set of numbers can get before it has to contain a pattern known as a “polynomial progression.”

Mathematicians are making inroads in the grand effort to find structure 

Some mathematical patterns are so subtle you could search for a lifetime and never find them. Others are so common that they seem impossible to avoid.
new proof by Sarah Peluse of the University of Oxford establishes that one particularly important type of numerical sequence is, ultimately, unavoidable: It’s guaranteed to show up in every single sufficiently large collection of numbers, regardless of how the numbers are chosen.
“There’s a sort of indestructibility to these patterns,” said Terence Tao of the University of California, Los Angeles.
Peluse’s proof concerns sequences of numbers called “polynomial progressions.” They are easy to generate — you could create one yourself in short order — and they touch on the interplay between addition and multiplication among the numbers.
For several decades, mathematicians have known that when a collection, or set, of numbers is small (meaning it contains relatively few numbers), the set might not contain any polynomial progressions. They also knew that as a set grows it eventually crosses a threshold, after which it has so many numbers that one of these patterns has to be there, somewhere. It’s like a bowl of alphabet soup — the more letters you have, the more likely it is that the bowl will contain words.
But prior to Peluse’s work, mathematicians didn’t know what that critical threshold was. Her proof provides an answer — a precise formula for determining how big a set needs to be in order to guarantee that it contains certain polynomial progressions.
Previously, mathematicians had only a vague understanding that polynomial progressions are embedded among the whole numbers (1, 2, 3 and so on). Now they know exactly how to find them.

Finding Patterns

Endre Szemerédi proved one of them. First, he said, decide how long you want your arithmetic progression to be. It could be any such pattern with four terms (2, 5, 8, 11), or seven terms (14, 17, 20, 23, 26, 29, 32), or any number of terms you want. Then he proved that once a set reaches some exact size (which he couldn’t identify), it must contain an arithmetic pattern of that length. In doing so, he codified the intuition that among a large enough collection of numbers there has to be a pattern somewhere.
“Szemerédi basically said that complete disorder is impossible. No matter what set you take, there have to be little inroads of structure inside of it,” said Ben Green of Oxford.
But Szemerédi’s theorem didn’t say anything about how big a collection of numbers needs to be before these patterns become inevitable. He simply said that there exists a set of numbers, of some unknown size, that contains an arithmetic pattern of the length you’re looking for.
More than two decades later, a mathematician put a number on that size — in effect, proving the second main fact about these arithmetic patterns.
In 2001, Timothy Gowers of the University of Cambridge proved that if you want to be guaranteed to find, say, a five-term arithmetic progression, you need a set of numbers that’s at least some exact size — and he identified just what that size is. (Actually stating the size is complicated, as it involves enormous exponential numbers.)
To understand what Gowers did, you have to understand what mathematicians mean when they talk about the “size” of a set of numbers and the idea of “big enough.”
First, choose an interval on the number line, say 1 to 1,000, or something more random like 17 to 1,016. The endpoints of the interval don’t matter — only the length does. Next, determine the proportion of the numbers within that interval that you’re going to put into your set. For example, if you create a set with 100 numbers from 1 to 1,000, the size of your set is 10% of the interval.
Gowers’ proof works regardless of how you choose the numbers in that set. You could take the first 100 odd numbers between 1 and 1,000, the first 100 numbers that end in a 6, or even 100 random numbers. Whatever your method, Gowers proved that once a set takes up enough space (not necessarily 10%) in a long enough interval, it can’t help but include a five-term arithmetic progression. And he did the same for arithmetic progressions of any length you want.
“After Gowers, you know that if you give me an arithmetic progression of any length, then any subset” of numbers of some specified size must necessarily contain that progression, Peluse said.
Peluse’s work parallels Gowers’ achievement, except that she focused on polynomial progressions instead.
In an arithmetic progression, remember, you choose one starting number and keep adding another. In the type of polynomial progressions studied by Peluse, you still pick a starting value, but now you add powers of another number. For example: 2, 2 + 31, 2 + 32, 2 + 33, 2 + 34. Or 2, 5, 11, 29, 83. (Her progressions also had only one term for each power, a requirement that makes them more manageable.)
These polynomial progressions are closely related to an important type of pattern known as a geometric progression, formed by raising a number to increasingly high powers: 31, 32, 33, 34. These appear naturally in many areas of math and physics, and they’ve fascinated mathematicians for millennia. Geometric progressions are rare among even large sets of numbers, but if you tweak a geometric progression slightly — by adding a constant, like 2, to each term — you get a polynomial progression. And these seem to be everywhere.

“You can construct large sets of [numbers] that don’t contain geometric progressions. But if you allow yourself more freedom and shift your geometric progression,” creating a polynomial progression, then large sets seem to be forced to contain them, said Sean Prendiville of Lancaster University, who has collaborated with Peluse on polynomial progressions.
In 1996, Vitaly Bergelson and Alexander Leibman proved that once a set of numbers gets big enough, it has to contain polynomial progressions — the polynomial equivalent of Szemerédi’s work. But once again, that left mathematicians with no idea what “big enough” meant.
Peluse answered that question in a counterintuitive way — by thinking about exactly what it would take for a set of numbers not to contain the pattern you’re looking for.

Fighting Patterns With Patterns

Peluse wanted to determine how big a set needs to be — what percentage of the numbers in an interval it needs to include — in order to ensure that it contains a given polynomial progression. To do that, she had to think about all the ways a set of numbers could avoid containing the progression — and then to prove that past a certain size, even the cleverest avoidance strategies no longer work.
You can think about the problem as a challenge. Imagine that someone asks you to create a set containing half the numbers between 1 and 1,000. You win if the set doesn’t contain the first four terms of a polynomial progression. How would you select the numbers?

Your instinct might be to choose the numbers at random. That instinct is wrong.
“Most sets are in the middle of the bell curve. The number of polynomial progressions they contain is the average number,” Prendiville said. And that average is way more than the zero you want.
It’s similar to how, if you chose a person at random from the world’s population, you’d probably get someone close to average height. If your goal was to find a much rarer 7-footer, you’d need to search in a more targeted way.
So to win the number-picking challenge, you’ll need a more organized way of deciding which ones to include in your set of 500. You might notice, for example, that if you choose only even numbers, you eliminate the possibility that your set contains polynomial progressions with any odd numbers. Progress! Of course, you’ve also increased the likelihood that your set contains polynomial progressions made up of even numbers.
But the point is that by coming up with a structured way of choosing those 500 numbers, you can rule out the possibility that your set will contain certain polynomial progressions. In other words, it takes a pattern to avoid a pattern.
Peluse set out to prove that after reaching a certain size, even the most cleverly constructed sets still have to contain a given polynomial progression. Essentially, she wanted to identify the critical point at which every time you avoid one instance of a polynomial progression, you back into another, as with the odd and even numbers.
To do that, she had to find a way to quantify just how much structure a set contains.

Measuring Structure

Before Peluse’s paper, many mathematicians had tried to understand when polynomial progressions appear among sets of numbers. The researchers included some of the most accomplished minds in the field, but none of them made significant progress toward figuring out how big a set needs to be in order to contain polynomial progressions of different lengths.
The main impediment was that mathematicians had no idea how to capture, in a precise way, the kinds of structures that might allow a set to avoid containing polynomial progressions. There was one candidate technique. But at the time Peluse started her work, it was completely inapplicable to questions about polynomial progressions.
The technique originated in Gowers’ 2001 work on arithmetic progressions. There, Gowers had created a test, called the “Gowers norm,” which detects specific kinds of structures in a set of numbers. The test generates a single number quantifying the amount of structure in the set — or, to put it differently, it quantifies how far the set is from being a set of random numbers.
“The notion of looking random is not a well-defined mathematical notion,” Green said. Gowers found a way to quantify what this should mean.
A set can be structured to a greater or lesser extent. Sets containing random numbers have no structure at all — and therefore are likely to contain numerical patterns. They have a lower Gowers norm. Sets containing only odd numbers, or only multiples of 10, have a rudimentary structure. It’s easy to prove that past a certain size, sets with these kinds of simple designs also contain different kinds of patterns.
The most difficult sets to work on are those with a very intricate structure. These sets could still look random when in fact they’re constructed according to some kind of very subtle rule. They have a high Gowers norm, and they present the best chance of systematically avoiding patterns as the sets themselves grow larger.
While Gowers used these techniques to answer questions about arithmetic progressions, they were not applicable to questions about polynomial progressions. This is because arithmetic progressions are evenly spaced, while the terms in polynomial progressions jump wildly from one to the next. The Gowers norms were useful for studying polynomial progressions the way a weed whacker is good for stripping paint from a house: the right general idea, even if it’s not  suited to the job.
In her new proof, Peluse used the basic idea of the Gowers norm to create a wholly new way of analyzing the kinds of structures relevant to polynomial progressions. She used this technique, called “degree-lowering,” to prove that for the polynomial progressions she was interested in, the only kinds of structures you actually need to worry about are the blunt, simple ones with a low Gowers norm. That’s because polynomial progressions veer so extremely from one term to the next that they inevitably run through subtler numerical obstacles — like a bull crashing through display tables on its way out of a china shop.

To get a sense of these patterns, consider one that is slightly simpler than the polynomial progressions Peluse worked with. Start with 2 and keep adding 3: 2, 5, 8, 11, 14, and so on. This pattern — where you begin with one number and keep adding another — is called an “arithmetic progression.” It’s one of the most studied, and most prevalent, patterns in math.

There are two main facts to understand about the frequency with which arithmetic progressions appear among the whole numbers.

In 1975, Peluse’s formula is complicated to state. It involves taking a double logarithm of the length of the initial interval from which you choose the numbers in your set. The minimum size she came up with is also not necessarily the true minimum size — future work might find that the critical threshold is even lower. But before her proof, mathematicians had no quantitative understanding at all about when polynomial progressions were guaranteed.
“She’s the first person to show how large sets need to be,” Prendiville said.
Peluse’s proof answers one quantitative question about polynomial progressions. Mathematicians now hope they can use it to answer another — about when polynomial progressions appear in sets composed entirely of prime numbers, which are the most important numbers in mathematics and notoriously resistant to patterns. Before Peluse’s proof, mathematicians had no idea how to approach that question.
“There’s a hope you can import some of the arguments from my paper into the primes setting,” Peluse said.

Quantum Effects at Mega-Scale: The Universe May Have Quantum Properties

Mega-Scale Quantum Effects
An article of the IKBFU Director of Institute of Physics, Mathematics and Informational Technology, Artyom Yurov and the Institute’s Associate Professor, Valerian Yurov was recently published in European Physical Journal. The scientists have released their calculations, according to which the Universe may have quantum properties.
Artyom Yurov explained:
“To begin with, let’s remember what quantum physics is. Perhaps this is the most amazing phenomenon known to people. When scientists started studying atoms for the first time, they noticed that everything works “upside down” in the microcosm. For example, according to quantum theory, an electron may present in several places simultaneously.
Try to imagine your cat simultaneously lying on the sofa and eating from its bowl that is in the other corner of the room. The cat is not either here or there, but in both places simultaneously. But the cat is there only BEFORE you look at it. The moment you start staring at it, it changes the position to EITHER the bowl OR the sofa. You may ask, of course, that if the cat acts so weird only when not observed by us, so how do we know that it actually acts this way? The answer is simple: math! If we are to try and gather statistical information about us looking at the cat (needed to estimate the number of cases when the cat was on the sofa and when — near the bowl), we won’t have any information. This proves to be impossible if we consider the cat being EITHER near the bowl OR on the sofa. Well, it doesn’t work like that with cats, but works fine for electrons.
When we observe this particle, it really appears in one place and we can record that, but when we do not observe it, it must be in several places at once. For example, this is what they mean in chemistry classes when they talk about electron clouds. No wonder poor children never understand this. They just memorize … ”
Professor Artyom Yurov
Professor Artyom Yurov Credit: Immanuel Kant Baltic Federal University

Decoherence Effect

Yes, the cat is not some electron, but why? Cats consist of elementary particles, like electrons, protons, and neutrons. All the particles act the same when measured on the quantum level. So why a cat can’t be in two places simultaneously?
And the other question is: what is so magical about our ability to “observe”? Because when we don’t “observe,” the object is being “smeared” all over the universe, but the moment we look at it — it is gathered in one place! Well, physicists don’t say “gathered,” they say “wave function collapsed,” but those smart words actually mean “gathered” in one place as a result of observation! How are we able to do that?
“Firstly, the answer to these complex questions appeared at the end of the last century, when such a phenomenon as decoherence was discovered. It turns out that indeed, any object is located in several places at once, in very many places. It seems to be spread throughout the universe. But if the object comes into interaction with the environment, even collides with one atom of a photon, he immediately “collapses.” So there are no mystical abilities to cause quantum collapse by observation — this is due to interaction with the environment, and we are simply part of this environment.
Professor Valerian Yurov
Professor Valerian Yurov. Credit: Immanuel Kant Baltic Federal University
Secondly, there is no absolute collapse as such. The collapse happens in the following way: if before interacting with the environment the object was “smeared” over two places, (we use “two places” to simplify, in reality it might be smeared over hundreds of thousands of places) but in fact, the object presents 99.9999% (and many, many nines after) of the time in one place, and a small remaining part of time in the second. And we observe it as being in one and only place! Everything happens in no time and the bigger an object is, the faster the “collapse.” We cannot realize it or somehow register, as such devices simply do not exist. And they cannot be created.”
According to Artyom Yurov, a long time ago his friend and co-author from Madrid, professor Pedro Gonzalez Diaz (unfortunately, long deceased) has presented an idea of the Universe having quantum properties.
Prof. Yurov said: “Back in the days I was skeptical about the idea. Because it is known that the bigger an object is the faster it collapses. Even a bacteria collapses extremely fast, and here we are talking about the Universe. But here Pedro asked me: “What the Universe interacts with?” and I answered nothing. There is nothing but the Universe and there is nothing it can interact with. Which, theoretically allows as to think of it as of a quantum object.”

A human being and the facsimiles

However, the impetus for writing a scientific article about the quantum nature of the Universe was not so much the idea of Pedro Gonzalez Diaz as the one that came out in 2007 scientific publication (pdf) by Hall, Deckert, and Wiseman, who described those quantum miracles in the language of classical mechanics, adding some “quantum forces” in it.
That is, each location of the object is described as a separate “world,” but it is believed that these “worlds” act on each other with real “forces.” I must say that the idea of “many worlds” has existed for a long time and belongs to Hugh Everett. The idea of describing quantum effects by introducing additional forces also exists for a long time and belongs to David Bohm, but Hall, Deckert, and Wiseman were able to combine these ideas and build a meaningful mathematical model.
“And when Valerian and I saw this work in 2007,” says Artyom Yurov, “it seemed to us that the mathematical formalism used in it allows us to look very differently at what Pedro said at the time. The essence of our work is that we took the equation that cosmologists use to describe the Friedmann-Einstein Universe, added “quantum forces” according to the HDV scheme, and investigated the solutions obtained. We managed to get some amazing results, in particular, it is possible that some puzzles of cosmology can receive unexpected coverage from this side. But the most important thing is that such a model is testable. ”
It is too early to talk about what such a formulation of the question may lead to. The theory must also be confirmed by experiments (i.e., observations). But now it’s obvious that scientists have come close to what can fundamentally change our understanding of the universe.
Reference: “The day the universes interacted: quantum cosmology without a wave function” by A. V. Yurov and V. A. Yurov, 18 September 2019, The European Physical Journal C.
DOI: 10.1140/epjc/s10052-019-7261-y