Researchers have hypothesized that the universe contains "dark matter." They have also posited the existence of "dark energy." These two hypotheses account for the movement of stars in galaxies and for the accelerating expansion of the universe. But according to a researcher at the University of Geneva (UNIGE), these concepts may be no longer valid, as universal phenomena can be demonstrated without them. This research exploits a new theoretical model based on the scale invariance of empty space.
In 1933, the Swiss astronomer Fritz Zwicky claimed there is substantially more matter in the universe than we can actually see. Astronomers called this unknown matter "dark matter," a concept that was to take on yet more importance in the 1970s, when the U.S. astronomer Vera Rubin invoked this enigmatic matter to explain the movements and speed of the stars.
Scientists have subsequently devoted considerable resources to identifying dark matter in space, on the ground and at CERN, but without success. In 1998, a team of Australian and U.S. astrophysicists discovered the acceleration of the expansion of the universe, earning the Nobel Prize for physics in 2011.
However, in spite of enormous science resources, no theory or observation has been able to define this energy that is allegedly stronger than Newton's gravitational attraction. In short, dark matter and dark energy are mysteries that have stumped astronomers for decades.
The way physicists represent the universe and its history are described by Einstein's equations of general relativity, Newton's universal gravitation and quantum mechanics. The consensus at present is that of a Big Bang followed by expansion.
"In this model, there is a starting hypothesis that hasn't been taken into account, in my opinion," says André Maeder, honorary professor in the Department of Astronomy in UNIGE's Faculty of Science. "By that, I mean the scale invariance of empty space; in other words, empty space and its properties do not change following a dilatation or contraction."
Empty space plays a primordial role in Einstein's equations as it operates in a quantity known as a "cosmological constant," and the resulting model depends on it. Based on this hypothesis, Maeder is now re-examining the Standard Model of the universe, pointing out that the scale invariance of empty space is also present in the fundamental theory of electromagnetism.
When Maeder carried out cosmological tests on his new model, he found that it matched observations. He also found that the model predicts the accelerated expansion of the universe without having to factor in dark energy. In short, it appears that dark energy may not actually exist since the acceleration of the expansion is contained in the equations of the physics.
In a second stage, Maeder focused on Newton's law, a specific instance of the equations of general relativity. The law is also slightly modified when the model incorporates Maeder's new hypothesis. Indeed, it contains a very small outward acceleration term, which is particularly significant at low densities. This amended law, when applied to clusters of galaxies, leads to masses of clusters in line with that of visible matter (contrary to what Zwicky argued in 1933).
This means that no dark matter is needed to explain the high speeds of the galaxies in the clusters. A second test demonstrated that this law also predicts the high speeds reached by the stars in the outer regions of galaxies (as Rubin had observed), without having to resort to dark matter to describe them.
Finally, a third test looked at the dispersion of the speeds of the stars oscillating around the plane of the Milky Way. This dispersion, which increases with the age of the relevant stars, can be explained very well using the invariant empty space hypothesis, while there was before no agreement on the origin of this effect.
Maeder's discovery paves the way for a new conception of astronomy that will raise questions and generate controversy. "The announcement of this model, which at last solves two of astronomy's greatest mysteries, remains true to the spirit of science: nothing can ever be taken for granted, not in terms of experience, observation or the reasoning of human beings," concluded André Maeder.
Researchers have uncovered an entirely new way cosmology that sheds light on the future of particle physics by showing how the largest possible structure – the curvature of the universe as a whole – can be used as a lens onto the smallest objects observable today, elementary particles.
Niayesh Afshordi and postdoctoral fellow Elliot Nelson of Canada's Perimeter Institute began with the knowledge that space is flat. While there are local wrinkles, they are wrinkles in a flat space, not wrinkles in curved space. The universe as a whole is within one percent of flat. The problem is that it shouldn’t be. The vacuum of space is not empty; it is filled with fields that may be weak but cannot be zero – nothing quantum can ever be zero, because quantum things wiggle.
According to general relativity, such fluctuations should cause spacetime to curve. In fact, a straightforward calculation of how much the vacuum should curve predicts a universe so tightly wound that the moon would not fit inside it.
Cosmologists have typically worked around this problem – that the universe should be curved, but looks flat – by assuming there is some antigravity that exactly offsets the tendency of the vacuum to curve. This set of off-base predictions and unlikely corrections is known as the cosmological constant problem, and it has been dogging cosmology for more than half a century.
The images above and below are a “visualization showing fields of view” of the entire observable Universe, an illustration created by Pablo Carlos Budassi was based on almost incomprehensible logarithmic maps created by Princeton University. Budassi’s illustrations of celestial bodies were based on images from NASA. Shown in the image are all the bodies of our solar system, with the Sun at the center, the Kuiper Belt, the Oort Cloud, the Perseus arm of the Milky Way Galaxy, and the Andromeda Galaxy. The outer rim is said to be comprised of the Cosmic Microwave Background, a byproduct of the Big Bang, and a “ring of plasma” said to have been created by the Big Bang.
In their paper, Nelson and Afshordi make no attempt to solve it, but where other cosmologists invoked an offsetting constant and moved on, Nelson and Afshordi went on to ask one more question: Does adding such a constant to cancel the vacuum’s energy guarantee a flat spacetime? Their answer: not quite.
The vacuum is still filled with quantum fields, and it is the nature of quantum fields to fluctuate. Even if they are perfectly offset such that their average value is zero, they will still fluctuate around that zero point. Those fluctuations should (again) cause space to curve – just not as much.
In this scenario, the amount of curve created by the known fields – the electromagnetic field, for example, or the Higgs field – is too small to be measured, and is therefore allowed. But any unknown field would have to be weak enough that its fluctuations would not cause an observable curve in the universe. This sets a maximum energy for unknown fields.
A theoretical maximum on a theoretical field may not sound groundbreaking – but the work opens a new window in an unexpected place: particle physics.
A particle, quantum mechanics teaches us, is just an excitation of a field. A photon is an excitation of the electric field, for example, and the newly discovered Higgs boson is an excitation of the Higgs field. It’s roughly similar to the way a wave is an excitation of the ocean. And just as the height of a breaking wave can tell us something about the depth of the water, the mass of a particle depends on the strength of its corresponding field.
New kinds of quantum fields are often associated with proposals to extend the Standard Model of particle physics. If Afshordi and Nelson are right, and there can be no such fields whose fluctuations have enough energy to noticeably curve space, there can be no unknown particles with a mass of more than 35 TeV. The authors predict that if there are new fields and particles associated with an extension to the Standard Model, they will be below that range.
For generations, particle physics has made progress from the bottom up: building more and more powerful colliders to create – then spot and study – heavier and heavier particles. It is as if we started from the ground floor and built up, discovering more particles at higher altitudes as we went. What Nelson and Afshordi have done is lower the sky.
There is a great deal of debate in particle physics about whether we should build increasingly powerful accelerators to search for heavier unknown particles. Right now, the most powerful accelerator in the world, the Large Hadron Collider, runs at a top energy of about 14 TeV; a proposed new super accelerator in China would run at about 100 TeV. As this debate unfolds, this new work could be particularly useful in helping experimentalists decide which energy levels – which skyscraper heights – are the most interesting.
The sky does indeed have a limit, this research suggests – and we are about to hit it.
An increased likelihood of life-threatening comet impacts could occur when the Sun passes through a possible dark matter disk in the Galaxy. Our Solar System orbits around the Milky Way’s center, completing a revolution every 250 million years or so. Along this path, it oscillates up and down, crossing the galactic plane about every 32 million years. If a dark matter disk were concentrated along the galactic plane, as shown here, it might tidally disrupt the motion of comets in the Oort cloud at the outer edge of our Solar System. This could explain possible periodic fluctuations in the rate of impacts on Earth.
Scientists have uncovered possible evidence of this galactic bumpiness in an apparent periodic fluctuation in the rate of large crater-forming impacts—the kind that likely killed off the dinosaurs. The frequency of impact fluctuations closely matches the rate at which the Sun passes through the plane of the galactic disk. However, it hasn’t been clear what element in the disk could be influencing comet trajectories. Two theoretical physicists have put forward a hypothesis that inserts dark matter as the missing piece between Solar System motion and possibly life-threatening comet impacts. In a paper published in Physical Review Letters, Lisa Randall and Matthew Reece from Harvard University suggest that some of the mysterious invisible matter, which makes up 85% of all matter in the Universe, could exist in a thin disk that disturbs the path of certain comets so that they are more likely to collide with our planet.
Comet impact events appear to have played a significant role in shaping Earth’s history, creating craters and possibly causing mass extinctions. Many of these comets come from the Oort cloud, a spherical envelope of icy bodies in the outer edge of the Solar System extending from just outside the orbit of Neptune to halfway to the next nearest star. Because the Oort cloud is so distant from the Sun, it is highly susceptible to perturbations from gravitational forces coming from other bodies. Indeed, there have been some indications that the frequency of impacts (from both comets and asteroids) on Earth oscillates on a timescale of about 25 to 35 million years, which suggests a connection between the dynamics at the outer edge of the Solar System and the comet shower strikes on Earth.
Two hypotheses have been proposed to explain the possible periodicity in comet impacts. One idea involves the gravitational pull of an as-yet-undiscovered distant companion star (called Nemesis) or planet (called planet X) that periodically disturbs comets in the Oort cloud and causes a large increase in the number of comets visiting the inner Solar System and thus in the frequency of the impact events on Earth. Neither Nemesis nor planet X was detected with NASA’s Wide-field Infrared Survey Explorer (WISE) space telescope, effectively ruling out the theory that an object in our Sun’s neighborhood can explain the impact fluctuations.
An alternative hypothesis involves a gravitational influence of the dense galactic disk on the Solar System . Our Sun orbits around the Galactic center, taking approximately 250 million years to make a complete revolution. However, this trajectory is not a perfect circle. The Solar System weaves up and down, crossing the plane of the Milky Way approximately every 32 million years, which coincides with the presumed periodicity of the impact variations. This bobbing motion, which extends about 250 light years above and below the plane, is determined by the concentration of gas and stars in the disk of our Galaxy. This ordinary “baryonic” matter is concentrated within about 1000 light years of the plane. Because the density drops off in the vertical direction, there is a gravitational gradient, or tide, that may perturb the orbits of comets in the Oort cloud, causing some comets to fly into the inner Solar System and periodically raise the chances of collision with Earth. However, the problem with this idea is that the estimated galactic tide is too weak to cause many waves in the Oort cloud.
In their new study, Randall and Reece focus on this second hypothesis and suggest that the galactic tide could be made stronger with a thin disk of dark matter. Dark disks are a possible outcome of dark matter physics, as the authors and their colleagues recently showed. Here, the researchers consider a specific model, in which our Galaxy hosts a dark disk with a thickness of 30 light years and a surface density of around 1 solar mass per square light year (the surface density of ordinary baryonic matter is roughly 5 times that, but it’s less concentrated near the plane). Although one has to stretch the observational constraints to make room, their thin disk of dark matter is consistent with astronomical data on our Galaxy. Focusing their analysis on large (>20km) craters created in the last 250 million years, Randall and Reece argue that their dark disk scenario can produce the observed pattern in crater frequency with a fair amount of statistical uncertainty.
Randall and Reece’s dark disk model is not made of an ordinary type of dark matter. The most likely candidate of dark matter—known as weakly interacting massive particles (WIMPs)—is expected to form a spherical halo around the Milky Way, instead of being concentrated in the disk. This WIMP dark matter scenario has been remarkably successful in explaining the large-scale distribution of matter in the Universe. But, there is a long-standing problem on small-scales—the theory generally predicts overly dense cores in the centers of galaxies and clusters of galaxies, and it predicts a larger number of dwarf galaxy satellites around the Milky Way than are observed. While some of these problems could be resolved by better understanding the physics of baryonic matter (as it relates, for example, to star formation and gas dynamics), it remains unclear whether a baryonic solution can work in the smallest mass galaxies (with very little stars and gas) where discrepancies are observed.
Alternatively, this small-scale conflict could be evidence of more complex physics in the dark matter sector itself. One solution is to invoke strong electromagnetic-like interactions among dark matter particles, which could lead to the emission of “dark photons”. These self-interactions can redistribute momentum through elastic scattering, thereby altering the predicted distribution of dark matter in the innermost regions of galaxies and clusters of galaxies as well as the number of dwarf galaxies in the Milky Way. Although self-interacting dark matter could resolve the tension between theory and observations at small-scales, large-scale measurements of galaxies and clusters of galaxies only allow a small fraction (less than 5%) of the dark matter to be self-interacting. Recently, Randall, Reece, and their collaborators showed that if a portion of the dark matter is self-interacting, then these particles will collapse into a dark galactic disk that overlaps with the ordinary baryonic disk .
Did a thin disk of dark matter trigger extinction events like the one that snuffed out the dinosaurs? The evidence is still far from compelling. First, the periodicity in Earth’s cratering rate is not clearly established, because a patchy crater record makes it difficult to see a firm pattern. It is also unclear what role comets may have played in the mass extinctions. The prevailing view is that the Chicxulub crater, which has been linked to the dinosaur extinction 66 million years ago, was created by a giant asteroid, instead of a comet. Randall and Reece were careful in acknowledging at the outset that “statistical evidence is not overwhelming” and listing various limitations for using a patchy crater record. But the geological data is unlikely to improve in the near future, unfortunately.
On the other hand, advances in astronomical data are expected with the European Space Agency’s Gaia space mission, which was launched last year and is currently studying the Milky Way in unprecedented detail. Gaia will observe millions of stars and measure their precise distances and velocities. These measurements should enable astronomers to map out the surface-density of the dense galactic disk as a function of height. Close to the plane, astronomers could then directly see whether there is a “disk within the disk” that has much more mass than we could account for with the ordinary baryonic matter. Evidence of such a dark disk would allow better predictive modeling of the effects on comets and on the life of our planet.
Source: Dark Matter as a Trigger for Periodic Comet Impacts, Lisa Randall and Matthew Reece,Phys. Rev. Lett. 112, 161301 (2014), Published April 21, 2014
Bitcoin just crashed 50% today, on news that the Chinese government has banned local exchanges from accepting deposits in Yuan. BtC was trading over $1000 yesterday; now it's down to $500 and still falling.
I want Bitcoin to die in a fire: this is a start, but it's not sufficient. Let me give you a round-up below the cut.
Like all currency systems, Bitcoin comes with an implicit political agenda attached. Decisions we take about how to manage money, taxation, and the economy have consequences: by its consequences you may judge a finance system. Our current global system is pretty crap, but I submit that Bitcoin is worst.
For starters, BtC is inherently deflationary. There is an upper limit on the number of bitcoins that can ever be created ('mined', in the jargon: new bitcoins are created by carrying out mathematical operations which become progressively harder as the bitcoin space is explored—like calculating ever-larger prime numbers, they get further apart). This means the the cost of generating new Bitcoins rises over time, so that the value of Bitcoins rise relative to the available goods and services in the market. Less money chasing stuff; less cash for everybody to spend (as the supply of stuff out-grows the supply of money). Hint: Deflation and Inflation are two very different things; in particular, deflation is not the opposite of inflation (although you can't have both deflation and inflation simultaneously—you get one disease or the other).
Bitcoin is designed to be verifiable (forgery-resistant) but pretty much untraceable, and very easy to hide. Easier than a bunch of gold coins, anyway. And easier to ship to the opposite side of the planet at the push of a button.
Libertarians love it because it pushes the same buttons as their gold fetish and it doesn't look like a "Fiat currency". You can visualize it as some kind of scarce precious data resource, sort of a digital equivalent of gold. Nation-states don't control the supply of it, so it promises to bypass central banks.
But there are a number of huge down-sides. Here's a link-farm to the high points:
To editorialize briefly, BitCoin looks like it was designed as a weapon intended to damage central banking and money issuing banks, with a Libertarian political agenda in mind—to damage states ability to collect tax and monitor their citizens financial transactions. Which is fine if you're a Libertarian, but I tend to take the stance that Libertarianism is like Leninism: a fascinating, internally consistent political theory with some good underlying points that, regrettably, makes prescriptions about how to run human society that can only work if we replace real messy human beings with frictionless spherical humanoids of uniform density (because it relies on simplifying assumptions about human behaviour which are unfortunately wrong).
TL:DR; the current banking industry and late-period capitalism may suck, but replacing it with Bitcoin would be like swapping out a hangnail for gas gangrene.