Forty years ago, proof for the existence of dark matter and dark energy was thought to be just around the corner. Billions of dollars have since been spent on a wide variety of experiments led by our world’s most prestigious institutes in an effort to find such proof. Yet, as of today, this grand search remains empty-handed, and its initial confidence has been replaced by a growing sense of desperation as the probability of these particles’ existence continues to slip further and further away with each passing year and each failed experiment.
In 1932, a Dutch astronomer by the name of Jan Hendrick Oort noticed an issue with his measurements of local celestial bodies. In essence, he’d observed stars moving faster than what our accepted understanding of physics would say is possible. These stars were violating Newton’s law that tells us a body’s orbital velocity is dependent on the gravitational mass it is orbiting.
This was a headache for Dr. Oort, beyond any doubt, for a man in his position can never lightly host criticisms of our beloved physical laws—no matter how sure his measurements are. However, unbeknownst to Dr. Oort, those tedious discrepancies that he’d suffered over would not be his burden to bear alone. In fact, this problem that he’d first touched on has since come to grow into one of the greatest dilemmas in modern physics.
Expanding on the Issue
The implications of this grand problem might best be appreciated if we examine the form of a classic spiral galaxy. Like giant catherine wheels, the spiral galaxies in our universe spin out, and are simultaneously held together, by a point in their center where most physicists now believe one—or more—supermassive black holes reside, acting as the gravitational power-houses that hold these enormous galaxies together. Our classic understanding would say that the majority of the stars and mass in our galaxy must be around the central bulge, and the stars outside of this bulge, spaced ever-farther out into the spiral arms, would be moving ever-slower: orbiting with less and less velocity based on their distance from the gravitational mass in the center.
Alas, this is far from the case. As Dr. Oort first observed, and as so many have observed since then, no matter how far a star is from the galactic center, its orbital velocity almost always remains as a constant that’s shared with all other stars outside of its galaxy’s central bulge. In other cases—such as in galaxies with a uniform distribution of luminous matter–the orbital velocity even appears to increase the farther the stars are from their galactic center.
Now, the magnitude of this problem insists that it should not be understated, for its very existence undermines a substantial portion of the laws of physics that contemporary science has come to depend on. So, with that in mind, let’s clarify at least one of the key problems that arises out of the information presented above:
In order for said galaxies to maintain their great spiral shapes, the force of gravity emanating from their centers must be instantaneously engaging all other cosmic bodies bound to their form, for those smaller, comprising bodies are rapidly moving as well as interacting gravitationally with a multitude of bodies in the same manner as the galaxies themselves. In other words, for a star near the tip of such a spiral arm to remain in place in one of these enormous formations, the gravity from the supermassive black hole/holes at the center of its galaxy not only has to be far greater than what is typically observed, it also has to be reaching out and interacting with this star constantly and with almost no lag-time whatsoever. Otherwise, centrifugal force would hurl it and any other stars out from these galactic formations.
Yet, as far as we can tell, stars are not being hurled out of their respected galaxies. And those galaxies, for the most part, are also doing a remarkable job of holding their shapes. So what is going on? It’s no secret that gravity is an odd force to contend with, just ask anyone who’s tried to measure its fluctuating “constant” or recall/google a physics lesson on Newtonian gravity: when making calculations regarding the movements or orbits of cosmic bodies, gravity has to be given an infinite speed—otherwise, the equations simply do not work.
“THAT ONE BODY SHOULD ACT UPON ANOTHER THROUGH A VACUUM WITHOUT THE MEDIATION OF ANYTHING ELSE IS SO GREAT AN ABSURDITY THAT NO MAN SUITED TO DO SCIENCE…CAN EVER FALL INTO IT”
If you weren’t previously aware of these—or any of the other—glaring holes that exist in our world’s fundamental understanding of space and time, you might very well be asking yourself why this isn’t a bigger deal or maybe how a problem this big still exists today. The thing is, this is still a very big deal, and this problem does still exist today, but… it also doesn’t.
Eyes Over Here
During the 1960’s and 1970’s, astronomers Vera Rubin and Kent Ford put forth the strongest evidences yet to help detail the array of problems that can be found in the immense gulf between the cosmos we predict and the cosmos we observe. But, at the same time, they also proposed the perfect scapegoat for what had been growing into a nightmarish problem for astronomers, suggesting the existence of a never before seen or detected material known as dark matter as well as a force known as dark energy. The ambiguity and undetectable nature of these supposed materials allowed them to act as a sort of multi-purpose duct tape for patching or supporting a number of baffling aspects in our observed cosmos.
To be fair, a Swiss astrophysicist by the name of Fritz Zwicky was the first to coin the term dark matter—dunkle materie—back in 1933 when he encountered the “missing mass problem” that Dr. Oort had arrived at only a year earlier. At the time, Dr. Zwicky had posited that there must be four hundred times as much invisible mass in our universe as there is visible due to the Coma galactic cluster remaining together despite its members’ incredibly fast orbits. Still, it wasn’t until Vera Rubin and Kent Ford that the idea of dark matter truly began to pick up speed. Their results indicated, beyond any doubt, that there were missing forces at work in our universe, and their explanation for those forces was just ambiguous enough to act as a gaudy patch for the frightening hole they’d just helped puncture. By the 1980’s, many astrophysicists had accepted dark matter—and, not long after, dark energy—as part of our reality. And, as of today, based on observations of gravitational anomalies—like the galactic rotation problem and gravitational lensing—it’s now believed that our universe is made up of less than five percent of matter in the traditional sense with the rest being a mix of dark matter and dark energy.
We’ve now reached the point in this article where someone might be nodding their head while feeling foolish for ever doubting that those far superior minds we call scientists hadn’t sorted out the problems discussed above. Ya, those were pretty big problems, but of course there’s an explanation, the dark stuff does it. Eh, not quite.
Back in the 80’s, during dark matter’s heyday, those trumping the mysterious concept would have felt sure that humanity was on the brink of its first observations of dark matter. Yet, as the years passed, more and more of the experiments, which were launched for the sole purpose of detection, turned up with nothing. To this day, not a single trace of dark matter has ever been detected. Just this week, the prestigious LUX experiment, headed up by Stanford University, announced that they’d found no traces of dark matter in their search.
I do know, sadly enough, that there are those out there who aren’t too concerned with, what I’d call, the troublesome period of time that has passed from the start of the hunt for dark matter to the disappointing place it’s at now. For those people, I’ve decided to include an additional piece of info that helps show how shaky our understanding of reality really is. It’s something that’s become known as, “the worst theoretical prediction in the history of physics.”
What is it?
Well, let’s briefly remind ourselves that dark energy—and dark matter—are forces that, among other things, are used to explain why the universe is expanding as absurdly rapidly as it is—as well as some of the other discrepancies mentioned above. And, when looking at some of the predictions made by quantum field theory—the most widely accepted and utilized framework for understanding particles—we see that one such prediction measures something known as zero-point energy; which can also be understood as vacuum energy density, or, in other words, the energy of empty space. Quantum field theory predicts what this energy should be, in fact, it relies on its prediction of what this energy should be to help make other predictions and models.
The problem? The reason behind “the worst theoretical prediction in the history of physics?”
Simple, we’ve actually taken measurements of what quantum field theory has predicted, and we now know that the predictions and our measurements are off by a nauseating degree. It’s absurd how great the difference is. An inquisitive soul was able to dredge up this metaphor in order to put this situation somewhat into perspective: “If we said that our universe only consists of one particle, that crazed statement is still at least ten times more accurate than the quantum field theory prediction.”
Our understanding of reality is far-less settled than most would believe—just something to keep in mind.
Hunting for Hope
Earlier this year, a new model was put forth to help explain the very discrepancies mentioned in this article. A team from Lawrence Livermore National Laboratory, leading a national group of particle physicists, recently authored a paper, “Direct Detection of Stealth Dark Matter through Electromagnetic Polarizability”, in which they make a case for dark matter being a composite of electrically charged particles—and, as a note, this advanced dark matter model might previously have been out of reach for the team if it wasn’t for Livermore’s state-of-the-art parallel 2-petaflop Vulcan supercomputer.
Anywho, the model that the team arrived at is fascinating for a number of reasons. First and foremost, it incorporates electromagnetism into the physics of space; whereas, in previous theories, the weak force of gravity would act as the prime mover in the large-scale dances of celestial objects, and if dark matter and/or dark energy are factored in, they exist in said models as yet undiscovered state/states of matter interacting with gravity in a still undetermined way. Without the incorporation of electricity and electromagnetism, as well as the related properties of resonance and frequency, these previous theories wouldn’t even appear to have a platform to stand on.
Recent theories regarding plasma cosmology have found the success that they have simply because their calculations factor in the fourth and most abundant state of matter: plasma; which hadn’t even been named until twenty three years after Einstein had formulated his theory of relativity. As mentioned in the Lawrence team’s paper, extremely high-temperature plasma conditions looked to have pervaded the early universe, and when we understand that plasma acts as a remarkable—and possibly even an infinite—conductor, we’re given a picture of a dense and electrically-friendly universe where charges and interactions can jolt down vast streams of waiting particle fields at nearly the speed of light or, possibly, even at the speed of light; which, once achieved, exists outside of space and time—at least according to Einstein.
It is important to note, us at MTH don’t necessarily agree with this new model, though it is does look to be an encouraging bit of progress. If you’re familiar with the ramblings we do here then you may already know that a holographic principle is key to our understanding of reality. So, in relation to the core problem we’ve been discussing here… Well, there would no longer be a problem. Those troublesome, Einstein-violating distances don’t exist in a holographic world, and neither do the problems with quantum entanglement. Gravity doesn’t have to have an infinite speed or force the invention of a mysterious form of matter or even exist, if you can believe that reality is a resonance system manifesting in a holographic void. After all, we are almost completely empty space—at the atomic level.
If you are into alternative theories regarding space and reality in general, browse our framework or confined sonority pages. But, for now, that’s enough digression.
By this point, ideally, it should already be clear that gravity cannot stand alone when it comes to holding space together. The real debate that this article is meant to spark is in regards to that invisible, almighty glue that we’ve been made to trust; the still completely theoretical materials that make up over 95% of our reality; the stuff that was supposed to have been proven but never was. Hopefully, and at the very least, this article will make you second guess a very considerable aspect of our world that, still today, remains a mystery. That is the point of this piece, and, in general, it is also one of the grander points of this website: to draw attention to the hidden or unspoken mysteries and hopefully illuminate them in the process.
ALL IMAGES IN THIS ARTICLE CAN BE FOUND IN THE PUBLIC DOMAIN OR ARE BEING USED UNDER A CREATIVE COMMONS LICENSE