SpaceTime Series 25 Episode 76
*The Dark Side of the Universe might not be all that dark after all
The accelerating expansion of the universe due to a mysterious quantity called “dark energy” may not be real, according to ongoing research claiming it might simply be an artefact caused by the physical structure of the cosmos.
*CAPSTONE moon probe finally phones home after mysteriously going silent
NASA’s Capstone mission has re-established contact with mission managers two days after suddenly going silent following its separation from its Rocket Lab Photon Spacecraft.
*NASA’s second launch at the Arnhem Space Centre blasts off
NASA has conducted a second launch from the Arnhem Space Centre sending a sub orbital sounding rocket on a mission to study our nearest neighbouring star system Alpha Centauri.
*The Science Report
A new more infectious third COVID Omicron wave on the way.
A diet rich in Vitamin K could lower your risk of atherosclerosis-related cardiovascular disease.
New study confirms true believers are most likely to be radicalised.
Skeptic's guide to junk science in forensics
Listen to SpaceTime on your favorite podcast app with our universal listen link: https://spacetimewithstuartgary.com/listen
For more SpaceTime and show links: https://linktr.ee/biteszHQ
If you love this podcast, please get someone else to listen to. Thank you…
To become a SpaceTime supporter and unlock commercial free editions of the show, gain early access and bonus content, please visit https://bitesz.supercast.com/ . Premium version now available via Spotify and Apple Podcasts.
For more podcasts visit our HQ at https://bitesz.com
Your support is needed...
SpaceTime is an independently produced podcast (we are not funded by any government grants, big organisations or companies), and we’re working towards becoming a completely listener supported show...meaning we can do away with the commercials and sponsors. We figure the time can be much better spent on researching and producing stories for you, rather than having to chase sponsors to help us pay the bills.
That's where you come in....help us reach our first 1,000 subscribers...at that level the show becomes financially viable and bills can be paid without us breaking into a sweat every month. Every little bit helps...even if you could contribute just $1 per month. It all adds up.
By signing up and becoming a supporter at the $5 or more level, you get immediate access to over 240 commercial-free, double, and triple episode editions of SpaceTime plus extended interview bonus content. You also receive all new episodes on a Monday rather than having to wait the week out. Subscribe via Patreon or Supercast (you get a month’s free trial with Supercast to see if it’s really for you or not)....and share in the rewards. Details at Patreon www.patreon.com/spacetimewithstuartgary or Supercast - https://bitesznetwork.supercast.tech/
Details at https://spacetimewithstuartgary.com or www.bitesz.com
#space #science #astronomy #news #podcast #spacetime
SpaceTime S25E76 AI Transcript
Stuart: This is Spacetime Series 25, episode 76 for broadcast on 11th July 2022. Coming up on space time, your assert suggests the dark side of the universe might not be all that right real after all. NASA's capstone Moon probe finally phones home after mysteriously going silent for two days. And NASA's second launch in the Arnold Space Center blasts off to study Alpha Centauri. All that and more, coming up on Space Time.
VO Dude: Welcome to spacetime with Stewart gary.
Stuart: New research, uh, suggests the accelerating expansion of the universe due to a mysterious quantity known as dark energy may not be real after all at least not according to ongoing research claiming it might simply be an artifact caused by the physical expansion of the cosmos. The findings, reported in the Monthly Notices of the Royal Astronomical Society claims a fit of type One A supernovate to a model universe with no dark energy appears to be slightly better than the fit using the standard dark energy model. A study's lead author, Professor David Wool, chair from the University of Canterbury in New Zealand, says existing dark energy models are based on a homogeneous universe in which matter is evenly distributed. However, as we all know, in, um, the real universe, uh, things are far more complicated. It comprises galaxies, galaxy clusters and superclusters arranged in a sort of giant cosmic web with sheets and filaments of stars and galaxies surrounding near vast, empty voids. Current models, um, of the universe require dark energy to explain the observed acceleration and the rate at which the universe is expanding. Scientists based its conclusion on measurements of the distances to type one A supernovae in distant galaxies which appear to be further away than they should be if the universe's rate of expansion wasn't accelerating. Type one A supernovae are powerful explosions bright enough to briefly outshine entire galaxies. They're caused by the thermonuclear destruction of a type of star known as a white dwarf the stellar corpse of a Sunlike star. Uh, all type One A supernovae are thought to explode at around the same mass. A figure known in astrophysics is the Chandraseka limit, which equates to about 144 times the mass of our sun. Um, now, because they're all exploding at about the same mass, they all explode with about the same level of luminosity. And this allows astronomers to use them as standard candles to measure cosmic distances across the universe. It's exactly the same as the way you determine how far away a row of street lights are along a road by how bright each light appears from where you're standing. In physics, it's known as the inverse square law. On a galactic scale, gravity appears to be stronger than scientists can account for using the normal matter of the universe the material of the standard model of particle physics, which makes up all the stars, planets, buildings and people. To explain their observations, scientists have invented dark matter, a mysterious substance which seems to only interact gravitationally with normal matter. Uh, scientists know dark matter is real because they can see its effect on galaxies, um, holding them together as they rotate. Without dark matter, uh, galaxies would literally fly apart. They've calculated there must be at least five times as much dark matter as normal matter in the universe to explain science's observations of how galaxies move. But on the even larger cosmic scales of an expanding Universe, gravity appears to be weaker than expected in a universe containing only normal matter and dark matter. Uh, and so scientists have invented a new force called dark energy a sort of antigravitational force causing an acceleration in the expansion of the Universe up from the Big Bang 13.82 billion years ago. Dark energy isn't noticeable at small scales, but it becomes the dominating force of the universe on the largest cosmic scales, almost four times greater than the gravity of normal and dark matter combined. It was the late, great professor Albert Einstein who first came up with the idea, um, to explain the problem he was having when he first applied his famous 1915 equations of general relativity theory to the whole Universe. You see, like other scientists of the time, albert Einstein, um, believed the universe was in a steady, unchanging state. Yet when applied to cosmology, his equations showed that the universe wanted to expand or contract as matter interacts with a fabric of spacetime. Matter tells spacetime how to curve, and space time is telling matter how to move. To resolve the problem, einstein introduced a dark energy, uh, force in 1917, which he called the cosmological constant. It was simply a mathematical invention, or fudge factor. And it was designed, uh, to solve discrepancies between general relativity theory and the best observational evidence of the day bringing the Universe back into that steady state. Of course, years later, astronomer Edwin Hubble discovered that galaxies appear to be moving away from each other, and the rate at which they were moving was proportional to their distance. The further away galaxy was, the faster it appeared to be accelerating. With this revelation, Einstein realized his mistake, describing the cosmological constant as the biggest blunder of his life. However, the idea, uh, of a cosmological constant has never really gone away. In fact, it keeps reappearing to explain strange observations. In the mid 1990s, two teams of scientists, one led by Brian Schmidt from the Australian National University and Adam Reese, and the other by Saul Pearl motor independently, uh, measured distances to type one, a supernova in the distant universe, finding that they appear to be further away than they should be if the Universe's rate of expansion was constant. The observations led to the hypothesis that there must be some kind of dark energy antigravitational force at work, which has caused the expansion of the universe to accelerate over the last 6 billion years. But Wilshire says these observations are, um, based on an old model of expansion that hasn't really changed since the 1920s. You see, back in 1922, Professor Alexander Friedman used Einstein's field equations to develop a physical cosmology governing the expansion of space in homogeneous and isotropic models of the Universe. August says treatment equations assumes an expansion identical to that of a featureless soup with no complicated structures. And this has become the basis for what today is the standard lambda called dark matter cosmology, used to describe the Universe. Wilshire points out the unfortunate fact that in reality, today's Universe is not homogeneous. Um, the earliest snapshot of our Universe, captured as a fade afterglow of the Big Bang and called the cosmic microwave background radiation, displays only slight temperature variations caused by differences in densities. Present, 2700 years after the Big Bang, the cosmic microwave background radiation has cooled to just three degrees above absolute zero, and the Universe, uh, has become a vast cosmic web dominated in volume by empty voids, surrounded by sheets of galaxies and threaded by Wispy Filaments of stars. Rather than comparing the super nervous observations to the standard lambda cold dark energy cosmological model, wilshire and colleagues have used a different model known as timescape cosmology. Now, timescape cosmology has no dark energy uh. Instead, it includes variations in the effects of gravity caused by the lumpiness of the structure of the Universe. Woolshire says that clocks carried by observers in galaxies differ from the clock that best describes the average expansion. Once in homogeneity becomes significant. He says whether or not one infers accelerated expansion then depends crucially on the clock being used. Wallsheet points out that time Sccape cosmology gives a slightly better fit to the largest supernova data catalog than lambda cold dark matter cosmology. But he admits the statistical evidence isn't yet strong enough to definitely rule in favor of one model over the other. While she says future missions such as the European Space Agency Eucalypt spacecraft will have the power to distinguish between differing cosmology models. And another problem involves science's understanding of type one, a supernovae. You see, in reality, they're not perfect standard handles. Since timescape cosmology uses a different equation for average expansion, it gives scientists a new way to test for changes in properties of supernovae with distance. Regardless of which model ultimately fits better. A better understanding of all this will increase the confidence with which scientists can use these practices in order to develop precise distance indicators, which says answering questions like these will help scientists determine whether dark energy is real or not, an important step in determining the ultimate fate of the Universe.
Speaker B: Einstein invented dark energy 100 years ago when he first applied relativity to cosmology and for a different reason, to the reason we use it now. Back then, he mistakenly wanted to exactly balance the self attraction of matter by some sort of antigravity on larger scales. And that's because he couldn't imagine that the Universe had a beginning and he didn't want it to change with time. But really, nothing was much known about the universe in 1917. The very idea that the nebulae were galaxies, uh, at vast distances was a matter of something called a great debate. So he faced a dilemma because, as he would know, the essence of his theory is that matter tells space how to curve and space tells me how to move. Which means the natural state of anything with an attractive force, from the point of view of Newton is that space will naturally want to expand or contract bending together with matter. It never stands still. So in order to get round, uh, that he put in this extra antigravity force to balance the fact that space moving with matter which is universally attractive or generally want to pull together.
Speaker A: He was never really happy about that, was he? It was always something that bothered him a little bit was as a fudge factor.
Speaker B: And he knew that, yes, the reason that he wasn't happy was that the universe putting that thing in as a fudge factor is unstable. So mathematically, it's not a good solution. So there are various reasons why you could be happy or unhappy with it because it's a very deep question in physics. Also, there are various levels of how you can answer that question.
Speaker A: All that became when Edwin Hubble realized that galaxies appear to be virtually all moving away from us.
Speaker B: Correct. So that's why he went back on the cosmological constant because if he didn't need it, then he didn't see a reason to include it. And it's been coming back and forth in an out of fashion over many decades, depending on what the observations were. Most of the time, people didn't include it. But there were other times in the history, like in late 1960s, where people saw some certain observations. Suddenly it was fashionable again, and then it went away again, and then it came back. And of course. In the 90s. Well.
Speaker UNK: There.
Speaker B: Uh. Are various reasons for knowing that the universe. Uh. Wasn't described by the model that everybody favored. Which was a certain model was only. Um. Cold. Dark matter in which all the energy density. Uh. In the universe added up to what's known as critical density the density which you just need so that things will just keep on expanding forever because if there's too much matter. Then things will want to contract. There's a thing we call the critical density. And there are various reasons that therapists have, based on inflationary cosmology, which they want something which is very close to critical density. And first of all, in the early 90s, there's a lot of evidence that it wasn't such a critical density universe full of ordinary matter. And then the observation of supernovae, uh, really clinched it. In the late 1990, uh, s that the expansion of the universe appeared to be accelerating. If you use the simple model of Friedman, which has been around since 1922.
Speaker A: That'S basically assuming the universe is the same density all the way through. It hasn't got the structures that we know it has with filaments and voids.
Speaker B: Yeah. Uh, that's exactly right. So Einstein, of course, included that. Einstein actually wrote down that sort of model in 1917. It's just that he didn't take the idea of a changing universe seriously. And then Friedman used the same equations, but just took the idea of expansion and contraction seriously. So, yes, it does assume that the universe expands just as if everything could be put through a blender to make a uniform soup. People do know that the universe is in homogeneous. But what they've always been assuming is that even though it's in homogeneous, the average expansion is exactly as if there were no structure. And people then always look at it mathematically, as if we assume there's no structure, and we just add the structures on as, uh, perturbations. And if the perturbation theory breaks down and we do simulations, um, using Newtonian gravity because it's very difficult to do it with general relativity.
Speaker A: What are the basic ideas for what dark energy is? I know quantum fluctuations, particles popping into out of existence.
Speaker B: Uh, is one of the current ones. Well, that's an idea that generally doesn't work in itself. There is something known as the cosmological constant problem that if you predict what the vacuum energy should be using some rather naive ideas in quantum field theory, and you get something which is out by 120 orders of magnitude much too.
Speaker A: Much energy for what we see.
Speaker B: Yes, that's right. The universe just wouldn't be here. But that is done without any understanding of what quantum gravity is. And ultimately, there are holes in Einstein's theory, and we expect that it should be unified with quantum theory. And then we might understand this question better. But what people have done is to generally you invent some scalar field that typically is very easy to add sources of matter, or in this case, energy. Something which doesn't clump gravitationally. So should distinguish dark matter from dark energy. Dark matter would be something which doesn't emit light and is different from all the particles that we're familiar with, but which still forms lumps gravitationally, where stark energy is something which is a different type of stuff. It's something which doesn't want to form lumps because it's sort of inherently repulsive. And you can write down very imaginative theories of physics in which you add these sorts of things, and there's zillions and zillions of ways you can do that. Things called quintessence KS. You can try and modify, uh, the equations of Einstein. Imagine gravity is totally different. But when people do that, they generally imagine gravity is totally different. But still, the universe is very simple because, unfortunately, it's easy to solve differential equations if the universe is simple. And that's, I think, one of the problems sociologically in the field is that people can think up all sorts of very crazy, unknown physics but they still want to solve simple differential equations rather than dealing with some of the harder conceptual and foundational issues that are in the holes in general relativity, which Einstein left. Because you shouldn't think of general relativity as a finished, complete theory. There are really basic questions which Einstein didn't answer. So those are things I've been trying to think about, questions such as what is the biggest particle, what is the biggest object that can move? Following, um, a path determined, uh, by Einstein's equation, saying that there has to be an answer to that. But those sorts of things weren't put in, and also by Einstein. He also didn't answer on what scale do the Einstein equations hold if I cut matter up into arbitrarily large boxes? So Einstein's equations say that geometry is proportional to matter. So the matter tells space how to curve. But it assumes that you can describe the matter by fluid. But at some level, the fluid description has to break down, especially once structures form and the universe becomes complex. Then things like voids and filaments of galaxies that those certainly are not like particles. They don't have a particle description. So the way in which we describe that matter can be fundamentally different. It's just that because it's a too hard problem, we just say, let's forget about that. Let's just assume that it is behaving like pressure less dust. And then we add all these extra things because we don't want to go and think about the too hard problem. So a few of us are trying to think about the too hard problem.
Speaker A: You've been doing this research for at least ten years that I know of, and probably a lot longer. Tell me a little bit about your own views on this matter.
Speaker B: Okay, so I've been working on many long history and gravitational physics, uh, working on everything from I started off in my PhD in Cambridge looking at higher dimensions and string theory and things like that. And I've looked at all these other sorts, uh, of funny theories that people think about modified gravity theories, theories with dark energy and all that sort of stuff. In fact, I did a lot of that before the supernova observations came out in the 90s because there was already enough evidence that the older models didn't work. But it is obvious to everybody who looks at data that, well, the universe is in homogeneous on some small scales. At late epoch, it started out very smooth at the time of the Big Bang. It's a reasonable approximation then, well, roughly, to think that the universe is filled with smooth matter fluid. But clearly, the universe is very inhomogeneous today. So a number of cosmologists started thinking about those questions in the early 2000s, in particular, Thomas Buchert, who's a German cosmologist now working in France for the last ten years in Leon. And he put forward a way of taking averages of Einstein's equations which deal with in homogeneity, but look for something which is close to a smooth average. But what's important in doing that is that you don't necessarily get the equations of Alexander Friedman back. So in general, you're going to get some different equation. If you assume Einstein's equations hold on small scales, the scales that we actually test it, and then you write down the formalism, which takes averages of this, then you end up with different equations. They're the bulkhouse equations now. So a lot of people were thinking about this and, um, as soon as you challenge the status quo, there is a lot of debate. And various people weren't happy about this and say, well, there are various problems with this, including how do you relate these statistical quantities? Because that's what he's dealing with, statistical things. How do you relate those statistical quantities to observables? And what is the time slicing here? There are a number of these questions which book its formalism doesn't address directly. So that's where I got interested in the problem because it's clear that there are many things you can do with the bulk of formalism. But you really need to address some foundational questions to try and answer things such as how do I relate observations to these statistical things? And in all the work that I've done, I've always thought a lot about foundational questions. And so that's when I came in. And I guess the most fun I've had is going back and thinking about redoing thought experiments in the way that Einstein did it back when he was inventing general relativity to think, well, if he had all these observations today, how would he have approached, um, this thought experiment? In particular, thinking about what is the difference between motion in a completely empty space, a flat space? So how do I sting moving in an empty space from being at rest in an expanding space? And at some level, those two notions should be indistinguishable. So when I went in doing that, I extended principle of Einstein's, the strong equivalence principle, to try and answer that question. Because I think with anything with the universe which is completely inhomogeneous, there has to be simplifying principles to say why the universe appears to be almost homogeneous, even though it's not. And people have conventionally assumed, well, it's just the freedom equation. But actually, perhaps it's not the freedom equation. But there is a deeper principle in trying to disentangle all of the ideas about distances and redshift. So I added those ingredients to the bullet equations and ended up with something in which you can sit down and make predictions. And it worked. But it's a very hard job. Then redoing the whole of cosmology from first principle.
Speaker A: That's an understatement if ever I've heard one.
Speaker B: What we've, uh, done is to look at current supernova data using a very large catalog of everything that's, uh, available. And there was a big debate when some people pointed out we're arguing about how statistically significant the signature of acceleration is. But they were just comparing universes which all satisfied the Friedman equation. So you can have a universe with dark energy or you can have a universe which is completely empty and unrealistic, of course, because we know there's matter in the universe. But a universe which is completely empty neither accelerates nor decelerates in its expansion. You can imagine a universe like this which expands at a constant rate. And then there's argument about how statistically significant it is, because even after 20 years of data, the statistical significance is marginal. Not marginal, depending on how you treat things. So we decided to go back and look at the data, because last time that we tested things in detail at that time, we got different answers depending on which way we reduced the supernova data. Well, that's to say, either my cosmological model, the tonescape model, fitted better, or if you use a different method, then the other model fitted better. And it's clear that there are a lot of things that we don't know in the methods about reducing supernova data, because supernova are not perfect standard candles. There's a lot of uncertainty. So we used the methods of the people who've been questioning the statistical significance of the data and we also use the methods of their opponents in the debate. And so we found that the new way of doing things, that we get agreement between both ways of reducing the data. And also the timescape model fits with very tiny fits better, but only by a very tiny amount. So it's nothing to write home about. It's not statistically significant yet. But what is more important from our point of view is that we can see the effect of how the cosmological model that you assume is related to various systematic issues when you reduce the data. So, for example, there's something called the color parameter. And we've done tests that people wouldn't, uh ordinarily never do, such as from the point of view of my cosmological model of the timescape, you shouldn't have an average expansion law if you're using data on small scales below about 450,000,000 light years. So what we did was just not to assume a scale, but to progressively cut data out and see if there's a change in quantities which are supposed to be constant. And lo and behold, there is a change. And the change is roughly at the scale where you expect it. And that's independent of the cosmological model. If you're using my model or their model or an empty universe model, you get the same result. And that means that those sorts of things are going to be important for anyone doing supernova data and trying to understand which is the best model of the universe. So at present, because the uncertainties in the way that we use supernova candles, because the uncertainties are so large, we can't say at present which model fits better. But in the Euclid satellite mission, combined with other things about supernova, we should be able to distinguish the models. In fact, we should be able to test the freedom equation. So whether my model is the correct way of doing in homogeneity or some other model is the correct way of doing in homogeneity, anything which doesn't agree with the freedom of equation will give a different result by a particular test. So you can test the freedom equation itself. And so I have a prediction about the precision of data observational test that is required in order to see the difference. And other people there is another back reaction around called the TARDIS Cosmology. They have prediction also. But in order to actually reach that precision, uh, we've got to understand the supernova better. And so what we have demonstrated in this latest work is that even if people who are trying to understand dark energy don't accept our model, the very fact that it's not the freedom of equation that we're using can help them to understand the systematic uncertainty, the unknown astrophysics and the selection biases. Because if you only test your data with one model, you can add things and subtract things and do all sorts of things and not realize that actually it's, uh, a degeneracy between your cosmological model and some empirical parameter which describes some supernova physics that you don't understand. What I'm hoping is that people will start just testing their data and the way, um, that they reduce the data using this, uh, cosmological model or some other cosmological, uh, model which is not Friedman. And then maybe we've got some better hope of actually determining what dark energy is that's.
Speaker A: Professor David Wilcher from the University of Canterbury in New Zealand. And this is spacetime. Still to come, the Capstone Moon probe finally phones home after mysteriously going silent. And NASA's second launcher in the Arnold Space Center blasts off to study Alpha Centauri. All that and more still to come on Space Time.
Speaker UNK: Um.
Speaker A: There are a lot of relieved scientists tonight with NASA's Capstone mission reestablishing contact with mission managers two days after mysteriously going silent following its separation from its rocket lab Proton spacecraft. The communications loss raised serious concerns about the mission and whether or not a catastrophic failure had somehow occurred. The 25 kilogram Capstone CubeSat was launched last week aboard an electron booster from rocket Las Maria Peninsula launch complex on New Zealand's North Island east coast. The SIS Lunar Autonomous Positioning System Technology Operations and Navigation Experiment capstone, for short, is designed to test what's known as a near Rector linear halo orbit, basically a lagrangian l, one gravitational well between the Earth and the Moon, a place where a spacecraft could maintain a stable orbit between the two celestial bodies. As the gravitational pull from each of the two bodies canceled each other out, this unique orbital position will eventually become home to the new Lunar Gateway Space Station, which will be used as a base camp and jumping off point for atomos missions down to the lunar surface. Capstone was launched into an initial low Earth parking orbit by an electron rocket, then sent onto a lunar transfer orbit by photon, gradually increasing its velocity and stretching its orbit into an extreme ellipse around the Earth. Photon then ignited its engine for one final burn, accelerating Capstone, with the aid of the Sun's gravity, to some 39,500 km an hour on a ballistic lunar transfer trajectory before releasing the probe. The burn is sending Capstone on a four month, 1.55 million kilometer journey away from the Earth, more than three times the distance between the Earth and the Moon, uh, before gravity finally pulls the little spacecraft back into the Earth Moon system and settles it into the near rectilinear halo orbit. Now, once in position, Capstone will remain there for the next six months, tasked with verifying the region's stability and orbital dynamics. After separating from photon, capstone deployed its solar arrayss as planned, and began preparing its onorbit propulsion system for the first of its engine burns. It successfully contacted NASA's Deep Space network during one pass, but achieved only a partial contact on the second, and then nothing. The spacecraft mysteriously went silent, and mission managers still don't know why. But the spacecraft has now reestablished communications and is looking happy and healthy. The loss of contact meant scrubbing the spacecraft's first ejectory correction engine burn, but that's an issue which can be rectified on the next burn. We'll keep you informed. This, um, is Spacetime. Still to come, NASA's second launch from the Arnold Space center blastsoft to study Alpha Centauri. And later in the science report, doctors warn of a new, more infectious third covert Omarbron wave on its way. All that and more still to come on Spacetime. NASA has conducted a second launch from the Arnold Space Center, sending a suborbital sounding rocket on a mission to study our nearest, uh, neighboring star system, Alpha Centauri. The flight, named Sistine, was designed to study ultraviolet light from the two primary stars in the triple star Alpha Centauri system. The launch had been delayed several days by high winds. Cysteine is meant to stand for suborbital imaging spectrograph for transition region radiance from nearby exoplanet host stars. It was developed by the University of Colorado, Boulder. It's designed to study how the ultraviolet light from, uh, the two stars Alpha Centauri A, which is a spectral type G yellow dwarf star, similar but a little bit bigger than our sun, and Alpha Centauri B, a less massive spectral type K orange dwarf star a little smaller than our sun, is likely to impact the atmosphere, uh, of any planet orbiting around the pair.
Speaker C: All the things we sort of take for granted that happen here on Earth with our sun, we just don't know.
Speaker A: Those things about planets around other stars. And so that's what we're here to measure our Centauri A and. B orbit each other in a binary system and are in turn orbited by the third star of the system, a spectro type M red dwarf star known as Proximisantauri, which, at 425 light years distant, is the nearest star to the Earth other than the sun. The mission, aboard a 13 meters long black brand nine sounding rocket follows on from last week's successful launch, which studied Alpha Centauri using an Xray quantum calorimeter developed by the University of Wisconsin in Madison. That, uh, flight used unique Xray detectors cooled down to just one 20th of a degree above absolute zero to measure X rays in the interstellar medium. That's the space between the stars with unprecedented precision. The third mission, which is stated to launch this week, will carry Juice, the Dual Channel Extreme Ultraviolet Continuum Experiment, also developed by the University of Colorado, Boulder. Juice will measure a so far unstudied part of the ultraviolet spectrum from stars less massive than the sun and check out its effect on the atmospheres of any orbiting planets. The Atom Space Center was selected as preferred launch site for these missions because of its location, allowing astrophysics studies that can only be done from the Southern Hemisphere. Located just twelve degrees south of the equator on the Gulf of Carpentaria, the Arm Space Center near Nolenboy, 640 km east of Darwin in the Northern Territory outback is ideally suited for equatorial missions. The complex is developing three launch pads, allowing it to undertake up to 50 flights a year. Meanwhile, work continues on three other commercial launch facilities across Australia. Southern Launch is developing a polar launch facility at a place called Whalers Way. That's on South Australia's Air Peninsula near Port Lincoln. It would undertake flights over the great Southern Ocean on polar orbits often used by scientific intelligence gathering spacecraft. Southern Launch also operates a rocket test range at Kinema, 40 km from Sejuna on the south Australian west coast. It tracks west over the Nullarbor Plain, allowing the testing and retrieval of rockets and their payloads. Meanwhile, construction is also proceeding in North Queensland on the Bowen Space port at Abbott Point, which will be used for equatorial launches by Gilmore Space Technology's new Aries rocket, which will fly east over the South Pacific Ocean that's expected to launch its first mission later this year. This is space time, and time now to take a brief look at some of the other stories making news in science this week with a science report. The Australian government has decided to proceed with a fourth covered 19 vaccination dose for people aged 30 and older the fourth shots already available for senior citizens aged over 65. The move comes in the wake of the spread of the more, uh, infectious BA Four and BA Five sub variants of Omakrin, which first appeared in South Africa in January and February this year. Uh, these new mutations have learned how to evade immune responses and they even ignore vaccine induced antibodies this allows the new variants not just to infect more people, but also reinfect people who were previously infected and have recovered. The original China Wuhan strain of Coval 19 had an Ro rating of 3.3, meaning every infected person was likely to infect an average of at least 3.3 more people. The Delta Street, which was the Big Boogeyman last year, increased that infection rate to 5.1, while the original Omarron BA One strain saw that number jump all the way to 9.5. The BA Two strain, which is currently the dominant variant in Australia, has seen another big jump in infection rates to 13. Three preprint non peerreviewed studies in South Africa looking at the new BA Four and BA Five sub variants suggest an arrow rating of around 18.6, a massive jump, meaning that every infected person is likely to infect at least another 18 to 19 people. Now, to put that in context, that's a similar rate to Measles, which until now was the most infectious viral disease for humans. Over six 4 million people have now been killed by the Covariant coronavirus since it first appeared in the area around China's Wuhan Institute of Biurology back in September 2019. However, the World Health Organization says the true death toll is likely to be around, uh, 15 million, with over 560,000,000 confirmed cases globally. A new study claims people with a diet rich in vitamin K could have a 34% lower risk of arteriosclerosis related cardiovascular disease. The findings, reported in the Journal of the American Heart Association, are based on data from more than 500 people taking part in the Danish Diet, Cancer and Health Study. Over a 23 year period, scientist at Edith Cowan University investigated whether people who eat more foods containing vitamin K had a lower risk of cardiovascular disease related to arteriosclerosis. She's a part of build up in the arteries. There are two types of vitamin K fat and foods we eat. Vitamin K One comes primarily from green leafy vegetables and vegetable oils, while vitamin K Two is found in meat, eggs and fermented foods such as cheese. The study found that people with the highest intakes of vitamin K One were 21% less likely to be hospitalized with cardiovascular disease related to aterosclerosis. And for vitamin K two, the risk of being hospitalized was 14% lower. A new study has found that true believers whose identity is strongly fused with their cause are usually the most willing to fight and die for that cause. The findings, uh, reported in the journal Frontiers in Psychology, looked at two causes Second Amendment gun rights and abortion rights and found that if one considers the value to be sacred, considers it to be a moral conviction, or if identity, uh, is fused with a cause, one might be more likely and more willing to fight and die for that cause. The authors say the findings suggests that people who are strongly fused with their cause and experience what they believe to be a threat to it are likely to become radicalized. They also suggest that shifting radicals from fusion with an extreme cause to a benevolent cause may well transform them from a force of evil to a force of good. A, um, new book looks at the high levels of junk sites which often finds its way into forensic sciences. From lie detecting polygraphs to bite marks, this book carefully and unarguably explains that many things which have in the past been paraded, um, in courts of law as scientific fact are often unreliable and commonly entirely bogus. Tim Minimum from Strange Skeptic says the problem isn't just polygraphs, but arson investigations, hair microscopy, bullet lid analysis, voice spectrometry, handwriting and even bloodstain spatter analysis.
Speaker C: Yeah, this is the thing. And I like the way you say forensic science because thank heavens you did, because the actual definition of forensic is to do with the law, with courts. You have forensic science, you have forensic medicine, you have forensic accounting, all sorts of things. So the forensic just means it's used in courts, et cetera. But forensic science has obviously a little cache because it's now popular. It's been popular in TV shows and things where the filter goes out to the crime scene, the scientific person and collects information, et cetera, then comes back with their expert opinion. Type of a lot of psychology as well. Obviously, we've seen that. The problem is you see things that are used in forensic science that they think are accurate, objective measures where they're actually very subjective measures. Uh, the classic example is the polygraph or the lie detector.
Speaker A: Yeah, I thought polygraphs were real. I thought there was something that proved if you were lying or not, but we now know they're not. And it's the same with body language. I always thought that was real because that's what they tell me. But we now know there's very little relationship with what a person is really thinking.
Speaker C: Yeah, I mean, this particular story now came out of an article of someone who was looking at a book about forensics and saying that most of the forensics science you see used in court, especially obviously in the media, but actually used in court, is dodgy to say the best. Now, the polygraph, the lie detector. If you have been reading the skeptical literature, you would know the decade that it poor. Very, very poor. A lie detector does not measure lies. It measures stress. It can be very different things. And people undergoing a lie detector test tend to be under stress, for a start. And there's been examples used for people. A question is asked of a person who is wired up and that is when you did this knowing full well they didn't. And even though they might say, no, I didn't do that, the little graph will go up, shooting up, etc. Indicating a lie. Well, it's not. It indicates stress. It's totally different things. And that use of polygraph. In fact, I think these days in courts is not allowed because they know it just doesn't work close to a lot of other areas. This particular thing says bite marks, blood splatters, voice spectrum, handwriting analysis, all sorts of your hair microscopy, all sorts of things that are used, some of them routinely in forensic sides, have very little justification. You obviously end up with forget the media, forget the CSI programs. And that's the thing from the start. Let's look at the actual use of it. You do get people who get carried away by the fame of being a forensic science expert and expert opinion that's being used in courts, et cetera. Some of them are some of them.
Speaker A: Are experts moving out of it, don't they?
Speaker C: That's right. And get a lot of credibility out of it, and it might be good for their profession, et cetera, uh, all sorts of things like that. They become a noted expert on a particular topic.
Speaker A: I keep thinking of the famous Lindy Chamberlain case with Azaria, and the blood spatter in the Holden terranna, which turned out not to be blood spatter at all, but insulation glue.
Speaker C: Yeah. This is the problem that fishing high profile cases, people get carried away with the evidence in quotes that's being put forward from a forensic science point of view. Now, there is real forensic science, and it's really used in cases. The question is, how reliable is it and how much credibility you should give to it. And this particular book is about the only thing that is credible is DNA analysis, which is pretty objective. You do analyze the DNA, you see it, the DNA thread, you compare it with other DNA. If it matches, there's a pretty good chance that is an indication contamination. Of course, that's the other area. Yeah, I mean, that's the case of all these things. But some of these, when you're testing teeth marks or bite marks, it's very subjective as to what matches or not. But DNA comes closer to being truly scientific and testable and repeatable with a lot of these other things are not. And so it's a simple matter of realizing that despite what you see in the movies and TV people running around doing lie detector tests, practitioners wearing lab coats, don't know why they wear lab coats, makes them look more scientific. And that sort of stuff is take everything like that with skepticism and also grain of salt. So uh, that's the thing to watch out for. That's what this book talks about. Obviously, some people might have a bit of vested interest because the guy who wrote the book does DNA analysis, so that might be an issue there as well. But it's very interesting. Skeptics have been talking about this yonks other people only just discovering it. They're mainly because it's done with such conviction in TV shows and that sort of thing.
Speaker A: Uh, that's Tim Mendel from Australian skeptics, and that's the show. For now, space Time is available every Monday, Wednesday and Friday through Apple Podcasts, itunes, Stitcher, Google Podcast, Pocket Casts, Spotify, Acast, Amazon Music Bytescom, SoundCloud, YouTube, your favorite podcast download provider and from spacetime. Um with Stuartgarrycom. Spacetime is also broadcasts through the National Science Foundation on science owned radio, and on both iHeartRadio and tune in Radio. And you can help to support our show by visiting the Spacetime Store for a range of promotional merchandising goodies. Or by becoming a Spacetime patron, which gives you access to triple episode, commercial free versions of the show, as well as lots of bonus audio content which doesn't go to air, access to our exclusive Facebook group and other rewards. Just go to Spacetime with Stuartgary.com for full details. And if you want more space time, please check out our blog, where you'll find all the stuff we couldn't fit in the show, as well as heaps of images, news stories, loads of videos and things on the web I find interesting or amusing. Just go to spacetime with Stuartgarry Tumblrcom. That's all one word and that's Tumblr without the e. You can also follow us through at stuartgary, on Twitter, at Spacetime with Stuart Gary, on Instagram, through our Spacetime YouTube channel and on Facebook. Just go to Facebook.com spacetime with Stuartgary SpaceTime is brought to you in collaboration with Australian Sky and Telescope magazine, Your Window in the Universe.
Speaker A: You've been listening to Space time with Stuart Gary. This has been another quality podcast production from Bitesz.com.