Accessing Universal Intelligence.
Human Ingenuity and Creativity. Our Cultural Heritage.
Favourite things. Music and Movies. Nature. Items that interest me on any topic.
Scientists have set a new world record for plasma pressure - the 'key
ingredient' for producing energy from nuclear fusion - which means this
clean and sustainable energy source is closer to our grasp than ever
before. The new record stands at 2.05 atmospheres - a 15 percent jump over
the previous record of 1.77 atmospheres. Both this record and the last
were set at the custom-built Alcator C-Mod reactor at MIT.
While a viable nuclear fusion reactor ready to power our homes is
still a long way off, these increased pressures equate to increased
reaction rates, and are more evidence that we're getting closer to a
reactor that's technologically and economically viable. It also gives scientists more clues about how best to move forward.
"This is a remarkable achievement that highlights the highly successful Alcator C-Mod program at MIT," said physicist Dale Meade of Princeton Plasma Physics Laboratory, who wasn't involved in the experiments.
"The record plasma pressure validates the high-magnetic-field approach as an attractive path to practical fusion energy."
To reach the 2.05-atmosphere record, MIT researchers turned the
reactor up to 35 million degrees Celsius (63 million degrees Fahrenheit)
- over twice as hot as the Sun's core - holding plasma producing 300 trillion fusion reactions per second for 2 seconds.
These three variables - temperature, pressure, and time sustained - act as trade-offs, as previousrecords from teams
from around the world have demonstrated. For example, while the Alcator
C-Mod reactor has the top spot in terms of pressure, other reactions
have been hotter or lasted longer.
However, plasma pressure is crucial to the overall energy produced, which is why the MIT team is so excited. It says pressure levels are "two-thirds of the challenge" of producing nuclear fusion reactions.
Scientists think nuclear fusion
could give us the clean, safe, and virtually unlimited energy source
we've been looking for - it essentially replicates what's happening on
the Sun here on Earth, by heating tiny elements of matter to over
several million degrees Celsius, and forming the superheated gas called
plasma.
Isolate plasma from ordinary matter using a super-strong magnetic
field, and there's your energy source - one that could replace all
nuclear and fossil fuel power plants at a stroke. And unlike the nuclear fission
reactions that power today's nuclear power plants (where atoms are
split), nuclear fusion (where atoms are fused together) creates no
radioactive waste, and there's no chance of a meltdown either.
The Universe is expanding. In the standard model of cosmology the rate of that expansion is given by the Hubble parameter,
which is a measure of the dark energy that drives cosmic expansion. New
observations of distant galaxies yield a higher than expected Hubble
value. That may mean the Universe is expanding faster than we thought,
but there’s no need to start rewriting textbooks just yet.
Since the Hubble parameter measures the rate of cosmic expansion, one way to determine it is to compare the redshift
of light from distant galaxies with their distance. The cosmological
redshift of a galaxy is easy to measure, and is due to the fact that
cosmic expansion stretches the wavelength
of light as it travels across millions or billions of light years,
making it appear more red. By comparing the redshifts for galaxies of
different distances we can determine just how fast the Universe is
expanding.
Unfortunately distance is difficult to determine. It relies upon a
range of methods that vary depending on distance, known as the cosmic distance ladder. For close stars we can use parallax,
which is an apparent shift of stars relative to more distant objects
due to the Earth’s motion around the Sun. The greater a star’s distance
the smaller its parallax, so the method is only good to about 1,600
light years. For larger distances we can look at variable stars such as Cepheid variables. We know the distance to some Cepheid variables from their parallax, so we can determine their actual brightness (absolute magnitude). From this we’ve found that the rate at which a Cepheid variable changes in brightness correlates with its overall brightness.
This relation means we can determine the absolute brightness of Cepheid
variables greater than 1,600 light years away.
If we compare that to
their apparent brightness we can calculate their distance. By observing
Cepheids in various galaxies we can determine galactic distances. We can
observe Cepheids out to about 50 million light years, at which point
they’re simply too faint to currently observe.
The achilles heel of the cosmological distance ladder is that it
relies upon a chain of data. The distance for supernovas depends upon
the calculated distance of Cepheid variables, which in turn depend upon
parallax distance measurements. With ever increasing distance comes
greater uncertainty in the results.
So you want your uncertainties at
each step to be as small as possible, which is where this new work comes
in. Using data from the Hubble Space Telescope’s Wide Field Camera 3, a
teamTISI +%
measured about 2,400 Cepheid variables in 11 galaxies where a Type Ia
supernova had also occurred. This allowed them to reduce the uncertainty
of supernova distance measurements. They then compared the distances
and redshifts for 300 supernovae to get a measure of the Hubble
parameter accurate to within 2.4%.
That by itself is good work, but the result was surprising.
The value
for the Hubble parameter they got was about 73 km/s per megaparsec,
which is higher than the “accepted” value of 69.3. The difference is
large enough that it falls outside the uncertainty range of the accepted
value. If the result is right, then it means the Universe is expanding
at a faster rate than we thought. It could also point to an additional
dark energy component in the early Universe, meaning that dark energy is
very different than we’ve supposed.
But we shouldn’t consider this result definitive just yet.
The use of
supernovae to measure the Hubble parameter isn’t the only method we
have. We can also look at the way galaxies cluster on large scales, and fluctuations in the cosmic microwave background.
Each of these gives a slightly different value for the Hubble
parameter, and the “accepted” value is a kind of weighted average of all
measurements.
The variation of values from different methods is known as tension
in the cosmological model, and any new claim about dark energy and
cosmic expansion will need to address this tension. If the supernova
method is right and the Universe really is expanding faster than we
thought, why do other methods yield a value significantly smaller than
the true value?
It could be that there is some bias in one or both of
the methods that we haven’t accounted for. Planck, for example, has to
account for gas and dust between us and the cosmic background, and that may be skewing the results. It could be that the supernovae we use as standard candles to measure galactic distance aren’t as standard as we think. It could also be that our cosmological model isn’t quite right.
The current model presumes that the universe is flat, and that cosmic expansion is driven by a cosmological constant.
We have measurements to support those assumptions, but if they are
slightly wrong that could account for the differences as well.
This new result does raise interesting questions, and it confirms
that the discrepancy between different methods is very real. Whether
that leads to a new understanding of cosmic expansion and dark energy is
yet to be seen.
Paper: Adam G. Riess, et al. A 2.4% Determination of the Local Value of the Hubble Constant. arXiv:1604.01424 [astro-ph.CO] (2016)
By Brian Koberlein who is an astrophysicist, professor and author. You can find more of his writing at One Universe at a Time.
The last
piece of Albert Einstein’s general theory of relativity may be about to fall
into place 100 years after he first revealed it to the world.
Scientists
searching for minute traces of gravitational waves, infinitesimally subtle distortions
through space-time that Einstein predicted would ripple off giant black holes
and dying stars millions of light years away, may be about to announce one of
the biggest breakthroughs in modern physics.
Whispers
have been circulating for months that a hypersensitive detector spanning the
breadth of the US has finally caught the elusive phenomenon. The team is
expected to make a definitive announcement tomorrow (Thursday).
If they
have found the trail left by gravitational waves, it will be more than just a
vindication of Einstein’s mathematical masterpiece. The discovery would allow
stargazers to map out hidden galaxies on the other side of the universe by
looking out for almost imperceptible disturbances in our own.
In November
1915 Einstein stunned the Prussian Academy of Science with his formulas showing
how gravity might be caused by massive objects curving the fabric of space and
time. He later used the theory to predict that these two vast bodies circling
each other would spread waves of gravity at the speed of light, very slightly
expanding and contracting the distances between atoms in distant galaxies.
While
astronomers have found ample evidence backing up the central planks of general
relativity, gravitational waves are so delicate that they have proven much
harder to pin down.
The leading
candidate for the job is the Laser Interferometer Gravitational-Wave
Observatory, which consists of a detector deep in the wilds of Washington state
on the west coast of the US and another 3,000km (1,865 miles) away in rural
Louisiana.
Each
facility is made up of three 4km-long vacuum tubes containing ultra-sensitive
lasers that can detect the slightest disturbance from gravitational waves. If
the lasers are knocked out of place physicists will be able to work out roughly
which part of the sky the waves came from.
Lawrence
Krauss, a well-known theoretical physicist at Arizona State University, is the
most influential researcher to publicly endorse the rumours on Twitter.
UPDATE: Einstein’s Gravitational Waves Detected In Major Breakthrough
In an announcement that
electrified the world of astronomy, scientists said they have finally detected
gravitational waves, the ripples in the fabric of space-time that Einstein
predicted a century ago.
Some scientists likened the
breakthrough to the moment Galileo took up a telescope to look at the
planets.
The discovery of these waves,
created by violent collisions in the universe, excites astronomers because it
opens the door to a new way of observing the cosmos. For them, it’s like turning
a silent movie into a talkie because these waves are the soundtrack of the
cosmos.
“Until this moment we had our
eyes on the sky and we couldn’t hear the music,” said Columbia University
astrophysicist Szabolcs Marka, a member of the discovery team. “The skies will
never be the same.”
An all-star international team of
astrophysicists used a newly upgraded and excruciatingly sensitive $1.1 billion
instrument known as the Laser Interferometer Gravitational-Wave Observatory, or
LIGO, to detect a gravitational wave from the distant crash of two black holes,
one of the ways these ripples are created.
To make sense of the raw data,
the scientists translated the wave into sound. At a news conference, they played
what they called a “chirp” — the signal they heard on September 14. It was
barely perceptible even when enhanced.
Some physicists said the finding
is as big a deal as the 2012 discovery of the subatomic Higgs boson, sometimes
called the “God particle.” Some said this is bigger.
“It’s really comparable only to
Galileo taking up the telescope and looking at the planets,” said Penn State
physics theorist Abhay Ashtekar, who wasn’t part of the discovery
team. “Our understanding of the heavens
changed dramatically.”
Gravitational waves, first
theorised by Albert Einstein in 1916 as part of his theory of general
relativity, are extraordinarily faint ripples in space-time, the hard-to-fathom
fourth dimension that combines time with the familiar up, down, left and right.
When massive but compact objects like black holes or neutron stars collide, they
send gravity ripples across the universe.
Scientists found indirect proof
of the existence of gravitational waves in the 1970s — computations that showed
they ever so slightly changed the orbits of two colliding stars — and the work
was honoured as part of the 1993 Nobel prize in physics. But Thursday’s
announcement was a direct detection of a gravitational wave.
And that’s considered a big
difference.
“It’s one thing to know
soundwaves exist, but it’s another to actually hear Beethoven’s Fifth Symphony,”
said Marc Kamionkowsi, a physicist at Johns Hopkins University who wasn’t part
of the discovery team. “In this case we’re actually getting to hear black holes
merging.” Gravitational waves are the “soundtrack of the universe,” said team
member Chad Hanna of Pennsylvania State University.
Detecting gravitational waves is
so difficult that when Einstein first theorised about them, he figured
scientists would never be able to hear them. Einstein later doubted himself and
even questioned in the 1930s whether they really do exist, but by the 1960s
scientists had concluded they probably do, Ashtekar said. In 1979, the National
Science Foundation decided to give money to the California Institute of
Technology and the Massachusetts Institute of Technology to come up with a way
to detect the waves. Twenty years later, they started building two LIGO
detectors in Hanford, Washington, and Livingston, Louisiana, and they were
turned on in 2001. But after years with no luck, scientists realised they had to
build a more advanced detection system, which was turned on last
September.
“This is truly a scientific
moonshot and we did it. We landed on the moon,” said David Reitze, LIGO’s
executive director.
The new LIGO in some frequencies
is three times more sensitive than the old one and is able to detect ripples at
lower frequencies that the old one couldn’t. And more upgrades are
planned.
Scientists found indirect proof
of the existence of gravitational waves in the 1970s — computations that showed
they ever so slightly changed the orbits of two colliding stars — and the work
was honoured as part of the 1993 Nobel prize in physics. But Thursday’s
announcement was a direct detection of a gravitational wave.
And that’s considered a big
difference.
“It’s one thing to know
soundwaves exist, but it’s another to actually hear Beethoven’s Fifth Symphony,”
said Marc Kamionkowsi, a physicist at Johns Hopkins University who wasn’t part
of the discovery team. “In this case we’re actually getting to hear black holes
merging.” Gravitational waves are the “soundtrack of the universe,” said team
member Chad Hanna of Pennsylvania State University.
Detecting gravitational waves is
so difficult that when Einstein first theorised about them, he figured
scientists would never be able to hear them. Einstein later doubted himself and
even questioned in the 1930s whether they really do exist, but by the 1960s
scientists had concluded they probably do, Ashtekar said. In 1979, the National
Science Foundation decided to give money to the California Institute of
Technology and the Massachusetts Institute of Technology to come up with a way
to detect the waves.
Twenty years later, they started building two LIGO
detectors in Hanford, Washington, and Livingston, Louisiana, and they were
turned on in 2001. But after years with no luck, scientists realised they had to
build a more advanced detection system, which was turned on last
September.
“This is truly a scientific
moonshot and we did it. We landed on the moon,” said David Reitze, LIGO’s
executive director.
The new LIGO in some frequencies
is three times more sensitive than the old one and is able to detect ripples at
lower frequencies that the old one couldn’t. And more upgrades are
planned. With thanks to The
Australian
IN 1966 Time magazine ran a cover story asking: Is God Dead? Many
have accepted the cultural narrative that he’s obsolete — that as science
progresses, there is less need for a “God” to explain the universe. Yet it turns
out that the rumours of God’s death were premature. More amazing is that the
relatively recent case for his existence comes from a surprising place — science
itself.
Here’s the story: The same year Time featured the now-famous
headline, the astronomer Carl Sagan announced that there were two important
criteria for a planet to support life: The right kind of star, and a planet the
right distance from that star. Given the roughly octillion — 1 followed by 24
zeros — planets in the universe, there should have been about septillion — 1
followed by 21 zeros — planets capable of supporting life.
With
such spectacular odds, the Search for Extraterrestrial Intelligence, a large,
expensive collection of private and publicly funded projects launched in the
1960s, was sure to turn up something soon. Scientists listened with a vast radio
telescopic network for signals that resembled coded intelligence and were not
merely random. But as years passed, the silence from the rest of the universe
was deafening. Congress defunded SETI in 1993, but the search continues with
private funds. As of 2014, researches have discovered precisely bubkis — 0
followed by nothing.
What
happened? As our knowledge of the universe increased, it became clear that there
were far more factors necessary for life than Sagan supposed. His two parameters
grew to 10 and then 20 and then 50, and so the number of potentially
life-supporting planets decreased accordingly. The number dropped to a few
thousand planets and kept on plummeting.
Even
SETI proponents acknowledged the problem. Peter Schenkel wrote in a 2006 piece
for Skeptical Inquirer magazine: “In light of new findings and insights, it
seems appropriate to put excessive euphoria to rest ... We should quietly admit
that the early estimates ... may no longer be tenable.”
As
factors continued to be discovered, the number of possible planets hit zero, and
kept going. In other words, the odds turned against any planet in the universe
supporting life, including this one. Probability said that even we shouldn’t be
here.
Today there are more than 200 known parameters necessary for a
planet to support life — every single one of which must be perfectly met, or the
whole thing falls apart. Without a massive planet like Jupiter nearby, whose
gravity will draw away asteroids, a thousand times as many would hit Earth’s
surface. The odds against life in the universe are simply
astonishing.
Yet
here we are, not only existing, but talking about existing. What can account for
it? Can every one of those many parameters have been perfect by accident? At
what point is it fair to admit that science suggests that we cannot be the
result of random forces? Doesn’t assuming that an intelligence created these
perfect conditions require far less faith than believing that a life-sustaining
Earth just happened to beat the inconceivable odds to come into
being?
There’s more. The finetuning necessary for life to exist on a
planet is nothing compared with the finetuning required for the universe to
exist at all. For example, astrophysicists now know that the values of the four
fundamental forces — gravity, the electromagnetic force, and the “strong” and
“weak” nuclear forces — were determined less than one millionth of a second
after the big bang. Alter any one value and the universe could not exist.
For
instance, if the ratio between the nuclear strong force and the electromagnetic
force had been off by the tiniest fraction of the tiniest fraction — by even one
part in 100,000,000,000,000,000 — then no stars could have ever formed at all.
Feel free to gulp.
Multiply that single parameter by all the other necessary
conditions, and the odds against the universe existing are so heart-stoppingly
astronomical that the notion that it all “just happened” defies common sense. It
would be like tossing a coin and having it come up heads 10 quintillion times in
a row. Really?
Fred
Hoyle, above, the astronomer who coined the term “big bang,” said that his atheism was
“greatly shaken” at these developments. He later wrote that “a commonsense
interpretation of the facts suggests that a super-intellect has monkeyed with
the physics, as well as with chemistry and biology ... The numbers one
calculates from the facts seem to me so overwhelming as to put this conclusion
almost beyond question.”
Theoretical physicist Paul Davies has said that “the appearance of
design is overwhelming” and Oxford professor Dr. John Lennox has said “the more
we get to know about our universe, the more the hypothesis that there is a
Creator ... gains in credibility as the best explanation of why we are
here.”
The
greatest miracle of all time, without any close seconds, is the universe. It is
the miracle of all miracles, one that ineluctably points with the combined
brightness of every star to something — or Someone — beyond
itself.