Saturday, October 29, 2005

FQX: Foundational Questions in Physics & Cosmology

Interesting (relatively) new Web site:

FQX: Foundational Questions in Physics & Cosmology
Mission
Our mission is to catalyze and support research on foundational questions in physics and cosmology, particularly cutting-edge areas unlikely to be supported by conventional funding sources. We do so through grants, contests, conferences, and other programs, as described below.
Not much commentary needed about this. There isn't much content on the site yet, except for the discussion forums.

Unfortunately, even the forums have little content, except for abstracts of some presentations that were given at a conference called "Amazing Light: Visions for Discovery", which was held October 6-8. There is some cause for concern in that regard, since one of the main sponsors of the event was the Templeton Foundation, which has an avowed theistic agenda. But the people running the FQX Web site -- including Max Tegmark and Anthony Aguirre -- have excellent scientific credentials. Their advisory board includes eminent scientists like Martin Rees, Lee Smolin, and Frank Wilczek, so one has high expectations of good things from this site.

But why mention the site at all if there's not much on it yet? Well, here are the questions that FQX promises to deal with:

  • What, if anything, happened before the Big Bang? What determined the characteristics of the universe? Is our observed universe all that exists, or is it just one "universe" among many, a mere part of a much bigger picture, in which we misinterpret local conditions as fundamental laws? What will happen in the distant future? Will dark energy collapse, or rip apart, our universe? Will all particles and black holes ultimately decay away?
  • What do the fantastically effective but bafflingly counterintuitive laws of quantum mechanics tell us about reality? How do quantum measurements occur: Are there really "many worlds," and if not, how do quantum possibilities collapse into a single observed reality? Can we find a self-consistent theory of nature that unifies gravity and quantum mechanics?
  • What distinguishes the future from the past, if the universe is governed by physical laws that make no such distinction? How does duration, which we experience, relate to the time described by physics and mathematics?
  • What is the relationship between physics, mathematics, information? How 'real' is the world of mathematics - and how 'real' is the world of matter?
  • Why does the universe seem so complex, given its simple initial conditions, and the elegant mathematics that describes it? Is life ubiquitous in the universe (or beyond)? How does matter give rise to consciousness - or does it?


Those are some of the most important open questions in all of science, right up there with How did life on Earth begin? and How does the brain work? So you might want at least to bookmark the FQX site and check back with it from time to time.

Tags: ,

Labels:

Tuesday, October 25, 2005

The universe, dark matter, and everything

It all fits together.

Over the years, there have been many skeptics of what is currently the consensus theory -- the big bang -- of how the universe evolved from a very hot, very dense state about 13.7 billion years ago to its current stage. In the 1950s and early 1960s, an alternative -- the "steady state theory" -- briefly flourished. But, in the view of most cosmologists, this alternative took a fatal hit in 1964 with the discovery of the cosmic microwave background (CMB).

Nevertheless, a handful of big bang skeptics have persisted until the present day. Some argue, for instance, that the spectral red shifts that are generally presumed to be due to a relatively simple relationship between the distance of an object and its velocity relative to the Earth could in fact be explained by some other means.

Unfortunately for the skeptics, there is more than one type of evidence for the big bang theory. For example, measurements of the relative abundance of a few light isotopes of hydrogen, helium, and lithium are very consistent with predictions of the big bang theory on the basis of the process of cosmic nucleosynthesis of these isotopes. The steady state model has no obvious explanation for this consistency, because the predictions rest on the universe being in a very hot, dense state in which the isotopes were created about 5 minutes after the big bang.

An even deeper problem for the skeptics is that just about everything we have observed about the distant universe -- things like the expansion rate at different times, the apparent existence of "exotic" dark matter, and various detailed properties of the CMB -- all fit very nicely and neatly in the big bang theory, like the pieces of a mechanical puzzle. There is no alternative theory that explains even one of these features well, let alone all of them.

Suppose you took all the observational evidence we have regarding dark matter, the CMB, and so forth and combined all that with the theoretical assumptions behind the big bang model in order to make a detailed computer simulation of the evolution of the universe. This would be much like the way the behavior of hurricanes, say, can be simulated from what we know about the circulation of the atmosphere, ocean currents, and basic gas dynamics. Would such a simulation of the universe on a computer predict various additional features of the universe we can actually observe? Most importantly, things like the way galaxies and clusters of galaxies are distributed in space and the kinds of objects we can observe at very great distances, such as quasars and very young galaxies.

A very detailed simulation of this kind has recently been constructed and run. The results of this Millennium Simulation were announced in June of this year. (News stories here, here, here.)


An image from the 3-dimensional visualization of the Millennium Simulation


Then in August a nice article by Ron Cowen appeared in Science News -- Cosmic Computing: Simulating the universe

Basically, what any simulation allows you to do is to derive consequences and predictions that follow from given theoretical assumptions and observational data. The computer allows making numerical predictions and (sometimes) presenting them in viusal form, even when exact solutions of the underlying equations aren't known. And when the observational data is only approximate, it is possible to figure out the consequences when the data is varied, in order to determine what produces the best fit with other observations.

Here are some of the most noteworthy findings:

  • Temperature and density fluctuations in the CMB are markers for the distribution of dark matter, and as this distribution participates in the expansion of the universe, it leads to a distribution of galaxies and clusters of galaxies which is very consistent with what has been determined by large-scale galaxy surveys, especially the Sloan Digital Sky Survey and the Two Micron All Sky Survey.
  • The model confirms that the expansion of the universe is accelerating, implying the existence of some sort of dark energy.
  • Supermassive black holes and quasars could have evolved in the universe very early. Some extremely bright quasars that existed about 870 million years after the big bang must have been powered by black holes with the mass a billion times the mass of our sun.
  • The simulation shows that such supermassive black holes could have formed that early, because they would have grown more quickly in unusually dense regions of the early universe. They would also have become the cores of supermassive galaxies which can now be observed at the centers of the largest galaxy clusters.
  • Galaxies form in a bottom-up rather than top-down manner -- they start out small and grow as they capture more matter, instead of by the splitting of larger aggregations of matter. Eventually larger elements of a structure hierarchy develop -- galaxy clusters and then clusters of clusters.
  • Clusters of galaxies form within dense regions of dark matter known as halos. Among halos having the same mass, the ones that formed earliest have the densest clusters of galaxies. Hence old galaxies cluster more stongly than newer ones. That is, clusters of galaxies are denser if they formed earlier, as well as if the halo in which they formed is itself denser. (See The age dependence of halo clustering.)
  • About 85% of gravitating matter in the universe must be in the form of non-baryonic "exotic" dark matter in order for the simulation to be consistent with observations.


------------------------

References:

The Virgo Consortium - research group responsible for the Millennium Simulation

The Big Bang

------------------------

Tags: , ,

Labels: , ,

Saturday, October 22, 2005

Our cousins, Caenorhabditis elegans

Humans descended from worms? Sounds about right...

Evolutionary Conservation Of A Mechanism Of Longevity From Worms To Mammals

Though the study of aging in the nematode model organism C. elegans has provided much insight into this complex process, it is not yet clear whether genes involved in aging in the worm have a similar role in mammals. In a recent study, Dr. Hekimi and colleagues of McGill University (Canada) report that inactivation of the gene mclk1, the murine ortholog of the C. elegans gene clk-1, results in increased cellular fitness and prolonged lifespan in mice.

The gene clk-1 in the worm, as well as mclk1 in mice, encodes an enzyme necessary for the biosynhesis of ubiquinone, an essential cofactor in numerous redox reactions such as mitochondrial respiration. Though lack of the mclk1 gene results in embryonic lethality, the authors were able to study mclk1-/- embryonic stem (ES) cells and show that they are resistant to oxidative stress and exhibit reduced DNA damage when compared to ES cells in which this gene is active. ...

Though the aging process of different organisms will most likely differ due to different physiologies and environments, Dr. Hekimi summarizes the relevance of their findings by concluding that "... the longevity-promoting effect of reducing clk-1/mclk1 activity that was initially observed in C. elegans is conserved in mice, supporting the idea that some molecular mechanisms of aging are shared throughout the animal kingdom."

A number of other genes that affect aging in a wide range of species have also been found in research using the nematode worm C. elegans -- for example:


Tags:

Labels: , ,

Human papilloma virus vaccine

First large test shows vaccine prevents cervical cancer
THE ASSOCIATED PRESS

October 6, 2005, 11:26 AM EDT

TRENTON, N.J. -- The first large study of an experimental cervical cancer vaccine found it was 100 percent effective, in the short term, at blocking the most common cause of the disease, the vaccine's maker said Thursday.

Merck's genetically engineered vaccine prevents cervical cancer by blocking infection from the human papilloma virus strains that cause 70 percent of cervical cancers.

Other types of HPV, which is sexually transmitted, also can cause cervical cancer and painful genital warts. About 20 million Americans have some form of HPV.

The final-stage study of the vaccine included 10,559 sexually active women ages 16 to 26 in the United States and 12 other countries who were not infected with the HPV strains 16 or 18. Half got three vaccine doses over six months; half got dummy shots.

Among those still virus-free after the six months, none who received the vaccine developed either cervical cancer or precancerous lesions likely to turn cancerous over an average two years of follow-up, compared with 21 who got dummy shots.

"To have 100 percent efficacy is something that you have very rarely," Dr. Eliav Barr, Merck's head of clinical development for the vaccine called Gardasil, told The Associated Press. "We're breaking out the champagne."

Other articles about this: here, here , here, here.

Sounds like good news, right? Well, consider this earlier story from April:


Will cancer vaccine get to all women?

The trouble is that the human papilloma virus (HPV) is sexually transmitted. So to prevent infection, girls will have to be vaccinated before they become sexually active, which could be a problem in many countries.

In the US, for instance, religious groups are gearing up to oppose vaccination, despite a survey showing 80 per cent of parents favour vaccinating their daughters. "Abstinence is the best way to prevent HPV," says Bridget Maher of the Family Research Council, a leading Christian lobby group that has made much of the fact that, because it can spread by skin contact, condoms are not as effective against HPV as they are against other viruses such as HIV.

And attitudes may be even worse in other cultures:
"We found that some Asian women in Britain are afraid even to get tested for HPV infection, because they say if it is positive they will be killed, never mind that their husbands probably gave it to them," says Szarewski. She feels that such attitudes may mean that HPV vaccination may be a non-starter in such communities.

What a shame that religious and cultural prejudices contribute to millions of avoidable deaths. This is consistent with a "culture of life"?

Tags:

Labels:

How do stars form?

Stars Form By Gravitational Collapse, Not Competitive Accretion
There are now two dominant models of how stars form: gravitational collapse theory holds that star-forming molecular clumps, typically hundreds to thousands of solar masses in mass, fragment into gaseous cores that subsequently collapse to make individual stars or small multiple systems. In contrast, competitive accretion theory suggests that at birth all stars are much smaller than the typical stellar mass (~0.5 solar masses), and that final stellar masses are determined by the subsequent accretion of unbound gas from the clump. Competitive accretion models explain brown dwarfs and free-floating planets as protostars ejected from star-forming clumps before accreting much mass, predicting that they should lack disks, have high velocity dispersions, and form more frequently in denser clumps. They also predict that mean stellar mass should vary within the Galaxy. Here we derive a simple estimate for the rate of competitive accretion as a function of the star-forming environment, based partly on simulations, and determine in what types of environments competitive accretion can occur. We show that no observed star-forming region produces significant competitive accretion, and that simulations that show competitive accretion do so because their properties differ from those determined by observation. Our result shows that stars form by gravitational collapse, and explains why observations have failed to confirm predictions of the competitive accretion scenario
This sounds significant... Out of two plausible alternative models of star formation, only one is determined to be consistent with observation.

It's a top-down vs. bottom-up question. Does a large mass of matter first collapse to a denser state and then fragment, or do numerous small regions of the matter collapse indpendently directly to protostars? The latter possibility, here called "competitive accretion" makes predictions that don't agree with observations, so the other possibility, known as "gravitational collapse", is more likely.

More: How do stars form?

Press release: Astrophysicists put kibosh on alternative theory of star formation

Tags:

Labels:

Wednesday, October 19, 2005

Revisiting the evidence for dark matter

Recently, the CERN Courier, a usually reliable source, came out with this article: General relativity versus exotic dark matter.
Determinations of the rotation speed of stars in galaxies (galactic rotation curves) based on the assumption that Newtonian gravity is a good approximation have led to the inference that a large amount of dark matter must be present - more than can be accounted for by non-luminous baryonic matter. While there are plenty of attractive theoretical candidates for the additional dark matter, such as a lightest supersymmetric particle (LSP), it is also interesting to look into the details of the calculations that suggest the need for such exotica. Now F I Cooperstock and S Tieu of the University of Victoria have reworked the problem using general relativity in place of Newtonian gravity, and they find no need to assume the existence of a halo of exotic dark matter to fit the observed rotation curves.
This is based on a July arXiv preprint: General Relativity Resolves Galactic Rotation Without Exotic Dark Matter.

Then only a week ago Space.com jumped on the story: Dark Matter: Invisible, Mysterious and Perhaps Nonexistent. However, it does caution, "The new analysis has been submitted to the Astrophysical Journal but has yet to be reviewed by other scientists."

It seems that more than a few people are skeptical about dark matter, and eager to tout anything that seems to explain it away. Unfortunately for them, the Cooperstock-Tieu paper was reviewed by other scientists, who'd already thrown cold water on it in another arXiv preprint from August, Singular disk of matter in the Cooperstock and Tieu galaxy model.

Cosmologist Sean Carroll at Cosmic Variance explains in some detail where Cooperstock and Tieu seem to have gone wrong: Escape from the clutches of the dark sector.
To be honest, there are a bunch of problems with this paper. For example, equations (1) and (2) seem mutually inconsistent — they have chosen one coordinate system in which to express the spacetime metric, and another in which to express the spacetime velocity of the particles in the galaxy. Ordinarilly, you have to pick one coordinate system and stick to it. More importantly, Korzynski has analyzed their solution carefully and noticed that they have secretly included not only the mass of the stars, but a completely imaginary thin sheet of infinite density in the galactic plane. So the fact that the rotation curves don’t decay as they should is really no surprise.


Sean also writes some interesting stuff about problems using perturbation theory to "solve" the Einstein equations, and this may be relevant to work of Kolb and others (see this) that attempts to explain accelerating cosmic expansion without dark energy or quintessence. (See here for my overview of those topics.) But we're getting too far off course, so put this on the shelf for now.

Anyhow, problems with choice of coordinate systems are one of the most common sources of error in general relativity. Even Einstein himself had managed to screw up at times on this account. In fact, such a lapse led him to cease submitting papers to a leading physics journal, the Physical Review, as explained in this article: Einstein Versus the Physical Review. It seems that in 1936 Einstein had written a paper with his assistant Nathan Rosen disproving the possibility of gravity waves. He was miffed that the journal had put the paper out for peer review, and a referee discovered that a problem with choice of coordinate systems invalidated the result. Although Einstein soon recognized his mistake and published a revised paper a few months later in another journal, he never again submitted work to the Physical Review.

So both the CERN Courier and Space.com missed the doubts of other cosmologists about the Cooperstock-Tieu paper. But there's something much more important that they missed: There is a huge amount of other evidence for "exotic" dark matter which is independent of galactic rotation curves. (This is sometimes known as the "galaxy rotation problem".) Recall that there are two types of dark matter: baryonic dark matter (made mostly of protons and neutrons) and exotic dark matter (all other gravitating matter that's not visible). Here's some of the evidence:

  1. The average velocities of galaxies in large galaxy clusters allows one to calculate the mass of the cluster -- and it's much too high to be accounted for by the visible (baryonic) matter. (This comes from the "virial theorem".)
  2. X-ray observations of galaxy clusters show the presence of a lot of hot gas that could not persist in the cluster without much more mass due to exotic dark matter. See: Scientists Find Missing Matter.
  3. Gravitational lensing caused by large galaxy clusters affecting objects behind the cluster (on the same line of sight) also implies much more mass in the cluster.
  4. Clusters of clusters ("superclusters") could not have formed to the extent that they have without exotic dark matter. A couple of recent surveys of many thousands of galaxies supports this. This is all part of the issue of cosmological structure formation, which is easiest to explain if there is a large amount of exotic dark matter.
  5. The amplitudes of temperature fluctuations in the cosmic microwave background require much more mass than available as baryonic matter.
  6. Various anomalous objects have been detected that appear to have the mass of a galaxy but little or no visible matter -- "low surface brightness galaxies" and "dark galaxies". See: Astronomers claim first 'dark galaxy' find, and Have we seen the first "dark galaxy"?.
  7. New calculations of star velocities in elliptical galaxies are consistent with large amounts of exotic dark matter. See: here, here, here, here.


The bottom line is: if exotic dark matter didn't exist, cosmologists would need to come up with new explanations for a whole lot of other fairly well-established phenomena.

----------------------------------------------------

Other references:

Experimental Searches for Dark Matter - survey/review article from 2002

Dark matter - survey and many additional references on dark matter

----------------------------------------------------

Tags:

Labels: ,

Saturday, October 15, 2005

Be Afraid, Be Very Afraid

In a post ominously entitled Be Afraid, Be Very Afraid Clifford Johnson at Cosmic Variance talks about a campus colloquium he organized, featuring Professor Nathan Lewis.

The topic was Scientific Challenges in Sustainable Energy Technology. The reason for the bleak, sepulchral title of the post may be inferred from this paraphrase of a key point of Lewis' talk:
no matter how conservative you are about the effects this [carbon dioxide dumping] will have, and no matter how optimistic you are about the difference we will make by trying to clean up our act using emissions reductions, we are extremely late in getting around to considering greenhouse-gas-emission-free primary sources of energy. How late? Well, using generous estimates of how the trends will continue if we use the policy of “business as usual” currently advocated by our policy makers, by about 2050, we will begin to pass the point where it will take of the order of 1000 years to restore the levels of greenhouse gas to anything like we were used to. [emphasis in original]

The talk as a whole was generally about sustainable energy technology, and the bottom line is that no matter how much effort we put into development of that technology, it can't come soon enough to reduce greenhouse gas emissions sufficiently to avoid significant global warning.

In other words, under any foreseeable scenario, we're probably going to have to cope with the effects of global warming within a few decades. Although I'm not even close to being an expert on this, I've already come to that conclusion.

Anyhow, this is a very worthwhile post to ponder, and you can learn more from other material associated with the presentation.

At the end of the post there's also a plug for another favorite theme: The quest for better science education

Tags: , ,

Labels:

Friday, October 14, 2005

Top Advisory Panel Warns of Erosion of U.S. Science

We've written about aspects of this problem before:

Now a Congressionally-chartered panel of the U. S. National Academies is saying very similar things. There's a press release here, and a summary from the New York Times here.

Example problems cited are:

  • For the cost of one chemist or one engineer in the United States, a company can hire about five chemists in China or 11 engineers in India.
  • Last year chemical companies shuttered 70 facilities in the United States and have tagged 40 more for closure. Of 120 chemical plants being built around the world with price tags of $1 billion or more, one is in the United States and 50 are in China.
  • U.S. 12th-graders recently performed below the international average for 21 countries on a test of general knowledge in mathematics and science. In addition, an advanced mathematics assessment was administered to students in 15 other countries who were taking or had taken advanced math courses, and to U.S. students who were taking or had taken pre-calculus, calculus, or Advanced Placement calculus. Eleven countries outperformed the United States, and four scored similarly. None scored significantly below the United States.
  • In 1999 only 41 percent of U.S. eighth-graders had a math teacher who had majored in mathematics at the undergraduate or graduate level or studied the subject for teacher certification -- a figure that was considerably lower than the international average of 71 percent.
  • Last year more than 600,000 engineers graduated from institutions of higher education in China. In India, the figure was 350,000. In America, it was about 70,000.
  • In 2001 U.S. industry spent more on tort litigation than on research and development.


There are also recommended corrective actions. But one must say in advance, these recommendations seem too predictably of the "throw money at the problem" sort. Yes, the money should be spent. But something much more is needed: a change of basic cultural attitudes. Developing countries like China and India know that their people must value education and commitment to acquiring scientific and technical skills in order to achieve their goals of becoming competitive, productive members of the world economy. They know this can't happen without effort and hard work.

At one time, when the U. S. was still an "underdeveloped country", we had the same attitudes -- as far back as Benjamin Franklin and Thomas Jefferson, and as recently as the "space race" with the Soviet Union in the 1960s. How was it that in the 1960s it took only eight years practically from scratch and the decision to put men on the Moon until that was accomplished in 1969, and yet now we can't accomplsh the same thing for maybe thirteen years, until 2018, using basically the same technology we've had for 40 years? Isn't that sort of incredible?

Well, the goal of putting men (and women) back on the Moon probably isn't the right goal now. Other problems, like finding clean but economically viable sources of energy and figuring out how to cope with global warming, are much more pressing. But whatever goals we choose, it's not looking like we'll get there without some fundamental attitude changes. Most notably, we'll need to value more -- and invest much more of our wealth in -- human capital goods such as basic science education and acquisiton of skills. Instead of unproductive consumer goods and services like SUVs and $250-a-pair blue jeans and wedding extravaganzas that cost more than a full year at Harvard. (To say nothing of useless wars in far-off countries that cost upwards of $1 billion a week.) We need better priorities. We can't afford it all.

OK, that rant aside, here are the four concrete proposals of the NAS report:
  • Attract 10,000 top students every year to science and math K-12 teaching careers with 4-year college scholarships.
  • Increase by 10% a year the funding of basic research in physical sciences, engineering, mathematics, and information sciences. Support additional research in innovative energy souces. Start a new program of research grants for the most outstanding early-career researchers.
  • Make the U. S. the most attractive setting in the world to study and conduct research, with 25,000 undergraduate scholarships and 5000 graduate fellowships for students enrolled in physical science, life science, engineering, and mathematics programs at U.S. colleges and universities. Change visa requirements to allow foreign recipients of scientific/technical PhD's to remain in the country for a year to find employment.
  • Fix various policies in order to encourage innovation, such as by modernizing the U.S. patent system, realigning tax policies, and ensuring affordable broadband Internet access.


Yeah, that sounds like a good start. There are all kinds of other things which could be done too, with relatively little funding. Some such things are already being done, such as encouraging -- and providing appropriate training for -- scientifically knowledgable creative people to write books and TV/movie screenplays that feature scientific researchers and scientific themes. (In much the same vein as this idea.) Hell, we could support public broadcasting (and cable companies) to produce high-quality educational programming -- instead of trying our best to kill such things.

Or, since this is (supposedly) the 21st century, we could put teams of programmers to work developing educational software that gives young people hands-on experience with scientific computing and computer visualization, and which is as interesting and fun to use as worthless shoot-em-up computer games.

Maybe even bring back the Congressional Office of Technology Assessment, so that our legislators stand a chance of getting a clue about all this, instead of their current ignorance. Now there's a truly radical proposal.

Just a few ideas, most of which will probably never happen.

New evidence for evolution

New Analyses Bolster Central Tenets of Evolution Theory

When scientists announced last month they had determined the exact order of all 3 billion bits of genetic code that go into making a chimpanzee, it was no surprise that the sequence was more than 96 percent identical to the human genome. Charles Darwin had deduced more than a century ago that chimps were among humans' closest cousins.

But decoding chimpanzees' DNA allowed scientists to do more than just refine their estimates of how similar humans and chimps are. It let them put the very theory of evolution to some tough new tests.

If Darwin was right, for example, then scientists should be able to perform a neat trick. Using a mathematical formula that emerges from evolutionary theory, they should be able to predict the number of harmful mutations in chimpanzee DNA by knowing the number of mutations in a different species' DNA and the two animals' population sizes.

"That's a very specific prediction," said Eric Lander, a geneticist at the Broad Institute of MIT and Harvard in Cambridge, Mass., and a leader in the chimp project.

Sure enough, when Lander and his colleagues tallied the harmful mutations in the chimp genome, the number fit perfectly into the range that evolutionary theory had predicted.
Of course, there's already an abundance of evidence for evolution, but it's always nice to have more. Gathering evidence that supports a theory by verifying nontrivial predictions of the theory is a concept that aficionados of "intelligent design" haven't begun to understand.
"What makes evolution a scientific explanation is that it makes testable predictions," Lander said. "You only believe theories when they make non-obvious predictions that are confirmed by scientific evidence."

Lander's experiment tested a quirky prediction of evolutionary theory: that a harmful mutation is unlikely to persist if it is serious enough to reduce an individual's odds of leaving descendants by an amount that is greater than the number one divided by the population of that species.

The rule proved true not only for mice and chimps, Lander said. A new and still unpublished analysis of the canine genome has found that dogs, whose numbers have historically been greater than those of apes but smaller than for mice, have an intermediate number of harmful mutations -- again, just as evolution predicts.
It's not clear why this prediction is called "quirky", except that it's simply mathematical, and perhaps therefore more obscure when written in words. It can be restated as follows. Suppose there are N living members of the species, its "population". Then if a given mutation is harmful enough that it has odds of greater than 1 out of N of reducing an indivudual's chance of leaving descendants, the mutation is unlikely to persist.

Stated yet more simply, if less precisely, any mutation which is sufficiently harmful probably won't persist.

A corollary is that species with large populations tend to have fewer harmful mutations. This is because when N is large, the threshold above which a harmful mutation is unlikely to persist is lower, and so a higher percentage are eliminated over time. Consequently, mice would have fewer harmful mutations than dogs, which would in turn have fewer than apes. And humans would have fewer than chimpanzees.
Asked to provide examples of non-obvious, testable predictions made by the theory of Intelligent Design, John West, an associate director of the Discovery Institute, a Seattle-based ID think tank, offered one: In 1998, he said, an ID theorist, reckoning that an intelligent designer would not fill animals' genomes with DNA that had no use, predicted that much of the "junk" DNA in animals' genomes -- long seen as the detritus of evolutionary processes -- will someday be found to have a function.

(In fact, some "junk" DNA has indeed been found to be functional in recent years, though more than 90 percent of human DNA still appears to be the flotsam of biological history.) In any case, West said, it is up to Darwinists to prove ID wrong.
Actually, there is "junk" DNA that is found to be functional. There are sequences of DNA that don't code for genes (a more precise way of saying "junk") yet do have functions. For instance, some of these sequences act as "promoters", which facilitate the expression of real genes.

But the interesting thing is, when a DNA sequence really does have a function like this, it strongly tends to persist in related species. But many other "junk" DNA sequences disappear rapidly in related species, which strongly suggests they are random and have no function. This contradicts the ID prediction that all DNA in a genome has some function.

As for the contention that "it is up to Darwinists to prove ID wrong," that's obvious hogwash. It is primarily up to the supporters of a theory to provide evidence for it, and evolutionists have provided lots of proof for evolution. Opponents of a theory may provide contrary evidence to specific predictions of the theory. But if the theory in question doesn't make nontrivial but testable predictions -- which is the case with ID -- there simply isn't any way to prove it wrong. How convenient!

The bulk of this article provides a reasonably clear outline of how evolution actually works, and additional evidence in favor of the thory. Reading the details is quite worthwhile. For example:
It is now clear from fossil and molecular evidence that certain patterns of growth in multicellular organisms appeared about 600 million years ago. Those patterns proved so useful that versions of the genes governing them are carried by nearly every species that has arisen since.

These several hundred "tool kit genes," in the words of University of Wisconsin biologist Sean B. Carroll, are molecular evidence of natural selection's ability to hold on to very useful functions that arise.

Research on how and when tool kit genes are turned on and off also has helped explain how evolutionary changes in DNA gave rise to Earth's vast diversity of species. Studies indicate that the determination of an organism's form during embryonic development is largely the result of a small number of genes that are turned on in varying combinations and order. Gene regulation is where the action is.
In particular, a standard criticism of evolution made by ID proponents is that evolution has no way to make sudden large changes in a species. But that's wrong:
... mutations in regulatory portions of a DNA strand can have effects just as dramatic as those prompted by mutations in genes themselves. They can, for example, cancel the development of an appendage -- or add an appendage where one never existed. This discovery refuted assertions by Intelligent Design advocates that gene mutation and natural selection can, at most, explain the fine-tuning of species.
All in all, this article from the Washington Post is a very good piece of science journalism.

Here's another commentary on the same article: The Post Shows the Way. It's highly critical of an article in the New York Times which also deals with the Kitzmiller et al. v. Dover Area School District trial.

It also links to a very good, thorough article: Plagiarized Errors and Molecular Genetics, which explains a lot of molecular biology and its bearing on evolution in great technical detail. One should be prepared to spend at least an hour or so on this article, but it's worth it. ID supporters who claim that there is no evidence for evolution are simply blowing smoke when they ignore evidence like this. Although reading such an article is hard work, anyone tempted to believe the ID position really needs to do that work before making up their mind.

Tags: ,

Labels:

Wednesday, October 12, 2005

The Republican War on Science

Chances are, if you follow science policy debates at all, you are aware of Chris Mooney's recently published book, The Republican War on Science.

Not surprisingly, it has generated a lot of controversy. You can find a good sample of the arguments in some recent discussions involving Mooney and two critics, Lawrence Krauss and Roger Pielke, at TPM Cafe. The comments from others there are very interesting as well.

Although Pielke has good credentials, his style of argumentation seems sophistical and even, perhaps, deliberately deceptive. There is, for instance, in this article a real howler.
A central part of Mooney's thesis is that "bad scientific information leads, inexorably, to bad policy" (p. 4). But scholars who study the relationship of knowledge and action paint a far more complicated picture of the relation of knowledge and decision making than is implied by this overly-simplistic, linear formula.
OK so far. But then he uses as his first example studies of needle exchange programs. Such programs were rejected by both Clinton and Bush administrations. But while the former didn't dispute the scientific studies (which supported exchange programs), the latter claimed, falsely, that the evidence for the efficacy of exchange programs is shaky.

Pielke notes, appropriately, that "Science appears to have been mostly irrelevant in either case." But that does not in the least contradict Mooney's assertion. Because what we have here is an instance where good science did not lead to good policy, because it was either ignored or disputed. That does not address at all the very plausible assumption that bad (i. e. inadequate, inept, disproven, distorted, or dishonest) science may very well lead to bad policy -- especially if it is relied upon instead of simply ignored. So Pielke's whole argument begins with poor logic.

On a related note, there's an especially harsh review of the book from Keay Davidson, orgininaly published in the Washington Post, and also posted at Amazon. Davidson is a science journalist who sometimes takes a critical view of science.

Davidson asserts, reasonably enough, that "Historically, debates over U.S. science policy have at least two broad features. First, there are the scientific/technical details of the debates," and "Then there are the broader, quasi-philosophical questions that loom beyond the technical details." All well and good, but Mooney is taken to task because his book doesn't involve much of either.

Well, duh, just looking at the book's title shows neither of these was the purpose, because the book -- whether it's mainly right or wrong -- is obviously about politics. It's a work of political journalism. Nothing wrong with that. Mooney is much more concerned with how science is misused and/or abused by politicians and government officials rather than with how it can be used legitimately and effectively.

Mooney's a fine writer, and on his own blog he points to other problems with Davidson's review. In particular, he disputes Davidson's allegation that the book fails to address the difficult problem of discriminating between "good" and "bad" science. One must admit that though the book mentions the problem in passing, it doesn't deal head-on with the issue. But again, the book isn't intended as either philosophy or sociology of science. It's about the politics of science. And questions about what makes science "bad" (i. e. inadequate, inept, disproven, distorted, or dishonest) deserve (and have many) book-length treatments.

Further response from Mooney is here.

Interestingly enough, Davidson works as a science writer for the Chronicle. But the review of Mooney's book that the Chronicle actually published (Bush and company blinded by pseudoscience), by David Appell, is a lot more favorable.

Update, October 15: Pielke and Mooney go another round here and here. Pielke's contribution is mostly a complaint about the "war" metaphor, but he continues to avoid specifics. His position is that science shouldn't be "politicized", and that at worst the various sides in a given issue mostly just cherry-pick the science that supports their case. Mooney continues to respond (with good basis) that the Republican actions are worse than that, when they ignore the scientific consensus altogether (global warming), pack advisory committees with people favorable to their side, and even apply political loyalty tests to as many professional civil service positions as possible (when the purpose of the civil service in the first place was to avoid that).

Pielke's quibble with the word "war" is this: "When you declare "war on" something this means that you are trying to get rid of it." That's one possibility, but not entirely correct. The U. S. went to war against Iraq (most recently) not to get rid of it but merely to change its government to one that is more favorable to the interests of the regime in the U. S. That seems like an apt description of the Republican war on science -- not eliminate science, just make it favorable to the party's goals.

Another review of this book: 'Swift Boating' Science

Tuesday, October 11, 2005

Embryonic stem cells kill cancer

Researchers Use Human Embryonic Stem Cells To Kill Cancer Cells
For the first time, stem cell researchers at the University of Minnesota have coaxed human embryonic stem cells to create cancer-killing cells in the laboratory, paving the way for future treatments for various types of cancers (or tumors). The research will be published in the Oct. 15 issue of the Journal of Immunology.

Undifferentiated human embryonic stem cells can be used to make blood cells like natural killer cells or red blood cells

Researchers generated "natural killer" cells from the human embryonic stem cells. As part of the immune system, natural killer cells normally are present in the blood stream and are play a role in defending the body against infection and against some cancers.

"This is the first published research to show the ability to make cells from human embryonic stem cells that are able to treat and fight cancer, especially leukemias and lymphomas," said Dan Kaufman, M.D., Ph.D., assistant professor of medicine in the Stem Cell Institute and Department of Medicine at the University of Minnesota and lead author of the study.
Note especially this comment:
This research was done on two of the federally approved embryonic stem cell lines. Kaufman said, however, that if the research would lead to a treatment for people, new lines would have to be developed.
Too bad we have a Federal government that refuses to support this.

Via Science Blog

Tags: , ,

Labels: ,

Sunday, October 09, 2005

The "intelligent design" circus, #1

The Kitzmiller et al. v. Dover Area School District trial is still playing in the center ring (also known as Dover, PA) -- to the uproarious amusement of the entire civilized world. So enthralled is the crowd of laughing, incredulous spectators that the clowns desperately vying for attention in the outside rings (currently located in such places as Kansas and Rio Rancho, New Mexico) have been all but forgotten.

And meanwhile, outside the main tent, there are almost endless sideshows going on, featuring an astounding cast of creationist freaks, grotesqueries, and lesser oddities.

So step right up. It's more than the mind can boggle!

In order to help you make sense of this incredible extravaganza, we've invited a team of very experienced docents to give a blow-by-blow explanation of the nonstop action as it unfolds. At irregular, uncertain, and unpredictable intervals we'll issue reports like this with summaries of the best analysis and insight available.



First up, Pat Shipman, from American Scientist (and introduced by Evolving Thoughts) explains in Being Stalked by Intelligent Design how ID is nothing but "religious prejudice disguised as intellectual freedom" and needs to be taken very seriously, just in case it escapes fron the Discovery Institute Zoo in Seattle, where it normally resides.

Next, New Scientist reports on testimony in the Dover trial: Book thrown at proponents of Intelligent Design. The testimony by Barbara Forrest demonstrates how authors of the ID propaganda tract, Pandas and People, have stealthily replaced the term "creationism" with "intelligent design" in hopes of making the book acceptable for use as a science text. (Without changing the claims made in the book, of course.)

Dr. Forrest is a political scientist and an expert on ID propaganda tactics. She understands perfectly well that ID is not in the least a scientfic theory, and only partly a religious viewpoint. What it is mostly is a political scheme to advance the U. S. "culture wars" on behalf of conservative activists, as explained in her book Creationism's Trojan Horse: The Wedge of Intelligent Design.

And then we have the Campaign to Defend the Constitution stepping up with their map of current Islands of Ignorance: Top 10 Places Where Science Education is Under Threat in the U. S. There are links to information on exactly what is currently going on in these creationist hot spots.

OK, that's it for now. Go buy yourself a hot dog and some cotton candy, then come back and enjoy the greatest clown show on Earth.

Scientists and writers: a great idea

SciTalk is a website that promotes contact between scientists and writers (of novels, short stories, poetry, plays, screen plays, etc.) in order to facilitate the realistic portrayal of scientists in art.
Scientists need to show writers — poets, playwrights, novelists – the wealth of possibilities that are opened up to fiction by using science and scientists in their work. Just as a novel with an accountant as a main character need not be about accountancy, a novel with a scientist need not be about science. Scientists need writers to show that they are 'normal people' from all backgrounds, with normal concerns.

SciTalk offers a way for scientists to communicate their expertise and their enthusiasm to writers, and a way for writers to find out about science and how scientists ‘work’ — through personal contact and meeting face-to-face, not just by email or phone.
Scientists contribute personal information and contact details to the site in order for writers to arrange meetings. There is a very detailed directory of scientific specialties that can be browsed for working professional scientists.

Labels:

Avian flu

Flu Wiki

I don't have anything to add to this topic at the moment -- there's an awful lot of information out there already, and the Flu Wiki is a good place to start. Just feeling sort of guilty about not mentioning it. It's a very important topic.

As my absolutely minimal contribution, here's a link to a good summary of the topic, with many other links, that covers both policy and science aspects: Bird Flu - What's A Reasoned Approach? Part II.

Labels:

Friday, October 07, 2005

Surprisingly massive early galaxy

Spitzer and Hubble Team Up to Find "Big Baby" Galaxies in the Newborn Universe - September 27, 2005

Two of NASA's Great Observatories, the Spitzer and Hubble Space Telescopes, have teamed up to "weigh" the stars in several very distant galaxies. One of these galaxies, among the most distant ever seen, appears to be unusually massive and mature for its place in the young universe. This comes as a surprise to astronomers because the earliest galaxies in the universe are commonly thought to have been much smaller agglomerations of stars that gradually merged together to build large majestic galaxies like our Milky Way.




The galaxy is named HUDF-JD2. (HUDF referes to the Hubble Ultra Deep Field Survey, in which the galaxy was discovered.) The galaxy is the red smudge in the top center of the image above. The image was made by Hubble's near infrared camera. HUDF-JD2 is not visible at all in optical wavelengths, due to the extreme redshift.

The distance to the galaxy, almost 13 billion light years, can be estimated from the redshift of spectral lines in its light. The redshift also indicates that we are seeing the galaxy as it was only 800 million years after the big bang.

As faint as HUDF-JD2 appears to us, its actual brightness must be about 8 times that of the Milky Way today, so the galaxy must have about 8 times as many stars. It would therefore rank as one of the largest of galaxies even at the present age of the universe (about 13.7 billion years).

It has generally been assumed that galaxies of any size did not start to form until several hundred million years after the big bang, so HUDF-JD2 could be only perhaps 400 or 500 million years old itself. That's a very short time for such a massive galaxy. A further standard assumption is that large galaxies form by the merging of smaller ones, which would be even more surprising for such a young galaxy at such an early time.

More detailed analysis suggests that HUDF-JD2 contains few luminous young blue stars, which is another surprise, since young stars should continue to form in galaxies for several billion years. (Many luminous new stars are still forming in the Mily Way, even after 13 billion years.) The implication is that HUDF-JD2 ran out of star-forming interstellar matter in a very short time. The lack of bright young stars means that the average star in HUDF-JD2 is less luminous than in galaxies which are still forming new stars. Consequently, there must be more stars than expected in order to account for the galaxy's observed brightness.

Clearly, some "standard" assumptions can't be entirely correct. Galaxies may have started to form a lot earlier than assumed, or large galaxies could have formed directly instead of through mergers. Or perhaps both.

Other references: here, here, here, here.

Labels:

Thursday, October 06, 2005

What are gamma-ray bursts?

There have been a lot of interesting results in the past few months which seem to be providing a firmer understanding of the mechanisms that underlie gamma ray bursts. The reason is simple: NASA's Swift satellite observatory, launched in November 2004 and designed especially to detect gamma-ray bursts and report them quickly to ground-based astronomers, has been a great success.

According to the Wikipedia article, "Gamma-ray bursts (GRBs) are the most luminous physical phenomena in the universe known to the field of astronomy. They consist of flashes of gamma rays that last from seconds to hours, the longer ones being followed by several days of X-ray afterglow." (This article itself is not especially up-to-date.)

GRBs are scattered uniformly around the sky, and some are known to be extremely distant, implying a very high output of energy, typically around 1051 ergs.

The big question is: what causes them? There have been two main ideas for an answer: very powerful supernovae, or collisions between neutron stars or between a neutron star and a black hole. It's possible, of course, that both processes occur, in different events.

Let's look at some of the recent research reports.

Evidence for neutron star collisions

Creation of Black Hole Detected - May 9, 2005
Scientists Watch Black Hole Born in Split-Second Light Flash - May 11, 2005
Short gamma-ray burst hints at neutron star merger - May 11, 2005
Signs Point to Neutron-Star Crash - May 13, 2005 (subscription required)

On May 9, NASA's Swift satellite detected (as it was designed to do) a sharp gamma ray flare. The energy spike lasted just .05 seconds. This is far too brief to be caused by a supernova event. So it is called a "short" burst, that must be a different sort of event from a "long" burst, which lasts from a few seconds to a few minutes and is generally assumed to be caused by a supernova.

As intended, the Swift satellite notified ground-based astronomers of the event's location. Within a few hours, ground telescopes detected a faint patch of light near an old galaxy 2.7 billion light years away. The galaxy has not been forming new stars for several billion years, and the location, away from the galaxy, of the visual flare was consistent with what would be expected for a pair of neutron stars that could have been ejected in a much earlier supernova event.

Because short burts are so brief, it has not been possible before to locate any optical aftereffect, and therefore not possible to pick up any clues about their nature. The burst has been named GRB050509b.

Link between supernovae and gamma-ray bursts

Naked carbon/oxygen stars linked to gamma-ray bursts - May 26, 2005
The Link Between Supernovae and Gamma Ray Bursts - May 27, 2005 (subscription required)
An Asymmetric Energetic Type Ic Supernova Viewed Off-Axis, and a Link to Gamma Ray Bursts - May 27, 2005 (subscription required)
Double Whammy: Cosmic Fireworks Explained - May 30, 2005
Naked Star's Outburst
The latest in core collapse astronomy - June 16, 2005

Supernovae have been linked to GRBs by observations at visible wavelengths that show evidence of a supernova event at the same location as the GRB. But the precise type and mechanism involved in the supernova has been unclear. The research reported here describes a supernova, located at a distance of 260 million light years, named SN 2003jd, and first observed in 2003, that seems to have been disrupted by an asymmetric process. "Long" GRBs have been hypothesized to result from supernovae which project high-energy jets of matter in the direction of Earth, in what is known as the "collapsar" model. So this supernova could be such an event whose jets were not directed towards us.

The reasoning that originally linked GRBs to supernovae came from observations of an earlier supernova, SN 1998bw, which was found at the site of a GRB. This was not the first such discovery of a supernova in the location of a GRB. (That happened a year earlier.) But 1998bw was only 80 million light years away, so it was possible to learn much more about the nature of the event. In particular 1998bw was found to be a "Type 1c" supernova, meaning its spectrum indicated a lack of hydrogen and helium, which would have been lost before the explosion. If the progenitor star had been spinning very rapidly, perhaps due to collision with another star, then two large opposing jets of matter would be ejected at very high speeds along the axis of rotation, and this would account for a copious production of gamma rays.

Such events, known as "hypernovae", are believed to result from the explosive collapse of a star that weighs between 10 and 100 solar masses and has completely lost its outer layers of hydrogen and helium. If 1998b were a hypernova and one of its jets pointed in our direction, we would see the event as a GRB. (Indeed, the expulsion of matter in a jetlike matter is necessary, as well as sufficient, to explain GRB as some sort of supernova event. That's because if the explosion were spherically symmetric, the same amount of energy would be expelled in all directions, so that the total energy would be too great to be produced in a supernova.)

However, there was one problem. Since 1998bw was only 80 million light years away, while most GRBs had been found at distances of billions of light years, 1998bw must have been much less powerful. Therefore, it seemed hard to accept that much more distant and powerful GRBs might have resulted from the same sort of event. And yet, in 2003 a GRB was observed that was much farther away than 1998bw, hence about 1000 times as powerful. But since this object was much closer than most known GRB, it was possible to determine that it had the same general structure as 1998bw, namely a hypernova. Hence GRB could indeed be explained as hypernovae.

Not all Type 1c supernovae need be hypernovae, i. e. they don't need to have jets. But if GRB are explained as hypernovae, then we would expect there to be many Type 1c hypernovae which aren't GRB, simply because of the unliklihood of a jet pointing in our direction. That's were 2003jd comes in. It is a Type 1c supernova. In order for it to be a hypernova it would need energetic jets. But how would we know, if they weren't directed towards us? The answer is that its spectrum would give evidence of asymmetries. And that is exactly what was found with 2003jd. So we have evidence that hypernovae aren't just theoretical possibilities and do actually occur.

More evidence for neutron star collisions in short bursts

Astronomers unravel cosmic explosion mystery - August 10, 2005

As noted above, "short" and "long" gamma-ray bursts have been suspected to result from different processes. While the evidence is good that long bursts (producing gamma-rays for a couple seconds or more) are caused by a type of supernova in a massive star that lacks hydrogen and helium, short bursts have been more mysterious, partly because the gamma-rays fade so quickly that locating the event at optical wavelenghts has been difficult.

But in July, two more short burst events were detected. One on July 9 lasted a tenth of a second, and another on July 24 lasted a quarter of a second. The first was found to be about 10,000 light years from the center of a star-forming galaxy that is 1.8 billion light years away. The second was inside a galaxy consisting of old stars that is 2.8 billion light years away.

A short burst on May 9 (described above) was found to be associated with a galaxy 2.8 billion light years distant. However, in this case, the burst lasted only one twentieth of a second, so it released less total energy. It is significant that the bursts in July were more energetic, because that apparently rules out an alternate model for the source of short bursts. This model involves a flare driven by a very strong magnetic field of a neutron star (an object called a magnetar). But there is an upper limit to the amount of energy such an event can release, and the July burst events would exceed that limit.

This is more evidence that the event was caused by a collision between a neutron star and another neutron star or a black hole.

Long bursts have a more complex structure than previously supposed

Giant Space Blasts a Two-step Process - August 17, 2005
NASA's Swift Satellite Finds Newborn Black Holes - August 18, 2005
Scientists Watch Baby Black Hole Get to Work Fast - August 18, 2005
Bright X-ray Flares in Gamma-Ray Burst Afterglows - August 18, 2005 (subscription required)
Black holes born through a brutal labour - August 18, 2005
The baby black holes with giant hiccups - August 19, 2005
Black Hole Surprise: Multiple Eruptions Seconds After Birth - August 23, 2005

Although the collapsar model of long gamma-ray bursts now seems fairly well substantiated, the actual sequence of events following the beginning of the burst apparently can be more complicated than previously supposed. This conclusion follows from observations of the burst known as GRB 050502B, which occurred on May 2, 2005, as well as an x-ray flare event known as XRF 050406, which occurred on April 6, 2005. X-ray flares may occur either by themselves or as part of a GRB. (Which suggests that an x-ray flare by itself might be associated with a GRB that doesn't have a jet directed towards the earth and is therefore not observable.)

There were two bright x-ray flares that appeared in the GRB 050502B afterglow and peaked minutes after the main burst. One of these flares carried as much energy as the GRB itself. Evidently, there is a lot going on after the end of the gamma-ray emission.

The standard collapsar model is based on a supernova event in a star which is sufficiently heavy that when the internal thermonuclear fusion process in the core ceases, the star collapses directly to a black hole rather than a neutron star. (Exactly how heavy the star must be isn't clear, but may be around 25 solar masses.) The initial burst of gamma-ray energy is produced by shock waves within the rapidly collapsing stellar matter. This can last from a few seconds to several minutes.

If the star is spinning rapidly, the collapse is not spherically symmetric. Instead, to conserve angular momentum, matter tends to collapse roughly parallel to the spin axis. The result is that at first the star's matter forms a lens-shaped object, in the center of which a black hole develops. This is followed by the contraction of the lens itself under the enormous gravity of the black hole. But not all the matter in the outer parts of the lens falls directly into the black hole. Instead, the matter is forced out at nearly relativistic speeds into opposing jets along the rotation axis, like water squeezed out of a sponge. Shock waves in the collapsing matter then heat the jets to temperatures high enough to generate gamma-rays.

If the progenitor star still has hydrogen and helium in its outer layers, then the jets can be absorbed by this remaining gas and their gamma-ray emission will be extinguished. On the other hand, if the hydrogen and helium layers have been stripped away previously, by a companion star or black hole for example, then the jets can emerge, and the result is a gamma-ray burst.

That's the simple picture. However, in the case of GRB 050502B, the process must have been rather more complex. The gamma-ray emission lasted abou 17 seconds. But about 8 minutes later, there was an enormously powerful burst of x-ray photons, comparable in total energy to the earlier gamma-ray emission. Much smaller x-ray flares have been observed in connection with other GRBs, but his was 100 times as energetic as any seen before. In other GRB events, as many as four separate x-ray flares have been observed.

Evidently the whole process can be much more complex than the simple version sketched above. Exactly what goes on isn't clear. The matter swirling around the black hole may be in such chaotic motion that repeated smaller explosions occur as clumps of matter fall into the black hole. Or perhaps some of the material in the jets falls back upon itself under the huge gravitational pull of the black hole. X-ray flares can be detected in about a third of GRB events. It may be that the speed of rotation of the progenitor star influences how "smoothly" the process occurs.

More precise analysis of some short gamma-ray bursts

Swift Spacecraft Solves Mystery of Short Gamma-Ray Bursts - October 5, 2005
Colliding Stars Behind 35-year-old Mystery - October 5, 2005
HETE-2 satellite solves mystery of cosmic explosions - October 5, 2005
Split-second explosions, so-called short gamma-ray bursts, solidly linked to stellar collisions - October 5, 2005
HETE Satellite Solves Mystery of Short Gamma Ray Bursts - October 5, 2005 (PDF file)
35-year-old mystery solved in a flash of light - October 5, 2005
Ancient Instellar Collision Helps Explain Source of Radiation - October 5, 2005
Discovery of the short g-ray burst GRB 050709 - October 5, 2005 (Nature paper, PDF file)

The observations of short GRBs on May 9, July 9, and July 24 of this year, mentioned above, have been more carefully analyzed, and the results indicate that these events almost certainly involved collisions between an orbiting pair of neutron stars, or a pair of a neutron star with a black hole.

The May 9 event was a GRB that lasted .07 seconds in the vicinity of an elliptical galaxy about 2.9 billion light years away. Since elliptical galaxies are old, they do not contain massive young stars which could produce a supernova.

The July 9 event (GRB 050709) was detected by a satellite named HETE-2, and its X-ray emissions were measured by NASA's Chandra X-ray observatory. This made it possible to detect a visible light afterglow, which placed the event inside a small blue galaxy 1 billion light years away. Given the known distance, it could be calculated that the event was only between a hundredth and a thousandth as intense as a typical long GRB. That amount of energy is consistent with a collision between neutron stars. Also, the event occurred in the outer area of the galaxy, which is where a pair of neutrons stars is most likely to be, and not compatible with a supernova, which is the explosion of a young star. Clinching the case, no supernova was detected at visible wavelengths. However, like long GRBs, the typical short GRB also seems to radiate gamma-rays in jet-like beams that occupy only one 30th of a full sphere.

It's not certain that these events involved a pair of neutron stars rather than a neutron star and a black hole. There are hints that the latter may have been the case, especially in the July 24 event, which was four times as powerful as the one on July 9, as would be expected since a black hole would be more massive than a neutron star. The latter event contained several bursts of x-rays, which suggests it involved a black hole that first consumed most of the neutron star and then consumed smaller fragments separately.

Tags:

Labels:

Wednesday, October 05, 2005

Why do people doubt evolution?

Mostly just some bookmarks at this point.

From the Tail: Betting on Uncertainty asks the question and suggests the answer is that humans in general have an aversion to uncertainty and accident. There is too little tolerance for ambiguity. And so the explicit randomness in evolution is too unsettling.

I would add: This aversion to randomness is one of the main reasons that religion still hangs around. People think -- not necessarily with real justification -- that life needs to have "meaning" and "purpose". And so a theory is unwelcome if it seems to imply that life in general -- and humans in particular -- might be just an accident. Instead, people crave a belief system that posits there is no uncertainty, no randomness, no lack of intentionality behind nature.

Even Einstein, famously, hated quantum theory because of its essential randomness. His (failed) scientfic quest was to find a few equations that required the universe to be as it is and not otherwise, so that ultimately even a deity couldn't have made things differently. Perhaps the main reason that religious fundamentalists haven't attacked quantum theory yet is that they know so little about it.

Of course, the aversion to the theory of evolution is especially intense in the U. S., and not all forms of religious belief are hostile to evolutionary theory, so further explanation is needed. Presumably the thrall to fundamentalist religion in which so many people in the U. S. are trapped must play a big part. But how to explain that? One needs to look at history and the tradition of anti-intellectualism in American life.

Other thoughts on these questions are here, here, and here.

Labels:

Tuesday, October 04, 2005

Biomedical research funding

Funding for biomedical research doubles in last decade
From 1994 to 2003, total funding for biomedical research in the U.S. doubled to $94.3 billion, with industry providing 57 percent of the funding and the National Institutes of Health providing 28 percent, according to a study in the September 21 issue of JAMA, a theme issue on medical research.

Lead author Hamilton Moses III, M.D., of the Alerion Institute, North Garden, Va., presented the findings of the study today at a JAMA media briefing on medical research.

Few comprehensive analyses of the sources of financial support of biomedical research and uses of these funds have been available, according to background information in the article. This results in inadequate information on which to base investment decisions and can create a barrier to judging the value of research to society. Previous articles have examined specific sectors, but few have done so comprehensively.

Dr. Moses and colleagues conducted a study to determine the level and trend from 1994 to 2004 of basic, translational (the application of knowledge of basic science research to clinical care), and clinical U.S. biomedical research support from the major sponsors of this research: (1) federal government, (2) state and local governments, (3) private not-for-profit entities including foundations, and (4) industry.

The actual spending on biomedical R&D increased from $37.1 billion to $94.3 billion. Allowing for inflation, that's a double.

What's the point of mentioning this? Well, $94 billion is hardly an insignificant amount of money. For comparison, the entire NASA budget in 2005 was around $16 billion. So biomedical research receives almost 6 times as much funding as space exploration. That has obvious career implications for people now in college or looking for a career switch.

It also suggests that in the coming decades we should expect some significant payoffs from this research. Here's just one datapoint. I watch press releases coming out in most scientific areas. Right now I am seeing 3 to 4 times as many press releases announcing new developments in cancer research as I did just one year ago. There are similar increases in announcements related to research in Alzheimer's disease, diabetes, and infectious diseases from last year at this time. It's possible that this just means researchers and their supporters are more hungry for publicity now.

But I think it's more likely that there are real increases in the number of significant discoveries, too. In cancer, for instance, it appears we are finding out a lot more about the genetic changes involved in different specific types of cancer. We are "reverse engineering" the typical ways that each different kind of cancer operates. I don't think that such knowledge can avoid being useful in actually treating the different types of cancer, even if it takes the usual 8 to 10 years to bring new drugs to market.

Saturday, October 01, 2005

What's special about embryonic stem cells?

Researchers discover key to embryonic stem-cell potential
CAMBRIDGE, Mass. (September 8, 2005) - What exactly makes a stem cell a stem cell? The question may seem simplistic, but while we know a great deal of what stem cells can do, we don't yet understand the molecular processes that afford them such unique attributes.

Now, researchers at Whitehead Institute for Biomedical Research working with human embryonic stem cells have uncovered the process responsible for the single-most tantalizing characteristic of these cells: their ability to become just about any type of cell in the body, a trait known as pluripotency.

When a sperm cell fuses with an egg, the cell that results is called a zygote. The zygote proceeds to divide into additional cells through the normal process of cell division. After a few days, when there are about 40 to 150 cells (of a mammalian embryo), a central fluid-filled cavity develops and the embryo is referred to as a blastocyte.

At first, all of the cells in the developing embryo are (embryonic) stem cells, of a type referred to as totipotent, because they can develop (directly or indirectly) into any other type of cell, including other totipotent stem cells. But by the blastocyte stage, the stem cells have lost the ability to remain totipotent when they divide. Yet they can still become any of the other more than 200 types of possible mammalian cells. Stem cells at this stage are referred to as pluripotent.

Human pluripotent stem cells are assumed to have the greatest therapeutic potential, because they can develop into any type of body tissue, and because they can also be cultured indefinitely as independent cells.

By the time the embryo has developed into an adult, there remain many stem cells -- adult stem cells -- but all have lost their pluripotency, as far as we can tell. Such stem cells can develop into certain limited types of cells -- different kinds of blood cells, for example -- but that's all.

So one important question is: What causes a pluripotent cell in an embryo to lose that property and to head down a path toward a cell that makes up a specific type of tissue? And the other side of the question is just as important: What allows a cell cultured outside an embryo to remain pluripotent?

The answer depends entirely on which of the 25,000 or so genes (in the case of the human genome) are "expressed", that is, capable of directing the production of one or more proteins. Some genes are always expressed, because the proteins they are responsible for are needed for the proper function of any cell. But other genes are not expressed unless their proteins are needed at a given time. Some of the genes of this latter kind determine (among other things) the type of cell one has. A liver cell, for instance, is a liver cell because certain specific genes are expressed.

So it is to be expected that there must be genes which, when expressed, maintain the initial pluripotency of an embryonic stem cell. Prior to the latest research findings, it had been determined that in humans there are (at least) three such genes, which are named (along with their associated proteins) Oct4, Sox2, and Nanog. It was known that all of these proteins are necessary for pluripotency, because if any one of the genes is not expressed, pluripotency is lost. It was also known that the three proteins do not play a direct role in cell function, such as forming part of the internal machinery of the cell. Instead, they were known to be transcription factors, which affect the expression of other genes.

What remained unclear was how these proteins did their job. A transcription factor can either facilitate or inhibit the expression of a gene, and several transcription factors usually work together to do this, by binding to specific locations in the cell's DNA. The finding of the present research is that when the three genes are expressed and all three proteins are produced, then if the proteins bind together at specific places they can inhibit the expression of 353 other genes. And so, when any of these three genes is silenced, 353 other genes can be expressed. All of these genes are responsible for other transcription factors, which in turn affect many more genes in a large cascade effect. Eventually new proteins are produced which lead to the cell's loss of plutipotency and its differentiation into some more specific type of cell.

The interaction of genes in a living cell is a lot like a computer program. Starting from some particular initial state, a program, whether it's in a computer or a cell, takes "input" from outside. Then the program state, influenced by the relevant external inputs, transitions to another state in the next "clock cycle". Ultimately, what is to be determined, is the overall "state diagram" of the genome. That is, what combination of expressed genes and external influences lead to a new set of expressed genes, new transcription factors, and new cell behavior? What has been figured out so far is just the very earliest stage of the puzzle.

This is still just the beginning of learning what's going on. Yet to be determined is what first inhibits one of more of Oct4, Sox2, and Nanog from being expressed. Presumably it's something in the cell's environment. And beyond that, what determines the path that the cell will eventually take, towards becoming a blood cell rather than a neuron, for example?

The new results inevitably raise the question: Is there some way to make a cell dedifferentiate and allow all three genes to become expressed, so that the cell becomes pluripotent again? It can't be very easy at all, because as long as so many of the 25,000 genes in the genome are expressed that are silent in a pluripotent cell, the internal environment is very different. Will the genie go back in the bottle? On the other hand, the process of cloning via "somatic cell nuclear transfer", has precisely this effect of resetting the initial conditions simply by removing the nucleus of a fully differentiated cell (of skin, say) and inserting it into an egg cell whose own nucleus has been removed. Well, it's not quite that easy, but of course such cloning has been done.



References:

Labels: ,