Early Universe Boiled Like Water

When I read today’s article, I instantly thought of the movie Men in Black. My brain shot to the scene in the morgue where a tiny alien living in the head of what appears to be a human gives Will Smith the cryptic message “The galaxy is on Orion’s belt.” Later, of course, we learn that there was an actual galaxy in an orb that is attached to the collar of a cat named Orion. The final scene of the movie also came to my mind today. The camera zooms out, from Earth through the solar system and our galaxy, to realize that our galaxy is also enclosed in an orb. This time, however, there are aliens using many orbs containing galaxies in a game of marbles!

Today’s article isn’t saying that the universe is exactly like we saw in Men in Black; however, the title of the article “Is the Universe a Bubble?” was what sparked my memory. The author equates the early universe around the time of the big bang to be like a pot of water sitting on the stove. As the water boils, and as the universe radiated with extreme energy, bubbles begin to form. Inside each “universe bubble,” a vacuum was produced, causing each bubble to expand. Many bumped into each other to melt together to make larger bubbles; some remained far apart. No matter which sort of bubble, the researchers are trying to explain that inside each bubble was a separate universe. Now you can see why I recalled those scenes from Men in Black!


Have you never heard of this theory? Have you always only heard of the big bang? Don’t worry, you’re not alone! This theory of cosmic inflation is not widely accepted, mainly because instead of saying that the big bang began with a tiny, condensed piece of matter, scientists are saying that the cosmic inflation began with a vacuum. This gives some people pause.


Large particle colliders are where scientists try to test their big bang theories. Scientists at the Perimeter Institute are trying to test their cosmic inflation hypothesis. Using computer simulations, scientists are currently testing the scenario of two bubble universes merging together. Then, based on what their simulations show, they can search the skies for evidence for their theory. For example, if two bubble universes collided, scientists predict there would be a blemish of microwaves to detect; unfortunately, no such radiation has been discovered yet. Scientists will continue their work, however, because they believe that the simulations are showing that the bubble universe could actually be possible. They just need to find the correct marker in the background radiation of the universe to prove their point!

Article Source: Perimeter Institute. “Is the universe a bubble? Let’s check: Making the multiverse hypothesis testable.” ScienceDaily. ScienceDaily, 17 July 2014

Image Sources:

Scott Akerman “Boiling Water” 6 September 2008 via Flickr. Creative Commons Attributions.

Terry Hancock “M33 Triangulum Galaxy” 7 October 2011 via Flickr. Creative Commons Attributions

Early Detection Is Often The Key

In general, the earlier that your doctor diagnoses you with a disease, the higher your chance of survival or improvement. We see this everyday in instances of cancer and heart disease. When caught at the late stage, the prognosis takes a steep decline. The same paradigm holds true for mental illnesses. The earlier that treatment can be started, the better the outcome.

Alzheimer’s disease is a debilitating ailment with progressive decreases in mental acuity and increasing dementia that normally occurs in the aging population. Like many diseases, Alzheimer’s worsens over time. Early diagnosis is not always possible because the first signs and symptoms are often brushed off as “old age.” The disease progression differs from person to person, but it ultimately leads to a loss of both short- and long-term memory followed by a loss of bodily functions and death. A final and definitive diagnosis of Alzheimer’s disease is done by examining brain tissue for the accumulation of amyloid plaques; unfortunately, this is very invasive and can only safely be done after death.

Obviously, the search for biomarkers is currently an area of intense research not only in Alzheimer’s disease but across many disciplines. The thought is that if there was a certain marker of disease that was elevated in an easily accessible bodily compartment (for example, the blood), then routine tests would be able to predict who would and would not get a disease. For Alzheimer’s, the stand-out compound would be the detection of amyloid plaques. For example, a molecule that could bind to the amyloid plaques could be injected into the patient, and a scan of the brain would show plaque accumulation. Another form of tests that are being researched include compiling characteristics of Alzheimer’s patients including the size of multiple parts of the brain. If a pattern is found, this could prove useful to early diagnosis.

SmellAll of these ideas are good and well, but what about the costs associated? Are insurance companies going to pay for all of these expensive tests when there is not a very convincing lead? What if I told you that scientists were currently working on a very simple and cost-effective means to identify who would need more expensive testing? A group from Columbia University did a very large study involving patient’s ability to detect smells over a period of time. They found that those who began to lose their sense of smell had a significantly increased probability of being diagnosed later with Alzheimer’s disease and dementia.  A separate study from Australia that is in the early stages is trying to find a method of early detection via eye examinations. In this study, a patient would take a compound that binds beta-amyloid and then fluoresces. The doctor would then scan the eye to detect fluorescence.

Both of these studies are very interesting, but I most like the one associated with smell. What an easy test! I would like to see it expanded in order to make a definitive claim, but it looks to be on the right track!

Article Source: Alzheimer’s Association. “Smell and eye tests show potential to detect Alzheimer’s early.” ScienceDaily. ScienceDaily, 13 July 2014.

Image Source: Dennis Wong. “Smell” 22 February 2009 via Flickr. Creative Commons Attributions.


HIV Functional Cure: An Update

The science news story that I’m highlighting today is one that I’ve been following since its beginnings. Last year, as I was working in my first HIV lab, an interesting patient study was presented at the Conference on Retroviruses and Opportunistic Infections (CROI), an annual meeting dedicated to the study of HIV/AIDS and their accompanying infections.


As many of you may remember, this study detailed a child dubbed the “Mississippi Baby” whose mother tested positive for HIV during labor and delivery. The mother had no suppressive therapies during her pregnancy. The baby was tested for HIV several times in the days after its birth, and results came back positive for HIV. The baby was started on a rapid antiretroviral cocktail that was normally used for more advanced cases of infection. This brought the child’s viral load, the amount of HIV found in the blood, down to undetectable levels. After 18 months, the mother and child were lost from the care of doctors, and therapy was ended.

The most interesting part of this story comes after the mother and child returned to doctors’ care. At nearly 2 years of age, the child, who had been off antiretroviral therapy for almost 6 months, still showed undetectable levels of HIV, even under more strict and intensive detection methods. However, these research studies did detect cells that were probably latently infected, meaning that upon reactivation, they could produce resurgence in HIV.

This study was very exciting to HIV researchers. Was this a way to get a functional cure for infants born to parents with HIV? Most people, when taken off of antiretroviral therapy, have a rebound in HIV infection in days to weeks. This child had gone for months without treatment and still had undetectable HIV infection. Scientists now had to wait and monitor the child to see what would happen next.

This week, scientists were disappointed to report that the child now has detectable HIV levels, after nearly 2 years without therapy. The child was also presenting with other signs of resurgence in HIV such as low CD4+ T cell levels. Now, the child must return to antiretroviral therapy in order to manage the disease.

Even though the child wasn’t cured of HIV as was previously suspected, the case study still brings many more questions to researchers. Why did it take so long for the virus to rebound? Did the early therapy limit the amount of cells that were infected in the first place, and, if so, could there be a way to kill off that small number of cells to make a true functional cure? Once again, scientists have much to study, but I will be interested to continue following the developments!

Article Sources:

NIH/National Institute of Allergy and Infectious Diseases. “‘Mississippi Baby’ now has detectable HIV, researchers find.” ScienceDaily. ScienceDaily, 10 July 2014.


Image Sources:

NIAID “HIV-infected H9 T Cell” 12 April 2011 via Flickr. Creative Commons Attributions.

Hominid Evolution — Diverging Traits

The theory known as “Out of Africa” explains the evolution of man and the expansion from Africa throughout the world. Only a few years ago, only one theory existed; however, scientists now agree that there were two expansions out of Africa with the second wave of hominids leading to modern humans. Now, scientists are asking questions such as “Did those that left Africa have all of the traits of modern humans?” or “Did the traits come progressively, one by one?” These traits include “a large brain, long legs, the ability to craft tools, and prolonged maturation periods.”

homo erectus

Growing research is showing the latter question to be true: hominid characteristics evolved one by one, some very early and others quite later. Interestingly, climate changes are being cited as the major driver of hominid evolution. The climate at the time, from 2.5 to 1.5 millions years ago, was highly volatile, swinging from dry to wet seasons. Multiple hominid species lived during this time, and it is the ones that could adapt best to their environment that ultimately prevailed. Scientists studying the fossil record have found some skeletons with large skulls, but no other defining traits, and other skeletons with hominid teeth and hands, without any other hominid traits. Scientists suggest that during this unstable period, different groups of hominids were acquiring distinctive traits that would help them to survive.

The more I read about evolution, the more interested I get. We normally assume that animals evolved in a straight line, acquiring one trait, then the other, until we have the animal that we see today. However, it is more likely that animals diverged, acquired different traits, and those that were the most beneficial survived. Then, that new animal acquired new traits, and the process repeats over and over again. As I look around at the world today, of course I cannot see evolution happening in day-to-day life. I wonder what scientists will study about our lives, millions of yeas in the future!

Article Source: Smithsonian. “Timeline of human origins revised: New synthesis of research links changing environment with Homo’s evolutionary adaptability.” ScienceDaily. ScienceDaily, 3 July 2014.

Image Source: NCSSM “Hominid_Skull-Homo_Erectus_Dmanisi_033.jpg” 16 May 2013 via Flickr. Creative Commons Attributions.

Eat Insects For Bigger Brains?

Do you ever wonder how monkeys and humans achieved higher learning skills? Was it just the luck of our evolution, or was there one important point in our history that contributed to our brainpower? Actually, it was based on our ancestors’ diet. I suppose the ads are right when they say that some food is “brain food!”

Our ancestors were not herbivores, only relying on plants and flowers in order to survive, nor were they strict carnivores, hunting large prey across the land. Rather, they were herbivores that preferred to eat foliage but also enjoyed tasty insects when other food was scarce. What makes this diet different from the others? Plants are quite easy to attain while large prey are easily seen and then chased to the kill. Insects, however, hide in the soil and among dead logs. Our ancestors had to use their cunning skills and dexterity to get the tiny insects out of their hiding places.


One current study shows that this use of problem solving and acute motor skills helped to improve cognition of our ancestors. As you might expect, those species that developed tools to assist their insect quest had higher learning skills. Scientists studied two species of capuchin monkeys to come to their conclusions. One species lived in tropical rainforests while the other lived in a more temperate climate that included a variation of seasons. What the research found was that the monkeys that lived in the seasonal climate taught themselves how to use tools while the monkeys in the rainforest used cruder methods such as smashing items to get food. This difference between monkey cognition based on the climate in which they live brings us back to the hypothesis that our ancestors achieved higher learning skills because they foraged for insects when other food was scarce, as might happen during a seasonal change.

I find this study very interesting because it really does make sense. If a monkey lives in a climate that rarely changes, then the plants and insects available to be eaten would stay the same. They would be able to learn how to attain and eat only that which was easily available because they wouldn’t need anything else. However, those monkeys that live in a changing environment would find a change in plants and insects with the seasons. They would have to develop crafty skills and possibly tools in order to get food when the rain is less plentiful or the weather is colder and plants aren’t growing and insects are hiding.

Article Source: Washington University in St. Louis. “Insect diet helped early humans build bigger brains: Quest for elusive bugs spurred primate tool use, problem-solving skills.” ScienceDaily. ScienceDaily, 1 July 2014.

Image Source: Hamish Irvine. “Two Capuchins” 11 October 2011 via Flickr. Creative Commons Attributions.

How Stress Affects A Child’s Brain

How many memories do you have from your early years? I don’t think that I can remember anything earlier than about the age of five. Even if we don’t remember specific events, is it possible that our brain is still influenced by them in our later years? New studies by a group at the University of Wisconsin-Madison say that the things that happened to us as children indeed shape our brain.

Scientists have known for some time that stressors like abuse or neglect in childhood can manifest as illnesses ranging from depression to cancer or heart disease in adults. The question that still remains is which part of the brain is affected. Once this is answered, doctors will have a better idea of how to treat children before such illnesses take hold.


In this particular study, researchers gathered children around the age of 12 who had an excessively stressful early life. After extensive interviews with both the children and their current caregivers to get a sense of what the child was exposed to and how their behavior has been impacted, doctors imaged the brains of the children. Results showed that compared to similar aged children who had a normal upbringing, these stressed children had smaller amygdalae and hippocampi, parts of the brain involved in emotion and short- and long-term memory, respectfully.

Scientists still need to determine precisely why the amygdala and hippocampus fails to grow to the normal size in these children. However, this study gives an area of focus for future research. I, for one, am not surprised by this study. I have always believed that everything that happens to us shapes us in some way, and this study shows an exact measurement in the brain of that phenomenon.

Article Source: University of Wisconsin-Madison. “Early life stress can leave lasting impacts on the brain.” ScienceDaily. ScienceDaily, 27 June 2014.

Image Source: A Health Blog “Exercise Plays Vital Role Maintaining Brain Health” 24 April 2012 via Flickr. Creative Commons Attributions

One Father, One Mother, and Another Mother?

For many couples who cannot have children the natural way, in vitro fertilization or IVF has given them the chance to have a family. The process begins with stimulating the mother’s body to produce and release eggs, which are then removed by a physician. The eggs are cultured in a laboratory with the father’s sperm to allow fertilization and subsequent monitoring before transfer into the mother’s uterus for implantation.

B0007320 Egg and sperm

For most normal cases in which there are no genetic risk factors from the mother or father, this sequence of events for IVF will be able to produce a pregnancy. However, if the mother has a mitochondrial disease, the odds are against her.

During fertilization, genomic DNA from the father’s sperm joins up with genomic DNA from the mother’s egg to make a complete set of chromosomes. However, there is another form of DNA in your cells. Mitochondria, the energy-producers of the cell, have their own circular DNA that is passed on during each cellular division. This DNA is not passed on like the chromosomes from the nucleus. Rather, mitochondria grow and divide on their own schedule. When the cell enters mitosis and divides, its cytoplasm is divided between its daughter cells, causing half of the mitochondria to enter each daughter.


When it comes to determining which parent passes on mitochondria, the answer is very simple. Sperm are really only a package for delivering the father’s DNA to the egg, providing nothing else. The egg, on the other hand, provides the cytoplasmic materials needed for the beginning of life, including mitochondria.

In the late 1990’s, the first IVF experiment to bypass a mitochondrial disease took place. Scientists followed the same procedures as in other cases of IVF but added one more step. A small amount of cytoplasm from a second mother was injected into the egg, making a three-parent IVF. The pregnancy was carried to term, and the child did not have any mitochondrial diseases.

We are now over 10 years from the first use of this procedure, and scientists are asking how we are doing. Unfortunately, not all of the offspring made from three-parent IVFs have been tracked, so it is difficult to know if the procedure has caused any lasting effects. Now, researchers are ready to conduct a clinical trial to know once and for all if this procedure is safe and effective. One difference between the first procedure and the clinical trial going forward is that instead of injecting only a small amount of another egg’s cytoplasm, scientists now want to replace all of the mitochondria in the egg. This could be done by uniting the father’s DNA and the mother’s DNA and injecting them into a donor egg from a third party.

The obvious concern here is the question as to whether this is genetic engineering, moving towards a genetically perfect offspring. I do not think this is the case. Rather, I think that scientists are researching ways in which to stop certain diseases, and if using someone else’s mitochondria helps my child to lead a normal life without disease, then I am all for it!

Article Source: Kim Tingley. “The Brave New World of Three-Parent IVF”. The New York Times. 27 June 2014.

Image Source:

Wellcome Images. “B0007320 Egg and Sperm” 9 June 2011 via Flickr. Creative Commons Attributions

Erin Hass. “Mitochondria” 7 February 2011 via Flickr. Creative Commons Attributions.