April 5, 2012
By J.D Heyes
“Let your kids play in the dirt. It’s good for them. In fact, you should join them. It’s fun getting dirty.” –KTRN
Here’s a concept that most parents will find a little hard to believe: new research shows that it’s possible kids can be too clean.
You read that right.
That’s because all of the soap dispensers, hand sanitizers and alcohol-tinged wipes could be robbing our kids from exposure to the germs that help strengthen their immune systems.
According to new research done on mice, increasing exposure to germs helps develop the immune system, thereby preventing allergies and other immune-related diseases like colitis and asthma later in life.
Researchers at the Brigham and Women’s Hospital (BWH) in Boston led the study and published their results in the March issue of Science magazine. Using the “hygiene hypothesis,” the team says research shows a lack of exposure to microbes at an early childhood age increases susceptibility to some diseases because the lack of exposure suppresses the body’s immune system. The study does more than just support the notion, it also may explain the whys and hows of the process.
Researchers warned, however, that their research was conducted on mice, not humans. Still, the results seemed to indicate that you have to trigger the immune system with the introduction of germs in order for it to develop fully.
How it works
The research team, led by co-authors Dr. Richard Blumberg, chief for the BWH Division of Gastroenterology, Hepatology and Endoscopy, and Dr. Dennis Kasper, director of BWH Channing Laboratory, studied “germ-free” mice bred in a sterile environment without exposure to microbes, as well as specific-pathogen-free mice that were raised in a normal laboratory environment.
The mice were bred to develop forms of asthma and inflammatory bowel disease, in which their immune systems were then compared.
The team found that the germ-free mice had more invariant natural killer T cells in their lungs and bowel, and developed more severe disease symptoms.
“[... W]e show that, in germ-free (GF) mice, invariant natural killer T (iNKT) cells accumulate in the colonic lamina propria and lung, resulting in increased morbidity in models of IBD and allergic asthma compared to specific pathogen-free (SPF) mice,” Blumberg and Kasper wrote.
The researchers also found that when they exposed germ-free mice to mice with germs in their first few weeks of life, they didn’t develop high levels of invariant natural killer T cells. Also, they didn’t develop the more severe symptoms seen in those mice kept germ-free. And, they discovered, germ-free mice with early-life exposure to microbes developed long-term disease protection.
“These studies show the critical importance of proper immune conditioning by microbes during the earliest periods of life,” Blumberg told reporters. “Also now knowing a potential mechanism will allow scientists to potentially identify the microbial factors important in determining protection from allergic and autoimmune diseases later in life.”
September 23, 2011
The privacy curtains that separate care spaces in hospitals and clinics are frequently contaminated with potentially dangerous bacteria, researchers said in Chicago this week.
To avoid spreading those bugs, health care providers should make sure to wash their hands after routine contact with the curtains and before interacting with patients, Dr. Michael Ohl, from the University of Iowa, Iowa City, said at the 51st Interscience Conference on Antimicrobial Agents and Chemotherapy.
“There is growing recognition that the hospital environment plays an important role in the transmission of infections in the health care setting and it’s clear that these (privacy curtains) are potentially important sites of contamination because they are frequently touched by patients and providers,” Dr. Ohl told Reuters Health.
Health care providers often touch these curtains after they have washed their hands and then proceed to touch the patient. Further, these curtains often hang for a long time and are difficult to disinfect.
In their study, Dr. Ohl and his team took 180 swab cultures from 43 privacy curtains twice a week for three weeks. The curtains were located in the medical and surgical intensive care units and on a medical ward of the University of Iowa Hospitals.
The researchers also marked the curtains to keep track of when they were changed.
Tests detected Staphylococcus aureus bacteria, including the especially dangerous methicillin-resistant S. aureus (MRSA), as well as various species of Enterococci — gut bacteria — some resistant to the newer antibiotic vancomycin.
The researchers used additional tests to identify specific vancomycin and methicillin-resistant strains to see whether the same strains were circulating and contaminating the curtains over and over.
The study found significant contamination that occurred very rapidly after new curtains were placed. Of the 13 privacy curtains placed during the study, 12 showed contamination within a week.
Virtually all privacy curtains tested (41 of 43) were contaminated on at least one occasion.
MRSA was isolated from one in five curtains, and vancomycin-resistant Enterococci (VRE) from four in 10. Eight curtains were contaminated with VRE more than once. Three of these were of a single type, but the other five showed contamination with different VRE strains, which suggested recontamination was happening with bacteria from new sources.
Overall, two thirds of the swab cultures were positive for either S. aureus (26 percent), Enterococcus species (44 percent) or various bacterial species from a group known as gram-negative rods (22 percent).
“The vast majority of curtains showed contamination with potentially significant bacteria within a week of first being hung, and many were hanging for longer than three or four weeks,” Dr. Ohl noted.
“We need to think about strategies to reduce the potential transfer of bacteria from curtains to patients,” he added. “The most intuitive, common sense strategy is (for health care workers) to wash hands after pulling the curtain and before seeing the patient. There are other strategies, such as more frequent disinfecting, but this would involve more use of disinfectant chemicals, and then there is the possibility of using microbial resistant fabrics. But handwashing is by far the most practical, and the cheapest intervention.”
September 19th, 2011
By: Lenette Nakauchi
Since the beginning of medicine, people have looked for a way to ease the pain and discomfort of a variety of ailments. Over time, some herbs have been found to be highly effective as natural remedies, and lemon grass is one that is known to alleviate the pain felt as a result of muscle cramps. Effective in a variety of forms and for a diverse list of medical issues, lemon grass is a very widely used herb that has been incorporated into numerous industries.
Known officially as andropogon citratus, or andropogon flexuosus, lemon grass has also been assigned the monikers lemongrass, scurvy grass, citronella grass, and fever grass. It is a perennial tropic grass that grows primarily in warm climates and is often found in Asian countries, as well as Africa and South America. With roughly 55 species known to provide medicinal treatment, lemon grass has also been used as thatching for huts and cottages in traditional cultures.
The benefits of lemon grass are evident before the herb is even prepared, as the leaves have an aromatic smell that can be used as fragrance in potpourri and satchels. The leaves themselves, which are a staple in Ayurvedic medicine, vary in color from yellow to reddish brown and can be used in fresh, dried, powered, and oil form. Oil form is one of the most popular; essential oils are distilled from the lemon grass leaves and are very thin in texture, similar to water.
Lemon grass is known as a great natural treatment for muscle cramps, as it alleviates the stress in the tissues and helps the muscles relax. But this herb has many other applications, ranging from a food additive to a fragrance used in beauty products. Some of the various uses of lemon grass include: a flavor supplement in food, especially wine and sauce; a fragrance for soaps, creams, detergents, perfumes, lotions, and hair products; a pesticide and rodent repellant; a degreaser; a treatment for depression; and a natural, safe way to fight off fatigue and to invigorate the senses.
Additionally, Chinese herbalists have long used lemon grass to treat a variety of ailments, including: colds, fungal infections, stomach aches, digestion issues, spasms, toothaches, the buildup of mucus, and rheumatic pain. Lemon grass also kills germs, stops flatulence, helps blood to clot, acts as diuretic, increases kidney health, serves as a sedative, treats ringworm, and is an effective tonic.
Though the benefits of lemon grass seem to go on and on, there are two issues that people using this herb should keep in mind. First, when used topically, lemon grass can cause irritation to sensitive skin. To prevent discomfort, it is recommended that anyone using lemon grass for the first time apply it to a small patch of skin to ensure that s/he does not have an adverse reaction. Additionally, it is highly recommended that pregnant women refrain from using lemon grass.
Clearly, the health benefits of lemon grass include the ease of muscle cramps and so much more. As a staple in modern and ancient natural medicine, lemon grass has proven itself to be a versatile and effective treatment for many ailments. It is a wonderful herb that will provide many health benefits. The ability of this herb to alleviate muscle cramps can be heightened by using it in a smoothie or elixir after a workout.
Here is a quick and simple recipe you can try: blend 1 inch of the lemon grass root with 2 cups water, add a dash of stevia and cinnamon and/or vanilla, strain and enjoy!
June 22nd, 2011
Pillows at home and in hospitals have been overlooked as breeding grounds for infectious germs—including superbugs—according to a study cited by The London Times on Wednesday.
The study revealed that after two years of use, more than than one third of a pillow’s weight is made up of living and dead dust mites, dust mite feces, dead skin and bacteria.
The findings, from UK public healthcare provider Barts and the London NHS Trust, emerged after a probe into standard-issue hospital pillows found that they were potential vehicles for infections such as methicillin-resistant staphylococcus aureus (MRSA) and Clostridium difficile (C. diff).
“People put a clean pillow case on and it looks and smells nice and fresh, but you are wrapping up something really nasty underneath,” said lead researcher, Dr. Art Tucker, principal clinical scientist at St. Barts Hospital. said Dr. Art Tucker, lead researcher and principal clinical scientist at St. Barts Hospital.
The study, presented at the Healthcare Associated Infections 2011 conference in London on Tuesday, compared the state of standard hospital pillows with a medical pillow developed by the company Gabriel Scientific during several months on different wards at the hospital.
The high-tech pillows, sold under the brand name SleepAngel, are made from a membrane that is normally used as a filter in a heart stent to keep out bacteria, and sealed by melting the edges together rather than sewn.
After two months the medical pillows tested negative for all bacteria under investigation, while the standard pillows tested positive for a range of micro-organisms.
The study stopped short of demonstrating that there was an increased risk of actual transmission of infections between hospital patients. Other scientists suggested that pillows were so widely used that they could not constitute a major health risk.
January 14th, 2011
By: Melanie Grimes
Honey provides a natural sweetener that can be used instead of sugar. Honey is a whole food that comes from plant nectar and does not raise blood sugar as other simple sugars do. Honey also contains a variety of minerals and vitamins and has a long history as a healing food. Honey has been used as an antiseptic, antimicrobial, and antibiotic.
Antibiotic Honey Kills Germs
Honey has long been used for its antibiotic properties and research has now demonstrated the mechanism. In an article published in the Journal of the Federation of American Society for Experimental Biology, scientists explain that a protein made by the bees called defensin-1 is the active germ-killing ingredient in honey. The researchers postulate that honey may even be able to treat diseases and infections that are antibiotic resistant such as MRSA (Methicillin-resistant Staphylococcus aureus).
Vitamins and Minerals in Honey
Honey contains vitamin A, vitamin B2 or riboflavin B3 or nicotinic acid, B5 called pantothenic acid, vitamin C, biotin (also known as vitamin H) and rutine. Honey also contains many minerals: including calcium, magnesium, potassium, iron, copper, iodine, and zinc. The bee pollen in honey contains a great deal of protein as well.
Antioxidants in Honey
Honey contains antioxidants. A study at the University of California demonstrated that consuming honey can raise antioxidant levels in the blood. The darker the honey, the more antioxidants it contains. Dark colored honey from Illinois buckwheat has been shown to have 20 times the antioxidant value as sage honey from California.
Skin Healing Properties of Honey
Because honey has antimicrobial and antiseptic properties, it can be used to heal skin conditions. Hundreds of cases have been published in medical journals demonstrating honey’s ability to cure wounds and burns. Honey kills bacteria in the skin and speeds the healing of burns. It can be used to treat sunburns as well.
Honey in History
Honey has been used as far back as Ancient Egypt. Pictured on ancient hieroglyphs and stored in tombs, honey was mentioned on the Rosetta Stone. King Menes, an Egyptian King from 4000 B.C, was known as the Beekeeper. The use of honey combined with wine and milk for healing was recorded in Egyptian medical texts written on papyrus.
January 7th, 2011
By: Kim Evans
Most people know that conventionally raised animals are regularly given antibiotics. In fact, according to the FDA, 29 million pounds of antibiotics are fed to livestock each year and for drug companies, this adds billions to the bottom line. Most people assume that antibiotics are given to livestock to kill off the germs and pathogens that come from living in the crowded and unsanitary quarters common on factory farms. What they don’t know: farmers really use antibiotics to make the animals fat.
According to the Des Moines Register, antibiotics are routinely added to the animal’s food to fatten the animals and save on feed costs. This, of course, boosts the farmer’s profits. Antibiotics are well known to kill off healthy gut bacteria, and the absence of the animals’ gut bacteria performs the fattening task by disrupting how foods, particularly fats, are metabolized. It turns out that farmers know if you kill off the gut flora, it’ll lead to fat animals. And it begs the question: what effects are antibiotics having on the humans that continually consume them in animal flesh and also take them as drugs? In a nation that struggles with obesity problems, it’s actually a pretty serious question.
Not only do antibiotics make the animals fat, they also bring disease. Like in humans who kill off their healthy bacteria, the animals are having health problems too. In a University of Iowa study, 70 percent of pigs and 64 percent of workers on several Iowa and Western Illinois farms had a new strain of MSRA. These farms used antibiotics routinely, and on antibiotic-free farms no MRSA was found. This, of course, refutes arguments that antibiotics lead to health because it’s becoming more and more obvious that they actually lead to disease if the healthy bacteria aren’t replenished afterward. We’ve also yet to see a study on what exactly eating MRSA ridden pigs and other animals can do to a person. If nothing else, it’s pretty gross.
In humans, a strain of healthy gut bacteria works with a hormone that regulates fat development and hunger. Antibiotics are wiping out this bacterial strain in humans, and scientists have theorized that the loss of this bacteria might “be contributing to the current epidemics of early-life obesity, type 2 diabetes and related metabolic syndromes.” Antibiotics may be touted as miracle drugs, but when they cause more harm than good in the long run, they’re not much of a miracle. Real miracle antibiotics are more like coconut oil and raw organic garlic, which are both well known to kill the bad bacteria while leaving our healthy bacteria intact. Antibiotics have been added to animal feed since 1946.
August 13th, 2010
By: Maggie Fox
Germs living in the gut may cause higher rates of allergies, chronic stomach upsets and even obesity among children living in rich industrialized countries, researchers reported on Monday.
They compared intestinal bacteria between European Union children and young villagers in remote Burkina Faso, and found enough differences to help explain disparities in chronic disease and obesity.
The findings, published in the Proceedings of the National Academy of Sciences, may support the development of probiotic products to help restore the ancient balance and keep humans leaner and healthier, the researchers said.
“Our results suggest that diet has a dominant role over other possible variables such as ethnicity, sanitation, hygiene, geography, and climate, in shaping the gut microbiota,” Paolo Lionetti of the University of Florence in Italy and colleagues wrote.
“We can hypothesize that the reduction in richness we observe in EU compared with Burkina Faso children, could indicate how the consumption of sugar, animal fat, and calorie-dense foods in industrialized countries is rapidly limiting the adaptive potential of the microbiota.”
The study builds on a body of evidence that human health relies heavily on the trillions of microorganisms living in and on our bodies. Only a fraction cause disease directly — many more help digest food, affect other bacteria and may influence hundreds of biological functions.
Several recent studies have found that certain bacteria cause inflammation that can affect appetite as well as inflammatory bowel conditions like Crohn’s disease and colitis, including a study published in Science in March.
TRADING ONE DISEASE FOR ANOTHER
“Western developed countries successfully controlled infectious diseases during the second half of the last century, by improving sanitation and using antibiotics and vaccines,” the researchers wrote.
“At the same time, a rise in new diseases such as allergic, autoimmune disorders, and inflammatory bowel disease both in adults and in children has been observed,” they added
Lionetti’s team studied the DNA of the gut bacteria of children in Burkina Faso, who are breast-fed up to age two and eat a diet likely similar to stone-age humans, rich in whole grains such as millet, legumes such as black-eyed peas, and vegetables. They eat very little meat.
The Western diet, in contrast, is heavy in meat, processed grains, sugar and fat.
The Italian team found the African children had many bacteria that help break down fiber, but the European children were lacking these microbes. The ratios were similar to studies comparing the gut bacteria of lean people to obese people.
This bacterial balance could even be causing obesity, the researchers said. It may also be useful to test children for these bacteria to see if they are at high risk of becoming obese, they said.
“Reduction in microbial richness is possibly one of the undesirable effects of globalization and of eating generic, nutrient-rich, uncontaminated foods,” Lionetti’s team wrote in the study.
January 06, 2010
By S. L. Baker
The opportunistic bacterium Pseudomonas aeruginosa is increasingly recognized as a cause of severe nosocomial infections — those are infections people contract as a result of treatment in a hospital or other medical center. In fact, a Pseudomonas aeruginosa infection can be life-threatening, especially if someone is immunocompromised.
The germ also causes chronic infections in cystic fibrosis patients. So it’s no surprise that disinfectants are widely sprayed, sloshed and wiped over surfaces in medical settings to supposedly protect patients. But now comes evidence the very act of relying on disinfectants to prevent Pseudomonas aeruginosa infections could be turning the already dangerous germ into a superbug that’s resistant to antibiotics as well as the disinfectant itself.
Germs adapt to survive
For a study just published in the January issue of the journal Microbiology, researchers from the National University of Ireland in Galway took laboratory cultures of Pseudomonas aeruginosa and added increasing amounts of disinfectant to the bacteria. They found this caused the germs to adapt over time so they could survive the disinfectant.
But something else also happened when the bacteria were exposed to the disinfectant. Remarkably, the germs became resistant to ciprofloxacin, a strong antibiotic widely-prescribed to fight Pseudomonas aeruginosa. And the germs became resistant to the drug even though they weren’t exposed to it.
How could this be possible? The scientists discovered that when exposed to the disinfectant, the bacteria adapted to more efficiently pump out antimicrobial agents (both the disinfectant and antibiotics) from the germ’s cells. The researchers also found the bacteria’s adaptation resulted in a DNA mutation that allowed the Pseudomonas aeruginosa microbes to specifically become immune to ciprofloxacin-type antibiotics.
Dr. Gerard Fleming, who headed the research team, warned in a media statement that the study results could mean “… residue from incorrectly diluted disinfectants left on hospital surfaces could promote the growth of antibiotic-resistant bacteria. What is more worrying is that bacteria seem to be able to adapt to resist antibiotics without even being exposed to them.”
Obviously, if disinfectants used to kill bacteria on surfaces to prevent their spread are actually making the germs stronger so they survive and go on to infect patients — and if antibiotics used to treat these infections are no longer effective — the results could be a serious threat to hospitalized patients. Dr. Fleming added that it is important for scientists to zero in on environmental factors that might promote antibiotic resistance, thereby creating superbugs.
“We need to investigate the effects of using more than one type of disinfectant on promoting antibiotic-resistant strains. This will increase the effectiveness of both our first and second lines of defense against hospital-acquired infections,” he stated.
December 22, 2009
By S. L. Baker
Gone are the days when play time for kids often meant getting dirty making mud “pies”, splashing in mud puddles and creeks, and climbing trees — and when children washed their hands, mostly just before a meal, it was with plain soap and water. Modern day parents often take pride in keeping their little ones squeaky clean and as germ-free as possible, dousing them with antibacterial soaps and hand sanitizers. But new Northwestern University research suggests that normal exposure to everyday germs is a natural way to prevent diseases in adulthood.
The study, published in the December 9th edition of the journal Proceedings of the Royal Society B: Biological Sciences, is the first to investigate whether microbial exposures early in life affect inflammatory processes related to diseases in adulthood. Remarkably, the Northwestern study suggests exposure to infectious microbes in childhood may actually protect youngsters from developing serious illnesses, including cardiovascular diseases, when they grow into adults.
“Contrary to assumptions related to earlier studies, our research suggests that ultra-clean, ultra-hygienic environments early in life may contribute to higher levels of inflammation as an adult, which in turn increases risks for a wide range of diseases,” Thomas McDade, lead author of the study, said in a statement to the media. McDade is associate professor of anthropology in Northwestern’s Weinberg College of Arts and Sciences and a faculty fellow at the Institute for Policy Research.
He added that humans have only recently lived in super clean environments and it could well be time to put down the antibacterial soap. That’s because the new research suggests that inflammatory systems need a reasonably high level of exposure to common everyday germs and other microbes to develop and work properly in the body.
“In other words, inflammatory networks may need the same type of microbial exposures early in life that have been part of the human environment for all of our evolutionary history to function optimally in adulthood,” stated McDade.
The Northwestern University researchers specifically studied how environments early in life might affect production of C-reactive protein (CRP), a protein that rises in the blood due to inflammation, in adulthood. Research concerning CRP, which is an important part of the immune system’s fight against infection, has primarily focused on the protein as a possible predictor of heart disease. Scientists previously have mostly conducted CRP research in affluent settings, including the U.S., where there are relatively low levels of infectious diseases.
McDade and colleagues were interested in what CRP production looks like in the Philippines where residents have with a high level of infectious diseases in early childhood compared to Western countries. However, compared to Western countries, the people of the Philippines have relatively low rates of obesity (which is associated with CRP) and cardiovascular diseases.
How the research was conducted
The research team worked with data from a longitudinal study of Filipinos which began in the 1980s with 3,327 Filipino mothers in their third trimester of pregnancy. The mothers were interviewed about breast feeding and care giving and their households were assessed for socioeconomic levels, hygiene (including whether homes included domestic animals) and how many people lived in the home.
Researchers also visited with the mothers after their babies were born and then every two months for the first two years of the children’s lives. From that point on, the researchers followed up with the children every four or five years until the research subjects were approximately 22 years of age. During this entire period, records were kept on the children documenting their height and weight and any infectious diseases they contracted.
Blood tests revealed Filipino participants in their early 20s had CRP concentrations on average of .2 milligrams per liter — that’s about five to seven times lower than the average CRP levels for Americans of the same age.
“In the U.S we have this idea that we need to protect infants and children from microbes and pathogens at all possible costs,” McDade concluded. “But we may be depriving developing immune networks of important environmental input needed to guide their function throughout childhood and into adulthood. Without this input, our research suggests, inflammation may be more likely to be poorly regulated and result in inflammatory responses that are overblown or more difficult to turn off once things get started.”
October 28, 2009
By Cain Burdeau
Federal officials plan to ban sales of raw oysters harvested from the Gulf of Mexico unless the shellfish are treated to destroy potentially deadly bacteria — a requirement that opponents say could deprive diners of a delicacy cherished for generations.
The plan has also raised concern among oystermen that they could be pushed out of business.
The Gulf region supplies about two-thirds of U.S. oysters, and some people in the $500 million industry argue that the anti-bacterial procedures are too costly. They insist adequate measures are already being taken to battle germs, including increased refrigeration on oyster boats and warnings posted in restaurants.
About 15 people die each year in the United States from raw oysters infected with Vibrio vulnificus, which typically is found in warm coastal waters between April and October. Most of the deaths occur among people with weak immune systems caused by health problems like liver or kidney disease, cancer, diabetes, or AIDS.
“Seldom is the evidence on a food-safety problem and solution so unambiguous,” Michael Taylor, a senior adviser at the Food and Drug Administration, told a shellfish conference in Manchester, N.H., earlier this month in announcing the policy change.
Some oyster sellers say the FDA rule smacks of government meddling. The sales ban would take effect in 2011 for oysters harvested in the Gulf during warm months.
“We have one man who’s 97 years old, and he comes in here every week and gets his oyster fix, no matter what month it is,” said Mark DeFelice, head chef at Pascal’s Manale Restaurant in New Orleans. “There comes a time when we need to be responsible. Government doesn’t need to be involved in this.”
The anti-bacterial process treats oysters with a method similar to pasteurization, using mild heat, freezing temperatures, high pressure and low-dose gamma radiation.
But doing so “kills the taste, the texture,” DeFelice said. “For our local connoisseurs, people who’ve grown up eating oysters all their lives, there’s no comparison” between salty raw oysters and the treated kind.
A Gulf Coast oyster — or better still, a plate of a dozen oysters on the half-shell — is a delicacy savored for its salty, refreshing, slightly slimy taste. Some people add a drop of horseradish, lemon or hot sauce on top for extra zest.