Higher triglyceride level increases stroke risk


A study by scientists in Denmark revealed that increasing levels of non-fasting triglycerides are linked to an increased risk of ischemic stroke in men and women. Higher cholesterol levels were linked to greater stroke risk in men only. Details of this novel, 33-year study are now available online in Annals of Neurology, a journal published by Wiley-Blackwell on behalf of the American Neurological Association.


Higher triglyceride level increases stroke risk
As per the World Health Organization (WHO) cardiovascular diseases are the number one cause of death globally?responsible for an estimated 17.1 million deaths worldwide ( 2004), with 5.7 million due to stroke. The American Stroke Association states that stroke is the third leading cause of death in the U.S. and 87% of all cases are attributed to ischemic stroke, occurring when the supply of blood to the brain is obstructed. The obstruction or blockage is typically caused by the build-up of fatty deposits inside blood vessels (atherosclerosis).

Medical evidence suggests that elevated non-fasting triglycerides are markers of elevated levels of lipoprotein remnants, particles similar to low density lipoprotein (LDL), or bad cholesterol, both of which are thought to contribute to plaque build-up. "Interestingly, current guidelines on stroke prevention have recommendations on desirable cholesterol levels, but not on non-fasting triglycerides," said lead study author, Dr. Marianne Benn from Copenhagen University Hospital. "Our study was the first to examine how the risk of stroke for very high levels of non-fasting triglycerides compared with very high cholesterol levels in the general population."

The Danish team followed 7,579 women and 6,372 men who were enrolled in the Copenhagen City Heart Study, all of whom were white and of Danish decent. Participants had non-fasting triglycerides and cholesterol measurements taken at baseline (1976-1978) and were followed for up to 33 years. A diagnosis of ischemic stroke was made when focal neurological symptoms lasted more than 24 hours. During the follow-up period, completed by 100% of participants, 837 women and 837 men developed ischemic stroke.

Results confirmed in both women and men, stepwise increasing levels of non-fasting triglycerides linked to increased risk of ischemic stroke. In women, triglycerides levels of 1-2 mmol/L (89-177 mg/dL) carried a relative risk of 1.2 and levels of 5 mmol/L (443 mg/dL) or greater were linked to a 3.9-fold greater risk, compared with women whose triglycerides levels were less than 1 mmol/L (89 mg/dL). At similar triglyceride levels men had a relative risk that ranged from 1.2 to 2.3. Increasing cholesterol levels did not associate with greater risk of ischemic stroke, except in men whose cholesterol levels were equal to 9 mmol/L (348 mg/dL) or more (relative risk of 4.4).

"Our findings suggest that levels of non-fasting triglycerides should be included in stroke prevention guidelines which currently focus on total cholesterol and LDL cholesterol levels," concluded Dr. Benn.


Posted by: Daniel    Source


Did you know?
A study by scientists in Denmark revealed that increasing levels of non-fasting triglycerides are linked to an increased risk of ischemic stroke in men and women. Higher cholesterol levels were linked to greater stroke risk in men only. Details of this novel, 33-year study are now available online in Annals of Neurology, a journal published by Wiley-Blackwell on behalf of the American Neurological Association.
Read more >>
Bookmark and Share

Take care of your brain

As the average life span becomes longer, dementia becomes more common. Swedish scientist Laura Fratiglioni has shown that everyone can minimize his or her risk of being affected. Factors from blood pressure and weight to the degree of physical and mental activity can influence cognitive functioning as one gets older.

The lengthening of the average life span in the population has caused an increase in the prevalence of aging related disorders, one of which is cognitive impairment and dementia. An expert panel estimates that worldwide more than 24 million people are affected by dementia, most suffering from Alzheimer's disease. In the more developed countries, 70 percent of the persons with dementia are 75 years or older.


Take care of your brain
Age is the greatest risk factor for developing dementia. But there is growing evidence that the strong association with increasing age can be, at least partially, explained by a life course cumulative exposure to different risk factors.

Laura Fratiglioni's research group at Karolinska Institutet is a leader in identifying the risk factors that lie behind developing dementia and using this knowledge to develop possible preventative strategies. The group's research has shown that the risk is partly determined by an individual genetic susceptibility, and that active involvement in mental, physical and social activities can delay the onset of dementia by preserving cognitive functions. Further education early in life has a protective effect, and the group's research has shown that it is never too late to get started.

"The brain, just as other parts of the body, requires stimulation and exercise in order to continue to function. Elderly people with an active life ? mentally, physically and socially ? run a lower risk of developing dementia, and it doesn't matter what the particular activities are", says Professor Laura Fratiglioni.

Laura Fratiglioni's research has shown that physical factors are also significant. Not only high and low blood pressure, but also diabetes and obesity when middle-aged increase the risk of developing dementia after the age of 70. "What is good for the heart is good for the brain", she says.

Knowledge about risk factors and how to protect the brain from dementia is based on findings based on observation in which researchers have discovered statistical correlations in the population. Researchers in other current studies that are carried out in Europe are investigating what happens when a large number of study participants are given special help to better control vascular risk factors and to stimulate social, physical and mental activities. which should, at least, lead to a delay of dementia onset.

"You could say that we are progressing from observation to experiment. This means that in a few years we will know more about which strategies are most effective in preventing neurodegenerative disorders", says Laura Fratiglioni.


Posted by: Daniel    Source


Did you know?
As the average life span becomes longer, dementia becomes more common. Swedish scientist Laura Fratiglioni has shown that everyone can minimize his or her risk of being affected. Factors from blood pressure and weight to the degree of physical and mental activity can influence cognitive functioning as one gets older.
Read more >>
Bookmark and Share

Blood pressure measurement method to revolutionize

Error in deserializing body of reply message for operation 'Translate'. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 8905.
In a major scientific breakthrough, a new blood pressure measurement device is set to revolutionise the way patients' blood pressure is measured.

The new approach, invented by researchers at the University of Leicester and in Singapore, has the potential to enable doctors to treat their patients more effectively because it gives a more accurate reading than the current method used. It does this by measuring the pressure close to the heart ? the central aortic systolic pressure or CASP.

Blood pressure is currently measured in the arm because it is convenient however this may not always accurately reflect what the pressure is in the larger arteries close to the heart.


Blood pressure measurement method to revolutionizeCASPro blood pressure measurement device.

Credit: University of Leicester


The new technology uses a sensor on the wrist to record the pulse wave and then, using computerised mathematical modelling of the pulse wave, researchers are able to accurately read the pressure close to the heart. Patients who have tested the new device found it easier and more comfortable, as it can be worn like a watch.

Being able to measure blood pressure in the aorta which is closer to the heart and brain is important because this is where hypertension can cause damage. In addition, the pressure in the aorta can be quite different from that traditionally measured in the arm. The new technology will hopefully lead to better identification of those who will most likely benefit from therapy by identifying those who have a high central aortic systolic pressure value. This will be particularly important for younger people in whom the pressure measured in the arm can sometimes be quite exaggerated in comparison to the pressure in the aorta.

A key question is whether measurement of central aortic pressure will become routine in clinical practice. Professor Williams said: "it is not going to replace what we do overnight but it is a big advance. Further work will define whether such measurements are preferred for everybody or whether there is a more defined role in selective cases to better decide who needs therapy and who doesn't and whether the therapy is working optimally" .

The University's close collaboration with the Singapore-based medical device company HealthSTATS International ("HealthSTATS") has led to the development of this world-first technique for more accurate blood pressure measurement.

The research work carried out by the University of Leicester was funded by the Department of Health's National Institute for Health Research (NIHR). The NIHR has invested ?3.4million with a further ?2.2million Capital funding from the Department of Health to establish a Biomedical Research Unit at Glenfield Hospital, Leicester, dedicated to translational research in cardiovascular research. The work, led by Professor Bryan Williams, Professor of Medicine at the University of Leicester and consultant doctor at University Hospitals of Leicester NHS Trust, has the promise to change the way we measure blood pressure.

Professor Williams, who is based in the University of Leicester's Department of Cardiovascular Sciences at Glenfield Hospital, said: "I am under no illusion about the magnitude of the change this technique will bring about. It has been a fabulous scientific adventure to get to this point and it will change the way blood pressure has been monitored for more than a century. The beauty of all of this, is that it is difficult to argue against the proposition that the pressure near to your heart and brain is likely to be more relevant to your risk of stroke and heart disease than the pressure in your arm.

"Leicester is one of the UK's leading centres for cardiovascular research and is founded on the close working relationship between the University and the Hospitals which allows us to translate scientific research into patient care more efficiently. Key to our contribution to this work has been the support from the NIHR without which we would not have been able to contribute to this tremendous advance. The support of the NIHR has been invaluable in backing us to take this project from an idea to the bedside. Critical to the success of this project has been the synergies of combining clinical academic work here with HealthSTATS and their outstanding medical technology platform in Singapore. This has been the game-changer and I really do think this is going to change clinical practice".




.


IMAGE:
This is the CASPal blood pressure measurement device.


Click here for more information.


Dr. Choon Meng Ting the Chairman and CEO of HealthSTATS said: "This study has resulted in a very significant translational impact worldwide as it will empower doctors and their patients to monitor their central aortic systolic pressure easily, even in their homes and modify the course of therapy for BP-related ailments. Pharmaceutical companies can also use CASP devices for clinical trials and drug treatment. All these will ultimately bring about more cost savings for patients, reduce the incidences of stroke and heart attacks, and save more lives".

Health Secretary Andrew Lansley said:

"I saw this new technique in action in Leicester when I visited a few months ago. This is a great example of how research breakthroughs and innovation can make a real difference to patients' lives. We want the NHS to become one of the leading healthcare systems in the world and our financial commitment to the National Institute for Health Research reflects this.

"I believe patients deserve the best therapys available and science research like this helps us move closer to making that happen".

Professor Dame Sally Davies, Director General of Research and Development and Interim Chief Medical Officer at the Department of Health, said:

"This is fantastic work by Professor Williams and his team and I am delighted to welcome these findings. I am especially pleased that the clinical research took place at the NIHR Biomedical Research Unit in Leicester. NIHR funding for Biomedical Research Centres and Units across England supports precisely this type of translational research, aimed at pulling-through exciting scientific discoveries into benefits for patients and the NHS by contributing to improved diagnostics and therapys".


Posted by: Scott    Source




Did you know?
In a major scientific breakthrough, a new blood pressure measurement device is set to revolutionise the way patients' blood pressure is measured. The new approach, invented by researchers at the University of Leicester and in Singapore, has the potential to enable doctors to treat their patients more effectively because it gives a more accurate reading than the current method used. It does this by measuring the pressure close to the heart ? the central aortic systolic pressure or CASP.
Read more >>
Bookmark and Share

Relatives of melanoma patients

It is well known that sunbathing increases the risk of skin cancer and that this risk is increased in people with a family history of melanoma. New research published in BioMed Central's open access journal BMC Public Health shows that young people in this 'at risk' group are still ignoring sun safety advice.

Professor Sharon Manne at the Centre Cancer Prevention and Control Program, New Jersey, asked over 500 people with a family history of melanoma, the most dangerous form of skin cancer, whether they regularly sunbathed and whether they used sunscreen. Eventhough most of these people were aware that sunscreen would protect them against cancer and premature aging, a number of of them still did not feel it necessary to use any form of sun protection.


Relatives of melanoma patients
Disturbingly she observed that, despite their increased risk of melanoma, the younger women in this survey still viewed a tan as being healthy and were the most unlikely to use sunscreen. Professor Manne said, "To reduce the occurence rate of melanoma we need to reduce the perceived benefits of sunbathing and to increase to use of sun protection".


Posted by: George    Source


Did you know?
It is well known that sunbathing increases the risk of skin cancer and that this risk is increased in people with a family history of melanoma. New research published in BioMed Central's open access journal BMC Public Health shows that young people in this 'at risk' group are still ignoring sun safety advice.
Read more >>
Bookmark and Share

Vaccine made with synthetic gene

Scientists at Albert Einstein College of Medicine of Yeshiva University have developed an experimental vaccine that appears to protect against an increasingly common and especially deadly form of pneumococcal pneumonia. Details of the new vaccine, which was tested in an animal model, are reported in a paper published recently in the Journal of Infectious Diseases

Vaccine made with synthetic gene
Pneumococcal pneumonia can occur when the lungs are infected with the bacterial species Streptococcus pneumoniae (also known as pneumococcus). "Like a number of microbes that cause pneumonia, pneumococcus is spread from person to person through coughing or sneezing," said principal investigator Liise-anne Pirofski, M.D., professor of medicine and of microbiology & immunology and the Selma and Dr. Jacques Mitrani Chair in Biomedical Research. Symptoms include cough, fever, shortness of breath, and chest pain.

The National Foundation for Infectious Diseases estimates that 175,000 people are hospitalized with pneumococcal pneumonia in the United States each year. In addition to pneumonia, pneumococcus causes 34,500 bloodstream infections and 2,200 cases of meningitis annually. It is responsible for more deaths in the United States ? 4,800 a year ? than any other vaccine-preventable disease. It poses a particular problem in the developing world, where it is estimated to cause more than one million deaths in children each year, as per the World Health Organization.

A pediatric vaccine has dramatically reduced the occurence rate of pneumococcal disease in children and adults, both by protecting vaccinated children and by reducing person-to-person transmission of the bacterium to others ? a phenomenon known as herd immunity.

"The pediatric vaccine is a great victory for modern medicine, but it doesn't cover all strains of disease-causing pneumococcus ? some of which have recently emerged and are very virulent," said Dr. Pirofski. "This problem, coupled with the fact that herd immunity doesn't protect immunocompromised patients as effectively as people with normal immunity, led us to look for a better vaccine".

The scientists focused on developing a vaccine against serotype 3 ? a pneumococcal strain that was not included in the pediatric vaccine used for the past decade and that has emerged as a cause of serious pneumonia in adults and children. Serotype 3 can trigger inflammation so overwhelming that it can result in very severe disease or even death.

The goal of this study was to produce a vaccine consisting of a live, attenuated (weakened) version of serotype 3 S. pneumoniae. To create their vaccine, the scientists focused on the serotype 3 gene that codes for pneumolysin, a toxin produced by all pneumococcal strains. The scientists replaced this gene with a synthetic version that they hoped would reduce the amount of toxin produced.

"Our idea was to design a live vaccine that would stimulate the immune system sufficiently to ward off disease but wouldn't lead to the severely damaging inflammatory response that this strain can cause," said main author J. Robert Coleman, Ph.D., a postdoctoral fellow in microbiology & immunology at Einstein, who helped develop the gene-modification technique, known as synthetic gene customization, while a graduate student at Stony Brook University.

"The novelty of this approach lies in the fact that the gene's expression would be reduced, but not eliminated," Dr. Coleman added. "Prior approaches to genetic regulation of virulence relied on knocking out genes, which eliminates their expression completely." .

Altering the pneumolysin gene in the seroptype 3 bacteria resulted in less pneumolysin toxin produced in vitro. When mice were injected with either attenuated or unattenuated serotype 3 bacteria, mice receiving the attenuated strain developed an inflammatory response much weaker than was observed in mice receiving the unattenuated serotype 3 strain. Most important, of the five mice injected with the attenuated strain, four survived a subsequent challenge from the highly virulent unattenuated serotype 3 strain, which was lethal in five of five unvaccinated, control mice.

This method of reducing gene expression had been used for viral pathogens, but this is the first time that gene customization has successfully controlled virulence in bacteria. The study's findings could potentially lead to pneumococcal vaccines based on weakened strains, and the Einstein scientists are now investigating whether they can reduce the expression of other genes linked to pneumococcal virulence.


Posted by: Mark    Source


Did you know?
Scientists at Albert Einstein College of Medicine of Yeshiva University have developed an experimental vaccine that appears to protect against an increasingly common and especially deadly form of pneumococcal pneumonia. Details of the new vaccine, which was tested in an animal model, are reported in a paper published recently in the Journal of Infectious Diseases
Read more >>
Bookmark and Share

How many mammograms radiologists must read?

Radiologists who interpret more mammograms and spend some time reading diagnostic mammograms do better at determining which suspicious breast lesions are cancer, as per a new report published online on February 22 and in print in the recent issue of Radiology
In direct response to a report from the Institute of Medicine that called for more research on the relationship between interpretive volume and performance in screening mammography, the multi-site team undertook the largest and most comprehensive study of U.S. radiologists. The Institute of Medicine is the health arm of the National Academies, advisors to the nation on science, engineering, and medicine.


How many mammograms radiologists must read?
Funded largely through a unique collaboration between the American Cancer Society and the National Cancer Institute, the study examined information from 120 radiologists who interpreted 783,965 screening mammograms at six mammography registries in the Breast Cancer Surveillance Consortium (BCSC) over five years. The scientists looked at how screening outcomes were correlation to four different measures of each radiologist's annual volume: the number of screening and diagnostic mammograms?separately and in combination?and the percentage of total mammograms that were for screening rather than diagnosis.

"We observed that radiologists who interpreted more mammograms a year had clinically and statistically significantly fewer false-positive findings?without missing more cancers," said study leader Diana S.M. Buist, PhD, MPH, a senior investigator at Group Health Research Institute. "That means radiologists with higher 'interpretive volumes' could identify the same number of cancers, while making fewer women come in for extra tests that showed they did not have cancer." On average, for every cancer detected, 22.3 women were called back for more testing.

False-positive findings?when a mammogram suggests a breast cancer is present, but it turns out not to be?cause women anxiety and spur extra testing, which amounts to at least $1.6 billion in health care costs each year. Often, there's a tradeoff between minimizing false positives and maximizing sensitivity, which is the ability to identify cancer when present. But in this study, despite their lower false-positive rates, the high-volume radiologists had sensitivities and cancer-detection rates that resembled those of their lower-volume colleagues.

"We also observed that radiologists were more accurate at interpreting mammograms if they also interpreted some diagnostic mammograms." Dr. Buist said. Diagnostic mammograms evaluate breast symptoms or abnormalities seen on a previous screening mammogram. The cancer-detection rate was highest when at least one in five of the mammograms that a radiologist read a diagnostic, not screening, mammogram?instead of their focusing more exclusively on reading screening mammograms.

This report's findings have policy implications. The U.S. Food and Drug Administration (FDA) requires radiologists who interpret mammograms to read only 960 mammograms in two years, with no requirement about the type of mammograms they read (screening or diagnostic). In Europe and Canada, where volume requirements are 5? times higher, screening mammography programs have lower false-positive rates?but similar cancer-detection rates?than the United States.

"In the United States, the goal of screening is to achieve high sensitivity while keeping the rates of false positives low," Dr. Buist said. "No single measure can be calculated to make policy decisions, because any policy needs to weigh the tradeoff between missed cancers and false positives: Both have important impacts on women and society."

Dr. Buist added: "Based on these data, it would be beneficial if U.S. volume requirements could be increased to 1,000 or 1,500 screening mammograms per year, while adding a minimal requirement for diagnostic interpretation, which would optimize sensitivity and false-positive rates." As per her team's simulations, raising annual requirements for screening volume could lower the number of American women with false-positive workups?by more than 71,000 for annual minimums of 1,000, or by more than 117,000 year for annual minimums of 1,500?without hindering the detection of breast cancer.

Conversely, raising the volume requirements could cause low-volume radiologists to stop reading mammograms. Concerns have been raised that the cadre of U.S. radiologists who read mammograms is aging and retiring. In this study, for instance, radiologists' median age was 54, and 38 percent of them interpreted fewer than 1,500 mammograms a year.

"Without more radiologists interpreting more mammograms, women may have less access to the only screening test that trials have shown can reduce deaths from breast cancer," Dr. Buist said. "Unlike the mammography debate about whether women in their 40s should be screened, which is based on the weight of harms of false positives, the tradeoff around volume policy will concern workforce issues and reporting requirements that would necessitate changes to how the FDA collects information on how a number of mammograms radiologists interpret." Her team has also been testing strategies for improving how well radiologists interpret mammograms.

In a unique partnership and combination of funding, the American Cancer Society through the Longaberger Company's Horizon of Hope Campaign?, the National Cancer Institute through Breast Cancer Stamp Fund, and the Agency for Healthcare Research and Quality supported this study using data from the Breast Cancer Surveillance Consortium. The Longaberger Company, which sells baskets and other products through home shows, has raised more than $14 million through its Horizon of Hope campaign. From the sale of every Horizon of Hope basket, $2 goes to the American Cancer Society to support breast cancer research and other initiatives.


Posted by: Janet    Source


Did you know?
Radiologists who interpret more mammograms and spend some time reading diagnostic mammograms do better at determining which suspicious breast lesions are cancer, as per a new report published online on February 22 and in print in the recent issue of Radiology In direct response to a report from the Institute of Medicine that called for more research on the relationship between interpretive volume and performance in screening mammography, the multi-site team undertook the largest and most comprehensive study of U.S. radiologists. The Institute of Medicine is the health arm of the National Academies, advisors to the nation on science, engineering, and medicine.
Read more >>
Bookmark and Share

Callous-unemotional traits

Research presented this week at the annual meeting of the American Association for the Advancement of Science highlights the importance of callous-unemotional traits (CU) in identifying children at risk of antisocial behavior and other adjustment problems.

The research, presented by Indiana University Bloomington faculty member Nathalie M.G. Fontaine, finds that the emergence of CU traits in childhood is in most cases influenced by genetic factors, particularly in boys. However, environmental factors appear to be more significant for the small number of girls who exhibit high levels of CU traits.


Callous-unemotional traitsNathalie Fontaine is a researcher at Indiana University.

Credit: Indiana University


In this first longitudinal study employing a group-based analysis to examine the correlation between childhood trajectories of CU traits and conduct problems, scientists observed that high levels of both CU traits and conduct problems were linked to negative child and family factors at age 4 and with behavioral problems at age 12.

CU traits, such as a lack of emotion and a lack of empathy or guilt, are exhibited by a small number of children and are linked to persistent conduct problems, which are experienced by 5 percent to 10 percent of children.

"The children with high levels of both CU traits and conduct problems between ages 7 to 12 were likely to present negative predictors and outcomes, including hyperactivity problems and living in a chaotic home environment," said Fontaine, assistant professor of criminal justice in the College of Arts and Sciences at Indiana University Bloomington. "If we could identify those children early enough, we could help them as well as their families."

The AAAS presentation combines findings from two articles, one published in July 2010 in the Journal of the American Academy of Child & Adolescent Psychiatry and the other to be published online this week by the Journal of Abnormal Psychology Co-authors include Fr?hling Rijsdijk of King's College London; Eamon McCrory of University College London; Michel Boivin of Laval University; Terrie Moffitt of Duke University and King's College London; and Essi Viding of University College London and King's College London.

The scientists examined data for more than 9,000 twins from the Twins Early Development Study, a data set of twins born in England and Wales between 1994 and 1996. Evaluations of CU traits and conduct problems were based on teacher questionnaires when the children were 7, 9 and 12. Family-level predictors at age 4 were based on information from parents, and behavioral outcomes at age 12 were based on information from teachers.

Participants were grouped in four trajectories for CU traits: stable low, stable high, increasing and decreasing. While most exhibited stable and low levels of CU traits, about one-fourth had stable high, increasing or decreasing CU traits. Participants were grouped in two trajectories for conduct problems, high and low.

Because the data set included both identical and non-identical twins, the scientists were able to examine the extent to which each trajectory of CU traits was correlation to genetic and environmental factors. They observed that, for boys in all four trajectories, genetic factors had the strongest influence. But for girls with stable high or increasing levels of CU traits, a shared environment had the strongest influence.

The research found an asymmetrical relationship between CU traits and persistent conduct problems. Children with high levels of CU traits were likely to also display high levels of conduct problems. But children with high levels of conduct problems did not necessarily exhibit high levels of CU traits.

Children with a high trajectory of CU traits and conduct problems were more likely than others to have experienced negative predictors at age 4, such as hyperactivity, negative parental discipline and chaos in the home. They also were more likely to experience negative outcomes at age 12, including problems with peers, emotional problems and negative parental feelings.

Fontaine emphasized that the findings do not mean that some children are or necessarily will become delinquents or psychopathic individuals -- or that heritability of CU traits equals destiny. Rather, the research suggests that CU traits appears to be used to identify children who are at risk for persistent and severe antisocial behavior and to implement appropriate interventions to support and help these children and their families.

The research also could inform decisions about whether to include CU traits as a sub-typing index within the category of conduct disorder for the next edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-V).


Posted by: JoAnn    Source


Did you know?
Research presented this week at the annual meeting of the American Association for the Advancement of Science highlights the importance of callous-unemotional traits (CU) in identifying children at risk of antisocial behavior and other adjustment problems. The research, presented by Indiana University Bloomington faculty member Nathalie M.G. Fontaine, finds that the emergence of CU traits in childhood is in most cases influenced by genetic factors, particularly in boys. However, environmental factors appear to be more significant for the small number of girls who exhibit high levels of CU traits.
Read more >>
Bookmark and Share

Using EEGs to diagnose autism

A computational physicist and a cognitive neuroscientist at Children's Hospital Boston have come up with the beginnings of a noninvasive test to evaluate an infant's autism risk. It combines the standard electroencephalogram (EEG), which records electrical activity in the brain, with machine-learning algorithms. In a pilot study, their system had 80 percent accuracy in distinguishing between 9-month-old infants known to be at high risk for autism from controls of the same age.


Using EEGs to diagnose autism
Eventhough this work, published February 22 in the online open-access journal BMC Medicine, requires validation and refinement, it suggests a safe, practical way of identifying infants at high risk for developing autism by capturing very early differences in brain organization and function. This would allow parents to begin behavioral interventions one to two years before autism can be diagnosed through traditional behavioral testing.

"Electrical activity produced by the brain has a lot more information than we realized," says William Bosl, PhD, a neuroinformatics researcher in the Children's Hospital Informatics Program. "Computer algorithms can pick out patterns in those squiggly lines that the eye can't see".

Bosl, Charles A. Nelson, PhD, Research Director of the Developmental Medicine Center at Children's, and his colleagues recorded resting EEG signals from 79 babies 6 to 24 months of age participating in a larger study aimed at finding very early risk markers of autism. Forty-six infants had an older sibling with a confirmed diagnosis of an autism spectrum disorder (ASD); the other 33 had no family history of ASDs.

As the babies watched a research assistant blowing bubbles, recordings were made via a hairnet-like cap on their scalps, studded with 64 electrodes. When possible, tests were repeated at 6, 9, 12, 18 and 24 months of age.

Bosl then took the EEG brain-wave readings for each electrode and computed their modified multiscale entropy (mMSE) -- a measure borrowed from chaos theory that quantifies the degree of randomness in a signal, from which characteristics of whatever is producing the signal can be inferred. In this case, patterns in the brain's electrical activity give indirect information about how the brain is wired: the density of neurons in each part of the brain, how connections between them are organized, and the balance of short- and long-distance connections.

The researchers looked at the entropy of each EEG channel, which is believed to contain information about the density of neural connections in the brain region near that electrode.

"A number of neuroresearchers think that autism reflects a 'disconnection syndrome,' by which distributed populations of neurons fail to communicate efficiently with one another," explains Nelson. "The current paper supports this hypothesis by suggesting that the brains of infants at high risk for developing autism exhibit different patterns of neural connectivity, though the relationship between entropy and the density of neural arbors remains to be explored." (Neural arbors are projections of neurons that form synapses or connections with other neurons.).

On average, the greatest difference was seen at 9 months of age. The scientists note that at 9 months, babies undergo important changes in their brain function that are critical for the emergence of higher-level social and communication skills -- skills often impaired in ASDs.

For reasons that still need to be explored, there was a gender difference: classification accuracy was greatest for girls at 6 months and remained high for boys at 12 and 18 months.

Overall, however, the distinction between the high-risk group and controls was smaller when infants were tested at 12 to 24 months. The authors speculate that the high-risk group may have a genetic vulnerability to autism that can be influenced and sometimes mitigated by environmental factors.

Bosl hopes to follow the high-risk group over time and compare EEG patterns in those who receive an actual ASD diagnosis and who appear to be developing normally ? and then compare both groups to the controls.

"With enough data, I'd like to follow each child's whole trajectory from 6 to 24 months," Bosl adds. "The trend over time appears to be more important than a value at any particular age".

Eventhough EEG testing for autism risk may seem impractical to implement on a wide scale, it is inexpensive, safe, does not require sedation (unlike MRI), takes only minutes to perform and can be done in a doctor's office. There are already data showing differences in EEG patterns for schizophrenia, major depression and PTSD, Bosl says.

Bosl also has started to collect data from older children 6 to 17 years old, and eventually hopes to have enough subjects to be able to compare EEG patterns for different types of ASDs.


Posted by: JoAnn    Source


Did you know?
A computational physicist and a cognitive neuroscientist at Children's Hospital Boston have come up with the beginnings of a noninvasive test to evaluate an infant's autism risk. It combines the standard electroencephalogram (EEG), which records electrical activity in the brain, with machine-learning algorithms. In a pilot study, their system had 80 percent accuracy in distinguishing between 9-month-old infants known to be at high risk for autism from controls of the same age.
Read more >>
Bookmark and Share

Careful cleaning of children's skin wounds key to healing

When it comes to curing skin infected with the antibiotic-resistant bacterium MRSA (methicillin-resistant Staphylococcus aureus), timely and proper wound cleaning and draining appears to be more important than the choice of antibiotic, as per a new Johns Hopkins Children's Center study. The work is reported in the recent issue of Pediatrics

Careful cleaning of children's skin wounds key to healing
Scientists originally set out to compare the efficacy of two antibiotics usually used to treat staph skin infections, randomly giving 191 children either cephalexin, a classic anti-staph antibiotic known to work against the most common strains of the bacterium but not MRSA, or clindamycin, known to work better against the resistant strains. Much to the researchers' surprise, they said, drug choice didn't matter: 95 percent of the children in the study recovered completely within a week, regardless of which antibiotic they got.

The finding led the research team to conclude that proper wound care, not antibiotics, may have been the key to healing.

"The good news is that no matter which antibiotic we gave, nearly all skin infections cleared up fully within a week," says study lead investigator Aaron Chen, M.D., an emergency doctor at Hopkins Children's. "The better news might be that good low-tech wound care, cleaning, draining and keeping the infected area clean, is what truly makes the difference between rapid healing and persistent infection".

Chen says that proper wound care has always been the cornerstone of skin infection therapy but, the scientists say, in recent years more physicians have started prescribing antibiotics preemptively.

Eventhough the Johns Hopkins researchers stop short of advocating against prescribing antibiotics for uncomplicated MRSA skin infections, they call for studies that directly measure the benefit ? if any ? of drug treatment versus proper wound care. The best study, they say, would compare patients receiving placebo with those on antibiotics, along with proper wound cleaning, draining and dressing.

Antibiotics can have serious side effects, fuel drug resistance and raise the cost of care significantly, the scientists say.

"A number of physicians understandably assume that antibiotics are always necessary for bacterial infections, but there is evidence to suggest this may not be the case," says senior investigator George Siberry, M.D., M.P.H., a Hopkins Children's pediatrician and medical officer at the Eunice Kennedy Shriver Institute of Child Health & Human Development. "We need studies that precisely measure the benefit of antibiotics to help us determine which cases warrant them and which ones would fare well without them".

The 191 children in the study, ages 6 months to 18 years, were treated for skin infections at Hopkins Children's from 2006 to 2009. Of these, 133 were infected with community-acquired MRSA, and the remainder had simple staph infections with non-resistant strains of the bacterium. Community-acquired (CA-MRSA) is a virulent subset of the bacterium that's not susceptible to most usually used antibiotics. Most CA-MRSA causes skin and soft-tissue infections, but in those who are sick or have weakened immune systems, it can lead to invasive, sometimes fatal, infections.

At 48-hour to 72-hour follow-ups, children treated with both antibiotics showed similar rates of improvement ? 94 percent in the cephalexin group improved and 97 percent in the clindamycin group improved. By one week, the infections were gone in 97 percent of patients receiving cephalexin and in 94 percent of those on clindamycin. Those younger than 1 year of age and those whose infections were accompanied by fever were more prone to complications and more likely to be hospitalized.


Posted by: George    Source


Did you know?
When it comes to curing skin infected with the antibiotic-resistant bacterium MRSA (methicillin-resistant Staphylococcus aureus), timely and proper wound cleaning and draining appears to be more important than the choice of antibiotic, as per a new Johns Hopkins Children's Center study. The work is reported in the recent issue of Pediatrics
Read more >>
Bookmark and Share

Tau-induced memory loss in Alzheimer's mice

Amyloid-beta and tau protein deposits in the brain are characteristic features of Alzheimer disease. The effect on the hippocampus, the area of the brain that plays a central role in learning and memory, is especially severe. However, it appears that the toxic effect of tau protein is largely eliminated when the corresponding tau gene is switched off. Scientists from the Max Planck Research Unit for Structural Molecular Biology at DESY in Hamburg have succeeded in demonstrating that once the gene is deactivated, mice with a human tau gene, which previously presented symptoms of dementia, regain their ability to learn and remember, and that the synapses of the mice also reappear in part. The researchers are now testing active substances to prevent the formation of tau deposits in mice. This may help to reverse memory loss in the early stages of Alzheimer disease - in part, at least.


Tau-induced memory loss in Alzheimer's miceTo test their capacity to learn, the mice are trained to find an underwater platform which is not visible to them from the edge of a water basin. The swimming path is marked in red. Normal mice learn to find the path after just a few training sessions; they remember it and swim straight to the platform (left) when tested. A mouse with too much aggregated tau protein in its neurons finds it difficult to learn and swims aimlessly around the basin (centre) for extended periods. If the gene for the toxic tau protein in this mouse is switched off for a few weeks using a genetic trick, the mouse is able to learn normally again and quickly finds its way to the platform (right). © Max-Planck-ASMB/Mandelkow
Whereas aggregated amyloid-beta protein forms insoluble clumps between the neurons, the tau protein accumulates inside them. Tau protein stabilises the tube-shaped fibers of the cytoskeleton, known as microtubules, which provide the "rails" for cellular transport. In Alzheimer disease, excess phosphate groups cause the tau protein to malfunction and form clumps (the 'neurofibrillary tangles'). As a result, nutrient transport breaks down and the neurons and their synapses die off. This process is accompanied by the initial stage of memory loss.

Together with colleagues from Leuven, Hamburg and Erlangen, Eva and Eckhard Mandelkow's team from the Max Planck Research Unit for Structural Molecular Biology generated regulatable transgenic mice with two different human tau gene variants that can be switched on and off again: one group was given a form of the protein that cannot become entangled (anti-aggregant), and a second was provided with the code for the strongly aggregating protein variant (pro-aggregant). The mice with the first form developed no Alzheimer symptoms; the rodents that were given the pro-aggregant tau developed the disease.

The researchers measured the mice's memory loss with the help of a swimming test: the healthy mice quickly learn how to find a life-saving platform located under the surface of the water in a water basin. In contrast, the transgenic animals, which have the additional pro-aggregant tau gene paddle aimlessly around the basin until they accidentally stumble on the platform; they require over four times more time to do this than their healthy counterparts. However, if the mutated toxic tau gene is switched off again, the mice learn to reach "dry land" with ease just a few weeks later. As a control, the mice with the anti-aggregant form of tau have no defects in learning, just as normal non-transgenic mice.

Surprising tissue results
Tissue tests showed that, as expected, no tau clumps had formed in the brains of the first group of mice expressing anti-aggregant tau. In the second group - the mice suffering from Alzheimer's - co-aggregates from human tau and "mouse tau" were formed - against expectations, because tau protein from mice does not commonly aggregate. "Even more astonishingly, weeks after the additional gene had been switched off, the aggregated human tau had dissolved again. However, the 'mouse tau' remained clumped. Despite this, the mice were able to learn and remember again," says Eckhard Mandelkow. More precise tests revealed that new synapses had actually formed in their brains.

The researchers concluded from this that mutated or pathological tau can alter healthy tau. It appears that pro-aggregant tau can act similar to a crystal nucleus - once it has started to clump up, it drags neighboring "healthy" tau into the clumps as well. This is what makes the process so toxic to the neurons. "The really important discovery here, however, is that the progression of Alzheimer's disease can be reversed in principle - at least at an early stage of the illness before too a number of neurons have been destroyed," explains Eva Mandelkow who, together with her husband, will be awarded the Potamkin Prize 2011 for Alzheimer's disease research, which is sponsored by the American Academy of Neurology.

The aggregation of tau proteins, however, cannot simply be switched off in humans the way it can in the transgenic mice. Nevertheless, special substances exist that could dissolve the tau aggregates. By screening 200,000 substances, the Hamburg scientists have already identified several classes of active substances that could re-convert the tau aggregates into soluble tau. These are now being tested on animals.


Posted by: Daniel    Source


Did you know?
Amyloid-beta and tau protein deposits in the brain are characteristic features of Alzheimer disease. The effect on the hippocampus, the area of the brain that plays a central role in learning and memory, is especially severe. However, it appears that the toxic effect of tau protein is largely eliminated when the corresponding tau gene is switched off. Scientists from the Max Planck Research Unit for Structural Molecular Biology at DESY in Hamburg have succeeded in demonstrating that once the gene is deactivated, mice with a human tau gene, which previously presented symptoms of dementia, regain their ability to learn and remember, and that the synapses of the mice also reappear in part. The researchers are now testing active substances to prevent the formation of tau deposits in mice. This may help to reverse memory loss in the early stages of Alzheimer disease - in part, at least.
Read more >>
Bookmark and Share

Increasing brain enzyme may slow Alzheimer's disease

Increasing puromycin-sensitive aminopeptidase, the most abundant brain peptidase in mammals, slowed the damaging accumulation of tau proteins that are toxic to nerve cells and eventually lead to the neurofibrillary tangles, a major pathological hallmark of Alzheimer's disease and other forms of dementia, as per a research studypublished online in the journal, Human Molecular Genetics

Stanislav Karsten, an LA BioMed principal researcher, is the lead author of a new study on Alzheimer's disease.

Credit: LA BioMed


Scientists found they could safely increase the puromycin-sensitive aminopeptidase, PSA/NPEPPS, by two to three times the usual amount in animal models, and it removed the tau proteins in the neurons. Removing the tau proteins restored neuronal density and slowed down disease progression. Scientists detected no abnormalities caused by the increase in PSA/NPEPPS, suggesting that elevating PSA/NPEPPS activity appears to be a viable approach to treat Alzheimer's disease and other forms of dementia, known a tauopathies.

"Our research demonstrated that increasing the brain enzyme known as PSA/NPEPPS can effectively block the accumulation of tau protein that is toxic to nerve cells and slow down the progression of neural degeneration without unwanted side effects," said Stanislav L. Karsten, PhD, the corresponding author for the study and a principal investigator at Los Angeles Biomedical Research Institute at Harbor-UCLA Medical Center (LA BioMed). "These findings suggest that increasing this naturally occurring brain peptidase, PSA/NPEPPS, appears to be a feasible therapeutic approach to eliminate the accumulation of unwanted toxic proteins, such as tau, that cause the neural degeneration linked to the devastating effects of Alzheimer's disease and other forms of dementia".

Alzheimer's disease affects 2 million to 4 million Americans, and their ranks are expected to grow to as a number of as 14 million by the middle of the 21st century as the population ages.

The potential for PSA/NPEPPS to protect neurons from degeneration was first reported in a 2006 issue of the journal, Neuron. At that time, scientists hypothesized that PSA/NPEPPS appears to be a natural mechanism for protecting neurons. Dr. Karsten, who was the main author of the 2006 study, said the newly released study is the first to provide the data confirming the neuroprotective role of PSA/NPEPPS in mammals.


Posted by: Daniel    Source


Did you know?
Increasing puromycin-sensitive aminopeptidase, the most abundant brain peptidase in mammals, slowed the damaging accumulation of tau proteins that are toxic to nerve cells and eventually lead to the neurofibrillary tangles, a major pathological hallmark of Alzheimer's disease and other forms of dementia, as per a research studypublished online in the journal, Human Molecular Genetics
Read more >>
Bookmark and Share

Regrowing hair

It has been long known that stress plays a part not just in the graying of hair but in hair loss as well. Over the years, numerous hair-restoration remedies have emerged, ranging from hucksters' "miracle solvents" to legitimate medications such as minoxidil. But even the best of these have shown limited effectiveness.

Now, a team led by scientists from UCLA and the Veterans Administration that was investigating how stress affects gastrointestinal function may have found a chemical compound that induces hair growth by blocking a stress-related hormone linked to hair loss ? entirely by accident.

The serendipitous discovery is described in an article reported in the online journal PLoS One

Regrowing hairThe CRF1/CRF2 receptor antagonist, astressin-B, injected intraperitoneally (ip) in CRF-OE mice with fully developed alopecia induces hair growth and pigmentation. Photographs: Row A: Male CRF-OE mice (4 months old) injected ip once daily for 5 consecutive days with saline at 3 days after the last injection and Row B: astressin-B (5 mg/mouse) at 3 days after the last ip injection, and Row C: the same mice as in the middle panel Row B at 4 weeks after the last ip injection.

Credit: UCLA/VA


"Our findings show that a short-duration therapy with this compound causes an astounding long-term hair regrowth in chronically stressed mutant mice," said Million Mulugeta, an adjunct professor of medicine in the division of digestive diseases at the David Geffen School of Medicine at UCLA and a corresponding author of the research. "This could open new venues to treat hair loss in humans through the modulation of the stress hormone receptors, especially hair loss correlation to chronic stress and aging." .

The research team, which was originally studying brain?gut interactions, included Mulugeta, Lixin Wang, Noah Craft and Yvette Tach? from UCLA; Jean Rivier and Catherine Rivier from the Salk Institute for Biological Studies in La Jolla, Calif.; and Mary Stenzel-Poore from the Oregon Health and Sciences University.

For their experiments, the scientists had been using mice that were genetically altered to overproduce a stress hormone called corticotrophin-releasing factor, or CRF. As these mice age, they lose hair and eventually become bald on their backs, making them visually distinct from their unaltered counterparts.

The Salk Institute scientists had developed the chemical compound, a peptide called astressin-B, and described its ability to block the action of CRF. Stenzel-Poore had created an animal model of chronic stress by altering the mice to overproduce CRF.

UCLA and VA scientists injected the astressin-B into the bald mice to observe how its CRF-blocking ability affected gastrointestinal tract function. The initial single injection had no effect, so the researchers continued the injections over five days to give the peptide a better chance of blocking the CRF receptors. They measured the inhibitory effects of this regimen on the stress-induced response in the colons of the mice and placed the animals back in their cages with their hairy counterparts.

About three months later, the researchers returned to these mice to conduct further gastrointestinal studies and found they couldn't distinguish them from their unaltered brethren. They had regrown hair on their previously bald backs.

"When we analyzed the identification number of the mice that had grown hair we observed that, indeed, the astressin-B peptide was responsible for the remarkable hair growth in the bald mice," Mulugeta said. "Subsequent studies confirmed this unequivocally." .

Of particular interest was the short duration of the therapys: Just one shot per day for five consecutive days maintained the effects for up to four months.

"This is a comparatively long time, considering that mice's life span is less than two years," Mulugeta said.

So far, this effect has been seen only in mice. Whether it also happens in humans remains to be seen, said the researchers, who also treated the bald mice with minoxidil alone, which resulted in mild hair growth, as it does in humans. This suggests that astressin-B could also translate for use in human hair growth. In fact, it is known that the stress-hormone CRF, its receptors and other peptides that modulate these receptors are found in human skin.


Posted by: George    Source


Did you know?
It has been long known that stress plays a part not just in the graying of hair but in hair loss as well. Over the years, numerous hair-restoration remedies have emerged, ranging from hucksters' "miracle solvents" to legitimate medications such as minoxidil. But even the best of these have shown limited effectiveness.
Read more >>
Bookmark and Share

Late nights can lead to higher risk of strokes

New research from Warwick Medical School published recently in the European Heart Journal shows that prolonged sleep deprivation and disrupted sleep patterns can have long-term, serious health implications. Leading academics from the University have linked lack of sleep to strokes, heart attacks and cardiovascular disorders which often result in early death.

Professor Francesco Cappuccio from the University of Warwick Medical School, explained: "If you sleep less than six hours per night and have disturbed sleep you stand a 48 per cent greater chance of developing or dying from heart disease and a 15 per cent greater chance of developing or dying of a stroke.


Late nights can lead to higher risk of strokes
"The trend for late nights and early mornings is actually a ticking time bomb for our health so you need to act now to reduce your risk of developing these life-threatening conditions".

Professor Cappuccio and co-author Dr Michelle Miller, from the University of Warwick, conducted the research programme which followed up evidence from seven to 25 years from more than 470,000 participants from eight countries including Japan, USA, Sweden and UK.

Professor Cappuccio explained: "There is an expectation in today's society to fit more into our lives. The whole work/life balance struggle is causing too a number of of us to trade in precious sleeping time to ensure we complete all the jobs we believe are expected of us".

He added: "But in doing so, we are significantly increasing the risk of suffering a stroke or developing cardiovascular disease resulting in, for example, heart attacks".

Dr Miller explained further: "Chronic short sleep produces hormones and chemicals in the body which increase the risk of developing heart disease and strokes, and other conditions like hypertension and cholesterol, diabetes and obesity".

But Professor Cappuccio did warn of the implications of going too far the other way, as sleeping overly long ? more than nine hours at a stretch ? appears to be an indicator of illness, including cardiovascular disease.

"By ensuring you have about seven hours sleep a night, you are protecting your future health, and reducing the risk of developing chronic illnesses. The link is clear from our research: get the sleep you need to stay healthy and live longer".


Posted by: Daniel    Source


Did you know?
New research from Warwick Medical School published recently in the European Heart Journal shows that prolonged sleep deprivation and disrupted sleep patterns can have long-term, serious health implications. Leading academics from the University have linked lack of sleep to strokes, heart attacks and cardiovascular disorders which often result in early death.
Read more >>
Bookmark and Share

Blood test to detect Alzheimer's disease

UT Southwestern Medical Center researchers have helped develop a novel technology to diagnose Alzheimer's disease from blood samples long before symptoms appear.

This preliminary technology, which uses synthetic molecules to seek out and identify disease-specific antibodies, also could be used eventually in the development of specific biomarkers for a range of other hard-to-diagnose diseases and conditions, including Parkinson's disease and immune system-related diseases like multiple sclerosis and lupus, the scientists predict.


Blood test to detect Alzheimer's disease
"One of the great challenges in treating patients with Alzheimer's disease is that once symptoms appear, it's too late. You can't un-ring the bell," said Dr. Dwight German, professor of psychiatry and an author of the paper reported in the Jan. 7 edition of Cell "If we can find a way to detect the disease in its earliest stages ? before cognitive impairment begins ? we might be able to stop it in its tracks by developing new therapy strategies".

Because patients with Alzheimer's disease (AD) exhibit immune system activation and neurodegeneration in several brain regions, scientists in the study hypothesized that there appears to be numerous antibodies in the serum of affected patients that are specific to the disease and can serve as a biomarker.

Antigens ? substances such as protein from a virus or bacteria that triggers an immune response ? traditionally have been necessary for the discovery of antibody biomarkers. It has been impossible previously to identify an antibody (a type of targeted immune molecule) without first knowing the antigen that triggers its production.

The newly released study, however, challenges conventional wisdom and uses synthetic molecules (peptoids) rather than antigens to successfully detect signs of disease in patients' blood samples. These peptoids have a number of advantages; they can be modified easily and can be produced quickly in relatively large amounts at lower cost.

The adaptive immune system is believed to be a rich source of protein biomarkers, but diagnostically useful antibodies remain undiscovered for a large number of diseases, Dr. German said. This is, in part, because the antigens that trigger an immune response in a number of diseases are unknown. The technology behind this discovery is essentially an immune-system reader, which is designed to pick out antibodies without knowing in advance which ones to look for.

The scientists used a combination library of several thousand peptoids to screen serum samples from mice with multiple sclerosis-like symptoms as well as from healthy control mice. The particular peptoids that retained more antibodies from the blood samples of the diseased animals were identified as potential agents for capturing diagnostically useful molecules.

The researchers then examined serum samples from six AD patients, six healthy patients and six patients with Parkinson's. Three peptoids were identified that captured six times the IgG antibody levels in all of the Alzheimer's patients when in comparison to the control group or to the Parkinson's patients. Two of the peptoids were found to bind the same IgG antibody, while the third was shown to bind to different antibodies ? meaning there are at least two candidate biomarkers for AD. Using an additional set of 16 normal control subjects and 10 subjects at the very early state of AD, the three candidate biomarkers identified AD with 90 percent accuracy.

"The results of this study, though preliminary, show great potential for becoming a landmark," said Dr. German.


Posted by: Daniel    Source


Did you know?
UT Southwestern Medical Center researchers have helped develop a novel technology to diagnose Alzheimer's disease from blood samples long before symptoms appear. This preliminary technology, which uses synthetic molecules to seek out and identify disease-specific antibodies, also could be used eventually in the development of specific biomarkers for a range of other hard-to-diagnose diseases and conditions, including Parkinson's disease and immune system-related diseases like multiple sclerosis and lupus, the scientists predict.
Read more >>
Bookmark and Share

Key culprit in breast cancer metastasis

When doctors discover high concentrations of regulatory T cells in the tumors of patients with breast cancer, the prognosis is often grim, though why exactly has long been unclear.

Now new research at the University of California, San Diego School of Medicine suggests these regulatory T cells, whose job is to help mediate the body's immune response, produce a protein that appears to hasten and intensify the spread of breast cancer to distant organs and, in doing so, dramatically increase the risk of death.

The findings are published in the Feb. 16 advance online edition of the journal Nature

Key culprit in breast cancer metastasis
The scientists observed that mice with breast cancer were more likely to develop metastatic lung cancer due to elevated levels of RANKL, an inflammatory protein normally involved in bone remodeling. Regulatory T cells were found to be the primary source of RANKL in these tumors. However, the same increase in metastasis was seen when synthetic RANKL was injected directly into tumors, suggesting that RANKL was the key to the ability of regulatory T cells to promote the spread of breast cancer. The researchers also determined that interfering with the ability of RANKL to interact with cancer cells seemed to block tumor progression, and may represent a potential target for drug treatment.

"What is exciting about this study is that now that we understand an increase in RANKL translates to an increase in metastasis, we can get to work on figuring out ways to stop or slow the production of RANKL in patients with breast cancer," said Michael Karin, PhD, Distinguished Professor of Pharmacology and Pathology at UCSD's Laboratory of Gene Regulation and Signal Transduction and Moores Cancer Center.

RANKL is a well-known factor in a variety of degenerative bone diseases, including rheumatoid arthritis and bone metastasis. In June 2010, the Food and Drug Administration approved the first RANKL-inhibiting drug for use in postmenopausal women at risk for osteoporosis.

"When we were able to control the RANKL production in the mice, we were able to slow or stop the spread of the cancer," Karin said. "The next logical step is to turn to drugs that block RANKL production to see how they might affect the spread of breast cancer".

Other breast cancer studies have linked RANKL to early stages in the development of synthetic progestin-driven breast tumors. As per the Women's Health Initiative and the Million Women Study, hormone replacement treatment and contraceptives with progestin increase significantly the risk of developing breast cancer. The findings from these studies and the new UCSD research suggest that drugs that block RANKL appears to be effective in preventing both the early stages of breast cancer and the advanced progression of the disease.


Posted by: Janet    Source


Did you know?
When doctors discover high concentrations of regulatory T cells in the tumors of patients with breast cancer, the prognosis is often grim, though why exactly has long been unclear. Now new research at the University of California, San Diego School of Medicine suggests these regulatory T cells, whose job is to help mediate the body's immune response, produce a protein that appears to hasten and intensify the spread of breast cancer to distant organs and, in doing so, dramatically increase the risk of death.
Read more >>
Bookmark and Share

Healthy lifestyle, positive attitude

Error in deserializing body of reply message for operation 'Translate'. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 9417.
Joint replacement patients who improve their lifestyle and maintain a positive mindset previous to surgery are more likely to have better functional outcomes than those who do not, as per research presented today at the 2011 Annual Meeting of the American Academy of Orthopaedic Surgeons (AAOS). Multiple studies observed that patients who smoke, misuse alcohol, fail to control blood sugar levels or simply have a poor attitude previous to undergoing total hip or knee replacement (THR/TKR) surgery can, in some cases, double their odds of post-operative complications.

Data were presented in three separate studies and one instructional course by scientists from Stanford University, the University of Alabama, the Orthopedic Institute in Miami and the University of Massachusetts.


Healthy lifestyle, positive attitude
"Some known risk factors for complications like advanced age and pre-existing heart or lung conditions are difficult or impossible to modify previous to surgery," said Jasvinder Singh, MD, associate professor of medicine at the University of Alabama in Birmingham. "In contrast, smoking, alcohol abuse, blood sugar levels and mental attitude are completely manageable by the patients themselves, which makes them an excellent target for prevention and intervention programs that are likely to improve outcomes".

Smoking (now or ever) raises patient risks (Embargo: February 17)
Dr. Singh, who also is a staff doctor at the Birmingham VA (Veterans' Affairs) Medical Center, led a team of scientists who examined whether current or previous tobacco use had an effect on post-operative recovery in veterans undergoing elective THR or TKR.

The study observed that current smokers had 41 percent higher odds of site infections (SSI) than those who had never smoked before. Current smokers also had significantly higher odds of pneumonia (53 percent,) stroke (161 percent) and one-year mortality (63 percent,) in comparison to never smokers.

Previous smokers were at higher odds of stroke (114 percent), pneumonia (34 percent), urinary tract infection (26 percent) and pulmonary complications (30 percent), compared with never smokers.

They analyzed data from 33,336 patients from the VA Surgical Quality Improvement Program (VASQIP) who underwent elective primary joint replacement procedures between October 2001 and September 2008. Specifically, they measured the association of smoking status at the time of surgery with 30-day, 90-day and one-year post-operative complication rates including surgical site and other infections, such as pneumonia, stroke, heart attack, and mortality.

Patients were on average 64 years old, mostly male (95 percent) and Caucasian (66 percent). Fifty-seven percent never smoked, 19 percent were previous smokers (who had stopped smoking at least a year before surgery) and 24 percent were current smokers.

"Since the risk of complications in joint replacement patients who smoke is quite significant and since it is possible that even short-term cessation may provide significant protection from such complications, it would be reasonable to approach surgical candidates for a pre-operative smoking-cessation program," said Dr. Singh. "If smokers are looking for a reason to quit, the waiting period before total joint replacement provides a golden opportunity".

Alcohol misuse a factor in likely complications (Embargo: February 15)
In the first study, scientists from Stanford University reviewed post-surgical complication rates among 185 veterans who underwent total joint replacement surgery and who had admitted consuming alcohol in the past year based on their responses to the Alcohol Use Disorders Identification Test (AUDIT-C), a standardized annual evaluation conducted at VA facilities.

They observed that patients who reported the highest amount of alcohol consumption (at the level considered "alcohol misuse" *) were most likely to experience complications. In fact, each additional point in the 12-point scale corresponded with a 29-percent increase in the expected number of complications.

"Complications following total joint replacement and alcohol misuse are exponentially related," said main author Nicholas J. Giori, MD, orthopaedic surgeon at Palo Alto Veterans Affairs Medical Center and associate professor of orthopaedic surgery at Stanford University Hospital. "These results, though from a small selection of patients, indicate the need for preoperative screenings and possibly interventions for alcohol misuse among joint replacement candidates".

*The U.S. Health and Human Services offers a guideline for moderate alcohol use as one or fewer drinks per day for women and two or fewer drinks per day for men. This study defined "alcohol misuse" by the AUDIT-C a Veterans' Affairs screening tool.

Patients who are at risk for "alcohol misuse" was defined by the AUDIT-C Veterans' Affairs screening tool and can include either drinking more than 4 times a week, having more than 9 standard drinks in a typical day, or routinely having more than 6 drinks a day.

Patients' stable blood sugar can help healing (Embargo: February 15)
In another study, scientists at the Orthopedic Institute in Miami reported that type 2 diabetic patients who had preoperative hypoglycemia (low blood sugar) and hyperglycemia (high blood sugar) fared worse after total joint replacement surgery than those who were able to keep their blood sugar (HbA1c) at normal levels.

Surgeons conducted 121 consecutive primary total joint replacements on type 2 diabetic patients and reviewed them based on preoperative HbA1c levels.* They divided the group into three segments?25 percent of patients were hypoglycemic, 50 percent were within normal ranges and 25 percent were hyperglycemic?and compared each of the three segment's patient-oriented outcomes, complications, length of stay and hospital costs.

Scientists found a significant trend toward worse scores in all categories among patients in the lowest and highest ranges.

"When set in a graph, the results looked like an inverted bell, with complications spiking on both ends of the spectrum and dipping in the middle," said Carlos J. Lavernia, MD, Chief of Orthopaedics at Mercy Hospital in Miami and Chief of the Orthopedic Institute. "Even after controlling for all external factors that could have affected the outcomes, the inverted-bell shape remained intact, indicating that diabetic patients who control their blood sugar previous to surgery will inevitably have better outcomes".

*A number of individual factors (including timing of last meal) are examined to determine a "normal blood glucose level," but medical expertise states that 70mg/dL- 120mg/dL is considered ideal. Patients with diabetes or hypoglycemia are urged to narrow that range even further and have or should have their own "norms" identified by a physician.

Strong mental and emotional health can set the stage for success (Embargo: February 18)
Finally, during a symposium moderated by David C. Ayers, MD, The Arthur Pappas Professor and Chair of Orthopedics at the University of Massachusetts Medical School, participants learned that patients can help to determine how well they tolerate the recovery process and the degree of functional improvement they gain after surgery based on their mental approach before, during and after surgery.

Dr. David C. Ring, MD, Associate Professor of Orthopaedic Surgery at Harvard Medical School and one of the symposium presenters, said, "Individuals who recognize within themselves the ability to ensure that things will be okay consistently report less pain and disability for a given disease or impairment".

Through a grant from the National Institutes of Health (NIH), Dr. Ayers is currently leading a team of scientists who are studying the emotional aspects of musculoskeletal health in patients undergoing total knee arthroplasty.

"There is a range of functional improvement patients experience after TKR. We have shown that patients with poor emotional health pre-operatively, that have poor coping skills, little social support, and are anxious, are at risk for less functional improvement after total knee replacement. We are studying the effect of placing these high-risk patients in a post-op pathway that directly addresses the factors in order to improve their functional improvement after TKR ," said Dr. Ayers. "In addition to offering top-notch surgical and medical care, all medical professionals should encourage patients to engage in positive changes in lifestyle before and after surgery. The results will speak for themselves".


Posted by: Janet    Source




Did you know?
Joint replacement patients who improve their lifestyle and maintain a positive mindset previous to surgery are more likely to have better functional outcomes than those who do not, as per research presented today at the 2011 Annual Meeting of the American Academy of Orthopaedic Surgeons (AAOS). Multiple studies observed that patients who smoke, misuse alcohol, fail to control blood sugar levels or simply have a poor attitude previous to undergoing total hip or knee replacement (THR/TKR) surgery can, in some cases, double their odds of post-operative complications.
Read more >>
Bookmark and Share

Flavonoids may lower risk of Parkinson's

New research shows men and women who regularly eat berries may have a lower risk of developing Parkinson's disease, while men may also further lower their risk by regularly eating apples, oranges and other sources rich in dietary components called flavonoids. The study was released recently and will be presented at the American Academy of Neurology's 63rd Annual Meeting in Honolulu April 9 to April 16, 2011.

Flavonoids are found in plants and fruits and are also known collectively as vitamin P and citrin. They can also be found in berry fruits, chocolate, and citrus fruits such as grapefruit.



The study involved 49,281 men and 80,336 women. Scientists gave participants questionnaires and used a database to calculate intake amount of flavonoids. They then analyzed the association between flavonoid intakes and risk of developing Parkinson's disease. They also analyzed consumption of five major sources of foods rich in flavonoids: tea, berries, apples, red wine and oranges or orange juice. The participants were followed for 20 to 22 years.

During that time, 805 people developed Parkinson's disease. In men, the top 20 percent who consumed the most flavonoids were about 40 percent less likely to develop Parkinson's disease than the bottom 20 percent of male participants who consumed the least amount of flavonoids. In women, there was no relationship between overall flavonoid consumption and developing Parkinson's disease. However, when sub-classes of flavonoids were examined, regular consumption of anthocyanins, which are mainly obtained from berries, were found to be linked to a lower risk of Parkinson's disease in both men and women.

"This is the first study in humans to examine the association between flavonoids and risk of developing Parkinson's disease," said study author Xiang Gao, MD, PhD, with the Harvard School of Public Health in Boston. "Our findings suggest that flavonoids, specifically a group called anthocyanins, may have neuroprotective effects. If confirmed, flavonoids appears to be a natural and healthy way to reduce your risk of developing Parkinson's disease".


Posted by: Daniel    Source


Did you know?
New research shows men and women who regularly eat berries may have a lower risk of developing Parkinson's disease, while men may also further lower their risk by regularly eating apples, oranges and other sources rich in dietary components called flavonoids. The study was released recently and will be presented at the American Academy of Neurology's 63rd Annual Meeting in Honolulu April 9 to April 16, 2011.
Read more >>
Bookmark and Share

Circulating Tumor Cell Detection

Tiny gold particles can help doctors detect tumor cells circulating in the blood of patients with head and neck cancer, scientists at Emory and Georgia Tech have found.

The detection of circulating tumor cells (CTCs) is an emerging technique that can allow oncologists to monitor patients with cancer for metastasis or to evaluate the progress of their therapy. The gold particles, which are embedded with dyes allowing their detection by laser spectroscopy, could enhance this technique's specificity by reducing the number of false positives.

The results are published online in the journal Cancer Research.


Circulating Tumor Cell DetectionGold-based nanoparticles can detect circulating tumor cells.
One challenge with detecting CTCs is separating out signals from white blood cells, which are similarly sized as tumor cells and can stick to the same antibodies normally used to identify tumor cells. Commercially available devices trap CTCs using antibody-coated magnetic beads, and technicians must stain the trapped cells with several antibodies to avoid falsely identifying white blood cells as tumor cells.

Emory and Georgia Tech scientists show that polymer-coated and dye-studded gold particles, directly associated with a growth factor peptide rather than an antibody, can detect circulating tumor cells in the blood of patients with head and neck cancer.

"The key technological advance here is our finding that polymer-coated gold nanoparticles that are conjugated with low molecular weight peptides such as EGF are much less sticky than particles conjugated to whole antibodies," says Shuming Nie, PhD, a professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. "This effect has led to a major improvement in discriminating tumor cells from non-tumor cells in the blood".

The particles are associated with EGF (epithelial growth factor), whose counterpart EGFR (epithelial growth factor receptor) is over-produced on the surfaces of several types of tumor cells.

Upon laser illumination, the particles display a sharp fingerprint-like pattern that is specific to the dye, because the gold enhances the signal coming from the dyes. This suggests that several types of nanoparticles could be combined to gain more information about the growth characteristics of the tumor cells. In addition, measuring CTC levels appears to be sensitive enough to distinguish patients with localized disease from those with metastatic disease.

"Nanoparticles could be instrumental in modifying the process so that circulating tumor cells can be detected without separating the tumor cells from normal blood cells," Nie says. "We've demonstrated that one tumor cell out of approximately one to ten million normal cells can be detected this way".

In collaboration with oncologists at Winship Cancer Institute, scientists used nanoparticles to test for CTCs in blood samples from 19 patients with head and neck cancer. Of these patients, 17 had positive signals for CTCs in their blood. The two with low signals were verified to have no circulating cells by a different technique.

"Eventhough the results have not been compared or validated with current CTC detection methods, our 'one-tube' SERS technology could be faster and lower in costs than other detection methods," says Dong Moon Shin, MD, professor of hematology and oncology and otolaryngology, associate director of academic development for Winship Cancer Institute and director of the Winship Cancer Institute Chemoprevention Program. "We need to validate this pilot study by continuing with larger groups of patients and comparing with other tests".


Posted by: Janet    Source


Did you know?
Tiny gold particles can help doctors detect tumor cells circulating in the blood of patients with head and neck cancer, scientists at Emory and Georgia Tech have found. The detection of circulating tumor cells (CTCs) is an emerging technique that can allow oncologists to monitor patients with cancer for metastasis or to evaluate the progress of their therapy. The gold particles, which are embedded with dyes allowing their detection by laser spectroscopy, could enhance this technique's specificity by reducing the number of false positives.
Read more >>
Bookmark and Share

Reduced levels of neurotransmitter in MS

Scientists at the University of Illinois at Chicago have demonstrated for the first time that damage to a particular area of the brain and a consequent reduction in noradrenaline are linked to multiple sclerosis.

The study is available online in the journal Brain
The pathological processes in MS are not well understood, but an important contributor to its progression is the infiltration of white blood cells involved in immune defense through the blood-brain barrier.

Douglas Feinstein, research professor in anesthesiology at the UIC College of Medicine, and colleagues previously showed that the neurotransmitter noradrenaline plays an important role as an immunosuppressant in the brain, preventing inflammation and stress to neurons. Noradrenaline is also known to help to preserve the integrity of the blood-brain barrier.


Reduced levels of neurotransmitter in MS
Because the major source of noradrenaline is neurons in an area of the brain called the locus coeruleus, the UIC scientists hypothesized that damage to the LC was responsible for lowered levels of noradrenaline in the brains of MS patients.

"There's a lot of evidence of damage to the LC in Alzheimer's and Parkinson's disease, but this is the first time that it has been demonstrated that there is stress involved to the neurons in the LC of MS patients, and that there is a reduction in brain noradrenaline levels," said Paul Polak, research specialist in the health sciences in anesthesiology and first author on the paper.

For the last 15 years, Feinstein and colleagues have been studying the importance of noradrenaline to inflammatory processes in the brain.

"We have all the models for studying this problem, so in some ways it was a small step to look at this question in MS," said Polak.

The scientists observed that LC damage and reduced levels of noradrenaline occur in a mouse model of MS and that similar changes could be found in the brains of MS patients.

The findings suggest that LC damage, accompanied by reduction in noradrenaline levels in the brain, appears to be a common feature of neurologic diseases, Polak said.

"There are many FDA-approved drugs that have been shown to raise levels of noradrenaline in the brain, and we think that this type of therapeutic intervention could benefit patients with MS and other neurodegenerative diseases, and should be investigated," he said.


Posted by: Daniel    Source


Did you know?
Scientists at the University of Illinois at Chicago have demonstrated for the first time that damage to a particular area of the brain and a consequent reduction in noradrenaline are linked to multiple sclerosis. The study is available online in the journal Brain The pathological processes in MS are not well understood, but an important contributor to its progression is the infiltration of white blood cells involved in immune defense through the blood-brain barrier.
Read more >>
Bookmark and Share

Compound blocks brain cell destruction in Parkinson's disease

Researchers from the Florida campus of The Scripps Research Institute have produced the first known compound to show significant effectiveness in protecting brain cells directly affected by Parkinson's disease, a progressive and fatal neurodegenerative disorder.

Eventhough the findings were in animal models of the disease, the effectiveness of the compound, combined with its potential to be taken orally, offers the tantalizing possibility of a potentially useful future treatment for Parkinson's disease patients.

The results were published in two separate studies in the journal ACS Chemical Neuroscience

Compound blocks brain cell destruction in Parkinson's disease
"These studies present compelling data on the first oral, brain-penetrating inhibitor to show significant efficacy in preventing neurodegeneration in both mouse and rat models of Parkinson's disease," said team leader Philip LoGrasso, a professor in the Department of Molecular Therapeutics and senior director for drug discovery at Scripps Florida. "The compound offers one of the best opportunities we have for the development of an effective neuroprotective therapy".

The new small molecule?labeled SR-3306?is aimed at inhibiting a class of enzymes called c-jun-N-terminal kinases (JNK). Pronounced "junk," these enzymes have been shown to play an important role in neuron (nerve cell) survival. As such, they have become a highly viable target for drugs to treat neurodegenerative disorders such as Parkinson's disease.

"A drug like SR-3306 that prevents neurodegeneration would be a quantum leap in the clinical therapy of Parkinson's because all current therapies treat only the symptoms of the disease, not the underlying pathologies," LoGrasso said.

Patients with Parkinson's disease suffer from the loss of a group of neurons in the substantia nigra pars compacta (SNpc), part of the midbrain involved in motor control. These cells produce dopamine, a neurotransmitter that plays a key role in motor reflexes and cognition. The disease also affects projecting nerve fibers in the striatum, a part of the forebrain filled with cells that interact with dopamine.

Stopping the Progression of Neuron Destruction in Animal Models.

The SR-3306 compound, which has been in development at Scripps Florida for several years, performed well in both cell culture and animal models. In cell culture, the compound showed greater than 90 percent protection against induced cell death of primary dopaminergic neurons, while in mouse models of induced neuron death, the compound showed protective levels of approximately 72 percent.

The researchers went one step further, testing the new compound in a rat model, which duplicates the physical symptoms often seen with the human disease?a pronounced and progressive loss of motor skills. The results showed SR-3306 provided a protection level of approximately 30 percent in the brain, a level that reduced the dysfunctional motor responses by nearly 90 percent.

"It was a surprise that level of neuroprotection reduced the behavioral impact so strongly," LoGrasso said, "but it's indicative of how it might perform in human patients. While SR-3306 doesn't represent a cure, it does appear to have the potential of stopping the progression of the disease".

The new studies are part of a $7.6 million multiyear grant awarded to LoGrasso in 2008 by the National Institutes of Neurological Disorders and Stroke (NINDS). The grant will enable Scripps Research and potential partners to file an application for an investigational new drug (IND)?the first step in the lengthy clinical trials process mandatory by the U.S. Food and Drug Administration before a new drug can be brought to market.


Posted by: Daniel    Source


Did you know?
Researchers from the Florida campus of The Scripps Research Institute have produced the first known compound to show significant effectiveness in protecting brain cells directly affected by Parkinson's disease, a progressive and fatal neurodegenerative disorder. Eventhough the findings were in animal models of the disease, the effectiveness of the compound, combined with its potential to be taken orally, offers the tantalizing possibility of a potentially useful future treatment for Parkinson's disease patients.
Read more >>
Bookmark and Share

Gonorrhea acquires a piece of human DNA

If a human cell and a bacterial cell met at a speed-dating event, they would never be expected to exchange phone numbers, much less genetic material. In more scientific terms, a direct transfer of DNA has never been recorded from humans to bacteria.

Until now. Northwestern Medicine scientists have discovered the first evidence of a human DNA fragment in a bacterial genome ? in this case, Neisseria gonorrhoeae, the bacterium that causes gonorrhea. Further research showed the gene transfer may be a recent evolutionary event.

The discovery offers insight into evolution as well as gonorrhea's nimble ability to continually adapt and survive in its human hosts. Gonorrhea, which is transmitted through sexual contact, is one of the oldest recorded diseases and one of a few exclusive to humans.


Gonorrhea acquires a piece of human DNA
"This has evolutionary significance because it shows you can take broad evolutionary steps when you're able to acquire these pieces of DNA," said study senior author Hank Seifert, professor of microbiology and immunology at Northwestern University Feinberg School of Medicine. "The bacterium is getting a genetic sequence from the very host it's infecting. That could have far reaching implications as far as how the bacteria can adapt to the host".

It's known that gene transfer occurs between different bacteria and even between bacteria and yeast cells. "But human DNA to a bacterium is a very large jump," said main author Mark Anderson, a postdoctoral fellow in microbiology. "This bacterium had to overcome several obstacles in order to acquire this DNA sequence".

The paper will be published Feb. 14 in the online journal mBio
The finding suggests gonorrhea's ability to acquire DNA from its human host may enable it to develop new and different strains of itself. "But whether this particular event has provided an advantage for the gonorrhea bacterium, we don't know yet, " Seifert said.

Every year an estimated 700,000 people in the United States and 50 million worldwide acquire gonorrhea. While the disease is curable with antibiotics, only one drug is now recommended for therapy because the disease developed resistance to previously used antibiotic options over the past four decades.

Gonorrhea is a especially serious disease for women. If left untreated, gonorrhea can lead to pelvic inflammatory disease, a painful condition that can cause sterility and ectopic pregnancy. In rare cases, men and women can develop a form of the disease that leaves the genital tract and enters the bloodstream, causing arthritis and endocarditis, an infection of the inner lining of the .

heart.

An ancient disease that sounds like gonorrhea is described in the Bible, noted Seifert, who has studied the disease for 28 years. Most of his research focuses on how the bacterium evades the human immune system by altering its appearance and modulating the action of white blood cells.

The gene transfer was discovered when the genomic sequences of several gonorrhea .

clinical isolates were determined at the Broad Institute in Cambridge, Mass. Three of the 14 isolates had a piece of DNA where the sequence of DNA bases (A's, T's, C's and G's) was identical to an L1 DNA element found in humans.

In Seifert's Feinberg lab, Anderson sequenced the fragment to reconfirm it was indeed identical to the human one. He also showed that this human sequence is present in about 11 percent of the screened gonorrhea isolates.

Anderson also screened the bacterium that causes meningitis, Neisseria meningitidis, and is very closely correlation to gonorrhea bacteria at the genetic level. There was no sign of the human fragment, suggesting the gene transfer is a recent evolutionary event.

"The next step is to figure out what this piece of DNA is doing," Seifert said.


Posted by: Mark    Source


Did you know?
If a human cell and a bacterial cell met at a speed-dating event, they would never be expected to exchange phone numbers, much less genetic material. In more scientific terms, a direct transfer of DNA has never been recorded from humans to bacteria. Until now. Northwestern Medicine scientists have discovered the first evidence of a human DNA fragment in a bacterial genome ? in this case, Neisseria gonorrhoeae, the bacterium that causes gonorrhea. Further research showed the gene transfer may be a recent evolutionary event.
Read more >>
Bookmark and Share