Conference papers vs journal publications: Which is the better publication route?

In course of their research, academicians often need to interact and exchange views with their colleagues to provide a firmer ground for their inferences. Such meetings help them debate their research topic with other like-minded participants and then assimilate the information that is presented through audio-visual media to produce a more conclusive finding. Therefore, seminars and colloquia are an essential part in the growth of any research. Often the proceedings of such meetings are recorded in the form of a collection of papers that were presented during the event.

On the other hand, a journal publishes research work, either on the web or as printed copies, after a rigorous process of review and a long approval cycle. However, once published in a reputed journal, your paper has an audience that you would otherwise have never had access to.

Why opt for conferences?

Conference proceedings have several advantages for a researcher. This is because conferences:

– Give a platform for interaction among research scholars who share a common interest.

– Have a faster review process and generate a faster feedback.

– Are often characterized by short presentations, so they manage to present the aim of the research clearly without consuming too much time.

– Include discussions sessions, which encourages exchange of views and ideas on the presentations.

– Allow interaction of scholars from all over the world who are engaged in the same or allied research fields.

– Have a predictable and time-bound review time.

– Help the presentations to be properly archived for reference in similar events held elsewhere on related research topics.

– Involve sponsors, who allure researchers with publishing credits and personal and professional benefits for attending the conference.

– Have high visibility and often leave a greater impact on the academic fraternity.

– Mainly focus on recent researches or up-to-date academic endeavors, unlike a journal that often takes a long time to finally publish a research.

Demerits of a conference publication

On the flip side, conferences have the following drawbacks:

– The review process is often superficial or cursory, i.e., there is no second round of reviewing.

– They have a low acceptance rate.

– The feedback from the research fraternity may be lukewarm compared to a publication in a journal.

– Economies of scale work against good quality publications because the publication is one of many expense heads for the organizers. Therefore, the production quality often leaves much to be desired.

Why opt for a journal publication?

A publication in a reputed journal presents the following advantages for the researcher:

– Research papers that are published in journals are thoroughly peer reviewed, including multiple review phases.

– The quality of research published in a journal is of a high standard.

– Journal publications carry deep analysis of a research work.

– Useful feedback is received from the reviewers, which help bring about substantive changes in the paper to improve the research analysis.

– Word and page limits are longer in the case of journals. This gives more scope to the researcher to express his or her thoughts and interpretations.

– A journal gives a chance to authors to revise their work based on the feedback and then re-submit it for further review and publication.

– Conference papers are never considered the ultimate in publishing a research. Often, conference papers can be converted to journal papers and published in reputed journals with a high impact factor.

Demerits of journal publications

There are also few demerits of journal publications. These include:

– The publication process is time-consuming.

– Due to such delays, the research topic might get outdated.

– Selection of journals is a difficult task. Sometimes, a good research is published in a sub-standard journal.

Both these routes to publication have their pros and cons. It must also be noted that conference proceedings and journal publications are not mutually exclusive; a situation may arise where one form of a research work might be published in the conference proceedings and another, perhaps more developed, form might be published in a journal. Therefore, for a more diverse and in-depth research output, both conference proceedings and journal publications need to play a significant part.

Transgenic Approach for Value Addition

Transgene: It is considered to be a gene or a genetic material of interest that is transferred from one organism to another, either naturally or by artificial transformation methods.

Methods involved in transgenics:

Transformation techniques: The transformational methods can be of different types, i.e., direct or indirect.

Some of the widely used direct transformation methods are microinjection, electron gun, electroporation (physical methods); calcium phosphate method, liposome-mediated transfer method (chemical methods), etc.

Indirect transformation methods are also known as vector-mediated gene transfer methods. The major biological vectors involved are Agrobacterium tumefaciens, viruses (for e.g., CaMV virus), etc.

Genome editing: As the name suggests, part of a genome is edited, i.e., either any sequence is inserted, or deleted or replaced by any other desired sequence, with the help of restriction enzymes (also termed as molecular scissors).

Value addition properties of the transgenic process:

  • It has ensured enhanced yields to the farming community, in quantitative terms, by introducing various superior crop varieties, that upholds various high performance qualities like insect or pest resistance, herbicide resistance, resistance to various other biotic stresses, as well as resistance to various abiotic stresses like drought, salinity, flood, temperature imbalance, etc.
  • The above resistance in the crops has also led to an improvement in the crop quality.
  • Transgenics have been a boon, not only in the field of agriculture, but also in other fields like aquaculture, sericulture, horticulture, etc., through an increase in the productivity and quality of the resembled products.
  • Transgenics have opened up the world market for various commercial products obtained through genetic manipulation. For e.g., Flavr Savr tomato, the first ever genetically engineered crop product to be marketed and commercialized by a California-based company, Calgene, which succeeded in introducing an antisense gene that interfered in the production of the fruit ripening enzyme (Polygalacturonase enzyme).
  • One of the landmark achievements of the transgenic technique was the introduction of the gene therapy. The technique aimed at treating a disease in a patient, those who experienced low concentration of any of the metabolically important protein or enzyme production. The introduced gene in the patient leads to altered functions of the genetic material, but in a productive way, only in relation to the disease.
  • Transgenic methodology has brought about a revolutionary change in the field of medical science by prologue of various therapeutic methods, which involve the development of modern vaccination techniques, new and more efficient vaccines for various untreated diseases, production of antibodies from various biological vectors (microorganisms as well as plants), production of monoclonal antibodies by hybridoma technology, and the production of certain other important drugs used in treating genetic diseases.
  • The field of transgenics has been considered important for the researchers and scientists, giving them an ample opportunity for creating substantive and commercially important products, which would meet the demands of the overgrowing world population.

Apart from all these advantages, the mounting costs of the products are making it less appealing, for both the researchers and the customers who tend to buy it. Even certain ethical issues, related to the genetically engineered food products are preventing its market exposure and availability.

Importance of Statistical Review of manuscripts

Statistics: It is a branch of mathematics that deals with the collection of data, its analysis, interpretation, presentation and sequential organization. In simple terms, it deals with philosophy, logic, and expression of data.

Who does the statistical review?

Statistical review is basically done by the expert statisticians or authors and journal editors with statistical knowledge. It comprises of statistical and even methodological questions that are to be answered by the author or even the journal editors that are put forward by the reviewer.

Role of the statistical reviewers:

  • The statistical reviewers find out the possible statistical error sources in the manuscript, in turn increasing the statistical accuracy of the paper as well as ensuring quicker publication of the manuscript.
  • All forms of statistical data checking is performed by the statistical reviewers like checking the missing data, checking whether correct statistical methods were followed or not, checking whether the statistical methods were used appropriately or not, checking for statistical errors like error in level of significance during analysis of the data, checking whether appropriate name of the statistical package is mentioned or not along with the version used, checking whether the measurable units are properly mentioned or not, checking whether the tables and figures displayed in the manuscript hold a proper self-explanatory footnote or not, and so on.
  • They ensure proper statistical presentation of data throughout the manuscript; proper use of statistical language is also ensured by the reviewer in the data presentation section.
  • The reviewer also checks whether the conclusion section in the manuscript is justified or not with regard to the presented data.
  • They also cross check the feasibility of the discussion section based on the results.

Significance of statistical review:

  • If there is any kind of major statistical errors found in the data presentation section, then it may lead to the rejection of the research paper. So, reviewing of the statistical data and its proper presentation is of utmost importance for the author. The frequent statistical problems in the manuscript are found in data interpretation and presentation, its analysis and the study design.
  • Sound statistics is the foundation to high-quality research work interpreting quantitative studies.

Is self-plagiarism ethical?

Research papers or journals are the medium of spreading knowledge and new ideas evolved. Innovative and original piece of work would certainly be more educative and admirable. Nevertheless, authors and writers are often found to be reusing their old piece of work or some extracts from their previous published papers while writing a new research paper.

When questions are raised against this content reuse, authors claim that those stuffs are their own works and materials, and thus, they can reuse them as they wish, and it cannot be termed as plagiarism since they have not stolen the ideas from any other author or source.

The ethics of plagiarism are not applicable to such reuse, as a result of which it has been overlooked till date. While the discussion is whether this reuse is ethical or not, the publications and the journals, on the other hand, have set certain guidelines for such works citing it as Self-plagiarism.

What is self-plagiarism?

Self-plagiarism is a form of plagiarism where the writer reuses his/her own previously published work in portions or entirely while creating a new study paper. It can breach the publisher’s copyright on those published work when it is reused in the new study papers without appropriate citations. Let us now know more about the ethical aspects of self-plagiarism.

Self-plagiarism can be detected when:

a)  A published paper is used to republish elsewhere without the consent of the co-authors and the publisher of the paper or work.

b)  A paper of a large study is published in small sections with an intention to increase the number of publications.

c)  A previously written work either published or not is reused again in portions in the new study papers.

Although the laws of self-plagiarism are not enforced, it somehow reflects the dishonesty of the author. Moreover, the journals and the publishers are rejecting such copy-paste works as they are seeking writings based on original research findings and proper citations of all the references.

Nowadays, journals are also pointing out questions on the reuse of one’s own work. In order to avoid self-plagiarism, one should try to keep his/her work original, and in case it is necessary to include any portion from his/her previous works, it should be then properly cited with proper references. I hope this article will surely help you in detecting prospective self-plagiarism before submitting your paper or work to publications or journals.

BioConference Live 2014

President Barack Obama's participation in BioconferenceLive 2014 alongside ManuscripteditThe BioConference Live virtual neuroscience conference conducted on March 19-20, 2014, was an online event set to unite the neuroscience community via live video webcasts and real-time networking. Manuscriptedit participated in this high profile conference that saw the participation of President Barack Obama as well.

Researchers, post docs, lab directors, and other medical professionals learnt about recent investments and the scientific foci of the BRAIN Initiative through a panel discussion with key leaders from diverse scientific and funding regulatory agencies. The BRAIN Initiative was part of a new Presidential focus intended at reforming our understanding of the human brain.

The Neuroscience conference included topics from science journals like Behavioral and Cognitive Neuroscience, Epigenetic Regulation, Genetics of Neurologic Diseases, Molecular Mechanism, Neurologic Dysfunction from Human Diseases, and Nervous System Development. It also covered neurological diseases from lab to clinic, including Alzheimer’s, ALS, Epilepsy, Huntington’s disease, Multiple Sclerosis, Parkinson’s, traumatic brain and spinal cord injury, and neuropsychiatric disorders.

In addition to topics on diseases, the conference also covered emerging therapies, like combinatorial therapies, immunomodulation, myelin repair, non-coding RNA, neurorobotics, neuroengineering, stem cells, and imaging technologies – in vitro and in vivo.

The intense two-day conference covered original research data, teaching presentations, broad overview of new frontiers given by thought leaders in the field and discussion forums. Attendees learnt new concepts, tools and techniques that they can apply to research and diagnosis.

Analytical Study Design in Medical Research: Measures of risk and disease association

A researcher, while designing any analytical study in medical research, should be aware of few basic terms in epidemiology required to measure disease risk and association. This blog article focuses on defining those terms used for calculating disease risk and association. As mentioned above, there are two different types of measurements: Measures of risk and Measures of association.

Measures of Risk

Risk is defined as the probability of an individual developing a condition or disease over a period of time.

Risk = Chances of something to happen/ Chances of all things to happen

Odds= Chances of something to happen/ Chances of it not happening

Therefore, “Risk” is a proportion, while “Odds” is a ratio.

Incidence: Incidence is a measure of risk which describes the number of cases developed a new condition for a specified period of time. In this context, there is another important term, “Incidence proportion” to be worth mentioning. It is defined as the proportion of the number of cases developed a new condition and total population including the cases with developed condition and no condition in a specified period of time.

For example, among 100 non-diseased persons initially at risk, 20 develop a disease/condition over a period of five years.

Incidence = 20 cases

Incidence proportion = 20 cases per 100 persons i.e., 5%

Incidence rate = 20 cases developed in 100 persons in 5 year means the rate of incidence is equal to 4 per 100 person-years

Prevalence: Prevalence is the proportion of the number of people having a condition at a specific point of time and total population studied. This is specifically called point prevalence. For example, at a certain date, five persons are detected having a condition among 100 people studied. There are two more terms need to be defined in this regard: Period prevalence and Life time prevalence (LTF). The former is defined as the proportion of the number of people having the disease at a certain period of time, say a month or period or a year and the total population studied at that period of time. On the other hand, LTF is defined as the proportion of the number of people having the disease at some point of their life and total population studied.

There is a very subtle difference between incidence and prevalence. Incidence is the frequency of a new event, while prevalence is the frequency of an existing event.

Cumulative Risk: Cumulative risk is defined by the probability of developing a condition over a period of time.

Measures of Association

Association is defined as a statistical measurement between two or more variables.

For measuring the strength of association of a disease for etiological and hypothesis testing, following measurements are important. The terms defined below are used to measure the association between exposure and disease.

Relative risk (RR): The relative risk is measured as a ratio of two risks.

For example, in 100 people consisting of 50 male and 50 female, while 20 male are infected with Tuberculosis, 10 female develop the condition.

Risk in men: 20/50

Risk in women: 10/50

Therefore, relative risk (RR) of developing Tuberculosis in men compared to women is

RR = 20/50 : 10/50 = 2.0

i.e., men are at double risk of developing Tuberculosis as compared to women.

Odd ratio (OR): Odd ratio is measured as the ratio of two odds (odds is defined above).

Continuing the previous example of Tuberculosis in men and women in a total population of 100

Odds in men: 20/30

Odds in women: 10/40

Odd ratio (OR) = 20/30 : 10/40 = 2.67

Therefore, the odds of men getting infected with Tuberculosis are 2.6 times as high as the women developing Tuberculosis.

To measure the impact of   the disease association on public health, following measuerments are important. All these measurements assume that the association between exposure and disease is causal.

Attributable risk (AR): Amount of disease attributed to the exposure i.e., the difference between the incidence of disease in the exposed group (Ie) and the incidence of disease in the unexposed group (Iue).

AR = Ie – Iue

Attributable (risk) fraction (ARF): ARF is the proportion of disease in the exposed population whose disease can be attributed to the exposure.

ARF = Ie – Iue / Ie

Population attributable risk (PAR): The incidence of disease in total population (Ip) that can be attributed to the exposure.

PAR = Ip – Iue

Population attributable (risk) fraction (PARF): PARF is the proportion of the disease in the total population whose disease can be attributed to the exposure.

PARF = Ip – Iue / Ip

 

Bias and Confounding Factors

In an epidemiological study, when association is found between exposure and disease, it is very important to check first whether the association is real. One needs to be cautious if the association is by chance due to non-adequate sample size or it is because of some kind of bias in the design or measurement.

Bias is a systematic error in design, conduct or analysis which results in unreal association of exposure with disease. There are three types of biases possible: (i) Selection bias, (ii) Information bias, and (iii) Confounding.

Selection bias occurs when selection of participants in one group shows different outcome in the selection of other groups. Information bias happens when information is taken differently from two groups.

Confounding occurs when the observed result between exposure and disease differs from the truth due to the influence of a third variable which has not been considered for analysis. For example, a person suffers from headache when he is under stress; however the person eats a lot of junk food especially, when he is in under stress. Therefore, it is hard to predict what actually causes the headache; whether it is lack of sleep, anxiety, gas formation due to indigestion. Therefore, all these variables should be adjusted before associating mental stress with headache.

 

References

1. Health Statistics New South Wales – Definitions. (n.d.). http://www.healthstats.nsw.gov.au/ContentText/Display/Definitions

2. SOURCES OF EPIDEMIOLOGIC DATA – KSU. (n.d.).

http://faculty.ksu.edu.sa/71640/Publications/COURSES/epidemiology-334%20CHS%20%20(70).doc

3. John-Hopkins open courseware. http://ocw.jhsph.edu/courses/fundepiii/lectureNotes.cfm

4. Manuel Bayona M, Chris Olsen, C. Measures in Epidemiology. In The Young Epidemiology Scholars Program (YES)

www.collegeboard.com/prod_downloads/yes/4297_MODULE_09.pdf‎

5. Emily L. Harris EL. Linking Exposures and Endpoints: Measures of Association and Risk

http://www.genome.gov/pages/about/od/opg/epidemiologyforresearchers/3_harris.pdf

Antibiotic Resistance: Cause and Mechanism

Scope of antibiotic resistance problem:

Antibacterial-resistant strains and species, occasionally referred as “superbugs”, now contribute to the emergence of diseases that were well controlled few decades ago. In a recent report “Antibiotic Resistance Threats in the United States, 2013,” CDC calls this as a critical health threat for the country. According to the report more than 2 million people in the United States get antibiotic resistant infections each year and at least 23,000 of them die annually. Now, this is the situation in a country where drug regulations are quite tough and stringent and physicians are relatively careful in prescribing medications. Imagine the situation in developing countries like India, where antibiotics are available over the counter without medical prescription and more than 80-90% of population use antibiotics without physician’s consultation. In fact they are not even aware of the proper use of the antibiotic course. This is again a huge health challenge that will pose even more serious threat in coming years in treating antibiotic resistant infections. Recently, in a clinic in Mumbai some 160 of the 566 patients tested positive for TB between March and September that were resistant to the most powerful TB medicine. In fact, more than one-quarter of people diagnosed with tuberculosis have a strain that doesn’t respond to the main treatment against the disease. According to WHO and data from Indian government, India has about 100,000 of the 650,000 people in the world with multi-drug-resistance.

 Factors contributing to antibiotic resistance:

Inappropriate treatment and misuse of antibiotics has contributed maximum to the emergence of antibacterial-resistant bacteria.Many antibiotics are frequently prescribed to treat diseases that do not respond to these antibacterial therapies or are likely to resolve without any treatment. Most of the time incorrect or suboptimal doses of antibiotics are prescribed for bacterial infections. Self-prescription of antibiotics is another example of misuse. The most common forms of antibiotic misuse however, include excessive use of prophylactic antibiotics by travelers and also the failure of medical professionals to prescribe the correct dosage of antibiotics based on the patient’s weight and history of prior use. Other misuse comprise of failure to complete the entire prescribed course of the antibiotics, incorrect dosage or failure to rest for sufficient recovery. Other major causes that contribute to antibiotic resistance are excessive use of antibiotics in animal husbandry and food industry and frequent hospitalization for small medical issues where most resistant strains gets a chance to circulated among the community.

To conclude, humans contribute the most to the development and spread of drug resistance by: 1) not using the right drug for a particular infection; 2) not completing the antibiotic duration or 3) using antibiotics when they are not needed.

In addition to the growing threat of antibiotic-resistant bugs, there may be another valid reason doctors should desist from freely prescribing antibiotics. According to a recent paper published online in Science Translational Medicine, certain antibiotics cause mammalian mitochondria to fail, which in turn leads to tissue damage.

 Mechanism of antibiotic resistance:

Antibiotic resistance is a condition where bacteria develop insensitivity to the drugs (antibiotics) that generally cause growth inhibition or cell death at a given concentration.

Resistance can be categorized as:

a) Intrinsic or natural resistance:  Naturally occurring antibiotic resistance is very common, where a bacteria may be simply, inherently resistant to antibiotics. For example, Streptomyces possess genes responsible for conferring resistance to its own antibiotic, or bacteria naturally lack the target sites for the drugs or they naturally have low permeability or lack the efflux pumps or transport system for antibiotics. The genes which confer this resistance are known as the environmental resistome and these genes can be transferred from non-disease-causing bacteria to the disease causing bacter, leading to clinically significant antibiotic resistance.

b) Acquired resistance: Here a naturally susceptible microorganism acquires ways not to get affected by the drug. Bacteria can develop resistance to antibiotics due to mutations in chromosomal genes or mobile genetic elements e.g., plasmids, transposons carrying antibiotic resistance genes.

The two major mechanisms of how antibiotic resistance is acquired are:

Genetic resistance: It occurs via chromosomal mutations or acquisition of antibiotic resistance genes on plasmids or transposons.

Phenotypic resistance: Phenotypic resistance can be acquired without any genetic alteration. Mostly it is achieved due to changes in the bacterial physiological state. Bacteria can become non-susceptible to antibiotics when not growing such as in stationary phase, biofilms, persisters and in the dormant state. Example: Salicylate-induced resistance in E. coli, Staphylococci and M. tuberculosis.

In genetic resistance category, following are the five major mechanisms of antibiotic drug resistance, which occurs due to chromosomal mutations:

1. Reduced permeability or uptake (e.g. outer membrane porin mutation in Neisseria gonorrhoeae)

2. Enhanced efflux (membrane bound protein helps in extrusion of antibiotics out of bacterial cell; Efflux of drug in Streptococcus pyogenes, Streptococcus pneumoniae)

3. Enzymatic inactivation (beta-lactamases cleave beta-lactam antibiotics and cause resistance)

4. Alteration or over expression of the drug target (resistance to rifampin and vancomycin)

5. Loss of enzymes involved in drug activation (as in isoniazid resistance-KatG, pyrazinamide resistance-PncA)

Examples of transfer of resistance genes through plasmid are; Sulfa drug resistance and Streptomycin resistance genes, strA and strB while the transfer of resistance gene through transposon occurs via conjugative transposons in Salmonella and Vibro cholera.

In the next post, I will discuss few important examples of antibiotic resistance in clinically relevant microbes.

Antibiotics: Wonder drugs or a threat to public health?

What are antibiotics?

Antibiotics, also known as antibacterials, are category of medications which kills or slow down the bacterial growth. Penicillin was the first antibiotic, discovered by Sir Alexander Fleming in 1928, but it was not until the early 1940s that its true potential was recognized before it came into widespread use. In 1942, the term antibiotic was first used by Selman Waksman. In earlier days, antibiotics were often referred as “wonder drugs” because they cured several bacterial diseases that were once fatal. With antibiotic use, the number of deaths caused by bacterial infections like meningitis, pneumonia, tuberculosis, and scarlet fever were drastically reduced.  Discovery of antibiotics have revolutionized human development in a highly significant way. Other than vaccines, few medical discoveries had such a huge impact on healthcare delivery. Major complicated surgeries, transplants, advances in neonatal medicine, and advances in chemotherapy for cancer patients would not be possible without antibiotics.

Antibiotics classification:

Antibiotics are broadly classified based on their mechanism of action, structure, source or origin of the antibacterial agent or their biological activity. With the recent advances in medicinal chemistry, most antibiotics available nowadays are semisynthetic derivative of various natural compounds (penicillins, Cephalosporins and Ampicillin). Very few antibiotics like aminoglycosides (Streptomycin, Gentamicin, and Neomycin) are isolated from living organisms while many other antibiotics, Sulfonamides,Quinolones, Moxifloxacin and Norfloxacin are chemically synthesized. Based on the biological activity of the microorganisms, antibiotics are classified as bactericidal agents (which kill bacteria) and bacteriostatic agents (which slow down or impede bacterial growth). Microorganisms are known to develop resistance faster to the natural antimicrobials since they have been pre-exposed to these compounds in nature. Therefore semisynthetic drugs were developed for increased efficacy and less toxicity. Synthetic drugs possess an added advantage that bacteria are not exposed to these compounds until they are released systemically. They are designed to have even improved effectiveness with decreased toxicity.

Antibiotics are also classified based upon their range of effectiveness. Broad-spectrum drugs are effective against many types of microbes (gram-positive and gram-negative) and tend to have higher toxicity to the host. Narrow-spectrum drugs are effective against a limited group of microbes (either gram-positive or gram-negative) and exhibit lower toxicity to the host. Based on the chemical structure, antibiotics are classified into two categories: β-lactams and aminoglycosides. All the above mentioned classes of antibiotics are further divided according to their targets or mode of action in the bacteria. Following are the five important antibiotic targets in bacteria.

1. Inhibitors of cell wall synthesis (-cillins)

2. Inhibitors of protein synthesis (-mycins)

3. Inhibitors of membrane function (Polymyxin)

4. Anti-metabolites (Sulfa drugs)

5. Inhibitors of nucleic acid synthesis (Nalidixic acid, Rifampicin)

 The deluge of antibiotic resistance bacteria:

“The first rule of antibiotics is try not to use them, and the second rule is try not to use too many of them”, a quote by Paul L. Marino. Well, after an era of plentiful antibiotics, presently, the situation is alarming due to the ever increasing number of antibiotic resistant strains. In early years, new antibiotics were developed faster than bacteria developed resistance to them. But the bugs have caught up fast now. In the 1950s and 60s, many new classes of antibiotics were discovered. However, in 1980s and 1990s, scientists have only managed to make improvements within different classes of antibiotics.

 The emerging resistance of bacteria to antibacterial drugs is becoming a continuous threat to human health. Bacterial resistance to penicillin was observed within 2 years of its introduction in mid 1940s. Rapidly emerging resistance to ciprofloxacin and various anti-tuberculosis drugs indicates that it is microbe’s world and they are ready to adapt. Since, microbes congregate in large numbers to induce infection, generate rapidly and mutate efficiently, developing resistance is not a matter of “if” but of ‘when”. To overcome any assault, bacteria possess efficient defense system present within DNA or chromosomes or extrachromosomal elements called plasmid. The bacteria have advantage that these plasmids carrying resistance gene with them can easily shuttle between bacterial cells and humans.

Now, no longer limited to the hospitals, antibiotic resistance with Neisseria gonorrhea and Streptococcus pneumoniae is becoming a household and a community setting phenomenon. The use of surface antibacterials in common households, self-medication and unregulated sales of antibiotic in many countries are further aggravating the problem. According to a CDC report by the end of 20th century, approximately 30 % of S. pneumoniae (causative agent of meningitis, otitis media and pneumonia) were no longer found to be sensitive against penicillin. Similarly, treatment failures were observed in patients because of the resistant strains of, Shigella, Salmonella typhi, Staphylococcus, Mycobacteria tuberculosis, Klebsiella pneumoniae, Clostridium difficle and S. pneumoniae. Drug-resistant bacteria can be acquired in community settings like, daycares, schools and other crowded places. Other risk factors are antibiotic use and consumption of food products treated with antibiotics. Increased use of quinolones in poultry and farm animals has been associated with the increased prevalence of human infection with quinolone-resistant Salmonella and Campylobacter.  Besides, the established pathogens, relatively recent appearance of opportunistic organisms, intrinsically resistant to many drugs are making the matter worse. With a larger number of immunocompromised patients, these organisms have become ‘specialized’ pathogens—typically attacking only the most vulnerable patients. Examples of such opportunistic pathogens are Enterococci, the coagulase-negative Staphylococci, Pseudomonas aeruginosa and Acinetobacter baumanii. Therefore, it is the high time to think and act to reverse this trend of antibiotic resistance by medical professionals by creating awareness among communities on the proper use of antibiotics and discouraging self-medications. In the next series, I will discuss the factors responsible for antibiotic resistance and its detailed mechanism.

 

Pharmacogenomics: A study of personalized drug therapy

With the increasing advancement of technology and research progress, modern medicine has found cure for several diseases which were considered to be incurable few decades ago e.g. cardiovascular diseases, various cancers, tuberculosis, malaria and infectious diseases. However, till date no single drug is shown to be 100% efficacious for treating a certain diseased condition without exhibiting adverse drug effects. It is now a well recognized fact that each patient respond differently to a given drug treatment for a similar disease. With a particular drug, desirable therapeutic effects could be obtained in few patients where as others may have modest or no therapeutic response. Besides, many patients might experience an adverse effect that also varies from mild to severe and life-threatening. Studies have shown that with a similar dose, plasma concentration of a certain drug might vary up to a difference of 600 fold among two individuals of same weight. Such inter-individual variations occurring in response to a drug might be a consequence of complex interaction between various genetic and environmental factors. Genetic factors are known to account for approximately 15-30% inter-individual variability in drug disposition and response, but for certain drugs it could also account for 95% variations. For the majority of the drugs, these differences are largely ascribed to the polymorphic genes encoding drug metabolizing enzymes, receptors or transporters. These polymorphic genes mainly influence important pharmacokinetic characteristics of the drug metabolism e.g. drug absorption, distribution, metabolism and elimination.

Origin of pharmacogenomics:

The first report of an inherited difference in response to a foreign chemical or xenobiotic was inability to taste phenylthiocarbamide. Another example which showed that drug response is determined by genetic factors which can alter the pharmacokinetics and pharmacodynamics of medications, evolved in late 1950s, when an inherited deficiency of glucose-6-phosphate dehydrogenase was shown to cause severe hemolysis in some patients when exposed to the antimalarial drug primaquine. This discovery elucidated why hemolysis was reported mainly in African-Americans, where this deficiency is common, and rarely observed in Caucasians. Other established evidences of inter-individual variations observed in response to suxamethonium (succinylcholine), isoniazid, and debrisoquine were also linked with a genetic connection. The discovery that prolonged paralysis following the administration of succinylcholine was the result of a variant of the butyryl-cholinesterase enzyme, and peripheral neuropathy occurring in a large number of patients treated with antituberculosis drug isoniazid was an outcome of genetic diversity in the enzyme N-acetyltransferase 2 (NAT2) are excellent examples of “classical” pharmacogenetic traits altering amino acid sequence.

These observations of highly variable drug response, which began in the early 1950s, led to beginning of a new scientific discipline known as pharmacogenetics. Vogel in 1959 was the first to use the term pharmacogenetics but it was not until 1962, when in a booy by Kalow, pharmacogenetics was defined as the study of heredity and the response to drugs.

Pharmacogenomics in new era:

The term pharmacogenomics was later introduced to reflect the recent transition from genetics to genomics and the use of genome-wide approaches to identify the genes that contribute to a specific disease or a drug response. The term pharmacogenomics and pharmacogenetics are many times used interchangeably. Pharmacogenomics is an emerging discipline that aimed at studying genetic differences in drug disposition or drug targets to drug response. With the availability of more sophisticated molecular tools for detection of genetic polymorphisms, advances in bioinformatics and functional genomics, pharmacogenomic based studies are generating data which is used in identifying the genes responsible for a specific disease or the drug response. There is emerging data from various human genome projects on drug metabolizing genes that is rapidly elucidated and translated into more rational drug therapy towards a personalized medicine approach. Many physicians are now reconsidering whether “One Drug for All” approach is ideal while prescribing medicines to treat a certain condition in different individuals. Various studies have now reported genotype- phenotype association studies with reference to many diseases where respective drug metabolizing genes and receptors are highly polymorphic. In the last decade, FDA has increasingly acknowledged the importance of biomarkers and formulated new recommendations on pharmacogenomic diagnostic tests and data submission.

Applications and challenges of Pharmacogenomics:

Personalized medicine is at times deemed to be a future phenomenon; however it is already making a marked difference on patient treatments especially in various cancers. Molecular or genetic testing is now available for colon, multiple myeloma, leukemia, prostrate and breast cancer patients, hepatitits C and cardiovascular diseases where one can identify their genetic profile and based on that it can be predicted whether patients are likely to benefit from new drug treatments simultaneously minimizing adverse drug reactions. Recently, at MD Anderson Cancer Center, “Institute for Personalized Therpay” was created particularly to implement personalized cancer therapy for improving patient outcomes and reducing treatment costs.

Personalized medicine might guarantee many medical innovations but its implementation is associated with several challenges regarding public policy, social and ethical issues. Individual may not opt or participate in the genetic research as they feel it might breach their right for privacy and confidentiality. To tackle these challenges, “2008 Genetic Information Nondiscrimination Act” was designed to shield individuals from genetic discrimination. Apart from this, other existing concerns are: ownership of genetic materials, medical record privacy, clinical trial ethics, and patient’s knowledge on the consequences of storing genetic materials and phenotypic data. These concerns must be addressed for the satisfaction of all the stakeholders, especially the patients on reaching a common consensus as how to manage pharmacogenomics applications into clinical practices.

Interdisciplinary research – Nobel Prize for Chemistry was awarded to two Biologists

Modern scientific research does not confine itself to any restricted boundary.  Nowadays, it is all about interdisciplinary research. In 2012, Nobel Prize for Chemistry (http://www.nobelprize.org/nobel_prizes/chemistry/)was awarded to two eminent biologists, Prof. Robert J Lefkowitz and Prof. Brian Kobika, for their crucial contribution in unveiling the signalling mechanism of G protein-coupled receptors (GPCRs). It’s a lifetime work of both the scientists. Dr. Lefkowitz, an investigator at Howard Hughes Medical Institute (HHMI) at Duke University, is also James B Duke Professor of Medicine and of Biochemistry at Duke University Medical Center, Durham, NC, USA. Dr. Kobika, earlier a postdoctoral fellow in Dr. Lefkowitz’s laboratory, is currently Professor of Molecular and Cellular Physiology at Stanford University, School of Medicine, Stanford, CA, USA.

Transmembrane signalling of one GPCR “caught in action” by X-ray crystallography

GTP (guanosine triphosphate) binding proteins (G-protein) act as molecular switches in transmitting signals from different stimuli outside the cell to inside the cell. However, for doing this G-protein needs to be activated, and that is where GPCRs play the most important role. They sit in the cell membranes throughout the body. GPCRs, also known as seven transmembrane (pass through the cell membrane seven times) domain proteins, detect the external signals like odor, light, flavor as well as the signals within the body such as hormones, neurotransmitter.1 Once the GPCRs detect a signal, the signal is transduced in certain pathway and finally activate the G-protein. In response, the activated G-protein triggers different cellular processes. Binding of a signalling molecule or ligand to the GPCR causes conformational changes in the GPCR structure. As a result of extensive research of 20 long years, Dr. Lefkowitz and Dr. Kobika not only identified 800 members of GPCRs family in human but also caught in action how these receptor proteins actually carry out the signal transduction with the help of high resolution X-ray crystallography. The crystal structure of ß2-adrenergic receptor (ß2AR), a member of the human GPCRs family was reported by Dr. Kobika and his colleagues in 2007.2 The hormones adrenaline and noradrenaline are known to activate ß2AR, and the activated ß2AR triggers different biochemical processes which help in speeding up the heart and opening airways as body’s fight response. The ß2AR is a key ingredient in anti-asthma drugs. One of the major breakthroughs came in 2011 when Dr. Kobika and his co-workers unveiled for the first time the exact moment of the transmembrane signalling by a GPCR. They reported the crystal structure of “the active state ternary complex composed of agonist-occupied monomeric ß2AR and nucleotide-free Gs heterotrimer”.3 A major conformational change in ß2AR during signal transduction was discovered.

Now what is so special about GPCRs? Well, these proteins belong to one of the largest families of  all human proteins. GPCRs are involved in most of the physiological activities, and hence are  the targets of a number of drugs. Determination of the molecular structures of this class of receptors not only helps the researchers to understand the actual mechanism of different cellular processes but also help them to design life saving and more effective drugs. So, in a nut shell, this scientific breakthrough was possible due to the involvement of experts of different areas of science such as, chemistry, biochemistry, molecular and cellular biology, structural biology, cardiology, crystallography.

 

References

 

  1. Lefkowitz, R. J. Seven transmembrane receptors: something old, something new. Acta Physiol. (Oxf.) 190, 9–19 (2007).
  2. Rasmussen, S. G. et al. Crystal structure of the human b2 adrenergic G-protein coupled receptor. Nature 450, 383–387 (2007).
  3. Rasmussen, S. G. et al.  Crystal structure of the b2 adrenergic receptor–Gs protein complex. Nature 477,  549-557 (2011)