Scientific Journals: The Knowledge Storehouse

Scientific journals date back to 1665, when the publication of research results began. A scientific journal publishes scientific data periodically on recent breakthroughs in the field of science.

Who benefits from scientific journals?

At present, there is widespread acceptance of scientific journals and articles published in them. This magnifies the importance of the researches brought to light in such publications. It has been proven that scientific journals are of great import for academicians, researchers, and students of science and allied fields. The journals also have a profound impact on the overall educational system.

Advantages derived from scientific journals

  • Scientific journals promote and develop active learning skills among students and researchers. In fact, current research shows that reading journal articles provides an impetus for deeper thinking.
  • As you start browsing different research articles in journals, you will notice that the findings are  well organized and the overall conclusions are backed by evidence. The scientific articles carry research-oriented analyses or findings of researchers as well as students.
  • The research papers tend to keep pace with recent developments in the relevant field.
  • Researchers or students can derive valuable information for their own research area because they come across timely updates through these publications.
  • Scientific journals widen the scope for exploring one’s own research subject.
  • They help readers gain in-depth knowledge, especially through citation of case studies that can act as a research base. This encourages a thorough analysis and often leads to the formulation of novel hypotheses.
  • Even if you are engaged in research toward the submission of your doctoral thesis, you can benefit from valuable feedback if you publish papers in relevant scientific journals.
  • It is possible to search for and access the latest research topics easily from scientific publications.
  • Academic credentials of researchers receive a major boost from published papers in scientific journals, which stands them in good stead for their career objectives.
  • Scientific journals provide a platform for research scholars to express and pen down their research ideas at length.
  • Scientific journals represent a varied spectrum because each journal represents a specific stream of research. Such scientific publications bridge the gap between articles and books by publishing the researches of different authors, thus creating a single interactive platform.

BioArt from a Bio-artist’s Perception

BioArt is broadly defined as an imaginary live art form that involves micro-organisms; living, semi-living, or assisted living tissues; tissue-cultured cells or tissues; transgenic tissues; biological life processes; dead plants, animals, or even insects; body fluids or serum; and other living organisms.

Bio-artists collate the above-mentioned elements in terms of the technological aspects of science. Often, BioArt raises ethical queries related to the subject of the art.

History of BioArt

The term “BioArt” was coined by Eduardo Kac in 1977 in the context of his famous artwork Time Capsule.

Alexander Fleming was one of the early bio-artists. In fact, Fleming is arguably regarded as the father of BioArt because he was the first scientist to work on micro-organisms (especially bacteria) that were being differentiated by color when grown on a petri plate.

Scientific Domains of BioArt

Domains of BioArt or image making involve descriptions garnered through different scientific methods like genomic mapping, Electrocardiograph (ECG) Electroencephalograph (EEG), Magnetic Resonance Imaging  (MRI), electrophoretic patterns, Polymerase Chain Reaction (PCR) techniques, protein synthesis and visualization, and phenotypic or genotypic variations.

Advantages of BioArt

  • Artists have succeeded in making BioArt a platform for sharing the beauty of the research field and related innovative scientific ideas.
  • This type of biological art captures the essence of nature.
  • Although BioArt is a recent art form, it has encouraged a healthy debate among scholars, media, and the laity.
  • BioArt can also be used to represent complex scientific researches in simple forms such as visual or performing arts.
  • BioArt is a fusion of art and science because it is concerned with the presentation of complex life processes as well as vital interactions experienced in the environment.
  • Besides the scientific arena, BioArt has carved a niche in literature, psychology, and allied disciplines.
  • BioArt has led to the introduction of a sub-field called Biocouture, which is implemented in both fashion and textile industries. Biocouture has also reinvented the concepts of bio-designing and presentation of the art forms.

Dark Side of BioArt

  • A major concern regarding BioArt is the disposal of living tissues that are included in the art process.
  • Without the use of laboratory equipments and basic scientific expertise, creation of BioArt is a distant dream.
  • Many bio-artists transform their body into a work of art. One of the burning examples is the work of Stelarc. In his “Ear on Arm,” he incorporated a live tissue BioArt that resembled the shape of an ear and surgically introduced it into his left arm.
  • In some instances, manipulation of the genetic constitution of an organism was done for the sake of introducing an effective BioArt, just for entertainment and pleasure.
  • Many research scholars consider BioArt to be an unethical practice.

How does the publication cycle work?

What is the publication cycle?

The publication cycle is an inseparable and critical aspect that every researcher or writer needs to understand. This is because the publication cycle gives a tangible form to a theoretical concept, an idea, or an expression of writing talent. To use a commercial term, it is much like an assembly line where an idea passes through various inter-related processes and iterations before it develops into its final published form.

Content and medium: Two determinants of the publication cycle

The publication cycle differs based on two factors: the nature of content and the medium of publication. A writing output can belong to one of a myriad range of topics and publication mediums.  Some writers present their ideas in the form of research articles on various topics in, inter alia, journals, dissertations, conference papers, and scholarly books. Conversely, the output of other writers might be in the form of informal writings that appear in magazines catering to the general reader.

Apart from the nature of content, publications also differ in the medium selected for publication. Unlike most of the 20th century, publications are no longer limited to the print medium. In fact, the digital revolution and advent of the Internet have given an entirely new dimension to publishing with the popularity of articles and even books published on the web and in the electronic medium. Therefore, one needs to understand that the publication cycle, or the intervening processes for an idea to reach the reader, is determined by several factors. These factors determine the processes and the time cycle for a writing to be published in its final form.

Electronic publications include two categories. The first is the category of online publications on the World Wide Web. These include personal web pages of the author, individual blogs, online videos or presentations, as well as online research journals or periodicals. The second category includes electronic books, often referred to as e-books, which are bought and sold in the market, but the reader can access them only by using software on a digital device or a personal computer.

Conversely, the more conventional publications in print include research papers or reports published in journals, magazines, and books.

What does the publication cycle involve?

The publication cycle starts with the generation of an idea by the author or writer. This first step toward publication is basically an individualistic approach in the sense that it is a creative process and not a time-bound phenomenon.

After an idea takes root in the writer’s mind, the next step is to undertake the research that will help develop that idea into a well-constructed piece of writing. In this step, the authors search for existing literature on the relevant subject and identify the lacunae in such writing. This helps them make a useful contribution to their area of research.

The research work is followed by an informal communication among the authors (in case of multiple authors) that includes regular conversations on the phone and meetings for discussion of their research output. This stage provides a common platform for different authors to share ideas and views on a particular topic or idea of research.

The next major step involves report research. This could either be an informal approach adopted by the author or authors to share their research on their individual blogs or web pages, or a formal approach that includes white paper publications, report publications such as lab or research reports, and presentations at conferences and colloquia.

The next step is to report the findings as a publication in journals and/or magazines. Such publications provide a platform for popularization of the authors’ work, or to bring the research to the notice of a wider readership.

The culminating point of the publication cycle of a research idea occurs in a book or encyclopedia publication.  This is the most formalized medium of publishing a research work, and is recognized as the ultimate achievement for a researcher.

Each stage of the publication cycle is relevant because it constitutes a step in the ladder toward the final form of a research idea. Considered holistically, an understanding of the publication cycle facilitates the development of an efficient strategy for publication of a research work in an organized manner.

Conference papers vs journal publications: Which is the better publication route?

In course of their research, academicians often need to interact and exchange views with their colleagues to provide a firmer ground for their inferences. Such meetings help them debate their research topic with other like-minded participants and then assimilate the information that is presented through audio-visual media to produce a more conclusive finding. Therefore, seminars and colloquia are an essential part in the growth of any research. Often the proceedings of such meetings are recorded in the form of a collection of papers that were presented during the event.

On the other hand, a journal publishes research work, either on the web or as printed copies, after a rigorous process of review and a long approval cycle. However, once published in a reputed journal, your paper has an audience that you would otherwise have never had access to.

Why opt for conferences?

Conference proceedings have several advantages for a researcher. This is because conferences:

– Give a platform for interaction among research scholars who share a common interest.

– Have a faster review process and generate a faster feedback.

– Are often characterized by short presentations, so they manage to present the aim of the research clearly without consuming too much time.

– Include discussions sessions, which encourages exchange of views and ideas on the presentations.

– Allow interaction of scholars from all over the world who are engaged in the same or allied research fields.

– Have a predictable and time-bound review time.

– Help the presentations to be properly archived for reference in similar events held elsewhere on related research topics.

– Involve sponsors, who allure researchers with publishing credits and personal and professional benefits for attending the conference.

– Have high visibility and often leave a greater impact on the academic fraternity.

– Mainly focus on recent researches or up-to-date academic endeavors, unlike a journal that often takes a long time to finally publish a research.

Demerits of a conference publication

On the flip side, conferences have the following drawbacks:

– The review process is often superficial or cursory, i.e., there is no second round of reviewing.

– They have a low acceptance rate.

– The feedback from the research fraternity may be lukewarm compared to a publication in a journal.

– Economies of scale work against good quality publications because the publication is one of many expense heads for the organizers. Therefore, the production quality often leaves much to be desired.

Why opt for a journal publication?

A publication in a reputed journal presents the following advantages for the researcher:

– Research papers that are published in journals are thoroughly peer reviewed, including multiple review phases.

– The quality of research published in a journal is of a high standard.

– Journal publications carry deep analysis of a research work.

– Useful feedback is received from the reviewers, which help bring about substantive changes in the paper to improve the research analysis.

– Word and page limits are longer in the case of journals. This gives more scope to the researcher to express his or her thoughts and interpretations.

– A journal gives a chance to authors to revise their work based on the feedback and then re-submit it for further review and publication.

– Conference papers are never considered the ultimate in publishing a research. Often, conference papers can be converted to journal papers and published in reputed journals with a high impact factor.

Demerits of journal publications

There are also few demerits of journal publications. These include:

– The publication process is time-consuming.

– Due to such delays, the research topic might get outdated.

– Selection of journals is a difficult task. Sometimes, a good research is published in a sub-standard journal.

Both these routes to publication have their pros and cons. It must also be noted that conference proceedings and journal publications are not mutually exclusive; a situation may arise where one form of a research work might be published in the conference proceedings and another, perhaps more developed, form might be published in a journal. Therefore, for a more diverse and in-depth research output, both conference proceedings and journal publications need to play a significant part.

Transgenic Approach for Value Addition

Transgene: It is considered to be a gene or a genetic material of interest that is transferred from one organism to another, either naturally or by artificial transformation methods.

Methods involved in transgenics:

Transformation techniques: The transformational methods can be of different types, i.e., direct or indirect.

Some of the widely used direct transformation methods are microinjection, electron gun, electroporation (physical methods); calcium phosphate method, liposome-mediated transfer method (chemical methods), etc.

Indirect transformation methods are also known as vector-mediated gene transfer methods. The major biological vectors involved are Agrobacterium tumefaciens, viruses (for e.g., CaMV virus), etc.

Genome editing: As the name suggests, part of a genome is edited, i.e., either any sequence is inserted, or deleted or replaced by any other desired sequence, with the help of restriction enzymes (also termed as molecular scissors).

Value addition properties of the transgenic process:

  • It has ensured enhanced yields to the farming community, in quantitative terms, by introducing various superior crop varieties, that upholds various high performance qualities like insect or pest resistance, herbicide resistance, resistance to various other biotic stresses, as well as resistance to various abiotic stresses like drought, salinity, flood, temperature imbalance, etc.
  • The above resistance in the crops has also led to an improvement in the crop quality.
  • Transgenics have been a boon, not only in the field of agriculture, but also in other fields like aquaculture, sericulture, horticulture, etc., through an increase in the productivity and quality of the resembled products.
  • Transgenics have opened up the world market for various commercial products obtained through genetic manipulation. For e.g., Flavr Savr tomato, the first ever genetically engineered crop product to be marketed and commercialized by a California-based company, Calgene, which succeeded in introducing an antisense gene that interfered in the production of the fruit ripening enzyme (Polygalacturonase enzyme).
  • One of the landmark achievements of the transgenic technique was the introduction of the gene therapy. The technique aimed at treating a disease in a patient, those who experienced low concentration of any of the metabolically important protein or enzyme production. The introduced gene in the patient leads to altered functions of the genetic material, but in a productive way, only in relation to the disease.
  • Transgenic methodology has brought about a revolutionary change in the field of medical science by prologue of various therapeutic methods, which involve the development of modern vaccination techniques, new and more efficient vaccines for various untreated diseases, production of antibodies from various biological vectors (microorganisms as well as plants), production of monoclonal antibodies by hybridoma technology, and the production of certain other important drugs used in treating genetic diseases.
  • The field of transgenics has been considered important for the researchers and scientists, giving them an ample opportunity for creating substantive and commercially important products, which would meet the demands of the overgrowing world population.

Apart from all these advantages, the mounting costs of the products are making it less appealing, for both the researchers and the customers who tend to buy it. Even certain ethical issues, related to the genetically engineered food products are preventing its market exposure and availability.

Importance of Statistical Review of manuscripts

Statistics: It is a branch of mathematics that deals with the collection of data, its analysis, interpretation, presentation and sequential organization. In simple terms, it deals with philosophy, logic, and expression of data.

Who does the statistical review?

Statistical review is basically done by the expert statisticians or authors and journal editors with statistical knowledge. It comprises of statistical and even methodological questions that are to be answered by the author or even the journal editors that are put forward by the reviewer.

Role of the statistical reviewers:

  • The statistical reviewers find out the possible statistical error sources in the manuscript, in turn increasing the statistical accuracy of the paper as well as ensuring quicker publication of the manuscript.
  • All forms of statistical data checking is performed by the statistical reviewers like checking the missing data, checking whether correct statistical methods were followed or not, checking whether the statistical methods were used appropriately or not, checking for statistical errors like error in level of significance during analysis of the data, checking whether appropriate name of the statistical package is mentioned or not along with the version used, checking whether the measurable units are properly mentioned or not, checking whether the tables and figures displayed in the manuscript hold a proper self-explanatory footnote or not, and so on.
  • They ensure proper statistical presentation of data throughout the manuscript; proper use of statistical language is also ensured by the reviewer in the data presentation section.
  • The reviewer also checks whether the conclusion section in the manuscript is justified or not with regard to the presented data.
  • They also cross check the feasibility of the discussion section based on the results.

Significance of statistical review:

  • If there is any kind of major statistical errors found in the data presentation section, then it may lead to the rejection of the research paper. So, reviewing of the statistical data and its proper presentation is of utmost importance for the author. The frequent statistical problems in the manuscript are found in data interpretation and presentation, its analysis and the study design.
  • Sound statistics is the foundation to high-quality research work interpreting quantitative studies.

Is self-plagiarism ethical?

Research papers or journals are the medium of spreading knowledge and new ideas evolved. Innovative and original piece of work would certainly be more educative and admirable. Nevertheless, authors and writers are often found to be reusing their old piece of work or some extracts from their previous published papers while writing a new research paper.

When questions are raised against this content reuse, authors claim that those stuffs are their own works and materials, and thus, they can reuse them as they wish, and it cannot be termed as plagiarism since they have not stolen the ideas from any other author or source.

The ethics of plagiarism are not applicable to such reuse, as a result of which it has been overlooked till date. While the discussion is whether this reuse is ethical or not, the publications and the journals, on the other hand, have set certain guidelines for such works citing it as Self-plagiarism.

What is self-plagiarism?

Self-plagiarism is a form of plagiarism where the writer reuses his/her own previously published work in portions or entirely while creating a new study paper. It can breach the publisher’s copyright on those published work when it is reused in the new study papers without appropriate citations. Let us now know more about the ethical aspects of self-plagiarism.

Self-plagiarism can be detected when:

a)  A published paper is used to republish elsewhere without the consent of the co-authors and the publisher of the paper or work.

b)  A paper of a large study is published in small sections with an intention to increase the number of publications.

c)  A previously written work either published or not is reused again in portions in the new study papers.

Although the laws of self-plagiarism are not enforced, it somehow reflects the dishonesty of the author. Moreover, the journals and the publishers are rejecting such copy-paste works as they are seeking writings based on original research findings and proper citations of all the references.

Nowadays, journals are also pointing out questions on the reuse of one’s own work. In order to avoid self-plagiarism, one should try to keep his/her work original, and in case it is necessary to include any portion from his/her previous works, it should be then properly cited with proper references. I hope this article will surely help you in detecting prospective self-plagiarism before submitting your paper or work to publications or journals.

BioConference Live 2014

President Barack Obama's participation in BioconferenceLive 2014 alongside ManuscripteditThe BioConference Live virtual neuroscience conference conducted on March 19-20, 2014, was an online event set to unite the neuroscience community via live video webcasts and real-time networking. Manuscriptedit participated in this high profile conference that saw the participation of President Barack Obama as well.

Researchers, post docs, lab directors, and other medical professionals learnt about recent investments and the scientific foci of the BRAIN Initiative through a panel discussion with key leaders from diverse scientific and funding regulatory agencies. The BRAIN Initiative was part of a new Presidential focus intended at reforming our understanding of the human brain.

The Neuroscience conference included topics from science journals like Behavioral and Cognitive Neuroscience, Epigenetic Regulation, Genetics of Neurologic Diseases, Molecular Mechanism, Neurologic Dysfunction from Human Diseases, and Nervous System Development. It also covered neurological diseases from lab to clinic, including Alzheimer’s, ALS, Epilepsy, Huntington’s disease, Multiple Sclerosis, Parkinson’s, traumatic brain and spinal cord injury, and neuropsychiatric disorders.

In addition to topics on diseases, the conference also covered emerging therapies, like combinatorial therapies, immunomodulation, myelin repair, non-coding RNA, neurorobotics, neuroengineering, stem cells, and imaging technologies – in vitro and in vivo.

The intense two-day conference covered original research data, teaching presentations, broad overview of new frontiers given by thought leaders in the field and discussion forums. Attendees learnt new concepts, tools and techniques that they can apply to research and diagnosis.

Antibiotic Resistance: Cause and Mechanism

Scope of antibiotic resistance problem:

Antibacterial-resistant strains and species, occasionally referred as “superbugs”, now contribute to the emergence of diseases that were well controlled few decades ago. In a recent report “Antibiotic Resistance Threats in the United States, 2013,” CDC calls this as a critical health threat for the country. According to the report more than 2 million people in the United States get antibiotic resistant infections each year and at least 23,000 of them die annually. Now, this is the situation in a country where drug regulations are quite tough and stringent and physicians are relatively careful in prescribing medications. Imagine the situation in developing countries like India, where antibiotics are available over the counter without medical prescription and more than 80-90% of population use antibiotics without physician’s consultation. In fact they are not even aware of the proper use of the antibiotic course. This is again a huge health challenge that will pose even more serious threat in coming years in treating antibiotic resistant infections. Recently, in a clinic in Mumbai some 160 of the 566 patients tested positive for TB between March and September that were resistant to the most powerful TB medicine. In fact, more than one-quarter of people diagnosed with tuberculosis have a strain that doesn’t respond to the main treatment against the disease. According to WHO and data from Indian government, India has about 100,000 of the 650,000 people in the world with multi-drug-resistance.

 Factors contributing to antibiotic resistance:

Inappropriate treatment and misuse of antibiotics has contributed maximum to the emergence of antibacterial-resistant bacteria.Many antibiotics are frequently prescribed to treat diseases that do not respond to these antibacterial therapies or are likely to resolve without any treatment. Most of the time incorrect or suboptimal doses of antibiotics are prescribed for bacterial infections. Self-prescription of antibiotics is another example of misuse. The most common forms of antibiotic misuse however, include excessive use of prophylactic antibiotics by travelers and also the failure of medical professionals to prescribe the correct dosage of antibiotics based on the patient’s weight and history of prior use. Other misuse comprise of failure to complete the entire prescribed course of the antibiotics, incorrect dosage or failure to rest for sufficient recovery. Other major causes that contribute to antibiotic resistance are excessive use of antibiotics in animal husbandry and food industry and frequent hospitalization for small medical issues where most resistant strains gets a chance to circulated among the community.

To conclude, humans contribute the most to the development and spread of drug resistance by: 1) not using the right drug for a particular infection; 2) not completing the antibiotic duration or 3) using antibiotics when they are not needed.

In addition to the growing threat of antibiotic-resistant bugs, there may be another valid reason doctors should desist from freely prescribing antibiotics. According to a recent paper published online in Science Translational Medicine, certain antibiotics cause mammalian mitochondria to fail, which in turn leads to tissue damage.

 Mechanism of antibiotic resistance:

Antibiotic resistance is a condition where bacteria develop insensitivity to the drugs (antibiotics) that generally cause growth inhibition or cell death at a given concentration.

Resistance can be categorized as:

a) Intrinsic or natural resistance:  Naturally occurring antibiotic resistance is very common, where a bacteria may be simply, inherently resistant to antibiotics. For example, Streptomyces possess genes responsible for conferring resistance to its own antibiotic, or bacteria naturally lack the target sites for the drugs or they naturally have low permeability or lack the efflux pumps or transport system for antibiotics. The genes which confer this resistance are known as the environmental resistome and these genes can be transferred from non-disease-causing bacteria to the disease causing bacter, leading to clinically significant antibiotic resistance.

b) Acquired resistance: Here a naturally susceptible microorganism acquires ways not to get affected by the drug. Bacteria can develop resistance to antibiotics due to mutations in chromosomal genes or mobile genetic elements e.g., plasmids, transposons carrying antibiotic resistance genes.

The two major mechanisms of how antibiotic resistance is acquired are:

Genetic resistance: It occurs via chromosomal mutations or acquisition of antibiotic resistance genes on plasmids or transposons.

Phenotypic resistance: Phenotypic resistance can be acquired without any genetic alteration. Mostly it is achieved due to changes in the bacterial physiological state. Bacteria can become non-susceptible to antibiotics when not growing such as in stationary phase, biofilms, persisters and in the dormant state. Example: Salicylate-induced resistance in E. coli, Staphylococci and M. tuberculosis.

In genetic resistance category, following are the five major mechanisms of antibiotic drug resistance, which occurs due to chromosomal mutations:

1. Reduced permeability or uptake (e.g. outer membrane porin mutation in Neisseria gonorrhoeae)

2. Enhanced efflux (membrane bound protein helps in extrusion of antibiotics out of bacterial cell; Efflux of drug in Streptococcus pyogenes, Streptococcus pneumoniae)

3. Enzymatic inactivation (beta-lactamases cleave beta-lactam antibiotics and cause resistance)

4. Alteration or over expression of the drug target (resistance to rifampin and vancomycin)

5. Loss of enzymes involved in drug activation (as in isoniazid resistance-KatG, pyrazinamide resistance-PncA)

Examples of transfer of resistance genes through plasmid are; Sulfa drug resistance and Streptomycin resistance genes, strA and strB while the transfer of resistance gene through transposon occurs via conjugative transposons in Salmonella and Vibro cholera.

In the next post, I will discuss few important examples of antibiotic resistance in clinically relevant microbes.

Antibiotics: Wonder drugs or a threat to public health?

What are antibiotics?

Antibiotics, also known as antibacterials, are category of medications which kills or slow down the bacterial growth. Penicillin was the first antibiotic, discovered by Sir Alexander Fleming in 1928, but it was not until the early 1940s that its true potential was recognized before it came into widespread use. In 1942, the term antibiotic was first used by Selman Waksman. In earlier days, antibiotics were often referred as “wonder drugs” because they cured several bacterial diseases that were once fatal. With antibiotic use, the number of deaths caused by bacterial infections like meningitis, pneumonia, tuberculosis, and scarlet fever were drastically reduced.  Discovery of antibiotics have revolutionized human development in a highly significant way. Other than vaccines, few medical discoveries had such a huge impact on healthcare delivery. Major complicated surgeries, transplants, advances in neonatal medicine, and advances in chemotherapy for cancer patients would not be possible without antibiotics.

Antibiotics classification:

Antibiotics are broadly classified based on their mechanism of action, structure, source or origin of the antibacterial agent or their biological activity. With the recent advances in medicinal chemistry, most antibiotics available nowadays are semisynthetic derivative of various natural compounds (penicillins, Cephalosporins and Ampicillin). Very few antibiotics like aminoglycosides (Streptomycin, Gentamicin, and Neomycin) are isolated from living organisms while many other antibiotics, Sulfonamides,Quinolones, Moxifloxacin and Norfloxacin are chemically synthesized. Based on the biological activity of the microorganisms, antibiotics are classified as bactericidal agents (which kill bacteria) and bacteriostatic agents (which slow down or impede bacterial growth). Microorganisms are known to develop resistance faster to the natural antimicrobials since they have been pre-exposed to these compounds in nature. Therefore semisynthetic drugs were developed for increased efficacy and less toxicity. Synthetic drugs possess an added advantage that bacteria are not exposed to these compounds until they are released systemically. They are designed to have even improved effectiveness with decreased toxicity.

Antibiotics are also classified based upon their range of effectiveness. Broad-spectrum drugs are effective against many types of microbes (gram-positive and gram-negative) and tend to have higher toxicity to the host. Narrow-spectrum drugs are effective against a limited group of microbes (either gram-positive or gram-negative) and exhibit lower toxicity to the host. Based on the chemical structure, antibiotics are classified into two categories: β-lactams and aminoglycosides. All the above mentioned classes of antibiotics are further divided according to their targets or mode of action in the bacteria. Following are the five important antibiotic targets in bacteria.

1. Inhibitors of cell wall synthesis (-cillins)

2. Inhibitors of protein synthesis (-mycins)

3. Inhibitors of membrane function (Polymyxin)

4. Anti-metabolites (Sulfa drugs)

5. Inhibitors of nucleic acid synthesis (Nalidixic acid, Rifampicin)

 The deluge of antibiotic resistance bacteria:

“The first rule of antibiotics is try not to use them, and the second rule is try not to use too many of them”, a quote by Paul L. Marino. Well, after an era of plentiful antibiotics, presently, the situation is alarming due to the ever increasing number of antibiotic resistant strains. In early years, new antibiotics were developed faster than bacteria developed resistance to them. But the bugs have caught up fast now. In the 1950s and 60s, many new classes of antibiotics were discovered. However, in 1980s and 1990s, scientists have only managed to make improvements within different classes of antibiotics.

 The emerging resistance of bacteria to antibacterial drugs is becoming a continuous threat to human health. Bacterial resistance to penicillin was observed within 2 years of its introduction in mid 1940s. Rapidly emerging resistance to ciprofloxacin and various anti-tuberculosis drugs indicates that it is microbe’s world and they are ready to adapt. Since, microbes congregate in large numbers to induce infection, generate rapidly and mutate efficiently, developing resistance is not a matter of “if” but of ‘when”. To overcome any assault, bacteria possess efficient defense system present within DNA or chromosomes or extrachromosomal elements called plasmid. The bacteria have advantage that these plasmids carrying resistance gene with them can easily shuttle between bacterial cells and humans.

Now, no longer limited to the hospitals, antibiotic resistance with Neisseria gonorrhea and Streptococcus pneumoniae is becoming a household and a community setting phenomenon. The use of surface antibacterials in common households, self-medication and unregulated sales of antibiotic in many countries are further aggravating the problem. According to a CDC report by the end of 20th century, approximately 30 % of S. pneumoniae (causative agent of meningitis, otitis media and pneumonia) were no longer found to be sensitive against penicillin. Similarly, treatment failures were observed in patients because of the resistant strains of, Shigella, Salmonella typhi, Staphylococcus, Mycobacteria tuberculosis, Klebsiella pneumoniae, Clostridium difficle and S. pneumoniae. Drug-resistant bacteria can be acquired in community settings like, daycares, schools and other crowded places. Other risk factors are antibiotic use and consumption of food products treated with antibiotics. Increased use of quinolones in poultry and farm animals has been associated with the increased prevalence of human infection with quinolone-resistant Salmonella and Campylobacter.  Besides, the established pathogens, relatively recent appearance of opportunistic organisms, intrinsically resistant to many drugs are making the matter worse. With a larger number of immunocompromised patients, these organisms have become ‘specialized’ pathogens—typically attacking only the most vulnerable patients. Examples of such opportunistic pathogens are Enterococci, the coagulase-negative Staphylococci, Pseudomonas aeruginosa and Acinetobacter baumanii. Therefore, it is the high time to think and act to reverse this trend of antibiotic resistance by medical professionals by creating awareness among communities on the proper use of antibiotics and discouraging self-medications. In the next series, I will discuss the factors responsible for antibiotic resistance and its detailed mechanism.