Ultrasound Patch: An innovative technology for rapid treatment and management of ulcers

Venous skin ulcers also known as stasis ulcers or varicose ulcers are chronic wounds caused by the poor blood circulation in the venous valves or veins, usually occurring in the lower part of the legs, between the ankle and the calf. This condition is known venous insufficiency and accounts for roughly 70 % to 90% of leg ulcer cases. These ulcers are often recurring, extremely painful and can take months and years to heal. This condition affects approximately 500,000 Americans annually, and the number is expected to increase as the rate of obesity climbs. It is estimated that the financial burden for the treatment of venous skin ulcers costs US healthcare systems over one billion dollars per year and the monthly treatment costs could be as high as $2,400 a month. Current treatments for venous skin ulcers are either conservative management, such as compression therapy or invasive and expensive surgical procedures, such as skin grafts. Other available treatment options include mechanical treatment and medications. The most standard treatment however involves the infection control, wound dressings and compression therapy in which patients are asked to wear elastic stockings to help improve leg circulation. Nevertheless, all these approaches were not found to be successful in every case and these wounds take often months or sometime years to heal.

Designing Ultrasound patch:

Recently, a team of researchers led by Dr. Peter A. Lewin at Drexel University at Philadelphia have designed a novel non-invasive technique called “Ultrasound Patch” for treating chronic ulcers and wounds. This technique uses patches with a novel ultrasound applicator that can be worn effortlessly like a band-aid. In this alternative therapy, battery-powered patch sends low-frequency and low-intensity ultrasound waves directly to the wound site. The therapeutic benefits of ultrasound for wound healing were established in previous studies, but most of studies were performed with much higher frequencies, around 1-3 megahertz (MHz). Dr. Lewin believed that decreasing the frequency to 20–100 kilohertz (kHz) might work better with a reduced exposure. According to him, one of the biggest challenges in designing this technology was to build a battery-powered patch since most ultrasound transducers require a bulky apparatus which need to be fixed on the wall. Dr. Lewin and colleagues also wanted to create something which is portable and can be easily worn for which the device has to be essentially battery operated. To accomplish this, they designed a transducer that could produce medically pertinent energy levels using minimum voltage. The ultrasound patch in its present form, weighs approximately 100 grams and required two rechargeable AA batteries. It is designed to be worn over the ulcer or the wound and the patient can deliver controlled pulses of ultrasound directly to the wound, while at home. The funding for this study was received from the National Institute of Biomedical Imaging and Bioengineering (NIBIB), a part of the National Institutes of Health.

Clinical studies for testing Ultrasound patch

To determine the optimal frequency and treatment duration of ultrasound patch, the study trial was carried out initially in total 20 patients, divided into four groups. Each group received either 20 kHz for 15 minutes, 20 kHz for 45 minutes 100 kHz for 15 minutes, or 15 minutes of a placebo or control which received no radiation.  According to the researchers, the first group was the one that eventually came out best, where all the five participants completely healed by the time they reached their fourth session. In contrast, the ulcers of the patients in the placebo group worsened over the similar duration. Results suggested that patients who received this low-frequency, low-intensity ultrasound therapy during their weekly follow ups (in addition to the standard compression therapy), showed a net reduction in wound size just after four weeks of the therapy. Whereas, the patients who did not received the ultrasound treatment had an average increase in the wound size. The team’s clinical findings were further confirmed by their in vitro studies where after 24 hours of receiving 20 kHz ultrasound for 15 minutes, mouse fibroblasts cells that play an active role in wound healing showed a 32% increase in cell metabolism and a 40% increase in cell proliferation as compared to the control cells. These findings are yet to be published in the Journal of the Acoustical Society of America.

Advantages and applications of Ultrasound patch

Researchers believe that using ultrasound patch for chronic ulcers will reduce the treatment cost and patient’s discomfort. It aids in speedy recovery of wounds as compared to the conventional approaches and could eventually be used to manage wounds associated with diabetic and pressure ulcers. However, before it widespread applications, studies need to be conducted on the larger-scale for establishing its overall safety and efficacy. The ultrasound patch is light weight and can be easily worn like a band-aid. Another characteristic feature of this patch is an attached monitoring component that uses near infrared spectroscopy (NIRS) to assess the progress of wound healing. NIRS can help to non-invasively assess changes in the wound bed and monitor if the treatment is working in its initial stages, when healing is difficult to spot with the naked eye.  Using this patch will also prevent frequent visits to doctor’s clinic or hospital, which can be at times very difficult for patients with chronic wounds. Currently, studies with larger numbers of patients are underway to confirm the safety and efficacy of this patch before it makes its way into the clinics.

SEO Content Writing

Search engine optimisation or SEO content writing is not just an application to be used for the website, but it can be used for various other benefits as well. In a broad sense, SEO tools can be used in many different ways in order to promote websites, businesses, and products and services. In fact, one will be completely astonished at all the ways in which these services can be useful.

The present article on “SEO content writing” gives the fundamental ways to use SEO content writing and SEO tools. It also provides detailed information on why to use the services of SEO content writing.

Ways to use SEO Content Writing

Firstly, SEO content writing can be definitely used for websites. Organic search engine optimisation (Organic SEO) is significant for getting in the search engine records at a high position and compelling visitors to the website. Simultaneously, it is necessary for the content to be appealing, fascinating and instructive for the readers. Consequently, the visitors will be tempted to hang on to your website long enough to possibly buy services and/or products, or at least ask for some extra information. Hence, SEO content writing can be certainly used for improving the content of a website.

Secondly, SEO content writing can also be used for blogs for similar reasons as discussed above. Blogs are a great way to endorse businesses, develop brand appreciation and representation, as well as increase visitors to the main website. For this purpose, we have to use the same SEO procedures for our blogs that we would have used for the main website. Ensure that the blogs are mostly instructive in nature. Besides, they must be connected to your main website as a source for extra information, useful products or services.

Thirdly, SEO content writing can be employed for internet marketing purposes. One expects that anything he/she posts on the internet utilises organic SEO so that they have as many positive connections to their website as possible, and as a result receive plenty of hits in the search engines on different websites and advertising methods. The most general manner by which these services can be used for internet marketing is through article marketing. Instructive articles are written about products, services, company, and/or industry. Then, these articles are posted into article listings, where they get arranged in the search engines. The articles have a link connecting them back to the main website in order to increase visitors. Meanwhile, people come across the content and get to know about the company/brand in a positive light, thus increasing business.

On the whole, there are several other ways in which SEO content writing can assist in our dealings. The only limitation is our own imagination. Gradually, you will realise that the more SEO content you put on the website, the more successful your company will develop into. Therefore, utilise these services as much as possible, and be surprised at the outcomes you will accomplish.

Antibiotic Resistance: Cause and Mechanism

Scope of antibiotic resistance problem:

Antibacterial-resistant strains and species, occasionally referred as “superbugs”, now contribute to the emergence of diseases that were well controlled few decades ago. In a recent report “Antibiotic Resistance Threats in the United States, 2013,” CDC calls this as a critical health threat for the country. According to the report more than 2 million people in the United States get antibiotic resistant infections each year and at least 23,000 of them die annually. Now, this is the situation in a country where drug regulations are quite tough and stringent and physicians are relatively careful in prescribing medications. Imagine the situation in developing countries like India, where antibiotics are available over the counter without medical prescription and more than 80-90% of population use antibiotics without physician’s consultation. In fact they are not even aware of the proper use of the antibiotic course. This is again a huge health challenge that will pose even more serious threat in coming years in treating antibiotic resistant infections. Recently, in a clinic in Mumbai some 160 of the 566 patients tested positive for TB between March and September that were resistant to the most powerful TB medicine. In fact, more than one-quarter of people diagnosed with tuberculosis have a strain that doesn’t respond to the main treatment against the disease. According to WHO and data from Indian government, India has about 100,000 of the 650,000 people in the world with multi-drug-resistance.

 Factors contributing to antibiotic resistance:

Inappropriate treatment and misuse of antibiotics has contributed maximum to the emergence of antibacterial-resistant bacteria.Many antibiotics are frequently prescribed to treat diseases that do not respond to these antibacterial therapies or are likely to resolve without any treatment. Most of the time incorrect or suboptimal doses of antibiotics are prescribed for bacterial infections. Self-prescription of antibiotics is another example of misuse. The most common forms of antibiotic misuse however, include excessive use of prophylactic antibiotics by travelers and also the failure of medical professionals to prescribe the correct dosage of antibiotics based on the patient’s weight and history of prior use. Other misuse comprise of failure to complete the entire prescribed course of the antibiotics, incorrect dosage or failure to rest for sufficient recovery. Other major causes that contribute to antibiotic resistance are excessive use of antibiotics in animal husbandry and food industry and frequent hospitalization for small medical issues where most resistant strains gets a chance to circulated among the community.

To conclude, humans contribute the most to the development and spread of drug resistance by: 1) not using the right drug for a particular infection; 2) not completing the antibiotic duration or 3) using antibiotics when they are not needed.

In addition to the growing threat of antibiotic-resistant bugs, there may be another valid reason doctors should desist from freely prescribing antibiotics. According to a recent paper published online in Science Translational Medicine, certain antibiotics cause mammalian mitochondria to fail, which in turn leads to tissue damage.

 Mechanism of antibiotic resistance:

Antibiotic resistance is a condition where bacteria develop insensitivity to the drugs (antibiotics) that generally cause growth inhibition or cell death at a given concentration.

Resistance can be categorized as:

a) Intrinsic or natural resistance:  Naturally occurring antibiotic resistance is very common, where a bacteria may be simply, inherently resistant to antibiotics. For example, Streptomyces possess genes responsible for conferring resistance to its own antibiotic, or bacteria naturally lack the target sites for the drugs or they naturally have low permeability or lack the efflux pumps or transport system for antibiotics. The genes which confer this resistance are known as the environmental resistome and these genes can be transferred from non-disease-causing bacteria to the disease causing bacter, leading to clinically significant antibiotic resistance.

b) Acquired resistance: Here a naturally susceptible microorganism acquires ways not to get affected by the drug. Bacteria can develop resistance to antibiotics due to mutations in chromosomal genes or mobile genetic elements e.g., plasmids, transposons carrying antibiotic resistance genes.

The two major mechanisms of how antibiotic resistance is acquired are:

Genetic resistance: It occurs via chromosomal mutations or acquisition of antibiotic resistance genes on plasmids or transposons.

Phenotypic resistance: Phenotypic resistance can be acquired without any genetic alteration. Mostly it is achieved due to changes in the bacterial physiological state. Bacteria can become non-susceptible to antibiotics when not growing such as in stationary phase, biofilms, persisters and in the dormant state. Example: Salicylate-induced resistance in E. coli, Staphylococci and M. tuberculosis.

In genetic resistance category, following are the five major mechanisms of antibiotic drug resistance, which occurs due to chromosomal mutations:

1. Reduced permeability or uptake (e.g. outer membrane porin mutation in Neisseria gonorrhoeae)

2. Enhanced efflux (membrane bound protein helps in extrusion of antibiotics out of bacterial cell; Efflux of drug in Streptococcus pyogenes, Streptococcus pneumoniae)

3. Enzymatic inactivation (beta-lactamases cleave beta-lactam antibiotics and cause resistance)

4. Alteration or over expression of the drug target (resistance to rifampin and vancomycin)

5. Loss of enzymes involved in drug activation (as in isoniazid resistance-KatG, pyrazinamide resistance-PncA)

Examples of transfer of resistance genes through plasmid are; Sulfa drug resistance and Streptomycin resistance genes, strA and strB while the transfer of resistance gene through transposon occurs via conjugative transposons in Salmonella and Vibro cholera.

In the next post, I will discuss few important examples of antibiotic resistance in clinically relevant microbes.

Pharmacogenomics: A study of personalized drug therapy

With the increasing advancement of technology and research progress, modern medicine has found cure for several diseases which were considered to be incurable few decades ago e.g. cardiovascular diseases, various cancers, tuberculosis, malaria and infectious diseases. However, till date no single drug is shown to be 100% efficacious for treating a certain diseased condition without exhibiting adverse drug effects. It is now a well recognized fact that each patient respond differently to a given drug treatment for a similar disease. With a particular drug, desirable therapeutic effects could be obtained in few patients where as others may have modest or no therapeutic response. Besides, many patients might experience an adverse effect that also varies from mild to severe and life-threatening. Studies have shown that with a similar dose, plasma concentration of a certain drug might vary up to a difference of 600 fold among two individuals of same weight. Such inter-individual variations occurring in response to a drug might be a consequence of complex interaction between various genetic and environmental factors. Genetic factors are known to account for approximately 15-30% inter-individual variability in drug disposition and response, but for certain drugs it could also account for 95% variations. For the majority of the drugs, these differences are largely ascribed to the polymorphic genes encoding drug metabolizing enzymes, receptors or transporters. These polymorphic genes mainly influence important pharmacokinetic characteristics of the drug metabolism e.g. drug absorption, distribution, metabolism and elimination.

Origin of pharmacogenomics:

The first report of an inherited difference in response to a foreign chemical or xenobiotic was inability to taste phenylthiocarbamide. Another example which showed that drug response is determined by genetic factors which can alter the pharmacokinetics and pharmacodynamics of medications, evolved in late 1950s, when an inherited deficiency of glucose-6-phosphate dehydrogenase was shown to cause severe hemolysis in some patients when exposed to the antimalarial drug primaquine. This discovery elucidated why hemolysis was reported mainly in African-Americans, where this deficiency is common, and rarely observed in Caucasians. Other established evidences of inter-individual variations observed in response to suxamethonium (succinylcholine), isoniazid, and debrisoquine were also linked with a genetic connection. The discovery that prolonged paralysis following the administration of succinylcholine was the result of a variant of the butyryl-cholinesterase enzyme, and peripheral neuropathy occurring in a large number of patients treated with antituberculosis drug isoniazid was an outcome of genetic diversity in the enzyme N-acetyltransferase 2 (NAT2) are excellent examples of “classical” pharmacogenetic traits altering amino acid sequence.

These observations of highly variable drug response, which began in the early 1950s, led to beginning of a new scientific discipline known as pharmacogenetics. Vogel in 1959 was the first to use the term pharmacogenetics but it was not until 1962, when in a booy by Kalow, pharmacogenetics was defined as the study of heredity and the response to drugs.

Pharmacogenomics in new era:

The term pharmacogenomics was later introduced to reflect the recent transition from genetics to genomics and the use of genome-wide approaches to identify the genes that contribute to a specific disease or a drug response. The term pharmacogenomics and pharmacogenetics are many times used interchangeably. Pharmacogenomics is an emerging discipline that aimed at studying genetic differences in drug disposition or drug targets to drug response. With the availability of more sophisticated molecular tools for detection of genetic polymorphisms, advances in bioinformatics and functional genomics, pharmacogenomic based studies are generating data which is used in identifying the genes responsible for a specific disease or the drug response. There is emerging data from various human genome projects on drug metabolizing genes that is rapidly elucidated and translated into more rational drug therapy towards a personalized medicine approach. Many physicians are now reconsidering whether “One Drug for All” approach is ideal while prescribing medicines to treat a certain condition in different individuals. Various studies have now reported genotype- phenotype association studies with reference to many diseases where respective drug metabolizing genes and receptors are highly polymorphic. In the last decade, FDA has increasingly acknowledged the importance of biomarkers and formulated new recommendations on pharmacogenomic diagnostic tests and data submission.

Applications and challenges of Pharmacogenomics:

Personalized medicine is at times deemed to be a future phenomenon; however it is already making a marked difference on patient treatments especially in various cancers. Molecular or genetic testing is now available for colon, multiple myeloma, leukemia, prostrate and breast cancer patients, hepatitits C and cardiovascular diseases where one can identify their genetic profile and based on that it can be predicted whether patients are likely to benefit from new drug treatments simultaneously minimizing adverse drug reactions. Recently, at MD Anderson Cancer Center, “Institute for Personalized Therpay” was created particularly to implement personalized cancer therapy for improving patient outcomes and reducing treatment costs.

Personalized medicine might guarantee many medical innovations but its implementation is associated with several challenges regarding public policy, social and ethical issues. Individual may not opt or participate in the genetic research as they feel it might breach their right for privacy and confidentiality. To tackle these challenges, “2008 Genetic Information Nondiscrimination Act” was designed to shield individuals from genetic discrimination. Apart from this, other existing concerns are: ownership of genetic materials, medical record privacy, clinical trial ethics, and patient’s knowledge on the consequences of storing genetic materials and phenotypic data. These concerns must be addressed for the satisfaction of all the stakeholders, especially the patients on reaching a common consensus as how to manage pharmacogenomics applications into clinical practices.

Interdisciplinary research – Direct Imaging of Single Molecule

Interdisciplinary research has immense potential. I have talked about one of the major discoveries of modern science based on interdisciplinary research in my previous blog, posted on 29th July 2013 (http://blog.manuscriptedit.com/2013/07/ interdisciplinary-research-nobel-prize-chemistry-won-biologists/). Today, let us take another example, where one chemist and one physicist came together and presented us with the direct image of internal covalent bond structure of a single molecule using one of the advanced imaging tools, non-contact Atomic force microscope (nc-AFM). Image1Dr. Felix R.Fischer (http://www.cchem.berkeley.edu/frfgrp/), a young Assistant Professor of Chemistry at University of California (UC), Berkeley along with his collaborator Dr. Michael Crommie (http://www.physics.berkeley.edu/research/crommie/home), also a UC Berkeley Professor of Physics captured the images of internal bond structure of oligo (phenylene-1, 2 ethynylenes) [Reactant1] when it undergoes cyclization to give different cyclic compounds (one of which is shown in the inset picture http://newscenter.berkeley.edu/2013/05/30/scientists-capture-first-images-of-molecules-before-and-after-reaction/). Chemists generally determine structure of molecules using different spectroscopic techniques (NMR, IR, Uv-vis, etc.) in an indirect manner. The molecular structures, we generally see in the textbooks result from the indirect way of structure determination, either theoretical or experimental or both. It is more like putting together various parts to solve a puzzle. But now, with this ground breaking work of two scientists from UC Berkeley, one can directly see for the very first time in the history of science, how a single molecule undergoes transformation in a chemical reaction, how the atoms reorganized themselves at a certain condition to produce another molecule. No more solving puzzle for next generation of chemists to determine the molecular structure.

HOW interdisciplinary research made it possible:

Well, it was not easy task for the scientists to come up with these spectacular molecular images. Imaging techniques such as scanning tunneling microscopy (STM), tunneling electron microscopy (TEM), have their limitations, and are often destructive to the organic molecular structure. Advanced technique like nc-AFM where a single carbon monoxide molecule sits on the tip (probe) helps in enhancing the spatial resolution of the microscope, and this method is also non-destructive. The thermal cyclization of the Reactant 1 was probed on an atomically cleaned silver surface, Ag(001) under ultra-high vacuum at single molecular level by STM and nc-AFM. Before probing, the reaction surface and the molecules were chilled at liquid helium temperature, 40K (-2700C). Then the researchers first located the surface molecules by STM and then performed further finetuning with nc-AFM, and the result is what we see in the inset picture. For cyclization, the Reactant 1 was heated at 900C, the products were chilled and probed.  Chilling after heating did not alter the structure of the products. The mechanism of thermal cyclization was also clearly understood, and the mechanistic pathway was in agreement with the theoretical calculations. From the blurred images of STM, Dr. Fischer and Dr. Crommie along with their coworkers presented us crystal clear molecular images with visible internal bond structure. This ground breaking work shows the potential of nc-AFM and unveils secrets of surface bound chemical reactions which will definitely have a huge impact on oil and chemical industries where heterogeneous catalysis is widely used. This technique will also help in creating customized nanostructure for use in electronic devices.

Again this path breaking work was possible due to the collaborative research between chemists and physicists. Hence, the interdisciplinary researches have endless potential.

References

1.    de Oteyza DG, Gorman P, Chen Y-C, Wickenburg S, Riss A, Mowbray DJ, Etkin G, Pedramrazi Z, Tsai H-Z, Rubio A, Crommie MF, Fischer FR. Direct Imaging of Covalent bond structure in Single-molecule chemical reactions. Science (2013); 340: 1434-1437

 

Interdisciplinary research – Nobel Prize for Chemistry was awarded to two Biologists

Modern scientific research does not confine itself to any restricted boundary.  Nowadays, it is all about interdisciplinary research. In 2012, Nobel Prize for Chemistry (http://www.nobelprize.org/nobel_prizes/chemistry/)was awarded to two eminent biologists, Prof. Robert J Lefkowitz and Prof. Brian Kobika, for their crucial contribution in unveiling the signalling mechanism of G protein-coupled receptors (GPCRs). It’s a lifetime work of both the scientists. Dr. Lefkowitz, an investigator at Howard Hughes Medical Institute (HHMI) at Duke University, is also James B Duke Professor of Medicine and of Biochemistry at Duke University Medical Center, Durham, NC, USA. Dr. Kobika, earlier a postdoctoral fellow in Dr. Lefkowitz’s laboratory, is currently Professor of Molecular and Cellular Physiology at Stanford University, School of Medicine, Stanford, CA, USA.

Transmembrane signalling of one GPCR “caught in action” by X-ray crystallography

GTP (guanosine triphosphate) binding proteins (G-protein) act as molecular switches in transmitting signals from different stimuli outside the cell to inside the cell. However, for doing this G-protein needs to be activated, and that is where GPCRs play the most important role. They sit in the cell membranes throughout the body. GPCRs, also known as seven transmembrane (pass through the cell membrane seven times) domain proteins, detect the external signals like odor, light, flavor as well as the signals within the body such as hormones, neurotransmitter.1 Once the GPCRs detect a signal, the signal is transduced in certain pathway and finally activate the G-protein. In response, the activated G-protein triggers different cellular processes. Binding of a signalling molecule or ligand to the GPCR causes conformational changes in the GPCR structure. As a result of extensive research of 20 long years, Dr. Lefkowitz and Dr. Kobika not only identified 800 members of GPCRs family in human but also caught in action how these receptor proteins actually carry out the signal transduction with the help of high resolution X-ray crystallography. The crystal structure of ß2-adrenergic receptor (ß2AR), a member of the human GPCRs family was reported by Dr. Kobika and his colleagues in 2007.2 The hormones adrenaline and noradrenaline are known to activate ß2AR, and the activated ß2AR triggers different biochemical processes which help in speeding up the heart and opening airways as body’s fight response. The ß2AR is a key ingredient in anti-asthma drugs. One of the major breakthroughs came in 2011 when Dr. Kobika and his co-workers unveiled for the first time the exact moment of the transmembrane signalling by a GPCR. They reported the crystal structure of “the active state ternary complex composed of agonist-occupied monomeric ß2AR and nucleotide-free Gs heterotrimer”.3 A major conformational change in ß2AR during signal transduction was discovered.

Now what is so special about GPCRs? Well, these proteins belong to one of the largest families of  all human proteins. GPCRs are involved in most of the physiological activities, and hence are  the targets of a number of drugs. Determination of the molecular structures of this class of receptors not only helps the researchers to understand the actual mechanism of different cellular processes but also help them to design life saving and more effective drugs. So, in a nut shell, this scientific breakthrough was possible due to the involvement of experts of different areas of science such as, chemistry, biochemistry, molecular and cellular biology, structural biology, cardiology, crystallography.

 

References

 

  1. Lefkowitz, R. J. Seven transmembrane receptors: something old, something new. Acta Physiol. (Oxf.) 190, 9–19 (2007).
  2. Rasmussen, S. G. et al. Crystal structure of the human b2 adrenergic G-protein coupled receptor. Nature 450, 383–387 (2007).
  3. Rasmussen, S. G. et al.  Crystal structure of the b2 adrenergic receptor–Gs protein complex. Nature 477,  549-557 (2011)

 

Peer Review Mechanism

Many of us have often come across the terms “peer review”, “peer-reviewed journal” or “peer-reviewed paper” at some or the other point of time. But, how many of us know what exactly the term “peer review” refers to or what the “peer review process” is all about? Let us discuss this key aspect of the research process.

According to the International Committee of Medical Journal Editors (ICMJE), peer review is the critical assessment of manuscripts submitted to journals by experts who are not part of the editorial staff. Peer review, which is also known as refereeing, has become an inevitable part of the quality control process, which determines whether a paper is worth publishing/funding or not.

The origin of peer review often dates back to 18th century Britain. However, it became a key part of the research process only in the second half of the 20th century, triggered by the growth of scholarly research. As the reviewers are specialized in the same field as the author, they are considered to be the author’s peers; hence, it was coined as “peer review”.

Peer Review Process

The author submits the paper to the respective journal. The Journal Editor forwards that paper to experts (reviewers) in the relevant field. These reviewers thoroughly analyze the quality of the paper, validity of the data and methods used, and the accuracy of the results. They provide their judgment on the paper whether: there is scope for improvement, it is ok as it is, or it is not worth publishing. If there are changes to be made in the paper, the reviewers list in their comments the particular areas that have scope for improvement. Then the paper is returned to the Journal Editor who sends it to the author with the appropriate decision: accepted as it is; accepted with revisions; or rejected. Accordingly, the author makes the changes and resubmits to the same journal, or resubmits to another journal.

Types of Peer Review

Peer review can be classified into three types based on the levels of transparency:

Single-blind review: In this case, the author’s identity is known to the reviewers, but not vice versa.

Double-blind review: In this case, the identities of the author and reviewers are hidden from each other.

Open peer review: Here, the author’s and reviewers’ identities are known to each other.

At present, the peer review process is implemented by a majority of scientific journals. It helps to prevent falsified work from being published. Its importance has become such that, most research are not considered to be serious stuff unless they have been validated by peer review. A peer-reviewed paper that is accepted for publication is looked upon as a work of quality. But, this process has its own disadvantages. It is an extremely time-consuming process. The long wait can be extremely frustrating for the researcher and can even jeopardize his academic progress. Moreover, sometimes the element of bias creeps into the peer review process. The reviewers’ judgment might be influenced by their own perception of things, the identity of the author, and at times, even the country of origin of the author.

Factors of Audience Analysis

Psychographics

Psychographics refers to the lifestyle, values, leisure activities and social self-image that the readers are likely to have. Marketing research shows that people react favorably towards products and services that they see as representative of themselves. Similarly, your readers will respond differently to your message according to their values. What are their interests, opinions and hobbies’? In the rapidly changing and diversifying contemporary world, interests and values are less and less tied to demographic issues. For example, when computer games first started to develop, they were associated with a target market of young males in the 15 to 25 age group. As this form of entertainment evolved, the target market changed, and there are now computer games that attract females, older males, and other demographic groups. An analysis of the computer game market, therefore, is more likely to benefit from a psychographic examination that would see the computer game market as a special interest group, rather than a demographic.

Demographic and psychographic analysis are especially relevant in journalistic and public relations writing, where you address a wider public.

Audience Analysis

Every act of writing takes place in a new context, with a unique time, place or reader to take into account. Audience adaptation refers to the skill of arranging words, organizing your thoughts, and formatting your document to best achieve your desired effect on your target audience. Audience dynamics refers to the relationship that writers form with their readers through their style, and through the amount and structure of information they provide. The audience dynamics are effective when the readers get a sense of satisfaction that the questions raised in the text were relevant to their interests and the answers or solutions provided were convincing. In contrast, audience dynamics are ineffective when the readers feel frustrated or offended because the writer’s tone is condescending, the answers or solutions provided are simplistic in relation to the complexity of the questions, or the argument is emotive and based on generalization. To maximise your ability to achieve effective audience dynamics, assess the readers’ needs, knowledge and interest by conducting an audience analysis before writing. Audience analysis is an integral part of your research.