Is self-plagiarism ethical?

Research papers or journals are the medium of spreading knowledge and new ideas evolved. Innovative and original piece of work would certainly be more educative and admirable. Nevertheless, authors and writers are often found to be reusing their old piece of work or some extracts from their previous published papers while writing a new research paper.

When questions are raised against this content reuse, authors claim that those stuffs are their own works and materials, and thus, they can reuse them as they wish, and it cannot be termed as plagiarism since they have not stolen the ideas from any other author or source.

The ethics of plagiarism are not applicable to such reuse, as a result of which it has been overlooked till date. While the discussion is whether this reuse is ethical or not, the publications and the journals, on the other hand, have set certain guidelines for such works citing it as Self-plagiarism.

What is self-plagiarism?

Self-plagiarism is a form of plagiarism where the writer reuses his/her own previously published work in portions or entirely while creating a new study paper. It can breach the publisher’s copyright on those published work when it is reused in the new study papers without appropriate citations. Let us now know more about the ethical aspects of self-plagiarism.

Self-plagiarism can be detected when:

a)  A published paper is used to republish elsewhere without the consent of the co-authors and the publisher of the paper or work.

b)  A paper of a large study is published in small sections with an intention to increase the number of publications.

c)  A previously written work either published or not is reused again in portions in the new study papers.

Although the laws of self-plagiarism are not enforced, it somehow reflects the dishonesty of the author. Moreover, the journals and the publishers are rejecting such copy-paste works as they are seeking writings based on original research findings and proper citations of all the references.

Nowadays, journals are also pointing out questions on the reuse of one’s own work. In order to avoid self-plagiarism, one should try to keep his/her work original, and in case it is necessary to include any portion from his/her previous works, it should be then properly cited with proper references. I hope this article will surely help you in detecting prospective self-plagiarism before submitting your paper or work to publications or journals.

Size does matter: Nano vs. Macroscopic world

We live in an era of nanomaterials, nanotechnology, and nanoscience. What is so special about this nano world? How different is it from the macroscopic world of conventional bulk materials? How size influences the difference in properties in these two distinct worlds, although the basic material is same? For example, the properties of gold nanoparticles are distinctly different from the bulk gold. One simple answer is nanoparticles consist of fewer atoms to few thousand atoms while the bulk materials generally Fig 1 gold macro vs nano composed of billions of atoms. Look at the image below. At nanoscale, gold does not look even yellow! All of us know that gold (in bulk) is an inert metal. However, the same metal at nanosize of about 5 nm works as a catalyst in oxidizing carbon monoxide (CO). Therefore, size does influence the property. But, how? What happens when a material breaks down to nanoscale? Part of the answer lies in the number of surface atoms. Let’s elaborate it. We know that at bulk state gold forms face centered cubic (fcc) lattice where each gold atom remains surrounded by 12 gold atoms, even the gold atoms at surface is surrounded by six adjacent atoms. In a gold nanoparticle, a higher number of atoms sit at the surface, and surface atoms are always more reactive. These large numbers of exposed atoms in gold nanoparticles compared to the bulk material enable gold nanoparticles to function as a catalyst.

Now what happens to the color? At nanoscale, gold loses its vibrant yellow color. While light gets reflected from the surface of the gold at bulk state, the electron clouds resonates with certain wavelength of light at nanoscale. Depending on the size of the nanoparticle, it absorbs light of certain wavelength and emits light at different wavelength. For example, nanoparticles of sizes about 90 nm absorb red to yellow light and emit blue-green, whereas particles around 30 nm in size absorb blue and green light and appear red in color.

The physical properties such as melting point, boiling point, conductivity, etc. also change in nanoscale. For example, Fig 2a when gold melts in its bulk state regardless whether it’s a small ring or big gold bar, all melts at the same temperature. But this is not true for nanoparticles; with decrease in size, the melting point lowers and it varies by hundreds of degrees (Check the inset picture). This is because when a matter reaches nano-regime, it no longer follows Newtonian or classical physics, rather it obeys the rules of quantum mechanics. The nanoeffects which are relevant for nanomaterials are as follows: (i) Gravitational force no longer controls the behavior due to the very small mass of the nanoparticles, rather electromagnetic field determines the behavior of the atoms and molecules; (ii) Wave-particle duality applicable for such small masses, where wave nature shows pronounced effect; (iii) As a result of wave-particle duality, a particle (electron) can penetrate through an energy region or barrier (i.e. energy potential) which is classically forbidden and this is known as quantum tunneling. In classical physics, a particle can jump a barrier only when it has energy more than the barrier; Fig 2_tunneling therefore, the probability of finding the particle on the other side the barrier is nil if the particle possesses less energy than the barrier. On the other hand, in quantum physics, the probability of finding a particle, with less energy required to jump the barrier, on the other side is finite. However, to have a tunneling effect, the thickness of the barrier should be comparable with the wavelength of the particle and this is only possible in nanoscale level. Based on quantum tunneling, scanning tunneling microscope (STM) is created to characterize the nanosurfaces.

(iv) Quantum confinement i.e. electrons are not freely movable in bulk material rather these are confined in space. Size tunable electronic properties of nanoparticles arise due to quantum confinement.

(v) Energy quantization i.e. energy is quantized. An electron can exist only at discreet energy levels. Quantum dots, a special class of nanoparticles of size 1-30 nm, show the effect of energy quantization.

(vi) Random molecular motion: At absolute zero molecules are always moving owing to their kinetic energy, although this motion is not comparable to the object at macroscale. However, at nanoscale, this motion becomes comparable to the size of the particle; hence, influence the behavior of the particle.

(vii) Increased surface-to-volume ratio: The changes in the bulk properties (mp, bp, hardness, etc.) can be attributed to the enhanced surface-to-volume ratio of nanoparticles.

Therefore, in a nut shell, because of the above mentioned changes, the properties of a material in nanoregime differ from macroscale.

Pharmacogenomics: A study of personalized drug therapy

With the increasing advancement of technology and research progress, modern medicine has found cure for several diseases which were considered to be incurable few decades ago e.g. cardiovascular diseases, various cancers, tuberculosis, malaria and infectious diseases. However, till date no single drug is shown to be 100% efficacious for treating a certain diseased condition without exhibiting adverse drug effects. It is now a well recognized fact that each patient respond differently to a given drug treatment for a similar disease. With a particular drug, desirable therapeutic effects could be obtained in few patients where as others may have modest or no therapeutic response. Besides, many patients might experience an adverse effect that also varies from mild to severe and life-threatening. Studies have shown that with a similar dose, plasma concentration of a certain drug might vary up to a difference of 600 fold among two individuals of same weight. Such inter-individual variations occurring in response to a drug might be a consequence of complex interaction between various genetic and environmental factors. Genetic factors are known to account for approximately 15-30% inter-individual variability in drug disposition and response, but for certain drugs it could also account for 95% variations. For the majority of the drugs, these differences are largely ascribed to the polymorphic genes encoding drug metabolizing enzymes, receptors or transporters. These polymorphic genes mainly influence important pharmacokinetic characteristics of the drug metabolism e.g. drug absorption, distribution, metabolism and elimination.

Origin of pharmacogenomics:

The first report of an inherited difference in response to a foreign chemical or xenobiotic was inability to taste phenylthiocarbamide. Another example which showed that drug response is determined by genetic factors which can alter the pharmacokinetics and pharmacodynamics of medications, evolved in late 1950s, when an inherited deficiency of glucose-6-phosphate dehydrogenase was shown to cause severe hemolysis in some patients when exposed to the antimalarial drug primaquine. This discovery elucidated why hemolysis was reported mainly in African-Americans, where this deficiency is common, and rarely observed in Caucasians. Other established evidences of inter-individual variations observed in response to suxamethonium (succinylcholine), isoniazid, and debrisoquine were also linked with a genetic connection. The discovery that prolonged paralysis following the administration of succinylcholine was the result of a variant of the butyryl-cholinesterase enzyme, and peripheral neuropathy occurring in a large number of patients treated with antituberculosis drug isoniazid was an outcome of genetic diversity in the enzyme N-acetyltransferase 2 (NAT2) are excellent examples of “classical” pharmacogenetic traits altering amino acid sequence.

These observations of highly variable drug response, which began in the early 1950s, led to beginning of a new scientific discipline known as pharmacogenetics. Vogel in 1959 was the first to use the term pharmacogenetics but it was not until 1962, when in a booy by Kalow, pharmacogenetics was defined as the study of heredity and the response to drugs.

Pharmacogenomics in new era:

The term pharmacogenomics was later introduced to reflect the recent transition from genetics to genomics and the use of genome-wide approaches to identify the genes that contribute to a specific disease or a drug response. The term pharmacogenomics and pharmacogenetics are many times used interchangeably. Pharmacogenomics is an emerging discipline that aimed at studying genetic differences in drug disposition or drug targets to drug response. With the availability of more sophisticated molecular tools for detection of genetic polymorphisms, advances in bioinformatics and functional genomics, pharmacogenomic based studies are generating data which is used in identifying the genes responsible for a specific disease or the drug response. There is emerging data from various human genome projects on drug metabolizing genes that is rapidly elucidated and translated into more rational drug therapy towards a personalized medicine approach. Many physicians are now reconsidering whether “One Drug for All” approach is ideal while prescribing medicines to treat a certain condition in different individuals. Various studies have now reported genotype- phenotype association studies with reference to many diseases where respective drug metabolizing genes and receptors are highly polymorphic. In the last decade, FDA has increasingly acknowledged the importance of biomarkers and formulated new recommendations on pharmacogenomic diagnostic tests and data submission.

Applications and challenges of Pharmacogenomics:

Personalized medicine is at times deemed to be a future phenomenon; however it is already making a marked difference on patient treatments especially in various cancers. Molecular or genetic testing is now available for colon, multiple myeloma, leukemia, prostrate and breast cancer patients, hepatitits C and cardiovascular diseases where one can identify their genetic profile and based on that it can be predicted whether patients are likely to benefit from new drug treatments simultaneously minimizing adverse drug reactions. Recently, at MD Anderson Cancer Center, “Institute for Personalized Therpay” was created particularly to implement personalized cancer therapy for improving patient outcomes and reducing treatment costs.

Personalized medicine might guarantee many medical innovations but its implementation is associated with several challenges regarding public policy, social and ethical issues. Individual may not opt or participate in the genetic research as they feel it might breach their right for privacy and confidentiality. To tackle these challenges, “2008 Genetic Information Nondiscrimination Act” was designed to shield individuals from genetic discrimination. Apart from this, other existing concerns are: ownership of genetic materials, medical record privacy, clinical trial ethics, and patient’s knowledge on the consequences of storing genetic materials and phenotypic data. These concerns must be addressed for the satisfaction of all the stakeholders, especially the patients on reaching a common consensus as how to manage pharmacogenomics applications into clinical practices.

Interdisciplinary research – Direct Imaging of Single Molecule

Interdisciplinary research has immense potential. I have talked about one of the major discoveries of modern science based on interdisciplinary research in my previous blog, posted on 29th July 2013 (http://blog.manuscriptedit.com/2013/07/ interdisciplinary-research-nobel-prize-chemistry-won-biologists/). Today, let us take another example, where one chemist and one physicist came together and presented us with the direct image of internal covalent bond structure of a single molecule using one of the advanced imaging tools, non-contact Atomic force microscope (nc-AFM). Image1Dr. Felix R.Fischer (http://www.cchem.berkeley.edu/frfgrp/), a young Assistant Professor of Chemistry at University of California (UC), Berkeley along with his collaborator Dr. Michael Crommie (http://www.physics.berkeley.edu/research/crommie/home), also a UC Berkeley Professor of Physics captured the images of internal bond structure of oligo (phenylene-1, 2 ethynylenes) [Reactant1] when it undergoes cyclization to give different cyclic compounds (one of which is shown in the inset picture http://newscenter.berkeley.edu/2013/05/30/scientists-capture-first-images-of-molecules-before-and-after-reaction/). Chemists generally determine structure of molecules using different spectroscopic techniques (NMR, IR, Uv-vis, etc.) in an indirect manner. The molecular structures, we generally see in the textbooks result from the indirect way of structure determination, either theoretical or experimental or both. It is more like putting together various parts to solve a puzzle. But now, with this ground breaking work of two scientists from UC Berkeley, one can directly see for the very first time in the history of science, how a single molecule undergoes transformation in a chemical reaction, how the atoms reorganized themselves at a certain condition to produce another molecule. No more solving puzzle for next generation of chemists to determine the molecular structure.

HOW interdisciplinary research made it possible:

Well, it was not easy task for the scientists to come up with these spectacular molecular images. Imaging techniques such as scanning tunneling microscopy (STM), tunneling electron microscopy (TEM), have their limitations, and are often destructive to the organic molecular structure. Advanced technique like nc-AFM where a single carbon monoxide molecule sits on the tip (probe) helps in enhancing the spatial resolution of the microscope, and this method is also non-destructive. The thermal cyclization of the Reactant 1 was probed on an atomically cleaned silver surface, Ag(001) under ultra-high vacuum at single molecular level by STM and nc-AFM. Before probing, the reaction surface and the molecules were chilled at liquid helium temperature, 40K (-2700C). Then the researchers first located the surface molecules by STM and then performed further finetuning with nc-AFM, and the result is what we see in the inset picture. For cyclization, the Reactant 1 was heated at 900C, the products were chilled and probed.  Chilling after heating did not alter the structure of the products. The mechanism of thermal cyclization was also clearly understood, and the mechanistic pathway was in agreement with the theoretical calculations. From the blurred images of STM, Dr. Fischer and Dr. Crommie along with their coworkers presented us crystal clear molecular images with visible internal bond structure. This ground breaking work shows the potential of nc-AFM and unveils secrets of surface bound chemical reactions which will definitely have a huge impact on oil and chemical industries where heterogeneous catalysis is widely used. This technique will also help in creating customized nanostructure for use in electronic devices.

Again this path breaking work was possible due to the collaborative research between chemists and physicists. Hence, the interdisciplinary researches have endless potential.

References

1.    de Oteyza DG, Gorman P, Chen Y-C, Wickenburg S, Riss A, Mowbray DJ, Etkin G, Pedramrazi Z, Tsai H-Z, Rubio A, Crommie MF, Fischer FR. Direct Imaging of Covalent bond structure in Single-molecule chemical reactions. Science (2013); 340: 1434-1437