## Hypothesis Testing

Hypothesis testing is the process of testing validity of a hypothesis or a supposition in relation to a statistical parameter. Hypothesis testing is used by analysts to determine whether or not a hypothesis is reasonable. For example, hypothesis testing could be used to find whether a certain drug is effective or not in treating headache. It uses data from a sample to draw conclusions about a statistical parameter. Hypothesis testing is an important step as it validates statistical parameter which could be used in making conclusions or inference about population or large sample data.

Types of Hypothesis

In data sampling, different types of hypothesis is used to examine whether a sample is positive for test hypothesis or not.

1. Alternative Hypothesis (H1) – This hypothesis states that there is a relationship between two variables (where one variable affects the value of other variable). The relationship that exists between the variables is not due to chance or coincidence.
2. Null Hypothesis (H0) – This hypothesis states that there is no relationship between two variables. It states that the effect of one variable on another is entirely due to chance, with no empirical explanation.
3. Non-Directional Hypothesis – It states that there is a relationship between two variables, but that the direction of influence is unknown.
4. Directional Hypothesis – It states the direction of effect of the relationship between two variables.

Alternative hypothesis and null hypothesis is used to study data samples to find a possible pattern to form a statistical hypothesis that can be validated through hypothetical testing. Alternative hypothesis and Null hypothesis cannot be true at the same time as they are mutually exclusive. Similarly, Non-directional and directional hypothesis cannot be true at the same time as they are mutually exclusive.

Methods of Hypothesis Testing

1. Frequentist Hypothesis Testing- This is the traditional approach to hypothesis testing. It involves making assumptions on current data and comparing prior knowledge about hypothesis with posterior knowledge of the hypothesis to form a conclusion on the hypothesis. One of the subtypes of this approach is Null Hypothesis Significance Testing.
2. Bayesian Hypothesis Testing- It is one of the modern methods of hypothesis testing. In this method prior probability of hypothesis from past data and current data is used to find posterior probability of the hypothesis.

The Bayes factor, which is a key component of this approach, represents the likelihood ratio between the null and alternative hypotheses. This factor indicates the plausibility of either of the two hypotheses formed for hypothesis testing.

Techniques of Hypothesis Testing

There are few commonly used Tests: Z-Test, T-Test, Chi squared Test and F-Test.

1. Z Test- A z test is performed on a population with independent data points that follows a normal distribution and has a sample size of larger than or equal to 30. When the population variance is known, it is used to determine whether the means of two populations are equal. Z test statistic is compared to the crucial value and the null hypothesis of z test is rejected if the z test statistic is statistically significant.

Where,

Z= Z-test

X̄ =sample average

µ=mean

s=standard deviation

1. T Test – A t-test is an inferential statistic that is used to see if there is a significant difference in the means of two groups that are related in some way. This test is also called as Student test. It is used when variables are continuous, sample size is less than 30, and population standard deviation is not known. T statistic is used to arrive at a conclusion on whether to accept the hypothesis or reject the hypothesis.

Where,

t= Student’s t-test

m= mean of sample

µ= assumed mean

s= standard deviation

n= number of observations

1. Chi squared Test – A chi-square statistic is a test that evaluates how well a model matches actual data. For using Chi squared test the data used must be random, mutually exclusive, taken from independent variables from a large sample.

where:

c=Degrees of freedom

O=Observed value(s)

E=Expected value(s)

There are two types of χ2 test – the test of independence, and goodness-of-fit test. A χ2 test for independence can show us how likely it is that random chance can explain any observed difference between the actual frequencies in the data and these theoretical expectations.

1. F Test – Any statistical test with an F-distribution under the null hypothesis is known as an F-test. It is generally used to compare statistical models that have been fitted to a data set to find which model best fits the population from which the data were sampled. To perform an F-test, the population must have an f distribution and the samples must be random. If the f test findings are statistically significant, the null hypothesis is rejected otherwise, it is not. F statistic for large samples:

Where,

σ1= variance of the first population

σ22  = variance of the second population

## 5 Steps for Publishing in a High-Quality Medical Journal

Scientific writing and publishing is a vital component of medical advancement. Publications are used to transmit new developments in human knowledge to the rest of the world. This knowledge must be accurate, valid, reproducible, and clinically valuable. Many ambitious physicians and scientists aspire to publish their work in high-impact publications.

What are the Effective Steps for Publishing in a High-Quality Medical Journal?

Choose a Journal and Read the Journal’s Instructions.

It is critical to decide on authorship and the order of authors, including the corresponding author, ahead of time. 6 All authors listed on the final manuscript must have contributed substantially to the work to be held accountable and accept public accountability for the publication’s content. When preparing your paper, carefully observe the author guidelines, including the word limit and the number of tables and figures allowed. Many journals permit the submission of supplemental material as part of your publication, subject to the word count and figures/tables constraints.

Prepare the Manuscript

• Organize the Manuscript – Begin by outlining the manuscript with this basic structure in mind. The first rough draft would be a list of crucial points to describe beneath each section and subsection.
• Prepare the Manuscript – Pay attention when drafting the manuscript to avoid plagiarism (including self-plagiarism), fraud, and fabrication.
• Colleagues should provide feedback and revise the manuscript – After completing the work, share it with co-authors and one or two non-author colleagues for criticism and feedback. Address and correct the English after revising to verify that your idea or ideas have been appropriately and fully communicated.

Submit the Manuscript

The majority of scientific journals now demand submissions be made through their websites. Most journals allow authors to propose reviewers who should and should not evaluate an article during the manuscript submission process. Such comments are beneficial to the editorial staff of the journals. Choose your title and keywords carefully so that readers can discover your paper. A brief cover letter that includes a two- to a three-sentence overview of the manuscript’s relevance might provide essential context to the editor.

Receive the Editor’s Communication and Revise the Manuscript

Acceptance of a paper as submitted is extremely rare. Do not be offended if your manuscript is rejected. In reality, only about one in every four articles is approved by a top journal. If adjustments are requested, it is critical to provide a thoughtful and respectful response to maximize the likelihood of later acceptance.

Resubmit the Manuscript

Include a cover letter to the editor with point-by-point responses to the reviewers’ remarks and ideas when submitting a revised manuscript, and address all complaints and suggestions as extensively as possible. It would help to highlight the Changes in the updated text to evaluate thoroughly.

Conclusion

To summarise, the process of publishing a manuscript in a high-impact journal begins with selecting an important question, designing a sound study with statistical power, carrying out the work with impeccable integrity and attention to detail, writing an excellent manuscript, submitting it to the appropriate journal, fully responding to reviewer comments, and completing the standard post-acceptance checks. Nothing beats the satisfaction of seeing your paper published and visible to the rest of the world.

## Medical Grant Proposal Writing

What is a Grant Proposal Writing?

A Grant Proposal Writing is a document or collection of documents submitted to an agency for the express purpose of obtaining funding for a research project. Successful grant applications are the result of a lengthy process that starts with an idea. Grant Proposal Writing is a circular process, although many people think of it as a linear process (from concept to proposal to award).

What are the Steps in Grant Proposal Writing?

Formulate a Research Question

Many people begin by formulating a research proposal. This is simpler if you know what you want to accomplish before you start writing.

• As a direct result of your project, what expertise or information would you gain?
• What is the significance of your study in a wider sense?
• You must make this aim clear to the committee that will be evaluating your application.

Define a Goal

• You must first determine what type of research you will conduct and why before you begin writing your proposal.
• What exactly are you up to, and why are you up to it? Give a reason for your decision.
• Feel free to show some initiative and tackle a dilemma, just make sure you can justify why and then persuade us that you have a good chance of succeeding.
• Demonstrate the approach’s uniqueness by presenting the information void that needs to be filled.

Find Funding Agencies

• Whether or not your plan is funded is largely determined by how well your intent and objectives align with the priorities of awarding agencies.
• Locating potential grantors is a time-consuming process, but it will pay off in the long run.
• Even if you have the most compelling research idea in the world, if you don’t submit it to the right institutions, you’re unlikely to be considered for funding.
• There are a plethora of resources available to learn more about granting agencies and grant programs.
• Most colleges, as well as several schools within universities, have research offices whose primary function is to assist faculty and students with grant applications.
• To assist people with finding potential grants, these offices typically have libraries or resource centers.

Do Internal Review

• Seek the advice of a mentor or a senior colleague for a second opinion. Remember to check the following things –
• Title
• Introduction about the Medical Research
• Problems in the Medical Research
• Objectives
• Preliminary Literature Review
• Research Methodologies
• Research Plan
• Make a Budget Plan
• Reference

• Your proposal must be presented to the funding body as a successful cash reward.
• There must be a strategy for every aspect of the mission.
• Analysts will go over it carefully to ensure that the study’s components are affordable.
• Your application can be executed due to over-costing.
• Consider if the advancement you can make in the field justifies the expense.

Conclusion

Before writing a Research Protocol, identification of Sponsors and Understanding Application Guidelines is vital. Many companies are providing professional grant writing services.

## Importance Of Meta-analysis In Medical research

Writing articles related to medical research today has to follow certain well-accepted forms of research in order to be accepted. Of the various types of medical research articles, studies based on metadata analysis is one of the most well-accepted forms.

Used & Refurbished Philips Mobile C-Arms can help guide you to equipment and parts that will provide the best value for your dollar.

The concept of meta-analysis stems from the field of statistics. Meta-analysis is a process of combining the results of multiple scientific research related to the same field. Statistically, if there are multiple experiments on the same lines, the results of each are prone to certain degrees of error (for example, the most commonly occurring errors are Type 1 errors related to false-positives or Type 2 errors of false-negatives). However, if all those studies are pooled together, the net derivative result is less likely to have any of such errors and will yield a more definitive result. The key to the aggregation of data is higher statistical power and thereby more robust point estimates than any individual scientific research.

Meta-analysis based medical research started in the 70s and has gained immense popularity ever since. In fact, statistics suggest that meta-analysis based medical articles are the most cited articles in the field.

While meta-analysis generally refers to quantitative studies, there exists another form of studies called statistical meta-synthesis pertaining to integrating results from multiple but related qualitative studies. The approach to statistical meta-synthesis is more interpretive than aggregative and thereby have a different approach altogether. Before embarking on your work, you need to first determine whether you want to go for meta-analysis or statistical meta-synthesis, based on the type of field you are working on.

How to Approach Writing a meta-analysis medical research paper?

Meta-analysis is conducted to assess the strength of evidence, usually on the efficacy of any specific disease or particular treatment. Meta-analysis seeks to collate multiple pieces of evidence to arrive at a conclusion; whether any direct or indirect effect exists, and/or whether such effect is positive or negative (particularly pertaining to a specific type of treatment). This assumes importance as heterogeneity is vital for the development of any new hypothesis.

Key to such study is developing the metadata pool, based on a thorough filing and coding system, proper categorization of data, and thereby identifying the key analytical data- crunching exercise that is expected to yield the best results.

The key to a good meta-analysis exercise is proper identification of the different methods adopted in each exercise and thereby identifying how they may have affected the findings of those exercises. Identifying and mapping methods is also critical given not all methods are comparable and therefore all data may not be compatible.

Tools for meta-analysis

Needless to say, the main tools for such analysis are hardcore statistical tools, mostly software like SAS, STRATA, R. The data is usually pooled from recognized medical research sources like PubMed, Embase, or CENTRAL.

Reporting of results is usually done via the form of Preferred Reporting Items for Systemic Reviews and Meta-analysis or PRISMA.

## All About Medical Case Report

Medical practitioners come across a large number of patients with variations in disease expression and subsequent prognosis. Sometimes a unique non reported disease phenotype or may notice the significant outcome of any innovative combinative treatment. These surgical table details are crucial and need to be recorded to provide direction to both doctors and researchers. However, with the hustling profession of being a medical practitioner, it seems an overwhelming task to report “insights of new and unique observations” in form of full-length paper. Does this mean publishing is all about lengthy research article or review writing? The answer is No, the alternative is a MEDICAL CASE REPORT.

What do we mean by Medical Case reports (MCR)?

It is written communication of previously unknown or a rare disease presentation. Also, MCR can be reporting of de novo treatment strategy of utmost medical significance. Since, its frontline of evidence, therefore, can pave the way towards the development of medical sciences bringing better diagnosis and prognostic strategies.

Components of MCR and their relevance

Similar to any scientific text, MCR too has a specific format to provide clarity and keeping the readers engaged until the end.

• Title
• Abstract: Short and crisp conveying the gist of the article in a structured manner.
• Background: Origin of study with highlights’ of what’s known to date.
• Case Presentation: Physical examination of the apposite prognosis of the subject in chronological order.
• Discussion: Illustration of the uniqueness of case and future significance.
• Conclusion: Take home message of MCR.
• References
• List of Abbreviation
• Consent
• Author information
• Acknowledgment
• Cover Letter

Conclusion: MCR can be the first step to revolutionize the field of medical sciences with heart-throbbing findings of life significance.

## Modulation of life expectancy by weight statistics in young adulthood

Ideal weight not only impacts the overall personality of a person but at the same time, is also the hallmark of good health. Weight management is often goggled in view of its various implications, mostly in the form of grave lifestyle associated disorders.

Obesity implication in early lives:

The obesity statistics in India are alarming, thereby expected to be tripled by 2040, as suggested by the latest research. Narrowing the studies to the early adolescent population worldwide, the figures are a bit scarier.

Contrary, to the age-old, believes that childhood is the “time of liberty” in terms of food choices we make, and cautiousness towards ingredients in the serving plate is applicable only with old age is a misnomer. Junk food, high sugar diet with loads of trans fats, lack of workout, irregular and unhealthy eating habits are making the adolescent population predisposed towards obesity. Primary or secondary obesity prevalence at a younger age has immersed as a deadly threat, closing the survival bracket to just mid-age. Researches have shown that around 12.4% of premature death can be linked to early adulthood associated with obesity.

The timeline of weight management:

A retrospective survey using the BMI (Basal metabolic index) corresponding to the early ages of the participants has shown that the negative implication of obesity on life expectancy remains unaltered. The tough part is that the damage caused due to being obese at a young age can’t be countered by weight reduction in later years. Also, premature death probability remains unchanged in both “presently overweight but previously obese” and “currently overweight” participants. This signifies the irreversible damage incorporated on the cellular dynamics due to early-onset obesity. Subsequently, paving ways to other comorbidities such as cardiovascular disease, diabetes in later lives.

Conclusion:

Weight management awareness at the right age is warranted to make a healthy population with survival fitness for which we recommend to use the best testosterone booster. Inculcation of proper lifestyle habits and realization of gross detrimental effects of obesity at the beginning of early adulthood shall help to build future generations with a superior quality of life.

## The double catastrophe of cardiovascular-pulmonary disorders

Darkness crippled in the lives of family members, with the sudden demise of the only son of the family, aged 38, years undergoing treatment for an interstitial lung disorder. The death report stated cardiac arrest not a pulmonary failure as the cause of untimely demise. This incidence like many more, hints the fine-tuning at the cellular-functional axis, amongst The MOST VITAL organs “Lungs & Heart” to support life.

Pulmonary cardiovascular disease

Both cardiovascular and pulmonary disease is the leading cause of deaths worldwide. Cardiovascular diseases developed in respiratory patients have a high mortality rate apart from affecting day to day lives. For instance, patients with lung fibrosis or COPD are more likely to die due to heart failure as compared to those with a pulmonary issue but no cardiac involvement”. The irony of the story is that the patient with lung disease is less likely to receive coronary revascularization or coronary artery bypass graft.  It is due to similar symptoms in both condition and complex management in the pretext of existing lung complications.

The dynamics of Heart Lung Reciprocity

Lungs and heart not only share the thoracic cavity in common but also are functionally interdependent. A load of transporting oxygen laden pure blood is a composite effort of both the organs. This is evident in heart diseases having breathlessness as a hallmark.

Pulmonary disease conditions such as ILD can exert backpressure to heart known as pulmonary arterial hypertension causing right-sided heart failure. Also, conditions like left-sided heart failure, mitral stenosis, myocardial infarction can cause pulmonary edema or waterlog in the lungs due to venous hypertension. Fluid accumulation in air sacs or obstruction in blood flow due to fibrosis leads to a build-up of arterial pressure resulting in morbidity due to heart collapse.

Conclusion

Advancement in medical and surgical intervention has undoubtedly increased life expectancy and strengthened emergency care. However, the major lacuna to date is the prognosis mystery regarding the unrevealing of complex lung disorders. The situation is highly alarming with the involvement of pumping organs, eventually leading to untimely catastrophe, known as DEATH.

## Diabetic Mice Improve With Retrievable Millimetre-thick Cell-laden Hydro-gel Fiber

There has been a recent advancement in the treatment of Type 1 Diabetes as the researchers at the Institute of Industrial Science, The University of Tokyo) found out that the diameter of hydrogels carrying cells can establish its longevity after transplantation, making the cell therapy for Type 1Diabetes Mellitus efficient. In short, researchers have come up with a fiber shaped hydrogel transplant that can be successful at treating T1DM.

Type 1 Diabetes, also known as insulin-dependent diabetes, is becoming an increasingly common disease among young and adults alike. A 2017 report revealed that about 425 million adults in the age group of 20-79 were living with diabetes around the world. The cause of Type 1Diabetes Mellitus is autoimmune destruction of -βcells that are responsible for producing insulin which is an important hormone that facilitates sugar flow in the cells to produce energy.

At the moment, the treatment for T1DM involves timed exogenous insulin administration and continuous blood glucose measurements. This creates an unnecessary burden not only on the patient but also on the health system. The aim of the new cell therapy is to eliminate the need for insulin replacement as it focuses on substituting lost pancreaticβ-cells. Although cell replacement therapy appears to be an interesting option, its clinical success is quite limited. It is often compared to an organ transplant that depends highly on the transplant -acceptance. Foreign body reactions are common factors behind transplant rejection. The idea behind cell therapy is to make use of hydrogels to provide long-term protection for transplanted cells.

This is not the only research that focuses on the replacement of damaged cells to cure this auto-immune disorder, the whole pancreas transplant has already been successful at many clinics. It was noted that even though blood sugar levels were immediately restored following the transplant, the survival rate of the transplanted pancreas was as low as six months. This improved over the years with the advancement of technology but even at present, the transplant doesn’t survive for more than three years.

Interestingly, the study conducted by the researchers at the University of Tokyo revealed that the diameter of these hydrogel fibers can be detrimental in anticipating foreign body reactions. The tests were conducted on diabetic mice. Barium alginate (Ba-Alg) hydrogels with different fiber diameters were implanted into normal mice. This was done to demonstrate that immune reactions seemed quite low at 1.0 mm. In order to further concretize the claim, researchers compared these findings with the foreign body reactions at 0.35 mm. The results revealed that covering hydrogel in 1.0mm-thick fibers resulted in long-term immune-protection for islets of Langerhans and also helped maintain glycemic control in diabetic mice. These fibers also facilitated the easy flow of small molecules of glucose, insulin, and oxygen to pass through the membrane, which is crucial for better functioning of the cells.

The findings definitely give hope, and more clinical trials could be helpful in repairing all possible loopholes.

## Raw milk may do more harm than good

There is a popular belief that consuming raw milk or raw milk products is a healthier option, in contrast to consuming pasteurized milk or milk products. There exists a lot of misconceptions about pasteurization, with some suggesting it leads to loss of essential nutrients while others accuse pasteurization as an artificial or ‘unnatural’ processing of milk that risks spoiling the product.

However, scientific studies have found all such allegations to be unfounded. But what is more disconcerting is that scientific evidences suggest consuming raw milk or raw milk products poses definite health threats for us and that pasteurization of milk is critical before it is consumed.

Some critical scientific evidences from across the world suggest:

It is true that the heating process during pasteurization affects some nutrients in raw milk; viz thiamine, vitamin B6 and folic acid within the B-complex, and vitamin C. However, our present diet ensures these nutrients are received from other sources and hence missing them in milk products does not affect us much.

In contrast, raw milk or raw milk products have a higher content probability for harmful germs like Brucella, Campylobacter, Cryptosporidium, E. coli, Listeria, and Salmonella. Between 1993 till 2012, the Centers for Disease Control and Prevention (CDC) in the USA reported 127 outbreaks of diseases due to raw milk, which includes 1,909 illnesses and 144 hospitalizations.

Scientists at UC Davis have discovered that unrefrigerated milk, often done intentionally to allow it to ferment to produce clabber actually leads to these bacteria developing anti-microbial resistance genes which then makes them immune to antibiotic medicines.

The gastrointestinal tract in humans in modern times are often not able to digest certain components which we could earlier; or resist certain types of bacterial infections. A major factor in this development has been the advances made in medicines and how the human digestive system has evolved with these advances. Thus, the gastrointestinal tracts of infants and young children, older adults, pregnant women, and people with weakened immune systems such as people with cancer, an organ transplant, are unable to face the challenges of the different types of bacteria present in raw milk. While most healthy people can recover from an illness caused by harmful bacteria in raw milk products, there always exists the risk that some may develop symptoms that are chronic, severe, or even life-threatening.

A big form of risk stems from raw milk products like cheese, with a new-found customer preference for hand-made cheese or what is known as artisan cheese. Often these are part of ‘back to nature’ products, where the milk and cheese are produced in ‘organic farms’ with pasteurization. While handmade cheese from pasteurized milk is not a concern, such products from unpasteurized milk risk contamination (animal feces, dirt), cow diseases (mastitis, bovine tuberculosis), cross-contamination from dairy workers, etc that raises the risk of harmful elements in the raw milk thus produced.

It is therefore always advisable to consume only pasteurized milk and milk products.

## AI making its way to improve results and efficiency in Cardiovascular Imaging

Artificial intelligence is showing reassuring results in cardiology, especially in the area of cardiovascular imaging. Ever wondered how it works?

Machine learning algorithms (a subdivision of AI) are making it possible for cardiologists to find out new opportunities and delve into new discoveries difficult to be noticed using conventional techniques. This offers newer gateways helpful in medical decision-making.

Key features of AI in Cardiovascular Imaging:

• AI can help improve performance at low cost thereby facilitating decision making, interpretation, and precise image acquisition of anatomical structures as well as diagnosis.
• The big data obtained using imaging will be helpful in personalizing medical treatments and keeping an electronic record of patient data, health records, and outcome data.
• It is believed that this will help physicians work more efficiently on core issues while computers will handle the technical part.

Different kinds of imagining possibilities using AI

Echocardiography – This is the most commonly used imaging technique in cardiology, but it is highly user-dependent. It is also important to undergo serious training in order to interpret the results accurately. AI can be a low-cost alternative to provide a standardized analysis of echocardiographic images. It has already shown great success in this area.

Computed Tomography – AI has been greatly appreciated in the field of cardiac CT as it has helped in noise reduction, and image optimization thereby preventing invasive coronary angiography (ICA) in the identification of severe stenosis.

Magnetic Resonance Imaging – This includes anatomical images of various aspects of the heart including flow imaging, contractile function, perfusion imaging, and myocardial characterization. As is the case with Echocardiography, MRI is also highly user-dependent. Reports have shown that implementing computer-aided detection in the clinical setup can increase accuracy and simplify the analysis.

Nuclear Imaging of the heart – This is performed to assess any perfusion defects within the myocardial lining. AI-based models can improve the clinical value of the results obtained. AI-based models have been highly successful at detecting abnormal myocardium in CAD, and their efficiency is at par with manual analysis images received.

Conclusion

Technology is not new to humans. We’re getting more and more comfortable with the idea of relying on machines for safer and more accurate conduct of our day to day lives. In the case of cardiovascular imaging, AI has proved itself promising in various ways. Here are the reasons why the medical industry is ready to adopt computer-aided detection and diagnosis in Cardiology:

• Detection and diagnosis of disease
• Interpretation of data
• Collection and comparison of data for future studies
• Clinical decision making
• Accurate image acquisition of images
• Reducing health care expenditure
• Reducing the workload for physicians

AI seems to have a lot of pros for the medical industry, but it needs to be made perfect with more testing and re-testing. It will definitely be a great tool for cardiologists as it is capable of recognizing patterns that are otherwise difficult to assess for the human brain.