Monday 31 October 2016

Risk Assessment and Management for Healthcare Manufacturing

Tim Sandle has published a new book titled “Risk Assessment and Management for Healthcare Manufacturing: Practical Tips and Case Studies.”

Avoidance of hazards and assessment of risk have long been part of the manufacture of pharmaceuticals and healthcare products. A high quality drug product must be free from contamination and reliably deliver the intended therapeutic dose as stated on the label and to achieve this manufactures must always be mindful of risk.

Tim Sandle's newest book incorporates regulatory perspectives, scientific methods and practical examples to describe approaches to problem solving when assessing, managing and reviewing risk. The book is divided into four sections that present a formal approach to risk. The first section provides a look at risk assessments and hazards, exploring the origins, looking at key concepts and philosophies and assessing the regulatory perspective. An overview of available tools for risk assessment and problem solving leads into specific 'soft skills' that can help to run an effective meeting, oversee a project and report root cause analysis and risk outcomes. The book concludes with an extensive set of case studies to show real-world applications of the tools and techniques presented. The wide range of topics presented throughout the four sections includes risk considerations for aging pharmaceutical facilities, application of quality risk management to cleanroom design and process incident investigation.

Further details about the book and ordering details can be found via the PDA Bookstore:

The contents of the book are:

Part A: Risk Assessment and Hazards
  • Risk Assessment and Risk Management
  • Regulatory Perspectives on Risk
  • Pharmaceutical Processing Hazards
  • Root Cause Analysis

Part B: Risk Assessment Tools and Problem Solving Approaches
  • Question Based Approaches: The "Five Whys" and "What if" Methods
  • Is/Is not Approach
  • Simple Risk Assessment Tools
  • Fishbone (Ishikawa) Diagram
  • Contradiction Matrix and Knot Charts
  • Pareto Charts and Control Charts
  • Hazard Analysis and Critical Control Points
  • Failure Modes and Effects Analysis
  • Monte Carlo Method
  • Fault Tree Analysis
  • Hazard and Operability Study
  • Six Sigma and Associated Quality Tools

Part C: Practical Tips
  • Effective Meetings and the Process of Brainstorming
  • Project Management and Research
  • Reporting Risk Outcomes

Part D: Case Studies
  • Application of Quality Risk Management to Cleanroom Design
  • Case Study: Hepa Filter Failure
  • Aseptic Transfer Risk Assessment: A Case Study
  • Importance of Risk Assessment for Aseptic Transfer in Pharmaceutical Compounding
  • Risk Assessment for Intervention Scoring in Relation to Aseptic Processing
  • General Considerations for the Risk Assessment of Isolators Used for Aseptic Processes
  • Risk Considerations for the Use of Unidirectional Airflow Devices
  • Risk Consideration for Aging Pharmaceutical Facilities
  • Process Incident Investigation
  • Addressing Manufacturing Constraints by Increasing Production Throughput
  • Risk Assessment of Production Formulation Stages
  • Risk Based Approach to Internal Quality Auditing
  • Risk Assessment for Data Integrity
  • Assessment of Raw Material Handling and Expiration
  • Risk Management and the Supply Chain
  • Risks Associated with Clinical Trials
  • Application of Risk Assessment to Develop an Environmental Monitoring Program
  • Detection and Risk: Environmental Monitoring Data Deviations
  • Application of Risk Assessment for Personnel Safety
  • Risk Considerations for the Installation of a New Pharmaceutical Facility Autoclave
  • Safety Risk Assessment for the Ozonation of a Purified Water System
  • Error Risk Reduction: Concept and Case Study

The reference is:

Sandle, T. (2016) Risk Assessment and Management for Healthcare Manufacturing: Practical Tips and Case Studies, PDA / DHI, Bethesda, MD, USA.

Posted by Dr. Tim Sandle

Sunday 30 October 2016

Shingles And Asthma Linked Together

A new study has connected shingles to increased rates of acute cardiovascular events such as ischemic stroke. The same research, from the Mayo Clinic, has drawn a connection between developing shingles and asthma.

Shingles (caused by the virus herpes zoster) is a disease characterized by a painful skin rash with blisters. Symptoms include a burning rash, together with headache, fever, and malaise.

Moreover, research indicates that childhood asthma to be linked to higher risk of developing shingles as an adult. This was derived at after a review of medical records from adults aged 50 and over from Olmsted County, Minnesota. These data were cross-referenced with the frequency of asthma in people diagnosed with shingles.

It was found that the mean (average) age of the patients with shingles was 67 years. Using logistic regression, the researchers calculated that adults who had asthma had a 70% higher risk of developing shingles as compared with those without asthma.

The inference from this study is that older adults with asthma should consider being immunized against shingles by vaccination.

Speaking with Medline, Dr. Young Juhn, a general academic pediatrician and asthma epidemiologist at the Mayo Clinic Children's Research Center, noted: “As asthma is an unrecognized risk factor for zoster [shingles] in adults, consideration should be given to immunizing adults aged 50 years and older with asthma or atopic dermatitis as a target group for zoster [shingles] vaccination.”

Research detailing the connection has been published in the Journal of Allergy and Clinical Immunology. The research is titled “Asthma as a risk factor for zoster in adults: A population-based case-control study.”

Posted by Dr. Tim Sandle

Wednesday 26 October 2016

EMA paper on production of water for injections (WFI) by non-distillation methods

The European Medicines Agency has published a question and answer paper on the production of water for injections (WFI) by non-distillation methods.

Water for injections in bulk is obtained from water that complies with the regulations on water intended for human consumption laid down by the competent authority or from purified water. It is produced either:

·         By distillation in an apparatus of which the parts in contact with the water are of neutral glass, quartz or a suitable metal and which is fitted with an effective device to prevent the entrainment of droplets; or
·         By a purification process that is equivalent to distillation. Reverse osmosis, which may be single-pass or double-pass, coupled with other appropriate techniques such as electro-deionisation, ultrafiltration or nanofiltration, is suitable. Notice is given to the supervisory authority of the manufacturer before implementation.

EMA/INS/GMP/489331/2016: Questions and answers on production of water for injections by non-distillation methods – reverse osmosis and biofilms and control strategies

Posted by Dr. Tim Sandle

Sunday 23 October 2016

Will Big Data Influence Pharmaceuticals for the Better?

Today, it feels like everything is connected. You can access your files from a computer, a smart phone, a tablet or any other interconnected device — and that only accounts for your own personal files and devices.

Guest post by Megan Ray Nichols

Cloud storage and other big data creations have changed the way we look at and store information. What impact will these changes have on the pharmaceutical industry? Will these changes be able to alter the industry for the better, or could they possibly present new problems?

Waiting for Answers

One of the biggest problems researchers face in the pharmaceutical industry is the fact that information, in general, is treated as a proprietary and closely guarded secret. Individual researchers and companies spend a lot of time keeping their data protected from outside influence. They spend so much time protecting their information that when it comes time to share with the public, the investors, or other pharmaceutical companies, it becomes difficult or nearly impossible to disseminate the information.

Many companies have started to share their raw clinical trial information with the industry, but it’s a slow process. In the meantime, the data that might lead to the next wonder drug or medical breakthrough is sitting in limbo, gathering virtual dust because it can be so difficult to access.

Genomics and Data

One of the biggest uses for big data in the pharmaceutical industry is in the field of genomics. You need a lot of space and quite a bit of computing power to sequence a human genome — when you’re dealing with 25,000 genes and three billion base pairs of DNA, you’re looking at about 1.5 gigabytes of storage per genome sequenced. To put it in perspective, that’s about the size of a 1080p movie file.

If you’re sequencing the genomes of a couple hundred test subjects, you may find yourself carting around dozens of heavy hard drives in order to carry all of that information. Alternatively, though, you could look into cloud data storage.

The ability to share genomic data via cloud technology has two main benefits. First, it cuts down on the amount of physical storage space you need. All that’s required is a computer with Internet access to connect to all of your data.

Second, it makes it easier and faster to share raw research data with the rest of the pharmaceutical community. If a researcher in Tokyo makes a discovery that could shake the entire industry to its core, they don’t have to sit through the peer review process, waiting weeks or months to publish a paper that could change the world. All they have to do is upload their research to the cloud. It’s as simple as that.

Now, most researchers aren’t uploading their publishable discoveries, preferring instead to share their raw research data, but the platform is still there to provide a stepping stone for genomic data discoveries.

Quantifying Data

While cloud storage is a great platform for sharing research data, that’s not the only thing it’s good for. It can also be used to help researchers quantify the raw data that has been placed in the cloud.

Having sequenced genome information stored for a variety of different subjects is great, but it can be a bit daunting to sift through if you’re specifically looking for subjects of a specific race, gender or age.

Cloud storage, when paired with a little bit of simple software, allows researchers to search through the stored data to find specific traits without having to pick through each individual genome to find what they’re looking for. Cambridge Semantic’s program Semantic Web is just one of the tools researchers can use to sift through the raw data to find the traits they’re looking for.

Crowdsourcing Our Genetics
Crowdsourcing has become a great tool for people who need to raise money, gather information, or in many cases, even make dramatic scientific discoveries. Stanford’s Folding@Home Project, for example, has been using personal computers around the globe for 16 years to find the answers to puzzles that have otherwise eluded scientists.

FoldIt, on the other hand, is a more interactive game that allows users to actively work toward the solution rather than watching the proteins passively fold. In 2011, users found the answer to a problem that had eluded scientists for 15 years. The amazing part is that collectively, FoldIt players were able to find the answer in three weeks.

If big data can do that for the pharmaceutical industry while just using random Internet visitors, imagine what they could do with a crowd of industry professionals around the globe?

Risks vs. Rewards
Bringing big data into the pharmaceutical industry has the potential for great rewards. Unfortunately, whenever you bring the Internet into the data equation, there’s always some risk as well.

The first risk you take is that the data you’re going to find is just random junk with no real use or application. Even if you severely limit the number of people who can access your data cloud, there’s still a chance that someone will upload some useless data that could potentially skew any and all results.

The second, and arguably the most dangerous risk, is data privacy. All it takes is one person with nefarious intentions gaining access to your data cloud, and all of your patient’s information could be at risk. Hackers are targeting medical information now more than ever because it’s more valuable than credit card information and not checked nearly as often.

It is possible to reduce this risk by removing personal information from the data, beyond the bits of information needed to classify the raw information. Removing names, insurance information and other personal identifiers can help to protect your subjects while still allowing you to take advantage of the raw study data.

What It Means for Big Data and Pharmaceuticals

Overall, the use of Big Data in the pharmaceutical industry is going be a force for great good and the launching point for many ingenious advances in the industry. There’s no telling what amazing things will fall from that cloud next!

Saturday 22 October 2016

Moist-heat sterilization of blood bags

Vittorio Mascherpa has written an interesting article on the sterilization of blood collection bags. The abstract reads:

“This article provides basic information on the sterilization of blood bags systems by moist-heat. Problems of pressure compensation and steam penetration into the system parts without water inside, and the process choice between single and double autoclaving are discussed.”

The article can be accessed here.

Posted by Dr. Tim Sandle

Thursday 20 October 2016

WHO guidance on variations to multisource pharmaceutical products

The World Health Organization has updated Annex 10 of its GMPs “WHO general guidance on variations to multisource pharmaceutical products.”

According to the document:

“A marketing authorization (MA) holder or applicant is responsible for the quality, safety and efficacy (QSE) of a finished pharmaceutical product (FPP) that is placed on the market, throughout its life cycle. After the FPP has been authorized for marketing, the manufacturer will often wish to make changes (variations) for a number of reasons, for example, to respond to technical and scientific progress, to improve the quality of the FPP, to apply updates to the retest period for the active pharmaceutical ingredient (API) or shelf life of the FPP, to meet market requirements such as for scale-up or additional manufacturing sites, or to update product information (e.g. the information on adverse reactions). Such changes, regardless of their nature, are referred to as variations and may require the approval of the national medicines regulatory authority (NMRA) prior to implementation.”

The document can be accessed here.

Posted by Dr. Tim Sandle

Wednesday 19 October 2016

History of Pharmig

The history of the Pharmaceutical Microbiology Interest Group (Pharmig)

Posted by Dr. Tim Sandle

Tuesday 18 October 2016

More flexibility for authorisation of biocidal products

News on the regulation of biocides in Europe:


The regulation on the authorisation of same biocidal products has been updated to add new possibilities requested by industry. Companies are now able to get a national authorisation when a Union authorisation application exists for an identical product. The regulation will enter into force on 1 November.

Helsinki, 12 October 2016 – The Same Biocidal Products Regulation lays down a procedure for companies to get a secondary authorisation for a product based on either an existing authorisation or an on-going authorisation application for an identical product.
  • Following the update of the regulation, companies can now also use this authorisation procedure for:
  • a product when it is part of a product family for which an authorisation (or authorisation application) exists;
  • a product family when it is part of a larger product family for which an authorisation (or authorisation application) exists;
  • national authorisation when a corresponding Union authorisation (or authorisation application) exists.
The regulation was updated by the European Commission after a request from industry stating that the changes would help reduce the costs and administrative burden for companies.

The regulation was published in the Official Journal of the European Union on 12 October and will enter into force on 1 November 2016.

To allow for the new possibilities, ECHA will launch updated versions of the biocides IT tools – R4BP 3 and the SPC Editor – when the regulation enters into force.

Support will be available: a new web page, a practical guide and updated submission manuals will be published and a webinar on this topic will take place on 9 November 2016.

Source: ECHA

Posted by Dr. Tim Sandle

Monday 17 October 2016

Control of WFI and Clean Steam Systems for Bacterial Endotoxins

Bacterial endotoxin presents a risk to several classes of pharmaceutical product, with parenteral products at the greatest risk. Bacterial endotoxin is the lipopolysaccharide (LPS) component of the cell wall of Gram-negative bacteria. It is pyrogenic and it is a risk to patients who are administered intravenous and intramuscular preparations. The pathological effects of endotoxin, when injected, are a rapid increase in core body temperature followed by extremely rapid and severe shock. In some cases, death can occur.

This is the topic of a new paper by Tim Sandle.

The reference is:

Sandle, T. (2016) Control of WFI and Clean Steam Systems for Bacterial Endotoxins, Journal of GxP Compliance,  20 (4): 1-15

For further details, contact Tim Sandle

Posted by Dr. Tim Sandle

Sunday 16 October 2016

Introducing Cleanrooms

This book provides an introduction to cleanrooms and clean air devices in GMP environments. The book explains what cleanrooms are, the contamination risks, key design features, and the requirements for classifying and operating them. The book includes detail on the 2015 update to the international cleanroom standard ISO 14644 (Parts 1 and 2).

Posted by Dr. Tim Sandle

Friday 14 October 2016

ATCC® Bacterial Culture Guide

ATCC offer a Bacterial Culture Guide, which contains general technical information regarding bacterial growth, propagation, preservation, and application.

To access the guide, go to ATCC.

Posted by Dr. Tim Sandle

Thursday 13 October 2016

Vote for PDA distinguished author or editor

The voting has opened for the PDA Distinguished Editor or Author award. The Editor/Author with the most votes by Dec. 31 will be awarded the 2016 PDA/DHI Technical Books Distinguished Editor/Author Award at PDA's 2017 Annual Meeting.

Five books have been selected, including one by Tim Sandle on risk assessment ("Risk Assessment and Management for Healthcare Manufacturing: Practical Tips and Case Studies").

To vote, please go to: PDA vote

Wednesday 12 October 2016

Detergent selection

Ensuring that medical equipment is free from soil requires effective cleaning and the selection of an appropriate detergent. For this understanding the chemical characteristics of detergents is necessary in order to make an appropriate choice and for implementing a successful decontamination strategy.

Sandle, T. (2016) The importance of detergent selection, The Clinical Services Journal, 15 (8): 72-74

For further details, contact Tim Sandle

Posted by Dr. Tim Sandle

Tuesday 11 October 2016

Pharmacopeial Forum 42 (5)

A new edition of the Pharmacopeial Forum Vol. 42, No.5, has been issued. Items of interest include:

Chapter197  Spectrophotometric Identification Tests

Proposed revisions:

1.         A title change is proposed.
2.         A new section, Introduction and Scope is proposed.
3.         A new section, Identification Methodology, is proposed.
4.         A section heading change from Infrared Absorption to Infrared Spectroscopy is proposed.
5.         Technical and editorial revisions for Infrared Spectroscopy are proposed. The main technical revision for this section allows the comparison of the spectrum of the test specimen to the spectrum of the USP Reference Standard previously recorded.
6.         A new section, Near-Infrared and Raman Spectroscopy, is proposed.
7.         A section heading change from Ultraviolet Absorption to Ultraviolet-Visible Spectroscopy is proposed.
8.         A new section, X-ray Powder Diffraction Spectroscopy, is proposed.
9.         A new section, Equivalent/Alternative Tests, is proposed.

Chapter 1210 Statistical Tools for Procedure Validation [NEW]

Proposed revision details:

This chapter is proposed as a companion to Validation of Compendial Procedures 1225 with the purpose of providing statistical methods that can be used in the validation of analytical procedures. Specifically, this revision clarifies the accuracy and precision calculations while removing specific linearity requirements. Linearity may be inferred from accuracy or other statistical methods as deemed appropriate. This chapter discusses all of the following analytical performance characteristics from a statistical perspective: accuracy, precision, range, detection limit, quantitation limit, and linearity. Additional related topics that are discussed in this proposed chapter include statistical power, two one-sided test of statistical equivalence, tolerance intervals, and prediction intervals.

Chapter 121 >            Sterilization and Sterility Assurance of Compendial Articles

There were two main objectives in updating this chapter. One was to make sure current information was accurate. The other objective was to develop a new format for the information, with improved organization of specific topics. It was decided to accomplish the update in two phases. The two-stage revision started in the last USP cycle (2010–2015). The initial focus of the revision was on the sterilization aspects of the chapter, to be followed by revision of content relevant to sterility assurance. In considering how to update 1211, which has its origins in the late 1980s, it was decided to split the chapter, separating out the sterilization content by moving it to a new chapter family, Sterilization of Compendial Articles 1229. The content on sterility assurance is remaining in 1211, and the chapter is being renamed Sterility Assurance 1211. With the development of the 1229 family of chapters, focused on the various sterilization processes, now nearly complete, the focus of the future 1211 will be limited to sterility assurance. This draft proposal, which is a major revision to the existing 1211, is a short-term fix while a complete rewrite is under development.

Additionally, minor editorial changes have been made to update this chapter to current USP style.

The following Briefing list includes monographs and/or chapters that both reference the General Chapter under revision and require revision to keep references to the General Chapter accurate. Other monographs and/or chapters may also be listed, even where the reference to the General Chapter remains unchanged, as additional notice to stakeholders where there is believed to be potential for the change in the general chapter itself to affect pass-fail determinations for particular monograph articles

Chapter 1229.14 Sterilization Cycle Development [NEW]

This new chapter is an addition to the Sterilization of Compendial Articles 1229 family of chapters. Sterilization is not only a means to eliminate microorganisms, but is also a process that may alter materials and thereby impact usefulness, safety, or both. The correct development and implementation of a sterilization process involves a number of important steps. This chapter provides an overview of those steps and the general principles involved in the development of a sterilization cycle.

Chapter 1229.15 Sterilizing Filtration of Gases [NEW]

This new chapter is an addition to the Sterilization of Compendial Articles 1229 family of chapters. Sterilization processes are broadly divided into two categories: destruction of microorganisms and physical removal of those microorganisms from the material being sterilized. Autoclaving is an example of the former, and sterilizing filtration is an example of the latter. The sterilization of gases that contact sterile components, container and closures, and product contact surfaces of processing equipment is typically accomplished by passing the gas through one or more sterilizing-grade membrane filters. This proposed chapter provides an overview of the general principles involved in the sterilizing filtration of gases, including the types of filters used, the retention mechanisms, and the validation of the filtration process.

Posted by Dr. Tim Sandle

Monday 10 October 2016

Making science better: Swap ‘bad science’ with ‘good science’

In a previous article, we looked at ‘bad science’ including deliberately misleading research. Although not common, bad science still occurs. How can scientific findings be improved? We assess some best practices,

In this article: what makes for good science and how what steps are being taken to improve the quality of scientific research?
To begin with, what is good science? Good science is based on examination of empirical or measurable evidence, with the findings subject to specific principles of reasoning (a.k.a. the ‘scientific method.’) This is where disciplines masquerading as science fall foul. Here belief-systems like homeopathy and chiropractic cannot be said to be "sciences"; they are pseudoscientific in that they are presented as scientific but they do not adhere to evidence-based studies. These should be given a wide berth.
Inside a Chinese homeopathy shop
Inside a Chinese homeopathy shop
Good science draws a distinction between facts, theories, truths and opinions. A fact is something generally accepted as "reality" (although still open to investigation); it contrasts with "the absolute truth," which is not science as it cannot be challenged. A theory is based on an objective consideration of evidence, and is different to an opinion. An opinion is subjective. For example (using a fictitious person called Fred):
• To say Fred is an affable person is an opinion;
• If Fred says if he drops his glass on the floor it will break is a theory; although one which can be tested by the glass being dropped multiple times onto different surfaces to see how often it breaks, and whether this is more likely with concrete or carpet;
• If Fred jumps from a diving board, he predicts he will fall downwards into the water due to gravity. This statement is a fact and a theory, something with a high probability of being correct, and based on scores of research dating back to Isaac Newton.
To produce good science a "good scientist" is required. A good scientist is not simply someone who has studied a scientific subject and who is keen to learn. A good scientist will always have about them a degree of uncertainty and will be prepared to question, and acknowledge there is always something to learn. A good scientist will never claim to know all there is to know about his or her subject.
A microbiologist undertakes molecular testing into an unknown bacterium. Photograph taken in Tim San...
A microbiologist undertakes molecular testing into an unknown bacterium. Photograph taken in Tim Sandle's laboratory.
The foundation of good science is independent research. There are signs that this is becoming more commonplace. As to why this is necessary, the British Medical Journal found 97 percent of studies sponsored by a company delivered results in favor of the sponsor. By declaring interests up front, a degree of assurance prefixes the study.
Where the results of a scientific study can be closely scrutinized, researchers are more likely to be surer of their findings and less willing to present weak research. A recent trend in the scientific world has been towards open access. This is where science papers are made available for any person to read for free and not hidden behind a pay wall. Accessing a journal that requires an upfront payment isn’t cheap and if you want to look at several papers, this starts to go beyond the reach of most people. Not all journals have adopted the open access model, although more are doing so. This can only be encouraged.
Linked to open access and this is still not as common as it should be, is open access to the experimental data. Many medical doctors, for example, work with evidence based medicine. Not to be able to see the full-set of results from a clinical trial hampers their assessment of a drug’s efficacy. The biggest culprits remain pharmaceutical companies; although government run health and science agencies have become much better in recent years. To help push through access to clinical trial data, a campaign website has been set up. Called “All Trials”, the site has succeeded in convincing some journals to only accept papers where there is full disclosure of data. To encourage more openness, the site also carries a petition.
Another area to improve findings is with ensuring that research papers, which undertake "systematic reviews", are sufficiently wide ranging in their scope. Some literature reviews cherry-pick certain facts and figures to suit an argument. The better review articles consider and use a range of databases and assess all of the literature on a given subject. For example, a review of Czech studies on mouse models of cancer would not be as comprehensive as a review of cancer cases in animals and humans in all of Europe.
Science journalists can also do better. The perennial problem for the science journalist is "how can I sum up this topic in just one or two sentences that will make audiences want to read more?" Sometimes the scientist behind the research will like this and enjoy seeing their ideas reaching a large audience. At other times, the scientists might be thinking: "how dare you try and reduce a body of research into a shallow soundbite?" Science journalists must try to present the facts; avoid exaggeration; but also engage their audience.
To close out, just as the previous article concluded with a checklist for spotting bad science, this article ends with the foundations of ‘good science’:
1. Good science starts with a question worthy of answering.
2. This question should lead into some research, to look at what has been studied before. Research should always build-up on what has gone before. Scientifically recognized mechanisms should be employed.
3. Research should use natural mechanisms. There is no recorded case of where a non-natural explanation has proved scientifically useful. For example, a study that looked into drinking 30 cans of Coke in 10 minutes concluded it made people feel ill, this would be something that has strayed well away from what might really happen in society.
4. From the question, a hypothesis should be generated upfront. For example, “smoking tobacco does not cause cancer”. Having set this, scientists would then go off to prove whether or not this this hypothesis is true or false.
The best hypotheses lead to predictions that can be tested in various ways. Based on the recommendations of the philosopher Karl Popper and his theory of "falsifiability", these predictions should be tested experimentally; and the experiment should attempt to disprove the assumption rather than prove it. So, with the cancer example, scientists would attempt to investigate the question “smoking tobacco does not cause cancer” by trying to prove that it does (and if the question had been the other way around, the scientists would be attempting to prove smoking does not cause cancer.)
5. Similarly, the research should be free of dubious assumptions.
6. Having gone through these though-processes the experiment should then be designed. Asking questions like: how much data will be needed?; For how long will the experiment run for?
7. Where test subjects are used, the subjects — especially people — should be representative of the larger population that the results are intended to apply to. Note should be made of gender and demographic differences.
8. When an experiment is run, there should be controls. In a chemical study this might be something known not to react; in clinical trials, this will be the inclusion of placebo. For human research studies, the gold standard is a randomized control trial (this is a type of study where the scientist randomly assigns participants to either receive the treatment/exposure or not, and is ideally unaware which participants are receiving the treatment).
9. Good experiments are repeated many times, and the results subject to a robust statistical analysis. If an experiment cannot be repeated to produce the same results, this implies that the original results might have been in error.
10. The collected evidence should be peer reviewed by the scientific community. These external experts should give their views anonymously.
11. Findings should be cautious and include study limitations. A theory may or may not be formed.
12. One study inevitably leads onto another. The evidence will rarely be the end of the inquiry; instead it should serve as the basis for further studies.
Keeping these criteria in mind, scientists can produce better research and general community will have greater confidence in the information presented. Readers wanting to verify the findings of a science paper are worthwhile can use the key points above to make an assessment.

Posted by Dr. Tim Sandle

Sunday 9 October 2016

Recoding E. coli to become resistant to all viruses

Scientists are close to finishing recoding the Escherichia coli bacterium to work with a different genetic code — one that differs to all other genetic codes on Earth, the organism will have some interesting properties.

Microbiologists at the Harvard Medical School in Boston are close to completing a remarkable feat: the first full genetic recoding of a living organism. This feat begins with Escherichia coli, a simple, single-celled prokaryote organism. Even here the experiments are complex, requiring 62,000 changes to be made to the bacterial genome.
The purpose of this study is beyond the academic. The recoded E. coli bacterium would fulfill, if things work out, a range of industrial applications. The organism will be able to produce all types of proteins. These proteins have the potential to be more complex than any other known types, with the ability for artificial amino acids to be slotted in. In addition, the organism will be hardy and resistant to all known viruses.
To create the super-organism the research team designed the genome on a computer and then synthesized the DNA in short pieces around 2000 DNA letters long. These pieces were then fitted together through gene editing.
To address concerns about such an organism spreading to the environment, the scientists are programming it so it will not function without being "fed" a special type of amino acid, of a type that does not occur naturally. This means that should containment protocols fail and the bacterium enters the environment outside of the laboratory it would be unable to survive or reproduce.
Once the proof-of-concept study has been completed, the application of gene modification could be applied further. Interviewed by New Scientist magazine, lead researcher Professor George Church said he would like to create human stem cells that are resistant to all viruses.
The process to create the organism is discussed in the journal Science. The research paper is: "Design, synthesis, and testing toward a 57-codon genome."

Posted by Dr. Tim Sandle

Friday 7 October 2016

Meet the new virus named influenza D

A virus, relatively newly detected, that affects cattle has been given a new name: influenza D. The reason for the new naming is because of the virus is distinct from other influenza types — A, B and C.
The naming of the virus was decided at a meeting of the International Committee of Taxonomy of Virus. The committee declared that the virus forms part of a new genus called Orthomyxovirdae. Currently a single species sits within the genus — the Influenza D virus.
Influenza ("the flu") is an infectious disease caused by an influenza virus. The disease symptoms vary from the mild to severe, with common symptoms including a high fever, runny nose, sore throat, muscle pains, headache, coughing. Influenza affects many types of mammals.
Three types of influenza viruses affect people, called Type A, Type B and Type C. The new fourth type has, so far, only been shown to affect cattle.
The new virus was isolated in 2011. Although the virus was isolated from a pig, the reservoir for infection was traced back to a cow. This made influenza Type D the first known influenza virus to affect cattle. The discovery was made by Professor Feng Li and one of his doctoral students, Ben Hause. Noting that the virus was atypical, the U.S. National Institutes of Health provided a grant to allow for further examination.
The examination showed that the Type D virus was sufficiently different to other influenza viruses, with the closet match being the Type C virus, although even her the difference was 50 percent.
The virus is spread through direct contact, normally from one cow to another (in laboratory studies guinea pigs were used to study viral transmission). Subsequently to cows and pigs, the virus has been detected in sheep and goats. Studies with poultry have shown the virus cannot be transmitted to chickens.
The virus is not pathogenic to humans and does not pose any (immediate) future risk. However, a risk of mutation exists and further studies are taking place into the likelihood of the virus forming a new genetic strain that could pose a risk to people.
The original research paper was published in mBio and it is titled “Characterization of a Novel Influenza Virus in Cattle and Swine: Proposal for a New Genus in the Orthomyxoviridae Family.”

Posted by Dr. Tim Sandle

Thursday 6 October 2016

Rare trees discovered in the Queen’s garden

With huge areas of land and many residences Queen Elizabeth II might be forgiven for not knowing the full diversity of flora and fauna. This week botanists report the discovery of rare elms in the royal garden.
Elm trees thought to be extinct in Britain have been discovered at the Queen's official residence in Scotland (the Palace of Holyroodhouse, in Edinburgh.) These are two 100 feet tall Wentworth elms (Ulmus Wentworthii Pendula.) Holyrood Palace has served as the principal residence of the monarchs since the 16th century, and is a setting for state occasions and official entertaining.
Strangely the trees are conspicuous and well photographed, appearing as a backdrop to many a picture as various dignitaries meet the Queen or another member of Britain’s royal family.
The rarity of the tree has only just been discovered, following a survey of the plant life in the garden.
Botanists are hoping to propagate the rare trees and to grow more of the elms in other selected parts of Britain. Discussing this, and the discovery further, Dr. Max Coleman, who works at the Royal Botanic Garden Edinburgh, explained to the BBC: "That's the most striking thing about this story. It seems very odd on the face of it that these massive trees, that are probably the most photographed trees in the grounds of the palace, have gone unrecognized until now.”
Since the 1970s millions of elm trees have disappeared from the U.K., falling foul to a fungal infection called Dutch elm disease. This disease is caused by a member of the sac fungi (Ascomycota), and it is spread by a beetle. According to Science Alert the disease killed between 25 and 75 million elms across the U.K. over a ten year period.
One area of the U.K. where action was taken to preserve elm trees was in Scotland, and the Edinburgh area has the highest concentration. Despite this, the discovery of the rare Wentworth elms was something of a surprise. The last known tree died in 1996.
The trees have a "weeping" appearance and have large glossy leaves. The tree was originally named for Wentworth Woodhouse, the largest Classical house in Britain in the early twentieth century. The two trees are located on the main lawn to the east of the palace.

Posted by Dr. Tim Sandle

Special offers