Thursday, 23 November 2017

PHENOGENETIC MAP CREATED FOR STEM CELLS MODELS


In an effort to better understand neurological diseases like Alzheimer’s, Parkinson’s and ALS – and develop new ways to treat them – researchers at The Ohio State University Wexner Medical Center have performed the first meta-analysis of all induced pluripotent stem cell models for neurological and neurodegenerative diseases, and created an atlas of how cell characteristics are linked to their genotype.

Findings of the study published online today in the journal EMBO Molecular Medicine.
“Synthesizing this information to understand the phenotypic role of disease-promoting genes and identifying the limitations of our current practices will be crucial steps toward achieving the great translational potential of induced pluripotent stem cell models of neurological diseases,” said Dr. Jaime Imitola, director of the Progressive Multiple Sclerosis Multidisciplinary Clinic and Translational Research Program at Ohio State’s Wexner Medical Center.

Imitola led the study with colleagues in Ohio State’s Neuroscience Research Institute and collaborators at the Nationwide Children’s Hospital, University of South Carolina and University of California Santa Barbara.

Induced pluripotent stem cells (iPSC) are derived from skin or blood cells that have been reprogrammed into an embryonic-like pluripotent state that can be prodded into becoming other cells, such as neurons to treat neurological disorders. The genotype is the set of genes in our DNA which is responsible for a particular trait, while the phenotype is the physical expression, or characteristics, of that trait.

Human disease modeling with iPSCs has enabled researchers to study the disease phenotypes of patient-derived cells directly in the lab. Now, a decade after the discovery of iPSCs, hundreds of patient cell lines and neurological disease phenotypes have been generated. Yet this abundance of phenotypic information has become difficult to follow and interpret, and research practices for iPSC neurological disease modeling varies among different laboratories, Imitola said.

For this study, researchers included 93 out of more than 110 studies initially screened, from which they collected data on phenotypes and genotypes, encompassing 31 neurological diseases that span the pediatric to adult population with a total of 71 gene mutations. As they analyzed the correlation of 663 neuronal phenotypes with genotypic data from 243 patients and 214 controls – and examined research practices and reporting bias in neurological disease models – they found that there is no established standard for the reporting of methods nor a defined minimal number of cell lines.
From the retrospective analysis of the published literature, researchers developed a taxonomy of central nervous system cellular phenotypes in vitro, and revealed that there are previously unrelated genes that show similar disease phenotypes. This work also showed that alterations in patient-derived cells at the level of gene expression correlate with the reported cellular phenotypes, and these dysregulated genes are highly expressed in specific regions of disease in the human brain.

As a valuable resource for the research community, researchers developed the iPhemap – an online database of phenotypic information of iPSC neurological diseases that can be referenced, updated and continually refined by researchers worldwide – to share knowledge and develop new, more effective therapies.

“Our phenogenetic map can be used to build new hypotheses in the field of neurological disease modeling, and to identify potential new opportunities to design novel drug strategies,” said first author Ethan W. Hollingsworth, a neural stem cell research assistant at Ohio State’s Wexner Medical Center and a hematology/oncology clinical research intern at Nationwide Children’s Hospital.

“The ultimate goal of this research is to be able to determine the phenotypes and genotypes relationship in neurological diseases where there is no mutation, or there are small genetic changes in neurons and oligodendrocytes from patients with progressive multiple sclerosis or sporadic Alzheimer’s disease to find new medications to stop neurodegeneration,” Imitola said.

Wednesday, 22 November 2017

Justification of starting materials for the manufacture of chemical active substances


Reflection paper on the requirements for selection and justification of starting materials for the manufacture of chemical active substances

The EMA has revised and re-issued its reflection paper on API starting materials in order to further clarify what information firms should supply. The paper centers on the following problem statement:

Disagreements between applicants and quality assessors on the suitability of proposed starting materials have become more frequent in recent times. This suggests that the current guidelines, intentionally high level to allow application to the wide range of chemical syntheses submitted to regulatory authorities, are open to interpretation. Furthermore, it is increasingly common for applicants to propose very short synthetic routes with complex custom-synthesized starting materials. Another trend is for some, or all, of the active substance manufacture to be outsourced to third parties. The use of external sources for any steps in a manufacturing process may lead to a higher degree of risk to quality of the active substance than would be expected were the full manufacturing process to be carried out by the applicant or a single active substance manufacturer alone. This document strives to expand on some of the points in ICH Q11 in order to harmonise opinions between assessors and clarify the requirements for applicants.

Additionally, the information submitted by applicants or Active Substance Master File (ASMF) holders to justify the selection of starting materials and their proposed specifications is often insufficient to allow adequate assessment of suitability.4 A detailed description of the manufacturing process of the active substance is required, along with a flow chart of transformations employed to synthesize starting materials including all solvents, reagents, catalysts and processing aids used, in order to facilitate a proper assessment. Since steps deemed critical should be carried out under Good Manufacturing Practice (GMP), an appraisal of the criticality of all transformations in the full synthetic route on the quality of the active substance is needed. The description of the manufacturing process should be sufficiently detailed to demonstrate that the process and its associated control strategy will consistently provide active substance of satisfactory quality. Starting materials can only be justified once the criticality of all steps has been discussed. Often, starting materials are selected and then only subsequent steps are discussed. This is not sufficient. A scheme of synthetic steps carried out to synthesize the proposed non-commodity starting materials should be provided as part of the justification of starting material selection.

For details see: EMA

Posted by Dr. Tim Sandle

Tuesday, 21 November 2017

Q&A production of water for injections by non-distillation methods



Q&A production of water for injections by non-distillation methods – reverse osmosis and biofilms and control strategies – a new document of interest from the European Medicines Agency.

The document states: “Production of water for injection by other methods than distillation has been allowed in the EU since April 1st 2017. A draft additional guidance was published in 2016 and a Final Version of this document provides a set of questions and answers (Q&A) which are intended to provide preliminary guidance until such time the on-going revision of Annex I of the GMP guide is complete.”

For further details see: EMA


Posted by Dr. Tim Sandle

Monday, 20 November 2017

Cleanroom Management in Pharmaceuticals and Healthcare - special offer


Dear Reader,

Euromed Communications have recently brought out a new 2017 edition of the Cleanroom Management in Pharmaceuticals and Healthcare. Since the first edition of this book in 2013 there have been many changes to the approach and methods for cleaning and certifying cleanrooms, most notably the revisions to Parts 1 and 2 of the ISO 14644 series of global cleanroom standards. In addition to setting out the principal changes in these revised standards, many of the other chapters in the book have been updated to reflect their requirements, bringing current practices and Good Manufacturing Practice regulations up-to-date. The book is edited by Tim Sandle and Madhu Raju Saghee.

This book was reviewed in the May issue of Pharmig News and full details of the book can be found on the Euromed Communications website.

The publishers are offering a special discount to readers of this site and of the Pharmig, Sterility Assurance & Pharmaceutical Microbiology LinkedIn Groups at 20% off the cover price.

Thus the special offer costs for the new manual are as follows:

Hard back: £196 (~$260)
Paperback: £152 (~$200)

If you wish to take up this offer simply send an email to jill.monk@euromedcommunications.com mentioning the code ‘Pharmig20’ and you or your company will be invoiced accordingly.

Please note this offer ends on 10 December 2017.

Tim Sandle, on behalf of Euromed Communications

Microbial Control of Pharmaceutical Water Systems


Tim Sandle has written an article for Pharmaceutical Engineering:

Water needs to be microbiologicallycontrolled. Microorganisms are ubiquitous and varied in their ability to survive and grow under different conditions. An out-of-control water system can cause harm to the patient or adulterate pharmaceutical products. This article assesses some of the requirements for good design together with the control measures necessary to help to maintain effective microbiological control in pharmaceutical facility water systems.

The reference is:

Sandle, T. (2017) Design and Control of Pharmaceutical Water Systems to Minimize Microbiological Contamination, Pharmaceutical Engineering, 37 (4): 44-48

For further details, please contact Tim Sandle

Posted by Dr. Tim Sandle

Sunday, 19 November 2017

Biocidal cleaners may spread multidrug resistance in MRSA


Multidrug resistance to MRSA and reinfection with MRSA, said corresponding author Jonathan Shahbazian, MPH, were the most important in this study. The study also showed that whether used in humans or companion animals, the antibiotic clindamycin was not associated with the risk of multi-drug resistant bacteria in the home.

Treatment with mupirocin, an antibiotic used to treat skin infections and to eradicate MRSA from the nasal passages in order to prevent its spread from sneezes, is weakly associated with mupirocin resistance in the household environment.

In the study, the investigators collected samples from the home environments and companion animals of households enrolled in a large randomized controlled trial, which took place over a 14 month period. They tested whether household-wide efforts to eradicate MRSA -- which included daily use of nasal mupirocin ointment and chlorhexidine body wash -- were successful in reducing recurrence of MRSA among adults and children who had previously been diagnosed with a MRSA skin or soft tissue infection. They repeated sampling in 65 homes three months after the residents had been treated for MRSA, or, as a control, after they had been educated about MRSA.

The investigators concluded that a better understanding of what causes home environmental MRSA to become multidrug resistant, and thus harder to treat, could help in identifying which households are more likely to harbor multidrug resistant MRSA, so that these could be targeted for eradication of the pathogen.

See:

J. H. Shahbazian, P. D. Hahn, S Ludwig, J Ferguson, P Baron, A Christ, K Spicer, P Tolomeo, A. M. Torrie, W. B. Bilker, V. C. Cluzet, B Hu, K Julian, I Nachamkin, S. C. Rankin, D. O. Morris, E Lautenbach, M. F. Davis. Multidrug and mupirocin resistance in environmental methicillin-resistant Staphylococcus aureus (MRSA) collected from the homes of people diagnosed with a community-onset (CO-) MRSA infection. Applied and Environmental Microbiology, 2017; AEM.01369-17 DOI: 10.1128/AEM.01369-17



Posted by Dr. Tim Sandle

Saturday, 18 November 2017

Examining the lifestyles of microbes


Scientists are identifying and characterizing more microbes each year using DNA sequencing technologies. As each new species is sequenced, scientists add it to the microbial "tree of life," creating a virtual census of what's there.

Turns out it's not an easy job. To put things in perspective, scientists aren't sure how many microbes even exist. Estimates vary widely from millions to trillions.

University of Delaware professor Jennifer Biddle and Rosa Leon-Zayas, who completed post-doctoral work at UD earlier this year, recently described new details about microbes known as Parcubacteria in a paper published in Environmental Microbiology.

The Parcubacteria were found in sediment samples collected by James Cameron within the Challenger Deep region of the Mariana Trench during the Deepsea Challenge Expedition. Leon-Zayas' doctoral advisor, Doug Bartlett at Scripps Institution of Oceanography, was a chief scientist on the expedition.

"From a scientific perspective, Challenger Deep was an invaluable opportunity to collect samples from the deepest part of the ocean," said Leon-Zayas, the paper's lead author, now an assistant professor at Willamette University.

Scientists traditionally have learned how microbes work by growing and studying them in petri dishes and beakers. It wasn't until DNA sequencing advanced to include the ability to separate and test microbes present in environmental samples (such as soils or sediments) that scientists realized they had missed a huge portion of bacteria now called the Candidate Phyla Radiation (CPR).

One group of CPR microbes called the Parcubacteria had been seen in the groundwater and shallow sediments of a few places on land, but it had only been intensively studied in sediment samples from an aquifer near Rifle, Col.

When Cameron collected sediment samples at the bottom of the trench, the scientists discovered that many different species of Parcubacteria live there, too.

"We were interested in seeing if the microbes living at the bottom of the ocean had the same lifestyle as the microbes living in soils in Rifle, Colorado," said Biddle, a marine microbiologist and associate professor in the College of Earth, Ocean, and Environment's School of Marine Science and Policy.

Leon-Zayas used a sorting technique to separate the microbial cells from the sediment particles so that scientists could amplify and sequence the microbe DNA. The researchers then characterized the individual microbial genomes. Based on the genes that are present in the genome -- sections of DNA that define what metabolisms a cell is capable of -- scientists can infer what the bacteria is doing.

This genomic sequencing revealed that Parcubacteria from the deep sea have a fairly simple metabolism; but the genomes were larger than that of their terrestrial cousins and even had a few extra features. In particular, these features indicated the bacteria may be able to perform anaerobic respiration, using things like nitrate to breathe instead of oxygen.

Parcubacteria also seemed to have more proteins and enzymes associated with cold environments, not surprising since the bottom of the Mariana Trench is cold and dark.

"It makes sense that organisms at the bottom of the ocean might have to be more self-sufficient. The environment is extreme and there isn't as much food," Biddle said.

See:

Rosa León-Zayas, Logan Peoples, Jennifer F. Biddle, Sheila Podell, Mark Novotny, James Cameron, Roger S. Lasken, Douglas H. Bartlett. The metabolic potential of the single cell genomes obtained from the Challenger Deep, Mariana Trench within the candidate superphylum Parcubacteria (OD1). Environmental Microbiology, 2017; 19 (7): 2769 DOI: 10.1111/1462-2920.13789



Posted by Dr. Tim Sandle

Friday, 17 November 2017

Bacterial signalling in sharper resolution


The complex signalling networks bacteria use to adapt to their environments have become clearer following new research.

John Innes Centre researchers used a study of the plant-growth promoting bacterium Pseudomonas fluorescens to develop an advanced analysis method which, they hope, will increase our capacity to understand plant and human diseases.

Until recently, investigations into bacterial signalling have tended to look at different aspects of gene regulation in isolation. Building on these individual approaches, the John Innes team used a range of lab, computational and mathematical techniques to integrate data obtained from multiple different microbiological experiments.

This approach has enabled them to build a comprehensive 'signalling map' for the key bacterial protein Hfq, which controls virulence and stress responses in many clinically and agriculturally important species.

Dr Jacob Malone, the project leader associated with the work explained: "Our technique allows us to follow every gene and protein in a bacterial cell, and say how it changes and at what level that change occurs, in response to a given signal input.

"We are using the same data sets as previous studies but we have developed a way of integrating the data using mathematics and programming. If you consider the individual elements of a movie: the photography, the soundtrack and the script; by combining them you get a whole movie -- something greater than the sum of the parts. This is the same principle, only with genetics."

See:

Lucia Grenga, Govind Chandra, Gerhard Saalbach, Carla V. Galmozzi, Günter Kramer, Jacob G. Malone. Analyzing the Complex Regulatory Landscape of Hfq – an Integrative, Multi-Omics Approach. Frontiers in Microbiology, 2017; 8 DOI: 10.3389/fmicb.2017.01784



Posted by Dr. Tim Sandle

Thursday, 16 November 2017

ISO/IEC 17025 moves to final stage of revision


Calibration as well as testing and analysing a sample is the daily practice of more than 60 000 laboratories worldwide, but how can they reassure customers about the reliability of their results?

Over the years, ISO/IEC 17025, General requirements for the competence of testing and calibration laboratories, has become the international reference for testing and calibration laboratories wanting to demonstrate their capacity to deliver trusted results. The International Standard, published jointly by ISO and IEC (International Electrotechnical Commission), contains a set of requirements enabling laboratories to improve their ability to produce consistently valid results.


However, the laboratory environment has changed dramatically since the standard was last published, leading to the decision to revise the standard and integrate significant changes. Steve Sidney, one of the Convenors of the working group revising the standard, explains: “The last version of ISO/IEC 17025 was published in 2005. Since then, market conditions have changed and we felt we could bring some improvements to the standard.”
Heribert Schorn, working group Convenor who also participates in IECEE (System of Conformity Assessment Schemes for Electrotechnical Equipment and Components), adds: “The revision was needed to cover all the technical changes, technical developments and developments in IT techniques that the industry has seen since the last version. Additionally, the standard takes into consideration the new version of ISO 9001.”

This standard is of high significance for the IEC Conformity Assessment Community as it outlines the basic requirements for testing within all Conformity Assessment Schemes and Programmes operating within the IECEE, IECEx, IECQ and IECRE Conformity Assessment Systems.

The review was started in February 2015 as a result of a joint proposal by the International Laboratory Accreditation Cooperation (ILAC) and the South African Bureau of Standards (SABS), who is a member of ISO and hosts the IEC National Committee. The standard’s revision process has now reached the Final Draft International Standard (FDIS) stage, the last leg of development before publication.

The main changes:

The revision of ISO/IEC 17025 takes into account the activities and new ways of working of laboratories today. The main changes are as follow:

The process approach now matches that of newer standards such as ISO 9001 (quality management), ISO 15189 (quality of medical laboratories) and ISO/IEC 17021-1 (requirements for audit and certification bodies). The revised standard puts the emphasis on the results of a process instead of the detailed description of its tasks and steps.

With a stronger focus on information technologies, the standard now recognizes and incorporates the use of computer systems, electronic records and the production of electronic results and reports. Modern-day laboratories work increasingly with information and communication technologies and the working group felt it was necessary to develop a chapter on this topic.

The new version of the standard includes a chapter on risk-based thinking and describes the commonalities with the new version of ISO 9001:2015, Quality management systems – Requirements.

The terminology has been updated to be more in step with today’s world and the fact that hard-copy manuals, records and reports are slowly being phased out in favour of electronic versions. Examples include changes to the International Vocabulary of Metrology (VIM)and alignment with ISO/IEC terminology, which has a set of common terms and definitions for all standards dedicated to conformity assessment.


A new structure has been adopted to align the standard with the other existing ISO/IEC conformity assessment standards such as the ISO/IEC 17000 series on conformity assessment.

The scope has been revised to cover all laboratory activities including testing, calibration and the sampling associated with subsequent calibration and testing.


Using ISO/IEC 17025 facilitates cooperation between laboratories and other bodies. It assists in the exchange of information and experience and helps harmonize standards and procedures, as Warren Merkel, another Convenor of the working group, explains. “ISO/IEC 17025 impacts the results delivered by laboratories in a number of ways. The standard requires them to meet criteria for competence of their personnel, the calibration and maintenance of their equipment and the overall processes they use to generate the data. This requires laboratories to think and operate in a way that ensures their processes are under control and their data are reliable.” Results also gain wider acceptance between countries when laboratories conform to the standard.

Developed jointly by ISO and IEC in the Committee on conformity assessment (CASCO), the new version of ISO/IEC 17025 will replace the 2005 version and is scheduled for publication at the end of this year.


Posted by Dr. Tim Sandle

Wednesday, 15 November 2017

Past, present, and future perspectives on the LAL reagent


Q&A covering past, present, and future perspectives on the LAL reagent


Since the introduction of the LAL reagent roughly 50 years ago, three methods of testing have been primarily used. how has your company used these methods?

JD: Actually, we use all four methods: gel-clot, kinetic turbidimetric, kineticchromogenic – and the PTS™. These methods are used to test and release incoming raw materials and to evaluate intermediate formulation buffers that are subsequently used to produce licensed LAL products. Our accessory products also benefit from the use of these methods prior to release. Finally, we leverage all of these techniques in developing and validating LAL test methods for customer-specific products.

TS: We began with the gel-clot test, which worked fine for water samples although it proved to be somewhat laborious for testing final products. Bio Products Laboratory was one of the first laboratories in the UK to adopt turbidimetric testing. This was in 1990, and technologies have improved greatly since then. We use turbidimetric testing today to test water samples (WFI and purified), intermediate products, finished product, and for other applications, such as testing container-closures for levels of endotoxin. We have also used LAL to screen for glucans.
How did the introduction of the LAL test change our industry?

TS: In terms of moving away from rabbit pyrogen testing this was very significant. It not only resulted in less animal testing being conducted, it also sped up the time-to-result and placed control of the test back into the hands of individual companies. Moreover, LAL results – where endotoxin is the pyrogen of concern – are more accurate and can be trended.JD: The LAL test has dramatically improved the quality of drug products and medical devices since its introduction. It has allowed the production process to be monitored in a way not possible before, resulting in better and safer products entering the healthcare system.

Furthermore, the relative low cost of the test allowed more samples to be tested. This expanded the scope of in-process control testing.
What are the key trends in endotoxin and LAL testing? What advances have been made?

JD: A major trend in LAL testing today is the continual movement away from the traditional gel-clot reagents and into more rapid quantitative methods. The data from these systems is also becoming more and more directly interfaced with a company’s LIMS system for more sophisticated data management and electronic record keeping. Obviously, it’s difficult to top the invention of LAL and replacement of the rabbit pyrogen test; however, there have been two significant LAL developments that really stand out. One is the development of quantitative LAL-based methods, which allow customers to quantify levels of endotoxin in their samples below their limits and to look for trends that could proactively prevent failures. Following that, the FDA approval of the Endosafe®-PTS™, the first point-of-use endotoxin detection system, allows users to rapidly test for the presence of endotoxin in the manufacturing environment for real-time results. Celebrating its 10-year anniversary, the Endosafe®- PTS™, has revolutionized the healthcare industry, enabling QC laboratories to improve sample management, decrease testing time and accelerate product production while still being compliant with global pharmacopoeial methods.

TS: In terms of trends, the scope of testing seems to be everincreasing with more tests being conducted.

Regarding advances, in recent years these have been towards rapid methods and with test systems having a larger capacity, to ease sample throughput. Recombinant lysates have also appeared on the market which seems as effective as lysate derived from the Limulus crab.

LW: Robust and more sensitive assays are the top two trends in the endotoxin detection industry; while portable systems and comprehensive analytical tools, like WinKQCL™ Endotoxin Detection and Analysis Software, are also becoming more essential to end-users.

In addition, there is a growing awareness of the need to conserve diminishing animal sources that are used for pharmaceutical and medical device testing. Bacterial endotoxin testing is rapidly evolving from an industry that relied heavily upon animal resources, i.e. rabbits and horseshoe crabs, to the development of biochemical assays as an endotoxin detector. This is evident in the increasing number of endusers who are now evaluating and validating alternative methods, like recombinant Factor C and the Monocyte Activation Test (MAT), instead of LAL-based tests for endotoxin detection. A little over ten years ago, Lonza recognized this progression of the industry and was the first company to develop a recombinant method for endotoxin detection that did not rely on the blood from the horseshoe crab: the PyroGene™ Recombinant Factor C Assay. Factor C, the first component of the horseshoe crab clotting cascade activated by endotoxin, is the critical component that allows for detection of endotoxin without isolation of LAL from the horseshoe crab. Endotoxin detection is more specific with the recombinant Factor C assay because there is less interference from other substances, like beta (β)-glucans. In addition, recombinant Factor C assays are less variable, as compared to the LAL test which may perform differently based on pooling of various lysates, while still providing the sensitivity end-users require.

What new challenges will be faced in the years to come?

JD: The development of new pharmaceutical drug products is becoming more and more sophisticated. These newer products, including biologics providing highly targeted therapies, involve spectacularly expensive production processes. Our challenge is to provide customers with all of the tools necessary to detect and, thereby, mitigate gram-negative bacterial contamination. These efforts require rapid methods, robust LAL testing methodologies, and the means to track and trend in-process and final-product endotoxin testing. Regulatory bodies currently emphasize process understanding, capability and control. These crucial production characteristics will become even more critical in the years to come.

TS: I think that recombinant lysates will become more commonplace, given some concerns about horseshow crab ecology. I also think that innovations will be made with rapid methods, and applied applications like on-line testing of water systems with fixed endotoxin reading devices. It is possible that robotic LAL testing – a fully automated method – will emerge (this was discussed a few years ago but it didn’t really develop). It is possible that the Monocyte Activation Test will challenge LAL as a key test for pyrogens, although adoption of this method has been slow and there are some concerns about its variability. Whatever happens in the future, the need to carry out endotoxin testing will remain as a key release test for many pharmaceutical products.

LW: New pharmaceutical and medical devices are continually being developed and released into commerce after regulatory approval. While some are approved, many more go through the same development process and clinical trials and are not approved. However, all of these products require BET testing if they are parenterally administered or implantable medical devices. Given the recent and expected future increased demand for LAL, the need to develop a sustainable alternative to the LAL method is a growing concern for end-users and regulators.

The distribution of horseshoe crabs is not worldwide. Since the population is limited by a lack of geographical distribution as well as the size of the population, any environmental threat provides a risk to the supply of available HSC. Given the demands being put on the limited HSC resource, as well as the risks to that resource, other options must be explored to meet the demands of the Pharmaceutical and Medical Device industries, and ultimately of the patients that rely on a safe supply of those products.Posted by Dr. Tim SandleContinued growth in the number of tests performed annually and increasing restrictions on fishing quotas for horseshoe crabs are mutually incompatible. The Asian species of horseshoe crabs, TAL, are now reported to be depleted, while the population of LAL on the East Coast of the US also had a significant decline in the 1980s and 1990s.

Tuesday, 14 November 2017

FDA approves first ever ‘digital pill’



An innovative medical device, in the form of a pill containing a built-in sensor has been developed and been given approval, based on safety and efficacy data, by U.S. authorities.

by Tim Sandle

What is being heralded as the world’s first ‘digital’ pill has gained approved by the U.S. Food and Drug Administration (FDA). The pill is called Abilify MyCite. The pill contains a medication called aripiprazole, which treats conditions like schizophrenia, bipolar disorder and depression. Also combined with the pill is an ingestible sensor. The pill comes after several years of research and is a venture between the Japanese pharmaceutical company Otsuka and digital medicine service Proteus Digital Health.

It is the sensor that creates the ‘digital’ or ‘smart’ pill. The purpose of the sensor, according to The Verge, is to record when the pill has been taken. This happens by a signal being sent to a wearable patch (fixed to the left rib cage); and then from the patch to a mobile device such as a smartphone via Bluetooth. The patch has additional functionality. It records activity levels, sleeping patterns, steps taken, and heart rate. The patch needs to be replaced every seven days.

The purpose of this is to determine when a pill has been taken. This either acts as a reminder  to patients that they have (or to verify that they have not) taken their required dose of medication; or, for more serious cases where a patient has been sectioned as the result of a mental disorder, to enable medics to record the fact that a medication has been taken.
Failure to take medications has societal and economic consequences, such as putting a strain on the hospital system. According to Dr. William Shrank, chief medical officer of the health plan division at the University of Pittsburgh Medical Center, who spoke with the New York Times on this subject: “When patients don’t adhere to lifestyle or medications that are prescribed for them, there are really substantive consequences that are bad for the patient and very costly.”


The sensor is only the size of a grain of sand. It is manufactured from silicon, copper, and magnesium. In terms of how the sensor works, an electrical signal is activated when the sensor comes into contact with stomach acid.

It is a common problem with medications, either through forgetfulness or as a conscious act, when patients do not taken the medicines prescribed to them. The digital pill aims to redress this.

Commenting on the go-ahead for the digital pill to be marketed, Mitchell Mathis, who is the director of the Division of Psychiatry Products in the FDA, told PharmaPhorum: “Being able to track ingestion of medications prescribed for mental illness may be useful for some patients.”

The regulator added: “The FDA supports the development and use of new technology in prescription drugs and is committed to working with companies to understand how technology might benefit patients and prescribers.”


The agreement relates to both parts of the smart medication: Abilify and Proteus Health’s sensor and patch, for the U.S. market. A label warning will accompany the product. This will state that the combined smart system has not been shown to improve patient compliance and that there are concerns about the effectiveness of the tracker in real-time since detection may be delayed. Nevertheless, the digital pill is likely to become popular with the medical establishment. The price of the pill has yet to be announced.


The Wall Street Journal opines that there could now be a raft of approval requests for other digital pills. The paper also notes that the FDA are preparing to hire more staff with understanding of software development in relation to medical devices.

How Staph Cells Dodge the Body’s Immune System


For years, medical investigators have tried and failed to develop vaccines for a type of staph bacteria  associated with the deadly superbug MRSA. But a new study by Cedars-Sinai investigators shows how staph cells evade the body’s immune system, offering a clearer picture of how a successful vaccine would work.

Staph frequently causes skin infections but occasionally can lead to deadly conditions such as sepsis, pneumonia and bloodstream infections, particularly in hospitalized patients whose immune systems could be weakened by illness.

One strain of the bacterium, the superbug methicillin-resistant Staphylococcus aureus (MRSA), is considered one of the top drug-resistant threats in the U.S., causing more than 11,000 deaths per year, according to the Centers for Disease Control and Prevention. In fact, the superbug kills more Americans than HIV.

“Widespread MRSA infections have prompted routine use of once last-line antibiotics, and this is making the antibiotic resistance problem worse,” said George Liu, MD, PhD, co-lead author of the study and a pediatric infectious diseases physician at Cedars-Sinai’s Maxine Dunitz Children’s Health Center and the F. Widjaja Foundation Inflammatory Bowel and Immunobiology Research Institute. “Our study focuses on why MRSA is so common and why we never develop immunity to these bacteria.”

The study, published in the peer-reviewed journal Cell Host & Microbe, also sheds light on how investigators could develop an effective vaccine against staph.

When exposed to a pathogen like a staph bacterium, the body usually fights it and then forms a memory of how its immune system responded. The next time the body encounters the same pathogen, it can use that memory to fight off the microbe much more easily.

But the body can suffer from repeated staph infections throughout life without developing a robust protective memory immune response. The study shows that staph bacteria are able to dodge this immune response.  

When the staph cell wall primarily is kept intact after infecting a host, bacterial molecules don’t escape the staph cell and the body isn’t prompted to produce robust protective immune memory.
“Essentially, staph tricks the body’s T cells, which are white blood cells that fight infection, and prevents them from mounting an effective defense,” said co-lead author Gislaine Martins, PhD, an assistant professor at the F. Widjaja Foundation Inflammatory Bowel and Immunobiology Research Institute and departments of Biomedical Science and Medicine.

As a result, the body does not develop long-term immunity and remains vulnerable to that particular staph infection throughout life. While certain staph bacteria cause mild skin infections, other strains of staph bacteria can wreak havoc in the bloodstream and bones, sometimes leading to amputations. 
“The study explains why our immune system is fooled by staph,” Martins said. “Staph evolved to have this enzyme that makes this modification in its cell wall. This modification protects the wall from degradation and therefore from being properly detected by the immune system, which won’t remember the bacteria the next time the body is infected.”

When study authors removed the cell wall modification, the staph cells spilled their molecules more easily. The modified bacteria sparked a robust memory immune response that  protected against reinfection. 

The study provides clues about what type of element could be added to staph vaccines to make them more effective. Whereas most staph vaccines have tried to stimulate antibodies — specialized molecules that recognize foreign bodies and help to mobilize the immune system — this study suggests that a successful vaccine should harness the body’s T cells.

DOI: 10.1016/j.chom.2017.08.008

Posted by Dr. Tim Sandle

Monday, 13 November 2017

Antimicrobial copper surfaces in hospitals



Infection control in hospitals is of paramount importance in order to reduce the potential for healthcare associated infections (an infection whose development is favored by a healthcare environment.) Infection control is concerned with eliminating as many pathogenic microorganisms as possible and limiting their transfer. This covers a range of measures from handwashing, disinfection, selection of antimicrobial drugs, the treatment of surfaces and so on. With surfaces, many types of microorganisms can persist for extended periods of time (some organisms can survive for longer than thirty days on standard surfaces); consequently touch-surfaces represent risk spots for pathogen transmission. In the hospital setting, some types of key equipment can be manufactured with antimicrobial touch components with the aim of making the surfaces self-disinfecting. For this a recent trend in the hospital setting has been to revisit the inherent antimicrobial properties of certain metals. A prominent example is the use, or incorporation of, copper.

Tim Sandle has written a review of the application of copper in hospitals. The reference is:

Sandle, T. (2017) Antimicrobial copper surfaces in hospitals, The Clinical Services Journal, 16 (6): 47-51

For a copy, please contact Tim Sandle.

Posted by Dr. Tim Sandle