Categories
Drug Delivery Neurosciences Sciences Uncategorized

[Sciences/BBB/Drug Delivery] Red blood cell-hitchhiking boosts delivery of nanocarriers to chosen organs by orders of magnitude (Brenner et al., Nat Comm 2018)

I know, I know I have been fairly quiet. I have to tell that between attending a scientific meeting, teaching a summer course, taking care of two grants proposals and finally handling three manuscripts. However, sometimes I also like to share some papers published in the field that are interesting or bringing novel ideas or concepts. This one has been suggested by one of the author and this is his quick summary:
We bind nanoparticles to the surface of red blood cells. These nanoparticles are going to be released from the red cells once they pass the first capillary bed. Therefore we can concentrate the nanoparticles in the target tissue. In addition, these nanoparticles potentially can carry one or multiple selected drug(s). So if we inject intravenously the particles are going to be release in the lungs but if we inject through the brain arteries the nanoparticles will be release into the brain vasculature.
It is a very nice of work here published in Nature Communication (I consider this is one of the top OA journal to get published in) and you can download the full-text here.
You can also appreciate this paper likely went into at least one round of review, with a time period of about 7 months between it got accepted for review and accepted for publications.

In this paper, the authors have been working on trying to develop alternative drug delivery carrier, in that case use red blood cells as “piggyback” cells to enhance drug delivery. They tried different formulations on isolated RBCs and identified some suitable for carrying antibodies or proteins.  They called these piggybacking RBCs as RBC-Hitchiking (RH) (I wonder if it is some Easter egg towards the “Hitchiking Guide to Galaxy”).

Upon identification of the right nanoparticle materials, the authors investigated the distribution and the delivery of the conjugates in different organs, as liver and lungs. What is interesting is the amount of injected dose recovered is much higher in the RH than the free-circulating one, in particular in lungs, whether they are bare-naked or bound with a protein. These nano-carriers can be delivered to endothelial cells.

But the interesting snack-bite from this paper is the intra-arterial injection in the carotid artery, in which there was a significant increase in nano carriers delivery in the brain.  Nano-carriers alone show a %ID lesser than 1% (that is about what you expect from delivering antibodies from an IV route towards the brain)  to over 10%.   The delivery was also maintaining the ipsilateral injection site, which is good considering you are likely to treat one brain hemisphere.

Now, time for me to be picky and kind of wish we had information on the PK profile, especially if this approach increased the stability of the nanocarriers. It would be also interesting to see how this technique fare in a experimental disease model (for instance a xenograft brain tumor and see if you can deliver targeted chemotherapy in mice).

Nevertheless it is a good paper that take us out from the classical nanoparticles formulation and try here an innovative and novel approach in drug delivery.

 

Categories
Neurosciences Sciences Uncategorized

[Science/Neurosciences] Mole rats running on…..fructose!

You may have heard about this paper from Park TJ and colleagues (Park TJ et al., Science 2017) on how mole rats were showing extreme compliance to anoxic (0% oxygen) level, no? It made the news these last couple of weeks and finally was able to put my hand on. You can access to it here (need to have a Science subscription though) but I read it and it is really interesting for many reasons, especially because I try to think how can we translate it as a therapeutical strategies for hypoxemic pre-terms babies or even as a stroke fighting-drug.

First, mole rats. Oh mole rats! Not the prettiest mammals out there. They are naked, they have long teeth and look all wrinkled. But they are underground dwelling animals like moles. Underneath, oxygenation is scarce and these animals have developed formidable adaption to hypoxia. We as humans can barely survive 8% oxygen (thats about the Mount Everest). At 6% oxygen (thats what would happen if a aircraft cabin undergo a depressurization), you die within minutes.

In this experiment, they went fairly extreme, they put the animals into anoxia (0% O2) and looked how long the animals would survive. They used a common mouse strain as a control. Mice rapidly died at 100% rate at 5% O2 and died twice faster (based on the number of breaths) at 0% O2. In opposite, mole rats went 30 times longer than mice and still were doing fine (0% deaths). Were mice died within 60 seconds, molerats died over 1000 seconds of anoxia. One possible reason is their ability of their heart to beat much longer than mice.

Now what is interesting is how the authors came to fructose. Mammalian cells run on glucose through the following biochemical pathway (see below):

glycolysis-10-steps-explained-steps-by-steps-with-diagram

I will spare you the Krebs cycle but this is what every since healthcare and life scientist have to learn. Glucose is broken down into many intermediates and at the end becomes pyruvate. From pyruvate, you can enter the Krebs Cycle and produce a significant amount of ATP (the fuel cell of every living organism) needed to provide energy for any biological process. Krebs cycle is very good at it and provides an ATP yield of 36ATP/glucose consumed. However, the Krebs cycle stall under hypoxia and forces the cell to adapt. In particular, it needs to regenerate NAD+ (from NADH) in order to keep the system flowing and producing energy. One way mammalian cells solved it is by converting pyruvate into lactate. Thats allow cells to produce some energy (2ATP/glucose) and regenerate its NAD+. However lactate has tendency to accumulate and develop adverse effect (the famous muscle cramps any runners have experienced).

Fructose is not much different from glucose, it has the same composition but just a little difference in the molecular structure.  We get fructose from our daily diet made of fruits and vegetables, but also from refined sugar (sucrose or HFCS, same deal).

17693

Now fructose can bypass and feed the glycolysis at different steps:
fructolysis-pic-ca333

Fructose can produce glyceraldehyde-3-P (GA3P) and dihydroxyacetone-P (DHAP) and enter the rest of the glycolysis. Now like glucose, fructose needs a transporter to enter inside the cells. Glucose has a myriad of glucose transporters (GLUTs and SGLTs) that can provide glucose inside the cells. But not fructose. These transporters have very poor affinity for fructose. In that case, fructose has one transporter called GLUT5 that prefers fructose over glucose.
Now this is where it becomes interesting, mole rats show much higher levels of fructose than mice during anoxia in many organs and in blood. Now the interesting fact is the high prevalence of it as fructose-1-P in the brain, only this form. How it goes in? I don’t know but mole rat brains have a higher GLUT5 expression than mice. Where this transporter is expressed? I don’t know either but it would interesting to look at this transporter at the BBB.

What is interesting is the difference in how mole rats  brain and heart differ from mice in terms of fructose activity. When administered fructose over glucose, mole rats organs know to switch between the two sugars to gets its energy. In the other hand, mice organs fail to switch and result in decrease their activity.

Now the question I have (since I am working on glucose transport across the BBB and its impact in kids suffering from GLUT1 deficiency) is: does human express GLUT5? If yes, which brain cells express it and if these cells can adopt fructose as a source of energy?

Categories
Blood-Brain Barrier Sciences Uncategorized

[Sciences/BBB] Histamine-induced blood-brain barrier disruption in teething children: a “post hoc ergo” on glucocorticoids.

Recently, I have been aware about some parents concerned about the impact of teething on the blood-brain barrier integrity, such claims was wrapped through one of the most bizarre “ergo post hoc” fallacy following that sequence:
1. Teething induces histamine release
2. Histamine is a vascular hyperpermeable vascular factor
3. Blood-brain barrier in babies is leak
4. Therefore Teething induces a blood-brain barrier breakdown in children.

You have to agree that is one of the most bizarre fallacious association, but it has been repeated and spread enough to have parents concerned about the impact of teething on the blood-brain barrier. To dispel that myth and beat that dead horse once for all, I think it is important to demonstrate why this information is fallacious.

 

  1. Teething and histamine release: understanding the mechanism of inflammation.

First, in order to understand the physiological response of teething, you have to understand the mechanism of inflammation. Everyone can provide a clinical presentation of an inflammation: it is red, it is swollen and it is hot.
Inflammation is triggered by lesion or a wound due to internal or external stimuli, in our case teething. Teething involves a mechanical stress (due to teeth growth) and eventually cells and tissue laceration. Such laceration release intracellular contents into the extracellular space that turns on resident immune cells. These cells in turn release what we refer to “pro-inflammatory factors”, a cocktail of different chemicals that triggers the inflammatory bomb on:

We have amino-acids derivatives and peptides (bradykinin, histamine, inflammatory chemokines and cytokines) and arachidonic acid (AA)-derived molecules. The production of AA-derived molecules (also known as prostanoids) are driven by cyclo-oxygenases (COXs). There are two types of COXs: COX1 that is constantly produced at small level and COX2 that is increased during inflammation.  COX1 produce the “good prostanoids” and COX2 the “bad prostanoids”, the latter being the driving force of the inflammation. COXs are the classical targets of the classical NSAIDs found in OTC products including acetaminophen (Tylenol), ibuprofen (Advil) and naproxen (Aleve). All these small molecule target COXs and stop prostanoids production. In the case of teething, inflammation is mostly driven by the release of prostanoids (such as prostaglandin PGE2) and interleukins (IL-1beta) (Blakey, White et al. 1996). Aside of two obscure studies published 40 years ago in obscure medical journals (Cotias, de Medeiros et al. 1968, Soliman, Abdel Wahed et al. 1977) there is no evidence of histamine release following teething. This therefore nullify claim 1.

  1. Histamine and the blood-brain barrier

Histamine is not the major mediator of inflammation, but it is indeed the major mediator in allergic reaction. During an allergic reaction, the immune system respond to the allergen by the production of certain types of antibodies called IgE.

IgE are produced by B-cells that positively responded to the allergen, recognizing it as a foreign body. IgE binds to a certain type of immune cells called mast cells. Mast cells are about less than 1% of the total population of immune cells. These cells are super-loaded with histamine, ready to puff it upon signal. Once IgE binds to its appropriate receptor, mast cells puff and release vast amount of histamine. Histamine in turns triggers the anaphylactic response such as “asthma”, “runny nose” and in the worst case an anaphylactic shock. The main treatment for mild allergic reaction is solved by taking anti-histaminic drugs such as Claritin D or Benadryl.
Because the histamine released in teething is much more negligible than the amount of prostanoids produced, the use of anti-histaminic is worthless because you only address a minor component of the inflammation and omit to block the major component. So the histamine relationship with teething is also refuted at this point. But does histamine can cause the BBB disruption and its leakiness? Yes, but only at high doses and only in very specific cases. Studies that have investigated the biological effects of histamine at the BBB are very old (20+ years) and were achieved with high concentrations (10-100 micromol/L) (Gross, Teasdale et al. 1981, Domer, Boertje et al. 1983, Watanabe and Rosenblum 1987, Butt and Jones 1992, Mayhan 1996). If we consider that histamine is produced during teething, we can conservatively assume that such level would not be over plasma levels found during a severe allergic reaction such as an anaphylactic shock. Reported values for an anaphylactic shock are about 6.35 nmol/L (Laroche, Gomis et al. 2014). Even at that high level, that’s put us about 1000X to 16000X less than values reported to have an activity on the BBB. So claim 2 is also refuted.

  1. Is the BBB leaky in newborns and babies?

TL; DR the short answer is NO. If you want to understand why and what is the science behind my statement, please check my previous post about it: https://scientistabe.wordpress.com/2016/05/21/neurosciencesbbb-thiomersal-and-the-blood-brain-barrier-where-does-the-science-stand/)

 

  1. Conclusions

By now, we should agree that the reason of delaying vaccines in children due to histamine-induced barrier disruption does not stand to science. There is no scientific rationale to support the hypothesis of a massive release of histamine during teething, such release being well below reported values for achieving a BBB disruption and leakage. If you have your baby teething right when he/she is due for immunization, consult with your AAP-accredited pediatrician for what is best for baby.

    5. References

Blakey, G. H., R. P. White, Jr., S. Offenbacher, C. Phillips, E. O. Delano and G. Maynor (1996). “Clinical/biological outcomes of treatment for pericoronitis.” J Oral Maxillofac Surg 54(10): 1150-1160.

Butt, A. M. and H. C. Jones (1992). “Effect of histamine and antagonists on electrical resistance across the blood-brain barrier in rat brain-surface microvessels.” Brain Res 569(1): 100-105.

Cotias, C. T., E. C. de Medeiros, U. V. Lima and C. F. de Santana (1968). “[Determination of histamine release in the blood serum of children during deciduous tooth eruption].” Rev Fac Odontol Pernambuco 1(2): 95-100.

Domer, F. R., S. B. Boertje and S. A. Sweeney (1983). “Blockade of the acetylcholine-and histamine-induced changes in the permeability of the blood-brain barrier of normotensive and spontaneously hypertensive rats by atropine and pyrilamine.” Res Commun Chem Pathol Pharmacol 42(1): 157-160.

Gross, P. M., G. M. Teasdale, W. J. Angerson and A. M. Harper (1981). “H2-Receptors mediate increases in permeability of the blood-brain barrier during arterial histamine infusion.” Brain Res 210(1-2): 396-400.

Laroche, D., P. Gomis, E. Gallimidi, J. M. Malinovsky and P. M. Mertes (2014). “Diagnostic value of histamine and tryptase concentrations in severe anaphylaxis with shock or cardiac arrest during anesthesia.” Anesthesiology 121(2): 272-279.

Mayhan, W. G. (1996). “Role of nitric oxide in histamine-induced increases in permeability of the blood-brain barrier.” Brain Res 743(1-2): 70-76.

Soliman, N. A., S. Abdel Wahed, A. M. Abul Hassan, G. el-Asheiry and A. K. Abdallah (1977). “Systemic disturbances accompanying primary teething: a clinical and pharmacological study.” Egypt Dent J 23(1): 1-8.

Watanabe, M. and W. I. Rosenblum (1987). “In vivo studies of pial vascular permeability to sodium fluorescein: absence of alterations by bradykinin, histamine, serotonin, or arachidonic acid.” Stroke 18(6): 1157-1159.

Categories
Blood-Brain Barrier Neurosciences Sciences Uncategorized

[BBB/Neurosciences] Why chemo is less effective on brain cancers than other types of cancer?

This is a post that termed from a  message left by one follower on my Facebook asking me: why does chemo do no penetrate the blood-brain barrier?
This was indeed a very good question that surely many non-scientists may ask themselves but have little or no reply on that topic.
To understand why chemo do not penetrate the BBB, you have to understand the obstacle a chemotherapeutic agent has to cross to get inside the brain.
In order for any drugs to get into the brain, you have to cross the BBB. If you are a drug, you are sorted into two class: hydrophilic drugs and lipophilic drugs. Hydrophilic drugs are “water-soluble” and dissolve easily in biological fluids (blood, gastric juice…) by their own. The problem however is that such drugs cannot enter the cells unless they have a protein carrier, also named “transporter”, that can bring this drug inside.
The second class of drugs are lipophilic drugs. These are drugs that dissolve poorly in water and other biological fluids but dissolve very well in fat and oils. Because cell membrane are made of fatty acids and cholesterol (yep, that’s why we need cholesterol as we also use it to make bile salts to dissolve fats from our food and also as a prime brick for the production of steroid hormones), lipophilic drugs can passively diffuse through the cells and reach the target.
Thats a boon and a bane in the same time when you are in the drug discovery side. A boon because it means your drug will not have issues to get into its target and can be given orally, a bane because it is also mean these drugs will be substrate of drug efflux pumps.
bbb_human_new

Drug efflux pumps are belonging to a super-family of molecules called “ABC transporters”. ABC stands for ATP-binding cassettes, as these pumps use ATP (Adenosine triphosphate, the major source of energy storage in cells) to function. These transporters function in a very similar fashion that exit door you can experience in a subway station: you can get from the subway system from it but you cannot get into the system from it.
Such efflux pumps were firstly described in cancer cells that developed drug resistance to their chemotherapy and later identified to be expressed by various tissues including intestine, liver and the BBB.
At the BBB, we have an array of different ABC transporters: ABCA (cholesterol efflux), ABCB1 (P-glycoprotein or P-gp), ABCCs (multi drug resistant polypeptides or MRPs) and ABCG2 (also called breast cancer resistant protein or BCRP). They have one job and only one job: to pump things out. Some have a very narrow spectrum (ABCA transporters mostly efflux cholesterol), some have such a broad spectrum (ABCB1 and ABCG2 share a common pool of substrates) that it is impossible to predict if your drug candidate will be substrate of these pumps or not. These pumps are so efficient to keep xenobiotics (any molecule not produced by your body) that we estimate over 95% of chemicals known by humans are not capable to cross the BBB. Thats a boon because many of these compounds can have severe neurotoxic effects, it is also a bane because it means that delivering drugs to the brain will be challenged by the BBB.
Unfortunately, a lot of chemotherapeutic agents have been shown to be substrate for one of these pumps. This has been recently brought into a white paper from the first  CNS Anticancer Drug Discovery and Development Conference (source:http://neuro-oncology.oxfordjournals.org/content/17/suppl_6/vi1.long)  co-authored by Dr. Quentin Smith (Texas Tech University Health Sciences Center, Amarillo, TX) and by William “Bill” Elmquist (University of Minnesota, St Paul-Minneapolis, MN), two eminent experts in the field of CNS tumors and BBB.
In 35 years of research, we have been able to extend the life expectancy of patients with glioblastoma multiform (GBM), one of the most aggressive type of cancer, from 7 months to 14 months. This is a fairly grim results in comparison to other types of cancer in which we are not talking about extending life expectancy but talking about how many patients stay cancer-free and overcome their cancer.
We certainly failed from bringing miracles to patients with brain cancer, but we are learning from our mistakes and by correcting them we are closer from finding better cures.
One mistake we have done was assuming that brain tumors were like any other tumors and were only good at making leaky and botched blood vessels. Turned out brain tumors surely make some botched blood vessels but they also find a way to keep chemotherapeutics away via the presence of a blood-tumor barrier (BTB).
f4-large

The second mistake was to consider that efflux pumps were equal in distribution and function, with a distinct preference for certain molecules.

f5-large

As you can see we have different pumps with an overlapping catalog of substrates but also an heterogenous activity, with P-gp and BCRP considered the main players. There are speculation that a rodent BBB and a human BBB are not similar in terms of activity of P-gp and BCRP (some studies suggest that the BBB in mice and rats rely heavily on Pgp, whereas in humans BCRP lift most of the weight). It also turned out that trying to block these transporters may have little or no advantages (http://onlinelibrary.wiley.com/doi/10.1038/clpt.2013.34/abstract;jsessionid=03AF5C34128C830A05FEDDF895E4A5B2.f02t04).
Finally, we are also learning that there is not such a “one size fits all” in brain tumors. Brain tumors are very heterogenous between their types and even between tumor sites within the same patient, as depicted in one figure presented in the white paper cited above.
f6-large
You can see that the uptake of 17 different compounds between a healthy brain region and a tumor brain were poorly correlating to each other.
Therefore, a lot of scientists are working on to find way to circumvent this issue. One approach is to enhance our odds to remove the tumor during surgery by using a glowing tag to label brain tumors (http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4343207/http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4343207/http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4343207/) and depicted below:
nihms-641371-f0004

Some try to develop some Trojan horses to bypass the BBB and deliver the chemo directly inside the brain and some try to basically transiently open the BBB by using micro bubbles in combination of focused ultrasounds (FUS) that induce a local dilation of these bubbles and a small opening of the BBB.
Science is a long learning process made of high expectations and a high number of failures, bringing scientists back to square one in the design of new treatments. Brain tumors are certainly the worst and as a BBB scientist that collaborated on some aspect and attended a couple of international conferences discussing brain tumors, it is heartbreaking to have little or no options that we can offer to patients.
However, as we learn from our mistake, we are understanding that success will not come for a magic bullet but indeed by small incremental steps bringing us closer to a treatment for these particular types of cancer.

 

 

Categories
Biology Neurosciences Sciences Uncategorized

[Neurosciences] The glymphatic system: the brain drainage system?

We are now 3 years after the description of the lymphatic system by Iliff and colleagues (http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3582150/) and the literature has been growing and its implication in neurological diseases as well. A recent message on my Facebook page (https://www.facebook.com/bbbscientist/) asked my thoughts on that topic and I thought it would be worthwhile to discuss about the glymphatic system since what was a couple of years ago disputable as some in the field considered maybe an artifact (also because it was also obvious how everyone missed that system until now) seems to gain momentum and acceptance.
I thought it would be worth to write down a concise review on the glymphatic system and its implication in some neurological diseases, using the current existing literature.

1. What is the glymphatic system?

ds00186_20ds00350_20ds00351_20my00828_20my01271_im04253_mcdc7_lymphatic_systemthu_jpg

In mammalians, plasma (devoid of any blood cells) from blood vessels can diffuse passively into the connective tissue and become what we refers as the interstitial fluid. This interstitial fluid perfuse the tissue between the cells and recovered by a drainage system called the lymphatic system. The lymphatic system is a circulatory system sharing the same origin than arteries and veins, as it is also formed by a endothelium. However such endothelium has a cellular phenotype differing from the endothelium lining the arteries and veins.
The lymphatic system ends up connecting to the thoracic duct that ends up in the vena cava superior and reinject the lymphatic fluid into the blood system. This lymphatic system have two functions: It contributes in a convection flow and the maintenance of the hydrostatic pressure within our internal organs and tissues, secondly it provides a robust monitoring system by the presence of lymph nodes highly enriched in immune system. By constantly checking for antigens from bacteria, virus and other pathogens, the immune system provide a constant protection and setup an alarm system.
Until recently, the brain was considered devoid of such lymphatic system. In place of such lymphatic system, the brain has the ventricular system.
introduction_clip_image012

The ventricular system produce an interstitial fluid called the cerebrospinal fluid (CSF) produced by the choroid plexus present in the 4th (IV) ventricle and circulate in the brain parenchyma via the ventricular system. It was considered such CSF was working within a closed system, being reabsorbed by the ependyma layer lining the ventricular system and by the sinus vein (located underneath the choroid plexus) and superficial venous system present on the brain surface.
This was the “textbook” model that was accepted by everyone but yet was filled by biophysical caveats. The study from Rennels and colleagues in 1985 already described the presence of a perivascular system capable to distribute within the whole brain through the injection within the subarachnoid space. The subarachnoid space is a virtual space that is sandwiched between the pia matter (an epithelial layer forming the most internal meaning layer) and the dura matter (the meningial layer lining underneath the skull).
This study was remaining fairly discrete and it was only until  Maiken Nedergaard and colleagues (University of Rochester, NY, USA) study published in JCI in 2013 (http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3582150/pdf/JCI67677.pdf), followed by the publication of Johnathan Kipnis and colleagues (http://www.nature.com/nature/journal/v523/n7560/full/nature14432.html).
The glymphatic system is what we refer as a “paravascular” (distinct from blood vessels) system that provide a conduction system system for the CSF that is external of the ventricular system as illustrated by the following chart from a recent review by  Tarrasoff-Conway and colleagues (http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4694579/pdf/nihms744165.pdf):
nihms744165f1
This provide a convection system providing a CSF flow diffusing inside the CNS and clearing out. Note this glymphatic system starts then the pill arteries (red) enters the brain and creates a particular space called the “Virchow-Robin space” and run through medium-size vessels (as these vessels have a smooth muscle layer). This flow occurs in medium to big-size caliber arteries but we don’t know if such system also occurs in brain capillaries, the place of the blood-brain barrier (BBB). Finally, the CSF is drained by veins, allowing the removal of metabolic waste and cellular debris (this point is important for Alzheimers).
If you are more a YouTube enthusiast, there is a video for the glymphatic system:

2. What is the function of the glymphatic system? 

We have still not a complete understanding of the glymphatic system in the CNS, however we can guess its function by looking at neurological diseases.
Firstly, it seems that this glymphatic system plays a role in Alzheimer’s disease (AD).  AD is characterized by the presence of aggregates of amyloid beta inside the brain. Such aggregates (also called oligomers) are capable to induce neuronal cell death. The glymphatic system therefore appears to play a role in the “drainage” of the brain parenchyma. A disturbance of this glymphatic system may result in an indirect accumulation of amyloid-beta peptide, as suggested in a recent study using a mouse model of AD (http://www.ncbi.nlm.nih.gov/pubmed/27234656). This is also supported by another recent study showing an impaired glymphatic system in patients with Type 2 diabetes mellitus (T2DM, the obesity-driven type of diabetes), as T2DM is an important risk factor for AD and other vascular cognitive impairments (also referred as vascular dementia). (http://www.ncbi.nlm.nih.gov/pubmed/27306755).
The importance of the glymphatic system in other neurological diseases, in particular multiple sclerosis (MS) has yet to be demonstrated although some websites associated this system with MS yet without citing literature to support such claims.

 

 

Categories
Academia Junk Sciences Pharmacology Sciences Uncategorized

[Junk Sciences] About that scientific paper retracted from Scientific Reports yesterday and the limits of peer-review

A tenet of becoming a scientist and earning a doctoral degree (Ph.D) in hard sciences is to be able to develop a critical thinking and skepticism over scientific findings. We learn how to not accept scientific claims as facts “just because someone said it” and learn to fact-check such claims by analyzing the data and see if the data are robust enough to support the claims or if they are simply inaccurate, non-conclusive or worse…..simply fabricated.
Data fabrication, adulteration, plagiarism and manipulation is unfortunately present in science. This is why peer-review is playing an important role in filtering out studies that are robust enough from studies that are murky or questionable enough. That latter is usually what I refer as “junk science”, scientific studies that are not standing to scientific rigor and should not have been reaching the publication stage. The peer-review process is not the most optimal one. If you want an analogy, consider peer-review as the wooden fence lining your backyard: it will not stop a burglar to climb over it but it will stop trespassers and marauders to come too close from your home.
Yesterday, I woke up straight in a middle of a Twitter firestorm about the retraction of a paper. Seeing papers retracted is not uncommon, there is even a website for that called “Retraction Watch” that track studies retracted by scientific journals. But yesterday it was such a bad paper that yesterday’s Dr. Derek Lowe that hold a PhD in Chemistry from Duke University had a fiery blog post about it (the access was denied soon after I read it but seems to be online again this morning) named “Crap, courtesy of a major scientific publisher“.
The problem was not facing a junk scientific paper, there are plenty around nowadays since Open-Access journals started to kick inside the world of scientific publishing and thanks to predatory publishers (I will talk about it later). The problem was the journal that has such junk paper published: Scientific Reports (SciRep, from Nature Publishing Group) (Disclosure: I have co-authored a paper published in Scientific Reports). Scientific Reports is the response of NPG to open-access (OA) journals such as Public Library of Sciences (PLoS). Because it is coming from NPG, everyone is expecting to attain a certain rigor for peer-review (Nature is one of the hardest journal to get your scientific study published). I always joke around that it is so demanding that we are facing “icebergs” papers, studies with five main figures and 50 supplemental figures that are only accessed online.
Using this debacle, I thought it would serve well as a poster child to expose some scientific fraud and provide some tips to distinguish good papers from bad papers.

1. Scientific Publishing 101: Peer-review, open-access, predatory journals and publication fees.

Publications in peer-reviewed journals is the bread and butter of academic researchers. It is as vital for a researcher as a credit report is for anyone living in the US. Two criteria matters in big time decisions such as finding a job or earning tenure in an University: how many papers you have your name affiliated to and which journals. These metrics are very important, especially with the latter driven by the impact factor (IF). The IF is the equivalent of a BBB rating: the higher, the better. Two giants dominates the field: Nature (from NPG, IF ~42) and Sciences (From the American Association of Advancement of Sciences or AAAS, IF~32).
It is so important that the number of papers coming from these two journals conditions the odd of a researcher to get a job in prestigious institutions such as Harvard, MIT, Stanford or UC Berkeley.
Papers are part of a particular cycle that I don’t know if we should call it vicious or virtuous.
1. To publish papers you need data.
2. To obtain data you need research funds.
3. To obtain funds, you need to write winning grants.
4. To have a grant having a chance to get funded you need papers
5. Repeat step 1.
All peer-reviewed journals follow the same procedure: I submit my draft manuscript that I consider solid enough for peer-review to a journal. The editor-in-chief (usually a well seasoned scientist) decides using both an objective and subjective point-of-view what to do with it: the objective one is if the paper fits into the editorial policy (for instance publishing my work on BBB into a plant biology journal is fairly no-sense) and the subjective one is if the paper is “attractive” enough for the editor-in-chief or not. If not, it will toss it fast. If it is, it will proceed and pick 2 reviewers that have more-or-less the adequate expertise. Such reviewers are kept anonymous for most journals with very few exceptions. Reviewers have a moral obligation to keep their review objective and fair. Sometimes they do, sometimes they don’t. You can easily imagine that if reviewer X is a scientist working on the same topic than me, that reviewer feels the risk of being scooped and therefore will work hard to find flaws to get my paper rejected and work hard to scoop me.
At the end, 2 or 3 reviewers will provide their comments and feedback giving the editor-in-chief the decision to accept or reject your paper. Once rejected, you have no other choice to move on to another journal and restart the same game.
The competition is fierce, with only less than 1% of papers submitted to the top 2 journals will end up being published. This also raised a race-arms to publish only papers that are groundbreaking science in big way and usually can shake up an entire field and a fierce competition for getting published. This is what I call the “wow factor”. But thats only a small problem that raised to OA journals and sometimes it can backfire due to scientific misconducts (examples: Two stem cells papers retracted because of data fabrications such as the Hwang paper about the cloning of hESCs from human oocytes published in Science in 2005 and the STAP “pickled stem cells” published in Nature in 2014)
The main problem is that once accepted, this study will suffer from a double-jeopardy in terms of publication fees: the authors have to pay publication fees to get the accepted paper published (usually goes from ~$1000-3000 per study). Once published,  you secede the copyrights to the publisher, this publisher will ask anyone wanting to read the paper to pay for its accession (~$50 per study). This second fee hinders how many scientists can read your study, limit access of scientists from developing countries to these studies and also limit the number of studies that will cite your study. Certain public health agencies like the National Institute of Health (NIH) responded to such issue by asking any studies funded with $$$ from NIH grant to be available free 12 months after publication through their “Pubmed Central” portal.

OA journals were born from these concerns. The OA publication follows the same protocol than regular published journals except for two aspect: they will accept any papers based on the robustness of the data rather than the novelty or “wow factor”. If your paper is not as exciting and breaking ground as higher journals but it solid and can provide the field with small but solid information, it will get accepted.
once published such studies are made open-access. Anyone can read them freely. This is because once accepted, the journal recover the costs by asking higher publication fees (~$2500-$3500) from the author of the study.
This is an interesting alternative publication method, however it also opened a new wild wild West in academic publishing. Like any good Western movies, you have wandering snake oil sellers and in academic publishing these snake oil sellers are represented by predatory journals and publishers. These publishers found some easy preys to feed on: academic scientists with studies that are so poorly designed or just simply fraud and could not pass the peer-review filters. As long as you give them money in form of publishing fees, they will publish your paper through an expedited review. This lead in recent years in the appearance of “junk papers” that are little or no scientific merit and yet get the right to get cited. This lead to a hall of shame through the Beall’s list of predatory publishers providing a database of journals and publishers with suspicious or demonstrated predatory practices. There is even one publisher found with a mailing address pointing….to a suburban house. How serious this can be? This is what feed most of the pseudoscience outside. Anti-GMO, anti-vaccines, chiropractic, naturopaths and homeopaths are all relying heavily on such “junk science” to provide a scientific rationale to their claims.

2. What was about this paper that made such firestorm and retraction by Scientific Reports?
The paper in question is titled “Novel piperazine core compound induces death in human liver cancer cells: possible pharmacological properties” by Samie and colleagues from the University of Malaya, Kuala Lumpur, Malaysia and published in SciRep last April. As today, the paper was not available through SciRep yesterday and seems back online today. I guess the academic firestorm put the server into severe stress.
I will go step by step and explained in comments what is wrong with this paper (see figures below).

Slide1

Slide2

Slide3

Slide4

Slide5

3. Conclusions

After reviewing the paper, you can notice how many flaws and blatant data manipulation was mined inside this paper. Peer-review cannot be a fool-proof system, as some very elaborated data fabrication may go unnoticed even by the most seasoned reviewer. I am not surprised either to see such junk study to made it through publication, if it was coming from a predatory journal. But seeing such paper coming from Scientific Reports being unnoticed although a fairly reasonable turnover (it was received in October 1 2015, accepted March 23 2016 suggesting at least one round of review and the submission of a revised form) is disturbing. Scientific Reports editorial has to consider what went wrong and investigate the review history of this paper but also whether reviewers assigned to review this study displayed the expertise needed and the objectivity to do it.
At that time, I would not be reviewer 1 or 2 (even 3) that reviewed this junk paper. Garbage in, garbage out.

Categories
Sciences Uncategorized

[Sciences] Chernobyl Catastrophe – Anatomy from a 30 year-old nuclear disaster

Today we marked the 30th anniversary of the Chernobyl catastrophe. I was seven when that catastrophic event happened. A lot of speculation, discussion, fears and also completely botched public relations mired the catastrophe. But also some of the most unexpected things happened.
As I am now 37 years old, I thought it would be a great time to go back to this, discussing about what exactly happened that night, the next days that saw a huge rush to reduce the impact of the catastrophe, the dreaded nuclear cloud that swayed through Europe and the disastrous PR of the French government, the closure of Pripyat and the unexpected rebound of the wildlife, but also the kind of impact the term of “radiation” had on my inner psyche that maybe raised my interest about the “post-nuke” genre in science-fiction.
First, I would say, the best would be to read this blog with the track “Radioactivity” by Kraftwerk, the German pioneers in electronic music that were capable to transform binary electronic sound into some nice of porto-electronic music.

1. The Chernobyl Central:
The nuclear site of Chernobyl is located at the vicinity of Pripyat, a small town that was primarily set for the engineers and staff working at the nuclear station. Pripyat is located near the Belarussian border and about 100 miles north from Kiev (Ukraine).

nuclear-power-plant
Nuclear central generate electricity using the energy produced by plutonium-239 decay into uranium-235 and by emitting an alpha particle (helium) with the release of an important thermal energy. This thermal energy allows the conversion of water from a liquid form into a gas (vapor) form that will allow the propelling of turbines that will transform this chemical energy into electrical energy. The process of nuclear fission is perpetuated by the decay of plutonium until the fuel rods have been completely burned up. Such fission process is perpetuated by the presence of neutrons generated that can bombard nearby uranium-238 and enrich them to become plutonium-239. However, such reaction can be modulated by the insertion of graphite rods that can absorb neutrons produced and reduce the reaction.
In the case of Chernobyl, the reactor in use was the RBMK type (more can be found about on the respective Wikipedia page (https://en.wikipedia.org/wiki/RBMK). RBMK nuclear reactors were cheap, efficient but suffered from several technical flaws in particular in its high propensity to become unstable if used in the level exceeding the nominal utilization.

2. The events that lead to Chernobyl catastrophe:
A series of events with little consequences by themselves indeed lead to the most catastrophic event in civil nuclear.
Chernobyl plant had a series design flaw that was fatal that day: a cooling system based on light (regular) water and the lag time between the time of an emergency shutdown and the activation of diesel engines (60-120 seconds) to allow the flow of water cooling inside the reactor to cool down the core).  This testing was planned to be done overnight, at the time the average consumption would be low enough to not generate a shortage in electricity supply. However, an unexpected failure in a regional power plant forced Chernobyl to produce more energy than its nominal usage and had the day team postponing the testing to the evening team.
As the evening team came to the central around midnight, they started the procedure by reducing the reactor power to 700MW (the minimum considered safe to avoid a shutdown of the central). However, an unexpected event happened as the reactor started to build-up Xenon-135, a byproduct that is normally disintegrated during a normal functioning of the central. Xenon-135 is also an excellent neutron absorber and can further decrease the amount of energy produced in a reaction named “reactor poisoning”. However, none of the operators were aware that the energy levels were already below the safety levels and inserted the graphite rods too deep, brining the energy production below 5% of the safety level. The reactor was running on unstable conditions, safety alarms went between 00:30-00:45 on but the operator decided to keep on with the procedure and delayed their maneuver to restore power to higher level. At 01:05, the operator were around 30% of the safety level but already doomed the reactor, as the coolant was not fast enough to contain the neutron production and the subsequent overheating.  At 01:19, the operators bypassed safety protocol and removed fail-safe rods below the minimum number required.
At 01:23, the emergency shutdown procedure was initiated and as a matter of series of poor design and decision-making triggered the reactor to unstoppable chain reaction. This procedure resulted in the insertion of all graphite rods that should have reduced the activity. However due to the flawed design of the rods, the poor neutron buffer properties of light water and the overheating of the coolant, it resulted in a positive feedback loop that further pushed the reactor to overheat (reaching a energy production from 200MW to 30’000MW in few minutes), pushing the coolant to vaporize and explode the roof of the reactor to a massive vapor pressure. A second explosion occurred after the explosion of the rooftop, estimated about the equivalent of ten tons of TNT the ejection of highly radioactive material from the reactor core to the surroundings, especially to reactor 3, as bitumen (a flammable material) was used in the conception of the roofings of the different reactors.

3. The Chernobyl catastrophe and the rapid response:
The radioactivity on site was disastrous with reports of 30’000 roentgens (R)/hour at the vicinity of the reactor core and 1’000 R/hour near the unit. A dose of 100R/hour is considered lethal to human beings. The two operators died on May 10 and 14 following the explosion. Although the report of the accident, staff in the reactor 3 were mentioned to continue their work, after ingestion of potassium iodine and gas mask. At 5:00, Yuri Bagdasarov (chief of the nightshift team) decided to go against the order and shutdown the reactor 3.
The radiation were so high that dosimeters designed to a maximal threshold value of 3.6R/h were literally outpaced within seconds of exposure, leaving workers with no realistic estimate of their exposure to radiation and obviously a blatant lack of personal protective equipment. The first emergency team was trying to tame the fire with no PPEs and for an exposure way beyond the safety level, leading these first responders to a certain death by radiation poisoning.
In the morning of the 26th and the following days, the emergency plan is set with one goal: to tame the fire inside the reactor and bring the reaction into control by dropping boron, lead and sand over the crater. 5000 tons will be dropped. The town of Pripyat was evacuated as well as the surroundings only by the beginning of the afternoon. Declassified KGB documents later indicated that the Soviet politburo in Moscow maintained the Ukrainian governor in the black during the first day of the accident, downsizing the gravity of the situation by pretexting that a fire occurred in the Chernobyl reactor 4 but it was under control and was extinguished. Pripyat residents started to develop some signs of radiation sickness as they described a metallic taste in their mouth, headaches and nausea. One witness of the event, Yuri Andreyev recalled his experience of that day to a recent BBC report:

Hundreds of reservists came to the site to help control the fire with a bare equipment against a lethal radiation. Some nicknamed them “liquidators”, some nicknamed them “bio-robots”. We estimate about 4’000 of the 200’000 workers were exposed to lethal levels of radiation.
Yet another danger was lurking that would have razed half of Europe. A danger that was equivalent to 10 times the strength of the Hiroshima bomb. The RBMK reactor design has two pools of coolant underneath the reactor. With the excessive heat, the melting of the concrete basement would have exposed the water that in contact would have further damaged the reactor (as light water is very poor neutron absorbent) and set the reaction further into a global catastrophe. The only way for avoiding such catastrophe was by manually opening the purging valves that would empty these coolant reservoirs. The only access was by swimming across the pool with a certain death by very high radiation exposure. Alexei Ananenko, Valeri Bezpalov and Boris Baranov, acclaimed as heroes as they exited the reactor after the valve opening, died from intense radiation sickness days after the exposure.
The combination of combustible fuel still burning, blended with molten graphite and concrete created a particular state of matter called corium that shows similarities with lava. As such corium started to pour into the pools, an emergency taskforce designed to dig a mine tunnel to pour an additional concrete slab was set, with miners digging 24/7 against the clock. In december 1986, a scientific team with the help of robots entered the reactor area and discovered that some of this corium has formed a particular feature in the pool that is referred as the “elephant foot”. Such matter was so radioactive that it was enough to sublime the camera and created inadvertently the first case of “selfie”.

maxresdefault

Days after the explosion, radioactive particles carried by a nuclear cloud moved over West towards Europe. By the detection of higher atmospheric radiation levels recorded by Finland, Sweden and Austria, USSR could not anymore cover up the disaster and had to publicly announce after the inquiry of the IAEA of the catastrophe.
Another disaster was just started to fester, this time a PR disaster. Germany rapidly informed his citizens from Southern Germany (including Baden-Wurttemberg and Bayern) to not each any fresh fruits and vegetables from homegrown gardens and from hunting catch.
In France, on April 30th 1986, the French government tried to downplay the exposure to the radioactive cloud by using weather forecast that was comforting that anti-cyclone coming from the Azura Islands would bounce the cloud from the Eastern side of France as depicted in this archive footages:
http://tempsreel.nouvelobs.com/monde/20160426.OBS9235/video-tchernobyl-revoyez-ce-bulletin-meteo-ou-le-nuage-radioactif-contourne-la-france.html/
http://www.ina.fr/video/CAB03006932

However, in the following days after the passing of the radioactive cloud, journalists manually detect an increased radiation from fresh fruits and vegetables in Alsace (my birth region), labelling the region as the one hit the hardest by the radiation.

4. Aftermath of Chernobyl on the local fauna and flora:
Chernobyl accident had direct fallout on the Ukrainian and Belarussian population with increased cases of thyroid cancer among the population and increased congenital deformities during the first five years following the accident. In the vicinity of the central, a pine forest got a amount of radiation high enough to see changes in the colors of the trees from a emerald green into a red color, indicative of a major stress and death process. Such forest is classically referred as the “Red Forest”.
The area is now a ghost town with an important security exclusion zone of around 20 miles around the central. Surprisingly enough the reactors 1,2 and 3 were maintained operational and were still in function up to 15 years after the accident and only closed in 2001. Wildlife has kicked back and have been surprisingly capable to adapt to the irradiated environment. There are no blatant sign of mutations but in an uncanny outcome, Chernobyl site has become an interesting biological experiment to study the effect of accelerated evolution within an ecosystem.
Mutations are the key element of evolution. Under normal condition, mutations occur at very low frequency. However, due to the presence of alpha and gamma particles that are also known as ionizing agents, these particles can damage the DNA present in living organism of the ecosystem. Under normal circumstance, we have a DNA repair toolkit that can fix DNA damage in a fairly high fidelity level. However, the extent of DNA damage can significantly impairs a proper repair mechanism and increase mutation rates.
Some surprising reports are the ability of the ecosystem to have repopulated the whole area within 30 years of absence of human activity in the area. Another recent study also highlighted the persistence of falling leaves that are not undergoing any degradation, suggesting the presence of a high radioactivity on the soil surface. There are even reports of a particular fungi growing in the Unit 4 of the central that may have developed some evolutionary traits to sustain radiation damage by presence of melanin pigments. The area though remains highly radioactive.
Currently, there is an active project ongoing to build an extra sarcophagi layer over the existing but decaying first sarcophagi.  A current concern is the constant monitoring of the “elephant foot” that is still highly radioactive and that will take centuries to decay if not millennia.
A recent BBC footage went inside the reactor 4 few years ago:

For more information:
The BBC has done a remarkable work on reporting on Chernobyl disaster and even produced a docudrama retracing the catastrophe that can be watched here below.

Among different academic institutions that have a dedicated page on the effects of Chernobyl on wildlife, the department of Biology at Texas Tech University has an interesting website that worth being visited.
http://www.nsrl.ttu.edu/chornobyl/
http://chernobyl.ttu.edu

 

 


 

 

Categories
Neurosciences Sciences Stroke Uncategorized

[Sciences] International Stroke Conference 2016 – Day 1 summary


I am in Los Angeles this week for attending the International Stroke Conference 2016, back in town after 5 years. Two days and a half conference gathering basic scientists, nurses, physicians and other healthcare professionals (we estimate around 5’000 attendants) discuss about latest news in stroke research, from the bench to the latest innovation in clinic.

Today was a very informative day, with lot of good sessions that were going around. The day started early with a session at 7:00AM on the “blood-brain barrier during stroke injury” chaired by Professor Gregory Bix (University of Kentucky) that have a lineup of very good sessions. In particular, his research on the effect of extracellular matrix as an active actor of stroke recovery. A lot of scientists see the ECM as merely more than a scaffold for the maintenance of the blood vessels structures. Turns out it is more than just a protein scaffold and has indeed a important role to play in the cellular response to injury, especially during stroke injury.  In particular, he showed that integrins (a class of proteins involved in cell-matrix interactions) has an important role to play in stroke outcome as his research showed that suppressing intern alpha5beta1 was having a dramatic effect on stroke infarct size and the BBB integrity. Another talk by Dr. Drittan Agalliu (Columbia University) shed some new lights on the disruption of tight junctions complexes during stroke with novel mechanisms and concepts that were until now not much investigated. Another presentation by Dr. Martha O’Donnell (University of California-Davis) raised the importance of ion channels and transporters in the brain endothelium during stroke injury, especially as an important mediator for brain swelling. We neuroscientists focus ion channels exclusively to neurons (due to their excitable nature), eclipsing their importance to other cells of the neuromuscular unit. Her presentation was indeed a great reminder of how such ion channels and proteins are important.
Finally the first session was concluded by Dr. Patrick Ronaldson (University of Arizona) that demonstrated that another class of transporters, classically referred as drug transporters, are also sensitive to stroke injury and that by better understanding how hypoxia/ischemia their expression, we may use it as a novel approach to deliver therapeutics across the BBB.
The second morning session that I attended was a discussion panel on using stem cells as therapeutics to stroke injury. It was a vivid and engaging session with several big names in the field including Pr. Steve Cramer (University of California-Irvine) that charismatically made a call to have more stem cell research into stroke research or Pr. Tom Carmichael (University of California – Los Angeles) describing over 15 years of stem cell therapy and stroke. It was amazing to see how far we came but also how too fast scientists and the general public may have been enthusiastic about it (as we were with gene therapy as well). We learned a lot in these last 15 years, we have seen huge progress in stem cell research. But we also have seen huge failure and disappointment as well. Some due to our knowledge at that time of stem cell biology but also due to poor designed clinical studies that resulted in disappointing results. This resulted in the field in a move similar to what have been done with experimental stroke model through the STAIR committee (Stroke Treatment Academic Industry Roundtable) that set guidelines and recommendations to have better models for stroke research with the creation of STEPS that try to apply the same guidelines and recommendation for stem cell-based therapies. One presenter used the analogy of the year of the monkey that comes back every 15 years. 15 years ago, stem cell therapy was very promising but suffered several setbacks. After 15 years, we are back into the year of the monkey and we have learned a lot from these cells. Maybe in the next year of the monkey, we may have some successful clinical trials.
One analogy used by Dr. Carmichael is the idea of versions. We had “stem cells v0.25” 15 years ago and we have been now in “stem cells v1.0” moving to “version 2.0”. I personally see the field ripe to use iPSCs to better model the neuromuscular unit during stroke injury and have a better bi-directional approach rather than the old linear “bench to bedside” to have a chance for finding some treatment to stroke injury.

The afternoon session was as good with small communications about basic mechanisms of stroke injury as well as poster tour. The nice thing about posters is it gives you time to meet and greet, ask questions and also find some old friends around. What I can say that by the end of the day, I was washed off metaphorically and literarily (rain was pouring down on LA when I was out back to my hotel and ended up soaked up well).

Categories
Neurosciences Pharmacology Sciences Uncategorized

[Sciences/Pharmacology] Bial BIA-10-2474 – Lessons from a tragic Phase I clinical trial ending

 

1043-logo

[UPDATE]: The Drug Safety French agency has released the protocol used by Bial and approved by the agency (PDF in French), according to an article published in Nature (http://www.nature.com/news/researchers-question-design-of-fatal-french-clinical-trial-1.19221?WT.mc_id=FBK_NatureNews): Press Release and PDF file in French

If you have been following the news, you may have heard about the tragic ending of a Phase I clinical trial carried in Rennes (France) for a new painkiller that was firstly described as a cannabis-like compound before being refuted.
The drug was set for a Phase I trial, with an initial 90 volunteers, with repeated injections of a same dose, in a placebo-controlled experimental setup. Out of 90 volunteers, 4 of them are in serious clinical conditions, one was declared brain dead within 24 hours and and as I write this post, his death was just announced in one French newspaper (http://www.dna.fr/actualite/2016/01/17/essai-therapeutique-le-patient-en-etat-de-mort-cerebrale-decede)
After leaving a few days to have the fog of breaking news and conflicting reports dissipate, I thought it would be helpful to write down a post to summarize the information about the drug candidate, the operating procedure behind Phase I trial and the current conditions on human experimentation. This is not an exhaustive review and comments are welcome to help improve this article.

What was the drug candidate tested?

According to the article posted in the French daily newspaper “Dernieres Nouvelles d’Alsace” dated from January 17th 2016 (see link above), the drug candidate that was used in the clinical trial is “BIA-10-2474”. For any chemist enthusiast around, here is the chemical structure of BIA, according to Pierre van de Weghe (Universite Rennes 1, Rennes, France).

 

BIA-10-2474 (that I will summarize as BIA) in the rest of the text is an experimental inhibitor of the fatty acid amid hydrolase (FAAH), an enzyme involved in the anandamide, an endogenous ligand of endocannabinoid system (as depicted in the following diagram).

nrgastro-2013-245-f1

The endocannabinoid system is the same system targeted by 9-THC one the active compound found in cannabis. Aside from the hallucinogenic effects, it also has some analgesic properties. Therefore, by acting on the endocannabinoid system, that may represent an alternative to opiates.

Bial, a Portugese pharmaceutical company has developed BIA as one of their product that was in their pipeline as depicted by the company website:

pipeline_EN_09112015

BIA was not the only drug candidate in the pipeline, as various other compounds especially targeting neurological disorders such as epilepsy and Parkinson’s disease went through Phase III clinical trials and are few steps away from approval by EU drug approval authorities.

BIA was a complete internal product, with no peer-reviewed publications listed in Pubmed (this is fairly common, due to important risk of being scooped by other pharmaceutical companies capable to synthesize such compounds through reverse engineering).

What is a Phase I clinical trial and what happened?

Drug discovery in pharmaceutical research follows the same pattern. Basic sciences identify some novel mechanism of a disease. Because you identify a novel signaling pathway driven by a protein, a nucleic acid, a sugar or a fatty acid derivative.

Pharmaceutical companies have huge chemical libraries either synthesized by medicinal chemists or extracted from natural compounds, that can contains up to millions of different compounds. Once you have identified a novel target for a drug and developed a biological assay, these companies will screen their whole libraries in a procedure called “high throughput screening” (HTS) run by automated robots capable to runs thousands of compounds per day.

Once they find a positive result, a “hit”, then they identify the compound and by fine medicinal chemistry, works on refining the biological activity by synthesizing a lead compound with hundreds of similar differing by subtle changes in the structure.

Once they found the best-matching compound, it undergoes screening using cell-based assays (in vitro) and animal-based assays (in vivo) to demonstrate if it has any potential. If it has any therapeutical potential, it goes into pharmacokinetics and drug metabolism testing to determine how this compound is handled by the whole organism as well as determining the toxicology (to determine any toxic effect, in particular any neurotoxicity, cardiotoxicity, hepatotoxicity and renal toxicity). This step is important at it prelude the Phase I clinical trial. Once it crosses these different checkpoint, the lucky one gets into Phase I trial. In Phase I clinical trial, it is all about assessing the compound is safe and determine any toxicity or side effects, but also determine the maximum dose that is deemed safe (this is what we call the minimum toxic concentration or MTC). Most of the time, the phase I is run in healthy volunteers, but sometimes it may be performed in patients volunteers that are affected by a condition deemed incurable in their late stage of the disease.
In the case of BIA, it was done on healthy volunteers, it was done through a contractor named “Biotrial” under the authorization by the French Agency of Drug Safety as described in their website (http://www.ansm.sante.fr/S-informer/Actualite/La-survenue-d-effets-graves-ayant-entraine-l-hospitalisation-de-6-patients-dont-un-en-etat-de-mort-cerebrale-a-conduit-a-l-arret-premature-d-un-essai-clinique-du-laboratoire-BIAL-Point-d-information).
Biomedical experiments in human subject is a sensitive issue that have seen various situations in humankind. Until the Second World War, experimentation on human subjects were fairly unregulated and looked more a “wild west” in which physicians and scientists had a large degree of maneuver, sometimes at the expense of the volunteer integrity. However, that radically changed how human subjects were involved in experiments. During the second World War, nazi scientists (with some Nazi physicians such as “Dr. Mengele” became celebrity due to their most outrageous “medical” experiments)  found in concentration camps a trove of human subjects to perform their experiments that not only had little or no scientific merit by their experimental design or rational at best, the worst form of torture or sadism that one human being can apply to another.
After the end of the Second World War, an international court named as the “Nuremberg Trial” took place in Nuremberg, Germany in 1946 to prosecute the SS chain of command that lead to one of the worst form of genocide documented by the mankind. After such trial, a second one took place and was aimed at prosecuting the Nazi physicians. This second trial served to set fundamental basis in the terms of involvement of human subjects in medical experiments named as the “Nuremberg Code” (https://en.wikipedia.org/wiki/Nuremberg_Code).
A key aspect of the Nuremberg Code is the absence of financial compensation that was deemed to avoid the pressure of human subjects to enroll and remain in the experiment.
Financial compensation is not obligatory but is seldom as a compensation for the time volunteers have allocated themselves for the experiment. In that case, an article from Science (http://www.sciencemag.org/news/2016/01/more-details-emerge-fateful-french-drug-trial) mentioned a honorarium of EUR1900 in addition of travel fare reimbursement. The experimental design was done for a 2-weeks period, with 10 oral administration and at least 40 blood sampling (likely for the PK and toxicology study).
The first round of the trial was done successfully in the single dose setting, with no adverse effect noted. However, the situation turned when the third stage of the trial designed to find the highest dose and the effect of repeated dose. This stage enrolled 90 volunteers, males and females in a age range of 18-55. Five of them presented same symptoms, with one that showed cerebral death and died as I wrote this post.
According the same article from Science, they cite the Chief of the Neurosciences Pole of Rennes University Hospital as following “… neurologist Gilles Edan of the University of Rennes Hospital Center said yesterday. MRI imaging has shown “deep, necrotic and hemorrhagic lesions in the brain” of the patients, Edan said.“. Similar information was also reported by Forbes (http://www.forbes.com/sites/judystone/2016/01/16/bials-french-clinicial-trial-ends-in-disaster-what-went-wrong/#2715e4857a0b309e43439b2c), but the Forbes article is raising rightfully some interesting questions as quoted in their article “There are so many questions, as little information has been released. We don’t know what doses this group of ill volunteers received, or how that was different from earlier groups. It was likely the first of a higher dosage. We don’t know if or how food affected the drug’s metabolism. Could there have been a contaminant causing disaster in one batch of drug? While much less likely with an oral than IV drug, an error in manufacturing or the concentration of drug could perhaps have affected one batch and not others.
There are still a lot of information that remains unknown.
One speculation I may raise based on the allegation is the report of “necrosis and brain hemorrhage” that would raise for me the possible disruption of the blood-brain barrier in these patients, leading to a massive brain hemorrhage and the subsequent brain damage. Such logical explanation is further supported by the facts the physicians have extensively used MRI scans to assess the damage.
Science article on this story mentioned that other companies like Pfizer found no undesirable effects of their home-made FAAH inhibitor but dropped the compound after failing to find clinical efficacy in Phase II clinical trial. The impact of FAAH and the barrier function remains unclear but there is a study by Piomelli and colleagues (Moreno-Sanz G. et al, Pharm Res 2014) that mentioned another FAAH inhibitor that share some chemical structural similarities with BIA suggested the interactions with P-gp (ABCB1) and BCRP (ABCG2), two drug efflux pump at the BBB (http://www.ncbi.nlm.nih.gov/pubmed/24993496). Another study done by O’Sullivan and colleagues (Hind WH et al., Br J Pharmacol 2015) highlighted the protective effect of anandamide and oleoylethanolamide (two endogenous cannabinoid ligands and FAAH substrate) as protecting the barrier function at the BBB using an in vitro model of stroke injury (http://www.ncbi.nlm.nih.gov/pubmed/25651941).
Hopefully, we can hope to learn about what went wrong in the next few weeks.

Categories
Sciences

[Sciences] The TB-BCG vaccine, why I love vaccines and how certain vaccines give us immunity for life.

IMG_0660

A couple of days ago, I had to have a physical exam for some administrative process. Among the different steps of the physical exam, I had to have a TB test. TB stands for tuberculosis. It is caused by an infectious agent, Mycobacterium tuberculosis. Mycobacterium genus contains some nasty germs inside including tuberculosis and foremost the one causing the leprosy. These are germs usually associated with poor hygiene but highly contagious.
M. tuberculosis is one of the particular bacteria that is very challenging to grow in cultures and until modern molecular biology technique, differential coloration using the Ziehl-fucsin technique (using malachite green) was used for identifying these germs from the population.
In the pre-antibiotic era, the only solution that was working at some point, as we called in Europe “sanatorium” consisting of fresh, high-altitude air with a good nutrition would help the “natural immunity” fight off the infection as well as strict law enforcement against “spitting in public area”. The arrival of streptomycin and the development of attenuated vaccines called “bacille de Calmette-Guerin” (thus the acronym BCG) and resulted in a huge decrease in cases of tuberculosis worldwide, leading to the closure sanatorium all over the world.
The BCG was a very popular vaccines and according my French immunization record, my last shot was dated from 1985, with a BCG test realized in 1993. This test, as seen in the picture, consist of an sub-cutaneous injection of tuberculin, a protein present on M. tuberculosis (both the virulent and BCG alike). These surface proteins act as a “passport” for our immune system, If the immune system recognize your plate is not valid, it will pull the foreign agent aside as if you fail to have valid document at the US Custom Border Protection booth. Its job is to kill and destroy anything that is not holding the right “passport”. This is were vaccines are important, they act as fake illegal aliens and train your immune system to recognize nasty agents. Thus, if one real threat appeared, it will able to immediately induce an appropriate response and neutralize the threat, but also insert into the server the useful information for newer generations of CBP agents to recognize the threat. This is the memory lymphocytes B-cells and T-cells that will persist more and less longer and these are the direct output of the vaccination process.
Thats bring me back to my injection, my body have for a good twenty years never in contact of such germ or such proteins and never get reminded about it. I got the test injected, it just took 24 hours to have a massive inflammation (hot, red, swollen and painful spot) to show that my whole immune system went completely on alert and identified a threat. This twenty years after encountering the agent.

I never had a chance to see a sanatorium in my life and tell us something about the impact of vaccines on public health. If such medical institutions and other diseases (like measles, mumps, rubella, whooping cough, polio, shingles…..) appears old or harmless, it is because vaccines have done a tremendous job in protecting us against these diseases. Do we still hear about sanatorium, iron lungs? No more, because vaccines worked and work to protect us. And because when everybody is vaccinated, the germs have no safe house to propagate and spread, this is what we call “herd immunity”. By protecting ourselves, we also protect those who cannot be protected because they are too young to get their first shot or face an immune compromised body due to various reasons.
However, the recent call of “anti-vaxxers” held a fallacious number of celebrities holding no credentials in biomedical sciences or public health, except holding abilities to talk with the gluteal muscles or known for a “ho-la-la” career have beem spearheading a movement to associate vaccines with death, autism and other conditions. This of course resulted in the recent increase in cases of measles and other diseases we considered eradicated (but were just hiding and kept at bay). It shows that deliberately breaching our protection and the “herd immunity” by standing on fallacious and pseudoscientific claims can have rapid and damaging effects on global health, including several cases of fatalities in various part of the world. You have to sit down and think about that some states like Oregon and Washington have vaccination rates plummeted so much that they went below some developing countries.
Vaccines work, vaccines are keeping us safe for diseases that were so devastating 50 years ago that vaccines were victim of their success and wiped out them from collective memories.   I love my vaccines and remind everyone to check your immunization records, be sure you have yours up-to-date. And foremost, vaccinates your damn kids, they will say thank you later in their life!

IMG_0656