Hacker News new | past | comments | ask | show | jobs | submit login
Pharmacokinetics: Drug development's broken stair (trevorklee.substack.com)
110 points by Ariarule 9 months ago | hide | past | favorite | 50 comments



Yes, pharma is empirical, not predictive. We are very far from understanding biology well enough to be predictive (except on TV).

Look are Loewe’s recent column in Science: Target-based drug programs seem like an obviously sensible approach but in practice have not been fruitful: https://www.science.org/content/blog-post/target-based-drug-...

Biology is still art and luck.

Note: I’m a former small molecule pharma developer myself.


Yup, I used to do small molecule work as well.

I remember when we were doing some early optimizing work and we had the computational chemists do some calculations on how to optimize drug binding to the receptor. "Oh, just put a methyl group here and you'll improve binding by 10x".

So we go and make the molecule and a few weeks later (it wasn't a trivial change) we go and run the assay and.... the binding is 1000x worse.

No doubt the computational methods have gotten better since this story happened, but our understanding of biology is so limited that even using AI to incorporate all known biology into a prediction isn't going to give you great results.

I do see it helping in the sense of guiding optimizing - "try and improve binding on this part of the molecule", but we're very, very far away from entering some data into a model and it spitting out a result that solves all the issues at hand.


>We are very far from understanding biology well enough to be predictive (except on TV)

I'm very excited for the next season of Small Drug Discovery on NBC. I hear the first 5 episodes this season are just screening candidates.


I actually got excited about there being a TV series about drug discovery when I read your comment, until I realized it was a joke (I'm tired).


> the first 5 episodes this season are just screening candidates

This was a dead giveaway. The first 99 episodes are just screening candidates.

The 100th anniversary episode is about the one candidate that looked promising but failed phase I because a patient thought he was el pollero.


I think the finale will cover a phase 1 trial!


When the drug doesn’t work they just push the “enhance” button and shareholder value goes up.


One of the most wildly productive periods in small molecule drug discovery was back when crazy, mostly German, chemists would taste their products as part of characterizing them.

That's why, among other things, we have such wonderfully high quality data on progressive mercury poisoning.


Saccharin was discovered when Ramsen rolled a cigarette and noticed that the paper tasted sweet (there is a non-tobacco version of this for the faint of heart).

And Alexander Fleming discovered penicillin because some bread from an old sandwich did not become moldy.

Modern lab procedures preclude any such discovery today.


...

Penicillin is mould. Mould has been known for centuries to have special properties that resist infection, but no scientific evidence to back it up.

The bacteria culture was accidentally left open during a vacation and some mold started growing on it. When they returned, it had killed off the bacteria. Fleming isolated the mould (penicillin) and had a provable antibiotic.

Lab errors still happen, though, because they are still run by humans.


You are right, but doing the kind of work proposed in the essay is how it very slowly becomes better understood and more predictive. Obviously there won't ever be the one breakthrough that solves everything, and it's most likely that the vast majority of such attempts fail, but just throwing up our hands and saying "that's just how it works!" is how you guarantee it never changes. And this is coming from someone who works in an perhaps an even less predictive and more empirical field.


Are you implying that people haven’t been and aren’t working hard to make it more predictive? This essay gives the impression that the author has just dipped their toe into the field.

Everybody has to start somewhere but it helps to have some depth of understanding before proclaiming others as idiots (seems to be a common passtime of startups these days).

When I got into pharma I could easily see a dozen obvious ways to speed things up and/or add therapeutic value. Fortunately I kept my mouth shut and learned that all but one had been tried. That one that made us unique was more of a social practice.


I am implying nothing. I don't work in the field, and, as you say, have no idea what is or isn't being done. If, as you suggest, the kind of work being suggested by the author is being worked on, that probably would have been a more useful comment than the one you made, which didn't imply anything of the sort and seemed pretty defeatist and just "that's the way it is", which is what prompted my reply.


‘Predictive’ = ‘empirical’ + non adhoc explanations


Sometimes the best you can get in pharma is post hoc. And sometimes it turns out the apparent mechanism of action is wrong.

Just look at the recent GLP-1 agonists like Ozempic. They are still largely a mystery.


That's why drugs go for clinical trials, because of ceteris paribus: researchers can't figure out the conditions under which cetera are paria. In many human situations, just instrumentalism (a philosophical stance) would do, without even understanding the underlying reality.


It’s shocking how many fail in phase 3. You don’t go into an eye-wateringly expensive phase 3 unless it’s blindingly obvious from phase 2 (dose-ranging trial) that you have a winner. Which phase 2 you don’t embark on unless you have good safety data from phase 1 (often a combined phase 1/2 if you can swing it so you some belief it is efficacious in humans).

I once had a board member with 26 approved drugs from his long career at Roche. He told me “you don’t want to know how many failed after hundreds of millions had been spent. You’d be too discouraged.”


I like how the author has discovered that a problem that costs pharma companies billions of dollars actually has a quick fix: simply model how livers react to infinite compounds.

Since all livers behave the same and there is no variation based on gene expression or disease, this is a trivial task! The reason why this hasn’t been Solved is because none have been Disruptive enough to dream of doing research in this area.


Moreover, pharma companies and academics already do model PK.. it's a huge field of study, not some ignored "broken stair" - there is definitely "serious money" behind this.


Author here. That's not what I said. Please don't be a jerk on the Internet.


You literally wrote

> It would just require the raw data from a variety of pharmacokinetic trials, some in-depth experiments on human liver and gastric membranes, and some simulation of the physics of how different drugs diffuse into the bloodstream and across membranes. This would be difficult, but not impossible, and would not require any huge scientific advances.


Yes, what do you think the pk trials and liver experiments would involve if not people who have a variety of phenotypes?


Indeed. The use of computation methods has been going on for I don't know..40 years at least? Since the advent of computers.

This is not a new approach nor one that hasn't already been exhaustively explored at a massive cost.


In majority of cases drugs simply fail due to issues with solubility and absorption, not metabolism.


> We could fix this stair. It would just require the raw data from a variety of pharmacokinetic trials, some in-depth experiments on human liver and gastric membranes, and some simulation of the physics of how different drugs diffuse into the bloodstream and across membranes.

This statement seems overly optimistic. Predictive Pharmacokinetics that obviates most trials would need to model all possible drugs for all people. The computational complexity of that problem seems out of reach. The best way to get to know complex adaptive systems (which can’t be properly simulated most of the time) is to test them empirically.


This is very interesting, and I think massively underestimates the variability in how small molecular drugs are affected by the body. Small differences in molecules can have drastic effects on where, how, and how quickly they're adsorbed, and what systems they affect.

More than anything, I think this just underestimates the unpredictability of it all. To extend the author's metaphor, if you measure the 100 data points from shooting 10 different projectiles out of 10 different devices, on 10 different celestial bodies, you're still not going to have a lot of predictive power that can be generalized.


This also ignores the effect of the microbiome, which metabolizes many drug compounds. Also how pre-existing conditions effect drug metabolism


And we haven’t even touched modeling, testing or predicting all of the genetic variants banging around in humans that strongly affect all aspects of pharmacokinetics.

Half of pharmacokinstics is pharmacoGENETICS.


I'm surprised the article didn't mention anything about pharmacogenetics. For example, drug-metabolizing enzyme polymorphisms can often lead to a 10+ fold difference in Cmax (or name the parameter) between a poor and an ultrarapid metabolizer of some enzyme.


Author here. Enzyme polymorphisms are tricky and would require a whole different blog post. Some drugs they matter a lot for, some drugs they do not. I actually have not found any drugs with 10-fold variability that could be ascribed solely (or even mainly) to polymorphisms, although it doesn't seem impossible.


https://www.ncbi.nlm.nih.gov/books/NBK84174/

Here’s an article I found in 30s of searching (granted, I already knew Warfarin was hugely affected by CYP2D6) stating that equivalent maintenance doses of Warfarin can vary by a factor of >10 (from 0.5mg to 7mg) based on the CYP2D6 phenotype of the patient.


Don't know about 10x, but having the MTHFR mutation genotype 667TT reduces folate activation [1] by 70-80%, which is a problem with drugs that require folate supplementation, like methotrexate.

[1] https://www.snpedia.com/index.php/Rs1801133


What drug has a 10 fold difference in Cmax, as a pharmacist I can’t think of anything even close to that.


Many tricyclic antidepressants metabolized by CYP2D6 would be here, if you put the nonmetabolizers up against the rapid metabolizer, https://en.wikipedia.org/wiki/Doxepin for example.


What about codeine with ultra-rapid metabolizers (e.g. the significant population in Saudi Arabia), it might not be a 10x difference but normal dosages can result in overdose or death in people that are ultra-rapid metabolizers or even in their babies if they are nursing mothers.


Warfarin is one of the most well known scientifically, but as a practitioner you may not know this because hospitals and insurers are scared to acknowledge the ongoing rate of preventable bad outcomes caused by not accounting for variances in metabolism.

That is to say, if we were to actually start assessing patients’ ability to metabolize warfarin we’d have to confront the fact that a huge number of bad clinical outcomes in the recent past stemmed from not doing this.


> hospitals and insurers are scared to acknowledge the ongoing rate of preventable bad outcomes caused by not accounting for variances in metabolism.

Healthcare systems are so aware of and open about warfarin's risks that they have warfarin clinics set up to repeatedly measure patients' prothrombin times so that they can adjust dosing empirically. Further, they also provide nutritional guidance to patients taking warfarin to help them reduce diet-influenced variance in warfarin metabolism.


Cyclosporine and tacrolimus, if you look at the range in any of their pk trials you'll easily see 10 fold variability.


you're a bit behind on your CE time. There are lots with that high of variability.


As someone who has worked in pharma, this article is pretty damn ignorant. There’s a ton of funding for pharmacokinetics work. For liver and the gut, it is widely agreed that 2-D tissue culture is completely non-predictive, so the last couple of decades have been building 3-D tissue culture and nowadays, companies commercially purchase organs on a chip for both and even multi-organ systems[0]. Plenty of companies have internal teams that have built similar technology and they all have teams working hard trying to build computational models on the generated data. But small changes in molecules have huge ADMET impacts and these organs are just a subset of the organs that can impact pharmacokinetics. Plus, toxicology can tank a drug by impacts on nearly any organ, so whole animal models always still win.

The entire article just yells ignorance of what drug companies work on and what they’re investing in.

[0] https://cn-bio.com/models/


Author here. As someone who works in pharma, organ on a chip models are not even close to being relevant to providing accurate PK curves for oral absorption. The state of the art is trying to make it work for cutaneous absorption and that's still not great/commercially useful.

My whole point is that there needs to be way more open data on pharmacokinetics. That's the only way we're going to solve this.


There's a company called VeriSIM life trying to make the physiological models mentioned in the post more accessible [1]. They apparently fit their models across a bunch of publicly available and proprietary data. I found some peer-reviewed publications (e.g. [2]), but I am not sure how widely they are used.

[1] https://www.verisimlife.com/our-platform [2] https://www.tandfonline.com/doi/full/10.2147/DDDT.S253064?ro...


Simulations Plus has been building models on this data since 1996 and is one of the more popular vendors for prebuilt software for this modeling. There’s literally dozens of vendors with software though because companies have been working on this since the 80’s despite the article’s claim that this has been an ignored problem.

https://en.m.wikipedia.org/wiki/Simulations_Plus


The problem is even more complicated than the author presents. Much of the chemistry that happens within the body happens within cells. Serum levels are a poor proxy for absorption into organ, tissue, and ultimately cell types. In fact, the problem is even more complex: the spatial organization of a cells within a tissue is heterogeneous; thus we can’t even reliably predict how well a drug will reach a particular cell type since it depends on the specific cell location. This is a reason why cancer is so difficult to treat: cancer cells often barricade themselves behind other cells and the tissue stroma, making it difficult to get a drug at a high enough concentration to kill the cancer cell.

Biology is fiendishly complex and empirical. One regularly sees Silicon Valley types come along and propose to ‘disrupt’ the field using the latest fad from the field of computer science, only to slink away years later having been humbled by biology. The current fad is AI, which despite having added some extraordinary tools to the biologist’s tool chest, will also not live up to its hype.


I like the optimism of the author, and if their overall point is for there to be more open sharing of data in pharmacokinetics research, that is a valid and worthwhile topic of discussion. One that is worth advocating for, as the potential results it would yield for future research and for the progress of science.

That said, their view that it is simply a matter of collaboration and coordination is entirely wrong. Sharing of data and collaboration would absolutely be worthwhile (though it runs opposite to the direction of incentives in profit-driven drug development) but it's like saying we could start building a Dyson Sphere tomorrow and solve the worlds energy problems if we just pooled our talent and resources. In contrast to what the author claims, we need HUGE advances in technology and our understanding of the human body, pharmaceutical sciences, drug development, etc. before this is possible. To use their example of GLP-1 agonists, prior to their development and wide-spread usage, the psychological effects of these drugs were completely unknown. Both positive and negative, clinically. But what if those effects were much more dangerous? Many SSRIs have a black box warning, which is mostly applicable to specific age groups. Negative side effects that we see in teenage patients are much less common in other age groups. These kinds of effects are why medicine moves very slowly and experimental work is costly, because ultimately we are talking about peoples lives and not a machine that is easily replaced if we break it during the testing phase.

Millions of animals would be the first to rejoice and praise a model that didn't require in-vivo testing but, we may sadly never see that day. I'm skeptical that even the development of an ASI would be enough to get us there.

I did find that the author has a knack for explaining difficult concepts with simple and illustrative metaphors. As a clinician and scientist in the pharma research space, this is one of the few articles I would send to a friend that finds the topic interesting but lacks the background knowledge to understand most literature about drug research.


Author here. Thanks! I'm pretty sure most of the negative effects of SSRIs were discovered in phase 2 trials, though. Pk trials aren't usually powered to find adverse effects in subpopulations like teenage patients. Same thing with GLP-1 agonists and their effects on addictions (which I think were actually discovered post approval).

That being said, we'd still have to do dose escalation studies for toxicology even if we had great pbpk models. Not denying that.


The problem is actually much worse for topical drugs, where there is often very little information as to the how, where, when, and how much of drug permeation through the skin. It’s a huge gap in our understanding not just for brand name drugs but for generics as well.


Currently, this problem is already mostly addressed by animal trials. Granted that that the pharmacokinetics of a given drug in mice, or even monkeys, won't be identical to that in humans, it's more predictive than using isolated tissue or cell preparations.

>It would just require the raw data from a variety of pharmacokinetic trials...

By and large, such trials are likely poorly relevant to the pharmacokinetics of your drug candidate, unless its very similar to an existing drug. And even then, you will still have to test your actual drug in actual people during Phase 0 trials.


Maybe there could be a publicly funded virtual liver project.


There are some pretty good efforts into human organs/body simulation. I remember reading about some reserach lab in Italy that is putting a lot of effort in this branch.

The problem is, biologists still dont have complete understanding of how the organs work... how can we look for simulating them?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: