by Patrick Cox
September 9, 2015
This essay isn’t going to be what I had planned it to be. Our household is in upheaval right now because an older relative just slipped hard down the slope of cognitive decline. My wife has experienced a truly unpleasant role reversal, taking the car keys away from her father following a couple of scary events. Routes that he’s driven for decades now baffle him. He’s been lost several times and recently had an accident.
The rapid onset of his dementia has been startling. When I started writing about Alzheimer’s and dementia, my father-in-law was fine. Now his ability to care for himself is slipping away fast.
For several years, I’ve been predicting that US policy makers would be forced to recognize the need for serious reform of the drug and device approval process. The reasons are pretty self-evident, but I’ll lay them out in case you missed it.
The biggest part of government spending is healthcare expenses (just as it is for many families). This isn’t really surprising as healthcare is the largest industrial sector in America and most of the world—bigger than energy and military spending combined. In a real sense, the central activity of the human species is fighting illness and death. Even during downturns, healthcare is remarkably countercyclical.
Right now, however, things are changing. As I’ve so often said, birthrates are plummeting and life spans are skyrocketing, increasing the average age of populations worldwide.
The undiscussed consequence of this change is that healthcare spending increases exponentially along with the average age. For example, healthcare costs about 10 times as much for the average 80-year-old than for the average 10-year-old.
Rather than facing this fact, the intellectual classes have blamed drug companies, insurance companies, and doctors for the increases in healthcare costs. Sometimes I think that anything the majority of this group believes must be, by definition, hogwash.
The Affordable Care Act was supposed to fix the problem once and for all, but it has done nothing to reduce costs, which continue to climb. No system of transfer payments is capable of reconciling a shrinking population of healthy payers to a growing population of less healthy recipients.
For this reason, we are approaching a turning point when it becomes obvious to a critical mass of policy makers that the only solution to our budgetary crisis is better and more cost-effective healthcare. The last year has, in fact, been a watershed year as more and more healthcare experts have pointed out the need to shift from a war on disease to a war on the process of accelerated aging, which is the underlying cause of most diseases.
Now we’re seeing the first real sign that the alarm is being heard in political circles. Newt Gingrich has written recently and positively about a bipartisan attempt to accelerate biotechnological progress called the 21st Century Cures Act. This legislation would increase the budget of the NIH and hopefully reduce the regulatory burden on biotech startups.
Gingrich was one of the few policy makers who understood the importance of the Web before it arrived, so I take his opinion seriously. I think, however, that the 21st Century Cures Act is more important as a harbinger of change than it is the change we need.
I don’t have a lot of faith that a bigger NIH budget will be a game changer for biotechnology, though I wouldn’t oppose it. In the pharmaceutical industry, the NIH is simply not seen as a place where the best and brightest are gathered to solve big problems. Rather, it is a place where scientists begin their careers before gaining the experience and knowledge necessary for more important work.
Moreover, it’s pretty clear that the NIH has not yet focused sufficient resources on solving the root cause of Alzheimer’s and all the other biggest diseases: accelerated aging. This is really a shame because so much progress has been made in this area already.
Truly astonishing discoveries have been made about aging in the last 10 years. However, prior research is often overlooked by agencies whose goal is to find something new. If these discoveries aren’t overlooked, then they are probably in development and being hindered by US regulations that focus on “cures” rather than the amelioration of the aging processes that cause diseases.
So we have an NIH that wants to fund new research and a hegemonic FDA that delays the application of those discoveries. Stuck in the middle are discoveries that could actually solve real medical and fiscal problems—if somebody other than start-up biotechs (the low men on the totem pole) were doing so.
Among the most important of these prior breakthroughs is sigma-1 receptor activation of autophagy and life extension. This article from 2008 provides an interesting third-party overview of autophagy for the layman (which we now know is activated via the sigma-1 receptor and aging).
Chronic autoimmune inflammation, activated by the NFκB transcription factor, is another major cause of accelerated aging and telomere loss. Anatabine citrate is an enormously promising compound that reduces age-related autoimmune inflammation. Its effectiveness, reported in the recent publication of Alzheimer’s mouse data, is not a surprise. It is only one of many indicators, along with massive anecdotal evidence from hundreds of thousands of people who used the product when it was sold over the counter, that this molecule is a huge anti-aging breakthrough. I have no idea why the FDA is not fast-tracking it.
The fact that I’m currently dealing with a family member suffering from serious cognitive decline only increases my frustration with the American regulatory morass. The 21st Century Cures Act should be seen not as the solution to that morass, but as an indicator that our government is finally coming around.
Currently, it takes an incredible 15 years, on average, for a drug to be approved. Halving that time would cause no unnecessary risk but would yield enormous health benefits as well as increasing return on investment for those who fund biotechs.
My personal preference for reform is to follow Japan’s lead in eliminating phase 2 and 3 trials for stem-cell therapies. Former FDA Commissioner Andrew von Eschenbach, among others, has endorsed this approach, which would increase the value of biotech investments by doing away with the need for a big-pharma partnership in many cases.
This would more than double return on investment (ROI) for biotech by reducing regulatory delays, thus guaranteeing that all the money needed for crucial biotech research would be forthcoming. It would also have the benefit of provoking an avalanche of new research funds for biotech without adding to the taxpayer burden associated with an increased NIH budget.
I was going to spend some time today catching up on oxaloacetate, another powerful anti-aging therapeutic with the potential to delay or reverse Alzheimer’s. Japanese and Canadian regulators, by the way, have already acknowledged the therapeutic potential of this compound.
That write-up will have to wait, though. My next few days are going to be spent helping my wife deal with her father’s disease—a disease that could have been significantly ameliorated if the authorities were actually looking for solutions… instead of lording over the process of drug development.
Fortunately, I have no doubt that the US will adopt the more enlightened regulatory attitudes found elsewhere, and probably sooner than most people expect. This will not only help solve numerous financial and healthcare problems, it will increase the value of the right biotech investments.
— This article originally appeared at Transformational Technology Alert.