Continuing our Precision Medicine in Neuroscience discussion, Jason Kralic and Nik Tezapsidis discuss how biomarkers and partnerships will drive precision medicine in the Neuroscience industry.
- Jason Kralic, Operating Executive, Kineticos
- Nik Tezapsidis, President and CEO, Neurotez
Kineticos: The lack of biomarkers and precision medicine tools in the neurosciences space isn’t significantly impacting your portfolio. You are leveraging what you know about metabolic and genetic characteristics of your patient population that support your drug development. You are forging ahead, not relying upon the beta-amyloid hypothesis per say.
NT: Incidentally, Abeta as a biomarker can be something that we can utilize if we were to do Amyvid PET brain scans as an adjunct to other diagnostic evaluations to confirm Alzheimer’s disease. Based on the observation that amyloid pathology is initiated decades prior to the clinical expression. Alternatively, we can use FDG PET scans. Deficiencies in glucose utilization is another form of pathology that develops decades prior to clinical dementia. Even though it is a couple of decades old, it is a type of technique that could be utilized.
Kineticos: Taking a step back from your direct observations from your programs at Neurotez, talk to me about your thinking on resource allocation towards precision medicine development as an industry.
NT: It is fundamental. Again, if we use Alzheimer’s as an example, in clinical trials, you have to treat for a long period of time in order to record an impact in the desired endpoint – cognition. The length of those trials are longer the earlier you start in the continuum of the disease, as the decline in the early phase of disease is much slower, compared to later phase of the disease. It’s fundamental to have biomarkers that reflect the impact that you have in the underlying pathology.
Again, given the huge amount of data from multiple studies, we could argue that leptin itself can be a surrogate biomarker. As such, particularly if the target is going to be the early phase patients you need very long clinical trials to demonstrate cognition impact.
Perhaps, demonstrating that increasing leptin levels from, say, the lowest quartile to the highest quartile safely, would be sufficient to allow surrogate biomarker endpoint-based approval of this treatment to be followed by phase 4 trails. A relative short phase 3 with a larger population, showing that you can safely increase leptin levels, followed by conditional approval, followed by phase 4 post marketing studies. This can be a fantastic way to demonstrate that you know biomarkers can be surrogate markers of the disease, that are true reflectors of the disease process. This can be a means by which you can get a faster track towards approval in marketing or promoting drugs.
Kineticos: More resources are always needed to do more faster. Individual companies, even larger ones, need to focus on putting resources towards supporting the development of their target and their programs. We undoubtedly will rely on public-private partnerships to support development of technologies and sharing that data is very important, to your point Nik.
We have the Alzheimer Precision Medicine Initiative and surely similar initiatives for other neurologic and psychiatric diseases. Success will require participation by private and public institutions in discovery and harmonization of the precision medicine tools and approaches, including health care systems and payers. A good example has been controversy regarding reimbursement of the PET tracers for beta amyloid and FDG PET, which is more of a research tool. Don’t developers of precision medicine tools need to have some degree of confidence for reimbursement if those tools are used to support patient care?
NT: One thing is for individual groups or private companies to be encouraged to do biomarker studies and the other is to share them. A wonderful example of how this works well is ADNI, the Alzheimer’s Disease Neuroimaging Initiative, a consortium of pharma and academic groups. It would be unproductive to have to replicate, unnecessarily, studies. The more that you can consolidate data from a number of trials, the more of a bigger cloud you are going to be creating with data. You are going to have more efficiently and meaningfully analyzed data. It is all about collecting the data and sharing it. Particularly on the biomarker front. Each individual company can still file patents on proprietary data, but the biomarkers is something that can be shared by all developers. It will substantially facilitate the discovery of new medicines. It will be more efficient, transformative, and curative for the disease.
Kineticos: When we finally see a drug approved for Alzheimer’s disease, do you think it will be accompanied by a companion diagnostic or not? If it is your product, for instance, do you envision a companion diagnostic?
NT: In our case, undeniably yes. The diagnostic will be used to identify the right population. The protein will be used to monitor treatment because we don’t want to exceed the levels above the upper quartile because there may be side effects in the nature of leptin resistance. It can also be something that could be supportive of a surrogate endpoint approval for the drug. In our case, the treatment will be accompanied by a test for the protein.
Kineticos: Do you see the same holding true for other neurologic and psychiatric diseases?
NT: Systems biology is very important in the identification of specific biomarkers that can assist in identifying the right intervention in order to bring that profile towards the wellness status. That is the overall thesis and how you are going to be maintaining wellness and preventing the disease or worst-case scenario, trying to reverse the disease state towards wellness. Ideally, when systems biology works in a perfect world, it would be able to identify those that are at risk and monitor their phenotypic profile to prevent them from getting the disease and that is heavily diagnostics and biomarker dependent. You have to have that readout. The whole genome sequencing will provide your genetic makeup and all the information that we have collected, and will be collecting in the future, about genetic predispositions for certain diseases. The genetic and phenotypic readout will provide answers. In essence, that is what Neurotez is doing. We are using genetic readout and phenotypic readout to select the right population and restore the levels of an agent that is known to have pro-cognitive properties. In many ways it is very conceptually simplistic.
Kineticos: It is, and we need to keep chipping away at it, keeping that logic in mind and hopefully choosing the right targets based on understanding the disease. You look at what we can apply today to understanding the disease versus what we had a couple decades ago. There’s a lot of potential. It is an exciting time.
Subscribe to the Kineticos Research Institute to receive weekly emails containing insightful content.