Paul McKenzie: Bringing Order to Complexity at J&J

    Paul McKenzie: Bringing Order to Complexity at J&J

    Editor in Chief Agnes Shanley recently spoke at length with Paul McKenzie, global head of pharmaceutical development and manufacturing sciences for J&J pharmaceutical research and development. McKenzie shared his goals for J&J’s pharmaceutical R&D, specifically its “Lab to Patient” program. He also discussed how pharmaceutical data visibility and access might be improved, throughout the value chain and across functions.

    Much of this discussion is available on PharmaManufacturing.com (“Connecting Data to the Patient”), and will be published in Pharmaceutical Manufacturing’s September issue. (And for some historical perspective, here’s a 2007 interview, when McKenzie was at BMS.) Below is what McKenzie had to say regarding Quality by Design at J&J and related burning issues.

    A.S.: Do you have any specific milestones in mind for the Lab-to-Patient program?

    P.M.: The goals consider large molecule biologics, small molecule chemistry and formulation devices, and institutionalizing an end-to-end platform from discovery on out to commercial, to ensure that we’re developing and transferring processes in a way that our manufacturing colleagues can readily adopt them.

    Our near-term milestones are really, to institutionalize this concept of platform and to build it into our execution model through the tools that a scientist would use. We’re building out a system-independent recipe model, utilizing the industry standards S-88 and S-95, that can really be layered across any type of technology tool that you choose to implement; electronic lab notebook, or MES in both process and analytical arenas.

    Without a system-independent recipe structure, you can end up customizing per product, or per technology, which doesn’t give you the flexibility to evolve quickly as technology products evolve. So we’re working very hard to generate this recipe structure and to ensure that we can utilize it quickly in a multitude of possible technology offerings that come about in any of the spaces, MES or ELN. This approach will help us be able to change more rapidly and also, to speak a common vocabulary across development into manufacturing.

    The other thing we’re focusing a significant amount of time on is understanding and developing device platforms. Devices have always been thought about at the later stages, more at the lifecycle management stages versus incorporating them early on in your target product profile, and even incorporating them very proactively in your clinical trials.

    And to do that takes a separate mindset; it takes developing standards that can be utilized in devices across multiple therapeutic areas versus always custom tailoring a device for a given indication. So we’re working very hard with out commercial colleagues to create a library of potential devices that can be leveraged uniformly across multiple therapeutic areas.

    A.S.: Can you give an example?

    P.M.: One device that we’ve had some recent success with is our auto injector for Simponi. It was developed internally. That auto injector, we feel, gives the patient, typically, a rheumatoid arthritis patient, more convenience. Now we’re asking whether we can use this platform for other therapeutic areas. We’ve invested in developing it, we’ve invested in making it commercially available, it’s getting good feedback from the commercial market, so how do we take the auto-injector beyond Simponi?

    A.S.: In addition to the vocabulary of S-88, are there any other standard building blocks that are required for data transparency between R&D, manufacturing and the business side?

    P.M.: One area that we still haven’t flushed out completely concerns analytical methodologies, and standards on the models you apply to the raw data. For instance, with FTIR, NIR-IR, there is a lot of variation in the way vendors approach these models.

    If you think about Foundation Fieldbus and HART, there isn’t an equivalent to those across all NIR-IR or FTIR instruments. Working with the analytical vendors, I think they’ll soon see that there is some power in driving that to a standard.

    I always look at the Foundation Fieldbus work that was done years ago as an example. It was a tremendous achievement for the instrument world to come out with a standard like that. All the companies, you know, at some point, adopted it and it has provided real benefit. People have been able to bring instruments on and put them to beneficial use very quickly. The analytical space, in addition to the S-88 expression, also needs a continual push at developing standards in those areas.

    A.S.: Do you have to standardize on specific IT platforms or is it possible to use different systems from different vendors?

    P.M.: Within the past year or two, I’ve seen that, if you approach the problem correctly, you can use multiple tools, instead of just picking one. That said, I still think that having strategic partnerships where you develop a solution in collaboration with a partner will save overall lifecycle time, because they can adopt it, they can make it part of their product, and then your lifecycle ownership costs, over time, go down.

    But we’re getting to a point where, once you define what you need, you can use a variety of products, but the vendors involved must be willing to use open architecture in their tool development. It now seems to me that many vendors are narrowing down their own tool selection, and that tool selection is pretty consistent. They add their spin to how they integrate those tools.

    A.S.: What roles are PAT, onboard diagnostics, simulation, modeling and high throughput playing in lab-to-patient and can you give any specific examples of how you’re applying them?

    P.M.:
    At Centocor, on the biologics side, we’re at the “early adopter” stages for the PAT and high throughput. And I think it’s a fair statement to say that the biotech industry, in general, relative to the small molecule industry, has significant opportunity to utilize high throughput and PAT more in process development.

    For instance, today, the stability of a biologic material is assessed in a very static way. We take the biologic, put it in its final package, and wait for the clock.

    So one thing we’re doing is creating a high throughput stability workflow, using the appropriate analytical methodology to diagnose a large molecule. This requires a significant number of analytical instruments, to probe all the regions of the protein, but to do that proactively, so that we can introduce, in a high throughput environment, things that may be more variable in a manufacturing environment—for instance, levels of Silicon oil, sealants from pump seals, from tubing, any metal interactions or product contact interactions.

    The question that we want to answer is: How can one, more proactively, in a high throughput automated flow, recreate situations that your protein may be exposed to over the course of the lifecycle of manufacturing process optimization? How can you develop a database of what you’re going to experience, and be more predictive?

    For example, consider protein aggregation, a major issue today. How can we do a better job understanding what fundamentally drives protein aggregation, both from a scientific viewpoint, but even from an empirical viewpoint? What do proteins “see” during processing and during stability storage conditions? This is a good example of how we’re introducing high throughput into the biotech world, where, historically, it has not been used for process development.

    A.S.: Do you develop the software in-house?

    P.M.: For the areas that I described, like protein aggregation, that is something that we need to do in-house; it’s not readily available For the general bio process portion, you know, cell culture dynamics, you know, oxygen uptake, there are some models that are available. Those models, in general, tend to model bioreactor performance, not necessarily the output from the cell aligned, from a product viewpoint.

    A.S.: You’ve been doing a lot of work with electronic lab notebooks, and you’ve noted the potential to handle lab IT and automation just as it’s done or similarly to how it’s done on the plant floor, with the historians and the batch engine. How has this been working out and, how are you making that connection between the notebooks and the MES?

    P.M.: Electronic lab notebooks will be key to our success, really, because we’re working with a diverse global footprint internally, and a diverse global footprint externally; and having a good, electronic lab notebook system that connects various parts of the world is important, so that you can really get 24-hours out of your day, but most importantly, be able to pull all your data together seamlessly.

    I may run an experiment today in the east coast of the United States and the analytical work for that may be done in my J&J Pharma R&D group in Mumbai. An electronic lab notebook allows scientists to see that connectivity. So it’s very important for us to get that electronic lab notebook rollout complete and done and available, using the S-88 recipe backbone structure so that we can get that integration of process and analytical across all of our worlds.

    In addition, we’re trying to do that with select external partners where, if we’re doing a significant amount of effort at those external partners, we would like them to be on that same format and approach so we can have real connectivity of the data.

    With that said, we are working very hard to make sure the ELN framework and the MES framework are very similar, using the system-independent recipe model, so that we can quickly transfer from an ELN to an MES and can decide the best place to put those tools, right. So, you know, for example, how far back you can push an MES, and how far forward can you push an ELN.

    Today, there is a lot more flexibility to do that less definitively than we’ve done before, where you may take an ELN further into GMP manufacturing today, because your group that is executing it is most familiar with ELN or you may, in certain groups, where the earlier groups are familiar with MES, drive the MES back.

    The system-independent recipe structure will allow us to use these tools more flexibly and that poses a real opportunity. So, for instance, at Centocor, we have a bioreactor lab that is running 3-Liter, 30-Liter bioreactors. Historically, as a process engineer, I would probably have said, “Oh, it probably makes the most sense to put an MES in there because those processes are most similar to the 20,000-L bioreactors that will eventually run in the plant.”

    That’s easy for me because I come from a Plant Operations background, but for a biologist who hasn’t spent much time in the plant it’s probably easier to adopt an ELN in that space.

    But if we create the content of the tools to be system independent, we can decide more easily.

    A.S.: What are your thoughts about the Pharmaceutical Quality by Design movement? Several people in the industry have described a disconnect between the development side and the manufacturing, and the lack of integration of manufacturing knowledge. What do you see as being needed for true QBD to exist?

    P.M.: At J&J’s Pharmaceutical R&D, our manufacturing and development groups agree that we need to make progress on defining platforms and technology to get the same base vocabulary across the two areas. That’s really our short-term goal, for 2010-2011.

    As we do that, we can than start driving, internally, the business case for QBD, to examine how we can take advantage of the design space of data we have collected which, now, we can compare from Point- A to Point-F or G, that we couldn’t compare readily before. Then the question becomes: How do we convince ourselves and the regulatory agencies we work with that we have a firm understanding of our complete design space? That then allows us to consider different approaches to product portability, process and/or raw material changes.”

    This foundation of a platform within each technology area is the first step that needs to be taken before we can do that. Once we have that, and we have that continuum of information, we need to marry that space with the clinical space so we can really dial in on, particularly for biologics, the impact that process changes will have on safety or efficacy.

    If we can build that integrated process, analytical and clinical design space during the development of a product, it will make it much easier for us to adopt and make a strong business case for QbD.

    A.S.: At Centocor are you using QBD as a framework for all your development projects?

    P.M.: At this point, we’re ensuring we have the pillars to make QbD successful, but we think we need to invest on those pillars, first, before we prescribe doing QBD across our whole portfolio.

    So it’s really about building QbD from the ground up.

    A.S.: What do you think is needed from the regulators, regarding QbD. We hear reports of people not being on the same wavelength within FDA, in terms of reviewing. Is there any message they should be sending?

    P.M.: FDA has done a very good job of trying to reach out to industry. I’ve been at several forums where barriers are openly discussed and there’s an attitude of “How can we help?” We need to continue to discuss, openly, both successful and challenging approaches to QBD cross-industry.

    Which companies have had success, why have they had success? And which companies have struggled, and why?

    What were the pillars they had pre-invested in to be successful?

    The more we have those conversations and institutionalize them, the more helpful it will be. We all need to understand how people have taken advantage of QbD, and FDA is trying to do a very good job in creating forums for those conversations, so that people can really compare notes.

    A.S.: You’re both a PhD engineer and a senior Pharma manager. What we’ve been hearing is that at many companies there doesn’t seem to be a great amount of a top level support for experimenting with S-88, doing PAT or even QBD. How should engineers and technical professionals go about getting more senior management funding and buy-ins for projects and how should they frame the discussion, justifying the costs?

    P.M.: A couple of things have been important to me, in my career. As an engineer, you know, coming out of school or being in the industry, I think it’s incredibly important to immerse yourself in multiple parts of the continuum, from discovery to manufacturing.

    I think that you can speak more credibly if you understand your customer in drug discovery, or your customer in commercial very readily because you spent time there, to appreciate their challenges. So I think as you see more engineers working in drug discovery and development, the situation will change. Historically, most engineers have been in manufacturing.

    But as you get that right mix of scientific staff, systems engineering or chemical engineering staff working together across the continuum, they can provide that insight of the continuum to the different scientific disciplines across different areas.

    So, being an engineer in development who has had the lucky fortune to work in multiple areas, formulations, manufacturing, in both small and large molecules, one can bring those insights.

    When you have those insights, you can get those around you passionate about what you can, collectively, bring to upper management

    If you go in and you try to give a talk about S-88 to upper management, they don’t need to know all the answers, but need to understand how you’re going to bring value by integrating different scientific disciplines together to make a medicine for a patient.

    Sometimes, as engineers, we tend to run right to the tool and explain how we are doing it. We need to do a better job of explaining why we are doing it and get real strategic buy-in on the “why” and then give management confidence that we know how to deliver the “how?”

    A.S.: And how do you translate that into the language of the accounting sheet?

    P.M.: It’s the idea of the iceberg . . . there are very tangible items above the waterline that you can share with them, such as quality building, zero days batch release, quality and operational metrics. But then there are a lot of the intangibles underneath the ice of, say, your ability to bring on other technologies more quickly, your ability to adopt process changes more quickly.

    You need to do a good job showing the ROI above the waterline, but then also, make them aware of the softer items, and show that by building these in, the organization will be more successful.

    One softer item is the idea of being able to speak in the vocabulary between R&D and manufacturing. You know, it’s hard to quantify results on Day-5 or the first day when you’re selling it, but over time, when tech transfers get shorter and shorter and you can show over a several-year window, that on average, your tech transfers have shifted from, you know, ten months-to-eight months-to-six months, that really shows benefit because you’re getting a product out to patients earlier.

    So some of it is selling the simple but, yet, incredibly important abilities to deliver in reasonable timeframes at the highest quality levels, and the others are the inherent soft items that, with time, will show real dollar value.

    But you don’t want to oversell. You know, I find, sometimes, people want to oversell. They’ll say, “Well, every day that we’re on the market means we’re one day closer to making ‘x, y, z’. Well, that is hyping things up a little bit too much. It’s really showing how you can get operational excellence in your areas with it. “If I do this well, then other intangibles will come along with it.

    Related posts:

    1. Merck’s Integrated QbD Approach: Patient-Focused and Bringing Manufacturing Closer to Development
      Merck experts provide a snapshot into their efforts to integrate QbD across different teams and to broaden the role that...
    2. Success Stories: PAT Addresses Bioprocess Complexity at Amgen
      Amgen and biologics manufacturers are finding varied successes, but just scratching the surface of what PAT and QbD can do,...
    3. BMS’ Tabora: For PAT and QbD, the Exotic is Becoming Routine
      QbD’s success depends on rapid knowledge generation, which has its roots in multivariate modeling, says AIChE award recipient Jose Tabora....
    4. Incorporating Adverse Patient Responses into Risk Mitigation
      A brief talk with Steve Jolley, VP and Director of Pharmacovigilance at PATNI....
    5. QbD Works: Detailing the Value Prop at Pfizer
      Gerald Migliaccio provides examples of how QbD is paying off at Pfizer, and dismisses those who would say QbD is...

    Leave a Reply

    You must be logged in to post a comment.