For a life sciences industry whose Research & Development (R&D) processes are often held over from the 1990s—and whose returns on those investments are falling sharply—new digital technologies might feel as if they can’t arrive fast enough.
Almost everywhere you turn, there seems to be an opportunity as outlined in Deloitte’s recent report, Digital R&D: Transforming the future of clinical development. These opportunities could include:
- Video-based telemedicine to streamline trial participation
- Paperless tracking of clinical supplies and patient adherence
- Wearables and sensors that can feed real-time patient metrics into powerful new analytical engines
- Even the process of matching patients to trials can benefit from new machine-learning techniques.
As other industries—from finance to retail—reshape their operations around digitization, life sciences companies may find it tempting to jump in feet-first.
Tempting, but not necessarily the wisest course. In the life sciences space, it may be more effective in the long run to take a more measured, structural approach to the adoption of digital technology. My Deloitte colleague Rajeev Ronanki made a similar point in an article about artificial intelligence that he co-authored for the Harvard Business Review. He and MIT professor Thomas Davenport outlined four steps for bringing new technology on board:
- Understand the technology
- Create a portfolio of projects
- Launch pilots
- Scale up
In our work with life sciences companies, my colleagues and I have found that early adoption of technology is fragmented in R&D. Leaders, asked to fund exciting new investments, don’t always have the technical knowledge to understand how each shiny new toy relates to bottom-line strategy. Meanwhile, experienced clinical leaders, who are comfortable with established methods, might resist the disruption they fear awaits first-adopters. This “sure, but not on my trial” attitude has even led to cases of redundant processes—new digital tools operating in parallel to the “real” trial instead of improving it.
No comprehensive digital strategy is likely to emerge from that approach. What each life sciences company could benefit from is a repeatable process to assess improvements and assemble them into new capabilities, similar to what my colleagues outlined in the HBR article.
Understand the technology
The people on the lab floor may be excited about what new tools can do. But to move forward with purpose, the people who make investment and strategic decisions should be just as excited about why the tools do it and how the mission will benefit as a result. That means any path to adopting a new technology should include education and transparency. No one in the decision path should be asked to endorse a black box.
When everyone understands a new capability from all angles—technical, operational, strategic, financial—it can be easier to move past risk-aversion. It can also be easier to identify which resources, human and otherwise, it will take to support the new initiative. And it can be easier to define a standard of measurement that can provide all parties with a common view of the tool’s effectiveness.
The regulated nature of the life sciences industry and R&D might seem to be an obstacle to this alignment. Privacy, documentation, and reporting requirements are all built around old methods. Yet the financial industry operates under analogous constraints, and it is further ahead in technology adoption. Not every use of technology runs afoul of the rules. Some even enhance the ability to obey them. The process of understanding should include a willingness to stop looking for reasons to say “no.”
Create a portfolio of projects
This is where companies can introduce new tools on a modest scale to test their effectiveness. The first decision is which tools to test in which areas. The answer can hinge on how decision-makers frame the question: It is one thing to ask, “Here’s a new digital tool—what can it do?” It is quite another thing to ask, “We have this challenge in R&D—how can digital help solve it?”
The first option represents the “shiny toy” fallacy. Following the second path starts with assessing internal programs as they operate now. Are you putting up with deficiencies a new approach could remedy? Are there opportunities for improvement? Some digital implementations (e.g., case processing) have demonstrated cost reductions of around 90 percent. Instances like that can be attractive targets for pilot projects.
As with any new internal venture, the creation of digital technology pilots likely needs alignment and support. Alignment in the form of defined objectives and metrics, along with inclusion in a system of governance. And support in the form of resources and room to maneuver. Once a company has determined to pilot a technology, it should “ring fence” the funds and other resources it needs to work. People with the authority to kill the initiative should commit to giving it a chance despite a temporary uptick in cost. Some pilots have the potential to return the investment a thousand fold or more.
Lining up support for technology pilots can extend outside the organization as well, specifically to regulators. With new tools come new procedures, reporting formats, or other changes regulators might eventually need to approve. Their approval may be easier to win if they understand from the beginning what you’re trying to achieve.
With the decisions made on where to try out new tools and approaches, the challenge often shifts to governance. (You’ve made a decision, now see it through.) Structure and organization can protect and promote each fledgling initiative while it takes root. Innovation can fall down if it isn’t matched by diligence in management.
While you’re being supportive of technology pilots, you should also be demanding of them. The pilot is localized, but it’s there to prove a broader proposition about the tool or technique in question. Don’t accept a limited scope of results. Make sure each pilot generates learning that can inform the entire organization and set the stage for future studies. That aspect should be built into the pilot program at the beginning.
If the pilot implementations were selected to correspond to the company’s strategic mission, and carried out in a way that yielded useful learnings, the final step can be to apply that knowledge more broadly. It can be easy to invest in a new capability, but these are the distinctions that can help set up a reliable return on that investment.
Taking a new capability beyond the pilot stage often hinges on a single decision—do we continue? But many other decisions are likely involved. Which parts of the experimental experience are worth carrying forward? Which should be left behind? Is a fresh round of buy-in needed, and from which stakeholders?
Scaling a technology also typically means scaling the operational support that surrounds it. Having set up funding, staffing, and infrastructure for the pilot project, you now must extend these resources to the scale of the enlarged application. A specific set of goals and metrics should likely make the leap in scale as well.
The four steps — from understanding the technology to scaling up —comprise many decisions and operations. For a life sciences organization trying to find a place for digital technology in R&D, a process like this can spell the difference between wandering and winning. Many would agree that digital is the future. But an investment in process, learning, and measurement can help make that future tangible and bring it here sooner.
PS, if your at DIA this week make sure you stop by our ConvergeHEALTH booth at #2130.