Top 10 mistakes made when implementing EDC

17th September 2008 No Comments
Topics:EDC

(last update from admin @ eclinicalopinion)

Ok, I am calling for a challenge here.  I am making an attempt at identifying 10 top Top 10 winnermistakes that I believe are made when companies attempt to implement EDC.  No science to this. Just a bit of fun.

I will make edits if anyone posts comments that I believe out do my own;

 

1. Pick the nastiest, most complex study to implement as a first study

Sponsors may be trying to check the EDC system and vendor as a confirmation of claims of functionality, services and support.  It may also be an internal organizations ‘sell’ when bringing in a new system. In reality, the risk factors are at their highest, and the chances of failure greater than at any subsequent time in an Enterprise EDC system rollout.  Instead of learning and improving with optimized processes and a well designed workflow model, a ‘get it out the door quickly’ approach is forced and pain is suffered from all parties!

2. Expecting the return on EDC to be immediate – admin @ eclinicalopinion

Many clients are very experienced with paper and have wrung the very last drop of efficiency out of their process. They start with EDC believing that they are entering a new golden era only to be disappointed with the gains (or losses!) on their first study.
As with any new process or technology, it takes time to refine. The potential gains are real but it will take a few trials before a company hits its stride with EDC.

3. Over emphasis on faster closeout- admin @ eclinicalopinion

Companies new to EDC getting excited about the faster closeout of EDC trials but ignoring the issue of longer start-up times with EDC. With paper you could print the CRF’s and send them out before you have finalized (or even built) the database that will finally store the data.

4. Use all the functionality that was demonstrated

A common problem. When a sales person demo’s the product, it looks cool. Almost every feature looks good, and could add value…. Well, in reality, not always.  Many EDC systems developed today offer features as a ‘tick in the box’, but when the feature is used and combined with other features, sometimes the value falls short.  For example, most systems offer some form of data flagging… Reviewed, SDV’d, Frozen, Locked etc etc. Do not use all flags on all fields.  That will be slower than paper.

5. Resource the same way

If you have the same resourcing for Data Management and Monitoring AND you are also resourcing separately for building and testing EDC studies – then you have done something wrong.

With a good EDC product, the rules that would typically be applied manually are applied automatically. The delta should be picked up by a smaller number of ‘eyes’.  Many CRO’s have played the ‘better safe than sorry’ card to charge for the same Monitoring and Data Management as paper as well as EDC license and deployment costs.  This demonstrates an inexperienced CRO.

6. Model the eCRF according to the paper CRF Layout

Trying to make an electronic CRF identical to an original paper CRF will result in tears.  The users will be frustrated with the workflow.  The ‘e’ nature of the medium will not be utilized and the study will be less effective.

Instead, consider appropriate workflow and dynamic eCRF’s.  I will stress ‘appropriate’. Overdoing the bells and whistles can cause frustration, but, no bells and whistles and many of the advantages of EDC are lost.

7. eCRF Design by committee

The surest way to blown budgets and timelines is to attempt to develop an eCRF based on a committee of individuals.  The sponsor should delegate a chosen few (ideally 1) to work with the EDC CRF Designer. The study should be built to a greater %, then following this period, a wider review should be carried out.

8. Wait until the end of the study to look at the Data

It is surprising how often this is still the case. EDC means cleaner data faster, but often sponsors, and their Statistical departments are geared towards working with final data-sets. Good EDC systems can deliver clean data on a continuous basis.  Whether the data is able to achieve a statistically significant sample size is another question, but, information is often available for companies that are willing to leverage it.

9. Fail to use the built in communication tools provided

Many EDC systems offer the means for different parties involved in the execution to communicate.  These might be in the form of Post it Notes, Query Messages or internal email. Often these facilities are either not used, or not used effectively.  This means that the true status of study is a combination of information in the EDC tool, tasks in Outlook, actions in emails, scribbled notes on a Monitors pad etc etc.

10. Do lots of programming for a study

This covers many areas. It could be programming to handle complicated validation rules, or it could be programming to adapt data-sets to meet requirements. If you’re EDC system requires lots of programming in order to define an EDC study, then I would suspect you have the wrong EDC system. Good EDC systems today are configured based on metadata stored in tables. Old systems relied on heavy customization of the core code for each deployment, or, relied on some form of programming in order to complete the study build. If you write code, then you need to test it. The testing is similar to software validation.  This takes time and money. 

Most EDC tools can be extended through ‘programming’. If you need to do this, try to do the work outside of the critical path of a study.  Develop an interface, test and dry run it, and then, utilize it in a live study. In this way, you will have time to do it right with proper documentation, support and processes.

and relegated below the top 10…

11. Start by developing Library Standards from Day 1

This may sound like an odd one, but, let me explain.  Implementing EDC effectively with an EDC system, even with a highly experienced vendor, takes time.  All parties are learning, and, modern EDC systems take a while to adapt. Workflow, system settings and integrations all need to come up to speed and be optimized before standards can really be applied and add value. Start too early, and the library standards are just throw aways once the teams come up to speed. It is best to leverage the knowledge and skills of the CRO or Vendor first.

12. Develop your Data Extracts after First Patient In

Sometimes tempting due to tight timelines, but, if you have an EDC tool that requires either programming or mapping of data to reach the target format, then the less consideration you give to the outputs when you design the study, the harder it can be to meet the these requirements after FPI. This leads to higher costs, and, the potential for post deployment changes if you discover something is missing.

Conclusion

Many thanks to admin @ eclinicalopinion for the additional 2 mistakes made, now coming in at numbers 2 & 3!  More comments welcome…

Leave a Comment