After working on over 550 migration projects with our clients, we’ve noticed one huge unaddressed issue that many customers face: archiving GxP data and content.
Does this sound familiar? During a migration project, Veeva or Valiance are frequently asked, “What should I do with my legacy data and content I’m not migrating?” In nearly every situation our answer is the same: Archive it. Especially since many of our clients are in heavily regulated industries and need to retain data for a certain period of time for record-keeping and compliance purposes.
The inevitable follow-up question is, “Where should I store it?” There really hasn’t been one right answer to that question, but there are a whole lot of possibilities. Where to archive data is a textbook Paradox of Choice situation, where the more selections that are available, the more complexity that comes with the decision – and there are plenty of options.
Having nearly unlimited options to choose from is overwhelming, so companies typically keep their legacy system active after a migration is complete. They may limit user access, but they continue to pay ongoing maintenance fees in case they need that data later. After all, the alternative is evaluating, testing, and selecting one of the numerous archiving platforms available. It’s not a top-of-mind need and we’ve seen “evaluations” going on for years with little movement.
Ideally, those who are migrating or have already migrated to Veeva could store all their archived data and content (not just their submissions) within Veeva Vault. This would provide the same familiar interface, search, and security allowing for easy accessibility and could conceivably lower the costs of moving to a disruptive technology like Veeva by migrating only what you need to use moving forward, and archiving the rest, in this case, into your Vault.
The Valiance team recently developed a configurable, integrated archiving solution that takes legacy content or data from other platforms, archives it in Veeva Vault, and makes it accessible through a dropdown menu from your standard Veeva interface. Imagine, searching for your current and archived content and data all within Vault from the same interface. One Vault – All your information!
We’re extremely proud to officially unveil this new solution at the 2019 Veeva R&D Summit and are excited to set up private demos to show you exactly how it works.
If you’re looking for new ways to make a great Veeva solution even better and you’re ready to stop paying maintenance to retain an old system, then you can stop by the Valiance booth at Veeva R&D Summit in Philadelphia this week, or contact us online to learn more about this solution.
Valiance is the leading provider of GxP migration and software and services when compliance and quality are on the line. Our TRUmigrate™ and TRUcompare™ software has been successfully integrated with Veeva Vault since 2012, and Valiance has become the preferred pathway for migrating legacy information to Vault. Our unique data migration testing application provides 100% verification of data and content, for 100% peace of mind.
Proven. Tested. Preferred. It’s Vault migrations done right.
My wife always told me to never forget our anniversary because it was important to remember all the good things that have occurred as a result of the initial act.
With that in mind, June 2018 marks the 15-year anniversary of when a few guys got together to write a business plan on a napkin (framed in our office board room) about an idea to start a company, Valiance Partners. With that napkin plan in place, they had a concept of what they might be able to do for their first customer and then see where things take them.
Fast forward 15 years: Valiance (our rebranded name) has moved a few times, been recognized by many, has created a market that is largely void of specialized vendors, garnered over 110 life science customers and partners, and in just the past 3 years, has tripled in growth.
We have many to thank including our outstanding employees and our loyal customers who keep coming back to us to do the hard work required to meet their demanding GxP migration needs as they acquire, merge, and divest over and over again.
Thank you one and all as we look forward to continuing to serve your unique needs.
Sample-based testing has been a staple of quality organizations in life sciences for decades. In some cases, it makes perfect sense to use and, absent other options, it can be made to work.
The viability of sample-based testing lies in its assumptions. That is, if you are looking for errors with a probability of occurrence uniformly distributed across the things you are testing, you’re on the right track. For example: In the case of gradually-degrading equipment at a pharmaceutical manufacturing plant, it’s easy to select a sampling of products to test the performance of the production machines. You’re able to see the quality degrading in the creation of the tablets and can isolate the issue, fix the problem and resume production.
However, data managed in an information system is poorly modelled with this assumption and anyone who has migrated data/content and used sample-based testing intuitively understands this shortcoming. For example, you could rerun a sample-based migration test five times, using different data or documents and come up with five different results. That is inherent proof that the test itself has shortcomings.
Because sample-based testing of data migrations is unpredictable, it can be unsuitable for GxP system migrations and can very easily delay the timeline and increase the cost of a migration project. When a company tests by sampling data, they will take a small representation of their data/document sets across different document types and manually test them to see if there are any gaps or anomalies. And, as stated above in a different way, depending on which data/documents you pick, you could and probably will get different results each time, making predictability almost impossible.
Just some examples of what makes testing difficult can include:
If and when the differences are discovered, IT will need to work with the business to discuss next-steps to remedy the gaps or anomalies. Still, after those issues have been resolved and a solution has been found, the team will sample again to ensure things now are error-free. But, it’s extremely common for even more issues to come to light during this second sampling and they’re most likely different issues. Now the entire process of working with the business and creating a solution must be repeated as long as it takes until some comfort is reached, or time for testing simply runs out.
How many times will IT have to repeat this process? The answer is uncertain and depends on the results of each prior sample-based test as well as the quantity and the overall quality of the data. Most people plan on sampling twice, but our experience shows it will take from five to nine times to run samples. Because the number of times varies, the “unknown” factor raises concerns among all parties and throws the project timeline and budgets out the window. Plus, there is an increased possibility that there will be errors during Validation, and it’s very likely that the Production run will include unknown errors, something that’s at the least embarrassing and could raise compliance or quality concerns as well.
Looking for a successful migration testing use case? Look no further than this project at a top-tier pharmaceutical corporation.
The Alternative: Automated Testing For 100% Migration Verification
It’s not enough to plan a project timeline based on the quantity of documents. IT also must consider the legacy and new systems as well as the quality of data. Once the onion layers are peeled back, the project can seem more challenging.
Consider the previous time a migration was performed – let’s say that was ten years ago. That data may have been previously migrated from systems that are even older. When you consider data that’s decades old from systems that are decades old – sampling doesn’t seem capable of completing successful testing in the allotted period of time.
Automated testing, alternatively, delivers predictability by testing ALL data and finding issues earlier in the process. This allows solutions to be implemented quicker and more predictably, providing the peace of mind to move on and complete the migration project on time and within budget.
Why 100% Migration Testing is a Better Choice?
If you want predicable results, automated data migration testing has proven to be the better way. In our experience, if you are migrating a small amount of data – fewer than 10,000 – 15,000 documents – then sampling may very well take approximately the same time to complete as automated testing. This means, for similar volumes or greater, you’re paying the cost of automated testing, but not getting the benefit. Very often however, migrations could include hundreds of thousands (or even millions) of documents. And, our study assumes high quality data, which is normally not the case.
If you could test 100% of your data to ensure it would all migrate successfully, would you? Of course! If you’re doing manual sample-based testing, but paying the price for automated testing, why not just do 100% automated testing in the first place? The answer might be that you never expect to take the five to nine iterations that are typical of GxP data.
Finding and overcoming problems manually can slow down the project as more and more roadblocks are uncovered. Automating this process protects the project deadline and helps avoid having to give management bad news about delays or cost overruns.
Once again, this method is much more predictable and “makes sense” more often than one might think. The fact is that an automated migration and testing processes supported by the right methodology can save time, improve efficiency and quality while reducing risks associated across a broad set of migration challenges.
Access the free whitepaper to read more about automated data migration and 100% migration testing.
After a merger, acquisition or a divestiture, it’s a major challenge for the affected companies to create a workable and sustainable environment with all relevant information in their total systems. Without doing so, it can hurt business productivity, hinder easy access to data and sometimes create compliance issues.
The ultimate solution could be to consolidate or migrate from one system to another in order to maintain compliance to regulatory requirements, while decreasing operating costs. The trouble is, to the business community – especially senior management, migrating data seems like it should be an easy task. After all, it’s just moving data from one system to the other, right? Maybe, but probably not. That’s because systems within the combined company will most likely be incompatible. There may be applications using the same technology platforms, but be on different versions. One system could be out-of-the-box while the other could be highly customized. Data could be stored in multiple geographic locations. The quality of the data from one (or both) company(ies) could be very poor. And on and on it goes.
After the news of a merger, acquisition or divestiture, IT must to figure out how to handle data from all affected companies – but there are plenty of roadblocks to first overcome.
Here are some tips on how to avoid common data migration issues that happen with an acquisition or divestiture:
Put an M&A IT Team in Place Ahead of Time
If your company has a propensity to acquire, then this team should be in place ongoing. Even if the company doesn’t plan on a lot of acquisitions, there is plenty going on in the industry in terms of selling/buying products, divisions, plant sites, etc. Each of these require some of level of system integration and/or system deletion which makes having this team in place a good idea.
Know When to Ask for Help
It may seem like handling things internally is the best and cheapest route but things can quickly get overwhelming. Sometimes IT will attempt to do the work themselves using tools or software left over from a migration from years earlier. Or, the company might have a talented team of people who unfortunately don’t have much experience in large system migrations or with the SAME systems.
This is a project that requires predictability with an on-time finish (TSAs are often involved) – if your team can’t guarantee that, look for outside help.
Do an Initial Assessment
After figuring out you need help, get an initial assessment done as quickly as possible. It doesn’t have to be precise, but the clock is ticking, and you need to try to get a handle on answering the question, “how big is this?”
Although you might not be able to share much information yet between the acquired/divesting ventures, see if you can get any information on basic technology stacks, volumes of data/content. Anything at all is helpful. You’ll need an initial assessment to be able to take to upward get all of this funded (See: “Educate Upper Management”).
Educate Upper Management About What is Involved
As stated before, it can be an uphill battle to convince senior management to invest money in a data migration project. However, this is a “pay me now, or pay me later” proposition. It’s important to explain, in simple terms and with examples, why migrating and consolidating will save downstream costs (license renewals, maintenance) and it is more than moving information from System A to System B. Performing a merger of data, technologies and processes will help the organization consolidate programs and minimize the risk of non-compliance.
Leverage Automated Data Testing
If you have tens or hundreds of thousands of documents to migrate (or more) then sample-based testing simply won’t cut it. If you are only taking a sample and testing a small percentage of documents and data, you will continually find issues where it will require an unknown number of samples, and it can very quickly rack up man-hours and ultimately delay the finish of the project. Improve the predictability of your project by testing all of your data at once with the right software and services.
If you do need assistance migrating data during a merger, acquisition or divestiture, Valiance Partners has the experience, the software and the methodologies to streamline the project and finish on time with 100% verified data. Contact us today for a consultation.
Oftentimes when companies are in a heavily-regulated industry, the threat of not being in compliance can result in overprotection of information. Rather than risk not having access to it, many organizations hoard information, often forever, in their production system.
Storing information in a primary production system can be very expensive – why use nearly all of your available storage to house information that may need to be accessed only in the event of an audit or infrequent report?
Alternatively, archiving information uses a pre-determined set of business requirements to move certain data and/or documents from old or legacy systems to a much cheaper storage option that is still secure and allows for easy access in the event information is needed.
When Do I Need to Archive Data?
There are a number of scenarios where a company may seek out data archiving software:
Alleviating your current or new systems of excessive information can help improve the performance – sometimes drastically – of your production system. Without the stress of extra terabytes of legacy or/and excess information, systems run quicker, crash less, perform updates faster and have more available storage.
Archiving should be performed with every migration project and for any company looking for easier cloud enablement. Why clutter your new system with unneeded legacy data? Why pay premium prices to store information you may never need to access?
Migrate vs. Delete vs. Archive Data
Again, this is where compliance and regulation causes us all to hold tight to our information and keep it at-the-ready just in case it’s needed for an audit. Audits are a real concern – if, for example, the FDA needs a certain document for a service you haven’t offered in years, you need to be able to produce that document quickly.
Because of this, many don’t want to risk moving information off their primary system and they certainly don’t want to risk deleting it. Unfortunately, keeping records for longer than their standard retention period can be a liability for the organization.
When you archive data using data archiving software, the information is copied, moved and then deleted from your system. You can still easily and readily access your information without the clutter of storing it on your system and without the liability of keeping information past its retention period. Archiving provides a win-win scenario.
Want to learn more about managing and archiving information in TrackWise using an easy-to-use software?
The right data archiving software can help determine which information in your system is inactive and, using established requirements, will copy and move archived information to an alternative storage location and then delete it from your system. With the amount of intricacies in software like this, when you look for data archiving software it’s best to also look for a company with years of archival and migration experience within your industry.
To read more about archiving information in TrackWise, download this free PDF and see how data archiving software can help.
From May 30th to June 1st the Valiance team was in Copenhagen for the 3rd annual Veeva Systems European R&D Forum. For those who weren’t able to make it, here’s what you missed and some of my tips for how to get the most of future trade show events.
What is the Veeva Systems European R&D Forum?
This annual event is the place to be for manufacturing, regulatory and IT employees in the pharma and biotech industries considering a move away from their current technology to Veeva. Event attendance this year nearly doubled from last, with everyone looking for the latest from Veeva and its partners. As a Veeva partner, Valiance not only showcased and presented at the event, but we were also a proud sponsor.
Over the course of three days, attendees had plenty of opportunities to attend sessions that were scheduled to correspond with job functionalities and responsibilities, such as clinical, quality and regulatory. My colleague and managing director of Valiance Partners David Katzoff co-hosted a session along with the Veeva Director of Services Engagement, Henri Valentin. They spoke to a captivated audience about how to plan a successful migration, sharing recent experiences and lessons learned.
Don’t Miss Out on Veeva Systems Insights… or the Giveaways
In our sponsor booth, Valiance hosted a giveaway promotion for the newest Amazon Echo. Everyone who visited us at the booth and/or attended our presentation was entered into the drawing. While this contest was only available for event attendees, everyone can still access the free whitepaper below with Veeva product insight – download our whitepaper and take the migraine out of migrations.
Read about how to plan and execute a successful migration to Veeva Vault that avoids common pitfalls and keeps your project on time and on budget.
Tom’s Top Three Tradeshow Tips
In my years of experience, I’ve attended my fair share of conferences, forums and events. On the heels of the Veeva Systems European R&D Forum and as we prepare for the Veeva Systems Global R&D Summit this October in Philadelphia, I would like to share my top three tips on how to optimize your time at a tradeshow:
If you have the time, spend a few minutes at the booth of a familiar company – they will have new information to share about clients doing exactly what you might be doing. While we do plenty of educating on our company and offerings in our booth, we also love hearing from existing customers and sharing special offers, promotions and relevant updates.
Perusing booths and attending educational sessions or workshops are important, but my priority when attending an event like Veeva Systems European R&D Forum is to network. It’s so important for me to talk to as many current customers, new prospects and partners as possible. Not only does this help keep me in the loop of new industry trends and topics, but also builds relationships.
In today’s digital world you can expect any event to publish the list of sponsors and speakers on the event’s or/and company website. If you see a company that you would like to meet with at the event – whether it’s to learn more about the company, explore a specific topic, network with a particular thought leader or to schedule some one-on-one time for a software demo or consultation – engage with them on their company website or LinkedIn page. Speaking from my own experience, we are more than willing to meet and connect with those who raise their hand.
When a company takes the time and effort to host an event, it surprises me that some don’t take the time to visit with event sponsors. This is a missed opportunity! In the period of a year, much has often changed with a sponsor, and having hosts mingle with sponsors can be very enlightening. I can’t tell you how many times I’ll be talking to a manager or rep and they’ll say, “I didn’t realize that.” You’ll never regret learning about a product update or new approach to solving a tricky problem.
For more information about future Veeva events, please click here: Veeva Systems Events
For our upcoming event schedule, including where we’ll be next, please click here: Valiance News and Events
To download our free whitepaper featuring Veeva Vault migrations, please click here: Taking the Migraine out of Migrations
Migrations are a complex undertaking and traditional testing approaches often fall short of today’s business and compliance requirements. The ensuing risk can easily result in costly errors. The question is: How sound are your current testing and sampling approaches?