An Approach to Seamless Data Migration during Retirement Plan Recordkeeping Platform ModernizationBy: admin - Jun 03, 2014
The change in the customer environment in the US retirement markets, coupled with the increased competition and need for growth in plan participant base, has caused US retirement plan providers to reassess their platform/technologies and processes. It’s the right time for the plan provider organizations to establish a level of parity from a technology perspective to stay relevant to customers, and have competitive edge over the other players. Also, there is a burning need for cost reduction, as significant reduction in operation costs have to be achieved. In a recent study of cost/income ratios of financial organizations, it has surfaced that most of the financial institutes like banks, insurance and other financial organizations who run their business on legacy platforms are having a very high cost/income ratio of 60 plus, and where as those run their core platforms and systems with real-time processing have the ratio as low as 30 percent. Irony is that the CTOs and CIOs are still comfortable with the legacy platforms with high operation costs and most of them express their opinion that “why change something, which is working”. In spite of knowing the ‘cost of not changing’, they prefer the status quo due to the complexity and risk associated with transitioning from a long standing “stable” legacy platform to something new.
When probed, most of the reasons stated by CXOs for being in ‘status quo’ fell under three categories mentioned below:
i. Data Migration Issues
ii. Consolidating / replacing various peripheral / line of business systems
iii. Integration challenges
Data Migration Issues
The term data migration doesn’t mean simply copying data from system A to system B. When we ask about the data migration challenges, most of the organizations talk about issues related to data quality (corrupt data, missing data etc.), data formats and compatibility issues (as data has to be pulled out from various peripheral systems, each application had its own format. It’s quite a challenging task to transform the data where the source and target systems have different data formats), application performance issues and extended / unexpected application downtimes. This is just a tip of the iceberg. Data migration challenges severely impact project budgets. It is very critical to understand all the applications involved, design requirements, time schedules, stakeholders involved, operating systems, hardware configurations, number of servers and server clusters, amount of storage, type of data bases and network speeds etc. Any small mistake will have a huge impact on the project schedule.
As a single business process run on plethora of applications, these applications have different levels of business criticality and therefore have varying degrees of acceptable downtime. So achieving a minimal downtime during the data migration process is a daunting task. Organizations are looking at technologies which allow for non-disruptive data migration with “roll-back” options. Essentially if something goes wrong, they can kill / restart the migration process.
If discovering, analyzing and profiling source data is a challenge, then other challenge will be verifying that data is migrated correctly. When dealing with several peripheral systems, a few systems might demand similar configurations for source and target storage hardware. As these peripheral systems are of different firmware versions and completely from different generations, they hold data of various data models, time periods, relevance, and quality demanding lot of time for data mapping and transformation tasks. Combining all of these issues makes data migration a complex and demanding project, which most of the CXOs want to avoid and maintain the status quo.Disclaimer
Categories : 401 Record Keeping Blog DC Record keeping Record Keeping Software Retirement Services Retirement Technology 401k Record Keeping Software Retirement Plans Record Keeping
- HADR Environment with Azure Blob Storage
- Testing to Create a Zero-Defect Retirement Product: Challenges and Checks
- How Automation Aids Quality Control and What to Watch Out For
- Why Automate Software Testing?
- Make financial wellness programs more valuable to users, quantify benefits to employers: Report
- Why financial literary and financial wellness matter
- Role of APIs in the digital transformation journey
- Data Analytics can help deliver competitive plans, work out retirement income
- Mobile and online capability essential for DC plan providers, millennials seek info from social media: Study
- Why authentic leadership is important for transformation success; impact of MEPs, and the millennial connection