Another Look at Reconciling Data

Mike Arnoldy

By
November 7th, 2017


Most organizations severely underestimate what it takes to reconcile data. They often don’t consider how much historical data is needed and other items that need to be accounted for. The best way to make sure data reconciliation is smooth and mostly uneventful is to take a planned and considered approach.

Getting Started

The first step is to determine how much historical data to load. To do this right you need to review the data with a critical eye and operate with the assumption that data validation WILL take longer than anticipated. The minimum historical data required is the current year and the year prior. This will facilitate reporting with prior year comparisons. While there are typically some reports that report data that is older than the previous year, the time and effort to load this additional data is generally not worth it. Also, this type of reporting is best handled in Excel while data is naturally built up in the application.

Bottom line with this step is: If you think you need additional data, go back and think again, or be prepared to lengthen the project.

Data Validation

The next step is determining how you are going to validate the data. Often the first thought is to run reports from the existing system and Financial Management and then tick and tie the numbers. This is a very manual and slow process. Depending upon what the current system is, it is usually quicker to perform the validation in Excel. If the current consolidation is done in Excel or any Hyperion product, doing the validation in spreadsheets is definitely the way to go. The Excel spreadsheets are typically set up in a financial statement format. There is a column with values from the old consolidation system, a column populated using Smart View to pull the comparable values from Financial Management and a difference column. There should be a “Point of View” section that is referenced by Smart View so that the entity or period can easily be changed and the data refreshed.

Another challenge can be determining tie out points. If the new and old application differ in their dimensionality or there is greater detail in the new application, it may be challenging to find common data points between the applications for data validation. For example, if the new application has a dimension that is not in the original application, what do you tie it to? For the purpose of data validation, you may have to only look the top of that dimension when tying out to the original application and then look to another source like the ERP system to tie out the details.

Another increasingly important consideration is what the requirements are for documenting the data validation. This needs to be discussed upfront with the various stakeholders including both internal and external auditors.  This will allow for producing the required documentation during the data validation process rather than created at the end.

Now for the Process

Now that the mechanics of the process is defined, what does the process look like? I have heard the suggestion more than once, “Let’s just check the top and see how it looks!” The folks suggesting this have obviously never validated data.

The validation process needs to be an orderly one that builds from the bottom up. If possible, start with a base entity and base accounts for one month, usually January of last year. At this level, you are really checking that the data load process is loading the data correctly. Then you move up to the parent entities and parent accounts. This step is to confirm that your hierarchies are built correctly, and that translation and eliminations are being performed at the appropriate place.

Once you have reached the top entity, you should have the majority of the issues corrected. Each month’s validation will be smoother and quicker, with a few bumps here and there. At this point, it can be appropriate to take more of an audit approach to the validation where you can even “check the top and see how it looks!”

The validation process requires resources. I had the director of IT at one client volunteer to have IT do the data validation. I had to chuckle as I do not know many accountants that are going to sign off on data validated by IT. The resources used should ideally be familiar with the consolidation process. For the validation of the first month, you may want to keep the validation team small. There will be a lot of issues to resolve and it will be easier to work thru these with a smaller team. After you get the first few months of validation done and the process is running smoothly, you may want to expand the team by adding additional users. Have the users in Europe tie out their data. This accomplishes several things. It adds resources to the process so that it can be completed sooner. For the extended members of the validation team, they get to learn and use the application and gain confidence that the application is working correctly. This early exposure to the application can help ensure smooth parallels and go-live.

One final note: this process is easily carried out in Oracle EPM and OneStream CPM applications.

 


Mike Arnoldy

About Mike Arnoldy

Mike Arnoldy has over twenty years of experience in all phases of design, development, and implementation of financial applications. He has exceptional problem-solving and architecting skills with a strong technical background. He specializes in financial consolidation applications that include complex calculations, foreign currency translation, inter-company/equity eliminations and varied reporting requirements to satisfy both external and management reporting needs. Mike is also a Certified Public Accountant.

Leave a Reply

Your email address will not be published. Required fields are marked *