Dodd-Frank Forces the Data-Quality Issue

Dallas 2010/Sept./29 : By Bill Long

How sustainable is a data management outlook that does not, at some point, say: "Inaccurate data is incompatible with executional excellence? If we don't believe the data foundation of our strategy, why would anyone commit to the strategy?"

We've heard a few bankers saying it a different way: "What we don't know can hurt us — is hurting us. We just don't know where and how."

Where and how is about to get a lot clearer. Under Dodd-Frank, the integrity, accuracy, accessibility, usability and security of banks' data is in for intense scrutiny.

Once again compliance may be driving what could just as well be driven by profits. Poor data management is expensive. Its cost has always been obscured by the belief that the underlying issues couldn't be fixed fast enough for the immediate need. So get what you need, make your decisions and move on.

But what that attitude — an attitude steadily congealed over the years into standard practice — conceals is the enormous cost of never fixing the underlying issues. Are these examples familiar?

• A top executive needs a certain set of data every week and has to reassemble it from different sources every time. Each time it takes an extra hour to gut-check the source data and adjust it according to anomalies he has found in it in the past. 
• Ten different groups in the bank — separate business lines and support groups — all tap operations every month for the same general information for their separate purposes, and operations has to create it fresh for each. 
• A decision is made to close a data center without a realization of the lack of redundancy in the applications it houses. 
• A fruitless marketing campaign based on faulty data from old systems of acquired institutions. 
• Whole departments virtually shutting down to assemble compliance reports. IT barely able to take on mission-critical projects because regulators need data that is hard to find. 
• The same data warehoused in multiple places, but updated or fixed in only a couple of the places. 
• A competitor moves nimbly into a timely opportunity because they have the data that your bank can only guess at.

Multiply those inefficiencies every day for every employee who deals in data, and the costs are staggering — but never calculated.

What would be the net gain to the bank if you simply eliminated the most egregious data management problems: the flat-out errors, redundancies and incompatibilities?

Remember, the goal doesn't have to be data management utopia. The gurus in check processing used to devise slick solutions to speed up the checks; the grizzled veterans would brush them off, saying: "Forget all that — forget what already works. Eighty percent of my costs are exceptions. Fix them, and you've cured my problems."

If you wanted to take that 80% approach, how to start?

Name the problem. It could have a significant cultural impact for executive management to say: "We have a data management problem. Let's stop saying directionally right is good enough. It is not. Maybe it's wrong. We all need rapid access to consistent and reliable data."

Take the enterprise view. The siloed approach has already been tried. In every part of the bank, over the years, somebody has tried to fix the data problem for their reports or their needs. Even if they fixed it, the siloed solution is unsustainable. If the data matters, it matters to the enterprise, and it needs to be fixed at that level.

Take the users' view. It is tempting to start with the systems of record, perfect them, and then advance out to the users. But many data management problems arise when users, unable to get what they need, build their own solutions, creating redundancies. Build your data management around what your users need and how the use it.

Start selectively. Not all data is equal. What are the key pieces of data that drive the most important or frequent decisions around the enterprise? Start there, and build credibility for data management by making that information accurate, getting it into the users' hands in a timely manner.

Dodd-Frank may have awakened a sleeping monster in data management, but it was due for a wake-up call. In the coming months and years, it will become increasingly apparent which banks can make good decisions based on access to reliable data.

Bill Long is a managing partner in the North American financial services practice at ABeam Consulting. He can be reached at blong@hazelwoodpartners.com.

page top