Data is moving up the agenda within financial institutions as firms try to squeeze out more value, greater accuracy, improved timeliness, greater efficiency and all of that wrapped up in the compliance burden of quality control and risk management.
The claim that “Accurate data will prevent the next big financial blow-up” isn’t lightly made, but it is one of the conclusions of recent benchmarking research for FIMA, the financial data management conference taking place in London 12-13 November.
What is an acceptable level of data quality?
Fully 48% of the respondents to the FIMA survey have a chief data officer and a further 12% intend to create a CDO role in future.
Almost half are planning on adopting a new data governance platform over the next 12-18 months, whether to replace an existing platform or as a new implementation. The main reasons cited for doing so were to adapt to changing business needs , keep up with changes in the market, or for enhanced functionality.
Big data not such a big issue
While just over half handle data governance through a permanent centralised team, almost a quarter distribute responsibility out to the business to the relevant team through third-party installed software; only 4% outsource entirely.
The top three drivers for data management strategy were to improve data quality (79%), regulatory/compliance requirements (63%) and better operational efficiency (63%).
Only a quarter of respondents said that ‘Big Data’ analytics were of ‘vital’ or ‘high’ importance.
The report, Reference Data Management Industry Benchmarking Survey 2014, can be downloaded by clicking here.
FIMA Europe takes place in London 12-13 November 2014. The Forum for Regulatory Change is proud to be a media partner for the event.