<![CDATA[The Regulation Forum - News]]>Thu, 21 Jan 2016 11:37:31 +0000EditMySite<![CDATA[New SRC head Tucker warns against regulatory complacency]]>Fri, 11 Dec 2015 16:15:58 GMThttp://www.regulatorychange.com/news/new-src-head-tucker-warns-against-regulatory-complacency

Paul Tucker (right), the newly-appointed chair of the Systemic Risk Council, the U.S.-based lobbying group that campaigns on financial services regulation, has warned against complacency from the regulators now that the global financial crisis has passed its peak.

In an interview with the Financial Times, the former Bank of England deputy governor said he would use his new role to ensure policymakers’ determination to reform the financial system does not fade.

Tucker's experience in financial services regulation covers the peak of the financial crisis: He was Deputy Governor at the Bank of England from 2009 through 2013. He joined the Bank in 1980 and played a major role in many of the most significant developments in the international financial system. He also served as a member of the G20 Financial Stability Board Steering Committee and chaired the FSB’s group on resolving large and complex banks. 

<![CDATA[ESMA consults on revisions to trade reporting requirements under EMIR]]>Tue, 02 Dec 2014 16:41:16 GMThttp://www.regulatorychange.com/news/esma-consults-on-revisions-to-trade-reporting-requirements-under-emirPicture
By Nathaniel Lalone, Katten Muchin Rosenman UK LLP

The European Securities and Markets Authority (ESMA) announced a new consultation to consider revisions to the regulatory technical standards (RTS) and implementing technical standards (ITS) for trade reporting under the European Market Infrastructure Directive (EMIR). The RTS and ITS were finalized in 2012 and 2013 at a time when, according to ESMA, “there was only limited practical experience with the reporting of derivatives.” Since then, reporting counterparties have identified a series of issues requiring further clarification.

This has led ESMA to publish a series of questions and answers relating to trade reporting requirements (Q&As) in an effort to address these concerns. ESMA now proposes to transpose certain of the key elements of the Q&As into technical standards. 

ESMA has classified its proposed amendments as clarifications, adaptations and introductions. 

'Clarifications' refer to instances where the meaning of a reporting field is open to interpretation or where completing the field is approached inconsistently by reporting counterparties. For example, mark-to-market value reporting will be adjusted in light of market practice in valuing different types of derivatives: futures and options will be marked-to-market based on the size of the contract and the current market price, whereas other types of derivatives, such as swaps and forwards, will be valued based on their replacement cost, taking into account delivery of the underlying asset. Cleared trades will be valued based on the central counterparty (CCP)’s settlement price. 

By contrast, an 'adaptation' is an update to an existing reporting field to reflect clarifications made in the Q&As. This would include expanding the range of possible values beyond “financial counterparty” and “non-financial counterparty” to include CCPs and certain public or international entities. 

Finally, an 'introduction' is a new field or value to reflect market practice or other regulatory requirements. In this regard, ESMA has proposed new data fields to distinguish between trade-level and position-level reporting as well as to provide greater clarity on the identity of the underlying reference asset for a derivative contract. 

The consultation paper contains a series of questions for which ESMA requests public comment. The consultation period will close on February 13, 2015. The European Commission will have three months from the publication of ESMA’s final report to endorse the proposed amendments to the RTS and ITS. The consultation paper can be found here.

Nathaniel Lalone is a partner at Katten Muchin Rosenman UK LLP  www.kattenlaw.com

<![CDATA[Getting the data story across to your board]]>Sat, 29 Nov 2014 10:53:21 GMThttp://www.regulatorychange.com/news/getting-the-data-story-across-to-your-boardPicture
"We didn't talk to Stephen Hester about data until the last two minutes," says RBS chief data officer 

Data and regulation may be important issues for chief data officers to discuss with their boards, but a discussion panel at the recent FIMA Europe conference agreed that ‘data’ and ‘regulation’ should take a back seat in that conversation. Instead, CDOs need to communicate about ‘business outcomes’.

“The topic of data management is relatively complex to get across in a meaningful way to people who are not involved in the topic,” said Graham Smith (pictured), chief data officer at RBS.

Smith said that RBS has gone through six iterations of trying to develop “a narrative that is the story about data. When I have a conversation now with the board we talk about ‘correct’, ‘connect’ and ‘create’,” he said.
  • Correct means the hygiene work, quantification of data quality and remediation of data
  • Connect means bringing that data together so it’s available for people to use 
  • Create – “the exciting bit” – means generating better customer insight and so on

“Those three views of the world are quite easy and intuitive to understand. And there’s value in each of them,” Smith said.

In doing a pitch to then-CEO Stephen Hester about why data matters , Smith said, “We had a conversation and didn’t mention data until the last two minutes. We talked about business outcomes and told them about some problems and the value of fixing those problems  - and then said, “By the way all of those are related to making your data better.” So ‘business outcomes’ are the most important way of having the data conversation.”

Peter Serenita (pictured, right), group chief data officer with HSBC agreed, and explained why data professionals struggle to understand why other people don’t “get it”: “Part of the problem is we feel it’s so obvious that we don’t have to communicate it. If there is  takeaway it is simply, simplify the message: forget all the things you actually know – in fact, it would be better if you just woke up with amnesia one morning and said, ‘I’m going to communicate outcomes.’”

He added: “It’s a never-ending thing: don’t think it’s ‘once and done’. It’s continual reinforcement of the message.” 

Telling stories

Smith said that “One of the most powerful ways to get across the data story is to tell individual stories that are a minute subset of what we’re trying to achieve. The more of those stories you can collect and bring out at an appropriate point, the easier your communication gets. People latch onto very real things.” For example, RBS managed to pre-call many people during the recent floods and say that if they had an insurance claim to call a particular special line that had been set up overnight. “We knew who to call because all the postcode data was good. Did it make us any more money? No, but it made the customer experience better.”

Smith addressed the particular problem of communicating about regulatory issues: “To make it real to people, have a story that says, ‘This is not about that thing that we’ve all heard about – BCBS 239 – it’s about good data management.’ However, there are 16 or 20 bits of regulation that all demand the same things, really. If you’ve ever taken the time to go through them and decompose them you’ll find they’re all asking for a fairly common set of outcomes. Our policy reflects that but that’s not the way to communicate the demands of regulation. It’s not about regulations, it’s about doing it for positive reasons.”

<![CDATA[FCA admits it has "a way to go" in being able to handle OTC trade data]]>Wed, 26 Nov 2014 12:19:35 GMThttp://www.regulatorychange.com/news/fca-admits-it-has-a-way-to-go-in-being-able-to-handle-otc-trade-dataPicture
Regulator's objectives frustrated by volume and quality of data but says recent ESMA consultation is "a good step"

The Financial Conduct Authority (FCA) says that it still has “a way to go” in developing the capabilities to use the huge amount of data generated by the EMIR trade reporting rules.

Speaking on a panel at the recent FIMA Europe conference, Tom Springbett (pictured), head of OTC derivatives and post-trade policy at the FCA, set out the regulator’s objectives with regard to OTC derivatives data but admitted, “We are still a fair way from achieving that. The quality of the data isn’t yet good enough.”

He told the panel: “We as regulators need to get better at using the data. We’ve asked for and received an awful lot more data than we’ve previous had but the fact that the data is also new to us means that we still have a way to go in developing the capabilities to use it.”

Sprinbett said the regulator’s aim in respect of OTC derivatives was twofold:
  • The next time there is a big financial crisis (“presumably there will be at some point”) the regulator will be able to look at a troubled entity and see which other entities it is exposed to in OTC derivatives. “We’ll be able to look at the whole web of the market and see what the impact of failure or multiple failures would be, with some confidence,” he said.
  • The other objective is to be able to see trouble coming, in advance. “We should be able to keep an eye on the market and to identify things that might be a cause for concern,” Springbett said.

He said that there had been “pretty good compliance and a lot of well-directed effort trying to get consistent reporting” but that there was still a long way to go”.  He added: “We still manage to find lots of problems with the data so there is more for the industry to do and there is also more for the regulators to do.”

He said that regulators needed to clarify what they expect better than they have done so far, adding that the recently-announced ESMA consultation was “a good step in that direction”.
<![CDATA[FCA says data issues were “pushed out” during EMIR drafting]]>Tue, 25 Nov 2014 23:19:09 GMThttp://www.regulatorychange.com/news/fca-admits-data-issues-were-pushed-out-during-emir-draftingPicture
Regulators now fixing EMIR rules because “data stuff” didn’t get prominence it deserved - Data experts need to be "closely engaged"

The Financial Conduct Authority (FCA) has admitted at a conference for data managers that data issues did not get the consideration they deserved when trade reporting regulations were being drafted. Moreover, this failure is reflected in the recent decision by the European Securities and Markets Authority (ESMA) to consult on changes to the trade reporting rules under EMIR, the European Markets and Infrastructure Regulations.

ESMA said in its consultation: “The practical implementation of EMIR reporting showed some shortcomings and highlighted particular instances where improvements could usefully be made so that the EMIR reports better fulfil their objectives.”

Tom Springbett, head of the FCA’s OTC derivatives and post-trade policy, said in a discussion panel at the recent FIMA Europe conference that data professionals need to make sure they are “closely engaged” with regulators.

He that there were two lessons to be drawn. “The first was that there was an awful lot going on at that stage [when EMIR rules were being drafted] so we were drafting rules around how much capital central counterparties had to hold and around how we would define what an intragroup transaction was and a whole list of 20 or 30 technical standards,” he said. 

“And I think the data stuff maybe didn’t get the prominence it deserves and to some extent we’re fixing the problems that arose because of that now. That’s a lesson for us as regulators.”

There was also a lesson for data professionals: “The engagement that we got at that stage wasn’t as intense as it could have been,” Springbett said. “If we had a big bank coming in to talk to us about EMIR technical standards back in early 2012 then we’d have a slot built in at the end to talk [about data] at the end but it would usually get pushed out because we’d overrun on other things.”

He suggested that the next time there is “a big regulatory initiative around data” that the people working on data within firms should “make sure that their messages are getting adequate prominence.” 

Chris Johnson, head of product management, market data services, at HSBC Securities Services, said that while banks have to, by definition, give the regulators the same answers for each of the trade reporting data fields, “we don’t have a great track record over the last 30 years of banks collaborating over things like [data issues]”. 

He added that if regulators “pick off each firm individually” they will get different answers. What was needed, he said, was “collaboration to get harmonised answers, which is what we all need because there is no commercial benefit in us doing things differently.”

Johnson said that, across multiple regulations such as EMIR, MiFID (Market in Financial Instruments Directive), AIFMD (Alternative Investment Fund Managers Directve) and Solvency II, there were different ‘asset type’ data fields. “We are grappling with these different needs,” he said. “Why can’t we get a consistent single version for all the regulations that means we can do things once?”

Springbett said that that was “a desirable objective” and that “ESMA is definitely pushing in that direction”. 

“I do certainly sense a greater determination to fix these inconsistencies and to have single standards,” he said, but “it will certainly take a while”. Sprinbett added that different regulators had different purposes: “If I’m a derivatives markets supervisor looking at MiFID or EMIR trade transaction report s I probably want something rather more granular than if I’m an asset manager supervisor who wants to know what an asset manager’s basic business is. But certainly the regulator community is very much bought into the [single standards] objective and heading in the right direction at least.”
<![CDATA[Case study: Shine a light on the facts - data governance lessons from ‘code 15’]]>Sun, 23 Nov 2014 18:51:48 GMThttp://www.regulatorychange.com/news/shine-a-light-on-the-facts-data-governance-lessons-from-code-15Picture
A case study presented at FIMA Europe showed how a rogue data code was a regulatory and a business issue

Determining the facts about an organisation’s data is essential to bring about a step change in data governance and to ensure compliance with the Basel Committee for Banking Supervision BCBS 239, Principles for Effective Risk Data Aggregation and Risk Reporting (PERDARR).

In a short anonymised case study about a bank credit risk modelling group, Jon Asprey, vice president strategic consulting at Harte-Hanks Trillium Software, told the recent FIMA Europe conference in London: “The regulator is guiding, cajoling, and moving you into the right place and really telling you the foundational capabilities, skills and processes they would expect you to have in place to manage your risk data effectively.”

Things like full documentation, Asprey said: “How often do we turn up at a client on Day One and say ‘Where’s the documentation?’ – and it’s in people’s heads, it went out the door, it hasn’t been updated for six years. But [the regulator] expects things to be fully documented, they expect taxonomies, lineage, naming conventions. They expect you to have established ownership, quality and accountability and also there’s a big focus on accuracy, completeness and the elements of governance.

Facts about the data

 “To be compliant and to demonstrate the capabilities required by BCBS, all the data management and data quality components that you need to put in place root themselves in the analysis of facts about your data.”

Asprey added: “When you’re trying to establish the culture of governance I often say it’s about trying to get people to be data-aware – not data gurus but conscious of the part that data plays in the business.”

These are some of the high-level steps that need to be taken:
  • Create the knowledge-base: what do we currently understand about our data? That means collating that, documenting it, making it accessible and visible and searchable to the people who need to know.
  • Encourage and drive collaboration. Organisations are silo’d but often data-driven processes have a number of hand-offs: they may start in operations or relationship management, move into IT and warehousing and then risk and finance and maybe customer support – so there are many hand offs so you need to drive that collaboration.
  • Measure the business impact and value.
  • Determine the process for taking action: who’s responsible? What happens?

There are three tactics for success:
  • Connect the stakeholders.
  • Link the data to the business process so as to be able to assess the impact of the problems and the priorities. “You’ll find 101 problems – the challenge is, which are the important problems and which are the ones you try to fix first?” Asprey said.
  • Connect governance and data quality management to a measurable improvement in the business. Money is spent on tools and resource are devoted to this – but how do we record the payback?

Case study: The mystery of code 15

In the case study example given by Asprey, his firm was working for the credit risk modelling group of a retail/commercial bank. The engagement objective was to identify and interview the stakeholders and build a set of data definitions (what does the data mean, what are the expected values, what’s the English description, what are good and bad values?).

Each of the bank’s customer was assigned a ‘risk category’.
  • The source system owner said that codes 1 to 14 were valid and that there shouldn’t be any blanks.
  • The transformation layer (ie, the data warehousing team) said that they have a script they run so that if there are any blanks, they are replaced with code 1.
  • The model developers said they use all codes that are in the system, with no filters or checks applied. 

It transpired that 125,081 customer files had been assigned a code 15 – a code that it had been said didn’t exist, wasn’t valid and shouldn’t be used. 

“We used this fact as a way to get those people around the table and say, ‘Well, you’ve all told us a part of the story but when we analysed the data this is what we see – so let’s try to understand actually what’s going on,’” Asprey said.
  • Then the system owner said code 15 was a redundant code and that it shouldn’t be being used for any new data. A change request was in place that had not been implemented to take it out of the system and to stop it being used.
  • The data warehousing people said code 15 was a known issue and a manual recoding process was in place to update the code 15s to whatever code it should be.
  • And the model developers said they deploy a “simplifying assumption”: if the data doesn’t look right they put something else in its place, such as an average value or a default value. 

Simplifying assumptions

“The interesting thing was the head of portfolio analytics who owned the model developers was on a mission to remove simplifying assumptions,” Asprey said. “The regulator was saying you shouldn’t be doing them.”

As one of the senior stakeholders said during the discussions, “What this is doing is turning a data story about code 15 into a business story of people doing extra work and having things that are inaccurate and having invalid data.”

“When we started the process, the model-development team were very uncooperative, “ Asprey said. “They didn’t want everybody to know that they were using data that wasn’t very good, they didn’t want people to know that simplifying assumptions were in place – it was just not a healthy environment.”

Now, the stakeholders at the bank are fully on board, they understand that their models are better, they can give better mitigation and explanation to the regulators, and they are in a much more transparent environment. 

“But the interesting thing is, by unearthing the facts we were able to get the dialogue. In terms of business performance the accuracy of the data was improved, the productivity of the agents was improved, the number of adjustments to the data and manual work was reduced, and ultimately for the model development guys, they had better model performance, fewer simplifying assumptions, and a better story for the regulator,” Asprey said.

“That factual analysis was the key driver in getting all these guys around the table, understanding the impact, and get agreement on what the problem was and what plan of action we needed to put in place.”