Data utility — it’s a buzzword, which has cropped up sporadically in the data management arena and is now making a comeback, albeit in diverse flavors.
The latest wave of incarnations: a reference data utility launched by IBM and GoldenSource, and a know-your-customer (KYC) data utility disclosed by global messaging network SWIFT. Their unveilings earlier this year followed yet another announcement in late 2013 from US market infrastructure Depository Trust & Clearing Corp. (DTCC) of the potential creation of a data utility specializing in counterparty data. SmartStream Technologies went live with a reference data utility for equities, fixed-income securities, and exchange-traded derivatives in 2009. New issuance Eurobonds were added in affiliation with Euroclear in 2013.
Their arrival is opportune, say data management experts, as the need to reduce costs and meet regulatory pressures is prompting a greater focus on the merits of outsourcing the data management process. Why do all the work in-house to collect data from an array of sources, ensure its accuracy and distribute a clean copy downstream if someone else can do it?
Data can be viewed as a proprietary — and competitive — asset depending on the type of information involved and a firm’s philosophy. However, financial firms understand that doing all of the work on so-called commoditized data to ensure accurate information flows through a multitude of internal applications for trading, risk management, valuation, and even clearance and settlement is a thankless, time-consuming and often cost-prohibitive task.
The reason: multi-tasking middle- and back-office operations experts, who might not even understand all of the data nuances and usage requirements for each asset class or application, are performing the exact same cumbersome chores multiple times for different business lines with no guarantees they will come up with the same results. Data rules, data models, and even data sources could differ leading to discrepancies and potential financial risk in the form of incorrect portfolio valuations; market, counterparty and credit risk: erroneous regulatory and investor reports, and even missed clearing and settlement timetables.
The question then arises: is a data utility the answer and how exactly does that business model differ from managed services?
Unclear Definitions
“The terminology is confusing to many, because both data utilities and managed services say they offer a one-to-one approach to clients,” explains Brian Sentance, chief executive of Xenomorph, a New York-based software firm specializing in cleansing risk management data. “All financial firms ideally want a client-customized solution that also delivers the cost benefits of outsourcing. The question is, are those two goals achievable at the same time?”
Historically, managed services have worked on a one-to-one basis with each client, whereas utilities have taken more of a one-to-many approach, say several data management specialists at banks and fund management shops who spoke with FinOps Report. Ownership and access could also differ. Regardless, service providers will now need to strike a balance between common and customized bespoke services.
“A data utility isn’t a one-size fits all,” insists Christopher Riggs, a partner in the financial services practice of IBM in New York. Case in point: its new service with GoldenSource comes in three variations with IBM responsible for hosting a global SmartCloud for exception management and data integration and GoldenSource providing an EDM application platform with market data connections and cleansing capabilities. Users can rely on receiving a data feed of cleansed reference data processed by IBM and sourced from third-party vendors, while retaining their in-house enterprise data management platforms and data integration technologies; alternatively IBM can lift out their existing EDM platforms and handle data cleansing and integration with its own staff. In yet a third hybrid model, a client could have a local in-house copy of the data with extra attributes — a scenario which could appeal to hedge funds which don’t want to reveal their specific securities of interest and hence their possible trading strategies.
GoldenSource and IBM’s partnership isn’t the first time the concept of a commercially-run data utility has appeared. Similar initiatives were undertaken by GoldenSource-rival Asset Control in combination with global consultancy Accenture in the reference data space, and IBM itself which purchased a reference data management business from Dresdner Bank about a decade ago. Neither mustered sufficient client volume, say several data management experts. Yet another venture between GoldenSource and Broadridge Financial in 2007 did nab a few clients, but when Broadridge purchased buy-side technology provider Paladyne in 2011, it opted to leverage that firm’s data management capabilities.
So just what’s new now? GoldenSource and IBM are betting on two factors: a more flexible business model and, even more so, on C-level interest in outsourcing data management to ensure cost reduction. Timing is everything, they believe.
“The issue of data management has now become elevated to the C-level with new sponsors,” says Mike Meriton, vice chairman of GoldenSource. “No longer do such programs need to be sponsored by the operations and technology folks, which run them.”
While neither GoldenSource nor IBM would provide even an inkling of cost savings, three data management experts at large global banks who spoke with FinOps say that as a rule of thumb outsourcing data-cleansing procedures could easily generate a savings of between 10 percent and 20 percent in operating costs depending on the number of data vendors, applications and types of data involved. Additional savings may come from renegotiated data vendor contracts. However, large-scale installations could also create data lag in time to market and pose integration challenges, so there is no guaranteed benefit.
Competition Abounds
GoldenSource and IBM aren’t the only ones counting on a utility model for reference data management to be a cash cow. So is incumbent SmartStream, which touts the merits of its offering as the first “real data management utility.” It doesn’t operate under the guise of a managed service, claim company officials. At the core of SmartStream’s service is the concept that, while clients’ consumption requirements may vary and change over time, there are common elements in the data points needed to identify financial instruments and the processes to cleanse and normalize them.
While not disclosing either the number of users or any names, SmartStream officials claim their data utility goes a step further than other commercial rivals. “To maximize economies of scale, a data utility must promote standardization through mutualized services,” says Adam Cottingham, SmartStream’s vice president of data management services in London. “By offering access to primary sources and aggregated vendor sources along with common rules libraries, SmartStream allows financial firms to take a new approach to solving data quality and timeliness issues.”
In the case of new issuance Eurobonds, the data utility receives information on the debt instruments from syndicate desks through Euroclear, well-known for its settlement of Eurobond transactions, as soon as an issuance is made public. SmartStream, says Cottingham, will turn the data around in 15 minutes after it is enriched against users’ own requirements and forward it to either Euroclear’s members or SmartStream’s direct clients in the format they want. The result: Eurobond traders can trade a new issuance immediately in the secondary market, because they don’t have to wait until the end of the day for the necessary information.
Just as important is what happens to data after it is delivered to customers. “Simply dumping a datafeed alongside another source and calling it a golden copy will not help business lines solve the issues related to their data management and reduce cost,” insists Cottingham.
In SmartStream’s version of a data utility, Smartstream is also responsible for tracking the entire lifecycle of a financial instrument, including how it changes against customer-defined sources. SmartStream will cross-reference the information with market-driven events — such as corporate actions — and changes to pricing models. Such a scenario ensures that data is enriched in accordance with how it will be consumed in the trade-life cycle. What’s more, SmartStream will also be responsible for providing an audit trail of any changes made by any operations executives at the client firm.
“The cost benefits to a customer are far greater using our data management process versus a managed service which claims to be a data utility,” claims Cottingham. “Mutualized processing approaches that run in conjunction with customer specifications drive down costs. Economies of scale can be achieved in producing measureable data improvements rather than being limited to a headcount-reduction model aligned with application hosting and proprietary processes.”
While SmartStream would not specify either the number of financial firms using its data utility or the names of any users, it insists that it is offering a multi-tenant platform. So do Goldensource and IBM: clients have some say on data-cleansing rules while gaining the scale of a utility service, they counter. Rhetoric aside, it is difficult to distinguish their respective merits with any certainty, acknowledge data management specialists consulted by FinOps Report.
In the realm of customer or counterparty information, the divergence among offerings seems a little more clear, although still conceptual. SmartStream’s data utility includes legal entity information, but its approach is different from the that adopted by DTCC , says Cottingham. Instead of viewing legal entity data from an internal KYC basis, SmartStream is adopting a market-event driven approach to associate legal entity codes and their reference data heirarchies.
DTCC has said that its counterparty data utility can help financial firms comply with KYC, the US Foreign Account Tax Compliance Act and related customer onboarding rules. The counterparty data will also be enriched with the Omgeo Alert database of standing settlement instructions.
At the time of its announcement last October, DTCC cited some large banks — including Barclays, Credit Suisse, Goldman Sachs and JPMorgan Chase — as helping with the initiative. DTCC’s subsidiary Avox will provide data scrubbing and validation while the legal entity identification codes for players in the swaps market will be generated through the CFTC Interim Compliance Identifier (CICI) utility that DTCC runs in partnership with SWIFT.
Ownership differences aside, DTCC’s data utility is likely to be complementary to SmartStream’s. However, at this point, it is difficult, if not impossible, to analyze how the DTCC’s plans compare to SWIFT’s KYC service. Questions remain about DTCC’s fee structure and who will be allowed to access information in the data utility.
SWIFT insists that the two utilities differ in large part, because SWIFT’s will for starters benefit correspondent banks — and is not securities-related. In a statement on its KYC Registry, SWIFT cited Bank of America Merrill Lynch, Citi, Commerzbank, JP Morgan, Societe Generale and Standard Chartered as initial supporters, but expects more to join the bandwagon. “Users will have a standardized access point to obtain details on their counterparties, while retaining ownership of their own information and control over which institutions can view it,” said the statement.
Yet, the impetus for both seems to be regulatory. Securities watchdogs won’t hesitate to fine large banks for doing business with the wrong people and countries. The current rules related to preventing tax evasion and money-laundering require that financial firms obtain enough information about new and current customers and counterparties. Ensuring that information remains accurate is challenging, because such data can become stale quickly.
One counterparty data specialist at a large US broker-dealer predicts there will be substantial overlap in the type of information DTCC and SWIFT offer and from where it is sourced — the users of the service themselves. “It appears to be a case of leveraging some of the same data, rather than outright competition,” says one New York broker dealer operations executive.
For now, large global banks and broker-dealers appear to be the first customers of any type of outsourced data management service, and for good reason. They are likely the largest consumers of data and spend the most on managing it due to the vast number of applications for their diverse business lines. But where the sell-side goes, the buy-side often follows particularly as trading strategies become more sophisticated to capture alpha.
What’s gained by a financial firm using a data utility, and whether the utilities themselves — especially the commercial ones — can survive are still matters of debate. Well-funded initiatives by brand-name providers have failed in the past; therefore, it remains to be seen whether time has changed their prospects for success.
Yet, at issue are consistent concerns. “There are a lot of words being thrown around, but that stance misses the critical points: who will set the rules, who will have access, and whether there will be sufficient cost savings,” says Alan Paris, a partner with outsourcing specialist firm eClerx in New York. “That’s what ultimately matters.”
Leave a Comment
You must be logged in to post a comment.