How could BNY Pershing have unintentionally stored and distributed the wrong interest rates on domestic and international variable rate securities to clients for years?
Easily if the firm didn’t have the right checks and balances to ensure data quality, based on the Financial Industry Regulatory Authority’s account of why it fined the giant clearing firm US$1.4 million in late July. The broker-dealer watchdog did not name any individual culprits but suggests that Pershing’s policies and procedures were flawed. The blunders caused Pershing to violate the regulatory agency’s rules mandating accurate recordkeeping, trade confirmations, and system oversight. FINRA requires member firms to report data in a way that ensures its accuracy, timeliness, and completeness.
The amount Pershing was fined is a slap on the wrist, because it represents FINRA’s interpretation of mitigating circumstances and remediation, say legal experts. The errors occurred during periods of economic turmoil; there was no intentional wrongdoing; and clients were paid the correct interest rates. Pershing also eventually hired a consultant to improve its data management policies and procedures.
However, regardless of the size of Pershing’s penalty, FINRA has sent clearing agencies and broker-dealers a clear message on their responsibilities to ensure accurate data. “FINRA may impose a higher fine on others for similar failings in the future on the grounds they should have learned from Pershing,” cautions David Adams, an attorney with Mintz Levin in Washington, D.C specializing in broker-dealer regulation.
Pershing’s snafu seems to have jolted rivals into action. Operations managers at two other clearing agencies and some broker-dealers privately tell FinOps Report that they have been asked to review their data coding procedures. They would not comment on who made the requests but acknowledge that internal data quality audits are in the works. No one wants to incur regulatory fines and reputational risk for needless mistakes.
In Pershing’s case, the wrong interest rates appeared in over one million account statements and at least 200,000 trade confirmations between January 2010 and December 2022. The incorrect information also found its way onto online access portals used by Pershing’s customers and registered representatives at the introducing brokerage firms which use Pershing for clearing services. The inaccurate interest rate information affected current interest rates, accrued interest, estimated annual income and estimated annual yield.
A subsidiary of Bank of New York Mellon, Pershing clears trades for more than 450 introducing firms which interact directly with investors. FINRA’s fine is the second recent whammy BNY has faced involving bad data management. The other is a lawsuit filed by SS&C Technologies in Canada against BNY for violating the terms of its market data contract by distributing the data to CIBC Mellon and dozens of other units without receiving SS&C’s consent and without paying higher fees. As reported by FinOps Report, earlier this year SS&C appealed an Ontario superior court’s ruling for BNY to pay the technology giant US$10 million in damages. SS&C told an Ontario court of appeals it is entitled to US$890 million instead. (FinOps Report, March 6. 2024 article SS&C: $890M At Stake in Legal Tussle With BNY Mellon)
In its documentation about Pershing’s fine, FINRA cites two reasons for its data quality mistakes. The first lapse occurred between January 2016 and September 2022 when a third-party vendor used by Pershing did not provide updated interest rates for at least 13,000 foreign variable rate securities. As a result, stale rates were used.
The second lapse involved a coding error within Pershing’s security master system which prevented the system from recording a zero percent interest rate on 2,900 US variable rate bonds from 2010 to 2022. Upon receiving updated information from another third-party vendor, Pershing’s security master file would always record their interest rates as the most recent non-zero figure. The security master file is a firm’s “golden copy” or Bible of all the details of any assets traded and cleared. Those details include the category of asset, interest rates, payment dates, issuer and identification codes.
At the core of Pershing’s errors was a fundamentally flawed data governance structure. In an organization as large as the clearing agency, the left hand might not have known what the right was doing. Rules promulgated by FINRA, and the Securities and Exchange Commission require firms to have a system of supervisory controls that identify the responsibilities of each party to ensure compliance with their regulations.
The chain of command for data quality, recommend data management experts, should include representatives from business processes through to management, chief data officers, and any data governance boards with internal audit as the last line of defense. “Compliance departments, trading desks, vendors and internal audit departments need to work together on market data quality,” Lisa Balter Saacks, president of Trillium Surveyor, a New York-based trade surveillance firm tells FinOps Report.
As best practice, financial firms also regularly test their applications to ensure that any coding or changes to coding are accurate. “Data quality testing should also be completed frequently to ensure accuracy,” says Saacks.” FINRA’s account notes that Pershing did do some testing, but it was not sufficient to catch all errors.
Large broker-dealers and clearing firms have historically relied on data stewards who work in partnership with business lines to set up a series of rules on how data is populated into a security master database. Those rules were a series of instructions on how the data will be cleansed or cross-checked prior to insertion.
Best practice is to always cross-reference data from two or more vendors or other sources. “Financial firms typically rely on multiple data sources for the same information,” says Jonathan Bloch, chief executive officer of London-headquartered data vendor Exchange Data international. “Taking such an approach allows firms to reconcile any discrepancy to create a golden copy of the data for the security master file to distribute to downstream applications.” Those applications include trade confirmation and customer reporting.
The data steward position has evolved into the data governance officer job with the expanded role intended to establish a data governance bureaucracy. That bureaucracy has a series of checks and balances to ensure the integrity of the firm’s data and its reputation. The internal audit department is the last line of defense.
“Pershing may have established a data governance structure but may never have revisited the original set of rules or instructions that predated the data governance officer,” speculates Peter Esler, founder of PVE Consulting, a market data advisory and compliance services firm. “Maybe, there was a naive assumption that best practice rules were followed.”
The right quality testing could have caught the two errors Pershing made. “Clearly, it does not appear than an internal audit was conducted to evaluate the original rules,” explains Esler. “The previous rules were also never scrutinized to deal with a zero-rate value or a multi-vendor cross check.”
When Pershing was notified of its mistakes by dozens of clients, Pershing did nothing to fix the underlying causes. It took a Band-Aid approach and corrected data only for the disgruntled clients without investigating whether there were more widespread problems. “It would have been better if Pershing had discovered the errors itself,” explains Bloch. “The firm should also have immediately investigated the cause of the glitches and not waited until FINRA started its investigation.” It could not be determined who decided to delay taking action. Was it the chief data officer or the data governance committee?
Pershing should also have taken a preemptive legal strike. “Firms that detect an anomaly that has broad potential customer impact will typically be proactive and communicate with FINRA,” says Esler. Data operations managers at rival firms tell FinOps Report that Pershing’s chief compliance officer might have first found out about the data glitches from FINRA. However, that stance could not be verified. Pershing is keeping tight-lipped about its mishaps.
Industry speculation abounds that Pershing’s coding managers, operations managers, chief data officer, and chief compliance officer were called to the carpet by senior management to explain just what went wrong. That’s when finger pointing might have started. It would be easy for a compliance department to plead ignorance if it were not informed of glitches by the chief data officer.
Chances are that if Pershing received calls from disgruntled customers sporadically over an extended period of time, the data governance committee might not have been sufficiently concerned. FINRA says the clearing firm made the right interest payments, but it cannot be determined whether the payments were corrected after complaints from clients, or they were always right. It could also not be verified whether Pershing compensated customers if the data errors affected the prices at which customers bought or sold securities. Customers could have bought securities at too high a price and sold at too low a price.
Sooner or later, someone will likely fall on the sword for Pershing’s mistakes. The coding experts are the easiest candidates. However, the onus of preventing data mistakes or fixing them ultimately rests with the data governance committee and its C-level executives who have a fiduciary obligation to clients. In Pershing’s case one can only wonder whether its chief executive and chief information officer found out about the problems from the chief compliance officer only after FINRA’s investigation began. “We take our regulatory and compliance responsibilities very seriously,” says a statement issued by BNY, adding the bank is pleased to have resolved the matter. It should have done so a long time ago on its own.
Leave a Comment
You must be logged in to post a comment.