Rates & Repo Preview: Repo economics and Settlement venues

Globally, interest in repo CCPs is growing, to some extent because of regulatory actions. But how does it all fit together when considering both traditional and newly emerging business models? Ahead of our upcoming Rates and Repo conference, we speak with our panelist experts about exciting new prospects as repo modernizes, as well as warnings to manage expectations between buy- and sell-sides.

In the repo world, one major theme is the continued progression of trading models and a “mutually beneficial evolution” of values between the sell- and buy-sides. And that is moving the market beyond traditional uncleared repo in a dynamic economic backdrop, said Jeff Sowell, repo trading and product development for Financing and Collateral Solutions at State Street.

Sowell points to the success of FICC’s sponsored repo in the US as an example. State Street was among the first sponsors to join and remains one of the largest participants. With banks increasingly providing access, Sowell noted that the team differentiates by including as many jurisdictions and entity types as possible given appropriate governance; providing access to late-day liquidity, even after the Fed’s reverse repo facility closes, particularly for cash investors; and offering ancillary services beyond repo.

Given the increased workload on the front office, there’s a lot of value that banks can provide clients, Sowell said, listing services such as standardized legal agreements, access to multiple counterparties in one set of documents, reduced counterparty risk, electronic trading, straight-through processing on the back end, collateral management and regulatory reporting.

State Street has also been at the forefront of developing peer to peer repo, which he noted needs to be more than a trading only solution if it is going to grow into a scalable, self-sustaining, liquid marketplace: “In order for repo models to continue to progress, it requires a partnership between the buy-side and the sell-side to mutually drive these models forward. The sell side can’t create a solution in a vacuum, it has to work for the buy side, and the buy-side cannot just conceive demands and expect that the sell-side can accommodate it.”

Triparty’s unfinished business

The traditional business model of triparty is one area getting a significant shake-up from digitalization trends. Triparty has been a huge driver of efficiency in the financial system because it provided a collateralized construct without a lot of operational overhead, but the next level of evolution comes when clients can optimize collateral across multiple triparty agents and other venues, like CCPs, said Bimal Kadikar, CEO of Transcend.

Kadikar also noted that interoperability across triparty agents has been a huge client demand for a long time but has not happened yet due to a variety of factors. It’s not clear that these issues can be solved directly by triparty agents, and consequently clients still need to address them. That means newer entrants such as tech platforms or DLT-based players will be expected to address that gap, which could potentially impact business models.

Transcend is helping clients to optimize collateral across triparty agents using innovative optimization algorithms and directing movement of assets across agents through clients’ operational infrastructure to achieve enterprise level efficiencies.

Among triparty agents, there’s headway on how digitalization and related technologies will help, but the bottom line is that the level of data and technology integration or “cohesive ecosystem” required does not exist today amid an increasingly urgent need for sophisticated execution of cheapest to deliver collateral that incorporates many more advanced factors, such as: customer vs firm collateral and liquidity coverage and net stable funding ratios (LCR/NSFR), Kadikar said.

“(The) industry has been asking triparty agents to be interoperable for years. A lot of current discussions indicate that Blockchain and DLT will make it easier, but I am not convinced,” said Kadikar, adding that the interoperability problem is solved by clients today but somewhat inefficiently. Moreover, ESG integration can be expected to gradually become an important demand, and while there is an early response from triparty agents, it remains simplistically focused on exclusion lists.

At the same time, there’s yet unfinished business with more vanilla technology, such as APIs: “APIs continue to be a prominently important and critical topic, it just needs the right level of investment from each of the triparty agents for it to work to the right level. And it will be important whether you (have) traditional assets or tokenized assets.”

Integrated verticals and buy-side repo clearing

While triparty has seen a surge of interest from Uncleared Margin Rules, another major development gaining steam this year  is non-bank participants – pension funds, insurance companies, asset managers, hedge funds — trying to access centrally cleared repo markets in Europe, a trend that might be as much as a decade behind the US, said Frank Odendall, head of Securities Financing Product and Business Development at Eurex.

“All these different buy-side entities suddenly have shown strong interest to connect,” he said, identifying factors behind this such as interest rate volatility, economic shocks associated with energy markets and the Russia-Ukraine conflict, and inflationary pressures.

Eurex runs a repo trading venue, Eurex Repo, and central clearing facility, Eurex Clearing. In addition, Clearstream, the ICSD and settlement venue for Deutsche Börse, facilitates single ISIN repo, triparty repo as well collateral management for repo trading and margining. This integrated model has a lot of efficiency benefits, he said.

In July, Eurex extended the repo clearing service to leveraged funds, and now has 11 buy-side firms representing about $1 trillion in assets on its cleared repo model, called ISA Direct. FICC’s sponsored model has some 1,900 buy-side Legal Entities meanwhile and has been live since 2005 by comparison.

Some firms stay away from cleared repo because of haircuts and margin requirements. But that simplistic calculation does not take into account extra services like access to intraday liquidity management, a topic that is going to become more prominent as the US moves to T+1 settlement and firms become even more challenged to raise cash on a compressed settlement cycle, Odendall noted.

“We provide, through the integrated system of trading and clearing and interlinked with the settlement locations, a mechanism to address one of those key challenges,” Odendall said. “That feature [same day settlement] is used every day significantly to raise money or place money…we settle billions every day in 30 minutes.”

Guaranteeing repo across the pond

Earlier this year, Bloomberg, Euroclear and Sunthay announced an initiative to launch guaranteed repo in the US, which combines two ideas: bank balance sheet relief and an alternative to how US repo settlement works today. It can be compared to indemnification in a peer to peer model.

Shiv Rao, chair of Sunthay, said that the model was some six years in the making, with early versions arising during his time at Barclays and Wells Fargo. This latest guaranteed repo venture aims to “industrialize” the early structures that he previously innovated, explained Rao. He further noted that Bloomberg’s and Euroclear’s involvement are central to creating scalability for the model.

He is keen to note that the Securities and Exchange Commission’s proposed rule for mandated clearing does not extend to guaranteed repo. Rao believes that the model reduces systemwide leverage and addresses contagion and concentration risks through a market-developed global solution, and that public policy goals to enhance resilience are furthered by excluding guaranteed repo transactions from clearing mandates.

“Euroclear is offering a service that brings much of the functionality of triparty to the bilateral DVP settlement market. Guaranteed repo reduces systemwide intraday liquidity demands and cash and collateral movements, which reduces costs for everyone, but also offers an alternative to the concentration that happens in Bank of New York,” said Rao. “Additional solutions are good for the market.”

Shiv, Frank and Jeff will be joining colleagues from DTCC and BNY Mellon on the panel “Repo Economics and Settlement Venues” at Rates and Repo, when they will discuss these and other major market trends. Transcend’s Bimal will be joined by experts from BNY Mellon, J.P. Morgan and Pirum for a panel discussion on “Developing the Triparty Business Model“. Rates & Repo is a conference for cash investors, dealers, market intermediaries, technology firms and other service providers. 

For Original Publication: Finadium Rates and Repo 2022

Collateral Management Technology Vendor Survey 2021 From Finadium Features Transcend

Finadium featured Transcend in a new survey on Collateral Management Technology Vendors in 2021. The survey presents an inside look at the technology vendors who are leading the future of collateral technology – and the incredible feats clients can accomplish with them.

Finadium profiles Transcend as a solution to manage collateral, funding, and liquidity within distinct business lines and across the enterprise. By connecting data and processes across disparate systems, Transcend’s holistic solutions help clients run sophisticated analytics, optimization and automation.

“Transcend was purpose-built to provide the most advanced post-trade collateral optimization capabilities in the industry.”

– 2021 Finadium Collateral Management Technology Vendor Survey

Finadium subscribers can download the survey to learn more about Transcend’s role in driving more effective collateral management and collateral optimization, as well as some new functionality recently added to the Transcend platform.

Learn More About Transcend

Transcend empowers financial institutions to maximize enterprise-wide financial performance and
operational efficiency. Through real-time global inventory and collateral management and optimization
solutions, Transcend helps clients manage intraday liquidity, funding and regulatory requirements.
With seamless workflows that connect front office decision-making with back office operations,
Transcend’s innovative technology promotes smarter investment decisions and improved financial
performance.

Contact the Transcend team for more information on our fully integrated suite of solutions.

Connected Data: The Opportunity for Collateral and Liquidity Optimization

The function and definition of collateral and liquidity optimization has continued to expand from its roots in the early 2000s. Practitioners must now consider the application of connected data on security holders to operationalize the next level of efficiency in balance sheet management. A guest post from Transcend.

The concept of connected data, or metadata, in financial markets can sound like a new age philosophy, but really refers to the description of security holdings and agreements that together deliver an understanding of what collateral must be received and delivered, where it originated and how it must be considered on the balance sheet. This information is not available from simply observing the quantity and price of the security in a portfolio. Rather, connected data is an important wrapper for information that is too complex to show in a simple spreadsheet.

Earlier days of optimization meant ordering best to deliver collateral in a list, or creating algorithms based on Credit Support Annexes and collateral schedules. These were effective tools in their day and were appropriate for the level of balance sheet expertise and technology at hand; some were in fact quite advanced. These techniques enabled banks and buy-side firms to take advantage of best pricing in the marketplace for collateral assets that could be lent to internal or external counterparties. Many of these techniques are still in use today. While they deliver on what they were designed for, they are fast becoming outmoded. Consequently, firms relying on these methodologies struggle to drive further increases in balance sheet efficiency, and in order to maintain financial performance targets may need to charge higher prices. This is not a sustainable strategy.

The next level of collateral optimization considers connected data in collateral calculations. Interest is being driven by better technology that can more precisely track financial performance in real time. A finely tuned understanding of the nature of the individual positions and how they impact the firm can in turn mandate a new kind of collateral optimization methodology that structures cheapest to deliver based on a combination of performance impacting factors and market pricing. This gives a new meaning to “best collateral” for any given margin requirement. This only becomes possible when connected data is integrated into the collateral optimization platform.

As an example of applying connected data, not all equities are the same on a balance sheet. A client position that must be funded has one implication while a firm position has another. Both bring a funding and liquidity cost. A firm long delivered against a customer short is internalization, which has a specific balance sheet impact. Depending on balance sheet liquidity, this impact may need additional capital to maintain. Likewise, an expected tenor of a position will impact liquidity treatments. A decision to retain or host these different assets as collateral can in turn feedback to Liquidity Coverage Ratio, Leverage Ratio and other metrics for internal and external consumption.

If these impacts can be observed in real-time, the firm may find that internalizing the position reduces balance sheet but is sub-optimal compared to borrowing the collateral externally. This of course carries its own funding and capital charges, along with counterparty credit limits and risk weightings in the bilateral market. These could in turn be balanced by repo-ing out the firm position, and by tenor matching collateral liabilities in line with the Liquidity Coverage Ratio and future Net Stable Funding Ratio requirements. Anyone familiar with balance sheet calculations will see that these overlapping and potentially conflicting objectives may result in decisions that increase or decrease costs depending on the outcome. By understanding the connected data of each position, including available assets and what needs to be funded, firms can make the best possible decision in collateral utilization. Importantly, the end result is to reduce slippage, increase efficiency, and ultimately deliver greater revenues and better client pricing based on smarter balance sheet management.

Another way to look at the new view of collateral optimization is as the second derivative. The first derivative was the ordering of lists or observation of collateral schedules. The next generation incorporates connected data across collateral holdings and requirements for a more granular understanding of what collateral needs to be delivered and where, and how this will impact the balance sheet and funding costs. It has taken some time to build the technology and an internal perspective, but firms are now ready to engage in this next level of collateral sophistication.

Implementing technology for connected data in collateral and liquidity

A connected data framework starts with assessing what data is available and what needs to be tagged for informing the next level of information about collateral holdings. This process is achievable only with a scalable technology solution: it is not possible to manage this level of information manually let alone for real time decision making. Building out a technology platform requires careful consideration of the end to end use case. If firms get this part right, they can succeed in building out a connected data ecosystem.

The connected data project also requires access to a wide range of data sources. Advances in technology have allowed data to be captured and presented to traders, regulators, and credit and operations teams. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way.

Once a usable, tagged data set has been established, it can then be applied to collateral optimization and actionable results. This can include what-if scenarios, algorithmic trading, workflow management, and further to areas like transfer pricing analytics. Assessing and organizing the data, then tagging it appropriately, can yield broad-ranging results.

Building out the collateral mindset

An evolution in the practice of collateral optimization requires a more holistic view of what collateral is supposed to do for the firm and how to get there. This is a complex cultural challenge and is part of an ongoing evolution in capital markets about the role of the firm, digitization and how services are delivered. While difficult to track, market participants can qualitatively point to differences in how they and their peers think about collateral today versus five years ago. The further the past distance, the greater the change, which naturally suggests challenges when looking at a possible future state.

An important element to developing scalable collateral thinking is the application of technology; our observation is that technology and thinking about how the technology can be applied go hand-in-hand. As each new wave of technology is introduced, new possibilities emerge to think about balance sheet efficiency and also how services are delivered both internally and to clients. In solving these challenges for our clients using our technology, it is evident that a new vision is required before a technology roadmap can be designed or implemented.

The application of connected data for the collateral market is one such point of evolution. Before connected data were available on a digitized basis, collateral desks relied either on ordered lists or individual/manual understandings of which positions were available for which purposes. There was no conversation about the balance sheet except in general terms. Now however, standardized connected data means that every trading desk, operations team and balance sheet decision maker can refine options for what collateral to deliver based on the best balance sheet outcome in near real-time. New scenarios can be run that were never possible, and pricing for clients can be obtained in time spans that used to take hours if not a day or more.

Now that collateral optimization based on connected data is available, this requires firms to think about what services they can deliver to clients on an automated basis, and what should be bundled and what should be kept disaggregated. As new competitors loom in both the retail and institutional space, these sorts of conversations driven by technology and collateral become critical to the future of the business. Connected data is leading the way.

This article was originally published on Securities Finance Monitor.

View and/or download Article PDF

Finadium report on ISDA’s Common Domain Model and the Digitization of Collateral

Finadium recently spoke to Bimal Kadikar, CEO of Transcend, regarding the adoption of ISDA’s Common Domain Model (CDM) by market participants. Finadium’s new report, published by Josh Galper, Managing Principal, evaluates the role of CDM to solve business problems for collateralized trading markets and its potential to standardize data elements across the derivatives lifecycle. Bimal commented on the pace of industry adoption:  

“Firms can migrate to CDM on their own schedules. It’s not like blockchain where the entire industry may need to switch over at the same time. Firms can also pick parts of CDM when they are ready for digitization at different points. This will help firms take advantage.” 

Access Finadium’s full report: The Common Domain Model: A Kickoff for Digitization in Collateral at Last?

In five years, 90% of funding will be done by machines

You may disagree with the number of years or the percent, but everyone understands that automation in the funding and collateral space is occurring at a fast pace. The question is how you prepare for this inevitable future? Our view is that connecting data from disparate sources is the key to the next evolution in the funding markets. A guest post from Transcend.

Who in the capital markets industry isn’t seeking greater profitability or returns? From balance sheet pressures and competitive dynamics to more resources to comply with regulation, focusing on transformative change to advance the firm has been a huge challenge. At the same time, technology is evolving at a rapid pace and the availability of structured and unstructured data is presenting a whole new level of opportunities. For firms to realize this opportunity, connecting disparate data and adopting smart algorithms across the institution are a critical part of any strategy.

Advances in technology have allowed data to be captured and presented to traders, credit, regulators, and operations. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. Individual extracts exist that sometimes cross silos, but more often cannot be reconciled across sources or users. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way. It does not matter if individual systems are old or new, in the cloud or behind firewalls, from vendor packages or in-house technology: they all have to work together. We call this connected data.

Businesses have understood for some time that this will require growth of automation, which will be a critical driver of success. Banks and asset managers know that they have to do something: doing nothing is no option at all. Machine learning and artificial intelligence are part of the solution, and firms have embarked on projects large and small to enable automation under watchful human eyes. The new element to consider in the pace of change is the ability of machines to connect, process and analyze data within technology platforms for exposure management, regulatory reporting and pricing. The more data that feeds into technology on the funding desk, the more that automated decision-making can occur.

While individual systems and silos can succeed on their own, a robust and integrated data management process brings the pieces together and enables the kinds of decision-making that today can only be performed by senior finance and risk managers. Connected data is therefore possibly the most important link between automation and profitability. It is a daunting task to consider major changes to all systems that are in play, but most firms are adopting a strategy to build a centralized platform that brings data from multiple businesses and sources. A key benefit of this strategy is that advances in technology and algorithms can be applied to this platform, enabling multiple businesses or potentially the whole enterprise to benefit from this investment.

The risk of inaction

Connected data can stake its claim as the new, most competitive advantage in the markets. Like algorithmic trading and straight-through processing, which were once novelties and are now taken for granted, the build-out of a connected data architecture combined with the tools to analyze data will initially provide some firms with an important strategic advantage in cost and profitability management.

With all the talk about data, there is an important human element to what inaction means. In a data-driven, technology-led world, having more or all the right people will not stop a firm from being left behind, and in fact may become a strategic disadvantage. The value of automation is to identify a trade opportunity based on its characteristics, the firm’s capital and the current balance sheet profile. Humans cannot see this flow with the same speed as a computer, and cannot make as fast a decision on whether the trade is profitable from a funding and liquidity perspective. While the classic picture of a trader shouting across a room to check whether a trade is profitable makes for a good movie scene, it is unwieldy in the current environment. A competitor with connected data in place can make that decision in a fraction of the time and execute the trade before the slower firm has brought the trade to enough decision-makers to move forward.

The competitive race towards connected data means that firms with more headcount will see higher costs and less productivity. As firms with efficient and automated funding decision tools employ new processes for decision-making, they will gain a competitive advantage due to cost management, and could even drive spread compression in the funding space. This will put additional pressure on firms that have stood still, and is the true danger of inaction at this time.

Action items for connected data

Data is only as good as the reason for using it. Firms must embark on connecting their data with an understanding of what the data are for, also called foundational functionality. This is the initial building block for what can later become a well-developed real-time data infrastructure.

Each transaction has three elements: a depository ladder for tracking movements by settlement locations; a legal entity or trading desk ladder; and a cash ladder. Each of these contain critical information for connecting data across the organization. If your firm has a cross-business view of fixed income, equities and derivatives on a global basis, then you are due a vacation. We have not yet seen this work completed by any firm, however, and expect that this will be a major focus for banks through 2019 and 2020.

Ultimately, an advanced data infrastructure must provide and connect many types of data in real-time, such as referential data, market data, transactions and positions. “Unstructured” data, such as agreements and terms, capital and liquidity constraints, and risk limits, must also be available more broadly for better decision-making, despite their tendency to be created in some specific silo. But an important early step is ensuring visibility into global, real-time inventory across desks, businesses, settlement systems and transaction types; this is critical to optimize collateral management. Access to accurate data can increase internalization and reduce fails, cutting costs and operational RWA. This is especially important for businesses that have decoupled their inventory management functionality over time, for example, OTC derivatives, prime brokerage and securities financing. Likewise, the ability to access remote pools of high-quality assets, whether for balance sheet or lending purposes, can have direct P&L impacts.

Step two is the development of rules-based models to establish the information flows that are critical to connecting data across a firm and simultaneously optimizing businesses on a book, business entity, and firm levels. The system must understand a firm’s flows and what variables they need to monitor and control within a business line and across the firm. Data will push in both directions, for example to and from regulatory compliance databases or between settlement systems and a trader’s position monitors. Rules-based systems simplify and focus on what is otherwise a very complex set of inter-related and overlapping priorities (see Exhibit 1).

Connected data can enable significant improvements such as:

  • Regulatory models can be fed on a real-time pre-trade “what-if scenario” so businesses can know how much a particular trade absorbs in terms of capital, liquidity or balance sheet for the given return, or if a trade is balance sheet-, capital- or margin-reducing.
  • Data can feed analytics that tells a trader, salesperson, manager or any stakeholder what kind of trades they should focus on in order to keep within their risk limits, with information on a granular client level.
  • XVA desks, the groups often charged with balancing out a firm’s risk and capital, can not only be looped in but push information back to a trader in real-time so they can know the impact of a trade.
  • Systems that track master agreements can be linked and analytics can point toward the most efficient agreement to use for a given trade.
  • Trading and settlement systems can interface with market utilities, both backward and forward.
  • Transfer pricing tools can be built into the system core and be transparent to all stakeholders with near instantaneous speed, at scale.

Transcend’s recent experience with some of the top global banks shows the value of consolidating data into one infrastructure. We are connecting front- and back-office to market infrastructure and providing information in a dashboard, in real-time. As trades book on the depository ladder, key stakeholders can see the change in their dashboard application and can make decisions on funding manually or feedback new parameters to pricing models across the enterprise. The same transaction and positions affect the real-time inventory view from legal entity or customer perspectives as well as driving cash and liquidity management decisions. Over time, as banks get more comfortable with their data management tools, parts of decision-making that follow specific rules can be automated. This will be an excellent deployment of the new data framework.

Betting on the time or the percent

As machine learning and AI advance, and connected data becomes more of a reality, technology platforms will learn how to efficiently mine and analyze data to understand if a trade satisfies institutional regulatory, credit, balance sheet, liquidity, and profitability hurdles. This will lead to an environment where a trade inquiry comes in electronically, is accepted or rejected, and processed automatically through the institution’s systems. The steps in this process are methodical, and there is nothing outside of what financial institutions do today that would prevent execution. A reduction in manual intervention can allow traders to focus on what is important: working on the most complex transactions to turn data into information and action.

The fact that more automation is occurring in funding markets is certain. The question at this time is how long will it take to automate most of the business. This is a bet on the timeline or the percent to which funding decisions can be automated but not the direction of the trend line. Could it be as much as 90% in five years? Answers will vary by the firm and some of the major players are already developing strategies to progress in this direction. Typically, people overestimate the impact of a new technology in the short term, but underestimate the impact in the long term. Banks have already invested in machine learning and AI tools to make automated funding a reality. But it will depend on the next and more complex step: to ensure that connected data can reach these tools, allowing for a robust view of positions, regulatory metrics and profitability requirements across the firm.

This article was originally published on Securities Finance Monitor.

View and/or download Article PDF

Building a Holistic Collateral Infrastructure

Following the financial crisis, regulations and their associated reporting have created an opportunity for banks and investment firms to create a single, unified collateral infrastructure across all product siloes. This does not have to be a radical architecture rebuild, but rather can be achieved incrementally.

There are legitimate historical reasons why collateral infrastructure has grown up as a patchwork of systems and processes. For products such as stock lending, repo, futures or contracts for difference (CFDs), the collateral/margining process was generally integral to the products and processing systems. It would not have made sense to break out collateral management into a separate group and hence operating teams and systems were structured around the core product unit. Generally, only OTC derivatives had a relatively clear decoupling between collateral management and other operational processes. Even as business units merged at the top level, this product separation at the collateral management level often continued.

While this situation could stand during non-stress periods, the financial crisis demonstrated the fallacy that siloed, uncoordinated collateral management systems, data and processes could weather any storm. This disjointed view caused a number of specific problems, including: an inability to see the full exposures to counterparties; a lack of organization in cash and non-cash holdings; and substantial inflexibility in mobilizing the overall collateral pool. Even before the crisis, inconsistent or “zero cost allocation” for collateral usage meant that collateral was not always being directed to the parts of the business that needed it most. After the crisis, with collateral and High Quality Liquid Assets at a premium, this became unacceptable.

Today, few banks and investment firms have completed the work of integrating their collateral management functions across products (see Exhibit 1). Some of the largest banks are focused on building capabilities to achieve enterprise-wide collateral optimization, while others are just starting on this effort, at least on a silo basis. Some have bought or built large systems with cross-product support, although this has proven costly. Others are evaluating organizational consolidation. Whatever their current state, a new round of regulatory reporting requirements in the US and Europe means that letting collateral infrastructure sit to one side is no longer viable to meet business or compliance objectives without adding substantial staff. One way or another, long-term solutions must be achieved.

Exhibit 1: Moving past the siloed approach

Source: Transcend Street Solutions

The next round of regulatory impact

While nearly all large firms have digested the current waves of regulatory reporting and collateral management requirements, the next round will soon be arriving. Among these are the Federal Reserve’s regulation SR14-1, MiFID II (Revision of the Markets in Financial Instruments Directive), and the Securities Finance Transactions Regulation (SFTR). It is worth looking at some of these requirements in detail to understand what else is being demanded of collateral management infrastructure and departments.

The Federal Reserve’s regulation SR14-1 is aimed at improving the resolution process for US bank holding companies. It includes a high level requirement that banks should have effective processes for managing, identifying, and valuing collateral it receives from and posts to external parties and affiliates.[1] At the close of any business day banks should be able to identify exactly where any counterparty collateral is held, document all netting and rehypothecation arrangements and track inter-entity collateral related to inter-entity credit risk. On a quarterly basis they need to review CSAs, differences in collateral requirements and processes between jurisdictions, and forecast changes in collateral requirements. Also on the theme of improved resolution rules are the record keeping requirements related to “Qualified Financial Contracts” (effectively most non-cleared OTC transactions).[2] These require banks to identify the details and conditions of the master agreements and CSAs applying to the relevant trades.

While the regulatory intent is understandable, these requirements are exceptionally difficult to meet without a unified collateral infrastructure. There is in fact no way to respond without a single, holistic view of collateral and exposure across the enterprise. While SR14-1 impacts only the largest banks, it still means these banks have a mandate to complete the work they have begun in organizing their vast collection of collateral information. This will lead to greater collateral opportunities for the big banks, and may in turn encourage smaller competitors to complete the same work in order to exploit similar new efficiencies.

Article 15 of Europe’s SFTR places restrictions on the reuse of collateral (rehypothecation). The provider of collateral has to be informed in writing of the risk and consequences of their collateral being reused. They also have to provide prior, express consent to the reuse of their collateral. Even with the appropriate documentation and reporting in place, a collateral management department has to carefully ensure that the written agreement on reuse is strictly complied with. While nothing is written in the US yet, market participants believe that the US Office of Financial Research will soon require mandatory reporting that may entail overlapping requirements.

Similarly, MiFID II introduces strict restrictions on the use of customer assets for collateral purposes and potentially has a major impact on collateralized trading products. A complicated analysis must be conducted on best execution, but in OTC and securities financing markets, best execution may be a function of term, price, counterparty risk and/or collateral acceptance. Further, any variation from a standard best price policy needs to be documented to show how the investment firm or intermediary sought to safeguard the interest of the client.

SFTR and MiFID II require that banks rethink their entire reporting methodologies, and in some cases must rethink parts of their business model. A wide range of new information must be captured, analyzed, consolidated, and reported outwards and internally. This will likely generate new ideas and business opportunities around collateral usage and pricing for those firms that can digest the large quantities of new information that will be produced.

A holistic foundation for trading, control, MIS and regulatory reporting

The struggle at many firms to comply with regulations while maximizing profitability has led to two parallel sets of infrastructures: one for the business and another for compliance. This creates two levels of cost that duplicate substantial effort inside the firm. Along the way, business lines get charged twice for this work as costs are allocated back to the business. This is an immediate negative impact on profitability; even firms that have completed collateral optimization immediately lose a piece of that financial benefit.

The cumulative impact of regulation means that banks and investment firms generally cannot afford to wait for consolidation projects to deliver a single integrated platform. The fragmentation of teams, data and processes are hurdles for any institution to overcome but so is the old mindset that simply thinks of collateral management as an isolated operational process.

We identify five critical areas for firms to address in order to create a foundation for their holistic collateral infrastructure:

  • Map the full impacts of regulatory and profitability requirements on businesses, processes, and systems.
  • Recognize that collateral management is an integral part of many key activities at the firm including trading and liquidity management.
  • Understand the core decision making processes at the heart of effective collateral management.
  • Organize and manage the data that is required to drive those processes.
  • Build a functional operating model for collateral management.

The fifth recommendation, building a functional operational model for collateral, means being able to connect together disparate business lines to provide an enterprise view of collateral. It includes mining collateral agreements to make optimal decisions or decisions mandated by regulation. It requires the ability to perform analysis of collateral to balance economic and regulatory drivers, and it requires controls and transparency of client collateral across all margin centers.

At Transcend Street Solutions, we are actively working with our clients to help them develop a strategic roadmap of business and technology deliverables to achieve a holistic collateral infrastructure. While there are always organizational as well as infrastructural nuances in every business, we have seen the framework proposed above yield a positive return for our clients. Our technology platform, CoSMOS, is nimble, modular and customizable to accelerate collateral infrastructure evolution without necessarily having to retire existing systems or undergo a big infrastructural lift.

Getting this right is important for more than just regulatory compliance. It means the collateral function and trading desks can perform the forward processes required to support both profitable trading and firm-wide decision making. Pre-trade analytics is needed to ensure that collateral is allocated optimally across portfolios and collateral agreements. Optimization is also needed at the trade level to ensure the most suitable collateral is applied to each trade or structure. Finally, analysis needs to be carried out across the whole inventory of securities and cash positions to ensure collateral is used by the right businesses. After all, correct pricing of collateral across business lines is not only essential for firm-level profitability but also incentivizing desirable behavior throughout the organization.

We strongly believe that firms that are successful in achieving a holistic collateral architecture will have a significant competitive advantage in the industry. They will be able to achieve optimization of collateral and liquidity across business silos while meeting most global regulatory requirements, and all that with a much more efficient IT spend.

This article was originally published on Securities Finance Monitor.