Connected Data: The Opportunity for Collateral and Liquidity Optimization

The function and definition of collateral and liquidity optimization has continued to expand from its roots in the early 2000s. Practitioners must now consider the application of connected data on security holders to operationalize the next level of efficiency in balance sheet management. A guest post from Transcend.

The concept of connected data, or metadata, in financial markets can sound like a new age philosophy, but really refers to the description of security holdings and agreements that together deliver an understanding of what collateral must be received and delivered, where it originated and how it must be considered on the balance sheet. This information is not available from simply observing the quantity and price of the security in a portfolio. Rather, connected data is an important wrapper for information that is too complex to show in a simple spreadsheet.

Earlier days of optimization meant ordering best to deliver collateral in a list, or creating algorithms based on Credit Support Annexes and collateral schedules. These were effective tools in their day and were appropriate for the level of balance sheet expertise and technology at hand; some were in fact quite advanced. These techniques enabled banks and buy-side firms to take advantage of best pricing in the marketplace for collateral assets that could be lent to internal or external counterparties. Many of these techniques are still in use today. While they deliver on what they were designed for, they are fast becoming outmoded. Consequently, firms relying on these methodologies struggle to drive further increases in balance sheet efficiency, and in order to maintain financial performance targets may need to charge higher prices. This is not a sustainable strategy.

The next level of collateral optimization considers connected data in collateral calculations. Interest is being driven by better technology that can more precisely track financial performance in real time. A finely tuned understanding of the nature of the individual positions and how they impact the firm can in turn mandate a new kind of collateral optimization methodology that structures cheapest to deliver based on a combination of performance impacting factors and market pricing. This gives a new meaning to “best collateral” for any given margin requirement. This only becomes possible when connected data is integrated into the collateral optimization platform.

As an example of applying connected data, not all equities are the same on a balance sheet. A client position that must be funded has one implication while a firm position has another. Both bring a funding and liquidity cost. A firm long delivered against a customer short is internalization, which has a specific balance sheet impact. Depending on balance sheet liquidity, this impact may need additional capital to maintain. Likewise, an expected tenor of a position will impact liquidity treatments. A decision to retain or host these different assets as collateral can in turn feedback to Liquidity Coverage Ratio, Leverage Ratio and other metrics for internal and external consumption.

If these impacts can be observed in real-time, the firm may find that internalizing the position reduces balance sheet but is sub-optimal compared to borrowing the collateral externally. This of course carries its own funding and capital charges, along with counterparty credit limits and risk weightings in the bilateral market. These could in turn be balanced by repo-ing out the firm position, and by tenor matching collateral liabilities in line with the Liquidity Coverage Ratio and future Net Stable Funding Ratio requirements. Anyone familiar with balance sheet calculations will see that these overlapping and potentially conflicting objectives may result in decisions that increase or decrease costs depending on the outcome. By understanding the connected data of each position, including available assets and what needs to be funded, firms can make the best possible decision in collateral utilization. Importantly, the end result is to reduce slippage, increase efficiency, and ultimately deliver greater revenues and better client pricing based on smarter balance sheet management.

Another way to look at the new view of collateral optimization is as the second derivative. The first derivative was the ordering of lists or observation of collateral schedules. The next generation incorporates connected data across collateral holdings and requirements for a more granular understanding of what collateral needs to be delivered and where, and how this will impact the balance sheet and funding costs. It has taken some time to build the technology and an internal perspective, but firms are now ready to engage in this next level of collateral sophistication.

Implementing technology for connected data in collateral and liquidity

A connected data framework starts with assessing what data is available and what needs to be tagged for informing the next level of information about collateral holdings. This process is achievable only with a scalable technology solution: it is not possible to manage this level of information manually let alone for real time decision making. Building out a technology platform requires careful consideration of the end to end use case. If firms get this part right, they can succeed in building out a connected data ecosystem.

The connected data project also requires access to a wide range of data sources. Advances in technology have allowed data to be captured and presented to traders, regulators, and credit and operations teams. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way.

Once a usable, tagged data set has been established, it can then be applied to collateral optimization and actionable results. This can include what-if scenarios, algorithmic trading, workflow management, and further to areas like transfer pricing analytics. Assessing and organizing the data, then tagging it appropriately, can yield broad-ranging results.

Building out the collateral mindset

An evolution in the practice of collateral optimization requires a more holistic view of what collateral is supposed to do for the firm and how to get there. This is a complex cultural challenge and is part of an ongoing evolution in capital markets about the role of the firm, digitization and how services are delivered. While difficult to track, market participants can qualitatively point to differences in how they and their peers think about collateral today versus five years ago. The further the past distance, the greater the change, which naturally suggests challenges when looking at a possible future state.

An important element to developing scalable collateral thinking is the application of technology; our observation is that technology and thinking about how the technology can be applied go hand-in-hand. As each new wave of technology is introduced, new possibilities emerge to think about balance sheet efficiency and also how services are delivered both internally and to clients. In solving these challenges for our clients using our technology, it is evident that a new vision is required before a technology roadmap can be designed or implemented.

The application of connected data for the collateral market is one such point of evolution. Before connected data were available on a digitized basis, collateral desks relied either on ordered lists or individual/manual understandings of which positions were available for which purposes. There was no conversation about the balance sheet except in general terms. Now however, standardized connected data means that every trading desk, operations team and balance sheet decision maker can refine options for what collateral to deliver based on the best balance sheet outcome in near real-time. New scenarios can be run that were never possible, and pricing for clients can be obtained in time spans that used to take hours if not a day or more.

Now that collateral optimization based on connected data is available, this requires firms to think about what services they can deliver to clients on an automated basis, and what should be bundled and what should be kept disaggregated. As new competitors loom in both the retail and institutional space, these sorts of conversations driven by technology and collateral become critical to the future of the business. Connected data is leading the way.

This article was originally published on Securities Finance Monitor.

View and/or download Article PDF

In five years, 90% of funding will be done by machines

You may disagree with the number of years or the percent, but everyone understands that automation in the funding and collateral space is occurring at a fast pace. The question is how you prepare for this inevitable future? Our view is that connecting data from disparate sources is the key to the next evolution in the funding markets. A guest post from Transcend.

Who in the capital markets industry isn’t seeking greater profitability or returns? From balance sheet pressures and competitive dynamics to more resources to comply with regulation, focusing on transformative change to advance the firm has been a huge challenge. At the same time, technology is evolving at a rapid pace and the availability of structured and unstructured data is presenting a whole new level of opportunities. For firms to realize this opportunity, connecting disparate data and adopting smart algorithms across the institution are a critical part of any strategy.

Advances in technology have allowed data to be captured and presented to traders, credit, regulators, and operations. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. Individual extracts exist that sometimes cross silos, but more often cannot be reconciled across sources or users. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way. It does not matter if individual systems are old or new, in the cloud or behind firewalls, from vendor packages or in-house technology: they all have to work together. We call this connected data.

Businesses have understood for some time that this will require growth of automation, which will be a critical driver of success. Banks and asset managers know that they have to do something: doing nothing is no option at all. Machine learning and artificial intelligence are part of the solution, and firms have embarked on projects large and small to enable automation under watchful human eyes. The new element to consider in the pace of change is the ability of machines to connect, process and analyze data within technology platforms for exposure management, regulatory reporting and pricing. The more data that feeds into technology on the funding desk, the more that automated decision-making can occur.

While individual systems and silos can succeed on their own, a robust and integrated data management process brings the pieces together and enables the kinds of decision-making that today can only be performed by senior finance and risk managers. Connected data is therefore possibly the most important link between automation and profitability. It is a daunting task to consider major changes to all systems that are in play, but most firms are adopting a strategy to build a centralized platform that brings data from multiple businesses and sources. A key benefit of this strategy is that advances in technology and algorithms can be applied to this platform, enabling multiple businesses or potentially the whole enterprise to benefit from this investment.

The risk of inaction

Connected data can stake its claim as the new, most competitive advantage in the markets. Like algorithmic trading and straight-through processing, which were once novelties and are now taken for granted, the build-out of a connected data architecture combined with the tools to analyze data will initially provide some firms with an important strategic advantage in cost and profitability management.

With all the talk about data, there is an important human element to what inaction means. In a data-driven, technology-led world, having more or all the right people will not stop a firm from being left behind, and in fact may become a strategic disadvantage. The value of automation is to identify a trade opportunity based on its characteristics, the firm’s capital and the current balance sheet profile. Humans cannot see this flow with the same speed as a computer, and cannot make as fast a decision on whether the trade is profitable from a funding and liquidity perspective. While the classic picture of a trader shouting across a room to check whether a trade is profitable makes for a good movie scene, it is unwieldy in the current environment. A competitor with connected data in place can make that decision in a fraction of the time and execute the trade before the slower firm has brought the trade to enough decision-makers to move forward.

The competitive race towards connected data means that firms with more headcount will see higher costs and less productivity. As firms with efficient and automated funding decision tools employ new processes for decision-making, they will gain a competitive advantage due to cost management, and could even drive spread compression in the funding space. This will put additional pressure on firms that have stood still, and is the true danger of inaction at this time.

Action items for connected data

Data is only as good as the reason for using it. Firms must embark on connecting their data with an understanding of what the data are for, also called foundational functionality. This is the initial building block for what can later become a well-developed real-time data infrastructure.

Each transaction has three elements: a depository ladder for tracking movements by settlement locations; a legal entity or trading desk ladder; and a cash ladder. Each of these contain critical information for connecting data across the organization. If your firm has a cross-business view of fixed income, equities and derivatives on a global basis, then you are due a vacation. We have not yet seen this work completed by any firm, however, and expect that this will be a major focus for banks through 2019 and 2020.

Ultimately, an advanced data infrastructure must provide and connect many types of data in real-time, such as referential data, market data, transactions and positions. “Unstructured” data, such as agreements and terms, capital and liquidity constraints, and risk limits, must also be available more broadly for better decision-making, despite their tendency to be created in some specific silo. But an important early step is ensuring visibility into global, real-time inventory across desks, businesses, settlement systems and transaction types; this is critical to optimize collateral management. Access to accurate data can increase internalization and reduce fails, cutting costs and operational RWA. This is especially important for businesses that have decoupled their inventory management functionality over time, for example, OTC derivatives, prime brokerage and securities financing. Likewise, the ability to access remote pools of high-quality assets, whether for balance sheet or lending purposes, can have direct P&L impacts.

Step two is the development of rules-based models to establish the information flows that are critical to connecting data across a firm and simultaneously optimizing businesses on a book, business entity, and firm levels. The system must understand a firm’s flows and what variables they need to monitor and control within a business line and across the firm. Data will push in both directions, for example to and from regulatory compliance databases or between settlement systems and a trader’s position monitors. Rules-based systems simplify and focus on what is otherwise a very complex set of inter-related and overlapping priorities (see Exhibit 1).

Connected data can enable significant improvements such as:

  • Regulatory models can be fed on a real-time pre-trade “what-if scenario” so businesses can know how much a particular trade absorbs in terms of capital, liquidity or balance sheet for the given return, or if a trade is balance sheet-, capital- or margin-reducing.
  • Data can feed analytics that tells a trader, salesperson, manager or any stakeholder what kind of trades they should focus on in order to keep within their risk limits, with information on a granular client level.
  • XVA desks, the groups often charged with balancing out a firm’s risk and capital, can not only be looped in but push information back to a trader in real-time so they can know the impact of a trade.
  • Systems that track master agreements can be linked and analytics can point toward the most efficient agreement to use for a given trade.
  • Trading and settlement systems can interface with market utilities, both backward and forward.
  • Transfer pricing tools can be built into the system core and be transparent to all stakeholders with near instantaneous speed, at scale.

Transcend’s recent experience with some of the top global banks shows the value of consolidating data into one infrastructure. We are connecting front- and back-office to market infrastructure and providing information in a dashboard, in real-time. As trades book on the depository ladder, key stakeholders can see the change in their dashboard application and can make decisions on funding manually or feedback new parameters to pricing models across the enterprise. The same transaction and positions affect the real-time inventory view from legal entity or customer perspectives as well as driving cash and liquidity management decisions. Over time, as banks get more comfortable with their data management tools, parts of decision-making that follow specific rules can be automated. This will be an excellent deployment of the new data framework.

Betting on the time or the percent

As machine learning and AI advance, and connected data becomes more of a reality, technology platforms will learn how to efficiently mine and analyze data to understand if a trade satisfies institutional regulatory, credit, balance sheet, liquidity, and profitability hurdles. This will lead to an environment where a trade inquiry comes in electronically, is accepted or rejected, and processed automatically through the institution’s systems. The steps in this process are methodical, and there is nothing outside of what financial institutions do today that would prevent execution. A reduction in manual intervention can allow traders to focus on what is important: working on the most complex transactions to turn data into information and action.

The fact that more automation is occurring in funding markets is certain. The question at this time is how long will it take to automate most of the business. This is a bet on the timeline or the percent to which funding decisions can be automated but not the direction of the trend line. Could it be as much as 90% in five years? Answers will vary by the firm and some of the major players are already developing strategies to progress in this direction. Typically, people overestimate the impact of a new technology in the short term, but underestimate the impact in the long term. Banks have already invested in machine learning and AI tools to make automated funding a reality. But it will depend on the next and more complex step: to ensure that connected data can reach these tools, allowing for a robust view of positions, regulatory metrics and profitability requirements across the firm.

This article was originally published on Securities Finance Monitor.

View and/or download Article PDF