Top five trends in collateral management for 2018

Collateral management has broadened far past simple margin processing; collateral now impacts a majority of financial market activity from determining critical capital calculations to impacting customer experience to driving strategic investment decisions. In this article, we identify the top five trends in collateral management for 2018 and highlight important areas to watch going forward.

The holistic theme driving forward collateral management is its central role in financial markets. Collateral has grown so broad as to make even its name confusing: where collateral can refer to a specific asset, the implications of collateral today can reach through reporting, risk, liquidity, pricing, infrastructure and relationship management. The opportunities for collateral professionals have likewise expanded, and non-collateral roles must now have an understanding of collateral to deliver their core obligations to internal and external clients.

We see a common theme running through five areas to watch in collateral management in the coming year: the application of smarter data and intelligence to drive core business objectives. Many firms have digested the basics of collateral optimization and are now ready to incorporate a broader set of parameters and even a new definition of what optimization means. Likewise, technology investments in collateral are starting to tie into broader innovation projects at larger firms; this will unlock new value-added opportunities for both internal and external facing technology applications.

Here are our top five trends for collateral management in 2018:

#5 Technology Investments

The investment cycle in collateral-related technology applications continues to grow at a rapid pace. Collateral management budget discussions are moving from the back office to the top of the house. And partly as a result, the definition of the category is also changing. Collateral management should no longer be seen as strictly the actions of moving margin for specified products, but rather is part of a complex ecosystem of collateral, liquidity, balance sheet management and analytics. The usual, first order investment targets of these budgets are internally focused, including better reporting, inventory management and data aggregation. The second derivative benefit of a more robust data infrastructure focuses on externally facing trading applications, including tools for traders and client intelligence utilities that provide real-time information and pricing for the benefit of all parties. This new category does not yet have a simple name, one could think of it as a “recommendation system” but regardless of name, this has become a major driver of forward-looking bank technology efforts and efficiency drives.

As large financial services firms capture the benefits of their current round of investments, they will increasingly turn towards integrating core innovations in artificial intelligence, Robotics Process Automation and other existing technologies into their collateral-related investments. This will unlock a large new wave of opportunity for how business is conducted and what information can be captured, analyzed, then automated, for a range of client facing, business line, internal management and reporting applications.

#4 Regulatory reporting

Despite being 10 years since the bottom of the great recession, regulatory reporting requirements for banks and asset managers continue to evolve. Largely irrespective of jurisdiction, the core problem facing these firms is aggregating and linking data together for reporting automation. Due to strict timeframes and complex requirements, firms historically relied on a pre-existing mosaic of technology and human resources to satisfy regulatory reporting needs. However, these tactical solutions made scale, efficiency and responsiveness to new rules difficult. The challenge of regulatory reporting is a puzzle that, once solved, appears obvious. But the process of solving the puzzle can create substantial challenges.

Looking at one regulation alone misses the transformative opportunity of strategic data management across the organization. Whether it is SFTR, MiFID II, Recovery & Resolution Planning requirements of SR-14/17 or Qualified Financial Contracts (QFCs), the latest initiative du jour should be a kick off for a broader rethink about data utilization. Wherever a firm starts, the end result must be a robust data infrastructure that can aggregate and link information at the most granular level. At a high level, firms will need to develop the capability to link all positions and trading data with agreements that govern these positions, collateral that is posted on the agreements, any guarantees that may be applied and any other constraints that need to be considered. Additionally, it has to be able to format and produce the needed information on demand. Achieving this goal will take meaningful work but will make organizations not only more efficient but also more future proof.

#3 Transfer pricing

As firms try to optimize collateral across the enterprise, it is critical that they develop reasonably sophisticated transfer pricing mechanisms to ensure appropriate cost allocations as well as sufficient transparency to promote best incentives in the organization. Many sell-side firms have highly granular models with visibility into secured and unsecured funding, XVA, balance sheet and capital costs. And in varying fashion these firms allocate some or all of these costs internally. But many challenges remain, including: how should all these costs be directly charged to the trader or desk doing the trade; and what is the right balance of allocating actual costs versus incentivizing business behavior that maximally benefits the client franchise overall. As we know, client business profiles change through time as do funding and capital constraints. There may be a conscious decision to do some business that may not make money in support of other areas that are highly profitable. Transfer pricing is evolving from a bespoke, business aligned process to a dynamic, enterprise tool. The effort to enhance transfer pricing business models continues to be refined and expanded.

Firms that embrace the next iteration of transfer pricing will achieve a more scalable, efficient and responsive balance sheet. This will include capturing both secured and unsecured funding costs, plus firm-wide and business specific liquidity and capital costs. Accurately identifying the range of costs can properly incentivize business behaviors beyond simply the cost of an asset in the collateral market. Ultimately, transfer pricing can be a tool to drive strategic balance sheet management objectives across the firm.

Functionally, implementing transfer pricing requires access to substantial data on existing balance sheet costs, inventory management and liquidity costs that firms must consider. Much like collateral optimization, the building block of a robust transfer pricing methodology is data. Accurate information on transfer pricing can then flow back into trading and business decisions to be truly effective.

#2 Collateral control and optimization

Optimization is evolving well beyond an operations driven process of finding opportunity within a business to an enterprise wide approach at pre-trade, trade and post-trade levels. Pre-trade, “what-if” analyses that will inform a trader if a proposed transaction is cost accretive or reducing to the franchise is important, but this requires an analytics tool that can comprehend the impact to the firm’s economic ecosystem. At the point of trade, identifying demands and sources of collateral across the entire enterprise extends to knowing where inventories are across business lines, margin centers, legal entities and regions. It also means understanding the operational nuances and legal constraints governing those demands across global tri-parties, CCPs, derivative margin centers and securities finance requirements.

In a simple example, collateral posted on one day may not be the best to post a week later; if posted collateral becomes scarce in the securities financing market and can be profitably lent out, it may be unwise to provide it as margin. A holistic post-trade analysis, complete with updated repo or securities lending spreads, can tell a trader about missed opportunities, leading to a new form of Transaction Cost Analytics for collateralized trading markets.

#1 Integration of derivatives & securities finance (fixed income and equities)

The need for taking a holistic approach to collateral management has led the industry toward significant business model changes. Collateral is common currency across an enterprise and must be properly allocated to wherever it can be used most efficiently. This means that traditional silos – repo, securities lending, OTC derivatives, exchange traded derivatives, treasury and other areas – need to be integrated. Operations groups that have been doing fundamentally the same thing can no longer be isolated from one another; the cost savings that come from process automation and avoiding operational duplication is too compelling.

On the front-office side, changes needed to impact trading behavior, culture and reporting to name a few are often very difficult to implement over a short period of time. Despite similar flows and economic guidelines, different markets and operation centers, even though all under the same roof, traditionally suffer from asymmetric information. To address this challenge a handful of large sell-side players have combined some aspects of these businesses under the “collateral” banner, sometimes along with custody or other related processing business. Others have developed an enterprise solution to inventory and collateral management. We expect that, more and more, management is seeing the common threads and shared risks involved. The merger of business and operations teams translates into a need for technology that can be leveraged across silos.

The business of collateral management is reshaping every process and silo it touches. While the trends we have identified are not brand new, they all stand out for how far and fast they are advancing in 2018 and beyond. Financial services firms that take advantage of these trends concurrently and plan for a future where collateral is integrated across all areas of the business will improve their competitive positioning going forward. To add a sixth trend: firms that ignore broader thinking about collateral management technology do so at their own peril.

This article was originally published on Securities Finance Monitor.

Collateral and Liquidity Data Management: the next big challenge for financial institutions

The problem is well known: financial institutions have data all over the place. Small institutions tend to face straight-forward challenges, while large ones must identify not only where data are hidden but how can it be aggregated without disrupting other processes. Thankfully, new advances in collateral and liquidity technology are ready to make solutions cost-effective and relatively painless to implement.

Imagine these scenarios that require data:

  • Regulators are mandating reporting that looks at all assets of a corporation, both on and off balance sheet, across every subsidiary and geography. How does a central reporting group collect the information?
  • Sales traders and their clients are cautious about balance sheet charges. How can a sales trader tell a client about the netting opportunities in a trade compared to existing holdings?
  • Large institutions have recently created central collateral funding desks. How can a trading division know what collateral is available internally to commit to a counterparty and how much it will cost?

These are all situations where data aggregation and management can play a pivotal role, saving substantial time and effort and opening doors to enhanced revenue opportunities.

The Four Vs of Collateral Management Data

The obstacles to effective collateral data management today begin with the sheer volume and dispersal of data around the world. This is in some ways a ‘Big Data’ problem, albeit with industry-specific twists.
We see four Vs at work in collateral and liquidity data management:

  • Volume – the volume of data that must be managed reaches the gigabytes and terabytes for any financial institution of at least moderate size. The bigger the institution, the greater still the volume of information that must be captured and analyzed.
  • Variety – collateral and liquidity data do not come standardized in a pre-packaged format. Instead, users must contend with multiple forms of data that ultimately get combined to provide the right report or picture for taking action. This can happen with both internal and external data sources.
  • Velocity – data move fast, and every new trade in financial markets means that something has changed in an institution’s holdings, whether the value of stocks owned, the need for a collateral call or the credit limit of a counterparty.
  • Veracity – its great to have all data in one place but how can users be sure that the data are accurate? Users need a way to verify the integrity of data across the enterprise.



Existing Solutions

While institutions have largely solved these problems for single business or legal entities in one legal jurisdiction, the problem is not close to being solved once the boundaries get beyond this limited scope. For example, getting US OTC derivatives to communicate with UK secured funding across different IT systems and countries can be difficult in silos, let alone ensuring that technology solutions work together.

The financial markets industry has recognized the difficulty of collateral management and is supporting initiatives and utilities meant to solve the problem. DTCC-Euroclear GlobalCollateral Ltd is launching the Margin Transit Utility (MTU), which aims to aggregate a firm’s holdings across all custodians and Central Securities Depositories. This is a great start, but even if all market participants and depositories agree to connect to the MTU, firms will need to integrate this information internally and feedback information externally. That will require a significant work effort across the board and even in the best-case scenario will take time.

Most technology providers also have excellent solutions for calculating data and managing positions but rely on the client to already deliver data internally. This is the same data problem once again: even the best collateral management system is made less effective by incomplete, unreliable data inputs. So, technology solutions need to evolve that allow connecting and harmonizing data across multiple silos more easily and without requiring major multi-year re-engineering efforts.

Case Study: Recovery and Resolution Reporting

While the problems inherent in daily trading operations are readily understood, the importance of collateral and liquidity data management grows even larger when considering regulatory reporting requirements. In one example, the Federal Reserve’s SR14-1 recovery and resolution plan reporting processes for banks highlights the critical need for robust data management. According to a January 24, 2014 supervisory letter, the eight largest US banks should have:

  • Effective processes for managing, identifying, and valuing collateral it receives from and posts to external parties and affiliates;
    A comprehensive understanding of obligations and exposures associated with payment, clearing, and settlement activities;
  • The ability to analyze funding sources, uses, and risks of each material entity and critical operation, including how these entities and operations may be affected under stress;
  • Demonstrated management information systems capabilities for producing certain key data on a legal entity basis that is readily retrievable, with controls in place to ensure data integrity and reliability; and
  • Robust arrangements in place for the continued provision of shared or outsourced services needed to maintain critical operations that are documented and supported by legal and operational frameworks.

Four out of five of these bullet points speak directly to data management. There can really be no question: it is not only a good business practice for banks to have active collateral and liquidity data management problems, it is also a legal requirement under SR14-1.

Case Study: Central Collateral Trading Desks

As collateral visibility, management and optimization have grown in importance due to regulatory and/or economic pressures, many large financial institutions are setting up central collateral trading desks/functions. Trading collateral has always been a fundamental part of dealer business but is usually done in silos such as repos, sec lending, OTC derivatives, prime brokerage, etc. The challenge of this new direction is that profitability has not grown at the same pace, which means that these desks may not have sufficient investments to build requisite analytics and technology. In addition, the centralization of bank services across operations and technology means that the needs of specific collateral types may get ignored in the event of a major technology renovation project.

A simple yet innovative solution to this problem is technology that serves as connectivity across all collateralized trading desks whether merged or in silos. Connectivity to repo, securities lending, OTC margining, futures, prime brokerage and other collateral-related business lines is critical to understanding both the big picture and the contributions of each business unit. By establishing this connectivity, firms can avoid major technology rebuilds or installs that may affect every trading desk in favor of middleware that provides data management as well as decision support across the organization.

By connecting all trading desks while leaving their product-specific technologies alone, firms can create a mechanism where data and analytics flow up to trading desks while decisions and actions flow down into the firm’s aggregate data pool. This creates a sizeable advantage for firms wanting to optimize their collateral trading activities while avoiding the cost and headache of a major technology project to harmonize platforms for data management.

The Transcend Street Solution

We at Transcend Street Solutions have considered the data problem across multiple large financial institutions in a new way. Many technology vendors seek to be the golden source of all data. We do not. Instead, we want to connect to every golden source of data where it stands now. This asks a financial institution to provide access to data and not replace existing warehouses or infrastructure. Our first solution, CoSMOS, collates, harmonizes, mines and analyzes all valuable information across enterprise-wide systems in real-time. We then feed those data into platforms for business user decision making, including regulatory reporting, internal applications and third party collateral management systems. By acting as an overlay, our goal is to quickly get the data out of storage and into a useful, actionable format.

Once the process of collateral and liquidity data aggregation is complete throughout the global organization and across business units, there are a wide variety of applications that can be brought to bear. We see regulatory reporting, insight on collateral agreements, funding and position management, margin dashboard management and liquidity analytics as starting places. We expect that the collateral and liquidity space will evolve to require additional services.

Processing data for collateral and liquidity management is not an insurmountable task but it does take work. Many firms have only loose ideas about where every source of information is located internally across business units and geographies. But focusing on internal data aggregation enables a large number of other processes, reporting and technologies to function with maximum efficiency. The data problem is well-known: now solutions are appearing that confront the challenge in new ways.

This article was originally published on Securities Finance Monitor.