Collateral Management Technology Vendor Survey 2021 From Finadium Features Transcend

Finadium featured Transcend in a new survey on Collateral Management Technology Vendors in 2021. The survey presents an inside look at the technology vendors who are leading the future of collateral technology – and the incredible feats clients can accomplish with them.

Finadium profiles Transcend as a solution to manage collateral, funding, and liquidity within distinct business lines and across the enterprise. By connecting data and processes across disparate systems, Transcend’s holistic solutions help clients run sophisticated analytics, optimization and automation.

“Transcend was purpose-built to provide the most advanced post-trade collateral optimization capabilities in the industry.”

– 2021 Finadium Collateral Management Technology Vendor Survey

Finadium subscribers can download the survey to learn more about Transcend’s role in driving more effective collateral management and collateral optimization, as well as some new functionality recently added to the Transcend platform.

Learn More About Transcend

Transcend empowers financial institutions to maximize enterprise-wide financial performance and
operational efficiency. Through real-time global inventory and collateral management and optimization
solutions, Transcend helps clients manage intraday liquidity, funding and regulatory requirements.
With seamless workflows that connect front office decision-making with back office operations,
Transcend’s innovative technology promotes smarter investment decisions and improved financial
performance.

Contact the Transcend team for more information on our fully integrated suite of solutions.

Collateral Benchmarking Checklist: How Does Your Firm Compare?

When it comes to collateral and inventory optimization, how do you know how your firm stacks up to industry best practices? How can you benchmark your progress, and importantly, pinpoint opportunities to solve inefficiencies? 

Download Transcend’s Collateral Benchmarking Checklist and get a quick one-page snapshot to compare your firm’s funding, liquidity, optimization and risk capabilities to industry leaders. 

The Transcend team would be happy to walk you through your assessment and discuss how to prioritize your optimization strategy to drive better results for your business – and in the shortest possible timeframe.

2020 Outlook: Bimal Kadikar, Transcend Street Solutions

What were the key themes for your business in 2019?

At Transcend, we have seen a growing shift in the industry towards firm-wide optimization of collateral, liquidity and funding. Our clients’ goals are to manage their capital more effectively and drive efficiencies across the enterprise, and that requires a coordinated, integrated and automated approach across siloed business lines, systems and processes. It is no small task to connect and harmonize vast sets of data related to collateral – such as agreements, positions and trades – and various workflows, but the returns are quickly realized. The good news is that firms can pursue their optimization strategy widely, or they can choose to focus on a priority area of their business and scale from there.

What are your expectations for 2020?

In 2020, we expect a continued increase in complexity and bottom-line pressures. Firms need to provide differentiated, competitive services to drive profitability, despite potentially operating with legacy technology and processes. Plus, they face growing reporting requirements and regulatory pressures (such as QFC Recordkeeping and SFTR). This is leading more firms to the realization of the need – and benefits – to undertake a centralized optimization strategy to help overcome multiple challenges through a singular solution.

What trends are getting underway that people may not know about but will be important?

Everyone understands that automation in the funding and collateral space is occurring at a fast pace. At Transcend, we believe that in five years, as much as 90% of funding will be done by machines. But what is not fully in focus is that connecting data from disparate sources is the key to this next evolution in the funding markets. Today, most data is fragmented across a firm. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way. Thus, harmonizing and connecting data needs to be every firm’s priority in order to achieve automation and optimization.

This article was originally published on Markets Media.

Connected Data: The Opportunity for Collateral and Liquidity Optimization

The function and definition of collateral and liquidity optimization has continued to expand from its roots in the early 2000s. Practitioners must now consider the application of connected data on security holders to operationalize the next level of efficiency in balance sheet management. A guest post from Transcend.

The concept of connected data, or metadata, in financial markets can sound like a new age philosophy, but really refers to the description of security holdings and agreements that together deliver an understanding of what collateral must be received and delivered, where it originated and how it must be considered on the balance sheet. This information is not available from simply observing the quantity and price of the security in a portfolio. Rather, connected data is an important wrapper for information that is too complex to show in a simple spreadsheet.

Earlier days of optimization meant ordering best to deliver collateral in a list, or creating algorithms based on Credit Support Annexes and collateral schedules. These were effective tools in their day and were appropriate for the level of balance sheet expertise and technology at hand; some were in fact quite advanced. These techniques enabled banks and buy-side firms to take advantage of best pricing in the marketplace for collateral assets that could be lent to internal or external counterparties. Many of these techniques are still in use today. While they deliver on what they were designed for, they are fast becoming outmoded. Consequently, firms relying on these methodologies struggle to drive further increases in balance sheet efficiency, and in order to maintain financial performance targets may need to charge higher prices. This is not a sustainable strategy.

The next level of collateral optimization considers connected data in collateral calculations. Interest is being driven by better technology that can more precisely track financial performance in real time. A finely tuned understanding of the nature of the individual positions and how they impact the firm can in turn mandate a new kind of collateral optimization methodology that structures cheapest to deliver based on a combination of performance impacting factors and market pricing. This gives a new meaning to “best collateral” for any given margin requirement. This only becomes possible when connected data is integrated into the collateral optimization platform.

As an example of applying connected data, not all equities are the same on a balance sheet. A client position that must be funded has one implication while a firm position has another. Both bring a funding and liquidity cost. A firm long delivered against a customer short is internalization, which has a specific balance sheet impact. Depending on balance sheet liquidity, this impact may need additional capital to maintain. Likewise, an expected tenor of a position will impact liquidity treatments. A decision to retain or host these different assets as collateral can in turn feedback to Liquidity Coverage Ratio, Leverage Ratio and other metrics for internal and external consumption.

If these impacts can be observed in real-time, the firm may find that internalizing the position reduces balance sheet but is sub-optimal compared to borrowing the collateral externally. This of course carries its own funding and capital charges, along with counterparty credit limits and risk weightings in the bilateral market. These could in turn be balanced by repo-ing out the firm position, and by tenor matching collateral liabilities in line with the Liquidity Coverage Ratio and future Net Stable Funding Ratio requirements. Anyone familiar with balance sheet calculations will see that these overlapping and potentially conflicting objectives may result in decisions that increase or decrease costs depending on the outcome. By understanding the connected data of each position, including available assets and what needs to be funded, firms can make the best possible decision in collateral utilization. Importantly, the end result is to reduce slippage, increase efficiency, and ultimately deliver greater revenues and better client pricing based on smarter balance sheet management.

Another way to look at the new view of collateral optimization is as the second derivative. The first derivative was the ordering of lists or observation of collateral schedules. The next generation incorporates connected data across collateral holdings and requirements for a more granular understanding of what collateral needs to be delivered and where, and how this will impact the balance sheet and funding costs. It has taken some time to build the technology and an internal perspective, but firms are now ready to engage in this next level of collateral sophistication.

Implementing technology for connected data in collateral and liquidity

A connected data framework starts with assessing what data is available and what needs to be tagged for informing the next level of information about collateral holdings. This process is achievable only with a scalable technology solution: it is not possible to manage this level of information manually let alone for real time decision making. Building out a technology platform requires careful consideration of the end to end use case. If firms get this part right, they can succeed in building out a connected data ecosystem.

The connected data project also requires access to a wide range of data sources. Advances in technology have allowed data to be captured and presented to traders, regulators, and credit and operations teams. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way.

Once a usable, tagged data set has been established, it can then be applied to collateral optimization and actionable results. This can include what-if scenarios, algorithmic trading, workflow management, and further to areas like transfer pricing analytics. Assessing and organizing the data, then tagging it appropriately, can yield broad-ranging results.

Building out the collateral mindset

An evolution in the practice of collateral optimization requires a more holistic view of what collateral is supposed to do for the firm and how to get there. This is a complex cultural challenge and is part of an ongoing evolution in capital markets about the role of the firm, digitization and how services are delivered. While difficult to track, market participants can qualitatively point to differences in how they and their peers think about collateral today versus five years ago. The further the past distance, the greater the change, which naturally suggests challenges when looking at a possible future state.

An important element to developing scalable collateral thinking is the application of technology; our observation is that technology and thinking about how the technology can be applied go hand-in-hand. As each new wave of technology is introduced, new possibilities emerge to think about balance sheet efficiency and also how services are delivered both internally and to clients. In solving these challenges for our clients using our technology, it is evident that a new vision is required before a technology roadmap can be designed or implemented.

The application of connected data for the collateral market is one such point of evolution. Before connected data were available on a digitized basis, collateral desks relied either on ordered lists or individual/manual understandings of which positions were available for which purposes. There was no conversation about the balance sheet except in general terms. Now however, standardized connected data means that every trading desk, operations team and balance sheet decision maker can refine options for what collateral to deliver based on the best balance sheet outcome in near real-time. New scenarios can be run that were never possible, and pricing for clients can be obtained in time spans that used to take hours if not a day or more.

Now that collateral optimization based on connected data is available, this requires firms to think about what services they can deliver to clients on an automated basis, and what should be bundled and what should be kept disaggregated. As new competitors loom in both the retail and institutional space, these sorts of conversations driven by technology and collateral become critical to the future of the business. Connected data is leading the way.

This article was originally published on Securities Finance Monitor.

View and/or download Article PDF

Centralized collateral management becoming a reality

Collateral management has transitioned from an ancillary service to a core competency, largely as a result of the sheer breadth of activity from front to back office and horizontally across silos and asset classes. This has spurred a marked shift towards centralization of collateral management, providing organizations with a centralized view of inventory as well as funding and collateral optimization decisions.

But the move to a more efficient and centralized model is not without challenges. Inefficiencies and the cost of errors are magnified by the multiplicity of internal and external relationships that need to be managed and the requirement to control positions more frequently, even in real-time.

This requires a fundamental shift from managing assets only for margin purposes to managing assets for value, cost and balance sheet purposes.

Moving to a centralized collateral organization is a difficult step for many reasons and as a result, some firms are decoupling their business organization from their technology capabilities.  They are instead focusing on building a centralized, horizontal technology strategy for inventory and collateral management.

In either case, the end goal may be the same – a holistic infrastructure that can yield the benefits of centralized collateral and inventory management coupled with sophisticated analytics and firm-wide optimization capabilities. Fortunately, today’s technology enables this ultimate goal as well as the smaller moves in this direction.

Steps to collateral optimization

Regardless of the approach taken, there are a number of best practices for firms looking to increase the efficiency of their collateral and liquidity management:

  1. Achieve visibility into inventory across multiple business lines and regions. This centralized view is extremely important.
  2. Ensure all collateral schedules and legal agreements are easily accessible as these will impose constraints on decision-making.
  3. Take a centralized view of different types of obligations and requirements to enable good decision-making.
  4. Establish targeted analytics and Key Performance Indicators (KPIs) to measure and monitor progress of these initiatives.

These are vital foundational steps towards achieving an optimized collateral management environment.

Connected data: The key to better decision-making

Of course, bringing the data together is just one part of the process – the next step is to connect the data so that algorithms and analytics can be applied to it. Firms understand that the information is there for them to make better decisions, but they face a challenge in getting useable information and putting it to work.

The main obstacle, in most cases, is that they have built their operational structures and technology around specific areas of the business. To achieve a view across the whole enterprise, these businesses require coordination and connectivity across a large number of different internal and external systems – not easy to accomplish.

The solution lies in implementing a system that is easy to integrate and is targeted at connecting and harmonizing this data.

Avoiding costly re-engineering

There are sometimes negative connotations around the phrase ‘legacy technology’ but this is not always accurate. A firm’s existing securities lending or repo or margin systems may be good, but they will more often than not have been built as separate systems. Rather than re-engineering all these systems, what the firm needs is a layer that pulls these disparate systems together to ensure they are seeing a holistic and harmonized view of inventory, positions and obligations.

Most firms have taken some steps to improve their inventory management, but there is a wide difference across the industry in terms of the strategies adopted to achieve this objective. Some organizations are trying to address the issue in a tactical way, fixing one system at a time to see whether this gives them greater visibility, but this approach does not have much longevity from a strategic perspective.

The larger organizations have usually taken a more strategic approach. Some see it as primarily an internal engineering effort, while others are talking to firms such as Transcend as they seek to harness real-time data, collateral and liquidity.

Regardless of the approach taken, being able to optimize collateral and liquidity decisions at an enterprise level has huge benefits. The sheer number of firms and analysts that have explored the scale of these benefits underlines the significance of the opportunity, and we find that most firms are actively taking steps towards achieving these capabilities.

Optimization models can be implemented with a rules-based approach or even using more sophisticated algorithms (i.e. linear and non-linear programming models). These all have a vital role to play in monetizing the connected data across the firm.

Scaling the benefits

Being able to optimize collateral across business lines is an obvious benefit, but there are also advantages to be gained from reducing internal errors and fail rates. In addition, funding costs will fall because firms will be managing their funding operations more efficiently: improving securitized funding leads to a reduction in more expensive, unsecured funding.

Whether or not firms embrace centralization across all aspects of their business, it is clear that rationalizing complex systems and harnessing fragmented data sets provides for informed, confident and compliant decision-making. And once centralized funding and collateral management are fully achieved, the benefits of efficiency, cost-savings and liquidity attain even greater scale for the firm.

This article was originally published on Global Investor Group.

View and/or download Article PDF

In five years, 90% of funding will be done by machines

You may disagree with the number of years or the percent, but everyone understands that automation in the funding and collateral space is occurring at a fast pace. The question is how you prepare for this inevitable future? Our view is that connecting data from disparate sources is the key to the next evolution in the funding markets. A guest post from Transcend.

Who in the capital markets industry isn’t seeking greater profitability or returns? From balance sheet pressures and competitive dynamics to more resources to comply with regulation, focusing on transformative change to advance the firm has been a huge challenge. At the same time, technology is evolving at a rapid pace and the availability of structured and unstructured data is presenting a whole new level of opportunities. For firms to realize this opportunity, connecting disparate data and adopting smart algorithms across the institution are a critical part of any strategy.

Advances in technology have allowed data to be captured and presented to traders, credit, regulators, and operations. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. Individual extracts exist that sometimes cross silos, but more often cannot be reconciled across sources or users. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way. It does not matter if individual systems are old or new, in the cloud or behind firewalls, from vendor packages or in-house technology: they all have to work together. We call this connected data.

Businesses have understood for some time that this will require growth of automation, which will be a critical driver of success. Banks and asset managers know that they have to do something: doing nothing is no option at all. Machine learning and artificial intelligence are part of the solution, and firms have embarked on projects large and small to enable automation under watchful human eyes. The new element to consider in the pace of change is the ability of machines to connect, process and analyze data within technology platforms for exposure management, regulatory reporting and pricing. The more data that feeds into technology on the funding desk, the more that automated decision-making can occur.

While individual systems and silos can succeed on their own, a robust and integrated data management process brings the pieces together and enables the kinds of decision-making that today can only be performed by senior finance and risk managers. Connected data is therefore possibly the most important link between automation and profitability. It is a daunting task to consider major changes to all systems that are in play, but most firms are adopting a strategy to build a centralized platform that brings data from multiple businesses and sources. A key benefit of this strategy is that advances in technology and algorithms can be applied to this platform, enabling multiple businesses or potentially the whole enterprise to benefit from this investment.

The risk of inaction

Connected data can stake its claim as the new, most competitive advantage in the markets. Like algorithmic trading and straight-through processing, which were once novelties and are now taken for granted, the build-out of a connected data architecture combined with the tools to analyze data will initially provide some firms with an important strategic advantage in cost and profitability management.

With all the talk about data, there is an important human element to what inaction means. In a data-driven, technology-led world, having more or all the right people will not stop a firm from being left behind, and in fact may become a strategic disadvantage. The value of automation is to identify a trade opportunity based on its characteristics, the firm’s capital and the current balance sheet profile. Humans cannot see this flow with the same speed as a computer, and cannot make as fast a decision on whether the trade is profitable from a funding and liquidity perspective. While the classic picture of a trader shouting across a room to check whether a trade is profitable makes for a good movie scene, it is unwieldy in the current environment. A competitor with connected data in place can make that decision in a fraction of the time and execute the trade before the slower firm has brought the trade to enough decision-makers to move forward.

The competitive race towards connected data means that firms with more headcount will see higher costs and less productivity. As firms with efficient and automated funding decision tools employ new processes for decision-making, they will gain a competitive advantage due to cost management, and could even drive spread compression in the funding space. This will put additional pressure on firms that have stood still, and is the true danger of inaction at this time.

Action items for connected data

Data is only as good as the reason for using it. Firms must embark on connecting their data with an understanding of what the data are for, also called foundational functionality. This is the initial building block for what can later become a well-developed real-time data infrastructure.

Each transaction has three elements: a depository ladder for tracking movements by settlement locations; a legal entity or trading desk ladder; and a cash ladder. Each of these contain critical information for connecting data across the organization. If your firm has a cross-business view of fixed income, equities and derivatives on a global basis, then you are due a vacation. We have not yet seen this work completed by any firm, however, and expect that this will be a major focus for banks through 2019 and 2020.

Ultimately, an advanced data infrastructure must provide and connect many types of data in real-time, such as referential data, market data, transactions and positions. “Unstructured” data, such as agreements and terms, capital and liquidity constraints, and risk limits, must also be available more broadly for better decision-making, despite their tendency to be created in some specific silo. But an important early step is ensuring visibility into global, real-time inventory across desks, businesses, settlement systems and transaction types; this is critical to optimize collateral management. Access to accurate data can increase internalization and reduce fails, cutting costs and operational RWA. This is especially important for businesses that have decoupled their inventory management functionality over time, for example, OTC derivatives, prime brokerage and securities financing. Likewise, the ability to access remote pools of high-quality assets, whether for balance sheet or lending purposes, can have direct P&L impacts.

Step two is the development of rules-based models to establish the information flows that are critical to connecting data across a firm and simultaneously optimizing businesses on a book, business entity, and firm levels. The system must understand a firm’s flows and what variables they need to monitor and control within a business line and across the firm. Data will push in both directions, for example to and from regulatory compliance databases or between settlement systems and a trader’s position monitors. Rules-based systems simplify and focus on what is otherwise a very complex set of inter-related and overlapping priorities (see Exhibit 1).

Connected data can enable significant improvements such as:

  • Regulatory models can be fed on a real-time pre-trade “what-if scenario” so businesses can know how much a particular trade absorbs in terms of capital, liquidity or balance sheet for the given return, or if a trade is balance sheet-, capital- or margin-reducing.
  • Data can feed analytics that tells a trader, salesperson, manager or any stakeholder what kind of trades they should focus on in order to keep within their risk limits, with information on a granular client level.
  • XVA desks, the groups often charged with balancing out a firm’s risk and capital, can not only be looped in but push information back to a trader in real-time so they can know the impact of a trade.
  • Systems that track master agreements can be linked and analytics can point toward the most efficient agreement to use for a given trade.
  • Trading and settlement systems can interface with market utilities, both backward and forward.
  • Transfer pricing tools can be built into the system core and be transparent to all stakeholders with near instantaneous speed, at scale.

Transcend’s recent experience with some of the top global banks shows the value of consolidating data into one infrastructure. We are connecting front- and back-office to market infrastructure and providing information in a dashboard, in real-time. As trades book on the depository ladder, key stakeholders can see the change in their dashboard application and can make decisions on funding manually or feedback new parameters to pricing models across the enterprise. The same transaction and positions affect the real-time inventory view from legal entity or customer perspectives as well as driving cash and liquidity management decisions. Over time, as banks get more comfortable with their data management tools, parts of decision-making that follow specific rules can be automated. This will be an excellent deployment of the new data framework.

Betting on the time or the percent

As machine learning and AI advance, and connected data becomes more of a reality, technology platforms will learn how to efficiently mine and analyze data to understand if a trade satisfies institutional regulatory, credit, balance sheet, liquidity, and profitability hurdles. This will lead to an environment where a trade inquiry comes in electronically, is accepted or rejected, and processed automatically through the institution’s systems. The steps in this process are methodical, and there is nothing outside of what financial institutions do today that would prevent execution. A reduction in manual intervention can allow traders to focus on what is important: working on the most complex transactions to turn data into information and action.

The fact that more automation is occurring in funding markets is certain. The question at this time is how long will it take to automate most of the business. This is a bet on the timeline or the percent to which funding decisions can be automated but not the direction of the trend line. Could it be as much as 90% in five years? Answers will vary by the firm and some of the major players are already developing strategies to progress in this direction. Typically, people overestimate the impact of a new technology in the short term, but underestimate the impact in the long term. Banks have already invested in machine learning and AI tools to make automated funding a reality. But it will depend on the next and more complex step: to ensure that connected data can reach these tools, allowing for a robust view of positions, regulatory metrics and profitability requirements across the firm.

This article was originally published on Securities Finance Monitor.

View and/or download Article PDF

Collaboration, Communication (and a Margarita?): The Catalysts for IT Innovation

Leadership, especially in critical, but technologically-challenged functions like collateral management, is the key to seizing a competitive advantage.

IT innovation doesn’t just happen, even in the capital markets where opportunities for substantial improvements in areas like collateral and liquidity management can lead to greater, measurable and sustainable returns. All IT innovation needs commitment, investment and a strategy to make a difference. But most importantly, it needs unwavering leadership if it is to deliver the competitive success it promises.

And here lies the conundrum.

Bank executives already allocate hundreds of millions of dollars (even billions) annually towards technology budgets, yet they are still being bombarded by the claims of a myriad of new developments and solutions that promise an elusive holy grail.

Strengthen Decision-Making

How should the business digitize, become platform-based and leverage open architectures to drive data management strategies that deliver intelligent information?  Finding the key to this will strengthen decision-making across all front-to-back office functions.

But it’s not surprising that there is resistance to change, with perennial questions to be answered such as: Why can’t we get more out of our existing IT estate? Will that spend even deliver half of what it promises? What disruption will there be to existing systems while this takes place and how long will it take?

These are understandable executive concerns, given the time consumed by regulatory compliance, the dynamics of a rapidly changing market, and constant pressures to reduce costs and improve margins. Also, not unnaturally, executives lean heavily on historically well-resourced internal IT teams to guide future decision-making, and hence investment.

But it still came as a shock to many when a 2015 Accenture Report estimated that 96% of bank board members had no professional technology experience, while only 3% of bank CEOs had any formal IT knowledge. At the same time, another study said that the top 10 banks have more IT personnel than the top 10 financial software vendors.

Some say that “ignorance is bliss”, but others counter, “If that’s the case why aren’t there more happy people about?” And this reveals the dilemma.

Define the Divide

A lack of IT and business alignment in banking has been a thorny subject for years, constantly framing the two sides as adversaries, rather than partners. These differences often create a chasm of understanding of the priorities, objectives and vision of “success” for each side, effectively stagnating progress toward the necessary transformation.

But there is a way forward.

Remove Gridlocks

Take, for example, collateral management. We know processes are often gridlocked, liquidity constrained, technology inflexible and access to pertinent data denied by historic silos and working practices. Every week we see how this results in lower capital returns and impaired profitability, at a time of increased competition and shrinking margins.

What used to be a straightforward back-office task to ensure sufficient and appropriate collateral has become mission-critical in pre-trade decision-making as constraints on capital, regulatory pressures and efficiency mandates demand optimized collateral deployment firm-wide.

But recognizing the problem is only the first challenge. Attempting to fix system pitfalls with a few bandages on already stretched legacy systems tends to compound the problem over time.

Trust External Expertise and Innovation

Experience shows that wider collaboration is feasible – and is working. Banks are now better able to lean on the expertise of outside IT vendor expertise, whose claims are not only battle-proven but are ones that complement rather than threaten internal teams. Developing collaborative partnerships with the business, internal IT and select external vendors who bring new ideas, innovation and experience to the table can significantly advance the firm’s technology objectives. Furthermore, there is a greater willingness to consider cloud-based solutions, as cost benefits and improved resilience start to outweigh historic operational risk concerns.

Align Talent with Objectives

This collaborative approach also benefits internal departments by enabling them to deploy talent where it can be most effective. It encourages the injection of fresh ideas into internal debates, complementing existing capabilities with a step-by-step series of tactical enhancements that eventually deliver a strategic objective – without undermining business opportunities or day-to-day operations.

If this leads to more effective data aggregation and analysis, there will be better-informed decisions that deliver tangible improvements to business profitability, while also reducing risk and bolstering regulatory compliance.

A fresh look at enterprise-wide technologies also lays the foundations for ongoing automation of critical business processes. By starting in a segment like collateral management that impacts all asset classes, business functions and jurisdictions, firms can enable each stakeholder across trading workflows to evolve and provide greater value to the broader enterprise.

This should not only produce a more effective and profitable business but a better informed and more confident executive team that is further empowered to deploy technology more widely to the best benefit of the business.

Once there, they can probably also have a laugh and raise a margarita to Jimmy Buffett, who one of my island-loving peers quotes: “Is it ignorance or apathy? Hey, I don’t know and I don’t care” – because by then everyone will know and they will care.

Top five trends in collateral management for 2018

Collateral management has broadened far past simple margin processing; collateral now impacts a majority of financial market activity from determining critical capital calculations to impacting customer experience to driving strategic investment decisions. In this article, we identify the top five trends in collateral management for 2018 and highlight important areas to watch going forward.

The holistic theme driving forward collateral management is its central role in financial markets. Collateral has grown so broad as to make even its name confusing: where collateral can refer to a specific asset, the implications of collateral today can reach through reporting, risk, liquidity, pricing, infrastructure and relationship management. The opportunities for collateral professionals have likewise expanded, and non-collateral roles must now have an understanding of collateral to deliver their core obligations to internal and external clients.

We see a common theme running through five areas to watch in collateral management in the coming year: the application of smarter data and intelligence to drive core business objectives. Many firms have digested the basics of collateral optimization and are now ready to incorporate a broader set of parameters and even a new definition of what optimization means. Likewise, technology investments in collateral are starting to tie into broader innovation projects at larger firms; this will unlock new value-added opportunities for both internal and external facing technology applications.

Here are our top five trends for collateral management in 2018:

#5 Technology Investments

The investment cycle in collateral-related technology applications continues to grow at a rapid pace. Collateral management budget discussions are moving from the back office to the top of the house. And partly as a result, the definition of the category is also changing. Collateral management should no longer be seen as strictly the actions of moving margin for specified products, but rather is part of a complex ecosystem of collateral, liquidity, balance sheet management and analytics. The usual, first order investment targets of these budgets are internally focused, including better reporting, inventory management and data aggregation. The second derivative benefit of a more robust data infrastructure focuses on externally facing trading applications, including tools for traders and client intelligence utilities that provide real-time information and pricing for the benefit of all parties. This new category does not yet have a simple name, one could think of it as a “recommendation system” but regardless of name, this has become a major driver of forward-looking bank technology efforts and efficiency drives.

As large financial services firms capture the benefits of their current round of investments, they will increasingly turn towards integrating core innovations in artificial intelligence, Robotics Process Automation and other existing technologies into their collateral-related investments. This will unlock a large new wave of opportunity for how business is conducted and what information can be captured, analyzed, then automated, for a range of client facing, business line, internal management and reporting applications.

#4 Regulatory reporting

Despite being 10 years since the bottom of the great recession, regulatory reporting requirements for banks and asset managers continue to evolve. Largely irrespective of jurisdiction, the core problem facing these firms is aggregating and linking data together for reporting automation. Due to strict timeframes and complex requirements, firms historically relied on a pre-existing mosaic of technology and human resources to satisfy regulatory reporting needs. However, these tactical solutions made scale, efficiency and responsiveness to new rules difficult. The challenge of regulatory reporting is a puzzle that, once solved, appears obvious. But the process of solving the puzzle can create substantial challenges.

Looking at one regulation alone misses the transformative opportunity of strategic data management across the organization. Whether it is SFTR, MiFID II, Recovery & Resolution Planning requirements of SR-14/17 or Qualified Financial Contracts (QFCs), the latest initiative du jour should be a kick off for a broader rethink about data utilization. Wherever a firm starts, the end result must be a robust data infrastructure that can aggregate and link information at the most granular level. At a high level, firms will need to develop the capability to link all positions and trading data with agreements that govern these positions, collateral that is posted on the agreements, any guarantees that may be applied and any other constraints that need to be considered. Additionally, it has to be able to format and produce the needed information on demand. Achieving this goal will take meaningful work but will make organizations not only more efficient but also more future proof.

#3 Transfer pricing

As firms try to optimize collateral across the enterprise, it is critical that they develop reasonably sophisticated transfer pricing mechanisms to ensure appropriate cost allocations as well as sufficient transparency to promote best incentives in the organization. Many sell-side firms have highly granular models with visibility into secured and unsecured funding, XVA, balance sheet and capital costs. And in varying fashion these firms allocate some or all of these costs internally. But many challenges remain, including: how should all these costs be directly charged to the trader or desk doing the trade; and what is the right balance of allocating actual costs versus incentivizing business behavior that maximally benefits the client franchise overall. As we know, client business profiles change through time as do funding and capital constraints. There may be a conscious decision to do some business that may not make money in support of other areas that are highly profitable. Transfer pricing is evolving from a bespoke, business aligned process to a dynamic, enterprise tool. The effort to enhance transfer pricing business models continues to be refined and expanded.

Firms that embrace the next iteration of transfer pricing will achieve a more scalable, efficient and responsive balance sheet. This will include capturing both secured and unsecured funding costs, plus firm-wide and business specific liquidity and capital costs. Accurately identifying the range of costs can properly incentivize business behaviors beyond simply the cost of an asset in the collateral market. Ultimately, transfer pricing can be a tool to drive strategic balance sheet management objectives across the firm.

Functionally, implementing transfer pricing requires access to substantial data on existing balance sheet costs, inventory management and liquidity costs that firms must consider. Much like collateral optimization, the building block of a robust transfer pricing methodology is data. Accurate information on transfer pricing can then flow back into trading and business decisions to be truly effective.

#2 Collateral control and optimization

Optimization is evolving well beyond an operations driven process of finding opportunity within a business to an enterprise wide approach at pre-trade, trade and post-trade levels. Pre-trade, “what-if” analyses that will inform a trader if a proposed transaction is cost accretive or reducing to the franchise is important, but this requires an analytics tool that can comprehend the impact to the firm’s economic ecosystem. At the point of trade, identifying demands and sources of collateral across the entire enterprise extends to knowing where inventories are across business lines, margin centers, legal entities and regions. It also means understanding the operational nuances and legal constraints governing those demands across global tri-parties, CCPs, derivative margin centers and securities finance requirements.

In a simple example, collateral posted on one day may not be the best to post a week later; if posted collateral becomes scarce in the securities financing market and can be profitably lent out, it may be unwise to provide it as margin. A holistic post-trade analysis, complete with updated repo or securities lending spreads, can tell a trader about missed opportunities, leading to a new form of Transaction Cost Analytics for collateralized trading markets.

#1 Integration of derivatives & securities finance (fixed income and equities)

The need for taking a holistic approach to collateral management has led the industry toward significant business model changes. Collateral is common currency across an enterprise and must be properly allocated to wherever it can be used most efficiently. This means that traditional silos – repo, securities lending, OTC derivatives, exchange traded derivatives, treasury and other areas – need to be integrated. Operations groups that have been doing fundamentally the same thing can no longer be isolated from one another; the cost savings that come from process automation and avoiding operational duplication is too compelling.

On the front-office side, changes needed to impact trading behavior, culture and reporting to name a few are often very difficult to implement over a short period of time. Despite similar flows and economic guidelines, different markets and operation centers, even though all under the same roof, traditionally suffer from asymmetric information. To address this challenge a handful of large sell-side players have combined some aspects of these businesses under the “collateral” banner, sometimes along with custody or other related processing business. Others have developed an enterprise solution to inventory and collateral management. We expect that, more and more, management is seeing the common threads and shared risks involved. The merger of business and operations teams translates into a need for technology that can be leveraged across silos.

The business of collateral management is reshaping every process and silo it touches. While the trends we have identified are not brand new, they all stand out for how far and fast they are advancing in 2018 and beyond. Financial services firms that take advantage of these trends concurrently and plan for a future where collateral is integrated across all areas of the business will improve their competitive positioning going forward. To add a sixth trend: firms that ignore broader thinking about collateral management technology do so at their own peril.

This article was originally published on Securities Finance Monitor.

Revisiting the Importance of Inventory Management in Collateral and Liquidity

In this article in Securities Finance Monitor, Transcend’s CEO Bimal Kadikar discusses the opportunities for more effective liquidity and collateral management – and the potential benefits to the bottom line. A solid starting point is inventory management whereby firms can match collateral to needs, improve front-to-back office communications and increase operational efficiency and compliance.

Access the full report on Securities Finance Monitor.
To download this article, please click here.