After over 120 entries have been reviewed, the FOW International Awards shortlist has been released. The winners will be unveiled at a Gala Dinner in London on 7 December.
For Original Publication , Click here: FOW Awards 2022
After over 120 entries have been reviewed, the FOW International Awards shortlist has been released. The winners will be unveiled at a Gala Dinner in London on 7 December.
For Original Publication , Click here: FOW Awards 2022
When it comes to collateral and inventory optimization, how do you know how your firm stacks up to industry best practices? How can you benchmark your progress, and importantly, pinpoint opportunities to solve inefficiencies?
Download Transcend’s Collateral Benchmarking Checklist and get a quick one-page snapshot to compare your firm’s funding, liquidity, optimization and risk capabilities to industry leaders.
The Transcend team would be happy to walk you through your assessment and discuss how to prioritize your optimization strategy to drive better results for your business – and in the shortest possible timeframe.
Last month, Transcend announced the launch of CCP Central, the first enterprise-wide CCP optimization and connectivity solution. We interviewed Transcend’s founder and CEO, Bimal Kadikar, to gain his insights on CCP Central.
For those who may be unfamiliar with Transcend CCP Central, can you please explain what the solution does?
CCP Central is all about connecting an ecosystem of disparate CCPs. Every CCP operates in its own way, with unique rules, requirements, and nuances. This lack of standardization across CCPs has made it difficult for firms to efficiently manage and scale their CCP-facing collateralization processes. As the market leader in collateral optimization, it was important for Transcend to incorporate CCPs in our technology framework in order to offer a truly holistic solution.
CCP Central provides our clients with real-time visibility across CCP relationships and thoughtfully operationalizes margin processes.
Can you walk through a use case of how a firm could benefit from CCP Central?
A great example is OCC. OCC is a challenge because, unlike other CCPs who publish haircuts by securities that apply to all members, they accept securities under their “Collateral as Margin” procedure. This means that they charge each member a variable-bespoke haircut per security based on its interplay with the members trading positions: the “Portfolio Specific Haircut” or PSH of each security.
Let’s say adding GE shares as collateral reduces the overall portfolio exposure, whereas adding IBM shares as collateral would increase exposure within a firm’s trading portfolio then the OCC would give the firm more collateral value for GE shares versus if the firm posted the IBM ones. At another member, the situation could be reversed, and IBM shares would be more optimal than GE shares to pledge as collateral on the same day.
It becomes critical for firms to figure out which security to post based on the dynamic PSH method. Now, think about this complexity multiplied across all CCPs; it becomes a daunting challenge for most firms without an automated and smart solution.
Because Transcend connects CCP datasets and analyzes all reference data and constraints, our technology can seamlessly identify what is the best security to post in order to get the best value across all obligations.
What trends were you seeing in the industry that led to the development of CCP Central?
Traditionally, CCPs have been one of the most critical players in the derivatives markets, and they continue to rapidly grow in importance. Nevertheless, our clients continue to struggle with how to conduct holistic CCP optimization.
Typically, because most firms manage complex CCP requirements manually, they have to keep to simple funding routines focused on meeting critical requirements in a timely fashion. However, in doing so, they incur an opportunity cost by missing out on smarter combinations of assets that are achievable with better tools.
At Transcend, we are flipping that concept around. Our technology first identifies the most economically efficient collateral allocations and then carries out the operational processes required to instruct the movements.
How does CCP fit into Transcend’s solution suite of collateral, funding and liquidity products?
The Transcend solution suite is a cohesive, fully integrated yet modular platform. We have built each of our products thoughtfully and organically, integrating each module into a systematic architecture that can either solve very specific challenges or work together as an end-to-end enterprise solution.
CCP Central is an extension of our existing offering and allows clients to choose to either streamline CCP funding optimization on its own, or to fold cleared derivatives CCP funding into a broader firmwide collateral strategy.
What makes CCP Central different from other solutions, whether built internally or developed by third parties?
As far as I know, no clearing member or software provider has been able to build the holistic solution that Transcend offers especially within the context of enterprise level optimization. While many have spent years tactically developing parts of a digitized data framework, they have not been able to connect everything end-to-end.
The truth is that it is difficult and costly to build and maintain such wide connectivity. Transcend allows firms to minimize development, hosting and maintenance costs while reaping the benefits of a thoughtfully built solution that has evolved over the last seven years.
We’ve already overcome the challenges of comprehensive optimization by developing and implementing Transcend at top-tier banks, broker-dealers and custodians. Why recreate the wheel or only solve part of the puzzle when Transcend can solve your firm’s greatest technological challenges in a very short timeframe?
Learn more about Transcend’s CCP Central solution, or request a demo.
The last decade has seen a dramatic increase in the breadth and depth of collateralized businesses. Firstly, regulatory requirements now necessitate that large banks hold more High-Quality Liquid Assets (HQLA) in order to manage broader liquidity needs and mitigate counterparty credit risk. Additionally, previously uncollateralized products and industries require collateral such as OTC clearing, certain uncleared derivatives and TBA mortgage activities. Lastly, trading businesses have turned to financing liquid collateral for steadier, annuity-like revenue. This heightened focus on secured lending exists across all key financial services segments including retail broker dealers, institutional broker dealers and large banks. As a result of these changes, there is a greater need for companies to effectively manage scarce collateral resources.
While posting and receiving collateral has helped firms mitigate credit risks and create new revenue streams, it also adds further complexities for operations and risk management. Firms must now manage collateral received and posted across multiple business units, with varying margin regimes and customer types, under disparate legal agreements and collateral requirements. As a result, risk managers must now manage these various activities across the capital markets businesses and the enterprise.
We have seen risk managers particularly challenged by how to track margin calls generated from these disparate business activities to ensure counterparties remain effectively collateralized. Recent liquidations of large positions in Prime Brokerage businesses have proven that even in a lower volatility market, large margin calls necessitate active and intraday risk management of these exposures. Accordingly, firms must effectively manage their risk positions not only during periods of extreme market volatility, but also in times of market stability.
Now more than ever, it is important for firms to implement technology that aggregates and harmonizes collateral pools across the enterprise, including the “collateral profile” of these positions, in addition to intraday margin calls. These solutions can provide risk managers the ability to analyze transactions that produce or require collateral and margin calls that expose the firm to counterparty credit risk. The management of margin call activities is where counterparty credit and operational risks converge; firms need the appropriate technology infrastructure and operational processes in place to fully understand the terms of their contracts, status of margin calls, and current collateralization, thus enabling business and risk managers to mitigate potential credit risks.
Firms that have the operational capabilities to manage margin and collateral risks in real-time are establishing a competitive advantage in the market.
Transcend’s Intraday Margin Optimization Solutions are helping firms unlock real-time exposure analysis and superior collateral allocations to deliver firm-wide optimization with full STP connectivity. Learn more.
The business of collateral optimization has changed radically. In 2020, banks can no longer accept linear priority lists for collateral delivery because when viewed globally across balance sheets and product lines, this no longer makes sense. What was a cutting-edge solution even five years ago is now leaving money on the table. Automation is a central part of this change.
Automation of collateral optimization has shifted how solutions get implemented. While both vendors and institutions would always prefer one solution that provides turnkey results, it now requires far more than simply the ordering of collateral lists to deliver the outsized value seen in the past.
Optimization is an ongoing process that requires both sophisticated software and engaged stakeholders. In this article we discuss recent client lessons, including five key observations from clients on how institutions need to consider collateral optimization, and how our clients are approaching the next complex layer of global inventory optimization.
Collateral optimization requires a complex mix of people and technology. While automation is usually a desired end-state, there are an extensive number of processes, regulatory and client constraints that need to be incorporated first. In the past few months, Transcend has learned some important lessons working with clients:
These examples are the next iteration of collateral optimization, which recognizes the importance of both automation and also the realistic limitations of a human-centered process.
Aggregating inventory globally to a central data hub ensures that collateral optimization considers all available assets at any given times. It sounds easy but in practice contains substantial complexity, in particular the requirement that systems communicate with each other and that descriptive information about each asset is collected and accurate.
To date, collateral optimization has been a tactical and localized process. Individual business units have successfully delivered optimization for their region or silo but that has left the firm as a whole in the dark about where enterprise scalable opportunities may lie. Few firms have a holistic optimization strategy in place and fewer still have implemented one globally, but most recognize that tactical solutions have reached their limits. The next evolution of collateral optimization needs to occur to deliver on its promise of reduced costs and greater operational efficiency.
In an earlier article, Connected Data: The Opportunity for Collateral and Liquidity Optimization, I discussed the importance of connected data, or metadata, to global inventory management. This information covers: the tenor of a position; who the owner is; whether the position is owned by the firm or a client; rehypothecation status; and where it can be pledged at the lowest haircut. This enrichment process is still not conducted by most firms, resulting in real opportunity costs as assets aren’t fully optimized against the firm’s liabilities.
A global inventory optimization effort looks to solve for this problem by developing and assigning connected data to each asset. The process can be complex, but the end results deliver a level of collateral optimization that is robust and scalable. This is a cornerstone of broadening out the impact that optimization can have for financial services firms, starting from data and delivering through to actionable results.
Effective global inventory optimization is an input to solving an array of other problems, including:
Solving the problems of global inventory management and process automation while building tools for human/technology/process engagement is Transcend’s core business. The client examples discussed here show that collateral optimization works best with tools that are well thought-out in advance. We continue to work with our clients to explore where the boundaries lie in optimizing not just collateral but also the process.
Automation of collateral optimization can clearly be a competitive advantage. With hundreds of millions in revenue on the line, advanced firms are now looking to integrate pre- and post-trade across silos. Deciding whether to use collateral for a repo vs. deliver for an OTC derivatives transaction has been discussed since optimization has been around, and firms are now in a position to actualize this intelligence. Collateral optimization is not easy, but the promise of delivering meaningful results to the front office could unlock a new generation of technology development in the collateral space.
This article was originally published on Securities Finance Monitor.
At Transcend, we have seen a growing shift in the industry towards firm-wide optimization of collateral, liquidity and funding. Our clients’ goals are to manage their capital more effectively and drive efficiencies across the enterprise, and that requires a coordinated, integrated and automated approach across siloed business lines, systems and processes. It is no small task to connect and harmonize vast sets of data related to collateral – such as agreements, positions and trades – and various workflows, but the returns are quickly realized. The good news is that firms can pursue their optimization strategy widely, or they can choose to focus on a priority area of their business and scale from there.
In 2020, we expect a continued increase in complexity and bottom-line pressures. Firms need to provide differentiated, competitive services to drive profitability, despite potentially operating with legacy technology and processes. Plus, they face growing reporting requirements and regulatory pressures (such as QFC Recordkeeping and SFTR). This is leading more firms to the realization of the need – and benefits – to undertake a centralized optimization strategy to help overcome multiple challenges through a singular solution.
Everyone understands that automation in the funding and collateral space is occurring at a fast pace. At Transcend, we believe that in five years, as much as 90% of funding will be done by machines. But what is not fully in focus is that connecting data from disparate sources is the key to this next evolution in the funding markets. Today, most data is fragmented across a firm. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way. Thus, harmonizing and connecting data needs to be every firm’s priority in order to achieve automation and optimization.
This article was originally published on Markets Media.
In this Global Investor Group Special Report, Collateral in 2020, Bimal Kadikar outlines the steps firms can take to optimise collateral at an enterprise-wide level and explains how a connected collateral ecosystem can be utilised to inform decision-making.
“Forward-looking firms have recognised that optimising collateral and liquidity across an enterprise, as well as within business areas, can drive efficiencies and deliver wider strategic benefits.”
Access Global Investor Group’s full report: Collateral in 2020 – Driving optimisation in an evolving ecosystem.
The function and definition of collateral and liquidity optimization has continued to expand from its roots in the early 2000s. Practitioners must now consider the application of connected data on security holders to operationalize the next level of efficiency in balance sheet management. A guest post from Transcend.
The concept of connected data, or metadata, in financial markets can sound like a new age philosophy, but really refers to the description of security holdings and agreements that together deliver an understanding of what collateral must be received and delivered, where it originated and how it must be considered on the balance sheet. This information is not available from simply observing the quantity and price of the security in a portfolio. Rather, connected data is an important wrapper for information that is too complex to show in a simple spreadsheet.
Earlier days of optimization meant ordering best to deliver collateral in a list, or creating algorithms based on Credit Support Annexes and collateral schedules. These were effective tools in their day and were appropriate for the level of balance sheet expertise and technology at hand; some were in fact quite advanced. These techniques enabled banks and buy-side firms to take advantage of best pricing in the marketplace for collateral assets that could be lent to internal or external counterparties. Many of these techniques are still in use today. While they deliver on what they were designed for, they are fast becoming outmoded. Consequently, firms relying on these methodologies struggle to drive further increases in balance sheet efficiency, and in order to maintain financial performance targets may need to charge higher prices. This is not a sustainable strategy.
The next level of collateral optimization considers connected data in collateral calculations. Interest is being driven by better technology that can more precisely track financial performance in real time. A finely tuned understanding of the nature of the individual positions and how they impact the firm can in turn mandate a new kind of collateral optimization methodology that structures cheapest to deliver based on a combination of performance impacting factors and market pricing. This gives a new meaning to “best collateral” for any given margin requirement. This only becomes possible when connected data is integrated into the collateral optimization platform.
As an example of applying connected data, not all equities are the same on a balance sheet. A client position that must be funded has one implication while a firm position has another. Both bring a funding and liquidity cost. A firm long delivered against a customer short is internalization, which has a specific balance sheet impact. Depending on balance sheet liquidity, this impact may need additional capital to maintain. Likewise, an expected tenor of a position will impact liquidity treatments. A decision to retain or host these different assets as collateral can in turn feedback to Liquidity Coverage Ratio, Leverage Ratio and other metrics for internal and external consumption.
If these impacts can be observed in real-time, the firm may find that internalizing the position reduces balance sheet but is sub-optimal compared to borrowing the collateral externally. This of course carries its own funding and capital charges, along with counterparty credit limits and risk weightings in the bilateral market. These could in turn be balanced by repo-ing out the firm position, and by tenor matching collateral liabilities in line with the Liquidity Coverage Ratio and future Net Stable Funding Ratio requirements. Anyone familiar with balance sheet calculations will see that these overlapping and potentially conflicting objectives may result in decisions that increase or decrease costs depending on the outcome. By understanding the connected data of each position, including available assets and what needs to be funded, firms can make the best possible decision in collateral utilization. Importantly, the end result is to reduce slippage, increase efficiency, and ultimately deliver greater revenues and better client pricing based on smarter balance sheet management.
Another way to look at the new view of collateral optimization is as the second derivative. The first derivative was the ordering of lists or observation of collateral schedules. The next generation incorporates connected data across collateral holdings and requirements for a more granular understanding of what collateral needs to be delivered and where, and how this will impact the balance sheet and funding costs. It has taken some time to build the technology and an internal perspective, but firms are now ready to engage in this next level of collateral sophistication.
A connected data framework starts with assessing what data is available and what needs to be tagged for informing the next level of information about collateral holdings. This process is achievable only with a scalable technology solution: it is not possible to manage this level of information manually let alone for real time decision making. Building out a technology platform requires careful consideration of the end to end use case. If firms get this part right, they can succeed in building out a connected data ecosystem.
The connected data project also requires access to a wide range of data sources. Advances in technology have allowed data to be captured and presented to traders, regulators, and credit and operations teams. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way.
Once a usable, tagged data set has been established, it can then be applied to collateral optimization and actionable results. This can include what-if scenarios, algorithmic trading, workflow management, and further to areas like transfer pricing analytics. Assessing and organizing the data, then tagging it appropriately, can yield broad-ranging results.
An evolution in the practice of collateral optimization requires a more holistic view of what collateral is supposed to do for the firm and how to get there. This is a complex cultural challenge and is part of an ongoing evolution in capital markets about the role of the firm, digitization and how services are delivered. While difficult to track, market participants can qualitatively point to differences in how they and their peers think about collateral today versus five years ago. The further the past distance, the greater the change, which naturally suggests challenges when looking at a possible future state.
An important element to developing scalable collateral thinking is the application of technology; our observation is that technology and thinking about how the technology can be applied go hand-in-hand. As each new wave of technology is introduced, new possibilities emerge to think about balance sheet efficiency and also how services are delivered both internally and to clients. In solving these challenges for our clients using our technology, it is evident that a new vision is required before a technology roadmap can be designed or implemented.
The application of connected data for the collateral market is one such point of evolution. Before connected data were available on a digitized basis, collateral desks relied either on ordered lists or individual/manual understandings of which positions were available for which purposes. There was no conversation about the balance sheet except in general terms. Now however, standardized connected data means that every trading desk, operations team and balance sheet decision maker can refine options for what collateral to deliver based on the best balance sheet outcome in near real-time. New scenarios can be run that were never possible, and pricing for clients can be obtained in time spans that used to take hours if not a day or more.
Now that collateral optimization based on connected data is available, this requires firms to think about what services they can deliver to clients on an automated basis, and what should be bundled and what should be kept disaggregated. As new competitors loom in both the retail and institutional space, these sorts of conversations driven by technology and collateral become critical to the future of the business. Connected data is leading the way.
This article was originally published on Securities Finance Monitor.
Collateral management has transitioned from an ancillary service to a core competency, largely as a result of the sheer breadth of activity from front to back office and horizontally across silos and asset classes. This has spurred a marked shift towards centralization of collateral management, providing organizations with a centralized view of inventory as well as funding and collateral optimization decisions.
But the move to a more efficient and centralized model is not without challenges. Inefficiencies and the cost of errors are magnified by the multiplicity of internal and external relationships that need to be managed and the requirement to control positions more frequently, even in real-time.
This requires a fundamental shift from managing assets only for margin purposes to managing assets for value, cost and balance sheet purposes.
Moving to a centralized collateral organization is a difficult step for many reasons and as a result, some firms are decoupling their business organization from their technology capabilities. They are instead focusing on building a centralized, horizontal technology strategy for inventory and collateral management.
In either case, the end goal may be the same – a holistic infrastructure that can yield the benefits of centralized collateral and inventory management coupled with sophisticated analytics and firm-wide optimization capabilities. Fortunately, today’s technology enables this ultimate goal as well as the smaller moves in this direction.
Regardless of the approach taken, there are a number of best practices for firms looking to increase the efficiency of their collateral and liquidity management:
These are vital foundational steps towards achieving an optimized collateral management environment.
Of course, bringing the data together is just one part of the process – the next step is to connect the data so that algorithms and analytics can be applied to it. Firms understand that the information is there for them to make better decisions, but they face a challenge in getting useable information and putting it to work.
The main obstacle, in most cases, is that they have built their operational structures and technology around specific areas of the business. To achieve a view across the whole enterprise, these businesses require coordination and connectivity across a large number of different internal and external systems – not easy to accomplish.
The solution lies in implementing a system that is easy to integrate and is targeted at connecting and harmonizing this data.
There are sometimes negative connotations around the phrase ‘legacy technology’ but this is not always accurate. A firm’s existing securities lending or repo or margin systems may be good, but they will more often than not have been built as separate systems. Rather than re-engineering all these systems, what the firm needs is a layer that pulls these disparate systems together to ensure they are seeing a holistic and harmonized view of inventory, positions and obligations.
Most firms have taken some steps to improve their inventory management, but there is a wide difference across the industry in terms of the strategies adopted to achieve this objective. Some organizations are trying to address the issue in a tactical way, fixing one system at a time to see whether this gives them greater visibility, but this approach does not have much longevity from a strategic perspective.
The larger organizations have usually taken a more strategic approach. Some see it as primarily an internal engineering effort, while others are talking to firms such as Transcend as they seek to harness real-time data, collateral and liquidity.
Regardless of the approach taken, being able to optimize collateral and liquidity decisions at an enterprise level has huge benefits. The sheer number of firms and analysts that have explored the scale of these benefits underlines the significance of the opportunity, and we find that most firms are actively taking steps towards achieving these capabilities.
Optimization models can be implemented with a rules-based approach or even using more sophisticated algorithms (i.e. linear and non-linear programming models). These all have a vital role to play in monetizing the connected data across the firm.
Being able to optimize collateral across business lines is an obvious benefit, but there are also advantages to be gained from reducing internal errors and fail rates. In addition, funding costs will fall because firms will be managing their funding operations more efficiently: improving securitized funding leads to a reduction in more expensive, unsecured funding.
Whether or not firms embrace centralization across all aspects of their business, it is clear that rationalizing complex systems and harnessing fragmented data sets provides for informed, confident and compliant decision-making. And once centralized funding and collateral management are fully achieved, the benefits of efficiency, cost-savings and liquidity attain even greater scale for the firm.
This article was originally published on Global Investor Group.
As collateral rules have grown in complexity, so has the need for greater optimization – But as Tim Steele [of Funds Europe] discovers, achieving that can be painful.
Collateral has long been used as a tool for mitigating counterparty risk and obtaining credit, but now more than ever, it is the key determinant of an institution’s ability to engage in financial transactions in the cash or derivatives markets….
“If you optimize every pool or silo individually, as a firm you will by design not be optimized,” says Bimal Kadikar.
Following the financial crisis, regulations and their associated reporting have created an opportunity for banks and investment firms to create a single, unified collateral infrastructure across all product siloes. This does not have to be a radical architecture rebuild, but rather can be achieved incrementally.
There are legitimate historical reasons why collateral infrastructure has grown up as a patchwork of systems and processes. For products such as stock lending, repo, futures or contracts for difference (CFDs), the collateral/margining process was generally integral to the products and processing systems. It would not have made sense to break out collateral management into a separate group and hence operating teams and systems were structured around the core product unit. Generally, only OTC derivatives had a relatively clear decoupling between collateral management and other operational processes. Even as business units merged at the top level, this product separation at the collateral management level often continued.
While this situation could stand during non-stress periods, the financial crisis demonstrated the fallacy that siloed, uncoordinated collateral management systems, data and processes could weather any storm. This disjointed view caused a number of specific problems, including: an inability to see the full exposures to counterparties; a lack of organization in cash and non-cash holdings; and substantial inflexibility in mobilizing the overall collateral pool. Even before the crisis, inconsistent or “zero cost allocation” for collateral usage meant that collateral was not always being directed to the parts of the business that needed it most. After the crisis, with collateral and High Quality Liquid Assets at a premium, this became unacceptable.
Today, few banks and investment firms have completed the work of integrating their collateral management functions across products (see Exhibit 1). Some of the largest banks are focused on building capabilities to achieve enterprise-wide collateral optimization, while others are just starting on this effort, at least on a silo basis. Some have bought or built large systems with cross-product support, although this has proven costly. Others are evaluating organizational consolidation. Whatever their current state, a new round of regulatory reporting requirements in the US and Europe means that letting collateral infrastructure sit to one side is no longer viable to meet business or compliance objectives without adding substantial staff. One way or another, long-term solutions must be achieved.
Source: Transcend Street Solutions
While nearly all large firms have digested the current waves of regulatory reporting and collateral management requirements, the next round will soon be arriving. Among these are the Federal Reserve’s regulation SR14-1, MiFID II (Revision of the Markets in Financial Instruments Directive), and the Securities Finance Transactions Regulation (SFTR). It is worth looking at some of these requirements in detail to understand what else is being demanded of collateral management infrastructure and departments.
The Federal Reserve’s regulation SR14-1 is aimed at improving the resolution process for US bank holding companies. It includes a high level requirement that banks should have effective processes for managing, identifying, and valuing collateral it receives from and posts to external parties and affiliates. At the close of any business day banks should be able to identify exactly where any counterparty collateral is held, document all netting and rehypothecation arrangements and track inter-entity collateral related to inter-entity credit risk. On a quarterly basis they need to review CSAs, differences in collateral requirements and processes between jurisdictions, and forecast changes in collateral requirements. Also on the theme of improved resolution rules are the record keeping requirements related to “Qualified Financial Contracts” (effectively most non-cleared OTC transactions). These require banks to identify the details and conditions of the master agreements and CSAs applying to the relevant trades.
While the regulatory intent is understandable, these requirements are exceptionally difficult to meet without a unified collateral infrastructure. There is in fact no way to respond without a single, holistic view of collateral and exposure across the enterprise. While SR14-1 impacts only the largest banks, it still means these banks have a mandate to complete the work they have begun in organizing their vast collection of collateral information. This will lead to greater collateral opportunities for the big banks, and may in turn encourage smaller competitors to complete the same work in order to exploit similar new efficiencies.
Article 15 of Europe’s SFTR places restrictions on the reuse of collateral (rehypothecation). The provider of collateral has to be informed in writing of the risk and consequences of their collateral being reused. They also have to provide prior, express consent to the reuse of their collateral. Even with the appropriate documentation and reporting in place, a collateral management department has to carefully ensure that the written agreement on reuse is strictly complied with. While nothing is written in the US yet, market participants believe that the US Office of Financial Research will soon require mandatory reporting that may entail overlapping requirements.
Similarly, MiFID II introduces strict restrictions on the use of customer assets for collateral purposes and potentially has a major impact on collateralized trading products. A complicated analysis must be conducted on best execution, but in OTC and securities financing markets, best execution may be a function of term, price, counterparty risk and/or collateral acceptance. Further, any variation from a standard best price policy needs to be documented to show how the investment firm or intermediary sought to safeguard the interest of the client.
SFTR and MiFID II require that banks rethink their entire reporting methodologies, and in some cases must rethink parts of their business model. A wide range of new information must be captured, analyzed, consolidated, and reported outwards and internally. This will likely generate new ideas and business opportunities around collateral usage and pricing for those firms that can digest the large quantities of new information that will be produced.
The struggle at many firms to comply with regulations while maximizing profitability has led to two parallel sets of infrastructures: one for the business and another for compliance. This creates two levels of cost that duplicate substantial effort inside the firm. Along the way, business lines get charged twice for this work as costs are allocated back to the business. This is an immediate negative impact on profitability; even firms that have completed collateral optimization immediately lose a piece of that financial benefit.
The cumulative impact of regulation means that banks and investment firms generally cannot afford to wait for consolidation projects to deliver a single integrated platform. The fragmentation of teams, data and processes are hurdles for any institution to overcome but so is the old mindset that simply thinks of collateral management as an isolated operational process.
We identify five critical areas for firms to address in order to create a foundation for their holistic collateral infrastructure:
The fifth recommendation, building a functional operational model for collateral, means being able to connect together disparate business lines to provide an enterprise view of collateral. It includes mining collateral agreements to make optimal decisions or decisions mandated by regulation. It requires the ability to perform analysis of collateral to balance economic and regulatory drivers, and it requires controls and transparency of client collateral across all margin centers.
At Transcend Street Solutions, we are actively working with our clients to help them develop a strategic roadmap of business and technology deliverables to achieve a holistic collateral infrastructure. While there are always organizational as well as infrastructural nuances in every business, we have seen the framework proposed above yield a positive return for our clients. Our technology platform, CoSMOS, is nimble, modular and customizable to accelerate collateral infrastructure evolution without necessarily having to retire existing systems or undergo a big infrastructural lift.
Getting this right is important for more than just regulatory compliance. It means the collateral function and trading desks can perform the forward processes required to support both profitable trading and firm-wide decision making. Pre-trade analytics is needed to ensure that collateral is allocated optimally across portfolios and collateral agreements. Optimization is also needed at the trade level to ensure the most suitable collateral is applied to each trade or structure. Finally, analysis needs to be carried out across the whole inventory of securities and cash positions to ensure collateral is used by the right businesses. After all, correct pricing of collateral across business lines is not only essential for firm-level profitability but also incentivizing desirable behavior throughout the organization.
We strongly believe that firms that are successful in achieving a holistic collateral architecture will have a significant competitive advantage in the industry. They will be able to achieve optimization of collateral and liquidity across business silos while meeting most global regulatory requirements, and all that with a much more efficient IT spend.
This article was originally published on Securities Finance Monitor.