A foundation for value in capital markets

| Article

In a capital-markets environment where digital capabilities play a large and growing role, a bank’s success depends in large part on its data and application architecture—the rules, policies, standards, and topology underlying its technology stack. Architecture renewal is a foundational requirement for competing in the capital-markets industry, and banks must get it right to remain relevant. Technology investments based on a shaky architecture will result in weak performance, poor decisions, dissatisfied customers, and financial losses.

The benefits of a superior architecture

The role of the data and application architecture in supporting the primary and secondary markets means that it’s often regarded as a cost. However, in a digitally enabled environment, that perspective is outmoded. In fact, the return on investment for banks with superior architectures is likely to be significant. Revenue growth is increasingly correlated with technological intensity, and equities, flow rates, and foreign exchange in particular increasingly rely on strong designs that can support investments in execution and connectivity solutions (Exhibit 1).

1
Revenue growth is increasingly correlated with technological intensity.

A superior architecture brings further benefits to the capital-markets value chain by supporting the automation of tasks and processes, reducing IT expenditures, and accelerating time to market. The savings include lower licensing costs, reduced HR budgets, and fewer reconciliations. A well-designed architecture also enables more incisive analytics and, in some cases, can mitigate capital requirements. Often it allows front-office employees to be redeployed to higher-value activities. McKinsey estimates that the combined value impact of these benefits for a large bank with a technology budget of about $3 billion can be as high as $1.1 billion annually (Exhibit 2).

2
A better data and application architecture boosts financial performance.

Higher revenue

McKinsey estimates that for the top 12 banks, the capital-markets revenue benefit of a better architecture could amount to as much as $200 million a year, thanks to improved cross-selling, faster and smarter trading, a more efficient allocation of credit, and an enhanced ability to launch new products.

The benefits are both direct and tangential. One bank found that the automation of client onboarding not only cut costs but also boosted sales and customer satisfaction. A strong architecture ensures that analytics are based on definitive information that can be accessed speedily and processed in a consistent format. Stronger analytics, in turn, can lead to improved pricing, better lead generation, more precise credit terms, fewer credit losses, and stronger sales operations (Exhibit 3). All these have a significant revenue upside.

3
Advanced analytics add value across the full capital-markets value chain.

Balance-sheet efficiency and capital savings

A strong data and application architecture and the resulting improvement in data quality create capital savings for market, credit, and operational risks. Banks can cut risk-weighted assets (RWA) because better data will allow them to avoid overly conservative assumptions and mismatches in quarterly true-ups. McKinsey estimates that the top 12 banks can save as much as $300 million a year in capital exposures. A superior data architecture can also help banks to avoid operational shortfalls—for example, those arising out of late or inaccurate trade booking—and that may further reduce long-term RWA.

Business-cost reductions

A better data architecture can cut costs in the front office, operations, and finance, as well. For large banks, the additional value can be as high as $200 million a year—an estimate based on McKinsey’s calculations for the top 12.

When a capital-markets bank implemented what it called “straight-through processing on steroids,” which was based on the renewal of its architecture, it reduced its cash-equity trading headcount from 500 to 12 in a little over a decade. The bank also cut its derivatives-confirmation cycle from 28 to 7 days and its costs by 20 percent.

Other benefits resulting from an improved architecture include avoiding missed trades, as well as a reduction in work handling exceptions, breaks in data processing, and manual interventions. The finance function can benefit significantly from lower rework rates, improved data cleansing, and a reduced need for reconciliation.

The automation of post-trade life-cycle processing also brings significant gains. One firm with a large volume calculated its costs at $3 per trade for fixed-income straight-though processing, compared with $240 per trade for exceptions. Exception rates vary significantly—the best-performing banks have rates that are 45 to 75 percent below the average.

Of the 65 data elements required for MIFID II reporting, FIX or FPML standards identify 56 as common across asset classes. With common processes and data, banks can realize significant benefits through the automation enabled by a superior data and application architecture.

Lower technology costs

McKinsey estimates that the top 12 capital-markets players can save up to $400 million annually on technology costs through an enhanced data and application architecture. The economies come from eliminating duplication, reducing programming time, and downsizing the infrastructure for applications and data centers. For example, one large bank with redundant applications and databases averages 6 percent server utilization, while a rival runs server and storage arrays at 65 percent, saving a significant sum annually. Another with a long-running architecture program and strong governance spends around half as much on regulatory technology as peers with more fragmented architectures.

Large banks typically spend more than $1 billion a year on the IT infrastructure, but for many banks at least 30 percent of the applications and data are duplicative. Eliminating them is an easy win.

The characteristics of a superior architecture

A description of the advantages of a superior data and application architecture clearly requires us to indicate what it would be like. There is no one-size-fits-all architecture and no standard approach to design; most banks will combine bespoke internal implementations with a mix of vendor solutions. However, outstanding architectures share three characteristics: flexibility, singularity, and availability.

Flexibility

A flexible architecture can handle data and services in ways not originally anticipated and thus supports innovation. It has the following key features:

  • Modular “plug and play” capabilities. These enable interoperability, facilitate build-and-buy strategies, and allow components to be reworked to reflect emerging capabilities. Modularity is especially critical to ensure the effective implementation of programs, because legacy systems can’t be replaced in one go. The modularity should be designed to ensure relatively simple integration with fintech and regtech platforms and distributed ledgers.
  • Rules for extracting business logic from applications. These should be housed in configurable rules engines, which provide for changes in business logic without the need for code changes in applications (such as a new parameter for a regulatory requirement).
  • Granular data storage—for instance, order, trade, and execution. In many cases, databases store only aggregate positions, so banks struggle to drill down to the original transaction. Granular storage provides for research, aggregation, and analysis.
  • The ability to associate different forms of data. These data include numbers and prices, text, algorithms, models, documents, photos, and voice recordings, so banks can generate a 360-degree view of a client or transaction.

Singularity

A singular architecture has one application, data store, process, and code set—unless there is a compelling business or technical reason to have another. Defining characteristics should include these:

  • One authoritative source for data elements. Data must be as accurate and complete as possible as soon as possible after they are created. In trading, this would mean that the sales and trading figuration also works for settlement. The “get it right up front” principle also applies to the augmentation and completion of data.
  • Cross-asset-class systems. For complex logic across asset classes, there should be one set of procedures per calculation engine and function (excluding business-case exceptions, for example, in relation to high-frequency trading). Engines typically require a burst capacity at a certain time of the day or the month. Cross-asset functionality makes it possible to scale the underlying infrastructure appropriately. One leading investment bank is storing code associated with risk calculations for its fixed-income-trading platforms in reusable-code libraries ready for use in equity trading.
  • Shared code and data structures. Reusable-code libraries prevent the same functions from being developed multiple times. Modules providing functions or processes should be cataloged. One bank has applied an open-source management methodology to its proprietary code modules for over 18 years, suggesting that the common belief in the protocol’s unsuitability for capital markets is mistaken. The extensive use of open-source technology should be considered before coding. An added advantage of open source is that technologists can use their programming skills to add value and make a mark (most open-source code includes the name of its author). Still, open source should not be the first choice for proprietary calculations or other areas where processes provide a competitive advantage.
  • Applications as services, cutting the need for unique data stores. Too often, applications that use the same or similar data require a copy in a format that is unique to the application, and this creates duplication and the need for reconciliation. Applications should be designed to share data stores or data messages, so that instead of sending data to applications, the architecture brings applications to the data. One European CIO estimates that moving to shared-data apps would save his bank hundreds of millions of euros annually.

Availability

An available architecture design ensures that data, applications, and services are at hand. These are the imperatives:

  • Process data in real or near-real time. Batch processing should be reserved for “snapshot” functions (such as end-of-business-day-by-region value calculations). While not all data will be used intraday, processing data as business transactions occur minimizes exceptions and ensures that data are available quickly.
  • Provide multichannel support. This is particularly critical for clients and salespeople, who shouldn’t need to switch from mobile platforms to PCs to access their data and applications. Voice and mobile overlays on CRM systems make the salesforce more efficient.
  • Use in-memory databases and other low-latency techniques. Once reserved for high-frequency trading, these techniques are increasingly necessary for analytics and functions such as managing balance-sheet liquidity in real time. The technology supporting low-latency messaging is developing fast. In-memory databases are relatively mature, and nonvolatile random access memory (NRAM)—which retains all data in case of power outages—has improved, with major releases expected by early 2018.

Data and application architectures designed to encompass the key elements of flexibility, singularity, and availability can evolve with the organization (see sidebar, “Data and application technologies for capital markets”). They should be expected to support the delivery of products and services for 20 years or more.

Making it happen: Six key points

Capital-markets participants with superior data and application architectures are more likely to succeed in rolling out technological capabilities. For that reason, many have prioritized these kinds of investments. However, instead of the anticipated benefits, some have experienced delays, technical challenges, and changes of strategic direction. Digital-change leaders and executives should consider the following six key points for moving projects forward.

Analyze the value at stake

Banks should quantify the value at stake from renewing their architectures—which can amount to more than $1 billion annually. Smaller banks will realize value in proportion to their IT budgets. Project leaders need a detailed understanding of how the company’s current data and application architecture is creating or destroying value. To that end they must do the following:

  • Review a broad range of metrics, including cost-per-trade benchmarking, regulatory spending, the client experience, the percentage of duplicative databases or applications, and the percentage of development projects completed on time and on budget (including ROE, as well as maintenance and enhancement costs). The results should be benchmarked against those of industry peers.
  • Assess and quantify analytics capabilities throughout the value chain, either as added value or possible missed opportunities.
  • Evaluate the potential for innovation within the current data and application architecture in view of time to market, client onboarding, and risk-management capabilities.

Following the assessment, it should be possible to estimate the new architecture’s approximate annual value gain arising from automation, business improvements, capital savings, IT-budget efficiencies, and lower infrastructure costs. This information should be used to target the highest pockets of value.

Rally the organization around the vision

Capital-markets participants often launch data programs in response to a regulatory review or treat them as centrally driven initiatives. From a business-line perspective, the programs are often perceived primarily as interference. To counter that perception, change leaders must position the program clearly as essential to the strategic aims of the business and an exciting opportunity to create a competitive advantage.

The business case for architecture renewal—including an annual revenue target and business goals, such as time to market, client profitability, customer satisfaction, and employee satisfaction—should be explained and disseminated. Program governance and oversight must be put in place. “Storytelling” (credible examples of what can be achieved in specified time frames) should be a key element. Dialogue is critical; open forums, surveys, and road shows are ways to ensure that all voices are heard.

Pilot and deliver

Piloting is a key step in testing the potential impact of a new data and application architecture. It can be approached from the point of view of a business line, a critical capability, or a business opportunity (for example, a robotic-enabled, automated work flow or analytics capabilities to optimize credit decisions). Pilots can also involve partnering with fintechs. Examples of successful pilots include a front-to-back data architecture for wholesale loans to address regulatory-reporting needs, a client-profitability system, and an improved prime-brokerage offer integrated with franchise services.

Pilots should meaningfully test the principles of the architecture design and focus on an area with a significant potential ROI and a proven business or regulatory need. Two to four pilots can be conducted in parallel. Agile techniques should be a key part of the delivery methodology, and the project should be deliverable in a few months. The aim should be to deliver early and often.

Cognitive technologies in capital markets

How cognitive technologies are transforming capital markets

Boost skills internally and among vendors

Although today’s bank-technology resources will be insufficient for 2020, the answer is not a hiring spree. Firms should combine redeploying and training current technologists, hiring new talent, and forming targeted partnerships. Given the importance of technology to the capital markets—at some banks today, more than half of all employees are engineers—leaders should take ownership of the process. The executive and front offices should cooperate in championing technology, and firms should appoint credible and visionary technology leaders.

To ensure continuous improvement, firms must have a system for identifying, training, and empowering top performers. The front office should be closely involved in spotting talent and invest in career development and mobility. Teams should be agile and include both capital-markets employees and technologists. The requirements will include new roles, such as data scientists, data engineers, and visualization engineers. To attract them at a time of intense competition for their services, banks must tell a compelling story about how their technology groups operate.

Banks can also use partnerships to develop skills. Many fintechs look for these opportunities, and firms should engage with them to import and strengthen skills. Similarly, universities are ideal partners for breakthrough initiatives—as some banks have already found. Academic institutions not only offer new perspectives and knowledge but can also be valuable resources for recruitment.

Embrace the capital-markets network

In three to five years, large portions of the capital-markets business, as well as the data and apps that support it, will be outside the walls of banks. These institutions therefore need to become purposeful about where they can be distinctive and about how they can collaborate with vendors and design architecture capabilities that reflect the evolving capital-markets network.

In-house technology resources should be deployed on the banks’ most distinctive and highest-value-added services. Open-source and vendor systems may be the first choice for commoditized functions. The ability to scale the data center’s infrastructure will be increasingly critical. Some banks are using co-location facilities that are less expensive than proprietary data centers. Others are experimenting with public-cloud offers for less critical functions (and most are shifting to hybrid cloud), reflecting the trend to move infrastructure outside the bank. A number of banks have launched open APIs in recent months to respond to public-policy initiatives and to encourage partnerships and innovation. Modular data and application architectures, as well as clear interface standards, will be important to operate successfully in a mix-and-match environment.

Run the program as a business

The goals of architecture design are business goals: increased revenue, additional capabilities, better customer service, capital savings, and improved regulatory compliance. Success therefore requires a real partnership among the front office, operations, finance, and technology. At some banks, this will mean that the front office participates regularly; others will rethink business processes with less direct front-office engagement.

A data-and-application-architecture program should be run and evaluated like any other business plan, with regular reviews and contingencies. The business teams and the implementation teams must be mutually accountable for meeting financial and business goals.

Initial steps toward a new architecture

A great data architecture begins with design. Once it’s in place, banks need a practical approach to implementation for handling the inevitable execution challenges, including complex and fragmented existing platforms, a lack of investment, a shortage of talent, and difficulties with delivery. Further, many banks lack the time and patience to accept long-term payoffs and an uncertain impact on the bottom line.

The most comprehensive approach to transformation would be an end-to-end simplification program, but if a bank lacks the interest or budget for root-and-branch reform, it can take a step-by-step approach, either starting with one high-potential area or focusing on integration layers.

  1. Launch end-to-end simplification programs. Some banks (especially larger institutions) have launched broad programs to simplify the data and application architecture. Those banks will realize the greatest economic benefits if they remain committed to programs that run five years or longer.
  2. Start with one high-potential area. Identify either a high-value, front-to-back process (such as onboarding) or a product area that has significant profit potential, where market structures and digital frameworks allow for simpler platforms (foreign exchange, equities, listed derivatives). Rebuild the data and application architecture end to end, leveraging third-party platforms and solution providers when that makes sense. If existing systems and processes are too constraining, the bank can build the new infrastructure from the ground up and decommission existing systems later. We have seen such efforts pay off in two to three years and create significant momentum for change.
  3. Focus on integration layers. Banks should extract data from disparate sources into a mezzanine (or integration) layer that cleans and transforms data and allows applications and users to call them up. For applications, manual handoffs should be addressed through robotic process automation or similar tools. At the cost of some additional complexity, these approaches allow banks to capture many of the benefits of a cleaner data and application architecture without the full cost of simplification. By generating faster economic benefits, they can buy time to address longer-term issues.

The data and application architecture is critical to enabling the capital-markets business capabilities of the present and near future. In some cases, the technology enabled by this architecture is set to become the business. As a baseline, banks that transform existing systems to create a flexible, singular, and available architecture are likely to derive significant savings. In some cases, this may be the difference between ceding market share to more innovative and nimble rivals and continuing to play at the top table.

Explore a career with us