In Pursuit of Smarter Profits: Capital Market Efficiency through Automation and Distributed Computing

David Snitkof, Head of Data Integration and Insights, Kabbage
6
20
4
David Snitkof, Head of Data Integration and Insights, Kabbage

David Snitkof, Head of Data Integration and Insights, Kabbage

Those of us who have thought about the future of capital markets have envisioned a world where investor preferences and asset characteristics can all be expressed in code, and a smart matching engine can execute trades that maximize the efficiency of the market without unnecessary human involvement. It is now over a decade since the inception of online lending—the inspiration to apply this vision to loan financing—but this vision is not yet fully realized. However, at this time, we can explore how the concept translates to today’s capital markets environment, how forward-thinking lending companies such as Kabbage are using technology to optimize their treasury processes, and how automation and human intelligence can combine to bring the aforementioned vision to reality.

The Rise of Intelligent Capital Markets

In the golden era of peer-to-peer (P2P) lending, a type of online lending where funds are crowdsourced by multiple investors, P2P loan originators would publish detailed, loan-level information over an API accessible by investors. Investors would then evaluate the risk and expected return of each loan programmatically and place automated bids on the loans in which they wanted to invest. This model of loan-level, automated funding scaled from smaller retail investors who helped form the initial funding base for P2P to larger hedge funds and family offices. The early success of programmatic, loan-level institutional investment (e.g. at my previous startup, Orchard, our systems helped place billions of dollars worth of programmatic loan investments for institutional clients) led many to believe that a fully automated market for loan funding was just around the corner. However, as the capital needs of online lenders sharply grew, nearly all online lenders turned to more well-worn paths for financing—such as credit facilities, securitization, or passive forward-flow contracts with institutional buyers; many lenders adopted a hybrid model across multiple funding channels. It is now clear that the path forward in data-driven capital markets is longer, more incremental, and more nuanced than many had previously thought. With this background, we can discuss the state of the art in today’s market.

  The past several years have demonstrated that the capital markets for lending will evolve through incremental change rather than overnight revolution   

The 101 of Capital Markets in an Automated Age

Any lender with multiple sources of capital has to decide which source should fund each dollar of lending. As each capital source has its own set of attributes (e.g. cost, size, collateral eligibility, concentration limits, covenants), the allocation decision becomes nontrivial once a business reaches sufficient size and the diversity of capital sources increases.

At most lenders, this capital allocation exercise is performed more or less manually, with a simple set of rules, some spreadsheet-based testing, and a team of analysts to generate reporting for banks. This process works, but it is not optimal for three key reasons:

1. Efficiency: Running a manual capital-allocation and reporting process requires significant work and a team that expands linearly with the number of capital sources.  Organizations adopting a manual approach inevitably spend more time on report generation and fire drills than on strategic analysis and capital planning.

2. Operational Risk: Human-centric processes are prone to error, and the likelihood of error increases with the inherent complexity of a growing business. In treasury and capital markets, the impact of an operational error can be high. Mistakes such as double-pledging, mis-reporting, and incorrect collateral eligibility assessment can force a lender into technical default and potentially lead to insolvency.

3. Yield Optimization: When an organization needs to spend so much of its time and attention producing reports, allocating loans, and avoiding human error, not nearly enough time is spent on financial optimization in matching capital to loan originations. Depending on the lender, this can be a massive unseen opportunity cost.

How to Build a More Efficient System

At Kabbage, it is only natural to apply our core competencies in technology and analytics to capital markets. Kabbage uses software developed at Orchard (now owned by Kabbage) to programmatically evaluate thousands of capital-to-loan allocation simulations each day and select the allocation that maximizes our ability to extend credit to borrowers in the most capital efficient way.

Following in this line of thinking, capital markets processes can become more reliable by incorporating principles from software engineering. Practices that can minimize the risk from human error include:

• Unit testing: writing code to test discrete components of a program with certain inputs and compare the results to an expectation. For instance, if the eligible receivables for a warehouse facility multiplied by the advance rate should always equal the collateral base, then a unit test would compare the actual values in a borrowing base report to the expected calculation. The test can then generate alerts when something does not match, avoiding a situation where a lender sends an incorrect report to a bank.

• Version control: the use of software to track changes in code and understand how various versions of a technical system have evolved, particularly if multiple people are working on a project. For instance, if rules and logic are stored in spreadsheets, it is hard to track who made a particular change at a particular time. By contrast, if rules and logic are stored in code under a version control system, it is possible to see a history of all changes ever made to a piece of code. This facilitates collaboration across multiple developers—from proposing to reviewing and accepting changes to code.

• Data Integrity Automation: the development of systems to constantly monitor data for completeness, structure, and accuracy. With all the data that exists within a modern lending business, it is impossible to check for every potential inconsistency without automation. If there are low-probability but high-impact errors that could happen within data, it makes sense to develop code to check for them constantly and generate alerts if and when they occur.

The past several years have demonstrated that the capital markets for lending will evolve through incremental change rather than overnight revolution. While assets like equities, currencies, and commodities—as well as non-financial markets such as online advertising—can often trade in fully-automated, data-driven markets, the path is more difficult with loans, which are less-fungible assets that have a long history of manual analysis and complex structuring. Furthermore, the most common vehicles for large-scale financing of lending activities require not only the ability to analyze data and predict risk, but a sophisticated knowledge of the capital markets, deep relationships with providers of capital, and the ability to establish trust. Fortunately, the technology now exists to automate and optimize the mathematical and procedural aspects of loan funding, while freeing capital markets and treasury teams to focus on the important work such as ensuring that credit can be extended to borrowers in the most efficient way.

Read Also

Hiring Your Next IT Management Superstar

Hiring Your Next IT Management Superstar

J. Freeman, Senior Technology Manager and Vice President, Dimensional Fund Advisors
Third Party Vendor Management - Looking out for the Weakest Link

Third Party Vendor Management - Looking out for the Weakest Link

Al Berg, Chief Security and Risk Officer, Liquidnet Holdings Inc.