The Spider and the Fly
Michael Lewis has written a riveting report on the trial, incarceration, release, and re-arrest of Sergey Aleynikov, once a star programmer at Goldman Sachs. It's a tale of a corporation coming down with all its might on a former employee who, when all is said an done, damaged the company only by deciding to take his prodigious talents elsewhere.
As is always the case with Lewis, the narrative is brightly lit while the economic insights lie half-concealed in the penumbra of his prose. In this case he manages to shed light on the enormous divergence between the private and social costs of high frequency trading, as well as the madness of an intellectual property regime in which open-source code routinely finds its way into products that are then walled off from the public domain, violating the spirit if not the letter of the original open licenses.
Aleynikov was hired by Goldman to help improve its relatively weak position in what is rather euphemistically called the market-making business. In principle, this is the business of offering quotes on both sides of an asset market in order that investors wishing to buy or sell will find willing counterparties. It was once a protected oligopoly in which specialists and dealers made money on substantial spreads between bid and ask prices, in return for which they provided some measure of price continuity.
But these spreads have vanished over the past decade or so as the original market makers have been displaced by firms using algorithms to implement trading strategies that rely on rapid responses to incoming market data. The strategies are characterized by extremely short holding periods, limited intraday directional exposure, and very high volume. A key point in the transition was the adoption in 2007 of Regulation NMS (National Market System), which required that orders be routed to the exchange offering the best available price. This led to a proliferation of trading venues, since order flow could be attracted by price alone. Lewis describes the transition thus:
For reasons not entirely obvious... the new rule stimulated a huge amount of stock-market trading. Much of the new volume was generated not by old-fashioned investors but by extremely fast computers controlled by high-frequency-trading firms... Essentially, the more places there were to trade stocks, the greater the opportunity there was for high-frequency traders to interpose themselves between buyers on one exchange and sellers on another. This was perverse. The initial promise of computer technology was to remove the intermediary from the financial market, or at least reduce the amount he could scalp from that market. The reality has turned out to be a boom in financial intermediation and an estimated take for Wall Street of somewhere between $10 and $20 billion a year, depending on whose estimates you wish to believe. As high-frequency-trading firms aren’t required to disclose their profits... no one really knows just how much money is being made. But when a single high-frequency trader is paid $75 million in cash for a single year of trading (as was Misha Malyshev in 2008, when he worked at Citadel) and then quits because he is “dissatisfied,” a new beast is afoot.
The combination of new market rules and new technology was turning the stock market into, in effect, a war of robots. The robots were absurdly fast: they could execute tens of thousands of stock-market transactions in the time it took a human trader to blink his eye. The games they played were often complicated, but one aspect of them was simple and clear: the faster the robot, the more likely it was to make money at the expense of the relative sloth of others in the market.
This last point is not quite right: speed alone can't get you very far unless you have an effective trading strategy. Knight Capital managed to lose almost a half billion dollars in less than an hour not because their algorithms were slow but because they did not faithfully execute the intended strategy. But what makes a strategy effective? The key, as Andrei Kirilenko and his co-authors discovered in their study of transaction-level data from the S&P E-mini futures market, is predictive power:
High Frequency Traders effectively predict and react to price changes... [they] are consistently profitable although they never accumulate a large net position... HFTs appear to trade in the same direction as the contemporaneous price and prices of the past five seconds. In other words, they buy... if the immediate prices are rising. However, after about ten seconds, they appear to reverse the direction of their trading... possibly due to their speed advantage or superior ability to predict price changes, HFTs are able to buy right as the prices are about to increase... They do not hold positions over long periods of time and revert to their target inventory level quickly... HFTs very quickly reduce their inventories by submitting marketable orders. They also aggressively trade when prices are about to change.
Aleynikov was hired to speed up Goldman's systems, but he was largely unaware of (and seemed genuinely uninterested in) the details of their trading strategies. Here's Lewis again:
Oddly, he found his job more interesting than the stock-market trading he was enabling. “I think the engineering problems are much more interesting than the business problems,” he says... He understood that Goldman’s quants were forever dreaming up new trading strategies, in the form of algorithms, for the robots to execute, and that these traders were meant to be extremely shrewd. He grasped further that “all their algorithms are premised on some sort of prediction—predicting something one second into the future.”
Effective prediction of price movements, even over such very short horizons, is not an easy task. It is essentially a problem of information extraction, based on rapid processing of incoming market data. The important point is that this information would have found its way into prices sooner or later in any case. By anticipating the process by a fraction of a second, the new market makers are able to generate a great deal of private value. But they are not responsible for the informational content of prices, and their profits, as well as the substantial cost of their operations, therefore must come at the expense of those investors who are actually trading on fundamental information.
It is commonly argued that high frequency trading benefits institutional and retail investors because it has resulted in a sharp decline in bid-ask spreads. But this spread is a highly imperfect measure of the value to investors of the change in regime. What matters, especially for institutional investors placing large orders based on fundamental research, is not the marginal price at which the first few shares trade but the average price over the entire transaction. And if their private information is effectively extracted early in this process, the price impact of their activity will be greater, and price volatility will be higher in general.
After all, it was a large order from an institutional investor in the S&P futures market that triggered the flash crash, sending indexes plummeting briefly, and individual securities trading at absurd prices. Accenture traded for a penny on the way down, and Sotheby's for a hundred thousand dollars a share on the bounce back.
In evaluating the impact on investors of the change in market microstructure, it is worth keeping in mind Bogle's Law:
It is the iron law of the markets, the undefiable rules of arithmetic: Gross return in the market, less the costs of financial intermediation, equals the net return actually delivered to market participants.
This is just basic accounting, but often overlooked. If one wants to argue that the new organization of markets has been beneficial to investors, one needs to make the case that the costs of financial intermediation in the aggregate have gone down. Smaller bid-ask spreads have to be balanced against the massive increase in volume, the profits of the new market makers, and most importantly, the costs of high-frequency trading. These include nontrivial payments to highly skilled programmers and quants, as well as the costs of infrastructure, equipment, and energy. Lewis notes that the "top high-frequency-trading firms chuck out their old gear and buy new stuff every few months," but these costs probably pale in comparison with those of cables facilitating rapid transmission across large distances and the more mundane costs of cooling systems. All told, it is far from clear that the costs of financial intermediation have fallen in the aggregate.
This post is already too long, but I'd like to briefly mention a quite different point that emerges from the Lewis article since it relates to a theme previously explored on this blog. Aleynikov relied routinely on open-source code, which he modified and improved to meet the needs of the company. It is customary, if not mandatory, for these improvements to be released back into the public domain for use by others. But his attempts to do so were blocked:
Serge quickly discovered, to his surprise, that Goldman had a one-way relationship with open source. They took huge amounts of free software off the Web, but they did not return it after he had modified it, even when his modifications were very slight and of general rather than financial use. “Once I took some open-source components, repackaged them to come up with a component that was not even used at Goldman Sachs,” he says. “It was basically a way to make two computers look like one, so if one went down the other could jump in and perform the task.” He described the pleasure of his innovation this way: “It created something out of chaos. When you create something out of chaos, essentially, you reduce the entropy in the world.” He went to his boss, a fellow named Adam Schlesinger, and asked if he could release it back into open source, as was his inclination. “He said it was now Goldman’s property,” recalls Serge. “He was quite tense. When I mentioned it, it was very close to bonus time. And he didn’t want any disturbances.”
Open source was an idea that depended on collaboration and sharing, and Serge had a long history of contributing to it. He didn’t fully understand how Goldman could think it was O.K. to benefit so greatly from the work of others and then behave so selfishly toward them... But from then on, on instructions from Schlesinger, he treated everything on Goldman Sachs’s servers, even if it had just been transferred there from open source, as Goldman Sachs’s property. (At Serge’s trial Kevin Marino, his lawyer, flashed two pages of computer code: the original, with its open-source license on top, and a replica, with the open-source license stripped off and replaced by the Goldman Sachs license.)
This unwillingness to refresh the reservoir of ideas from which one drinks may be good for the firm but is clearly bad for the economy. As Michele Boldin and David Levine have strenuously argued, the rate of innovation in the software industry was dramatic prior to 1981 (before which software could not be patented):
What about the graphical user interfaces, the widgets such as buttons and icons, the compilers, assemblers, linked lists, object oriented programs, databases, search algorithms, font displays, word processing, computer languages – all the vast array of algorithms and methods that go into even the simplest modern program? ... Each and every one of these key innovations occurred prior to 1981 and so occurred without the benefit of patent protection. Not only that, had all these bits and pieces of computer programs been patented, as they certainly would have in the current regime, far from being enhanced, progress in the software industry would never have taken place. According to Bill Gates – hardly your radical communist or utopist – “If people had understood how patents would be granted when most of today's ideas were invented, and had taken out patents, the industry would be at a complete standstill today.”
Vigorous innovation in open source development continues under the current system, but relies on a willingness to give back on the part of those who benefit from it, even if they are not legally mandated to do so. Aleynikov's natural instincts to reciprocate were blocked by his employer for reasons that are easy to understand but very difficult to sympathize with.
Lewis concludes his piece by reflecting on Goldman's motives:
The real mystery, to the insiders, wasn’t why Serge had done what he had done. It was why Goldman Sachs had done what it had done. Why on earth call the F.B.I.? Why coach your employees to say what they need to say on a witness stand to maximize the possibility of sending him to prison? Why exploit the ignorance of both the general public and the legal system about complex financial matters to punish this one little guy? Why must the spider always eat the fly?
The answer to this, I think, is contained in the company's response to Lewis, which is now appended to the article. The statement is impersonal, stern, vague and legalistic. It quotes an appeals court that overturned the verdict in a manner that suggests support for Goldman's position. Like the actions of the proverbial spider, it's a reflex, unconstrained by reflection or self-examination. Even if the management's primary fiduciary duty is to protect the interests of shareholders, this really does seem like a very shortsighted way to proceed.
---
Update (August 6). RT Leuchtkafer, whose writing has been featured in several earlier posts, sends in the following by email (posted with permission):
I'd add the task for HFT shops is more than information extraction in short timeframes - they've expanded that task to be one of coaxing information leakage from the exchanges, for which they pay the exchanges handsomely.
On intermediation and its costs, intermediary participation in the equities markets has easily tripled since the HFT innovation (and the deregulation of intermediation), and so on net I've argued that aggregate position intermediation costs have gone up even as per share costs have gone down. Intermediaries make much less on a share than they used to but thanks to deregulation they interpose themselves between natural buyers and sellers much more often than they did, with the result that even though portfolio implementation costs have gone down the portion of those costs captured by intermediaries has greatly increased.
In addition, Steve Waldman has pointed out that the costs of defensive expenditures to counter HFT strategies are also subject to Bogle's Law and need to be accounted for. For some vivid examples see this post by Jason Voss (via Themis Trading).
Responses to this post on Economist's View and Naked Capitalism are also worth a look; I especially recommend the discussion of open source following this comment by Brooklin Bridge.