Sunday, October 28, 2012

Spotlight: 2010 Flash Crash



On May 6, 2010, the stock market experienced something that had never happened before - a market plunge of almost 1000 points in a 15-minute span. It was called the "flash crash", and scared investors and market executives alike, even after the market rebounded for the rest of the day to only finish down 3%. What really put fear in the hearts of many was the fact that they knew something like this could happen again. What actually caused this "flash crash"? Let's take an in-depth look, thanks to the help of a report of the day's events released by the U.S. Commodity Futures Trading Commission (CFTC) and the U.S. Securities & Exchange Commission (SEC).

According to the report, May 6 started as an unusually turbulent day for the markets, due in large part to news coming out of Europe concerning the European debt crisis. But nobody could have predicted what would happen that afternoon. The report breaks up the trading day into five phases, which I will summarize:


  • First phase (Market Open -  2:32 PM): Stock market indices suffered losses of about 3%, perhaps due to crisis in Europe.
  • Second phase (2:32 - 2:41 PM): Markets lost another 1-2%. 
  • Third phase (2:41 - 2:45:28 PM): Volume spiked upwards as stock prices plummeted another 5-6%, resulting in 9-10% lows. 
  • Fourth phase (2:45:28 - 3:00 PM): Stock indices started to recover but some individual securities experienced extreme price fluctuations. In some cases, securities were sold from prices ranging to less than a penny to over $100,000! 
  • Fifth phase (3:00 PM - Market Close): Trading returned to normal levels and most stocks recovered most of what was lost.

So what happened? Well, because of the news from Europe, market volatility was unusually high and liquidity unusually low just before the nosedive. Volatility is the variation of price over time - volatile stocks are normally considered risky investments. Liquidity is an asset's ability to be sold without affecting the price - low liquidity is bad because a single trade can have a big impact on a stock's trading price.

The S&P volatility index (VIX) was up 22.5% by 2:30 PM because of pressure to sell stocks. Additionally, liquidity in the E-Mini S&P 500 futures contracts as well as the S&P 500 SPDR exchange traded fund, the two most active stock instruments traded in futures and equity markets, had declined 55% and 20%, respectively.

Then, with this background of high volatility and low liquidity, a mutual fund group initiated a major trade that would sell a total of 75,000 E-Mini contracts, valued at about $4.1 billion. Although not named in the CFTC/SEC report, this firm was instantly identified as Waddell & Reed Financial Inc. Now typically, such a large sell order, when executed algorithmically (which is what they were doing), would take hours. However, their algorithm didn't take prices or time into account, it just looked at trading volume (which was already high because of the high volatility in the market). As a result, this enormous sell order took just 20 minutes instead of several hours. This is what wreaked havoc on the market - High Frequency Traders (HFTs) from around the market quickly jumped on the E-Mini contracts. They bought and sold these contracts, in milliseconds, from and to each other, creating a "hot potato" effect. Then, Waddell & Reed's sell order algorithm, seeing the increased volume due to HFTs, pumped more of its orders into the market, creating a feedback loop.

The market plunged. The Dow Jones Industrial Average, the most widely known stock index, lost 998 points, or 9.2%, in just minutes. To put that in perspective: just a 1% change in the Dow Jones, over a whole day of trading, is considered large. A liquidity crisis ensued, and many stocks lost their value quickly. Some HFT algorithms stopped trading because they had not been designed to handle such a situation; the prices were beyond their threshold of trading.

Some investors got really unlucky - a man named Mike McCarthy lost $17,000 on a single trade because his order to sell shares of Procter & Gamble was executed at 2:46 PM, when the price was at rock bottom. Some hedge funds lost millions because they placed bets at the wrong time.

The blame game is still going on - an opinion article published by Bloomberg posits that it was the HFTs, not Waddell & Reed, who sucked liquidity from the market and caused the crash. Others argue that it wasn't the HFTs' algorithms that failed, it was the one from the mutual fund.

Regardless of who caused the flash crash, we know the facts. It happened. Another flash crash is not impossible. What can be done to prevent another flash crash from happening? How is confidence in the market affected by events like this?

Sources:

[1] Buchanan, Mark, "Flash-Crash Story Looks More Like a Fairy Tale", Bloomberg View, May 7, 2012, http://www.bloomberg.com/news/2012-05-07/flash-crash-story-looks-more-like-a-fairy-tale.html

[2] Kirilenko, Andrei A., Kyle, Albert S., Samadi, Mehrdad and Tuzun, Tugkan, "The Flash Crash: The Impact of High Frequency Trading on an Electronic Market", May 26, 2011, http://ssrn.com/abstract=1686004

[3] U.S. Commodity Futures Trading Commission and U.S. Securities & Exchange Commission, "Findings Regarding the Market Events of May 6, 2010", September 30, 2010, http://www.sec.gov/news/studies/2010/marketevents-report.pdf

[4] "What Caused the Flash Crash? One Big, Bad Trade", The Economist, October 1, 2010, http://www.economist.com/blogs/newsbook/2010/10/what_caused_flash_crash


Sunday, October 21, 2012

High Frequency Trading - Good or Bad?

The stock market has changed drastically over the past few decades with the introduction of computerized systems, but also even over the past few years due to the evolution of high frequency trading - essentially, systems that can buy or sell stocks within milliseconds. Designed to profit from minuscule changes in price, these algorithms are now competing with each other in a market where everyone wants an edge. Statistics show that high-frequency trading (HFT) can account for up to 73% of all trading volume today. Everyone uses them - from small hedge funds to the big investment banks like Goldman Sachs. In the currency market alone, up to 4,000 trillion U.S. dollars can change hands in a day. While market bosses are obviously in favor of these robotic computer algorithms picking their stocks and rolling in the dough, there are obviously some downsides. Let's do an analysis of HFT. What are some benefits and drawbacks to such an intriguing new technology?

Benefits
  • It's fast. Obviously, this goes without saying. HFT takes out human error, time delays, and anything else to complete the one task it's assigned: trade stocks.
  • It increases liquidity. With so many trades being made per second and algorithms trying to constantly figure out future prices, HFT puts a more accurate value on each stock with every trade.
  • It cuts costs. HFT helps investors by reducing the costs necessary to make a transaction.
Drawbacks
  • Computers can crash. The complex computer algorithms that make up most of HFT are susceptible to computer errors or sudden changes in the market; numerous examples have occurred in the past few years and I will talk about them in the coming weeks.
  • It can have far-fledged effects. One author calls it the "butterfly effect" - a huge storm in the U.S. can be caused by a butterfly flapping its wings in Asia. HFT makes all markets more connected and therefore more respondent to events happening around the world.
  • There's no safety net. In a number of recent reports on HFT algorithms, industry pros revealed that profits were more important than safety; often times firms would fix bad algorithms just by "tweaking old code" and then reimplementing the algorithm into the system.


Of course, there's many different kinds of HFT and the algorithms that make it up. This list is in no way a complete, closed book on high frequency trading. With that being said, what do you think? Is the evolution of HFT a good thing for Wall Street and the financial world, or not? Is it profitable only for investment bankers, not us everyday consumers or occasional individual investors?

I think it's safe to say that for now, at least, we'll have to live with HFT. Unless another market crash or significant government regulation is near, these algorithms will always be a part of Wall Street. Pretty soon, we could even see algorithms that can process trades in microseconds. It's getting faster than ever before. Does the machine ever stop?


Sources:

[1] "Are Computers Bringing Down The Stock Market?", Forbes Magazine, August 15, 2011, http://www.forbes.com/sites/investopedia/2011/08/15/are-computers-bringing-down-the-stock-market/2/

[2] Jeff Cox, "High-Frequency Trading: It's Worse Than You Thought", CNBC, September 20, 2012, http://www.cnbc.com/id/49102808/High_Frequency_Trading_It_s_Worse_Than_You_Thought

[3] Fabrizio Goria, "Easy Money", The European Magazine, October 15, 2012, http://www.theeuropean-magazine.com/861-goria-fabrizio/862-high-frequency-trading

[4] Bruno J. Navarro, "Vanguard CIO: High-Frequency Trading Cuts Costs", CNBC, October 18, 2012, http://www.cnbc.com/id/49434073/Vanguard_CIO_High_Frequency_Trading_Cuts_Costs

[5] "The Fast and the Furious", The Economist, February 25, 2012, http://www.economist.com/node/21547988

Sunday, October 14, 2012

Technology and the Stock Market

On October 27, 1986, the London Stock Exchange introduced the Stock Exchange Automated Quotation System to replace the trading floor. The day became known in history known as the "Big Bang" - after the introduction of automated trading, the number of trades made in a month was around the number of trades made in a whole year before 1986. However, in past decades the stock market had been changing already. The point is, introducing computers into the equation wasn't a drastic change because the market had adjusted to technological innovation before. For example, take into account the effect of the telegram, which was introduced to Wall Street towards the end of the 19th century:


Wall Street, 1850s



Wall Street with Telegram Wires

I got these pictures from another blog about the stock market and it's history, nerdsonwallstreet.com.

Although the advent of computerized trading came across as a "Big Bang", it really wasn't. Researchers, academics, and those involved in the financial markets had seen this coming because they knew it was only a matter of time before computers became a fixture in our lives. Automated trading may have not been introduced until the 1980s, but even in the 1960s people like Gunter Rischer were making predictions about computers and their future involvement in stock trading.

In a 1961 issue of the Financial Analysts Journal, Rischer wrote a short essay on computers and the stock market. In it, he explained how it was pretty much inevitable that computers would eventually be employed by stock traders to perform calculations on masses of economic data. In the 1960s, the most widely used stock statistics were price-earnings ratio (P/E ratio) and dividend yield because they could be calculated the quickest. With computers, calculating anything and everything about a stock becomes easier - this gives everyone more information to judge a stock on. But who would get this information? Rischer considered two cases: (1) the information obtained from computers accesible just to a restricted group, or (2) to the public. In case (1), Rischer explained that he who has the computer has a "truly differential advantage over everybody else" without a computer. However, the more likely scenario would be case (2). In this case, every investor would profit from the advantage of more information but nobody would have an edge over anybody else; this in effect just raises the bar.

What I found most interesting about this essay by Rischer is his conclusion and predictions about future implications of automated trading. He wrote that with time, the financial analysts and investors need not worry about calculations on stocks and their current position because this would be done by computers. Thus, they could focus their attention to future outlook of companies and stocks. At the end of his essay, he points out that with computers, the competition will lie in the appraisal of the future rather than the appraisal of the past and present. "This does not seem to favor stability of the stock market", he said. "If anything it would make for greater instability."

This foresight by Rischer, back in the 1960s, amazed me because he was totally right. Currently, futures markets are a big player on Wall Street. Additionally, speculation about the future of a company's health can make or break that company's stock price. Last year, the European debt crisis affected Wall Street also on pure speculation. Nowadays, we are always talking about what will happen next, and this makes the stock market unstable.

Ultimately, the introduction of computers to the stock market was no big surprise. Just like the telegram, it "raised the bar", so to speak, giving investors and analysts more information on which to make decisions. Computers processed information about stocks efficiently and with ease, giving everyone more time to think ahead instead of looking back at the old numbers. But does this create instability in the market?

Sources:

[1] Karl Flinders, The Evolution of Stock Market Technology, Computer Weekly, Nov 2, 2007, http://www.computerweekly.com/news/2240083742/The-evolution-of-stock-market-technology

[2] David Leinweber, An Illustrated History of Wired Capital Markets, Nerds on Wall Street (blog), http://nerdsonwallstreet.com/illustrated-history-of-wired-capital-markets-132/

[3] Gunter Rischer, Computers and the Stock Market (A Comment), Financial Analysts Journal, Vol. 17, No. 4 (Jul-Aug, 1961), pp. 91-93, http://www.jstor.org/stable/4469230





Sunday, October 7, 2012

First Post

Hello there!

My name is Alec Powell and I'm a freshman at Stanford University. I'm taking a IntroSeminar called Computers and the Open Society, and this is my blog for the class.

In today's market, every trade is made by a computer. You can put in an order on online broker websites. From there, the trade goes through to more computers, and magically it is processed. What is the process for this? How are trades actually made if they happen every millisecond? What happens if a computer's trading algorithm glitches or crashes? How is the market affected by all of the computers running the show? Are there governmental regulations to this?

Throughout the quarter, I'll be researching and blogging about computers and the stock market to try and answer some of these questions. I'm interested in studying both Computer Science and Economics, so this topic is perfect for me because it is a smooth blend of both fields.

Just a few decades ago, when there were no computers, the stock market trading floor was cluttered with traders holding ticker tape. These days, it's cluttered with computers making digital transactions. I'm interested in how this process came to be, and what implications this has on the stock market of today and tomorrow.

I'd also love to hear what you think about this topic. If you have any suggestions, I'd love a comment from you!