What do you do with all those books you bought and now consider useless?
Following up on my 2009 post on unusual Christmas trees but becoming increasingly lazy, this year I direct you to a site that pictures twelve Christmas trees made out of books.
Friday, December 23, 2011
Wednesday, December 21, 2011
Cortés, Against the Herd
Those of you who watch Fast Money on CNBC (I don’t) are undoubtedly familiar with Steve Cortés since he’s one of the regulars. He is also the founder of Veracruz, a market research firm. In Against the Herd: 6 Contrarian Investment Strategies You Should Follow (Wiley, 2012), itself a fast-paced book, he shares some of the fruits of his and his firm’s research.
The themes are (and with one exception I won’t be a spoiler and tell you which way he positions himself) China, Japan, gold, the housing market, market volatility, and the U.S.
Here I’ll share Cortés’s take on Japan. In a chapter entitled “Dolls Are Meant for Children” he predicts a “severe demographic and fiscal implosion” for Japan, arguing that the country’s problems “are utterly terminal [and] there is literally no escape from the death spiral.” (p. 35) The chapter’s title, by the way, refers to the doll Yumel, which serves “as a fake grandchild for the massively growing legions of lonely, geriatric, grandchildless Japanese.” (p. 34)
In the 1980s Japan seemed unstoppable economically. The Nikkei reached a closing high of 38,916 in 1989 and eight of the ten largest companies in the world by market cap were Japanese. Super-low rates and a strong yen created “the tinder for a classic bonfire of reckless investment.” (p. 41) As we know, the bubble popped. The Nikkei declined over 80% to a 2009 low of 7,055, in the early 1990s values of commercial real estate fell 87%, and deflation set in.
Japan is now in an “inescapable” bond trap, with a debt-to-GDP ratio over 200% and a debt-to-private GDP ratio at 240%. For the past 20 years Japan has been able to sell its bonds to domestic insurance companies, individual Japanese savers, and public pension plans. “But the famously thrifty Japanese are fast drawing down savings and the trajectory is certain and points toward an aging nation of net spenders, not savers. Japan began the two lost decades with a savings rate at 16 percent. It has slowly dipped to 2 percent and will soon likely head to negative territory.” (p. 46)
If Japan has to tap the international fixed income market, rates will have to rise. “According to hedge fund titan Kyle Bass, every 1 percent increase in the Japanese government’s cost of capital will consume an astounding 25 percent of total government revenue. He states, ‘For context, if Japan had to borrow at France’s rate, the interest burden alone would bankrupt the government.’” (p. 48) Japan would have to roll out its printing presses and devalue its currency.
Cortés offers some ideas on how to capitalize on Japan’s impending doom, the simplest being shorting the yen and buying the U.S. dollar.
Cortés’s writing is fine in small doses, formulaic for the length of a book. He tries to ease the reader into investing concepts by invoking pop culture images. For instance, in the chapter on gold, he starts with MC Hammer, moves on to John Travolta, then the Bible (well, I guess that wouldn’t qualify as pop culture), Richard Simmons, John L. Sullivan, Three’s Company, and finally A Man for All Seasons. This goes on chapter after chapter after chapter after chapter…. It starts to wear thin pretty quickly.
But for those who like to think in macro terms Cortés’s book offers six contrarian (or semi-contrarian) theses supported by well-reasoned arguments and sufficient though not overwhelming data.
The themes are (and with one exception I won’t be a spoiler and tell you which way he positions himself) China, Japan, gold, the housing market, market volatility, and the U.S.
Here I’ll share Cortés’s take on Japan. In a chapter entitled “Dolls Are Meant for Children” he predicts a “severe demographic and fiscal implosion” for Japan, arguing that the country’s problems “are utterly terminal [and] there is literally no escape from the death spiral.” (p. 35) The chapter’s title, by the way, refers to the doll Yumel, which serves “as a fake grandchild for the massively growing legions of lonely, geriatric, grandchildless Japanese.” (p. 34)
In the 1980s Japan seemed unstoppable economically. The Nikkei reached a closing high of 38,916 in 1989 and eight of the ten largest companies in the world by market cap were Japanese. Super-low rates and a strong yen created “the tinder for a classic bonfire of reckless investment.” (p. 41) As we know, the bubble popped. The Nikkei declined over 80% to a 2009 low of 7,055, in the early 1990s values of commercial real estate fell 87%, and deflation set in.
Japan is now in an “inescapable” bond trap, with a debt-to-GDP ratio over 200% and a debt-to-private GDP ratio at 240%. For the past 20 years Japan has been able to sell its bonds to domestic insurance companies, individual Japanese savers, and public pension plans. “But the famously thrifty Japanese are fast drawing down savings and the trajectory is certain and points toward an aging nation of net spenders, not savers. Japan began the two lost decades with a savings rate at 16 percent. It has slowly dipped to 2 percent and will soon likely head to negative territory.” (p. 46)
If Japan has to tap the international fixed income market, rates will have to rise. “According to hedge fund titan Kyle Bass, every 1 percent increase in the Japanese government’s cost of capital will consume an astounding 25 percent of total government revenue. He states, ‘For context, if Japan had to borrow at France’s rate, the interest burden alone would bankrupt the government.’” (p. 48) Japan would have to roll out its printing presses and devalue its currency.
Cortés offers some ideas on how to capitalize on Japan’s impending doom, the simplest being shorting the yen and buying the U.S. dollar.
Cortés’s writing is fine in small doses, formulaic for the length of a book. He tries to ease the reader into investing concepts by invoking pop culture images. For instance, in the chapter on gold, he starts with MC Hammer, moves on to John Travolta, then the Bible (well, I guess that wouldn’t qualify as pop culture), Richard Simmons, John L. Sullivan, Three’s Company, and finally A Man for All Seasons. This goes on chapter after chapter after chapter after chapter…. It starts to wear thin pretty quickly.
But for those who like to think in macro terms Cortés’s book offers six contrarian (or semi-contrarian) theses supported by well-reasoned arguments and sufficient though not overwhelming data.
Sunday, December 18, 2011
My picks of the year
Last year I wrote a post in which I highlighted some books that I personally found valuable. One of my readers requested a 2011 update. So here it is—my brief, admittedly very idiosyncratic list presented in alphabetical order. Clicking on the book title will with any luck take you to my review.
Aaron Brown, Red-Blooded Risk: The Secret History of Wall Street
William Byers, The Blind Spot
Emanuel Derman, Models.Behaving.Badly
Scott E. Page, Diversity and Complexity
Among the books that deal more directly with investing I almost always enjoy titles in the “little book” series. Here are two, both particularly useful for value investors: Aswath Damordaran, The Little Book of Valuation and Vitaliy N. Katsenelson, The Little Book of Sideways Markets.
Lots of runners-up this year, but I’ll stop for now. I may need to pull them out of the hat for next year’s picks.
Aaron Brown, Red-Blooded Risk: The Secret History of Wall Street
William Byers, The Blind Spot
Emanuel Derman, Models.Behaving.Badly
Scott E. Page, Diversity and Complexity
Among the books that deal more directly with investing I almost always enjoy titles in the “little book” series. Here are two, both particularly useful for value investors: Aswath Damordaran, The Little Book of Valuation and Vitaliy N. Katsenelson, The Little Book of Sideways Markets.
Lots of runners-up this year, but I’ll stop for now. I may need to pull them out of the hat for next year’s picks.
Thursday, December 15, 2011
Brooks, Trading Price Action Trends
In 2009 Al Brooks wrote Reading Price Charts Bar by Bar, a book I struggled with, as I explained in my review. It seems I was not alone. Brooks therefore re-engineered his project instead of simply writing a second edition. The result is a three-book series, of which Trading Price Action Trends: Technical Analysis of Price Charts Bar by Bar for the Serious Trader (Wiley, 2012) is the first volume. The other two, forthcoming in January, will deal with trading ranges and reversals.
Trading Price Action Trends is still no spine-tingling thriller, but it’s a tremendous improvement over Brooks’s first effort. For starters, the prose is cleaner and the charts are larger. And instead of merely describing bars, individually and as parts of patterns, he explains what they may reveal about the intentions and expectations of traders, both bulls and bears.
Brooks himself trades primarily off of 5-minute e-mini S&P 500 candlestick charts using only price action—no indicators (with the exception of a 20-bar EMA and hand-drawn trend lines), news, or multiple time frames. He sees everything “in shades of gray” and thinks “in terms of probabilities.” (p. 12) He recognizes that “everything can change to the exact opposite in an instant, even without any movement in price.” (p. 37) When he looks at a chart, he is “constantly thinking about the bullish case and the bearish case with every tick, every bar, and every swing.” (p. 39) He dissects bars but also realizes that ultimately they have meaning only contextually. If he were dealing with animals instead of charts he would be both an anatomist and an ecologist.
About half of this volume is devoted to the fundamentals of price action, the other half to trends. Of course, there is no clear demarcation line between the two. It’s impossible to write about the fundamentals of price action without discussing trends. At the level of individual bars, for instance, Brooks differentiates between trend bars and dojis (where the bulls and bears are in balance).
Who should read this book? Novices who think that trading is easy; this book should definitely dissuade them and perhaps prevent yet another account from being blown out. Serious traders, as Brooks himself suggests—and I would add serious traders with a penchant for detailed analysis who are willing to log thousands of hours of screen time and after-hours study in order to stand a chance of becoming a successful discretionary trader.
Trading Price Action Trends is a tough book to absorb. One pass is certainly not enough. But even on the first pass I found some extremely useful pointers. So it goes on to the shelf awaiting a second read.
Trading Price Action Trends is still no spine-tingling thriller, but it’s a tremendous improvement over Brooks’s first effort. For starters, the prose is cleaner and the charts are larger. And instead of merely describing bars, individually and as parts of patterns, he explains what they may reveal about the intentions and expectations of traders, both bulls and bears.
Brooks himself trades primarily off of 5-minute e-mini S&P 500 candlestick charts using only price action—no indicators (with the exception of a 20-bar EMA and hand-drawn trend lines), news, or multiple time frames. He sees everything “in shades of gray” and thinks “in terms of probabilities.” (p. 12) He recognizes that “everything can change to the exact opposite in an instant, even without any movement in price.” (p. 37) When he looks at a chart, he is “constantly thinking about the bullish case and the bearish case with every tick, every bar, and every swing.” (p. 39) He dissects bars but also realizes that ultimately they have meaning only contextually. If he were dealing with animals instead of charts he would be both an anatomist and an ecologist.
About half of this volume is devoted to the fundamentals of price action, the other half to trends. Of course, there is no clear demarcation line between the two. It’s impossible to write about the fundamentals of price action without discussing trends. At the level of individual bars, for instance, Brooks differentiates between trend bars and dojis (where the bulls and bears are in balance).
Who should read this book? Novices who think that trading is easy; this book should definitely dissuade them and perhaps prevent yet another account from being blown out. Serious traders, as Brooks himself suggests—and I would add serious traders with a penchant for detailed analysis who are willing to log thousands of hours of screen time and after-hours study in order to stand a chance of becoming a successful discretionary trader.
Trading Price Action Trends is a tough book to absorb. One pass is certainly not enough. But even on the first pass I found some extremely useful pointers. So it goes on to the shelf awaiting a second read.
Wednesday, December 14, 2011
Smith and Shawky, Institutional Money Management
Institutional Money Management: An Inside Look at Strategies, Players, and Practices, edited by David M. Smith and Hany A. Shawky (Wiley, 2012) is the most recent addition to the Kolb Series in Finance. As is the custom with books in this series, it includes contributions by both academics and practitioners and is designed for professionals in the field as well as those aspiring to enter the field. It is a well-edited volume that anyone with a modicum of market experience should have no difficulty reading.
The book has four main themes that are explored in 22 chapters: market regulation, performance evaluation, and reporting; key individuals to the investment process; major investment approaches; and types of institutional investors.
In this post, rather than attempting an overview, I’m going to zero in on a single point that I think is potentially important for the individual investor.
The editors, in their chapter “Investment Buy and Sell Decision Making,” analyze data from Informa’s plan sponsor network (PSN) database, which is updated quarterly through surveys of money managers. They examine 5,410 equity portfolios and 1,494 fixed income portfolios from 1979 to the November 2009 release.
What criteria, they ask, do portfolio managers use when buying equities? About 60% reported using a bottom-up method. The next most popular criteria were quantitative/research (14%), fundamental analysis (11%), computer screening/models (4%), and top-down/economic analysis (4%). Only 20 portfolios of the 5,410 relied on technical analysis although, as the authors note, “the criteria most closely related to technical analysis—quantitative analysis, computer screening, and momentum—also enjoy widespread usage by portfolio managers.” (p. 125)
As we know, the more important question is usually when to sell. The PSN database recognizes six sell-discipline criteria: down from cost, up from cost, target price, valuation level, fundamental deterioration overview, and opportunity cost. The most popular among managers was fundamental, followed by valuation level; target price came in third.
The authors analyze returns by equity class (and the overall average) for each of these sell-discipline criteria. The best-performing criterion was down from cost, followed by target price. The worst performance, by a large measure, was logged by those who used no sell-discipline criterion. Here are the overall average numbers for the arithmetic average benchmark-adjusted returns (percent annualized): fundamental 2.17%, valuation level 2.00%, target price 2.47%, opportunity cost 1.76%, down from cost 2.59%, and none 1.22%.
There’s a lesson here.
The book has four main themes that are explored in 22 chapters: market regulation, performance evaluation, and reporting; key individuals to the investment process; major investment approaches; and types of institutional investors.
In this post, rather than attempting an overview, I’m going to zero in on a single point that I think is potentially important for the individual investor.
The editors, in their chapter “Investment Buy and Sell Decision Making,” analyze data from Informa’s plan sponsor network (PSN) database, which is updated quarterly through surveys of money managers. They examine 5,410 equity portfolios and 1,494 fixed income portfolios from 1979 to the November 2009 release.
What criteria, they ask, do portfolio managers use when buying equities? About 60% reported using a bottom-up method. The next most popular criteria were quantitative/research (14%), fundamental analysis (11%), computer screening/models (4%), and top-down/economic analysis (4%). Only 20 portfolios of the 5,410 relied on technical analysis although, as the authors note, “the criteria most closely related to technical analysis—quantitative analysis, computer screening, and momentum—also enjoy widespread usage by portfolio managers.” (p. 125)
As we know, the more important question is usually when to sell. The PSN database recognizes six sell-discipline criteria: down from cost, up from cost, target price, valuation level, fundamental deterioration overview, and opportunity cost. The most popular among managers was fundamental, followed by valuation level; target price came in third.
The authors analyze returns by equity class (and the overall average) for each of these sell-discipline criteria. The best-performing criterion was down from cost, followed by target price. The worst performance, by a large measure, was logged by those who used no sell-discipline criterion. Here are the overall average numbers for the arithmetic average benchmark-adjusted returns (percent annualized): fundamental 2.17%, valuation level 2.00%, target price 2.47%, opportunity cost 1.76%, down from cost 2.59%, and none 1.22%.
There’s a lesson here.
Tuesday, December 13, 2011
McDowell, Survival Guide for Traders
So you want to become an independent trader, either to supplement your income or eventually to have trading be your sole source of income? Bennett A. McDowell has written Survival Guide for Traders: How to Set Up and Organize Your Trading Business (Wiley, 2012) for the novice who wants to get started but doesn’t quite know how to go about it. If the wannabe trader doesn’t feel quite ready to plunge into the markets after reading this book, McDowell is more than ready to sell him a range of pricey products on his website TradersCoach.com.
Like all start-up businesses, trading is difficult and prone to failure. What will give the novice a shot at being successful? McDowell offers six pointers: (1) understand it will be a lot of work, (2) get adequate financing, (3) plan, plan, and plan some more, (4) start your business for the right reasons, (5) be resilient and persevere, and (6) create a model that can be profitable. And, he adds, reduce your expenses because with taxes a penny saved is closer to a penny and a half earned.
McDowell covers a lot of territory in this book, from how to choose the best data feed, broker, and front-end platform for your needs to money management and financial psychology. He offers a detailed business plan template. He lists some technical analysis signals and tools (those included in his own software are at the top of the list) and five popular scanning tools (again, his scanner heads the list).
What is the holy grail of trading? McDowell suggests that is perseverance: “those who survive and prosper for the long term are the traders who can persevere through thick and thin and just keep going with new solutions and strategies.” (p. 132)
Survival Guide for Traders is a good primer with an abundance, sometimes an overabundance, of information, but of course the reader won’t go from “See Spot run” to “Sleep that knits up the ravelled sleave of care” in one fell swoop. There’s a lot of work and practice in between.
Like all start-up businesses, trading is difficult and prone to failure. What will give the novice a shot at being successful? McDowell offers six pointers: (1) understand it will be a lot of work, (2) get adequate financing, (3) plan, plan, and plan some more, (4) start your business for the right reasons, (5) be resilient and persevere, and (6) create a model that can be profitable. And, he adds, reduce your expenses because with taxes a penny saved is closer to a penny and a half earned.
McDowell covers a lot of territory in this book, from how to choose the best data feed, broker, and front-end platform for your needs to money management and financial psychology. He offers a detailed business plan template. He lists some technical analysis signals and tools (those included in his own software are at the top of the list) and five popular scanning tools (again, his scanner heads the list).
What is the holy grail of trading? McDowell suggests that is perseverance: “those who survive and prosper for the long term are the traders who can persevere through thick and thin and just keep going with new solutions and strategies.” (p. 132)
Survival Guide for Traders is a good primer with an abundance, sometimes an overabundance, of information, but of course the reader won’t go from “See Spot run” to “Sleep that knits up the ravelled sleave of care” in one fell swoop. There’s a lot of work and practice in between.
Monday, December 12, 2011
Stoken, Survival of the Fittest for Investors
Over the past few years, as academics have come to view markets as complex adaptive systems, Darwin’s ideas have worked their way into the mainstream financial literature. In Survival of the Fittest for Investors: Using Darwin’s Laws of Evolution to Build a Winning Portfolio (McGraw-Hill, 2012) Dick Stoken explores both the theory and practice of Darwinism as it applies to the individual investor.
Free markets, Stoken explains, use the same algorithm as evolution—search through potential designs, select the few that are good enough, then replicate or amplify them. Free markets are “littered with errors,”, but “each error, along the way, provides feedback so as to formulate a new trial until a solution is found.” (pp. 29-30) Moreover, similar to prey/predator models, “free markets fluctuate…. During the exuberant phase of the cycle, errors become embedded into the system and, at some point, interfere with its ability to self-regulate. Reversals are necessary to flush enough of the errors out so that the system can regain its former vitality.” (p. 31)
Stoken analyzes bubbles at some length and argues that the errors in the most recent financial crisis were not man-made. “If only X had done Y” is a misplaced criticism. The errors “were of the kind that living systems, operating via a trial-and-error process, typically make. Therefore, they could not, in an ordinary sense, be man-fixed.” The real error, he contends, was “in not allowing for error. … Our wealth-generating machine lowered its margin of safety drastically, leaving little room for anything to go wrong.” Stoken continues: “The more complex a system becomes, the greater the number of errors. Bubble time is also peak complexity time. The particular error doesn’t matter so much, as all roads lead to system failure. Much like Hercules in his battle with the nine-headed serpent Hydra, when Hercules cut off one head, two more would sprout; if we fixed one error, the problem shifted and another and more potent accident soon popped up.” (pp. 106-107)
What is an investor who accepts this view of the market to do? Of course, he has to adapt. Stoken offers several concrete options, from a passive diversified portfolio of alternative investments to a levered actively managed “combined assets” portfolio, from annual rebalancing to using a breakout system for buy and sell signals. He includes basic backtesting results for each strategy.
Survival of the Fittest for Investors is one of the “fittest,” best written investment books I’ve read in some time. The investor who is searching for a way to boost returns would do well to add it to his must-read list.
Free markets, Stoken explains, use the same algorithm as evolution—search through potential designs, select the few that are good enough, then replicate or amplify them. Free markets are “littered with errors,”, but “each error, along the way, provides feedback so as to formulate a new trial until a solution is found.” (pp. 29-30) Moreover, similar to prey/predator models, “free markets fluctuate…. During the exuberant phase of the cycle, errors become embedded into the system and, at some point, interfere with its ability to self-regulate. Reversals are necessary to flush enough of the errors out so that the system can regain its former vitality.” (p. 31)
Stoken analyzes bubbles at some length and argues that the errors in the most recent financial crisis were not man-made. “If only X had done Y” is a misplaced criticism. The errors “were of the kind that living systems, operating via a trial-and-error process, typically make. Therefore, they could not, in an ordinary sense, be man-fixed.” The real error, he contends, was “in not allowing for error. … Our wealth-generating machine lowered its margin of safety drastically, leaving little room for anything to go wrong.” Stoken continues: “The more complex a system becomes, the greater the number of errors. Bubble time is also peak complexity time. The particular error doesn’t matter so much, as all roads lead to system failure. Much like Hercules in his battle with the nine-headed serpent Hydra, when Hercules cut off one head, two more would sprout; if we fixed one error, the problem shifted and another and more potent accident soon popped up.” (pp. 106-107)
What is an investor who accepts this view of the market to do? Of course, he has to adapt. Stoken offers several concrete options, from a passive diversified portfolio of alternative investments to a levered actively managed “combined assets” portfolio, from annual rebalancing to using a breakout system for buy and sell signals. He includes basic backtesting results for each strategy.
Survival of the Fittest for Investors is one of the “fittest,” best written investment books I’ve read in some time. The investor who is searching for a way to boost returns would do well to add it to his must-read list.
Saturday, December 10, 2011
A daily chart advent calendar
As usual, I'm late to the game. On November 30 The Economist published a daily chart advent calendar, a collection of the 24 most popular maps, charts, data visualisations and interactive features on their site this year plus a new chart for Christmas day. Not just the routine stuff. The first graph is the street price of cocaine in rich nations and adult usage in these countries. We're now on day ten; you're not allowed to peek ahead.
Thursday, December 8, 2011
Schmitt, 401(k) Day Trading
Over the past couple of weeks I have reviewed several books that meticulously explore the viability of various investing and trading strategies. Richard Schmitt’s 401(k) Day Trading: The Art of Cashing in on a Shaky Market in Minutes a Day (Wiley, 2011) does not fall into this category.
Schmitt argues that the investor with unconsolidated 401(k) accounts can outperform buy-and-hold returns by trading part of his retirement portfolio once every day. How, you may ask, can he do this given the restrictions on frequent trading imposed by most mutual funds? It’s simple. He has two commission-free stock fund accounts, one for buying stock and the other for selling stock, and a cash fund. And here’s the strategy. “You use one account for transfers just before the market close from a stock fund to a cash fund whenever the stock market is advancing for the day. In the other account, you transfer from a cash fund to a stock fund when the market is declining for the day.” (p. 220)
The investor doesn’t swap out his total buying or selling account each day. Instead, he determines how much to exchange based on the S&P 500’s daily change (not the daily range) multiplied by a constant calibration factor. “Under normal conditions, you could use a calibration factor of one thousandth of the initial amount of your retirement savings portfolio you decide to subject to day trading.” (p. 194) So, if the investor had a $100,000 portfolio, the S&P moved 10 points, and his calibration factor was $100, he would exchange $1,000.
Schmitt provides very little statistical information about his backtesting of this method. He compares only the differences in returns between the $100,000 day trading portfolio and the $100,000 S&P 500 portfolio. To “accommodate the market volatility experienced during the decade ended December 31, 2010,” the backtested day trading portfolio used a calibration factor of $75 for each daily one-point change in the S&P 500 index. (I assume this figure was optimized in hindsight, warning flag #1.) The day trading portfolio outperformed the index portfolio over the ten-year period (120 vs. 95.3) and a five-year timeframe (115.1 vs. 100.7). In 2010 itself, however, the index portfolio outperformed 112.8 vs. 107.3. We have no data on portfolio drawdowns.
The author acknowledges that “a comparison of alternative asset management strategies relies heavily on the measurement period selected for use in computing the strategies’ investment returns” (warning flag #2). “A 20-year comparison of alternative strategies becomes a bit more interesting.” (p. 207) We don’t know how badly Schmitt’s strategy would have performed over that time frame. Suffice it to say that the author acknowledges that “day trading is a strategy of the times, but not for all times. It may perform well in a volatile stock market with little or no net change over time but does not fare as well by comparison in a rapidly increasing or decreasing stock market.” (p. 208) So, warning flag #3 is that the investor has to be able to read the tea leaves of the market environment, present and near future.
Schmitt’s book provides a lot more information about 401(k) plans than I have shared in this review, and to that extent I have been unfair. But before investors with 401(k) plans get all excited about a way to earn excess returns with very little work they should step back a bit and think it through.
Schmitt argues that the investor with unconsolidated 401(k) accounts can outperform buy-and-hold returns by trading part of his retirement portfolio once every day. How, you may ask, can he do this given the restrictions on frequent trading imposed by most mutual funds? It’s simple. He has two commission-free stock fund accounts, one for buying stock and the other for selling stock, and a cash fund. And here’s the strategy. “You use one account for transfers just before the market close from a stock fund to a cash fund whenever the stock market is advancing for the day. In the other account, you transfer from a cash fund to a stock fund when the market is declining for the day.” (p. 220)
The investor doesn’t swap out his total buying or selling account each day. Instead, he determines how much to exchange based on the S&P 500’s daily change (not the daily range) multiplied by a constant calibration factor. “Under normal conditions, you could use a calibration factor of one thousandth of the initial amount of your retirement savings portfolio you decide to subject to day trading.” (p. 194) So, if the investor had a $100,000 portfolio, the S&P moved 10 points, and his calibration factor was $100, he would exchange $1,000.
Schmitt provides very little statistical information about his backtesting of this method. He compares only the differences in returns between the $100,000 day trading portfolio and the $100,000 S&P 500 portfolio. To “accommodate the market volatility experienced during the decade ended December 31, 2010,” the backtested day trading portfolio used a calibration factor of $75 for each daily one-point change in the S&P 500 index. (I assume this figure was optimized in hindsight, warning flag #1.) The day trading portfolio outperformed the index portfolio over the ten-year period (120 vs. 95.3) and a five-year timeframe (115.1 vs. 100.7). In 2010 itself, however, the index portfolio outperformed 112.8 vs. 107.3. We have no data on portfolio drawdowns.
The author acknowledges that “a comparison of alternative asset management strategies relies heavily on the measurement period selected for use in computing the strategies’ investment returns” (warning flag #2). “A 20-year comparison of alternative strategies becomes a bit more interesting.” (p. 207) We don’t know how badly Schmitt’s strategy would have performed over that time frame. Suffice it to say that the author acknowledges that “day trading is a strategy of the times, but not for all times. It may perform well in a volatile stock market with little or no net change over time but does not fare as well by comparison in a rapidly increasing or decreasing stock market.” (p. 208) So, warning flag #3 is that the investor has to be able to read the tea leaves of the market environment, present and near future.
Schmitt’s book provides a lot more information about 401(k) plans than I have shared in this review, and to that extent I have been unfair. But before investors with 401(k) plans get all excited about a way to earn excess returns with very little work they should step back a bit and think it through.
Wednesday, December 7, 2011
Hassett, The Risk Premium Factor
In The Risk Premium Factor: A New Model for Understanding the Volatile Forces That Drive Stock Prices (Wiley, 2011) Stephen D. Hassett sets out to provide a model for estimating the equity risk premium (and hence the cost of capital) and for solving the equity premium puzzle.
Hassett begins with the CAPM equation for the cost of equity: risk-free rate + beta x equity risk premium (ERP). For the market as a whole, the cost of equity is the risk-free rate + ERP. The problem is how to calculate ERP. Enter Hassett’s risk premium factor (RPF) model, which proposes that “the equity risk premium (ERP) is a simple function of the risk-free rate.”
“Conventional theory would hold that if the equity risk premium (ERP) were 6.0 percent and 10-year Treasury yield were 4.0 percent, then investors would expect equities to yield 10 percent, but if the 10-year Treasury were 10 percent, then investors would require a 16 percent return—a proportionately smaller premium.” By contrast, Hassett argues that ERP is not fixed but “varies directly with the level of the risk-free rate in accordance with a risk premium factor (RPF).” For example, “with an RPF of 1.48, equities are expected to yield 9.9 percent when Treasury yields are 4.0 percent and 24.8 percent (10 + 1.48 x 10 = 24.8) when they are at 10 percent to provide investors with the same proportional compensation for risk.” (p. 19)
To calculate the RPF Hassett ran regressions on annual data between 1960 and 2008 and quarterly data from Q4 1986 to Q4 2008. He found two shifts in the RPF—in 1981 and September 2002. (The causes of these shifts, the author admits, are still not fully explained.) The RPF values for the annual data sets were 1.24 between 1960 and 1980, 0.90 from 1981 through 2001, and 1.51 for 2002 through 2008.
Hassett acknowledges that the RPF can be discerned only in hindsight and cannot be projected, but he still considers his method superior to other methods for determining risk premiums. For instance, “if the RPF changed just two times over 50 years, one might argue that in any year there is a 96 percent chance … that the RPF will remain constant over the next year.” (p. 28)
The RPF model is also brought to bear on the equity premium puzzle, the inability to reconcile the observed ERP with financial models. The authors of a 1985 paper found that on average short-term Treasuries produced a real return of about 1% over the long term and equities yielded 7%. This, the authors maintained, would require a puzzling coefficient of risk aversion on the order of 40 or 50 to justify the 7% ERP. Haslett invokes his model in conjunction with the notion of loss aversion to tackle the puzzle.
Hassett uses his model to explain major market gyrations. He also explains how it can be used to value an acquisition or project.
For most investors the model is most applicable in valuing the overall stock market. Hassett argues that when trying to decide whether the market is over- or undervalued the analyst should focus on the two drivers of valuation—earnings and interest rates (interpreted through the lens of the RPF model).
I am not equipped to pass judgment on Hassett’s model. It’s certainly a simple model, not one of those complex quant models that have come under attack of late. It also seems plausible. Is it useful? Perhaps.
Hassett begins with the CAPM equation for the cost of equity: risk-free rate + beta x equity risk premium (ERP). For the market as a whole, the cost of equity is the risk-free rate + ERP. The problem is how to calculate ERP. Enter Hassett’s risk premium factor (RPF) model, which proposes that “the equity risk premium (ERP) is a simple function of the risk-free rate.”
“Conventional theory would hold that if the equity risk premium (ERP) were 6.0 percent and 10-year Treasury yield were 4.0 percent, then investors would expect equities to yield 10 percent, but if the 10-year Treasury were 10 percent, then investors would require a 16 percent return—a proportionately smaller premium.” By contrast, Hassett argues that ERP is not fixed but “varies directly with the level of the risk-free rate in accordance with a risk premium factor (RPF).” For example, “with an RPF of 1.48, equities are expected to yield 9.9 percent when Treasury yields are 4.0 percent and 24.8 percent (10 + 1.48 x 10 = 24.8) when they are at 10 percent to provide investors with the same proportional compensation for risk.” (p. 19)
To calculate the RPF Hassett ran regressions on annual data between 1960 and 2008 and quarterly data from Q4 1986 to Q4 2008. He found two shifts in the RPF—in 1981 and September 2002. (The causes of these shifts, the author admits, are still not fully explained.) The RPF values for the annual data sets were 1.24 between 1960 and 1980, 0.90 from 1981 through 2001, and 1.51 for 2002 through 2008.
Hassett acknowledges that the RPF can be discerned only in hindsight and cannot be projected, but he still considers his method superior to other methods for determining risk premiums. For instance, “if the RPF changed just two times over 50 years, one might argue that in any year there is a 96 percent chance … that the RPF will remain constant over the next year.” (p. 28)
The RPF model is also brought to bear on the equity premium puzzle, the inability to reconcile the observed ERP with financial models. The authors of a 1985 paper found that on average short-term Treasuries produced a real return of about 1% over the long term and equities yielded 7%. This, the authors maintained, would require a puzzling coefficient of risk aversion on the order of 40 or 50 to justify the 7% ERP. Haslett invokes his model in conjunction with the notion of loss aversion to tackle the puzzle.
Hassett uses his model to explain major market gyrations. He also explains how it can be used to value an acquisition or project.
For most investors the model is most applicable in valuing the overall stock market. Hassett argues that when trying to decide whether the market is over- or undervalued the analyst should focus on the two drivers of valuation—earnings and interest rates (interpreted through the lens of the RPF model).
I am not equipped to pass judgment on Hassett’s model. It’s certainly a simple model, not one of those complex quant models that have come under attack of late. It also seems plausible. Is it useful? Perhaps.
Tuesday, December 6, 2011
Triana, The Number That Killed Us
Pablo Triana, a professor at ESADE Business School (Spain), is a man on a mission: to rid modern finance of complex mathematical models. In The Number That Killed Us: A Story of Modern Banking, Flawed Mathematics, and a Big Financial Crisis (Wiley, 2012) his target is one of the most widely used models, VaR (Value at Risk).
Triana’s thesis is fairly straightforward, although he spends over 200 pages fleshing it out. VaR is fundamentally flawed because it relies on historical data, viewing the past as prologue; it assumes a normal probability distribution; and it doesn’t differentiate among kinds of assets. Moreover, since VaR is the key metric invoked to determine leverage, traders can use it to game the system. They can put together portfolios with ostensibly low risk profiles and hence be eligible to use more leverage, an exercise that sometimes masks reckless behavior.
VaR, Triana argues, was the “top miscreant” in the financial crisis. “Without those unrealistically insignificant risk estimates, the securities that sank the banks and unleashed the crisis would most likely not have been accumulated in such a vicious fashion, as the gambles would not have been internally authorized and, most critically, would have been impossibly expensive capital-wise.” (p. 3)
Triana wants to get rid of VaR. What should replace it? “Going forward, let’s do less mathematical financial risk analysis, please. Softer sapience based on traders’ war scars, experience-honed intuition, historical lessons, and networking with other players will not only typically beat quant sapience when it comes to understanding and deciphering exposures (we humans can’t be that bad!), but most crucially should be far more effective in preventing obviously lethal, chaos-igniting practices.” (pp. 44-45)
Financial risk, he contends, “is a simple discipline. Or rather, a discipline that ought to be based on fairly simple tenets: Financial risk is not measurable or forecastable, the past is not prologue, battle-scarred experience-honed intuitive wisdom should be accorded utmost notoriety, certain assets are intrinsically riskier than others, too much leverage should be avoided, and too much toxic leverage should be banned.” (p. 213)
The book closes with a brief Q&A with Nassim Taleb and an essay by Aaron Brown that presents a more balanced view of VaR.
I’m certainly no expert on VaR or on the risk management practices of investment banks, but from the little I know Triana does justice to neither. Considering that he doesn’t want to reform quantitative risk management but either to abandon it or to keep a simplified version of it on a tight leash, I suppose fine points are irrelevant. Triana paints with broad, impassioned brushstrokes.
In my opinion supplementing VaR in particular and quantitative models in general with a large dose of human wisdom is a laudable goal. Maintaining a healthy margin of safety is important even if it means diminished profits during good times. Keeping models as simple as possible is undoubtedly good practice. On the other hand, chucking math and replacing it with so-called “battle-scarred experience-honed intuitive wisdom” is not. For one thing, behavioral finance has taught us that, despite our best intentions, we can be terrible bunglers. We humans really are that bad! For another, what masquerades as wisdom often turns out to be an oversized ego. Just think of …. Well, I’m sure you can easily fill in the blanks.
Triana’s thesis is fairly straightforward, although he spends over 200 pages fleshing it out. VaR is fundamentally flawed because it relies on historical data, viewing the past as prologue; it assumes a normal probability distribution; and it doesn’t differentiate among kinds of assets. Moreover, since VaR is the key metric invoked to determine leverage, traders can use it to game the system. They can put together portfolios with ostensibly low risk profiles and hence be eligible to use more leverage, an exercise that sometimes masks reckless behavior.
VaR, Triana argues, was the “top miscreant” in the financial crisis. “Without those unrealistically insignificant risk estimates, the securities that sank the banks and unleashed the crisis would most likely not have been accumulated in such a vicious fashion, as the gambles would not have been internally authorized and, most critically, would have been impossibly expensive capital-wise.” (p. 3)
Triana wants to get rid of VaR. What should replace it? “Going forward, let’s do less mathematical financial risk analysis, please. Softer sapience based on traders’ war scars, experience-honed intuition, historical lessons, and networking with other players will not only typically beat quant sapience when it comes to understanding and deciphering exposures (we humans can’t be that bad!), but most crucially should be far more effective in preventing obviously lethal, chaos-igniting practices.” (pp. 44-45)
Financial risk, he contends, “is a simple discipline. Or rather, a discipline that ought to be based on fairly simple tenets: Financial risk is not measurable or forecastable, the past is not prologue, battle-scarred experience-honed intuitive wisdom should be accorded utmost notoriety, certain assets are intrinsically riskier than others, too much leverage should be avoided, and too much toxic leverage should be banned.” (p. 213)
The book closes with a brief Q&A with Nassim Taleb and an essay by Aaron Brown that presents a more balanced view of VaR.
I’m certainly no expert on VaR or on the risk management practices of investment banks, but from the little I know Triana does justice to neither. Considering that he doesn’t want to reform quantitative risk management but either to abandon it or to keep a simplified version of it on a tight leash, I suppose fine points are irrelevant. Triana paints with broad, impassioned brushstrokes.
In my opinion supplementing VaR in particular and quantitative models in general with a large dose of human wisdom is a laudable goal. Maintaining a healthy margin of safety is important even if it means diminished profits during good times. Keeping models as simple as possible is undoubtedly good practice. On the other hand, chucking math and replacing it with so-called “battle-scarred experience-honed intuitive wisdom” is not. For one thing, behavioral finance has taught us that, despite our best intentions, we can be terrible bunglers. We humans really are that bad! For another, what masquerades as wisdom often turns out to be an oversized ego. Just think of …. Well, I’m sure you can easily fill in the blanks.
Monday, December 5, 2011
Twomey, Inside the Currency Market
For those who are intellectually curious and who want to know more about the currency market than they’ll ever need to become a halfway decent trader Brian Twomey’s Inside the Currency Market: Mechanics, Valuation, and Strategies (Bloomberg/Wiley, 2012) is a fascinating if sometimes overwhelming book.
Twomey begins not with the definition of a pip but with an analysis of Bank of International Settlements (BIS) reports (which, by the way, are available online). The second chapter, entitled “Currency Trading Beyond the Basics,” lives up to its billing. It deals with such topics as margin in various countries, rollover rates and LIBOR, swap points, and purchasing power parity.
Twomey supplies formulas where necessary, charts where helpful as he takes the reader on a journey through trade weight indices, short-term interest rates and money market instruments, LIBOR, government bonds and yield curves, swaps and forwards, stock and bond markets, currency cycles and volatility. The journey is also geographical, encompassing markets in all the major crosses. And it includes a series of recommended trade strategies.
The final chapter is on technical analysis but, once again, not just the run-of-the-mill fare. He describes volume and open interest, COT reports, Bollinger bands, simple moving averages, Ichimoku, the Baltic dry index, the IMF and special drawing rights, pivot points, and currency correlations and trend lines.
This post is beginning to sound like a laundry list, but it’s the only way I can convey the breadth and the sometimes unexpected content of Twomey’s book. The book is definitely not for those who want a “dummies” introduction or who are looking for instructions on how to get rich quick in the forex market. It’s also not for those in search of some light reading for a rainy afternoon. Inside the Currency Market is for the serious student who wants to go beyond simple buy and sell signals to understand the range of market factors that influence currency prices.
Twomey begins not with the definition of a pip but with an analysis of Bank of International Settlements (BIS) reports (which, by the way, are available online). The second chapter, entitled “Currency Trading Beyond the Basics,” lives up to its billing. It deals with such topics as margin in various countries, rollover rates and LIBOR, swap points, and purchasing power parity.
Twomey supplies formulas where necessary, charts where helpful as he takes the reader on a journey through trade weight indices, short-term interest rates and money market instruments, LIBOR, government bonds and yield curves, swaps and forwards, stock and bond markets, currency cycles and volatility. The journey is also geographical, encompassing markets in all the major crosses. And it includes a series of recommended trade strategies.
The final chapter is on technical analysis but, once again, not just the run-of-the-mill fare. He describes volume and open interest, COT reports, Bollinger bands, simple moving averages, Ichimoku, the Baltic dry index, the IMF and special drawing rights, pivot points, and currency correlations and trend lines.
This post is beginning to sound like a laundry list, but it’s the only way I can convey the breadth and the sometimes unexpected content of Twomey’s book. The book is definitely not for those who want a “dummies” introduction or who are looking for instructions on how to get rich quick in the forex market. It’s also not for those in search of some light reading for a rainy afternoon. Inside the Currency Market is for the serious student who wants to go beyond simple buy and sell signals to understand the range of market factors that influence currency prices.
Thursday, December 1, 2011
Derman, Models.Behaving.Badly
Models.Behaving.Badly: Why Confusing Illusion with Reality Can Lead to Disaster, on Wall Street and in Life (Free Press, 2011) is a wise book by a man who has thought deeply about his life, and not just as a quant on Wall Street. Emanuel Derman reminisces about his youth as a member of Habonim (a coeducational Zionist movement) in apartheid South Africa, his ordeal with myopic ophthalmological specialists, and his immersion in theoretical physics. He ventures into an unlikely corner of philosophy—not epistemology but Spinoza’s theory of emotions. He describes “the crooked paths that culminate in theories,” in particular classical and quantum electromagnetic theory.
Although these discussions have a life of their own, they serve to illustrate three fundamental ways of understanding the world: models, theories, and intuition. Intuition is “a merging of the understander with the understood”; “it emerges only from intimate knowledge acquired after careful observation and painstaking effort.” (pp. 96-97) Theory bears a close relationship to intuition. “[W]hen it is successful … it describes the object of its focus so accurately that the theory becomes virtually indistinguishable from the object itself. Maxwell’s equations are electricity and magnetism; the Dirac equation is the electron….” (p. 61)
Of the three, models are the most common and potentially the most troubling ways of understanding. “A model is a metaphor of limited applicability, not the thing itself.” (p. 54) It is an analogy which, although not unfounded, is partial and flawed. “Models project multidimensional reality onto smaller, more manageable spaces where regularities appear and then, in that smaller space, allow us to extrapolate and interpolate from the observed to the unknown.” (p. 58)
Since the financial markets do not obey the laws of science but are subject to the vagaries of human behavior, there can be no financial theories, only financial models. And, as Derman writes graphically, “When we make a model involving human beings, we are trying to force the ugly stepsister’s foot into Cinderella’s pretty glass slipper. It doesn’t fit without cutting off some of the essential parts.”
Models are not useless; indeed, they can be “immensely helpful in calculating initial estimates of value.” (p. 194) The best model in all of economics, Derman argues, is Black-Scholes. Although it is imperfect, it stands head and shoulders above CAPM: “an unkind way to look at CAPM is to say that it’s not very good.” (p. 182)
Reading Derman’s book is both an intellectually rewarding and a thoroughly pleasurable way to spend an afternoon. I recommend it unequivocally.
Although these discussions have a life of their own, they serve to illustrate three fundamental ways of understanding the world: models, theories, and intuition. Intuition is “a merging of the understander with the understood”; “it emerges only from intimate knowledge acquired after careful observation and painstaking effort.” (pp. 96-97) Theory bears a close relationship to intuition. “[W]hen it is successful … it describes the object of its focus so accurately that the theory becomes virtually indistinguishable from the object itself. Maxwell’s equations are electricity and magnetism; the Dirac equation is the electron….” (p. 61)
Of the three, models are the most common and potentially the most troubling ways of understanding. “A model is a metaphor of limited applicability, not the thing itself.” (p. 54) It is an analogy which, although not unfounded, is partial and flawed. “Models project multidimensional reality onto smaller, more manageable spaces where regularities appear and then, in that smaller space, allow us to extrapolate and interpolate from the observed to the unknown.” (p. 58)
Since the financial markets do not obey the laws of science but are subject to the vagaries of human behavior, there can be no financial theories, only financial models. And, as Derman writes graphically, “When we make a model involving human beings, we are trying to force the ugly stepsister’s foot into Cinderella’s pretty glass slipper. It doesn’t fit without cutting off some of the essential parts.”
Models are not useless; indeed, they can be “immensely helpful in calculating initial estimates of value.” (p. 194) The best model in all of economics, Derman argues, is Black-Scholes. Although it is imperfect, it stands head and shoulders above CAPM: “an unkind way to look at CAPM is to say that it’s not very good.” (p. 182)
Reading Derman’s book is both an intellectually rewarding and a thoroughly pleasurable way to spend an afternoon. I recommend it unequivocally.
Wednesday, November 30, 2011
Fisher, Using Median Lines as a Trading Tool
Median lines can sometimes be very useful in mapping out market structure and in identifying potential profit targets. One argument against them, however, is that they are subjective and that as a result no trading strategy that incorporates them can be backtested.
Greg Fisher of www.Median-Line-Study.com addressed this problem in a paper he wrote for an MBA independent study course and subsequently published in 2009 as Using Median Lines as a Trading Tool: An Empirical Study--Grain Markets 1990-2005. He also wrote a companion piece, Finding High Probability Lines. Both are brief works (70-some pages) and in part cover material available elsewhere. But his study of the grain markets is a serious attempt at backtesting median line principles.
The thorniest problem is the choice of pivot points. They are obvious in hindsight and somewhere between tough and impossible to identify in real time. Fisher uses Andrews’ definition of a trendline: “For an uptrend within the period of consideration, draw a line from the lowest low, up and to the highest minor low point preceding the highest high. The line must not pass through prices in between the two low points. Extend the line.” A pivot is “a reverse in price direction that reverses the previous trend by violating the previous trendline.” (pp. 13-14) Once a pivot forms, a median line and its parallels are drawn.
Fisher offers a flowchart of possible price action, starting with the most basic either/or: does price reach the median line or not?
And, among other things, he records the percentage of time price reached the median line as well as price action at the median line.
Fisher’s work is thorough and probably about as good as backtesting median lines can be. Yet, as he himself admits, it benefited from hindsight since he was working with historical data. “The study assumes all pivots were chosen correctly as it is based on known price data.” (p. 38) In real time trendlines are drawn and re-drawn. In fact, even in the figure Fisher provides to illustrate Andrews’ notion of trendlines there are obvious places to draw trendlines that misidentify the important pivot points.
This is no reason to reject median lines out of hand. Andrews’ method can be used in multiple ways, for multiple purposes. For instance, one doesn’t always have to rely on the most recent, still tentative pivot to draw median lines. Often the best sets are those that the market has already respected.
For readers who are unfamiliar with median lines Fisher’s books provide a good introduction. For those who want to subject the median line method to some Monday-morning backtesting Fisher offers a carefully thought out model.
Greg Fisher of www.Median-Line-Study.com addressed this problem in a paper he wrote for an MBA independent study course and subsequently published in 2009 as Using Median Lines as a Trading Tool: An Empirical Study--Grain Markets 1990-2005. He also wrote a companion piece, Finding High Probability Lines. Both are brief works (70-some pages) and in part cover material available elsewhere. But his study of the grain markets is a serious attempt at backtesting median line principles.
The thorniest problem is the choice of pivot points. They are obvious in hindsight and somewhere between tough and impossible to identify in real time. Fisher uses Andrews’ definition of a trendline: “For an uptrend within the period of consideration, draw a line from the lowest low, up and to the highest minor low point preceding the highest high. The line must not pass through prices in between the two low points. Extend the line.” A pivot is “a reverse in price direction that reverses the previous trend by violating the previous trendline.” (pp. 13-14) Once a pivot forms, a median line and its parallels are drawn.
Fisher offers a flowchart of possible price action, starting with the most basic either/or: does price reach the median line or not?
And, among other things, he records the percentage of time price reached the median line as well as price action at the median line.
Fisher’s work is thorough and probably about as good as backtesting median lines can be. Yet, as he himself admits, it benefited from hindsight since he was working with historical data. “The study assumes all pivots were chosen correctly as it is based on known price data.” (p. 38) In real time trendlines are drawn and re-drawn. In fact, even in the figure Fisher provides to illustrate Andrews’ notion of trendlines there are obvious places to draw trendlines that misidentify the important pivot points.
This is no reason to reject median lines out of hand. Andrews’ method can be used in multiple ways, for multiple purposes. For instance, one doesn’t always have to rely on the most recent, still tentative pivot to draw median lines. Often the best sets are those that the market has already respected.
For readers who are unfamiliar with median lines Fisher’s books provide a good introduction. For those who want to subject the median line method to some Monday-morning backtesting Fisher offers a carefully thought out model.
Tuesday, November 29, 2011
Frush, All About Exchange-Traded Funds
ETFs have been a major force in the markets for some years. It is perhaps a tribute to their success that a subcommittee of the Senate Banking Committee held a special hearing last month examining such topics as whether ETFs are contributing to volatility and posing risks to the financial system.
Scott Paul Frush writes for those who want, as the subtitle of every “All About” book reads, “the easy way to get started” in ETFs. All About Exchange-Traded Funds (McGraw-Hill, 2012) is a comprehensive introduction. Among other things it includes a brief history of ETFs, it identifies the players, it explains how ETFs come to market, it describes their internal workings, and in a hundred-page section it outlines the main types of ETFs (broad-based, sector and industry, fixed-income, global, real asset, and specialty). It even includes a list of ETF closures by year, through 2010.
In an early chapter Frush discusses the advantages and drawbacks of ETFs over mutual funds and individual stocks. Let me mention just three here. First, a plus for ETFs, they are not subject to style drift, “the tendency of a portfolio manager to deviate from his or her fund’s specific strategy or objective.” (p. 44)
Second, a plus for mutual funds, some ETFs suffer from dividend drag, “the implicit cost some ETFs incur as a result of the … SEC rules stipulating that certain ETFs cannot immediately reinvest back into the portfolio the dividends paid by companies held in the fund. Instead, some ETFs must accumulate the dividends in a cash reserve account and pay them to shareholders at periodic intervals—typically quarterly. This requirement differs for mutual funds, as they can reinvest dividends immediately.” (p. 45)
Third, ETFs are fully invested by nature—“and that’s a very good thing for investors.” Mutual funds, by contrast, must allocate part of their holdings to cash to satisfy shareholder liquidations. “If you were to build a portfolio of 60 percent equities and 40 percent fixed income and later discovered that many of your equity mutual funds hold 10 percent cash, you might not be too happy since that means that your actual portfolio allocation is closer to 54/46 percent equities to fixed income, respectively.” (pp. 46-47) Speaking of asset allocation, Frush devotes a chapter to the topic in the final part of the book on how to use ETFs.
All About Exchange-Traded Funds is a well-written, balanced introduction. Both individual investors and financial advisers could profit from reading it.
Scott Paul Frush writes for those who want, as the subtitle of every “All About” book reads, “the easy way to get started” in ETFs. All About Exchange-Traded Funds (McGraw-Hill, 2012) is a comprehensive introduction. Among other things it includes a brief history of ETFs, it identifies the players, it explains how ETFs come to market, it describes their internal workings, and in a hundred-page section it outlines the main types of ETFs (broad-based, sector and industry, fixed-income, global, real asset, and specialty). It even includes a list of ETF closures by year, through 2010.
In an early chapter Frush discusses the advantages and drawbacks of ETFs over mutual funds and individual stocks. Let me mention just three here. First, a plus for ETFs, they are not subject to style drift, “the tendency of a portfolio manager to deviate from his or her fund’s specific strategy or objective.” (p. 44)
Second, a plus for mutual funds, some ETFs suffer from dividend drag, “the implicit cost some ETFs incur as a result of the … SEC rules stipulating that certain ETFs cannot immediately reinvest back into the portfolio the dividends paid by companies held in the fund. Instead, some ETFs must accumulate the dividends in a cash reserve account and pay them to shareholders at periodic intervals—typically quarterly. This requirement differs for mutual funds, as they can reinvest dividends immediately.” (p. 45)
Third, ETFs are fully invested by nature—“and that’s a very good thing for investors.” Mutual funds, by contrast, must allocate part of their holdings to cash to satisfy shareholder liquidations. “If you were to build a portfolio of 60 percent equities and 40 percent fixed income and later discovered that many of your equity mutual funds hold 10 percent cash, you might not be too happy since that means that your actual portfolio allocation is closer to 54/46 percent equities to fixed income, respectively.” (pp. 46-47) Speaking of asset allocation, Frush devotes a chapter to the topic in the final part of the book on how to use ETFs.
All About Exchange-Traded Funds is a well-written, balanced introduction. Both individual investors and financial advisers could profit from reading it.
Monday, November 28, 2011
Zacks, The Handbook of Equity Market Anomalies
Leonard Zacks, the founder of Zacks Investment Research, has edited a book that should appeal to serious investors who are trying to find an edge. The contributors to The Handbook of Equity Market Anomalies: Translating Market Inefficiencies into Effective Investment Strategies (Wiley, 2011) are mostly academics. They review literature from the past twenty years on market anomalies and draw practical implications for the individual investor.
After an introductory chapter on the conceptual foundations of capital market anomalies, the book highlights nine anomalies: the accrual anomaly, the analyst recommendation and earnings forecast anomaly, post-earnings announcement drift and related anomalies, fundamental data anomalies, net stock anomalies, the insider trading anomaly, momentum (the technical analysis anomaly), seasonal anomalies, and size and value anomalies. The editor then looks at anomaly-based processes for the individual investor, and an appendix covers the use of anomaly research by professional investors. As you might expect, each chapter includes an extensive bibliography. (The website that accompanies the book has abstracts of the referenced works and links to the original articles—some available at no charge.)
The authors explore strategies that worked, that still work, and—the toughest hurdle of all—that continue to produce excess risk-adjusted returns after all those pesky transaction costs are deducted.
Many of the anomalies can be attributed to psychological factors. To take perhaps the simplest case, sunshine has been linked to tipping and the lack of sunshine to depression and suicide. When the sun shines people feel good, are more optimistic, and “may be more inclined to buy stocks thus leading to higher stock prices.” (p. 255) Indeed, studies show, sunshine is strongly positively correlated with daily stock returns, especially the farther one is from the equator. Other weather conditions such as rain and snow are unrelated to returns.
A question that I found intriguing is whether aggregate insider trading can be used to predict market returns. (Studies suggest that insider trading as an individual stock picking strategy delivers superior returns over long investment horizons.) Insiders, it turns out, are contrarian traders; they sell after increases in the market and buy after poor performance. So far studies indicate that “aggregate insider trading is more successful in predicting forthcoming poor stock market performance, but it can also be used with moderate success to predict large stock market increases. Aggregate insider transactions have a promising potential to be used as a market-timing tool.” (p. 166)
Zacks suggests that research on these anomalies be used to create quant multifactor equity models. The investor can embrace that suggestion if she has some quant skills, can subscribe to a scanning service (Zacks Investment Research, of course, provides such a service), or can use select anomalies to supplement her current investing strategy. No matter what the choice, this book is a cornucopia of carefully researched investing ideas.
After an introductory chapter on the conceptual foundations of capital market anomalies, the book highlights nine anomalies: the accrual anomaly, the analyst recommendation and earnings forecast anomaly, post-earnings announcement drift and related anomalies, fundamental data anomalies, net stock anomalies, the insider trading anomaly, momentum (the technical analysis anomaly), seasonal anomalies, and size and value anomalies. The editor then looks at anomaly-based processes for the individual investor, and an appendix covers the use of anomaly research by professional investors. As you might expect, each chapter includes an extensive bibliography. (The website that accompanies the book has abstracts of the referenced works and links to the original articles—some available at no charge.)
The authors explore strategies that worked, that still work, and—the toughest hurdle of all—that continue to produce excess risk-adjusted returns after all those pesky transaction costs are deducted.
Many of the anomalies can be attributed to psychological factors. To take perhaps the simplest case, sunshine has been linked to tipping and the lack of sunshine to depression and suicide. When the sun shines people feel good, are more optimistic, and “may be more inclined to buy stocks thus leading to higher stock prices.” (p. 255) Indeed, studies show, sunshine is strongly positively correlated with daily stock returns, especially the farther one is from the equator. Other weather conditions such as rain and snow are unrelated to returns.
A question that I found intriguing is whether aggregate insider trading can be used to predict market returns. (Studies suggest that insider trading as an individual stock picking strategy delivers superior returns over long investment horizons.) Insiders, it turns out, are contrarian traders; they sell after increases in the market and buy after poor performance. So far studies indicate that “aggregate insider trading is more successful in predicting forthcoming poor stock market performance, but it can also be used with moderate success to predict large stock market increases. Aggregate insider transactions have a promising potential to be used as a market-timing tool.” (p. 166)
Zacks suggests that research on these anomalies be used to create quant multifactor equity models. The investor can embrace that suggestion if she has some quant skills, can subscribe to a scanning service (Zacks Investment Research, of course, provides such a service), or can use select anomalies to supplement her current investing strategy. No matter what the choice, this book is a cornucopia of carefully researched investing ideas.
Thursday, November 24, 2011
Wednesday, November 23, 2011
Martin, Benjamin Graham and the Power of Growth Stocks
Most people asked to describe the difference between value and growth investing would stumble because these strategies are neither mutually exclusive nor collectively exhaustive. In fact, Warren Buffett argues that “growth is always a component in the calculation of value” and “the very term ‘value investing’ is redundant. What is ‘investing’ if not the act of seeking value at least sufficient to justify the amount paid?” Frederick K. Martin gives full form to this “joined at the hip” reality in Benjamin Graham and the Power of Growth Stocks: Lost Growth Strategies from the Father of Value Investing (McGraw-Hill, 2012).
Graham may be best known for his rigorous methodology for evaluating an investment, but “as his career progressed, he developed an appreciation for the long-term power of growth.” Later in his career he purchased a major stake in GEICO for $27 a share “and watched it rise over the ensuing years to the equivalent of $54,000 per share. … That single transaction, which accounted for about a quarter of his assets at the time, ultimately yielded more profit than all his other investments combined.” (p. 7)
Graham defined a growth stock as “one which has done better than average over a number of years and is expected to do so in the future.” In the 1962 edition of Security Analysis he devoted an entire chapter to analyzing and valuing growth stocks. “Brilliant in its common sense and simplicity, it was a landmark chapter in Graham’s illustrious career before it inexplicably disappeared from future editions of the book.” (p. 65) Martin reprints the “lost” chapter in this book—an important service in and of itself.
Martin expands on Graham’s growth ideas in seven chapters. Among the key principles is Graham’s valuation equation: value = current “normal” earnings x (8.5 plus 2G), where G is the average annual growth rate expected for the next 7 to 10 years. Martin also explains how to build a margin of safety for growth stocks and describes the characteristics of a great growth company. He hammers home the power of compounding and invokes Graham’s notion of “Mr. Market” to explain inefficiency in the market. And he gets down in the trenches, showing how to put Graham’s principles into action.
Benjamin Graham and the Power of Growth Stocks provides effective guidance for the long-term investor who wants to achieve above-average returns. Yes, the strategy requires work—and a lot of wasted effort, and this in itself “could be a deal killer for anyone who is not willing to be diligent in her investing practices. If she doesn’t enjoy the painstaking effort of poring through 10-K reports, analyzing balance sheets and cash flow charts, and making careful projections, she may be unable to execute Graham’s strategy effectively.” (p. 246) At least she doesn’t need to be a sophisticated quant. As the author argues, “higher-level math implies a level of precision that does not exist in the real world.” (p. 259)
Graham may be best known for his rigorous methodology for evaluating an investment, but “as his career progressed, he developed an appreciation for the long-term power of growth.” Later in his career he purchased a major stake in GEICO for $27 a share “and watched it rise over the ensuing years to the equivalent of $54,000 per share. … That single transaction, which accounted for about a quarter of his assets at the time, ultimately yielded more profit than all his other investments combined.” (p. 7)
Graham defined a growth stock as “one which has done better than average over a number of years and is expected to do so in the future.” In the 1962 edition of Security Analysis he devoted an entire chapter to analyzing and valuing growth stocks. “Brilliant in its common sense and simplicity, it was a landmark chapter in Graham’s illustrious career before it inexplicably disappeared from future editions of the book.” (p. 65) Martin reprints the “lost” chapter in this book—an important service in and of itself.
Martin expands on Graham’s growth ideas in seven chapters. Among the key principles is Graham’s valuation equation: value = current “normal” earnings x (8.5 plus 2G), where G is the average annual growth rate expected for the next 7 to 10 years. Martin also explains how to build a margin of safety for growth stocks and describes the characteristics of a great growth company. He hammers home the power of compounding and invokes Graham’s notion of “Mr. Market” to explain inefficiency in the market. And he gets down in the trenches, showing how to put Graham’s principles into action.
Benjamin Graham and the Power of Growth Stocks provides effective guidance for the long-term investor who wants to achieve above-average returns. Yes, the strategy requires work—and a lot of wasted effort, and this in itself “could be a deal killer for anyone who is not willing to be diligent in her investing practices. If she doesn’t enjoy the painstaking effort of poring through 10-K reports, analyzing balance sheets and cash flow charts, and making careful projections, she may be unable to execute Graham’s strategy effectively.” (p. 246) At least she doesn’t need to be a sophisticated quant. As the author argues, “higher-level math implies a level of precision that does not exist in the real world.” (p. 259)
Tuesday, November 22, 2011
Zacks, The Little Book of Stock Market Profits
Mitch Zacks, senior portfolio manager at Zacks Investment Management and “the firm’s primary expert on quantitative investing,” is the latest contributor to Wiley’s “little book big profits” series. Not surprisingly The Little Book of Stock Market Profits: The Best Strategies of All Time Made Even Better (2012) describes some of the strategies that are inextricably linked with the Zacks name. The author also draws on academic findings summarized in Leonard Zacks’s Handbook of Equity Market Anomalies (to be reviewed on this blog in the very near future).
In eleven chapters the author discusses such topics as changes in sell-side analyst recommendations, smaller cap stocks, earnings estimate revisions, price momentum, piggybacking on the trades of insiders, net stock issuance (IPOs, stock buy-backs), quality of earnings, valuation metrics, post-earnings announcement drift, seasonal timing, and multi-factor models.
Despite the brevity of the book Zacks is careful to explain the conditions under which the strategies work, what can derail them, and whether they are effective as stand-alones. Let’s take a single example: price momentum.
“The data show two patterns that seem relatively stable, or about as stable as financial data ever become. These two patterns are short-term momentum and long-term reversals. In the short-to-medium term, roughly about a calendar quarter or two, stocks that have gone up in price substantially tend to continue to trend upward. However, over the long term, stocks that performed extraordinarily well over the last few years tend to become losers over the next three to five years.” (pp. 64-65) Even though momentum-based trading was profitable until 2000, the normal relationship between short-term momentum and long-term reversals then broke down, and it was not until 2005 that it was re-established.
Zacks explores specific momentum strategies such as a 12/3 split. “What this means is sorting the universe of stocks into deciles based upon how they performed over the past year, and then holding the portfolio for the next three months. It also looks like performance can slightly be increased if you lag the construction period by one week.” (p. 67) Other momentum strategies that can deliver excess returns are “buying stocks that trade near their 52-week highs and using traditional moving averages.” (p. 68)
Somewhat curiously, at least according to one study, “momentum seems to work better for mid-cap stocks than for either large- or small-cap stocks.” Other studies, however, indicate that “price-momentum-based studies are statistically stronger for stocks with lower analyst coverage.” (pp. 69-70)
Since individual investors, unlike institutional investors, tend to hold losing stocks too long and sell winning stocks too soon, and since these behavioral biases contribute to momentum returns, “we would expect to see stocks with low volume—which would be more likely to be held by individuals—to exhibit stronger momentum returns.” This is indeed the case: “a stock’s past trading volume is a good predictor of how strong the momentum effect is and the extent to which the momentum effect persists.” (p. 74)
Momentum-based returns also seem to be linked to broad-based economic strength. “Over the past seventy years, it looks like momentum based strategies only generate excess returns in periods of economic expansion. Even more interesting, momentum returns turn negative during a recession.” (p. 75)
The Little Book of Stock Market Profits can be used as something of a crib or pony by those who don’t want to spend countless hours reading the academic literature. It’s a fast read and offers useful advice for the active investor.
In eleven chapters the author discusses such topics as changes in sell-side analyst recommendations, smaller cap stocks, earnings estimate revisions, price momentum, piggybacking on the trades of insiders, net stock issuance (IPOs, stock buy-backs), quality of earnings, valuation metrics, post-earnings announcement drift, seasonal timing, and multi-factor models.
Despite the brevity of the book Zacks is careful to explain the conditions under which the strategies work, what can derail them, and whether they are effective as stand-alones. Let’s take a single example: price momentum.
“The data show two patterns that seem relatively stable, or about as stable as financial data ever become. These two patterns are short-term momentum and long-term reversals. In the short-to-medium term, roughly about a calendar quarter or two, stocks that have gone up in price substantially tend to continue to trend upward. However, over the long term, stocks that performed extraordinarily well over the last few years tend to become losers over the next three to five years.” (pp. 64-65) Even though momentum-based trading was profitable until 2000, the normal relationship between short-term momentum and long-term reversals then broke down, and it was not until 2005 that it was re-established.
Zacks explores specific momentum strategies such as a 12/3 split. “What this means is sorting the universe of stocks into deciles based upon how they performed over the past year, and then holding the portfolio for the next three months. It also looks like performance can slightly be increased if you lag the construction period by one week.” (p. 67) Other momentum strategies that can deliver excess returns are “buying stocks that trade near their 52-week highs and using traditional moving averages.” (p. 68)
Somewhat curiously, at least according to one study, “momentum seems to work better for mid-cap stocks than for either large- or small-cap stocks.” Other studies, however, indicate that “price-momentum-based studies are statistically stronger for stocks with lower analyst coverage.” (pp. 69-70)
Since individual investors, unlike institutional investors, tend to hold losing stocks too long and sell winning stocks too soon, and since these behavioral biases contribute to momentum returns, “we would expect to see stocks with low volume—which would be more likely to be held by individuals—to exhibit stronger momentum returns.” This is indeed the case: “a stock’s past trading volume is a good predictor of how strong the momentum effect is and the extent to which the momentum effect persists.” (p. 74)
Momentum-based returns also seem to be linked to broad-based economic strength. “Over the past seventy years, it looks like momentum based strategies only generate excess returns in periods of economic expansion. Even more interesting, momentum returns turn negative during a recession.” (p. 75)
The Little Book of Stock Market Profits can be used as something of a crib or pony by those who don’t want to spend countless hours reading the academic literature. It’s a fast read and offers useful advice for the active investor.
Monday, November 21, 2011
Ciana, New Frontiers in Technical Analysis
For those of us without a Bloomberg terminal New Frontiers in Technical Analysis: Effective Tools and Strategies for Trading and Investing by Paul Ciana (Bloomberg/Wiley, 2011) is an idea book, not a plug and play manual. But even though some of the software tools described in Ciana’s book are not available on run-of-the-mill trading platforms (and where they are, they are available by subscription only) clever programmers may get inspired. Moreover, even without access to proprietary software the imaginative reader can add some new arrows to his quiver.
The six chapters in this book are written by six different authors: “Evidence of the Most Popular Technical Indicators” (Paul Ciana), “Everything Is Relative Strength Is Everything” (Julius de Kempenaer), “Applying Seasonality and Erlanger Studies” (Philip B. Erlanger), “Kase StatWare and Studies” (Cynthia A. Kase), “Rules-Based Trading and Market Analysis Using Simplified Market Profile” (Andrew Kezeli), and “Advanced Trading Methods” (Rick Knox).
Ciana provides some fascinating data about the preferences of those who use the Bloomberg Professional Service. For instance, Europe opts for log charts 47% of the time and Asia only 9% of the time. Asia prefers candlestick charts, the Americas bar charts. Worldwide the most popular technical indicators (excluding moving averages) are RSI, MACD, Bollinger bands (BOLL), stochastics (STO), directional movement index (DMI), Ichimoku (GOC), and volume at time (VAT). RSI is the clear winner, with a 44.4% worldwide preference; MACD comes in second at 22%. Some indicators have geographical ties. GOC has a 10.8% popularity rating in Asia as opposed to 2.5% in the Americas and 2.8% in Europe. VAT has a 5.3% rating in the Americas and only 1.8% in Europe and 1.6% in Asia.
VAT, for those who are unfamiliar with it, is something of a seasonal indicator. For instance, “from a historical perspective, VAT considers the volume that has occurred on that day over the past X years to create the average for that day. … From an intraday perspective, VAT creates an average of volume from the actual volume that occurred during that time-slice for the past X days. In both applications VAT can be projected into the future to get an idea of expected volume.” (p. 37)
The authors describe the proprietary technical indicators they have developed and give ample illustrations of them. Some of the indicators have been around for quite a while. For instance, there are Kase’s DevStops, which I always mean to study more carefully and somehow never do. And, of course, Market Profile is a well-known if not so well understood trading framework (hence the simplification proposed in this book).
For those traders always in search of the next best thing (and all traders should be open minded enough to recognize that regimes change and strategies must be adaptable) Ciana’s New Frontiers in Technical Analysis offers a lot of eye candy. Possibly addictive, perhaps not truly nutritious, but definitely fun to devour.
The six chapters in this book are written by six different authors: “Evidence of the Most Popular Technical Indicators” (Paul Ciana), “Everything Is Relative Strength Is Everything” (Julius de Kempenaer), “Applying Seasonality and Erlanger Studies” (Philip B. Erlanger), “Kase StatWare and Studies” (Cynthia A. Kase), “Rules-Based Trading and Market Analysis Using Simplified Market Profile” (Andrew Kezeli), and “Advanced Trading Methods” (Rick Knox).
Ciana provides some fascinating data about the preferences of those who use the Bloomberg Professional Service. For instance, Europe opts for log charts 47% of the time and Asia only 9% of the time. Asia prefers candlestick charts, the Americas bar charts. Worldwide the most popular technical indicators (excluding moving averages) are RSI, MACD, Bollinger bands (BOLL), stochastics (STO), directional movement index (DMI), Ichimoku (GOC), and volume at time (VAT). RSI is the clear winner, with a 44.4% worldwide preference; MACD comes in second at 22%. Some indicators have geographical ties. GOC has a 10.8% popularity rating in Asia as opposed to 2.5% in the Americas and 2.8% in Europe. VAT has a 5.3% rating in the Americas and only 1.8% in Europe and 1.6% in Asia.
VAT, for those who are unfamiliar with it, is something of a seasonal indicator. For instance, “from a historical perspective, VAT considers the volume that has occurred on that day over the past X years to create the average for that day. … From an intraday perspective, VAT creates an average of volume from the actual volume that occurred during that time-slice for the past X days. In both applications VAT can be projected into the future to get an idea of expected volume.” (p. 37)
The authors describe the proprietary technical indicators they have developed and give ample illustrations of them. Some of the indicators have been around for quite a while. For instance, there are Kase’s DevStops, which I always mean to study more carefully and somehow never do. And, of course, Market Profile is a well-known if not so well understood trading framework (hence the simplification proposed in this book).
For those traders always in search of the next best thing (and all traders should be open minded enough to recognize that regimes change and strategies must be adaptable) Ciana’s New Frontiers in Technical Analysis offers a lot of eye candy. Possibly addictive, perhaps not truly nutritious, but definitely fun to devour.
Thursday, November 17, 2011
Fabozzi and Markowitz, eds., Equity Valuation and Portfolio Management
Someday, I trust, markets will get a break from the relentless macroeconomic and political vicissitudes and investors will be able to focus once again on company and portfolio basics. In anticipation of that day Frank J. Fabozzi and Harry M. Markowitz have edited a very useful volume dealing with quantitative approaches to Equity Valuation and Portfolio Management (Wiley, 2011).
The book is broad in scope, as witnessed by the table of contents. Its 21 chapters include “An Introduction to Quantitative Equity Investing,” “Equity Analysis Using Traditional and Value-Based Metrics,” “A Franchise Factor Approach to Modeling P/E Orbits,” “Relative Valuation Methods for Equity Analysis,” “Valuation over the Cycle and the Distribution of Returns,” “An Architecture for Equity Portfolio Management,” “Equity Analysis in a Complex Market,” “Survey Studies of the Use of Quantitative Equity Management,” “Implementable Quantitative Equity Research,” “Tracking Error and Common Stock Portfolio Management,” “Factor-Based Equity Portfolio Construction and Analysis,” “Cross-Sectional Factor-Based Models and Trading Strategies,” “Multifactor Equity Risk Models and Their Applications,” “Dynamic Factor Approaches to Equity Portfolio Management,” “A Factor Competition Approach to Stock Selection,” “Avoiding Unintended Country Bets in Global Equity Portfolios,” “Modeling Market Impact Costs,” “Equity Portfolio Selection in Practice,” “Portfolio Construction and Extreme Risk,” “Working with High-Frequency Data,” and “Statistical Arbitrage.”
I can’t possibly do justice to the book in this brief post. Let me simply share a single idea that I found promising (and that can be sketched out in a couple of paragraphs).
It’s an approach for dealing with strategy failure. “Every investment strategy has three core properties: (1) the return that the strategy generates, (2) the volatility of that return, and (3) the correlation of that return with alternate strategies.” (p. 397) As we know, returns are fragile: “strategies fail despite the support of logic or history.”
To address this problem the authors, both from Nomura, suggest an alpha repair process. Essentially, what they do is to take a set of 45 factors (such things as 1-year price momentum, market cap, and ROE) and view each as an asset. “The objective is to own the portfolio of factors weighted to produce the highest Sharpe ratio.” They don’t want to include all 45 factors on the portfolio “team” because “while more factors could improve the Sharpe ratio by reducing risk, a diversified portfolio of many factors would likely produce lower returns than a concentrated portfolio of good factors.” (p. 403) Instead, they run their screens each month against a large pool of contending strategies and select three factors using optimization for next month’s “team.” The screens incorporate “the history of factor return, volatility, and correlation with other factor returns.” (p. 412)
The alpha repair strategy provides a framework for discarding a factor that seems to have lost efficacy and embracing one that seems to have become important. The procedure (which is of course a lot more complicated in its execution than in its bare outlines) has a targeted 100% annual turnover. It has outperformed the benchmark Russell 1000 in a consistently linear fashion—by more than 4% in the period January-August 2010, annually for the past five years, and annually for the past ten years.
The book is broad in scope, as witnessed by the table of contents. Its 21 chapters include “An Introduction to Quantitative Equity Investing,” “Equity Analysis Using Traditional and Value-Based Metrics,” “A Franchise Factor Approach to Modeling P/E Orbits,” “Relative Valuation Methods for Equity Analysis,” “Valuation over the Cycle and the Distribution of Returns,” “An Architecture for Equity Portfolio Management,” “Equity Analysis in a Complex Market,” “Survey Studies of the Use of Quantitative Equity Management,” “Implementable Quantitative Equity Research,” “Tracking Error and Common Stock Portfolio Management,” “Factor-Based Equity Portfolio Construction and Analysis,” “Cross-Sectional Factor-Based Models and Trading Strategies,” “Multifactor Equity Risk Models and Their Applications,” “Dynamic Factor Approaches to Equity Portfolio Management,” “A Factor Competition Approach to Stock Selection,” “Avoiding Unintended Country Bets in Global Equity Portfolios,” “Modeling Market Impact Costs,” “Equity Portfolio Selection in Practice,” “Portfolio Construction and Extreme Risk,” “Working with High-Frequency Data,” and “Statistical Arbitrage.”
I can’t possibly do justice to the book in this brief post. Let me simply share a single idea that I found promising (and that can be sketched out in a couple of paragraphs).
It’s an approach for dealing with strategy failure. “Every investment strategy has three core properties: (1) the return that the strategy generates, (2) the volatility of that return, and (3) the correlation of that return with alternate strategies.” (p. 397) As we know, returns are fragile: “strategies fail despite the support of logic or history.”
To address this problem the authors, both from Nomura, suggest an alpha repair process. Essentially, what they do is to take a set of 45 factors (such things as 1-year price momentum, market cap, and ROE) and view each as an asset. “The objective is to own the portfolio of factors weighted to produce the highest Sharpe ratio.” They don’t want to include all 45 factors on the portfolio “team” because “while more factors could improve the Sharpe ratio by reducing risk, a diversified portfolio of many factors would likely produce lower returns than a concentrated portfolio of good factors.” (p. 403) Instead, they run their screens each month against a large pool of contending strategies and select three factors using optimization for next month’s “team.” The screens incorporate “the history of factor return, volatility, and correlation with other factor returns.” (p. 412)
The alpha repair strategy provides a framework for discarding a factor that seems to have lost efficacy and embracing one that seems to have become important. The procedure (which is of course a lot more complicated in its execution than in its bare outlines) has a targeted 100% annual turnover. It has outperformed the benchmark Russell 1000 in a consistently linear fashion—by more than 4% in the period January-August 2010, annually for the past five years, and annually for the past ten years.
Wednesday, November 16, 2011
O’Shaughnessy, What Works on Wall Street
I can’t quite fathom how I missed the first three editions of this book, but James P. O’Shaughnessy’s best-selling What Works on Wall Street: The Classic Guide to the Best-Performing Investment Strategies of All Time (McGraw-Hill, 2012) is now in its fourth edition. It has been updated with new data covering the recent market turmoil, innovative strategies for investing success, and new material on sector analysis. One of the advantages of buying a book that has a large readership is that it is inexpensive—under $25 on Amazon for almost 700 oversized pages.
Backtesting investment strategies is an enterprise fraught with danger, and critics are only too ready to line up to shoot holes in the tester’s methodology, sample size or timeframe, or to decry the value of statistics in general. I, by contrast, am grateful that anyone undertakes a task for which I am ill equipped. For those who believe that history provides a guide to the future O’Shaughnessy’s massive work is an invaluable reference.
Confronted with such a large, data-heavy book, readers will be tempted to skip to the penultimate chapter: “Ranking the Strategies.” But this lazy approach will probably yield little because it is important to understand why certain strategies work better than others and why multifactor models, especially those that “marry both value and growth characteristics, … will go on to offer investors the best absolute and risk-adjusted returns.” (p. 470)
O’Shaughnessy analyzes all the popular metrics—among them, market cap, P/E ratios, EBITDA to enterprise value, price-to-cash flow ratios, price-to-sales ratios, price-to-book value ratios, dividend yields, buyback yield, shareholder yield, and accounting ratios. He looks at one-year earnings per share percentage changes, profit margins, return on equity, and relative price strength. He dissects the market leaders universe and the small stocks universe and describes the ratios that add the most value in each case. And he looks at how the factors he has examined perform on the sector level.
What Works on Wall Street is not a book you read straight through unless you are a possessed numbers wonk. Rather, it’s one you pick at—and the pickings are by no means slim—and put in a prominent place on your reference shelf.
Backtesting investment strategies is an enterprise fraught with danger, and critics are only too ready to line up to shoot holes in the tester’s methodology, sample size or timeframe, or to decry the value of statistics in general. I, by contrast, am grateful that anyone undertakes a task for which I am ill equipped. For those who believe that history provides a guide to the future O’Shaughnessy’s massive work is an invaluable reference.
Confronted with such a large, data-heavy book, readers will be tempted to skip to the penultimate chapter: “Ranking the Strategies.” But this lazy approach will probably yield little because it is important to understand why certain strategies work better than others and why multifactor models, especially those that “marry both value and growth characteristics, … will go on to offer investors the best absolute and risk-adjusted returns.” (p. 470)
O’Shaughnessy analyzes all the popular metrics—among them, market cap, P/E ratios, EBITDA to enterprise value, price-to-cash flow ratios, price-to-sales ratios, price-to-book value ratios, dividend yields, buyback yield, shareholder yield, and accounting ratios. He looks at one-year earnings per share percentage changes, profit margins, return on equity, and relative price strength. He dissects the market leaders universe and the small stocks universe and describes the ratios that add the most value in each case. And he looks at how the factors he has examined perform on the sector level.
What Works on Wall Street is not a book you read straight through unless you are a possessed numbers wonk. Rather, it’s one you pick at—and the pickings are by no means slim—and put in a prominent place on your reference shelf.
Tuesday, November 15, 2011
Malz, Financial Risk Management
Allan M. Malz, currently a senior analytical advisor in the Markets Group at the Federal Reserve Bank of New York and a faculty member at Columbia University and earlier a trader and risk manager, has written a thorough account of the principles of risk management and the challenges facing risk managers today.
Financial Risk Management: Models, History, and Institutions (Wiley, 2011) is a big book, over 700 pages. And for those who are not steeped in the basic math of risk management it is a difficult book. The fault, I am the first to admit, lies with the reviewer, not the author. I have a knack for getting in over my head when it comes to quant books.
Despite my obvious handicap, I learned a lot from this wide-ranging book. Start with macro/central bank issues such as the structure of the banking industry, stress tests, and financial regulation--especially in the wake of the most recent financial crisis. And since, lest we myopically forget, the world has experienced many financial crises, Malz devotes a chapter to the topic, analyzing such issues as panics, runs, and crashes; the causes of financial crises; and the behavior of asset prices during crises.
For those who are charged with managing portfolio risk Malz takes them through the ins and outs of VaR. Especially intriguing to me, albeit mentally tasking, was the chapter on nonlinear risks and the treatment of bonds and options. And, although I would like to believe it’s Monday morning quarterbacking but I know it’s not, we can read about credit and counterparty risk and structured credit risk.
Malz’s Financial Risk Management will never be a best seller. But those who need to know how to manage risk—and I would hope that their numbers are not limited to risk managers themselves--would do well to add this book to their must-read list.
Financial Risk Management: Models, History, and Institutions (Wiley, 2011) is a big book, over 700 pages. And for those who are not steeped in the basic math of risk management it is a difficult book. The fault, I am the first to admit, lies with the reviewer, not the author. I have a knack for getting in over my head when it comes to quant books.
Despite my obvious handicap, I learned a lot from this wide-ranging book. Start with macro/central bank issues such as the structure of the banking industry, stress tests, and financial regulation--especially in the wake of the most recent financial crisis. And since, lest we myopically forget, the world has experienced many financial crises, Malz devotes a chapter to the topic, analyzing such issues as panics, runs, and crashes; the causes of financial crises; and the behavior of asset prices during crises.
For those who are charged with managing portfolio risk Malz takes them through the ins and outs of VaR. Especially intriguing to me, albeit mentally tasking, was the chapter on nonlinear risks and the treatment of bonds and options. And, although I would like to believe it’s Monday morning quarterbacking but I know it’s not, we can read about credit and counterparty risk and structured credit risk.
Malz’s Financial Risk Management will never be a best seller. But those who need to know how to manage risk—and I would hope that their numbers are not limited to risk managers themselves--would do well to add this book to their must-read list.
Monday, November 14, 2011
Weissman, Trade Like a Casino
Richard L. Weissman, the author of Mechanical Trading Systems (2004), has a new book out: Trade Like a Casino: Find Your Edge, Manage Risk, and Win Like the House (Wiley, 2011). It’s a well-balanced book that incorporates trading ideas, risk management, and trader psychology and that is appropriate for both beginning and intermediate traders.
Weissman contends that “the single most important tool in developing positive expectancy trading models is the cyclical nature of volatility.” (p. 73) Trendless, low-volatility environments resolve themselves into trending markets; trending, high-volatility markets cycle to low volatility. He discusses the leading volatility indicators, with Bollinger bands being his favorite, average true range following, and ADX rounding out the list. He offers coding for long and short entries and exits using these indicators.
Another tool he finds valuable is timeframe divergence, where the shortest timeframe diverges from longer ones. He discusses this tool at some length and provides ample chart illustrations.
The book proceeds at an easy gait despite the fact that Weissman recognizes that there are no simple, one-dimensional paths to successful trading. “All market behavior is multifaceted, uncertain, and ever changing.” (p. 191)The master trader, he writes, must be disciplined yet flexible, must reprogram himself to avoid the irrational cycles of euphoria and fear (taking partial profits helps in this endeavor) and instead be even-minded, and must learn to participate rather than anticipate. And, of course, must always practice good risk management.
Well, that doesn’t sound too hard, does it? It shouldn’t take all that much work to follow Weissman’s directives. Then comes the chapter on analyzing performance—in particular, the due diligence questionnaire that he advises each reader to complete. The questionnaire with explanations is a staggering 19 pages long and forces the trader to assess virtually every aspect of his business. Anyone who takes this questionnaire seriously should begin to understand how much work really is involved in setting up and running a successful trading operation. Weissman also offers tips on monitoring performance.
Trade Like a Casino won’t guarantee the reader a profitable trading career, but it provides a pretty complete outline of what it takes to win like the house.
Weissman contends that “the single most important tool in developing positive expectancy trading models is the cyclical nature of volatility.” (p. 73) Trendless, low-volatility environments resolve themselves into trending markets; trending, high-volatility markets cycle to low volatility. He discusses the leading volatility indicators, with Bollinger bands being his favorite, average true range following, and ADX rounding out the list. He offers coding for long and short entries and exits using these indicators.
Another tool he finds valuable is timeframe divergence, where the shortest timeframe diverges from longer ones. He discusses this tool at some length and provides ample chart illustrations.
The book proceeds at an easy gait despite the fact that Weissman recognizes that there are no simple, one-dimensional paths to successful trading. “All market behavior is multifaceted, uncertain, and ever changing.” (p. 191)The master trader, he writes, must be disciplined yet flexible, must reprogram himself to avoid the irrational cycles of euphoria and fear (taking partial profits helps in this endeavor) and instead be even-minded, and must learn to participate rather than anticipate. And, of course, must always practice good risk management.
Well, that doesn’t sound too hard, does it? It shouldn’t take all that much work to follow Weissman’s directives. Then comes the chapter on analyzing performance—in particular, the due diligence questionnaire that he advises each reader to complete. The questionnaire with explanations is a staggering 19 pages long and forces the trader to assess virtually every aspect of his business. Anyone who takes this questionnaire seriously should begin to understand how much work really is involved in setting up and running a successful trading operation. Weissman also offers tips on monitoring performance.
Trade Like a Casino won’t guarantee the reader a profitable trading career, but it provides a pretty complete outline of what it takes to win like the house.
Thursday, November 10, 2011
Finding your own style
Before I return Mehrling’s biography of Fischer Black to the public library I thought I’d touch on another theme. Herewith some lengthy quotations that may inspire you to find your own style.
Fischer spent the early 1970s at the University of Chicago’s Graduate School of Business “safe inside his office on the third floor of Rosenwald, with Myron Scholes on one side and Eugene Fama on the other.” There he “could be a maverick without bearing the costs that everywhere else are imposed on those who refuse to conform to societal norms. He loved it.
“The own downside was the obligation to teach. No one was ever better than Fischer at working one-on-one with a student, but the subject had to be one that he was interested in learning about himself. And the student had better be interested in solving the problem … and prepared to defend his proposed solution against alternatives, even alternatives that no one except Fischer was prepared to consider. For him, the whole point of academic life was to think new thoughts, and he was interested only in interacting with people who shared that commitment.”
“Standard classroom lecturing about matters that Fischer thought had already been resolved was simply a bore, and Fischer was lousy at it. He recalled: ‘I was terrible, judging by the ratings my students gave me. I thought lectures were a waste of time for me and for my students (especially when the material is in a book). I looked for every possible excuse to cancel classes. One thing I did was to fill some classes with reviews of past exams. That worked well, so gradually I started doing more of it. Eventually, I worked out the following system.
‘I handed out lists of questions and packets of readings at the start of each semester. In each class we would discuss three or four of the questions. They would give their views and I would give my views. The students liked it, and I liked it. My ratings went from the bottom to the top, and I’m sure they learned more than when I was lecturing.’ Students would recall of his ‘Fifty Questions’ course that every year the questions stayed the same, but the answers always changed. In effect, Fischer handled teaching, which he hated, by turning it into research and intellectual dialogue, which he loved.” (pp. 186-87)
* * *
“The way that Fischer worked was to use ‘recalcitrant experience’—the expression is Quine’s—to stimulate further refinement of his theory of how the world works. Indeed, he actively sought examples of recalcitrant experience for this purpose.”
“Inconclusive was fine with him. Better to know the extent of your ignorance than to waste time building theories on insubstantial empirical foundations. More generally, he favored simply exploring the data quite directly, without very much in the way of elaborate statistical apparatus. He was not testing theories or estimating coefficients, but rather searching for new knowledge.”
“Twenty-five years later, at Goldman Sachs, he ruminated about how best to manage traders: ‘Observing a puzzle is not enough. A crooked yield curve or an unexplained stock price move is suggestive, but I generally want to know why these patterns exist before I trade. Stories can be wrong, but I’m uncomfortable trading without one. I want to know what kind of supply and demand imbalance is creating the trading opportunity.’ Such is the long shadow cast by Fischer’s early encounter with Quine.” (pp. 117-19)
Fischer spent the early 1970s at the University of Chicago’s Graduate School of Business “safe inside his office on the third floor of Rosenwald, with Myron Scholes on one side and Eugene Fama on the other.” There he “could be a maverick without bearing the costs that everywhere else are imposed on those who refuse to conform to societal norms. He loved it.
“The own downside was the obligation to teach. No one was ever better than Fischer at working one-on-one with a student, but the subject had to be one that he was interested in learning about himself. And the student had better be interested in solving the problem … and prepared to defend his proposed solution against alternatives, even alternatives that no one except Fischer was prepared to consider. For him, the whole point of academic life was to think new thoughts, and he was interested only in interacting with people who shared that commitment.”
“Standard classroom lecturing about matters that Fischer thought had already been resolved was simply a bore, and Fischer was lousy at it. He recalled: ‘I was terrible, judging by the ratings my students gave me. I thought lectures were a waste of time for me and for my students (especially when the material is in a book). I looked for every possible excuse to cancel classes. One thing I did was to fill some classes with reviews of past exams. That worked well, so gradually I started doing more of it. Eventually, I worked out the following system.
‘I handed out lists of questions and packets of readings at the start of each semester. In each class we would discuss three or four of the questions. They would give their views and I would give my views. The students liked it, and I liked it. My ratings went from the bottom to the top, and I’m sure they learned more than when I was lecturing.’ Students would recall of his ‘Fifty Questions’ course that every year the questions stayed the same, but the answers always changed. In effect, Fischer handled teaching, which he hated, by turning it into research and intellectual dialogue, which he loved.” (pp. 186-87)
* * *
“The way that Fischer worked was to use ‘recalcitrant experience’—the expression is Quine’s—to stimulate further refinement of his theory of how the world works. Indeed, he actively sought examples of recalcitrant experience for this purpose.”
“Inconclusive was fine with him. Better to know the extent of your ignorance than to waste time building theories on insubstantial empirical foundations. More generally, he favored simply exploring the data quite directly, without very much in the way of elaborate statistical apparatus. He was not testing theories or estimating coefficients, but rather searching for new knowledge.”
“Twenty-five years later, at Goldman Sachs, he ruminated about how best to manage traders: ‘Observing a puzzle is not enough. A crooked yield curve or an unexplained stock price move is suggestive, but I generally want to know why these patterns exist before I trade. Stories can be wrong, but I’m uncomfortable trading without one. I want to know what kind of supply and demand imbalance is creating the trading opportunity.’ Such is the long shadow cast by Fischer’s early encounter with Quine.” (pp. 117-19)
Wednesday, November 9, 2011
In praise of dilettantism, with a dash of madness
I recently finished reading the Steve Jobs biography as well as Fischer Black and the Revolutionary Idea of Finance (2005) by Perry Mehrling. Both men had cancer and died young—Jobs at 56, Black at 57. In their short lifetimes both transformed their fields.
Those who don’t know the basic outlines of Steve Jobs’s life and his embrace of aesthetic and technological principles in the creation of Apple products have been living under a rock. So I’m not going there. Rather, I want to use Fischer Black’s years at Harvard to illustrate some of the powers of intellectual cross-fertilization.
As an undergraduate Fischer was initially infatuated with studies in social relations (“a conglomeration of sociology, anthropology, psychology, psychiatry, etc.”). On the side he took courses in mathematics and physics. “He seems to have had the idea that he could always return to physics for graduate school if nothing else worked out. In May 1957 he wrote to his parents that physics was not interesting to him but would lead to the kind of job he wanted, namely in research. ‘In social relations the subject matter would be more interesting and everything would be great if I could get the right kind of job, but I doubt if such jobs even exist. I’m now considering other fields, even economics.’ … Fischer spent the fall of 1957 trying out biology and chemistry as possible alternatives to physics [and, importantly, also took Van Quine’s course on deductive logic for his own interest] before he accepted the inevitable. … In May 1958 he switched his major to physics.” (p. 34)
He stayed on at Harvard for graduate school—the sole university to which he applied—but, once there, he took only one physics course and barely passed it. Instead, he became interested in computers, and eventually artificial intelligence, and petitioned “to switch officially from theoretical physics to applied mathematics.” He took a course on B.F. Skinner’s psychology of learning, but failed it. He was more comfortable with the cognitive approach to learning and took two graduate courses in this area.
Departmental authorities were concerned about Fischer—first because he failed the course in the psychology of learning and second because he was unwilling “to be tied down to a specific program of work. … [I]n his formal application for admission to the thesis-writing stage of the PhD, Fischer listed his proposed subject as ‘artificial intelligence or foundations of mathematics.’ Well, which was it to be? In February 1961, his adviser, Anthony Oettinger, wrote a note to the Committee on Higher Degrees: ‘I have reason to be concerned about his intellectual discipline so that, while recognizing his ability and his desire for independence, I am concerned lest he lapse into dilettantism.’” (p. 37)
“By spring 1962, his lack of progress toward a viable thesis was evident to everyone, and Oettinger graded his work unsatisfactory. In June he was officially informed that he would not be allowed to register in the fall.” (p. 39)
Undeterred, Fischer stayed on in Cambridge and described his life a month later: “I am studying modern art in summer school, taking guitar lessons, taking a speed reading course, working on my thesis, working at Bolt Beranek and Newman [a consulting firm], participating in psychological experiments on hypnotism.” (p. 40) In the spring of 1963, in an effort to learn “how natural language works in the hope that it might help him with the tricky problem of programming a computer to understand questions posed in natural language,” he sat in on courses at MIT in the grammar of English and in semantics. (p. 43)
Fischer’s work at BBN opened the possibility for a return to graduate school. “Over the next year, Fischer wrote a dissertation that was accepted for the PhD in applied mathematics in June 1964. The title of the dissertation was ‘A Deductive Question Answering System.’” Yet, “even as he was writing the thesis, he wrote to his parents: ‘I’m trying to decide what I want to do after I get my degree. The field is wide open. I don’t really like any of these labels: scientist, engineer, researcher. I’m not at all sure I want to stay in the computer field.’” (p. 44)
Mehrling concludes that Fischer Black’s work on artificial intelligence (the man-computer symbiosis) led to “an integration of the two sides of his character, the wildly creative and the ruthlessly logical. Thirty years later, a colleague would remember Fisher: ‘No one’s mind is, or will ever be, as fertile as Fischer’s was. No one is even close. He was crazy and logical at the same time. The force of his logic would push you into corners you didn’t like, or it could open vistas you had not imagined. The crazy streak freed him from conventional wisdom. He was intellectually fearless.’” (pp. 45-46)
Those who don’t know the basic outlines of Steve Jobs’s life and his embrace of aesthetic and technological principles in the creation of Apple products have been living under a rock. So I’m not going there. Rather, I want to use Fischer Black’s years at Harvard to illustrate some of the powers of intellectual cross-fertilization.
As an undergraduate Fischer was initially infatuated with studies in social relations (“a conglomeration of sociology, anthropology, psychology, psychiatry, etc.”). On the side he took courses in mathematics and physics. “He seems to have had the idea that he could always return to physics for graduate school if nothing else worked out. In May 1957 he wrote to his parents that physics was not interesting to him but would lead to the kind of job he wanted, namely in research. ‘In social relations the subject matter would be more interesting and everything would be great if I could get the right kind of job, but I doubt if such jobs even exist. I’m now considering other fields, even economics.’ … Fischer spent the fall of 1957 trying out biology and chemistry as possible alternatives to physics [and, importantly, also took Van Quine’s course on deductive logic for his own interest] before he accepted the inevitable. … In May 1958 he switched his major to physics.” (p. 34)
He stayed on at Harvard for graduate school—the sole university to which he applied—but, once there, he took only one physics course and barely passed it. Instead, he became interested in computers, and eventually artificial intelligence, and petitioned “to switch officially from theoretical physics to applied mathematics.” He took a course on B.F. Skinner’s psychology of learning, but failed it. He was more comfortable with the cognitive approach to learning and took two graduate courses in this area.
Departmental authorities were concerned about Fischer—first because he failed the course in the psychology of learning and second because he was unwilling “to be tied down to a specific program of work. … [I]n his formal application for admission to the thesis-writing stage of the PhD, Fischer listed his proposed subject as ‘artificial intelligence or foundations of mathematics.’ Well, which was it to be? In February 1961, his adviser, Anthony Oettinger, wrote a note to the Committee on Higher Degrees: ‘I have reason to be concerned about his intellectual discipline so that, while recognizing his ability and his desire for independence, I am concerned lest he lapse into dilettantism.’” (p. 37)
“By spring 1962, his lack of progress toward a viable thesis was evident to everyone, and Oettinger graded his work unsatisfactory. In June he was officially informed that he would not be allowed to register in the fall.” (p. 39)
Undeterred, Fischer stayed on in Cambridge and described his life a month later: “I am studying modern art in summer school, taking guitar lessons, taking a speed reading course, working on my thesis, working at Bolt Beranek and Newman [a consulting firm], participating in psychological experiments on hypnotism.” (p. 40) In the spring of 1963, in an effort to learn “how natural language works in the hope that it might help him with the tricky problem of programming a computer to understand questions posed in natural language,” he sat in on courses at MIT in the grammar of English and in semantics. (p. 43)
Fischer’s work at BBN opened the possibility for a return to graduate school. “Over the next year, Fischer wrote a dissertation that was accepted for the PhD in applied mathematics in June 1964. The title of the dissertation was ‘A Deductive Question Answering System.’” Yet, “even as he was writing the thesis, he wrote to his parents: ‘I’m trying to decide what I want to do after I get my degree. The field is wide open. I don’t really like any of these labels: scientist, engineer, researcher. I’m not at all sure I want to stay in the computer field.’” (p. 44)
Mehrling concludes that Fischer Black’s work on artificial intelligence (the man-computer symbiosis) led to “an integration of the two sides of his character, the wildly creative and the ruthlessly logical. Thirty years later, a colleague would remember Fisher: ‘No one’s mind is, or will ever be, as fertile as Fischer’s was. No one is even close. He was crazy and logical at the same time. The force of his logic would push you into corners you didn’t like, or it could open vistas you had not imagined. The crazy streak freed him from conventional wisdom. He was intellectually fearless.’” (pp. 45-46)
Tuesday, November 8, 2011
Hirsch and Person, Commodity Trader’s Almanac 2012
Now in its sixth edition, the 2012 Commodity Trader’s Almanac (Wiley) once again provides a treasure trove of price data and seasonal trading ideas. It belongs on the desk of everyone who trades commodity futures and U.S. dollar crosses as well as stocks, ETFs, and ETNs that are commodity based.
I bought the inaugural 2007 edition of this almanac and for purposes of this review pulled it off my shelf to compare it to its 2012 counterpart. My question was: Is it worth buying the almanac every year? Are the changes really that important to a trader’s bottom line? The brief answer is “yes.”
First of all, if there’s any place seasonal strategies work it’s in commodities. And the 2012 strategies are no rehash of those of 2007. The almanac switches around the commodities that are analyzed. For instance, the commodities that received a one-page analysis in June 2007 were coffee and crude oil whereas in the June section of the 2012 almanac we read about soybeans, sugar, beef, and corn.
The almanac follows the familiar Hirsch format. It’s spiral bound, which means it lies flat when open. About 60% of the volume has calendar entries with appropriate quotations on recto pages, analysis on verso pages. Then come contract specifications, a list of select ETFs, a series of potentially profitable trading patterns, and an extensive long-term data bank (annual highs, lows, and closes as well as near-term contract monthly percent changes) for the S&P 500, 30-year Treasury, crude oil, natural gas, copper, gold, silver, corn, soybeans, wheat, cocoa, coffee, sugar, live cattle, lean hogs, British pound, Euro, Swiss franc, and Japanese yen.
Every year the editors tweak the almanac to respond to trading trends. For instance, this time around they have added a brief section on options and spread trading. But the bottom line is that the Commodity Trader’s Almanac, like the Stock Trader’s Almanac, is a handsome, useful addition to a trader’s desk—especially those traders who believe, like Jesse Livermore, that “It isn’t as important to buy as cheap as possible as it is to buy at the right time.”
I bought the inaugural 2007 edition of this almanac and for purposes of this review pulled it off my shelf to compare it to its 2012 counterpart. My question was: Is it worth buying the almanac every year? Are the changes really that important to a trader’s bottom line? The brief answer is “yes.”
First of all, if there’s any place seasonal strategies work it’s in commodities. And the 2012 strategies are no rehash of those of 2007. The almanac switches around the commodities that are analyzed. For instance, the commodities that received a one-page analysis in June 2007 were coffee and crude oil whereas in the June section of the 2012 almanac we read about soybeans, sugar, beef, and corn.
The almanac follows the familiar Hirsch format. It’s spiral bound, which means it lies flat when open. About 60% of the volume has calendar entries with appropriate quotations on recto pages, analysis on verso pages. Then come contract specifications, a list of select ETFs, a series of potentially profitable trading patterns, and an extensive long-term data bank (annual highs, lows, and closes as well as near-term contract monthly percent changes) for the S&P 500, 30-year Treasury, crude oil, natural gas, copper, gold, silver, corn, soybeans, wheat, cocoa, coffee, sugar, live cattle, lean hogs, British pound, Euro, Swiss franc, and Japanese yen.
Every year the editors tweak the almanac to respond to trading trends. For instance, this time around they have added a brief section on options and spread trading. But the bottom line is that the Commodity Trader’s Almanac, like the Stock Trader’s Almanac, is a handsome, useful addition to a trader’s desk—especially those traders who believe, like Jesse Livermore, that “It isn’t as important to buy as cheap as possible as it is to buy at the right time.”
Monday, November 7, 2011
Knight, High-Probability Trade Setups
Tim Knight, founder of Prophet Financial Systems (now part of the TD Ameritrade stable) and a well-known blogger (Slope of Hope), takes yet another look at pattern trading in High-Probability Trade Setups: A Chartist’s Guide to Real-Time Trading (Wiley, 2011).
The bulk of the book is devoted to nineteen patterns: ascending triangles, ascending wedges, channels, cup with handle, descending triangles, descending wedges, diamonds, Fibonacci fans, Fibonacci retracements, flags, gaps, head and shoulders, inverted head and shoulders, multiple bottoms, multiple tops, pennants, rounded bottoms, rounded tops, and support failure. In each case he defines the pattern, explains the psychology behind it, and provides examples.
The two shorter parts of the book provide an overview (a primer on chart setups and the author’s personal trading journey) and tips on setting stops and being a bear as well as a guide to real-life trading.
For those who are familiar with the literature on chart patterns there’s not much new in this book. The illustrative charts are, however, such a decided improvement over those in the classic Edwards and Magee volume, Technical Analysis of Stock Trends, which has seen multiple editions over the years, that it is worth adding to a reference library. Moreover, the examples are drawn primarily from recent price action. Given market volatility over the past few years it’s easy to find charts that graphically illustrate most patterns. For instance, all of the rounded top examples were drawn from the bear market from late 2007 through early 2009. “The reason is that the market conditions leading up to 2007 were so strong (although wavering in the summer of 2007) and the conditions following the peak gradually deteriorated into an unmitigated collapse. Thus the formation of the domes—the rounded tops—was just about perfect. … Under normal circumstances, formations this clean are infrequent.” (p. 278)
For those who are looking for a solid introduction to pattern trading Knight’s book is first-rate—clear descriptions and lots of ProphetCharts. For those who are old hands at chart patterns the book bears testament to the fact that sometimes profitable patterns really do stare you in the face.
The bulk of the book is devoted to nineteen patterns: ascending triangles, ascending wedges, channels, cup with handle, descending triangles, descending wedges, diamonds, Fibonacci fans, Fibonacci retracements, flags, gaps, head and shoulders, inverted head and shoulders, multiple bottoms, multiple tops, pennants, rounded bottoms, rounded tops, and support failure. In each case he defines the pattern, explains the psychology behind it, and provides examples.
The two shorter parts of the book provide an overview (a primer on chart setups and the author’s personal trading journey) and tips on setting stops and being a bear as well as a guide to real-life trading.
For those who are familiar with the literature on chart patterns there’s not much new in this book. The illustrative charts are, however, such a decided improvement over those in the classic Edwards and Magee volume, Technical Analysis of Stock Trends, which has seen multiple editions over the years, that it is worth adding to a reference library. Moreover, the examples are drawn primarily from recent price action. Given market volatility over the past few years it’s easy to find charts that graphically illustrate most patterns. For instance, all of the rounded top examples were drawn from the bear market from late 2007 through early 2009. “The reason is that the market conditions leading up to 2007 were so strong (although wavering in the summer of 2007) and the conditions following the peak gradually deteriorated into an unmitigated collapse. Thus the formation of the domes—the rounded tops—was just about perfect. … Under normal circumstances, formations this clean are infrequent.” (p. 278)
For those who are looking for a solid introduction to pattern trading Knight’s book is first-rate—clear descriptions and lots of ProphetCharts. For those who are old hands at chart patterns the book bears testament to the fact that sometimes profitable patterns really do stare you in the face.
Thursday, October 27, 2011
Tuckett, Minding the Markets
In 2007 David Tuckett, a British psychoanalyst, interviewed 52 asset managers in the U.S., U.K., and Singapore in an effort to understand the context of their decision-making and, subsequently, to make sense of the financial crisis and to offer ways to make markets safer. Minding the Markets: An Emotional Finance View of Financial Instability (Palgrave Macmillan, 2011) explores such core concepts as phantastic objects, divided states of mind, and groupfeel.
In this post let’s look at a single dichotomy: the integrated state of mind vs. the divided state of mind. The former is “marked by a sense of coherence, which influences our perception of reality, so that we are more or less aware of our opposed ambivalent and uncertain thought and felt relations to objects.” By contrast, the latter is “an alternating incoherent state of mind marked by the possession of incompatible but strongly held beliefs and ideas; this inevitably influences our perception of reality so that at any one time a significant part of our relation to an object is not properly known (felt) by us. The aspects which are known and unknown can reverse but the momentarily unknown aspect is actively avoided and systematically ignored by our consciousness.” (pp. xi-xii)
The divided state of mind may be advantageous in certain circumstances—“the single-minded pursuit of a goal in battle with no thought for the consequences, or creative endeavour with little thought for consensus thinking.” Generally, however, “the pursuit of reward is tempered by the fear of loss, producing anxiety, which is a signal of danger.” (p. 63)
Tuckett argues that the financial marketplace accentuates rather than mitigates the human potential for developing divided states of mind. (p. 71) For one thing, there is undue emphasis on short-term returns. Even if a fund’s investment horizon is three to five years, there is pressure to perform short-term. Short-term results are also key in handing out bonuses.
Moreover, firms “may disproportionately select excessive risk-takers and predispose markets to gambling, based on –K [anxiety] thinking, rather than balancing risk and reward, based on K [real enquiry].” Tuckett reasons as follows: “[M]y respondents had to buy and hold stocks in a situation of quality uncertainty and information asymmetry, made particularly powerful by the fact that they were often making claims to have seen things that others had not. It is not surprising that this created a conflicting emotional experience. While it could be faced within an integrated state of mind, it is easy to see how it might be quickly and ‘dirtily’ dealt with in the short run by denial and a divided state of mind—in which case we would say that the dependent ambivalent relationship necessarily formed with assets is governed by –K rather than K. In –K decision-making anxiety about uncertainty is set aside. There is no longer an emotional incentive to desist from risky decisions. It creates the possibility that the attitudes and behaviour of many asset managers (particularly in a rising market and when there is pressure on them to perform exceptionally) will be excessively risky and that those who are successful will be rewarded so that it is people in a divided state who dominate the market. Those who make decisions in a divided state, if their gamble comes off, will perform better both than those who gamble and lose and those who are cautious.” (pp. 165-66)
Tuckett also found that “managers did not seem to approach failure in an integrated state of mind using their capacity for enquiry (K) to work through and learn from their mistakes. They did not mourn failure, so to speak, rather they sought to move on and fortify themselves for the next battle. This is what divided states enable human beings to do…. By and large the explanations my respondents gave for failures were not ones it would be easy to translate into successful decisions next time.” (p. 167)
Tuckett’s thesis requires some fine tuning, but I’m sure we can all recall those occasions on which we threw caution to the wind. Our wins, of course, resulted from our own careful enquiry (and genius); our losses from the vagaries of outside forces over which we had no control. And what did we learn that could make us better traders/investors? Nada.
In this post let’s look at a single dichotomy: the integrated state of mind vs. the divided state of mind. The former is “marked by a sense of coherence, which influences our perception of reality, so that we are more or less aware of our opposed ambivalent and uncertain thought and felt relations to objects.” By contrast, the latter is “an alternating incoherent state of mind marked by the possession of incompatible but strongly held beliefs and ideas; this inevitably influences our perception of reality so that at any one time a significant part of our relation to an object is not properly known (felt) by us. The aspects which are known and unknown can reverse but the momentarily unknown aspect is actively avoided and systematically ignored by our consciousness.” (pp. xi-xii)
The divided state of mind may be advantageous in certain circumstances—“the single-minded pursuit of a goal in battle with no thought for the consequences, or creative endeavour with little thought for consensus thinking.” Generally, however, “the pursuit of reward is tempered by the fear of loss, producing anxiety, which is a signal of danger.” (p. 63)
Tuckett argues that the financial marketplace accentuates rather than mitigates the human potential for developing divided states of mind. (p. 71) For one thing, there is undue emphasis on short-term returns. Even if a fund’s investment horizon is three to five years, there is pressure to perform short-term. Short-term results are also key in handing out bonuses.
Moreover, firms “may disproportionately select excessive risk-takers and predispose markets to gambling, based on –K [anxiety] thinking, rather than balancing risk and reward, based on K [real enquiry].” Tuckett reasons as follows: “[M]y respondents had to buy and hold stocks in a situation of quality uncertainty and information asymmetry, made particularly powerful by the fact that they were often making claims to have seen things that others had not. It is not surprising that this created a conflicting emotional experience. While it could be faced within an integrated state of mind, it is easy to see how it might be quickly and ‘dirtily’ dealt with in the short run by denial and a divided state of mind—in which case we would say that the dependent ambivalent relationship necessarily formed with assets is governed by –K rather than K. In –K decision-making anxiety about uncertainty is set aside. There is no longer an emotional incentive to desist from risky decisions. It creates the possibility that the attitudes and behaviour of many asset managers (particularly in a rising market and when there is pressure on them to perform exceptionally) will be excessively risky and that those who are successful will be rewarded so that it is people in a divided state who dominate the market. Those who make decisions in a divided state, if their gamble comes off, will perform better both than those who gamble and lose and those who are cautious.” (pp. 165-66)
Tuckett also found that “managers did not seem to approach failure in an integrated state of mind using their capacity for enquiry (K) to work through and learn from their mistakes. They did not mourn failure, so to speak, rather they sought to move on and fortify themselves for the next battle. This is what divided states enable human beings to do…. By and large the explanations my respondents gave for failures were not ones it would be easy to translate into successful decisions next time.” (p. 167)
Tuckett’s thesis requires some fine tuning, but I’m sure we can all recall those occasions on which we threw caution to the wind. Our wins, of course, resulted from our own careful enquiry (and genius); our losses from the vagaries of outside forces over which we had no control. And what did we learn that could make us better traders/investors? Nada.
Tuesday, October 25, 2011
Ranadivé and Maney, The Two-Second Advantage
The Two-Second Advantage: How We Succeed by Anticipating the Future—Just Enough (Crown Business, 2011) by Vivek Ranadivé and Kevin Maney explores how our brains predict future patterns and what this might mean for predictive software technology. Here I’ll limit myself to a couple of insights about our predictive abilities. Yes, you may have read most of this before, but it bears repeating.
The hero of the book is Wayne Gretzky, the 170-pound “weakling” who developed “an exquisite hockey brain.” I assume you know the famous line: “he doesn’t skate to where the puck is—he skates to where it’s going to be. Commentators would often say that Gretzky seemed to be two seconds ahead of everyone else.” (p. 5)
In fact, the authors claim, “most successful people are really good at making very accurate predictions—usually about some particular activity—just a little faster and better than everyone else.” (p. 8) How do people develop predictive skills? Some people are born with them, others acquire them through extensive deliberative practice.
Talented people have “an unusual ability to focus the brain’s resources on one task. … They seem to start up their mental models, quiet everything else, and open channels between regions of the brain. They run their minds so efficiently that time seems to slow down, possibly because they’re actually perceiving the world faster than the rest of us—which helps them get their predictions out ahead of the rest of us.” (p. 70)
Talented people also have a heightened sensitivity to weak signals, even to missing signals. They “make predictions based on a lack of events. This means they catch the notes that didn’t get played because someone in the orchestra missed them or recognize the deal that didn’t happen or the move an opponent didn’t make. It’s much more subtle than processing the things that do happen and takes a greater level of knowledge and a higher level of thinking.” (p. 46)
Let me conclude with a takeaway that applies to business management. Ben Horowitz, whose credentials go all the way back to being an “unheralded” product strategist at Netscape and who with his old Netscape boss launched the venture capital firm Andreesen Horowitz in 2009, believes that “there are two types of people in the top ranks of companies:” ones and twos. “Ones are predictive. Twos have to rely on mountains of data to figure out what they think. Ones should be CEOs, and twos should not.” (p. 24) Steve Jobs, of course, was the quintessential one; Steve Ballmer is a two.
The hero of the book is Wayne Gretzky, the 170-pound “weakling” who developed “an exquisite hockey brain.” I assume you know the famous line: “he doesn’t skate to where the puck is—he skates to where it’s going to be. Commentators would often say that Gretzky seemed to be two seconds ahead of everyone else.” (p. 5)
In fact, the authors claim, “most successful people are really good at making very accurate predictions—usually about some particular activity—just a little faster and better than everyone else.” (p. 8) How do people develop predictive skills? Some people are born with them, others acquire them through extensive deliberative practice.
Talented people have “an unusual ability to focus the brain’s resources on one task. … They seem to start up their mental models, quiet everything else, and open channels between regions of the brain. They run their minds so efficiently that time seems to slow down, possibly because they’re actually perceiving the world faster than the rest of us—which helps them get their predictions out ahead of the rest of us.” (p. 70)
Talented people also have a heightened sensitivity to weak signals, even to missing signals. They “make predictions based on a lack of events. This means they catch the notes that didn’t get played because someone in the orchestra missed them or recognize the deal that didn’t happen or the move an opponent didn’t make. It’s much more subtle than processing the things that do happen and takes a greater level of knowledge and a higher level of thinking.” (p. 46)
Let me conclude with a takeaway that applies to business management. Ben Horowitz, whose credentials go all the way back to being an “unheralded” product strategist at Netscape and who with his old Netscape boss launched the venture capital firm Andreesen Horowitz in 2009, believes that “there are two types of people in the top ranks of companies:” ones and twos. “Ones are predictive. Twos have to rely on mountains of data to figure out what they think. Ones should be CEOs, and twos should not.” (p. 24) Steve Jobs, of course, was the quintessential one; Steve Ballmer is a two.
Monday, October 24, 2011
Stock Trader’s Almanac 2012
The leaves are falling, Halloweeners are readying their costumes, and a new Stock Trader’s Almanac (Wiley) has arrived. The 2012 edition marks the 45th year of this annual publishing tradition. Jeffrey A. Hirsch, Yale Hirsch, and their team at the Hirsch Organization have once again produced an attractive spiral-bound desk calendar chock full of updated statistical data.
This volume follows the standard format. The calendar section takes up about 60% of the book; the rest is devoted to tables, statistical analyses, and personal record keeping.
In the calendar section recto pages are the actual calendar entries, complete with coding for each day. A witch icon appears on options expiration days. A bull icon “signifies favorable trading days based on the S&P 500 rising 60% or more of the time … during the 21-year period January 1990 to December 2010.” A bear icon uses the same parameters to identify unfavorable trading days. To provide even more granularity, beside each date are numbers indicating the probability of the Dow, S&P 500, and Nasdaq rising using the same lookback period. At the bottom of each entry is a quotation. There’s about a five-square-inch space in which to write.
Verso pages provide seasonal data, beginning with vital statistics for each month. Among the wealth of other data analyzed are the January barometer, market performance during presidential election years, market behavior three days before and three days after holidays, the best six months switching strategy, and Wall Street’s only free lunch.
Each almanac highlights the best investment books of the year. This year the top award goes to Ed Carlson’s George Lindsay and the Art of Technical Analysis, which I reviewed in September. The editors are major fans of Lindsay, whom they describe as “a brilliant market prognosticator who made numerous bold and uncannily accurate predictions. He was an intense student of history, market cycles, and repetitive price patterns. From memory, George could reproduce a chart of stock market prices for every one of the previous 160 years prior to his death in 1987.” (p. 114)
Every investor or trader who believes that history provides a guide to the future will do well to have the Stock Trader’s Almanac 2012 on his desk.
This volume follows the standard format. The calendar section takes up about 60% of the book; the rest is devoted to tables, statistical analyses, and personal record keeping.
In the calendar section recto pages are the actual calendar entries, complete with coding for each day. A witch icon appears on options expiration days. A bull icon “signifies favorable trading days based on the S&P 500 rising 60% or more of the time … during the 21-year period January 1990 to December 2010.” A bear icon uses the same parameters to identify unfavorable trading days. To provide even more granularity, beside each date are numbers indicating the probability of the Dow, S&P 500, and Nasdaq rising using the same lookback period. At the bottom of each entry is a quotation. There’s about a five-square-inch space in which to write.
Verso pages provide seasonal data, beginning with vital statistics for each month. Among the wealth of other data analyzed are the January barometer, market performance during presidential election years, market behavior three days before and three days after holidays, the best six months switching strategy, and Wall Street’s only free lunch.
Each almanac highlights the best investment books of the year. This year the top award goes to Ed Carlson’s George Lindsay and the Art of Technical Analysis, which I reviewed in September. The editors are major fans of Lindsay, whom they describe as “a brilliant market prognosticator who made numerous bold and uncannily accurate predictions. He was an intense student of history, market cycles, and repetitive price patterns. From memory, George could reproduce a chart of stock market prices for every one of the previous 160 years prior to his death in 1987.” (p. 114)
Every investor or trader who believes that history provides a guide to the future will do well to have the Stock Trader’s Almanac 2012 on his desk.
Thursday, October 20, 2011
Brown, Red-Blooded Risk
If you read only one finance book this year, Aaron Brown’s Red-Blooded Risk: The Secret History of Wall Street (Wiley, 2012) would be a good bet. Despite its title and subtitle, it is not a tell-all book and it would make a lousy movie. Instead, it challenges the reader to rethink the way she looks at commonly accepted financial principles. The experience can sometimes be intense, so to give the reader a break Brown intersperses chapters on quant history—and even here he rewrites the history we thought we knew.
This is a no-math-required book. But it is decidedly not a no-brains-required book. It reads easily, indeed enjoyably. At almost every turn, however, it offers an uncommon perspective. Take a concept as basic as risk. First, Brown differentiates risk from both danger and opportunity. Among other differences, risks are two-sided and measurable. As something of a corollary, he contends that risk is good. This does not mean that “risk must be accepted in order to improve expected outcomes. That makes risk a cost, something bad that you accept in order to get something good.” (p. 21) Those who treat risks as costs confuse risk with danger.
Brown, currently risk manager at AQR Capital Management, describes the principles and practice of risk management. The key principles are risk duality (the normal and the abnormal), valuable boundary (think VaR), risk ignition (Kelly’s optimal amount of risk that leads to exponential growth, though only within the VaR limit), money, evolution, superposition, and game theory. The practice, and Brown describes it from its infancy to its current state, depends on where you’re sitting—front-office, middle-office, or back-office. All, of course, involve intense quantitative analysis; the requisite people skills vary.
One of the jobs of the front-office risk manager is to analyze trading performance. Since independent traders are their own front-office risk managers, here’s an insight from someone who has monitored innumerable (well, probably not) trading results. First, two definitions to set the stage. “The accuracy ratio is the fraction of trades that make money. The performance ratio is the average gain on winning trades divided by the average loss on losing trades.” And second, the critical paragraph. “In principle, there is a trade-off between these two. If you cut losers faster and let profits run longer, you’ll have a lower accuracy ratio but a higher performance ratio. In practice, it very often seems to be true that the two are not closely related. The trader can pick a performance ratio, the market gives the accuracy ratio. Attempting to increase the accuracy ratio by sacrificing performance ratio seldom works. Therefore, the usual advice is to target a specific performance ratio, adjusting your trading if necessary to get to that target, but only to monitor accuracy ratio. When accuracy ratio is high, bet bigger, when it’s low, bet smaller or even stop trading until the market improves for your strategy.” (p. 221)
Who should read this book? Those who are interested in the battle between statistical frequentists and Bayesians and possible paths toward reconciliation. Those who aspire to be risk managers or who just want to know what they do and what their role was in the financial crisis. Those who are convinced that they know exactly what happened in tulipmania (read the fascinating chapter “Exponentials, Vampires, Zombies, and Tulips”). Those who have a passion for money—that is, its past and future (derivatives?). Those who want to pick the brain of a top poker player during the 1970s and 1980s, not for poker but for risk tips. Those who are tired of dull ideas expounded by drab people.
Here are a few disconnected closing thoughts about Brown’s book. Red-Blooded Risk is a heavy book, and I mean that in the old-fashioned sense of the word. It weighs in at two pounds for 415 pages. It is a book that makes you think. I highly doubt that you’ll agree with all of Brown’s ideas, but they are worthy of serious debate. It has comic strips provided by manga artist Eric Kim. It has a wonderful bibliography. And it extols those red-blooded risk takers “who are excited by challenges, but not to the point of being blinded to dangers and opportunities.” (p. 4) I have to admit, however, that I felt a sense of camaraderie with Paul Wilmott who wrote on the dust jacket: “His blood is considerably redder than mine, which is looking rather pink after reading this book. I’m not sure I’ve got the nerve to follow all of his advice, but then again I like quiche.”
This is a no-math-required book. But it is decidedly not a no-brains-required book. It reads easily, indeed enjoyably. At almost every turn, however, it offers an uncommon perspective. Take a concept as basic as risk. First, Brown differentiates risk from both danger and opportunity. Among other differences, risks are two-sided and measurable. As something of a corollary, he contends that risk is good. This does not mean that “risk must be accepted in order to improve expected outcomes. That makes risk a cost, something bad that you accept in order to get something good.” (p. 21) Those who treat risks as costs confuse risk with danger.
Brown, currently risk manager at AQR Capital Management, describes the principles and practice of risk management. The key principles are risk duality (the normal and the abnormal), valuable boundary (think VaR), risk ignition (Kelly’s optimal amount of risk that leads to exponential growth, though only within the VaR limit), money, evolution, superposition, and game theory. The practice, and Brown describes it from its infancy to its current state, depends on where you’re sitting—front-office, middle-office, or back-office. All, of course, involve intense quantitative analysis; the requisite people skills vary.
One of the jobs of the front-office risk manager is to analyze trading performance. Since independent traders are their own front-office risk managers, here’s an insight from someone who has monitored innumerable (well, probably not) trading results. First, two definitions to set the stage. “The accuracy ratio is the fraction of trades that make money. The performance ratio is the average gain on winning trades divided by the average loss on losing trades.” And second, the critical paragraph. “In principle, there is a trade-off between these two. If you cut losers faster and let profits run longer, you’ll have a lower accuracy ratio but a higher performance ratio. In practice, it very often seems to be true that the two are not closely related. The trader can pick a performance ratio, the market gives the accuracy ratio. Attempting to increase the accuracy ratio by sacrificing performance ratio seldom works. Therefore, the usual advice is to target a specific performance ratio, adjusting your trading if necessary to get to that target, but only to monitor accuracy ratio. When accuracy ratio is high, bet bigger, when it’s low, bet smaller or even stop trading until the market improves for your strategy.” (p. 221)
Who should read this book? Those who are interested in the battle between statistical frequentists and Bayesians and possible paths toward reconciliation. Those who aspire to be risk managers or who just want to know what they do and what their role was in the financial crisis. Those who are convinced that they know exactly what happened in tulipmania (read the fascinating chapter “Exponentials, Vampires, Zombies, and Tulips”). Those who have a passion for money—that is, its past and future (derivatives?). Those who want to pick the brain of a top poker player during the 1970s and 1980s, not for poker but for risk tips. Those who are tired of dull ideas expounded by drab people.
Here are a few disconnected closing thoughts about Brown’s book. Red-Blooded Risk is a heavy book, and I mean that in the old-fashioned sense of the word. It weighs in at two pounds for 415 pages. It is a book that makes you think. I highly doubt that you’ll agree with all of Brown’s ideas, but they are worthy of serious debate. It has comic strips provided by manga artist Eric Kim. It has a wonderful bibliography. And it extols those red-blooded risk takers “who are excited by challenges, but not to the point of being blinded to dangers and opportunities.” (p. 4) I have to admit, however, that I felt a sense of camaraderie with Paul Wilmott who wrote on the dust jacket: “His blood is considerably redder than mine, which is looking rather pink after reading this book. I’m not sure I’ve got the nerve to follow all of his advice, but then again I like quiche.”
Tuesday, October 18, 2011
Pestrichelli and Ferbert, Buy and Hedge
Long-term investors are faced with a host of difficulties. As a strategy, “buy and hold” has been a dud over the last decade. Diversification sometimes helps, but when it’s most needed it usually doesn’t. Jay Pestrichelli and Wayne Ferbert offer an add-on in Buy and Hedge: The 5 Iron Rules for Investing Over the Long Term (FT Press, 2012). As the title indicates, they suggest that investors use options to hedge their stock positions.
The basic premise is both simple and sound. Risk is the input to your portfolio, return is the output. This means that you can control risk, not return. Put another way, risk is what you buy, return is what you hope for.
The authors hammer this point home over and over, and they’re wise to do so since investors invariably focus on the potential reward of their position, not on the actual risk they are incurring. Yet they might just as easily be pinning their hopes on Enron as on Apple. Stock picking is tough.
And so, the authors argue, the solution is to define risk. Ideally, every investment should be hedged. Alternatively, the portfolio as a whole can be hedged. The authors outline a range of strategies, from married puts, collars, and ITM options to vertical and diagonal spreads, to accomplish this goal.
Since for the most part the authors view options as hedging vehicles, not speculative instruments, the focus is on risk management. They boil risk management down to four metrics, all of which should be used in analyzing a portfolio: capital at risk, volatility, implied leverage, and correlation.
Implied leverage may be an unfamiliar concept, so let me describe it briefly. First, what it is not: it is not using margin to buy stock. Rather, it focuses on the power of options to create leverage for a portfolio. That is, if an investor uses options to create exposure to equities and ETFS, he likely uses less capital than if he had bought or shorted the equity or ETF directly.
The authors offer the following example. “Suppose an S&P 500 ETF is trading at exactly $100 per share. You open an account with $100,000 in cash. Then you purchase ten Options contracts that are calls with a $90 strike price that expire six months from today…. The price of these Options is $11 per share…. Since a contract has 100 shares, the total price is $11,000. And with these ten contracts, you control 1,000 shares….” (pp. 81-82) The $100,000 portfolio is comprised of $11,000 in the SPY call options and $89,000 in cash. The implied leverage in this case is 1.0, calculated by using the formula (total market value of nonderivative securities + implied equity value for each derivatives position) / (total portfolio value – borrowed money).
If you add 2,000 shares of MSFT trading at $25 a share, your $100,000 portfolio now has an implied leverage of 1.5 because it controls the $100,000 of implied equity value plus $50,000 worth of MSFT. The numerator is now $150,000, and the denominator doesn’t change. The portfolio with the higher implied leverage gains more in an up market and loses more in a down market. If, for instance, SPY and MSFT both increase by 10%, the first portfolio will be worth about $109,750 and the second portfolio $114,750. If they both decrease by 10%, the first portfolio will be worth $91,000 and the second $86,000.
The authors conclude that “too much leverage increases your portfolio’s volatility by increasing the portfolio’s rate of change. The recommendation from Buy and Hedge is to avoid all excess leverage. Your implied leverage should always be 1.0 or lower. Your traditional leverage should always be 1.0 also.” (p. 85) Advanced investors who are looking for specific risk trades will end up with implied leverage well past 1.0, but the authors are trying to steer investors away from taking on speculative options risk. They are first and foremost hedgers.
Throughout the book the authors offer advice for long-term investors. For example, in the chapter “Harvest Your Gains and Losses” they identify when to reset your hedge in an investment that has an unrealized gain. Their suggestion is that once the unrealized gain of an investment is at least 50% of the total capital at risk for that investment, you should consider resetting the hedge to lock in gains.
Buy and Hedge is an eminently practical book for the long-term investor—and there are mighty few such books around these days.
The basic premise is both simple and sound. Risk is the input to your portfolio, return is the output. This means that you can control risk, not return. Put another way, risk is what you buy, return is what you hope for.
The authors hammer this point home over and over, and they’re wise to do so since investors invariably focus on the potential reward of their position, not on the actual risk they are incurring. Yet they might just as easily be pinning their hopes on Enron as on Apple. Stock picking is tough.
And so, the authors argue, the solution is to define risk. Ideally, every investment should be hedged. Alternatively, the portfolio as a whole can be hedged. The authors outline a range of strategies, from married puts, collars, and ITM options to vertical and diagonal spreads, to accomplish this goal.
Since for the most part the authors view options as hedging vehicles, not speculative instruments, the focus is on risk management. They boil risk management down to four metrics, all of which should be used in analyzing a portfolio: capital at risk, volatility, implied leverage, and correlation.
Implied leverage may be an unfamiliar concept, so let me describe it briefly. First, what it is not: it is not using margin to buy stock. Rather, it focuses on the power of options to create leverage for a portfolio. That is, if an investor uses options to create exposure to equities and ETFS, he likely uses less capital than if he had bought or shorted the equity or ETF directly.
The authors offer the following example. “Suppose an S&P 500 ETF is trading at exactly $100 per share. You open an account with $100,000 in cash. Then you purchase ten Options contracts that are calls with a $90 strike price that expire six months from today…. The price of these Options is $11 per share…. Since a contract has 100 shares, the total price is $11,000. And with these ten contracts, you control 1,000 shares….” (pp. 81-82) The $100,000 portfolio is comprised of $11,000 in the SPY call options and $89,000 in cash. The implied leverage in this case is 1.0, calculated by using the formula (total market value of nonderivative securities + implied equity value for each derivatives position) / (total portfolio value – borrowed money).
If you add 2,000 shares of MSFT trading at $25 a share, your $100,000 portfolio now has an implied leverage of 1.5 because it controls the $100,000 of implied equity value plus $50,000 worth of MSFT. The numerator is now $150,000, and the denominator doesn’t change. The portfolio with the higher implied leverage gains more in an up market and loses more in a down market. If, for instance, SPY and MSFT both increase by 10%, the first portfolio will be worth about $109,750 and the second portfolio $114,750. If they both decrease by 10%, the first portfolio will be worth $91,000 and the second $86,000.
The authors conclude that “too much leverage increases your portfolio’s volatility by increasing the portfolio’s rate of change. The recommendation from Buy and Hedge is to avoid all excess leverage. Your implied leverage should always be 1.0 or lower. Your traditional leverage should always be 1.0 also.” (p. 85) Advanced investors who are looking for specific risk trades will end up with implied leverage well past 1.0, but the authors are trying to steer investors away from taking on speculative options risk. They are first and foremost hedgers.
Throughout the book the authors offer advice for long-term investors. For example, in the chapter “Harvest Your Gains and Losses” they identify when to reset your hedge in an investment that has an unrealized gain. Their suggestion is that once the unrealized gain of an investment is at least 50% of the total capital at risk for that investment, you should consider resetting the hedge to lock in gains.
Buy and Hedge is an eminently practical book for the long-term investor—and there are mighty few such books around these days.
Thursday, October 13, 2011
Trester, Understanding ETF Options
Remember the student (of course it wasn’t you) who, faced with an assignment for a ten-page paper and with nothing much to say, resorted to wide margins, triple spacing, and—if he had a computer—a large font size? Kenneth R. Trester goes even further in Understanding ETF Options: Profitable Strategies for Diversified, Low-Risk Investing (McGraw-Hill, 2012). He pads his 233-page book with lots of readily available lists. For instance, he spends about 50 pages listing ETFs, MLPs, and REITs. Another 40 pages is devoted to normal (fair) value listed call and put tables. That leaves 143 pages for large-print text.
Trester describes option basics and recommends computer simulation to “tell you what your true odds of profiting are.” (p. 106) Not surprisingly, Trester touts his own simulation programs and, later, his newsletter.
What secrets does Trester impart? If you’re buying options (either calls or puts) you “should avoid buying options on ETFs, as most ETFs neutralize volatility.” Instead, “bet on explosive, unstable, small stocks” and “go for the home run.” (pp. 110, 113)
But the “hidden path to profits” is option writing. If you want to buy stocks and ETFs at lower prices and sell them at higher prices, “write naked puts and covered calls. This strategy forces you to buy stocks and ETFs at lower prices and sell stocks and ETFs at higher prices. This is the investor’s ideal.” (p. 134)
“In order to buy stocks and ETFs at low prices, you should write put options that will expire 80 percent of the time worthless without needing to buy the stock. Then you will get the stock at low enough prices and, of course, earn a lot of income as you wait to try to buy the stock. … Of course, the question is how do you know if your written puts will expire as worthless 80 percent of the time? The answer is computer simulation.” (p. 138)
To sum it up, and ratchet up the numbers, “The financial regulators do not believe you can win 90 percent of the time, but you can when you write puts (using a simulator to make sure you have a 90 percent chance of winning). You can win 90 percent of the time, and the remaining 10 percent will get you the underlying stock or ETF at an attractive price. This is the Holy Grail of investing. This is the secret weapon. Add diversification, and you have the perfect game plan.” (p. 177)
Would that life were so simple.
Trester describes option basics and recommends computer simulation to “tell you what your true odds of profiting are.” (p. 106) Not surprisingly, Trester touts his own simulation programs and, later, his newsletter.
What secrets does Trester impart? If you’re buying options (either calls or puts) you “should avoid buying options on ETFs, as most ETFs neutralize volatility.” Instead, “bet on explosive, unstable, small stocks” and “go for the home run.” (pp. 110, 113)
But the “hidden path to profits” is option writing. If you want to buy stocks and ETFs at lower prices and sell them at higher prices, “write naked puts and covered calls. This strategy forces you to buy stocks and ETFs at lower prices and sell stocks and ETFs at higher prices. This is the investor’s ideal.” (p. 134)
“In order to buy stocks and ETFs at low prices, you should write put options that will expire 80 percent of the time worthless without needing to buy the stock. Then you will get the stock at low enough prices and, of course, earn a lot of income as you wait to try to buy the stock. … Of course, the question is how do you know if your written puts will expire as worthless 80 percent of the time? The answer is computer simulation.” (p. 138)
To sum it up, and ratchet up the numbers, “The financial regulators do not believe you can win 90 percent of the time, but you can when you write puts (using a simulator to make sure you have a 90 percent chance of winning). You can win 90 percent of the time, and the remaining 10 percent will get you the underlying stock or ETF at an attractive price. This is the Holy Grail of investing. This is the secret weapon. Add diversification, and you have the perfect game plan.” (p. 177)
Would that life were so simple.
Monday, October 10, 2011
Half-price book sale
Once again, my bookshelves are spilling over, so it’s time for a major fall housecleaning.
Here’s the deal. I will sell the books listed below for half the current official Amazon U.S. price plus the cost of domestic media mail—figure $3 for a single title, less per book for multiple titles. (I’m willing to ship outside the U.S., but shipping charges can be prohibitive.) They are officially used because, yes, I read them. But I have one of the tiniest “book footprints” on the planet; my used books look better than most new books at the local bookstore. No dog ears, no coffee—or, in my case, tea—spills, no visible fingerprints.
In deference to the publishers who so kindly supply me with review copies, I am not offering anything I have reviewed in the last three months.
If you would like to buy any of these books, please email me at readingthemarkets@gmail.com. My preferred method of payment is PayPal. I’ll fill “orders” on a first come, first served basis.
I'll update this list as I receive payment for individual titles.
Anson et al., The Handbook of Traditional and Alternative Investment Vehicles
Bhuyan, Reverse Mortgages and Linked Securities
Biggs, A Hedge Fund Tale of Reach and Grasp
Caliskan, Market Threads (stamped “review copy not for resale” on bottom edge)
Caplan, Profiting with Futures Options (paper)
Fischer, Trading with Charts for Absolute Returns
Fullman, Increasing Alpha with Options
Isbitts, The Flexible Investing Playbook
Kaufman, Alpha Trading
Koesterich, The Ten Trillion Dollar Gamble
Kolb, Financial Contagion
Koppel, Investing and the Irrational Mind
Kroll, The Professional Commodity Trader
Kroszner & Shiller, Reforming U.S. Financial Markets
Kurzban, Why everyone (else) is a hypocrite (stamped “review copy not for resale” on bottom edge)
Labuszewski et al., The CME Group Risk Management Handbook
Leibovit, The Trader’s Book of Volume
Light, Taming the Beast
Marston, Portfolio Design
Martin, A Decade of Delusions
Meyers, The Technical Analysis Course
Phillipson, Adam Smith
Shover, Trading Options in Turbulent Markets
Sklarew, Techniques of a Professional Commodity Chart Analyst
Sorkin, Too Big to Fail
Here’s the deal. I will sell the books listed below for half the current official Amazon U.S. price plus the cost of domestic media mail—figure $3 for a single title, less per book for multiple titles. (I’m willing to ship outside the U.S., but shipping charges can be prohibitive.) They are officially used because, yes, I read them. But I have one of the tiniest “book footprints” on the planet; my used books look better than most new books at the local bookstore. No dog ears, no coffee—or, in my case, tea—spills, no visible fingerprints.
In deference to the publishers who so kindly supply me with review copies, I am not offering anything I have reviewed in the last three months.
If you would like to buy any of these books, please email me at readingthemarkets@gmail.com. My preferred method of payment is PayPal. I’ll fill “orders” on a first come, first served basis.
I'll update this list as I receive payment for individual titles.
Anson et al., The Handbook of Traditional and Alternative Investment Vehicles
Bhuyan, Reverse Mortgages and Linked Securities
Biggs, A Hedge Fund Tale of Reach and Grasp
Caliskan, Market Threads (stamped “review copy not for resale” on bottom edge)
Caplan, Profiting with Futures Options (paper)
Fischer, Trading with Charts for Absolute Returns
Fullman, Increasing Alpha with Options
Isbitts, The Flexible Investing Playbook
Kaufman, Alpha Trading
Koesterich, The Ten Trillion Dollar Gamble
Kolb, Financial Contagion
Koppel, Investing and the Irrational Mind
Kroll, The Professional Commodity Trader
Kroszner & Shiller, Reforming U.S. Financial Markets
Kurzban, Why everyone (else) is a hypocrite (stamped “review copy not for resale” on bottom edge)
Labuszewski et al., The CME Group Risk Management Handbook
Leibovit, The Trader’s Book of Volume
Light, Taming the Beast
Marston, Portfolio Design
Martin, A Decade of Delusions
Meyers, The Technical Analysis Course
Phillipson, Adam Smith
Shover, Trading Options in Turbulent Markets
Sklarew, Techniques of a Professional Commodity Chart Analyst
Sorkin, Too Big to Fail
Thursday, October 6, 2011
Columbus Day sale
Mark your calendars! Monday is the opening day of my second half-price book sale. The list is long, and there are lots of goodies. First come, first served.
I’ll provide more how-to details in Monday’s post.
International readers should probably window shop only since shipping charges to destinations outside the U.S. tend to be steep. Books, after all, are heavy, and there’s no international media mail rate.
I’ll provide more how-to details in Monday’s post.
International readers should probably window shop only since shipping charges to destinations outside the U.S. tend to be steep. Books, after all, are heavy, and there’s no international media mail rate.
Wednesday, October 5, 2011
Corbitt, All About Candlestick Charting
If you haven’t read the last dozen or so books on candlesticks, Wayne A. Corbitt’s All About Candlestick Charting (2012), the most recent volume in McGraw-Hill’s “All About” series, is a good place to start. The author is writing for the neophyte who can benefit from a primer on charts (candlestick and its cousins) as well as technical analysis.
In the first third of the book Corbitt describes the major candlestick reversal and continuation patterns. He then explains how to complement these patterns by using western techniques, both chart reading (trends, support and resistance) and technical momentum indicators. He then adds volume to the mix, including in his analysis the “convenient” yet, as he is the first to admit, problematic candlevolume charts. (I personally can’t understand why anyone would use candlevolume charts instead of, for instance, more granular and informative tick charts.) The final section of the book describes three-line break, renko, and kagi charts.
The book is clearly written and has plenty of illustrative figures and charts. For the most part it is a derivative work, which is fine for a primer. One original contribution is the author’s smoothed volume percentage indicator (VPI), which is akin to the on balance volume indicator. It is used to analyze the cumulative volume of the top stock holdings of ETFs to determine whether an ETF trend is likely to continue or reverse.
All About Candlestick Charting may not belong in the library of a seasoned trader, but it’s a worthy addition to McGraw-Hill’s “easy way to get started” series.
In the first third of the book Corbitt describes the major candlestick reversal and continuation patterns. He then explains how to complement these patterns by using western techniques, both chart reading (trends, support and resistance) and technical momentum indicators. He then adds volume to the mix, including in his analysis the “convenient” yet, as he is the first to admit, problematic candlevolume charts. (I personally can’t understand why anyone would use candlevolume charts instead of, for instance, more granular and informative tick charts.) The final section of the book describes three-line break, renko, and kagi charts.
The book is clearly written and has plenty of illustrative figures and charts. For the most part it is a derivative work, which is fine for a primer. One original contribution is the author’s smoothed volume percentage indicator (VPI), which is akin to the on balance volume indicator. It is used to analyze the cumulative volume of the top stock holdings of ETFs to determine whether an ETF trend is likely to continue or reverse.
All About Candlestick Charting may not belong in the library of a seasoned trader, but it’s a worthy addition to McGraw-Hill’s “easy way to get started” series.
Subscribe to:
Posts (Atom)