Is equity factor investing an improvement compared to traditional quantitative strategies?

Post with image

Factors have been around for quite some time. Most were discovered decades ago and were applied in investment strategies both by active managers and by those adopting traditional quantitative strategies. Some factors have proven to be more useful than others, at least to long-term investors. So is equity factor investing just an exercise in rebranding old strategies or is it a real innovation?

Let me reassure you: yes, there is a genuine innovation in equity factor investing strategies. This is actually at the level of portfolio construction. Risk is now of paramount importance and while traditional quantitative equity strategies have focused blindly on returns, factor investing strives to ensure investors understand what risks are being taken, using factors including low risk, value, momentum and quality.

Estimating expected equity returns using factors

It was the global financial crisis of 2008 that started it all. At BNP Paribas Investment Partners, we had been running multi-factor equity strategies based on optimisation of stock returns since 1994 thanks to our mentor Bob Haugen, the pioneer of multi-factor and low-volatility investing. In their seminal paper “Commonality in the determinants of expected stock returns”, Haugen and Nardin Baker introduced the framework that became the foundation of traditional quantitative equity investing and worked for more than a decade.

The approach provides the means to estimate expected equity returns from the factors to which stocks are exposed. Despite its initial success, the approach could not cope with the rapidly changing focus of markets in the aftermath of the financial crisis.

To see why, we have to look a little bit deeper into how these models worked. In such models, stock returns are driven by factors. Hence the idea of commonality: stocks exposed to the same factors should deliver comparable returns. The first step in such models is to determine which factors are driving the stock price. The exposure of a stock to a factor is essentially a measure of how cheap a stock is (value), how volatile it is (risk), how fast it moves in one direction or another (momentum) and how profitable it is (quality).

For instance, the cheapest stocks can be expected to have comparable returns to each other. And the same applies to the returns of the low-risk, the fast-trending and the profitable stocks.

The second step is the estimation of the factor returns. Here things become more complex. Factor returns are derived from what is known as a cross-sectional regression. That regression takes into account the exposure of a stock to factors and then produces the time series of factor returns from past stock returns. The idea is to say that if stocks are supposed to behave similarly when they have similar factor exposures, then, taking into account the stock’s past returns, we can estimate the returns attributable to factors so that they can help explain the stock’s returns. The regression solves that problem mathematically.

Two flaws have come to light

It is then assumed that the factor returns obtained from this regression are also a measure of the future expected factor returns. And that is what went wrong in 2008: factor returns in the immediate future deviated significantly from their recent past. And that generated bad stock return forecasts and underperformance. The attempt to time factor performances by assuming that the future performance will be determined by recent past performance failed.

A second problem with the approach is how it handles common exposures to different factors, e.g. the fact that a stock can be simultaneously cheap and profitable. The regression tends to reach its limits of applicability when many stocks are simultaneously strongly exposed to different factors. Prohibitive levels of turnover and exotic portfolio allocations with extreme weights in some stocks then result.

Typically, quantitative managers brush the problem under the carpet by not putting all this information about common stock exposures to different factors into the regression analysis. But ignoring the problem is not the right solution since it can generates over-exposure to correlated factors. The portfolio ends up being under-diversified in terms of factor exposures.

The birth of equity factor investing

The first generation of smart-beta strategies (minimum volatility, maximum diversification, risk parity and fundamental indexing) was a desperate response to the failure of traditional quantitative approaches and the poor equity market performance in 2008. But these strategies were based on an illusion: that stock diversification is enough to generate returns or, in other cases, that weighting stocks using a company’s fundamental data matters in portfolio construction. It is now well known  that neither is true.

It is, in fact, the exposures to factors such as low volatility or value that fully explain the risk and returns of these strategies. Investors are now increasingly aware that what really matters are the factor exposures that you choose. And if you select the right factor exposures, you can earn good returns on average. Factor investing was born.

The philosophy of factor investing is very different from past approaches, whether traditional quantitative or smart-beta investing. This is because factor investing is not about the stocks, but instead, entirely focused on the exposure of a portfolio to the factors. A recent paper by our researchers R. Perchet, R. Leote de Carvalho, P. Moulin, “Inter-temporal risk parity: a constant volatility framework for factor investing.”, Journal of Investment Strategies, Vol. 4, No. 1 (2014), pp. 1-23 discusses the importance of factor allocation and factor risk budgeting in portfolio construction. It shows that keeping portfolios exposed to factors such as value or momentum typically generates positive excess returns.

Timing factors should be avoided. Merely adjusting the portfolio allocation to factors so that the risk exposure to each factor remains constant over time is sufficient and leads to significantly higher information ratios.

En route to the optimal portfolio

The second drawback of traditional quantitative strategies is how they employed portfolio optimisers. Typically, traditional quantitative managers injected the stock’s expected returns derived from the regressions analysis into a mean variance optimiser and asked for the optimal portfolio at a given level of risk. But this approach is known to suffer from a number of shortcomings and portfolios are usually more a result of the portfolio constraints imposed than the expected stock returns used as inputs.

In response to that , model factor investing employs robust optimisation techniques that avoid all these well-known problems of traditional portfolio optimisation. One such robust portfolio technique was developed by our researchers. It aims at building portfolios that efficiently mix exposures to factors such as value, low volatility, momentum and quality. It is designed to be transparent and generate portfolios where the impact of guideline portfolio constraints is minimised. Details of the approach can be found in our paper R. Leote de Carvalho, X. Lu, P. Moulin. “An integrated risk-budgeting approach for multi-strategy equity portfolios.” Journal of Asset Management, Vol. 15, No. 1 (February 2014), pp. 24-47).

We have come a long way from the black boxes of traditional quantitative managers and from the naïve smart-beta indices. Efficiently constructed portfolios with multiple managed exposures to factors known to generate premiums are the future of quantitative equity investing.

Etienne Vincent

Head of Global Quantitative Management, THEAM

Leave a reply

Your email adress will not be published. Required fields are marked*