How accurate are bloggers II?
It's time for round two of the analysis of the Ticker Sense Blogger Sentiment Poll. My original analysis covered 20 polls from the start of the year and is given here. This is another 20 sample poll, covering the period from April 20th to August 31st. What is interesting about this period was the sharp correction for the S&P in July/August, but how had this impacted on Bloggers?
There have been some changes, notable were the end of contributions from Ant and Sons, and Bill aka NoDooDahs in July, both of whom were regular weekly data providers up to that point. The other blogger returns were a mixed bag: Controlled Greed, Knight Trader, Naked Shorts, Traders Talk, Wishing Wealth, Trader Tim, Jack Stevison, and Fly on the Wall returned 10 or fewer poll returns out of the 19 possible (there were data issues for one week out of the 20) - so their data will be skewed by the low sample number.
Based on a bullish (>1%), bearish (<1%) or neutral (inside +/- 1%) calls for the 30-days the S&P had 10 bullish, 5 bearish, and 5 neutral periods. This gave a net percentage return of 25% bullish for the S&P, down from 40% from the original analysis. The table below gives the net standing of the bloggers from max bullish (all received returns were bullish), to max bearish (all received returns were bearish). Again, a low sample return is more likely to give a blogger an extreme reading, e.g. Jack Stevison only made 4 returns, all of which were bearish, to give a maximum bearish opinion of -100%.
The most accurate Blogger was Daily Dose of Optimism at 59% accuracy, with Carl Futia, Jack Stevison and I next at over 50% accuracy. The most bullish blogger was Carl Futia at 100% bullish, with Shark Report and Bill aka NoDooDahs (up to July) reporting over 90% bullishness. Extreme bearishness covered Jack Stevison, Naked Shorts, Trader Tim, and Random Roger, all at -100% - but with the exception of Random Roger, the remaining three blogger's returned few poll returns, thereby skewing their data. There was nobody else with -90% or more bearishness. Twelve bloggers were net bullish (>10%), ten bloggers net bearish (<-10%), and 4 bloggers were net neutral (+10% to -10%). So bloggers tended to be more bullish than bearish for the last 20 periods.
The data below shows the initial return (left data series), the most recent return (right data series), and whether bloggers turned bullish, bearish, or were unchanged (+/- 5%) between the two data sets. It was interesting to see the continued strong performance of Carl Futia and Daily Dose of Optimism, from the original poll period to the most recent, each holding above 50% accuracy. But how did the bloggers fare with respect to the change in the S&P between the two sample periods?
Based on the two data sets the S&P turned more bearish, down from 40% bullishness to 25% bullishness. Of the 10 bullish bloggers from the first period, five turned more bullish, four turned bearish, and one was unchanged (Carl Futia couldn't get more bullish than 100%!). Of the 8 bearish bloggers from the first period, five turned more bullish, two were unchanged and only one turned more bearish. So there was a sharp reversal amongst prior bears to a more bullish stance (although Ant and Son had stopped giving calls as the worst of the correction was beginning). Given the S&P shifted bearish this may represent a contrarian signal amongst bloggers.
But how can this data be used. Given there are only two data sets to compare it is little hard to draw firm conclusions. One could follow the opinions of the most accurate bloggers, which in this case were Daily Dose of Optimism, and Carl Futia. Or go contrary to the least accurate bloggers, who were Millionaire Now and Random Roger. Or, do a combination of the two. The above four bloggers had relatively tight spreads in their accuracy for the two sample periods - but anyone can draw a line between two points - so it is going to take a number of sample periods to understand which bloggers run with, or inverse too, S&P direction. Some bloggers had a relatively wide spread in their accuracy calls (myself included) - so the ol' past performance does not guarantee future performance holds true too.
Unfortunately, there are complications to this conclusion. The above bloggers were relatively steadfast in their opinion, Carl was 100% bullish, Daily Dose of Optimism had the occasional bearish call (July 13th and May 18th) but was predominantly bullish, Roger was 100% bearish. Millionaire Now shifted from an all bearish opinion up to June 1st, to a Neutral opinion up to August 31st. Millionaire Now perhaps had the closest contrarian signal, but it was hardly an ideal signal.
The closest blogger to predicting the July/August correction was 24/7 Wall Street, going from a neutral stance in May-June, to bearish on June 15th - staying bearish until August 24th. Others who were on the right side of opinion for the July/August drop (myself included) had been long standing bears. As for the post July/August rally I had turned bullish on July 27th after long standing bearishness. Learning Curve turned bullish on August 17th after long standing bearishness. 24/7 Wall Street switched back to neutral on August 31st.
It does not appear individual bloggers are the best indicator for major turns in the market. There may be more substance from looking at the behavior of traditionally "bullish" and "bearish" bloggers as a group and seeing how the opinion of those groups change. But even then, the structure of my data set where averages are considered (and not the timing of the signals) makes even using this as a trading device difficult and may only work at a macro level (e.g. per 5, 10 or 20 period data set). That is another story.