House effects - UPDATED
Last summer I did a post looking at the "house effects" of each polling company, that is - the party partisan effects of each company's methodology. We know, for example, that ICM reallocate their don't knows which helps the Lib Dems, that Survation prompt for UKIP, that online companies tend to have slightly different results from telephone ones, that companies have different approaches to turnout and weighting and squeeze questions that all have impacts on their results. Some of these, like the reallocation of don't knows, are extremely easy to quantify from looking at the tables published by the pollsters. Others are impossible to measure except by looking at the long term differences between different companies results. This is attempt to roll it all up and give a brief picture of which companies tend to give results that are better for Labour, which tend to give higher results for UKIP and so on.
This is important for understanding trends. One of the most commons errors I see on twitter is people forgetting the house effects of different pollsters, and therefore suddenly thinking that the Labour lead has fallen when an ICM poll appears (when actually ICM always tend to show lower Labour leads) or grown when a TNS BMRB is published (when actually they always tend to show a bigger Labour lead).
House effects are not set in stone, they are a result of how the political situation interacts with each pollster's methodology, so they change with the political weather. For example, ICM's reallocation of don't knows tends to help whichever party has lost support since the last election - back before 1997 it helped the Tories, from around 2002 it started helping Labour, these days it is most helpful to the Liberal Democrats. Filtering by likelihood to vote tends to help the Tories, whose supporters come from socio-economic groups that are more likely to vote... but it can help any party if their supporters are particularly enthusiastic.
Anywhere, here is an updated chart based on polls over the last six months*. Companies to the left tend to show bigger Labour leads, companies to the right tend to show smaller Labour leads. Rather than looking at differences in Liberal Democrat support as I did last year, I've looked at differences in UKIP support, which tend to be much starker between pollsters:
A few things worth noting:
It's noticeable how the "new online companies" are clustered in the top left of the graph - there are differences between them, but they all tend to show the highest UKIP scores and the highest Labour leads. Compare them with the three traditional telephone polls from ICM, ComRes and Ipsos MORI in the bottom right, showing the lowest UKIP scores and the lowest Labour leads.
Secondly look at the range, once you put six month's data in all there is not a vast difference in the size of the Labour leads different companies show. If we leave aside Angus Reid, who only conducted a couple of polls at the start of the year, and look at just the regular companies there is only a 3 point gap between Opinium and TNS BMRB who show the biggest Labour leads and ICM and ComRes online polls that show the smallest Labour leads. In terms of the Labour lead, the polls are not really "all over the place" as people are want to say.
Contrast that with UKIP, where there really is a gulf between what different pollsters show - the average difference between ICM (who tend to show the lowest levels of UKIP support) and Survation (who tend to show the highest) is about 7 points.
(* As with last time the method I use is to take a rolling average of the YouGov daily poll as the comparison point, and then see what the average difference is between each companies' polls and the YouGov average at the same time (the reason for this rather complicated approach is that it irons out any differences that stem from the timing of polls. If you just took a crude average of each company's polls it would be skewed, by example, for a company that did several polls in the last month when UKIP was on a high but few polls early in the year when they weren't). I then relate this the average across all the companies so that YouGov don't automatically end up in the middle of the graph. It's important to note that all the differences are relative to each other... so being in the middle of the chart doesn't necessarily mean you are right.)