MIT Sloan Sports Analytics Conference 2013: Soccer Analytics Panel


Panelists : Chris Anderson, Albert Larcada, Blake Wooster, Jeff Agoos

The makeup of this year’s panel is very different from that of Soccer Panel at SSAC 2012. Dominated by “club insiders” last year, this year’s panel had a mix of “outsiders” like Chris Anderson, Blake Wooster from Prozone represented the data companies, Albert Larcada from ESPN coming from media and Jeff Agoos, a former player and Technical Director of MLS.

The Ballroom was almost full and this was a bigger ballroom than that of the last year’s conference. (My totally un-scientific method of measuring crowd sizes puts “almost full” > “75% full of last year”)

Marc Stein moderated the panel as was the case last year.

soccerpanel

This year’s panel took a completely different track compared to the panel from last year. Last year it surrounded around how analytics is used, where it is useful and the importance of context and trust. The biggest challenge cited in 2012 was the availability of good data to work with.  This year it centered more around how much analytics is used by managers, metrics, visualizations – use of analytics in the media and trust (a repeat theme from year).

The panel started off with a historic perspective of soccer analytics from Chris. I thought it was a very good beginning that gave the audience an idea of how far we have come. Albert’s examples of using heatmaps of Messi and Ronaldo in the build-up to the “Clasico” in SportsCenter showed how data is being used to tell a story. The challenge in a scenario like SportsCenter is that the announcers need to be able to explain the graphic and tell the story in less than 10 seconds. Choosing the right type of visualization is key. I also liked the idea of “visualization is analytics” quote from Albert. All visualizations are an approximation of the raw data. If done right, they can tell a story not just in media but also inside a soccer club.

The big questions

  1. How much analytics is being used by the club managers?
  2. What are the challenges in making the coaches use analytics?
  3. Metrics – what do we have today?

How much analytics is being used by the club managers?

Chris tried to answered this based on his experience working with clubs as a consultant but the viewpoint of an “insider” (like an analyst at a club) citing examples where analytics has been used by a manager successfully would have been a good counter-balance.  It is always tough balancing act to not reveal confidential details and have a frank discussion but I felt that the club angle was addressed better in last year’s panel. West Ham United’s manager Sam Allardyce talking about how he uses analytics is a good example.

What are the challenges in making the coaches use analytics?

There was some great discussion on this one.  Coaches have a lot at stake and they may not be willing to use something new unless they know (with a high degree of confidence) or trust that it will help them but they won’t know for sure if it will be useful unless hey use it. The classic chicken and egg. A great example was the importance of survival in a promotion-relegation environment and how that pushes managers and front-offices more towards short-term thinking and immediate results.

Daryl Moorey, the GM of Houston Rockets and a co-chair of the SSAC brought up a great point in the “Revenge of the Nerds” panel about how the front office needs to support and persist with analytics. Managers may not trust it right away but if they find it useful over a period of time they will eventually come around.

Blake had a point when he stressed that the onus is on the analyst to communicate the value of analytics to the coach. If the coach doesn’t see value in it, it is probably because the analyst didn’t do a good enough job of conveying the message.

In the talk “Why we don’t understand Luck” by Michael Mauboussin one of my favorite talks of the conference, Michael stressed the importance of evaluating things based on the process (which is very hard to do) rather than the outcomes (which is easy and what we do normally).

I believe that analytics is being used differently in different clubs (hardly surprising). It has a complimentary role as “another tool” in the overall toolkit for success.  The two key themes that resonated across the 2-day conference “communicating the message” and “winning the trust of coaches/decision-makers” are very important.

Metrics – what do we have today?

This is probably one of the most debated aspect of soccer analytics today. What metrics do we have? What is the equivalent of Baseball’s WAR in soccer? There is a lot to gain from knowing the process of how analytics and metrics work in other sports but every sport is unique and presents its own challenges. It is a fact that we haven’t been able to model soccer matches as well we would love to. Albert brought up a great point about how the paucity of scoring on soccer makes it much harder to model than almost every other team sport. We haven’t yet come up with a formula that tell us how to win a game or score a goal. That is because there are many ways to win a soccer game. Different coaches employ different systems. A metric/KPI valued highly in one system might not matter at all in another system. For example, speed on the ball and accurate long-balls might be very important in a counter-attacking system but may not be that important in a short passing system.

 

Charles Reep, probably the first soccer analyst, concluded that most goals were scored from fewer than three passes: therefore he concluded it was important to get the ball quickly forward as soon as possible.

While his statement might still hold true, the “How?” is the key question. The answer is not as straightforward. The quickest way to move the ball is for the goalkeeper to hoof balls upfield and we know that it doesn’t work most of the time.

I believe the availability of spatial XY-data + data from camera tracking systems (installed in most of the stadiums around the world today) in conjunction with video has helped in answering the “How” better. But we still have a long way to go.

As Chris pointed out at the very beginning of the panel – This is not a revolution but an evolution. We are constantly evolving.

A few more of my thoughts in a podcast with Richard Whittall  editor of the The Score Media’s Counter Attack blog

Suggestions for the next year’s panel

  1. I felt that the composition of the 2012 and 2013 panels were at the extremes in terms of “club insiders/outsiders”. A panel with a mix of the two groups would probably be more useful.
  2. Panel formats don’t lend themselves very well to 1-on-1 dialogue and interaction. An online Q & A chat session with the panelists during the conference is a good thing to try. I got similar feedback from a few others who attended the conference.
  3. Have Big Sam on the panel! – Seriously, having a manager who has used analytics and is willing to talk about it would elevate this panel to a whole another level. I am aware that EPL will be in season, but technology gives the option to have someone participate remotely.

I have another post summarizing my views on the other panels I attended coming up in a few days.

Links

Official website Sloan Sports Analytics Conference

Richard Whittall’s thoughts on the Soccer panel

Zach Slaton’s Summary of the SSAC 13

Mitch Lasky’s impressions of the conference

Advertisements

Sports Hack Day Project


I have been busy ever since I started working for the Seattle Sounders about a month ago. It has been great so far. We are less than a month away from the season kick-off. I am very excited, to say the least.

Coming up in a few weeks is the Sloan Sports Analytics Conference in Boston. Last years conference had a profound impact on me. More about that in another post.

This weekend I participated in the 1st ever Sports Hackday in Seattle. The idea of learning something new, meet like-minded people and a chance to avoid the endless Superbowl pregame show were enough motivation to sign-up. The Hackday was very well-organized. Kudos to the organizers and sponsors. We started off Friday night with introductions and forming teams. Our team “Submarino” constituted of Sarah, AdamMatt and I. We had a few ideas going into the Hackday. After a brief brainstorm we decided on looking at the impact of injuries to soccer clubs.

Sunday morning during the integration phase a few hours before the demo

Sunday morning during the integration phase a few hours before the demo

One of the coolest things about Sports Hackday was  that data providers like Sports Data LLC and platform companies like Google, Cloudant, Twilio etc., provided tools that ensured that we spent most of our time implement our idea and not worry about basic infrastructure and plumbing.

We used the Sports Data LLC‘s API to extract the injury information of English Premier League and broke them down based on teams, types of injuries, # of games missed due to these injuries. We built a fully working model of our idea using real data. It helped that we had an awesome team and that we did a very good job of decoupling the Frontend UI pieces and the backend database work which enabled us to work almost parallely. We had our hairy moments during the integration phase with the clock winding down to Noon, Sunday (the deadline for code-complete). However we were able get done most of what we wanted to do.

We did this cool  interactive visualization illustrating the breakdown of injuries in a team by category and the players. The thickness of the arcs depict the # of games they missed due to a particular injury.

We had 3 minutes to demo and it went well, although all of us were a bit nervous and very tired. We won two prizes. “Best data visualization” and the “Best overall data hack of the Hackday”.

Here is a piece on the Sports Hackday on Geekwire.
Local TV King 5‘s coverage of the event

Frankly, I did not expect to win the overall prize. We ended the evening very happy and very very tired.

Visuals

Manchester United had the highest # of player-games missed due to injuries so far this season. The 2nd visual highlights that muscle injuries is a team-wide issues and not just Nani who missed the most time due to muscle injuries.

This poses a new question : Is there something in the training regimen of Manchester United that is causing this? 

Manu Injury Breakdown

manu2

PS: I couldnt get the interactive part working on the blog due to javascript issues, if I ever figure it out, I will update.

MCFC Analytics blogposts – Summary #8


Here is the list of interesting posts I found in the past week

  1. An interesting post on home advantage and how it manifests itself into football stats by @FbPerspectives. The post also has a link to a detailed paper from 2009 on home advantage.
  2. Guardian Data blog has an interactive visualization of the Bolton – City game by @jburnmurdoch. The viz has a pitch map + a radial diagram that captures the pass direction and length.
  3. The man in the yellow shirt – an analysis of the refs by @PedroAfonso85
  4. An interactive visualization of the direction of a player’s passes by @alekseynp . Some of the outliers are very interesting.
  5. Momentum in Bolton – City game. by @SoccerStatistic . This is a different approach from the previous attempts on visualizing momentum using this data set.

I did not publish anything last week, although I did start writing. Hopefully I will publish something later this week.

Previous Summaries

Summary #7

Summary #6

Summary #5

Summary #4

Summary #3

Summary #2

Summary #1

If I missed any, please post them in the comments section or tweet them to me!

MCFC Analytics-Summary of blogposts #6


This week I saw a few more new bloggers getting into the act with the data.

First up, there was this article by @RWhittall of TheScore.com where Richard talked about “soccer data abuse by some bloggers using the MCFC data”. The gist of the article is that some of the bloggers are extrapolating too much with their conclusions based on one year’s worth of data from one league. The other point made in the article is that the output of the majority of  the work in soccer analytics isn’t groundbreaking and it is just adding a data context to what we already knew.

While I see where Richard is coming from, I don’t quite agree either with his assessment of the state of soccer analytics or the “data abuse” bit.

Unquestionably, we haven’t even scratched the surface of what we can do with data in soccer. The majority of the research work in the soccer analytics is carried out in the private domain.  That is because soccer data is not a commodity like it is in other sports like Baseball. The MCFC & Opta project could be a significant step in the direction of making soccer data more accessible to a wider audience,  if it can get enough passionate people interested in the project. However, like in any type of writing in the public domain, there is the good and the not so good. One of the things we discussed with Gavin Fleig, Head of Performance Analysis at Manchester City, Simon Farrant, Marketing coordinator at Opta et al is to build a community that fosters communication, collaboration and open feedback among the members and the readers. This should help everyone get better in some time.

Without further ado, here are links to some interesting work I found in this past week.

@MarkTaylor0 has a comprehensive piece on the state of soccer analytics and where it stands vis-à-vis other sports like NFL and Baseball. – The case for data analysis in football. This is a must read.

Analytics posts

  1. @PedroAfonso85 has a couple posts using the advanced data set
  2. @ChrisJLilley continues with his positional analysis series with Strikers and Central attacking midfielders
  3. @FootballFactman ‘s piece talks about what to look for in goalkeepers of the premier league
  4. @shots_on_target talks about the correlation between points in fantasy football and attacking stats
  5. In my weekly opposition analysis series I analyzed at Sunderland using last season’s data.

Visualization posts

  1. Earlier today I saw Voetstat, a neat blog by @Voetstat_craig which has some visualizations of pass completion + heatmaps. There are multiple posts. I haven’t had a chance to read all of them yet.
  2. @TomBerthon has this visualization of how goals were scored in the Bolton – City game from last season

If I missed any links, post them in the comments section and tweet them with the hashtag #MCFCAnalytics. I will retweet them.

Previous Summaries

Summary #5

Summary #4

Summary #3

Summary #2

Summary #1

MCFC Analytics – Summary of blog posts # 3


Thanks for the amazing response to Summary of blog posts #1 & Summary of blog posts #2

I also want to thank people who have reached out to me via twitter with links to their blogs & posts.

Goalscorer ‘footedness’ by @DavidAHopkins measures the footedness or the foot favoured by Premier League goalscorers.

How do the more successful clubs keep the ball in EPL by @JDewitt talks about how the top teams in EPL keep possession. Also by John is Successful Passing and Winning

A sneak peek of a very interesting carto by @Kennethfield  Charlie Adam’s “passing wheel”

Football Philosophy – Long passes by @Poolq1984 explores the importance of long ball in football.

@We_R_PL has a nice post on how to use the MCFC dataset more efficiently. He also has spreadsheet which has the own goals calculated per team.

@footballfactman has a post on Darron Gibson using a mix of data from MCFC dataset, whoscored and statszone

The always excellent MarkTaylor0 has detailed post Analysing the quality of shots in Bolton – Manchester City game using the advanced dataset.

@ChrisJLilley has 3 posts on his blog using MCFC data

GK positional analysis

Premier league game changers Part I & Part II

@DanJHarrington has cranked up a lot of things using the advanced dataset

1.  an interactive tableau viz to see touches of each player in Bolton -City on the pitch.

2. Passing visualization using D3.js

3. Dan also has some interesting visualization work in progress. There is a cool video in the link showing ball movement.

Network passing diagrams by @DevinPleuler

Bolton – http://t.co/mcRQ0oHU

Man City – http://t.co/6mtGgJQS

Extracting data from XML

There have been some questions regarding this and some folks have come up with solutions

1. If you have MS Excel 2007 or a later version you can open the file in XML. The only issue with is that XML’s are nested and Excel converts this into a very flat format. So you will see multiple rows for the same events. For example: A successful pass has multiple rows indicating the direction, the x,y coordinates of where it is passed to. Read the data spec thoroughly to understand how the data is formatted in the XML. It will help understand the data much better.

2. Code for R users to extract the F-24 XML by @MarchiMax

3. Code snippets from @JBrisson to extract events from the F-24 XML

4. If you are into programming, most languages have XML parsers. A simple search will get you code snippets to start with.
If I missed any links, please let me know via Twitter or comment on the blog post. Always use #MCFCAnalytics tag in twitter so I can pick them up easily!

Follow-up analysis: Final third passing and Goals scored per game


This is a follow up to my post regarding the strong correlation between completed in the final third and goals scored.

Question

Is there a correlation between the final third completions & goals scored at the game level?

Analysis

I investigated to see if this correlation exists at the game level using the #MCFCAnalytics data set. I plotted the completions in the final third vs. goals scored for Manchester City in all their 38 games of English Premier League.
Blue = Away
; Orange = Home

Manchester City Goals vs. Pass completions in the final 3rd on a per game basis

Findings:

  • Linear regression had an R2 of 0.04  implying that there is no correlation between passes completed in the final third and goals scored at the game level.
    I did the plot for a few other teams and got similar results.
     
  • Arsenal – Away and Liverpool – Home. In both cases, Manchester City had very little success completing passes in the final 3rd. However, they lost 1-0 at the Emirates and won 3-0 at home vs. Liverpool.
    Against Liverpool, City had 6 shots on target and 2 off target.
    Against Arsenal, City had 0 shots on target and 3 off target.
  • QPR – Home and QPR – Away. City scored 3 goals each against QPR home and away. However, they had a season high 326 completed passes in the final 3rd at home vs. just 74 in the away fixture.
    Shots vs. QPR Away – 5 on target & 10 off target.
    Shots vs. QPR Home – 15 on target and 10 off target.

The City – QPR fixture was that crazy season finale. City fell behind and they threw everyone forward to go for the win and the Premier league title. QPR was a man down from 55th minute and they defended at the edge of their 18-yard box for most of 2nd half. This explains the unusually high number of completed passes in the final third.

The above examples underline the rarity of the “goal” event. In any given game, there could be factors like bad shooting, luck, the opponent’s goalkeeper having a great game etc., which could influence the # of goals scored. However, over a season those things seem to even out.

In the next step of analysis I will add a 2nd variable to the model and analyze.

Passing in the final third and goals – EPL 2011-12 #MCFCAnalytics


Question:

Is there a correlation between passing in the final third and the goals scored?

I used the #MCFCAnalytics data set to find the answer.

Analysis

Plot of  Total # of completed passes in the final vs. Goals scored for all the 20 teams in the 2011-12 season of the Barclays Premier League

 Findings:

  • Linear regression had an R2 of 0.671indicating a strong correlation between passes completed in the final third and goals scored.
    Excluding the outlier of Liverpool from the dataset the R2jumped to 0.827.
  • Liverpool is ranked 3rd in the # of passes completed in the final third. However, they are only ranked 15th in goal scored.
  • 75.73– Liverpool’s expected goals scored based on the above regression. However, they managed to score only 42 goals.
    • What is the reason for the huge negative difference?
  • Swansea’s case is interesting. You may remember the term “Swansealona” was one of the favorites with EPL analysts and reporters last season due to their reputation for passing style and high amounts of possession. However, they are below the league average on passes completed in the final third.
  • Newcastle  is ranked 18th in passes completed in the final third. However, Newcastle is ranked 7th in goal scored.Expected goals scored for Newcastle is 29.6. They managed to score 51!
  • Blackburn is ranked last in passes completed in the final third. However, Blackburn scored a lot more goals (44) than their expected goals scored (24.2)
  • Stoke is at the bottom – Lowest # of goals scored and 2nd lowest # of passes completed in the final third.  Not surprising based on their style of play.

Liverpool

I hypothesized that

  1. Liverpool might be crossing a lot and
  2. Most crosses occur in the final third. (I would love to look at (X,Y) data to establish this fact.)
  3. Poor shot quality (which might or might be related to their propensity to cross)

Findings:

  • 1103 – Liverpool attempted the highest # of crosses +corners of all teams in 2011-12
  • 840 –  Liverpool attempted the highest # of open play crosses in 2011-12
  • 19th in overall crossing efficiency  (#of successful crosses+corners/# of successful  + # of unsuccessful crosses+corners)
  • 14th in open play crossing efficiency (# of successful open play crosses/# of successful + # of unsuccessful open play crosses)
  • 18th in overall shooting efficiency ( shots on target/shots on target + shots off target + blocked shots)
  • 15thin shooting efficiency not including blocked shots (shots on target/shots on target + shots off target)

    A glance at the top 10 open play crossers of Liverpool in 2011-12.

Player

Attempts

Efficiency

Downing

148

0.209

José Enrique

138

0.210

Henderson

72

0.125

Adam

70

0.157

Gerrard

69

0.203

Bellamy

67

0.194

Johnson

65

0.185

Kuyt

57

0.246

Suárez

47

0.149

Kelly

38

0.105

Liverpool Average

0.192

League Average

0.202

  • 2 – According this article on EPLIndex, Liverpool scored just 2 goals from 840 open play crosses all season. That is 1 goal per every 420 open play crosses.
  • 79 – The average # open play crosses per goal scored in the 2011-12 season. Liverpool are almost 10 times worse than Man United (44.5)  and Norwich (45.1) in open play crosses/goals category. If there ever was a stat that would (or should) regress to the mean, this is it.

Liverpool had a very talented team in 2011-12. This manifested itself in their high # of completions in the final third where the defensive pressure is highest. Once they are in possession in the final third, they seem to have relied heavily on “crossing the ball” to enable their center-forward Andy  Carroll to take a shot (or head) OR knock it down for their attacking midfielders and wide forwards to take a shot. One big problem was that delivering  crosses is not a very efficient way of passing the ball.  Another problem was they did not seem to have a plan B. It is quite possible that opponents have figured out Liverpool’s crossing strategy and their lack of plan B. The combination of these three factors has contributed significantly to the poor offensive display of Liverpool last season.

Newcastle United

  • 4th – Newcastle is 4thbest in shooting efficiency (goals scored/(shots on target + shots off target)). They stayed 4th even when I included blocked shots in the denominator.
    • This could be the reason why they are an outlier in the final-third completions vs goal scored plot.
    • Manchester City, Arsenal and Manchester United are the top – 3 in shooting efficiency.

Newcastle had two great strikers in Demba Ba and Papisse Cisse who accounted for 29 goals between them. These two were the focus of Newcastle attack and were very efficient with their shots. They did not need a high # of completed passes in the final third to score their goals as they were able to convert a higher % of their shots into goals.

Blackburn Rovers

  • 7thBlackburn are 7th best in shooting efficiency inside the box (goals scored from inside the box/(shots on target inside the box + shots off target inside the box)).
  • Yakubu scored 17 goals for Blackburn and has the 2ndbest  Goals to Shots ratio among all the forwards who have scored than 10 goals.
    • This could be one of the reasons for their big positive differential between actual goals scored (44) and the expected goals scored (24.2).

Summary

# of successful passes in the final third has a strong correlation to goals scored.

Final third is a “high-value” area for scoring goals. More completions in the final third means a team is spending more time in the high-value area. This translates into more opportunities to take a shot or draw errors from defenders to win set pieces from close range, which further increase scoring opportunities.

A high number of completions in the final third alone might not guarantee goals. Liverpool and Newcastle , two examples from the two extremes of the outlier spectrum are cases in point. However, it is one of the key contributing factors to scoring goals. The fact R2 jumped from 0.671 to 0.827 when Liverpool’s data was excluded from the data set strengthens is a case in point.

All future posts on Onfooty.com


Some of you might know this already. All my future posts will be published on On Football.

The objectives remain unchanged. A visual and a data-driven view of all things football.

Here are my first two posts on Onfooty.com

Agents in Football – Focus on EPL

Putting Manchester City’s spending into perspective

 

Follow us on Twitter at @AnalyseFooty and Sarah on @Onfooty

 

“Manchester United do ‘it’ to teams every year” Really?


It was a few weeks ago. Sunday EPL games just ended and Manchester United had opened up a 5-point lead over local rivals Manchester City.

Fans, journalists, some TV announcers and even some stats geeks on my Twitter timeline seemed to be saying the same thing.

“Manchester United do it every year to their rivals around this time of the year.”

“It” means a surge to win the title coming from behind. I got curious. Even after assuming that “doing it every year” is probably an exaggeration for “majority of the time” I couldn’t quite believe it. I looked at some data.

Hypothesis:

We tend to remember events better than numbers. Some events are more memorable than the others.
I hypothesized that this might be a case of selective memory due to the dramatic nature a few events like this comeback of Man United against Bayern München in 1999.

I analyzed the Premier League tables from the inaugural season in 92-93 through 2011-12.

As I had expected, the data painted a different picture.

Methodology:

1. Look at the top-4 of the standings for every season at the end of the months January, February, March, April and May(end of the season).
2. Plot the points differential between the leader and the rest.
3. Look at the # of times the lead changed hands in the seasons that Manchester United won the title (from the end of January to May)
4. Look at seasons where Manchester United lead early on but did not go on to win the title.

Assumptions:

1. Ignore teams below 4th place, to reduce noise. I have also ignored the 4th place in 2003-04 where there were 4 different teams that were 4th at the end each month, I ignored them to reduce noise.
2. Plotted only point totals at the end of the last 5 months, to reduce noise – Deeper analysis (on a week-to-week basis) in seasons with close title run-ins  will be done as a follow up.

Observations:

1. Manchester United is a great champion and won 12 of 19 titles, but in quite a few cases they had comfortable leads from end of January through May (see images below)

Season Lead changes after January
1993-94 0 – Led from week #4
1996-97 0 – Led from week #23
1999-00 0 – Led from week #21
2000-01 0 – Led from week#10
2006-07 0 – Led from week #7
2008-09 0 – Led from week #20
2010-11 0 – Lead from week #15

2. They were 5 seasons where they made a title push coming from behind to win the title.

  • 1992-93In the inaugural Premier League season they took the lead in Mid-March and led the rest of the way to a title
  • 95-96: Newcastle lead the table into mid-March but United overtook them and went on to win the title

  • 98-99:Closest run-in of all. They lead Jan through Mid-March, gave up the lead briefly to Arsenal but pipped them at the end by a point. If you want to talk about late comebacks, this has got to be the poster child, although they did wobble a bit towards the end.
  • 02-03:Came back from 3rd at end of January to overtake Arsenal in mid-March
  • 07-08: Interesting chart. Were level on points with Arsenal at the end of January. Took over the lead from Arsenal in Feb. Chelsea chased them down in April but Man United prevailed by 2 points in the end. Not a major comeback in my book just playing cool with the lead.
  • 11-12: Jury is still out on the current season

3. They lost 4 titles after leading the table post January

  • 97-98: United lead till from January through mid-April where the lost the lead to Arsenal. Arsenal’s rise is slightly exaggerated in the chart as the had 3 games in hand at the end of February.
  • 01-02: The bottom literally fell-off for United in late March
  • 03-04: After leading at the end of January, United were never in it. The “Invincibles” season of Arsenal. I dont have a 4th place team in this because there 4 different teams in 4th and the graph was getting busy.

  • 09-10: Great title race with Chelsea, but Man United came up short by a point in this one.

Other seasons

  • 04-05: They were never in contention

  • 05-06: They were never in contention
  • 94-95: Blackburn prevailed despite United running them close

Conclusion:

It is clear that they don’t come from behind always, not even close.

  • There were 5 instances where they came from behind between end of January & May to go on to win the title. (Not counting the current season)
  • There were 4 instances where they lost a lead between end of January & May to go on to lose the title.
  • There were 7 instances where they led end-to-end between end of January & March

In all there were 12 seasons in which they fell behind at some point between end of January and May.

Titles won after trailing 5/12 = 41.66%

Titles lost after leading 4/12 = 33.33%

Never lead 3/12 = 25%

All data from http://www.premierleague.com
You may follow me on twitter @AnalyseFooty & @aupasubmarino

Charts for seasons where they led from end of January through May

  • 93-94

  • 96-97

  • 99-00

  • 00-01

  • 06-07

  • 08-09

  • 10-11
%d bloggers like this: