We are past the halfway point in Major League Soccer this year and if you recall from this previous article I promised I would revisit my Expected Wins analysis again at about this stage.
To continue to chart the progress of PWP, to include the data points behind the calculations, I am offering up some diagrams on what the data looks like after:
- The 92 game mark of the MLS Regular Season (184 events).
- The 183 game mark of the MLS Regular Season (366 events).
- The same data points for World Cup 2014 (128 events).
For background details on Possession with Purpose click this here.
A reminder of how things looked after 184 Events (92 Games)…
Trends indicated that winning teams passed the ball more, completed more passes, penetrated the final third slightly less but completed more of their pass attempts in the final third.
For shooting; winning teams shot slightly less by volume but were far more successful in putting those shots on goal and scoring goals.
For details you can enlarge the diagram and look for your specific area of interest.
As for how the trends show after 366 Events (183 Games)…
Winning teams now average less pass attempts and complete slightly fewer passes.
There is a marked decrease in pass attempts into the opposing final third and slightly fewer passes completed within the final third.
In other words – teams are counter-attacking more and playing a style more related to ‘mistake driven’, counter-attacking, as opposed to positive attacking leading into the opponents final third.
As for shooting; winning team are now taking more shots, with more of those shots being on goal and more of those resulting in a goal scored.
In my opinion what is happening is teams are taking advantage of poor passing accuracy to generate and create turnovers .
In turn those turnovers are generating cleaner and clearer shots given opponent poor positional play on the transition.
My expectation is that more teams will now begin to focus on bringing in newer players that have better recovery skills and can defend better.
In contrast, here’s how these same data points look after completion of the World Cup of 2014… there is a difference…
Winning teams average more passes attempted and far more completions than losing teams.
In addition winning teams penetrated far more frequently than losing teams, and that increase in penetration also translated to an increase in passes completed within the final third.
With respect to shooting; winning teams shot more, put more shots on goal, and scored far more goals.
Clearly what we see here is that quality in player skill levels also translated to an increase in quantity.
That should become even more apparent in looking at the PWP outputs for MLS and World Cup Teams…
Here they are for MLS at the 184 Events point this year:
A quick review of the data outputs shows winning teams averaged 51% possession and are 2% points better in overall passing accuracy.
That passing accuracy advantage also carried into the final third but when taking shots losing teams averaged more shots taken, per penetration, than winning teams.
Bottom line here is that winning teams had those fewer shots taken generate more shots on goal and more goals scored than losing teams.
After the 366 Event point this is how those same outputs look…
Like the indicators, in the PWP Data points, the percentages here are beginning to reflect the counter-attacking style of football taking over as the norm.
Winning teams now, on average, possess the ball less than their opponents… wow… mistake driven football is taking hold across the MLS.
As for Passing accuracy within and outside the final third…
Winning teams continue to be better in passing – and that level of accuracy is driving a large increase in shots taken, per penetration, by winning teams compared to losing teams (almost 2% different).
That is a marked difference (4% swing), from earlier, where losing teams shot more frequently, per penetration, than winning teams.
In addition that increase in shots taken, per penetration, also results in more shots on goal, per shot taken, and more goals scored, per shot on goal.
The margin between winning teams, and losing teams, for goals scored versus shots on goal, at the 184 Event point versus 366 Event point, still remains > 29%.
So how about teams in the World Cup???
Like earlier, winning teams not only passed the ball more frequently they possessed the ball more, by 5% (52.56% to 47.89%).
So contrary to what others might think – tika-taka is not dead, it’s just been transformed a wee bit…
With respect to passing accuracy…
I’m not sure it can be any more clear than this – winning teams averaged 82.40% and losing teams averaged 80.46%.
What makes these outputs different from MLS is that the level of execution is far higher in passing accuracy; by as much as 6%.
To put that in perspective; if a team looks to attempt 500 passes in MLS that equals 380 passes completed – compared to 412 passes completed by World Cup teams; clearly the level of execution is much higher.
That difference of 32 passes completed can have a huge impact when penetrating and creating opportunities within the final third.
What makes it even tougher is that the quality of defenders is significantly higher at the World Cup level as well.
With respect to penetration and creation within the final third…
World Cup winning teams averaged 2% greater penetration per possession than winning teams in the MLS.
By contrast World Cup winning teams generated fewer shots taken per penetration than those in the MLS.
Does this speak to better defending? I think so…
What I think is happening is that quality gets the team into the box, but then the quality of the defenders and goal keepers, in that confined space, is taking over.
This should be evident, even more so, when seeing that winning teams in the World Cup also put fewer shots on goal per shot taken than winning teams in MLS.
And that also translated to goals scored for winning teams in the World Cup also scored fewer goals scored per shot on goal…
All told, winning teams in the World Cup displayed slightly different (average percentages) than winning teams in MLS with one exception – passing accuracy.
And given the importance of the tournament it’s no wonder…
Without having the data, yet, I’d expect that the better teams in the EPL, Bundesliga, and other top European Leagues that difference in passing accuracy would remain.
As for the difference in possession (winning teams clearly possessing the ball more than losing teams) I’m not sure – mistake driven football, if memory serves is an approach Chelsea have used in the past…
I’d imagine it’s a pendulum type effect – as more teams work towards mistake driven football more teams will strengthen their ability to recover and open the game up a bit with direct attack to force the opponent from pressing so high.
I’ll be looking for additional trends as the year progresses to see if direct play increases – perhaps a good indicator of that might be even fewer penetrations and more crossing?
With respect to statistical relevance of the data and the outputs generated…
In every case the relationships created, be them Exponential or 4th Order Polynomial all had correlations that exceeded .95.
In other words the variations are minimal and should really reinforce just how tight the difference is between winning and losing in a game of soccer…
COPYRIGHT, All Rights Reserved – PWP, Trademark
Like the La Liga article this week I’ll be taking a look at some mid-table maidens and muppets.
And yes, I’m breaking down this week and offering up some analysis on Manchester United.
I’ve delayed long enough I guess so I’ll take a peak at them along with Tottenham, Crystal Palace, West Brom, Stoke City, and Leicester City; all of them on eight points each working from 7th to 12th in the league table.
As usual – to start things my Possession with Purpose Composite PWP Strategic Index through Week 6:
In picking out those six teams Man United lead in CPWP (5th best); followed by Spurs (9th), Stoke (10th), Leicester City (13th), West Brom (14th), and Crystal Palace (16th); not bunched up like in the league table.
Perhaps there might be some telling team performance indicators in APWP or DPWP that really separate these teams?
The best way to start is to peel back all these teams in APWP:
The obvious – Man United rest 5th best, Leicester City, perhaps a surprise at 9th best (lest we forget that smashing pumpkin they delivered at Man United’s door two weeks ago), Spurs 11th best (or 10th worst), Crystal Palace 13th best, Stoke 15th best and West Brom 4th worst.
Here’s the six teams in focus plus two balancing agents – Chelsea and Burnley – the top and bottom of the EPL heap…
I could spend the better part of 800 words going over what’s offered here – I’d prefer not to and just point out a few bits and pieces before another diagram on Attacking.
- Leicester City (blue bars) have one hell of a great parabolic relationship (follows the white dashed parabola of Chelsea going on) – and Man United do as well. Not quite as pronounced as Chelsea but the pattern of attack is similar in team outputs.
- The difference there with Leicester City is obviously quality – less means less for the most part in the EPL – but all things considered not a bad form for Leicester.
- In considering Man United – plenty of patience (like Chelsea) but the finishing is getting in the way – perhaps Wayne Rooney is not the striker this team needs?
- Even more worrisome for Man United should be that they’ve played no-one of great concern in the EPL yet – they’ve got Everton next weekend then a potential break with West Brom (but maybe not?) then they have Chelsea and Man City back to back…
- When looking at the pear-shaped teams it’s West Brom, Spurs, and Stoke City who best follow the pattern (black dashes) set by Burnley.
- Crystal Palace look to follow the Chelsea parabola but appear to lack goals scored relative to the percentage of shots on goal – perhaps attributed to missing the near or far post? Still not bad form inside the 18 yard box.
- Those who chart Expected Goals will know that better than I.
In moving on to my Expected Wins Diagram; here’s the same teams viewing how those percentages of success translate to overall volume:
I’ve taken the liberty of highlighting Chelsea in light yellow while highlighting Burnley in light orange.
A few items of note here without 800 odd words of observations:
- Recall I mentioned that Crystal Palace was a bit lacking in goal scoring percentage compared to shots on goal – well in looking at all these teams, Crystal Palace average the lowest volume of activity in all these categories until – until – you get to Shots on Goal and Goals Scored… pretty remarkable and perhaps a great example of how an effective attacking performance plays out, statistically, for a team that plays more towards a counter-attacking style than a possession based style.
- I don’t offer Crystal Palace as being more direct given their lower volume of passes attempted in the Final Third – if their numbers were near Stoke City then I might.
- Note that Man United exceed all the others in this scrum by a good margin with one exception – Stoke City, who has a considerably less volume in passing but ends up with a higher volume of shots taken.
- In considering Stoke – note the drop-off in shots on goal and goals scored… even though they have the largest volume of shots taken for these teams.
- Perhaps this is another great example of a team that looks to play slightly more direct, has less patience on the ball, and as a result, their overall productivity takes a nose-dive when it comes to scoring goals?
- Oh – had to change the color for Chelsea to light blue given the white background…
I had a request earlier this week to offer up my Expected Wins diagram using a Logarithmic scale – as such I’ve included one below:
The highlighted areas remain the same – but with this approach you can clearly see the negative outcomes for Stoke City and Burnley – while also seeing that the overall data collection points do have a relationship.
The healthy one is clearly the light blue bar for Chelsea – and as noted in Expected Wins 3 – this league works off of volume with the exception of Final Third Passes Attempted… losing teams (now) attempt more passes into the Final Third – pretty much reinforcing that Direct Play just isn’t good enough to cut it in the EPL.
Moving on to Defending PWP:
Man United, Stoke, West Brom and Spurs are fall above the mid-table while Leicester City and Crystal Palace are near bottom; again they don’t really bunch up in defending team performance like they do in results.
In looking at the diagram below it’s a wonder Stoke City are as high up as they are – I’ll offer up where Stoke gets hit worst a bit later – for now notice that I’ve replaced Chelsea and Burnley with Southampton and QPR:
Measuring defending statistics is always hard to do because I have to intuit what doesn’t happen on the pitch; given the lack of clarity in separating passes and shots between those that are hindered and those that are open… more here on that if interested.
For now the juice in 800 words or less:
- A bad sign for me in how effective a team is, in defending their 18 yard box, is when the opponent percentage of goals scored, per shots on goal, exceeds the percentage of shots on goal, per shots taken.
- The team who best represents a lower percentage of goals scored per shots on goal than shots on goal per shots taken is Southampton – currently in second place; the White dotted line.
- At this stage their differential is 19.15% – second best is West Ham at 11.71% and third best is Swansea City at 9.22%.
- Of all the teams in this focus Man United has the best differential (+2.23%).
- The worst of the lot is Stoke City; a differential of -21.87%; the largest margin by far… either they need a new Goal Keeper or they need better fullbacks and center-backs…
- What keeps them on the higher end of the DPWP is lower percentages for their opponent in possession and shots on goal per shots taken – so they do a great job in looking to prevent the shots taken reach goal – but when they do reach goal they are high quality shots… I’d attribute this to poor positional play in the 18 yard box and perhaps goals conceded on the counter-attack.
- Either that or their Goal Keeper simply isn’t that good?
- As far as penetration goes, we already see Crystal Palace yields possession and space in the midfield – as do West Brom, Crystal Palace, and, for the most part, Leicester City.
- With higher opponent percentages in possession – coupled with a strong passing league, it’s no wonder when the defense breaks down in the 18 yard box those teams are going to be slightly less effective than someone like Southampton.
It should be noted that only Crystal Palace and Leicester City are on the lower end of DPWP – so these teams can score and at this stage it’s their attack that is pushing them to mid-table – can that hold?
Hard to say – one thing is, neither of those teams is as pear-shaped as Newcastle…
Still early days yet but teams are showing tactical trends, seen before in PWP analysis, that separate the possession based teams with those who like to play counterattack or more direct.
Survival of the fittest couldn’t be more clear in this superb league… speaking of Newcastle; how on earth are they so low in the Table?
More to follow on that question in a couple of weeks.
COPYRIGHT, ALL Rights Reserved. PWP – Trademark
Follow on twitter @chrisgluckpwp
Being mid-table – a glass half-full – or a glass half-empty?
Still just six weeks in, but there are trends that can be offered with six games, so for this week’s focus I’ll look in on Rayo Vallecano, Almeria (who I looked at in Week 3 also), and Granada.
Respectively those teams are 9th, 10th, and 11th in the League Table; all with eight points.
To get started here’s my traditional Possession with Purpose CPWP Strategic Index after Week 6:
First off – for those keeping track the correlation (R2) for La Liga CPWP, after Week 6, is (.79) to average points in the league table.
The three focus teams (Rayo, Almeria, and Granada) are not bunched up at 9th, 10th and 11th, here they are spread out – where Rayo is 4th best in CPWP, Almeria is 10th best, and Granada is 18th best (3rd worst) – quite a distinctive difference in team performance though the points remain the same.
In peeling those three teams back I’ll begin with APWP:
For the leading side of APWP we have Rayo in 5th, Almeria in 13th, and Granada in 16th…
On taking a surficial glance first thoughts here, without reviewing the data, and using just the goals for and goals against lead me to believe that Rayo are doing a good job of penetrating, creating and scoring goals in comparison to the other two.
While at the same time they are also giving up goals as good as they get them… Rayo (10 for – 10 against) – Almeria (5 for – 5 against) – Granada (4 for 9 against)* (* more later on the asterisk).
So what do the internal team performance statistics offer for these three teams?
- In taking a look at some standard statistics Rayo lead those three with an average passing accuracy of 77.12%; while Almeria is 74.47% and Granada is 73.88%.
- With respect to penetration – Almeria lead those three in penetrating the opponent’s final third (~27% of the time they control the ball they penetrate) – while both Rayo and Granada hover around 17.5%.
- Given that Almeria’s average possession percentage is ~47%; compared to 58% for Rayo and 41.5% for Granada I’d offer the more successful team in playing counter-attacking soccer is Almeria – while the more patient team in penetrating is Rayo and the least effective attacking team is Granada.
- A difference maker, after considering the tactical and penetration characteristics, is obviously testing the waters on their successes in generating shots from penetration as well as how effective they are in putting the ball into the back of the net.
- Rayo leads the three teams by a slim margin in shots taken per penetration (19%) – with the other two hovering at ~18%.
Not much difference in terms of overall success but in looking at the volume of shots both Rayo and Almeria average 12 per game while Granada average just 7 per game.
- Meaning 19% and 18% equals 12 shots taken per game for Rayo and Almeria while 18% yields just seven shots per game for Granada; not ideal – especially when we know “more is better” in La Liga…
- If you have read this article (Expected Wins 3) you’ll know this to be true for La Liga, while it is not true for other European Leagues I evaluate, at this time.
- So how do the shots taken translate to shots on goal? Almeria average the most shots on goal (4.17) versus Rayo at (3.5) and Granada (2.0).
- As with many successful counter-attacking teams – sometimes fewer shots taken generate more shots on goal given the poor position some possession-based teams find themselves in when turning the ball over in the wrong place.
In wrapping up – greater possession percentage and higher passing accuracy don’t drive overall success for Rayo in comparison to Almeria – who posssesses the ball less, and have a lower passing accuracy.
- I wonder what the Midfielder Player Radars, statsbomb develop, look like for Rayo compared to Almeria?
- The November 29th match up against these two teams should provide a great contrast in attacking style – and perhaps one that is worthy to watch for teams scouting the success or failure of counter-attacking teams versus possession-based teams that aren’t as dominant in $$ and skills as a team like Barcleona.
That’s only one-half of a game though – and for those who think defense first – attacking team performance is the less influential half. So how do these three teams compare in DPWP?
First off – I’ve altered the “y” axis scale to reinforce how much of a difference Barcelona has with the rest of La Liga when it comes to possession- based tactics.
Clearly Barcelona not only possess with the intent to score they also possess with the intent to defend… for me this is a great example where – if the opponent doesn’t have the ball they can’t score…
Now for Rayo, Almeria, and Granada; Rayo is 6th best in DPWP, while Almeria is 8th best and Granada is 18th best (3rd worst).
- * The more later on Granada: At first glance I’d offer Granada has been far luckier in garnering their eight points than Almeria or Rayo – but – Granada just got beat by Barcelona six – nil.
- Now that Goals Against is three instead of nine – for a +1 Goal Differential.
- So where would Granada be in DPWP without playing Barcelona?
- Granada would be 9th in overall DPWP if they hadn’t already played Barcelona!
- Further up the DPWP than Almeria and only one place behind Rayo… a GREAT example of how playing just one team – like Barcelona – can impact this Index so early in the season!
- It is what it is… and while it may be fair to eliminate the Granada game against Barcelona (mix apples with apples) I won’t… everyone has to play Barcelona twice.
- If the positive play of Granada continues, exclusive of Barcelona, then that will show up later on this year.
- If it doesn’t, then perhaps this is an early signal that Granada are on a down slide?
However viewed; here’s some takeaways for these three teams, in defending team performance after six weeks:
- Opponent posssession will be just the opposite as attacking possession – in other words opponent’s for Granada will possess the ball more than either Rayo or Almeria.
- And even when removing the Barcelona game against Granada their opponent’s average possession is ~56% per game – still higher than Rayo (42%) and Almeria (52%).
- With respect to penetration, Granada opponent’s penetrate at ~28% while Almeria and Rayo opponent’s gain entry ~24% – the takeaway here indicates that Granada will play slightly deeper than both Almeria and Rayo.
- The difference isn’t that simple though – Almeria are a counter-attacking team given other indications so it’s likely the opponent’s 24% is more associated with the tactic of allowing penetration – whereas with Rayo – a possession-based team – it’s likely the opponent is gaining their penetration based upon mistakes in defending (not getting behind the ball) and those initial mistakes lead to more goals scored.
To test that – let’s take a look at Shots Taken, Shots on Goal, and Goals Scored for the opponent’s of Rayo and Almeria.
- Indeed – Rayo opponent’s generate more shots taken per penetration (21.64%) to Almeria (20.44%) yet that greater percentage sees Rayo actually facing fewer shots taken (10.83) to (13.67), fewer shots on goal (4.00) to (4.17) yet more goals scored against per game (1.67) versus Almeria (.83).
- Those Radar Charts might support this but might not – the funny thing about defensive statistics is that the sum of individual defensive statistics never quite matches up, one-for-one, with the volume of unusccessful passes by an opponent – see here…
- To quantify a bit differently – Almeria opponent’s average 72 successful passes, per game, in the Almeria Defending Final Third – whereas Rayo opponent’s average 55 successful passes, per game, in the Rayo Defending Final Third.
- Lower volume, fewer shots faced, more goals scored against – a pattern I’ve seen in the MLS this year with teams like Portland and New York – teams that (when watching them play) exhibit the habits of teams who make defensive mistakes based upon poor positional play.
- With respect to Granada – they not only face a much higher volume of opponent passes in their own Defending Final Third (115 per game) than Rayo they also yield only 1.5 goals against per game…
- So again, another team with greater activity in their own Defending Final Third does a much better job of not ceding goals against.
If I had to offer an opinion here I’d suggest that in order for Rayo to continue to have a successful year they need to 1) get behind the ball a bit quicker, and perhaps 2) get a better defensive minded midfielders to work better with some (upgraded?) defenders in the back-four.
With respect to Almeria and Granada – finding the right balance between attacking and defending is always hard – it looks to me as if both teams have a prety good balance but could (perhaps?) to add a highly skilled midfielder, with superb vision, to try and eke out that odd goal that doesn’t generate undue risk on the defending side of the pitch…
COPYRIGHT, All Rights Reserved. PWP – Trademark
If history holds, it’s likely
Noteable for all three is that only Granada have played Barcelona – Rayo have Barcelona next week while Almeria don’t play Barcelona until November 8th.
If you’ve read these two previous articles, Expected Wins and Expected Wins 2, you know I look at how teams perform, on average, (win, lose, or draw) with respect to my primary data collection points for Possession with Purpose.
What will be added, in Version 3 (V3), will be a compare and contrast between all the leagues I evaluate in my Family of Indices.
Results of looking at the diagrams and reading through my observations should help clarify analyses like (ABAC, ABCB) doesn’t really have relevance to teams that win, lose or draw – at least not this year. (Note – two links – two different sites published roughly the same analysis)…
Don’t get me wrong – I’m not taking a personal dig at the grueling work associated with the analyses.
It has great value, but more from a tactical viewpoint in how passing is executed, not from a (bell curve) – volume/success of passing rate – relative to possession and penetration into the Final Third, that helps a team create and generate shots taken leading to goals scored; or… when flipped, leading to goals not scored.
And as pointed out by a (shomas) on the article, that surfaced on MIT, if anything, it adds predictability to what a team will do – and the more predictable a team, the more likely the opponent can defend against them better…
For me – I would have thought the GREATER the variation in that cycle(ABAC, etc…) the better… others may view that differently?
In addition, I think there could be more value, to the information, if it was segregated by league – more later on that…
To begin – here’s a reminder of what Expected Wins looked like in Major League Soccer after 92 games (184 events):
The term ‘event’ is used, as opposed to game, to clarify that each team’s attacking data is include in this analyses – and that the greater the volume of data points the stronger the overall statistical analyses is; i.e. sampling 15 data-stream points is not the same as sampling 1000 data-stream points.
Biggest takeaway here is the strength of correlation these seven data points have to each other (i.e. their representation – in my opinion – of the primary bell curve of activities that occur in a game of soccer)…
In every case, in every diagram that follows, all the Exponential trends exceed .947; and in every case the relationship for the winning teams is higher than the relationship for losing teams… speaking to consistency of purpose and lower variation in my view.
In general terms. this is my statistical way of showing that a goal scored is tantamount to a 5th or 6th standard deviation to the right from the normal bell cuver of activities that occur in a game of soccer.
Said another way – I don’t evaluate the tail – when measuring the dog’s motion – I evaluate the dog; recognizing that the tail will follow, to some degree, what the motion of the dog will be… and… that even if the motion of the dog is somewhat different, the tail will normally behave in the same way.
Therefore, it’s not the tail that should be analyzed – it’s the dog… others may view that differently.
Here’s the same diagram for the MLS after 366 events:
Oh… the green shaded areas are meant to show those data points that are higher for those particular categories; in other words the Volume of Shots Taken for winning teams (after 366 events) was higher than that of losing teams – but the volume of passes completed in the Final Third was higher for losing teams than winning teams… more on that later.
Here’s the diagram after 544 events in MLS:
Note the shift – only the volume of Final Third Passes Attempted is now higher for losing teams – all other data categories see the winning teams with greater volume.
For me, what this reinforces is the issue of time and space as well as patience – three statistics never measured in soccer (publicly at least)… again, reinforcing, for me, that shot location only has value relative to the time, space, and patience of the team in creating that time and space for that shot.
Statistically speaking, what that means, to me, is that Expected Goals; a very popular (and worthy) statistical calculation, needs to be refined if it’s to have greater value as a predictive tool/model… I’d be interested to hear / read the views of those who work Expected Goals efforts…
Now here’s the European Leagues I’ve added to my PWP Family of Indices analyses; first up the English Premier League:
Note that the pattern, here, after 100 events, resembles the same pattern for MLS after 544 events… worthy.
Moving on to the Bundesliga:
A pattern similar to MLS after 366 events; will this pattern morph into something different as the league continues? Possibly – the MLS pattern has changed so perhaps this one will too?
Now for La Liga:
A completely new pattern has taken shape – here “volume” speaks volumes!
Is this unique? Nope… It also happens to be the same pattern as the World Cup 2014 pattern – below:
Will that pattern show itself in the UEFA Champions League? I don’t know but we’ll find out…
So what’s it all mean? The “so-what”?
Before attempting to answer that, here’s two different diagrams plotting these data points for winners and losers (in reverse order) for the leagues I evaluate:
Now the grist:
The red shaded areas are where the losing teams’ average exceeds the winning teams’ average in the volume of those activites – the green shaded areas are highlighted for effect. Green shaded areas for the volume of Shots on Goal and Goals Scored indicate that those numbers are virutally the same, for winning teams, in all the activities measured…
Now, back to the so-what and what’s all mean?
For me this reinforces that the “pattern” of passing (ABAC, ABCB, etc…) that gets you into the Final Third has no relevance to the volume of Goals Scored.
And, it also reinforces that different motions of the ‘dog’ will generate the same tail wagging outputs – therefore it’s the analysis of the dogs activities that drive greater opportunities for improvement.
The averages for winners in the activities measured all behave somewhat differently – granted some patterns might be the same but the volumes are different.
And when volumes change, the game changes, and when the game changes, the strategic or tactical steps taken will change – but… the overall target should still remain the same (on average) – put at least 5-6 shots on goal and you ‘should’ score at least two goals… getting to that point remains the hard part!
Bottom line here:
These leagues are different leagues – and the performances, of the teams, in those leagues are different when it comes to winning.
Therefore, I’d offer that comparing a striker’s ability to score in one league is completely different than an expectation an organization might have in how that striker may score in another league.
Said another way – a striker who scores 20 goals in the Bundesliga, a league that shows winning teams play to a more counter-attacking style, might not perform as well in a league like the EPL; which looks to offer that winning teams play a more possession-based style.
Perhaps??? another good example… a striker playing for a team that counter-attacks, is more likely to have greater time and space to score a goal, than playing in a possession-based team where time and space become a premium because the opponents play far tighter within their own 18 yard box.
But, as mentioned before – since no-one statistically measures (publicly) the amount of time and space associated with passing, and shot taking, we can’t peel that onion back further. I have suggested two new statistics that may help ‘intuit’ time and space – that article is “New Statistics? Open Shots and Open Passes”: here.
For the future… I’m interested in seeing how these analyses play out when separating out teams who show patterns of counter-attacking, and perhaps direct play, over teams that show patterns of possession-based football.
In addition, I’m also keen to see how these take shape when reversing the filter and organizing this data based upon whether or not a team is defending deeper, or more shallow.
The filter there will come from looking at the opponent averages for passing inside and outside the Final Third…
It seems reasonable to me (others may view this differently?) that the if a team lacks goal scoring they need to find the right midfielders and fullbacks that are good enough to create the additional time and space the strikers need in order to score more goals.
And that doesn’t even begin to address the issues in defending – which statistics continue to prove year in and year out as being more critical to winning than attacking.
Given all this information, I may have missed something – I’m always looking for questions/clarifications so please poke and prod the diagrams and analyses and comment as time permits.
COPYRIGHT, All Rights Reserved. PWP – Trademark
You can follow me on twitter @chrisgluckpwp
It’s time to offer up another revised version of my Possession with Purpose Analysis.
My intent here is to:
- Provide an update that may help simplify this effort, and
- Update new links to articles most have found to be of great interest in the last year.
To begin… Possession with Purpose (PWP):
The End State, as always this is good to know up front:
Create an objective Strategic Family of Indices, with publicly made available data, that has relevance and helps identify (explain) the strengths and weaknesses of team performance ‘outside’ the realm of Points in the League Table.
Of note; this analysis has been presented, and received with great interest, at the World Conference on Science and Soccer of 2014. So it’s not a fly-by-night attempt to offer up analysis that can’t translate back to the soccer and science industry or help inform the general, or well educated, soccer community (both here and across the pond) about Footy…
Create a Family of Indices that measure the ‘bell curve’ of strategic activities that occur in a game of football (soccer); recognizing that in order to score goals the following activities usually need to occur:
- Gain possession of the ball
- Move the ball
- Penetrate the opponents defending final third
- Generate a shot taken
- That ends up on target and,
- Gets past the keeper
From a statistical (measurement) standpoint those activities are organized into these six categories:
- Possession percentage
- Passing Accuracy across the Entire Pitch
- Passing Percentage within and into the Opponents Final Third compared to overall possession (i.e. = Penetration)
- Shots Taken per Percentage of Penetration
- Shots on Goal per Shots Taken
- Goals Scored per Shots on Goal
It’s not a secret formula but I do retain Copyright.
The Family of Strategic Indices – there are three of them:
- Attacking Possession with Purpose (APWP)
- Defending Possession with Purpose (DPWP)
- Composite Possession with Purpose (CPWP)
APWP Index: How effective a team is in performing those six process steps throughout the course of a game. Example:
DPWP Index: How effective the opponent is in performing those six process steps, throughout the course of a game, against you. Example:
CPWP Index: The mathematical difference between the APWP Index and DPWP Index. Example:
Simply stated, the analysis stemming from this effort is a comparison and contrast between how a team performs (in the bell curve of these activities) relative to other teams in their league “without” including points in the league table.
Last year the CPWP Strategic Index Correlation (relationship) to Points in the Table, for Major League Soccer, was .77; this year, at the end Week 26, the R is .85.
In returning back to the End State:
“Create an objective Strategic Family of Indices, with publicly made available data, that has relevance and helps identify the strengths and weaknesses of team performance ‘outside’ the realm of Points in the League Table.”
Given the very high level of Correlation these Indices have, I’d say this Family of Indices has considerable statistical relevance; and I should point out that although the PWP approach is an Explanatory Model it can also be leveraged as a Predictability Model.
After speaking with a number of folks at the World Conference on Science and Soccer (2014) it was agreed that the most effective way to turn this into a Predictability Model is to remove Goals Scored (in both Indices) and ‘see’ how the Composite Index takes shape after that.
Here’s an example of what I mean:
A word or two of caution…
From a purely statistical viewpoint I do not see this as a Predictability Model that has direct relevance yet… why?
For the simple reason that there have not been 15 games played for all teams both Home and Away – teams show a tendency, for the most part to behave slightly different at home versus on the road…
Why the number 15? I suppose it comes down to Confidence Level in the number of samples that are needed in order to forecast the future based upon the past… with 34 games played in Major League Soccer you really need 15 games to reach that 95% Confidence Level limit in samples…
All that said, it is extremely inviting/inticing to see that even when Goals Scored (both for and against) are removed the CPWP Predictability Index still has a correlation (R) of .84…
Links to articles that have had extensive views over the last year and a way to get a taste of how PWP analyses might be able to help you, as a writer (through collaboration with me), better inform your audience about the nuance of soccer:
- Chicago Fire
- Portland Timbers
- Consistency of Purpose – Attacking Standard Deviations
- La Liga – Simana 2 ( I can offer translation of my articles from English to Spanish on special request)
- Bundesliga – Bayer Leverkusen (I can offer that translation request to German as well, on special request)
- English Premier League – Chelsea
- Colorado Rapids
- LA Galaxy
- Sometimes what doesn’t happen on the pitch has more value than what does happen
- New Statistics – Open Shots – Open Passes
- FIFA World Rankings – Time for a change?
- Expected Wins
- Passing – an oddity in how it’s measured (Part I)
- Passing – an oddity in how it’s measured (Part II)
- Expected Wins 3 – My deepest dive yet into the average performance of what winning teams do in Major League Soccer, the English Premier League, Bundesliga, LaLiga, and World Cup 2014.
- My original Introduction and Explanations (detailed) to Possession with Purpose Family of Indices
- 2014 End of Season Analysis – Houston Dynamo – Dynamic Dynamo Demagnetized as Dominic Departs
- West Ham and Aston Villa – EPL– Going in two different directions
- 2014 End of Season Analysis – Chicago Fire – Candle Burned at Both Ends
- Getting Better as a Youth Soccer Coach
- The Comforts of Home in Major League Soccer
- Seattle Sounders – Road Warriors in 2014 MLS Regular Season
- Portland Timbers End of Season 2014 – Defense Wins Games & Better Defending Leads to Better Attacking
- Valencia – Formula Won – La Liga
- Getting More From Less – Peeling back the statistical differences between teams that Direct Attack versus playing a Counter-Attacking Tactic as part of a Possession-based System.
- Expected Wins 4 – Is European Football Really Higher Quality than Major League Soccer?
- Seeing Red!!! Toronto FC
- World Cup – Two Best Teams? You Bet!
- UEFA Champions League – Some Great Games Coming
- Busting the Myth of Moneyball in Soccer Statistics…
- Scintillating Saints of Southampton Stay Strong
- Hurried Passes
- Catching up with Europe (CPWP and initial discussions on TSR)
- Redefining and Modernizing TSR
- Expected Wins Five (Europe – Pucker Time is here)
- Passing – More from Less – Barcley’s Premier League after Week 30
- MLS 2015 – Control or Lack Thereof
Others in mainstream media sometimes offer up subjective opinions that may not be substantiated with objective data; I won’t do that.
Every shred of analysis offered here will include some sort of objective data to support an opinion or conclusion.
Like any other mainstream business; statistical analysis provides objective data as a tool to leverage when looking to make business decisions. It is not a substitute for the seasoned leadership needed to make final decisions.
I don’t advocate that this analysis is the ‘answer’ or the only tool that substantiates one view – in a soccer match, with 40,000 supporters in attendance, I’ve learned that those 40,000 supporters have 40,000 sets of eyes that see things differently.
On this site, this information and analyses presented, is merely my view, from my eyes, in how I see the game – hopefully, in order to make my future articles of better value, others will add their comments, thoughts, and questions.
Finally, I’m not sure how this will develop but I’ve been approached to provide a manuscript for this analytical effort – for publication in a Sports Science Journal. More to follow on how that goes.
COPYRIGHT, All Rights Reserved. PWP – Trademark
NOTE: All data used to generate this analysis stems from OPTA through a number of open/public websites across Europe and America.
My thanks to OPTA and all those open websites for helping to facilitate my own analysis and potential improvements that may arise from this effort.
Roughly 10 days before the Kickoff to the first ever World Conference on Science and Soccer held in the United States I got a phone call from the Conference President and Coordinator Terry Favero asking me if I was interested in making a presentation on Possession with Purpose.
To say the least I was pleasantly surprised, elated and nervous all at the same time; me – just a wee blogger locked up here in the great northwest; a hot bed for soccer being asked to offer my work on Possession with Purpose.
In short; after a some discussion and clarification I said yes; and four days later had submitted this presentation for discussion.
Before showing the diagrams, my first order of business is to thank Terry Favero, and then also add my thanks to some great folks at Prozone Sports, New England Revolution, Portland Timbers and Arsenal FC for making the presentation and discusssion truly superb! Wow – what a great experience.
Without further ado; let the diagrams begin…
Wrapping up the hour long presentation/discussion with a few takeaways that come to mind…
Most agreed that the critical penetration numbers to focus on were passes “within” the Final Third and not just passes that “penetrate” the Final Third.
Most agreed that crosses ‘were’ passes – though there also remains value in considering crosses separately – but they should be included in the overall analysis of passing accuracy within the Final Third.
New Soccer Statistics?
- Additional discussion centered on the potential need for a couple of new statistics – “failed assists” was one – …there is value in knowing what players offer up what volume of potential goal scoring opportuinities even if they fail – especially those that fail as a result of a defensive clearance.
- In that example the defender gets credit for stopping a cross but the player who offers the cross that is good enough to require a clearance gets no statistical credit for it… that may change in the near future.
- Another additional new statistic considered was the ‘penetrating pass’ statistic – where individual players who generate penetrating passes into the Final Third are recognized… it’s hard to measure vision but many agreed this may be a statistic that could help measure ‘vision’…
The slide highlighting the changes in MLS Coaches from 2013 to 2014 also peaked some interest – indeed – like last year – the cycle has begun again this year with the sacking of John Hackworth.
I’ve done two separate articles on that and won’t go into any more detail other than to say – my team performance indicators lean towards that move being one of senior leadership panic (with the lack of wins) more than anything else.
Granted wins matter – but the last slide really drives home how an organization, loyal to their Head Coach, can turn things around with minimal changes in personnel and faith in the system being used.
If you’ve been following my Possession with Purpose and Expected Wins articles/streaming research you should know by now that the data had a pretty strong correlation to the MLS League Tables last year.
So how do things look for PWP at this stage in the MLS this year?
A few thoughts….
- It’s early days but the two teams lowest in the League Table (Western and Eastern Conference) also happen to be the lowest teams in the PWP Composite Index this year.
- Caveat – the amount of data for this Index is not ideal; ideal would be how the Index begins to take shape from Week 17 on-wards. I am, however, providing you this information so we can all watch how this Index takes shape for the entire year.
- As noted, last year the final Index was compelling in its relationship to the League Table; I have no idea if that will be the case this year.
- However viewed I don’t advocate that this Index represents a substitute for the League Table but those teams performing well in scoring points also seem to be those teams performing well in Possession with Purpose; or is it vice versa???
- I’ll dig into an update on my PWP approach in my next article, for now I readily acknowledge that this Index is influenced by passing accuracy – but it’s also influenced by shooting accuracy too.
- In looking at the Eastern Conference; the exception last year was Houston and it remains so again this year. This time Houston, with 14 games played, are in fourth in the League Table but 17th overall in PWP; if that 4th place is to continue I’d offer that their PWP Indices will improve compared to other teams.
- As for Montreal, Chicago, Toronto (with 3-5 games in hand), Philadelphia and New York are in the bottom half while Columbus, Sporting, New England and DC United are in the top half for both.
- Bottom line – with a few exceptions the Index looks reasonable – can it be a predictor at or near the 17 game point for all teams? I’m not sure but watching this Index change from week to week is intriguing.
Given that interesting output, I decided to take a look at how teams sit in the Index relative to games played at home versus on the road.
The team who appears to be performing the best on the road, relative to their own Index ratings, is Chivas – their differential is -.39. In looking at the total goals scored at home they have six, on the road they have 7. Chivas have taken six points away from home and four points at home.
Chicago Fire also appear to do better on the road than at home – they have 11 goals on the road and eight goals at home. Indeed they also have taken six points away from home and six points at home.
In looking at the upper end of the Index differences, New England leads PWP in team performance at home versus on the road. Their own Index difference is .68; with 13 points at home and ten points on the road going with 11 home goals and 10 road goals.
Next up is San Jose at .49 – they have scored 11 goals at home and just two on the road. Taking 12 points at home and just one point on the road.
In looking at the six steps of PWP for New England (home and away) they have about the same possession (~47% each) and overall passing accuracy (72.7% each). The biggest difference comes in penetration completion; at home the Revolution complete ~33% of all their overall passes within the Final Third; while on the road that figure is ~26% – a full 7% points difference. So it appears they are more willing to possess with the “intent to possess” more on the road.
In addition, the number of shots taken versus passes completed in the Final Third is ~17% at home while ~15% on the road. Again, more patience in attack on the road…
Finally, while their Shots taken versus shots on goal are slightly higher on the road (42% to 40%) their ability to score goals versus shots on goal is 33% at home versus 23% on the road. In other words they are more accurate in their goals scored at home.
As for San Jose the wide difference in goals scored at home versus on the road should be pretty obvious but in case you were wondering – in the four games San Jose have played on the road their overall penetration into the final third is 3% less than at home.
Their shots taken versus completed passes in the Final Third is 9% less, Shots on Goal versus Shots Taken is 14% less and their Goals Scored versus Shots on Goal is 19% less. In the case of San Jose it’s “less means less” in almost every category…
In considering Chivas…
To date they have played 5 games at home. At home their possession is 4% higher, passing accuracy is 6% higher, penetration is 1% higher, their shots per penetration is higher by 2% but their shots on goal per shot taken is 7% lower and their goals scored versus shots on goal is 17% lower.
In other words, at home, they appear to have more quantity in their overall passing to penetrate but they have less quality when it comes to scoring goals.
I’m not sure how this will play out for the year but at this stage the data is interesting. Is it compelling one way or the other? Hard to tell, but we don’t know what we don’t know unless we at least throw it out there to take a look…
For now I think it is compelling enough to re-look later this year on how team performance in PWP takes shape at home and away…
Hopefully most of you read Part I of my series on Expected Wins in Major League Soccer.
As a quick reminder the Expected Wins analysis is my internal data quality review on the seven data points I use to support my quantitative Possession with Purpose analysis; the stronger the correlation these data points have the more confidence I have in the overall Indices that are created to assess team performance.
For your benefit, in case you forgot, here are the seven data points I continue to analyze as we reach the 92 game point in MLS; which equals 184 events:
- Passes Attempted Entire Pitch
- Passes Completed Entire Pitch
- Passes Attempted Final Third
- Passes Completed Final Third
- Shots Taken
- Shots on Goal
- Goals Scored
All data points, at this time, have equal weight.
What is interesting is that over the week to week course of the season 40% (20/50) of the weekly top five teams, in Attacking PWP, have averaged less than 50% possession in their matches.
For me that’s pretty cool as it indicates this analysis is not really biased towards teams that use a shorter-passing scheme in attack. Week 5, 3 of 5 teams were under 50% and the other two were both under 51% possession.
Some of those teams are possession based teams like DC United, Portland and Seattle but in that week the margin of possession did not have as much effect as the ability of those teams to finish quality chances – the top three teams that week all scored goals equal to their shots on goal.
The five teams that week who exceeded 80% in Passing Accuracy; usually a good indicator of ground based attacking all finished outside the top 5.
Moving on after that tidbit, here’s the averages for overall (blue bar), teams that win (green bar), teams that draw (orange bar) and teams that lose (red bar).
Facts as they exist today after 184 Events in 2014:
- The overall tenor of the data points and their relationship really hasn’t changed that much since XpW 1.
- Teams that win average 51.11% Possession; losing teams average 48.89% Possession, (lower)
- Teams that win average 76.39% in Passing Accuracy; losing teams average 74.10% (lower)
- Teams that win average 20.48% Penetration in the Final Third based upon Total Passes completed; teams that lose average 20.32% (lower)
- Teams that win average 18.64% Shots Taken per pass completed in the Final Third, losing teams average 19.22% (higher)
- Teams that win average 42.67% Shots on Goal per Shot Taken; teams that lose 32.13% (lower) (by over 10%!)
- Teams that win average 46.18 Goals Scored per Shot on Goal; losing teams 17.03% (lower) (by over 29%!)
Like after XpW 1 (102 Events – 51 games) losing teams shoot the ball more often, on average, but are less accurate when it comes to putting those shots on target and into the net. Patience in creating quality continues to outweigh quantity…
Overall, the averages for Shots on Goal for winning teams has increased from XpW 1 (4.90) to XpW 2 (5.36); basically the better teams have gotten better and the losing teams have gotten worse (3.84 now) versus (4.10 in XpW 1).
I wonder how that trend will continue through the rest of this year?
Tthe 2% gap in Passing Accuracy between winning teams and losing teams has held from XpW 1 to XpW 2.
The gap in Shots on Goal has increased in losing teams to 10% as opposed to 9% (XpW 1).
The gap in Goals scored has remained near steady at roughly ~30%; though slightly smaller in XpW 2.
Losing teams still continue to take more Shots than winning teams; 12.74 (winning teams) to 12.80 (losing teams) but… that gap has dropped since XpW 1 – perhaps losing teams are looking to be more patient in their shot selection?
So how does the overall data relate in an Exponential Relationship?
The light shaded lines are the lines of data as in XpW 1 – and the trend-line colors remain the same.
This time the R2 has dropped just a tad.98 to .95 – all things considered most would consider that correlation Rock Solid… I do – and the correlation of these data points, viewed as a whole, have a higher correlation together than Goal Differential (R2 = .88) to Points in the League Table.
Goal differential is usually a great indicator but it also remains a qualitative statistical indicator not a quantitative indicator.
Like last time there remains a difference in the R2 between winning teams, teams that draw, and losing teams; with draws now having greater correlation than wins. Why? I’m not sure – but as noted by the closeness of all the data points there still remains a fine line between winning, losing and drawing.
Last time I felt that helped explain the difference between mistakes or unlucky breaks – I continue to sense that is the main difference. So might this be an indicator of luck – I don’t know – what do you think?
I have seen discussions of late, on Telly, and in some articles written elsewhere, that focus more on ‘space available’ as opposed to just Shots Taken… hopefully that trend continues!
I also remain hopeful that OPTA and other statistical web sites will offer up more critical events taking place in the Final Third… One other article written since XpW 1 is my analysis (as promised in Xpw 1) on defensive indicators; here’s a link to Hurried Passes and those details.
I still don’t have enough data, in my opinion, to offer additional thoughts on individual team performance relative to home and away games; that probably won’t have statistical reliability until the midpoint of the season (game 323 – events # 646).
There are trends but I’ll save that for another article, enough for now.
If you’ve been following me the last 2-3 years, through Columbian Newspaper, out of southern Washington, you’ll know that I’ve been researching data with the intent of creating some Indices to analyze “team performance” in Major League Soccer.
The initial version of my Possession with Purpose approach was published on 10 February, 2013 on the Columbian Newspaper Portland Timbers Blog Site (here).
This revised version was published on the 15th of January, 2014. I retain COPYRIGHT on all materials published in association with Possession with Purpose (PWP) TM
My intent has been to develop a simplified (Strategic) set of team performance indicators that may help others better understand soccer and how the outcome of a game may be better understood based on the primary inputs to the game.
Data for presentation originally comes from documenting and analyzing all 646 MLS Regular Season games in 2013; my research in beginning to develop Possession with Purpose, as it is known today, began mid-season 2012; the pre-cursor articles on that effort can be found on my Columbian Newspaper blog site.
All future research will be published here… As things have progressed my research and efforts in Possession with Purpose led to an invitation to present my findings at the World Conference on Science and Soccer 2014; that presentation can be found through this link.
The source data originates with OPTA and is displayed on the MLS Chalkboard and the MLS Statistics Sheet found through www.mlssoccer.com.
With that here’s my introduction on Possession with Purpose…
To first understand the context, I offer that this is one of the End States of my effort:
Create a simplified approach and documented method for measuring team performance where the output is an Index that (while excluding points) comes close to matching results in the MLS League Table.
Beginning with that End State in mind here is the End State product:
Observations from the Diagram…
Note that 9 of the top 10 teams in this Index made the MLS Playoffs last year with the Houston Dynamo finishing 12th in the Index.
For comparison, in benchmarking whoscored.com their Index only had 8 of their top 10 teams make the Playoffs, while http://www.squawka .com matched my 90% success rating, but the team they missed in the top 10 (New England) finished 16th in the Index.
From a strategic standpoint, the End State objective has been met; create a simplified approach and documented method for measuring team performance where the output is an Index that (while excluding points) comes close to matching results in the MLS League Table.
Defining the PWP Attacking and Defending Processes…
Here are the six steps in the PWP Strategic Attacking Process:
- Gain possession of the ball,
- Retain possession and move the ball,
- Penetrate & create goal scoring opportunities,
- Take shots when provided goal scoring opportunities,
- Put those shots taken on goal,
- Score the goal.
Here are the six steps in the PWP Strategic Defending Process:
- Minimize opponent gaining possession of the ball,
- Minimize opponent retaining possession and moving the ball,
- Minimize opponent penetrating and creating goal scoring opportunities,
- Minimize opponent taking shots when provided goal scoring opportunities,
- Minimize opponent putting those shots on goal,
- Minimize opponent scoring the goal.
Every step is this process has an average success rate (percentage) based upon data gathered from all 646 MLS Regular Season games.
Understanding the context of these steps versus other conditions and activities that influence the outcome of a game…
In case you missed it I call these Processes and the Indices “Strategic” to separate their value/meaning relative to other things that can influence the outcome of a game.
For me I have two other ways to classify information that can influence the outcomes in those steps. I have Operational conditions and Tactical metrics; provided below are some examples of each:
- Operational conditions: Scheme of maneuver a team uses in setting up their system, such as flat-back four, flat-back three, double-pivot midfield, single-pivot midfield, bunkering with counterattacking, pressing high, direct attacking, possession-oriented attacking, weather conditions, location of the game (home/away), conference foe, non-conference foe, etc…
- Tactical metrics: Locations of shots taken, shots on goal, and goals scored; penalty kicks, free kicks, crosses, headers won/lost, tackles won/lost, interceptions, clearances, blocked crosses, blocked shots, etc.
The diagram below shows the PWP Strategic Attacking Process with the average percentage of success rate in MLS for 2013. A more detailed explanation of each step is provided below the diagram.
Step 1: Gain possession of the ball: The intent behind this basic step should be clear; you can’t win the game if you don’t possess the ball to some extent. A second consideration about this step is that the more you possess the ball the less your opponent possesses the ball.
- From a defensive standpoint there are any number of ways a team can work to gain possession of the ball; they include, but are not limited to, tackling, intercepting, clearing the ball, winning fifty-fifty duels on the ground or in the air, or simply gathering a loose ball based upon a deflection or bad pass.
- For this Process the measurement of success is the percentage of possession a team has in a given game; note that in Soccer, the primary method for measuring possession is to add up the number of passes made in a game and divide into that the amount of passes one team makes (create a ratio percentage of possession); the opposing team has the difference between 100% and the percentage of possession that the other team has.
- It’s not perfect but it provides a simplified ratio to compare one team versus another…
Step 2: Retain possession and move the ball: It shouldn’t be a secret to many that in most cases the team possessing the ball will need to move the ball in order to penetrate the opponents Defending Third and score a goal.
- This is not to say a team has a minimum number of passes they need to complete to score a goal; for teams winning possession deep in the opponents Defending Third there may be times where the only thing needed is a quick shot on goal.
- By and large, however, most teams – when they gain possession of the ball – do so in their own Defending Third and then move the ball (eventually forward) in a position where a teammate can create a goal scoring opportunity for another team member to take a shot.
- For this process, the measurement of success is the team’s passing accuracy percentage across the entire pitch; passes completed divided into passes attempted.
- It’s not perfect, but it provides a simplified ratio to compare one team versus another; statistically speaking there are weaknesses in how this percentage is measured by the big data folks.
- Throw-ins, for example, move the ball across the pitch from one player to another yet they are not officially counted as passes.
- Successful crosses are also not counted as a successful pass even though the ball moves successfully from one player to another.
- Oddly enough, when evaluating the data provided on the MLS chalkboard, an Unsuccessful cross is included as a Pass attempted (?!)
- For the purposes of this analysis I had to count all successful crosses as successful passes; therefore my final pass completions totals will be slightly higher than what Opta provides. It is what it is…
- I should also point out here that there are occasions when a team wins possession of the ball and takes a shot where no pass was completed. Like I said, this measurement method is not perfect but it is ‘equal’ in ignoring that exception for all teams.
- Therefore the measurement itself has value in tracking the majority (bell curve) of activities that normally occur in a game of soccer. And as a reminder, these are Strategic steps in PWP; by definition a Strategic step will not measure to a level of granularity; that is where Tactical metrics come into play based upon an Operational condition where the team is applying pressure higher up the pitch.
Step 3: Penetrate and create goal scoring opportunities: Most know that a pitch is divided into three parts; the Defending Third, Middle Third, and Attacking Third. For the purposes of this effort, Penetration is associated with entering the opponent’s Defending Third with the intent to score.
- For this Process, penetration is measured by dividing the volume of passes a team completes within the opponent’s Defending Third into the volume of passes a team completes across the entire pitch.
- It’s not perfect but it creates a ratio that treats all teams fairly, and given the overall accuracy of the End State Index (90%), it’s a reasonable way to measure this step.
- In order to measure this step I first had to manually filter, for all 646 games, every pass attempted and completed using the MLS Chalkboard; my thanks to MLS and OPTA for providing us ‘stats’ guys the opportunity to do that. With Golazo stats now available, that task will be easier next year. As a stats guy, it would have been inappropriate to switch measurement methods ¾’s of the way through the year.
Step 4: Take shots when provided goal scoring opportunities: This is, by far, the hardest indicator to measure, given how current data sites really lack granularity in how they identify/define ‘created goal scoring opportunities.
- I define a ‘created goal scoring opportunity’ as any pass, successful or not, that may have ended with another teammate taking a shot. That’s hard to quantify, but an example, if you will:
- A fullback overlapping down the right side puts in a wicked cross that gets cleared at the last minute by a center-back, with his head. With OPTA and other data companies that wicked cross, though unsuccessful, is not quantified as a goal scoring opportunity created; it’s merely tracked as a clearance and an unsuccessful pass.
- I disagree; the fullback did their job in putting in that wicked cross – what really happened is the defender also did their job in clearing it – therefore a “potential” for the attacking team to complete a created goal scoring opportuinty and take a shot was denied.
- Both the attacking team and defending team should be statistically credited for doing what they are expected to do. Others may disagree…
- But as a Head Coach, I would put to memory that the fullback did what was supposed to happen; create the chance – therefore in my books that player created a goal scoring opportunity.
- For this Process, the step is measured by counting the number of Shots Taken compared to the number of completed passes in the opponent’s Defending Third.
- It’s not perfect, but it’s measured in an unbiased manner for every team, and there will be instances where a shot can be taken without a completed pass or originate from a defensive error.
- In going back to the example, as a Head Coach I would call that effort a “failed assist.” I think there is value in knowing the number of “failed assists” as much as there is in knowing “assists.”
- By tracking “failed assists” it provides a pure, statistical way, to track individual player performance (tactical metric) that can influence team performance.
- Bottom line on this one, as contentious as it may be for some, recall the End State of this Final Index… create a simplified approach and documented method for measuring team performance where the output is an Index that (while excluding points) comes close to matching results in the MLS League Table.
- Given the accuracy rating of 90% in matching the top 10 Playoff teams this year I feel and think the approach to measure this indicator works.
- If OPTA, or another data compilation agency starts to track “failed assists”, could an Index like this reach 100% accuracy?
Step 5: Put those Shots Taken on Goal:For the most part this is an individual statistic that is added up to create a team performance indicator.
- For this process, the step is measured by dividing the number of Shots on Goal by the number of Shots Taken.
- It’s one of the easier indicators to measure, and if you watch any level of soccer, it’s pretty self explanatory – if the Shot can come anywhere within the dimensions of the Goal, it is considered a Shot on Goal. One of two things happens; it goes in or it doesn’t.
Step 6: Score the Goal: One critical objective of the game.
- I say ‘one’ because indications, I see, lead me to offer that this game is not all about scoring goals.
- In my research it appears to me that teams who defend better seem to take more points in games than teams that don’t defend very well.
- A recent example in my End of Season analysis of Vancouver: in Western Conference competition, they scored 35 goals and gave up 35 goals; all told they took just 26 of 72 possible points – clearly, in this example, scoring goals did not result in wins…
- Prozone, a noted professional sporting analysis company, offers the following in the article: “Using data from the last ten seasons of the Premier League, Anderson and Sally compared the value of a goal scored and the value of a goal conceded. They found that scoring a goal, on average, is worth slightly more than one point, whereas not conceding produces, on average, 2.5 points per match. Goals that don’t happen are more valuable than goals that do happen.”
- It’s not perfect, but it provides reasonable information in a reasonable format that has reasonable value when comparing the End State output to how the MLS League Table finished.
- For those interested the PWP Strategic Attacking Index and Defending Index are provided below:
- In looking at these two Indices, note the number on the left; the difference between the Index number in the Attacking Index and the Defending Index is the number that appears to the left in the Final Strategic Index at the beginning of this article.
- That may help explain why some teams finished above zero, as opposed to below zero in the Final Index.
- Teams finishing above zero had team attacking percentages that exceeded their team defending percentages; in other words they were better in their attack against the opponents than the opponent’s were in attacking them.
- Team success rates in these six steps will be used next year to begin to analyze how well the team is performing as the new season starts compared to performance the previous year.