A superb run with five wins and a draw in six games; by most standards that is a compelling argument for consistency. I agree and their overall Composite Possession with Purpose Index rating continues to climb.
They’ve (New England) climbed from 17th in PWP (week 5) to 7th after week 11; a superb shift of 10 full places in 6 weeks.
So in considering this giant push forward I’d like to take a different approach in how the data points from PWP can be viewed.
This is new so please bear with me for a minute or two as I set the context.
Below are a number of diagrams referencing my PWP indicators for a few teams; the diagram being used this time is the ‘doughnut’ diagram from Microsoft Powerpoint.
The interesting thing about this diagram is that it allows me to offer up a view on my PWP data points that isn’t relative to the exponential relationship (a line). Instead, it allows me to picture the overall tenor of PWP data points in relationship to themselves as being a part of a ‘whole’; with the ‘whole’ being PWP.
I feel confident I can take this approach since my Expected Wins 2 correlation for my data points is ~.97 (R2) — as near to rock solid as you can get.
Other context points include:
- The teams used in this analysis are Seattle, New England, Montreal, Portland and last years’ Supporters Shield winner (New York) plus last years bottom dweller (DC United)
- Reminder in case my explanation was a bit wordy above – the percentages indicated in the doughnut are not the percentages of those activities relative to the game; they are the percentage of those activities relative to each other with 100% being all those activities added together.
- Source – as usual the MLS Chalkboard and the MLS Statistics Sheets
- Gold Stars on the diagrams are intended to show you where differences occur.
- The team name on the outside of the doughnut is the outer ring of data and the team name on the inside of the doughnut is the inner ring of data.
The volume of Final Third passes successfully completed by New England (29%) is 3% points higher than Montreal (26%). Note also that Montreal has a greater percentage of PWP outside the Final Third (30%) than New England (28%). Both of these indicate to me that New England is more focused on penetrating and creating than Montreal.
For the future I will check into these three areas when looking to see if a ‘direct attacking approach’ can be better differentiated from a ‘ground-based’ (short passing scheme) approach.
The actual volume of penetration is higher for New England as well (11%) versus (7%). And like my regular PWP analysis the data here also supports the fact that teams who are more patient in creating shots taken (6% for NER versus 11% for MIFC) end up with more goals scored.
I did ask Matthias Kullowatz about the specific shot data for New England and Montreal; ~60% of Montreal’s shots on target have come outside the prime scoring zones 1 & 2 while ~68% of the Revolution shots on target have also come outside of zones 1 & 2. So what’s different?
I think it’s down to time and space again; though it could be the Revolution have better strikers – but when you see the DC United doughnut diagram a bit later I think it’s back to time and space; and with fewer shots taken and more patience in the final third that seems reasonable to me.
Now for a contrast that might be better at explaining individual mistakes and bad fortune more than a bad ‘style/system’…
Notice no ‘gold stars’; why? Because there really isn’t that much difference between how these two teams execute the six steps of PWP.
What separates these two teams in the league table are individual mental mistakes in defense – Portland sit on ten points while Seattle have 25. Through the course of this year the Timbers have dropped 7 points due to red cards and penalties – they did both against Columbus Saturday night!
In considering the ‘sameness’ of the data I expect as time passes an output similar to this could highlight ‘individual mistakes’ and perhaps ‘good/bad luck’ when it comes to rebounds and deflections – again recall Saturday night when Futty Danso deflected a shot and notched an ‘own-goal’
All told things went pretty well for Columbus, a red card by their opponent, a foul in the penalty box by their opponent for a PK and a deflected own-goal by their opponent. If I were a Columbus fan I’d be pretty pissed they didn’t win – bad luck for the Crew!
However viewed I’ll revisit this diagram later when the Cascadia Cup battle heats up.
So here’s the doughnut view of New York compared to DC United last year and then a bit further down how they look compared to each other this year.
First off – let’s not forget Ben Olsen was not fired and perhaps this doughnut diagram can also help explain why given the overall poor performance in results last year for DC United.
Notice that the team does exceedingly well in comparison to New York with respect to Passing, penetration and creation; they actually exceed New York in the first two categories and only fall off when it comes to goals scored (7% for DC United versus 15% for New York).
So I’d offer that the system Ben Olsen ran last year worked – what he lacked was a pair of good strikers. And if you recall the Montreal doughnut earlier the outputs from DC United do not mirror those of the Impact!
They added Espindola and Johnson and shored up their defense a bit; that also included adding Amos Magee to the staff. Remember him as the Defensive Coordinator for Portland last year (I think – others can confirm or deny that I’m sure)
Bottom line here – the system didn’t change and the Head Coach didn’t change and I’d offer that was appropriate… now for the same diagram this year:
Note the increase for DC United in the final category – goals scored versus shots on goal – pretty compelling information to reinforce that the system used last year is the same system used this year and the difference – major difference – is the addition of two quality strikers.
I’m just in the learning stages on how this new doughnut diagram will take shape – I’m pretty sure it will have at least one hole in it – I’m hopeful there aren’t a lot more.
Some changes afoot with OPTA and MLS – I see OPTA incorporated the Final Third Passing Accuracy suggestion – just need to find out if crosses are included in that metric???
As for the new MLS Chalkboard – I’m not sure how that will work if the ‘numbers’ of activities are not available to count when it comes to defensive activities and ‘touches’ for players…
And yes, the old Chalkboard still appears to exist given a separate link within previous articles but it’s unclear if this change will be a permanent change for next year – or even the World Cup for that matter…
As for This Week in PWP; if you saw my tweets yesterday you know the top Attacking and Defending PWP teams of the week; New England in attack and Toronto in Defense with the Reds taking the Composite PWP Index top spot for Week 11.
Sporting KC, along with LA Galaxy remain atop the Composite PWP through Week 11 while the Revolution moved to 7th and Columbus dropped to 4th as Real Salt Lake are now in a comfortable position of 3rd best overall.
Finally, this view also gives you and idea of what percentage each team gleans from each of the PWP Six Steps data points in the calculation for the overall Index number.
If you’ve read these two previous articles, Expected Wins and Expected Wins 2, you know I look at how teams perform, on average, (win, lose, or draw) with respect to my primary data collection points for Possession with Purpose.
What will be added, in Version 3 (V3), will be a compare and contrast between all the leagues I evaluate in my Family of Indices.
Results of looking at the diagrams and reading through my observations should help clarify analyses like (ABAC, ABCB) doesn’t really have relevance to teams that win, lose or draw – at least not this year. (Note – two links – two different sites published roughly the same analysis)…
Don’t get me wrong – I’m not taking a personal dig at the grueling work associated with the analyses.
It has great value, but more from a tactical viewpoint in how passing is executed, not from a (bell curve) – volume/success of passing rate – relative to possession and penetration into the Final Third, that helps a team create and generate shots taken leading to goals scored; or… when flipped, leading to goals not scored.
And as pointed out by a (shomas) on the article, that surfaced on MIT, if anything, it adds predictability to what a team will do – and the more predictable a team, the more likely the opponent can defend against them better…
For me – I would have thought the GREATER the variation in that cycle(ABAC, etc…) the better… others may view that differently?
In addition, I think there could be more value, to the information, if it was segregated by league – more later on that…
To begin – here’s a reminder of what Expected Wins looked like in Major League Soccer after 92 games (184 events):
The term ‘event’ is used, as opposed to game, to clarify that each team’s attacking data is include in this analyses – and that the greater the volume of data points the stronger the overall statistical analyses is; i.e. sampling 15 data-stream points is not the same as sampling 1000 data-stream points.
Biggest takeaway here is the strength of correlation these seven data points have to each other (i.e. their representation – in my opinion – of the primary bell curve of activities that occur in a game of soccer)…
In every case, in every diagram that follows, all the Exponential trends exceed .947; and in every case the relationship for the winning teams is higher than the relationship for losing teams… speaking to consistency of purpose and lower variation in my view.
In general terms. this is my statistical way of showing that a goal scored is tantamount to a 5th or 6th standard deviation to the right from the normal bell cuver of activities that occur in a game of soccer.
Said another way – I don’t evaluate the tail – when measuring the dog’s motion – I evaluate the dog; recognizing that the tail will follow, to some degree, what the motion of the dog will be… and… that even if the motion of the dog is somewhat different, the tail will normally behave in the same way.
Therefore, it’s not the tail that should be analyzed – it’s the dog… others may view that differently.
Here’s the same diagram for the MLS after 366 events:
Oh… the green shaded areas are meant to show those data points that are higher for those particular categories; in other words the Volume of Shots Taken for winning teams (after 366 events) was higher than that of losing teams – but the volume of passes completed in the Final Third was higher for losing teams than winning teams… more on that later.
Here’s the diagram after 544 events in MLS:
Note the shift – only the volume of Final Third Passes Attempted is now higher for losing teams – all other data categories see the winning teams with greater volume.
For me, what this reinforces is the issue of time and space as well as patience – three statistics never measured in soccer (publicly at least)… again, reinforcing, for me, that shot location only has value relative to the time, space, and patience of the team in creating that time and space for that shot.
Statistically speaking, what that means, to me, is that Expected Goals; a very popular (and worthy) statistical calculation, needs to be refined if it’s to have greater value as a predictive tool/model… I’d be interested to hear / read the views of those who work Expected Goals efforts…
Now here’s the European Leagues I’ve added to my PWP Family of Indices analyses; first up the English Premier League:
Note that the pattern, here, after 100 events, resembles the same pattern for MLS after 544 events… worthy.
Moving on to the Bundesliga:
A pattern similar to MLS after 366 events; will this pattern morph into something different as the league continues? Possibly – the MLS pattern has changed so perhaps this one will too?
Now for La Liga:
A completely new pattern has taken shape – here “volume” speaks volumes!
Is this unique? Nope… It also happens to be the same pattern as the World Cup 2014 pattern – below:
Will that pattern show itself in the UEFA Champions League? I don’t know but we’ll find out…
So what’s it all mean? The “so-what”?
Before attempting to answer that, here’s two different diagrams plotting these data points for winners and losers (in reverse order) for the leagues I evaluate:
Now the grist:
The red shaded areas are where the losing teams’ average exceeds the winning teams’ average in the volume of those activites – the green shaded areas are highlighted for effect. Green shaded areas for the volume of Shots on Goal and Goals Scored indicate that those numbers are virutally the same, for winning teams, in all the activities measured…
Now, back to the so-what and what’s all mean?
For me this reinforces that the “pattern” of passing (ABAC, ABCB, etc…) that gets you into the Final Third has no relevance to the volume of Goals Scored.
And, it also reinforces that different motions of the ‘dog’ will generate the same tail wagging outputs – therefore it’s the analysis of the dogs activities that drive greater opportunities for improvement.
The averages for winners in the activities measured all behave somewhat differently – granted some patterns might be the same but the volumes are different.
And when volumes change, the game changes, and when the game changes, the strategic or tactical steps taken will change – but… the overall target should still remain the same (on average) – put at least 5-6 shots on goal and you ‘should’ score at least two goals… getting to that point remains the hard part!
Bottom line here:
These leagues are different leagues – and the performances, of the teams, in those leagues are different when it comes to winning.
Therefore, I’d offer that comparing a striker’s ability to score in one league is completely different than an expectation an organization might have in how that striker may score in another league.
Said another way – a striker who scores 20 goals in the Bundesliga, a league that shows winning teams play to a more counter-attacking style, might not perform as well in a league like the EPL; which looks to offer that winning teams play a more possession-based style.
Perhaps??? another good example… a striker playing for a team that counter-attacks, is more likely to have greater time and space to score a goal, than playing in a possession-based team where time and space become a premium because the opponents play far tighter within their own 18 yard box.
But, as mentioned before – since no-one statistically measures (publicly) the amount of time and space associated with passing, and shot taking, we can’t peel that onion back further. I have suggested two new statistics that may help ‘intuit’ time and space – that article is “New Statistics? Open Shots and Open Passes”: here.
For the future… I’m interested in seeing how these analyses play out when separating out teams who show patterns of counter-attacking, and perhaps direct play, over teams that show patterns of possession-based football.
In addition, I’m also keen to see how these take shape when reversing the filter and organizing this data based upon whether or not a team is defending deeper, or more shallow.
The filter there will come from looking at the opponent averages for passing inside and outside the Final Third…
It seems reasonable to me (others may view this differently?) that the if a team lacks goal scoring they need to find the right midfielders and fullbacks that are good enough to create the additional time and space the strikers need in order to score more goals.
And that doesn’t even begin to address the issues in defending – which statistics continue to prove year in and year out as being more critical to winning than attacking.
Given all this information, I may have missed something – I’m always looking for questions/clarifications so please poke and prod the diagrams and analyses and comment as time permits.
COPYRIGHT, All Rights Reserved. PWP – Trademark
You can follow me on twitter @chrisgluckpwp
Some stunners and bummers this week for plenty of soccer supporters across North America; who’da thought Montreal would get a clean sheet against New England and Real Salt Lake would get completely schooled by Seattle…
Others like Philadelphia reinforced they do not want to be a bottom dweller, as some suspect this year, by beating up on Chivas, and DC United took advantage of a depleted Sporting KC to take three in DC.
For sure this week, like a few others this season, reinforced why games need to be played.
So who was tops this week in Attacking PWP (APWP) – it may surprise you – (Columbus Crew) it did me for a start, but in review, the overall data supports the basic intent of PWP –
- A documented method for measuring team performance from my six step process.
- An index that ranks teams for their performance based on this method.
- The index, while excluding points, comes close to matching results in the MLS league table.
So here’s a look at the top five teams in APWP this past week and some comments to follow for consideration:
A couple of things…
Note the Completed passes in the Final Third vs Completed Passes across the Entire Pitch (4th column from the left). Three teams in the top 5 APWP this week all faced teams who attempted to bunker in; how can we tell that?
By the lower percentage of penetration versus completed passes for Columbus (13.80%); LA Galaxy (14.48%) and Philadelphia Union (16.30%). And when viewing other teams who have played against these teams the results are similar…
When Seattle played Chivas earlier this year they had just 13.94% of their total passes completed in the final third; against Toronto they did slightly better at 18%.
Columbus versus Chicago was 16.59%, LA versus Chivas was 15%, FC Dallas versus Chivas was 15%, Portland versus Chivas was 14% – so there is clearly a pattern.
It’s probably not as obvious with Toronto as Chicago or Chivas but a realistic assumption can be made that some outputs in PWP will help indicate what pattern of defense a team might encounter.
So how about the attacking portion that really matters – shots on goal and goals scored?
In the case of Columbus and LA both hit the magical 100% and that is what put them in the top five of APWP.
That’s not a bad thing; on the contrary it actually reinforces in my mind how fragile the game of soccer can be when it comes to mistakes and their impacts on the game.
Consider the overwhelming domination that Seattle had this past weekend; their inability to be ‘top of the heap’ in APWP is not a negative on the team.
Where the complete domination shows up is when you add in the Defending PWP…
It’s pretty clear here that three teams stood out from the rest; Colorado (3-nil clean sheet), Philadelphia (3-nil clean sheet) and Seattle (4-nil clean sheet). And that defensive dominance will carryover to the Composite PWP Index shown a bit later.
For now though take a look at the #4 team in DPWP – Montreal Impact – many might have considered that 2-nil win against New England a surprise…
But here’s an interesting tidbit of information about New England in Composite PWP this year.
At home New England perform better than their opponent in APWP 2.42 to 1.91 while on the road their APWP is 2.24 versus their opponent APWP is 2.44; in other words New England are far less productive performing on the road than at home.
Given that, and Montreal showing tendencies in performing better at home, perhaps it isn’t such a big surprise after all?
Here’s the differences between home and away for all teams in MLS at this time:
Bottom line here is that Chivas USA are clearly (far right amber bar) much much better in overall APWP on the road than at home; is it any wonder given their average audience is about 5 people… just kidding…
On the other side we already know about New England – but other teams not liking the road, so much in team performance, are Houston, San Jose, Colorado, Real Salt Lake and Toronto.
Road warriors, though not dominate / winning road warriors also include Chicago Fire (don’t forget that 5-4 win in Red Bull Arena), Philadelphia, Columbus and Portland.
The other takeaway here is how strong and equally consistent are Vancouver and LA Galaxy; there’s almost no difference in their PWP on the road versus at home.
One could argue the same for Portland but with them giving away so many PK’s this year, plus Red Cards (to begin with), there really isn’t value in offering up consistency with the Timbers until after they start playing mistake free football.
In closing, here’s the top to bottom in Week 13 Composite PWP…
A few final thoughts and an update of sorts in general…
Portland did quite well in scoring goals in the run of play this week, and they really proved how effective they can play in direct attacking – Adi has added value; when – not it – but when they get mistake free in the back-four they should push their way up the table…
That might be looking at the Timbers through rose colored glasses, so be it… it is what it is.
With respect to my weekly Attacking and Defending PWP Players of the Week; sadly I can longer offer up these awards. There is simply too much time needed to dig through the new MLS Chalkboard to come up with relevant individual player statistics to support one player over another.
On the one hand some of the new format works well; on the other hand it has completely hampered additional, detailed, defensive analysis… notice that ‘blocked crosses’ is no longer a statistic that is made publicly available.
Finally, and I’m a bit jazzed about this; I got a phone call late last week from the folks organizing the World Conference on Science and Soccer, asking me to present my Major League Soccer Possession with Purpose Index analysis. The better part of last week and early this week I’ve been putting the finishing touches to that presentation.
When I get it done and the Conference is completed I will post it here on my blog site. Really looking forward to listening in to all the presentations.
Pedigree and consistency of purpose are two words/phrases that come to mind when I consider these teams. Both are currently doing very well and in my Composite PWP Index, after 12 weeks, they sit in positions four and five.
In considering this early-to-mid-season marquee match-up I’ve put together a few diagrams that might help paint a picture on how effective these two teams are.
My approach will consider how well Real Salt Lake has performed on the road this year versus how well Seattle have performed at home this year; I hope you enjoy it.
But before starting the Capt. Obvious — both teams have some players missing. With this being Week 13 RSL have used 15 different field players this year while Seattle have used 18.
So although key-players are missing I don’t really think it matters that much – what matters for me, is the beginning words; pedigree and consistency of purpose through the course of this season so far.
And given RSL are unbeaten while Seattle have 26 points with 13 games played the ‘key-player-missing-theme’ just doesn’t work for me.
Given that here’s my latest Doughnut Diagrams for Real Salt Lake versus Seattle; first one up is showing the weighted averages on how each team has attacked their opponent this year (RSL in away games) and (SSFC in home games).
Not much separates the two teams when looking at what percentage each of the activities in PWP amounts to in relationship to each other – the only one showing any real difference is the amount of Shots on Goal versus Shots Taken for Real Salt Lake.
Given the same rough volume of Shots Taken per penetration (7%) for both teams, RSL are more effective in converting those Shots Taken to Shots on Goal.
In viewing the next percentage – converting those Shots on Goal to Goals Scored there is a slight edge to Seattle.
In total though, both teams are +5 in their Goal Differential (RSL on the road) and (SSFC at home).
Early indications are this should be a very tight game.
The next diagram offers up how each team performs in defense against their opponents attack:
While some may disagree with this view I would submit this diagram helps speak to how these two teams defend differently yet they end up with the same result.
Note that RSL opponents have yielded less volume in their opponent passing accuracy within and outside the final third but greater volume in penetrating and creating shots.
For me that indicates RSL have a tendency to apply pressure higher up the pitch.
On the other hand the Seattle opponent percentages seem to indicate to me that their defense tucks in a bit more in the final third with the intent of giving their opponent a wee bit more possession outside the final third.
However viewed both teams appear matched evenly when it comes to preventing Shots on Goal and Goals Scored.
An interesting thing to watch for in this game might be how high up the pitch Alonso ventures versus Grossman (the likely replacement for Beckerman).
I would offer the more Alonso commits himself outside the final third the more likely RSL are to score.
On to the standard team performance percentages from the six steps in Possession with Purpose:
Below is the diagram showing the percentages of RSL and how they defend on the road, versus SSFC and how they attack at home.
I’ve highlighted two areas; the Shots on Goal versus Shots Taken and the Goals Scored versus Shots on Goal; note that when Seattle attacks 40% of their Shots Taken end up on Goal with roughly 28% of those hitting the back of the net.
Conversely, when RSL defends on the road they are pretty stingy when it comes to yielding Shots on Goal; ~30%, but when the opponent does put that Shot on Goal about ~37% of those shots hit the back of the net.
All told the other indicators seem to support a high level of passing accuracy and possession; if the opponent (dark blue bar for SSFC is 46% then SSFC averages 54% at home in attack.
Next up the view on how RSL attacks on the road versus how SSFC defends at home.
Again the highlighted area for RSL is Shots on Goal versus Shots Taken – clearly (given the lower amount of Shots Taken per penetration) (light blue bar – ~18%) RSL takes its time in setting up shots that are more likely to be on target – and scoring is not a problem given their +5 goal differential on the road.
As for Seattle, they yield, on average, about the same amount of Shots Taken per penetration but the resultant indicates they are more successful in preventing that Shot Taken from becoming a Shot on Goal < ~30%.
Another indicator reinforcing that they appear to work towards closing down their opponents more tightly within their defending third.
I’m not sure we see a tight game here – it’s mid-season and both teams might want to test each others’ weaknesses at full speed.
If I had to take a choice on which defense is stronger I would go with Real – on the other side if I had to choose if momentum were going to influence this game I reckon the strong supporter base of Seattle will pull them through.
If individual players are going to impact this game for Real Salt Lake I’d like to think it would be Ned Grabavoy or Joao Plata.
On the other hand if individual players are going to provide a positive impact to Seattle I can see Cooper or Martins taking that leadership; both can be dangerous goal scorers in different ways.
If I were in Seattle I would go to this game… just to watch two strong teams go head-to-head!
Hopefully most of you read Part I of my series on Expected Wins in Major League Soccer.
As a quick reminder the Expected Wins analysis is my internal data quality review on the seven data points I use to support my quantitative Possession with Purpose analysis; the stronger the correlation these data points have the more confidence I have in the overall Indices that are created to assess team performance.
For your benefit, in case you forgot, here are the seven data points I continue to analyze as we reach the 92 game point in MLS; which equals 184 events:
- Passes Attempted Entire Pitch
- Passes Completed Entire Pitch
- Passes Attempted Final Third
- Passes Completed Final Third
- Shots Taken
- Shots on Goal
- Goals Scored
All data points, at this time, have equal weight.
What is interesting is that over the week to week course of the season 40% (20/50) of the weekly top five teams, in Attacking PWP, have averaged less than 50% possession in their matches.
For me that’s pretty cool as it indicates this analysis is not really biased towards teams that use a shorter-passing scheme in attack. Week 5, 3 of 5 teams were under 50% and the other two were both under 51% possession.
Some of those teams are possession based teams like DC United, Portland and Seattle but in that week the margin of possession did not have as much effect as the ability of those teams to finish quality chances – the top three teams that week all scored goals equal to their shots on goal.
The five teams that week who exceeded 80% in Passing Accuracy; usually a good indicator of ground based attacking all finished outside the top 5.
Moving on after that tidbit, here’s the averages for overall (blue bar), teams that win (green bar), teams that draw (orange bar) and teams that lose (red bar).
Facts as they exist today after 184 Events in 2014:
- The overall tenor of the data points and their relationship really hasn’t changed that much since XpW 1.
- Teams that win average 51.11% Possession; losing teams average 48.89% Possession, (lower)
- Teams that win average 76.39% in Passing Accuracy; losing teams average 74.10% (lower)
- Teams that win average 20.48% Penetration in the Final Third based upon Total Passes completed; teams that lose average 20.32% (lower)
- Teams that win average 18.64% Shots Taken per pass completed in the Final Third, losing teams average 19.22% (higher)
- Teams that win average 42.67% Shots on Goal per Shot Taken; teams that lose 32.13% (lower) (by over 10%!)
- Teams that win average 46.18 Goals Scored per Shot on Goal; losing teams 17.03% (lower) (by over 29%!)
Like after XpW 1 (102 Events – 51 games) losing teams shoot the ball more often, on average, but are less accurate when it comes to putting those shots on target and into the net. Patience in creating quality continues to outweigh quantity…
Overall, the averages for Shots on Goal for winning teams has increased from XpW 1 (4.90) to XpW 2 (5.36); basically the better teams have gotten better and the losing teams have gotten worse (3.84 now) versus (4.10 in XpW 1).
I wonder how that trend will continue through the rest of this year?
Tthe 2% gap in Passing Accuracy between winning teams and losing teams has held from XpW 1 to XpW 2.
The gap in Shots on Goal has increased in losing teams to 10% as opposed to 9% (XpW 1).
The gap in Goals scored has remained near steady at roughly ~30%; though slightly smaller in XpW 2.
Losing teams still continue to take more Shots than winning teams; 12.74 (winning teams) to 12.80 (losing teams) but… that gap has dropped since XpW 1 – perhaps losing teams are looking to be more patient in their shot selection?
So how does the overall data relate in an Exponential Relationship?
The light shaded lines are the lines of data as in XpW 1 – and the trend-line colors remain the same.
This time the R2 has dropped just a tad.98 to .95 – all things considered most would consider that correlation Rock Solid… I do – and the correlation of these data points, viewed as a whole, have a higher correlation together than Goal Differential (R2 = .88) to Points in the League Table.
Goal differential is usually a great indicator but it also remains a qualitative statistical indicator not a quantitative indicator.
Like last time there remains a difference in the R2 between winning teams, teams that draw, and losing teams; with draws now having greater correlation than wins. Why? I’m not sure – but as noted by the closeness of all the data points there still remains a fine line between winning, losing and drawing.
Last time I felt that helped explain the difference between mistakes or unlucky breaks – I continue to sense that is the main difference. So might this be an indicator of luck – I don’t know – what do you think?
I have seen discussions of late, on Telly, and in some articles written elsewhere, that focus more on ‘space available’ as opposed to just Shots Taken… hopefully that trend continues!
I also remain hopeful that OPTA and other statistical web sites will offer up more critical events taking place in the Final Third… One other article written since XpW 1 is my analysis (as promised in Xpw 1) on defensive indicators; here’s a link to Hurried Passes and those details.
I still don’t have enough data, in my opinion, to offer additional thoughts on individual team performance relative to home and away games; that probably won’t have statistical reliability until the midpoint of the season (game 323 – events # 646).
There are trends but I’ll save that for another article, enough for now.
If you’ve been following me the last 2-3 years, through Columbian Newspaper, out of southern Washington, you’ll know that I’ve been researching data with the intent of creating some Indices to analyze “team performance” in Major League Soccer.
The initial version of my Possession with Purpose approach was published on 10 February, 2013 on the Columbian Newspaper Portland Timbers Blog Site (here).
This revised version was published on the 15th of January, 2014. I retain COPYRIGHT on all materials published in association with Possession with Purpose (PWP) TM
My intent has been to develop a simplified (Strategic) set of team performance indicators that may help others better understand soccer and how the outcome of a game may be better understood based on the primary inputs to the game.
Data for presentation originally comes from documenting and analyzing all 646 MLS Regular Season games in 2013; my research in beginning to develop Possession with Purpose, as it is known today, began mid-season 2012; the pre-cursor articles on that effort can be found on my Columbian Newspaper blog site.
All future research will be published here… As things have progressed my research and efforts in Possession with Purpose led to an invitation to present my findings at the World Conference on Science and Soccer 2014; that presentation can be found through this link.
The source data originates with OPTA and is displayed on the MLS Chalkboard and the MLS Statistics Sheet found through www.mlssoccer.com.
With that here’s my introduction on Possession with Purpose…
To first understand the context, I offer that this is one of the End States of my effort:
Create a simplified approach and documented method for measuring team performance where the output is an Index that (while excluding points) comes close to matching results in the MLS League Table.
Beginning with that End State in mind here is the End State product:
Observations from the Diagram…
Note that 9 of the top 10 teams in this Index made the MLS Playoffs last year with the Houston Dynamo finishing 12th in the Index.
For comparison, in benchmarking whoscored.com their Index only had 8 of their top 10 teams make the Playoffs, while http://www.squawka .com matched my 90% success rating, but the team they missed in the top 10 (New England) finished 16th in the Index.
From a strategic standpoint, the End State objective has been met; create a simplified approach and documented method for measuring team performance where the output is an Index that (while excluding points) comes close to matching results in the MLS League Table.
Defining the PWP Attacking and Defending Processes…
Here are the six steps in the PWP Strategic Attacking Process:
- Gain possession of the ball,
- Retain possession and move the ball,
- Penetrate & create goal scoring opportunities,
- Take shots when provided goal scoring opportunities,
- Put those shots taken on goal,
- Score the goal.
Here are the six steps in the PWP Strategic Defending Process:
- Minimize opponent gaining possession of the ball,
- Minimize opponent retaining possession and moving the ball,
- Minimize opponent penetrating and creating goal scoring opportunities,
- Minimize opponent taking shots when provided goal scoring opportunities,
- Minimize opponent putting those shots on goal,
- Minimize opponent scoring the goal.
Every step is this process has an average success rate (percentage) based upon data gathered from all 646 MLS Regular Season games.
Understanding the context of these steps versus other conditions and activities that influence the outcome of a game…
In case you missed it I call these Processes and the Indices “Strategic” to separate their value/meaning relative to other things that can influence the outcome of a game.
For me I have two other ways to classify information that can influence the outcomes in those steps. I have Operational conditions and Tactical metrics; provided below are some examples of each:
- Operational conditions: Scheme of maneuver a team uses in setting up their system, such as flat-back four, flat-back three, double-pivot midfield, single-pivot midfield, bunkering with counterattacking, pressing high, direct attacking, possession-oriented attacking, weather conditions, location of the game (home/away), conference foe, non-conference foe, etc…
- Tactical metrics: Locations of shots taken, shots on goal, and goals scored; penalty kicks, free kicks, crosses, headers won/lost, tackles won/lost, interceptions, clearances, blocked crosses, blocked shots, etc.
The diagram below shows the PWP Strategic Attacking Process with the average percentage of success rate in MLS for 2013. A more detailed explanation of each step is provided below the diagram.
Step 1: Gain possession of the ball: The intent behind this basic step should be clear; you can’t win the game if you don’t possess the ball to some extent. A second consideration about this step is that the more you possess the ball the less your opponent possesses the ball.
- From a defensive standpoint there are any number of ways a team can work to gain possession of the ball; they include, but are not limited to, tackling, intercepting, clearing the ball, winning fifty-fifty duels on the ground or in the air, or simply gathering a loose ball based upon a deflection or bad pass.
- For this Process the measurement of success is the percentage of possession a team has in a given game; note that in Soccer, the primary method for measuring possession is to add up the number of passes made in a game and divide into that the amount of passes one team makes (create a ratio percentage of possession); the opposing team has the difference between 100% and the percentage of possession that the other team has.
- It’s not perfect but it provides a simplified ratio to compare one team versus another…
Step 2: Retain possession and move the ball: It shouldn’t be a secret to many that in most cases the team possessing the ball will need to move the ball in order to penetrate the opponents Defending Third and score a goal.
- This is not to say a team has a minimum number of passes they need to complete to score a goal; for teams winning possession deep in the opponents Defending Third there may be times where the only thing needed is a quick shot on goal.
- By and large, however, most teams – when they gain possession of the ball – do so in their own Defending Third and then move the ball (eventually forward) in a position where a teammate can create a goal scoring opportunity for another team member to take a shot.
- For this process, the measurement of success is the team’s passing accuracy percentage across the entire pitch; passes completed divided into passes attempted.
- It’s not perfect, but it provides a simplified ratio to compare one team versus another; statistically speaking there are weaknesses in how this percentage is measured by the big data folks.
- Throw-ins, for example, move the ball across the pitch from one player to another yet they are not officially counted as passes.
- Successful crosses are also not counted as a successful pass even though the ball moves successfully from one player to another.
- Oddly enough, when evaluating the data provided on the MLS chalkboard, an Unsuccessful cross is included as a Pass attempted (?!)
- For the purposes of this analysis I had to count all successful crosses as successful passes; therefore my final pass completions totals will be slightly higher than what Opta provides. It is what it is…
- I should also point out here that there are occasions when a team wins possession of the ball and takes a shot where no pass was completed. Like I said, this measurement method is not perfect but it is ‘equal’ in ignoring that exception for all teams.
- Therefore the measurement itself has value in tracking the majority (bell curve) of activities that normally occur in a game of soccer. And as a reminder, these are Strategic steps in PWP; by definition a Strategic step will not measure to a level of granularity; that is where Tactical metrics come into play based upon an Operational condition where the team is applying pressure higher up the pitch.
Step 3: Penetrate and create goal scoring opportunities: Most know that a pitch is divided into three parts; the Defending Third, Middle Third, and Attacking Third. For the purposes of this effort, Penetration is associated with entering the opponent’s Defending Third with the intent to score.
- For this Process, penetration is measured by dividing the volume of passes a team completes within the opponent’s Defending Third into the volume of passes a team completes across the entire pitch.
- It’s not perfect but it creates a ratio that treats all teams fairly, and given the overall accuracy of the End State Index (90%), it’s a reasonable way to measure this step.
- In order to measure this step I first had to manually filter, for all 646 games, every pass attempted and completed using the MLS Chalkboard; my thanks to MLS and OPTA for providing us ‘stats’ guys the opportunity to do that. With Golazo stats now available, that task will be easier next year. As a stats guy, it would have been inappropriate to switch measurement methods ¾’s of the way through the year.
Step 4: Take shots when provided goal scoring opportunities: This is, by far, the hardest indicator to measure, given how current data sites really lack granularity in how they identify/define ‘created goal scoring opportunities.
- I define a ‘created goal scoring opportunity’ as any pass, successful or not, that may have ended with another teammate taking a shot. That’s hard to quantify, but an example, if you will:
- A fullback overlapping down the right side puts in a wicked cross that gets cleared at the last minute by a center-back, with his head. With OPTA and other data companies that wicked cross, though unsuccessful, is not quantified as a goal scoring opportunity created; it’s merely tracked as a clearance and an unsuccessful pass.
- I disagree; the fullback did their job in putting in that wicked cross – what really happened is the defender also did their job in clearing it – therefore a “potential” for the attacking team to complete a created goal scoring opportuinty and take a shot was denied.
- Both the attacking team and defending team should be statistically credited for doing what they are expected to do. Others may disagree…
- But as a Head Coach, I would put to memory that the fullback did what was supposed to happen; create the chance – therefore in my books that player created a goal scoring opportunity.
- For this Process, the step is measured by counting the number of Shots Taken compared to the number of completed passes in the opponent’s Defending Third.
- It’s not perfect, but it’s measured in an unbiased manner for every team, and there will be instances where a shot can be taken without a completed pass or originate from a defensive error.
- In going back to the example, as a Head Coach I would call that effort a “failed assist.” I think there is value in knowing the number of “failed assists” as much as there is in knowing “assists.”
- By tracking “failed assists” it provides a pure, statistical way, to track individual player performance (tactical metric) that can influence team performance.
- Bottom line on this one, as contentious as it may be for some, recall the End State of this Final Index… create a simplified approach and documented method for measuring team performance where the output is an Index that (while excluding points) comes close to matching results in the MLS League Table.
- Given the accuracy rating of 90% in matching the top 10 Playoff teams this year I feel and think the approach to measure this indicator works.
- If OPTA, or another data compilation agency starts to track “failed assists”, could an Index like this reach 100% accuracy?
Step 5: Put those Shots Taken on Goal:For the most part this is an individual statistic that is added up to create a team performance indicator.
- For this process, the step is measured by dividing the number of Shots on Goal by the number of Shots Taken.
- It’s one of the easier indicators to measure, and if you watch any level of soccer, it’s pretty self explanatory – if the Shot can come anywhere within the dimensions of the Goal, it is considered a Shot on Goal. One of two things happens; it goes in or it doesn’t.
Step 6: Score the Goal: One critical objective of the game.
- I say ‘one’ because indications, I see, lead me to offer that this game is not all about scoring goals.
- In my research it appears to me that teams who defend better seem to take more points in games than teams that don’t defend very well.
- A recent example in my End of Season analysis of Vancouver: in Western Conference competition, they scored 35 goals and gave up 35 goals; all told they took just 26 of 72 possible points – clearly, in this example, scoring goals did not result in wins…
- Prozone, a noted professional sporting analysis company, offers the following in the article: “Using data from the last ten seasons of the Premier League, Anderson and Sally compared the value of a goal scored and the value of a goal conceded. They found that scoring a goal, on average, is worth slightly more than one point, whereas not conceding produces, on average, 2.5 points per match. Goals that don’t happen are more valuable than goals that do happen.”
- It’s not perfect, but it provides reasonable information in a reasonable format that has reasonable value when comparing the End State output to how the MLS League Table finished.
- For those interested the PWP Strategic Attacking Index and Defending Index are provided below:
- In looking at these two Indices, note the number on the left; the difference between the Index number in the Attacking Index and the Defending Index is the number that appears to the left in the Final Strategic Index at the beginning of this article.
- That may help explain why some teams finished above zero, as opposed to below zero in the Final Index.
- Teams finishing above zero had team attacking percentages that exceeded their team defending percentages; in other words they were better in their attack against the opponents than the opponent’s were in attacking them.
- Team success rates in these six steps will be used next year to begin to analyze how well the team is performing as the new season starts compared to performance the previous year.