Category: Team Analysis

Redefining and Modernizing Total Shots Ratio

For many years Total Shots Ratio has plodded along as a good indicator of team shooting performance, not overall team performance, but shooting performance.

It’s a good enough indicator that its found its way into generic match reports for professional soccer teams and has good visibility on Opta – a well recognized soccer statistics company now owned by Perform Group.

But with all that publicity and ‘useability’ that doesn’t make it ‘right’!

Why do I say that?

Within a game of football there are always two teams playing against each other – so team performance statistics should not only take into account what the attacking team is doing – they should also take into account what the opponent is doing to the attacking team.

So what do I mean about modernizing TSR.  Most define TSR has simply the volume of shots one team takes versus the volume of shots another team takes.  That’s okay but the end state is excluded – the result – a goal scored.

So my new vision of TSR centers around the end state as well as the volume – in other words the equation for Attacking TSR (ATSR) now becomes Goals Scored/Shots Taken and then Defending TSR (DTSR) becomes the percentage of your opponent’s Goals Scored/Shots Taken.

Finally, in looking at how well Composite Possession with Purpose correlates to Points Earned in the League Table I would create Composite TSR (CTSR).

Before getting to the numbers – some history first:

I built Possession with Purpose using this philosophy and if you’ve been following my efforts for the last two years you know that my correlations to points earned in the league table are extremely high…  To date:

  • MLS 2014 = .86
  • Bundesliga = .92
  • English Premier League =.92
  • La Liga =.91
  • UEFA Champions League =.87

So let’s peel back the regular way TSR correlates to Points earned in last year’s MLS – when viewing the old way (Total Shots only as a percentage for both teams) the Correlation Coefficient “r” for the entire league was .32.

My new way of calculating CTSR with the End State of Goals scored has a correlation coefficient “r” of .75

Far higher…  now for some data.

Here’s the correlation of the my new TSR Family of Indices shows with respect to Points Earned in the League Table – the same analyses used with respect to CPWP above:

  • MLS 2014 ATSR .74) DTSR (-.54) CTSR (.75)
  • Bundesliga ATSR (.53) DTSR (-.41) CTSR (.68)
  • EPL ATSR (.86) DTSR (-.35) CTSR (.76)
  • La Liga ATSR (.88) DTSR (-.77) CTSR (.92)
  • UEFA ATSR (.64) DTSR (-.40) CTSR (.65)

Like CPWP the correlations vary – in four of five competitions the CTSR has a better correlation to points earned in the league table – while in one case (the EPL) ATSR has the best correlation.

So how do the numbers stack up for some individual teams when evaluating ATSR, DTSR, CTSR, and CPWP compared to those teams points earned throughout the season?

In other words what do the correlations look like (game to game) through the course of a season for sample teams within each of those Leagues?

Samples ATSR DTSR CTSR CPWP

In almost every sample TSR (now ATSR) has a lower, overall correlation to a teams’ points earned in the League Table than CTSR (Borussia Dortmund and Barcelona being the exception) – this pattern follows the same pattern seen with CPWP almost always having a higher correlation than APWP and Goal Differential almost always having a higher correlation than Goals Scored.

I’ve also taken the liberty of highlighting which Composite Index has the best correlation to points earned between all four categories – in every instance either CTSR or CPWP is higher than TSR.  But, as can be seen, sometimes CTSR is higher than CPWP…

What this proves is that there simply isn’t one Index that is far better or far worse than the other – it shows that different teams show different styles that yield better relationships to points earned in different ways —> meaning there is not only room for improvement in current TSR statistics but room for the inclusion of PWP principles within the Industry standard.

I would offer – however – that even when you create CTSR the backbone of that data can’t offer up supporting analyses on how a team attacks or defends.  It’s still only relevant to the volume of shots taken and goals scored.

And while the volume of shots on goal and goals scored appears to be a constant across most competitive leagues (average greater than 5 and 2 respectively for teams winning on a regular basis) the average of shots taken for winning teams is not as constant… (Expected Wins 4)  —> why I favor PWP over TSR – nothing personal – just my view…

In Closing:

I’m not sure I did a good job of comparing what I view as the old way to calculate TSR (the way that ignores the End State of Scoring a goal) and how an update to it can help tell a better story that actually correlates better to the complexities of soccer.

Best, Chris

COPYRIGHT, All Rights Reserved.  PWP – Trademark

Improving Possession with Purpose

Throughout this three year effort I have always wanted to take time to make time to review the process and look for ways to improve the output while retaining the integrity of the End State (create an Index that matches, as close as possible, the League Table without using points earned).

A critical part of this has always been to ensure that the data points used within the Index had relevance (made sense) to how the game is played.

For three years my data points within Possession with Purpose have been:

  1. Passes Attempted across the Entire Pitch
  2. Passes Completed across the Entire Pitch
  3. Passes Attempted within and into the Final Third
  4. Passes Completed within and into the Final Third
  5. Shots Taken
  6. Shots on Goal, and
  7. Goals Scored

My new and improved PWP Family of Indices will continue to leverage these relevant data points but I am making a modification with respect to the measurement of quality given those data points.  The new modifications end up seeing the overall measurement of PWP being:

  1. Possession Percentage
  2. Passing Accuracy across the Entire Pitch
  3. Passing Accuracy within and into the Final Third
  4. Percentage of Passes Completed across the Entire Pitch versus Passes Completed within and into the Final Third
  5. Shots Taken per Passes Completed within and into the Final Third
  6. Shots on Goal per Shots Taken, and
  7. Goal Scored per Shots on Goal (times 2)

The two categories making up the new Index composition are highlighted in boldface font…

Why?

Well for me – in how PWP has developed – I don’t think I quite captured the mroe significant intent of a team to penetrate (given any style of attack – direct, counter, or short pass type of engagement given conditions on the pitch) nor do I think I really captured the considerable value of a goal scored – in any fashion (be it in run-of-play or via set-piece).

I don’t think this violates the integrity of the general tendency of teams and their behavior – I think it actually better represents the importance (weight) of a goal scored as well as the considerable advantage some teams show in being mroe accurate (in passing) as space on the pitch diminishes.

Finally, in making this adjustment I don’t violate the integrity of the original data points collected – I just am finding a better way to translate that quantity of information into a different output relative to quality.

So how do these changes manifest themselves in the data outputs?  I’ll let the diagrams and Correlation of Coefficient (R) speak for themselves.

Major League Soccer 2014:  (Before and After)

 

English Premier League: (Before and After)

 

Bundesliga: (Before and After)

 

La Liga: (Before and After)

 

 

Major League Soccer 2015: (Before and After)

 

 

Gluck: Fourth Year Anniversary Edition

My thanks to everyone who has supported my web site the last four years!

It’s been a learning experience for me and, I hope, for you too.

As the new year starts I’ve got at least five new articles planned; here’s a quick synopsis on what to expect:

  • Following up on Coaching Youth Soccer Part I and Coaching Youth Soccer Part II, I’ll be offering Coaching Youth Soccer Part III – digging into which team statistics to use, why, when, and how to use them.  For those who don’t know me these three articles highlight my coaching philosophy into one three word catchphrase “muscle memory mentality“.
  • Two new individual soccer statistics:   This (may?) be controversial – My intent is to submit two new, professional level, individual, soccer statistics that could transform the player market value system.

Said differently; are private statistics companies, like Prozone Sports, OPTA, and InStat (along with player agents) manipulating the player market value system by ignoring what might be the most logical, intuitive, individual soccer statistics ever?

  • Expected Points – An updated version of my previously created Expected Wins series of articles.  A follow on to what was offered at the World Conference on Science & Soccer 2017, Rennes, France.
  • Expected Goals – A new way to calculate this over-hyped soccer statistic that brings it a bit closer to reality.
  • World Cup 2018 Total Soccer Index; to include predicting the winners after round one is complete.

For now, in case you missed one or two, here’s my rundown on the top five articles in each of the last four years.

In Closing:

  • I called for Jurgen Klinsmann to be sacked after WC 2014 because his tactics and in-game adjustments weren’t up to snuff.  Three years later the rest of the american mainstream soccer media world agreed and Klinsmann was sacked.
  • I called for Sunil Gulati to be ‘ousted’ after WC 2014 because his leadership in helping youth development and head coach selection weren’t up to snuff. Three years later the rest of the american mainstream soccer media world agreed and Gulati is out.
  • In hindsight – I wonder where we’d be in youth soccer development if we’d have made those decisions three years ago?
  • No, I do not favor Caleb Porter as the next US Men’s National Team head coach.  I like Caleb, he’s a stand-up guy and always took time to share and listen.  That said, in my opinion, he’s not (consistently) good enough at reading in game situations and making tactical adjustments that lead to better performances; the exact same issue I had with Jurgen Klinsmann.  .
  • I’m hopeful either Eric Wynalda or Steve Gans are elected as the next United States Soccer Federation President; electing Kathy Carter is a NO-GO in my view as there’s perceived ‘collusion’ between MLS and SUM.  As a retired Air-Force veteran perception is reality until proven otherwise – some may disagree?

I wish you all the best for the new year.

Best,

CoachChrisGluck

 

Gluck – Clearances – Clearing the Air or Clear as Mud?

My after season team performance analysis continues.

This week I’m taking a look at Clearances.  A much used individual statistic that many rely on to rate the value of central defenders.  But does it add real value?  Are defenders with more clearances better than those with fewer clearances?

My research looks at clearances from two different perspectives:

  1. What is the relationship of clearances by the opponents’ relative to a team earning points, and
  2. What is the relationship of defensive clearances by your own team relative to your team earning points.

For Major League Soccer, 2016, the number of clearances each team had, and their opponents’ had, was counted per game.

I then did a simple correlation on the number of clearances (per game) relative to the points earned (per game).  Pictured below is a summary of the two perspectives relative to their correlation to points earned.

clearances

Diagram 1

Initial observations:

  • Diagram 1:  The average number of opponent clearances (per game – right part of the diagram) has a (-.30) correlation to points earned.
  • Diagram 1:  The average number of defensive clearances (per game – left side of the diagram) has a (+.30) correlation to points earned.
  • Diagram 1:  It’s pretty clear that the correlations vary (considerably) from team to team.

Average of opponent’s clearances per game versus points earned:

  • Sporting KC gained the most benefit from lack of opponent clearances throughout the season; their correlation was (-.48).  In other words Sporting KC were more inclined to earn points when the opponent had fewer clearances.  This seems reasonable, especially since Sporting KC offered up the second most crosses (19 per game) this year.  The less likely the opponent was in clearing those crosses the more likely Sporting KC had in converting those crosses to goals scored.
  • DC United got the least benefit from lack of opponent clearances; their correlation was (-.08).  In other words the number of clearances by their opponent’s, throughout the season, had little to no overall impact in DC United earning points.  This also seems reasonable since DC United offered up the third fewest crosses (14 per game) this year.  With not many crosses offered it seems reasonable that this mode of creating scoring chances was less likely to occur.

What’s that mean?

  • For me, I would offer it means the number of defensive clearances an opponent has, per game, isn’t really a strong team indicator.

Average of defensive clearances per game versus points earned:

  • San Jose gained the most benefit from defensive clearances (.57); meaning San Jose were more inclined to earn points when having more defensive clearances per game.  This seems reasonable as San Jose faced an average of 20.5 crosses and over six corners per game; tied for 8th most in each category across the league.  A higher volume faced should result in a higher volume of clearances.
  • New York Red Bulls gained the least benefit from defensive clearances (-.01); meaning the Red Bulls were just slightly more inclined to earn points when they had fewer defensive clearances (per game).  What is unusual with New York is they averaged a greater number of defensive clearances (21 per game) but faced fewer crosses and corners than San Jose.

What’s that mean?

  • For me, I would offer it means (again) the number of defensive clearances a team has, per game, doesn’t greatly determine the outcome of a game.

In conclusion:

  • If neither opponent defensive clearances per game, nor your own teams’ defensive clearances per game, don’t have a strong correlation to points earned then the individual player statistics – that make up those clearances’ statistics won’t have much value either.
  • If anything – given the wide variation in clearances’ value, relative to points earned, a players’ individual clearances (per game) should be weighted relative to that game – and that game only.  Recognizing that the ‘weight’ of those clearances is subject to change every single game.
  • Perhaps what’s really missing here is the volume of “clearances not made” instead of “clearances”?
  • Finally, as a ‘giggle check’ if-you-will, I did take a look to see if the correlation of clearances was over .50 relative to the number of opponent crosses and corners offered – it was.  The average correlation across the league was .71 – quite strong…  see Diagram 2 below.

    clearances-vs-opponent-crosses-and-corners

    Diagram 2

  • So our own common sense is supported by data analysis.
  • Said differently; “common data sense” shows the volume of clearances are related to the volume of crosses and corners.
  • Therefore… (in my view)
    • If “the common data sense” (shown in Diagram 1) does not show the volume of clearances having a strong relationship to earning points then our own common sense should follow that view.
  • Again reinforcing that individual defensive clearances, as an effective individual statistic, does not add real value at all.

Best, Chris

COPYRIGHT – All Rights Reserved.  Trademark: PMP

 

 

 

 

 

English Premier League – Possession with Purpose – Week 2

Two weeks in and Manchester City pretty much throws the gauntlet down against Liverpool and walks away with a dominating win.

Three other teams have also begun the season with six points (Spurs, Swansea, and Chelsea) but do those four teams show the most consistency with purpose in possession, penetration and creation of shots taken that result in goals scored?

And, do those same four teams show the most consistency in preventing their opponents from doing the same thing to them?

What about the early season dogs (QPR, Burnley, Crystal Palace, and Newcastle) – where do they fit?

I’ll try to answer those questions without too much detail given the season is just two weeks old.

So to begin; here’s the Composite PWP (CPWP) Strategic Index after Week 2:

CPWP EPL AFTER WEEK 2

Observations:

  1. A quick look at the table sees the top four in the Index as being the top four in the Table – not specifically in order but there it is.
  2. In looking at the bottom end of the Table the bottom four teams in the Index match exactly the bottom four in the Table.
  3. I doubt very much the level of accuracy will match the League Table that well throughout the year.
  4. Of note is that Arsenal, Hull and Aston Villa are next up in the Table but Villa seems to drift down a bit in the CPWP; perhaps the APWP or DPWP might explain that drift compared to Arsenal or Hull City?
  5. As a reminder – the End State of the Index is to provide an objective view of team performance indicators that don’t include Points in the League Table – in other words it’s a collection of data points, that when combined, can provide value in what team activities are occurring that are directly supporting results on the pitch – sometimes results on the pitch don’t match points earned…
  6. In leveraging this Index last year in the MLS it was very accurate in reflecting why certain Head Coaches may have been sacked – in a League like the EPL (where everything is expensive) perhaps this Index might have even more value to ownership?
  7. Movement in the Index – in the MLS, this last year, I have seen teams move up as many as 12 places and down as many as 11 places – after the 4th week – so the Index is not likely to stay constant – there will be changes.

I do not quantify Index outputs specific to individual player acquisition or performance – there is no intent to do this.  It’s my belief, good or bad, that even with individual star performances a team is a team is a team – you win as a team and you lose as a team… but this Index isn’t intended to stop others from doing that.

I leave that individual analyses for others who are far better at digging into the weeds than I – for the EPL I’d imagine many folks gravitate to @statsbomb or other @SBNation sites – I respect their individual analyses as I hope they respect my team analyses.

Whether the consistency of value shows itself in assessing team performance in the EPL like it has in Major League Soccer I have no idea – we will follow that journey, in public, together…

Now for Attacking PWP (APWP):

APWP EPL AFTER WEEK 2

Observations:

  1. In recalling Villa’s drift (it is still early) perhaps it’s an early indication that Villa are playing slightly more direct (given past indications analyzing Major League Soccer) – or with a greater lean towards counter-attacking and quick transition?
  2. In taking a quick look at their average volume of passes per game (305) compared to the rest of the EPL (456) it would seem to indicate Villa are playing more direct football.
  3. The team with the highest APWP while falling below the average number of passes attempted, per game, is Leicester City; they average 308 passes per game compared to the 456 average of EPL.  For me that’s an early indicator that they are making the best use of a direct attacking scheme – others may have a different view?
  4. The team with the lowest APWP while showing higher than the average number of passes attempted, ~(500 per game), is Stoke City – that might indicate the Potters are looking to possess the ball more with the intent to possess it as opposed to penetrating with it.  Folks who follow Stoke a bit closer might be able to add to that as I’ve yet to see them play this year.
  5. In terms of early form, relative to the six team performance indicators, Chelsea are tops with Everton, Arsenal, and Man City close behind.
  6. With respect to bottom feeders QPR are bottom in CPWP and bottom in APWP as well; most figured they’d be early favorites for relegation – the PWP Indices seem to lean that way already as well…
  7. Perhaps the early surprise in APWP is Newcastle?  Not sure about that one – last time I lived in England Alan Shearer was their striker and probably the best one in the country at that time…  others will no better about what Alan Pardew is up to…

Next up Defending PWP (DPWP):

DPWP EPL AFTER WEEK 2

Observations:

  1. Leaders here include Spurs, Man City, Swansea and Newcastle – is this an early indicator that Newcastle has experienced bad luck already?  Not sure but three of the bottom dwellers here are three of the four bottom dwellers in CPWP.
  2. Although not real clear here it might be easy to forget that Arsenal had a blindingly great first game and then eked out a draw against Everton in the last ten minutes; in considering that this data still just represents two games…
  3. Recall Stoke City – and the potential view that they might be possessing the ball with an intent to possess more-so than penetrate – even with just 1 point in the League Table their DPWP exceeds West Ham, Liverpool, and others who are further up the table.
  4. Man City showed great nous last year in winning the League and it reaffirmed for many of us the importance of defending – Liverpool were close last year given an awesome attack – players have changed but it’s likely the system/approach has not varied that much.  And after two games Liverpool are embedded firmly in the middle of the DPWP pack.
  5. Can they push higher up the DPWP? And if so, will that climb in the DPWP Index match a climb in the League Table; or vice versa?

In Closing:

Far too early to look for trends but these first few weeks will provide a baseline for future trends.

As noted in my most recent articles on Possession – the more accurate soundbite on whether or not a team is more likely to win has more relevance with respect to Passing Accuracy (>77% in MLS usually means a team is more likely to win) and not Possession.

The margin of winning and losing in MLS is far to muddied when looking at Possession – so as the EPL season continues I will also make it a point to study what ‘soundbite’ has more relevance; Passing Accuracy or Possession.

Other links that may be of interest to you include:

Possession with Purpose

My presentation at the World Conference on Science and Soccer

New Statistics (Open Shots and Open Passes)

Thanks in advance for your patience.

Best, Chris

COPYRIGHT, All Rights Reserved.  PWP – Trademark

Separating winners from losers in Major League Soccer…

We are past the halfway point in Major League Soccer this year and if you recall from this previous article I promised I would revisit my Expected Wins analysis again at about this stage.

To continue to chart the progress of PWP, to include the data points behind the calculations, I am offering up some diagrams on what the data looks like after:

  1. The 92 game mark of the MLS Regular Season (184 events).
  2. The 183 game mark of the MLS Regular Season (366 events).
  3. The same data points for World Cup 2014  (128 events).

For background details on Possession with Purpose click this here.

To begin…

A reminder of how things looked after 184 Events (92 Games)…

MLS EXPECTED WINS AFTER 184 EVENTS

Trends indicated that winning teams passed the ball more, completed more passes, penetrated the final third slightly less but completed more of their pass attempts in the final third.

For shooting; winning teams shot slightly less by volume but were far more successful in putting those shots on goal and scoring goals.

For details you can enlarge the diagram and look for your specific area of interest.

As for how the trends show after 366 Events (183 Games)…

MLS EXPECTED WINS AFTER 366 EVENTS

Winning teams now average less pass attempts and complete slightly fewer passes.

There is a marked decrease in pass attempts into the opposing final third and slightly fewer passes completed within the final third.

In other words – teams are counter-attacking more and playing a style more related to ‘mistake driven’, counter-attacking, as opposed to positive attacking leading into the opponents final third.

As for shooting; winning team are now taking more shots, with more of those shots being on goal and more of those resulting in a goal scored.

In my opinion what is happening is teams are taking advantage of poor passing accuracy to generate and create turnovers .

In turn those turnovers are generating cleaner and clearer shots given opponent poor positional play on the transition.

My expectation is that more teams will now begin to focus on bringing in newer players that have better recovery skills and can defend better.

In contrast, here’s how these same data points look after completion of the World Cup of 2014… there is a difference…

WORLD CUP EXPECTED WINS AFTER 182 EVENTS

 

Winning teams average more passes attempted and far more completions than losing teams.

In addition winning teams penetrated far more frequently than losing teams, and that increase in penetration also translated to an increase in passes completed within the final third.

With respect to shooting; winning teams shot more, put more shots on goal, and scored far more goals.

Clearly what we see here is that quality in player skill levels also translated to an increase in quantity.

That should become even more apparent in looking at the PWP outputs for MLS and World Cup Teams…

Here they are for MLS at the 184 Events point this year:

MLS SIX STEPS OF PWP AFTER 184 EVENTS

 

A quick review of the data outputs shows winning teams averaged 51% possession and are 2% points better in overall passing accuracy.

That passing accuracy advantage also carried into the final third but when taking shots losing teams averaged more shots taken, per penetration, than winning teams.

Bottom line here is that winning teams had those fewer shots taken generate more shots on goal and more goals scored than losing teams.

After the 366 Event point this is how those same outputs look…

MLS SIX STEPS OF PWP AFTER 366 EVENTS

Like the indicators, in the PWP Data points, the percentages here are beginning to reflect the counter-attacking style of football taking over as the norm.

Winning teams now, on average, possess the ball less than their opponents… wow… mistake driven football is taking hold across the MLS.

As for Passing accuracy within and outside the final third…

Winning teams continue to be better in passing – and that level of accuracy is driving a large increase in shots taken, per penetration, by winning teams compared to losing teams (almost 2% different).

That is a marked difference (4% swing), from earlier, where losing teams shot more frequently, per penetration, than winning teams.

In addition that increase in shots taken, per penetration, also results in more shots on goal, per shot taken, and more goals scored, per shot on goal.

The margin between winning teams, and losing teams, for goals scored versus shots on goal, at the 184 Event point versus 366 Event point, still remains > 29%.

 So how about teams in the World Cup???

WORLD CUP SIX STEPS OF PWP AFTER 128 EVENTS

Like earlier, winning teams not only passed the ball more frequently they possessed the ball more, by 5% (52.56% to 47.89%).

So contrary to what others might think – tika-taka is not dead, it’s just been transformed a wee bit…

With respect to passing accuracy…

I’m not sure it can be any more clear than this – winning teams averaged 82.40% and losing teams averaged 80.46%.

What makes these outputs different from MLS is that the level of execution is far higher in passing accuracy; by as much as 6%.

To put that in perspective; if  a team looks to attempt 500 passes in MLS that equals 380 passes completed – compared to 412 passes completed by World Cup teams; clearly the level of execution is much higher.

That difference of 32 passes completed can have a huge impact when penetrating and creating opportunities within the final third.

What makes it even tougher is that the quality of defenders is significantly higher at the World Cup level as well.

With respect to penetration and creation within the final third…

World Cup winning teams averaged 2% greater penetration per possession than winning teams in the MLS.

By contrast World Cup winning teams generated fewer shots taken per penetration than those in the MLS.

Does this speak to better defending?  I think so…

What I think is happening is that quality gets the team into the box, but then the quality of the defenders and goal keepers, in that confined space, is taking over.

This should be evident, even more so, when seeing that winning teams in the World Cup also put fewer shots on goal per shot taken than winning teams in MLS.

And that also translated to goals scored for winning teams in the World Cup also scored fewer goals scored per shot on goal…

In closing…

All told, winning teams in the World Cup displayed slightly different (average percentages) than winning teams in MLS with one exception – passing accuracy.

And given the importance of the tournament it’s no wonder…

Without having the data, yet, I’d expect that the better teams in the EPL, Bundesliga, and other top European Leagues that difference in passing accuracy would remain.

As for the difference in possession (winning teams clearly possessing the ball more than losing teams) I’m not sure – mistake driven football, if memory serves is an approach Chelsea have used in the past…

I’d imagine it’s a pendulum type effect – as more teams work towards mistake driven football more teams will strengthen their ability to recover and open the game up a bit with direct attack to force the opponent from pressing so high.

I’ll be looking for additional trends as the year progresses to see if direct play increases – perhaps a good indicator of that might be even fewer penetrations and more crossing?

With respect to statistical relevance of the data and the outputs generated…

In every case the relationships created, be them Exponential or 4th Order Polynomial all had correlations that exceeded .95.

In other words the variations are minimal and should really reinforce just how tight the difference is between winning and losing in a game of soccer…

Best, Chris

Re-tweets appreciated…

COPYRIGHT, All Rights Reserved – PWP, Trademark

 

You say you want a Revolution – A different angle on PWP and Team Performance

A superb run with five wins and a draw in six games; by most standards that is a compelling argument for consistency.  I agree and their overall Composite Possession with Purpose Index rating continues to climb.

They’ve (New England) climbed from 17th in PWP (week 5) to 7th after week 11; a superb shift of 10 full places in 6 weeks.

So in considering this giant push forward I’d like to take a different approach in how the data points from PWP can be viewed.  

This is new so please bear with me for a minute or two as I set the context.

Below are a number of diagrams referencing my PWP indicators for a few teams; the diagram being used this time is the ‘doughnut’ diagram from Microsoft Powerpoint.

The interesting thing about this diagram is that it allows me to offer up a view on my PWP data points that isn’t relative to the exponential relationship (a line). Instead, it allows me to picture the overall tenor of PWP data points in relationship to themselves as being a part of a ‘whole’; with the ‘whole’ being PWP.

I feel confident I can take this approach since my Expected Wins 2 correlation for my data points is ~.97 (R2) — as near to rock solid as you can get.

Other context points include:  

  • The teams used in this analysis are Seattle, New England, Montreal, Portland and last years’ Supporters Shield winner (New York) plus last years bottom dweller (DC United)
  • Reminder in case my explanation was a bit wordy above – the percentages indicated in the doughnut are not the percentages of those activities relative to the game; they are the percentage of those activities relative to each other with 100% being all those activities added together.
  • Source – as usual the MLS Chalkboard and the MLS Statistics Sheets
  • Gold Stars on the diagrams are intended to show you where differences occur.
  • The team name on the outside of the doughnut is the outer ring of data and the team name on the inside of the doughnut is the inner ring of data.

To begin…

PWP Doughnut Diagram Week 11 NER v MIFC

PWP Doughnut Diagram Week 11 NER v MIFC

The volume of Final Third passes successfully completed by New England (29%) is 3% points higher than Montreal (26%).  Note also that Montreal has a greater percentage of PWP outside the Final Third (30%) than New England (28%). Both of these indicate to me that New England is more focused on penetrating and creating than Montreal.

For the future I will check into these three areas when looking to see if a ‘direct attacking approach’ can be better differentiated from a ‘ground-based’ (short passing scheme) approach.

The actual volume of penetration is higher for New England as well (11%) versus (7%). And like my regular PWP analysis the data here also supports the fact that teams who are more patient in creating shots taken (6% for NER versus 11% for MIFC) end up with more goals scored.

I did ask Matthias Kullowatz about the specific shot data for New England and Montreal; ~60% of Montreal’s shots on target have come outside the prime scoring zones 1 & 2 while ~68% of the Revolution shots on target have also come outside of zones 1 & 2.  So what’s different?

I think it’s down to time and space again; though it could be the Revolution have better strikers – but when you see the DC United doughnut diagram a bit later I think it’s back to time and space; and with fewer shots taken and more patience in the final third that seems reasonable to me.

Now for a contrast that might be better at explaining individual mistakes and bad fortune more than a bad ‘style/system’…

PWP Doughnut Diagram Week 11 SSFC v PTFC

PWP Doughnut Diagram Week 11 SSFC v PTFC

Notice no ‘gold stars’; why? Because there really isn’t that much difference between how these two teams execute the six steps of PWP.

What separates these two teams in the league table are individual mental mistakes in defense – Portland sit on ten points while Seattle have 25. Through the course of this year the Timbers have dropped 7 points due to red cards and penalties – they did both against Columbus Saturday night!

In considering the ‘sameness’ of the data I expect as time passes an output similar to this could highlight ‘individual mistakes’ and perhaps ‘good/bad luck’ when it comes to rebounds and deflections – again recall Saturday night when Futty Danso deflected a shot and notched an ‘own-goal’

All told things went pretty well for Columbus, a red card by their opponent, a foul in the penalty box by their opponent for a PK and a deflected own-goal by their opponent. If I were a Columbus fan I’d be pretty pissed they didn’t win – bad luck for the Crew!

However viewed I’ll revisit this diagram later when the Cascadia Cup battle heats up.

So here’s the doughnut view of New York compared to DC United last year and then a bit further down how they look compared to each other this year.

PWP Doughnut Diagram NYRB v DCU 2013

PWP Doughnut Diagram NYRB v DCU 2013

First off – let’s not forget Ben Olsen was not fired and perhaps this doughnut diagram can also help explain why given the overall poor performance in results last year for DC United.

Notice that the team does exceedingly well in comparison to New York with respect to Passing, penetration and creation; they actually exceed New York in the first two categories and only fall off when it comes to goals scored (7% for DC United versus 15% for New York).

So I’d offer that the system Ben Olsen ran last year worked – what he lacked was a pair of good strikers.  And if you recall the Montreal doughnut earlier the outputs from DC United do not mirror those of the Impact!

They added Espindola and Johnson and shored up their defense a bit; that also included adding Amos Magee to the staff.  Remember him as the Defensive Coordinator for Portland last year (I think – others can confirm or deny that I’m sure)

Bottom line here – the system didn’t change and the Head Coach didn’t change and I’d offer that was appropriate…  now for the same diagram this year:

PWP Doughnut Diagram Week 11 NYRB v DCU 2014

PWP Doughnut Diagram Week 11 NYRB v DCU 2014

In closing:

Note the increase for DC United in the final category – goals scored versus shots on goal – pretty compelling information to reinforce that the system used last year is the same system used this year and the difference – major difference – is the addition of two quality strikers.

I’m just in the learning stages on how this new doughnut diagram will take shape – I’m pretty sure it will have at least one hole in it – I’m hopeful there aren’t a lot more.

Some changes afoot with OPTA and MLS – I see OPTA incorporated the Final Third Passing Accuracy suggestion – just need to find out if crosses are included in that metric???

As for the new MLS Chalkboard – I’m not sure how that will work if the ‘numbers’ of activities are not available to count when it comes to defensive activities and ‘touches’ for players…

And yes, the old Chalkboard still appears to exist given a separate link within previous articles but it’s unclear if this change will be a permanent change for next year – or even the World Cup for that matter…

As for This Week in PWP; if you saw my tweets yesterday you know the top Attacking and Defending PWP teams of the week; New England in attack and Toronto in Defense with the Reds taking the Composite PWP Index top spot for Week 11.  

Sporting KC, along with LA Galaxy remain atop the Composite PWP through Week 11 while the Revolution moved to 7th and Columbus dropped to 4th as Real Salt Lake are now in a comfortable position of 3rd best overall.

Finally, this view also gives you and idea of what percentage each team gleans from each of the PWP Six Steps data points in the calculation for the overall Index number.

Best, Chris

MLS Coaches – Leveraging Possession with Purpose to Analyze Coaching Performance

I promised this year, at various times, to offer some thoughts about how Possession with Purpose can be used to support analysis on how well Head Coaches might be performing compared to others.

As a reminder from last year; five of the bottom six teams in my PWP Composite Index had coaching changes, Columbus, Chicago, San Jose, Toronto, Chivas USA, and then after an early exit from the Playoffs; Montreal.  Other teams making changes included Vancouver, Colorado and FC Dallas and the depature of Kreis for NYCFC. All told, a total of 10 teams made changes in Head Coaches for one reason or another.

Will this year have similar results, and if so, who?  I don’t claim to prognosticate coaching changes and the firing of Head Coaches, but changes happen, and last year’s information, relative to the bottom six teams in my Composite PWP Index, is pretty compelling at first glance.

So after reading an article offered up by Jason Davis at ESPNFC “Three MLS coaches on the hot seat,” plus releasing my article earlier this week on Crosses offered in MLS, I figured the timing was pretty good for my first installment.

Here’s some of my initial information for consideration on “system of attack”:

  1. For home games Frank Klopas, Mark Watson and Frank Yallop-led teams are the top three in MLS that offer up more crosses per pass attempted in the final third.
  2. For away games Klopas, Watson, Yallop and Wilmer Cabrera-led teams are the top four teams in MLS that offer up more crosses per pass attempted in the final third.
  3. The relationship of taking points, at home, in the MLS is (-.70) for teams that cross the ball more frequently than others. In other words, the teams who cross the ball the most are more likely to lose points (at home) than teams that don’t.
  4. The same relationship of taking points in away games holds as well (less at -.37). but still the same logic – the more crosses a team offers in away games the more likely they are to drop points.
  5. Bottom line is these four teams are less likely to win at home or on the road given their current system of attack in the Final Third.

In other words, these teams led by these head coaches use a system of attack that simply doesn’t get positive results on a regular basis in MLS; or… these teams, led by these head coaches and general managers don’t have the right players to execute that system of attack in MLS.

So how does Sporting KC do it? They are a team that offers up the 7th-most crosses at home and the 5th most crosses on the road, yet they are winning using that system of attack.

Why? I think it’s because their GM and head coach, collectively, are getting the right players to play to that system of attack.

So how about overall Team Attacking and Defending performance  (Team Positions in my Composite PWP Index) after nine weeks in: (1) Possession, (2) Passing Accuracy, (3) Penetration into the Final Third, (4) Creating and Taking Shots, (5) Putting Shots on Goal, and (6) Scoring Goals?

After Week nine, four of the five worst performing teams in MLS, in these categories are:

  • Chivas USA (19th out of 19),
  • Montreal Impact (18th out of 19),
  • Chicago Fire (17th out of 19), and
  • San Jose Earthquakes (15th out of 19).

In case you missed it in an earlier article on Expected Wins 2 – the correlation of those data points as a whole is .99 (R-squared); the closer to “1” the better and stronger the relationship.

In other words that means that the relationship of those data points is pretty much rock solid, and that it’s a worthy indicator (outside of points in the league table) for objectively evaluating team (attacking and defending) performance.

So while Jason Davis indicates John Hackworth and Caleb Porter as being potential candidates for hot seat discussions, actual evidence available indicates those names don’t belong there. Indeed, there are other teams performing, as a whole, much worse than Philadelphia or Portland.

Three teams performing worse at this time include Chivas USA, Houston and Toronto, while Vancouver is behind the power curve compared to Philadelphia and slightly ahead of Portland.  By the way, this is not to say John Hackworth might not belong in a list a bit later this year – but for now I think it is highly speculative to even put in print that he’s a potential hot seat candidate.

And with respect to Caleb Porter – it does seem, at times, that writers outside of the Portland area speculate and use the Timbers large supporter base to artificially increase readership in some of their articles… just saying. As a writer covering the Timbers here in Portland, reading the idea that Caleb Porter is on some sort of hot seat is (softly voiced) bollocks. But that’s just me…

In closing:

Given the evidence offered, does it seem reasonable that those four Head Coaches and their associated GM’s are worthy of a “Hot Seat” distinction? I think so…

Winning styles come in all shapes and sizes – the critical piece is having the right players to support that effort, and the time to install the system. Klopas, Cabrera, Yallop and Watson all know more about football than I do.

And it’s not my place, nor is it the place of any soccer writer (in my opinion) to pass judgment on whether or not someone should get fired or hired.

But… objective evidence indicates that those four teams, compared to others, lack an effective attacking system of play, lack strong overall team performance in attacking and defending while also lacking the most important measuring stick – points in the league table.

I’m sure this is not new, nor rocket science, to those head coaches, general managers, or owners… but… (perhaps?) it is helpful to others.

Best, Chris

You can find Chris on twitter @ChrisGluckPWP

@MLS Total Soccer Index Power Rankings and More

The all star game happens this week so what better time to review where teams compare in the Total Soccer Index.

After every team had played 17 games; (halfway through the season).

Top team with 17 games in, points wise, was not Atlanta it was FC Dallas; Red Bulls of New York showed most consistency in quality across the pitch.

  • With Jesse Marsch departing it will be interesting to see if Red Bulls can continue that run of quality – if they do they could be a great bet to win it all if the pundits stick with Atlanta given their edge in points earned.
  • I also feel vindicated, as a pundit, as I recognized Jesse Marsch as the best ‘domestic’ MLS Head Coach option for the US Men’s National Team last year; read it here:.
  • Of additional interest; my second choice to coach the US Men’s National Team would be Oscar Pareja… not Gregg Berhalter.

Oscar Pareja, like Jesse Marsch, shows greater consistency of purpose in earning points when his team plays with OR without the ball.

Worst were Orlando, Colorado, and Montreal…  more to follow on just how bad Orlando and Colorado are; even compared to Chivas USA!

Here’s how the teams stack up (in one big conference) for total games played.   The Eastern Conference rules the roost.

Best and worst playing at home; the Eastern Conference rules the roost:

Best and worst playing away from home; only LA Galaxy,, of the Western Conference, squeeze into the top 3:

The elite teams at home are pretty much the elite teams on the road; some of the worst teams on the road are also in the bottom third at home.

When you’re good, you’re good, when you’re bad you’re bad.

In closing:

I don’t live in Orlando and I don’t follow their team, but it seems to me there’s some significant issues with that club at ALL levels.

  • In any other league in any other country they’d be relegated.
  • Instead they’ll get extra money to make them ‘appear’ to be more competitive.
  • When you’ve got a bad front office and ownership group you’ll consistently have a bad team.

To give you an idea what I mean; here’s the Total Soccer Index for every team that’s played in MLS since 2014 (all games included in this analysis). 

Only one team (not Chivas USA) has shown worse overall (combined) team performance (2014, 2015, 2016, 2017, and now 2018) than Orlando…. San Jose!

As much as folks might blast Orlando City ownership there should be equal, if not more, attention sent the way of Colorado and San Jose, as well as Chicago (maybe?).

  • Relegation would have seen these teams probably sink to division 3 if US Soccer ran a proper organization.
  • Ironic that Chivas USA has actually shown better in team performance than Montreal, Chicago, Colorado, Orlando, and San Jose…
  • When is the last time you heard soccer pundits calling out those organizations like they did Chivas USA?
  • If ever an organization needed to hire a forward-thinking, team-performance statistical analyst it’s Orlando, San Jose, Colorado, or Chicago.  Give me a call. 🙂

While Atlanta have shown well their first two season, the two most consistent teams (across 5 years’ measurement) are Red Bulls and FC Dallas.

  • Just another piece of team performance data for US Soccer to analyze that confirms both Jesse Marsch and Oscar Pareja are far more consistent in having their teams ‘earn points’, while also showing consistent quality across the entire pitch, than Gregg Berhalter.
  • Oooh, that’s likely to be a snarky comment for some and a front loaded criticism of US Soccer if they decide to hire Gregg.  Note this isn’t me saying he’s not a good coach – he is – but is he a better selection than Pareja or some other guy who coaches domestic soccer in Europe?

Best, Chris

Gluck: Heaps of News in Major League Soccer This Week

Lots going on to share with you as Major League Soccer gets set for this weekend.  In particular order:

  • Major League Soccer Total Soccer Index (TSI)
    • Eastern Conference
    • Western Conference
  • Jay Heaps gets the heave-ho from New England; why?
  • Quality in MLS – has it got better since 2014?
    • If so, where?
  • Predictions for this weekend.
  • Closing thoughts on Expected Goals

As a reminder – I called out Expected Goals and Expected Passes this week.  Positive response from my European readers has been tremendous; so far my readers in the United States have remained quiet.

In case you missed it the explanation about what the Total Soccer Index is, is here.

Major League Soccer TSI:

This is how the league looks in a single table format; of course it’s pear-shaped from the start because the league doesn’t play a balanced schedule for everyone.

  • The hammer identifies teams who have sacked their Head Coaches this year; are Jim Curtin, Ben Oslen, and/or Jason Kreis on the block too?
  • The correlation of TSI to points earned is .82 this year; that’s an increase from the last two years.
  • Offering, in my view, parity is decreasing.
  • More to follow when we look at quality across MLS a bit later…

Eastern Conference:

The Eastern Conference has been the more predictable conference all year even though MLS has an unbalanced schedule.

  • Teams that usually possess the ball more, penetrate more, while showing greater patience in shot creation, end up with more goals scored.
  • This pattern, across all the categories in Possession with Purpose, more closely matches European League performances measured in the past.
  • Is this an indicator parts of Major League Soccer are growing closer to European Soccer in terms of tactics and how those general tactics drive similar results?
  • Maybe…
  • More to follow when looking at quality across the entire league.

Who finishes as Eastern Conference Champion?   Toronto.

Western Conference:

I’m not sure anything is settled in the wild-wild west.

  • We’ve seen musical chairs in almost every position of their conference table.
  • About the only thing remaining constant is the poor play of Colorado Rapids, Minnesota United, and LA Galaxy.
  • The greatest surprise may be the demise of FC Dallas, we’ve seen them swoon in late season before, does it happen again this year?
  • If any one team has been consistent this year it’s Sporting KC – but that’s the case every year.
  • With US Men’s National Team looking for a new Head Coach, after WC 2018, has Peter Vermes put himself in pole position over someone like Oscar Pareja?

Who finishes as Western Conference Champion?  Sporting KC

Who wins the MLS League Championship?  I have no idea.

Jay Heaps gets the heave-ho by New England, why?

Observations:

  • Their Attack:
    • 2nd worst percentage in overall possession across MLS
    • Mid-table in passing accuracy percentage
    • 3rd highest percentage in overall penetration of final third
    • 7th lowest percentage in shot creation
    • 5th highest percentage in shot precision
    • 8th lowest percentage in shot finishing
  • Opponents Have:
    • 2nd highest percentage of possession vs NER
    • Highest percentage of Passing Accuracy in MLS vs NER
    • Mid-table percentage in penetration vs NER
    • Lowest percentage of shot creation in MLS vs NER
    • Eighth highest percentage in shot precision vs NER
    • Fourth highest percentage in shot finishing vs NER

Conclusions:

  • Their Attack:
    • The team does not lack in attack.
    • Shot creation is at a lower level relative to a higher level of penetration; usually a positive sign of patience in attack.
    • That, coupled with being eighth highest in shot precision means when they create space there are putting shots on goal.
    • What (may?) lack in attack is finishing…. but when you look at the stable of players and see Kamara on 11 goals, Nguyen on nine, and Agudelo on eight they are pretty good/versatile in attack.
  • Their Defense:
    • Lacks by a considerable margin compared to their opponents.
    • Opponent’s are averaging over 80% passing accuracy; partly due to Revolution tactics of ceding space outside the final third in order to facilitate a better counter-attack.
    • What is striking is their opponents are also eighth best in putting shots on goal and fourth best in finishing.
    • That indicates Revolution opponents are gaining solid possession time BOTH INSIDE and OUTSIDE their defending final third.
  • Is it the wrong players on the pitch?
  • Is it the wrong defensive tactics on the pitch?

I’d say it’s both.

Quality in Major League Soccer:

It appears that quality has been roughly the same, year in and year out since 2014.

  • But that’s deceptive.  From 2014 through to 2017
    • The difference between average passing accuracy for the best and worst has increased from 7.17% to 9.50%
    • The difference between average penetration percentages for the best and worst has increased from 12.67% to 16.05%
    • The difference between average creation percentages for the best and worst has increased from 6.67% to 11.48%
    • The difference between average precision percentages for the best and worst has increased from 9.31% to 12.87%
    • The difference between average finishing percentages for the best and worst has increased from 21.73% to 23.42%
  • The gap between better teams and worse teams has widened.
  • Another indicator parity has decreased, not increased.

Given the trends offered through PWP analysis it appears parity is on the decline in MLS.

When the season ends poor management will be rewarded with more money instead of being relegated; entitlement is alive and strong in Major League Soccer.

Predictions for this weekend:

As with most weeks, the home team is more likely to earn points.

  • So far this year the home teams have earned 589 points versus 284 for away teams.
  • That’s a pretty solid 2 to 1 margin in favor of the home team.
  • Last year home teams earned 612 points compared to 300 points for away teams.
  • In 2015 it was 624 points for home teams and 324 points for away teams.
  • In 2014 it was 557 points for home teams and 323 points for away teams.
  • Conclusion – even without using Expected Goals it’s pretty clear a novice in soccer can guess who will earn points in MLS games.

By the way, if using the TSI to predict who would have won the U.S. Open Cup the numbers show Sporting KC with an average TSI of .41 (at home) while the New York Red Bulls were .00 (away from home).

The final result was 2-1 Sporting KC.  In hind sight the TSI predictor was accurate in predicting the U.S. Open Cup winner.

In closing:

  • Do you really need to know what Expected Goals are to predict which teams in Major League Soccer will earn points week to week?  No….
  • If you bet the home team you’ll be right roughly 66% of the time.
    • Just another reason to debunk the value of expected goals.
    • Oh… I’m hearing expected goals statistics are being used to predict results for the next year, using previous years data.
    • And that those correlations are pretty solid from year to year.
    • Well, they will be.
    • You’re only using one event-based statistic to predict results in the next year and that number is notoriously low for every team; for room for error is minimal.
    • I’m willing to bet a teams’ Expected Goals from two, three, or even four years ago will also have a pretty high correlation to the current year too…
    • Why?
    • Because only one variable is being measured and the variation in that variable is low – very low.
    • What makes that approach worse is it violates common sense.
    • Teams change players and Head Coaches from year to year and while they may score the same amount of goals, year in and year out, their overall results may be different because they got better defenders or improved their defensive tactics.
  • Parity in Major League Soccer has waned this year and it’s likely to get worse next year as LA adds another team.

Best, Chris

@CoachChrisGluck