College Prospect Ratings (CPR) 2016

By Steve Shea, Ph.D. (@SteveShea33)
April 6, 2016

College Prospect Ratings (CPR) is a formula that uses NCAA players’ on-the-court performance to quantify their NBA potential. Below, I will present the CPR ratings for 105 of the top prospects in the 2016 draft. Before I get to this year’s ratings, I go through some of the aspects of the model and present some of the model’s successes and failures among recent draft classes.

A Performance-based Model

CPR uses each player’s performance on the court (as measured by box-score stats) to approximate his pro potential. There can be a number of reasons a prospect does not perform well on the floor. Many of these are an indication that the prospect will not be a great pro. However, there are other reasons for a lack of performance that may not suggest lesser potential. The most extreme example is an injury that takes the player off the court entirely. This happened for high-profile prospects Kyrie Irving and Nerlens Noel in recent seasons. When an injury takes a player off the court, it’s going to hurt his CPR. In these cases, it’s important to understand that a lower CPR does not reflect a lower talent level for these prospects.

As a performance-based model, CPR will not like players like Skal Labissiere and Cheick Diallo. These players did not perform well this season. Any team that drafts them will be doing so based on indicators besides their on-the-court performance this past season.

Quality of Opposition

CPR does not adjust for quality of opposition. It’s true that certain players face different contexts, but I have not yet seen an appropriate way to measure this context.

Often, quality of opposition is factored into a model by measuring the quality of the teams the individual faced. For example, Andrew Harrison’s Kentucky team in 2014-15 faced tough competition. That Kentucky team had a strength of schedule score (according to Sports-Reference.com) of 8.67. In contrast, Steph Curry’s 2008-09 Davidson team did not face very difficult competition. Davidson had a strength of schedule score of -3.33.

In 2008-09 Curry shot 38.7% on 3s. In 2014-15, Andrew Harrison shot 38.3%. Since Harrison’s team faced a tougher schedule, should the model be more impressed by Harrison’s 3P%? I’d argue the exact opposite. Harrison played on a loaded Kentucky team. How often was the defense focused on stopping Harrison? How often was Harrison double-teamed? Almost never.

Curry was the offense in Davidson. Opponents schemed specifically for Curry. Curry may have been playing mid-major competition, but they were draped all over him, and he still managed to shoot an amazing percentage. If we were going to adjust for quality of opposition, I’d argue that Curry’s numbers should be inflated as opposed to Harrisons.

In my experience, using measures such as strength of schedule in a draft model grossly underappreciates the context players like Steph Curry, Damian Lillard and C.J. McCollum played in, and thus, grossly underrates these players.

No Physical Measurements

CPR does not include height, weight, wingspan, or any other physical measures of the prospect. These measurements are important information, but mashing physical characteristics with on-the-court performance into one metric can be difficult to interpret.

Speaking about Providence’s Ben Bentil, a scout said, “He’s not going to be a power forward in the pros. He’s not 6-9. I’m hoping he’s 6-8 with a pair of sneakers on, so that means he’s going to have to be some form of a small forward.”

It sounds like what scouts said about Draymond Green in 2012. “The consensus is that Green won’t be able to guard either forward position because true small forwards will be quicker and true power forwards taller and able to post him and shoot over him.”

First, guarding in the post has far more to do with a player’s footwork, anticipation, awareness, athleticism, grit, length, etc. than it does an inch in his height. Second, the traditional five positions is an antiquated notion. In an NBA where the ability to switch screens is incredibly important, players like Green shouldn’t be labeled “tweeners.” They are versatile.

The scouts can determine whether an inch or two in height is important. The teams can decide to pass on Karl Towns because he doesn’t have a big enough ass. CPR will focus on basketball performance.

CPR’s Successes

CPR has correctly identified numerous 2nd round picks that eventually went on to have pro careers that far exceeded the expected value of a 2nd-rounder. For example, in 2012, Draymond Green (CPR=5.0), Jae Crowder (CPR=4.7) and Will Barton (CPR=6.2) all went in the 2nd round, but CPR rated all 3 in the top 10 for the class. In retrospect, all three would have been great 1st round selections. CPR had Kyle Korver (CPR=5.2) as a first round talent in 2003, and thought Hassan Whiteside (CPR = 14.6) was a ridiculous steal when he went 33rd overall in 2010.

CPR has made the right choice when many teams have missed. Here are just a few examples. In 2009, Minnesota selected Johnny Flynn (CPR=4.3) ahead of Steph Curry (CPR=10.6). In 2010, Golden State took Ekpe Udoh (CPR=4.8) when they could have had Paul George (CPR=8.9). In 2011, Phoenix took Markieff Morris (CPR=2.1), and Houston drafted Marcus Morris (CPR=2.3) right before Indiana drafted Kawhi Leonard (CPR=5.7). In 2012, Cleveland drafted Dion Waiters (CPR=1.8) 4th overall when Damian Lillard (CPR=4.8) went two spots later.

CPR correctly identifies superstar talent. Kevin Durant (CPR=38.6), Anthony Davis (CPR=24.1) and Carmelo Anthony (CPR=14.9) are the top 3 overall scores (among an incomplete run of recent draft classes). The “above 10” class also includes Blake Griffin (CPR=10.1), Tim Duncan (CPR=12.7), DeMarcus Cousins (CPR=10.9), and Kevin Love (CPR=14.5) among others.

CPR’s Failures

CPR doesn’t always find the late-round steals. CPR thought Chandler Parsons (CPR=1.7) was a 2nd round pick in 2011. That’s where he went, but his performance in the NBA has made that pick look great in retrospect.

CPR has missed at the top. CPR had Greg Oden (CPR=10.2) as the 2nd best prospect behind Kevin Durant in 2007. Oden went 1st overall. Unfortunately, Oden’s career was derailed by injuries.

In a poorly rated 2013 class, CPR thought Anthony Bennett was a top 3 pick (CPR=7.6). Bennett went 1st overall, but has been a complete bust thus far in his brief career.

2016 CPR

CPR offers a perspective that differs from traditional scouting. When CPR agrees with scouts, it provides added assurance on the prospect. When CPR disagrees with scouts, it should prompt teams to ask why and to take a second look at the player. With that in mind, here are the 2016 scores.

PlayerSchoolCPR
Brandon IngramDuke9.0
Ben SimmonsLSU8.8
Henry EllensonMarquette8.4
Jamal MurrayKentucky8.2
Jakob PoeltlUtah7.5
Kay FelderOakland7.5
Patrick McCawUNLV7.3
Benjamin BentilProvidence6.7
Dejounte MurrayWashington6.4
Buddy HieldOklahoma6.3
Pascal SiakamNew Mexico St.5.6
Denzel ValentineMichigan State5.6
Grayson AllenDuke5.6
Isaiah WhiteheadSeton Hall5.3
Dillon BrooksOregon5.3
Daniel HamiltonUConn5.1
Marquese ChrissWashington4.7
Tyler UlisKentucky4.5
Diamond StoneMaryland4.4
Malik BeasleyFlorida State4.2
Kris DunnProvidence4.2
Shawn LongLouisana3.9
Bryant CrawfordWake Forest3.9
Melo TimbleMaryland3.7
Jarrod UthoffIowa3.7
Malachi RichardsonSyracuse3.7
David WalkerNortheastern3.6
Domantas SabonisGonzaga3.6
Taurean PrinceBaylor3.3
Bennie BoatwrightUSC3.3
Gary Payton IIOregon St.3.2
Georges NiangIowa St.3.2
Dwayne BaconFlorida State3.2
Kyle WiltjerGonzaga3.2
Joel BolomboyWeber St.3.2
Jaylen BrownCalifornia3.0
Jameel WarneyStony Brook3.0
Dorian Finney-SmithFlorida3.0
Stephen ZimmermanUNLV2.9
Chinanu OnuakuLouisville2.8
Michael GbinijeSyracuse2.8
Brice JohnsonUNC2.8
A.J. HammonsPurdue2.7
Aaron HolidayUCLA2.7
Wade BaldwinVanderbilt2.7
Edmond SumnerXavier2.6
Antonio BlakeneyLSU2.6
Deandre BembrySt. Joseph's2.5
Michael CarreraSouth Carolina2.5
Yogi FerrellIndiana2.4
Ivan RabbCalifornia2.4
Anthony BarberN.C. State2.4
Demetrius JacksonNotre Dame2.4
Chris BoucherOregon2.4
Shake MiltonSMU2.3
Allonzo TrierArizona2.3
Josh HartVillanova2.3
James Webb IIIBoise St.2.2
Jaron BlossomgameClemson2.2
Nigel HayesWisconsin2.1
Isaac CopelandGeorgetown2.1
Malcolm BrogdanVirginia2.0
Monte MorrisIowa St.1.9
Ron BakerWichita St.1.9
Daniel OchefuVillanova1.9
Justin JacksonUNC1.8
Jake LaymanMaryland1.7
Alex CarusoTexas A&M1.7
Malik NewmanMississippi St.1.7
Fred VanVleetWichita St.1.7
Perry EllisKansas1.6
Thomas BryantIndiana1.6
Devin RobinsonFlorida1.6
Damion LeeLouisville1.5
Robert CarterMaryland1.5
Danuel HouseTexas A&M1.5
Mathew Fisher-DavisVanderbilt1.5
Troy WiliamsIndiana1.4
Tim QuartermanLSU1.4
Marcus PaigeUNC1.4
Luke KornetVanderbilt1.3
John EgbunuFlorida1.3
Moses KingsleyArkansas1.3
Sheldon McClellanMiami1.3
Isaiah BriscoeKentucky1.3
Isaiah TaylorTexas1.3
Tyler HarrisAuburn1.2
Caris LeVertMichigan1.2
Deyonta DavisMichigan State1.1
Damian JonesVanderbilt1.1
Zach AugusteNotre Dame1.1
Tyrone WallaceCalifornia1.0
Devin ThomasWake Forest1.0
Wayne SeldenKansas1.0
Shevon ThompsonGeorge Mason1.0
Skal LabissiereKentucky0.9
Kaleb TarczewskiArizona0.8
Alex PoythressKentucky0.7
Tonye JekiriMiami0.7
Sviatoslav MykhailiukKansas0.7
Amida BrimahUconn0.6
Carlton BraggKansas0.5
Prince IbehTexas0.4
Marcus LeeKentucky0.3
Cheick DialloKansas0.3

Updated 2016 CPR

By Stephen Shea, Ph.D. (@SteveShea33)

College Prospect Ratings (CPR) are an objective measure of an NCAA prospect’s NBA potential. They are generated from a player’s projected position, his years experience in college and the box score production captured in his game logs.

Below, I will present the updated ratings (as of March 29th), which for many players (including Ben Simmons) are their final CPR ratings.

Flashback to 2009

As Steph Curry is destroying the NBA, teams that passed on him in 2009 have to be wondering if they missed something pre-draft that would have provided some insight that Steph would develop into such an exceptional player. In particular, the Wolves, who drafted 2 point guards 5th and 6th overall and right before Curry went to the Warriors, have to wonder if their draft strategy was flawed.

No one saw Steph Curry becoming the all-time elite player that he is today, but there were reasons to suspect that he would be great. A draft model like CPR would have been one of them.

Here are the top 7 picks from the 2009 draft with their CPR scores (excluding Rubio).

PickPlayerCPR
1Blake Griffin10.1
2Hasheem Thabeet6.4
3James Harden5.5
4Tyreke Evans7.2
5Ricky Rubio-
6Johnny Flynn4.3
7Steph Curry10.6

On average, about 1 player per draft will rate above 10 in CPR. Without a doubt, a rating above 10 suggests a top 3 pick.   In 2009, both Blake Griffin and Curry rated above 10 with Curry slightly edging out Griffin for the high score. The general rule of thumb is that integer differences matter in CPR, while decimal differences aren’t that significant. Griffin and Curry were rated close enough that the model wouldn’t object to the selection of Griffin over Curry.

In contrast, CPR strongly favors Curry over the other NCAA players drafted above him (especially Johnny Flynn), and in retrospect, the model was right.

It’s also important to note that players can score well in CPR and not develop into solid NBA players, and players can score low and surprise. Hasheem Thabeet’s rating of 6.4 suggest that he should be a mid to late lottery selection and that he could develop into not a star, but a functional center. To date, he hasn’t done it. In contrast, Harden’s rating of 5.5 suggests he’s a late lottery selection, but he’s developed into the type of player that justifies his top 3 selection.

2016 Draft

Below are the updated CPR Rating for 35 of the top NCAA prospects.

PlayerCPR
Brandon Ingram9.0
Ben Simmons8.8
Henry Ellenson8.4
Jamal Murray8.2
Jakob Poeltl7.5
Dejounte Murray6.4
Buddy Hield6.1
Denzel Valentine5.6
Grayson Allen5.6
Marquese Chriss4.7
Tyler Ulis4.5
Diamond Stone4.4
Malik Beasley4.2
Kris Dunn4.2
Melo Trimble3.7
Domantas Sabonis3.6
Taurean Prince3.3
Jaylen Brown3.0
Stephen Zimmerman2.9
Brice Johnson2.8
A.J. Hammons2.7
Wade Baldwin2.7
Thomas Bryant2.5
DeAndre Bembry2.5
Ivan Rabb2.4
Demetrius Jackson2.4
Nigel Hayes2.1
Malcolm Brogdon2.0
Malik Newman1.7
Caris LeVert1.2
Deyonta Davis1.1
Damian Jones1.1
Wayne Selden1.0
Skal Labissiere0.9
Cheick Diallo0.3

Ben Simmons dropped significantly from his midseason CPR of about 15.6. There are reasons for the drop that aren’t due to him performing poorly necessarily.

First, LSU concluded its season after their last conference tournament game on March 12. This meant that Simmons had less opportunity to demonstrate his pro potential and to improve his CPR score. Had LSU made the NCAA tournament, it’s likely Simmons’ CPR score would be higher.

The second reason for the drop in CPR score is a technical reason and suggests that he was overrated in the midseason report. CPR uses a player’s 3P% in its formula. At season’s end, there are a minimum number of 3-point attempts needed for this 3P% to factor positively into the formula. (We don’t want a player that went 1 for 2 on 3s to profile as an excellent 3-point threat.) Early in the season, I usually don’t require any minimum number of 3-point attempts since CPR is dealing with small sample sizes everywhere. Later in the season, I usually require some prorated minimum, but I neglected to do this with Simmons in the above-reference ratings.

On the season, Simmons was 1 for 3 (33.3%) on 3-pointers. Obviously, that’s not a large enough sample, and so CPR considers Simmons to not have demonstrated college 3-point shooting ability. This is a significant blow to Simmons’ rating.

CPR looks for excellence in statistical production, whether it’s steals, blocks, rebounds, 3P% or elsewhere. The final output’s growth is exponential with regards to the accumulation of “excellence.” So, a player that has not profiled as excellent in anything would only get a small bump in CPR if he were a good 3-point shooter. In contrast, a player that has demonstrated excellence in 6 stats already would get a huge boost for adding 3-point shooting. Ben Simmons has demonstrated excellence in a number of categories. That’s why he has a CPR of 8.8, which is usually good enough to be in the top 3 in the draft class. If he had also demonstrated solid (but not exceptional) 3-point shooting, he would be about a 12 in CPR. In other words, CPR suggests that a Ben Simmons that could shoot 3s would be a better prospect than Blake Griffin was.

Ingram is now the top-rated 2016 prospect.  Jamal Murray, Hield and Poeltl saw significant improvements in their CPR scores since midseason. CPR likes Dejounte Murray as a late first round sleeper. There is nothing in the box score production to suggest Skal Labissiere or Cheick Diallo is going to be a great pro. Finally, CPR suggests Jaylen Brown is overrated by scouts that have him in the top 7.

How an NBA team can use an expected points model

By Chris Baker (@ChrisBakerAM) and Steve Shea (@SteveShea33)

March 16, 2015

 

We played well tonight, but the shots didn’t fall.” – Every coach ever

We hate the use of “random” or “luck” when describing the outcomes of athletic events. When Curry makes a three, it’s not the same as walking up to a slot machine, pulling the lever and winning the jackpot. One event requires skill, and the other does not.

Of course, Curry has spent nearly his entire life preparing to knock down a jumper when given the opportunity, but the nonrandom contributions extend further. Any specific 3-point attempt can be the result of the efforts of several skilled Golden State players. Perhaps Klay Thompson picked off a pass and started the transition that led to an open shot for Curry on the wing. Maybe Andrew Bogut set a perfect screen to spring Curry in the corner. Maybe the defense simply made a mistake. Curry doesn’t get open “by chance.”

Basketball is not random. However, the difference between making and missing can be so small that we can’t necessarily fault a player for missing a shot. Even Curry will miss the occasional open catch-and-shoot corner 3. (I think…maybe…Is he human?)

One of the major themes in sports analytics is to evaluate the process and not be fooled by the results. Sometimes good process can produce poor results, and vice versa. For example, a team can prepare exceptionally well for the draft and still miss on the prospect they select. Another team can throw darts blindfolded to pick their prospect and hit. It doesn’t mean that throwing darts is the better draft preparation.

Players miss good shots and make bad shots. Yet, we assume that the player that went 8 for 10 from the field played well, and the player that went 3 for 10 did not. The first player may have made 5 contested mid-range jumpers where he usually shoots 28%. The second player may have worked hard to get open corner 3s where he usually shoots 46%, but tonight he went 0 for 4. Perhaps we should be praising the second player and not the first.

An analogous situation occurs from the defensive perspective. We shouldn’t necessarily judge a defender based on how many shots his opponent makes. Instead, we should focus on the shots he forced his opponent to take. If those were bad shots, even if his opponent made an unusually high percentage of them, it was a solid defensive performance.

To shift the attention from the results (make or miss) to the process (the quality of the shot), we need an expected points model.

Expected points model

NBA.com used to provide shot logs for all players. These shot logs detailed the shot’s distance to the hoop, the distance to the closest defender, and whether or not the player dribbled into the shot. We used this information to calculate the average shooting percentage in all situations for all players for the each of the previous two seasons.

With this information, we can calculate how many points a player is expected score given their shot opportunities in a given game. We can then aggregate the expected points for the players to arrive at an expected total for the team. (We can calculate totals as well as rates, such as expected points per shot.)

Team analysis

No player is working in isolation. When one player gets open, he can often credit his teammates for creating the space. Similarly, defense is a team activity with rotations, switches and help. Thus, the team level may be the most appropriate domain to apply an expected points model. At least, it’s a good place to start.

The following plots the Spurs’ expected points per shot versus their actual points per shot by game in the 2014-15 season. (Note that the team total is the aggregation of the individual totals. In other words, the formula for expected points per shot pays attention to who’s taking the shot.)

Spurs game log PPS

As expected, actual points per shot varies more than expected points per shot. There are games when the team’s actual points per shot far exceeds their expected points per shot and vice versa. For example, on April 15, 2015 against New Orleans, the Spurs had an expected points per shot of 1.04. That’s a poor number by their standards. However, that night the shots fell. They scored 1.22 points per shot. That was quite different than what took place March 17, 2015 against the New York Knicks. That night, the Spurs’ expected points per shot was 1.13. That shows that San Antonio was able to get (and chose to take) good shots. However, the shots didn’t fall against New York. The team scored 0.94 points per shot.

Spurs O PPS

The question for San Antonio is how do they want to evaluate their offense? Do they want to praise the performance against New Orleans (April 15) because the shots fell, or are they going to be more pleased with the offense against New York (March 17) where they were able to create better looks?

The situation is similar on the defensive side. On December 10, 2014, the Spurs held the Knicks to an expected points per shot of 0.93. However, that night the shots fell for New York. They scored 1.14 points per shot. On March 27, 2015, the Spurs allowed 1.08 expected points per shot from Dallas. However, the Mavs only scored 0.84 points per shot. Based on the actual points per shot, it appears as though the Spurs played much better defense against the Mavs. The expected numbers tell a different story.

Spurs D PPS

It’s important to note that the opponents’ expected points per shot are based on the opposing players’ average numbers on the season. Thus, they are not restricted to performances against a particular team. For example, New York’s expected points per shot of 0.93 against the Spurs on December 10th was based on the shots each player got against San Antonio and what those players typically shoot in those situations (against the Spurs or not).

Not all contested shots are equal. When Serge Ibaka contests a shot at the rim, it looks very different than when Isaiah Thomas contests a shot at the rim. Thus, opponents’ expected points per shot and actual points per shots may differ.

Last season, Houston, Golden State, Oklahoma City, Chicago, and Milwaukee saw the biggest difference in opponents’ expected PPS and opponent’s actual PPS on average per game. For Houston, the difference was about 3 points per 100 shots (where 0.44 FTA is a “shot”). In other words, Houston’s opponents scored 3 less points per 100 shots than their expected points per shot suggested. Golden State, Oklahoma City, and Chicago were all around 2 points per 100 shots. Milwaukee was close to 1.5.

All five of the teams at the top of this metric were known for having great “length” and “positional versatility” on defense. Some may wonder why the Houston Rockets, which put a great emphasis on perimeter shooting in their offense, would go for players like Josh Smith, Corey Brewer and K.J. McDaniels. We’re seeing part of the answer in these numbers. Great length and quickness can certainly influence expected points. It can mean running more players off the 3-point line, or coaxing players to pull up in mid-range (as opposed to challenging the length at the hoop). However, what we’re capturing in this metric is that these types of defenders are doing even more.

At the other end, Minnesota gave up a terrible 6 more points than the expected model predicted for every 100 shots. New York and Orlando were the next closest at around 3 points per 100 shots.

San Antonio didn’t see a significant difference between their opponents’ expected points per shot and actual points per shot on average.

The table below displays the averages for all teams in each of the last two seasons.

Opponents' Expected vs. Actual Points Per Shot (PPS)

SeasonTeamOpp. Exp. PPSOpp. Act. PPSDifference
2015MIN1.0851.142-0.057
2015NYK1.0811.114-0.033
2015ORL1.0751.108-0.032
2015BRK1.0681.092-0.024
2015LAL1.0971.119-0.022
2015DEN1.0781.099-0.021
2015DET1.0671.087-0.020
2015CLE1.0581.073-0.015
2015TOR1.0751.090-0.015
2015CHO1.0411.053-0.012
2015MIA1.0801.092-0.012
2015BOS1.0571.068-0.011
2015SAC1.0811.091-0.010
2015PHO1.0761.084-0.009
2015ATL1.0541.060-0.006
2015MEM1.0541.059-0.005
2015DAL1.0811.085-0.004
2015SAS1.0421.045-0.003
2015UTA1.0601.061-0.001
2015LAC1.0761.0750.001
2015NOP1.0741.0700.003
2015WAS1.0471.0430.004
2015PHI1.0961.0880.008
2015IND1.0561.0460.010
2015POR1.0401.0290.011
2015MIL1.0801.0650.016
2015CHI1.0411.0200.020
2015OKC1.0831.0610.022
2015GSW1.0581.0360.022
2015HOU1.0851.0540.031
2014ORL1.0591.091-0.032
2014MIL1.0901.121-0.031
2014PHI1.1051.136-0.031
2014UTA1.0931.122-0.029
2014DET1.0941.120-0.026
2014ATL1.0661.092-0.026
2014MIN1.0771.101-0.024
2014BRK1.0841.104-0.020
2014NYK1.1001.120-0.020
2014CLE1.0761.093-0.017
2014WAS1.0771.092-0.015
2014BOS1.0821.097-0.015
2014MIA1.0871.100-0.014
2014DAL1.1021.113-0.011
2014SAC1.0981.108-0.010
2014SAS1.0321.040-0.008
2014MEM1.0681.074-0.006
2014CHA1.0501.056-0.005
2014NOP1.1171.120-0.003
2014LAL1.0921.094-0.003
2014PHO1.0871.0860.001
2014TOR1.0821.0780.004
2014POR1.0591.0550.004
2014DEN1.0941.0870.006
2014CHI1.0431.0220.021
2014GSW1.0661.0440.022
2014HOU1.0821.0580.024
2014LAC1.0871.0570.030
2014OKC1.0901.0590.032
2014IND1.0501.0080.042

While actual and expected averages for opponents do not always align on a season, the expected model can still be quite useful for evaluating team defensive performance. For one, it reflects the extent to which a team forced difficult shots (e.g. contested low percentage opportunities).

Also, if not faced with significant injuries or trades, teams tend to keep player minutes and usage consistent. Thus, when comparing performances for a particular team, any added benefit from defenders that contest “better” remains close to constant.

Individual analysis

The chart below plots James Harden’s actual versus expected points per shot by game for the 2014-15 season. Similar to the team level, the individual’s actual points per shot will vary much more than their expected points per shot.

Harden PPS Chart

On December 31, 2014, Charlotte held Harden to 1.11 expected points per shot. Harden’s average game that season was 1.21 points per shot. (Again, “shot” includes 0.44 free-throw attempts.) Part of Charlotte’s success was that they held Harden to just 4 free-throw attempts.

Unfortunately, Harden scored 1.73 points per shot. (It helped that he went 8 for 11 on threes). Charlotte should certainly review the game film to see what they could have done better. However, the expected numbers imply that Charlotte did a better job defending Harden than the actual numbers suggest.

On March 12, 2015, Utah held Harden to 0.80 points per shot. However, the expected model reveals that it may have been more a result of Harden having an off night than anything exceptional from Utah’s defense. Utah allowed Harden to get 1.31 expected points per shot that night.

In spite of the poor performance from Harden against Utah, the Jazz might want to go back and revise how they defended him. If they continue to allow 1.31 expected points per shot from Harden, he’s going to score more than 0.80 points per shot.

Final thoughts

Good process can occasionally yield poor results. Poor process can occasionally yield good results. When teams focus too much on the results, they can be misled. For example, if the shots aren’t falling for three straight nights, a coach might think he needs to mix things up. These changes may not be necessary, and the expected points model would go a long way in determining if the lack of offensive efficiency is due to something systematic, or just a run of “bad luck.”

Additional notes

-We’re not sure what an “expected turnover” looks like, but actual turnovers could be added to both the expected and actual shot production to get an expected and actual offensive rating for the team or player.

-Here, we used an entire season as the base line to judge games in that season. Teams would likely want to see expected production following each game as the season progresses. To do this, teams could use a rolling 60 to 82 prior games (dating back to the previous season) as the baseline. Rookies would likely need an artificial prior until a decent sample could be gathered from their NBA minutes.

-More detailed information about player locations could help this model. For example, a defender contesting a shot at the rim from behind the offensive player is much different than a player contesting from in front of the defender. The shot logs did not contain this level of detail.

-There were some glitches in the data, but we did not find anything that would dramatically influence the numbers presented in this article.

Hello Bucks, my name is Corner 3

By Stephen Shea

March 1, 2016

This season, the Milwaukee Bucks have given up twice as many corner 3s as they have made. It’s a feat that only two teams have accomplished in the last three years. (The 2015 Timberwolves were the other team.) It’s a mark that only 30 teams have hit in the last 20 seasons, and 27 (90%) of those had a losing record.

The corner 3 is an efficient shot, and an effective use of the corner 3 provides the type of spacing that typically leads to an efficient offense. A lack of use of the corner 3 on offense and an inability to prevent the corner 3 on defense suggest a lack of awareness among the organization and coaching staff of how the NBA game has evolved.

Every team is unique, and the ideal offensive and defensive systems for any organization need to be tailored to the strength’s of the personnel. No roster construction would excuse the corner 3 ratio demonstrated by the Bucks, but we might relax our criticism if the team were not built for corner 3 usage and prevention. THEY ARE!

The Bucks have emphasized a roster construction with length, quickness and positional versatility, which should be excellent at preventing corner 3s.

Offensively, the team has opted to plant a post-up big, Greg Monroe, in the paint, which limits opportunities for the perimeter players to drive. In addition, the emphasis on length and defensive flexibility has left the roster light on perimeter shooting. In an era where floor spacing is of the utmost importance, the Bucks NEED the corner 3 to keep opponents from helping on Monroe and to maintain what little space is left for cutters and drivers.

Why do we care so much about the corner 3?

Corner 3s have accounted for just 6.6% of the total points scored in the NBA this season, but the importance of the corner is far greater than what that percentage suggests.

The most obvious impact of the corner 3 is the value obtained from the shot itself. The NBA is shooting 37.5% on corner 3s this season. That equates to 1.125 points per attempt. According to NBA.com, that’s almost exactly the points per possession for the Golden State Warriors this season. The other 29 teams are scoring at a lower rate. The corner 3 is more efficient than what teams typically get on offense (including transition).

The corner 3’s value extends beyond what it provides when shot. Effective use of the corner 3 contributes immensely to the spacing in the halfcourt offense. See this previous post where we show that use of the corner 3 correlates with overall offensive efficiency.

See this post for a discussion of how driving and corner 3s are complementary activities, and their combined usage is remarkably predictive of offensive efficiency.

In another article, we use spatial tracking data to show that simply positioning capable shooters in the corners correlates with a more efficient offense.

Finally, see this article, which uses spatial tracking data to show that with each additional 3-point threat on the court, there are less help defenders around the paint, and the offensive efficiency improves. When teams are spacing the floor with 3 to 4 perimeter shooters, they are making use of the corner 3.

Kings of the corner 3

Why have the Spurs been so successful over the last 20 years? Talent has played a large part, but teams do not have the level and duration of success that the Spurs have enjoyed on talent alone. They also have great offensive and defensive systems, and the corner 3 is a big part of both.

To assess a team’s usage and prevention of the corner 3, we’ll use the ratio of the made corner 3s against them to the corner 3s they make. In each season from 2001 to 2014, the Spurs were in the top 3 in this ratio (where top implies lowest ratio.) In 13 of those 14 seasons, they were in the top 2. In 8 of those seasons, they led the league. The following chart presents the top 3 teams for each season since 2001.

ratio

The following chart shows that teams that excel in this ratio also win.

chartcorner3

The Spurs were ahead of their time in recognizing of the importance of the corner 3. For example, in 2005, the Spurs only gave up 88 corner 3s while making 237. That’s a ratio of 0.37. The next best ratio that season was 0.51.

In recent years, the Spurs haven’t been so dominant in this category. In 2015, the Spurs were 4th in the ratio. This season, they are 6th.

What happened the last two seasons? Simply put, other teams caught on. Everyone is familiar with how Daryl Morey has constructed the Rockets in an analytically savvy fashion, we recently wrote about the brilliance of Portland, and the Warriors…Well, they do just about everything right.

Detroit coach Stan Van Gundy brings the corner 3 wherever he goes. Detroit is 4th in the ratio this season, and check out his influence on Orlando.

stan

For more than a decade, the Spurs were the undisputed kings of the corner 3. Now, they are just one of many teams that realize its value. Given the growth of basketball analytics, it’s not surprising that many teams have caught on to this component of the Spurs’ success. What’s surprising is that there are teams that still haven’t figured it out.

Bucking the trend

In 2014, the Bucks allowed 1.96 corner 3s for every corner 3 they made. That was the lowest mark of any team that season. That abomination coincided with winning just 15 games. It was a low point for a franchise that’s too familiar with failure.

2015 brought new hope. Giannis Antetokounmpo was improving, Khris Middleton was emerging, and although he got hurt, the recent addition of Jabari Parker suggested the Bucks had a young core to build around. The team surprised by winning 41 games and making the playoffs. As part of their improvement, the team improved their corner 3 ratio, giving up 1.22 corner 3s for every corner 3 they made.

2015 should have been the stepping-stone for better things this season. Milwaukee brought back largely the same core (although the transition from Brandon Knight to Michael Carter-Williams that began at the 2015 trade deadline hasn’t worked out well.) In addition, Milwaukee spent big on Greg Monroe. (We questioned the fit of Monroe and MCW around Milwaukee’s core in a recent lineup construction article. So, we won’t focus on that here.)

Instead of moving up, Milwaukee has regressed. The team is 25-35 and in 12th place in the Eastern Conference. Their ratio of opponent corner 3s to own corner 3s has plummeted as well. They are giving up two corner 3s for every one that they make. It’s the worst mark this season.

Among the bottom 10 teams in this ratio, all 10 have a negative net rating. In other words, they are all giving up more points than they score. Among the top 10 teams in the ratio, eight have a positive net rating. Only Houston (which ranks 8th) and Philadelphia (which ranks 10th) have a negative net rating. Philadelphia employs an analytically savvy strategy on the court, but have nowhere near the talent yet to compete. Houston employs a similarly savvy system. They have more talent, but it’s been a tumultuous season that has seen them switch coaches and shop one of their stars, Dwight Howard, at the trade deadline.

Milwaukee’s performance in corner 3s is especially deplorable given their current roster construction. The team has emphasized long and athletic perimeter players that can switch screens and challenge perimeter shots. Player like Giannis, Middleton and MCW should have no problem rotating and taking away opponents’ corner 3s. Instead, Milwaukee has given up the most corner 3s this season.

On offense, many teams “hide” an average or below average offensive player in the corner. Teams need to balance the demands on offense and defense in their lineups. This often means that teams play players that are great defensively, but limited offensively. Players like Al-Farouq Aminu and Corey Brewer find themselves in lineups more for their defense than their offense. Offensively, Brewer and Aminu often find the corners where they can be efficient enough in a catch-and-shoot opportunity to provide value and spacing for their team.

Milwaukee has decided to invest in an offensive center that likes to post up. Greg Monroe provides little offensive value away from the hoop. This often pushes Milwaukee’s forwards to the perimeter. The Bucks do not have great 3-point-shooting forwards in Giannis and Parker. Neither should be launching from above the break. However, there is good reason to believe both could be efficient at corner 3s.

Jabari Parker was 38-106 (36%) from 3 in his one season at Duke. His FT% (75% at Duke and 78% this season) suggests he’s got the proper shooting mechanics and touch. Parker should have little problem knocking down corner 3s. Instead, he’s only taken 5 this season.

Milwaukee does place capable shooters in the corners on offense. Often, Middleton, O.J. Mayo and Jerryd Bayless will end up there. Having capable shooters in the corners is a good thing. However, when you’re best shooters are in the corners, it can leave little room for others to provide space. Consider the following still taken from a recent game against Portland.

Screen Shot 2016-03-01 at 1.13.48 PM

Middleton just received a pass at the top of the key from Bayless. Middleton’s defender was caught out of position opening up the possibility for a Middleton drive. In this image, the Bucks have Bayless and Rashad Vaughn in the corners, but their men won’t be forced to help on the Middleton drive. Parker and Monroe (with their defenders) are occupying the two blocks. It’s a spacing catastrophe. Middleton has nowhere to go. He pulls up for a mid-range jumper.

What would we like to see instead? In the next image, Miles Plumlee has come up to set a screen on Middleton’s man. Giannis is at the top of the picture and is racing into the corner.  Out of the picture, Bayless is in the near corner.

Screen Shot 2016-03-01 at 1.28.13 PM

Giannis gets to the corner. He’s dangerous enough there that his man can’t help on Middleton. Middleton draws Portland’s Mason Plumlee. In the next image, we see Middleton in the air. He would toss the alley-oop to Miles Plumlee, who finishes the play with an easy dunk.

Screen Shot 2016-03-01 at 1.25.55 PM

See the video here (the second video in the sequence):

http://on.nba.com/1Qpw73X

Giannis is not a great shooter, but he’s been respectable from the corner 3. This season, he’s 10 for 28 (36%) from the corners.

The tragedy is that he’s only taken 28 corner 3s.

Giannis and Parker don’t always need to be in the corners.  They bring more to the table offensively than that.  However, only 33 attempted corner 3s between the two of them this season is not acceptable.

Final thoughts

Milwaukee has the type of roster that could prevent corner 3s on defense and could greatly benefit from the use of corner 3s on offense. The incredibly poor prevention and usage of 3s this season suggests a systematic failure.

As the Bucks finish this season and prepare for the next, the organization should put a priority on playing lineups and implementing systems that are more likely to succeed in the modern NBA.