Lessons from benchmarking

undefined
 
 
L
e
s
s
o
n
s
 
f
r
o
m
 
b
e
n
c
h
m
a
r
k
i
n
g
W
h
a
t
 
h
a
p
p
e
n
s
 
i
n
 
p
l
a
n
n
i
n
g
 
a
u
t
h
o
r
i
t
i
e
s
?
 
Toby Hamilton, Martin Hutchings
 
P
o
s
i
t
i
v
e
 
P
l
a
n
n
i
n
g
 
D
a
y
M
a
r
c
h
 
2
0
1
5
 
w
w
w
.
p
a
s
.
g
o
v
.
u
k
 
B
e
n
c
h
m
a
r
k
 
r
o
u
n
d
u
p
 
 
w
h
y
 
b
o
t
h
e
r
?
 
Benchmarking since 2009
276 councils participated, many more than
once
Confidential, but valuable dataset
Publish aggregate as a “state of the
nation”
Before we forget
for future benefit
 
W
h
a
t
 
w
e
l
l
 
c
o
v
e
r
 
Costs and subsidy of planning
Fees
Productivity
Customer survey
Planning Quality Framework
 
W
h
a
t
 
d
o
 
c
o
u
n
c
i
l
s
 
s
p
e
n
d
 
t
h
e
 
m
o
n
e
y
o
n
?
 
 
P
e
r
c
e
n
t
a
g
e
 
o
f
 
L
P
A
 
c
o
s
t
 
n
o
t
 
c
o
v
e
r
e
d
b
y
 
f
e
e
s
 
a
n
d
 
i
n
c
o
m
e
 
E
a
c
h
 
v
e
r
t
i
c
a
l
 
l
i
n
e
 
r
e
p
r
e
s
e
n
t
s
 
a
 
d
i
f
f
e
r
e
n
t
 
L
P
A
A
v
e
r
a
g
e
 
s
u
b
s
i
d
y
 
=
 
a
l
m
o
s
t
 
7
0
%
 
(
a
t
 
t
h
e
 
t
i
m
e
)
 
C
o
s
t
 
p
e
r
 
h
o
u
r
 
-
 
P
r
o
d
u
c
t
i
v
e
 
h
o
u
r
l
y
 
r
a
t
e
 
=
 
£
4
6
-
 
C
o
m
p
a
r
e
 
t
h
i
s
 
w
i
t
h
 
p
r
e
-
a
p
p
 
c
h
a
r
g
e
s
 
(
!
)
 
M
a
j
o
r
s
 
=
 
p
r
o
f
i
t
.
 
A
v
o
i
d
 
c
o
n
d
i
t
i
o
n
s
 
!
 
 
P
r
o
d
u
c
t
i
v
i
t
y
 
W
e
 
a
r
e
 
n
o
t
 
u
p
d
a
t
i
n
g
 
t
h
e
 
1
5
0
 
c
a
s
e
s
 
p
e
r
o
f
f
i
c
e
r
 
t
h
i
n
g
In the end, we have caved in
 
C
a
s
e
l
o
a
d
 
=
 
1
4
4
 
/
 
c
a
s
e
 
o
f
f
i
c
e
r
 
 
P
r
o
d
u
c
t
i
v
i
t
y
 
r
e
v
i
s
i
t
e
d
 
In 2002, it was professional case officer +
admin types. Now less differentiation.
Not cases per DC officer, but cases per
person
Derives total head count
= less wiggle room
In the ODPM study, this was “less than 100”
 
A
l
l
-
i
n
 
f
i
g
u
r
e
 
i
s
 
8
8
 
c
a
s
e
s
 
p
e
r
 
p
e
r
s
o
n
 
 
A
l
l
-
i
n
 
f
i
g
u
r
e
 
i
s
 
8
8
 
c
a
s
e
s
 
p
e
r
 
p
e
r
s
o
n
 
 
W
h
y
 
i
s
 
t
h
e
r
e
 
s
u
c
h
 
a
 
d
i
f
f
e
r
e
n
c
e
?
 
Drivers of productivity:
 
Work mix
high numbers of simple applications. Fast track.
Often urban.
Large authorities = often higher productivity
Plus local factors (e.g. contamination)
 
S
u
p
e
r
g
r
o
u
p
s
 
=
 
O
N
S
 
c
l
a
s
s
i
f
i
c
a
t
i
o
n
 
 
C
u
s
t
o
m
e
r
s
 
In aggregate there were clear messages
Talk to us, generally. It’s just manners.
Talk to us *especially* when there are issues
We (generally) fail on customer care
We fail because we don’t acknowledge Work
In Progress and follow a target culture
 
R
e
f
l
e
c
t
i
o
n
s
 
o
n
 
t
h
e
 
o
l
d
 
b
e
n
c
h
m
a
r
k
 
Massive shift in understanding
Financial literacy
Looking beyond NI157
National indicators hide almost everything
about performance
Subsidy represents a risk to development
Communication is often weak
 
B
e
n
c
h
m
a
r
k
i
n
g
 
i
s
 
d
e
a
d
.
 
L
o
n
g
 
l
i
v
e
 
P
Q
F
.
 
B
a
s
i
c
 
b
u
i
l
d
i
n
g
 
b
l
o
c
k
s
 
a
d
a
p
t
e
d
 
a
n
d
 
r
e
c
y
c
l
e
d
 
More focused on customers
Internal management tool / external ‘declaration’
Not an annual snapshot, but a continuous process
We want it to become a “badge”. Over time.
 
 
P
l
a
n
n
i
n
g
 
Q
u
a
l
i
t
y
 
F
r
a
m
e
w
o
r
k
 
C
o
n
t
a
i
n
s
 
i
n
f
o
r
m
a
t
i
o
n
 
a
n
d
 
d
a
t
a
 
o
n
 
3
 
t
h
i
n
g
s
1.
Performance
2.
Customer experience
3.
Good planning delivery
 
f
e
e
 
i
n
c
o
m
e
 
 
A
p
p
r
o
v
e
d
 
?
 
 
V
a
l
i
d
 
?
 
N
o
 
f
e
e
 
?
 
(
e
x
c
.
 
h
e
r
i
t
a
g
e
 
&
 
t
r
e
e
s
)
 
 
C
u
s
t
o
m
e
r
 
o
r
 
T
a
r
g
e
t
-
d
r
i
v
e
n
?
 
M
o
r
e
 
t
o
 
c
o
m
e
 
Resources
Investment
[need more testing]
Pre-app
PPAs
[not invented yet]
 
I
s
 
i
t
 
g
e
t
t
i
n
g
 
b
u
s
i
e
r
 
?
 
[
y
e
s
]
 
 
D
e
v
e
l
o
p
m
e
n
t
 
v
a
l
u
e
 
i
n
 
o
u
r
 
p
l
a
c
e
 
=
 
£
6
0
m
/
y
r
 
 
C
u
s
t
o
m
e
r
 
s
u
r
v
e
y
 
r
e
s
u
l
t
s
 
 
A
p
p
l
i
c
a
t
i
o
n
 
R
e
f
:
 
H
A
/
F
U
L
/
4
4
5
6
/
1
4
 
P
A
S
 
P
l
a
n
n
i
n
g
 
Q
u
a
l
i
t
y
 
F
r
a
m
e
w
o
r
k
 
=
c
o
n
s
i
s
t
e
n
t
,
 
r
e
l
e
v
a
n
t
 
i
n
f
o
r
m
a
t
i
o
n
 
t
o
b
e
n
c
h
m
a
r
k
 
p
e
r
f
o
r
m
a
n
c
e
 
(
p
1
2
)
:
Slide Note
Embed
Share

The valuable insights and data gathered from benchmarking activities within planning authorities, covering topics such as costs, productivity, customer surveys, and quality frameworks. Discover how councils allocate resources, subsidy levels, cost per hour analysis, and more for effective decision-making and future planning.

  • Benchmarking
  • Planning Authorities
  • Productivity
  • Cost Analysis
  • Data Insights

Uploaded on Feb 27, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Lessons from benchmarking What happens in planning authorities? Toby Hamilton, Martin Hutchings Positive Planning Day www.pas.gov.uk March 2015

  2. Benchmark roundup why bother? Benchmarking since 2009 276 councils participated, many more than once Confidential, but valuable dataset Publish aggregate as a state of the nation Before we forget for future benefit

  3. What well cover Costs and subsidy of planning Fees Productivity Customer survey Planning Quality Framework

  4. What do councils spend the money on?

  5. Percentage of LPA cost not covered by fees and income Each vertical line represents a different LPA Average subsidy = almost 70% (at the time)

  6. Cost per hour Average cost per person per productive hour 2011 2012/13 48 48 40 40 41 41 51 55 46 46 Work type Planning applications (direct) Planning applications (other) Compliance work - enforcement etc. Strategic Planning All planning activities Combined 48 40 41 52 46 - Productive hourly rate = 46 - Compare this with pre-app charges (!)

  7. Majors = profit. Avoid conditions ! cost of processing per app fee per app at time of benchmark application count Major non residential All dwellings Minor non residential Householders Heritage All waste All minerals All others Conditions All app types 2149 14162 20999 48020 11981 2,841 1,664 783 408 449 4,155 622 385 268 589 6,277 1,293 410 131 2 58 144 5,137 1,110 158 93 353 48668 12540 158721

  8. Productivity We are not updating the 150 cases per officer thing In the end, we have caved in

  9. Caseload = 144 / case officer

  10. Productivity revisited In 2002, it was professional case officer + admin types. Now less differentiation. Not cases per DC officer, but cases per person Derives total head count = less wiggle room In the ODPM study, this was less than 100

  11. All-in figure is 88 cases per person

  12. All-in figure is 88 cases per person

  13. Why is there such a difference? Drivers of productivity: Work mix high numbers of simple applications. Fast track. Often urban. Large authorities = often higher productivity Plus local factors (e.g. contamination)

  14. Supergroups = ONS classification

  15. Customers In aggregate there were clear messages Talk to us, generally. It s just manners. Talk to us *especially* when there are issues We (generally) fail on customer care We fail because we don t acknowledge Work In Progress and follow a target culture

  16. Reflections on the old benchmark Massive shift in understanding Financial literacy Looking beyond NI157 National indicators hide almost everything about performance Subsidy represents a risk to development Communication is often weak

  17. Benchmarking is dead. Long live PQF. Basic building blocks adapted and recycled More focused on customers Internal management tool / external declaration Not an annual snapshot, but a continuous process We want it to become a badge . Over time.

  18. Planning Quality Framework Contains information and data on 3 things 1. Performance 2. Customer experience 3. Good planning delivery

  19. fee income

  20. Approved ?

  21. Valid ?

  22. No fee ? (exc. heritage & trees)

  23. Customer or Target-driven?

  24. More to come Resources Investment [need more testing] Pre-app PPAs [not invented yet]

  25. Is it getting busier ? [yes]

  26. Development value in our place = 60m/yr

  27. Customer survey results Application Ref: HA/FUL/4456/14

  28. PAS Planning Quality Framework = consistent, relevant information to benchmark performance (p12):

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#