10th Report, 2012 (Session 4): Report on Public Services Reform and Local Government: Strand 2 – Benchmarking and Performance Measurement

SP Paper 222

LGR/S4/12/R10

10th Report, 2012 (Session 4)

Report on Public Services Reform and Local Government: Strand 2 – Benchmarking and Performance Measurement

CONTENTS

Remit and membership

Report
Introduction
Background and context
What is benchmarking and what is its purpose?
Definitions
Purpose of benchmarking
The Indicators
The Role of the Regulators
Challenges to the success of benchmarking – managerial, cultural, political, and media
Community Planning Partnerships
Future scrutiny and utilisation of benchmarking by the Committee

Annexe A: Summary of Conclusions and Recommendations

Annexe B: Details of Benchmarking Indicators

Annexe C: Data Examples Relating to Educational Costs and Attainment

Annexe D: Extracts from the Minutes

Annexe E: Oral Evidence and Associated Written Evidence

Annexe F: Other Written Evidence

Annexe G: Committee Benchmarking Seminar

Annexe H: Summary of Written Evidence

Remit and membership

Remit:

To consider and report on a) the financing and delivery of local government and local services, and b) planning, and c) matters relating to regeneration falling within the responsibility of the Cabinet Secretary for Infrastructure and Capital Investment.

Membership:

Stuart McMillan (Member since 18 September 2012)
Anne McTaggart
Margaret Mitchell
John Pentland
Stewart Stevenson (Member since 18 September 2012)
Kevin Stewart (Convener)
John Wilson (Deputy Convener) (Member since 18 September 2012)
James Dornan (Member from 7 March 2012 to 18 September 2012)
Joe FitzPatrick (Convener from 1 June 2011 to 6 September 2012)
David Torrance (Member from 1 June 2011 to 18 September 2012)

Committee Clerking Team:

Clerk to the Committee
David Cullum

Senior Assistant Clerk
Fiona Mullen

Assistant Clerk
Seán Wixted

Committee Assistant
Fiona Sinclair

Report on Public Services Reform and Local Government: Strand 2 – Benchmarking and Performance Measurement

The Committee reports to the Parliament as follows—

INTRODUCTION

Background and context

1. At its meeting on 26 October 2011, the Local Government and Regeneration Committee agreed to prioritise the broad areas of Single Outcome Agreements, benchmarking and Community Planning Partnerships for its inquiry work beginning in January 2012. At its meeting on 7 December 2011 the Committee agreed that this would take the form of a three strand inquiry into public sector reform. The strands agreed were as follows:

Strand 1: Partnerships and outcomes;
Strand 2: Benchmarking and performance measurement;
Strand 3: Developing new ways of delivering services.

2. The Committee concluded part one of its inquiry and reported to the Parliament on 22 June 2012. Informed by this work the Committee commenced work on strand 2 looking at benchmarking and in particular an examination of the work being undertaken by the Society of Local Authority Chief Executives (SOLACE) in partnership with the Improvement Service (IS). The Committee wished to be informed of the progress of that work and ascertain how it was being proposed that it be utilised within local authorities.

3. The purpose of strand 2 was to examine the work that has taken place over the last two years in relation to the development of benchmarking and comparative performance data and cost measurement, and to assess how it can contribute to the performance of local authorities in Scotland and in turn the services they deliver. The Committee was also keen to use this strand to probe whether local authorities along with the various inspectorates were “counting the right things”1 in response to a comment made at a round table event the Committee held as part of its initial information gathering work on 7 September 2011.

4. As part of the initial Committee call for evidence, at the outset of the inquiry, the Committee included six questions relating to benchmarking. The responses to those questions were summarised by SPICe and used to inform this strand of the inquiry. A copy of that summary can be found at Annexe H.

5. To commence strand 2 of the inquiry the Committee organised a seminar in conjunction with the IS; this took place within the Parliament on 10 September 2012. The seminar was an opportunity for attendees to be informed about and discuss the performance data being produced by SOLACE and the Improvement Service. In addition it was an opportunity to be informed of and discuss the utility and effectiveness of benchmarking in Scotland more widely. Invitations were issued to all local authorities and others with an interest in or contact with authorities across Scotland.

6. At the seminar participants heard from academics, the IS, the Local Government Data Unit Wales and Audit Scotland as well as having an opportunity to ask questions and take part in a series of breakout groups. The event was webcast live and as a first for the Parliament at a Committee event, the external community were invited to submit questions during the meeting via Twitter. A number of questions were received, asked and answered.

7. The Committee thereafter took formal evidence from Scottish Water, given their relatively recent experiences of using a benchmarking approach, before hearing from COSLA, a cross party panel of council leaders and the IS for an update on progress. The Committee wishes to thank all who gave evidence, written or oral, and all who contributed to and participated in the seminar. The evidence and information received has greatly contributed to the Committee’s understanding of benchmarking.

WHAT IS BENCHMARKING AND WHAT IS ITS PURPOSE?

8. Before exploring the challenges and opportunities afforded by benchmarking, it was necessary for the Committee to develop an understanding of what benchmarking is and its purpose.

Definitions

9. There are many different definitions of benchmarking. The Chartered Institute of Public Finance and Accountancy (CIPFA) defines benchmarking as follows—

“Benchmarking is the process of searching for, and achieving, excellent levels of performance. This is achieved through a systematic comparison of performance and processes in different organisations, or between different parts of a single organisation, to learn how to do things better. Its purpose is continuous improvement in levels of performance, by identifying where changes can be made in what is done, or the way in which things are done.”2

10. Audit Scotland states that—

“There are probably as many definitions of benchmarking as there are organisations engaged in it. Benchmarking is best thought of as a structured and focused approach to comparing with others how you provide services and the performance levels you have achieved. The purpose of such comparison is to enable you to identify where and how you can do better. Benchmarking is concerned with finding and implementing better practice and performance wherever it is found.”3

11. In a presentation to the seminar held on 10 September, Dr Grace from Cardiff University Business School suggested that benchmarking should not be described in tight definitional terms, but simply as the “comparison of services against an external standard”.4

12. Benchmarking is undertaken across a range of public sector bodies in the UK. The NHS Benchmarking Network was established in 1996 to enable NHS organisations to share best practice and learn from one other. The Higher Education Funding Council for England funded a benchmarking project in 2000 and a number of institutions developed an online benchmarking database and continue to exchange data on an annual basis. Current funding aims to increase participation and activity across the HE sector, including encouraging more universities to benchmark their activities.

13. Quality Scotland now has over 200 members5 connected with a shared belief in the value of Business Excellence. Quality Scotland explains its approach to benchmarking in the private sector—

“Benchmarking is the process of measuring and comparing the performance of your organisation against another organisation that is recognised as a leader in that field. Benchmarking should fundamentally be treated as an on-going structured approach to improving your organisation’s performance and not a one-off measure. The aim is to create a learning experience, where best practice can be incorporated within your organisation.

To implement a benchmarking programme, you must first identify your organisation’s strengths and areas for improvement; this will help you determine which areas you would like to benchmark in. Once you have identified an area that you wish to focus on, the next step is to gather information on how other best-in-class organisations tackle this issue.” 6

14. The Global Benchmarking Network study in 2008 heard from 452 participants in 44 different countries and found 88% using a form of benchmarking. 20% reported an average financial return of over $250,000 per project. The survey also noted that—

“private organisations are more sceptical to share their corporate information and have more difficulties in finding benchmarking partners than public ones.”7

Purpose of benchmarking

15. Dr Grace at the seminar on 10 September articulated why he believes that benchmarking matters for local government. Firstly, he suggested that it is important because of the size and cost of local government. Secondly, its importance can be attributed to the fact that local government spend is primarily drawn from central funds. And thirdly—

“notwithstanding the emphasis that people place on local matters, there is public aversion to the postcode lottery. People want to make comparisons and to know how their area is doing compared with others and over time. They are not content with simply holding their own local public services to account. In effect, what we have in the UK is an inversion of that old thing about no taxation without representation. In a curious way, in UK local government, there is local representation and service delivery without a great deal of local taxation.”8

16. Dr Grace outlined the purposes of benchmarking—

“Benchmarking can be done for the purposes of economy, efficiency, effectiveness or excellence. However, in any system such as this, there can be both potential and some drivers for gaming. We might find that benchmarking is in part—or for some people—about evasion rather than those good purposes. That is perfectly possible. There is also, especially in England, a strong focus on anything that relates to austerity and will help to deliver the austerity agenda. It is critical to the effectiveness of any system to think through the purposes for which one is following it.”9

Dr Grace added—

“Benchmarking is an arrow in the quiver of public services improvement; it is not the answer. It is always best applied from what I call the improvement end of the telescope. You should ask, What do I want to improve? How do I think that will happen? How do I assess where public services are and what will move them on? Do I want to measure only the outcome: the eventual improvement? Alternatively, do I want to measure some of the intermediate processes?”10

17. Dr Grace also stressed how important it is that benchmarking is seen in the wider context of public services reform as part of a drive to improve services and welcomed the Committee’s consideration of benchmarking in this context. The Commission on the Future Delivery of Public Services in Scotland (“the Christie Commission”) stressed the need for improved transparency and accountability in the way public service organisations operate.

18. The Christie Commission reported “a patchwork” with little evidence that benchmarking is taking place across the public sector; rather it is taking place mainly internally within some local authorities. Their report recommended that public service providers “undertake regular benchmarking against comparable services and report publicly and annually on outcomes achieved and financial performance.”11

SOLACE and Improvement Service project

19. Mark McAteer of the IS explained the purpose of the work undertaken by it and SOLACE—

“Roughly two years ago, when SOLACE approached the Improvement Service to undertake the work on benchmarking, we agreed a clear statement of purpose for the exercise and what it was designed to achieve. From the outset, of critical importance to SOLACE was that the exercise should drive improvement in council service delivery.”12

20. The Committee heard that local government had from the outset taken ownership of the project. SOLACE were the driving force behind it supported by the IS. COSLA agreed indicating that “this benchmarking approach is a creature of local government and a tool that has been developed by local government officers. That is one of its strengths.”13 And that the project “allows us to take ownership of those indicators that we think add the greatest value to our understanding of management information locally.”14

21. In delivering this project and eventual improvement in council service delivery, Mark McAteer stressed the importance that had been given to reaching a consensus amongst all councils on the approach.

22. Councillor David O’Neill, the President of COSLA, explained the purpose—

“Almost everything that we do is based around getting better outcomes. We are aware that to achieve that aim we need to understand how we are performing.”15

23. Mark McAteer advised the Committee that work on the project had to date focussed on the collection of data to inform the benchmarking. He stressed, however, that the collection of data should not be seen as an end in itself, and that it should be about data which will then be used to drive service improvement.

24. Mark McAteer echoed Dr Grace’s view that benchmarking is not a magic bullet in itself, but rather a tool for local authorities to deliver service improvement—

“We were also clear from the outset that the indicators that we ended up with were to be high level. In a sense, they would be can openers—they would not explain everything about councils and their performance, but would enable chief executives to open up the can of their services and see how their delivery of a service compares to that of other councils, and then drill down into that to explain any variation in the level of delivery.”16

25. In the course of the inquiry the Committee has developed an appreciation and understanding of the purpose and value of benchmarking as a concept. The Committee has also recognised benchmarking’s potential value in terms of recognising and driving improvement in the delivery of services by local government.

26. The Committee heard compelling evidence from Scottish Water on how benchmarking had contributed post-merger towards annual cost efficiency savings of £100million in the period between 2002 and 2006.17 The Committee were advised that in addition to cost savings there had been huge improvements for customers. Belinda Oldfield from Scottish Water indicated that “there is no down side” and that the benchmarking process “was directionally correct”.18

27. COSLA in evidence to the Committee saw benchmarking as part of a process of continuous improvement. Councillor Michael Cook, the Vice President of COSLA, agreed that it was desirable they reach a situation—

“in which it must be possible to collect any one of the indicators on a comparative basis across all 32 councils. That comparative analysis seems to be the key building block of the approach for the future.”19

28. Council leaders who gave evidence on 31 October each supported the approach indicating a number of benefits mainly in relation to evidencing and confirming how their respective councils are operating.

29. The Committee, however, retains reservations about the extent to which the political leaders in local authorities fully understand the purpose for which benchmarking can be utilised and the potential value that the data can produce. Concerns arise from comments made in evidence, for example Councillor Cook from COSLA—

“the people who have the strongest interest in how councils perform are elected members, senior managers and those who are employed in the councils.”20

30. Council leaders gave the strong impression that no external help was required for staff. Leaders indicated they would be disappointed if the figures “do anything other than confirm what we already know”, and perhaps most worryingly was a comment by Councillor Bill McIntosh (South Ayrshire Council) that could be interpreted as suggesting a possible complacency about the benefits of the information

“We might say when we get the benchmarking figures for certain areas, ‘“Yes—that is about where we thought we should be.’” However, we must then ask whether we want to improve that position, or whether we should, for whatever reasons based on local priorities, leave it at where we are and concentrate on other areas.”21

31. Evidence was that benchmarking would be used by the council leaders reactively as opposed to proactively. They would turn to benchmarking information only as problems arose and as Councillor McIntosh indicated they would use it as a means to confirm their impressions of their council’s existing position—

“that my council is mid-range rather than up at the top or down at the bottom.”22

32. Similar concerns arise from the comments of Councillor Ken Guild (Dundee City Council) who indicated—

“Benchmarking enables people to work out whether councils are producing the services that they are required to produce at an acceptable level; it is not about how councils go about producing services. That is probably the correct approach.”23

33. Councillor Guild indicated that the main point was to improve the level of service and encourage best practice. The Committee notes the potential for savings to emerge as being of at least similar value.

34. The Committee urges local authorities to get fully behind this exercise and ensure that its full potential can be realised.

35. The Committee endorses and welcomes the introduction of benchmarking. The Committee applauds local authorities along with SOLACE and the IS on recognising the need for councils to take forward this initiative and in developing an approach which the Committee considers has the potential to bring about a huge step forward in improving the quality of services and deliver cost savings in coming years.

36. The Committee urges COSLA and the Scottish Government to ensure that all council leaders fully understand the potential benefits that can and should arise from appropriate use of the benchmarking data.

37. Witnesses were questioned as to the length of time this work had taken and when the information would become publicly available. All defended the 2 year period as being reasonable and proportionate, given the work involved, while stressing the need to get the information correct and the time and effort involved in aligning information particularly regarding costs across all 32 councils. Mark McAteer advised the Committee that publication would now happen early in the New Year once final data sets became available from central government. The Committee looks forward to the release of this data in January 2013.

38. The remainder of this report looks at the approach adopted including the indicators selected, looks forward to how they can be utilised, both within and outwith local authorities, looks at areas for potential development and sets out how the Local Government and Regeneration Committee will monitor benchmarking and the information it provides in future years.

The Indicators

39. At the seminar Mark McAteer explained the aim was to provide a “genuine comparative framework across all 32 councils”24 and from the outset the process had to be consensual with all councils remaining involved. The indicators selected were to be high level and were not intended to explain everything about councils and their performance but—

“would enable chief executives to open up the can of their services and see how their delivery of a service compares to that of other councils, and then drill down into that to explain any variation in the level of delivery.”25

40. Initial areas identified were “big-ticket” service areas, such as children’s services, corporate services and social work. Ultimately 47 indicators were identified with the aim of getting “data to the point at which it can help councils drive improvement in their performance”. The Committee were told that overall the suite of indicators was—

“getting close to a balanced scorecard for cost, output and customer satisfaction on all those major service areas in Scotland.” 26

41. COSLA in their written evidence supported the suite developed and evidence from council leaders also agreed suggesting that it was necessary to “suck it and see” and that it seems to target the right areas.

42. Details of the 47 benchmarking indicators provided by COSLA are listed at Annexe B.

43. The benchmarking data has a baseline year of 2010-11 and when first published will also cover 2011-12 and the three years preceding 10-11. This will provide initial comparative data over a 5 year period with the ability to show any trends arising, although the three years prior to 10-11 have not been subjected to the same rigorous quality control or allocation of support costs as the subsequent data.

44. Much of this core data derives from the Local Financial Returns required from local authorities to central government and also draws on data from statutory performance indicators, as well as customer satisfaction data drawn from the Scottish Household Survey. The Scottish Government advised the committee of ongoing work in relation to these areas and also on Scottish Neighbourhood Statistics which would assist with accuracy and utility.27

45. The Committee noted some potential issues with the indicators selected, in particular, the absence of procurement, planning and economic development information. In each case work to address the gaps was earmarked for the next stage of development. For those authorities that utilise Scotland Excel, procurement information was less important given the service was centrally driven.

46. There is also a relative weakness in outcome data particularly in relation to the difference that services are making to people’s lives. Mark McAteer indicated—

“We are strong on input and output data across local government and, indeed, across most of the public sector, but we need to improve the outcome data on the differences that services ultimately make to people’s lives. The trouble is that that extends us into the partnership arena beyond local government. If we want to truly understand outcomes and our impacts on them, we need to go beyond local government as part of the exercise. That in itself will be complex.”28

47. The issue of outcomes and services data is explored further later in the report in relation to the development of benchmarking to include CPPs.

48. Further development of the indicators requires to be an ongoing process in the coming years particularly in assisting with public sector reform. The Committee agrees with IS that further disaggregation down to the lowest level is required with the aim of councils and communities being able to understand what is happening in their areas.

49. The Committee will be looking at outcomes in much greater depth as part of strand three of its inquiry into Public Sector Reform and in particular looking at possible implications for Community Planning Partnerships.

50. Development to include CPPs is looked at in a little more depth later in this report at paragraph 98 onwards.

Using the indicators

51. The Committee noted that the indicators selected were in the main designed to measure inputs and outputs and were interested to discover the ways in which it was anticipated that they would be used. An initial concern that the indicators could not be compared directly was addressed and the Committee were assured that significant work had been undertaken to ensure a degree of consistency. A process of normalisation had been undertaken to allow for direct comparison across all 32 local authorities, where relevant.

52. COSLA made the point that ultimately what had been achieved was a method to compare “uniform delivery of services” as opposed to “uniform benchmarking”.29 The information collected would allow comparative analysis although it was stressed by COSLA that care needs to be taken to consider a variety of contextual factors that could lie behind the figures. An example showing how the data could potentially be used in practice relating to educational costs and attainment can be found at Annexe C.

53. An additional and significant factor that requires to be considered when using the data to make comparisons relates to the political choices of local authorities. Not all services are, or indeed should be, available uniformly across the country, local factors must continue to play an important part in the choices authorities make. Local Democracy was explained as a factor by Councillor Cook of COSLA—

“We need to recognise that when it comes to a whole range of factors, there is legitimate variation based on local democracy. It is up to councillors, who are the local democratic agents within councils, to make a policy judgment about some of those things…It may also partly be a consequence of policy determinations that those elected members have made and that is absolutely right. Sometimes we hear complaints about the postcode lottery. Sometimes the postcode lottery is local democracy in action.”30

54. Council leaders in evidence agreed with this aspect referring to the political choices and decisions that authorities make and funding decisions that can flow directly from them.31

Benchmarking families

55. SOLACE and IS are proposing an approach to benchmarking where local authorities will be grouped together in “families” of authorities. Mark McAteer of IS explained this approach to the seminar—

“We have agreed with SOLACE the development of family groups among the 32 councils, by which I mean that we will group councils on a like-for-like basis to allow them to get into much more detailed, drill-down activity to explain, for example, variations and what is going behind the scenes and behind the numbers.”32

56. Mark McAteer told the seminar that local authorities had been grouped together on the basis of socio-economic characteristics. He was keen to stress, however, that local authorities will be able to work with local authorities outwith their family group.

57. Some concerns were expressed to the Committee about the family groups approach. For example Argyll and Bute Council questioned whether families of authorities could be identified given few share the same or similar geographical make-up.

58. The family grouping approach was one of the issues discussed by the breakout groups during the benchmarking seminar. Three out of four groups had concerns about the family grouping approach.

59. A number of different factors prompted participants’ concern. Firstly, concern was expressed that this approach would preclude authorities from benchmarking against authorities outwith their family group. Secondly, some participants suggested that it would be more appropriate if local authorities benchmarked against different local authorities depending upon the particular service rather than on every service. And finally, participants questioned whether family groups would be able to maintain momentum over time without widening their membership.

60. In response, Mark McAteer stressed that local authorities would not be limited in who they benchmark against—

“A tension that perhaps crept into the discussion about family groups related to whether we would limit councils to working only with defined family groups. SOLACE’s answer to that is no—we are saying that a council would be part of one family group as a minimum, but would be at liberty to benchmark against others outwith that family group.”33

61. Mark McAteer continued—

“Family groups are for practicality as much as anything else—that was strongly discussed in my group. Across all services, can we get all 32 councils together in a single room to discuss things? We probably cannot, so family groups provide a practical basis for organising the drill-down discussion that needs to take place beyond data to get to issues of improvement and so forth.”34

62. COSLA, in their written submission, made the useful point that—

“The merit of a more organised approach is that helpful comparators that otherwise may not be obvious can be forged. Secondly, councils achieving better performance will not be overburdened by requests from others to understand how they have achieved their level of performance.”35

63. Council leaders who gave evidence noted existing and ongoing liaison with other councils to compare and learn from each one other. For education there are existing groupings produced by Education Scotland and Councillor Jim Fletcher gave an example of how in East Renfrewshire that has been utilised—

“As a council we are grouped by Education Scotland in a banding that includes similar councils in Scotland, such as Midlothian, East Lothian, Stirling, Aberdeenshire and East Dunbartonshire. As the benchmarks showed East Dunbartonshire Council doing particularly well in mathematics, we asked the obvious question: why does East Dunbartonshire outperform East Renfrewshire? Quietly, officers and principal teachers of mathematics went out there to have a frank discussion with people in mathematics departments in East Dunbartonshire in order to learn what people in that authority were doing that was different from and better than what was being done in our authority. I honestly think that the benchmarking is used as a way to drive performance.”36

64. Notwithstanding reservations about the areas not currently included the Committee is satisfied that 47 returns should be ample to enable local government to monitor and demonstrate performance on its activities and recommends that as additional indicators are identified others are removed.

65. The Committee in its ongoing work in future years will keep a close watch on this aspect. The Committee recognises the pragmatism behind the approach to using families and, while generally content, is concerned to ensure that the approach does not inhibit authorities learning from the best whether within or outwith Scotland.

The Role of the Regulators

66. The Christie Commission recognised that—

“well designed external challenge can be a catalyst for improvement where it influences behaviour and culture of providers, leading to improvements in the way that services are delivered”37

67. The Committee agrees and considers that Audit Scotland and other regulators have a significant part to play in securing improvement of public services.

68. The Committee noted that IS and SOLACE had been having ongoing discussions with Audit Scotland on the development of the indicators with further areas for inclusion identified. It is important that this dialogue continues and includes all regulators requiring information from local authorities. The Accounts Commission for Scotland have indicated that if there is a suite of indicators that can provide better performance and comparative information and help with public accountability they would utilise that in place of the existing Scottish Performance Indicators.38

69. COSLA supported a move towards rationalisation making the cogent point that local authorities as part of this exercise—

“want to look further than where we currently are to the wider range of indicators and performance information that is being collected, and take an honest look at whether all that adds value and whether it can be improved in a similar way.”39

70. Councillor Michael Cook, Vice President of COSLA, noted—

“We already have a series of statutory performance indicators, some of which, frankly, are pretty redundant.”40

71. The Committee is concerned to see that the collection of data for benchmarking is used as extensively as possible, including facilitating the de-cluttering of the returns landscape. The Committee urges all regulators requiring returns in the coming years to consider utilising data collected for or in support of benchmarking. The Committee looks to the Accounts Commission for Scotland to make that judgment in early course.

72. The Committee consider that the number and content of returns, including those to the Scottish Government must reduce and considers that only in exceptional circumstances (such as discrete areas not covered by benchmarking) should returns additional to benchmarking data be required.

Challenges to the success of benchmarking – managerial, cultural, political, and media

73. Although the data has now been collected, this is only the start of the work as far as benchmarking is concerned. The Committee while looking forward to publication of the figures recognises that in order to make effective use of the data, there are a number of challenges and obstacles. In this section of the report, the Committee explores these challenges and how they might be overcome.

Interpretive skills, management capability and organisational culture

74. Andrew Stephens from the Local Government Data Unit Wales referred to the need for a culture change. Ownership of performance and the reporting of it requires to lie in service areas and not be the domain of the corporate centre. He indicated that—

“to seriously change performance, it needed to be part of the day-to-day job of the service managers and the service leaders in the service areas across local authorities.”

75. Andrew Stephens went on to state that—

“It is not just the service areas that need to take ownership of performance, it is also necessary for the performance leads and the corporate centres in authorities to push such ownership and to say, ‘“Can we help you use this data to improve?”’ instead of asking service areas to give them data once or twice a year on how the service is improving.”41

76. The Committee heard from Scottish Water how they initially lacked the necessary skill set amongst employees to properly take forward and realise the benefits that ultimately accrued and continue to accrue from benchmarking. Barbara Oldfield referred to a skill shortage in relation to economists and statisticians.42 She also referred to a need for senior management to buy into the concept and a need to get the whole organisation to understand and have confidence in it. 43 She later indicated—

“One of the initial challenges in bringing benchmarking into any service delivery environment lies in giving people who have to be benchmarked a compelling argument about what the benefits will be, trying to dispel the natural fears that come with it, and transforming the approach into big benefits.”44

77. While COSLA agreed they were happy to learn from others, neither they nor the council leaders saw this as a significant problem for local authorities as local authorities had potentially been gearing up for the advent of benchmarking for at least two years. Barbara Lindsay from COSLA indicated that “we have to make this part of everyone’s job.”45 In effect all staff had for a number of years been required to consider performance management issues pertaining to their roles.

78. The Committee observes that the position of local authorities can be distinguished from that of Scottish Water which previously had a single purpose.

79. The Committee encourages local authorities to learn from Scottish Water’s experience and ensure that they access the appropriately skilled staff to maximise value and realise the full benefits from the benchmarking data.

80. Both COSLA and council leaders addressed the cultural issue in evidence on 31 October. COSLA agreed that the culture needed to be developed allowing people to say what they are good at and things that they are poor at, recognising poor performance being a key to addressing and improving it. Performance need to be embedded into the culture of each local authority. Councillor David O’Neill, President of COSLA, saw this as only—

“the start of a process and I suppose that in a few years some indicators will be replaced and some will be altered. We should not see the indicators as being set in tablets of stone. As we improve the tool, we ought to be able to improve the outcomes for our communities.”46

81. The council leaders who gave evidence were clear that the responsibility for utilising benchmarking should be for elected members and officials at all levels and be embedded into working processes. Each wished a culture in which the workforce all understood that responsibility for good working and performance belonged to everybody. They generally thought that such a situation already pertained.

82. Given the evidence the Committee expects to see strenuous efforts made to embed benchmarking in the culture of all councils with all staff recognising and taking individual responsibility. Strong leadership will be required to achieve this from officials and politicians. The Committee considers that there is a central role for the Scottish Government and the IS in encouraging this to happen.

83. The Committee will look closely at future regulatory reports for evidence that the confidence shown by council leaders in this regard has been justified, and that benchmarking has been accepted by all staff and has been embedded into processes across each authority.

Political and media challenges

84. Dr Grace (Cardiff University Business School) acknowledged the challenges generally associated with adopting benchmarking and the political realities that arise. It is recognised that it is difficult for politicians to have a time horizon which allows long term planning. Dr Grace noted that politicians are accountable for public services—

“However, there are huge problems related, in particular, to the time horizons of politicians, because it is very difficult for a politician to have a time horizon around matters related to public services, which give rise to so much feeling. It is very difficult to say, ‘“Don’t worry—our benchmarking system shows where the issues are, we have a plan and in two or three years’ time you will see change that is beneficial.”’ It is much more likely that there will be a different kind of response. You are the experts on that, so I will not describe it.” 47

85. This report earlier notes the legitimacy of political choices being made by local authorities to reflect local factors. See in particular paragraphs 53 and 54.

86. Support for taking a longer term view can be found in the Scottish Government’s response to the Christie Commission which noted that—

“Over the course of this Parliament Scotland’s public services will make a decisive shift towards prevention and take a holistic approach to addressing inequalities. This focus is essential to address the current squeeze on the Scottish budget, tackle persistent inequalities and ensure the sustainability of our public services in the longer term.”48... “There is a growing body of evidence which demonstrates that spending on prevention can deliver better solutions and outcomes for individuals and avert future costs to the public sector.”49

87. A feature in all the evidence received was concern about media reaction or overreaction when figures are published. It was considered inevitable that the media will turn the data into league tables and focus on the weaker performing councils under each indicator.

88. This was a major issue raised in breakout groups during the event on 10 September, including a concern that the legitimacy of local choice and control over prioritisation of services would be lost or diluted. Dr Grace suggested that enhanced scrutiny could encourage local politicians to “articulate the policy choices that they make”. And that there were legitimate choices to be made around economy and excellence of services—

“A local politician can decide that they want cheap services. That is a decision that they are entitled to make, and some authorities make it, for example, in the social care field…They can also decide to go for excellence, which is fine, too.”50

89. In summarising the group discussions Mark McAteer indicated that—

“The tenor was that the media will do what they do and always have done, so we just have to roll with the punches and not be obsessed by that. The media will have a short-term story, then we will all have to go back and get on with the real job of driving improvement across services.”51

90. Scottish Water commented on their experiences of adverse publicity following publication of benchmarking information and noted that it created “added stimulus for more focus on improving services. It drives behaviour in the business for ensuring that we create the service change that we need” and “provides the impetus for not wanting to be in that position year after year.”52

91. In recognising the likely media response COSLA made a plea for politicians to support them and avoid a tendency to criticise. Councillor Michael Cook, the Vice President, stated—

“One way in which we can do that involves politicians acting in a more mature manner. For example, if one local authority ends up at the wrong end of a benchmarking report, it would be wrong for the opposition on the council to use that as a club to batter the council administration about the head. It would also be wrong for MSPs to use the reports as a club to batter local authorities about the head. We can hardly criticise journalists for doing that if we as politicians do it, so we need a degree of maturity from the politicians. Among the Scandinavian countries, Sweden has been doing that for many years. It has that degree of maturity, in that benchmarking is already viewed as an improvement tool and is not used as a club to batter folk.”53

92. Work is ongoing to ensure that when the information is published it will be accompanied by clear contextual explanations to assist the public to understand what the information tells them about their authority. As Councillor Jim Fletcher (East Renfrewshire Council) stated—

“The important thing is to get out to the public what is happening in our council and what, if anything, we are doing about it. Benchmarking is not a one-off event but part of a process.”54

93. The Committee were pleased to learn that preparatory work was being undertaken to accompany the release of information and that COSLA were taking the lead to support councils by looking at media management and the core messages.

94. The Committee encourages the development of media messages well in advance to focus on the positives, and crucially to emphasise the opportunities the data presents to learn from others, to drive down costs and drive up service improvements.

95. In this context the Committee urges all politicians to support this work, and to work and comment constructively to ensure that the benefits that can ensue from using benchmarking can be realised.

96. The Committee consider that publication of the indicators presents an opportunity to address an existing democratic deficit and provide some clarity to the public.

97. The Committee echoes the comments of the Christie Commission and also believe that—

“the drive for improvement and better accountability can be enhanced through greater openness and transparency surrounding budget decisions, analysing the costs of services delivery and the degree to which services achieve their stated objectives.”55

Community Planning Partnerships

98. During the evidence session on 31 October both COSLA and council leaders were asked what the challenges were in applying benchmarking to community planning partnerships. Mark McAteer observed that—

“Benchmarking is ultimately about services and how they perform but, at present, community planning partnerships do not deliver services. They are co-ordination bodies that allow the key public partners to agree the key outcomes that they then try to reflect in their delivery of services. Therefore, benchmarking applied in that context would be slightly different.”

He added the more complex part—

“if we are serious about benchmarking at the community planning level, will relate to what the contributions of the service bodies add to the outcomes in their areas. That will be an extensive and complex piece of work. We have had to deal with technicalities such as the accountancy system in local government, but those issues will have to be resolved for each of the major public bodies. Therefore, that will be a complex piece of work, but I think that collectively we should commit to it and start to undertake elements of it. We could bring the learning and experience of the work with local authorities to support that and to advise other partners, although we should not underestimate the challenge of doing that.”56

99. Councillor Michael Cook, Vice President of COSLA, indicated—

“a desire to build on the process in which we are now engaged and to move forward to carry out benchmarking in relation to community planning partnerships and not merely local authorities.”57

Adding later in the session when accurately discerning the committee’s view—

“that community planning is a very good answer to the aspiration that we have across the country to improve outcomes. That is what it is all about. We need to build benchmarking mechanisms that allow us to drive performance on a cross-sectoral basis—for example, when work cuts across a local authority, a health board and the third sector. If we can identify outcomes and indicators and use those as weapons to drive performance, we will get to the right place”58

100. The Committee, while accepting that it will take time, agrees with Councillor Bill McIntosh (South Ayrshire Council) that it would be prudent to—

“see how the project works in the first year for councils, and then consider bringing it in for community planning.”59

101. To that end the Committee will be looking at outcomes in much greater depth as part of strand three of their inquiry into Public Sector Reform and in particular looking at possible implications for CPPs.

102. While desirous to see this take place in health boards, the police and fire services in particular the Committee takes a similar view in relation to rolling out benchmarking to other parts of the public sector.

Future scrutiny and utilisation of benchmarking by the Committee

103. Throughout this report the Committee have emphasised the value and utility they consider will arise to local authorities and their residents from benchmarking. The information available will also assist the regulators in their work with authorities including Audit Scotland and the Committee will look carefully at future reports to ascertain how benchmarking is progressing.

104. During strand three of the Committee’s inquiry into public sector reform the Committee will expect to be advised how benchmarking data is being utilised as part of work on developing new ways of delivering services.

105. The Committee will expect from SOLACE and COSLA regular updates on progress with the next phase of the project and expects as part of all updates to be provided with information on ways in which the original data is being both developed and utilised. The Committee intends at least once per annum to question directly selected local authorities on their progress in relation to each of the areas covered by this report.

ANNEXE A: SUMMARY OF CONCLUSIONS AND RECOMMENDATIONS

SOLACE and Improvement Service project

(Paragraph 34) The Committee urges local authorities to get fully behind [the Benchmarking] exercise and ensure that its full potential can be realised.

(Paragraph 35) The Committee endorses and welcomes the introduction of benchmarking. The Committee applauds local authorities along with SOLACE and the IS on recognising the need for councils to take forward this initiative and in developing an approach which the Committee considers has the potential to bring about a huge step forward in improving the quality of services and deliver cost savings in coming years.

(Paragraph 36) The Committee urges COSLA and the Scottish Government to ensure that all council leaders fully understand the potential benefits that can and should arise from appropriate use of the benchmarking data.

(Paragraph 37) The Committee looks forward to the release of [Benchmarking] data in January 2013.

The Indicators

(Paragraph 49) The Committee will be looking at outcomes in much greater depth as part of strand three of its inquiry into Public Sector Reform and in particular looking at possible implications for Community Planning Partnerships.

(Paragraph 64) Notwithstanding reservations about the areas not currently included the Committee is satisfied that 47 returns should be ample to enable local government to monitor and demonstrate performance on its activities and recommends that as additional indicators are identified others are removed.

(Paragraph 65) The Committee in its ongoing work in future years will keep a close watch on [Benchmarking families]. The Committee recognises the pragmatism behind the approach to using families and, while generally content, is concerned to ensure that the approach does not inhibit authorities learning from the best whether within or outwith Scotland.

The Role of the Regulators

(Paragraph 71) The Committee is concerned to see that the collection of data for benchmarking is used as extensively as possible, including facilitating the de-cluttering of the returns landscape. The Committee urges all regulators requiring returns in the coming years to consider utilising data collected for or in support of benchmarking. The Committee looks to the Accounts Commission for Scotland to make that judgment in early course.

(Paragraph 72) The Committee consider that the number and content of returns, including those to the Scottish Government must reduce and considers that only in exceptional circumstances (such as discrete areas not covered by benchmarking) should returns additional to benchmarking data be required.

Challenges to the success of benchmarking – managerial, cultural, political, and media

(Paragraph 79) The Committee encourages local authorities to learn from Scottish Water’s experience and ensure that they access the appropriately skilled staff to maximise value and realise the full benefits from the benchmarking data.

(Paragraph 83) The Committee will look closely at future regulatory reports for evidence that the confidence [of cultural change] shown by council leaders in this regard has been justified, and that benchmarking has been accepted by all staff and has been embedded into processes across each authority.

Political and media challenges

(Paragraph 94) The Committee encourages the development of media messages well in advance to focus on the positives, and crucially to emphasise the opportunities the data presents to learn from others, to drive down costs and drive up service improvements.

(Paragraph 95)..the Committee urges all politicians to support [Benchmarking], and to work and comment constructively to ensure that the benefits that can ensue from using benchmarking can be realised.

(Paragraph 96) The Committee consider that publication of the indicators presents an opportunity to address an existing democratic deficit and provide some clarity to the public.

(Paragraph 97) The Committee echoes the comments of the Christie Commission and also believe that—

“the drive for improvement and better accountability can be enhanced through greater openness and transparency surrounding budget decisions, analysing the costs of services delivery and the degree to which services achieve their stated objectives”.

Community Planning Partnerships

(Paragraph 101) To that end the Committee will be looking at outcomes in much greater depth as part of strand three of their inquiry into Public Sector Reform and in particular looking at possible implications for Community Planning Partnerships.

Future scrutiny and utilisation of benchmarking by the Committee

(Paragraph 105) The Committee will expect from SOLACE and COSLA regular updates on progress with the next phase of the [Benchmarking] project and expects as part of all updates to be provided with information on ways in which the original data is being both developed and utilised. The committee intends at least once per annum to question directly selected local authorities on their progress in relation to each of the areas covered by this report.

ANNEXE B: DETAILS OF BENCHMARKING INDICATORS (321KB pdf)

ANNEXE C: DATA EXAMPLES RELATING TO EDUCATIONAL COSTS AND ATTAINMENTANNEXE C: DATA EXAMPLES RELATING TO EDUCATIONAL COSTS AND ATTAINMENT

If benchmarking is to truly add value to the improvement processes of a service practitioners need to get beyond initial data about a service to understand why it performs as it does. An example to explore this looking at education services follows.

Different councils structure their services in ways that meet their local needs. This means that in examining cost structures there is no easy line of causality between costs and performance – it depends on what a service is specifically seeking to achieve; where the service is on its own improvement journey (different councils will be at different stages), and also where the service is on its own investment cycle. In exploring such issues in relation to education at secondary school level the following types of issues need to be considered:

In examining the range of education spend across 32 councils all that can be deduced is that some councils spend more per head of pupil than others, and spend data alone does not tell why the variation occurs. To understand this it is necessary to drill further into the data in order to make sense of what drives the expenditure at each council level.

In exploring this variation differences in the staff structure of different councils can be expected to emerge – some will have more staff who are more experienced (older) and therefore more likely to be placed at higher salary points than in other councils, so increasing costs within the service. Equally, different councils are unlikely to have a similar balance of staffing levels between teaching, classroom support and administrative staff: these impact on the cost structure and can explain variation between councils in what they spend.

Other issues need considering such as the impact of investment in school estates that some councils have undertaken through Public Private Partnerships (PPS/PFI) – many did so on the basis that that was the only funding mechanism available to them at the point in time when they needed to renew and invest in their school estate. There are costs associated with that investment impacting on the revenue spend for those councils which other councils which did not undertake the initial PPP investment do not have.

Also of significance will be the size and location of the school estate. These factors will affect costs such as security, heating, maintenance and cleaning. Other factors such as pupil transport costs can also help explain the differences between council spend.

Lastly in this example there is a need to better understand the composition of the children attending local schools. In some areas of Scotland there are significant numbers of children who, when first attending school, may have English as a second language and those councils therefore have to support those children in improving their English language, thus increasing teaching costs. Equally, research has shown that many children coming from lower social economic backgrounds also require additional supports on first entering school and again councils with higher numbers of such children attending their schools will see this translate through into the cost base of the service.

Therefore in order to make sense of why councils’ per pupil spend varies the impacts of such ‘factors of production’ need to be understood. This is an important step beyond simply recording the fact that different councils spend different levels on children in the education system.

If is also possible to look beyond the cost factors within a service such as education at how well services perform. There is not a strong relationship between educational attainment and what a council spends per child within the education system. High spend in itself does not equate to high performance. It is again necessary to go beyond the raw data and typically it will be found that the social background of children is a critical factor in understanding how well they are likely to perform in education over their school life.

Typically children from more affluent social economic backgrounds perform better in educational exams than children from lower social economic backgrounds. However in looking beyond this general maxim it is necessary to understand how the performance of children from similar backgrounds compares across schools and across local authority areas. In comparing ‘like for like’ schools can be identified in which the social backgrounds of children are broadly similar; those which performs best and critically why they do so. This may be down to the policy approach of the council on how it offers additional supports to the school; it may be down to how a head teacher leads the staff of the school; or it may be down to how teachers within different schools engage with and teach children. Therefore, as with cost, there is a need to get beyond the high level data that can help identify where difference in performance occurs across educational services in order to understand why that difference occurs and critically how others can learn from the best practices of the best performers.

In summary, when looking at a service all relevant data needs to be read ‘in the round’ to truly understand questions such as is the cheapest or indeed most expensive service performing the ‘best’? In making judgements cost factors need to be set against other service dimensions and a rounded understanding of performance arrived at.

The benchmarking data allows local authorities to undertake these comparisons. While the published data is at a high level and will show the cost per pupil and exam attainment figures, the information used to calculate the cost figure will have been calculated utilising all of the staffing and building costs. Detail of the composition of the children is collected by authorities who also have access to socio economic data. By boring down into the component parts of the cost and attainment figures a detailed picture starts to emerge which allows clear comparisons to be made across authorities taking into account all aspects of spend and attainment factors.

The whole purpose is to get behind data to drive learning in order to improve services. By making the comparisons on a like for like basis it becomes possible to isolate individual factors and costs allowing judgments to be made of the effectiveness of the inputs which can be varied such as teaching levels.

ANNEXE D: EXTRACTS FROM THE MINUTES OF THE LOCAL GOVERNMENT AND REGENERATION COMMITTEE

10th Meeting, 2012 (Session 4) - Wednesday 25 April 2012

Decision on taking business in private: The Committee agreed to take items 4 and 5 in private.

Public services reform and local government: strand 2 – benchmarking and performance measurement (in private): The Committee considered and agreed its approach to its inquiry.

13th Meeting, 2012 (Session 4) - Wednesday 23 May 2012

Decision on taking business in private: The Committee agreed to take items 4, 5 and 6 in private.

Public services reform and local government: strand 2 – Benchmarking and Performance Measurement (in private): The Committee considered the issue of the appointment of an adviser in connection with its forthcoming inquiry on public services reform and local government: strand 2 – benchmarking and performance measurement. The Committee agreed to seek to extend the contract of its adviser from its strand 1 inquiry on public services reform, for the duration of its strand 2 inquiry.

19th Meeting, 2012 (Session 4) - Wednesday 12 September 2012

Public services reform and local government: strand 2 – benchmarking and performance measurement: The Committee took evidence from Belinda Oldfield, Regulation General Manager, Scottish Water.

23rd Meeting, 2012 (Session 4) - Wednesday 31 October 2012

Decision on taking business in private: The Committee agreed to take items 3 and 4 in private.

Public services reform and local government: Strand 2 – benchmarking and performance measurement: The Committee took evidence from—

Councillor David O'Neill, President, Councillor Michael Cook, Vice- President, Barbara Lindsay, Depute Chief Executive, and Adam Stewart, Policy Manager, COSLA; Councillor Jim Fletcher, Council Leader, East Renfrewshire Council; Councillor Ken Guild, Council Leader, Dundee City Council; Councillor Bill McIntosh, Council Leader, South Ayrshire Council; Mark McAteer, Director of Governance and Performance Management, The Improvement Service.

Public services reform and local government: Strand 2 – benchmarking and performance measurement (in private): The Committee considered the evidence received.

26th Meeting, 2012 (Session 4) - Wednesday 21 November 2012

Public services reform and local government: strand 2 – benchmarking and performance measurement (in private): The Committee considered and agreed a draft report.


ANNEXE E: ORAL EVIDENCE AND ASSOCIATED WRITTEN EVIDENCE

19th Meeting 2012 (Session 4), 12 September 2012

Written Evidence

Scottish Water

Supplementary Written Evidence

Scottish Water

Oral Evidence

Scottish Water

23rd Meeting 2012 (Session 4), 31 October 2012

Written Evidence

COSLA
Scottish Government

Oral Evidence

COSLA
Dundee City Council
East Renfrewshire Council
Improvement Service
South Ayrshire Council

ANNEXE F: OTHER WRITTEN EVIDENCE

Aberdeen City Council
Accounts Committee for Scotland and the Auditor for Scotland
Association of Direction of Social Work - ADSW
Angus Council
Argyll and Bute Council
Association of Public Service Excellence
CEMVO – Council for Ethnic Minority Voluntary Organisations
Centre for Scottish Public Policy
Children in Scotland
Coalition of Care and Support Providers in Scotland – CCPS
Comhairle nan Eilean Siar (Western Isles Council)
Dundee City Council
East Ayrshire Community Planning Partnership
East Lothian Council
Edinburgh Council
Falkirk Council
Fire Brigades Union
Forum of Private Business
Glasgow City Council
Grampian Police
Highland Council
Long Term Conditions Alliance Scotland (LTCAS)
Lothian and Borders Fire and Rescue Service
MacKinnon, Niall
NHS Ayrshire & Arran
NHS Dumfries and Galloway
NHS Health Scotland
NHS Lothian
North Ayrshire Council
North Lanarkshire Council
Northern Constabulary
Outer Hebrides Community Planning Partnership
Royal Town Planning Institute Scotland
Scottish Association of Social Work (SASW)
Scottish Borders
Scottish Natural Heritage
SCVO – Scottish Council for Voluntary Organisations
Social Enterprise Scotland
SOLACE Scotland
South Lanarkshire Council
UNISON
Vanguard Consulting
Volunteer Development Scotland
West Dunbartonshire Council
West Lothian Council
Public Audit Committee of the Scottish Parliament
NHS Borders
COSLA
Federation of Small Businesses (FSB) Scotland

ANNEXE G: COMMITTEE BENCHMARKING SEMINAR HELD ON 10 SEPTEMBER 2012 – SCOTTISH PARLIAMENT

The Committee held a one-day seminar on benchmarking in Scottish local government as part of Strand 2 of the inquiry. This seminar took place at the Scottish Parliament on Monday 10 September 2012.

Transcript of the Seminar (404KB pdf)

Presentations made by speakers at the seminar:

Seminar Programme

  • 9.30am - Welcome & Purpose of the Workshop
  • 9.35am - What is Benchmarking?: Dr Clive Grace – Honorary Research Fellow, Cardiff University Business School
  • 9.50am - Improving Local Government Benchmarking In Scotland – SOLACE and Improvement Service
  • 10.10am - The Local Government Experience of Benchmarking in Wales – Andrew Stephens, Executive Director, Local Government Data Unit Wales
  • 10.30am - Question and Answer session with the morning’s speakers
  • 11.15am - Breakout Session: Issues and Challenges in Benchmarking and Performance Measurement and how they can be overcome
  • 12.30am - Lunch
  • 1.45pm - Workshop feedback from facilitators on the morning’s breakout session
  • 2.15pm - Taking Benchmarking and Performance Measurement Forward: Martin Walker, Assistant Director, Best Value and Scrutiny Improvement Group, Audit Scotland
  • 2.35pm - Taking Benchmarking and Performance Measurement Forward: Issues and Next Steps: Discussion session
  • 3.40pm - Next Steps
  • 4pm - Close

Speaker Biographies

Dr Clive Grace – Honorary Research Fellow, Cardiff University Business School. Clive is a former Chief Executive of Torfaen County Borough Council in Wales, and former Director-General of the Audit Commission in Wales. His portfolio now spans the academic, commercial, public, and professional sectors. He is Chair of the Research Council’s Shared Services Centre Ltd which runs all the Research Council back office and grants operations, and is taking on the back office for the Department of Innovation and business and its Partner Organisations. He is also Chair of the BT Wales Board, and a Non-Executive Director of Nominet, the steward of the 9m .uk domain names.

He is currently advising the Tunisian Government on the design of a public services benchmarking system to support the move to federal governance and the challenge of regional economic development and poverty reduction, and the Nepal Government on civil service reform. He is an Honorary Research Fellow at Cardiff Business School, and an Honorary Life Member of both the Chartered Institute of Public Finance and Accountancy and also of SOLACE.

He is a qualified lawyer, and has a Doctorate from the University of Oxford, a Master's degree from the University of California, a Bachelor's degree from the University of Birmingham, and a management qualification from the Open University.

Andrew Stephens – Executive Director, Local Government Data Unit Wales

Andrew is the Executive Director of the Local Government Data Unit ~ Wales. The Data Unit is part of the local government family in Wales. It provides a range of support to the Welsh Local Government Association, Welsh local authorities and their partners. Areas where the Unit provides support and input include: survey design and analysis; data collection, management and dissemination; performance measurement and management; benchmarking; and IT system development and support. The Unit either leads or plays an active role in a number of networks within Wales. In addition to having overall responsibility for the day to day management of the Data Unit, Andrew represents Welsh local government on a number of strategic and working groups. Prior to moving to the Data Unit, Andrew held a variety of posts in the Office for National Statistics, including managing large data collections, statistical methodology and quality, and developing national statistics policies.

Martin Walker, Assistant Director, Best Value and Scrutiny Improvement Group, Audit Scotland

Martin is an assistant director in Audit Scotland’s best value and scrutiny improvement group. The best value and scrutiny improvement group delivers best value audit reports on councils, fire and rescue services and, in conjunction with HMICS, police boards and forces. The group also produces overview reports on local government, police and fire, statutory reports and thematic reports in the ‘how councils work’ series as well as co-ordinating the shared risk assessment process and statutory performance indicators. Martin joined Audit Scotland in 2004 having previously worked in councils for fourteen years in various roles. Born and brought up in Oldham, Martin studied economics and industrial relations at Leeds University before moving to Scotland in 1990.

ANNEXE H: SUMMARY OF WRITTEN EVIDENCE RECEIVED

Introduction

This document was compiled by SPICe and contains a summary of the key themes to emerge from the written evidence received on Strand 2 of the Committee’s Public Services Reform inquiry on benchmarking. The summary is structured through the questions asked in the Committee’s call for written evidence.

Q1. What are the main challenges (cultural, technical, geographical or other) in developing performance measurement and benchmarking systems for local authorities across Scotland?

This question received the most detailed response. Local Authorities made a range of points in response to this question. Argyll and Bute identified four main challenges—

  • differing approaches to service delivery among local authorities can limit the number of opportunities for direct benchmarking of performance;
  • few local authorities share the same or similar geographical makeup;
  • there can be a defensive attitude to benchmarking and publication of performance information. This is particularly the case if performance is perceived to be poor; and
  • for many council services it would be beneficial to benchmark with the private sector; however, private sector organisations can be reluctant to supply information due to concerns regarding the release of commercially sensitive information.

Many submissions focussed on data issues. According to East Ayrshire CPP, the main challenges include a need—

  • “for better quality assurance around the accuracy and consistency of data collection;
  • for greater uniformity of data provision to allow fair comparison of council performance throughout Scotland using quantitative methods;
  • even if data were readily available, to recognise the impact of factors such as sparsity (some authorities are geographically remote, which raises the cost of service provision); demographic differences; and quality of services provided on the cost of service provision;
  • for greater recognition of the wide variance in the performance of local authorities in relation to service provision in terms of spend to provide a service; the level of service provided; and effectiveness in meeting local needs and national targets;
  • to focus on the process behind the numbers, rather than only the numerical data;
  • to ensure consistency in relation to indicators and performance measures over time to allow trend analysis; and
  • for further resources and better timeframes.”

Aberdeen City Council raised the issue of the focus on outcomes, which—

“… presents a challenge of measurement since movement in social outcomes (which is usually what is meant when government refers to “outcomes”) is long term; subject to complex influences; and is, generally, not well supported by data collection and reporting.” Current data collection arrangements are often infrequent and reporting is historical (e.g Scottish Household Survey, Census, etc). The use of this data to drive public sector investment is unconvincing. A national investment to increase frequency, timeliness and sample sizes would be welcomed.”

SOLACE made a number of detailed points, including that—

“there are still too many bodies auditing and inspecting public services… [which]…leads to a partial and fragmented form of scrutiny that does not recognise, and indeed inhibits, the holistic approach that is essential to effective service delivery. The reform of external audit and inspection and the development of a single external scrutiny body for all public services would be a significant step that could be taken to drive forward improvements in performance management and benchmarking across public services. Such a body would be well placed to advise the Scottish Parliament on the resource implications of proposed new scrutiny burdens.”

Non-local authority submissions also addressed this point. Community Care Providers Scotland (CCPS) stated that—

“Difficult resource allocation decisions are frequently being taken, but without the key element of performance information. Financial pressure is being applied to successful third sector organisations to the point at which their effectiveness, and even their viability, may be compromised. Services awarded high grades by the Care Inspectorate have been transferred to providers with much poorer track records, as a result of cost-driven tendering exercises. And we believe that in some areas, direct or in-house provision is being protected at the expense of the third sector in the absence of any comparative review of the track record of each in delivering quality, outcomes and Best Value.”

Aberlour’s view agreed with local authorities on the variation in local authorities’ size and structure. They also raised specific examples of programmes which have produced “good results … but few of these have succeeded in becoming part of the system”, like Youth Crime Intervention Fund and Sure Start Partnerships.

HM Chief Inspector of Constabulary (HMCIC) raised a number of challenges in terms of the use of data, including an “assumption of technical competence”, for example—

“i) interpreting data – senior managers and even performance analysts do not necessarily understand statistics, and this can lead to incorrect interpretations (often exaggerated) of what performance data can and actually does tell us, for example;

ii) contextual information – while there is growing recognition of the relevance of contextual factors to explain data – and with this a growth in the use of, e.g. rates per population – many agencies still feel uneasy not providing misleading or uninformative raw figures;

iii) making fair comparisons – the socio-demographic profile of local authority areas will be different and will differentially affect performance. Any system of benchmarking/comparison must be able to understand and take account of this.”

Q2. To what extent has the work undertaken over the last two years by the Improvement Service, Solace and others contributed to developing a common approach to benchmarking across Scotland’s local authorities?

Most responses to the second question came from local authorities. Although many submissions were positive about the work undertaken by SOLACE etc, some did highlight some potential issues.

Angus Council recognised that the work “has the potential to make a positive contribution to developing a common approach to benchmarking across councils.” But that “it is important that the work is completed as soon as possible to enable the benefits to be realised,” and that the process should be viewed as “evolutionary.”

East Ayrshire CPP raised two specific concerns about the consistency and reliability of data—

Local Finance Returns (LFRs) – Significant variations in the calculation of LFRs across local authorities, including in relation to allocation of support costs, management costs and depreciation, highlight issues in respect of the reliability of this information for the purpose of benchmarking service costs. Although it is our understanding that the CIPFA Directors of Finance have agreed to establish a Working Group to work towards providing the level of consistency which is required for comparison of the LFRs, it is anticipated that this will be realised over the longer term.

Scottish Household Survey (SHS) – Results from the SHS are only available every two years for smaller local authorities (including East Ayrshire) and use small sample sizes of around 500 households.”

East Lothian Council questioned whether using “unit costs” was appropriate—

“There are few outcomes against which to measure the effectiveness of this unit cost expenditure. Using basic unit costs as comparators without also comparing productivity or outcomes might only serve to push the driving down of costs without considering outcomes and/ or increasing efficiency and effectiveness.

A fundamental issue in using cost figures for benchmarking is that in some cases high levels of expenditure are what councils and Scottish Government are striving for (even in a recession). For example, a Council with a high gross cost for ‘Looked after Children’ might be doing better than a council with a very low cost as long as it is achieving good outcomes for these children. So comparing the Gross cost per child per week is meaningless unless we also consider the outcomes for these children. Similar statements hold true across a wide range of services (e.g. community care and education).”

West Lothian Council was also critical of the project—

“The SOLACE benchmarking exercise offered a small number of indicators that could be used to benchmark activities that were considered to be cross cutting across all authorities. The exercise was useful in its stated intent to act as a “can-opener” and encourage the practice of sharing comparable performance data. However, the suite of indicators identified in the project disproportionately viewed the performance of complex, needs-based services through the prism of efficiency. Taken in context, efficiency indicators can be useful management information, but if used as the sole indicator of performance, it is extremely limiting.

The performance of a service or success of an activity should be evaluated by measuring their impact in terms of customer satisfaction and effectiveness as well as efficiency. Additionally, there was concern regarding some of the data sources used by SOLACE/Improvement Service for a number of the indicators in the benchmarking exercise. For example, the use of the Scottish Household Survey undermined confidence in the satisfaction indicators as it was reliant on outdated information, where robust locally gathered consultation data provided a better, more representative sample and in some areas highlighted conflicting performance results.”

HMCIC suggested that the work could be further developed by using benchmarking / comparators with the private sector, especially for corporate services such as HR and finance.

Q3. What technical or other resources are needed to continue and complete the development of recent work on benchmarking?

Again, responses to this question focussed on the data sources needed and how that data should be used. The Association of Public Sector Excellence (APSE) encouraged—

“participating Councils to share process information to determine where savings can be achieved and provide comparable data in a meaningful way between family groups. In Scotland, a significant number of process benchmarking studies have been completed. These studies examine and explain cost/quality variations between Council services. The profiling of Councils who submit data to APSE Performance Networks allows similar types of Councils to share meaningful information, rather than simply a process of “near neighbour” information that can often distort or even undermine the comprehensive nature of performance information.”

The Accounts Commission’s submission highlighted that “organisations lack basic data on the cost, activity and quality of the services they deliver”, and that this can make it difficult for public bodies to—

  • “Demonstrate that their priorities clearly reflect local need;
  • Have confidence that they are targeting their resources and activities towards actions that will make a real difference for the area;
  • Understand how to make the most cost-effective use of the resources available to them;
  • Satisfy themselves that they are meeting their responsibilities for equalities; and
  • Demonstrate that they are continually improving the services they deliver.”

Black and Ethnic Minority Infrastructure in Scotland (BEMIS) recommended the introduction of a national “impact assessment tool kit”, “similar in purpose to the EQIA tool(s) as a universally useable ‘benchmarking assessment’. The model used in developing the EQIAs, involving relevant partner organisations would perhaps be the best way forward in relation to benchmarking.”

In terms of local authorities, Dundee City Council recommended that the Scottish Household survey “be expanded to provide a good national reporting model of public and customer feedback on the quality of a priority range of public services and outcomes that can be broken down by local government area. This should be across all public services and consideration should be given to including other sectors of consumer interest as well such as financial services, transport and retail.”

Highland Council suggested that local authorities “need to review their corporate performance frameworks to integrate the SOLACE benchmarking indicators. In Highland we will also amend our customer surveys to make sure we can supplement or improve on the data provided for particular SOLACE indicators which rely on national surveys (our sample size is better).”

East Ayrshire CPP highlighted the work of the “Cross Council Budget and Performance Working Group”—

“Officers participating in the CCBP Working Group have already undertaken significant work to address variations in LFR data and ensure that reliable comparisons can be made across service areas. To date, a range of reviews has been completed, which include initial analysis of budgets and performance to identify a comparator Council (that is the Council with the best performance/lowest cost). To eliminate differences arising from accounting treatment, detailed analysis includes in depth review of LFRs to ensure consistency and comparability of information, or identify reasons for variance in cost and examine differences in how services are structured and delivered, and identify best practice.”

The Royal Town Planning Institute noted the work of the Planning Performance Assessment Framework, as a good example of an approach “which balances the statistical/quantitive elements of performance with the softer/qualitative elements.”—

“National Headline Indicators will focus on decision making timescales, delivery of outputs, age of development plans and success of project planning, whilst the Framework will look at ways of measuring a high quality service through assessing how the authority is open for business; how it delivers high quality development on the ground; how it provides certainty; how it engages with its customers; how it is efficient and effective in making decisions; how effective its management structures are; its approach to financial management; and how it embeds a culture of continuous improvement. The framework will also identify future improvements to be made, and share progress on improvements that had been previously identified.”

Q4. To what extent can the developing work on benchmarking be extended across community planning partnerships? How can data derived from benchmarking influence the future direction of community planning and the contents of future SOAs?

The majority of submissions providing substantive comment on this question were from local authorities. Glasgow City Council stated that benchmarking for CPPs would “largely depend on the identification of a range of outcome measures common to all partnerships, or agreed partnership ‘families’ which might be created,” but that “Further work is required to determine whether a significant enough range of outcome measures were in common usage and could form the basis of a suite of comparison indicators.”

Highland Council made the point that—

“the SOLACE benchmarking indicators focus on local authority unit costs and customer satisfaction and partner bodies may already have something similar, for example comparing unit costs and satisfaction across different Police services and Health Boards. Comparing costs for back office services across partnerships may help develop a business case for more shared services and to measure any savings made from shared services put in place and similarly for any planning around integration of front-line services.”

North Lanarkshire Council stated that—

“There would be a benefit from focussing on themes such as health and transport in the first instance. Impacts can be measured through robust performance management in CPPs. Robust benchmarking would allow integration of unit cost of delivery, not just within an organisation but across a partnership, looking at duplication of effort and where joint resources could be better utilised.”

In terms of CPPs themselves, East Ayrshire CPP recommended that—

“SOAs were never intended to compare performance across different CPPs and the variability of local outcomes selected means that CPPs’ performance cannot be aggregated to assess their overall contribution to achieving national outcomes. In this regard, it may be useful to consider developing a robust set of core performance indicators that all Community Planning Partnerships require to report on linked to the National Performance Framework.”

From a Health Service perspective, NHS Lothian stated that—

“The main comment offered is the need to take a more joined up approach to benchmarking and performance management across all community planning partners. A key learning from NHS Lothian through our involvement in the Integrated Resource Framework is the inter-dependencies between health and social care. Any decision to realign a service or shift resources needs to be done within a wider context otherwise there is a risk of cost-shunting between organisations. This can be mitigated through the development of shared performance measurement processes. An example of this beyond the IRF has been the development of community planning strategic assessments in Midlothian, which has allowed partners to better understand priorities based on robust data and evidence. This approach is now being rolled out across the other areas in Lothian and this is welcomed.”

Q5. How can the development of benchmarking help improve the performance of local authorities in Scotland?

This question received the lowest number of substantive responses. East Lothian Council set out the potential benefits—

“Increased use of benchmarking is likely to help reduce the cost of public services. Local authorities are likely to understand their costs and the factors that drive those costs. Benchmarking could also help to improve transactional services that are more process driven. However, improving performance via benchmarking will be more difficult to achieve for non-transactional services. For example, benchmarking would add little to understanding educational attainment.”

Aberdeen City Council stated that—

“We believe benchmarking is a very useful method of understanding our business and exploring approaches for improvement. Our experience is that this has most impact when the benchmarking is initiated and designed by the professionals to fit their purpose rather than being imposed at a national level. We are strong advocates of broadening the source of benchmarking to professional groupings and the private sector.”

Finally, East Ayrshire council listed a number of potential benefits—

  • “providing greater accountability and transparency of process;
  • demonstrating impact, benefit and value for money;
  • improving service quality, and efficient and effective allocation of public finances;
  • identifying best practice and using this learning for improvement purposes;
  • assisting to determine priorities for performance improvement and providing re-assurance in respect of what is working well;
  • improving the organisation’s credibility and stakeholder satisfaction; and
  • identifying potential partners for collaborative working.”

Q6. Should the Scottish Government have a role in providing national impetus to the development of benchmarking and performance measurement?

Most of those answering this question saw a role for the Government in guiding the development of benchmarking. APSE stated that it—

“believes that any broad direction from central government should relate to—

  • What types of performance information might be made available
  • Performance indicators and the supporting information increasingly needs to be related to outcomes via the SOA process.
  • Performance information needs to be robust with an element of independent or peer assessment and
  • The presentation and accessibility of the data to the public and locally elected members, needs to support evidence based judgements on service delivery.”

Aberdeen City Council agreed that “we believe the role of national government should be one of facilitation by improving national data sets and supporting channels of comparative data (e.g. a portal).”

Angus Council though suggested the Government could “play a major role in the development and promotion of benchmarking and performance management by—

  • ensuring co-ordination between, and management of direction from, government departments.
  • promoting the use of data which measures outcomes rather than inputs.
  • ensuring national data is reported on a timely basis.
  • ensuring national data can be disaggregated to local level and that in doing so it remains representative.
  • ensuring more qualitative national data.
  • promoting the creation of a common comprehensive set of output indicators.”

Allan Campbell
SPICe Research
October 2012

Note: Committee briefing papers are provided by SPICe for the use of Scottish Parliament committees and clerking staff. They provide focused information or respond to specific questions or areas of interest to committees and are not intended to offer comprehensive coverage of a subject area.

    Footnotes:

    1 Scottish Parliament Local Government and Regeneration Committee, Official Report, 7 September 2012, Col 56.

    2 CPIFA (1996), Benchmarking to Improve Performance.

    3 Audit Scotland (1999), Measuring up to the best – a manager’s guide to benchmarking. Available at:
    http://www.audit-scotland.gov.uk/docs/local/pre1999/nr_9901_managers_guide_benchmarking.pdf . [Accessed 16 November 2012].

    4 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 1. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf. [Accessed 16 November 2012].

    5 Members include Standard Life and the Royal Bank of Scotland, SMEs and a range of public and voluntary organisations.

    6 Quality Scotland, benchmarking. Available at: http://www.qualityscotland.co.uk/benchmarking.asp. [Accessed 16 November 2012].

    7http://www.globalbenchmarking.org/images/stories/PDF/2010_gbn_survey_business_improvement_and_benchmarking_web.pdf

    8 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 4. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf. [Accessed 16 November 2012].

    9 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 5. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    10 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 8. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    11 Report of the Commission on the Future Delivery of Public Services (2011), page 64, paragraph 7.15. Available at: http://www.scotland.gov.uk/Resource/Doc/352649/0118638.pdf. [Accessed 16 November 2012].

    12 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 6. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf. [Accessed 16 November 2012].

    13 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1338.

    14 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1339.

    15 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1336.

    16 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 10. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf. [Accessed 16 November 2012].

    17 Scottish Parliament Local Government and Regeneration Committee, Official Report, 12 September 2012, Col 1169.

    18 Scottish Parliament Local Government and Regeneration Committee, Official Report, 12 September 2012, Col 1162.

    19 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1337.

    20 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1339.

    21 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1359.

    22 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1377.

    23 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1365.

    24 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 10. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    25 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 10. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    26 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 11. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    27 Scottish Government written submission

    28 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 17. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    29 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1344.

    30 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1343.

    31 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1369.

    32 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 12. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    33 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 38. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    34 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 38. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    35 COSLA, written submission, paragraph 14

    36 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1368.

    37 Report of the Commission on the Future Delivery of Public Services (2011), page 64, paragraph 7.16. Available at: http://www.scotland.gov.uk/Resource/Doc/352649/0118638.pdf. [Accessed 16 November 2012]. (Quoting the September 2007 Report of the Independent Review of Regulation, Audit, Inspection and Complaints Handling of Public Services in Scotland [the Crerar Review]).

    38 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 55. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    39 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1339.

    40 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1341.

    41 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Cols 20 - 21. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    42 Scottish Parliament Local Government and Regeneration Committee, Official Report, 12 September 2012, Col 1159.

    43 Scottish Parliament Local Government and Regeneration Committee, Official Report, 12 September 2012, Col 1161.

    44 Scottish Parliament Local Government and Regeneration Committee, Official Report, 12 September 2012, Col 1165.

    45 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1341.

    46 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1345.

    47 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 8. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    48 Renewing Scotland’s Public Services: Priorities for reform in response to the Christie Commission page 6

    49 Renewing Scotland's Public Services: Priorities for reform in response to the Christie Commission page 7

    50 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 33. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    51 Local Government and Regeneration Committee, Benchmarking and Performance Measurement Seminar 10 September 2012.Transcript, Col 36. Available at: http://www.scottish.parliament.uk/S4_LocalGovernmentandRegenerationCommittee/Inquiries/LGRC_Benchmarking_Seminar_10_September_2012_-_Transcript.pdf.

    52 Scottish Parliament Local Government and Regeneration Committee, Official Report, 12 September 2012, Col 1167.

    53 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1350.

    54 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1380.

    55 Report of the Commission on the Future Delivery of Public Services (2011), page 63, paragraph 7.10. Available at: http://www.scotland.gov.uk/Resource/Doc/352649/0118638.pdf. [Accessed 16 November 2012].

    56 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1391-1392.

    57 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1337-1338.

    58 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1348.

    59 Scottish Parliament Local Government and Regeneration Committee, Official Report, 31 October 2012, Col 1371.

    Back to top

    This website is using cookies.
    We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we’ll assume that you are happy to receive all cookies on this website.