Report

The design and conduct of the third and fourth funding rounds of the Regional Development Australia Fund

27 Nov 2014
Description

The objective of the audit was to assess the effectiveness of the design and conduct of the third and fourth funding rounds of the Regional Development Australia Fund.

The scope of the audit included the processes by which proposals were sought and assessed and successful projects were approved for funding. The audit criteria reflected the financial and grants administration frameworks then in effect, including the Commonwealth Grant Guidelines (CGGs), as well as ANAO’s grants administration Better Practice Guide.

Overall conclusion

The Regional Development Australia Fund (RDAF) was introduced following the 2010 election as part of a $1.4 billion commitment to support the infrastructure needs and economic growth of regional Australia. The third and fourth RDAF funding rounds were conducted between October 2012 and June 2013. There was significant interest in the opportunity to compete for Australian Government funding, with more than 900 expressions of interest received, seeking over $2.5 billion in funding compared with the $225 million that was announced as being available. As it eventuated, more than $226 million in grant funding was awarded across the two rounds to support 121 capital infrastructure projects.

The award of funding was undertaken through a two-stage application process that initially involved the 55 Regional Development Australia (RDA) committees shortlisting expressions of interest and assigning a priority to each project in their region, prior to full applications being submitted to the Department of Regional Australia, Local Government, Arts and Sport (DRALGAS, or ‘the department’). Those applications were then assessed by the department, which included assigning each eligible application a rating against each selection criterion. Improvements in the quality of the department’s assessment work, and of its application lodgement processes, were evident. An advisory panel, whose five members were selected for their experience, knowledge and expertise on regional Australia, was responsible for assessing the eligible applications and providing funding recommendations to the then Minister for Regional Services, Local Communities and Territories. The Minister made her funding decisions in May 2013 for round three and over May and June 2013 for round four.

The assessment and selection process as it was described in the program guidelines reflected a sound approach. However, in the manner implemented, the stages were not well integrated in that each step informed the next in only a limited way. As a result, there was not a clear trail through the assessment stages to demonstrate that the projects awarded funding were those that had the greatest merit in terms of the published program guidelines. In particular:

  • the order of regional priority allocated to projects by the RDA committees was not used by the department or the panel at any point in the assessment of applications, and was not provided to the Minister to inform her decision-making;
  • the panel categorised applications as ‘recommended’, ‘suitable’ and ‘not recommended’, but its categorisation was not supported by a documented assessment, by the panel, of the merits of each eligible application in terms of the published selection criteria. Rather, the panel advised ANAO that it considered and applied the selection criteria ‘in their entirety’;
  • the only recorded assessment of each eligible application against each of the published selection criteria was that undertaken by the department; however a third of the applications awarded the highest possible rating against each selection criterion by the department were assigned to the lowest merit category by the panel;
  • 27 per cent of the applications approved by the Minister, representing 48 per cent of total funding awarded, had not been included by the panel in the ‘Recommended for Funding’ category (as the panel did not consider them to be of sufficient quality). These applications represented:

- 15 per cent of approved round three applications (and 16 per cent of approved round three funding) categorised by the panel as other than ‘Recommended for Funding’, three quarters of which had been categorised as ‘Not Recommended for Funding’ with the remaining quarter classified as ‘Suitable for Funding’; and

- 50 per cent of approved round four applications (involving 53 per cent of approved round four funding) categorised by the panel as other than ‘Recommended for Funding’. Two-thirds of these applications had been categorised as ‘Not Recommended for Funding’ with the other third categorised as ‘Suitable for Funding’; and

  • 56 per cent of those applications awarded funding had been assessed by the department to not satisfactorily meet one or more of the selection criteria.

The absence of alignment or a clear trail between the assessed merit of applications against the published selection criteria and the rounds three and four funding decisions was a similar situation to that observed in ANAO’s audit of the first RDAF funding round. This shows that the recommendations made in the first audit, agreed by the department, had not been implemented by the department, and inadequate attention was given to relevant aspects of the grants administration framework. Effectively implementing agreed recommendations (which often reflect ANAO’s experience of practices other departments have found to be beneficial) and closer adherence to identified principles of better practice grants administration are matters that warrant greater attention by the department. In light of the findings of this current audit, ANAO has made a further three recommendations to DIRD directed at:

  • improving the efficiency of two-stage grant application processes;
  • a more rigorous approach to assessing whether candidates for grant funding will provide value with public money; and
  • improving the quality and clarity of advice provided to Ministers to inform their decisions about the relative merits of proposals competing for grant funding.

A further similarity between the third and fourth RDAF rounds and the first round was that a relatively high proportion of approved projects had not been recommended for approval by the panel. In each of the four rounds, the panel recommended that funding be approved only for those applications it had included in the ‘Recommended for Funding’ category. This reflected the design of the program, where the three categories to be used by the panel (see paragraph 4) were intended to distinguish between the assessed relative merit of groups of applications. In this respect, the published operating procedures for the panel required that applications categorised as:

  • ‘Recommended for Funding’ have been assessed as meritorious, meeting the selection criteria to a high degree and having a strong positive impact on the region;
  • ‘Suitable for Funding’ have been assessed to meet the selection criteria and have a positive impact on the region but are considered to be not as strong as those categorised as ‘Recommended for Funding’; and
  • ‘Not Recommended for Funding’ have been assessed as not strong and to have no identifiable positive impact on the broader community.

However, the Minister has informed the ANAO that: she had been advised by the department, and was always of the understanding, that projects in both the ‘Recommended for Funding’ and ‘Suitable for Funding’ categories were available for selection; in choosing projects from both categories she was complying with the program guidelines; and she would have reported to the Finance Minister her decisions to award funding to an application included in the ‘Suitable for Funding’ category if she had believed that the panel had not recommended them for funding. In this context, focusing solely on those applications approved for funding in the ‘Not Recommended for Funding’ category, the proportion of applications approved for funding against panel advice falls (from 27 per cent) but nevertheless remains significant (at 19 per cent of all applications approved, comprising 11 per cent of approved round three applications and 33 per cent of round four applications). In terms of the proportion of funding approved, 40 per cent ($90.6 million) was awarded to applications categorised as ‘Not Recommended for Funding’ (comprising 11 per cent of the round three funding and 45 per cent of the round four funding awarded).

ANAO sought advice from the department on whether officers responsible for briefing the Minister on the outcome of the funding rounds had provided such advice to the Minister. In response, the department outlined to ANAO that it had: briefed the Minister that rounds three and four involved discretionary grant funding; identified the applications the panel had recommended be awarded funding (being those in the ‘Recommended for Funding’ category); and advised her that she should review the list of projects recommended and satisfy herself as to the benefits of each project and that, should she disagree with the recommendations and choose other projects, then the reasons for these decisions should be recorded.

Setting aside the different perspectives of the Minister and the department, the then Government’s guidelines for this program provided for the advisory panel to make the recommendations to the Minister as to those applications that should be awarded funding. Further, the grants administration framework has been designed to accommodate situations where decision‑makers do not accept the advice they receive. Amongst other things, it requires that the basis for funding decisions be recorded. However, the records of the reasons for funding decisions taken contrary to panel advice generally provided little insight as to their basis and made no reference to the published selection criteria. This situation was particularly significant given that such decisions were largely at the expense of projects located in electorates held by the Coalition. Specifically:

  • 80 per cent of Ministerial decisions to not award funding to applications recommended by the advisory panel related to projects located in Coalition-held electorates. This was most notably the case in round three, where 93 per cent of those recommended applications that were rejected were located in Coalition-held electorates. For round four, 54 per cent of recommended applications that were rejected were located in Coalition‑held electorates; and
  • 64 per cent of Ministerial decisions to fund applications that had been categorised by the panel as other than ‘Recommended for Funding’ related to ALP-held electorates compared with the 18 per cent relating to Coalition-held electorates. Having regard to the Minister’s advice to ANAO (see paragraph 13) that she viewed only those applications categorised as ‘Not Recommended for Funding’ as involving the approval of a not recommended application, 57 per cent of approved applications from this category related to ALP-held electorates compared with 17 per cent relating to Coalition-held electorates.

Performance audits have been undertaken of each of the major regional grant funding programs introduced by successive governments over the last eleven years. Over this period, improvements have been observed in some important aspects of the design and implementation of regional grant programs. Nevertheless, in respect to each successive program there have been shortcomings in the design and administration of the assessment and decision‑making processes, and indicators of bias in the awarding of funding to government-held electorates.

Such situations detract from the measures that have been implemented to date to make improvements to the grants administration framework, noting that it was experience with one of the earlier regional grant funding programs that was a catalyst for the introduction of the Commonwealth Grant Guidelines. More importantly, these situations detract from the ability of grant funding programs to deliver on their policy objectives to the extent practicable, and are detrimental to those communities that would have benefited had funding been awarded to those projects that had been assessed, in a structured way, to be the most meritorious in terms of the published program guidelines.

Against this background, a key message from ANAO audits of grant programs over the years, and highlighted in ANAO’s grants administration Better Practice Guides, is that selecting the best grant applications that demonstrably satisfy well-constructed selection criteria promotes optimal outcomes for least administrative effort and cost. Another recurring theme in the ANAO’s audits of grants administration has been the importance of grant programs being implemented in a manner that accords with published program guidelines so that applicants are treated equitably. Similarly, the grants administration framework was developed based, in part, on recognition that potential applicants and other stakeholders have a right to expect that program funding decisions will be made in a manner, and on a basis, consistent with the published program guidelines. There is also an important message here for agencies to underline this expectation in advice to Ministers so as to avoid any misunderstandings and promote informed decision-making.

In this context, the most important message from this audit is that considerable work remains to be done to design and conduct regional grant programs in a way where funding is awarded, and can be seen to have been awarded, to those applications that demonstrate the greatest merit in terms of the published program guidelines. Ministers can show the way here by emphasising the importance of adhering to the published program guidelines, and discharging their responsibilities in accordance with wide considerations of public interest and without regard to considerations of a party political nature. History shows that this is particularly important in the lead-up to a Federal election.

Publication Details
Published year only: 
2014
67
Share
Share
Subject Areas
Geographic Coverage
Advertisement