Skip links and keyboard navigation

Program and project assurance framework

Document type:
Framework
Version:
Final v3.0.0
Status:
Owner:
QGCIO
Effective:
November 2018–current
Security classification:
OFFICIAL-Public

Final | November 2018 | v3.0.0 | OFFICIAL - Public |QGCIO

Introduction

The establishment of consistent assurance processes provides confidence that programs and projects are committed to the successful delivery of initiatives and services across the Queensland Government.

The Queensland Government program and project assurance framework provides a mechanism for independently reviewing and advising on ICT and ICT-enabled initiatives to ensure they represent value for money, are viable and are appropriately governed.

Please note, the use of the word initiative throughout this document is intended to mean either program or project.

Purpose

This framework provides information on the process of assurance profiling, planning, reviews, reporting and responding. Every program and project is different, and assurance should be applied sensibly and appropriately. Following the recommended actions for approaching assurance should result in timely initiative reviews that provide tailored advice and help to identify opportunities for successful delivery.

This document is broken down into the following sections that align with the recommended actions for approaching assurance:

  • assurance profiling
  • assurance planning
  • gated assurance reviews
  • gated assurance reporting
  • responding (action planning).

Profile

Purpose

Assurance profiling is the first step to determine the appropriate assurance level and the degree of independence and scrutiny required to adequately address the complexity and impact implications that the program or project represent to service delivery.

Minor initiatives will attract an assurance level of 1, while critical initiatives will attract level4. Therefore, as the assurance level increases, so too does the requirement for independent assurance analysis. This leveling will ensure appropriate assurance is applied to the initiative, therefore avoiding over or under assuring.

Process

Assurance profiling assessment criteria

The assurance profiling process analyses nine criteria to calculate an initiatives assurance level. Robust discussion with the Accountable Officer surrounding these characteristics will assist in understanding the benefits of assuring the initiative as well as increase focus on areas of concern.

  • Finance: The initiative is a significant financial investment. It involves significant time constrained funding.
  • Political: The initiative contributes to a major public service or government policy outcome.
  • Service delivery: The initiative is likely to directly impact front line or community government services and attract external (including media) interest.
  • Organisational change: The initiative involves substantial organisational change management considerations. It involves stakeholders outside of agency direct control whose buy-in and/or support may be required.
  • Duration: The initiative will be undertaken over an extended period or there are potential delivery challenges regarding duration. Example: the solution is not well-defined or has immovable dates.
  • Complexity: The initiative is innovative and not typical of an initiative undertaken by the agency. Example: it requires complex technology support and skills not available within the agency.
  • Security: The initiative involves sensitive information or operations requiring higher than normal security and business continuity considerations.
  • Delivery challenges: Delivery is regarded as challenging. Cross agency support and/or specialist expertise required for delivery of initiative and realisation of benefits.Involves legislative or policy changes.
  • Governance: The complexity of the initiative is likely to require an increased governance, scrutiny and specialist management capability.

Assurance levels

Four assurance levels are defined. Each progressive assurance level supports an increasing level of assurance activity, scrutiny, and independence. The table below provides a summary of how assurance can be applied for each level and more detail is provided in the summaries following.

 Within initiativeWithin departmentExternal to departmentSupplier for major initiativesQld Treasury GatewaySupplier for critical initiatives
Level 1
internal
Yes     
Level 2
external to initiative
 Yes    
Level 3
external to department
  YesYesYes 
Level 4
external to government
     Yes

Level 1 assurance - Internal

Level 1 represents the standard agency initiative level of assurance, primarily involving the Board and internal business area/s staff. This assurance level requires minimal assurance however reviews will still be scheduled and assurance planning still required.

The reviews can be completed by staff working closely with the initiative or staff in the agency's portfolio/program/project office (PMO), etc.

Level 2 assurance External to initiative

Reviews can still be conducted from within the agency but must be external to the initiative to ensure an appropriate level of independence and objectivity is maintained. For example, reviews may be performed by a PMO or suitable governing body. Involvement from senior management, independent from the business area, may also be required.

Level 3 assurance External to department

External assurance is required at this level. You may consider:

Level 4 assurance External to government

This is the highest assurance level and requires external to government, independent providers of assurance services for critical initiatives.

The ICTSS.13.03A ICT Services Panel - Program and Project Gated Assurance (Queensland Government employees only) contains a list of suppliers under Program and Project Gated Assurance (critical initiatives).

Practice

Assurance profiling takes place when a new initiative is identified or when significant change occurs; such as changes to scope, complexity or impact. There are 27 questions in the assurance profiling tool (available on the QGEA portal, for registered portal users).

The Accountable Officer will be required to sign off and formalise the resulting assurance level.You will need to keep a record of the results, including your reasoning and any additional evidence and provide an approved copy to the Office of Assurance and Investment (OAI@chde.qld.gov.au).

Plan

Purpose

Assurance planning will define the assurance reviews and related activities to be applied to assure the initiative. The assurance plan is to encompass the whole initiative and will also include the scheduling of health checks, etc.

A key advantage of forward planning is that review participants will be identified and assurance review activities scheduled to allow the assurance reviews to be implemented within a suitable time frame.

Process

Developing the plan involves the identification of assurance activities for the allocated assurance level. Some elements to consider as part of planning are:

  • selection of appropriate assurance reviewers, which may include external suppliers of assurance services
  • identification of any costs associated with assurance reviews
  • on-going updating of schedules
  • management of planning, execution, review and actioning tasks, and linking to dependencies and Board meetings
  • identification and management of resource needs for reviews, including information required to complete reviews
  • logistics such as file storage, accommodation, computer access etc.
  • establishing and updating an assurance register (this information may be included in the quality register)
  • scheduling and re-prioritising reviews as required
  • processes, roles and responsibilities to initiate and manage the reviews are defined
  • stakeholders are identified with engagement and communications planned
  • information requirements are identified and incorporated into quality systems and planning.

Reviews within the assurance plan

While an assurance plan will encompass the lifecycle of the initiative, each individual review may have its own plan developed and scheduled as part of the broader assurance plan.

The information in an individual review plan could include:

  • details on procedures and requirements for the review team regarding accommodation and computer access, including information access i.e. security requirements
  • processes, roles and responsibilities to initiate and manage the review
  • expected participants from the agency and other organisations in the review and processes to engage and provide communication
  • information requirements for the review and responsibility and timing for gathering information.

Practice

Assurance plan templates are available to assist with assurance planning and should be tailored to meet the individual review requirements of the initiative.

Low risk, short term initiatives may decide to include assurance planning activities in their Project Plan (or equivalent documentation).

Initiatives with an assurance level of 2, 3 or 4 will need to submit their assurance plan to the Office of Assurance and Investment (OAI@chde.qld.gov.au)as part of the ICT concept/investment review process.

Review

Purpose

The assurance review examines programs and projects at key decision points and aims to provide timely advice to the Accountable Officer. An assurance review will provide a strategic view of the initiative and aims to identify risks and issues that may hinder the successful delivery of the initiative, as well as acknowledging existing good governance and practice.

It is important to note that an assurance review is not an audit or a substitute for a governance framework; the aim of an assurance review is to help initiatives succeed.

Ultimately the assurance review aims to increase confidence that investment is well spent, aligns with strategic objectives and that benefits are realised.

Process

Gated assurance reviews take advantage of experience and expertise independent of the initiative. They provide a valuable and alternative fresh set of eyes perspective on the risks and issues confronting the initiative team while challenging the robustness of existing plans and processes.

The intent of the assurance gates is to review progress of the initiative against the evidence criteria relevant to each gate. Reviews assist in building confidence for Accountable Officers that decisions are calculated, well informed and carefully considered.

Gate 0 is relevant to a program and repeated at intervals in the programs lifecycle. Gates 1-5 are relevant to projects. Gates 1-4 are conducted during the life of the project; Gate 5 is repeated after project completion during benefits realisation.

During a review an initiative will require evidence of good governance, planning, risk and issue management.Reviewers will need to have confidential conversations with initiative stakeholders and will require access to information.

(Note, Level 1 assurance can be undertaken from within the initiative).

Practice

Programs

For programs, the Office of Assurance and Investment endorse the use of the Queensland Treasury Gateway review methodology, which is based on the Office of Government Commerce GatewayTM Process, for assurance reviews.

Gate 0 Strategic assessment

The Gate 0 review is first conducted in the early stages of a programs lifecycle to confirm the outcomes and objectives contribute to the overall strategy of the organisation and effectively interface with the broader high-level policy and initiatives of government.

Gate 0 is repeated throughout the life cycle of a program, with typical reviews occurring after program identification and at the end of each tranche. However, it is also common for Gate 0 reviews to occur multiple times during a tranche, before tranche or throughout delivery. Gate 0 is an outcome focused review that investigates the direction and planned outcomes of the program, together with the progress of its projects.

Projects

For projects, the Office of Assurance and Investment endorse the use of the Queensland Treasury Gateway review methodology, which is based on the Office of Government Commerce GatewayTM Process, for assurance reviews.

Gate 1 Preliminary evaluation focuses on the preliminary business case including details on the strategic importance of the project and its links to government and organisational policy and programs. This review explores whether the preliminary business case clearly identifies the business objectives and how they will be achieved, whether the project scope, scale and requirements are realistic and deliver value for money and if a sufficient number of options have been selected for further investigation.

Gate 2 Business case investigates the final business case and delivery approach for the project. It considers if the project plan, including key milestones and the proposed project budget, is sufficiently detailed and realistic, is there access to the right resources, skills and capabilities to ensure success, is there adequate funding, and have all relevant options for delivery been explored. This review also considers the procurement strategy before any formal approaches are made to the market.

Gate 3 Contract award focuses on the updated final business case and confirms the project is still required, affordable and achievable. Gate 3 confirms the final business case and benefits realisation plan following any updates from prospective suppliers. It also considers if the delivery approach is likely to deliver what we need on time and within budget, will it provide value for money and has the agreed procurement strategy been followed. The risk management strategy and progress toward draft contracts and service level agreements are also considered.

Gate 4 Readiness for service focuses on the readiness to transition from project delivery to the live environment. Gate 4 explores whether the current business case is still valid and whether the benefits realisation plan is likely to be achieved. A key focus area is whether the business is ready to make the change and the adequacy of the plans that are in place to manage the transition to operations. The review considers the full system and user testing that has been done to inform the go-live decision.

Gate 5 Benefits realisation confirms the desired benefits of the project are being achieved and if the business changes are operating smoothly. It considers if improvements to value for money and performance are being sought, has a post-project implementation review been undertaken and how the contract is being managed.

Agile projects

For Agile delivered projects, the Office of Assurance and Investment endorse the use of the United Kingdom Government, Infrastructure and Project Authority, Guide on assurance for agile delivery of digital services, Annex A.

If the agile project does not fit the alpha beta live delivery path, the traditional Queensland Treasury Gateway review process should be followed.

Gated assurance overview

The diagram below illustrates where gated assurance reviews occur within the lifecycle of programs and projects.

Gated assurance overview diagram

Review time frame

Gated assurance review interviews and report submission should occur within a one-week timeframe. Of course, additional time before and after the review is required for planning, documentation review and closure activities. Reviews examine an initiative at a point in time, therefore it is important that interviews occur over a short period of time so the review and resultant report can provide a current, focused assessment.

Review team numbers

The number of review team members can vary; influenced by factors such as context of initiative, gate, specialist skills, etc. The table below provides a guide to the number of expected reviewers.

Assurance levelReview team numbers
Level 1One or two
Level 2Usually two
Level 3Usually two to four
Level 4Usually four

You can request a gated assurance review by the Cross Agency Assurance Working Group (CAAWG) for a level 2 project by requesting a review (Opens in new window) (government login required).

Report

Purpose

An assurance report will provide a concise, evidence-based snapshot of an initiative at the time of the review. The intent of a report is to help initiatives. The report will identify any significant emerging issues that may impact the initiatives success and provide action-orientated recommendations that are helpful and practical.

It is important to note an assurance report does not replace the need to conduct risk identification, analysis, review or audit activities. A report will not ensure initiative success nor should it stop the initiative. The report and associated recommendations are prepared for the Accountable Officer who will then decide how to move forward.

Process

At the completion of each gated assurance review, the reviewer (either within the agency, external to agency or a panel approved supplier) will draft and present a report that provides an overall assessment of the initiative status. The report will also provide a summary of the review findings and associated corrective action recommendations which are prioritised by their urgency using the red-amber-green (RAG) categorisation.

It is important to note the RAG definitions used for an assurance report are different to the RAG definitions used by a program or project. Project RAG reflects current state in relation to the project tolerances.

An assurance RAG status is used to identify the criticality of timing for recommendations to be implemented. Therefore, it is important that stakeholders are informed of the RAG definitions, to dissuade the notation that red is bad. In the context of an assurance report, red indicates urgent action and/or consideration is required, and once again, it is the Accountable Officer who will then decide how to move forward.

Practice

Assurance report templates have been created to assist with assurance reporting and should be tailored to meet the individual review requirements of the initiative.

OAI reporting requirement

The finalised gated assurance report, as well as any resultant action plans which address the recommendations, will need to be shared with the OAI. The OAI will coordinate reporting for the collation of generic lessons learned. These lessons learned will provide the foundation for the on-going development of best practice guidance materials, and in the provision of advice to departments and agencies to help ensure that good practices are identified and mistakes of the past are not repeated in similar initiatives.

Annual, de-identified, gated assurance summary reports can be found on the QGEA internal to government portal.

Overall Report Delivery Confidence Assessment

Each gated review report will include a delivery confidence assessment (DCA) rating for the initiative. A delivery confidence assessment summarises the level of confidence that the review team holds as to whether the initiative is likely to deliver its planned benefits and achieve its objectives. It is important that the DCA definitions are understood; a RED rating does not mean bad, rather it indicates urgent action is required in order for the initiative to move forward.

A five point DCA rating scale is used:

Definition
Green:
Successful delivery to time, cost and quality appears highly likely and there are no major outstanding issues that at this stage appear to threaten delivery significantly.
Green/Amber:
Successful delivery appears probable, however constant attention will be needed to ensure risks do not materialise into major issues threatening delivery.
Amber:
Successful delivery appears feasible, but significant issues already exist requiring management attention. These appear resolvable at this stage and, if addressed promptly, should not impact delivery or benefits realisation.
Amber/Red:
Successful delivery is in doubt with major risks or issues apparent in a number of key areas. Prompt action is required to address these and establish whether resolution is feasible
Red:
Successful delivery appears to be unachievable. There are major issues which at this stage, do not appear to be manageable or resolvable. The project may need re-baselining and/or overall viability re-assessed

Recommendation priority ratings

Each recommendation should rated as critical, essential or recommended to indicate its priority and provide guidance on the areas requiring the most urgent work.

Recommendation priority ratingDescriptionTarget resolution timeframe
Critical(Do now)To achieve success the recommendation should be actioned immediately.Within 1 to 4 weeks
Essential(Do by)The recommendation is important but not urgent. Take action before further key decisions are taken.Within 4 to 8 weeks
Recommended (Good practice)The initiative would benefit from the uptake of this recommendation.Within 8 to 12 weeks

Respond

Purpose

Gated assurance review recommendations are targeted to the Accountable Officer.The report aims to assist the Accountable Officer to better understand any potential risks and issues (if any) that may hinder the initiatives success. The recommendations will offer a call to action that aims to ensure the initiative is on track for success.After receiving a gated assurance review report and recommendations, a response to each recommendation is required.

Process

The Accountable Officer is responsible for making sure the initiative is a success and therefore is also accountable for the response to each recommendation.

An action plan should be developed to make sure the recommendations are being effectively addressed.Actions should be assigned to a responsible officer, be specific, and have a time frame.

Development of the response and actions can be undertaken by a wider group of stakeholders, such as board members, initiative staff, etc.However, the Accountable Officer is expected to provide the final endorsement of a plan, and approval of completed actions.

Note: The gated assurance review team members are not involved in developing responses or actions to recommendations. The OAI understands that some external suppliers offer to review action plans after a review; this is acceptable only in the context of confirming that recommendations were well understood, and not misinterpreted.

Practice

At minimum, a response to gated assurance report recommendations should include:

  • each recommendation and its RAG status
  • Accountable Officer response to each recommendation
  • evidence the Accountable Officer has approved each response to each recommendation.

Gated assurance report recommendations are intended to be forward looking, actionable and practically helpful. Each initiative will determine how best to capture, monitor and report on actions resulting from recommendations. At minimum, an action plan should include the information from the Accountable Officer response and:

  • agreed action
  • target date
  • assigned officer
  • evidence the Accountable Officer has approved response actions
  • action update
  • action completion date
  • evidence the Accountable Officer has approved completed actions.

OAI reporting requirement

ICT Investment Review will require both a gated assurance report and the associated action plan.A Concept review seeks either a Gate 0 or Gate 1 report and action plan, the Investment Decision review seeks a Gate 3 report and action plan.On occasion, ICT Investment Review may request other gated assurance reports and action plans.

Learn more about ICT Investment Review (available on the QGEA portal, for registered portal users).

Definitions

Due to recent changes in some methodology terminology, Accountable Officer has been used throughout this document. The Accountable Officer is personally answerable for the initiative (this accountability cannot be delegated).An Accountable Officer is also known as:

  • Senior Responsible Owner (SRO)
  • Project Executive

Please note, the QGEA glossary (Opens in new window) provides a full listing and definition of terms used in QGEA publications.

References