Performance-based hospital funding: a reform tool or an incentive for fraud?

Antony Nocera
Med J Aust 2010; 192 (4): 222-224. || doi: 10.5694/j.1326-5377.2010.tb03483.x
Published online: 15 February 2010


Hospital incentive funding has been promoted as a tool for health care reform. However, overseas experience has shown that funding changes can influence hospital casemix coding practices.1 In the Australian states of Victoria and New South Wales, hospital incentive funding has resulted in fraudulent reporting of hospital performance data (Box 1 and Box 2). Data fraud raises important questions about the governance of hospitals and monitoring by state and territory governments of public hospital systems.

In addition, Queensland’s commission of inquiry into Bundaberg Base Hospital found that performance-based funding contributed to Dr Jayant Patel’s “sustained path of injury and death” in that too much emphasis was placed on attaining target numbers, and too little on patient care.12 In the context of Bundaberg Base Hospital allowing its budgetary concerns to dictate elective surgery throughput, the inquiry found that “Dr Patel made himself so valuable in that respect that the administrators were plainly reluctant to offend him, let alone investigate him”.12

Implications for proposed health care reform

The combination of performance data manipulation, data fraud and variable interpretation of reporting requirements has meant that comparisons of hospital performance among states are meaningless. In the absence of adequate funding to maintain basic services, performance-based funding has prompted hospital data fraud in Victoria and NSW. To date, the possible relationship between key performance indicator (KPI) fraud and the practice of paying KPI-based salary bonuses to hospital administrators has not been investigated.

The Commonwealth Corporations Act 2001 established national standards for the honest reporting of financial data and information that is financially sensitive, and for the open disclosure of potential conflicts of interest by directors of publicly listed companies. Breaches of these standards of behaviour attract criminal penalties for individual directors of publicly listed companies under Section 1308 of the Corporations Act.

Any attempts to promote health care reform in Australia through the use of incentive funding schemes must be backed up with legislation making clear that data corruption is not tolerated in the public sector, just as it is not tolerated in the private sector. Such legislation must be nationally uniform so that it regulates and defines the terms used in hospital data reporting processes to allow meaningful comparisons to be made among public hospital systems, and to allow appropriate evaluations of the impact of government health policy changes.

Even if hospitals were adequately resourced, poor management practices could lead to inefficient use of resources and decreased hospital performance. Comparative performance indicators that relate various hospital expenditures to their impact on patient care and measurable patient outcomes, such as the incidence of pressure ulcers after hospital admission, need to be developed. Focusing on patient outcomes would remove some of the current anomalies of casemix funding, such as the false economy of premature discharge of patients from hospital. Readmission of these patients is administratively treated as a “new” hospital admission that generates additional funding. The true cost of what is in fact a failed discharge is a financial impost on the community, not to mention a distress to patients, their families and carers.

Systems for reporting the performance of Australian public hospitals are inadequate. Systems have to be developed to provide nationally consistent reporting, with protection against potential hospital administration conflicts of interest, fraudulent manipulation of data and variable interpretation of reporting rules. Consideration should be given to making the Australian Bureau of Statistics responsible for collecting and protecting hospital data through regular independent audits. If meaningful health care reform is to be achieved in Australia, we need to change hospital management culture by making hospital data corruption and deliberate misinformation of the community a criminal offence.

1 Victorian hospital data fraud

Casemix funding arrangements, introduced into Victorian public hospitals on 1 July 1993, departed significantly from previous approaches to public hospital funding in Australia. Under these new arrangements, hospitals received a combination of fixed and variable payments that were linked to the number and case complexity of patients being treated, as well as emergency department (ED) performance.

Allegations of hospital data fraud were first reported in Victoria in 1996.2 Hospital data manipulation included the use of “ghost wards” and “phantom admissions” (see below), and the reclassification of patients on elective surgery waiting lists. Hospitals then fraudulently claimed that they met key performance indicator (KPI) targets to be eligible for performance-funding bonus payments, or to avoid funding penalties.

Ghost wards are “virtual wards” created on the hospital computer system — ED patients requiring admission to wards with no available beds are “admitted” to the ghost ward. These patients are administratively treated as having left the ED even though they have not. Phantom admissions involve the administrative discharge and readmission to hospital of inpatients who have not left the hospital. In this way, the hospital is funded twice for one patient admission.

In 2004, the Victorian Auditor-General conducted an audit into the management of the demand on EDs,3 and raised concerns about hospital data accuracy. The audit found a “large number” of patient records where patients were admitted to hospital as inpatients but, paradoxically, no inpatient beds had been requested.3 Also found was the potential in hospital reporting systems for data to be manipulated to meet hospital KPI measures and that “There are currently no controls to detect or prevent this activity”.3

In October 2007, a survey, conducted by the Victorian Faculty of the Australasian College for Emergency Medicine, was sent to 21 Victorian ED directors asking for details about data interpretation and management of ED waiting times and various KPIs. Nineteen responses were received, a response rate of 91% (Table).4

This ED survey sparked media outcries that led to claims of manipulation of both ED performance data and elective surgery waiting list data. Eventually, the Victorian Department of Health ordered a review of elective surgery list data at the Royal Women’s Hospital, Melbourne, which con-cluded there had been systematic manipulation of data to meet elective surgery waiting list targets dating back to at least the year 2000.5

In 2008, the Victorian Auditor-General conducted another audit and concluded that “Hospitals inconsistently interpreted reporting rules, data capture methods were susceptible to error, and the accuracy of some data was impossible to check, meaning incorrect data may not be detected. In one hospital, data manipulation had occurred”.6 In response to this, the Victorian Health Minister created a position called “Director of Hospital Data Integrity” within the Victorian Department of Health. In addition, the Victorian Parliament’s Standing Committee on Finance and Public Administration launched an inquiry, on 13 November 2008, into, among other things, “the accuracy and completeness of performance data for Victorian public hospitals”.7 The final report is not yet available.

Strategies used in 19 Victorian emergency departments (EDs) to meet key performance indicator (KPI) targets*

KPI target

“Virtual wards” used (%)

Data changed (%)

Admission to a hospital ward by 8 hours



Admission to a hospital ward by 24 hours



Discharge from ED by 4 hours



ED waiting time: 64% of hospitals misrepresented waiting times as time from patient’s arrival either to triage or to placement in a cubicle with no clinical contact.

* Results of a survey undertaken by the Victorian Faculty of the Australasian College for Emergency Medicine, October 2007.4

2 New South Wales hospital data fraud

In 2003, the NSW Independent Commission Against Corruption (ICAC) conducted an investigation at the request of the NSW Department of Health (DOH) into alleged active misreporting of elective surgery waiting list data at five metropolitan public hospitals.8 The ICAC found that the allegations were attributable to differing interpretations of the waiting list guidelines by the DOH reviewers and the staff at the five hospitals and two area health services. These guidelines were described by the ICAC as being “so loose and ambiguous that they created extensive opportunities for data to be artificially manipulated for personal or political purposes”.8

In 2006, Shellharbour Hospital designated the right wall of the hospital’s emergency department (ED) a “Clinical Decision Unit” with four virtual beds.9 Patients who were “admitted” to these virtual beds were administratively treated as though they had left the ED. In 2008, there were media reports of NSW hospital ED data fraud at Ryde Hospital and Gosford Hospital dating back to 2005; the fraudulent data gave the impression that the performance of the EDs was meeting key performance indicator targets.10 A review of ED triage benchmark performance was then conducted by the NSW DOH and found triage reports were of limited value as an accurate record of ED activity because data were captured and recorded in inconsistent ways.11 Further, it found that the focus of such reports was on the efficiency of processing patients, with no focus on other relevant areas such as the quality of care or patient outcomes.11

  • Antony Nocera

  • Dubbo Base Hospital, Dubbo, NSW.

Competing interests:

None identified.


remove_circle_outline Delete Author
add_circle_outline Add Author

Do you have any competing interests to declare? *

I/we agree to assign copyright to the Medical Journal of Australia and agree to the Conditions of publication *
I/we agree to the Terms of use of the Medical Journal of Australia *
Email me when people comment on this article

Online responses are no longer available. Please refer to our instructions for authors page for more information.