Published online by Cambridge University Press: 10 March 2009
In 1852 Florence Nightingale wrote, “It is valuable to have one place in the hospital where postoperative and other patients needing close attention can be watched”. Almost 100 years later, an American “anesthesia study commission” concluded that one-third of postoperative deaths in the first 24 hours could have been prevented by better nursing care. Yet it is only in the last 25 years that designated intensive care units have become a feature of acute hospitals in developed countries. In the United States, intensive care beds now comprise 15% of acute beds, and they cost at least two to three times more than ordinary ward beds. Over half the difference between “activity and treatment” in hospitals in the United States and in the United Kingdom is attributed to the much lower provision of intensive care in Britain (1), where such units comprise only 1% of acute hospital beds. Selection for intensive care is therefore more stringent than in the United States, as it is for other high technology procedures (e.g., coronary artery surgery and renal dialysis). However, even in the United States there is now reluctant recognition not only that health care as a whole has to be rationed, but also that unlimited access to high technology medicine is not always in the best interests of patients and their families. In particular the cost-benefit ratio of intensive care for certain types of patients has recently come under scrutiny at an NIH Consensus Development Conference (2) and at the Massachusetts General Hospital (3).