MEDICAL RESPONSE TO EFFECTS OF IONISING RADIATION
Proceedings of a conference on Medical Response to Effects of Ionis...
70 downloads
1661 Views
2MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
MEDICAL RESPONSE TO EFFECTS OF IONISING RADIATION
Proceedings of a conference on Medical Response to Effects of Ionising Radiation held at Queen Elizabeth II Conference Centre, London, 28–30 June 1989.
MEDICAL RESPONSE TO EFFECTS OF IONISING RADIATION
Edited by
W.A.CROSBIE Authority Chief Medical Officer, UKAEA
and J.H.GITTUS Director, Communication & Information, UKAEA
ELSEVIER APPLIED SCIENCE LONDON AND NEW YORK
ELSEVIER SCIENCE PUBLISHERS LTD Crown House, Linton Road, Barking, Essex IG11 8JU, England This edition published in the Taylor & Francis e-Library, 2003. Sole Distributor in the USA and Canada ELSEVIER SCIENCE PUBLISHING CO., INC. 655 Avenue of the Americas, New York, NY 10010, USA WITH 41 TABLES AND 39 ILLUSTRATIONS © 1989 ELSEVIER SCIENCE PUBLISHERS LTD © 1989 UNITED KINGDOM ATOMIC ENERGY AUTHORITY—pp. 1–36
© 1989 NATIONAL RADIOLOGICAL PROTECTION BOARD—pp. 83–131 © 1989 CROWN COPYRIGHT—pp. 151–223 British Library Cataloguing in Publication Data Medical response to effects of ionising radiation. 1. Man. Effects of ionising radiation I. Crosbie, W.A. II. Gittus, John 612´.014486 ISBN 0-203-21587-7 Master e-book ISBN
ISBN 0-203-27217-X (Adobe eReader Format) ISBN 1-85166-385-1 (Print Edition) Library of Congress CIP data applied for
No responsibility is assumed by the Publisher for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Special regulations for readers in the USA This publication has been registered with the Copyright Clearance Center Inc. (CCC), Salem, Massachusetts. Information can be obtained from the CCC about conditions under which photocopies of parts of this publication may be made in the USA. All other copyright questions, including photocopying outside the USA, should be referred to the publisher.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher.
Preface
In 1987 an incident occurred in Goiânia (Brazil) which spotlighted the role which the medical community needs to play in coping with the effects of ionising radiation. A medical radioactive source was accidentally exposed and many people received varying doses, some fatal, of radiation. This accident, like the Chernobyl reactor accident, placed unprecedented demands on the local medical and emergency services, and it is with the response to such demands that this conference is concerned. The conference sets out various scenarios and then considers laboratory and clinical aspects of medical effects upon the individual. Then the responsibilities, plans and resources for coping with an event are covered. Finally the long-term effects of radiation, including epidemiological studies, are presented. Some prominent authors signalled their intention to use the conference as a platform for presenting new findings, and the speakers and chairmen were chosen for the authority which they bring to bear on these important topics. The conference is aimed at a general audience, and the papers are presented in a readily understood manner. It will be of interest to general medical practitioners, accident and emergency personnel, environmental health officers, environmental planning officers, community physicians, safety engineers, environmental scientists, epidemiologists and officials from both government and industry. It will also be of particular value to news and media personnel. W.A.CROSBIE J.H.GITTUS
Contents
Preface List of Contributors
v ix
The medical implications of nuclear power plant accidents J.G.Tyror
1
Setting the scenario—potential hazards of the nuclear fuel cycle R.J.Berry and N.McPhail
37
The medical management of radiation casualties A.W.Lawson
51
Medical management of the patient immunosuppressed by ionising radiation J.C.Cawley The Goiânia accident J.R.Croft
75 83
Current radiation risk estimates and implications for the health consequences of Windscale, TMI and Chernobyl accidents 102 R.H.Clarke The role of biological dosimetry in a radiological accident in the UK D.Lloyd
119
Some priorities in experimental radiobiology G.E.Adams
132
Arrangements for dealing with emergencies at civil nuclear installations M.J.Turner and I.F.Robinson
151
The National Response Plan and Radioactive Incident Monitoring Network(RIMNET) M.W.Jones
vii
181
viii
The role of MAFF following a nuclear accident M.G.Segal Medical response to effects of ionising radiation: resources for coping with an event, the role of the Community Physician J.D.Terrell Local emergency arrangements for radiation accidents A.Jones Monitoring and assessment of radiation exposure from routine radioactive discharges, and its relevance to the question of disease clusters S.R.Jones
195
224 230
241
Studies of leukaemia incidence in Scotland (Abstract) J.Urquhart
271
The relevance of population mixing to the aetiology of childhood leukaemia L.J.Kinlen
272
The role of ionising radiation in the aetiology of the leukaemias R.A.Cartwright
279
A method of detecting spatial clustering of disease S.Openshaw, D.Wilkie, K.Binks, R.Wakeford, M.H.Gerrard and M.R.Croasdale
295
Prediction of the effect of small doses: inconsistencies in the epidemiological evidence (Abstract) 309
R.Doll
List of Contributors
G.E.Adams, Medical Research Council Radiobiology Unit, Chilton, Didcot, Oxon OX11 0RD. R.J.Berry, Health and Safety Directorate, British Nuclear Fuels plc, Risley, Warrington, Cheshire WA3 6AS. K.Binks, British Nuclear Fuels plc, Risley, Warrington, Cheshire WA3 6AS. R.A.Cartwright, Leukaemia Research Fund Centre for Clinical Epidemiology, University of Leeds, Department of Pathology, 17 Springfield Mount, Leeds LS2 9NG. J.C.Cawley, University Department of Haematology, Royal Liverpool Hospital, Prescot Street, PO Box 147, Liverpool L69 3BX. R.H.Clarke, National Radiological Protection Board, Chilton, Didcot, Oxon OX11 0RQ. M.R.Croasdale, Central Electricity Generating Board, Sudbury House, 15 Newgate Street, London EC1A 7AU. J.R.Croft, National Radiological Protection Board, Northern Centre, Hospital Lane, Cookridge, Leeds LS16 6RW. R.Doll, Cancer Epidemiology and Clinical Trials Unit, University of Oxford, Gibson Building, The Radcliffe Infirmary, Oxford OX2 6HE. M.H.Gerrard, Tessella Support Services, 104 Oak Street, Abingdon, Oxon OX14 5DH. A.Jones, County Emergency Planning Officer, Somerset County Council, County Hall, Taunton, Somerset TA1 4DY. M.W.Jones, Radioactive Substances Division, HM Inspectorate of Pollution, Department of the Environment, 43 Marsham Street, London SW1P 3PY. S.R.Jones, Environmental Protection Group, British Nuclear Fuels plc, Sellafield, Seascale, Cumbria CA20 1PG. L.J.Kinlen, CRC Cancer Epidemiology Unit, University of Edinburgh, 15 George Square, Edinburgh EH8 9JZ. A.W.Lawson, Company Chief Medical Officer, British Nuclear Fuels plc, Sellafield, Seascale, Cumbria CA20 1PG. D.Lloyd, National Radiological Protection Board, Chilton, Didcot, Oxon OX11 0RQ. N.McPhail, Health and Safety Directorate, British Nuclear Fuels plc, Risley, Warrington, Cheshire WA3 6AS. S.Openshaw, Centre for Regional Development Studies, University of Newcastle upon Tyne, Claremont Building, Newcastle upon Tyne NE1 7RU. I.F.Robinson, HM Nuclear Installations Inspectorate, Health and Safety Executive, St Peters House, Bootle, Merseyside L20 3LZ. M.G.Segal, Food Science Division, Ministry of Agriculture, Fisheries and Food, Ergon House, c/o Nobel House, 17 Smith Square, London SW1P 3HX. J.D.Terrell, District Medical Officer, West Cumbria Health Authority, West Cumberland Hospital, Hensingham, Whitehaven, Cumbria CA28 8JG.
ix
x
M.J.Turner, HM Nuclear Installations Inspectorate, Health and Safety Executive, St Peters House, Bootle, Merseyside L20 3LZ. J.G.Tyror, Director, Safety and Reliability Directorate, United Kingdom Atomic Energy Authority, Wigshaw Lane, Culcheth, Warrington, Cheshire WA3 4NE. J.Urquhart, Information and Statistics Division, Scottish Health Service, Common Services Agency, Trinity Park House, Edinburgh EH5 3SQ. R.Wakeford, British Nuclear Fuels plc, Risley, Warrington, Cheshire WA3 6AS. D.Wilkie, Windscale Laboratory, UKAEA, Sellafield, Seascale, Cumbria CA20 1PF.
THE MEDICAL IMPLICATIONS OF NUCLEAR POWER PLANT ACCIDENTS
J G TYROR Director, Safety and Reliability Directorate United Kingdom Atomic Energy Authority
ABSTRACT This paper examines the UK position regarding the potential for an accident at a nuclear power plant, the safeguards in place to prevent such an accident occurring and the emergency procedures designed to cope with the consequences should one occur. It focuses on the role of the medical services and examines previous accidents to suggest the nature and likely scale of response that may need to be provided. It is apparent that designs of UK nuclear power stations are robust and that the likelihood of a significant accident occurring is extremely remote. Emergency arrangements are, however, in place to deal with the eventuality should it arise and these incorporate sufficient flexibility to accommodate a wide range of accidents. Analysis of previous nuclear accidents at Windscale, Three Mile Island and Chernobyl provide a limited but valuable insight into the diversity and potential scale of response that may be required. It is concluded that above all, the response must be flexible to enable medical services to deal with the wide range of effects that may arise.
INTRODUCTION Recent accidents including those at Goiania and Chernobyl have highlighted the importance of an effective medical response in dealing with accidents involving radioactive materials. Although the chance of such accidents is extremely small, they have the potential to give rise to a number of unique features including radiation exposures, contamination and psychological stress. These features justify detailed consideration and the preparation of procedures to mitigate their consequences. The first section of the paper examines the philosophy and procedures behind the design and operation of nuclear power plant for ensuring that the likelihood of an accident is kept as low as reasonably practicable. This theme is developed in the second section which reviews the emergency procedures which are nevertheless available in case a
1
2
severe accident actually develops. Section 3 examines experience gained from major accidents at Windscale, Three Mile Island and Chernobyl and considers the medical effects and the load placed on medical services. The final section builds on this experience and speculates on the likelihood of such a demand being made in the UK, the magnitude of the consequences and the scale of response that might be required. Although the paper is primarily concerned with the position in the UK, and in particular with that concerning nuclear power reactors, it is necessary to examine the international arena for experience of nuclear incidents and their medical response.
NUCLEAR POWER PLANT SAFETY Commercial nuclear power was born out of the wartime development of the atomic bomb and from the outset the potential hazards involved in working with significant quantities of radioactive material were well recognised. Safety and the need to prevent accidental release of radioactive material has therefore been of prime concern at all stages of the design and operation of nuclear plant. The responsibility for this lies clearly with the operator, who works within a strict regulatory framework. It is this combination of operator responsibility and regulatory control that provides the basis for nuclear safety within the UK.
Legislative Framework The primary legislation governing Health and Safety standards at civil nuclear installations in the UK is the Health and Safety at Work, etc, Act 1974 plus the associated provisions made under the Nuclear Installations Acts of 1965 and 1969. National legislation is, however, subject to guidance from the international arena. The International Commission for Radiological Protection (ICRP) made a number of recommendations in their Publication 26 in 1977 and some of these formed the basis for a Euratom directive. As signatories of the Euratom Treaty, the UK is bound to introduce legislation to at least the standard of the directive, which it did by the introduction of the Ionising Radiations Regulations in 1985. Commercial nuclear facilities including power reactors must not be constructed or operated without a nuclear site licence granted by the Health and Safety Executive (HSE). The HSE delegates its licensing and regulatory functions to the Nuclear Installations Inspectorate (NII) and they in turn ensure that all necessary arrangements for monitoring safety are made by the licensees. In addition to the HSE/NII, the nuclear industry is also subject to regulation by the Department of the Environment (DoE) and the Ministry of Agriculture, Fisheries and Food (MAFF). The DoE is responsible for granting authorisations for the storage and disposal of solid radioactive waste and also for the discharge of materials to atmosphere. Whilst MAFF are also involved in these activities, their main interests are in the authorisation of liquid discharges to rivers or sea and the impact of all discharges on the environment.
3
The Nuclear Site Licence The issue of a nuclear site licence is dependent upon the satisfactory outcome of an NII review of proposals made by a prospective licensee. These proposals set out the safety principles on which the design of the nuclear plant is based and demonstrates how they can be met by the reference design. The NII must be satisfied from its examination of these proposals that the facility can be built and operated to the required standard of safety before recommending that a nuclear site licence be granted by the HSE (Ref 1). A nuclear site licence is generally accompanied by a series of licence conditions attached by the HSE as considered necessary in the interests of safety. The conditions are far-reaching and influence many areas including design, construction, operation, modification and maintenance of the facility in addition to the radiological protection of personnel both on and off site. They may be added to, amended or revoked at any time during the period when a licence is in force and this provides a very flexible regime of safety control.
The NII Safety Assessment Principles The fundamental principle applied in the UK to the regulation of industrial risks is the so-called As Low As Reasonably Practicable (ALARP) principle (Ref 2). This requires that operators take all reasonably practicable steps to reduce risks bearing in mind the cost of further reductions. Detailed guidance on how this principle is to be implemented for nuclear facilities is provided by both regulators and designers. In order to guide its assessors, the NII have developed a set of Safety Assessment Principles (Ref 3) to ensure consistency in the assessment of nuclear power plants of different designs. They include both limits and assessment levels which provide guidance as to whether all reasonably practicable steps have been taken to prevent accidents and, should they occur, to minimise their radiological consequences. The principles can be broadly divided into 3 categories; the first category provides a set of fundamental principles for radiological protection. The second category lays down basic principles for the limitation of the radiological consequences of operation for both normal and accident conditions and the third category is mainly concerned with engineering features of the plant. The semi-quantitative guidance provided by the NII, detailing the relationship between radiation dose to the public and accident frequency, is illustrated in Figure 1.
Safety by Design The primary objective of nuclear plant designers is to establish a good, safe design which fulfils the general plant performance specification. This specification includes details of the duty of the plant and, importantly, the requirement to meet the safety objectives and safety principles discussed previously. To this end, reactors tend to be conservatively designed with wide margins of safety and based on proven technology backed up by extensive testing and experience.
4
Figure 1. NII safety assessment principles Defence in Depth—Multi-Barrier Principles One of the most important techniques employed by designers to ensure a satisfactory standard of safety is that of providing defence in depth. This provides the basic framework for most nuclear power plant safety and has been refined and strengthened through many years of application. The defence in depth concept compensates for both human and mechanical vulnerability and is centred on several levels of protection preventing the release of radioactive material to the environment. This multilayer principle is based primarily on a series of barriers which would need to be breached in turn before harm could occur to people or the environment. These are physical barriers providing containment of radioactive material at successive levels. They may serve both operational and safety purposes, or safety purposes only. The reliability of physical barriers is enhanced by applying the defence in depth methodology to each of them in turn and by protecting each of them by a series of measures. Each physical barrier is designed conservatively, its quality is checked to ensure that the margins against failure are acceptable and all plant processes capable of affecting it are controlled and monitored in operation.
1
ERL: An Emergency Reference Level (ERL) is the radiation dose below which countermeasures to protect the public are unlikely to be justified. The National Radiological Protection Board (NRPB) is the UK body with responsibility for advising on this level, which is presently set at, for example, 100 mSv whole body dose equivalent for evacuation. See Section 2.
5
A number of human aspects of defence in depth are also used to protect the integrity of the barriers. These include quality assurance, control procedures, safety reviews and other administrative areas within the general safety culture. Engineered Safety Features Wherever practicable passive safety features are incorporated into the design of nuclear power plant. In addition, engineered systems are provided to shut down the reactor, maintain cooling and limit any release of fission products that may occur should there be a fuel failure. Both the initiation and operation of these engineered safety features are highly reliable. This is achieved by the appropriate use of fail-safe design and by independence between safety systems and plant process systems. Systems are designed to ensure that failure of a single component does not cause the entire safety feature to fail (the single failure criterion). To guard against this, redundancy is built into systems to ensure that sufficient back-up is available in the event of a component failure. In addition, diversity is built into the design to ensure that safety mechanisms can be operated by alternative means should the primary means fail. Diversity is also important in ensuring that safety systems are not disabled by common mode failure conditions. Design Guidelines To assist in the practical application of these principles, the Central Electricity Generating Board (CEGB) publish design safety criteria (Ref 4) which apply to the design of all their nuclear power reactors. These are accompanied by a set of Design Safety Guidelines (Ref 5) which expand and interpret these criteria for the new generation of Pressurised Water Reactors (PWR) to be introduced in the 1990’s. They are similar to the NII Safety Assessment Principles but in some cases they are more stringent. Thus the CEGB state that accidents giving rise to doses of 1/10 ERL should not exceed 10-4 per reactor year compared with the NII figure of 1 in 3000 reactor years. The CEGB’s design guidance is illustrated in Figure 2. Accidents which would give rise to high off-site doses are covered by a target total frequency of 10–6 per year for all accidents giving a ‘large uncontrolled release’, with a maximum contribution of 10-7 from any single accident sequence. Some latitude is allowed for ‘uncontrolled releases’ at levels between 1 ERL and 10 ERL to have probabilities somewhat higher than 10-6.
Safety in Operation The way in which a plant is operated is dictated by the conditions attached to the nuclear site licence. These conditions ensure that all steps are taken by the licensees to protect both workers and members of the public from risks associated with the operation of nuclear reactors. These plants are regularly inspected by the NII to ensure that the conditions of the site licence are being complied with.
6
Nuclear plant operating staff are of a high professional standing and are well qualified by both experience and training. Automatic systems ensure that plants operate within well defined safe limits and are designed such that any breach of these limits results in the plant shutting down. Reactor shutdown and other immediate emergency actions, are fully automatic on modern reactors and no input is required from the operator for about half an hour after they have shut down. This avoids the need for rushed decisions and enables the operator to take advantage of a pre-arranged formal system of technical advice. The training and re-training of operators on simulators to deal with such situations is required by the NII and is a standard part of operational procedure.
Figure 2. CEGB design safety criteria
Operational safety is also improved by incorporating lessons learned from incidents and experiences at other plants. Guidance on these matters will come from within the industry, from the HSE/NII, who must be informed of all incidents or potential incidents at nuclear installations in the UK, or from one of the internationally established agencies. These include: The International Atomic Energy Agency (IAEA) who investigate all serious radiological accidents and produce a detailed account from which all States may learn and hence avoid similar consequences (Ref 6). The International Nuclear Safety Advisory Group (INSAG) who advise on the safety of nuclear power plants. This body (established by the IAEA) serves as an international forum for the exchange of information on nuclear safety issues, and assists in the formulation of common safety concepts where appropriate (Ref 7).
7
The Organisation for Economic Co-operation and Development, Nuclear Energy Agency (OECD/NEA). This organisation maintains an Incident Recording System (IRS) which can be used to provide information on particular operational experiences and incident events. The World Association of Nuclear Operators (WANO) which is also being established to disseminate information and to enable experience to be made more widely available to nuclear operators throughout the world.
EMERGENCY PROCEDURES Despite all reasonably practicable steps taken to design and operate nuclear plant to the highest levels of safety, there can be no absolute guarantee that accidents will never happen. It is therefore necessary to have emergency arrangements to deal with any accident that might occur. We are all too aware of the demands made on emergency services in recent years in responding to major non-nuclear accidents at Bradford City Football Club, Manchester Airport, Kings Cross Station, Clapham Junction, Lockerbie and Kegworth. The responses to these accidents have demonstrated that civil Emergency Plans exist and that the emergency and medical response capabilities are in place to deal with large scale accidents. The response to such accidents is planned at Local Authority level and is based on arrangements involving the Police and other emergency services. These plans need to be sufficiently flexible to cope with a wide variety of potential incidents. Despite the fact that nuclear accidents could give rise to a number of unique characteristics, the response required to protect the public is not vastly dissimilar to that required for other civil emergencies. Consequently, nuclear plant emergency arrangements are integrated with existing County Emergency Plans to ensure that, where possible, there is commonality in the response required by organisations which have a role to play.
Nuclear Emergency Planning It is a requirement of a nuclear site licence that the operator must have an emergency plan approved by the NII. Such a plan will provide a general framework with detailed arrangements focused on a reference accident. These plans are published and are available for public scrutiny. They are regularly exercised in the presence of the NII, to ensure their continued effectiveness in providing the necessary action, both on and off site, to protect members of the public. There is no single emergency plan which provides a universal optimum solution; each site will develop its own which will be subject to scrutiny by the NII. If the plant is new, emergency plans will have to be in place before the plant is commissioned. They are generally based on a tiered structure of alert: Building Emergency: Where the effects of an incident are confined within building. Site Emergency: Where there are no off-site effects.
8
District Emergency: Where the incident gives rise to effects off site. The plans must be able to cope with a wide variety of accidents, ranging from those with potential for a release of little more than the routine radioactive discharge to those with potentially far reaching off-site consequences, possibly involving fire and injury to operating staff. The operator is at all times responsible for the on-site control of the incident irrespective of whether it has off-site consequences or not. The emergency plan for the affected site should contain a wide range of response capabilities which could be drawn upon to assist in the control of an incident. Off-site action involves the local emergency services and other authorities which may be called upon to implement measures to protect the public. As with any other type of civil emergency, the main responsibility for interaction with the public lies with the Police. The importance of the Police role and their experience in dealing with a variety of emergency situations cannot be too strongly emphasised.
Accident Consequences The consequences of an accident for operating staff and other workers at the plant could be severe, dependent upon its detailed nature and speed of development. If a serious accident did occur, it is likely to result in the release of volatile radioactive species including caesium, radioactive noble gases, and the radio-odines both in gaseous and particulate forms. This radioactive material would be transported by the wind from the affected plant and behave similarly to a plume of smoke, dispersing into the atmosphere and depositing some of its contents on the ground. The activity contained within the plume could give rise to radiation exposures to the public in a number of ways. The first consideration is the external dose from the airborne plume itself, material deposited on the ground and possibly on the skin and clothing of people in its path. The second, and possibly most significant route of exposure, is via the inhalation of material suspended in the plume. Finally and on a longer timescale, there are exposures arising from the consumption of contaminated food and water. The release of radioactivity resulting from a serious reactor accident should not cause any immediate health effects to the public but there are a number of countermeasures which could be appropriately taken to minimise longer term effects. These include: (a) Sheltering—The normal constructional materials used in houses and other buildings provide some protection from the effects of radioactive materials released to the atmosphere, up to a factor of ten dose reduction being obtained in favourable circumstances. (b) Potassium Iodate Tablets—If taken early enough, the thyroid can be blocked with stable iodine and this limits the absorption of radioactive iodine which may be present in the radioactive plume.
9
(c) Evacuation—If it is practicable to evacuate personnel from the effected areas, doses arising from exposure to the plume itself and from activity deposited on the ground could be reduced. Furthermore, following the incident, checks would be made on foodstuffs and MAFF may consider it necessary to introduce restrictions on agricultural produce and dairy products. These countermeasures should reduce exposures and help ensure that the public are not exposed to significant risks to health. The Police, the Local Authority, MAFF and the Local Health Authority would all be involved in the emergency response. It is the responsibility of the Local Health Authority to ensure that the health of the public is considered at all times during the control of the incident particularly with regard to countermeasures. Upon being informed of the incident, representatives from all organisations with a role to play would proceed to the local Operational Support Centre (OSC). Once established, the OSC becomes the focal point for liaison activities and the co-ordination of advice to all outside organisations. Upon declaration of an emergency a Government representative would be dispatched to the OSC to act as Government Technical Adviser (GTA). His principal task is to preside over all off-site developments and to ensure that consistent advice is given to all interested and involved parties including Government departments and the media.
Medical Aspects There are many important aspects which must be considered for an emergency plan to work effectively. The arrangements must include procedures for clarifying responsibilities, ensuring that effective teams can be readily mobilised, providing robust communications facilities, manning of control, support and media briefing centres, etc, etc. We are primarily concerned, however, with the medical implications of accidents and therefore attention is focused on these.
The Medical Response for On-Site Personnel Nuclear establishments are generally well drilled in their response to accidents/ emergencies. Each establishment will have its own emergency instructions, exercised on a scheduled basis and personnel working in affected areas should follow these instructions. Dependent on the type of emergency these may require staff to stay indoors (shelter) or possibly to evacuate to a local assembly point. Any monitoring and decontamination of personnel that may be required can be arranged on site since facilities are available for showering, change of clothes and health physics monitoring. In the event of a major accident it is possible that workers or personnel involved with bringing the incident under control may require medical treatment. Most nuclear establishments have on-site medical services which are geared to deal with emergencies including those requiring decontamination procedures. Medical staff are on call at every site and if conventional injuries allow, the medical centres would be used to receive, decontaminate and sort casualties prior to transfer to hospital.
10
The fact that a nuclear accident could result in demands being made on NHS hospitals and on the Health Service generally has long been recognised. Operators are therefore responsible for ensuring, in consultation with Local Health Authorities (in Scotland the Health Boards), that arrangements exist with particular hospitals for emergency treatment of personnel injured on a nuclear site (Ref 8). It is unlikely that patients would be able to be segregated at the scene of the incident and it is upon arrival at the main co-ordinating hospital that initial sorting of patients would be performed. Personnel who have neither been irradiated or contaminated could, if not in danger from conventional injury, be transferred to outlying hospitals without facilities for handling contaminated patients. Subject to any immediate life-saving and rescue measures, special care should be given to the handling of persons who have been contaminated by radioactive material or who may have been exposed to very high levels of radiation. Whilst there are a considerable number of hospitals able to deal with contaminated casualties, there are very few with the facilities required for the care of patients who have received high external doses. Contaminated Casualties It is possible that personnel directly involved in the accident or involved in actions to mitigate the consequences may have become contaminated with radioactive material. Whilst there may be more immediate life-threatening symptoms associated with their injuries, care must be taken to reduce spread of contamination. This would limit internal exposure of both the patient and the people dealing with his injuries, ie ambulance men, medical staff, nurses, etc. It would also reduce the risk of contaminating ambulances, hospital wards, treatment rooms and corridors. This is an area of continuing consideration. Thus, for example, workers at the UKAEA’s Dounreay establishment are developing a plastic envelope for preventing the spread of contamination from accident victims. This consists of a PVC base with heavy-duty “ding-film” sides. The envelope is opened out and placed on the stretcher and the casualty is laid on top. The sides are then wrapped round the person as required. These can be partially rolled back to allow access to local wounds or even cut and then re-sealed. As far as possible contaminated patients would be treated at hospitals having suitable facilities, ie a medical health physics team. Dependent upon the scale-of the incident it may be necessary to establish temporary contaminated patient facilities within the smaller hospitals but arrangements for this would have to be formulated at the County emergency planning level and outline arrangements included in the emergency plan. Irradiated Casualties Irradiated casualties present a different problem. As with contaminated patients the first priority is the treatment of any life-threatening conventional injuries. This should be followed by decontamination processes and collection of blood samples which, together with analysis of personal dosemeters, would allow an assessment to be made of the extent of the radiation injury. Only those who have received doses of between 1 and 10–12 Gy (Gray)2 will benefit from supportive care (Ref 9). Very few hospitals have the
11
sophisticated protective isolation facilities required for such care and it is likely that they would already be fully occupied. Arrangements are in place for the Royal Marsden Hospital to receive any irradiated casualites from CEGB sites.
The Medical Response for the Public Off Site It is important to stress that there would be few, if any, clinical effects manifested amongst the general public. Consequently it is only on-site personnel who are likely to require early hospital treatment. It is possible that in a major accident some members of the public close to the plant would become contaminated to an extent requiring decontamination and receive radiation doses considerably greater than those arising from natural background radiation. Whilst this should not result in any early health effects, medical counseling may be necessary for re-assurance purposes. Guidance for the Imposition of Countermeasures—ERLs The Local Health Authority would be involved in any decisions involving the issue of potassium iodate tablets to the public and they would also be consulted regarding other countermeasures under consideration. Guidance on the circumstances under which countermeasures might be taken is provided in the UK by Emergency Reference Levels (ERLs). These were initially recommended by the Medical Research Council in 1975 and subsequently revised by the National Radiological Protection Board (NRPB) in 1981 (Ref 10). The ERLs provide an aid to decision-making in the early stages following an accidental release of radioactive material into the environment, or in anticipation of such an event. Following this initial period it is expected that a team able to advise on continued protection measures would be assembled and the ERLs may not then be appropriate. The NRPB recognise the different risks and social costs associated with the various countermeasures and have recommended a lower and upper level of estimated dose for each. If the lower level is exceeded the countermeasure should be actively considered and taken, unless circumstances dictate otherwise. If predicted doses are higher than the upper level, the Board then considers the countermeasure to be essential. Figure 3 presents a summary of the Board’s recommendations. ERLs can be regarded as “primary limits” but in order to provide guidance in the aftermath of an incident it is necessary to compare measured data with Derived Emergency Reference Levels (DERLs). These are practical quantities which can be directly compared with environmental measurements to assess the seriousness of the release and help form judgements on the need for protective countermeasures. These intervention levels are determined in advance and agreed with theappropriate regulatory
2
For an explanation of health physics units and terms please refer to Appendix 1.
12
appropriate regulatory bodies. It is important, however, that they should always be seen as providing guidance rather than specifying a level at which action must be taken. Figure 3 An illustrative summary of the emergency reference levels recommended by the NRPB
Source: ERL2 (NRPB) 1977
13
Distribution of Stable Iodate Tablets Administration of stable iodate can reduce the uptake of radio-iodine which would otherwise concentrate in the thyroid gland. It has been shown experimentally that a dose of 100 mg of potassium iodide is virtually 100% effective if taken immediately or at the time of exposure to radio-iodine, 75% effective if taken 1.5 hours afterwards, and 50% effective if taken 5.5 hours later (Ref 11). For most populations evidence shows that risks to individuals of sideeffects caused by taking stable iodine are generally very low, but there is little guidance on the quantities which should be administered. Populations living close by UK nuclear facilities know of the existence of stable iodine tablets and are aware of predetermined plans to distribute them.
Brief Comparison of the Emergency Planning Practices of Major Nuclear Countries The nuclear emergency planning practices for countries with a significant nuclear power programme were reviewed following Chernobyl. There are differences related to the tiered organisation of Government and to the type of power reactor or nuclear facility being operated. Most countries, including the UK, place the responsibility for off-site emergency response with Local Government and rely on Central Government Departments to provide operational support. In the UK these would include The Department of Energy, DoE, MAFF, etc. In general, the emergency plans vary in their sophistication between countries but they are broadly similar because the radiological aspects are based on the recommendations of the ICRP. The differences lie mainly in the allocation of responsibilities and there are no areas where the UK plans appear to be significantly less stringent.
MAJOR ACCIDENT EXPERIENCE As the previous sections have shown, UK nuclear facilities are designed to be robust and they are operated within a strict regulatory framework. The limited number of nuclear accidents both in the UK and abroad is testimony to the high level of safety adopted in the operation of nuclear facilities world-wide. It is important, however, to analyse accidents which have occurred to see if lessons can be learned and to examine the implications for medical services. CRITICALITY INCIDENTS WITHIN THE NUCLEAR INDUSTRY An example of where previous experience and analysis of accidents has heavily influenced the evolution of present-day policy is in the field of nuclear criticality safety. These involve accidents in plants intended for operation with sub-critical quantities of fissile material or in experimental facilities where criticality was achieved unintentionally. There have been 37 reported criticality accidents since the mid–1940’s but they now occur less frequently (3 within the last 23 years). Analysis of these accidents has provided valuable guidance regarding principles of criticality control and an insight into the biological effects of high doses of gamma and neutron
14
radiations Of these 37 accidents, seven gave rise to a total of 9 deaths. The types of incident are summarised in Table 1 below:
TABLE 1 Summary of criticality incident types (ref 12)
Amongst these accidents there have been 8 well documented criticality excursions in chemical processing plant, all occurring with aqueous solutions of highly enriched uranium or plutonium. These 8 accidents resulted in 2 deaths and 19 workers being significantly over-exposed to radiation, but the general public was not endangered by any of these excursions. The accidents were not caused by misleading criticality information or errors in its interpretation, but by difficulties with equipment, procedural inadequacies and violations, or a combination thereof. Full details of the 8 chemical processing plant criticality accidents can be found in Reference 13, but the general features are summarised in the Table 2 below. In a number of instances the radiological consequences of incidents in unshielded facilities have been limited by evacuation of personnel alerted by alarms. Especially for prolonged and multiple excursions, alarms and subsequent evacuation must be credited with saving lives. It is important therefore that high integrity alarm systems are fitted in areas where there is potential for an accidental criticality excursion. The 2 fatalities amongst this group of accidents were suffered by personnel within 1m of an excursion and significant exposures were received by personnel standing up to a distance of 15m away. By normalising data from a number of criticality incidents it is possible to make broad deductions as to the fall-off of dose with distance and it is concluded that within approximately 3m of an excursion, lethal exposures can be expected. At a distance of 20 m the exposure is approximately 0.25 Gy, a level at which effects are generally not medically detectable. These distances are quite comparable to those considered dangerous for plant subject to moderate chemical explosions.
*From Ref 14
TABLE 2 Criticality accidents in processing plants*
16
The concentration of criticality accidents in chemical plants, 5 in the period 1958 to 1962, can be partly attributed to increased production of highly enriched uranium and plutonium without corresponding growth and sophistication of the facilities. As a result of this cluster of accidents, however, techniques for criticality control were refined and a number of facilities were modernised resulting in an improvement of the accident record. In terms of a medical response, acute effects arising from criticality accidents generally involve a limited number of on-site personnel. For accidents which occur outside shielding, external exposures to both gamma and neutron radiations are the principal concern and these can vary from a fraction of a Gy up to 10’s of Gy. Consequently there is a corresponding range of health effects, ranging from small exposures with no medically detectable effects through to fatalities resulting from high doses. The LD50/30, ie the dose expected to be lethal to 50% of those people exposed within 30 days of exposure is in the region of 4–5 Gy. Symptoms differ, however, dependent on dose but for lethal doses the time before death decreases with increasing dose, eg: An operator exposed in an Argentinian accident in 1983 received approximately 20.6 Gy/gamma and 17 Gy/neutron. He was coherent following the incident but 20 minutes later he began vomiting. He remained lucid until 4–5 hours before his death which occurred 2 days later.
WINDSCALE FIRE: 1957 Brief Description of Incident The Windscale Piles were, in effect, a very early, simple type of nuclear reactor whose main use was the production of material for the weapons programme. In essence they were large blocks of graphite, honeycombed with horizontal channels in which natural uranium fuel, canned in aluminium was made critical. Cooling was achieved by passing air over the fuel on a once through basis. This was then discharged to atmosphere through the distinctively shaped chimneys which housed filters to remove particulate material. It must be stressed, that these reactors were rather crude by modern standards and bear little resemblance to the sophisticated commercial reactors used for the generation of electrical power today. The accident occurred during the routine, controlled release of stored Wigner energy from the graphite of one of the piles. This operation started on 7 October 1957 and initially followed a normal course. On 9 October, however, it was noted that some of the recorded temperatures in the Pile seemed to be abnormal, and at noon on 10 October an air sample taken in the open showed radioactivity levels to be significantly higher than normal. Operating staff initially thought that a fuel cartridge had burst but at 1630 a visual inspection revealed glowing fuel cartridges in over 100 channels. Shortly after midnight the Police were warned of the possibility of an emergency and during the 11th and 12th October 1957 water was used to cool the Pile and extinguish the fire (Ref 15).
17
Measures Taken to Protect the Workers Once it was established that air contamination levels in open areas on site were higher than normal, contractors working on the construction of the Calder Hall reactors were sent home and workers elsewhere on the Windscale site were given instructions to stay indoors and to wear respiratory protection. The aluminium cladding in many of the fuel cartridges failed and fission products were released into cooling channels. Much of the radioactivity was retained on the filters but there was clearly some release from the chimney stack. The most radiologically significant elements released in the fire were Iodine–131, which following either ingestion or inhalation concentrates in the thyroid, and Caesium–137, a longer lived nuclide which irradiates the whole body both when ingested or inhaled, and via external radiation. Radiation doses received by workers at the plant as a result of the accident were not excessively high. During the 13 week period including the incident, 14 workers exceeded the maximum permissible quarterly dose level of 3 rem (30 mSv (milli-Sieverts)). The highest figure recorded for the same period was 4.66 rem (47 mSv). It is important to note that a number of these doses were received knowingly by personnel directly involved with controlling the incident. There was also some hair and hand contamination but this was successfully removed by routine procedures. The principal health hazard both to workers and members of the public arose from radioactive iodine and in order to assess the effects, thyroid iodine surveys were performed. The highest activity was found in the thyroid of an AEA employee. The value was 0.5 µCi (micro-curies) and this compares with the ICRP level for safe continuous activity of 0.1 µCi. Surveys were also carried out amongst workers for Strontium activity (Sr-89 and Sr-90), but levels found were at most, one-tenth of the maximum permissible body burden. Biological samples for radioactive Caesium were also found to be satisfactory (Ref 16). The staff most exposed during the incident received the regular medical examinations given to UKAEA workers throughout their subsequent employment. They have also been included in epidemiological studies of UKAEA and BNF Pic workers performed by independent experts (Refs 17, 18 and 19). These studies showed mortality rates below those of the general population and consistent with those expected in a normal healthy workforce.
Measures Taken to Protect the Public Following the accident the hazard to the public arose from inhalation, ingestion and external radiation arising from fission products released from the uranium fuel. It was possible at an early stage to reject the need for emergency measures based on inhalation or external radiation, but measurements of the Iodine content of milk produced locally made it necessary to restrict the collection and distribution of milk from areas around the site. Milk forms an important part of the diet of young children and there was particular concern that radioactive Iodine in the milk would be concentrated in the infant’s
18
thyroid. Calculations indicated that it than 0.1 µCi 1-1. Initially, collection of milk from farms within 2 miles would be unwise to permit consumption of milk with activity levels greater of the site was restricted and ultimately this was extended to 200 square miles, including the coastal strip south of Windscale to a distance of 30 miles. Iodine–131 has a physical half-life of approximately 8 days and based on this fact alone, disregarding the cleansing action of rain, etc, it was anticipated that levels would soon fall below 0.1 µCi 1-1. This proved to be the case but the final relaxation of the milk restrictions were delayed to make sure that Strontium levels were also within safe limits. Other foods including eggs, vegetables, etc, were investigated but all proved to be within safe limits. It was reported at the time that the Cumberland population took the accident remarkably calmly. The AEA stated that it would pay compensation for milk that was disposed of and curiously, during the period of the ban, there was a noticeable rise in the milk yield from local herds (Ref 20).
Lessons Learned from the Windscale Fire It is important to note that there were no rapid catastrophic failures associated with this incident since it developed over a period of days (7th–12th October 1957). It did not give rise to any seriously high doses or non-stochastic effects and consequently no immediate medical response was required. The recommendations and conclusions arising from the final report of the Committee appointed by the Prime Minister to make a technical evaluation of the fire in the Windscale Piles, centre on the methods used for the release of stored Wigner energy (Ref 21). In addition, there was a certain mistrust of instrumentation during the course of the incident and the Committee recommended a substantial increase in the number of instruments required, particularly for temperature measurement and fuel element failure. The committee of the inquiry were satisfied that it was in the “highest degree unlikely” that any harm was done to the health of people in the area, workers at the Windscale plant, or members of the general public. A number of weaknesses were identified, however, notably the delay between recognition of the existence of an accident with potential for the emission of radioactive substances and the institution of an extensive and rapid milk sampling programme throughout the area at risk. Once countermeasures had been agreed and brought into play, however, the committee considered them to be adequate to prevent ill effects. At the time of the accident there was practically no information published regarding safe levels of Iodine–131 in milk. It was considered important to err widely on the safe side and to ban the consumption of milk from farms in the area. The first area restricted was a coastal strip approximately 2 miles wide but this was later to be extended to an area of approximately 200 sq miles. Following the incident there was some criticism for not making the area pessimistically large at the beginning and then shrinking it, since this was considered to be psychologically more acceptable.
19
Application of the data in the Medical Research Council (1975) report (Ref 22) to the circumstances of the Windscale fire confirmed Iodine–131 to be the most important radio-nuclide concerned. The recommended ERL of dose to the thyroid from I–131 was 30 rem (300 mSv). This corresponds to a derived ERL of peak activity in milk for adults in the MRC (1975) report of 3.8 µCi 1-1 (based on an assumed milk consumption of 0.5 litres per day). Since the highest measured concentration of I–131 in milk produced close to Windscale was 1.4 µCi 1-1 the dose to the thyroid, had this been consumed as normal, would have been in the order of 11 rem (110 mSv) for an adult and possibly up to twice that for a child (Ref 23). The controls instituted to prevent milk being consumed with I–131 levels greater than 0.1 µCi 1-1 ensured, however, that actual doses to the thyroid were far less than this figure. (Maximum of 9 mSv to a child). The collective effective dose equivalent commitment to the population of the UK and Northern Europe is estimated as 2×103 person Sieverts (Ref 24). If the Windscale fire had happened today the response would be unlikely to differ significantly from the actions taken in 1957. Stable iodate would probably be issued once the incident had been recognised and a programme of monitoring instituted at an early stage to check for radio-iodine build-up in milk. As in 1957, evacuation would not be considered appropriate for an incident of this magnitude. THREE MILE ISLAND: 1979 A Brief Description of the Incident Two Pressurised Water Reactors (PWR’s) each of approximately 900 MW(e) and designed by the US company Babcock and Wilcox are situated on the Three Mile Island site in the Susquehanna River, Pennsylvania. The nearest sizeable town is Harrisburg which has a population of approximately 68,000 people (1979 figure). Unit No 1 came into commercial operation in 1974 and was shutdown for re-fuelling at the time of the accident. Unit No 2 entered commercial service on 30 December 1978 and suffered a serious accident just 3 months later on 28 March 1979. The accident at Three-Mile Island developed in a manner determined by equipment malfunction, some design weakness and operator error. The loss of feedwater to the steam generators should have been dealt with by the installed safety systems, but escalated into a more serious event because vital equipment had been disconnected during maintenance. Decisions made by the operators during the subsequent course of the event have also been shown to have exacerbated the situation. The initial accident sequence at Three Mile Island (TMI) occurred over a period of minutes. A feedwater pump trip led to an absence of an effective heat sink and the primary system pressure began to rise. After approximately 15 minutes a pressure relief valve opened (correctly) but then failed to re-shut properly when the coolant pressure dropped. The operators did not realise this failure for about 2 hours and as a result, large amounts of active water were discharged into the containment sump. Since the sump pumps were running at this time, some of this water was transferred to an auxiliary building outside the containment building. Faulty decisions made during the water loss resulted in about half the fuel lacking coolant. This gave rise to substantial fuel damage and a release of fission products into the containment building.
20
Despite a considerable release of radioactivity from the damaged fuel only nominal exposures were suffered by the general public since the containment proved to be effective. For a period of about 7 days following the accident there was, however, considerable anxiety regarding the safety of the plant and whether the containment would continue to hold.
Actual Hazard (Radiological) During the early stages of the accident some coolant was released into auxiliary buildings. Since this area was at a slightly lower pressure than the containment building and because the water was cooling down, dissolved fission product gases came out of solution and escaped from the building. This was the source of a plume of radioactive gas which naturally gave rise to public concern. The plume contained mainly radioactive noble gases and some traces of radio-iodine. The radioactive water was subsequently pumped back into the containment and the release terminated.
Health Effects Although there was a fairly swift and catastrophic failure of the core, this accident did not give rise to any casualties at the site. Radiation doses received by workers did not give rise to any non-stochastic effects and consequently there was no requirement for an immediate medical response. During the period from 28 March to 30 June 1979, 3 Three Mile Island workers received doses between 30–40 mSv (3–4 rems), which exceed the Nuclear Regulatory Commission (NRC) maximum permissible quarterly dose of 30 mSv (3 rems). The President’s Commission estimate that between 28 March and 15 April 1979 the collective dose to the population living within a 50 mile radius of the plant resulting from activity released was approximately 20 person-Sieverts (2000 person-rems). This figure can be compared with the estimated annual collective dose from background radiation for the same group of 2400 person-Sieverts (2.4×105 personrein). The incremental increase due to the accident at Three Mile Island to persons living within a 50 mile radius, was therefore approximately 1% of the annual background level. The same figure for people living within a 5 mile radius was calculated to be approximately 10%. The maximum estimated dose received by a member of the public during the accident was 0.7 mSv (70 milli-rems). The major health effect of the accident appears to have been the psychological stress imposed on people living in the region of Three Mile Island. There was immediate distress produced by the accident among many groups of the general population living within 30 miles of the plant. The highest levels of distress were found amongst people living within 5 miles of the reactor, those with pre-school children, and amongst workers from the plant itself. One of the major factors contributing to this heightened level of stress was the uncertainty caused by the lack of reliable information and guidance via the media. The actual radiation levels outside the plant were low but for several days there was uncertainty about the possibility of serious releases from the containment building. Federal and state officials disagreed regarding the information on which to base
21
decisions regarding counter-measures. Some officials based their decisions on actual radiation levels outside the plant whilst others based their decisions on the potential for a major release from the containment building (Ref 25).
Countermeasures On 30 March the Governor advised pregnant women and those with pre-school children to leave the area within a 5 mile radius of the plant until further notice, but this was in direct contrast to advice from the Pennsylvania Bureau which stated that no protective action of any kind was required. As a precautionary measure most pregnant women and young children were evacuated from the area surrounding the plant and a more general voluntary exodus also occurred (the advice to pregnant women and pre-school children was formally lifted on 9 April 1979). The President’s Commission concluded (Ref 25) that whilst the extent of the media coverage was justified, a combination of confusion and weakness in the source of information and lack of understanding on the part of the media resulted in the public being poorly served. The effectiveness of Potassium-Iodide for thyroid gland protection in the event of a release of radio-iodine was well recognised and the food and drug administration had authorised use of Potassium-Iodide as a thyroid blocking agent for the general public in December 1978. At the time of the Three Mile Island accident, however, PotassiumIodide for this use was not commercially available in the United States in sufficient quantities. No supplies of Potassium-Iodide were held on site but major efforts by the Federal Government resulted in delivery of substantial supplies to Pennsylvania within 2 days. There was conjecture as to whether the drug should or should not be issued. The Department of Health strongly opposed distributing the drug to the public stating that radio-iodine levels were far below those laid down for protective action and the fact that by then the likelihood of a high level release was diminishing. It was also considered that distributing the drug would increase public anxiety and that the possibility of adverse side-effects presented a potential public health problem in itself. Consequently, the Potassium-Iodide remained in a warehouse under armed guard throughout the emergency.
Positive Aspects of the Three Mile Island Accident The accident at Three Mile Island is as yet the only case of a severe core damage accident in a commercial PWR and there has been extensive examination of equipment and procedures that went wrong. It is important to note, however, that in many respects the plant and its operators performed well. The containment also survived an hydrogen burn (explosion) although it was not specifically designed for such an event. There is no doubt that if the containment had not held, the consequences of this incident would have been on a far larger scale. It can be concluded that although the Three Mile Island accident was the most serious prior to Chernobyl, the health consequences which arose were in the event relatively trivial.
22
Lessons Learned from the Three Mile Island Accident In addition to the President’s Commission, the US Nuclear Regulatory Commission constituted a Special Review Group (SRG) to review the lessons learned from the Three Mile Island accident (Ref 26). This study was highly critical of the overall safety culture at the plant. It highlighted the requirement for increased technical competence, for improved training, for a more effective inspection and enforcement programme and for marked improvements to emergency plans. As a direct result of the accident at Three Mile Island all nuclear operators instituted a major review of their emergency schemes. A number of changes were introduced in the UK covering: The establishment of an Operational Support Centre (OSC) during an emergency to act as the focal point for liaison between the operator, emergency services, Government departments and other agencies such as DoE and MAFF. The establishment of media briefing centres. The installation of fixed gamma monitors around the perimeter of each power station to provide a time profile of any release of airborne activity (Ref 27). The accident did not give rise to any immediate casualties at the plant and the effectiveness of the containment building ensured that the public were not significantly affected. Consequently, very little was required in the way of a medical response. The lack of Potassium-Iodide was soon remedied with approximately a quarter of a million one-ounce bottles of the drug being provided within 2 days of the request. Although the requirement for a medical response is questionable, this accident highlights the unseen health effects arising from a nuclear emergency. The 20 years which had elapsed since the Windscale fire had seen the development of a more widespread apprehension about the safety of nuclear power. In the event this was exacerbated by poor communications, conflicting guidance, and media coverage with an insufficient technical basis which all served to heighten public anxiety leading to high levels of psychological stress amongst many groups. In responding to a nuclear emergency we should be prepared for the widespread anxiety that would undoubtedly arise. CHERNOBYL: 1986 General Description of the Incident In the early morning of 26 April 1986, what is probably the worst accident in the history of commercial nuclear power generation occurred at the Chernobyl Nuclear Power Station approximately 60 miles north of Kiev in the Ukraine. The plant involved was a 1000 MW(e) reactor of the RBMK type which is peculiar to the Soviet Union. In 1986 there were 14 RBMK reactors in service and a further 8 under construction. The accident resulted in the destruction of the Unit 4 reactor core and much of the building which
23
housed it. Large amounts of radioactive fission products were released into the atmosphere, contaminating land around the station and requiring the evacuation of approximately 135,000 people. The accident occurred during a test being carried out on a turbo generator at the time of a normal scheduled shutdown of the reactor. It was intended to test the ability of the turbo generator to supply electrical energy, following loss of external electricity connections, for the short period of time until standby diesel generators could supply emergency power. Improper test procedures and serious violations of basic operating rules placed the reactor at low power in cooling conditions which could not be stabilised by manual control. Subsequent events led to the generation of steam voids which introduced positive reactivity and resulted in an increasingly rapid rise of power. Attempts were made to stop the chain reaction but a rapid shutdown was not possible since most of the control rods had been completely withdrawn from the core. The rapid energy release ruptured the fuel causing an explosion of sufficient energy to disrupt the 1000 tonne reactor cover plate. This was followed by a second explosion after 2–3 seconds which resulted in hot pieces of the reactor core and fuel being ejected from the building causing fires in surrounding areas and an ingress of air to feed the burning graphite core (Ref 28). Radioactive releases from the plant continued for several days and were not stopped until 10 May 1986.
The Soviet Emergency Response The emergency response to the accident can be conveniently split into a number of phases:
1st Phase 2nd Phase 3rd Phase 4th Phase
: The initial catastrophic failure of the core (seconds). : The immediate response required to bring the accident under control (hours). : The secondary response involving evacuation and measures taken to limit further release of fission products from the burning core (days). : The long-term measures required to make the site safe. Cleanup and entombment of the damaged reactor (months).
Fire Immediately following the accident the first priority was fighting a number of fires which had broken out in some 30 places including the roof of the reactor building (height 71m). Within 7 minutes firemen from the nearby towns of Chernobyl and Pripyat set out for the plant and within 1 hour the worst of the fires were under control. After 31/2 hours the fires were extinguished leaving only the graphite core burning within the reactor. The firemen who initially dealt with the fires were exposed to high doses of radiation arising from fuel and core materials ejected from the reactor by the explosion. Six of these men fell ill very quickly and subsequently died within days. As a result of the continuing graphite fire and on-going significant release of fission products, the decision was taken to cover the exposed core with boron
24
compounds, dolomite, sand, clay and lead. The boron was to stop any re-criticality; the dolomite gave off C02 as it heated up which starved the fire of oxygen; the lead absorbed heat and melted into the gaps to act as shielding, whilst the sand acted as a filter against the release of radioactive particles. Over the period 27 April to 10 May approximately 5,000 tonnes of material was dropped by Military helicopters covering the core and effectively filtering out the fine aerosol fission products (Ref 29). Evacuation The largest concentration of population near to the plant was approximately 10 km away at the town of Pripyat (45,000 people). During the accident itself these people were effectively sheltering since the majority of them were inside asleep. In addition, the initial plume missed the town but on the following day (27 April) the wind direction changed and the plume moved towards Pripyat. The decision to evacuate Pripyat was made at 2.00 pm on 27 April and within 21/2; hours the entire population had been removed. Later, when the control zone of 30 km was established the remainder of the 135,000 people in that zone were evacuated within days. Some tens of thousands of cattle were also evacuated. The Russians adopt a system of emergency reference levels similar to those used in the UK. For a predicted whole body dose of 250 mSv (25 rem) evacuation would only be considered but if predicted doses approached 750 mSv (75 rem) the decision to evacuate would be certain. Organisation Moscow was informed of the accident within 5 hours and according to information given in Vienna a single Emergency Control Centre was established in the town of Chernobyl, 16 km from the site. The Emergency Controller was a senior official of the State Committee for Atomic Energy and he headed a support team of approximately 1,000 technicians. Thousands of troops were also deployed to the area to carry out a number of support activities. The central emergency organisation directed the remainder of the emergency response, including decisions regarding evacuation, food and water bans and the mitigating actions taking place at the plant. Medical Response The medical response got underway swiftly with 2 teams of medical staff leaving the neighbouring town of Pripyat for the site within 20 minutes of the accident. Iodate tablets were issued on site within 11/2 hours and house to house in Pripyat within 19 hours. The initial medical response was provided by the Chernobyl regional hospitals and institutes which served the plant. Very early on, broad clinical criteria were established to determine those who were very sick, requiring immediate hospitalisation (Group 1), those who were not very sick (Group 2) and those with no symptoms of radiation syndrome (Group 3). One hundred and thirty persons were assigned to Group 1 and these were all hospitalised locally by the end of the first day with 129 later being sent to specialist hospitals in Moscow and Kiev. Eventually 203 people were found to have acute radiation syndrome of either second, third or fourth (the most severe) degrees (see Table 3) but these cases were confined to firemen and plant workers and there were none amongst the general public.
25
TABLE 3 Degree of radiation syndrome
In addition to a number of the firemen, some of the plant emergency personnel received high doses (greater than 1 Gy). Five persons received significant thermal burns, but there were many more cases of beta radiation burns3. By 0600 hours on 26 April (5 hours after the initial explosion) one plant worker had died from severe thermal burns, one was never found and 108 people had been hospitalised. A further 24 were admitted later (Ref 30). Once the patients suffering from acute radiation syndrome had been hospitalised, additional investigations of blood and bone marrow were performed and radiation dose assessments made using chromosome aberration analysis techniques. Bone marrow transplants were undertaken for the worst affected victims, but in addition to the 2 deaths reported to have occurred immediately following the accident, a further 29 fatalities were subsequently reported amongst those hospitalised and diagnosed as suffering from acute radiation syndrome. The medical care which was made available on a short timescale including blood transfusion, chemotherapy, antibiotics, and techniques to prevent infection did, however, appear to have been effective in limiting the number of fatalities. Within a few days over 400 medical teams of doctors, nurses, assistants and medical students (5,000 total) had been assembled to assist in hospitals and to carry out checks amongst evacuees. By 10 May several hundred thousand people had been medically examined.
3
Fifty-six people received burns to greater than 1% of their body and the majority of these were due to beta radiation burns. They were mainly due to highly beta active contaminants being entrained on wet clothing and hence in close contact with the skin. This gave rise to high localised doses of beta radiation and nonstochastic effects similar to thermal skin burns. The time at which the symptoms manifest themselves is not well determined, however, and can range from a few hours to a number of weeks.
26
There is no doubt that a detailed analysis of medical effects and treatments arising from exposures received during the Chernobyl accident would greatly increase the knowledge of radiation-related injuries. In addition, long-term follow-up will be required not only of patients suffering acute effects but of the 135,000 people evacuated from the 30 km radius exclusion zone. Although such checks can be considered as a continuing burden or legacy resulting from the Chernobyl incident, the study may help provide much needed information relating to the biological effects of low doses of ionising radiation. Initial assessments suggest, however, that over the next 70 years the spontaneous incidence of all cancers, amongst the 135,000 evacuees, is not likely to increase by greater than approximately 0.6% (Ref 28). On a wider scale, approximately 70 MCi (Megacuries) of activity (excluding noble gases) were released during the course of the accident, some of which was spread over much of western Europe. In Britain restrictions were imposed on the slaughter of lambs grazing on Caesium-contaminated grass in parts of Wales, Cumbria and Scotland. The additional average dose in Britain during the year following the accident, is estimated to have been in the order of 3.5% of the annual dose due to natural background radiation (70 µSv compared to approximately 2000 µSv). This is similar to the magnitude of the increase caused by atomic weapons testing in the early 1960’s.
Lessons Learned from the Accident at Chernobyl The Chernobyl accident was so unique to the Soviet RBMK reactor design that there are very few lessons for the United Kingdom to learn from it in terms of plant design and safety features. The main effect has been to reinforce and reiterate the importance and validity of UK safety standards and procedures. It is considered that a reactor accident of the type that occurred at Chernobyl could not happen in the United Kingdom (Ref 31). There is no doubt that the accident resulted in a tragic loss of life and whilst there may not be a lot to be learned in the fields of reactor design and operation, the accident should provide valuable information for medical and other emergency services. Experience in the treatment of acute radiation syndrome and of beta radiation skin burns has been greatly increased and should become widely available once the final analysis has been completed. The results will help optimise therapeutic schemes and the experience gained will be valuable for the successful handling of any similar major emergency, should this ever be required. A SUMMARY OF EXPERIENCE GAINED FROM THE MAJOR NUCLEAR ACCIDENTS Table 4 summarises the main characteristics of each of the 3 major accidents referred to above. They differ from each other in many respects and provide no definitive guidance relating to any future emergencies. Perhaps the main feature to note is that in all cases, the emergency lasted for several days. In other respects the experience gained can be summarised under three major headings: (1) Treatment of Radiation Injury/Acute Radiation Syndrome There is no well defined, systematic route which all of these nuclear accidents took and consequently no consistent medical response. They varied widely in their severity,
TABLE 4 A summary of the main characteristics of the three reactor accidents
28
in the number of people affected and in the type of injuries they received. The Chernobyl accident, however, clearly demonstrated the potential for a wide range of effects amongst plant workers and emergency personnel including: Conventional Injuries Thermal Burns Beta Radiation Burns Contaminated Casualties Casualties Exposed to High Doses of Radiation The incident at Chernobyl resulted in personnel receiving radiation exposures sufficiently high for them to experience acute radiation syndrome. Whilst the accident resulted in a tragic loss of life, the experience gained in treating these patients and the techniques used to mitigate the consequences of their injuries will be of medical value. Similarly, knowledge has been improved in the area of radiation-induced skin burns. The Chernobyl accident in particular highlighted the limitations of the protective clothing which was available for emergency use. (2) Development of Techniques to Control the Emergency Each of the nuclear accidents considered has contributed to the development of control techniques for major nuclear emergencies. The 1957 Windscale fire highlighted the importance of the radio-iodines and in the course of time this led to the development of Emergency Reference Levels. In 1979 the Three Mile Island accident highlighted the inadequacies of local emergency plans but did however show the importance of good robust engineering design which enabled the reactor containment vessel to withstand the accident. In 1986 the Chernobyl incident provided insight into the potential scale of consequences that can arise from a major nuclear accident. The scale and range of the emergency response required at Chernobyl prompted a review of the UK emergency arrangements. The Prime Minister in December 1988 in a written Parliamentary answer “confirmed the availability of contingency plans which would permit an effective response to be made to any nuclear accident, including those with more widespread effects than the specific site and off-site plans are designed to cater for.” (ie the ‘reference’ accident). On a wider scale Chernobyl also introduced the world to the concept of nuclear transfrontier pollution. (3) Effects On, and Response of, the Public At the time of the Windscale accident in 1957 reports suggest that the public’s reaction in the neighbourhood of the plant was fairly low key. This was in marked contrast to the accident at Three Mile Island 20 years later which, whilst it did not give rise to significant releases of activity to the environment, did nevertheless produce a significant problem of psychological stress and anxiety amongst people living nearby. This accident also highlighted the inadequacies of controllers and of the media in providing the public with clear and consistent advice regarding their best interests.
29
The accident at Chernobyl served to underline these inadequacies in communications with the public. The number of phone calls received by the National Radiological Protection Board and other similar agencies provided an indication of the perceived risk which people in the UK considered themselves to be exposed to. Quite often fears were disproportionate to the actual technical or statistical likelihood of harm. This phenomenon was perhaps best demonstrated by the Goiania incident (discussed later in the proceedings by Mr J R Croft of the NRPB) which, apart from the direct effects of the accident, gave rise to far reaching socio-economic consequences. Goiania provided a unique opportunity to examine the effects of perceived risk and stigma under actual conditions.
FUTURE IMPLICATIONS The location of commercial nuclear power reactors in the UK and others operated by the UKAEA and BNF Plc, are shown in Figure 4 and their characteristics are summarised in Table 5. The general principles which determine the safety of nuclear plant in the UK were discussed in Section 1 and these principles are applied to the plant listed in the Table. The inquiry into the CEGB’s application to build the Sizewell ‘B’ plant provides a recent detailed illustration of how these principles are applied in practise. The Public Inquiry opened on 11 January 1983 and ran for 340 days. The Inspector produced a report to the Secretary of State which contained 109 chapters of which 44 dealt with various safety matters (Ref 32). The main emphasis in the Sizewell ‘B’ safety case was on those incidents arising from relatively frequent faults which the plant is designed to deal with following the principles outlined in Section 1. It is shown that the severity and frequency of these satisfy the criteria illustrated in Figures 1 and 2 and are such as to be unlikely to require any public countermeasures. A preliminary analysis of the likelihood and consequences of more severe accidents was, however, also presented to the Inquiry (Ref 32). This showed that the estimated frequency of an accident arising from internal faults and involving major fuel damage was about 10-6 per year. Containment failure and a subsequent large release of radioactivity into the environment was predicted for a small proportion of these cases. Consideration of external events (such as earthquakes, aeroplane crashes) and human error would increase these values somewhat but in general, it may be claimed that the risk of an accident leading to an uncontrolled release is in the region between 10-5 and 10-6 per annum. The HSE has recently examined the tolerability of risk (Ref 33). It suggests that individual risks of death of 10-6 per year due to stochastic doses received following a nuclear accident might be broadly tolerable and that plant designed to the Principles and Criteria described above comfortably satisfy such a criterion. In societal terms the HSE suggest that a major accident (involving doses of 100 mSv out to 3 km) anywhere in the UK might be accepted as just tolerable at a frequency of 10-4 per year.
30
Figure 4. Britain’s Nuclear Power Stations In very broad terms we may therefore conclude that the chance of an accident occurring at a UK nuclear power plant is distributed as follows Class I
Serious accident but with on-site implications only
:
Around 1 in a few thousand per year
Class II
Accident with off-site consequences requiring public counter-measures
:
Around 1 in a few tens of thousands per year
Class III
Major accident with severe off-site consequences
:
Around 1 in a million per year
31 TABLE 5 Britain’s nuclear power stations—1989
NOTES: *Design Output Number of People Involved The number of plant personnel likely to be affected by an accident depends not only on the class of accident, but also the numbers present on site. The number of people working on a reactor site depends somewhat upon the reactor type, its operational state and, more importantly, upon the time of day/week. During normal day shift there may be
32
up to a few hundred people present on a modern reactor site. This number falls during other hours and at the weekend, to a level which may be less than 100 dependent upon shift maintenance staffing levels, etc. The number of people involved in any on-site emergency response depends upon the scale of the incident. Twenty-eight people fought the fire at the Chernobyl reactor (Ref 30), but it is likely that the response in the UK would be greater. They would also be accompanied by a similar number of medical/ambulance staff. It is important to note that emergency response crews would be monitored and relief crews organised so as to ensure that doses received by individuals do not exceed non-stochastic limits. This subject is currently being addressed and it has been suggested (Ref 34) that substantial efforts should be made to keep doses to workers and emergency services persons on site, to levels below those at which non-stochastic effects may occur, ie 0.5 Gy to the whole body, or 5 Gy to any organ or tissue which may be preferentially exposed. The number of people living in the vicinity of a nuclear power plant varies in detail from site to site. It is somewhat dependent upon the age of the station and the siting criteria used at the time that the station was built. All Magnox stations were built on relatively remote sites, but some of the later AGR stations, which were considered safer due to increased pressure vessel integrity, were built nearer to urban areas than had previously been permitted. Assessments of the radiological consequences of a release of radioactivity from a given site are undertaken using complex computer models. These models represent the atmospheric dispersion and deposition of radioactivity for any weather sequence and evaluate the implications for the particular local population distribution. Clearly the magnitude of any effects are dependent on the size of the release. In general we estimate that up to a few thousand might be affected by Class II accidents rising to a few tens of thousands for Class III. It is important to note, however, that no early health effects occurred amongst members of the public at Chernobyl and the chance of any occurring in the UK is remote.
Requirements for an Effective Medical Response As recent non-nuclear incidents have shown, there is no shortage of excellent people able to respond to major accidents at short notice and under adverse conditions. Studies of the nuclear accidents to date, however, demonstrate the wide diversity and scale that these accidents can take. It is essential, therefore, that any response must be flexible so as to accommodate the wide range of potential accidents and the diversity of patients’ injuries that could arise. The response may conceivably require the treatment of contaminated casualties, those who have received high external doses of radiation and those with conventional injury. In addition, there would be many in the area who, although not acutely affected by the incident, may suffer stress-related effects for some time after the incident.
CONCLUSIONS 1. As discussed in Section 1 nuclear facilities are designed, built and operated to extremely high levels of safety. This, together with the strict application of
33
regulatory controls, ensure that major accidents in nuclear plants can occur only with a very low probability. The UK has enjoyed hundreds of reactor years of commercial nuclear power generation without serious accident. 2. As the response to recent non-nuclear incidents has demonstrated effective emergency plans are in place to deal with large-scale emergencies. Emergency plans in place at nuclear establishments can be seen as an extension of those already in existence at Local Authority level for civil emergencies. Arrangements must, however, be made for dealing with injuries peculiar to accidents involving radioactivity, ie for contaminated patients and those exposed to high doses of radiation. 3. Examination of previous nuclear incidents shows no pattern. There is a marked difference in scale and number of persons affected and a great diversity in the range of effects. Similarly the timescale over which the accidents develop can vary. There may be a catastrophic initial event as occurred at Chernobyl or the accident may develop over a period of days as occurred at Windscale and Three Mile Island. Protracted accidents such as these invariably lead to a build-up of tension amongst people in the vicinity. 4. The medical emergency response to nuclear accidents in the UK needs to be based on flexibility. There is potential for a wide range of effects amongst plant workers or Emergency Response Teams including: Conventional Injuries Thermal Burns Beta Radiation Burns Contaminated Casualties Casualties Exposed to High Doses of Radiation The number of personnel exposed in this group will vary but it may be as high as about a hundred. 5. As with the Chernobyl incident, it is important that facilities are available for the treatment of patients who have received high external doses of radiation. This was one of the main lessons learned from Chernobyl and it is important that these facilities are available in the UK. 6. Rapid evaluation of high doses received by operators and emergency personnel are required to assist in rapid decisions regarding treatment procedures. 7. It in unlikely that any immediate radiological health effects will occur amongst members of the public. Countermeasures for public protection may be required however, and medical staff would be expected to participate in such considerations. In particular the decision to issue potassium iodate tablets to the public would need to be made in conjunction with the Local Health Authority.
34
8. In general the numbers of people directly affected by countermeasures could range from a few hundred to several thousand. Public concern about radioactivity and radiation can give rise to psychological concerns and medical counseling of this group— as well as many others not directly involved—may be required. 9. It is essential that in the event of a nuclear accident, arrangements are in place for both the control organisation and the media to provide the public with reliable information concerning their best interests. Medical counsellors must also be provided with appropriate details of the accident and its consequences. 10. The programme of public education must be continued to enable the public to appreciate the risks associated with nuclear plant accidents. This should go some way to avoiding unnecessary alarm and anxiety. As the incidents at Three Mile Island, Chernobyl and Goiania have shown, the public see risks from nuclear accidents as being far reaching and quite often their reaction to the perceived risk is greater than the actual risk would warrant.
REFERENCES 1. Gronow, W.S., HM Nuclear Installations Inspectorate. Safety and Siting of Nuclear Power Plants in the United Kingdom. National Radiological Protection Board— Advanced Radiological Protection Course. NRPB–LN141, 1987. 2. Tyror, J.G., Garnsey, R., Hicks, D. Severe Accident Research in the United Kingdom. Paper presented at the 16th Water Reactor Safety Meeting, Gaithersburg, October 1988. 3. Nuclear Installations Inspectorate. Safety Assessment Principles for Nuclear Power Reactors, HMSO, London (1979). 4. Central Electricity Generating Board. Design Safety Criteria for CEGB Nuclear Power Stations. Rep HS/R 167/81, Revised, CEGB, London (1982). 5. Central Electricity Generating Board. PWR Design Safety Guidelines. Rep DSG2, Issue A, CEGB, London (1982). 6. International Atomic Energy Agency. The Radiological Accident in Goiania, September 1988. Vienna, 1988. 7. International Atomic Energy Agency. Safety Series No 75—INSAG 3: A Report by the International Nuclear Safety Advisory Group—Basic Principles for Nuclear Power Plants. Vienna, 1988. 8. Health and Safety Executive. Emergency Plans for Civil Nuclear Installations. HMSO, London, 1982. 9. Professor A Barrett, Glasgow Institute of Radiotherapeutics and Oncology. Treatment of Severe Radiation Injury (September 1984). National Radiological Protection Board—Advanced Radiological Protection Course. NRPB–LN123, 1987. 10. National Radiological Protection Board. Emergency Reference Levels: Criteria for Limiting Dose to the Public in the Event of Accident Exposure to Radiation. NRPB– ERL2, 1981.
35
11. Ramsden, D., Passant, F.H., Peabody, C.P., and Speight, R.G. Radioiodine Uptakes in the Thyroid: Studies of the Blocking and Subsequent Recovery of the Gland following Administration of Stable Iodine. Health Physics 13, 633–46, 1967. 12. Stratton, W.R., Los Alamos Scientific Laboratory of the University of California. A Review of Criticality Accidents. LA 3611, 1967. 13. Knief, R.A. Nuclear Criticality Safety, Theory and Practice 1985. 14. Paxton, H.C. Historical Perspective of Nuclear Criticality Safety in the United States, Proc. ANS TOPL. Mtg. Nuclear Criticality Safety, EL. Passo, Texas. 8–10 April 1980. SAND80–1675, p. 21, Sandia National Laboratories. 15. Quinten, A., Dunster, H.J., UKAEA, Risley. Report on the Health and Safety of Employees of the UKAEA, 1957. IGS–R/R–4. 16. HMSO, London. Accident at Windscale No 1 Pile on 10 October 1957. November 1957. 17. Fraser, P., Booth, M., Beral, V., Inskip, H., Firstat. S., and Speak, S. Collection and Validation of Data in the United Kingdom Atomic Energy Authority Mortality Study. British Medical Journal, Vol 291, pp 435–439, 1985. 18. Beral, V., Inskip, H., Fraser, P., Booth, M., Coleman, D., Rose, G. Mortality of Employees of the United Kingdom Atomic Energy Authority, 1946–1979. British Medical Journal, Vol 291, pp 440–447, 1985. 19. Smith, P.G., and Douglas, A.J. Mortality of Workers at the Sellafield Plant of British Nuclear Fuels. British Medical Journal, Vol 239, 845–854, 1986. 20. Herbert, R. The Day the Reactor Caught Fire. New Scientist, pp 84–87, 14 October 1982. 21. HMSO, London. Final Report of the Committee Appointed by the Prime Minister to Make a Technical Evaluation of Information Relating to the Design and Operation of the Windscale Piles and to Review the Factors Involved in the Controlled Release of Wigner Energy, July 1958. 22. Medical Research Council (MRC), 1975. Criteria for Controlling Radiation Doses to the Public After Accidental Escape of Radioactive Material. HMSO, London. 23. Baverstock, K.F. and Wenart, J. Medical Research Council (MRC). Emergency Reference Levels for Reactor Accidents: A Re-Examination of the Windscale Reactor Accident. Health Physics, Pergamon Press, 1976. Vol 30 (April), pp 339–344. 24. Crich, M.J., Linsley, G.S., National Radiological Protection Board. An Assessment of the Radiological Impact of the Windscale Reactor Fire. October 1957. R135. Chilton, Oxon, 1983. 25. Kemeny, J.G. (Chairman). The President’s Commission on the Accident at Three Mile Island, the Need for Change: The Legacy of Three Mile Island. Washington DC, October 1979.
36
26. US Nuclear Regulatory Commission, Office of Inspection and Enforcement. Report of Special Review Group, Office of Inspection and Enforcement on Lessons Learned from Three Mile Island. NUREG– 0616. 27. International Atomic Energy Agency. Proceedings of a Symposium, Rome, 4–8 November 1985. Emergency Planning and Preparedness for Nuclear Facilities. (Vienna, 1986). 28. International Nuclear Safety Advisory Group. Summary Report on the Post-Accident Review Meeting on the Chernobyl Incident. International Atomic Energy Agency, Vienna, 1986. 29
Collier, J.G., Myrvddin-Davies, L. Chernobyl. Central Electricity Generating Board, October 1986.
30. Mould, R.F. Chernobyl—The Real Story. Pergamon Press 1988. 31. United Kingdom Atomic Energy Authority. The Chernobyl Accident and its Consequences. NOR4200, Second Edition, London, April 1988. 32. Sizewell ‘B’ Public Inquiry Report: Report by Sir Frank Layfield, HSMO, ISBN O 11 411576 1 (1987). 33. Health and Safety Executive. The Tolerability of Risk from Nuclear Power Stations, London. HMSO, 1988. 34. Hill, M.D., Wrixon, A.D., Webb, G.A.M. Protection of the Public and Workers in the Event of Accidental Releases of Radioactive Materials into the Environment. Journal of Radiological Protection, Volume 8, Number 4, December 1988.
APPENDIX 1 Gray (Gy) - Unit of absorbed dose of radiation=1 J/kg=100 rads Sievert (Sv) - Unit of radiation dose equivalent=100 rems Stochastic - Describes effects for whom the probability of occurrence in an exposed population (rather than severity in an affected individual) is a direct function of dose: these effects are commonly regarded as having no threshold; hereditary effects are regarded as being stochastic; some somatic effects, especially carcinogenesis, are regarded as being stochastic. Non-Stochastic - Describes effects whose severity is a function of dose; for these, a threshold may occur; non-stochastic somatic effects—include cataract induction, nonmalignant damage to skin, haematological deficiencies and impairment of fertility.
SETTING THE SCENARIO—POTENTIAL HAZARDS OF THE NUCLEAR FUEL CYCLE by R J BERRY and N McPHAIL Health and Safety Directorate British Nuclear Fuels plc Risley Warrington Cheshire WA3 6AS
The previous paper has outlined the hazards of the operation of nuclear reactors as sources of electrical power; in the UK the majority of these are operated by the Central Electricity Generating Board and the South of Scotland Electricity Board. British Nuclear Fuels plc, which was created in 1971 from the former production division of the United Kingdom Atomic Energy Authority, is with its French opposite number, COGEMA, one of the Western world’s only two suppliers of the full range of nuclear fuel cycle services. In addition, BNFL also own and operate two nuclear power stations, Calder Hall on the Sellafield Site and Chapelcross in Southern Scotland. The geographical location of BNFL’s plants in North West England and Southern Scotland are shown in Figure 1 and the elements of the nuclear fuel cycle are shown in Figure 2. The “front end” of the fuel cycle activities are carried out at BNFL’s plants at Springfields and Capenhurst. Uranium arrives at Springfields as ore which has been mined overseas. It is purified and undergoes various chemical and mechanical processes to make the fuel which is used in nuclear reactors.
37
Figure 1
Figure 2
40
Magnox reactors, which have now been operating successfully for more than 30 years in the UK and overseas, use natural uranium which has been cast and machined into metal fuel rods encased in magnesium-aluminium alloy (magnox) cans. The more modern and more efficient Advanced Gas Cooled Reactors and Pressurised Water Reactors require uranium fuel which has been “enriched” by increasing the proportion of the fissile isotope Uranium– 235. Uranium enrichment is carried out at BNFL’s Capenhurst site using a cascade of gas centrifuges which separate the lighter Uranium–235 (less than 1% of natural uranium) from the heavier and more prevalent Uranium–238. The enriched uranium is returned to Springfields where it is manufactured into ceramic uranium oxide pellets which are then encased in stainless steel cans and these individual fuel “pins” are further put together into complete fuel assemblies for use in nuclear power stations throughout the UK.
The potential hazards associated with operations at the Capenhurst and Springfields plants are mainly of a chemical nature, due to toxic chemicals handled in various parts of the process, including hydrogen fluorides and uranium hexafluoride, which decompose in the atmosphere yielding hydrogen fluoride and uranyl fluoride. Whilst there is an associated radiological risk, it is of a very much lower magnitude and comes from the low levels of radiation naturally emitted in the spontaneous radioactive decay of uranium in uranyl fluoride. The only significant potential radiological hazard at these sites which could lead to major doses to the workforce, is from the inadvertent bringing together of a critical mass of uranium such that fission can take place (a criticality incident/accident). As discussed later in the paper, the plants are designed to minimise this possibility and operating rules further reduce the chance of such untoward occurrences. A criticality accident would be extremely unlikely to have any off-site consequences for the general public, but is the major radiological hazard to the workforce.
41
After several years in a nuclear reactor, the nuclear fuel becomes depleted in the fissile isotope Uranium–235 and needs to be replaced. When removed from the reactor, the fuel is both heat generating and intensely radioactive and is normally stored underwater. An initial cooling period allows some of the short-lived radioactivity to decay and the fuel is then transported to BNFL’s reprocessing complex at Sellafield in extremely robust transport containers known as transport flasks. On arrival, the spent fuel is stored for a further period underwater prior to reprocessing, to allow further decay of some of the shorter lived radioactive products of nuclear fission.
Reprocessing consists of a series of chemical separation processes in which the uranium fuel rods are stripped of their encasing cans and treated to separate residual uranium (at least 96% by weight of the total) from the by-product Plutonium (about 1% of the total mass) and highly radioactive but non-useful waste (less than 3% of the total). The highly active waste is retained in safe storage while the reclaimed uranium is recycled for use again as fuel. Of the existing fuel in Britain’s AGR power stations, some two thirds has come from material recycled from the earlier magnox reactors. Current reprocessing plant started operating in 1964 and is designed to reprocess fuel from magnox reactors. The Thermal Oxide Reprocessing Plant (THORP), a massive new complex to reprocess fuel from AGR and PWR both in the UK and overseas, is now being constructed at Sellafield and is expected to be operating in the early 1990’s. High level (heat generating) and Intermediate level radioactive waste separated during reprocessing are at present stored on the Sellafield site. Plants currently under construction will convert liquid high level and intermediate wastes into glass blocks and cement monoliths respectively. Low level radioactive waste is disposed of safely in near-surface trenches at the nearby Drigg site and liquid waste via the sea pipeline. Both disposal routes are strictly controlled by a system of authorisations from Government Departments.
42
In contrast to the nature of the hazards at the “front-end” of the nuclear fuel cycle, those associated with reprocessing are mainly of a radiological nature and arise from the possibility of an aerial release of radioactivity into the atmosphere, which is then carried off-site. An accidental release of radioactive material in liquid form is unlikely to be a major short term health risk to either the workforce or the general public. As discussed for the “front-end” plants, however, the storage of nuclear materials is always associated with a risk of criticality accident which could be of consequence to workers, although unlikely to have any consequences off-site to the general public.
Restricting the discussion, for the balance of this paper to Sellafield as the site which has the greatest potential for radiological hazard, it is important to understand the system of safeguards designed to prevent radiological accidents. BNFL as operators would wish, and the general public have every right to demand, that the possibility of an accident leading to the release of sufficient radioactive material to cause interference with the normal activities of the public, or to require special arrangements to be made to protect the workforce, should be extremely remote. The “safety cycle” designed to ensure this high level of safety is shown in Figure 3. During the design of plant, safety aspects are treated as of great importance and all credible combinations of foreseeable events which could lead to an accident are taken into account. Appraisals of safety are carried out at all stages from the initial design concept through to plant start-up and are continued during operation of all plant. All such safety appraisals and safety procedures are kept under review by a nuclear safety committee which includes independent members and the whole safety system is subject to corporate audit. In addition, the design, construction and operation of all such plant are assessed independently by the Nuclear Installations Inspectors of the Health and Safety Executive. Their permission is required to commence construction, and subsequently to operate any plant handling nuclear material.
Figure 3
44
As with any industrial process, there will always remain a risk, no matter how small, for releases of radioactive materials from the Sellafield plants, these include the fuel cooling ponds, the highly active waste storage tanks and the Plutonium stores. Each of these has extensive and reduplicated safety features to prevent radiological accidents. For example, the highly active waste storage tanks contain concentrated, highly radioactive, liquid waste from reprocessing. Because of the intense radioactivity, the liquid is self-heating and therefore has to be kept cool. If all cooling was lost and the design of the tank allowed boiling of the liquid to occur, amounts up to hundreds of TBqs of radioactivity could be released into the atmosphere in a single event. The highly active waste storage tanks are designed so that the probability of such a release is exceedingly small. Each tank has ample spare cooling capacity to cope with the heat generated by the radioactive waste; there are four separate sources of cooling water available and three separate sources of electricity to operate the cooling water pumps. In addition, spare tanks are always kept empty on site into which liquid might be transferred if significant failures occurred in coiling coils in a particular tank. Even in the event of all the cooling mechanisms failing, it would take several hours for the contents to reach boiling point, and days before boil-away, and hence release to the atmosphere of the radionuclides, was complete. This would allow time for measures to be taken to prevent any release of these materials to the environment.
Plutonium which has been separated from the spent fuel during reprocessing is also a potentially valuable fissile material which can be used when mixed with uranium as fuel in modern thermal nuclear reactors, but which would be far more efficiently used as a fuel for future generations of fast reactors to ensure long term availability of economic supplies of electricity after easily obtainable new sources of uranium ore have been worked out. The plutonium, in the form of
45
plutonium oxide powder packaged in stainless steel cans, is stored on the Sellafield site in an impressively sturdy monolithic building. The potential hazards from the stored plutonium are the release of this material to the atmosphere via a failure of one or more of the stainless steel cans, or a criticality event resulting from the accidental bringing together of a mass of plutonium sufficient to sustain the nuclear chain reaction. The store and the packaging are both designed so that this kind of criticality event cannot occur, because of the geometry of the packages and the way in which they are positioned. In the very unlikely event of a criticality, the potential hazard would be direct radiation exposure to the workers in the store, leading in the almost unimaginable worst case, to doses large enough to cause the acute radiation syndrome to workers thus exposed. A potential hazard to the general public exists only if either plutonium dust (or fission products in the event of a criticality event) escape to the atmosphere following an accident in which the stainless steel can packaging the plutonium oxide powder was broached. The possible effects on both groups are also reduced by countermeasures, including automatic criticality and airborne radioactivity detection devices within the store and a filtered ventilation system to prevent egress of radioactive material to the environment. There is redundancy and diversity in the engineering design provisions and in the air filtration systems, and administrative arrangements are designed also to minimise the chance of accident. In the unlikely event of a major accident, the impact on the workforce and general public would be ameliorated by invoking emergency arrangements involving the local and national authorities. These arrangements include consideration of evacuation, sheltering, food bans etc.
46
EMERGENCY PLANNING Notwithstanding the complex and effective safety systems in place to prevent accidents, an emergency scheme has been drawn up to ensure that the necessary organisation is available should an accident lead to the escape of sufficient radioactive material, or there be a sufficiently high radiation dose rate on site to cause interference with the normal activities of the site or the general public. Safety studies have been carried out for those plants at Sellafield with the greatest potential for creating an off-site hazard in order to determine the size of an incident for which detailed emergency planning is required. Emergency arrangements based upon these assessments are drawn up with the relevant local organisations such as the Police, taking accounts of such factors as local geography and demography. The overall off-site emergency scheme is then designed with sufficient flexibility to be capable of extension. Planning for immediate actions such as sheltering or evacuation of the general public (and the issue of stable potassium iodate tablets should a reactor accident occur and radioactive iodine be a major potential source of public exposure) are based on the Emergency Reference Levels (ERL) promulgated by the National Radiological Protection Board (NRPB).
For foreseeable accidents on site, the likely off-site consequences at Sellafield are restricted to evacuation of persons from the site and out to two kilometres, involving only some tens of persons other than the site workforce. Evacuation would be considered if the whole body dose saving to the general public was in excess of 100 mSv and definitely be implemented if the dose saving exceeded 500 mSv, being the lower and upper ERLs currently specified by NRPB. The actual dose saving at which action would be implemented would depend on the circumstances on the day. Even in the worst case involving release of radioactive material, this would be expected to take place over a significant time period (hours), further reducing the total radiation dose and its effect once counter measures are taken.
47
Beyond two kilometres, sheltering indoors would be recommended for members of the general public up to a few kilometres from the site; this countermeasure would be considered if the predicted dose saving from the countermeasure was likely to exceed 5 mSv to the whole body, and would definitely be implemented if the dose exceeded 25 mSv. These levels of radiation are incapable of producing clinically detectable radiation injury.
A recent discussion paper circulated by NRPB staff gives a review of radiological protection in relation to accidents, in the light of experience gained following the Chernobyl accident and the possible revisions of risk factors. These ideas if subsequently adopted would lead to a reduction in the dose for which countermeasures would be implemented.
CONSEQUENCES OF A MAJOR ACCIDENT As we have seen, the mechanisms by which the activities of the workforce or the general public would be affected following a major accident at the Sellafield complex, would be the release of radioactive material to the atmosphere or as a result of a criticality accident. For an aerial release, potential deleterious effect would be mainly from the inhalation of the emitted radioactive gases. Effects of direct radiation and of material ingested on food would be much smaller. The risks to the workforce from an aerial release are that the initial incident occurs before protective countermeasures can take place, such as evacuation, sheltering or donning protective equipment, and that additional exposure will be received by recovery teams of workers who are attempting to bring the incident under control. For the former, risks are limited by design of the plant and ready availability of countermeasures, and for the latter by careful monitoring, use of appropriate protective equipment, restriction of exposure time, etc. The risk to the general public following a release to the atmosphere of
48
radioactive materials is limited by the imposition of countermeasures such as sheltering and evacuation in the short term, and by control of availability of contaminated foodstuffs, when this occurs, in the longer term. Even for the most serious accidents, the maximum dose which any individual on or off site is likely to receive as a result of an aerial release of radioactive material is of the order of a few hundred mSv allowing for the effective implementation of the emergency scheme. At this dose level, no member of the general public will have reason to show any evidence of the acute radiation syndrome; no-one should require hospitalisation or even medical treatment as a result of radiation injury from such a release. The possible increased radiation exposure to recovery teams on site should be limited by control procedures, and thus acute radiation injuries should not form part of the initial clinical problem. As in any industrial accident, mechanical injury, trauma of various sorts, thermal burns etc, may well form part of the casualty pattern following a major accident. Members of the recovery team may have received in the worst case localised skin doses which might result in minor radiation burns—but these will not be apparent in the short term.
The possibility that casualties will be contaminated with radioactive material does have to be considered however, and hospital services which receive such casualties must have facilities and training for the reception and handling of contaminated casualties. For a criticality accident, the hazard is from intense local neutron and gamma radiation in the area concerned, which pose an acute radiation syndrome risk to the workforce in the immediate vicinity. Because dose rates from direct radiation fall rapidly with increase in distance away from their source, and because they are reduced by shielding with concrete and other materials, this hazard really is applicable only to personnel in the immediate vicinity. A severe unshielded criticality incident, of which a small number
49
of workers could receive sufficient radiation doses to cause death from the acute radiation syndrome or very severe injury is prevented by operational control and by the design of the plant. The effects of a criticality event on the workforce are alleviated by the inclusion of shielding which is built into it, the continuous operation of criticality detection alarms and the training in appropriate countermeasures such as rapid building evacuation.
MEDICAL ARRANGEMENTS This paper has detailed the attention which is given to safety on the Sellafield site so that the probability of a major accident leading to the release of radioactive material sufficient to cause interference with the normal activities of the plant, or of the general public, or requiring special arrangements to be made to protect the workforce, is very low. Even in the worst case, it is not anticipated that persons off-site will require medical treatment due to the radiation effects of the most severe foreseeable accident. However, persons within the immediate vicinity of the Sellafield works affected by any countermeasures such as evacuation/sheltering will be monitored as a reassurance measure. On site, trained medical and nursing staff are available to deal immediately with injuries to the workforce. Extensive medical facilities range from local surgeries within the plant staffed by a nurse, to the main medical building, which could be described as a mini-hospital. The on-site staff consist of 1 Senior Medical Officer, 3 Medical Officers, 18 Nurses, 6 Surgery Assistants, 2 Radiographers and 3 Medical Laboratory Technicians. All of BNFL’s Medical Officers have been trained in aspects of radiation medicine at the US’s Oak Ridge Associated University.
50
For accidents involving the workforce, or others within the perimeter of the works, and where there is no radiation hazard, on-site services will deal with those casualties for which they have full capability. However, as with any industrial accident, the assistance of NHS specialist services will be available should injuries be sufficiently serious (eg trapped casualties or persons too severely injured to be moved any distance). In such cases, a request would be made to the Duty Consultant Surgeon at the West Cumberland Hospital to consider despatch of a mobile team. In the event of large numbers of casualties, the medical staff at Sellafield will decide whether it is necessary to seek activation of the Major Accident Procedure of the West Cumbria Health Authority. West Cumberland Hospital staff have been trained by the BNFL medical staff in the handling of contaminated casualties. Where members of the workforce or others within the perimeter of the works have been exposed to significant radiation doses, the Senior Medical Officer at Sellafield will decide whether the casualties so affected can be treated at the West Cumberland Hospital, or whether to transfer them to the Royal Victoria Infirmary, Newcastle, where the full panoply of support services are available in a centre with major radiotherapy and haematology/oncology departments.
In summary, for foreseeable accidents, the likelihood of genuine radiation casualties from accidents at Sellafield is very small, but like any industrial site, there could be a requirement to deal with a wide range of injuries following a major accident. The only special problem likely to occur is the necessity to deal with casualties who have been contaminated with radioactive material, so that the receiving hospital staff understand the problems this involves and the avoidance of any hazard to themselves.
THE MEDICAL MANAGEMENT OF RADIATION CASUALTIES DR. A.W. LAWSON Company Chief Medical Officer, British Nuclear Fuels plc Sellafield, Cumbria ABSTRACT
This paper reviews the key aspects of the medical management of radiological over-exposures and contamination incidents involving radioactive materials. The presentation is based on a simple but practical system of medical classification of radiation accidents. Emphasis is placed on the importance of the team approach in particular the role played by the medical physicist in ensuring the safe handling of these casualties. INTRODUCTION A radiological accident may be defined as “An unforeseen occurrence, either actual or suspected, involving an exposure of, or contamination on or within human beings and the environment by ionising radiations”. For many years the Radiation Emergency Assistance Centre/Training Site at Oak Ridge, Tennessee has maintained a retrospective and ongoing register of significant radiological accidents based on criteria laid down by the U.S. Department of Energy. The Registry was primarily established to ascertain all the relevant facts about radiological accidents with a view to (i) teaching and communicating the lessons to be learned and (ii) following up survivors in order to determine and document the incidence of late effects. In a recent review of the Registry database Lushbaugh [1] set out the number of individuals involved and the number of fatalities reported as a result of major radiological accidents between 1944 and March 1988 (Table 1). He also showed that radiological accidents could be conveniently classified under three main headings, viz- Criticality Excursions, Radiation Devices and those due to Radioisotopes (Table 2). One significant observation as a result of this project is well illustrated in Figure 1 where the frequency of reported accidents due to the mishandling of radiation devices can be seen to have risen considerably in comparison to
51
52
the incidence of accidents arising from accidental criticality excursions or from contamination by radioisotopes. TABLE 1
Table 1 —
Major Radiation Accidents—Worldwide Types of Injuries 1944—March 1988 TABLE 2
Table 2 —
Details the specific device or the radioisotope involved in the REAC/TS Register of Radiological Accidents.
53
Figure 1
Figure 1 — Frequency Distribution of Major Radiation Accidents—Worldwide 1940—March 1988
THE MEDICAL CLASSIFICATION OF RADIOLOGICAL ACCIDENTS The nature and severity of any illness which may follow exposure to ionising radiations is dependent on:- (i) The dose received. (ii) The dose rate. (iii) The part of the body and the volume of tissue irradiated and (iv) The quality and type of radiations. For example, a single dose of 4 Gray to the whole body, delivered as an instantaneous exposure, would result in a serious illness with the probability of 50% of those exposed dying within 30 days. However, if the same dose of 4 Gray was delivered in four separate exposures of 1 Gray, with an interval of a few days between, it is likely that no deaths would occur and the subsequent illness would be much less severe. If however, the dose of 4 Gray was received over a working life time, no acute illness would develop, although populations exposed to this level of radiation would be expected to show an increased incidence of late effects, such as leukaemia and other malignancies. The clinical picture however is entirely different if the dose of 4 Gray is delivered to a limb in an instantaneous exposure. There would be no immediate symptoms or signs but after a latent period of three weeks or more, erythema and epilation may occur depending on the area affected (the thin skin of the flexor surfaces being more radio-sensitive) and by the type and energy of the radiations involved.
54
For descriptive and treatment purposes it is convenient to designate radiation accidents as those due to:– (i)
EXTERNAL RADIATION,
–
(ii) CONTAMINATION INCIDENTS,
–
(iii) COMBINED RADIATION INJURY, –
either whole body or partial body (local) exposure. either external or internal, but both may co-exist. when the whole body exposure is associated with conventional injuries, e.g. wounds or thermal burns.
Although this system of accident classification tends to be an oversimplification it has nevertheless proved of value in practice. In respect of combined radiation injuries Hirsch [2] in a recent presentation made reference to the very narrow ‘time envelope’ in which surgical procedures could be carried out. He considered that surgery was precluded between the 2nd and 12th days following whole body radiation exposure and that urgent surgical procedures had of necessity to be carried out early as mortality increased the longer the delay. MEDICAL MANAGEMENT In order to respond effectively in the event of a radiological incident the physician in charge must have a basic understanding of the type of injury which may be expected to follow certain levels of exposure. The doctor must also have the specialist assistance of a medical physicist (Radiological Protection Adviser) to conduct the essential contamination and dosimetry assessments. In direct contrast to the conventional accident or illness, immediate danger to life is not usually the presenting feature in accidents involving over-exposure to ionising radiations unless there are associated physical injuries. It will be readily appreciated therefore that the physician’s initial responsibility in the accident situation must be to ensure that the airway is maintained and haemorrhage controlled before dealing with the radiological aspects of the incident. In the initial management of the heavily contaminated and incapacitated casualty, and depending on the local circumstances, spread of contamination can be limited by cutting off the clothing and covering the casualty with a blanket before transfer to the medical facility.
55
TABLE 3
Tables 3 and 4 set out the types of accident, possible location and the ionising radiations which may be involved in external radiation and contamination incidents [3]. TABLE 4
56
In general the initial management will be largely dictated by the circumstances in which the accident occurred but the following guidelines, summarised in Figure 2, set out the basic functions of the medical response in the emergency situation. 1 To render immediate first aid to any persons who may be injured, irradiated and/or contaminated during the incident. N.B. It is imperative that the medical team are adequately briefed and wear protective equipment before entering contaminated areas. 2 To liaise with the Radiological Protection Adviser and obtain the best possible estimate of doses received by individual personnel. 3 To carry out a detailed clinical assessment including biological investigations where indicated on the personnel involved. 4 To maintain detailed clinical records particularly with regard to the time of onset and the nature of any symptoms and signs as they develop. 5 To determine the subsequent management of radiation exposed personnel based on: (a) History of the accident. (b) The clinical condition of the patients. (c) Initial dosimetry assessments. (d) The number of casualties. If there are several casualties it may be necessary to establish a system of priorities. A simple form of triage advocated by the French [4] is to place over-exposed personnel into one of 3 categories. Category 1 Casualties who have received over-exposures to radiation but who also have combined injuries such as wounds, burns and/or contamination. Category 2 Casualties who are judged to have received whole body exposures at such levels that they will require treatment in a specialised haematological unit. Category 3 Casualties who are asymptomatic; are estimated to have received relatively low levels of whole or partial body exposure and who are free from any other form of injury.
FIGURE 2
Figure 2 —
The initial emergency response; the evaluation of casualties and the procedures followed after a radiological accident.
58
The de-tailed management of the potential biological effects of the ionising radiations are now considered in more detail. LOCAL EXTERNAL RADIATION (RADIATION BURNS) The skin is very vulnerable to external radiation exposure and damage in varying degree is relatively common following radiotherapy and minor accidents involving X and ray sources. Skin reaction is related to the absorbed radiation dose which is in turn dependent on the energy of the radiation and whether this is electromagnetic or particulate. In the case of electromagnetic radiation the effect on the skin is inversely proportional to the energy of the radiation and thus for a given external dose gamma—radiation is less likely to cause damage than X-rays. Beta-particles on the other hand give up their energy rapidly in the skin and subcutaneous tissues and emissions of this nature constitute a particular skin hazard. With so many variables therefore it may prove difficult to relate the observed skin damage to a specific radiation dose unless the type of radiation and its energy is known. As with all types of exposure the dose rate is important and skin effects are likely to be reduced if a given exposure is spread over a longer period of time. A transient erythema may appear within 2–3 hours after a moderate exposure at a high dose rate, to be followed after a variable latent period by fixed erythema which may progress to vesiculation. The development of the burn however is much slower in comparison to that produced by a thermal injury. Another important point of distinction between thermal and radiation burns is that the radiation injury often proves to be more severe than first thought. This is due to the fact that the energy deposition associated with electromagnetic radiation does not fall off as rapidly as that following thermal injuries and as a consequence tissue damage to a varying degree may occur at depth. The most important injury is to the endothelium of the blood vessels which may result in an obliterating endarteritis causing ischaemia and necrosis of overlying or peripheral tissues. Other organs which may be affected include the hair follicles and sweat glands. Loss of body hair usually follows exposures in excess of 3–4 Gray and irrespective of the size of the dose usually takes place about 2–3 weeks after the exposure. With doses in excess of 7 Gray the hair follicles are destroyed and the epilation is permanent. Following exposures in excess of 20–30 Gray the end result is a lesion which is slow to heal and which may subsequently become the seat of malignant change.
59
Management of Radiation Burns The occupational group now most at risk from experiencing radiation burns is that of industrial radiographer. In the majority of cases the patient only attends for consultation after the radiation burn has become manifest, although in a few cases they may have been examined earlier because of a reported over-exposure on their film badge. In these cases the diagnosis is fairly straightforward. On the other hand it is infinitely more difficult if no history of exposure to ionising radiations can be elicited. In the initial assessment of a patient suspected of having sustained a radiation burn the following factors should be taken into account:1 2 3 4
The time of exposure if known; the nature of the incident; in particular the type of radiations involved. The possibility of co-existing whole body exposure to penetrating radiation. The possibility of contamination by radioactive dusts or liquids. Whether there was any evidence of transient erythema, and the time of appearance of fixed erythema or other evidence of radiation burns.
In all cases a full physical examination should be conducted and a baseline blood screen, including chromosome aberrations, undertaken. The treatment of radiation burns is primarily directed to the prevention of infection and control of pain. In my limited experience of burns due to external radiation simple dry dressings, applications of merthiolate and judicious use of antibiotics proved more than adequate in the control of infection. However in the recent Goiania accident, in which 19 patients sustained radiationinduced skin lesions due to radioactive contamination the physicians in charge used a wide range of treatments including antiseptic and analgesic solutions, antibiotics and antiinflammatory agents. In addition, one group was given a series of injections to dilate and reduce capillary injury but without any obvious clinical benefit [5], Pain may be a late feature of radiation burns involving the extremities when it is usually due to ischaemia resulting from endarteritis. In those cases where amputation is contemplated various techniques have been developed such as thermographic imaging, to determine the demarcation between damaged and normal capillaries. The dilemma faced by the surgeon is deciding the optimum time to operate;
60
if too early further necrosis may occur, if delayed unduly, the patients suffering is prolonged!
WHOLE BODY IRRADIATION—THE ACUTE RADIATION SYNDROME The salient clinical and laboratory features of this symptom complex were first described in proper sequence by Keller in 1946 [6]. It is now recognised that the acute radiation syndrome represents the radiosensitivity of the haematopoietic tissues, the lining of the small intestine and the central nervous system to a large acute dose of whole body penetrating radiation. The clinical effects following such exposures are again dependent on the dose rate, the dose absorbed and the nature of the ionising radiations. The acute radiation syndrome involves a series of clinical events that vary in timing and duration, depending on the degree of exposure and the extent of tissue injury. The greater the area of the trunk involved, the more severe will be the illness because of the location there of both the small intestine and a large portion of the haematopoietic tissues. In its classical form three clinical stages are recognised. A Prodromal Phase characterised by the onset of nausea and vomiting, associated with marked fatigue, lasting from a few hours to one or two days. This is followed by A Latent Stage of days to a few weeks duration when the patient is relatively symptom free but nevertheless significant changes will be seen in the peripheral blood count. The third stage or Manifest Illness usually begins with an abrupt onset and is associated with diarrhoea, vomiting, severe fluid and electrolyte loss, intestinal ulceration and haemorrhage. Depending on the absorbed dose three specific syndromes can be identified. (i)
The Haemopoietic Syndrome The haematopoietic tissues are among the most radiosensitive in the body with the main impact being on the stem cells in the bone marrow. As a consequence there is inhibition of mitosis of the precursors of the red cells, the white cells and the platelets and changes in the peripheral blood count may occur very rapidly. The earliest change detected is a fall in the absolute lymphocyte count, which commences in the first few hours and continues for several days to levels commensurate with the amount of radiation absorbed. There is often a concomitant increase in the leucocyte count in the first few days, following which the granulocyte count begins to fall with the maximum leucopenia developing in 2 to 5 weeks. The fall in the platelet count parallels that of the granulocytes but begins a few days later. (Figure 3)
61
FIGURE 3
Figure 3 — Typical haematological response following a radiation dose of 4.5 Gray. Lymphocyte, neutrophil and platelet values should be multiplied by 1000. Haemoglobin values are in grams per 100 ml. As the changes in the blood picture become more marked clinical symptoms ensue usually commencing with malaise, headache, fatigue, chills or fever. At this stage the patient is prone to infections, particularly in the mouth and throat and fungal overgrowth in the gastro-intestinal tract may subsequently prove troublesome.
(ii) The Gastro-intestinal Syndrome This syndrome is primarily due to the loss of gastro-intestinal epithelium in association with the agranulocytosis. The symptoms of anorexia, nausea, vomiting, diarrhoea and fever may begin a few days or a few weeks after the prodromal phase depending on the degree of radiation exposure. The diarrhoea may persist when it is frequently associated with blood loss from the gut. Ultimately the patients condition may deteriorate further with abdominal distension, loss of
62
peristalsis, dehydration, circulatory collapse and death. The major associated clinical problems are related to systemic infection with enteric organisms, electrolyte disturbances and hypovolaemic shock.
(iii) The Central Nervous Syndrome Overwhelming doses of radiation of the order of tens of Grays can cause direct damage to the brain. Although there is little evidence of this syndrome following human exposure, experimental evidence would appear to suggest that the central nervous system complications following such a level of absorbed dose are secondary to vascular lesions and that the syndrome is in fact neurovascular in origin. The clinical features are characterised by the rapid onset of apathy, lethargy and prostration frequently followed by seizures, grand mal convulsions and death. Management of the Acute Radiation Syndrome
The general principles for handling emergency and accidental overexposures to ionising radiations are well set out in Appendix 3 of ICRP 28 [7] and form the basis of many emergency medical services procedures. However it will be readily appreciated from the description of the clinical syndromes which may be experienced by heavily irradiated casualties, that the subsequent management of these patients must lie within the specialist field of haematology. This aspect of the management will therefore be considered in more detail by Professor Cawley in his presentation on ‘The Management of the Immunosuppressed Patient’. At the scene of the accident the initial priorities are related to the treatment of life-threatening injuries, and the monitoring for serious external contamination. Following transfer to the medical facility a detailed clinical assessment should be conducted while awaiting the preliminary dosimetry information. Baseline biological investigations should also be initiated at this stage with sufficient blood being taken for a full blood count including blood group, HLA typing, chromosome aberrations and general biochemistry. All urine samples should be examined and retained. It is imperative that detailed clinical records are maintained, particularly with regard to the exact time of onset of any symptoms and signs as they may have early prognostic significance. The earliest symptoms are likely to be nausea and vomiting which may be alleviated by Hydrocortisone Hemisuccinate 100mg 1M, 8–hourly or by the use of Chlorpromazine 25mg IM or Metochlopromide 10mg IM. If vomiting persists a saline drip should be set up followed after 3 hours by 5% Dextrose (or 4% Dextrose+1/5 Saline). In a large nuclear establishment where extensive medical and dosimetric services are immediately
63
available, preliminary dosimetric data will be available within 1–2 hours after the exposure. This information coupled with the initial clinical assessment will enable the attendant physician to establish priorities. Patients confirmed or estimated as having experienced an over-exposure in excess of 1 Gray should be admitted to hospital. Personnel known to have received less than 1 Gray should be kept in the medical facility until preliminary investigations are complete. Those estimated to have received between 0.25–1 Gray may be allowed home on completion of the examination and assessment providing they are symptom free. They should be prescribed mouth washes and reviewed next day. In respect of the hospitalised casualties it is the moderate dose group of 2–6 Gray who are most likely to need and respond to treatment. The clinical problems likely to be experienced are those of fluid balance, haemopoietic dysfunction and infection. The following summary gives some indication of the procedures which may be adopted [8]. 1
The patient’s environment is of paramount importance and barrier nursing procedures must be instituted.
2
Diets should not contain uncooked food especially raw fruit or vegetables.
3
The patient’s personal hygiene should be supervised– nails should be trimmed and scrubbed and local neomycin (Naseptin) applied to the anterior nares.
4
Systemic antibiotics should be administered if fever persists above 38ºC for more than 2 hours or there are other signs of infection in an agranulopenic patient.
5
The gut should be sterilised if granulocytes fall to less than 1.5×109 L-1.
6
Irradiated packed red cells and platelet infusions should be administered to maintain haemoglobin and platelet levels or whenever bleeding occurs in a patient with a platelet count below 60×109 L-1.
7
Intravenous acyclovir (5mg/kg) three times a day should be commenced about three weeks after the radiation exposure to prevent the activation of the herpes simplex virus.
It is of interest to note that simple haematological evaluation by peripheral blood lymphocyte count appeared to be the best single laboratory tool for ‘triage’ in deciding where medical resources would require to be allocated in the recent Chernobyl accident. Further, the use of allogenous bone marrow transplants proved particularly disappointing in the management of these patients.
64
versus Host Disease’ in patients whose transplantation immunity had not been fully suppressed by the radiation overexposure.
TABLE 5
Table 5 The sequence of events, the haematological changes, and the clinical outcome of four groups of irradiated personnel who exhibited the Acute Radiation Syndrome following the Chernobyl accident.
65
This was primarily due to the development of ‘Graft CONTAMINATION BY RADIOACTIVE MATERIALS
When radioactive material in the form of dusts, liquids or gases is accidentally released into the environment, contamination may occur externally on the skin or internally by inhalation, ingestion or absorption through the intact or abraded skin. In the nuclear energy industry external contamination does not usually constitute a serious medical problem as static air samplers continually monitor for atmospheric contaminants and personal monitoring at the end of each shift ensures that radioactive contamination is identified at an early stage. The outcome however may be entirely different if contamination occurs and is allowed to persist due to the person involved being unaware of the fact. The series of events at Goiania in September/October 1987 bear tragic witness to this fact. EXTERNAL CONTAMINATION
The effects on the skin of radioactive contaminants essentially depends on the type and energy of the emissions from the radioisotope involved. Due to their relative short range emissions from particles are unable to penetrate to the basal cell layer of healthy skin. As a consequence the major concern associated with such contamination relates to its possible transfer to internal organs. On the other hand the emissions from particles penetrate to the deeper layers of the skin and depending on their energy to the subcutaneous tissues, thereby constituting a potential skin hazard. The effects of X and -rays are also energy dependent, the lower the energy of the incident radiation the more radiation will be absorbed and the greater the likelihood of skin damage ensuing. Many radioisotopes emit more than one type of radiation but it will be obvious from the foregoing that those which emit a preponderance of -particles are the most hazardous to the skin and subcutaneous tissues. Radioactive substances usually rest on the thin film of oil which covers the outer layer of the skin and the openings of the glands and hair follicles. As a consequence the decontamination techniques usually employed are based on the removal of this oily film by means of soap and detergents. With ingrained contamination particularly on the hands it may be necessary to employ, stronger agents to remove the outer horny layer of the skin.
Decontamination Procedures It should be appreciated that the intact skin is an excellent barrier to the absorption of radioactive materials and that harsh cleaning methods should not be employed in attempts to decontaminate the skin. Experience
66
has shown that it is important to be gentle when conducting decontamination procedures. It is a mistake to scrub too hard as the contamination may simply be rubbed deeper into the skin or alternatively it may render the skin too tender to complete the decontamination [9]. This particularly applies to the softer skin of the face. The basic agents and materials required to effect decontamination of the skin are:1
Soap and water
2
1% and 4% Cetrimide
3
Saturated solution Potassium Permanganate
4
10% solution of sodium metabisulphite
5
4% Xylocaine solution (or equivalent local anaesthetic)
6
Surgical gloves, rubber boots, protective aprons, large polythene bags for contaminated clothing and dressings, cotton wool balls, cotton wool applicators, nail brushes, adhesive masking tape and paper towels.
FIGURE 4
Figure 4. Decontamination Body Chart
67
It is essential to conduct a total body survey before commencing decontamination. The monitoring results can then be transposed on to a body chart (Figure 4) such as that used at Sellafield, which clearly demarcates the extent and degree of the contamination.
The following priorities and procedures should now be adopted:1
Breaks in the skin should be identified and covered.
2
Radioactive contaminants around body orifices, particularly the nose, should be removed first.
3
Decontamination should be carried out by starting at the periphery of the contaminated area then working gently towards the centre.
4
Soap and water should be used initially and if this fails to remove all the contaminant, 1% Cetrimide may then be tried.
5
If contamination persists the next step is to use a saturated solution of potassium permanganate which removes some of the horny layer of the skin, but care must be taken that any undissolved crystals do not come into contact with the skin. This substance must not be used near the eyes or on the hair and should only be used with great care on the softer areas of skin. The potassium permanganate solution should be left on for a few minutes only, until the skin is deeply discoloured. It is then washed and allowed to dry, the resulting pigmented area then being treated with the 10% solution of sodium metabisulphite to remove the coloration. If the contamination still persists these procedures may be repeated with the proviso that considerable care must, at all times, be given to the state of the skin.
If redness or tenderness develops decontamination procedures must be stopped and in such circumstances it is good practice to cover the area with a lanolin-containing cream followed by an impervious dressing. The patient can then be examined again the following day when the condition of the skin will usually allow further attempts at decontamination to be carried out. Decontamination procedures may have to be applied to other areas, e.g. hair, teeth and mouth, eyes, nostrils, ears, etc, and some ingenuity may be required to accomplish clearance of the contaminant. INTERNAL CONTAMINATION
The major hazard associated with the accidental intake of radionuclides is undoubtedly the risk of late stochastic effects and decorporation is primarily undertaken with this possibility in mind. It is nevertheless recognised that a very large intake could follow the maladministration of a therapeutic agent or by accidental inhalation, ingestion or from heavily
68
contaminated multiple wounds and in such an event these patients would need to be immediately admitted to a specialised facility and treated as if they were suffering from the Acute Radiation Syndrome. Following a radiological incident internal contaminants tend to be selectively deposited in specific body organs and tissues depending on the nuclide involved. Radioisotopes of stable elements normally found in the body follow the stable isotope (e.g. radioiodine is taken up by the thyroid) while other radioisotopes adopt the metabolic pattern of stable elements to which they are chemically related (e.g. strontium follows the same pattern as calcium). On the other hand plutonium for no specific reason is preferentially deposited in bone and liver. It follows therefore that the organ retaining the highest concentration of the radionuclide is the one most likely to sustain immediate radiation effects and/or subsequent malignant change and is usually regarded as the target organ for that particular isotope. One of the major difficulties experienced in the early management of persons internally contaminated with radionuclides is that the extent and magnitude of the contamination is seldom immediately available to the attendant physician. It is recognised that the hazards posed by an accidental intake relate to the quantity, site of deposition and metabolism of the radionuclide in addition to its physical half-life and the radiosensitivity of the target organs. Nevertheless the decision whether or not to implement treatment must on occasion be based on only limited information. Fortunately, an accidental intake is unlikely to pose an immediate threat to a patient’s life and decorporation is therefore primarily undertaken to reduce the risk of late effects, such as cancer. This poses another problem in that the need for initiating treatment in a particular case has to be based on an assessment of the risk of late effects in the individual concerned. As Voelz [10] pointed out—‘If discomfort, side effects or risk accompany the therapy, it is especially important to understand the need and basis for treatment’. The general principles adopted in the management of persons accidentally contaminated with radionuclides are twofold [11], (i)
To reduce absorption and internal deposition, and
(ii) To enhance elimination or excretion of the absorbed nuclides. It must however be recognised that while there are a large number of radioisotopes which theoretically could prove to be hazardous most are seldom encountered or have short physical half lives and therefore do not represent a serious problem in medical management. I have therefore limited the contents of this section to a review of the therapeutic procedures utilised for some of the more common radionuclides.
69
Iodine 131 This radionuclide may be released into the immediate environment following a reactor accident or any incident involving the release of fresh fission products e.g. following a nuclear weapons test. Most of the iodine released will be readily absorbed by inhalation, ingestion or through the skin. In the event of an incident, medical management is directed to the issue of stable iodine as a blocking agent to prevent uptake of the radioisotope by the thyroid gland. In a typical individual approximately 25% of a single intake of radioiodine will be retained in the thyroid after 12–24 hours. To be most effective therefore stable iodine should be administered as soon as possible after the incident. A recent report following experimental work in China [12] showed that stable iodine was 96.6% effective at the time of exposure but only 42.5% effective after four hours had elapsed. A convenient preparation for use in such emergencies is Potassium Iodate in tablet form and a consensus of opinion now suggests that one 170mg tablet daily (equivalent to 100mg iodide) provides adequate suppression of uptake of radioiodine in the adult [13]. In this country it is the practice to recommend half the adult dose for children under 12 years of age and a quarter of the adult dose for very young children. The frequency of reactions to iodide would appear to be very low. Based on American experience complications of iodide therapy in a presumably ill population represents a low order of risk of between 1×10-7 and 10×10-7 at a daily therapeutic level of administration [14]. Strontium and Radium Both these elements are absorbed from the intestine in competition with calcium. A number of treatment regimes have been advocated to reduce the gastro-intestinal absorption but in some cases the reports on the effectiveness of treatment varies quite markedly. In the case of strontium aluminium-containing antacids are recommended as being effective in reducing the intestinal uptake of radiostrontium by between 50 and 85% with aluminium phosphate appearing to be the most effective preparation. Alginates which are jelly-like substances obtained from brown seaweed will inhibit absorption of strontium if administered shortly after ingestion. However, their extreme viscosity makes them difficult to administer. Following ingestion, about 30% of radium is absorbed and subsequently most of this is excreted within a few days after the incident. The radium remaining is almost entirely deposited in the skeleton. Immediate stomach
70
lavage has been advocated for patients who have just ingested radium but little is known about the removal of radium once it is incorporated in the skeletal bones.
Caesium Caesium is rapidly and almost completely absorbed from the gastro-intestinal and respiratory tracts. It is soluble in body fluids and follows the same metabolic pathways as potassium, tending to concentrate in soft tissues particularly muscles. In the management of a contamination incident the administration of the ion-exchange resin Ferric Ferrocyanide (Prussian Blue) is the treatment of choice. Prussian Blue is of low toxicity as it is not absorbed from the intestine. It binds the caesium ions that are enterically cycled into the gastro-intestinal tract so that the caesium is not reabsorbed. The biological half-life during treatment is reduced to about one-third of its usual value and the systemic uptake is likewise reduced. It proved to be of real value when combined with an exercise and sauna regime in the treatment of members of the public contaminated with caesium chloride in the incident at Goiania, Brazil in 1987. One of the attendant physicians confirmed, in a personal communication, that Prussian Blue was the most effective treatment used to remove the caesium. He indicated that 40 patients were treated with this compound in doses ranging from 1.5 to 10.0 grams per day, but daily doses above 10grams were not well tolerated. Nevertheless one patient with extremely high internal contamination received a total dose of approximately 1000 grams of Prussian Blue in just over a four month period with no harmful effects [15]. Diuretics were also introduced into the treatment regime but did not make any significant contribution to the elimination of the caesium. On the other hand forcing the intake of fluids to at least 3 lites per day proved to be useful and practical. The fact that the contaminant was caesium chloride, a highly soluble compound, undoubtedly contributed to the success of an exercise and sauna regime in accelerating the removal of the caesium by ‘sweating it out’.
Tritium Tritium is the only radioactive isotope of hydrogen and problems of contamination only arise when elemental tritium is oxidised to tritiated water. In this form it is rapidly and completely absorbed following inhalation or ingestion and it can also be absorbed through intact skin. It is distributed evenly throughout the body and effectively produces whole body irradiation. The medical management is usually comparatively straightforward consisting of forcing 3 to 4 litres of fluid by mouth per day coupled with
71
the administration of a diuretic. However in a recent incident involving two laboratory staff in Switzerland, Lloyd et al [16] reported that the physician in charge treated the most severely exposed individual with intravenous infusions of 7 litres per day. However this regime is not without risk and can only be employed in young fit persons.
The clearance half time of tritiated water is generally considered to be about 10 days but Lloyd also showed that even on an unsupervised forced fluid regime this was reduced to less than 6 days, thereby reducing the other subject’s predicted dose uptake to approximately 65% of the initial assessment.
Plutonium and other transuranic elements Inhalation incidents are the commonset form of occupational exposure and the initial medical concern is centred on whether or not to initiate treatment with a chelating agent. A number of chemical compounds enhance the elimination of metals from the body by a process whereby organic compounds exchange less firmly bonded ions for other inorganic ions to form a relatively stable non-ionised ring complex which can be readily excreted by the kidney.—Diethylene triaminepentaacetic acid (DTPA) is a powerful chelating agent which is effective in the removal of transuranic metals, the rare earths and some transitional metals from the body. It also binds trace metals such as zinc and manganese and this factor has to be taken into account if long term administration is deemed necessary. In inhalation incidents the decision whether to employ a chelating agent such as DTPA is largely dependent on the chemical properties of the compound involved and this is where a knowledge of the plant and processes is invaluable. Soluble compounds such as plutonium nitrate have a relatively rapid uptake from the lungs from where they are readily translocated via the blood stream to the skeleton and liver. It is during the period of translocation that chelation by the intravenous administration of DTPA has proved most effective and studies suggest that about 60–70% of the soluble plutonium uptake can be removed if the DTPA is administered on the first day after exposure. However intravenous DTPA has proved ineffective in the initial treatment of inhalation incidents involving insoluble compounds, such as plutonium oxide because of the relatively small amount of plutonium which translocates via the blood or intracellular fluids following this form of intake. Nevertheless in the event of a massive intake it would be prudent to use DTPA aerosols after collecting baseline urine samples. DTPA is available either as the calcium or the zinc salt. Animal experiments suggest that Ca DTPA is more effective than Zn DTPA when given promptly after exposure to Plutonium and americium. However no comparable
72
studies are available in humans. Clinical experience has shown that both salts of DTPA may be safely given by intravenous injection but only the calcium salt should be used if the treatment is by aerosol, as symptoms similar to metal fume fever have occurred following the aerosol administration of Zn DTPA. In the management of Plutonium contaminated wounds it is standard practice to give intravenous Ca DTPA before commencing elective surgical procedures such as wound excision. This ensures that any plutonium which may be released into the circulation can be readily chelated.
The Hanford Americium Accident in August 1976 in which a 64 year old chemical operator was injured and heavily contaminated following an explosion in an ion exchange column fully demonstrated the effectiveness and lack of toxicity after prolonged administration of DTPA. In the subsequent management of this patient a total of 584 grams was administered in just over 4 years [17]. In this accident an estimated 37GBq–185GBq (1–5Ci) of Americium was initially deposited on the operator and his clothing and this was reduced to 222MBq (6mCi) by on-site decontamination and to 37MBq (1mCi) after the first day post-exposure.
Figure 5 The effectiveness of DTPA in reducing liver and bone burdens In a report on the accident prepared by Heid et al in 1979 [18] they reviewed the effectiveness of DTPA in reducing the liver and bone burdens
73
of Americium. They calculated the quantity available for deposition based on the amount excreted in the urine and faeces and the quantity estimated to remain in the bone, liver and facial skin to be about 40.7MBq (1100 Ci). Using models derived from animal experimentation they estimated that a total of 28.12MBq (760 Ci) would have been deposited in bone and liver if DTPA therapy had not been available. In effect they found only about 9.25MBq (250 Ci) was retained thereby confirming the effectiveness of DTPA in chelating and reducing the anticipated uptake of Americiam as a result of this accident. (Figure 5). REFERENCES
1. Lushbaugh, C.C., Ricks, R.C. Fry, S.A. A historical review of sealed sources acccidents. Radiation Protection in Nuclear Energy, Proceedings of a Conference in Sydney, April 18–22, 1988. pp. 401–408. 2. Hirsch, E.F. The Role of Surgery in the Management of Acute Local Radiation Injuries. The Medical Basis for Radiation Accident Preparedness, October 20–22 1988 Oak Ridge, Tennessee—To be published. 3. Schofield, G.B.—Procedures following Major Radioactive Contamination. Proceedings of the meeting at the University of Bristol, April 8th 1981, p. 11. 4. Nenot, J.C. Clinical Aspects of Accidents Resulting in Acute Total Body Irradiation. Oak Ridge Conference—to be published. 5. International Atomic Energy Agency, Vienna 1988—The Radiological Accident in Goiania p. 46. 6. Keller, P.D. A Clinical Syndrome following exposure to Atomic Bomb Explosions, J.A.M.A. Vol 131, No. 6, pp 504–506. 7. ICRP 28—The Principles and General Procedures for Handling Emergency and Accidental Exposures of Workers. Vol 1, 1978 p 17–21. 8. International Atomic Energy Agency, Vienna 1988—The Radiological Accident in Goiania, p. 45 9. Lawson, A.W.—Decontamination of the Skin. Proceedings of Symposium at Atomic Energy Establishment, Winfrith Nov 14th 1963, pp 69–76. 10. Voelz, G.L. Current Approaches to the Management of Internally Contaminated Persons. The Medical Basis for Radiation Accident Preparedness 18–20, October 1979 Oak Ridge, Tennessee. pp 311.
74
11. N.C.R.P. Report No 65—Management of Persons Accidentally Contaminated wiht Radionuclides April 15, 1980, p 12. Traitement des Contaminations Internes, Experience chinoise. Copy received January 1988. 13. N.C.R.P. Report No 55. Protection of the Thyroid Gland in the Event of Releases of Radioiodine. August 1977, p 23. 14. N.C.R.P. Report 65—Management of Persons Accidentally Contaminated with Radinuclides. April 15, 1980, p 15. Oliveira, A.R. (1988) Personal communication. 16. Lloyd, D.C., Auf der Maur, A., Gossi, U. Accidental Intake of Tritiated Water: A Report of Two Cases. Radiation Protection Dosimetry Vol 15, No 3, pp 191–196 (1986). 17. Breitenstein, B.D., The 1976 Hanford Americium Exposure Incident: Medical Management and Chelation Therapy. Health Physics, Vol 45, No 4 pp 855–866 (1983r). 18. Heid, K.R. The 1976 Hanford Americium Accident Report prepared for the U.S. Department of Energy, January 1979, p 11.
MEDICAL MANAGEMENT OF THE PATIENT IMMUNOSUPPRESSED BY IONISING RADIATION
JOHN C.CAWLEY UNIVERSITY DEPARTMENT OF HAEMATOLOGY, ROYAL LIVERPOOL HOSPITAL, PRESCOT STREET, P.O. BOX 147, LIVERPOOL L69 3BX, UK
INTRODUCTION Since the immune system is distributed throughout the whole body and since its cellular and humoral constituents circulate, only irradiation to a large area will result in profound immunosuppression. All immunocompetent cells are ultimately derived from the bone marrow which is itself widely distributed throughout the skeleton. Doses in excess of 5Gy cause severe bone marrow suppression; doses greater than 15Gy will result in fatal damage to other organs such as the lung and gut. Experience and the consequence of lethal or near lethal whole-body irradiation is largely confined to two situations:1) Total body irradiation (TBI) for bone marrow transplantation 2) Nuclear accidents e.g. Chernobyl and Brazil. Most experience relates to the therapeutic use of TBI and I shall largely confine myself to this area, although I will later consider briefly the conclusions to be drawn from Chernobyl. 1) Therapeutic TBI for bone marrow transplantation The general aim of therapeutic irradiation is, of course, the ablation of tumour with minimisation of damage to other tissues including the bone marrow. The only exception to this general rule is the therapeutic supralethal external total body irradiation (TBI) used in treatment
75
76
of widely disseminated radio-sensitive tumours such as acute leukaemia or lymphoma. At the doses employed ( 10–5 Gy), the limiting toxicity is bone marrow failure (and consequent immunosuppression etc) and some form of marrow rescue is necessary This may be done with the marrow of a donor (allogeneic [allo-] BMT) or with the patient’s own marrow (autologous [A] BMT) obtained before TBI (Nathan, 1983; Goldstone, 1986). Clinical BMT, therefore, provides a model for the medical management of the patient immunosuppressd by ionising irradiation. It must, of course, be immediately acknowledged that BMT patients constitute very imperfect models for subjects exposed to large doses of radiation during a nuclear accident. The accident patients will probably have received the irradiation over a much shorter period, will often have extensive burns and may well have ingested or inhaled radioactive substances. Bone marrow transplantion Both allogeneic and autologous BMT involve the basic sequence set out below:-
The essential difference between allogeneic and autologous transplantation is, of course, that the former involves the re-infusion of foreign marrow. This foreign donor marrow contains immunocompetent cells which potentially recognise the host as foreign and cause graftversus-host-disease. Both GVHD and manoeuvres (e.g. cyclosporin and Tcell depletion) designed to prevent GVHD cause profound immunosuppression. This type of immunosuppression is not a consequence of ionising irradiation per se and therefore allo-BMT seems a less appropriate model than auto-BMT for today’s meeting. The specific immunodeficiency of allo-BMT results in a number of late complications
77
following initial marrow recovery (e.g. viral pneumonitis); these are not relevant to the present talk. Autologous BMT ABMT involves the following sequence of events:-
This period of immunosuppression following ABMT seems the best model we have for considering the management of the patient immunocompromised as a result of radiation following a nuclear incident. Immunosuppression following ABMT Infection after ABMT occurs in the first 3–4 weeks and is primarily due to profound neutropenia in association with breakdown of mucosal barriers (Prentice, 1984). After initial marrow recovery, late infections are uncommon although zoster and occasional pneumococcal septicaemias do occur. Irradiation per se does impair lymphocyte and NK function, but this seems to cause few, if any, clinical problems. Infections during the neutropenic period Early infections are due to two main groups of organisms—skin organisms associated with in-dwelling catheters (70%) and gutassociated bacteria (30%)—both cause septicaemia. Later, after prolonged neutropenia, fungal infection becomes significant, Candida and Aspergillus sp being particularly important. Oropharyngeal herpetic infection is common throughout the neutropenic period.
78
Infection control strategies Table 1 sets out potential strategies for minimising infection in the neutropenic period. I shall consider each in turn in the context of ABMT/nuclear accidents. TABLE 1a) Possible infection-control strategies in patients severely immunocompromised as a result of ionising radiation
a) Modified from Hann and Prentice 1984. Laminar air flow rooms. In combination with other measures such as gastrointestinal decontamination, provision of pathogen-free air probably is beneficial and is probably, therefore, worthwhile for the patient exposed to large doses of radiation. Other protective measures. Simple attention to hygiene by staff is clearly important. Most units now recommend hand washing and treatment with antiseptic and the wearing of a clean gown/apron; elaborate theatre-style clothing has now been largely abandoned. Skin and hair bacterial decontamination with particular attention to the axillae, perineum and orifices is recommended. Long-term indwelling catheters (with protective skin tunnel) have
79
reduced cannula-related local sepsis and made life easier for the patient. Their use has greatly increased the incidence of grampositive bacteraemias which are usually not life-threatening. Clean food is probably worthwhile. Gastrointestinal decontamination. Most studies suggest that some form of gastrointestinal decontamination reduces infections in severely neutropenic patients. Such decontamination may aim either for total gastrointestinal decontamination (using a combination of non-absorbable antibiotics) or for selective suppression of enterobacteriaceae. It is not clear which approach is preferable, but most Units use some form of gut prophylaxis. Systemic prophylaxis. Most centres do not use systemic antibiotics for prophylaxis, preferring to reserve them for treatment of actual or suspected infection. Imidazole derivatives and oral acyclovir probably protect against Candida and herpes simplex infections respectively, and are given as routine prophylaxis in most Units. Management of pyrexial episodes Broad-spectrum antibiotic combinations are given early and often without microbiological proof of infection. The initial choice of antibiotics will be influenced by the clinical circumstances; for example, gut symptoms will prompt the early introduction of metrionidazole. Subsequent microbiological evidence will determine later treatment. Most studies of therapeutic granulocyte transfusions have demonstrated a significant improvement in survival. However, the benefit is small and the continuous improvement of antibiotic regimes has meant that granulocyte transfusions are now rarely employed. Their use should be reserved for ill patients with severe neutropenia and for definite bacterial infection persisting in the face of maximum appropriate antibiotic therapy. Marrow Recovery Both GM- and G-CSF shorten the neutropenia following immunoradiotherapy (e.g Brandt et al, 1988) and very recently this approach has actually been pursued following accidental irradiation (Butturini et al, 1988). Which cytokine is better and whether the action of either might be potentiated by other cytokines such as
80
IL–3 have not yet been determined. However, it now seems clear that cytokine treatment in some form or another is likely to be used in radiation victims in an attempt to minimise neutropenia and its attendant complications. Bone marrow transplantation If irradiation has caused irreversible damage to the most primitive haematopoietic progenitors, only bone marrow transplantation will aid bone marrow recovery. Autologous BMT would clearly be ideal in these circumstances, but this treatment would require the prophylactic harvesting of bone marrow from all radiation workers—hardly a reasonable option for the forseeable future. Allogeneic BMT also offers the prospect of rescuing the patient from irreversible progenitor damage (Gale & Reisner, 1988) However, this approach is fraught with difficulties. HLA-identical marrow from either a sib or a donor panel is required and this takes time and needs host leucocytes in substantial numbers. Also, the graft may cause serious GVHD in a patient who might have recovered spontaneously. Allo-BMT is not therefore an option unless the patient’s radiation exposure is known with some accuracy to be one that is almost certain to cause irreversible marrow damage without being inevitably fatal to other tissues. Even then, logistic difficulties mauy be insurmountable and Chernobyl (see later) has shown that allo-BMT is unlikely to be an important treatment option for radiation accidents in the immediate future. Allo-BMT from matched unrelated or from unmatched donors are currently high-risk procedures. Future immunological advances may, however, overcome these current difficulties and these forms of allo-BMT may become poosible treatment options at some future date. The lessons of Chernobyl 21/22 patients who received >6Gy died within 28 days. Mostly as a result of radiation burns rather than of immunosuppression. 7/23 patients who received 4–6Gy died and in all cases radiation burns were thought to be the cause of death. Only 1 death occurred among the more than 60 patients receiving doses <4Gy (Lancet Editorial, 1986).
81
Supportive measures of the sort used in clinical BMT patients seem to have been highly effective in preventing infections. 13 transplants were performed. 7 died early from other injuries. The other 6 showed temporary engraftment only and GVHD may have contributed significantly to death in 2 patients. The Russians concluded that BMT had made no significant contribution to patient survival.
CONCLUSIONS ABMT probably provides a reasonable model of the immunosuppression produced by large doses of ionising irradiation. The general and specific measures used to support ABMT during the 3–5 weeks immuno suppression post ABMT are appropriate to the management of accident patients. However, the clinical picture in accident patients is greatly modified by external and internal radiation burns and this, rather than immunosuppresssion per se, is the major cause of death. Allo-BMT does not have a major role and should probably only be considered in patients confidently thought to have received 8Gy Recombinant growth factor(s) probably shorten the period of neutropenia and will be given early in any future accidents.
REFERENCES Brandt, S.J. et al. Effect of recombinant human granulocyte-macrophage colony-stimulating factor on haemopoietic reconstitution after highdose chemotherapy and autologous bone marrow transplantation. New Engl.J.Med. 1988, 318, 870. Butturini, A. et al. Use of recmobinat granulocyte-macrophage colony stimulating factor in the Brazil radiation accident. Lancet 1988, ii, 471. Nathan D.G. Bone marrow transplantion. Clinics in Haematology, 1983. 12, 3. Goldstone A.H. Autologous bone marrow transplantation. Clinics in Haematology, 1986, 15, 1.
82
Prentice H.G. Infections in Haematology. Clinics in Haematology, 1984. 13, 3. Gale, P.R. & Reisner Y. The role of bone marrow transplants after nuclear accidents. Lancet, 1988, i, 923. Living with radiation after Chernobyl. Lancet Editorial, 1986, ii, 609.
THE GOIÂNIA ACCIDENT
J.R.CROFT National Radiological Protection Board Northern Centre, Hospital Lane, Cookridge Leeds LS16 6RW
ABSTRACT The Goiânia accident was one of the most serious radiation accidents that has occurred. This paper describes the background to the accident, the accident itself, the initial and the subsequent responses. Although the emphasis is on the medical effects and how they were dealt with, the major elements of the decontamination operation are addressed. Finally the main lessons from the accident are summarised.
INTRODUCTION On 13th September 1987, a 50.9 TBq caesium–137 source was removed from its protective housing in a teletherapy machine in an abandoned clinic in Goiânia, Brazil. The source was subsequently ruptured and the activity widely dispersed in the city. Many people incurred large doses due to both external and internal exposure. Four of these died and 28 suffered radiation burns. The extent and degree of contamination were such that seven residences and various associated buildings had to be demolished and topsoil had to be removed from a significant area. The decontamination of the environment took about 6 months to complete and generated some 3500 m3 of radioactive waste. This was one of the most serious radiological accidents that has occurred. Since 1945 there have been 59 reported deaths due to serious radiation exposure in 17 accidents, with 10 of these accidents being in the non-nuclear sector[1]. The extent to which these have been reported in readily accessible literature has varied considerably and consequently we have not been able to benefit as much as we should from
83
84
learning the lessons from these accidents. With this in mind and with a perception that greater attention needs to be paid to radiological protection outside the nuclear sector, the IAEA collaborated with the Brazilian Authorities in carrying out a post accident review. The author was fortunate enough to act as a consultant to IAEA in this review and in the drafting of the report of the accident, which has now been published [2].
BACKGROUND INFORMATION
Goiânia is a large city with a population of about one million and is the capital of Goias State which is on the central Brazilian plateau. It is of the order of 1000 km from the cities of Rio de Janeiro and Sao Paulo where the major radiological protection resources are situated. The accident occurred in one of the poorer areas of the city, where the literacy of the population is limited. Radiation Protection Infrastructure The competent national authority for nuclear energy is the National Commission for Nuclear Energy (CNEN). This has three research institutes which provide the main sources of radiological expertise in the country. (a) The Institute for Nuclear Energy Research (IPEN), Sao Paulo, has a research reactor which is used to produce radioisotopes for medical and industrial uses. It also processes uranium and thorium. (b) The Institute for Nuclear Engineering (IEN), Rio de Janeiro, has a research reactor. (c) The Institute for Radiation Protection and Dosimetry (IRD), Rio de Janeiro, is the nearest equivalent in Brazil to the NRPB and provided the major element of the professional staff and technical facilities in the recovery operation after the accident. Some radiation protection expertise was also available in two Federal Government Agencies: NUCLEBRAS which is responsible for uranium prospecting, extraction and handling, and FURNAS which is responsible for nuclear power generation, having one operating power station at Angra, south of Rio de Janeiro. Regulatory Framework
CNEN licences the purchase and holding of radioactive material. The licensing procedure takes into account the facilities, key personnel and radiation safety documentation, and includes a commissioning
85
inspection. One of the conditions of the licence is that CNEN be informed of any material changes, eg. disposal or change of location of a source. CNEN is also the body responsibile for making regulations governing the use of ionising radiations and in some sectors of use it also enforces the regulations. However, in the medical sector this is the responsibility of the Federal Ministry of Health, which has devolved this responsibility to the State Ministries of Health. The extent to which this responsibility was discharged varied considerably between the states. Emergengy Arrangements
The existing emergency arrangements were designed to deal with two main categories of accident. Firstly there was the site specific plan for the Angra nuclear power plant. Though not designed for accidents in the non-nuclear power sector, the underlying preparation and planning for emergency actions undoubtedly helped in the Goiânia accident. Of particular relevance were the Logistical Support element and the arrangements to deal with radiation casualties at the Naval Hospital in Rio de Janeiro. There were also arrangements for dealing with radiological emergencies in the non-nuclear power sector, however these were focussed on relatively small scale accidents such as those from transport and industrial radiography. These arrangements centred on one emergency contact in CNEN headquarters who would then activate various appropriate nominated people mostly from within CNEN’s Institutes. There was no equivalent to the NAIR Scheme in the UK, identifying local sources of radiation protection expertise and medical facilities capable of dealing with contaminated patients.
THE ACCIDENT
This section briefly tells how the accident came about, developed and was discovered. Individuals are referred to by the system of initials used in the IAEA report and the doses are those that were initially estimated from chromosome aberration analysis.
86
Lead-up to the Accident
The Institute Goiano de Radioterapia (IGR) was a private radiotherapy clinic in Goiânia which started operation in June 1971 and was owned by a medical partnership. The clinic’s facilities included treatment rooms with caesium–137 and cobalt–60 teletherapy units. At about the end of 1985 the IGR ceased operating from these premises and a new medical partnership took over the name and new premises. The cobalt–60 unit was transferred to the new premises, but ownership of the contents of the old clinic were a matter of dispute in the liquidation of the old partnership, and the caesium unit was left in the old clinic. CNEN were not informed and over the following months most of the clinic and surrounding properties were demolished as a prelude to proposed redevelopment of the site. The treatment rooms were not demolished but were left in a derelict state and were apparently used by vagrants. Some two years passed with the teletherapy unit left in this completely insecure situation. Removal and Rupture of the Source
In the teletherapy unit the source was situated in a rotating assembly inside a shielded housing such that the source could be rotated to the ‘expose’ or ‘safe’ positions. The radioactive material was in the form of highly soluble caesium chloride, which was compacted to form a coherent mass, doubly encapsulated in stainless steel and placed in an ‘international capsule’ of standard dimensions. Although these days most caesium sources are in a vitrified form, the less desirable caesium chloride has to be used for teletherapy sources as the necessary specific activity cannot be attained with the vitrified form. Around the middle of September 1987 two local people, R.A. and W.P. had heard rumours that valuable equipment had been left in the IGR. They went to the derelict site and worked with simple tools attempting to dismantle the teletherapy unit for scrap value. On 13 September they eventually managed to remove the rotating assembly and took it in a wheelbarrow to R.A’s house. This must have given them access to the unshielded source, which would have given a dose rate of 4.6 Gy h-1 at 1m. The next day they were both vomiting and W.P. who had a swollen hand, sought medical assistance. The symptoms were diagnosed as being due to some kind of allergic reaction caused by eating bad food. On 18 September they returned to working on the rotating assembly and in the garden of R.A. they managed to hammer a screwdriver through the 1mm steel ‘window’ of the source capsule and thus rupture it. Subsequently a dose rate of 1.1 Gy h-1 was measured close to the contaminated ground at this spot. As they did not seem
87
to be making progress they sold the rotating assembly to ‘Junkyard I’, managed by D.F. That night he went into the garage where the assembly was stored and noticed a blue glow emanating from it. He was fascinated by this, thinking it might be valuable or even supernatural, and took it into the house to show his wife M.F.I. During the next few days many friends and relatives came to see this phenomena and several were allowed to dig out rice-sized ‘grains’ of the source, which could be easily crumbled into powder. Perhaps the most tragic example of this distribution was that of the fragments of the source taken by D.F’s brother I.F. These were taken home and placed on the table during a meal. His 6–year old daughter L.F.2 handled them whilst eating by hand; she subsequently died having had an estimated intake of 1.0 GBq and having received an estimated dose of 6.0 Gy. After the distribution of the source fragments the assembly was cut up by two of D.F’s employees I.S. and A.S., who subsequently died having received estimated doses of 4.5 and 5.3 Gy respectively. The various pieces were distributed to other junkyards. During the period 13–28 September the activity was widely dispersed by a variety of means. Initially the activity was transported in discrete amounts by contaminated persons and items. However its highly soluble form and the climatic conditions combined to make the contamination highly mobile. There were a number of reported instances of the powder being daubed on bodies because it looked like the glitter used in Mardi Gras celebrations. Undoubtedly the chemical form of the source and the blue glow phenomena had a profound effect on the development of the accident, which might otherwise have evolved like the Ciudad Juarez accident in 1983 which also involved a teletherapy unit taken for scrap. This phenomena had previously not been observed by source manufacturers, but the blue glow has subsequently been observed at Oak Ridge National Laboratory (USA) during the disencapsulation of a similar source and is thought to be associated with fluorescence or Cerenkov radiation due to absorption of moisture. Further study is in progress at Oak Ridge. Discovery
By the 28 September a significant number of people were ill, including 10 in the Tropical Diseases Hospital, and the Junkyard manager’s wife, was intuitively convinced that the glowing powder was the cause. She and one of her husband’s employees took the remnants of the source, in a bag carried over his shoulder, to the
88
Vigilancia Sanitaria (a cross between a General Practice and the Public Health Department) and told them that it was “killing her family”. She was ill and was sent to hospital and subsequently died having received an estimated dose of 4·3 Gy. The doctor did not know what the source remnant was and placed it in a courtyard while he made some enquiries. Meanwhile the patients in the Tropical Diseases Hospital were causing increasing concern to the doctors who could not identify the cause of the illness. Food poisoning, contact dermititis and pemphigus were thought to be possible causes. However, one doctor started to suspect that the skin lesions might be due to radiation, and his enquiries of colleagues eventually came together with those from the Vigilancia Sanitaria. As a result, the next day, 29 September, a medical physicist who happened to be visiting the city and was known to one of those involved, was asked to investigate the package with a borrowed monitor. What he found astounded him. The package still contained about 4.5 TBq and a visit to ‘Junkyard I’ indicated areas of very high dose rate, and removable contamination that could be easily detected with a dose rate meter. On his own initiative he evacuated the Vigilancia Sanitaria and ‘Junkyard I’ with its surroundings.
INITIAL RESPONSE
The situation was reported to the State Authorities in Goiânia. As might be appreciated, the State officials were incredulous of the potential scale of the incident and it took the medical physicist some perseverance and several hours to get to see the State Secretary of Health and inform him of it. CNEN’s emergency co-ordinator (NEC) in Rio de Janeiro was informed. He assigned tasks for the mobilisation of key personnel and then left for Goiânia to assess the situation. At this stage, whilst recognised as serious, the full extent and origins of the incident were not known. Local response The authorities in Goiânia mobilised the police, fire and civil defence forces and by 20:00 hrs had designated the nearby ‘Olympic’ stadium as a collection point and monitoring station. Meanwhile the physicist, together with help from the new IGR clinic continued monitoring both around the known areas of contamination and searching for new areas as the story of what had happened unfolded from the local residents. The initial segregation of people sent to the Olympic stadium was mainly based on those likely to have had contact
89
with parts of the source, ie the relatives and neighbours of the principal characters. Although no local plans existed for responding to such an emergency, the authorities’ improvised strategy worked effectively in bringing the situation under some control and preventing further serious exposure. Once personnel from CNEN started to arrive the local authorities began to relinquish responsibility for control to them but continued to provide support. CNEN response With the large distances involved it was after midnight before the NEC and two technical support staff arrived. It was quickly evident to the NEC that the scale of the accident was extensive and beyond what had been initially thought. He therefore requested further assistance. By the following day, 1st October, there were some twenty experts available and four principal contamination sites (R.A’s house and three junkyards) had been isolated. Over the next couple of days three more foci and some minor areas of contamination were identified and isolated. The monitoring team at the Olympic stadium had identified 249 people with detectable contamination of which 121 were internally contaminated. By 3rd October the initial phase of taking control had been completed. Evacuation Some 200 individuals were evacuated from 41 houses. Some of these had been evacuated by the lone medical physicist who had used an action level of 2·5 µSv h-1. This was simply the derived level for exposure of the public around occupational uses of radiation and was based on a limit of 5 mSv in a year and normal, not emergency, situations. Subsequently, although the CNEN team used a different exposure model (continuous exposure over a 3 month clean-up period) they also used the 2·5 µSv h-1 action level still based on 5 mSv in a year. The IAEA report recognised the circumstances and the strong political pressures under which these decisions were taken but drew attention to the fact that different considerations apply in emergencies than in normal operations. In particular attention was drawn to two international reports [3, 4], indicating that in an emergency evacuation need not normally be contemplated below a dose saving of 50 mSv and that the use of a more restrictive criterion might carry with it economic and social burdens (see also Summary).
90
MEDICAL RESPONSE There are some parallels to previous accidents where radioactive sources ended up in the public domain and caused fatalities, viz Mexico City 1962, Algeria 1978 and Morocco 1983. Also there is the striking parallel of the Ciudad Juarez accident in Mexico in 1983 where there was significant whole body irradiation and the ensuing acute radiation syndrome, together with severe local radiation burns. This accident also resulted in external contamination of several individuals with cobalt–60; however, the levels were low and no significant internal contamination occurred. The Goiânia accident is unique in that the casualties incurred initial acute whole body external exposures followed by chronic whole body exposure at relatively low dose rates from internally deposited caesium–137. These exposures varied depending on the amount of time spent near the source and the amount of caesium–137 deposited internally. The situation was complicated by incomplete exposure histories and lack of information on exactly when the respective exposures began. Some external exposures were undoubtedly fractionated as a result of working and personal habits. In addition the more seriously exposed persons suffered acute local injury to the skin from beta irradiation and to deeper lying tissue from penetrating radiation. Initial response
During the 30th September, three medical radiation specialists arrived in Goiânia and were faced with the 11 patients in the Tropical Diseases Hospital and 22 persons assembled in tents at the Olympic Stadium. They first went to deal with the latter group who had been identified on the basis of their relationship or proximity to the principal characters involved. They had all been significantly exposed and were contaminated both externally and internally. Contaminated clothing was removed and all were decontaminated by taking several baths with soap and water. Although subsequently several would have to be hospitalised, the initial triage indicated that the most pressing problem was not this group but those in the Tropical Diseases Hospital. This latter group had all experienced nausea, vomiting, diarrhoea, dizziness and fatigue, and all but one had some degree of radiation induced skin injuries. These patients were transferred to an evacuated ward of the Goiânia General Hospital and contamination control procedures introduced, together with a programme of blood, urine and faecal sampling. Internal contamination was confirmed by crude gross counting of the urine and faecal samples. Radiological surveys over skin lesions gave dose rates as high as 15 mSv h-1 close to the skin, whilst 6 year old L.F.2 showed an average dose
91
rate of 3 mSv h-1 close to the skin. Decontamination was performed on all patients using mild soap and water, acetic acid and titanium oxide. This was only partially successful and contamination control was a major problem throughout, as the caesium–137 was continually being excreted in the sweat, thus making the patients mobile regenerable contamination sources. Over the next few weeks several more patients were admitted and arrangements were made to deal with some internally contaminated people as out patients. Between the 1st and 3rd October, 10 patients who had been identified as having the most serious problems were transferred by air to the Naval Hospital in Rio de Janeiro, which was the designated hospital to receive radiation casualties under the Emergency Plan for the Angra nuclear power plant. Among the general medical community in Goiânia, the clinical laboratory staff there and in the Naval Hospital, there was a reluctance to help, arising out of some fear and concern for their personal health. This had to be overcome by personal reassurances and informal instruction from the medical radiation specialists. Treatment of the acute radiation syndrome
Treatment of the most highly irradiated patients was directed at the assessment and management of the haematological crisis associated with the acute radiation syndrome. Here, apart from the clinical symptoms and the haematological data the doctors found useful inputs from— (a) patient interviews to help reconstruct the pattern of exposure, (b) dose estimates from cytogenetic techniques, and (c) assessments of internal contamination and associated doses. The latter two are dealt with more fully in later sections and despite complications in the interpretation of these dose estimates they were useful in predicting the degree of haematological depression and the consequent degree of susceptibility to infection. On the basis of the results of cultures from blood, skin, wounds and body orifices and from their clinical courses patients were treated with systemic or topical antibacterial, antifungal or antiviral agents. Other aspects of the treatment included reverse isolation, attention to diet and the administration of irradiated red packed cells and platelet infusions. The clinical course and laboratory findings indicated that bone marrow transplantation was not required by any patient.
92
A departure from clinical practice in previous radiation incidents was the use of granulocyte macrophage colony stimulating factor (GMCSF) which was administered to 8 patients. The IAEA report concluded that before further use of this in actual radiological accidents, further experimental and clinical usage is required in order to clarify the questionable results of its use following the accident in Goiânia. Between 23rd and 28th October, some four weeks after admission to the hospital, 4 of the casualties died (see Table 1). The post mortem examinations showed haemorrhagic and septic complications associated with the acute radiation syndrome. Treatment of local injuries Radiation induced skin injury was observed in 19 of the 20 patients in the two hospitals. Patients exhibited swelling, erythema, bronzing, dry desquamation and blistering. In the first week of October the majority of the skin lesions ruptured and secreted fluid; by 12th October they exhibited drying, sloughing and necrotic skin and re-epithelization, confirming the occurrence of superficial injury by beta irradiation. About three weeks later, this was followed by deep lesions in 10 of the 20 patients, indicative of gamma insult to deeper lying tissues. In one case amputation of the lower arm was necessary. Later a further five patients required surgical intervention, namely four debridements and one skin graft. Acceleration of Decorporation The Goiânia accident resulted in the highest levels of internal contamination with caesium–137 ever recorded; 1 GBq in the case of the 6 year old girl who died. Some 121 persons were found to have internal contamination and a major problem was to accelerate its decorporation. Of these, 62 were considered to have sufficiently high burdens to warrant the use of Prussian Blue, which had been recommended for such situations. Caesium nuclides would normally be excreted mainly in the urine (80%) as opposed to faeces (20%). The effect of Prussian Blue is to dramatically increase the rate of excretion in the faeces without significantly changing that in urine. This was found to be effective, as can be seen from Figure 1, but the dosages required were found to be higher than previously recommended. Other techniques such as diuretics, water overloading and enhanced sweating were also tried but without much success.
93
Figure 1 The impact on the effective half-life (T 1/2 ) of caesium–137 for an adult man who voluntarily stopped taking Prussian Blue.
Biological Dosimetry In total, blood samples from some 110 people who were thought to have received doses in excess of 0·1 Gy, were sent to the cytogenetic facility at IRD (Rio de Janeiro) for dose estimates. Of these the dose estimates for 21 people exceeded 1·0 Gy with 8 above 4·0 Gy (see Table 1). These estimates were based on dose response curves derived at a high dose rate (0·12 Gy min-1). As indicated earlier there is some doubt over the exposure pattern and in some instances the use of a lower dose rate curve might have been more appropriate. ‘Over dispersion’ in the distribution of chromosome aberrations, an indicator of non-uniform (partial body) irradiation, was found in a number of instances.
94
TABLE 1 Initial cytogenetic dose estimates based on an acute exposure (0·12 Gy min-1)
Body burdens were initially estimated from urine and faecal samples using bioassay techniques and age specific metabolic models. For analysis these samples had to be transported to IRD in Rio de Janeiro, which for those patients in Goiânia introduced some delay in the results being available. Also the samples were highly active which introduced radiation protection problems and problems for the counting facilities which were designed for low level analysis. It was clear that whole body counting capabilities would have been desirable from an early stage. These existed at IRD but it was obviously not practicable to transport all the patients to IRD. Temporary whole body counting facilities had therefore to be arranged at both the Naval Hospital and the Goiânia General Hospital. These were in place by early November. They had to be capable of dealing with a very wide range of body burdens, as can be seen from the distribution in Figure 2. The whole body monitoring results confirmed the validity of the age specific models and bioassay analyses previously used.
95
Figure 2 Distribution of the initial body burdens of caesium–137.
PHYSICAL CONSEQUENCES By the 3rd October identification and control of the 7 main foci of contamination had been achieved. Eighty-five houses had been found to have a degree of contamination. To ensure that no major areas of contamination had been missed, an aerial survey by a suitably equipped helicopter was carried out shortly afterwards. This identified a further area giving rise to a dose rate of 21 mSv h-1 at 1m above ground. The limitations of this technique, especially close to the main foci, were recognised and a system of monitoring using equipment mounted in a car was undertaken. To complement these techniques, the inhabitants of the contaminated premises were asked about visitors and their own movements during the relevant period. This indicated the potential transport routes for contamination and indeed some 42 other less contaminated sites were found. Early on a strategic decision had been taken that no major decontamination would be undertaken until comprehensive surveys had been carried out and a coherent plan for the work had been drawn up. These surveys were carried out on foot using hand-held monitors. With the heavy rains experienced during the accident it had been
96
initially expected that the activity would have been washed into the soil and retained. The survey indicated that this was not the case. It seems that the high temperatures (up to 40°C) had quickly dried out the ground and high winds had caused resuspension. Indeed the scale of the effect came as a surprise, with for example contamination deposited on roofs of the single storey buildings being the main contributor to the dose rates indoors. Complementing this survey an extensive programme of environmental sampling of soil, vegetation, water and air was undertaken. Countermeasures were only necessary for soil and fruit and were limited to within a 50 m radius of the main foci. Assessments were made of the resources and logistics in respect of personnel, hardware, disposable items, backup staff, etc. necessary for the decontamination. The difficulties in assembling all the necessary items over 1,000 km away from the major centres of radiological expertise were considerable. It was initially estimated that a decontamination programme would result in 4,000 to 5,000 m3 of active waste; 3500 m3 was eventually produced. This was a major problem area and the decisions on the type of containment, the location of the waste storage site, its planning and construction took more time than expected. Eventually a political decision was taken to choose a temporary repository site where the waste would be stored for up to 2 years. Only by mid-November did it become possible to start major decontamination work. This involved inter alia the demolition of 7 houses, the removal of areas of topsoil and the concreting over of some of these, the decontamination of cars, houses and contents. This phase was successfully completed by 21st December, allowing residents home for Christmas. From the New Year to the end of March 1988 further work was necessary to remove residual minor contamination that had been previously masked by the presence of the high level contamination. At its peak the operation required some 250 professional and technical staff and 300 staff in support, transport, demolition, etc., in Goiânia, together with various laboratory based services in Rio de Janeiro. CNEN and its various Institutes, particularly IRD bore the major brunt of the work. It had required the setting up in Goiânia of an instrument repair and calibration facility to service the wide variety of monitoring instruments garnered from many establishments; provision of training in practical radiological protection, particularly monitoring techniques; a dedicated laundry to deal with contaminated clothing; facilities to manufacture the waste disposal containers; and a multitude of other specialised
97
requirements. Health physics control of the operation was obviously important and monitoring results showed that the maximum dose was 16 mSv with two-thirds of the people receiving less than 1 mSv.
PUBLIC COMMUNICATION AND SOCIAL PERCEPTIONS On the night of the discovery of the accident, rumours spread about what had happened and were exacerbated the following morning when people woke up to find areas cordoned off with no coherent public explanation. Many people tried to go to the Olympic stadium for reassurance, straining the limited resources then available. The public and media interest multiplied manyfold and became a serious drain on the technical resources. In retrospect it is clear that from the beginning there was a need for a press officer, with appropriate support, to provide information to the public. Public over-reaction to radiation accidents has been noted before, particularly after Three Mile Island and Chernobyl and has been referred to as ‘radiophobia’. This was also much in evidence following the Goiania accident. For example people who had been contaminated or in any way connected with the accident were treated as ‘lepers’, even within families. Indeed the ‘clear of contamination’ certificate from the monitoring station at the Olympic stadium became almost a prerequisite for acceptance in the community. In total some 112,000 people were monitored. This ‘radiophobia’ was not restricted to the general public but encompassed some of the local agencies and the medical profession. Perhaps the most bizarre example was the stoning of the coffins during the burial of the four casualties. The ‘radiophobia’ was not confined to Goiânia. Sales of the principal products of the state, cattle, cereals and other agricultural produce, and cloth and cotton products, fell by a quarter in the period after the accident. In order to allay these fears the recovery teams were encouraged to explain to people what they were doing and why, and, for example to accept offers of drinking water and food from people’s houses. They thus gained people’s confidence and raised the credibility of official statements. Team workers made frequent appearances on television and talks on radiation protection were given to journalists to help them understand the situation better. A pamphlet was produced, “What you should know about radioactivity and radiation”, and 250,000 copies were distributed. A telephone service operating 24 hours a day was set up to answer enquiries.
98
SUMMARY Post accident reviews are mechanisms for feeding back experience into the systems of control and accident preparedness arrangements. Such reviews identify lessons to be learned and also often serve to illustrate and emphasize principles that are well known. The preceding sections have touched upon some of the relevant points arising from the Goiânia accident. These and other more general points are summarised below and it is hoped that these will help provide a focus for the rest of the meeting to look at the implications within a UK context. Prevention and Mitigation of Consequences (1)
The accident stemmed from an abdication of responsibilities over source security; as indeed have many other accidents, The wide dissemination of the story of Goiânia can do much to promote a responsible management attitude to radiation sources.
(2)
It is clear that the readily dispersable form of the radioactive material exacerbated the accident. This reinforces the point that whenever possible the physical and chemical forms of radioactive material that are not readily dispersable should be used in sealed sources. Also local contingency plans need to take the form of radioactive material into account.
(3)
The two people who initially dismantled the teletherapy unit displayed symptoms of the acute radiation syndrome at an early stage of the accident. If they, or the subsequent people seeking medical aid had been correctly diagnosed then the extent of the accident might not have been so great. This difficulty in diagnosis of the rarely seen effects of exposure to radiation is typical of such accidents. In the past, medical diagnoses have included insect, spider and snake bites, viral infection and exposure to toxic chemicals. The IAEA has previously recognised this problem and produced a publication [5] on this topic. The dissemination in the medical sector of this document and the Goiânia case history would seem to be of value.
Initial Response (4)
To cover radiological accidents affecting the public there should be a well known system for summoning assistance (preferably locally) and for notifying relevant authorities. In the UK the NAIR scheme meets these requirements.
99
(5)
As with any major emergency, eg flood, explosion etc, local authority plans for dealing with emergencies will provide invaluable support to those specialists dealing with the emergency.
(6)
In such accidents there is usually a tendency, prompted by political and social considerations, to impose extremely restrictive criteria for the implementation of countermeasures. These actions carry with them a substantial economic burden and possibly introduce conventional hazards. This emphasises the need for establishing beforehand clear standards for use in the event of such accidents based on a thorough study of all the issues involved. The updating of emergency reference levels is currently being addressed by the NRPB [6].
(7)
Clearly, to avoid placing a drain on the resources of the people trying to deal with the consequences of an accident and to limit the extent of public concern, emergency preparedness arrangements must include adequate provision for meeting the information needs of the public and the media.
Medical (8)
The accident demonstrated the value of
(i)
having physicians who had received specific training in dealing with radiation casualties
(ii)
having international agreements through which specialist help can be provided, and
(iii)
inputs from cytogenetic dose estimates, whole body monitoring and bioassay.
(9)
This was the first time that really extensive use has been made of Prussion Blue to accelerate the decorporation of caesium from the body. This produced very good metabolic data which showed the efficiency of the treatment, albeit at higher dosages than previously recommended.
(10) In managing a radiological accident, readily transportable equipment for whole body monitoring and bioassay may be need in addition to permanent dedicated facilities. (11) A single centralised nominated hospital to deal with severe radiation casualties may not be sufficient to cope with the numbers affected in an accident on the scale of Goiânia. It may be necessary to have arrangements that would allow all but the most seriously affected patients to be treated more locally.
100
(12) The IAEA report included the following statement on the type of hospital that should be designated to deal with radiation casualties, “The therapy of casualties of radiological accidents in modern times is varied and complicated. Such patients must be cared for by hospital staff who are engaged on a daily basis in the haematological, chemotherapeutic, radiotherapeutic and surgical treatment of patients at risk from cancer, immunosuppression and blood dyscrasias.” Recovery Operations (13) In the response to an accident of this scale there will be a need to engage professional expertise from a wide range of organisations, to define clearly the chain of command and to establish the logistical backup to assemble the necessary resources. (14) The accidents in both Goiânia and Ciudad Juarez produced large volumes of active waste. In such cases the management of the waste will be central to the recovery programme and will significantly affect its timescale. Appropriate contingency plans for dealing with such volumes of waste are necessary. Considerable effort has been devoted to the development of plans to deal with nuclear accidents and these efforts have intensified at both national and international level following the Chernobyl accident. Other types of radiological emergency have received much less attention, but the Goiânia accident has shown the potential scale of the consequences of accidents in this area. The IAEA report therefore emphasises that preparedness to deal with emergencies should be extended to cover the entire range of possible radiological accidents.
CONCLUSION The Goiânia accident fundamentally stems from an abdication of responsibilities over source security, and the story graphically shows the consequences of laxity in this important area. There are many other lessons to be learnt from, or reinforced by, the accident. Many of these could not be included in this article, but are in the IAEA report[2]. We must learn from our mistakes and the story of the Goiânia accident has much to commend itself as an element in training programmes at all levels and through a wide range of disciplines.
101
The Brazilian authorities responded very well to dealing with the accident, but as they themselves identify in the IAEA, report, there were facets that could have been dealt with better. Their openness and willingness to co-operate with IAEA in a post-accident review, as was also the case with the USSR after Chernobyl, must be strongly commended and it is hoped that in future other national authorities will emulate this.
REFERENCES 1. I.A.E.A., Nuclear Safety Review for 1987, IAEA Vienna, 88. 2. I.A.E.A., The Radiological Accident in Goiânia, STI/PUB/815, IAEA Vienna, 1988. 3. I.C.R.P., Protection of the Public in the Event of Major Radiation Accidents; Principles for Planning, Publication 40, Pergamon Press, Oxford (1984). 4. I.A.E.A., Principles for Establishing Intervention Levels for the Protection of the Public in the Event of a Nuclear Accident or Radiological Emergency, Safety Series No. 72, IAEA Vienna (1985). 5. I.A.E.A., What the General Practitioner (MD) should know about medical handling of overexposed individuals, IAEA-TECDOC-366, Vienna 1986. 6. Hill M D, Wrixon A D, and Webb G A M. Protection of the public and workers in the event of accidental releases of radioactive materials into the environment, J. Radiol. Prot. 1988. Vol 8 No. 4, 197–207.
CURRENT RADIATION RISK ESIMATES AND IMPLICATIONS FOR THE HEALTH CONSEQUENCES OF WINDSCALE, TMI AND CHERNOBYL ACCIDENTS
Roger H Clarke National Radiological Protection Board Chilton, Didcot, Oxon 0X11 ORQ
Abstract The publication of the 1988 report by the United Nations Scientific Committee on the Effects of Atomic Radiation represents a major step forward in the estimation of the levels of risk associated with exposure to ionising radiation. Data are now available which quantify the effects of high doses received at high dose-rates in a number of organs and tissues as well as the whole body. The estimates of risk given by UNSCEAR for fatal cancers are up to four times higher than when the Committee last reported in 1977. This paper outlines reasons for the changes and summarises the risk factors NRPB has now adopted for protection purposes. These factors are used to reexamine the consequences of the three major reactor accidents that have occurred, in terms of risks to the most exposed individuals and the population as a whole.
INTRODUCTION During the last 18 months it can hardly have escaped the notice of any person in the nuclear business— medicine or power—that our estimates of the risks from exposure to ionising radiation have increased. The main reason is the new information from the survivors of the atomic bombs at Hiroshima and Nagasaki. The International Commission on Radiological Protection (ICRP) felt it necessary to make a statement following its meeting in Como in September 1987(1), and my own organisation—NRPB—gave advice to our Government departments and agencies with regulatory responsibilities to consider the implications of the change in risk estimates for dose limits(2). The major addition to our information since that time has been the report of the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) on human radiation carcinogenesis(3). This Committee has reviewed all the human epidemiological data to assess the risks associated with exposure to ionising radiation and the full report with technical annexes was published at the end of 1988. The publication of this report signals an
102
103
international consensus on the effects of human radiation exposure, albeit at high doses and high dose-rates. The problem that remains is to interpret what this means at the low doses and low dose-rates to which workers and members of the public are exposed. The Board has drawn on the UNSCEAR work to develop a comprehensive set of organ and tissue risk factors applicable to exposure of the UK population. These data have been used in the preparation of evidence we gave to the public inquiry into the proposal to build a Pressurised Water Reactor at Hinkley Point in Somerset(4,5). In this paper I review the radiation risk data and then apply them to evaluate the health consequences of the three major reactor accidents that are known to have occurred. Our radiation risk estimates are broadly applicable to populations of Western Europe and North America so that both individual and population risks can be derived resulting from those accidents. Estimates of the risk of radiation-induced cancer in human populations The estimates of risk that NRPB has made are based on the 1988 UNSCEAR review of the experience obtained from human population groups exposed to high doses of radiation(3). Although there are several such groups, including patients treated for ankylosing spondylitis and workers exposed in the radium luminising industry, the single most important source of information is from the survivors of the Hiroshima and Nagasaki atomic bombs. This population of more than 90,000 people represents the largest group exposed to significant whole body irradiation. The new risk estimates are largely based on the latest data on the Japanese and take account of three main changes: dosimetry; epidemiology; and projection into the future. The first is a revision of the dosimetry known as DS86 to allow, amongst other factors, for the high humidity in the air over the cities which substantially reduced the neutron dose. The earlier 1965 (T65DR) estimates were based on measurements in the dry atmosphere of the Nevada Desert. Improved estimates have also been made of weapon yield, tissue and organ doses, and for the shielding provided by buildings. The second change is that the number of excess cancers in the population has increased since UNSCEAR last report in 1977(6) due to the longer time for epidemiological follow-up to 1985. The third change is because lifetime cancer experience is not yet available for any of the large epidemiological studies. Therefore, to project the overall cancer risk for an exposed population, it is necessary to use mathematical models that extrapolate, over time, data based only on a limited period of the lives of the individuals. The two projection models that have been used in the past are: (a) (b)
the additive model which postulates that the annual risk of cancer arises after a period of latency and then remains constant over time; and the multiplicative model in which the time distribution of the excess risk follows the same pattern as the time distribution of natural cancers, ie. the excess (after latency) is given by a constant multiplying factor applied to the age-dependent incidence of natural cancers in the population
104
The data now available provide a deeper insight into the applicability of the two models and UNSCEAR states in its 1988(3) report that recent findings in Japan suggest the multiplicative risk projection model is the more likely, at least for some of the most common cancer types. NRPB has adopted the multiplicative model for estimating lifetime risks of most solid cancers. An implication of the use of the multiplicative risk model is that for the majority of solid cancers it results in an increasing risk with time after exposure, following the increase in natural incidence with age. There are indications; at least in some groups exposed to radiation, that the excess risk of cancer starts to decline many years after exposure. This has been well documented for leukaemia, but has also been observed at long times after exposure in the case of solid cancers for the sponodylitics and possibly for some other irradiated groups. These results suggest that in the Japanese survivors the excess risk may ultimately decrease with time and thus multiplicative projection models applied over the lifetime could result in an overestimate of the cancer risks.
For a population of all ages, the 1988 UNSCEAR report(3) derives a fatal cancer risk following whole body ?-exposure at high dose and high dose-rate estimated, using the multiplicative model, from 7 to 11×10-2 Gy-1. The lower figure arises from the use of an age averaged relative risk and the higher figure, from the use of age specific relative risks. For a working population aged 25–64 years the multiplicative model gives a risk of 7–8×10-2 Gy-1. This compares with the Committee’s 1977 assessment(4) of 2.5×10-2 Gy-1 using the additive model. The Committee in its 1988 report gives an indication of the effect of different risk models at high doses by applying the additive model which gives a fatal cancer rate of 4–6×10-2 Gy-1 for a population of working age (25–64 y) compared with 4–5×10-2 Gy-1 for a population of all ages. The data are summarised in Table 1 for the two models and both populations. The differences between the two populations are less for the additive model because they are not based on the natural cancer incidence which increases with age. In 1977, the Committee pointed to uncertainties in two directions: the value derived from high doses was an under-estimate because no projection had been made into the future, but it was also an over-estimate in the sense that the risk per unit dose at low doses and low dose-rates was believed to be lower than the estimates at high doses and high dose-rates. Extrapolation into the future is still uncertain because about twothirds of the Japanese survivors are still alive and the additional cancer risk has still to be expressed. The exception is the risk of leukaemia which appears more certain as nearly all the excess now seems to have occurred. The problem of low doses and low dose-rates remains. Cancer risk estimates for protection purposes The assumption normally made for radiation protection purposes is that the risk of radiation-induced cancer is proportional to the dose, without threshold. For some human cancers, the dose-response data do suggest that the incidence of cancer is a linear function of the dose, at least for the dose range over which information is available.
105
This applies to thyroid cancer, breast cancer and possibly to leukaemia following exposure to x- or ?-radiation at high dose-rates(3,7).
There is, however, information from animal studies which indicates that the induction of radiogenic cancer by x- or ?-radiation depends on the dose-rate and dose-rate effectiveness factor (DREF) has been used by both UNSCEAR and ICRP to estimate the risk at low doses and dose-rates. At low dose-rates the numbers of cancers induced are lower by a factor of between two and ten than they are at high dose-rates(3,8). Doseresponse data for cancer induction have most recently been reviewed in detail by UNSCEAR in 1986(6). It was concluded that for many cancers the assumption of a linear response when extrapolating from information at high doses and dose-rates could over-estimate risks at low doses and low dose-rates by a factor up to five. In the 1988 UNSCEAR report a DREF of between two and ten is quoted. Table 2 summarises the results of reviews of dose-rate effectiveness factors. The majority of the available animal data indicate a DREF of between two and four for the induction of cancer at low dose-rates compared with that calculated at high dose-rates(4). The figure that has been adopted by NRPB is three for all cancers except breast cancer for which a DREF of two is judged to be more appropriate. Recently Liniecki has updated the animal data on DREFs and concludes that a factor of 3 is an average value(9). We have reviewed a number of human data from which DREFs can be obtained. These include: (a) thyroid irradiation, (b) breast irradiation, (c) lung cancer, (d) bone marrow, (e) solid tumours,
where protraction of the dose leads to lower risks by a factor of 3 or even 4 where some fluoroscopy data give a dose rate reduction factor of 3 where data from chronic exposures of uranium miners supports a DREF of up to 5 where US radiologists have an incidence of leukaemia 3 times less than the Japanese. studies of occupationally exposed workers give a risk to date consistent with a reduction of 3 from the Japanese.
As a result of this review, the Board believes that the DREF value of 3 is substanted by available human data. Risk coefficients for radiation-induced fatal cancer in the UK population
The risks of radiation-induced cancer have been estimated by NRPB in a UK population of all ages and both sexes. The full description of the derivation of the risks is given by Stather et al(4). The total lifetime fatal cancer risk in the UK population following whole body irradiation at low dose-rates is 4.5×10-2 Gy-1, over three times the ICRP figure from 1977(10). If the calculation is repeated using the cancer deaths that have actually occurred in the Japanese population, a whole body risk of 1.4×10-2 Gy-1 is obtained. This figure can be taken as the minimum value of risk so that the actual risk is from at least 1.4 to 4.5×10-2 Gy-1. The fatal cancer risk estimated for a UK adult population (ages 20–64 years, both sexes) exposed to whole body radiation at low dose-rates is from at least 1.8 to 3.4×10-2 Gy-1, up to nearly three times the 1977
106
ICRP figure. The lower figure 1.8% Sv-1 represents the risk to date, with allowance for a DREF of 3; studies on AEA and BNFL workers show a risk to date of up to 2% Sv-1 with significant error bars. The fatal cancer risks for protection purposes derived by NRPB are summarised in Table 3. For an overall assessment of the risk of cancer in a population information is required on incidence as well as facilities. Table 4 shows percentages of fatalities among incident cancer cases associated with various types of cancers. These values are based on fatality rates for radiation-induced cancers given in ICRP Publication 45(11) and on figures published by the Office of Population Censuses and Surveys(12) on cancer fatality in the general population of England and Wales. Recent improvements in the treatment of childhood leukaemia mean that about half of all cases are now classed as curable; the prognosis for adult leukaemia is not as good. The survival rate averaged over all ages is decreased relative to the value given by OPCS if, as in table 7, attention is restricted to radiation-induced leukaemias. This is because chronic lymphatic leukaemia, which does not appear to be radiation-inducible and is hence excluded from the leukaemia grouping, has better survival rates than some other types of leukaemia. The fatality rate of 10% for radiation-induced thyroid cancer given in Table 7 is that quoted by NCRP(8). This value is less than that for thyroid cancer in the UK generally(12), since the two types of thyroid cancer that appear to be radiation-inducible have lower fatality rates than do other types. Although there is some variation in the fatality rates with age, the age-averaged value is given in the Table. Based on the values in Table 4 and the excess numbers of cancer deaths in Table 3, the excess numbers of non-fatal cancers arising from radiation exposure can be calculated to be about 50% of the risk of fatal cancers. Let me now turn to the reassessment of reactor accidents, in the light of the new risk data. The Windscale Accident 1957
On 10 and 11 October 1957, a fire in the No. 1 Pile at the Windscale establishment in Cumbria led to an uncontrolled release of activity to atmosphere. The resultant plume of activity travelled over much of England and parts of northern Europe. The pile was air-cooled on a once-through basis so that the fission products released from the oxidising fuel were carried through the core with the coolant and were discharged from a 120 m chimney. In addition to the fission products, other nuclides were released, the most notable being polonium–210 which was being produced by neutron irradiation of bismuth ingots in the core. The quantities of radionuclides now thought to have been released have been determined by Crick and Linsley(13,14) and are shown in Table 5. The major releases of activity occurred around 2400 hours on 10 October when CO2 injection was used and between 0900 and 1100 hours on 11 October when water was initially introduced.
107
At the time of the accident, the radionuclide identified as being of principal concern was iodine–131 and its route to the population was thouugh consumption of cows milk. The prompt imposition of a ban on milk supplies had the effect of reducing intakes of radioidine via the pasture-cow-milk pathway. The average radiation doses to the thyroids of the local population were typically 5 to 20 mSv for adults and 10 to 60 mSv for children. Because the release was from a tall stack, the peak doses were received some 3 km downwind of Windscale. The maximum measured activity in a child’s thyroid was reported to correspond to a dose equivalent of 160 mSv(1) which, when allowance is made for other nuclides and pathways, gives a maximum individual effective dose equivalent of about 9 mSv. The contribution to individual effective doses from polonium–210 is approximately 2 mSv, which is equivalent to about one year of natural background radiation. About 90% of the polonium dose is due to inhalation from the plume and 10% from ingestion of contaminated foodstuffs. The highest organ irradiation was to the thyroid and the expected health effect would be an increase in the risk or thyroid cancer—the majority being non-fatal although requiring treatment. The appearance of any cancers would be over a few decades after irradiation. For a standardised population, the time integrated risk of thyroid cancer is taken to be about 2.5×10-3 Sv-1 (Table 6) for internal irradiation by iodine–131, of which some 10% is thought to be fatal(4). The lifetime risk of radiation-induced thyroid cancer in the most exposed individuals may be estimated at about 4×10-4. These risks are for the few infants measured who were estimated to receive thyroid doses of up to 160 mSv; for adults with the highest doses, the risks are about a factor of five less. The highest annual risk thus corresponds to about 8×10-6 per year. The average natural risk of thyroid cancer in the UK is about 9×10-6 per year for males and 22×10-6 per year for females—lower at younger ages and increasing throughout life. The risks to those who were infants in 1957 are less than twice the natural incidence, and so few children were involved at these doses that it is unlikely that any excess cases will be observed. Fatality rates from radiation induced thyroid cancer are less than 10-6 per year, again much less than the UK natural fatality rate. Assuming polonium to be moderately soluble in the lung, the risk of lung cancer from polonium–210 is estimated at about 2×10-4 for the most exposed individuals. This gives an annual risk of 5×10-6 compared with a natural risk of about 1068×10-6 for men and 383×10-6 for women in 1936 in the UK(15). Again, the additional risks from the Windscale accident are small in comparison with natural cancer rates. Taking into account all nuclides and pathways, the total fatal risk to the most exposed individual is about 3×10-4 or 6×10-6 per year. Table 7 shows the collective effective dose equivalent commitments from the accident, received in the local area, the whole of the UK and Europe (including Scandinavia). The inhalation route is the most important, contributing 50% of the effective dose commitment; the milk ingestion pathway accounts for 30%; and the remaining 20% is shared between the other ingestion pathways and the external irradiation.
108
The most important radionuclides contributing to the collective effective dose equivalent commitment are iodine–131 (37%) and polonium–210 (37%) followed by caesium–137 (15%). The collective dose equivalent commitment for the accident in the UK can be used to predict a number of cancer deaths using the risk factors in Table 6. However, at the very low levels of individual dose involved, it must be questionable as to whether the risks are real. They can only be regarded as an upper estimate and the most likely number of health effects will be lower and may be zero. There would be something of the order of 100 fatal cancers, as shown in Table 8, which would arise in the UK population over a period of 40–50 years. There would be about an equal number of non-fatal effects which would be 90% cancers and some 10% hereditary defects. The total number of thyroid cancers would be 60% of the non-fatal cancers. An incidence of between 1 and 2 thyroid cancers a year over 40 years in the whole of the UK, if real, may be compared to a natural incidence in England and Wales of over 700 per year, of which 30% are in males and 70% in females. The most significant number of fatal cancers is from polonium–210 but giving a rate of less than 1 per year in the UK population, in comparison with the natural fatal cancer rate of about 140,000 per year, of which 41,000 are of the lung. Since members of the UK population are also subject to background radiation of about 2.5 mSv each year, the effects of the Windscale accident, which gave only a few days equivalent of background radiation over most of England and Wales, are unlikely to be seen. Even the effect of the release of polonium–210 was only equivalent to about 10% of the annual exposure of the population to naturally occurring polonium–210. It seems unlikely therefore that any effects will be seen which can be attributed to the Windscale accident. The Three Mile Island Accident 1979
The basis of this PWR accident, the failure to close off a pressure relief value, was straightforward, but the details are extremely complicated and led to severe damage of uncooled fuel. Recent comprehensive estimates of the releases of radionuclides from Three Mile Island have been given by Behling and Hildebrand(16) and the more important figures are given in Table 9. The quantity of iodine 131 release was nearly 1000 times less than in the Windscale accident although that of xenon 133 was 25 times greater. In comparison with noble gases, for less caesium, strontium and iodine were released from Three Mile Island than from Windscale because of the tortuous pathway from the core to atmosphere and, because the containment system was intact, the air inside the buildings was released through the normal filtration system. The principal pathway of exposure of the public was external radiation from the radioactive decays of the noble gases in the plume as it dispersed. Monitoring off-site has confirmed this result and the maximum individual off-site dose from external rays has been estimated at 0.83 mSv on the bass of TLD detectors at 0.5 miles from the site. No member of the public could have been closer to the site(16). The average dose to those individuals within 10 miles of the site has been estimated at 0.08 mSv effective dose equivalent. The highest individual thyroid dose is less than 0.2 mSv, the collective thyroid dose equivalent
109
commitment is between 14 and 28 man-Sv out to 50 miles and the collective effective dose equivalent commitment between 16 and 53 man-Sv, with a most probable value of 33 man-Sv within 50 miles. The total collective effective dose equivalent containment is estimated by UNSCEAR(17) to be about twice the 50–mile values. Although the total quantity of activity released from Three Mile Island is much greater than from Windscale, the dosimetric consequences are far less (by a factor of 10) because of the different radionuclides and therefore pathways of exposure involved. Based on the highest individual effective dose of 0.8 mSv, the risk to such an individual is 3.6×10-5 over the next few decades or perhaps 10-6 per year. Estimates of the number of health effects in the 2,000,000 people who live within 50 miles can be derived from the risks in Table 3 as one fatal cancer among 325,000 fatal cancers that will occur in that population, less than one non-fatal cancer among 216,000 that will occur in that population and less than one genetic defect among 78,000 expected from the offspring of that population(16), if the whole population of the USA is considered, the number of theoretical cancers from Three Mile Island could double but the expected naturally occurring numbers increase by a factor of about 100. The Chernobyl Accident 1986 The RBMK reactor at Chernobyl unit 4 was one of 15 brought into operation in the USSR and is a graphite-moderated reactor cooled by water in pressure tubes surrounding each fuel element. The water boils in the upper part of the vertical tubes to produce steam which is fed directly to turbines to generate electricity. In this accident, the operators reduced the flow of water through the core leading to steam voids which, in the design, made the reactor produce more power, for which there was insufficient cooling. But automatic safety shutdown systems had been made inoperative and continuous input of power meant that the reactor eventually went super prompt critical and achieved 100 times normal full power within 4s. The energy released by the power excursion ruptured fuel and caused a steam explosion which shifted the 1000–tonne reactor cover plate and resulted in all cooling channels being cut off. After 2 or 3s, a second explosion (possibly hydrogen) occurred and hot pieces of fuel were ejected directly from the destroyed reactor building. The graphite, now open to the air, caught fire and the release of radionuclides continued over 10 days. The nuclides now thought to have been released are shown in Table 10. There was a total release of noble gases; the release of iodine was about 20% of that available in the core, that of caesium about 10% while about 3% of the rare earths and actinides were released. Some 3–4% of the core is believed to have been ejected to the atmosphere, of which some 0.3–0.5% was deposited on the site and some 1.5–2% was deposited within 20 km, leaving about 1–1.5% of the core dispersed beyond 20 km(18). The Soviet authorities have reported that acute radiation syndrome of varying clinical severity was diagnosed in 203 subjects who were all workers involved on the site. The doses were principally due to external irradiation by ß and ? rays and were in the range of 2–16 Gy. A complication in the treatment of these patients was the development in 48 individuals of severe skin burns covering up to 90% of the body
110
from aerosols deposited on the surface of the skin and clothes. None of the 135,000 members of the public evacuated from the 30 km zone around Chernobyl was diagnosed as exhibiting acute radiation syndrome. All were promptly medically examined. The highest contributions to dose came from iodine 131 in milk and from caesium 137 ingested in foodstuffs and externally from its deposition on the ground. The external radiation doses to most of those evacuated were less than 250 mSv, although a few in the most contaminated areas might have received doses up to 300–400 mSv. The collective dose from external radiation to the 135,000 is estimated at 16,000 manSv. Individual children’s thyroid doses may have been as high as 2.5 Sv although the average thyroid dose is 300 mSv and a collective thyroid dose of 400,000 man-Sv adding 1200 man-Sv to the collective effective dose equivalent. The collective doses in various regions of Europe and the Soviet Union are shown in Table 11. The Soviet authorities have estimated 2,000,000 man-Sv in the European Soviet Union and the nuclide which has contributed most of the dose is caesium 137. It may be that the final figure is 200.000 man Sv because of overestimation by the Soviet Authorities(18). The collective effective dose over all time to the members of the European Community has been estimated(19) at 80,000 man-Sv and the collective dose to the whole of Europe outside the USSR is likely to be about twice this value. The collective thyroid dose within the European Community is about 180,000 man-Sv, making a contribution of 5,400 man-Sv to the collective effective dose. The dose to the UK was 3,000 man-Sv, with 330 man-Sv being the contribution to collective effective dose equivalent from thyroid irradiation. The main pathways contributing to the collective dose in Western Europe are the consumption of contaminated foodstuffs and external radiation from deposited material. In the first year food consumption is the dominant pathway but in later years external radiation from deposited material increases in importance. For both pathways, caesium 137 is the most important nuclide. For the 135,000 evacuees who received high doses, the collective whole-body dose is estimated at 1.6×104 man-Sv and should lead, on the basis of present risk estimates, to 700 late fatal cancers. The contribution from doses to the thyroid from iodine 131 would increase this estimate. If the average dose to the thyroids of those evacuated was 300 mSv(18) then the collective dose would be about 40 000 man-Sv, leading to perhaps 10 fatal thyroid cancers and a substantially higher number of non-fatal cases. The cancers would appear over the next 40 years or so and about 20% of those evacuated would be expected to die of cancer anyway, based on Western cancer statistics, ie. some 27 000. The estimated extra 700 fatalities would constitute about 2% of the spontaneous cases and could perhaps be identified in an epidemiological study. Within Europe about 1,000 theoretical non-fatal cancers would be predicted over the next few decades, if the risks are real at the levels of dose involved. The number of theoretical fatalities from all cancers due to Chernobyl in Europe is predicted to be in the region of 3,000, spread out over a few decades following the
111
accident. Over the same period about 20 million or so people are expected to die in European Community countries from cancer of one type or another. The effects of Chernobyl seem unlikely to be seen in any epidemiogical study undertaken in Europe. CONCLUSIONS In this paper I have shown that the likely estimates of risk from exposure to ionising radiation have increased from the 1977 ICRP values, by about a factor of 3. I have utilised these new risk estimates to review the individual and population risks from the 3 reactor accidents known to have release radioactive materials to the atmosphere. In each of the three accidents there has not been any fatality off site in the short term, long-term detriment to health is unlikely to be seen in comparison with the natural cancer incidence rate except possibly amongst evacuees at Chernobyl. However, despite the severe nature of the accidents, their impact on the public was largely one of perceived and not real risk. For example, over most of England the dose incurred from Chernobyl was about equivalent to that of flying to Spain and back in a jet aircraft. The major task for the future is better communication with the public. There is a need to explain why people are expected to continue living normally in any enhanced artificial radioactive environment; although the dose may be very small, activity is measurable and causes concern. The fact that in radiological protection we assume a non-threshold response means that there is no level of zero risk and the problem to be faced is to argue for the acceptability of the presumed risk. A risk of 10-6 may be trivial to the individual, but if 50 million people are involved, 50 theoretical deaths result—a major disaster comparable with an air crash. It proves very difficult to explain to the public, or at any rate the media, the differences between these situations. Could not more use be made of competent professionals—environmental health officers, community physicians, hospital physicists, general practitioners, for example, in our response to the nuclear accident I hope we never have. REFERENCES (1)
ICRP. Statement from the 1987 Como meeting of the ICRP. Annals of the ICRP, 17, 4 1987.
(2)
NRPB. Interim guidance on the implications of recent revisions of risk estimates and the ICRP 1987 Como Statement. Chilton, NRPB–GS9 London, HMSO, 1987.
(3)
UNSCEAR. Sources, effects and risks of ionising radiation. 1988 report of the United Nations Scientific Committee on the Effects of Atomic Radiation to the General Assembly, with annexes. New York, United Nations 1988.
112
(4)
(5)
Stather, J.W., Muirhead, C.R., Edwards, A.A., Harrison, J.D., Lloyd, D.C., and Wood, N. Health effects models developed from the 1988 UNSCEAR report. Chilton, NRPB–R226 1988. Clarke. R.H. Statement of Evidence to the Hinkley Point C Inquiry. NRPB–M160 1988.
(6)
UNSCEAR. Sources and effects of ionising radiation. 1977 report of the United Nations Scientific Committee on the Effects of Atomic Radiation to the General Assembly, with annexes. New York, United Nations 1977.
(7)
UNSCEAR. Genetic and somatic effects of ionising radiation. 1986 report of the United Nations Scientific Committee on the Effects of Atomic Radiation to the General Assembly, with annexes. New York, United Nations 1986.
(8)
NCRP. Influence of dose and its distribution in time on dose-response relationships for low-LET radiation. Bethesda, MD, National Council on Radiation Protection and Measurements, Report No. 64 1980.
(9)
Liniecki, J. Cancer risk estimates for high doses and dose rates and extrapolation to the low dose domain. IBC Conference on Low Dose Effects, London 1989.
(10)
ICRP. Recommendation of the International Commission on Radiological Protection. ICRP Publication 26. Annals of the ICRP, vol. 1, No. 3, 1977.
(11)
ICRP. Quantitative bases of developing a unified index of harm, Pulication 45, Annals of ICRP 15, 3 1985.
(12)
OPCS. Cancer Statistics: Incidence, survival and mortality in England and Wales, Studies on Medical and Population Subjects No. 43 HMSO 1981.
(13)
Crick, M.J. and Linsley, G.S. An Assessment of the Radiological Impact of the Windscale reactor Fire October 1957. National Radiological Protection Board Report R135 HMSO, London, 1982.
(14)
Crick, M.J. and Linsley, G.S. An Assessment of the Radiological Impact of the Windscale Reactor Fire October 1957. Int. J. Radiat. Biol. 46, 5, 479–506, 1984.
(15)
Office of Population Censuses and Surveys. Mortality Statistics by Cause. London, HMSO, 1986.
(16)
Behling, U.H. and Hilderbrand, J.E., Radiation and Health Effects: a Report on the TMI–2 Accident and Related Health Studies GPU, Nuclear Corporation Middleton, PA170057, USA 1986.
(17)
UNSCEAR. Ionising Radiation: Sources and Biological Effect Report of the United Nations Scientific Committee on the Effects of Atomic Radiation. 1982 report to the General Assembly, United Nations, New York 1982.
(18)
INTERNATIONAL ATOMIC ENERGY AGENCY. Summary Report on the Post-accident Review Meeting on the Chernobyl Accident, Safety Series 75–INSAG–1 (IAEA, Vienna) 1986.
(19)
Morrey, M, Brown, J, Williams, J.A, Crick, M.J, and Simmonds, J.R. A Preliminary Assessment of the Radiologica Impact of the Chernobyl Reactor Accident on the Population of the European Community, (Health and Safety Directorate, Luxembourg) 1987.
113
TABLE 1 Projected lifetime risks of fatal cancer following whole body ?-exposure to high doses at high dose rate (10-2 Gy-l)
TABLE 2 Summary of dose rate effectiveness factors
114
TABLE 3 Fatal cancer risk factors for radiation protection purposes in a UK population (10-2 Sv-l)
TABLE 4 Fatility rates for radiation-induced cancers
115
TABLE 5 Estimates of radionuclides released from the 1957 Windscale accident(13, 14)
TABLE 6 Estimated 1988 lifetime fatal cancer risk in a UK population (all ages, both sexes) associated with low level exposure to ionising radiation(8) compared with ICRP(9)
TABLE 7 Collective Effective Dose Equivalent Commitment (man-Sv) due to the Windscale 1957 Fire(14)
116
TABLE 8 Maximum numbers of health effects predicted from the Windscale accident
TABLE 9 Major radionuclide releases estimated for the 1979 Three Mile Island Accident
117
TABLE 10 Estimates of radionuclides released from the 1986 Chernobyl accident
TABLE 11 Collective Doses from the Chernobyl Reactor Accident
118
TABLE 12 Estimates of the theoretical numbers of health affects from the Chernobyl accident risk model
THE ROLE OF BIOLOGICAL DOSIMETRY IN A RADIOLOGICAL ACCIDENT IN THE UK
DAVID LLOYD National Radiological Protection Board, Chilton, Didcot, Oxon, 0X11 ORQ, UK
ABSTRACT Reasonably accurate estimates of radiation dose can be made from the study of chromosomal aberrations in peripheral blood lymphocytes. Experience in using this technique in the UK over 20 years is reviewed. The estimation of dose in accidents that may involve non-acute or inhomogeneous exposure from external sources or incorporation of radionuclides is discussed. Consideration is given to the resources that could be made available after a serious UK accident involving many subjects. Early biological estimates of dose need only be approximate as physicians initially would use the information for confirming their correct sorting of casualties from clinical symptoms. Later, more accurate estimates for highly but inhomogeneously irradiated subjects could help in planning treatment for severe bone marrow damage. Biological dosimetry also has important roles in reassuring unexposed persons and in counselling those irradiated regarding their concerns for the risk of induced late stochastic effects.
INTRODUCTION A number of biochemical or cytological assays have been proposed as biologicalindicators of exposure [1], In general they suffer from a variety of drawbacks concerning factors such as poor sensitivity, particularly to doses below about 1 Gy, and fluctuating inter or intradonor variability in response. In some instances the response to irradiation is too transient for practical application. Cytogenetic dosimetry however stands out as being the most sensitive and widely used of the possible biological dosemeters and discussion in this paper will be restricted mainly to this technique. Biological dosimetry using chromosomal damage in peripheral blood lymphocytes after an accidental overexposure to radiation was first performed on victims of the Recuplex criticality accident in Hanford in 1962 [2], This demonstration, particularly as it involved exposure to neutrons, for which physical methods of dosimetry are difficult, excited considerable interest. In the UK a decision was taken to set up a biological dosimetry laboratory. It began operating in 1968 initially as part of the UKAEA, and later in 1971 it transferred to the NRPB.
119
120
Figure 1. A metaphase spread from a human lymphocyte showing many chromosomal aberrations (not all labelled). c—centric ring; d—dicentric; t—tricentric; q— quadricentric; f—fragment.
Figure 2. In vitro dose response curves.
121
Fortunately it has not been necessary to undertake a dose estimation following a criticality accident in this country, but to date over 700 persons have been examined for exposure mostly to low LET radiation. The technique has come to occupy a valuable niche in radiological protection. It supplements physical methods of dosimetry and indeed where these are unavailable the biological method may be the only means of quantifying the dose. Because of its cost the technique is only employed when overexposure is suspected or known to have occurred; the method does not replace physical means of routine personal monitoring.
THE CYTOGENETIC METHOD In essence the procedures are quite similar to those employed in the early 1960s. Lymphocytes from a peripheral blood sample are stimulated using the drug phytohaemagglutinin (PHA) to enter their division cycle. They are maintained in culture for 2 days until some have reached metaphase when they are harvested onto microscope slides, stained and scored for the presence of radiation-induced aberrations. An equivalent whole body dose is estimated by comparing the aberration yield with a dose response calibration curve produced by in vitro irradiation of blood. The incidence of dicentric aberrations is taken as the main indicator of dose. This is because the dicentric, being predominantly caused by radiation, has a low control frequency in subjects exposed only to background radiation. The in vitro dose response relationships for dicentrics are reproducible and relatable to in vivo exposure. These aberrations are also the most readily identified under the microscope. Since the 1960s extensive research and development has refined the technique so that now much uncertainty associated with factors like variability of cell cycling speeds has been removed. Also the in vitro dose response is well understood and good calibration data are available for a large range of radiation qualities including all those likely to be encountered in accidents. For a detailed account of the radiationinduced aberrations, laboratory procedures and the way that doses are calculated from the aberration yields, the reader is referred to a recent manual published by the International Atomic Energy Agency [3]. Figure 1 is a metaphase from a highly irradiated lymphocyte and this illustrates most of the aberrant forms of chromosomes induced by radiation. Figure 2 shows a range of dose response curves illustrating how the relative biological effectiveness varies with radiation quality. It also shows how the slopes of the curves range from linear with high LET radiations, such as fission neutrons, to curvilinear for X and ? rays. Moreover, for the low LET radiations there is a pronounced dose rate effect that is especially evident at the higher doses. Dose response relationships and how these are dependent on factors such as dose rate and radiation quality have been extensively reviewed by Lloyd and Edwards [4],
EXPERIENCE TO DATE IN THE UK Tables 1 and 2 summarise the number of cases examined by the NRPB laboratory up to December 1988. The majority have arisen from industrial uses of radiation, particularly non-destructive testing, and have mostly involved just one or two persons. The majority of subjects (66%) have been shown by chromosomal analysis to have sustained “zero dose”; strictly speaking exposure below the limit of
122
TABLE 1 The origins of the cases referred to NRPB for chromosomal analysis 1968–1988
TABLE 2 The numbers of cases and reasons for them being referred to the NRPB for chromosomal analysis 1968–1988
detection. 68% of subjects have been by personal dose meters such as film accurately reflect exposure, if any, dosimetric information from physical
studied to determine whether values recorded badges or thermoluminescent devices to the wearer. In the remaining cases methods was not available.
SENSITIVITY The lower limit of detection of the cytogenetic method is an equivalent whole-body dose of around 100 mGy of gamma rays such as from cobalt-60. The technique is possibly a little more sensitive to 250 kVp x-rays and certainly doses as low as 10–20 mGy of fission spectrum neutrons can be detected. Because of sampling statistics, dose estimates particularly at low levels carry considerable uncertainty and these are usually expressed as 95% confidence limits. Table 3 illustrates a range of gamma dose estimates and their limits showing how the latter depend on the numbers of metaphases scored for dicentrics.
123
TABLE 3 The 95% confidence limits on dose estimates from 0.1–3.0 Gy of gamma rays showing how the limits are influenced by the numbers of metaphases examined
The upper limit of dose estimation is well into the lethal range of exposure, although above about 8 Gy gamma rays, the measured response tends to flatten off due to a saturation phenomenon. Therefore, for exceptionally high exposures dose estimation is uncertain. This was amply illustrated by the 1982 Norwegian fatality [5] where 5 dicentrics per cell were scored and this was interpreted as a dose well in excess of 10 Gy. It was finally concluded from methods such as electron spin resonance with nitro-glycerine tablets in the patient’s pocket, that the most likely averaged whole-body dose was approximately 22 Gy.
NON-UNIFORM EXPOSURE In practice accidental overexposures almost invariably involve partial or inhomogeneous exposure. Therefore it may be necessary to attempt some refinement of the equivalent whole body dose estimate to indicate the extent of the inhomogeneity. There are of course statistical limitations to this approach. Thus it may not be possible to detect an elevated aberration yield for exposures confined to small volumes of the body, eg. a hand burn, even if the local dose is so high as to require amputation. However, if the exposure involves more than just an extremity then sufficient numbers of lymphocytes may be exposed for damage to be detected in the sample of cells scored. For a uniform irradiation the distribution of aberrations among the cells is Poisson. For non-uniform irradiation the aberrations
124
are distributed among the cells with a variance greater than that predicted by the Poisson distribution and it is sometimes possible to demonstrate this. By examining the magnitude of this overdispersion it is possible to estimate approximately the proportion of the body exposed and its average dose. The procedure requires a correction factor to take account of in vivo and in vitro selection due to cell killing or mitotic delay that reduces the likelihood of the irradiated fraction of cells coming to metaphase by 48 hours in culture. Also some simplifying assumptions are required regarding the distribution of PHA responsive lymphocytes within the body. The recent IAEA manual [3] gives a more detailed description of the computational steps and some worked examples of this procedure. The limits of resolution for this approach are currently being investigated. Some preliminary unpublished results indicate that it may be possible to identify persons who have received a high life-threatening exposure to much of the body but with perhaps 5–10% of their tissue being spared or only sub-lethally irradiated. The objective of this work is to determine whether such information could be linked to other haematological data collected during the first few days after an accident [6]. These may indicate the likelihood that some areas of active bone marrow may have been spared and, given time, could proceed to natural repopulation. Such information would be of value to physicians in planning the management of marrow crises. It should be emphasised that the distribution analysis of cytogenetic damage observed in lymphocytes does not help to delineate the actual exposed areas of the body. These may be obvious if doses exceed an erythema threshold. Examination of chromosome aberrations in certain cells ensheathing the bases of hairs has been suggested as a local cytogenetic dosemeter [7]. At present, however, technical details regarding the yield and quality of metaphase spreads need to be overcome before this is a practical dosemeter.
INTERNALLY INCORPORATED RADIONOCLIDES Cytogenetic analysis is mainly suited to estimating doses after an exposure to penetrating radiation from an external source. Experience has shown that this is what most often occurs in radiation accidents. Dose estimations when one or more radionuclides have been incorporated into the body are less certain [8]. Problems arise mainly from the localisation of radionuclides in particular organs or tissues. This represents another kind of partial body irradiation with the added complications of protracted exposure at variable dose rates as the material is lost from tissues by physical decay, biological turnover or excretion. Certainly overdispersed chromosomal aberrations in excess of background may be observed in lymphocytes from persons who have incorporated nuclides, eg. radioiodine [9] or an actinide [10]. An assessment of the dose to the pool of circulating lymphocytes can be made from the chromosomal aberration yield. However, this would tend to give a misleading estimate of the whole body dose and generally it does not assist in determining dose to the principal tissues of interest, eg. the thyroid or liver and bone surfaces. Exceptions to this limitation are those chemical forms of nuclides that tend to disperse fairly evenly throughout the body. Tritium, absorbed as tritiated water, is a recognised hazard in some parts of the nuclear industry and because it equilibrates quite rapidly with body water, leads to a fairly homogeneous irradiation. A recent
125
industrial accident with tritium (11) which fortunately led only to modest personal doses, illustrated that cytogenetic analysis is feasible and produces estimates of dose in good agreement with those calculated from measurements of tritium excreted in urine. Another important exception is the incorporation of isotopes of caesium as these also tend to give a fairly uniform distribution of dose. The Goiânia incident [12] can be cited as an exceptionally vivid illustration where members of the general public were exposed externally and/or internally to caesium–137 chloride. Cytogenetic analyses together with gamma ray body monitoring were the main techniques employed for the assessment of quite large numbers of potentially exposed subjects.
THE INFLUENCE OF DOSE RATE The acute dose response fits well to the linear quadratic model Y=aD+ (3D where Y is the yield of aberrations, D is the dose and a and (3 are fitted coefficients. Protraction or fractionation of exposure to low LET radiation produces a lower aberration yield (Figure 2) although for high LET radiation where the dose response is linear, Y=aD, no such effect is observed. The lowered response for protracted X and gamma rays arises from repair processes occurring during the few hours after exposure which affect the yield of those lesions of two track origin. These are represented by the ßD2 term in the yield equation. Lee and Catcheside [13] proposed a time-dependent “G function” that may be applied to this term. Use of this function combined with information on the likely time course of irradiation can enable more realistic estimates to be made for non-acute or repeated exposures. In practice where continuous high dose is delivered over a long period, say about a day or two, the G-function reduces the ß term to virtually zero so that the response is in effect linear. For brief intermittent exposures where the interfraction interval exceeds about six hours, the irradiations can be regarded as a number of isolated acute events for which the induced aberration yields are simply additive [3]. For shorter interfraction times an interaction term based on the Gfunction can be used. An extreme illustration of such a case was the Cuidad Juarez accident where some subjects were intermittently exposed to high doses of cobalt–60 gamma radiation over several weeks [14]. For the most heavily irradiated person the dicentric aberration level, compared with the investigating laboratory’s conventional single acute dose response curve, corresponded to 5.5 Gy and from the linear term only to 15.3 Gy. The most likely dose was somewhere in between, but as the patient was unable to give a reasonable timetable of his probable exposure regime, the calculation was not taken further.
SPEED OF OBTAINING A DOSE ESTIMATE Obtaining a dose estimate from chromosomal analysis is time consuming. Upon receipt of a blood specimen in the laboratory the normal procedure is to culture the lymphocytes for 48 hours and then fix and stain them. Microscope analysis can commence at about 50 hours. To score 500 metaphases requires approximately two man days at a conventional light microscope. However, given first priority and with several persons collaborating by scoring replicate slides, a result can be available by about 55 hours. In practice, with the service that is routinely
126 carried out by the NRPB, we expect to provide the physician with a dose estimate within three working days of receipt of a blood specimen. During this period the patient would normally be under continuing medical surveillance and if a severe dose is suspected from observations of acute prodromal symptoms, then regular measurements, such as for haematological changes would be underway. The need to examine large numbers of cells (Table 3) for estimating doses below 1 Gy requires considerable time and is tedious work. Recent developments [15] in computer driven microscopes now enable automatic and rapid location of metaphases. Soon, computer assistance will be available for the scoring of aberrations [16]. The main delay in obtaining the biological estimate of doses arises, however, from the need to incubate the cells to allow them to reach metaphase. The technique of premature chromosomal condensation may considerably reduce this delay so that a dose estimate could be made within two hours of receipt of a blood specimen [17]. Polyethylene glycol is used to fuse the human lymphocytes with mitotic Chinese hamster ovary cells. This causes the single stranded Go human chromosomes to condense so that breaks in them can be discerned. At present the technique is not used routinely, it has not been extensively calibrated and in several laboratories reproducibility of the procedure is not reliable. If it were felt that obtaining a biological estimate of dose so much sooner is essential for management of the patient, then more research effort should be directed into this technique. Another recent development in biological dosimetry is to examine lymphocytes for micronuclei induced by radiation. Micronuclei comprise a mixture of chromosome and/or chromatid fragments and also complete chromosomes that for a variety of reasons fail to separate correctly at mitosis and move into the daughter nuclei. They
Figure 3. A cytokinesis blocked human lymphocyte showing two main daughter nuclei and four micronuclei.
127
remain isolated and become encapsulated within a separate piece of nuclear membrane. They may be seen as inclusions within the cytoplasm of the cell but exhibiting a nucleus-like staining reaction (Figure 3). Quantification of the dose response for micronuclei recently became much more reliable with the introduction of the cytochalasin B blocking method [18]. This permits the micronuclei to be scored in cells that are guaranteed to have completed only one in vitro cell cycle. The images of micronuclei are far simpler than chromosomal aberrations and thus scoring for them is faster. It has been proposed that this assay would be especially useful if large numbers of casualties are involved [19]. A disadvantage however is that with current techniques the cells need 72 hours incubation before micronucleus preparations can be made. Prosser et al [20] have considered the logistical implications of using chromosomal or micronucleus methods for dosimetry in an accident where many subjects may have been irradiated. They concluded that during the 24 hours when chromosomal preparations are available, but before the micronucleus slides are ready, chromosome analysis on a sample of up to 25 metaphases per patient, would provide initial dose estimates. Although provisional, these would be accurate enough to confirm the triage group into which each patient had been placed based on acute symptoms and to identify anomalies such as vomiting due to hysteria. Reliable identification would also be made of those persons who had received doses likely to require treatment to counteract severe stem cell depletion. This group would then be given priority for improving the estimation by increasing the number of cells analysed. Whether this is done by scoring micronuclei or continuing with the chromosomal aberrations depends on the number of samples awaiting analysis and how many skilled microscopists are available.
RESOURCES FOR CYTOGENETTC DOSIMETRY IN THE UK In the UK we are fortunate in that there is a substantial pool of researchers experienced in studying radiation-induced chromosomal damage in human cells, including lymphocytes. The principal laboratories are at Chilton (MRC and NRPB), Manchester (Paterson Institute), Sellafield (BNF plc) and Edinburgh (MRC). Whilst most of these groups do not normally perform biological dosimetry they would, in a national emergency, be able to respond and cooperate very rapidly. Together they could mobilise approximately 25 trained scientists and technicians. Within the UK there is also a considerable number of university and hospital laboratories performing research and services in clinical cytogenetics. Their staff particularly in the health authority region in which a large accident occurred, would very likely be drawn into the local medical response. Processing blood samples for routine karyotyping differs in some technical aspects from the methods required for biological dosimetry. However, with some consultations with one of the establishments referred to above, their standard procedures could easily be adapted for dosimetry. A large accident, say of Chernobyl proportions, would attract international attention and there are a number of laboratories performing biological dosimetry elsewhere in Europe and beyond. Many would proffer help or respond to a request for assistance. A recent collaborative, quality-control experiment, sponsored by the IAEA demonstrated that it is quite feasible to despatch blood and obtain dose estimates over great distances [21]. Thus, given the good communications that exist
128
between the community of cytogeneticists within the UK and with the added possibility of drawing on assistance from overseas colleagues, a rapid and logistically well coordinated response could be mounted.
WHAT IS THE USE OF A BIOLOGICAL ESTIMATE OF DOSE? After any serious incident involving radiation there is a requirement for radiological protection professionals to investigate the circumstances. For incidents that might have involved exposure to people, this would include attempts to assess doses. Biological dosimetry can make a useful contribution to this by providing data independent of the various physical monitoring, calculations and/or reconstructions employed by health physicists. Reports of such investigations are required in the UK by the Health and Safety Executive. The results of enquiries and particularly the lessons learned are disseminated within the scientific community and to the public. Such investigations would also be required for medico/legal proceedings that may arise from an accident. However, in the present paper, discussion is mainly confined to the usefulness of biological dose estimates to the physician who is concerned with management of irradiated people. In the event of an accident that is quickly identified, such as that at Chernobyl, patients will be observed over the first few days for prodromal reactions and erythema. The absence or severity of responses will permit an early triage of casualties and indicate whether further developments, possibly lifethreatening, during the ensuing weeks or months, might be expected. Reports from Chernobyl indicate that this preliminary sorting of casualties, which in a sense could be considered as employing a crude biological dosemeter, was made before cytogenetic estimates became available. If suitable personal monitors are worn (which incidentally was not the case at Chernobyl, Goiânia or Cuidad Juarez) information from these should ideally become known within the first 24 hours. Cytogenetic dose evaluations available from about the third day will serve as a means of confirming the initial assessments of the patients. As discussed above, if there are a large number of patients possibly stretching the resources of the cytogenetic laboratories, then sufficiently accurate dose estimates could be made from a preliminary examination of up to 25 metaphases per subject, with the option of firming up the data later. At this point the cytogenetics should be able to identify anomalous situations of patients exhibiting prodromal-like symptoms that may be due to non-radiological causes. Generally it is not envisaged that the initial biological dose estimates will influence the early treatment of the patient so at this stage the data do not need to be very precise. Highly irradiated persons will already have been identified and various haematological, microbial and perhaps specific organ function monitoring should already be in progress. For those subjects needing management for severely depleted bone marrow, some assessment of the likelihood of regeneration of the patients’ own marrow, perhaps assisted by growth factors, will be necessary. This will require information on the heterogeneity of the exposure and this can be supplemented by the cytogenetic data using the distribution analysis described earlier. Priority would therefore be given to improving the chromosomal scoring for these patients to provide firmer data for the statistical analysis. For patients irradiated to lesser doses, below the threshold for producing the prodromal response,
129
and confirmed by cytogenetics, physicians would tend simply to observe and respond to any symptoms that may appear. An important function of the physician is in the counselling of persons who have been involved in a radiation accident. Indeed, for the majority of subjects, reassurance and psychological support would be the treatment of choice (eg. in Goiânia). This is particularly so for those who remain free from symptoms of overexposure during the first week after an accident and are thus unlikely to be at risk of early nonstochastic effects. These people will seek information on the likely longer term consequences of their exposure. Clearly the demonstration of no or very few chromosomal aberrations in lymphocytes, implying that the dose was at most, trivial, is valuable. In addition for those persons where the possibility of induced malignancy is thought not to be trivial, many would prefer a quantitative appraisal of their dose. Over a number of years after a large radiation incident, it is likely that epidemiological studies of long-term effects will be carried out. The studies of Japanese atomic bomb survivors have shown the importance of reliable estimates of dose for comparing with the incidence of excessive fatal and non-fatal stochastic effects [22]. Data from biological dosimetry will make a contribution here. These should form part of a properly coordinated wider approach involving a number of institutions. The programme would include studies on stable chromosomal aberrations, as well as several other cytological endpoints such as oncogenic transformation, point mutations, oncogene assays, in situ hybridisation and DNA breakage which bear relationships with neoplasia
CONCLUSIONS After a radiation accident there is a need for biological dosimetry to supplement physical methods. Currently the best available method is that of cytogenetics and there is a well developed pool of experience with this assay in the UK. The technique provides an estimate of equivalent acute whole-body dose but this may be modified, depending on the circumstances of the accident to take account of protracted or inhomogeneous exposure. For treatment of overexposed subjects a precise biological dose estimate is not necessary; the physician rather responds to the clinical situation. The cytogenetic technique, using lymphocytes, provides a biological assessment of dose. This can be used indirectly to make an assessment of the health consequences of the exposure as dose is relatable to risk. For low doses these are the risks of induced stochastic effects, whilst at high doses nonstochastic effects, especially on important stem cell populations such as in the marrow, are of more immediate concern.
REFERENCES 1.
Kaul, A., Dehos, A., Bögl, W., Hinz, G., Kossel, F., Schwarz, E-R., Stamm, A. and Stephan, G., Biological Indicators for Radiation Dose Assessment, HMV Medizin Verlag, Munich, 1986.
2.
Bender, M.A. and Gooch, P.C., Somatic chromosome aberrations induced by human whole-body irradiation: the “Recuplex” criticality accident. Radiat. Res., 1966, 29, 568–582.
130
3.
IAEA, Biological Dosimetry: Chromosomal Aberration Analysis for Dose Assessment. Tech. Rep. 260, International Atomic Energy Agency, Vienna, 1986.
4.
Lloyd, D.C. and Edwards, A.A., Chromosome aberrations in human lymphocytes: effect of radiation quality, dose and dose rate. In Radiation-Induced Chromosome Damage in Man, eds. T.Ishihara and M.S.Sasaki, A.R.Liss Inc., New York, 1983, pp. 23–49.
5.
Stavem, P., Brøgger, A., Devik, F., Flatby, J., van der Hagen, C.B., Henriksen, T., Hoel, P.S., Høst, H., Kett, K. and Petersen, B., Lethal acute gamma radiation accident at Kjeller, Norway. Acta. Radiol. Oncol., 1985, 61– 63.
6.
Fliedner, T.M., Strategien zur strahlenschutzmedizinischen ambulaten Verzorgung von Betroffenen bei kerntechnischen Unfällen. Strahlenschutz Forsch. Prax., 1981, 22, 71–82.
7.
Wells, J., Charles, N.W. and Warner, T.T., Chromosomes from the epithelium of plucked human telogen hairs. Hum. Genet., 1983, 63, 315–316.
8.
Lloyd, D.C., The problems of interpreting aberration yields induced by in vivo irradiation of lymphocytes. In Mutation-Induced Chromosome Damage in Man. eds. H.J.Evans and D.C.Lloyd. Edinburgh University Press, Edinburgh, 1978, pp. 77– 88.
9.
Lloyd, D.C., Purrott, R.J., Dolphin, G.W., Horton, P.W., Halman, K.E., Scott, J.S. and Mair, G., A comparison of physical and cytogenetic estimates of radiation dose in patients treated with iodine–131 for thyroid carcinoma . Int. J. Radiat. Biol., 1976, 30, 473–485.
10. Littlefield, L.G., Joiner, E.E., DuFrain, R.J., Hübner, K.F. and Beck, W.L., Cytogenetic dose estimates from in vivo samples from persons involved in real or suspected radiation exposures. In The Medical Basis for Radiation Accident Preparedness, eds. K.F.Hübner and S.A.Fry. Elsevier, Amsterdam, 1980, pp 375– 390. 11. Lloyd, D.C., Edwards, A.A., Prosser, J.S., Auf der Maur, A., Elzweiler, A., Weickhardt, U., Gössi, U., Geiger, L., Noelpp, U. and Rösler, H., Accidental intake of tritiated water: a report of two cases. Radiat. Prot. Dosim., 1986 15, 191–196. 12. IAEA, The Radiological Accident in Goiânia. International Atomic Energy Agency, Vienna, 1988. 13. Lea, D.E. and Catcheside, D.G., The mechanism of the induction by radiation of chromosome aberrations in Tradescantia. J.Genet., 1942, 44, 216–245. 14. CNSNS-IT-001, Accidente por contaminacion con cobalto–60. Mexico 1984, Comision Nacional de Seguridad Nuclear y Salvaguardias, Mexico City, 1985. 15. Finnon, P., Lloyd, D.C. and Edwards, A.A., An assessment of the metaphase finding capability of the Cytoscan 110. Mutation Res., 1986 164, 101–108.
131
16. Piper, J., Towers, S., Gordon, J., Ireland, J. and McDougall, D., Hypothesis combination and context sensitive classification for chromosome aberration scoring. In Pattern Recognition and Artificial Intelligence, eds. E.S.Gelsema and L.N.Kanal, Elsevier, Amsterdam, 1988, pp. 449–460. 17. Pantelias, G.E. and Maillie, H.D., The use of peripheral blood mononuclear cell prematurely condensed chromosomes for biological dosimetry. Radiat. Res., 1984, 99, 140–150. 18. Fenech, M. and Morley, A.A., Measurement of micronuclei in lymphocytes. Mutation Res., 1985, 147, 29–36. 19. Almassy, Z., Krepinsky, A.B., Bianco, A. and Köteles, G.J., The present state and perspectives of micronucleus assay in radiation protection. A review. Appl. Radiat. Isot., 1987, 38, 241–249. 20. Prosser, J.S., Lloyd, D.C. and Edwards, A.A., A comparison of chromosomal and micronuclear methods for radiation accident dosimetry. In Proc. 4th Int. Symp. Radiat. Prot.—Malvern, 1989, in press. 21. Lloyd, D.C., Edwards, A.A., Prosser, J.S., Bajaktarovic, N., Brown, J.K., Horvat, D., Ismail, S.R., Köteles, G.J., Almassy, Z., Krepinsky, A., Kucerova, M., Littlefield, L.G., Mukherjee, U., Natarajan, A.T. and Sasaki, M.S., A collaborative exercise on cytogenetic dosimetry for simulated whole and partial body accidental irradiation. Mutation Res., 1987, 179, 197–208. 22. Preston, D.L. and Pierce, D.A. The effects of changes in dosimetry on cancer mortality risk estimates in the atomic bomb survivors. RERF TR9–87. Radiation Effects Research Foundation, Hiroshima, 1987.
SOME PRIORITIES IN EXPERIMENTAL RADIOBIOLOGY
G.E. ADAMS D.Sc Medical Research Council Radiobiology Unit, Chilton, Didcot, Oxfordshire. 0X11 ORD. U.K.
ABSTRACT
Some priorities in experimental radiobiology relevant to radiation risks of low doses of radiation are discussed. Concepts and experimental data appropriate to dose-response relationships for cell inactivation, chromosomal changes, gene mutation and oncogenic transformation both in vitro and in vivo are considered. A few examples are given of the application of molecular techniques involving recombinant technologies. These methods are finding increasing application in experimental radiobiology particularly in regard to the analysis of chromosomal rearrangements of possible relevance to cancer induction by radiations. The reverse-dose rate effect if shown to be applicable to cancer induction could have important consequencies in risk estimates. Some of the evidence for and against is discussed. INTRODUCTION
Radiation is a two-edged sword. It is a real hazard that must be continually guarded against in the context of public health yet, correctly used, radiation is of inestimable value as a major modality in the treatment of a wide variety of human cancers. Currently, experimental radiobiology has priorities and objectives in both fields covering a wide range of interests. These include fundamental studies on the mechanisms of cell-kill, mutagenesis, chromosomal changes and malignant transformation; determination of the inter-relationships between dose-response functions for the various lethal and sub-lethal
132
133
effects of radiation; the modifying effects of dose-rate and fractionation in both the low and high-dose ranges; the protection of normal tissues against the harmful effects of low doses to populations and the high doses used in clinical radiotherapy; and even in the development of new drugs and delivery techniques for use in conjunction with radiation for the treatment of cancer. However, current developments in these various areas are far two numerous to review here. The discussion is limited therefore to some selected topics that are particularly relevant to the area of low dose radiological protection. There is now no question that damage to the genomic material of the cell is the primary cause of most radiation-induced changes. The proliferative mammalian cell is, however, surprisingly resistant to such damage. An absorbed radiation dose of 1 gray causes of the order of 2000 ionizations within the genomic material of the cell. A major consequence of this is strand breakage, although of the thousand or so breaks that are formed, the large majority are repaired within a few hours. The remainder, present mostly as ‘aligned’ or double-strand breaks, are believed to be the major cause of cell death. Nevertheless, even at this relatively high dose of radiation, some 40% of the cells recover and retain the capacity for proliferation. Much of the chemical damage therefore, must be irrelevant to the fate of the cell, certainly as far as reproductive death is concerned. Current thinking is that only a small part of the damage arising from a relatively rare event in which a large amount of energy is deposited near critically sensitive sites, is of any consequence. An important, perhaps the most important, aspect of environmental radiological protection however concerns the non-stochastic effects of low doses of radiation in which cells sustain damage that causes irreversible but non-lethal changes in the cells behaviour. Malignant transformation is, unquestionably, the most important of these changes. Genetic changes of various kinds can be induced by radiation although there is no evidence that they are important at low doses. With regard to hereditary transmission, some animal experiments suggest that while pre-conception parental exposure can lead to malignancy in offspring, there is no evidence of this phenomenon in human populations on the basis of the limited information available. There is much information on the cancer-inducing effects of radiation. Unique among single carcinogenic agents, radiation can induce most types of malignancies including various types of carcinoma and sarcoma, neurological tumours and most types of tumours of the
134
haemopoietic and lympho-reticular systems. External promotion by other agents is not necessary although it is clear in experimental systems that chemical promoters can increase the efficiency of transformation. It is this very ubiquitousness in radiation carcinogenesis that gives clues to mechanisms in terms of the types of DNA injury that are responsible (see later). There is now much evidence that radiation carcinogenesis involves free radical processes and considerable information exists [1] on the types and classification of DNA free radicals, their mode of formation, their reactivities, the role of chemical repair and the processes by which these chemical effects may cause the mutagenic and other changes that are relevant to cancer induction. The most important problem that remains in this area of research is the identification of those chemical processes that are important from the many other events that are of no consequence to either the mutagenic or lethal effects of ionising radiation. A valid starting point in this is the understanding of dose-response relationships.
DOSE RESPONSE RELATIONSHIPS
It is worth emphasising the fact that despite a huge volume of radiobiological literature on the carcinogenic effects of radiation in experimental animal systems, the assessment of risk and the implementation of radiological protection standards still relies mainly on the results of epidemiological analysis of irradiated human populations. It is not difficult to understand why. Large experimental animal studies are very expensive, time consuming, subject to uncertainties of extrapolation because of both inter and intraspecies variation but above all usually suffer from lack of sensitivity because of the complicating effects of spontaneous tumour incidence. Most rodent systems used to investigate radiation carcinogenesis in vivo often display a small, but significant incidence of the tumour whose induction by radiation is the object of the study. This usually imposes a limit on investigation of risk at low doses since clearly, the identification of a small yield of radiation-induced tumours against a relatively high yield of spontaneous tumours of the same type is not possible. There are exceptions; the CBA/H mouse model used to investigate radiation-induced myeloid leukaemias is one example [2] and can be used to great effect to investigate
135
dose-response relationships [2, 3]. However, in general, spontaneous tumour incidence is one of the major problems in the use of animal models for investigating radiation risk at low doses. Priorities in radiation research directed towards the assessment of cancer risk at low doses now rely more and more on basic laboratory studies using the new technologies in cellular and molecular biology. It is in this area of basic research where the best opportunities lie for the high-dose to low-dose extrapolations that are so necessary for refinement of risk estimates. Figure 1 reproduces the types of theoretical dose-response relationships considered to be worthy of consideration in the analysis of cancer induction. There are various examples of linear, curvilinear, both convex and concave and indeed threshhold relationships that exist in the literature. We still do not know however how any of these apply to radiation-induced cancer and indeed how they are influenced by factors such as radiation quality, dose-rate and celltype. At the cellular and sub-cellular level, information is more precise.
Figure 1. Some hypothetical dose-response relationships possibly relevant to radiation effects.
136
Survival curves for irradiated cell populations have been used for many years to examine the validity or otherwise of biophysical models for cell inactivation. High LET radiations are invariably more effective and this is reasonably interpreted to be due to a greater likelihood for the induction of the more damaging and less reparable type of lesion, the ‘double-strand break’ (Figure 2). An a-particle for example, traversing a segment of the DNA helix, will deposit its energy at a much greater rate than is the case for an X- or -ray track. There will be therefore, a greater probability of formation of a double-strand break than there is from either a single X- or -ray track or by the chance alignment of two single strand breaks arising from two separate tracks. The curvature often observed in cellular survival curves for X-or irradiated cells but seldom if ever observed in high LET radiation survival curves is explicable in terms of this physical representation.
Figure 2.
Schematic illustration of the different patterns of energydeposition from low and high LET radiation tracks traversing a section of the DNA helix.
137
THE a-β HYPOTHESIS
Curvilinear cell survival curves of the type shown in figure 3a can be fitted empirically to relationships of the type:
OR
where N is a population of clonogenic cells, N is the number of those cells 0 that survive a given radiation dose D and a and ß are coefficients. According to a hypothesis first introduced by Chadwick and Leenhouts [4] the linear term, the ‘alpha’ or ‘linear’ component represents that component of cell killing attributable to double strand breaks induced by a one-track traversal of the DNA helix. The ‘beta’ or ‘quadratic’ component of cellkilling is assigned to double strand breaks arising from the chance alignment of single strand breaks caused by separate radiation tracks. Its relative importance should increase with increasing dose and the yields of the damage should presumably bear a quadratic relationship with absorbed radiation dose. There are of course other interpretations of the curvilinear shapes of radiation survival curves but the strand break hypothesis is useful to examine various concepts relevant not only to cell inactivation phenomena but to those involving cell mutation and malignant transformation also.
CELL SURVIVAL AND CELL MUTATION On the basis that dose-response curves for cellular mutation may be relevant to those for the induction of cancer by radiation, it is pertinent to consider the relative efficiencies of cell inactivation and cell mutation by radiation. Figure 3a is a fairly typical example of cellular inactivation curves [5]. The curvature or shoulder on the survival curve for the low LET radiation, in keeping with the Chadwick and Leenhouts hypothesis, is consistent with the existence of enzyme-mediated ‘repair’ or restoration processes of strand breakage
Figure 3
a) Inactivation of Chinese hamster V79 cells by low LET X-rays (open circles) and high LET a-particles (solid circles). Data from ref 5. b) The different efficiencies of high LET fission neutrons and low LET -rays in causing dicentric aberrations in human lymphocytes. Data from refs 7, 8.
139
in the DNA. On the basis that such repair processes are eventually inactivated or saturated at higher doses, such survival plots should become linear. This is often observed with various types of mammalian cells both in vitro, and in vivo where appropriate clonogenic assays are available. Survival plots for high LET radiation are usually linear through the origin, indicative of the fact that repair processes are either absent or much less efficient for this type of radiation. The analysis of survival curve shapes in terms of a and ß components fits easily into concepts of repair-proficient and repair-deficient types of cellular radiation damage. It also provides a ready explanation of the influence of dose-rate. It is generally found both in vitro and in vivo that irradiation over a protracted period or with multiple small fractions is less efficient than a single acute dose of radiation of the same magnitude. Chromosomal changes
Radiation is known to cause a variety of changes in the structure of the cellular chromosome [6]. These include interchanges, intra-arm changes and inter-arm intra-changes as well as breaks or discontinuities. Not all such types of chromosomal damage are lethal to the cell nor do those that are transmissible to succeeding cellular generations necessarily have observable genetic consequencies. The position of the cell in its mitotic cycle at the time of irradiation can influence the type of chromosomal aberration observed at metaphase. These are the chromosome type where both sister chromatids are involved for a given locus and chromatid type where only one is affected. In contrast to most chromosome-damaging chemical agents that generally cause only the latter type of change, radiation can cause both chromosome type and chromatid type changes. Figure 3b reproduces data of Lloyd and co-workers comparing the different efficiencies of high LET fission neutrons and low LET -rays in causing dicentric aberrations in human lymphocytes [7, 8]. Here too, the curvature in the low LET response curve is readily explicable in terms of a greater reparability of the lesion produced by the radiation. An important consequence of the differences in the shapes of the dose-response curve is that the Relative Biological Effectiveness (RBE) of the two types of radiation is not independent of dose. In this, and many other examples, RBE increases with decreasing dose and this is a major cause for uncertainty in extrapolation from effects obtained at
140
relatively high doses. The influence of dose, dose-rate and radiation quality remains the major problem in determining the risks of cancer induction at low radiation doses. In the search for the understanding of mechanisms of radiation carcinogenesis, studies of gene mutation can provide useful data. Problems exist however of which lack of sensitivity is a major one. Mutation in one of a pair of chromosomes (autosomal mutation) is often difficult to detect in cellular behaviour because the function is unchanged in the undamaged chromosome. Greater sensitivity can occur in x-linked mutation where damage to the single x-chromosome can be expressed more readily. An example is radiation-induced resistance to thioguanine. The resistance is due to the induced reduction in activity of the enzyme hypoxanthine guanine phosphoribosyl transferase (HGPRT) whose normal function is to enable cells to incorporate purines from the growth medium. Damage to the gene causes cells to become resistant to thioguanine, a cellular poison, and this resistance is used as a screen for the mutation. Since the gene is x-linked, mutation can be observed with fairly low radiation doses. Evidence from various cytogenetic and biochemical studies [9] shows that mutation at the HGPRT locus usually involves gross deletions and/or re-arrangements of the DNA sequences in the gene. This indicates that point mutation at the locus is either infrequent or masked by the large re-arrangements or deletions in the DNA. This contrasts with mutation at this locus induced either by ultraviolet light or by chemicals. It is premature however to consider the gross molecular changes induced by radiation at the HGPRT locus to be at all representative of mutational changes produced elsewhere in the genomic material. This underlines the caution necessary in extrapolation from mutagenesis in relatively simple cellular models to mechanisms of carcinogenesis in vivo. It is interesting however, that for this particular mutation and therefore possibly for other mutations also, there is a quantitative relationship between the efficiency of radiation in causing cellular mutation and inactivation. Thacker [10] has found that for three cell lines of human, hamster and mouse origin, the mutation frequency per viable surviving cell shows a linear-log relationship with cellular surviving fraction. It is suggested [10] that the repair process employed by the cell to deal with the initial damage may not be error-free. If the probability of the repair systems for changing the genetic material relative to that for removal of the
141
lesion in order to give non-mutated surviving cells is fixed, this could explain the quantitative relationship found for the mutation at the HGPRT locus. Evidence for mis-repair has come from detailed studies on the autosomal recessive genetic disorder, ataxia telangiectasia (A-T) [11, 12 and included refs]. Sufferers from this disease exhibit neuromotor dysfunction, immunodeficiency, proneness to T-cell neoplasia and, in particular, abnormal sensitivity to ionizing radiation. This sensitivity and apparent repair deficiency has been demonstrated directly by recombinant DNA techniques. Cox and coworkers have shown by transfer of plasmid-encoded genes containing inactivating DNA strand sequences in both normal human and A-T lines, that A-T cells exhibit greatly elevated mis-repair of these strand scission [13]. It is suggested that such mis-repair might influence sequence-specific DNA recombination processes important in the development of ß-cell immunoglobulin and T-cell receptor systems. This might be implicated in the immunodeficiency and chromosomal rearrangements that are evident in the A-T syndrome. There is direct evidence in support in that a 7–14 chromosomal translocation observed in a A-T patient with a T-cell leukaemia involved aberrant recombination of the T-cell receptor ß-gene [14].
RADIATION TRANSFORMATION AND CANCER INDUCTION There is an urgent need for the development of simple in vitro models of cellular transformation that can be used to study the early stages of radiation carcinogenesis. There are many unsolved problems and barriers to the development of systems that are truly relevant to human oncogenesis which is mainly an abnormality of epithelial tissues. While much progress has been made in understanding the early processes involved in cellular immortalisation and transformation, particularly using molecular techniques, a suitable quantitative model system is not yet available for radiation studies. There are available however, some highly useful cell lines that can be transformed in vitro by radiation. Such transformed cells can induce tumours when re-implanted into the animal from which they were originally derived, or when transplanted into other hosts that are immunologically compatible.
142
Radiation transformation was first observed by Borek and Sachs [15] using short-term cultures of cells derived from Syrian Hamster embryos. The C3H 10T1/2 cell line [16] is also frequently used for transformation studies in vitro. These fibroblast-like cells grow in monolayer and show the usual property of contact inhibition. Treatment with radiation, causes morphological changes in that some cells lose this contact inhibition and give rise to dense colonies that overgrow the confluent layer. Cells from these colonies induce tumours after innoculation into appropriate hosts. Radiation studies with this and other transforming systems show that the efficiency of the transformation increase with increasing LET of the radiation in line with trends observed for other radiation-induced changes including cell inactivation, chromosomal re-arrangements, mutation and frank induction of cancer in vivo. While transforming cell lines of this type are undoubtedly valuable, there is an urgent need for developing new model systems that are sufficiently quantitative for studying the efficiency of radiation transformation in appropriate human cells particularly those derived from haemopoietic and epithelial tissues. This is one of the highest priorities in radiobiological research.
RADIATION-INDUCED LEUKAEMIA Epidemiological studies, particularly the Life-Span Study of survivors of the atomic bombs dropped over Japan in 1945 [17, 18] and the investigation of cancer incidence in patients with ankylosing spondylitis who were treated with radiation [19] have shown that the induction of leukaemia is a particularly strong hazard in human populations given fairly high doses of radiation. The possible risks from low levels of radiation exposure such as those associated with environmental or occupational exposure remain largely unknown and is a subject of much current public interest.
Aspects of the epidemiology of human leukaemia and the evidence of ‘clustering’ of the disease both near and remote from nuclear installations, is discussed elsewhere in these Proceedings. From the experimental standpoint, the study of mechanisms of leukaemia induction by radiation remains one of the highest priorities, particularly those concerning dose-response relationships and the significance or otherwise of radiation dose-rate. At present, there is no experimental
143
evidence that exposure to radiation at dose levels and dose-rates comparable to those applicable to exposure to environmental radiation, can give rise to leukaemia. The problem however is that experimental laboratory systems sufficiently sensitive, quantitative and relevant to the human situation, while under development, have not yet reached the level of sophistication required for helping to answer the question “Can human leukaemia arise from environmental exposure to radiation at very low doses”. The limitations of animal models have been mentioned earlier, particularly those arising from the complications of spontaneous incidence which prevent the study of low dose effects. The CBA/H mouse AML system is one exception and in principle could be used to study the existence, or otherwise, of risk from very low doses of radiation in model studies. Studies currently in progress are investigating the incidence of acute myeloid leukaemia following internal exposure to high LET radiation from the radionuclides radium–224 and plutonium–239 [3, 20 and Humphreys, private communication] administered as either a single acute dose or as multiple small doses over a protracted period. Because of the extremely low incidence of spontaneous AML in this mouse line, any induction of the disease by low doses of radiation should be highly significant.
MOLECULAR STUDIES IN RADIATION ONCOGENESIS Studies on the mechanisms of DNA repair, mutation and cell transformation should continue to provide valuable information. Another approach to the problems of carcinogenesis lies in the direct molecular analysis of somatic cells that carry abnormalities that may be associated with the malignant phenotype. It is interesting that the use of DNA-mediated gene transfer techniques have shown that the malignant phenotype can be transferred from one cell type to another; for example, from the 10T 1/2 cell line to the murine NIH3T3 cell line [21].
It has been known for many years that radiation can activate ‘cancer-inducing’ viruses. The radiation leukaemia virus (Rad. L.V.) is a class of retroviruses originally identified in cell-free extracts from thymic lymphomas induced in mice by radiation. Tumours of this type express the virus both in the primary tumour and in transplants
144
into syngeneic mice. Injection of cell-free extracts intrathymically, induces identical tumours in syngeneic mice. Much attention has been directed towards the possibility of oncogene activation by irradiation. Activation of the Ras oncogene is evident in some thymomas induced in mice by radiation, with some degree of specificity indicative of fairly precise DNA changes [22]. While it is not possible to relate these directly to cancer induction, they may be associated with some late event, or events, in neoplastic development. Problems in interpreting specific molecular changes induced by radiation in terms of cancer induction, is that radiation is so nonspecific in its capabilities for damaging DNA. Much of the damage sustained is of no consequence to the subsequent fate of the cell. An alternative approach and one that is bringing some success, lies in the search for specific chromosomal markers that may be indicative of an increased likelihood for neoplastic change in the organ in which those cells are normally present. While a wide range of chromosomal changes can be induced in irradiated cells, the regions in the chromosome where breaks may occur are not necessarily random. The reasons for this are not clear but site fragility may be an important cause. Various abnormalities occur in murine thymic lymphomas including trisomy of chromosome 15 (ch15) and a translocation marker between chromosomes 1 and 5. The latter change is sometimes observed quite early after irradiation [23]. The CBA mouse AML model [2] referred to earlier is in most respects a more realistic model of radiation carcinogenesis in humans. Various chromosomal rearrangements particularly involving chromosome 2 occur in cells from irradiated CBA mice [24]. These may be possible early markers for AML produced in this strain of mouse by both high and low LET radiation. Figure 4a reproduces karyotype analyses from two cases of radiation induced AML, and from the marrow of mice transplanted from irradiated donor mice (figure 4b) [24]. It was reported that the most consistent chromosomal change was deletion and/or translocation involving chromosome 2 [15, 16]. Rearrangements of this type are also common in the growing clones in transplanted marrow from irradiated mice and can be observed as early as five days after transplantation. It has been shown recently (Silver, Cox, private communication) that the haemopoietic
Figure 4
a) Representative karyotypes of two radiation-induced CBA AMLs b) Representatives karyotypes of two radiation-induced early transplantation clones. Data from ref 24.
146
growth factor gene, Interleukin 1 is located close to the breakpoint in chromosome 2 and is over-expressed in cases of AML in which this characteristic rearrangement is evident. It is known that a high proportion of cases of human AML also over-express this gene [25] suggesting that changes in this gene may be an important early marker in the development of AML. While it is true that radiation causes a wide variety of chromosomal changes in irradiated human cells, the results from the CBA model offer encouragement in the search for early molecular markers that may prove to be of valuable in the assessment of human risk from low doses of radiation.
THE REVERSE DOSE-RATE EFFECT A topic of considerable current interest concerns the influence of low dose-rate on the risks of low-dose radiation particularly at high LET. Normally, as discussed earlier, fractionation of the radiation dose or delivery of a single dose over a protracted period reduces the efficiency of radiation damage compared with that produced by the same dose given acutely. This is generally true for cell killing, chromosomal damage and cell mutation and is often assumed to be true for oncogenic transformation. There are however some reports suggesting that the reverse may be true for some types of high LET radiation both in vitro and in vivo. Elkind and co-workers observed a substantial increase in the effectiveness of fission neutrons in transforming C3H 10T 1/2 cells when the dose-rate was reduced form 10.3 to below 0.45 Gy/min [26] or when a small dose was administered as a multiple series of daily fractions [27]. It is interesting that these studies showed no influence of neutron doserate on cell-killing efficiency. A similar reverse dose-rate effect has been observed with fission neutrons in vivo in regard to tumour induction [28, 29].
A recent example of an apparent reverse dose-rate effect for irradiation concerns lymphoma-induction in young female NHg: NMRI mice treated with injections of radium–224 administered either as a single dose of 18.5 KBq Kg-1 or as 72 separate fractions of 257 Bq Kg-1
a-particle
147
twice weekly for 36 weeks. Figure 5 reproduces the data as published in April 1988 [30]. Curve C shows the spontaneous lymphoma incidence in untreated animals and curve A, the incidence in mice given the fractionated treatment. Significantly, no case was observed in the group receiving a single dose up to 360 days after treatment. The group contained 295 mice of which 287 were still alive 360 days later. This appears to be a remarkable result although there are grounds for caution in interpreting the result particularly in regard to the complicating effect of cell kill. Stem cells in the single dose group receive a substantial and fairly acute dose of a-radiation that may sterilise a high proportion of these cells. The total risk of lymphoma induction may be reduced even though the risk per surviving cell could remain high.
Figure 5. Cumulative incidence of malignant lymphoma in female NMRI after injection of 224Ra.
1) Line A: Group given 36 twice weekly injections (total 18.5 KBq Kg-1) 2) Line C: Untreated control group. 3) No lymphoma cases up to day 360 in the group with one single injection of 18.5 KBq Kg-1. Data from ref 30.
148
The phenomenon of reduced risk with increasing doses has been observed in various mouse tumour models including the CBA/AML system [2] and is generally attributed to the confounding effects of cell kill. However, the reverse dose-rate effect observed in the in vitro experiments with 10T1/2 cells cannot be explained on this basis. In contrast, there are other reports in the literature indicating that transformation studies on the 10T1/2 line using a particle radiation show no evidence of a reverse dose rate effect [31]. The significance or otherwise of the reverse dose-rate effect is one of the most topical problems in experimental radiobiology in view of its importance for radiological protection. Its solution is awaited with much interest.
REFERENCES 1. Von Sonntag, C. The Chemical Basis of Radiation Biology. Taylor and Francis, London-New York-Philadelphia. 1987. 2. Mole, R.H., Papworth, D.G. and Corp, M.J. The dose response for Xray induction of myeloid leukaemia in male CBA/H mice. British J. Cancer 1983, 47, 285–291. 3. Humphreys, E.R., Loutit, J.F., Major, I.R. and Stones, V.A. The induction by 224Ra of myeloid leukaemia and osteosarcoma in male CBA mice. Int. J. Radiat. Biol. 1985, 47, 239–247. 4. Chadwick, K.H. and Leenhouts, H.P. A molecular theory of cell survival. Phys. Med. Biol. 1973, 18. 78–87. 5. Thacker, J., Stretch, A. and Goodhead, D.T. The mutagenicity of Plutonium 238. Radiation Res. 1982, 92, 343–352. 6. Savage, J.R.K. Induction and consequences of structural chromosome aberrations in The Biological Basis of Radiotherapy, eds. G.G.Steel, G.E.Adams, M.J.Peckham. Elsevier Science Publishers B.V. 1983, pp. 93–103. 7. Lloyd, D.C., Purrott, H.R.J., Dolphin, G.W. Bolton, D., Edwards, A.A. and Corp, M.J. The relationship between chromosome aberrations and low LET radiation dose to human lymphocytes. Int. J. Radiat. Biol. 1975, 28, 75–90. 8. Lloyd, D.C., Purrott, R.J., Dolphin, G.W. and Edwards, A.A. Chromosome aberrations induced in human lymphocytes by neutron irradiation. Int. J. Radiat. Biol. 1976, 29, 169–182. 9. Thacker, J. The use of recombinant techniques to study radiationinduced damage, repair and genetic change in mammalian cells. Int. J. Radiat. Biol, 1986, 50, 1–30.
149
10. Thacker, J. The involvement of repair processes in radiationinduced mutation of cultured mammalian cells. In Proc 6th Int. Cong. Radiation Res. Tokyo 1979. Japanese Association for Radiation Research, pp. 612–620. 11. Cox, R.R. In vivo and in vitro radiosensitivity in ataxia telangiectasia in The Biological Basis of Radiotherapy. eds. G.G. Steel, G.E.Adams, M.J.Peckham. Elsevier Science Publishers B.V. 1983, pp. 105–112. 12. Bridges, B.A. and Harnden, D.G. Ataxia-telangiectasia Wiley, New York, 1982. 13. Cox, R.R., Debenham, P.G., Masson, W.K. and Webb, M.B.T. Ataxia Telangiectasia: A human mutation giving high frequency misrepair of DNA double strand scissions. Mol. Biol. Med. 1986, 3, 229–244. 14. Russo, G., Isobe, M., Pegoraro, L., Finan, J., Nowell, P.C. and Croce, C.M. Molecular analysis of a t (7:14) (q35:q32) chromosome translocation in a T-cell leukaemia of a patient with ataxia telangiectasia. Cell 1988, 53, 137–144. 15. Borek, C. and Sachs, L. Cell susceptibility to transformation by xirradiation and fixation of the transformed state. Proc. Natl. Acad. Sci. U.S.A. 1967, 57, 1522–1527. 16. Reznikoff, C.A., Brankow, D.W. and Heidelberger, C. Establishment and characterisation of a cloned line of C3H mouse embryo cells sensitive to postconfluence inhibition of division. Cancer Res. 1973, 33, 3231–3238. 17. Kato, H. and Schull, W.J. Studies of the mortality of A-bomb survivors 7, Mortality, 1950–78. Part 1 Cancer Mortality Radiation Res. 1982, 90, 395–432. 18. Report of the United Nations Scientific Committee on the Effects of Atomic Radiations (UNSCEAR) United Nations, New York, 1988. 19. Darby, S.C., Doll, R., Gill, S.K. and Smith, P.G. Long term mortality after a single treatment course with X-rays in patients treated for ankylosing spondylitis. Br. J. Cancer 1987, 55, 179–190. 20. Humphreys, E.R., Loutit, J.F. and Stones, V.A. The induction by 239 Pu of myeloid leukaemia and osteosarcoma in female CBA mice. Int. J. Radiat. Biol. 1987, 51, 331–339. 21. Borek, C., Ong, A. and Mason, H. Distinctive transforming genes in X-ray transformed mammalian cells. Proc. Nat, Acad. Sciences. U.S.A. 1987, 84, 794–798. 22. Diamond, L.E., Newcomb, E.W., McMorrow, L.E., Leon, J., Sloan, S. Guerrero, I. and Pellicer, A. In vivo activation of mouse oncogenes by radiation and chemicals in Radiation Research Vol 2 (ed.
150
E.M.Fielden, J.F.Fowler, J.H.Hendry and D.Scott) Taylor and Francis: London. 1987, pp. 532–537. 23. McMorrow, L.E., Newcomb, E.W. and Pelicer, A. Identification of a specific marker chromosome early in tumour development in -irradiated C57BL/6J mice. Leukaemia. 1988, 2, 115–119. 24. Silver, A., Breckon, G., Masson, W.K., Malowany, D. and Cox, R.R. Studies on radiation myeloid leukaemogenesis in the mouse. In Radiation Research Vol 2, ed. E.M. Fielden, J.F.Fowler, J.H. Hendry and D.Scott. Taylor and Francis, London. 1987. pp 494–500. 25. Griffin, J.D., Rambaldi, A., Vellenga, E., Young, D.C., Ostapovicz, D. and Cannistra, S.A. Secretion of interleukin-1 by acute myeloblastic leukaemia cells in vitro induced endothelial cells to secrete colony-stimulating factors, Blood, 70, 1218–1221. 26. Hill, C.K., Han, A. and Elkind, M.M. Fission spectrum neutrons at a low dose rate enhance neoplastic transformation in the linear, low dose region (0–10 Gy). Int. J. Radiat. Biol. 1984, 46, 11–15. 27. Hill, C.K., Carnes, B.A., Han, A. and Elkind, M.M. Neoplastic transformation is enhanced by multiple low doses of fission spectrum neutrons. Radiation Res 1985, 102, 404–410. 28. Upton, A.C., Randolph, M.L. and Conklin, J.W. Late effects of fast neutrons and gamma rays in mice as influenced by the dose-rate of irradiation: induction of neoplasia. Radiation Res. 1970, 41, 467–491. 29. Ulrich, R.L. Tumour induction in BALB/c mice after fractionated or protracted exposure to fission spectrum neutrons. Radiation Res. 1984, 86, 587–597. 30. Müller, W.A., Linzner, V. and Luz, A. Early induction of leukaemia (malignant lymphoma) in mice by protracted low a doses. Health Physics 1988, 54, 461–463. 31. Collected papers in Proceedings of the 14th L.H. Gray Conference. Low Dose Radiation-Biological Bases of Risk Assessment, Oxford September 1988 eds. K.F.Baverstock and J.Stather. Taylor and Francis, London (in press).
ARRANGEMENTS FOR DEALING WITH EMERGENCIES AT CIVIL NUCLEAR INSTALLATIONS
M J TURNER & I F ROBINSON HM Nuclear Installations Inspectorate Health and Safety Executive St Peters House Bootle Merseyside L20 3LZ
ABSTRACT This paper covers arrangements for dealing with nuclear emergencies at sites licensed by the Health and Safety Executive/Nuclear Installations Inspectorate. Such arrangements are over and above the contingency plans required for radiation incidents as required by the Ionising Radiations Regulations. The statutory position of the NII is described and, although the NII is limited to regulating the activities of the operator, the functions of the other organisations that could be involved in dealing with an emergency are briefly covered in order to give as complete a picture as possible. The basis for emergency planning is given together with the consequences and countermeasures for mitigation of a nuclear emergency, including the use of ERLs. The requirements for emergency exercises are explained.
151
152
1. STATUTORY POSITION AND ROLE OF THE NII
1.1 The main legislation governing the safety of civil nuclear installations is the Health and Safety at Work etc Act 1974 and the associated relevant statutory provisions of the Nuclear Installations (NI) Act 1965. Under these Acts the operators are clearly responsible for the safety of the plants but the Nuclear Installations Inspectorate (NII) which is a part of the Health and Safety Executive (HSE), is responsible for administering and enforcing the requirements at civil nuclear installations. In particular the HSE have the power to grant, refuse, vary or revoke a nuclear site licence which is required before anyone (other than the UKAEA or a Government Department) may build or operate a nuclear reactor or any other nuclear installation which is prescribed under the NI Act. The HSE is also empowered to attach conditions to a nuclear site licence at any time in the interests of safety or radioactive waste management. One of the conditions that has always been attached to Nuclear Site Licences concerns emergency arrangements. This includes the requirement that an operator has arrangements for dealing with any accident or emergency on the site. These arrangements are defined in the operators Emergency Plan which is required to be approved by the NII on behalf of HSE. 1.2 A nuclear site licence would not be granted for a new installation unless the NII were satisfied about the safety of the proposed plant and its location. The Inspectorate has its own safety assessment principles [1] [2] which are used as a guide when judging the adequacy of the safety case for the proposed plant put forward by the operator. Details of the role of the NII with respect to siting aspects, licensing, regulatory control of construction, commissioning and operation are given in reference 3. 1.3 Regulatory control is exercised by the NII through inspections for compliance and enforcement of these nuclear site licence conditions. At present, the NII organisation and practice provides for the appointment of a designated Site Inspector for each nuclear
153
site. The site inspector maintains surveillance of the activities on the site for conformity with the requirements of the Ionising Radiations Regulations 1985 (IRR 85) and carries out inspections for compliance with the conditions attached to the site licence; and is concerned with the verification that the operator and his contractors are meeting their obligations to build and operate the nuclear plant in accordance with the safety case. 1.4 Under the IRR 85 employers who carry out work with ionising radiations are required to assess the hazards involved. Where this shows that as a result of any reasonably foreseeable accident anybody is likely to receive a dose exceeding the dose limit, the employer must prepare a suitable contingency plan. The requirements to have arrangements for dealing with emergencies at sites licensed by HSE/NII are over and above those of the IRR 85. This paper covers the responsibilities and arrangements for dealing with emergencies at civil nuclear installations.
2. BASIS FOR EMERGENCY PLANNING 2.1 It is a principal objective of the NII to ensure that civil nuclear installations are properly designed, constructed and operated and that the operator takes all reasonably practicable measures to ensure that accidents do not occur. Over and above this, however, civil nuclear installations are required, in order to minimise the consequences should an accident occur, to be sited appropriately and have adequate emergency arrangements. 2.2 The purpose of the emergency arrangements is to enable an effective response to be mounted for any incident varying from a minor on-site event to a significant off-site release of radioactivity. The hazard could lead to either direct external irradiation of persons or internal radiation dose brought about by the inhalation or ingestion of the radioactive material. The countermeasures outlined in Section 5 are aimed at avoiding or mitigating these effects. 2.3 Current emergency planning takes into account the lessons learnt from nuclear accidents world wide where off-site
154
measures had to be considered; of particular interest were Windscale, Three Mile Island (TMI) and Chernobyl. A great deal was learned from the 1957 Windscale fire about the sort of organisation, equipment and procedures necessary to deal with a major accident, and in particular the need for establishing pre-determined criteria upon which to base the introduction of measures to protect the public was hightlighted. Out of this arose the concept of Emergency Reference Levels (ERLs)—see Section 6. The 1979 TMI accident showed that better emergency plans were required, that the release may not be dominated by radioiodine, that there were benefits if the ERL concept was made more flexible by adopting a dose range for action rather than a single fixed dose. Perhaps the most important lesson was the need to insulate the site from the barrage of requests for information and assessment from the media and public, and that there should be an independent source of authoritative advice. Out of this came the concepts of the Off Site Centre and Government Technical Advisor. The 1986 Chernobyl accident pointed up the need to consider the response not just to accidents within the UK but also those overseas, that there is a need to rationalise the wide range of different national responses to a major nuclear accident, and a number of more technical matters such as the unexpectedly long retention of radiocaesium in the root mat on acidic soils. 2.4 The emergency arrangements that have been developed in various countries to deal with a nuclear emergency reflect not only the nuclear hazard but also the social, economic and political arrangements of the country concerned. The United Nations International Atomic Energy Agency (IAEA) has developed a set of codes and guides which give the distilled experience of Member States on a wide range of topics relating to nuclear safety including emergency arrangements [4]. The UK is and has been an active participant in the development of these codes and guides and recognises that they are an important element in providing and establishing good practice. The IAEA guidance is taken into account in formulating nuclear emergency arrangements in the UK, having due regard for the regulatory framework and socio political structure within which the various agencies function.
155
2.5 In the UK the safety analysis of the plant identifies potential accidents or accident sequences which might lead to the release of radioactivity. The amount of activity calculated to be released is then used to determine the consequences of these events for the general public. The consequences are expressed in terms of radiation dose. These doses are then compared with ERLs given in the formal advice [5] of the National Radiological Protection Board (NRPB). These are dealt with in more detail in Section 6. Safety and protection systems are provided in the design of modern reactors with the intention that the lower bound of ERLs for evacuation should not be exceeded at the site boundary, and that the frequency with which this limit is approached should be less than once in 10,000 years. The safety analysis identifies those accident sequences, referred to as design basis accidents (dbas), for which this requirement is met. The probability of a more severe accident (beyond the design basis) is even more remote. 2.6 The earlier nuclear plants, such as Magnox reactors with steel pressure vessels, were assessed using a mainly deterministic approach because the present probabilistic method of analysis had not been developed. This led to these reactors being designed to cater for the failure of one duct of the reactor cooling circuits. If this occurred and it was assumed in addition that there was the unlikely event of gross failure of fuel cladding in a fuel channel, followed by oxidation of the fuel, then lower bound of ERL for evacuation would not be exceeded under adverse weather conditions beyond 1 mile (1.5 km) from the reactor. This led to the requirement that there should be a detailed emergency planning zone for evacuation of radius 1.5 miles (2.4 km) within which evacuation is planned to be carried out in a few hours. This requirement applies to all Magnox in steel reactor sites but has been extended at some sites to take account of local circumstances to ensure that all of a particular community would be included in the evacuation plans. 2.7 With later reactor designs tighter safety standards have the intention that for dbas the lower level of ERL for evacuation
156
is not exceeded beyond the site boundary. Nevertheless the minimum detailed evacuation planning zone radius for these later reactors has been set at 1km. 2.8 Details of the households in this zone are recorded and updated at regular intervals. On the declaration of an off-site emergency the operator would deploy radiological monitoring teams in the down wind direction. At pre-determined action levels, countermeasures and in particular evacuation and the issue of stable iodine, would be considered and if necessary initiated in conjunction with the Police. 2.9 Approved Emergency Plans are directed towards providing an effective response to a range of accidents. The accidents evisaged are those within the dbas but the structure of the plan would allow for an increase in response for more severe accidents. The arrangements for responding to a worse accident however, become less detailed as the probability of the incident decreases. This is because a balance must be struck between ensuring that detailed plans, sufficiently extensive to cope with a serious accident, are in place and the unjustified use of resources involved in planning in detail for very improbable events. This philosophy was endorsed in the Layfield report on the Sizewell Inquiry. [6]
3. ROLES OF RELEVANT ORGANISATIONS The Operator 3.1 In a nuclear emergency the operator of the nuclear. installation remains fully responsible for the nuclear site. The operator may receive advice from other agencies, depending on the type of emergency, for example, from the Nuclear Installations responsibility Inspectorate, or other nuclear establishments, but the for bringing the incident under control and ensuring the restoration of safe conditions at the site remains with him. 3.2 The Emergency Arrangements would be invoked by a senior manager
157
of the plant present on site. The nature and prognosis of the event together with possible consequences for the safety of people on and off-site determine the state of emergency which is declared. The operator’s Emergency Plan specifies the state of emergency to be declared for a range of conditions. The emergency states generally cover the following situations: a) an on-site incident with no possible off-site consequence (Site Incident Standby or Alert); b) an on-site incident which has the potential to develop such that significant off-site consequences may occur (Emergency Standby); and c) an on-site incident with significant off-site consequences (Emergency Alert). The senior manager invoking the emergency plan is nominated as Site Emergency Controller (SEC). Only designated staff identified in the Emergency Handbook can act as SEC and declare, upgrade, downgrade or cancel any of the emergency states. The responsibility for ensuring the restoration of safe conditions and minimising off-site releases remains at all times with the operator. 3.3 In the event that a Site Incident or Emergency condition were to be declared, all staff would muster at pre-arranged points and roll-calls would be taken. Certain members of staff would assemble to form a predetermined emergency response organisation. 3.4 This organisation would comprise a number of teams, each of which have received instruction and training in the tasks they would be required to perform. For example, there would be a team to assess the extent of any damage to the plant, a team to rescue any casualties and a team to repair damaged plant, teams to monitor levels of radiation both on the site and in its vicinity. In addition to these site teams, arrangements are made with other nuclear sites to provide assistance in terms of personnel and equipment during an emergency.
158
3.5 The SEC would be responsible for controlling all aspects of the emergency organisation in the early stages including, inter alia, notifying relevant off-site organisations, recommending actions for the protection of site personnel and members of the public and restoring the plant to a safe and stable condition. 3.6 Within a few hours of an emergency being declared at a major nuclear site an Off Site Centre (OSC) would be set up. These OSCs are generally located within 10–30 kilometres of the nuclear site and are equipped with communications and facilities designed for use in an emergency. Details are given in each site emergency plan. The main purpose of the OSC is to relieve the Site Emergency Controller at the nuclear installation of the responsibility for advising on emergency action outside the site and to provide any necessary support to the site. 3.7 Represented at the OSC would be the operator, NII, NRPB, DEnergy (DEn), Ministry of Agriculture, Fisheries & Food (MAFF), Department of the Environment (DoE), Department of Health (DoH), Police, Ambulance, and the Local District and County Authorities. These representatives provide links with their parent organisations. This is to ensure that adequate information and advice is available both at the OSC and the various organisations’ own emergency control centres. The purpose of this is to enable the most appropriate actions to be taken to protect anyone who might be affected by the incident. A conference facility is provided at which the representatives can discuss their intentions with other agencies to ensure that action taken is consistent and coordinated. 3.8 The Site Emergency Controller would retain responsibility for control of on-site activities and for restoring the plant to a safe condition. Liaison with other organisations and the offsite duties of the operator would pass to the OSC Controller as soon as it becomes operational. The OSC and not the nuclear site would become the focal point for all liaison activities and the co-ordination of advice to all outside organisations. The OSC would be managed by and staffed by the Operator. The
159
OSC Controller would then be responsible for the co-ordination of assessments of the monitoring of any radioactive material released from the site. He would also be responsible for assessing the course of the accident based on information from the Site Emergency Controller and his headquarters. These assessments will continue throughout the emergency, and enable authoritative information to be available on the state of the plant and on measures needed to protect the public. 3.9 It is essential that it should be clearly understood who is responsible for giving advice on the protection of the public during the course of an emergency. In the first few hours of the emergency, the Site Emergency Controller would provide it. Once the OSC had been activated the OSC Controller would assume this responsibility and would inform the police and all the other authorities that he had taken over. If, however, the emergency was considered to have off site consequences it is expected that a Government Technical Adviser (GTA) would be appointed to take over the specific responsibility of giving this advice (see 3.16 below). However, the Government Technical Adviser would not assume any executive authority. 3.10 Local Authorities All county authorities in England and Wales (and their counterparts in Scotland and Northern Ireland) have plans for dealing with emergencies. These are designed to cope with a wide range of emergencies, such as floods, major fires and transport accidents. Most nuclear installations are so situated as to be likely to require only one county to be involved, and these counties have in their plans a section on emergencies at nuclear sites located within their boundaries. In a nuclear emergency, the plan would be activated (as in other emergencies) by the police notifying designated representatives of the county and district authorities. These authorities would have a major role to play in the implementation of any countermeasures. In the event of evacuation these would include responsibilities for such matters as emergency housing, feeding and transport and provision of social services. The responsibilities of each county
160
department is laid down in the county emergency plans. For example, the County Engineer or Surveyor would normally be responsible for emergency transport; the Chief Education Officer for emergency feeding; and the Housing Authority for emergency accommodation. County Emergency planning Officers are, in most counties, responsible to their Chief Executive for contingency planning, and for the co-ordination of their resources in an actual emergency. Information will be provided by the Off-Site Centre to the County Emergency planning Officer or his deputy in order that his authority may be kept fully informed on the course and consequences of the emergencies In Scotland, the responsibility for preparing emergency plans for civil at contingencies rests with the Regional and Island’s Area Councils. Water Authorities 3.11 The water authorities would be informed promptly by the operator of any nuclear emergencies and would be responsible for deciding what action, if any, is necessary to restrict water supplies as a result of a nuclear incident. The main responsibility for action in responding to the possible off-site effects of an emergency would rest at county levels. The possibility of substituting alternative supplies would be examined, even at very low levels of contamination of a particular source. Water authorities would be advised by the radiochemical inspectors of HM Inspectorate of Pollution (HMIP). In Scotland the responsibility for deciding on any restrictions of water supplies lies with the appropriate Regional or Island’s Area Council advised by HM Industrial Pollution Inspectorate (HMIPI). Government Departments (GD’s) and Agencies 3.12 In England and Wales the Ministry of Agriculture Fisheries and Food (MAFF) would be involved in any nuclear emergency which had effects off-site. A release of radioactive material following an accident could contaminate grass, crops and produce and could appear in the milk of grazing animals. There is a MAFF Memorandum of Procedures setting out the Department’s policy and contingency planning for all types of nuclear
161
emergency in England and Wales. Each MAFF region also has a nuclear contingency plan which provides for farmers or producers whose land is affected to be notified quickly by the policy or MAFF officials. The MAFF Chief Regional Officer or his nominee is responsible for deciding what countermeasures are required, such as restricting movements of produce and milk. A monitoring distance for milk control is identified in the site emergency plan which can be extended if necessary. In deciding what range of restrictions to impose, Chief Regional Officers would be advised by the Atomic Energy Unit of MAFF’s Food and Science Division. This advice would be based upon the results of monitoring coordinated at the EOC or OSC and in consultation with the Controller or Government Technical Adviser. MAFF would set up its own control centre for the implementation of any necessary measures. The Department’s Memorandum advises Chief Regional Officers to be cautious by initially setting wide boundaries to the milk restriction area. 3.13 HMIP is responsible for giving advice to water authorities on the risks of contamination of water supplies from any radioactive source in England and Wales. HMIP and water authorities would be informed promptly by the operator of any nuclear emergency. Water supplies would be monitored by the HMIP, and samples assessed at the operator’s laboratories and by the Laboratory of the Government Chemist. This monitoring would be important, whether or not significant contamination was suspected, in order to reassure the public about the safety of the water supply. 3.14 The National Radiological Protection Board (NRPB) is responsible for advising Government Departments and other bodies on radiological protection matters. It would, with GD’s, undertake the co-ordination of monitoring for radioactive material outside the area covered by the site emergency plan, this would include the information available from RIMNET [7]. RIMNET is the nationwide network of monitoring stations being developed by the Department of the Environment as one of the responses to the Chernobyl accident. The NRPB would liaise with the NII, Government Departments and operators, and provide advice, as necessary, on the radiological hazards.
162
3.15 The Department of Energy would be responsible in England and Wales for the Nuclear Emergency Briefing Room (NEBR) in London. The Briefing Room would be opened soon after the emergency had been declared by the operator. It is not an operations room or control room, but would be the focal point in Whitehall for information and briefing of Government Ministers and Departments in an emergency. The Department would be in close touch with the operator and the NII and other Government Departments and agencies. The Department would also provide a central press briefing centre in their offices at Thames House South, Millbank, where Government statements on the emergency would be issued. In Scotland the arrangements are broadly similar centred on the Scottish Office Emergency Room (SOER) in Edinburgh. Represented at the NEBR would be the relevant central Government Departments, for example DoE, MAFF, DoH and MOD. Representatives of NII and NRPB would also be present, including the CI NII or his Deputy. The representatives would be briefed by their HQ organisations with the DEn taking the lead receiving advice from the GTA team at the OSC. Central government departments would issue press statements relating to matters which are their statutory or policy responsibilities. 3.16 DEn is responsible for the formal appointment of the GTA for emergencies in England and Wales, the Scottish Office being responsible for emergencies in Scotland. The principal function of the GTA is to provide independent expert technical advice to police and other organisations with executive responsibilities for off-site action to protect the public. This advice will include assessments of the progress of the accident, radiological conditions and effects off-site, and recommendations on countermeasures needed to protect the public. In addition he would brief visiting Ministers on the cause and progress of the accident, and make himself available for regular press briefings. He also has the associated aim of facilitating consistency between the actions and public statements of all the organisations with off-site responsibilities (see figure 1). Although the GTA would not be the sole channel of information to the media, in the interests of ensuring that the public received consistent and coherent information, he would need to be formed,
Figure 1 Civil Nuclear Emergencies GTA Communication Arrangements
164
if at all possible before their issue, of the timing and content of any public statements made by the organisations involved in dealing with the emergency. 3.17 When a GTA is appointed a Senior Government Liaison Representative (SGLR) would be immediately assigned to the OSC. He would be a senior civil servant from DEn or the Scottish Office as appropriate, and would have a small team to assist him. The SGLR would support the GTA. by ensuring that the GTA remains aware of central government actions and that the NEBR knows what is happening at the OSC so that consistent and unambiguous advice can be given to the general public. 3.18 The NII for its part would mobilize its own emergency response teams. These would comprise of the headquarters assessment team at the Bootle Emergency Room under the control of the Emergency Director; the Superintending Inspector, the Site Inspector and supporting staff at the OSC/site; and the Chief Inspector and supporting staff at the NEBR or SOER as appropriate (see figure 2). Important information about the emergency would be forwarded to the Bootle Emergency Room by the NII staff at the OSC/site. Using this information the Bootle Emergency Room would perform an independent assessment of the event and of the adequacy of the actions taken by the operator. These assessments would form the basis of any advice given to the operators or the GTA. The Chief Inspector, at the NEBR/SOER, would be kept informed on the course of events by the Bootle Emergency Director and would be responsible for briefing and advising the Secretary of State for Energy or Scotland. The Emergency Services 3.19 The Emergency Services are the Police, Fire and Ambulance Services and for coastal sites, the HM Coastguard. Each of these is recognised as having an important function in dealing with a nuclear emergency which is consistent with their duties and responsibilities for dealing with any other emergency. Nuclear emergencies have unique characteristics that must be taken into account at the planning stage. Proper advice needs to be available
Figure 2 Civil Nuclear Emergencies NII Communication Arrangements
166
to the emergency services both before the event in order to draw up adequate plans and during an incident to ensure that undue risks are not taken. This means, for example, that Emergency Services have to be aware of the hazards and characteristics of ionising radiation. They need to know what precautions require to be taken and what controls are necessary. This is particularly relevant for the Fire Service which could be involved in the on-site actions to bring any fire under control and would be likely to assist in rescue operations if these were required. 3.20 In the event of major emergencies the prime objective of the Police is the safety of the public and their main responsibility is to coordinate the actions of all emergency and other essential services. To this end all Forces throughout the country prepare contingency plans to reflect local circumstances and needs. Dealing with an off-site emergency at a civil nuclear site is one aspect of emergency planning for those Forces containing such establishments. 3.21 The contingency plans to deal with an off-site emergency at a civil nuclear site outline Police responsibilities. They include the establishment of the Police Coordination Centre to coordinate the efforts of all services, the notification and liaison with other relevant agencies and authorities, and the implementation of appropriate countermeasures based on the advice from the Site Emergency Controller or the OSC. 3.22 Particular emphasis is placed on the need for working in close cooperation with the Site Emergency Controller and all the involved agencies, notably the Government Technical Adviser, once the OSC has been set up. 3.23 The initial scientific advice is given to the Police by the SEC. The setting up of the OSC will provide through the GTA independent expert technical advice to the Police to assist them to carry out their executive responsibilities for off-site action to protect the public. A Senior Police Officer together with support staff are accommodated at the OSC.
167
3.24 The contingency plans include: the identification of rendezvous points for the emergency services; predetermination of assembly areas, check points, evacuation routes and reception centres; the setting up of road blocks; the identification of evacuation zones and of vulnerable groups and individuals within the zones and methods of notifying the community therein; documentation of evacuees; casualty clearance systems to deal with enquiries and the operation of mortuary facilities if required; and perimeter control of evacuation zones. 3.25 The implementation of countermeasures to protect the public will be orchestrated by the Police and could include warnings and advice on sheltering, issuing or assisting in the issue of stable iodine tablets, the evacuation of particular areas, preventing other people from entering areas of risk, the decision to return into the area once it was safe to do so and the notification to the public of the “all clear”. Health Authorities 3.26 A nuclear accident could result in demands being made on National Health Service (NHS) hospitals and the Health Service both local to the accident and more generally. Health Authorities (including the Ambulance Services) local to the site would be notified of a nuclear emergency by the operator or the Police as part of the standard procedure. There could be an immediate need for the movement and treatment of casualties; the Ambulance Service might be called upon to assist in the evacuation of persons with poor mobility; monitoring and reassurance would be required for persons who had been evacuated; monitoring and reassurance, when appropriate, would be provided for others who had been or believed they had been exposed to radioactivity. Reassurance in this context means that people will need to be provided with accurate information on the accident; how it is likely to affect them and, as appropriate, suitable monitoring facilities. In most instances this will be to say that they are safe and need to take no special precautions. Casualties resulting directly from the accident would almost certainly be limited to personnel on the site. Operators are responsible for ensuring, in consultation with District Health
168
Authorities, that arrangements exist with particular hospitals for emergency treatment of personnel injured on a nuclear site and, of these, who might be contaminated with radioactivity. 3.27 It should be noted here that for all accidents except those involving very unlikely and extreme situations, the dose levels envisaged off-site would not lead to acute or immediate effects such as vomiting, erythema (reddening of the skin), or pneumonitis. The risks are those of stochastic effects ie long term fatal or non fatal cancers and possible genetic effects. Following an accident which is within the design basis (as described in Section 2) and the implementation of any necessary countermeasures (as described in Section 5), the levels of radiation dose to, or contamination of, people outside the site would be low. Acute effects are unlikely and individual risk of any long term effect to health is likely to be small. Thus the main action required of the Health Service would be reassurance as described above. 3.28 If the scale of the release did exceed that of the reference accident there may be a need for more extensive countermeasures to protect the public. In all cases the general public will need reassurance, and monitoring will be required for people who may have been contaminated (or fear they had been contaminated) during the release. Monitoring and reassurance of those in the vicinity of the site, or who have been evacuated, would be done at local centres under NHS supervision, but with the assistance of the nuclear plant operators and NRPB. Additional reassurance or monitoring might be required further from the site. This would be provided by Health Authorities who also would give advice. 3.29 On a national basis the Health Service could expect to receive many enquiries concerning the health implications of an accident, and perhaps demands for monitoring. All Health Authorities are being asked to plan to meet such an eventuality. 4. CONSEQUENCE OF A NUCLEAR EMERGENCY 4.1 A nuclear reactor, or an installation processing spent nuclear fuel or nuclear waste, cannot cause a nuclear explosion like an
169
atomic bomb. The main potential hazard lies in the radioactive fission products which accumulate in the nuclear fuel while the reactor is operating. If an accident at a nuclear installation were to occur, the most likely outcome is that no-one would be hurt at all because at least one of the various safety features would prevent the accident from developing to the stage where a significant release of radioactivity from the site could take place. In the event that an accident at a nuclear plant led to a release of radioactivity to the atmosphere, it would be in the form of gases or small particles. The release could last a relatively short time or could be spread over a few hours or even longer depending on the type of accident, the plant involved, and the mechanism of the release. 4.2 Radioactivity released to the atmosphere will be carried downwind and dispersed in a similar manner to the plume of smoke from a chimney stack. Particulate radioactivity would be deposited on land or sea, but mainly locally on fields and pastures surrounding the installation. The magnitude of the deposition, of course, depends on the nature of the radioactive material and the prevailing meteorological conditions; it will decrease with the distance from the point of release. People may receive radiation doses through a variety of pathways as a consequence of a nuclear accident:a) external gamma radiation from the facility b) external beta and gamma irradiation from activity in the plume c) inhalation of activity in the plume d) contamination of skin and clothes e) inhalation of activity resuspended following deposition f) ingestion of contaminated food and water. g) external gamma irradiation from deposited activity Decisions concerning countermeasures against exposure routes (a)-(d) need to be given priority. This is because dose uptakes by routes (e)-(g) would be received, if not prevented by countermeasures, over a much longer timescale than routes (a)-(d).
170
4.3 The radiological effects of a major release of radioactive material can be divided into health impacts and economics impacts. The consequences of a release of the magnitude which occurred at Chernobyl are shown in Table 1. It is of interest that no member of the public had to be hospitalised as a result of radiation injuries arising from the Chernobyl accident [8]. Acute radiation sickness was diagnosed in 237 people, all either at the plant at the time of the explosion or arriving there shortly afterwards to fight the fire or control the plant [8]. Long term programmes for the medical and biological monitoring of the population are being established as part of post-Chernobyl epidemiological studies. TABLE 1: CONSEQUENCES OF AN EMERGENCY AT A NUCLEAR SITE WHERE THERE IS A MAJOR RELEASE OF RADIOACTIVE MATERIAL
4.4 Three time phases are generally accepted as all nuclear accident sequences; within each considerations apply to decision making for These phases are termed early, intermediate
being common to of them, different off-site action. and recovery. They
171
will almost certainly overlap, but they nevertheless provide a useful framework for emergency response planning. 4.5 The early phase can be subdivided into the period before a release begins (but when the potential for off-site exposure has been recognised) and the period when the majority of the release occurs. For some types of accidents (eg those caused by external events) there may in fact be no pre-release ‘warning’ period during which precautionary ccuntermeasures can be taken. The common feature associated with the threat of release, and the first few hours of release, is that emergency response decisions are more likely to be based primarily on the analysis of data and predictions being derived from the plant instrumentation or in the knowledge or anticipation of a release occurring, rather than on off-site measurements which inevitably take time to obtain and interpret. Thus, decisions to implement those countermeasures which are appropriate and reasonably practicable during this period will be based largely on plant status and meteorological information; the associated potential doses to individuals in the population would be assessed on the basis of prior analysis of plant fault sequences. As measurements of off-site exposure rates and concentrations of radionuclides in the environment become available during the early phase, they may be used to confirm the continued need for any countermeasures that might already have been taken and as an input into decisions on the need for further measures. 4.6 The intermediate phase covers the period which, typically, starts from the first few hours after the commencement of the release and could extend for several days or more. In this phase, environmental measurements of external radiation from deposited materials and levels of contamination in food, water and air will become the main source of data for dose assessments. The radiological characteristics of any deposited material would also be determined. These data can be used to estimate doses for the principal exposure pathways which can then be compared with intervention levels as a basis for taking decisions on the further implementation or continuation of countermeasures. 4.7 The recovery phase is concerned with return to normal
172
conditions and may extend from some weeks to years depending upon the magnitude and radionuclide composition of the deposition and the nature and extent of the area over which countermeasures remain in force. During this phase data obtained from environmental monitoring can be used to assist in making decisions on a return to normal conditions, by the relaxing of the various countermeasures imposed.
5 COUNTERMEASURES TO MITIGATE THE OFF-SITE EFFECTS OF A NUCLEAR ACCIDENT 5.1 The effective release is the result of the interaction of a number of complex factors and will vary depending upon the accident. However, in emergency planning terms the problem can be simplified because the hazards can be categorised and a few particular isotopes dominate the potential threat. 5.2 To mitigate these hazards various countermeasures can be adopted. Countermeasures can be very effective in reducing doses received, these reductions being dependent upon how the countermeasures are managed. The main countermeasures are:a) b) c) d) e)
sheltering within buildings; evacuation; the taking of stable iodine tablets; decontamination of the skin and clothing; and the restriction on consumption of contaminated foodstuffs and water supplies.
Sheltering within buildings 5.3 The protection afforded by sheltering depends principally on the ventilation rate of the building. The optimum arrangements are for a low ventilation rate during the passage of the plume followed by a high ventilation rate after the plume has passed. This would normally be achieved by closing and then opening all doors and windows at the appropriate time. In addition to being a countermeasure in its own right, sheltering may be used as a precursor to evacuation.
173
5.4 The protection would be reduced if sheltering is not implemented before the plume arrives or if the ventilation is not increased soon after the plume has passed. Experimental data for typical UK buildings suggested that dose reduction factors could be as good as 0.1 but are more likely to be in the range 0.3 to 0.5 [9]. Evacuation 5.5 The dose reduction which can be achieved depends upon the time at which the evacuation is implemented. Clearly, if evacuation to a location well away from the plume direction is completed before the plume arrives, then no dose will be received. On the other hand, even after the cloud has passed, dose reduction will be achieved by evacuating people from any area of deposited activity. Iodine Tablets 5.6 An effective countermeasure against the consequences of inhaling radioactive iodine is to take stable iodine which will saturate the thyroid and prevents further uptake. The time of administration is important, but delayed administration can still result in useful reduction of dose. Typical figures provided by NRPB [10] are shown in the following table: TABLE 2: EFFECTIVENESS OF TAKING STABLE IODINE
Restriction of food and water 5.7 This countermeasure is important to prevent the ingestion of radioactive material. However, while restrictions on food or
174
water may be imposed at the time of the incident, the countermeasure is essentially a longer term requirement. Therefore it is not seen as an immediate dose reduction measure as is sheltering, evacuation or the taking of stable iodine tablets. The levels at which food or water is restricted are the responsibility of MAFF or Welsh Office and HMIP respectively, taking due account of NRPB advice.
6. EMERGENCY REFERENCE LEVELS 6.1 The implementation of the countermeasures described in the previous section needs to be related to the magnitude of the predicted dose. Advice on this is provided by the NRPB. The current formal advice on the exposure of the general public is given in their publication ERL 2 [5]. However an insight into the probable revision of NRPB’s advice can be gleaned from a recent staff paper [11]. 6.2 ERL 2 is based primarily on the ICRP recommendations which states that “The decision to initiate remedial action will have to take account of the particular circumstances that prevail. In general it will be appropriate to institute countermeasures only when their social cost and risk will be less than those resulting from further exposure”. This means, for example, that the risk (in its widest sense) of evacuation must be balanced by the risk from any additional dose received by not evacuating. Accordingly the NRPB have developed a set of ERLs for each countermeasure; each ERL having an upper and a lower bound of dose. 6.3 ERL 2 states that the lower ERL is the dose equivalent at which consideration of the introduction of the countermeasure should begin. The risks associated with radiation doses less than this lower bound are considered by the Board to be less than those associated with the countermeasure. If the predicted doses exceed the lower ERL, implementation of the relevant countermeasure should be considered. Implementation of the countermeasures becomes increasingly desirable until at the upper ERL, implementation becomes essential and, in the view of NRPB, should be introduced whatever the circumstances.
175
6.4 The NRPB expect its advice to be used with some degree of flexibility and should be incorporated into emergency plans by the operators in consultation with all who may be involved. The NII expects to see the advice reflected in operators Emergency Arrangements. 6.5 In setting the upper and lower ERLs and in determining, for a specific incident, whether a particular countermeasure should be implemented, it is the balance of overall risk between the dose and the countermeasure that is the most important factor. 6.6 The NRPB advice in ERL 2 provides a vital ingredient to NII and GTA advice in the event of an accident. In order that the advice may be validly interpreted quickly and effectively on the day, NII and NRPB have paid special attention to emergency planning in the close and special relationship which exists between the two bodies. In particular regular meetings take place on emergency arrangements. Through these meetings emergency reference levels and their implications are discussed to ensure that the best advice is readily available and understood. In addition, joint meetings are held between NII, NRPB and nuclear operators again to ensure that a common understanding is reached between the various organisations.
7. EMERGENCY EXERCISE PROGRAMME AND REQUIREMENT 7.1 The NII requires all operators’ employees who could be involved in an emergency to be individually trained for their tasks and that there should be regular exercises to ensure appropriate team performance. In addition to these training exercises, the NII requires there to be a demonstration exercise each year for each site. This demonstration exercise, known as the Level I exercise, is witnessed by the NII, and is one of the means by which the NII assesses the appropriateness of the arrangements, training and resources of the operator for dealing with emergencies. 7.2 Level 1 exercises mainly concern the operator’s actions on and off site but may involve Emergency Services and other external organisations if so arranged by the operator. The extent to
176
which the OSC is activated varies according to the needs of the operator or as required by the NII. The timing and scenario of the exercise have to be agreed by the NII. The exercise is designed to test the operator’s on-site response and the onsite and off-site monitoring and assessment capability. 7.3 An NII team observes the actions and responses of the participants in dealing with the postulated on-site events and off-site releases. At the conclusion of the exercise the main findings of the participants and observers are discussed at a debriefing meeting. Comments and observations of the NII team on any shortcomings evident in the exercise or areas where improvements are desirable or required by NII are discussed. Any unsatisfactory aspects would need to be corrected and if necessary a repeat of any part or all of the exercise might be required. 7.4 In addition to the Level I exercise, which is mainly aimed at the operator’s arrangements, there are programmes of demonstration exercises to rehearse the OSC function and the larger national involvement. These are known as Level 2 and Level 3 exercises respectively. 7.5 The aim of the programme of Level 2 exercises is to demonstrate the function of each OSC once every 2 to 3 years. The exercise requires the operator to establish the OSC and provides for any agencies with responsibilities or duties at the OSC to take part and exercise their function as appropriate. This includes the GTA. The NII, in addition to exercising its emergency duties, has to be satisfied with the accident scenario and provides a full GTA team. The NII again provides a team of observers and the NII chairs the subsequent debriefing meeting at which the main findings of the participants and observers are discussed. 7.6 The Level 3 exercise is designed to test the setting up and operation of the OSC and also to test the wider national arrangements. This includes the participation of the various Government Departments at their HQ centres and at the NEBR. The
177
exercise could last more than one day and would be chosen from the programme of Level 2 exercises with the aim of holding one such exercise each year. 7.7 The results of the emergency exercises are reported to Local Community Liaison Committees. NII inspectors attend the local liaison meetings as observers and are prepared to comment on the results of the emergency exercises.
8. INFORMING THE PUBLIC 8.1 People will always be afraid of things that might cause them harm and which they do not understand. This fear will be exaggerated in a period of real or supposed emergency when they suspect that the situation may be out of control. It serves the officials responsible for emergency management, and makes their job easier, if the public understands the response that is being made to an accident and accepts the intentions of those controlling the response. Thus information management is important, and adequate information is required both before and during an emergency. These two issues are dealt with below. Information before an Emergency 8.2 Copies of the operator’s arrangements are readily available to the public and are placed in the public libraries near to all licensed sites. Their availability is publicised in the local press. In addition, operators have cooperated with local Authorities to produce pamphlets for distribution to the public in the vicinity of nuclear sites. These pamphlets summarise the emergency arrangements, give brief details of actions which the general public may be asked to take, and give guidance as to where more information may be obtained. 8.3 This approach to individuals is supplemented by formal groups such as the Local Liaison Committees (LLC). The LLC is usually chaired by the nuclear site manager and is composed of elected representatives and officials of local government and other representatives of local organisations such as the National
178
Farmers’ Union. The LLC is a means of consulting as well as informing the local community on many topics including emergency arrangements. 8.4 The HSE produced a booklet on “Emergency Plans for Civil Nuclear Installations” [12] in 1982. This set out the general arrangements for dealing with nuclear emergencies. It has been a well received publication and HSE is preparing to reissue an updated version. NRPB have produced a booklet entitled “Living with Radiation” [13] which explains radiation hazards. 8.5 Not only should information and education be directed to the general public, but also at those who can help explain the situation in the event of an emergency (eg. medical personnel and journalists). Of particular importance is the education of doctors and other health careworkers because this group plays a vital role as the primary source of advice on health matters for the majority of the population. Information during and after an Emergency 8.6 One of the first actions taken by the operator in the event of a nuclear emergency with possible off-site consequences is to alert and inform the Police and through them the public and the other Emergency Services and the local Authorities at both County and District level. The actions taken by these bodies are explained in Section 3. People evacuated to reception centres would be kept informed by representatives of the operator and the Local authority at these centres. Other people would be kept informed by special announcements on local radio and television. 8.7 Information on foodstuffs, livestock and water supplies would be promulgated by bulletins issued by MAFF, HMIP, the Water Authorities, the Scottish Office, and the Welsh Office as appropriate. These would also be the subject of special announcements on local radio and television. 8.8 In the more general sense, information would be made available
179
through announcements and statements by the GTA, central government and other relevant agencies through the media. This more general information would be promulgated via the media briefing centres associated with the NEBR in London and the OSC local to the incident.
9. CONCLUSIONS 9.1 The NII has always considered emergency arrangements to be an important aspect of the Licence requirements. Considerable time and effort has been devoted to the development of the arrangements. This includes the NII inspecting and reviewing the Licensee’s arrangements and in the witnessing and taking part in emergency exercises. Based on this experience the NII consider that the system that has been developed is a sound one which would be able to respond effectively in an emergency. 9.2 The ability to transfer the co-ordination of the actions to deal with off-site effects to an OSC is right in principle and one that would work in practice. This has been demonstrated by exercises. The OSC would provide a focus of information and advice for organisations which would be involved in an emergency response. The OSC would provide the centre for a consistent view during the course of an accident and on the necessary countermeasures. The concept of a GTA provides the necessary independent and authoritative advice to ensure that the most appropriate actions are taken. In the event of an emergency which is beyond the design basis, the OSC would provide the information and advice to extend the response. This could include if necessary calling on additional resources through the NEBR. 9.3 For a nuclear emergency at a civil nuclear installation the NII has two broad responsibilities. These are: i)
to ensure that the operator makes and maintains satisfactory arrangements;
ii) to be prepared to respond in an emergency in accordance with the emergency arrangements.
180
9.4 The approved emergency plan for the various civil nuclear installations contain arrangements to minimise the consequence of an accident, to detect the off-site effects and to implement the countermeasures necessary to protect the public. Arrangements are available to ensure that organisations involved in dealing with an emergency are properly consulted.
REFERENCES 1
Safety Assessment Principles for Nuclear Reactors, HSE 1979
2
Safety Assessment Principles for Nuclear chemical Plant, HSE 1983
3
The Work of HM Nuclear Inspectorate, HSE 1989
4
For example see: Safety series No 50-C-G (Rev 1), Code on the Safety of Nuclear Power Plants: Governmental Organisation, IAEA Vienna 1988.
5
Emergency Reference levels: Criteria for limiting doses to the public in the event of accidental exposure to radiation, ERL 2, NRPB July 1981.
6
Sizewell B Public Inquiry, Report by Sir Frank Layfield, para 42.60 et seq, DEn 1987.
7
Nuclear Accidents Overseas: The National Response Plan and Radioactive Incident Monitoring Network (RIMNET): A Statement of Proposals, DoE 1988.
8
World Health Organisation Environmental Health Series No 25, Nuclear Accidents and Epidemiology , WHO Copenhagen 1987.
9
Brown J, Radiation Protection Bulletin Nov 1988 97 pp 9–15.
10 NRPB–R182, NRPB Emergency Data Handbook, para 6.2, NRPB March 1986. 11 Hill M D, Wrixon A D, and Webb G A M, J Radiol. Prot 1988 8 197–207. 12 Emergency Plans for Civil Nuclear Installations, HSE Booklet 1982. 13 Living with Radiation, 3rd Edition NRPB 1986. V/NIID1/D17/02–89/SAH
THE NATIONAL RESPONSE PLAN AND RADIOACTIVE INCIDENT MONITORING NETWORK (RIMNET) M W Jones Radioactive Substances Division Her Majesty’s Inspectorate of Pollution Department of the Environment London
ABSTRACT The Department of the Environment is responsible through Her Majesty’s Inspectorate of Pollution for co-ordination of the Government’s response to overseas nuclear incidents. This paper describes the contingency arrangements that have been set up for this purpose.
INTRODUCTION
Following the accident at Chernobyl, the Government thoroughly reviewed its contingency plans for dealing with incidents leading to a release of radioactivity to the environment. This indicated that while existing procedures provided an adequate basis for response to a nuclear accident in the UK, there was a need for further planning to deal with the national response to an accident overseas.
The Department of the Environment is responsible for co-ordinating Government action in the event of any nuclear accident overseas and for preparing a National Response Plan. An important part of this Plan is the establishment of a network of continuously operating radiation monitors capable of detecting independently any radioactivity arriving over the UK. This network is the Radioactive Incident Monitoring Network or RIMNET.
A statement of the Government’s proposals for implementing RIMNET published in January 1988/_1_/ outlined a two-phase approach. The Phase 1 RIMNET system has now been set in
181
182
place. This paper gives details of this Phase 1 system, and describes the way in which it will be used, together with a description of proposals for Phase 2 development.
The Phase 1 programme provides means of detecting radiation arriving over the UK and handling the immediate response to it. The plans owe much to the help and advice provided by Government bodies, local authorities, the nuclear industry and others as a result of their participation on a national co-ordinating committee—The Radioactive Incident coordinating Committee (RIMCC).
The Committee will continue to operate to guide and to oversee improvement of the Phase 1 system and preparations for Phase 2. It will also review the exercises designed to test the Phase 1 system performance. The Phase 1 RIMNET system is expected to operate for a period of 2–3 years while the more extensive and fully automated Phase 2 system is being planned, installed and commissioned. The proposals for the Phase 2 system are set out later in this paper.
International agreements negotiated since Chernobyl should ensure that the United Kingdom (UK) Government is notified, through HMIP, of any nuclear accident overseas resulting in a significant release of radioactivity. Operation of RIMNET system will mean that, even if these notification arrangements fail, any unexpected increases in radiation levels over the UK, of the kind that could result from an overseas accident, can be detected immediately.
THE PHASE 1 RIMNET SYSTEM
The Radioactive Incident Monitoring Network (RIMNET) system has been designed to provide a national system for detecting and measuring radiation levels in the air over the UK. It also provides facilities for the national dissemination of information and advice concerning the incident using modern information technology. The Phase 1 RIMNET system is based on gamma-ray dose rate monitoring equipment installed at 46 field stations throughout the United Kingdom. The locations, which are all meteorological observatories, are shown in Figure A1.1 of Annex 1. A picture of the RIMNET gamma-ray dose rate monitoring equipment is shown in Figure A1.2.
183
Gamma-ray dose rate readings are taken by meteorological observers, usually every hour. The readings are transmitted to the Bracknell headquarters of the Meteorological office along with other meteorological data and then transmitted to the Central Database Facility (CDF), which is located at the Department of the Environment (DOE) in London SWl, and to a back-up computer installed at a DOE office in Lancaster.
The CDF collects and analyses data from all RIMNET sites and will automatically trigger an alert in the event of any unexpected rise in gamma-ray dose rate readings satisfying one of three algorithms covering different eventualities. The CDF computer and its associated communications equipment, were handed over to DOE by the installation contractor at the end of 1988 and commissioned early in 1989.
In the event of an overseas nuclear accident being notified or detected, the provisions of the National Response Plan will be implemented to assess its effect on the UK. The Secretary of State for the Environment will be the lead Government minister. The Director of Her Majesty’s Inspectorate of Pollution (HMIP), will be responsible for advising Ministers about any subsequent actions.
If it is clear that there will be an effect on the UK, Parliament, relevant official bodies and the public will be informed, and appropriate alert messages will be issued. National Response Plan arrangements described in the following sections of this paper will be initiated. The form of the procedures is shown schematically in Figure A2.1 of Annex 2. A Technical Co-ordination Centre (TCC) staffed by officials from Government departments, the National Radiological Protection Board (NRPB) and the Meteorological Office will be established to co-ordinate the Government’s response. Staff in this Centre will have access to data stored on the CDF and will be responsible for liaising with the departments
184
and agencies which they represent to obtain radiological assessments. The TCC will provide a fully co-ordinated Government response. In particular it will ensure that all information and advice bulletins issued by Government Departments and other statutory bodies are consistent.
Specific Government monitoring programmes will commence. These will cover food, livestock, crops, water supplies and people and goods coming into the country from affected areas overseas.
Other monitoring data supplied by accredited organisations will also be held on the CDF to assist the TCC with its work. Data links with other Government departments to enable the necessary information to be entered into the CDF are being planned.
On the basis of the information provided by the TCC, staff in the DOE Information Centre (IC) will prepare regular press releases. Regional information will also be broadcast on viewdata and teletext systems (CEEFAX and PRESTEL) to ensure that the public are kept up to date.
Data held on the CDF and additional information and advice will be supplied daily to bodies such as local and water authorities offering to help keep the public informed. Electronic mail will be used for this purpose. This will help these organisations answer enquiries from the public on local radiation matters. Local authorities may wish to act as the main focal point for local public enquiries.
The Phase 1 TCC and IC facilities are based in the DOE emergency operations room in London SW1. Modern information technology facilities (teletext, viewdata, electronic mail etc) and CDF access terminals have been provided. Further details of the National Response Plan System are given in Annex 2.
185
ROLES OF ORGANISATIONS OUTSIDE THE DOE UNDER PHASE 1
Role of Other Government Departments Government departments with relevant responsibilities will be represented in the TCC. These will include the Department of the Environment (DOE), the Ministry of Agriculture, Fisheries and Food (MAFF), the Department of Health (DH), the Ministry of Defence (MOD) and the three territorial departments—the Scottish Office, Welsh Office and Northern Ireland Office. The National Radiological Protection Board and the Meteorological Office will also be represented.
Following an incident these Government departments will continue to discharge their normal statutory responsibilities. However the departmental representatives at the TCC will ensure that decisions and actions are fully co-ordinated. These representatives will also provide the details and assessments necessary for the TCC and 1C to prepare information and advice bulletins.
To discharge their wide range of responsibilities in Scotland, Wales and Northern Ireland, and to provide local information and advice, the three territorial departments will set up their own incident control centres in Edinburgh, Cardiff and Belfast. The work of these centres will be co-ordinated with the national response through their TCC representatives.
If departments commission supplementary monitoring programmes for their own purposes, the results, and assessments of them, will be available to the TCC.
Role of Local, Health and Water Authorities In England, Scotland and Wales the local authorities may wish to act as the main focal point for local public enquiries in the event of any future overseas nuclear incident. Such local authorities are being identified with the help of the local authority associations, and will be amongst the DOE contact points for the initial incident alert message. They may disseminate the initial alert information to other authorities. Similar arrangements are being planned with the water authorities. Health authorities will be informed by the appropriate Government departments.
186
In Northern Ireland, a Government department group, established to advise on peacetime radioactive emergencies, will act as the main focal point for public enquiries. In addition local authorities will be kept informed and will assist by dealing with enquiries at more localised levels.
Information officers in local, water and health authorities will have access to the regular information and advice bulletins issued via viewdata and teletext systems ie, CEEFAX and PRESTEL. A number with appropriate radiological expertise will also have access by electronic mail to more detailed technical information prepared by the TCC. This will assist them in dealing with public enquiries at the local level.
Local, water and health authorities may collect radiological monitoring data of their own during an overseas incident. Where such data is available it will be used to obtain more detailed analysis of the distribution of radioactivity across the UK.
Role of the National Radiological Protection Board
In the event of an overseas nuclear incident the National Radiological Protection Board (NRPB) will advise Government departments on the interpretation of radiological monitoring data. The NRPB will be represented at the TCC, and have direct access to the CDF from their headquarters at Chilton, Oxfordshire. They will also supply monitoring data which they collect to the CDF.
Role of the Meteorological Office The Meteorological Office will use a model to estimate and forecast radionuclide concentrations in air and surface deposition activity. The model will draw on meteorological
187
and radiological data to predict the movement of the radioactive material released from the accident over Europe and the sea around the UK. The results will be sent to the Meteorological Office representative in the TCC to enable assessments to be made both on the likely movement of any plume of radioactive material across the UK, and on the pattern of deposition of radioactive material on the ground.
Role of the Nuclear Industries
In the event of any future overseas nuclear incident, the Central Electricity Generating Board (CEGB), British Nuclear Fuels plc (BNF), the United Kingdom Atomic Energy Authority (UKAEA) and the South of Scotland Electricity Board (SSEB) have expressed their willingness to collect supplementary monitoring data at their various sites to assist with the response to an incident. They would also assist with the analysis of samples from elsewhere.
ACCREDITATION OF SUPPLEMENTARY DATA SUPPLIERS
An accreditation system will establish a register of supplementary data suppliers able to provide specific monitoring results to the CDF. This will enable additional information on radionuclide concentrations in the environment to be obtained.
INTEGRATION WITH OTHER CONTINGENCY PLANS
The National Response Plan will provide a response to a nuclear accident abroad. Plans exist for dealing with nuclear accidents within the UK. These are the responsibility of the Department of Energy (DEn), the Industry Department for Scotland (IDS), the Ministry of Defence (MOD) or the Department of Transport as appropriate. Arrangements for accidents in the UK are described in the booklet ‘Emergency Plans for Civic Nuclear Installations’ published by HMSO.* Data from the Phase 1 RIMNET system will provide back-up information to the relevant Government department in the event of a UK nuclear accident. * An updated version of this document is to be published during 1989
188
PROPOSALS FOR PHASE 2
Phase 2 of the RIMNET programme, will increase the number of monitoring stations to about 85. Additional monitoring facilities will be provided, at some sites. These will include automatic air sampling and deposition measurement. All monitoring stations will operate automatically and supply readings to the CDF.
Supplementary data suppliers will be able to enter measurements directly into the CDF. The CDF will have improved analytical and display facilities to provide electronic assessment and the automatic preparation of draft view data pages and press releases.
SUMMARY
Within the National Response Plan, the Phase 1 RIMNET system provides a national means for monitoring the radiological effects of any nuclear incident overseas and for informing many organisations and the public of its implications. The systems will detect any overseas nuclear incidents effecting the UK, even if these incidents are not formally reported to the United Kingdom or there are delays in notification. Data from the RIMNET system will be available to the appropriate lead department in the event of a domestic nuclear incident.
REFERENCES
1.
Department of the Environment, The National Response Plan and Radioactive Incident
Monitoring Network (RIMNET). A Statement of Proposals, Her Majesty’s Stationery Office, London 1988.
189
ANNEX 1: PHASE 1 RIMNET MONITORING SITES AND EQUIPMENT
The 46 RIMNET monitoring sites are shown in Figure A1.1. The sites are all meteorological observatories.
The RIMNET monitoring equipment is shown in Figure A1.2. The radiation detector, monitor and line driver are housed in a weatherproof cubicle located within the field station instrument enclosure. The detector and monitor assembly is connected by cable to the observer’s office and operates a hard copy printer.
FIGURE A1.1. PHASE 1 RIMNET MONITORING SITES AND BULLETIN REGIONS
FIGURE A1.2 RIMNET monitoring eqipment
192
ANNEX 2: NATIONAL RESPONSE PLAN SYSTEMS
The form of National Response Plan systems is shown schematically in Figure A2.1. The component functions in the event of an overseas nuclear incident are as follows. Central Database Facility (CDF)
The CDF is located at the DOE office at 2 Marsham Street, London. The CDF back-up computer is located at another DOE office in Lancaster.
Data obtained from the RIMNET field installations and supplementary data obtained from accredited suppliers will be collated by the CDF.
The compiled data in the CDF will be available to staff in the Technical Co-ordination Centre (TCC) and others. The data will be used as a basis for information, for assessment and for advice.
Direct down loading of data from the CDF to Telecom Gold mailboxes will take place at the end of each day to authorised contact points in specified organisations to assist them in answering local public enquiries. Technical Co-ordination Centre (TCC)
The TCC will be located in the DOE incident control room at 2 Marsham Street, London SWl.
The TCC will be managed by HMIP and staffed by representatives of government departments and other organisations including the NRPB and the Meteorological Office. It will be set up in the event of any overseas nuclear incident that could have a radiological impact on the UK. It will co-ordinate the national response and act as the clearing house for all information and advice concerning the incident.
FIGURE A2.1 FORM OF NATIONAL RESPONSE PLAN SYSTEMS
194
The TCC will have access to radiation monitoring information held on the CDF. TCC members will obtain assessments on relevant radiological matters from the Government departments and agencies that they represent. These will provide the basis for the information and advice bulletins to be issued by the IC.
The Information Centre (IC)
The IC in 2 Marsham Streed will be operated by the DOE Press Office. It will ensure that briefing material produced by the TCC is in a form suitable for public dissemination and is appropriately distributed. The means of information dissemination will be press release, teletext (CEEFAX), viewdata (PRESTEL), and Telecom Gold electronic mail.
Where a single government department has direct statutory responsibility for a specific matter, that department will provide its own media and public briefing keeping the TCC informed.
THE ROLE OF MAFF FOLLOWING A NUCLEAR ACCIDENT
MICHAEL G SEGAL Food Science Division Ministry of Agriculture, Fisheries and Food Ergon House, c/o Nobel House, 17 Smith Square, London SW1P 3HX
ABSTRACT
MAFF has an important role in any incident involving release of radioactivity into the environment. The Ministry monitors both food and pasture for grazing animals, and can act to protect the public from exposure to unacceptable levels of contamination in food. Detailed emergency plans exist for both UK and overseas accidents. Following the Chernobyl accident in 1986, restrictions were enforced on the movement of sheep in parts of Cumbria, Wales and Scotland, and some are still in force. This work is backed up by an extensive research programme into the behaviour of radionuclides in the environment and the food-chain, and detailed analyses of actual diets.
INTRODUCTION: MAFF’S RESPONSIBILITIES In the event of a major nuclear accident, involving the release of significant quantities of radioactive isotopes into the environment, members of the public may receive radiation doses via a number of different routes. These can be broadly divided into three categories: direct irradiation, inhalation and ingestion.
195
196
Responsibility for protecting the population from the effects of radiation in these circumstances is divided between Government Departments. In keeping with its role in consumer protection with regard to food, the Ministry of Agriculture, Fisheries and Food takes responsibility for the third of these routes. As one of the Authorising Departments for the disposal of radioactive wastes under the Radioactive Substances Act 1960, MAFF authorises and monitors the effects of releases of radioactive effluents from power stations and other nuclear sites in routine operation. In the event of an accident at a UK nuclear site, MAFF has detailed emergency arrangements for dealing with any effects on food, and the Ministry also has a fundamental role to play in the event of any overseas accident which might result in contamination reaching the UK. In either case, there are three major objectives:
1. Protection of the public from unacceptable exposure to contaminated foodstuffs
2. Ensuring alternative food supplies are available where necessary
3. Minimising the effects on the agriculture, fisheries and food industries.
In such an event, MAFF would monitor as necessary environmental samples, foodstuffs or drinks in England, and carry out detailed assessments of the potential effects of radioactive contamination on consumers. Within a short period of time, we can reassure the public as to the safety of their food, if appropriate, or take what action is appropriate to limit the radiation dose to the public from ingestion. This is done by measures such as the banning of the sale of milk, or controlling the movement of livestock, in specified areas. Agriculture Ministers have powers under the Food and Environment Protection Act 1985 to make emergency orders to prohibit the movement or supply of produce
197
which has been, or may be, rendered unsuitable for human consumption. Since the Chernobyl accident, the European Commission has also enacted regulations to control trade in food contaminated with radioactivity as the result of an accident, and MAFF also has responsibility for implementing these. (Local Authority Environmental Health Officers have responsibilities and powers regarding food once it has entered the retail trade: the role of MAFF is to prevent food contaminated to an unacceptable level from reaching that stage).
Exceptions There are certain exceptions to the blanket role of MAFF described above, both geographical and technical. Geographically, the position in the rest of the UK is somewhat different: the Northern Ireland and Scottish Offices take full responsibility for these matters in their regions, whereas the Welsh Office take administrative responsibility in Wales, advised by MAFF technical staff. The technical exceptions are (a) tap-water, which is one of the general responsibilities of the Department of the Environment (this does not include, for example, bottled mineral water), and (b) imported foods; the Department of Health deals with the control of these, because of the responsibilities of the Port Health Authorities.
Relevant MAFF Structure The Ministry’s work in relation to radioactivity in food is broadly divided between food and fisheries. Technical expertise in these areas is centred in the Food Safety (Radiation) Unit, part of the Food Science Division, and the Directorate of Fisheries Research. The latter have an important role in matters such as regulation of routine discharges from nuclear power stations, but a nuclear accident is much more likely to have a significant impact on the public via terrestrial pathways than aquatic ones. The enormous dilution that occurs in the
198
sea greatly reduces the effect of any contamination there, compared with its effect on land.
As well as these central scientific groups, and their administrative counterparts, a vital role in any emergency will be played by the MAFF Regions. There are 5 Regional Offices in England, and these provide the necessary staff to carry out sampling and monitoring of food and agricultural produce, and for administration and enforcement of any controls necessary. Analysis of these samples is generally carried out at the Ministry’s Central Veterinary Laboratories (CVL) in Weybridge.
RADIOACTIVITY IN THE FOOD-CHAIN The Route to Man Except in the most severe accidents, and close to the event, the effects of direct radiation on the population will be very small. More significant health effects may arise from contamination by radionuclides released into the atmosphere and transported by normal atmospheric processes, possibly over vast distances (as occurred after the Chernobyl accident). Potential hazards due to deposition of these nuclides on the skin, or inhalation, must be considered, and preventive measures taken if necessary. However, the effects of contamination in the food-chain, resulting in potential risks to the consumer, are likely to be the most widespread and persistent, and affect the largest number of people.
The routes by which such radionuclides can be incorporated into the food-chain can be divided into two broad categories, direct and indirect. These are illustrated in Figure 1.
Direct routes: An example of a direct route is when a cloud containing radioactive gas or minute particles of radioactive material deposits contamination on an edible crop, in particular, large-leafed vegetables such as cabbage, spinach or lettuce. Consumption of the crop then provides a direct route for radioactivity into man. This is
199
an immediate effect, and so the monitoring of items like mature vegetables, ready for harvesting, is a high priority.
Indirect routes: An indirect route, involving the same crop, would be the result of deposition of contamination on the ground; radionuclides can then migrate through the soil and may be incorporated into the crop by gradual uptake through the roots, in the same way as normal nutrients. The appearance of radioactivity in the crop is then a much slower process, which may occur over one or more growing seasons, and can affect crops for years after the accident. Consumption of contaminated fodder can result in activity in meat as well: contamination of sheepmeat was significant in some areas of the UK after the Chernobyl accident, and this is discussed in detail below. These indirect routes are thus of less immediate concern, in general, but require consideration over much longer time-scales.
Figure 1. Pathways for airborne contamination to man via a broadleafed vegetable.
200
The pasture-cow-milk pathway: Where the plants concerned are grass or other pasture for livestock, either of these pathways can lead to the contamination of the animals, resulting in exposure to man via meat or milk. In general, the transfer of activity into milk is one of the routes of greatest concern, for a number of reasons. (i) Many radionuclides are rapidly and efficiently transferred into the milk, and cattle graze over wide areas, and so high concentrations on grass can result in high levels of contamination in milk very quickly.
(ii) Iodine–131, a volatile radionuclide which is one of the major components of released activity in most accident scenarios, may also be taken up by cows by direct inhalation from the plume.
(iii) The frequent milking of cows, and the rapid distribution systems available for milk, provide an exceptionally fast route to the consumer.
(iv) Milk is consumed in large quantities by most sectors of the population, in particular by young children.
(v) The latter are particularly at risk if exposed to I–131, because they are generally more vulnerable, and also because iodine is concentrated in the thyroid, which is proportionally larger in children.
Figure 2 shows how the concentrations of I–131 and Cs–137, another important nuclide in nuclear accidents, appear very rapidly in milk following a single release into the environment. The iodine concentration then falls away relatively quickly, because of its short physical half life (8 days), whereas the concentration of the longer-lived caesium
201
isotope is governed by its behaviour in the environment; the curve shown here is theoretical, and the shape depends on the nature of the soil, etc (see below).
As a result of these considerations, monitoring and protection of the grass-cow-milk pathway is one the most immediate and urgent requirements following any release of radioactivity into the environment. This occurred after the Windscale fire in 1957, when 3,000,000 litres of locally produced milk were bought in and disposed of, resulting in a great reduction in collective dose to the public [1].
Figure 2. The concentrations of Cs–137 and I–131 in milk following a spike deposition to ground (theoretical)
The Effect of Contaminated Food on the Consumer In order to define what is an unacceptable level of contamination in a foodstuff, it is necessary to understand the behaviour of the relevant radionuclides in the human body. This has been the subject of extremely detailed study across the world. As a result, models have been established which are widely used by the international radiation protection community; these models are drawn up by
202
committees of the International Commission for Radiological Protection, and are published in the Annals of the ICRP. Models are kept under constant review and amended as appropriate as new scientific information becomes available. ICRP Publication 30 [2] and its many supplements describe for each nuclide how it is distributed through the body after ingestion or inhalation, and how the resulting radiation dose is calculated.
The National Radiological Protection Board (NRPB), in the UK, has calculated the consequences of per unit intake of each of some 400 radionuclides, in terms of the dose to each of ten selected organs and the “committed effective dose equivalent” (CEDE) to the whole body. This is done on the basis of an extremely detailed, time-dependent model, and calculates the dose that can arise over a period of fifty years following intake; it may depend on the physical or chemical form of the nuclide. For example, Cs–137 has a CEDE of 7.7×10-9 Sv/Bq for an adult, with the dose for each organ being similar. Pu–239, on the other hand, is an a-emitter with a much longer half-life and is greatly concentrated on bone surfaces: for some forms of this isotope, the CEDE (whole body) is 1.1×10-4 Sv/Bq, with a dose to the bone surface of 2.1×10-3 Sv/Bq [3].
If we know the amount and nature of contamination on any particular foodstuff, we can then combine the Ministry’s information on the dietary habits of the population with the recommendations of the NRPB to calculate the likely radiation dose to members of the public. Both average and extreme cases can easily be derived.
Computer Models of the Food-chain In order to determine what foodstuffs may be at risk in any given circumstances, and to take effective action in monitoring them, MAFF makes extensive use of computer models for the movement of radionuclides through the natural environment and the food-chain.
203
Such models are used routinely in the determination of authorised limits for discharges from nuclear sites, but are also of great importance in the event of an accidental release. The fundamental role of these models is to use whatever measurements or other data are available at a particular time to calculate the resulting levels of contamination in food, and hence the potential dose following ingestion.
Practical application of these models in the event of a nuclear accident would proceed roughly as follows. From measurements of the radionuclide concentrations in a plume and knowledge of the weather, predictions would be made of the geographical distribution and magnitude of contamination on vegetables or pasture beneath it. If these predictions showed any risk of significant contamination, we should instigate an urgent sampling and monitoring programme; if there was any indication that the levels in food might exceed the permitted limits, then we should issue a precautionary ban on sale or movement, in the affected area. In the case of mature crops and milk, it is essential to act as speedily as possible, for the reasons outlined above, and measurements of airborne concentrations would be used to decide whether a ban on distribution was necessary, and over what area. Preliminary calculations would be refined by the input of further information, such as actual, measured, contamination levels on pasture. Thus the use of these models enables MAFF to ensure that the appropriate action is taken well in advance of analysis of the contaminated milk itself.
The SPADE Model. MAFF uses a suite of computer codes called SPADE (Soil-Plant-Animal-Dynamic-Evaluation), created for the Ministry by Associated Nuclear Services Ltd. These codes have been described in detail elsewhere [4, 5], and only the general principles will be given here. There are three separate models, for soil, plant and animal respectively, and each model consists of a number of compartments, linked together by a set of time-dependent transfer coefficients. The models are linked by routes such as root uptake from soil solution and
Figure 3. The SPADE model compartment structure: the soil-plant model.
205
ingestion of plants by animals. The compartment structure is illustrated for the plant model in Figure 3.
There is also an atmospheric dispersion model, which can be used to calculate the inputs for the soil and plant models (by deposition) and the animal model (by inhalation). In the absence of any other information, then, initial judgements can be made on the basis of no more information than the estimated magnitude of any release and the meteorological conditions. Data for one or more radionuclides can be input in this form, or in many other ways as more data become available, for example as measured deposition levels on soil or grass. The code can calculate the distribution of each nuclide throughout every compartment as a function of time.
Included in the suite of codes is detailed information on agricultural practices, consumption rates for different foods, and the dose consequences per unit intake, as described above. The codes can therefore be used either to calculate the levels of contamination at any time or the radiation doses that could arise in the absence of any restrictions.
Source Data: These computer codes are dependent on an extensive library of scientific data on the the behaviour of the relevant nuclides. For each nuclide, transfer rates between the compartments are required, as well as those from soil to every significant crop or pasture plant. These parameters need to be correct for a variety of different conditions, such as the season of the year or the soil type in the area of interest. For animals, the rates of consumption of these plants are required, and the transfer coefficient between plant and animal. The models used by MAFF are under a constant programme of revision and updating, to ensure that the most accurate and up-to-date information is included. This is backed up by an extensive research programme to determine parameters where the data are non-existent or unsatisfactory. Examples from a wide variety of projects include many
206
studies on the behaviour of key nuclides in soils and plants (see, eg, [6]), and several studies of environmental radioactivity as part of the UK Atomic Energy Authority’s Radiological Protection Research Programme [7]. The Ministry’s post-Chernobyl research programme has been published in some detail [8]. Safety Margins It is important to note that all the models used for these calculations are extremely conservative, to ensure a wide margin of safety. Immediate action taken in the interest of public protection will always err on the side of caution. Detailed measurements of the true levels of contamination in the food under consideration will be used, when they become available, as the basis for longer-term decisions.
One way to ensure that the models used to calculate the radionuclide intake of the population are conservative is to measure what people actually eat. MAFF carries out duplicate diet studies, in which volunteers are asked to set aside a complete and accurate duplicate of everything they eat over a period of a week, and these duplicates are then analysed for radioactive species. This has been carried out with selected populations in the vicinity of two nuclear sites. As expected, these studies showed that the actual consumption of radionuclides by the population was very much lower than calculated by the models. Similar studies are planned for other sites. Duplicate diet studies were also carried out in various parts of the country after the Chernobyl accident (see below) [9, 10].
INTERVENTION LEVELS AND CONCENTRATION LIMITS The above section describes how scientists at MAFF can calculate the potential intake of nuclides, and assess the radiation dose that could follow, as a result of eating food contaminated as the result of a nuclear accident. One of the most difficult decisions, and one which
207
is the subject of fierce debate both within the UK and internationally, is the question of what level of contamination is tolerable before the foodstuff is restricted. There are at least two, mutually inconsistent approaches to the problem from a technical point of view. Since Chernobyl, this has also become an extremely political issue, on which governments have to make decisions based on criteria other than technical. As a result, there is, as yet, no international consensus, and various international bodies are still in the process of formulating their policies.
Emergency Reference Levels and Derived Levels Prior to the Chernobyl accident, there was general agreement on the basic principles underlying any form of intervention to protect the public following a nuclear accident. The basic principles were set out in two ICRP Publications, Numbers 26 and 40 [11, 12], and application of these principles has been outlined in an IAEA Safety Series booklet [13]. For each form of intervention, such as evacuation, administration of stable iodine, or food restrictions, an Emergency Reference Level (ERL), was defined in terms of the radiation dose to any individual member of the population affected. This would have the effect of a dose limit, in that action would be taken to ensure that no member of the population received more than the set dose. It was different for different forms of intervention, because of the differences in financial and social cost. Recognising that there would have to be flexibility to meet varying circumstances, a two-tier approach was used; a lower level, below which countermeasures are unlikely to be justified, and an upper level above which action would almost certainly be necessary. If the calculated dose were to fall in the intermediate range, then the decision would be taken in the light of the prevailing circumstances.
For the control of foodstuffs and water, the NRPB have issued advice [14], and the IAEA recommended in that the lower and upper levels should be 5 and 50 mSv respectively, based on the dose
208
resulting from one year’s consumption [13]. Using the methods outlined above, it is possible to calculate the levels of contamination in any given foodstuff that would give rise to doses equivalent to these ERLs. The results of these calculations are the Derived Emergency Reference Levels (DERLs), or Derived Intervention Levels (DILs), which depend on only two factors: the Committed Effective Dose Equivalent per unit intake of the nuclide(s) concerned, and the annual consumption of the foodstuff. In 1986 the IAEA published some recommended values for DILs for various categories of food [15], and the NRPB also produced advice in the UK context [16].
It is important to note the point that these levels are calculated from the radiation dose that would be received from eating a whole year’s intake of the foodstuff in question, at the level of contamination specified. This adds a very considerable further margin of safety to those discussed above, since the limits are applied in practice so as to ensure that no items of food exceed them, whereas the scientific basis of the assessments implies that it would in principle be acceptable to consume a significant fraction of ones diet for a year at these levels.
Justification and Optimisation In recent years, the underlying principle on which these ERLs are based has been called into question (see, for example, [17]). When considering any form of intervention, it is argued, there are two important factors to take into account: the total cost of the measure (which can be financial, social, or even health detriment in the case of severe measures such as evacuation), and the amount of dose that will be saved by applying the measure. In many cases, of course, this is identical to the dose that would be received in the absence of intervention, but this is not always true. While every effort should be made to ensure that the total dose to any individual does not exceed acceptable levels, the decision an each form of intervention should be independent, based on the dose savings it
209
will achieve. We should consider these factors before applying any food restrictions, to ensure that they are both justified and optimised. Justification. A definition of justification that is widely quoted comes from ICRP Publication 40 [12]: “The risk…should be limited by introducing countermeasures which achieve a positive net benefit to the individuals involved.” In simple terms, this says no more than the measure should do more good than harm, which appears obvious. However, there are two questions which this statement leaves unanswered: how are we to calculate the “net benefit”, and who are the “individuals involved”?
Logically, the only workable definition of the people involved must include the entire population of the affected country. This is not clear from the words used, but everyone is affected, not only the potential consumer of the foodstuff, but also the farmer whose produce may be restricted and the tax-payer paying for compensation, disposal and replacement supplies. There may even be international effects, if restriction of food imports or exports is under consideration. Calculation of the net benefit is more difficult, since it requires the comparison of health risks, on the one hand, with social disruption and financial costs on the other. Attempts can be made to do this quantitatively, using cost-benefit analyses, which are carried out by converting every detriment and benefit in the equation to a money equivalent. So they always depend on basic value judgements, such as the money equivalent of a man-Sievert, the cost of social disruption, etc. It is almost impossible to get general agreement on issues such as these.
210
Optimisation. Clearly, any restriction imposed should be applied so as to achieve the maximum “net benefit”, and so this also depends on the sort of calculation described above. Attempts have been made to carry out this sort of exercise in varying amounts of detail. Gjørup [17] has carried out simple calculations, apparently based only on the cost of the foodstuff under consideration, the dose per unit intake of the radionuclide present, and the cost attributed per unit dose. At US prices, these calculations yield optimum intervention levels which range from 4,000 Bq/kg for milk to 80,000 Bq/kg for beef, if the contamination is Cs–137 [17].
Much more detailed calculations along similar lines were carried out by Dionian and Simmonds, using hypothetical accident scenarios and assessing the dose consequences and the costs of reducing them by food restrictions [18]. Rather than derive the optimum values for the level of contamination in the foodstuffs, these authors used this analysis to calculate the dose equivalents, and hence the effective ERLs. They found that the optimum ERL for food restrictions could lie anywhere in the range 0.1 to 50 mSv, based on the effective dose equivalent received by an individual from a year’s intake of food. The factors considered were the nature of the produce restricted, the site, size and direction of the release, the weather conditions and the cost assigned to unit collective dose. However, they concluded that “5 mSv would represent the optimum dose criterion in a substantial number of cases” [18].
More recently, other NRPB authors have reconsidered this study in the light of experience of the Chernobyl accident and the increased public awareness of the problems since then, and also the current legal position with regard to the EC Regulations. They suggest that, if a single ERL for food restrictions is still to apply, then a more appropriate value might be 1 mSv [19].
211
Limits Set by EC Regulation In practical terms, however, the question of ERLs for foodstuff restrictions is now rather an academic one. The only quantity that can be measured is the level of contamination in the foodstuff, not an assessed dose. The question is how to get general agreement on what the limits of acceptable level are going to be.
At the time of Chernobyl, the limits set in the UK were based on the ERL concept, and so were related directly to limiting the radiation dose to the consumer to previously accepted low levels. Other countries across the world used other limits: some were also calculated on a realistic scientific basis, and there were good reasons for some spread of results as the underlying assumptions varied slightly from country to country. Some other countries based their limits for restricting food imports on a perception of the risk that might be regarded as unreasonable and exaggerated.
It is almost universally agreed that the wide variety of limits not only caused considerable difficulty in international trade, but was also responsible for confusion and uncertainty in the minds of the public. Several international organisations have therefore undertaken to establish limits that can be agreed across the world, or at least make recommendations to that effect. The Commission of the European Communities moved relatively quickly to establish a Regulation covering the EC [20], and the Codex Alimentarius Commission, which is a joint group of the UN Food and Agriculture Organisation and the World Health Organisation, is currently working on a set of recommendations for food moving in international trade [21].
Council Regulation (Euratom) No. 3954/87. On 22 December 1987, the Community adopted this regulation to provide a regulatory framework for the management of food protection following any future accident [20]. This lays down maximum permitted levels for all radionuclides of
212
importance, in various separate categories of foodstuffs: babyfoods, dairy produce, other major foodstuffs, animal feedingstuffs and minor foodstuffs. At the time of writing, values have been established for only two of these categories, “dairy produce” and “other foodstuffs except minor foodstuffs”. These are clearly the most important categories in this context, since they cover milk and major foods; the values are shown in Table 1. These limits are not currently in force—they will only be triggered in the event of another accident. TABLE 1 Maximum Permitted Levels of Radioactive Contamination Following a Future Nuclear Emergency, Under EC Regulation 3954/87. (Bq/1 or Bq/kg)
These values were based on advice from the “Article 31 Group” of experts, set up under the Euratom Treaty, but are in some cases more stringent (by a factor of 4): the Commission have explained that this was not for reasons of public health and safety, but in the interests
213
of public reassurance. The Regulation provides for the established levels to be kept under regular review in the light of future scientific developments, and also to be rapidly reviewed in the light of the particular circumstances of any future incident.
The table in the Regulation shows blank spaces under the other categories, and at present the legal position is that no limits have been set for these. However, the Commission have established a series of working groups to recommend levels, on which MAFF have been represented alongside representatives of the other Member States. Values are under discussion for all the rest of the table, including a definition of “minor foodstuffs”, and it is expected that these will be formally incorporated into the Regulation soon.
In the event of a future accident, the maximum permitted radioactivity levels laid down under Regulation 3954/87 will apply to all foodstuffs and feedingstuffs placed on the UK market, regardless of origin. The Imported Food Regulation 1984 will be used to enforce these levels in respect of imports. Controls on UK produce will be operated by means of Part I of the Food and Environment Protection Act, as explained above.
EMERGENCY PLANS AND ACTIVITIES FOLLOWING AN ACCIDENT
Accident at a UK Site Planning. As an Authorising Department for Licensed Nuclear Sites in England, MAFF is involved in regulating and monitoring all their activities which have potential impact on the food-chain. In the context of emergency planning, the MAFF Regional Offices are represented on the Emergency Planning Consultative Committees of nuclear power station sites, and the MAFF Inspectors are consulted
214
by the operators in drawing up their emergency plans. Each MAFF Region also has its own emergency plans, which include radiological emergencies; these are drawn up following guidance laid down by Headquarters divisions.
The Government has designated the Department of Energy as the lead department in the event of an accident at a civil site, and there is close liaison between the relevant Ministries, the Nuclear Installations Inspectorate and the operators, to ensure full cooperation in the drawing up and implementation of emergency plans. MAFF Regional and Headquarters staff regularly take part in the emergency exercises that operators are required to carry out to ensure the adequacy of their plans and procedures.
Emergency Response. In the event of an accident at a UK site, MAFF would be heavily involved both locally and nationally. Regional staff and scientists from the Food Safety (Radiation) Unit would attend the Off-Site Centre, where all local action in response to the emergency is coordinated. The role of the former is to ensure the implementation of any precautionary measures that may be required, from environmental sampling to food restrictions, and to work closely with the Government Technical Advisor in achieving consistency and agreement between all relevant agencies. The role of the scientific specialists is to advise Regional staff, to liaise with the operators’ health physics staff on questions relating to environmental monitoring, and to ensure that all the necessary information is transmitted to and from London. MAFF would also be represented at the Nuclear Emergencies Briefing Room, in Whitehall, where the response of the various Ministries involved is coordinated at a national level.
Environmental monitoring. An essential part of the Ministry’s plans is the ability to react rapidly and flexibly to any accident involving a release of radioactivity into the environment. Computer
215
models are constantly available, so that assessment of the impact on the food-chain can be made from any available data, as described above. An integral part of our plans is the establishment of mechanisms whereby good quality, relevant input data can be obtained as quickly as possible. In the first few hours after an alert, this may consist of the operators’ estimates of the magnitude and nature of the release, and possibly quick measurements of simple instrument readings such as dose rates. Other essential information such as wind direction and weather conditions should also be available. This must be followed up as soon as possible by environmental monitoring, including the taking of samples of grass and vegetation for analysis. Responsibility for taking and analysing samples is shared between the operators and the Ministry, depending on the distance from the site. The emergency plans of each include arrangements with nearby laboratories for carrying out the necessary analyses, and plans of appropriate means of transport. Some samples would also be sent to the Ministry’s Central Veterinary Laboratories for analysis, using air transport if necessary. The results of these analyses should be available within a few hours, so that staff in London can carry out the modelling necessary to determine what countermeasures are required, if any. A database of all analytical results is maintained, so that trends can be identified over both short and long term, and to enable further assessment to be carried out subsequently.
Implementation of food restrictions. In the event that food restrictions are deemed necessary, an order under the relevant Act can be drawn up and signed very quickly by a Minister. Officials would act in the light of any available information, and it is policy to err on the side of safety in the event of any uncertainty. Thus any indication of a potential breach of the limits would be sufficient to trigger a precautionary ban, and the geographical extent would also include considerable safety margins. Results of environmental monitoring, and subsequently analyses of the actual
216
milk, vegetables, etc, would be used to refine the initial calculations. The nature and extent of any restriction orders would then be modified as appropriate.
Accident at an Overseas Site Most of the above section applies equally well to the situation following an accident overseas. MAFF’s responsibilities remain the same in terms of monitoring the food-chain to protect the consumer, and the mechanisms for sampling, analysis and assessment would not differ. From the point of view of the Ministry’s scientific work, the biggest difference is that we might get several days’ notice before any contamination reached the British Isles, rather than having to act immediately. Organisationally, the major difference is that the lead department in this case would be the Department of the Environment (DOE) and MAFF has responsibilities under the National Response Plan (NRP) [22].
MAFF and the National Response Plan. A statement of proposals, outlining the NRP and the RIMNET system, was published by the DOE in 1988 [22], and a further explanatory booklet is currently in preparation to describe the current position [23]. Under this plan, each Government department will continue to discharge its statutory duties in the normal way, but all scientific information relevant to the incident will be relayed to the Technical Coordination Centre, whose role is to provide a fully coordinated Government response, and to ensure that all advice and information provided to the public are consistent. One of MAFF’s responsibilities will be to provide a representative at the TCC, and to ensure that all relevant information relating to foodstuffs is available there.
The RIMNET itself is an array of gamma-ray doserate monitors throughout the United Kingdom, which will detect any rise in radiation levels caused by the passage of a cloud of contamination. The readings from these detectors are stored on the Central Database Facility (CDF) computer at the DOE. If an alert is triggered, then monitoring programmes
217
will be required, including those concerned with the food-chain. An important part of the NRP is to ensure that monitoring carried out by all agencies is collated and included in the assessments, and then made available to interested parties, such as Local Authorities. This is done by entering it onto the CDF.
Organisations wishing to provide such data will have been accredited in advance, to ensure confidence in the results provided. As well as carrying out our own monitoring and analysis programme, MAFF will receive relevant environmental and foodstuff monitoring results from a variety of other sources, which will provide further data on which to base the necessary assessments. All results of foodstuff monitoring will be sent first to MAFF, where they will be be checked before entering into both our own database and the CDF.
In practice, it is recognised that no amount of planning can foresee the details of any future accident. All national and departmental plans are based on the need for flexibility, to ensure the most appropriate response in any circumstances. An example of MAFF can carry out its responsibilities in a real radiological emergency is given by reference to the the Ministry’s response following the Chernobyl accident.
THE RESPONSE OF MAFF TO THE CHERNOBYL ACCIDENT Brief Summary of Events The events following the Chernobyl reactor accident in 1986 have been very well documented, and the response of MAFF has been described in considerable detail elsewhere, including a chronology of all the Ministry’s actions [8]. A brief summary of the important steps will be given here, followed by a discussion of recent findings and the current situation. Immediate response. Monitoring of milk for radioactive contamination was initiated as soon as news of the accident was
218
confirmed, and well before the plume actually reached Britain. As soon as it did, monitoring of vegetables was also undertaken. Over a period of a few days it became clear that no restrictions were necessary on either: I–131 levels in the milk peaked at less than 20% of the DERL, and the Cs peak was also a small fraction of the DERL, while the contamination of leafy vegetables remained below 1% of the DERL. Nevertheless, routine monitoring of milk, grass and other animal forage, and a variety of foodstuffs has continued across the country ever since the accident, with particular emphasis on those areas where the levels of contamination were highest. Results of this programme, covering thousands of samples from England and Wales, have been published [24,25].
Duplicate diet study. In the early summer of 1986, duplicate diet studies were carried out on adults and children in three separate areas of the country, Cumbria, the South-west of England and London. These represented rural areas of high and low deposition and an urban area where the food supply was likely to be derived from a more diverse range of sources. Studies were based on one week’s consumption, and calculations made on the very pessimistic assumption that levels would remain equally high for a whole year. The overall average committed effective dose equivalent due to isotopes of caesium in the diet, for a year, was less than 18 µSv. The highest values were found in Cumbria, but even here the highest value found was below 100 µSv, a factor of fifty less than the ERL of 5 mSv [9, 10].
Restriction of sheepmeat. However, a longer-term programme of testing the meat of grazing animals was necessary, for the reasons outlined above. This showed contrasting effects in different parts of the country. In lowland areas, the levels of caesium in lambs peaked early and dropped quite rapidly, but the levels in lambs grazing on peat-based soils in upland areas remained high. These levels were assessed in the light of international discussions which were under way at that time: the European Commission called together the “Article 31 Group” to consider what levels of contamination
219
should be considered acceptable, and that group recommended an interim figure of 1000 Bq/kg. No limit was set by EC regulation for foodstuffs produced within the Community, but it was decided in the UK to use this figure as the limit of acceptability. Movement and slaughter of sheep within certain well-defined areas of Wales, Cumbria and Scotland were prohibited. These controls were imposed before those lambs that had fattened on contaminated pasture were sent to market. Animals that had put on most of their body weight before the accident did go to slaughter in the intervening period, but these were not significantly contaminated. (It should be noted that the “Article 31 Group” subsequently revised their recommendation to 5000 Bq/kg; this figure was reduced to 1,250 Bq/kg in EC Regulation 3954/87 (Table 1), for reasons of public reassurance, as explained above).
This was followed by an extensive programme of analysis of meat samples from several thousand sheep taken from the affected areas, which resulted in a contraction of the boundaries of the restricted areas, allowing most holdings to be released from restriction after about three months.
Live Monitoring and “Mark and Release” In the period immediately following the Chernobyl accident, the only accepted way to determine the level of contamination in sheepmeat was to slaughter the animals, and submit samples of the meat to the laboratory for analysis. Within about three months, the Ministry had developed a system of live-monitoring the animals, using portable gamma-ray detection equipment [26, 27]. This has two major advantages: it greatly increases the number of measurements that can be made, and it allows the movement and subsequent re-monitoring of animals that exceed the limit.
220
Thus sheep within the restricted area can be monitored before they are allowed to move out to other pastures, or for slaughter. Animals that pass the test are ear tagged, and can be slaughtered; those that fail can also be moved, either for breeding purposes or to fatten on less contaminated pasture. Flocks of sheep are colour marked with indelible paint to indicate their status. The level of caesium in a lamb feeding on clean pasture drops quite rapidly: this element is mobile within the body, and so the activity is gradually excreted. Remonitoring of animals a few weeks after movement out of the restricted areas consistently shows that the levels of contamination are well below the limit, and they can then be released from slaughter controls.
TABLE 2 Summary of Live-monitoring Results for Sheep Moving out of the Cumbria Restricted Area under the “Mark and Release” Scheme.
Table 2 shows a summary of the results, up to December 1988, of live-monitoring of sheep before movement out of the Cumbria restricted area. Much more detailed analysis of these results has been published elsewhere [25]. This Table gives an idea of the scale of the operation, involving the monitoring of some 200,000 individual animals in this area alone; the figure for Wales is
221
somewhat higher, and these do not include the check monitoring carried out on flocks of sheep in the areas surrounding the restricted areas. The figures for failures are those for the initial monitoring of the animals moving out of the area, and do not take into account the passes on subsequent re-monitoring.
The Current Situation Restrictions are still in force in certain areas of Cumbria, Wales, Scotland and Northern Ireland. Both live-monitoring and analysis of meat samples continue in all affected areas, and MAFF has an extensive research programme to study the behaviour of radiocaesium in the environment, in particular in the soil and plant types found in the problem areas. The nature of the problem can be stated simply: in these upland areas, where the soil has a low mineral content, the caesium is fixed in the soil sufficiently well that it remains close to the surface, but not so well as to prevent its uptake by grass and other plants. Analysis of both plants and livestock has shown that there is considerable seasonal variation in the levels of contamination, but that the levels are only falling slowly year by year. Research is continuing into ways of preventing the uptake of the activity into the animals, or to speed up its excretion, by feeding them with appropriate chemicals. Meanwhile, MAFF’s programme of live-monitoring ensures that contaminated meat is prevented from entering the market, and the mark and release scheme allows them to go to slaughter when the level of radioactivity in their flesh has fallen to safe levels.
ACKNOWLEDGEMENTS This paper is published by permission of the Ministry of Agriculture, Fisheries and Food. The author is grateful for the assistance of members of the Food Safety (Radiation) Unit.
222
REFERENCES 1.
Crick, M.J. and Linsley, G.S., An Assessment of the Radiological Impact of the Windscale Reactor Fire, October 1957. NRPB–R135 (1982).
2.
Limits for Intakes of Radionuclides by Workers. ICRP Publication 30 (Pergamon Press, Oxford, 1979).
3.
Kendall, G.M., Kennedy, B.W., Greenhalgh, J.R., Adams, N. and Fell, T.P., Committed Dose Equivalent to Selected Organs and Committed Effective Dose Eqivalent from Intakes of Radionuclides. NRPB–GS7 (1987).
4.
Jackson, D., Coughtrey, P.J. and Crabtree, D.F., Nucl Europe, 5(4), 29 (1985).
5.
Jackson, D., Coughtrey, P.J. and Crabtree, D.F., J. Environ. Radioactivity, 5, 143 (1987).
6.
Coughtrey, P.J., Mitchell, N.G. and Kirton, J.A., Associated Nuclear Services Report R776–1 (1987)
7.
Radiation Protection Research, 1988 Annual Report. UKAEA Report AERE R13204.
8.
Chernobyl: The Government’s Reaction. Second Report of the House of Commons Agriculture Committee, Vol. II (HMSO, 1988).
9.
Walters, C.B. and Mondon, K.J., in Health Effects of Low Dose Ionising Radiation (BNES, London, 1988), p. 201.
10.
Walters, C.B. and Mondon, K.J., Unpublished work.
11.
Recommendations of the International Commission on Radiological Protection. ICRP Publication 26 (Pergamon Press, Oxford, 1977).
12.
Protection of the Public in the Event of Major Nuclear Accidents: Principles for Planning. ICRP Publication 40 (Pergamon Press, Oxford, 1984).
13.
Principles for Establishing Intervention Levels for the Protection of the Public in the Event of a Nuclear Accident or Radiological Emergency. Safety Series No. 72 (IAEA, Vienna, 1985).
14.
Emergency Reference Levels: Criteria for Limiting Doses to the Public in the Event of Accidental Exposure to Radiation. NRPB–ERL2 (1981).
223
15.
Derived Intervention Levels for Application in Controlling Radiation Doses to the Public in the Event of a Nuclear Accident or Radiological Emergency. Safety Series No. 81 (IAEA, Vienna, 1986).
16.
Linsley, G.S., Crick, M.J., Simmonds, J.R. and Haywood, S.M., Derived Emergency Reference Levels for the Introduction of Countermeasures in the Early to Intermediate Phases of Emergencies Involving the Release of Radioactive Materials to Atmosphere. NRPB–DL10 (1986).
17.
Gjørup, H.L., Radiation Protection Dosimetry, 21, 145 (1987).
18.
Dionian, J. and Simmonds, J.R., The Choice of Individual Dose Criterion at which to Restrict Agricultural Produce Following an Unplanned Release of Radioactive Materials to Atmosphere. NRPB–R183 (1986).
19.
Hill, M.D., Wrixon, A.D. and Webb, G.A.M., J. Radiological Protection, 8, 197 (1988).
20.
Official Journal of the European Communities No. L371 of 30 December 1987.
21.
Codex Alimentarius Commission Proposal CX/FAC 89/17.
22.
The National Response Plan and Radioactive Incident Monitoring Network (RIMNET). A Statement of Proposals. (HMSO, 1988).
23.
The National Response Plan and Radioactive Incident Monitoring Network (RIMNET). Phase 1. (HMSO, in preparation).
24.
Radionuclide Levels in Food, Animals and Agricultural Products (Post Chernobyl Monitoring in England and Wales). (MAFF/Welsh Office, HMSO, 1987).
25.
Radionuclide Levels in Food, Animals and Agricultural Products 1987 (Post Chernobyl Monitoring in England and Wales). (MAFF/Welsh Office, HMSO, 1988).
26.
Meredith, R.C.K., Mondon, K.J. and Sherlock, J.C., J. Environ. Radioactivity, 7, 209 (1988).
27.
Sherlock, J., Andrews, D., Dunderdale, J., Lally, A. and Shaw, P., J. Environ. Radioactivity, 7, 215 (1988).
MEDICAL RESPONSE TO EFFECTS OF IONISING RADIATION RESOURCES FOR COPING WITH AN EVENT THE ROLE OF THE COMMUNITY PHYSICIAN
DR J D TERRELL District Medical Officer West Cumbria Health Authority, West Cumberland Hospital, Hensingham, Whitehaven, Cumbria, CA28 8JG, UK
The Community Physician has three main functions to perform in any incident which exposes, or is in danger of exposing, the public to an environmental radiation hazard, and he fulfills his function as the principal medical adviser to the District Health Authority and as Director of Public Health for his Authority. It is assumed in this paper that the Community Physician’s role will be filled by the Director of Public Health of the District Health Authority or, in his absence, by another Community Physician of the Authority of consultant standing who is acting for the Director of Public Health, a post hitherto most commonly designated District Medical Officer. The three main functions of the DPH are:1.
to advise, at the District Control Centre/Off Site Centre, on all aspects of the incident which affect the health of the community;
2.
to co-ordinate and arrange the dissemination of health information to health professionals and other relevant people in the community;
3.
to effect initial co-ordination of the provision of health services for the victims of the incident.
This paper will first of all deal with a scenario in which a radiation hazard arises within the DPH’s Health District and, building on this, will go on to consider a modified role for the Community Physician where a hazard arises outside his District but poses some threat to it. The submissions made on the role of the Community Physician as Director of Public Health of a District Health Authority are based on experience of exercises conducted over recent years in relation to a possible incident at Sellafield in West Cumbria.
224
225
When it has become apparent to the site operators that a real or potential threat exists of environmental radiation hazard beyond the site perimeter, moves to set up an off-site emergency control centre proceed. At the Off Site Centre (OSC), the health desk should be occupied by the Director of Public Health (District Medical Officer) of the relevant District Health Authority. He should be accompanied there by at least one personal administrative assistant who is thoroughly familiar with the geography and communications of the district, and who is furnished comprehensively with names, addresses and telephone numbers of all of the agencies or people with whom the DPH might need to be in touch. In exercise circumstances, one such person has proved adequate, but it would be well to provide a back-up person for this vital support role which enables the DPH to give most of his time to studying health physics and other data passing through the OSC, and discussing this with colleagues in other disciplines. The DPH should also be accompanied by a medical physicist on the staff of the Health Authority, whose main responsibility is to keep constantly abreast of the health physics data entering the OSC; to consult, if necessary, on this with other health service physicists; and advise the DPH on the environmental health implications of the information. In this way, the DPH has a reasonable chance of making an important health input into the vital decisions on such matters as evacuation and issuing, where appropriate, of iodate tablets. Assuming that the DPH at the OSC is supported by two administrative colleagues and one medical physicist, these three individuals relate respectively to the three main functions indicated for the DPH at the OSC. The DPH himself, in addition to directing affairs at the health desk, will from that vantage point be in touch with the whole range of other service and agency personnel at the centre. He will participate in the “top table” periodic discussions on the progress of the incident and contribute to determining the principal decisions of the Controller at OSC whether initially an executive of the agency or company giving rise to the incident (in this case British Nuclear Fuels plc) or, later, the Government Technical Adviser or Senior Police Officer, according to local arrangements. The DPH will be in active consultation with the civil emergency services with regard to health problems arising in the community, ranging from areas affecting large numbers of people to advice on dealing with an individual acute illness or pregnancy requiring hospitalisation. Exercise experience has indicated a wide range of such contacts at OSC, including education officials with regard to school children’s health, and officers of the Ministry of Agriculture, Fisheries and Food in relation to decisions affecting food stuffs. The DPH will have to advise the Controller at OSC on the terms of press releases as they affect health hazards and health advice to the community. Through his medical physics colleague at OSC, the DPH should oversee the making of a truly independent assessment of the health issues in the environmental radiation hazard, working closely with the health physicists at the Centre, who relate to the nuclear installation concerned, and reaching agreement with them on the nature and
226
implications of the environmental hazard. The medical physicist of the Health Authority would need to communicate direct from OSC with other medical physics colleagues, notably the Regional Radiation Protection Adviser. Such an input to the decision making process not only allows the health services, including the Environmental Health Authority, to fulfil its statutory functions towards the health of the community, but also broadens the basis of professional consultation and advice on the nature and extent of the hazard. It is important that this input should be at the earliest possible moment, since some of the most important decisions, notably concerning evacuation and/or the issuing of potassium iodate tablets, may have to be taken at a very early stage of the emergency. Indeed, such decisions may have to be taken before the OSC gets established, and this would obviously be dependent on the best advice available even at the site control. Nevertheless, at every stage thereafter, the medical physics and health adviser input from the health desk should be secured. Turning to the second function of the DPH at the OSC, namely co-ordination and dissemination of health information, this is facilitated by the establishment of an information room in the most convenient building of the health service locally—in the case of West Cumbria at the West Cumberland Hospital. A dedicated telephone line is provided from the OSC to the hospital information room, and this becomes the life-line between the OSC and the health services generally. The DPH’s administrative support colleague has special responsibility for passing all necessary information from the OSC to the hospital information room. The latter is manned by three staff, the telephone link to the OSC being permanently attended. Two other telephones are also provided in the information room, for immediate onward transmission of urgent information to agencies requiring this, e.g. accident and emergency or other department of the hospital or to the Department of Health or Regional Health Authority. At the same time, other telephones within the hospital are made available in whatever numbers are required for the onward transmission of information to multiple centres, such as general practitioners’ surgeries and health centres, and these telephones, which are away from the hospital information room, are served by one or more “runners” from the information room. Thus it is possible to ‘cascade’ information which should be in the hands of doctors and nurses in the community so that they can speak with some authority when approached, as they certainly would be, for advice by the public and by local media agencies. Rather more information on the underlying reasons for public health decisions would be conveyed to health professionals in the community, than would be possible in official press releases and to the media from the OSC. The function of the DPH and the hospital information room was exercised during a level II exercise in West Cumbria in May 1988, and was commented on favourably by the exercise directing staff, who tested the communications function by passing questions and problems on health matters and assessing the nature and timing of the response from the hospital information room.
227
The staff who manned the information room throughout the hours of the exercise were drawn from a variety of disciplines, including administration, planning, community medicine, health education, within the health authority. Subsequent to the exercise, a concise manual of guidance was produced to remain permanently by the dedicated telephone point in the hospital information room, so that staff with no previous exercise experience could pick up the responsibility virtually at a moment’s notice. During the 1988 exercise, the communications with primary health care centres in the community was only tested in a limited way using selected practices and health centres planned in advance. It was the realisation that more staff on telephones would be needed to pass out this information more widely without undue delay that prompted the extension of the information room as described above to the bringing into play of more telephones in another part of the hospital. This, coupled with British Telecom’s experience in recent civil disasters, has prompted a check with British Telecom as to the appropriate listing of doctors’ surgeries and health centres in the “telephone preference scheme” for emergency situations. One matter of great importance, with regard to the establishment of sound and clear health advice, is the close co-operation of the Community Physician (DPH) with his environmental health colleagues at the Off Site Control. The Local Authority District Council is represented there, and normally one of its representatives is the Chief Environmental Health Officer. It should be remembered that the role of the DPH in major aspects of an environmental health hazard is as Medical Officer for Environmental Health (MOEH). He needs therefore to work closely with the Chief Environmental Health Officer and seek at all points to coalesce their views and the emerging advice. It is appropriate that the MOEH should show leadership in this, while remembering that there are many aspects of environmental health issues in which the EHO is more expert than is the doctor. At all costs conflicting advice from these two sources must be avoided, as indeed it must be at the OSC in response to advice on food stuffs emanating from the Ministry of Agriculture, Fisheries and Food. One of the vital functions of the OSC is to have the expert staff of all these agencies together so that consensus can be reached on all important advice for the public. The third function of the DPH is the co-ordination of the provision of health services which may be required by the victims of the accident. This may involve trauma to a probably small number of individuals, though they may also have suffered radiation contamination. Hospitals adjacent to nuclear installations have facilities which they exercise regularly, as we do in West Cumbria, for receiving injured contaminated casualties. The same channel of communication from the OSC to the hospital information room would be used for this, transmitted onwards to our casualty command post in the hospital which would be set up during the emergency. It would function essentially according to the standing orders for any major civil accident, including the standby availability of a surgical team to go to the site. In the event of an incident at a nuclear installation, however, every effort
228
would be made to bring severely injured people to the hospital. One of the internal health agencies with which the hospital information room would be in regular contact, is the ambulance service. The medical management of the victims of radiation exposure will doubtless be touched on by several contributors to this Conference. The NAIR scheme (National Arrangements for Incidents Involving Radioactivity) is described in the NRPB Handbook as a ‘long-stop’ to other emergency plans and not normally intended to operate in circumstances where detailed pre-planning for emergencies exists. The Handbook goes on, however, to state that the police may involve NAIR in any circumstances where they feel they have a genuine need for radiological assistance and where other plans have failed to operate properly, or where unreasonable delays are being experienced in effecting them. It seems to me, therefore, that there is much to be said for a direct link between NAIR and local Emergency Schemes based on establishments housing nuclear reactors and/or specialized plants such as reprocessing of reactor fuels. Turning to the role of the Community Physician in an incident arising outside his own district, we look to the experience of this country in relation to the Chernobyl accident for guidance on this, and I am indebted to my colleagues, the Regional Medical Officer and the Regional Medical Physicist and Radiation Protection Adviser for much of the context I am suggesting for the role of the Community Physician in such circumstances. It is interesting to reflect on the fact that the 1957 Windscale fire was “indistrict” as far as Cumbria was concerned (and would be particularly so today in relation to West Cumbria Health District), but was an “external” incident as far as parts of the United Kingdom down-wind were concerned in that incident, i.e. south-east of Cumbria. Before Chernobyl, Professor Keith Boddy, Regional Medical Physicist and Radiation Protection Adviser to the Regional Health Authority and its constituent District Health Authorities, had discussed with the Regional Medical Officer, Dr Liam Donaldson, how the former’s department of medical physics could best assist in such an external incident affecting the Northern Region. Chernobyl certainly provided just such a situation, both in the immediate days after the accident and, in the longer term, affecting vegetation and grazing stock. A pattern of health service arrangements was drawn up for dealing with the significance of a nuclear accident outside the boundaries of the District Health Authority. This involved close coordination of action and advice by the staff of the Regional Health Authority as guided by the Regional Radiation Protection Adviser and systematically passing information and advice to District Radiation Protection Advisers and Directors of Public Health in Health Districts. It is envisaged that each District Health Authority would nominate an officer, normally the DPH, to undertake major responsibilities, including an advance advisory letter to hospital doctors, general practitioners, environmental health officers, and county emergency planning officers, about the arrangements to apply during an incident. In the event of an emergency he
229
would co-ordinate locally a response to the incident after reference to Regional colleagues. He would also log all requests for assistance or advice and be a focal point for health advice to the community, providing, in consultation with the District General Manager, a service to meet media enquiries and prepare health advice for the general public. It seems essential that something of this kind should be done in order to establish a focal authoritative point for health advice in post-Chernobyl type situations. It seems appropriate that the Director of Public Health of the District Health Authority should undertake this function, and depending on the nature and the extent of the external incident, he may well decide to establish such a parallel information centre in the hospital as described above for “in-district” incidents. The effectiveness of the function of the DPH in such circumstances will depend very much on the media recognising this arrangement as the main source of authoritative information and advice. In order to achieve this, the DPH must have at all times a positive relationship with the news media serving his district. They will only turn to him and his organisation in an emergency if they have a confident day to day working relationship in non-emergency situations. It will not be adequate to expect to have them pointed in the direction of the Health Authority and the Director of Public Health if they don’t know him, and only from time to time communicate with him on health matters. It is appreciated, of course, that in a similar way all the major national agencies involved in a Chernobyl type incident have been reviewing and revising their functions, and, doubtless, more comprehensively co-ordinated responses could be expected in any future incident. Nevertheless, within Health Regions and Districts the health service will be looked to, and rightly so, as the primary source of health hazard evaluation and advice, and the Community Physician should be prepared to step into the major role.
LOCAL EMERGENCY ARRANGEMENTS FOR RADIATION ACCIDENTS
DR ALAN JONES, M.R.S.C., C. Chem County Emergency Planning Officer Somerset County Council County Hall, Taunton, Somerset. TA1 4DY
ABSTRACT This paper describes the local and national framework for public protection during peacetime emergencies with particular reference to major accidents or events with radiological consequences. The basis for the development of emergency plans will be described together with the inter-relationship between the responsibilities of individual organisations.
INTRODUCTION There are a number of scenarios for emergencies involving radiation. Nuclear war is a major one which remains a remote but real possibility and considerable emergency planning effort is involved in this area although it is beyond the remit of this paper. The related problem of nuclear terrorism could occur in peacetime. Other scenarios include civil and defence nuclear establishments, the transport of radioactive material, irradiated material from nuclear powered satellites and one-off incidents involving radioactive material eg Brazil 1987. The Home Office [1] and the Department of Health and Social Security [2] both issued general guidelines to Local Authorities and Health Authorities in 1985. Since then, peacetime emergency planning in general and nuclear emergency planning in particular have received a number of jolts and the whole field of study is subject to review and development. Separate guidance is also available from the Health and Safety Executive [3].
230
231
LOCAL ORGANISATIONS
Emergency Services Think of the number ‘999’ and the emergency services of POLICE, FIRE and AMBULANCE come to mind but the 999 call is one the few things which these three uniformed services have in common. The Chief Constable reports to a Police Committee which consists of County Councillors, Magistrates and others. His ‘operational’ autonomy is extensive and the Police-Home Office link is very strong indeed. Police responsibilities include incident control, traffic control, issuing warnings and instigating evacuation in addition to maintenance of law and order. Police Force areas coincide well with Local Authority boundaries although some police forces cover more than one County. Each force has a well-established ability to reinforce its own ‘thin blue line’ by mutual aid with other forces and there exists an extensive communications network linking them on a national basis. Except in incidents involving fire, all organisations recognise the police supremacy in coordinating immediate local public protection measures. The Fire Service is an agent of Local Government. The Chief Fire Officer reports to a fire authority which either equates to a County Council or in the urban conurbation to a Fire and Civil Defence Authority also made up of Local Authority elected members. As with the Police service, the Chief Fire Officer has considerable operational autonomy. The fire service are responsible for fire-fighting and light rescue and are trained and equipped to perform this role in chemical and radioactive environments. They also have procedures for decontamination. At any accident involving fire, the Senior Fire Officer controls the fireground. The Ambulance Service is part of the Health Authority and responsible for immediate first aid, movement of casualties and movement of the homesick during evacuation.
Local Authorities Apart from the wartime scenario, parishes and communities in rural areas are unlikely to be isolated in the context of nuclear emergencies. Nevertheless, it is worth noting that in many areas, education about emergency
232
arrangements is carried out at community level and links have been established between Parishes, District and County Councils.
District/Borough Councils in their normal business have responsibilities for housing, environmental health, development control, minor highways and sewerage, and refuse collection. In metropolitan areas and Scotland, Districts have wider responsibilities but I will restrict my comments to a typical non-metropolitan area. District Council responsibilities in an emergency derive from their normal business and particularly concern persons made homeless by evacuation. Over recent years, the level of resources in terms of housing stock, direct labour and plant and equipment available to Districts has decreased and has resulted in a greater reliance on the private sector and other providers.
County Councils are not superior to District Councils in any formal controlling way. The two levels are clearly related. The County covers geographically a number of Districts and has responsibility for such functions as education, social services, major highways, consumer protection, libraries and waste disposal in addition to the fire service as mentioned above. Some of these Departments have specific responsibilities for certain types of emergencies. In addition, the County would, in the first instance, support the Police and the District. The resources are not insignificant but are also becoming increasingly privatised. The County would co-ordinate Local Authority action once the Police considered the immediate life-saving phase was over.
Health Authorities Health Authorities are responsible for providing primary health care to anyone in need; for medical advice and monitoring to persons affected by any emergency evacuation; for medical advice to the general public. District Health Authorities can call upon general practitioners and community nurses, the hospitals and clinics and the headquarters community physician teams. District Health Authorities are grouped under Regional Health Authorities. Resources should not be underestimated but the task is formidable. Since Chernobyl, the Department of Health appears to be undertaking responsibilities for which guidance and resources have not yet percolated down to District Health Authority level. Nevertheless, the main hospitals have Major Disaster Plans and the concept of mobile disaster teams is developing. Table 1 illustrates the range of responsibilities
233
TABLE 1 Health Authority Responsibilities
Other Local Organisations Agriculture, Fisheries and Food Departments of Central Government have extensive responsibilities and powers at all points in the food chain and Water Authorities have to ensure a continued supply of wholesome water. The Womens’ Royal Voluntary Service, British Red Cross Society and St John Ambulance provide voluntary support to Local and Health Authorities as do a number of other voluntary groups. Military support can be called into the locality.
CO-ORDINATION OF PLANNING “No single organisation is responsible for dealing with major accidents or natural disasters in the United Kingdom. Successive Governments have decided that a national disaster force would not improve existing arrangements which rely on immediate assistance being given by the emergency services (Police, Fire and Ambulance), supplemented where necessary by local, health and other public authorities. Many other organisations also stand ready to deal with accidents in various industries and activities on land and sea.” That extract from Home Office Guidance [1] together with the autonomy of each of the principal organisations mentioned above is itself a recipe for disaster unless the various contributions are co-ordinated. To some extent, this co-ordinating role exists in the form of small emergency planning teams at County Council or equivalent level. However, between 1974 and 1984, the priority task was the co-ordination of civil defence plans.
234
Only in recent years has their expertise been seriously applied to emergencies and whereas civil defence planning is a statutory duty Central Government, peacetime emergency planning is rate-borne and from specific major hazards premises) derives from a discretionary rather than a statutory duty.
peacetime funded by (apart power
Gradually co-ordinated composite emergency plans are being developed although individual organisations remain responsible for their own operational plans. Composite plans can only be prepared by liaison and emergency planning teams can act as catalysts in this process. Table 2 illustrates the organisations involved in the preparation of one site-specific emergency plan. TABLE 2 Organisations involved in Plan Preparation Hinkley Point Off-Site Emergency Plan
Table 3 illustrates the other organisations whose responsibilities are also reflected in the plan.
TABLE 3 Other organisations covered by Hinkley Point Off-Site Emergency Plan
235
TABLE 3
So the task of co-ordination is a formidable one for this one problem alone. In Somerset, a whole family of plans exist each dealing with a particular threat.
COUNTERMEASURES AND NEEDS Returning to the theme of radiological emergencies, I will now focus on particular countermeasures and needs in that context dealing initially with LOCAL/SHORT TERM requirements and going on to deal with NATIONAL/LONG TERM requirements. TABLE 4 Local/Short-Term Requirements
Advance Information The lack of public awareness of nuclear and radiation issues is a major problem which has been made worse by the secrecy associated with many of the uses of radioactive material. The public have a right to know what radiation risks exist in their environment and also to know what safety procedures exist to minimise the risk and what emergency arrangements exist should the
236
safety procedures fail. Following Seveso (1976) European legislation demanded that the public around major chemical sites should be better informed. As a result, advisory booklets have been issued which in most cases include simple action cards.
This process has been or is being applied to most civil nuclear installations. The staff who we would expect to respond to such emergencies also have a right to know the risks involved and the control measures which apply for their protection. As employers, we all have responsibilities under the Health and Safety at Work Act etc.
Notification The Police, Fire and Ambulance Services have an established system for notifying each other. In addition, the Police take responsibility for notifying other agencies as required by the circumstances and such procedures are laid down in emergency plans. Not all organisations guarantee round the clock availability however and this is a continuing concern to the Police.
Rescue/Fire-Fighting/Casualty Handling I have already mentioned how the Police, Fire, Ambulance and Hospital Casualty units work together. Such co-operative responses are tested in real terms all too often by one of the biggest hazards of all—Road Traffic Accidents.
Traffic Control The Police have extensive powers to control traffic and by this means to control access to and egress from any area which is affected or threatened by contamination. The purpose is threefold. To ensure that essential and emergency services have easy access to the incident. To ensure that nonessential persons and the public can be speedily removed to safety and to ensure that, as far as possible, unauthorised persons are kept away from the incident. This zone could range upwards from a few hundred metres for a road accident. For the Hinkley Point site, we plan on an evacuation zone of 3.5km radius which, on the landward site covers some 60km (25 square miles) with access and egress possible via a network of minor roads. Police procedures include road blocks, diversions, rendezvous points and separate access/egress routes. Traffic control can also assist in documentation.
237
Public Warning In the local, short term context, the Police are responsible for warning the public. They use loud-hailers, mobile public address systems, door to door warnings and the media. Around certain major hazards premises, sirens have been installed to advise the Public to go indoors, close doors and windows, tune into local radio and await further advice. There is a case for more widespread use of such sirens but not all organisations are convinced at present. A review of the Civil Defence warning system is in progress which includes the consideration of extending the system for this purpose but the costs are likely to be prohibitive.
Sheltering Sheltering is a useful public protection measure in certain circumstances in its own right or as a precursor to evacuation. Its value in peacetime radiation emergencies is currently receiving much attention. Its effectiveness depends upon early warning and is related in part to the use of sirens. In the past, there may have been over-emphasis on evacuation which is much more disruptive.
Stable Iodine Prophylaxis Not all radiation accidents will involve radioactive iodine, but nuclear reactor accidents almost certainly would. Stable iodine administered early as a blocking agent will protect the thyroid gland and prevent it concentrating radioactive iodine. Around Hinkley Point, the Police are responsible for issuing stable iodine to the public and this is generally the case at civil nuclear sites. At some other sites, the Health Authority perform this role. Handing out tablets to the public will take time and will result in questions about after-effects, interference with other medications, and dosage. Stocks tend to be concentrated around nuclear sites and facilities and wider stock holding at Health Authority premises has been recommended [4]. Advance distribution around nuclear sites is another possibility. A pharmaceutical product licence is now available for potassium iodate tablets.
238
Evacuation The principles of evacuation are well established. The Police move people out of an affected area and Local Authorities arrange reception and rest centres with support from others. There is the need for transport, documentation, feeding, welfare, comforting and general reassurance until evacuees can be returned home or found alternative accommodation. Children, the elderly, personal belongings and pets all bring their own particular problems. Local Authorities have Housing and Social Services Departments and buildings such as schools, and these are co-ordinated with the Police and Health Authority and voluntary bodies.
In the context of radiation accidents, there are the additional requirements for personal monitoring, for decontamination and although local resources could meet this need in the local area, there is a far bigger requirement to meet the demand for reassurance monitoring beyond the bounds of the evacuation area.
Casualty Bureaux and Public Information In any major emergency, there is a requirement to collate information on casualties and evacuees. Such information is essential to the effective management of an emergency and also for answering enquiries from concerned relatives and friends. The Police have well established procedures for such matters although recent major disasters have illustrated that it is almost impossible to match the resources available to the volume of calls. Local Authorities, Health Authorities and Government Departments recognise the need for each of them to manage public information points in addition to emergency management centres. Plans also exist for extensive involvement of the media.
Co-ordination There is no national disaster organisation in the United Kingdom. Emergency Planning and Emergency Management rest with a large number of individual separate organisations. In the local short term context, the Police co-ordinate the emergency management of public protection measures in the immediate life-saving phase with co-ordination of local action passing to Local Authorities in a later administrative phase. Both the Police and Local Authorities can call on military and voluntary assistance.
239
Technical Support and Assessment Radiological emergencies require specialist advice and where specific site operators or shipment consigners (eg, BNF, CEGB, SSEB, MoD) are involved, arrangements for provision of special advice are made in advance. In other cases, the Police can activate the National Arrangements for Incidents involving Radioactivity (NAIR) scheme. Health Physics teams within Health Authorities are one source of assistance. The Nuclear Industry and MoD are the main source of monitoring and assessment resources supported by the developing RIMNET system of the Department of the Environment and Local Authority radiation monitoring networks.
Food and Water Central Government agriculture departments and the Water Authorities are responsible for countermeasures to protecting the public from contaminated food and water. TABLE 5 National/Long Term Requirements
All these wider issues have an impact on the local scene and need to be recognised in local and national emergency plans.
SUMMARY Local emergency arrangements for radiation accidents depend for their successful operation on the co-ordination of the efforts of a multitude of separate local, regional and national organisations. Considerable preplanning has already been done but there is a continuing need to develop planning, training and exercising and there is a fundamental need for each and every organisation to commit resources for this purpose.
240
REFERENCES 1. Home Office. Emergency Planning Guidance to Local Authorities. 1985, ISBN 0-86252-196-3. 2. Department of Health and Social Security. Health Service Arrangements for Dealing with Major Accidents. 1985. HC (85) 24. (Supplement to HC (77) 1). 3. Health and Safety Executive. Emergency Plans for Civil Nuclear Installations. 1982. ISBN 0-11-883669-2. 4. Sibson, M., Jones, A., Richards, A.H., Cameron, I.D., Taylor, P., Mossman, G.K., Nuclear Site Emergency Planning. Second Report. January 1988. Society of County Emergency Planning Officers.
MONITORING AND ASSESSMENT OF RADIATION EXPOSURE FROM ROUTINE RADIOACTIVE DISCHARGES, AND ITS RELEVANCE TO THE QUESTION OF DISEASE CLUSTERS
SRJONES Environmental Protection Group, British Nuclear Fuels plc, Sellafield, Seascale, Cumbria CA20 1PG.
ABSTRACT Over the period of more than 30 years in which nuclear facilities have been operating, monitoring of the environment around such facilities has produced a wealth of information about the levels of radioactivity in the environment and about how radioactivity, introduced into the environment from effluent discharges, behaves in natural systems. Techniques for estimating radiation exposure to people from environmental measurements, or from theoretical models relating discharges to radiation exposure, have also been developed and validated where possible. The existence of localised excesses of leukaemia near the reprocessing plants at Sellafield and Dounreay, which cannot be explained on the basis of radiation exposure using these established methodologies, could be taken to imply that the methodologies are defective. However the radiation risk estimates have been shown to be robust to many of the suggested deficiencies and there are other, more qualitative, indications that discharges may not be a causative factor. In these circumstances it is argued that a broader search for possible causative factors may be more productive in resolving the conundrum.
INTRODUCTION All nuclear installations in the UK are required to monitor and limit the quantities of radioactivity released to the environment during routine operation, and also to monitor the environment to detect the effects of the releases. The main objective for this monitoring is to enable estimates to be made of the radiation exposures of the public resulting from the releases, independently of the release measurements themselves.
241
242
In the early 1980s questions were raised about the incidence of various diseases around nuclear installations, notably childhood leukaemia. Since the release control and environmental monitoring arrangements were established with a view to regulating public radiation exposure to levels at which such effects could not occur, the apparent existence of such clusters and the suggested connection with discharges from the nuclear installations constituted a major challenge to the accepted methodology for environmental dose assessment and health risk assessment. That challenge is not yet fully resolved, although many studies have been carried out. Questions are raised in many diverse areas, such as the statistics and epidemiology of the occurrence of rare diseases; the aetiology of the diseases themselves; patterns of occurrence of disease on the national scale; and fundamental radiobiology. This paper concentrates on the area of environmental assessment and the estimation of public radiation exposure, briefly reviewing the large amount of work which has already been done and noting some areas strictly outside the confines of dose assessment where environmental studies can cast some light on the remaining controversy.
BEHAVIOUR OF RADIOACTIVITY IN THE ENVIRONMENT Radioactivity released into the environment behaves similarly to any other pollutant. That is, the material is diluted and dispersed by winds, tides and currents; it is also subject to more complex processes such as sedimentation, atmospheric deposition, and transfer through food chains. Some of these processes lead to a degree of reconcentration or accumulation of material. Figure 1 indicates some of the processes which are important. The detailed behaviour of released radioactivity therefore depends on many factors—the mode of its release (discharge to atmosphere, discharge to marine environment, discharge to freshwater environment); the form of the material released (noble gas, chemically reactive gas or vapour, dust; aqueous solution, suspended solids); and the chemical behaviour of the radioactive elements involved. Thus the environmental behaviour of releases from a single nuclear installation may seem too
Figure 1—Principal exposure pathways to man from radioactivity in the environment
244
complex to be readily understood, and in the sense of a complete mechanistic understanding of each and every process involved, that is correct. Nonetheless from a knowledge of the basic environmental processes involved and by taking an empirical approach to environmental monitoring, it is possible to identify and monitor the most important environmental effects. In this sense radioactivity has historically had one great advantage over other pollutants in that radionuclides can be detected in very small quantities in environmental materials, so that monitoring and field research studies are entirely feasible. This advantage has not always applied to conventional pollutants, and although modern analytical techniques are to some extent redressing the balance it would be true to say that there is a more detailed knowledge of the quantities and behaviour of pollutant radionuclides in the environment, than there is for many conventional pollutants resulting from industrial or agricultural processes.
ENVIRONMENTAL MONITORING PROGRAMMES As already indicated, one of the prime objectives of environmental monitoring is to enable estimates to be made of radiation doses to members of the public, although there are other subsidiary objectives (1). Design of the monitoring programme requires, firstly, consideration of the pathways by which radiation exposure may occur and the radionuclides which will be of particular significance, depending on the quantities discharged. As an illustrative practical example, the following paragraphs summarise the main exposure pathways, the monitoring programme, and methods of dose assessment for the Sellafield site.
Marine Pathways Effluent released from the pipeline into the Irish Sea is dispersed by tidal action and currents. Some isotopes remain mostly mobile within the seawater (eg caesium–137), whereas others are more strongly adsorbed by silt particles (eg plutonium and ruthenium–106). Certain isotopes are ingested or adsorbed by marine flora and fauna, depending on their chemical nature, and may enter the food chain. The quantity of radioactive material taken up by any particular species will depend, among other things, on its habitat and feeding habits. Eventually, radionuclides will pass along food
245
chains and be found in marine species eaten by man. Of particular interest in the waters and shores around Sellafield are fish such as plaice, cod and whiting; crustaceans such as crabs, lobsters and scampi; and molluscs such as mussels and winkles. Radionuclides will become concentrated in the edible parts of these species, and will thus become a source of radiation exposure to the consumer. Radionuclides readily adsorbed by silt may then be deposited in the inter-tidal regions of local beaches, harbours and estuaries, exposing people who frequent these areas to gamma radiation from nuclides such as caesium–137 and ruthenium–106. Mechanisms exist whereby fine sediment particles, mostly those suspended in seawater in the ‘surf zone’ where waves break, can become resuspended and therefore give rise to exposure by inhalation or the transfer of material into terrestrial food chains. The most important radionuclides here are isotopes of plutonium and americium; the mechanism has been the subject of much research (5, 7, 9, 12). Many marine materials that concentrate radionuclides are not consumed by man, but may be useful indicators of, for example, changes in dispersion patterns. This role is particularly true of certain seaweeds. One such seaweed, Porphyra umbilicalis, was collected from the Cumbrian coast some years ago and sent to south Wales where it was used to make a local delicacy called laver-bread. Although Porphyra is no longer collected locally in significant quantities for this purpose, it remains useful as an indicator material and its collection remains a statutory requirement. A summary of the main marine pathways of interest is given in Figure 1.
Land-Based Pathways Gaseous effluent released to the atmosphere from Sellafield is dispersed and the airborne material provides a source of low-level exposure either directly from the cloud or from inhalation of the material. Ultimately, the airborne material is deposited on exposed surfaces such as soil and vegetation and provides a pathway to man either directly by consumption of vegetables and fruit or, indirectly, for example, via milk from livestock which has grazed on the vegetation. A further possible source of exposure is from surface water run-off, which then forms part of a
246
drinking water supply. A general summary of land-based pathways is given in Figure 1.
Important Pathways And ‘Critical Groups’ The number of possible pathways by which man may be exposed as a result of discharges to the environment from Sellafield is large, although in most cases the exposure due to a particular pathway or group of pathways is insignificant. It is possible to identify those pathways which give rise to the most significant exposure and, for each of these, the group of individuals most highly exposed. Such a group is known as a critical group. The critical groups for marine pathways are defined by the Fisheries Research Directorate of the Ministry of Agriculture, Fisheries and Food (MAFF). Its staff carry out periodic surveys into the seafood consumption and occupancy habits of the local communities to identify those individuals most highly exposed, and to estimate consumption rates or occupancy times representative of that group of individuals. The regular sampling of animal and plant species representative of those eaten by the group and the radiation monitoring of areas of silt on which they spend time then form the basis of an environmental monitoring programme. The combination of radionuclide concentrations and radiation dose rates with representative critical group habits then allows the estimation of radiation doses to members of critical groups. The identification of critical groups for land-based pathways is not usually based on detailed habits surveys but on certain maximising assumptions. It is assumed that the most highly exposed person would be an individual living on a farm close to the site and consuming milk and other foods produced entirely at that farm, directly inhaling airborne material, and exposed to cloud dosage from airborne material. Unless other information is available, consumption rates listed by NRPB in their DL series of reports are used. A survey of local consumption habits has been commissioned and is now underway.
Marine Monitoring The important marine pathways have already been identified as external irradiation from occupancy of exposed areas of silt and consumption of local fish, crustaceans and molluscs. The monitoring programme is
247
largely based on a statutory programme, last reviewed and modified with effect from 1 July 1988, the details of which have been published and the results of which are reported annually by BNFL (2). BNFL also carries out some monitoring beyond the requirements of the statutory programme, and the Ministry of Agriculture, Fisheries and Food carries out a parallel monitoring programme, the results of which are also published annually (3). A major part of the programme consists of a series of measurements and samples taken from the beaches and inter-tidal areas of the Cumbrian coast. Gamma radiation surveys are carried out over certain areas of exposed silt, on a monthly or quarterly basis along the coastline from Maryport to Walney, the main areas of interest being in harbours and estuaries. Radiation measurements are made using a hand-held radiation monitor one metre above the silt. On extensive areas of silt, readings are taken and recorded at regular intervals along, say, a river bank, whereas on beaches where only small sporadic areas of silt may be present a reading is taken over each area. Samples of silt are also taken for more detailed analysis, as are samples of shore seawater. MAFF continue to keep occupancy of these areas under review. At the present time, the critical group has been identified by MAFF as a small number of people who live on house-boats moored in muddy creeks off the Ribble Estuary. This area is monitored by BNFL’s site at Springfields. In West Cumbria it is also people occupying boats who are representative of those who receive the highest external exposures. Mollusc samples (mainly mussels and winkles which are consumed by the critical group) are collected monthly from several coastal sites. The specimens are ‘picked’ at low tide and sufficient are obtained to ensure a sample of a few hundred grams of flesh. Sand and other debris are rinsed off the winkle and mussel shells with seawater during collection and, at Sellafield, the molluscs are quickly prepared, as if for consumption, in a manner recommended by MAFF. The edible flesh can then be removed from the shells prior to radionuclide analysis.
248
The reasons for sampling the seaweed Porphyra umbilicalis have been described previously. Samples for analysis are obtained from 4 coastal locations between Braystones and Walney, where it grows in patches on rocks. It is usually blackish brown or reddish in colour, producing flat, lobed growths. The monitoring locations for inter-tidal areas are shown in Figure 2. The remaining major component of the programme is the sampling of other marine foods from offshore. BNFL owns a trawler, the RV ‘Seascan’ based at Whitehaven harbour, which fishes for the main species consumed by the critical group, namely cod and plaice. From time to time, samples of other species such as whiting, sole and herring are obtained, and cod and plaice landed commercially at Ravenglass and Whitehaven are also purchased regularly. A typical sample consists of a few kilograms of edible fish flesh, which are analysed at Sellafield for the appropriate radionuclides. The sampling concentrates on areas close to the shore between St Bees Head and Selker, as shown in Figure 2. Crustacean samples (mainly crabs, lobsters and scampi) are also obtained, mainly from local commercial suppliers, although some are caught by the ‘Seascan’. These samples are again brought to Sellafield for radionuclide analysis of the edible parts.
Land Monitoring The important land pathways have already been identified. Much of the effort and expense in the land programme goes on milk sampling. Milk is collected from a number of farms every 2 weeks, a sample having been placed in a bottle by the farmer at each milking over the 2–week period. The farms sampled include 7 within 2 miles of Sellafield and another 4 between about 2 and 4 miles from the site, chosen to provide a reasonably uniform directional coverage. Analysis for important radionuclides such as caesium–137 and strontium–90 is carried out on each sample, and, for less important nuclides, on 6–week bulked samples. A sample from the milking on the day of collection is also taken from some of the farms and analysed for radioiodine. Samples of milk are taken from 2 farms about 25km from the site to establish ‘background’ levels of activity, and also from bulk milk supplies at a local Milk Marketing Board creamery where cheese is manufactured.
Figure 2 Marine environmental monitoring locations
250
In addition, milk is sampled from a farm on the Ravenglass estuary to monitor the effect of cattle grazing on pastures washed by the tide and on which may have been deposited low levels of radioactivity from the effluent discharged to the Irish Sea. Milk samples are also taken from a farm adjacent to the Drigg disposal site. Milk sampling locations are shown in Figure 3. A small number of samples of meat, potatoes and local vegetables are also collected each year. MAFF also carry out a parallel monitoring programme of milk and foodstuffs, and their results are published annually (4). In order to monitor the inhalation pathway, a number of air sampling devices provide samples from points on the site perimeter and in the surrounding area up to about 10km from Sellafield. Locations sampled further from the site include local inland population centres (Beckermet, Calderbridge, Holmrook), together with coastal locations (Ravenglass, Seascale, Braystones, St Bees and Whitehaven) which will reflect the effects of any resuspension of marine material, as previously described. Pumps capable of pulling up to 100m3 air h-1 through a large filter paper are used to monitor the very low levels of airborne radioactivity. The sample papers are bulked periodically for radionuclide analysis. Air sample locations are shown in Figure 3. Two rivers, the Calder and the Ehen, flow into the sea at Sellafield and samples are regularly taken at several locations either once or twice a month. The sample consists of a spot sample taken in a plastic bottle to confirm that the very low levels of radioactivity in the rivers have remained unchanged. The lower reaches of these rivers are not a normal source of drinking water. The works’ sewer, which discharges to the sea, is also sampled on a continuous basis. A stream which carries drainage from the Drigg site is sampled automatically every hour, and the samples bulked weekly for analysis. The River Irt into which the stream discharges is also sampled regularly, as are local water supplies, both lakes and reservoirs and from the tap. In all, such a programme will generate a lot of data (typically 20,000 analytical results per year for the Sellafield programme). Full
Figure 3 TERRESTRIAL MONITORING LOCATIONS
252
summaries of the results of the Sellafield programme are published annually (2), and parallel programmes are also carried out by the Ministry of Agriculture, Fisheries and Food (3), (4). Other studies more related to environmental research are also carried out particularly around Sellafield, and from another substantial body of information on levels of radioactivity in the environment (eg 5–20). These studies are also central to understanding the processes involved and to provide a scientific basis for predictive dose assessments.
ASSESSMENT OF RADIATION DOSES TO THE CRITICAL GROUP FROM ENVIRONMENTAL MEASUREMENTS As a general principle, environmental monitoring programmes have been established mainly with a view to estimating radiation doses to ‘critical groups’—that is, small sub-groups which characterise the highest levels of radiation exposure. As a rule, monitoring is concentrated on the final step in the exposure pathway which leads to exposure—for example, concentration of radionuclides in food. Dose estimates then rely on calculating the quantity of activity taken into the body by combining radionuclide measurements with consumption rates for particular foods. Intakes of material by inhalation are similarly calculated from measured concentrations in air and breathing rates. In addition to the above, occupancy times in particular areas will be relevant to the dose arising from external gamma irradiation (for example, from sediments in mud banks) and, in some circumstances, inhalation. From the estimates of intake so calculated radiation doses are evaluated which depend on models for the uptake, translocation, and retention of various radionuclides in body organs (21, 22). The value usually calculated is the “committed effective dose equivalent” from a single years intake, that is a sum of doses to individual organs, weighted according to sensitivity cancer induction and hereditary disease, and integrated for a period (conventionally 50 years) following the intake. Calculation of this quantity is simplified by the publication of values for the “dose per unit intake” for inhalation and ingestion and for
253
TABLE 1 Typical critical group exposures near Sellafield, 1987
TABLE 2 Comparison of dose equivalents (µSv) to red bone narrow for one year old children living near nuclear installations, from various sources
Notes: 1 Values typical of Oxfordshire and Berkshire. Fallout doses in Thurso and Sellafield are up to a factor of two or so higher. 2 NC=not calculated. (Taken from Reference 42)
254
various age groups (23). Doses to critical groups in the vicinity of Sellafield in 1987 are shown in Table 1.
DOSE ESTIMATES FOR COMPARISON WITH EPIDEMIOLOGICAL STUDIES The procedures outlined above produce dose estimates which are adequate for establishing that maximum radiation exposures are within recommended limits (24–27). However they are not directly relevant to assessing the possible relationship between radiation exposure and disease incidence in a population, because in this case the average exposure, or rather the exposure of more typical members of the population, is likely to be more relevant. This depends on the assumption that there is no small sub-group of the population with radiation doses so high that induction of a cancer is highly probable; the critical group dose estimates demonstrate that this is not possible. Thus it is necessary to make use of the data on radionuclide concentrations in the environment together with assumptions about intake which represent the average, rather than extreme, consumer. For foodstuffs this is relatively straightforward; however pathways which are not important as contributors to the critical group doses may make a significant contribution to the smaller average dose, and need to be considered. A good example is the inadvertent ingestion of house dust, soil or beach sediments. A further complication arises because radiation induced cancers arise as a result of irradiation of a particular organ or tissue, and by definition must be caused by radiation dose delivered to that organ in the period prior to the diagnosis of the condition. The committed effective dose equivalent is therefore not a suitable indicator of the likelihood of an actual cancer case having been caused by radiation, and must be replaced by estimates of doses to specific target organs for a particular cancer, including the integration of dose over a period relevant to the age of the subject. Such calculations are possible using the models already referred to (21, 22) but need to be carried out specifically for each case.
255
ENVIRONMENTAL MODELLING Using the procedures indicated above, it is possible to use environmental sampling results to make estimates of the radiation dose delivered to the population in a particular area for use in cause/effect studies in epidemiology. However that pre-supposes that (a) There are adequately comprehensive environmental radioactivity data for the area in question. (b) That data covers the entire period for which mortality/morbidity statistics are to be examined. For areas around nuclear installations (a) is true for the last ten years or so; however environmental monitoring programmes have generally been improved as analytical techniques have advanced, for both alpha and gamma emitting radionuclides, and were generally not so extensive and complete in the 1950s and 1960s. In order to extend the analysis back to this period it is necessary to use predictive models which relate environmental radionuclide concentrations to discharge rates. For some nuclear sites, levels of environmental radioactivity from discharges are so low that they cannot be measured or distinguished from other sources, such as nuclear weapons fall-out. Again, in these cases environmental models can be used to infer the levels of environmental radioactivity. For discharges to atmosphere, there are established models for atmospheric dispersion (28) and transfer through food chains (29) which are of general applicability. For discharges to sea, dispersion and dilution are highly site specific and therefore specific models for each situation are required (11). Since modelling sea water/sediment/biota systems are generally complicated, recourse may also be made to more empirical methods in which discharge rates, perhaps modified by exponential ‘time of availability’ functions, are correlated with measured environmental concentrations (30). Such models are naturally subject to uncertainty but can be validated against such environmental data as are available in order to improve confidence.
256
RADIOLOGICAL ASSESSMENTS RELEVANT TO LEUKAEMIA CLUSTERS NEAR NUCLEAR INSTALLATIONS Questions about the incidence of childhood leukaemia near nuclear installations were first raised by a Yorkshire TV programme about Sellafield in 1983, specifically in connection with the incidence of childhood leukaemia in Seascale. A committee of inquiry chaired by Sir Douglas Black considered this question in detail and reported in 1984 (31). Subsequently a standing committee, the Committee on Medical Aspects of Radioactivity in the Environment (COMARE) was set up and has issued two reports (32, 33). Separate papers in the scientific and medical journals have referred to excesses of childhood leukaemia in the vicinity of the UKAEA Dounreay site near Thurso (34) and in the vicinity of Ministry of Defence sites in Berkshire (35). COMARE have concluded that the excesses at Seascale and Thurso are real and statistically significant, and will report shortly on the Berkshire excess. This paper does not seek to address the statistical and epidemiological debates about the significance of these observations, but to concentrate on what environmental measurements and radiological assessment can contribute to the interpretation. The National Radiological Protection Board have carried out extensive assessments of the radiation doses resulting from radioactive discharges to the populations of Seascale and Thurso and for the population around MoD sites in Berkshire, using the approach indicated above (36–41). For Seascale and Thurso, the epidemiological findings of interest have centred on childhood leukaemia; NRPB have assumed that the relevant ‘target organ’ is red bone marrow and have evaluated the doses to that organ in typical children living in the area. They have considered doses to different theoretical ‘cohorts’ of children born and resident in the area at different times during the period of interest. After detailed consideration of discharge data, environmental monitoring data, and all the relevant environmental modelling, metabolic modelling, and dosimetric information, their main conclusions in terms of maximum doses to red bone marrow for children resident in the three areas are shown in Table 2 (42). The results may be considered in two ways—firstly by application of a risk factor for leukaemia induction to the calculated doses, and secondly in
257
a more empirical manner by comparison of the calculated doses to red bone marrow from discharges with those which arise from natural radioactivity and nuclear weapons fall-out. In the first case, current risk factors and estimated doses indicate that the effect of discharges on local leukaemia incidence should be quite negligible. The comparison with natural radiation doses and other sources such as fall-out, which are fairly uniform on a national scale, provides some assurance that the risk factors cannot be underestimated to an extent sufficient to provide an explanation for the excesses, since the doses from discharges do not appear to significantly increase the total dose to red bone marrow above the levels experienced elsewhere in the UK.
UNCERTAINTIES IN THE RADIOLOGICAL ASSESSMENTS The studies briefly reviewed above would seem to exclude radioactive discharges as a significant causative effect for childhood leukaemia in Seascale and Thurso. However as with all such assessments the basic science, assumptions and models used are subject to uncertainties and therefore open to question. As a consequence it is virtually impossible, in the scientific sense, to prove absolutely that radioactive discharges are not the cause. Moreover, COMARE have reached the judgement that the excesses of leukaemia at Seascale and Thurso are statistically significant, and that the presence of such excesses near two similar nuclear installations (both involved in nuclear fuel reprocessing) is suggestive of a link which requires further examination (33). It is therefore clearly important to consider sources of error and uncertainty in the radiological assessment and to review the evidence available to reduce the uncertainties. NRPB carried out such a review recently (42).
Unreliability Of Discharge Information Particularly for the early years, dose assessments rely to some extent on the recorded discharges from the nuclear facilities, and these may be in error. Indeed the addendum to the NRPB report on the risk-of leukaemia in Seascale (38) was produced largely to take account of new information on discharges from Sellafield during the early years of operation. In this case (38, 32) the new information made very little difference to the
258
calculated doses or risks, although it raised the question of whether larger undiscovered errors may still exist. Two factors mitigate against this uncertainty as a possible cause of error. Firstly, the long lived radionuclides (plutonium, americium, caesium, strontium) which also tend to be more significant in the radiological assessments, are retained for a long time in some environmental media and therefore retrospective inventory studies can provide some confirmation of the cumulative discharge. Thus soil core and lake sediment measurements have produced information which can broadly confirm the total discharges of long lived radionuclides to atmosphere (10); inventory studies of plutonium and americium in Irish Sea sediments and complimentary modelling studies can provide some confirmation of the magnitude of discharges to sea (43, 44). Secondly, leukaemia cases in the two ‘clusters’ are not confined to the early years of operation where uncertainty is greatest; indeed at Thurso cases are concentrated in the later period 1979–1984 (33). Thus cases are occurring in periods where there is good discharge and environmental monitoring data to support the assessment.
Existence Of Significant, But Unidentified Exposure Pathways The radiological assessments make use of inhalation rates, food consumption habits, occupancy times and other information to relate environmental radionuclide concentrations to doses. These assessments assume that all the significant exposure pathways have been identified. The methodology of environmental monitoring and dose assessment places most reliance on monitoring suitable environmental indicators at the end of various exposure pathways, rather than monitoring individuals themselves. This is largely a matter of convenience since measurements on individuals are difficult to organise and intrusive; however measurements on individuals are very valuable in confirmation of environmental assessments. Radionuclides such as caesium 137, which emit gamma radiation, can be detected readily in individuals by use of sensitive radiation detection equipment. In 1984, and to provide Sir Douglas Black’s inquiry
259
with additional information, NRPB undertook a large exercise of this type in Seascale which involved the measurement of 290 people for whole body radioactivity (45). In about 7% of the population, caesium 137 could be measured but the levels found were rather lower than those expected on the basis of the previous radiological assessment. Since caesium is cleared from the body with a half-time of about 110 days in adults or 20–50 days in children, this could only confirm that caesium uptake in the period of months to a year preceding the measurement was in agreement with expectation; however NRPB also refer to earlier, more limited, work in the 1960s which reached a similar conclusion (46–47). Studies of caesium 137 in whole body following the deposition of Chernobyl activity in the UK in 1986 also indicate that assessments based on environmental data tend to be pessimistic (48). For radionuclides which emit only alpha radiation, such as plutonium and americium, direct measurement of body content is not feasible. Confirmation of environmental assessments depends on the radiochemical analysis of samples, either autopsy samples or excreta. In the latter case modelling is still necessary to relate sample results to body content (49, 50). Samples of autopsy tissues therefore provide the most unambiguous confirmation and a limited amount of data on Seascale residents has been obtained (51). These also indicate plutonium uptakes lower than predicted by the environmental assessment (42), although the number of cases is small and more information is highly desirable. In addition to the above specific studies of the Seascale population, MAFF have carried out some studies of specific exposure pathways for Sellafield discharges. Work on the uptake of plutonium by shellfish consumers (52) has confirmed the important value for the fraction of plutonium and americium crossing the gut wall, and given some data on absolute levels of uptake by shellfish consumers. A study of radionuclides in “whole diet” of local residents provides a check on the total intake of radionuclides in food (53). The relevance of exposure in the ‘built’ environment has been examined in studies of radioactivity in house dust carried out by NRPB and a team at Imperial College (54, 55).
260
These studies all serve to offer some confirmation that significant exposure pathways have not been omitted from the environmental dose assessments.
Uncertainties In Radiobiology The comparison between doses to red bone marrow from natural background radiation and radioactive discharges helps to an extent to make the assessment robust against basic uncertainties in the mechanism of leukaemia induction by radiation and uncertainty in the associated risk factors. However there are a number of uncertainties still remaining, such as:(a) Alpha (or “high LET”) radiation is known to be especially biologically damaging, but the extent may have been underestimated. (b) The radionuclides responsible for delivering the dose from discharges may behave differently in the body than predicted, and differently from ‘natural’ radionuclides especially in children or pregnant mothers, thus leading to higher doses than predicted from discharged radionuclides. (c) The ‘target organ’ for leukaemia induction may not be red bone marrow, and the discharged radionuclides may deliver a much higher dose to that organ. The alternative most frequently proposed is the tracheobronchial lymph nodes. In this paper it would not be possible to do justice to the scientific argument on this topic. In their review (42) NRPB have commented on each possibility, noting that (a) Separately comparing doses from high LET and low LET radiation does not significantly alter the conclusions. (b) The existing metabolic models are fairly robust, and tend to be cautious where there is uncertainty; available human and animal data does not indicate that transfer across the placenta is of great significance. (c) Consideration of doses tracheobronchial lymph nodes
to or
other target organs, such fetal liver, does not lead
as to
261
significantly different conclusions to those reached from consideration of red bone marrow. Resolution of these uncertainties and a full understanding of their relevance to leukaemia induction, to a degree which would approach ‘scientific proof’, would require substantial research and development of scientific understanding both in fundamental radiobiology and in the aetiology of leukaemias and related conditions.
TREATING ENVIRONMENTAL AND EPIDEMIOLOGICAL DATA AS ‘SYMPTOMS’ At this point, therefore, it is valuable to step back from the detailed science and to look at the relationship between available epidemiological and environmental data at a broader level, to see what information can be gained. This may be compared to diagnosis of a disease by observation and correlation of the symptoms, rather than by recourse to detailed pathological or biochemical tests. As already noted, COMARE (33) has concluded that there are significant excesses of childhood leukaemia at both Seascale and Thurso, and that their existence close to the nuclear sites of Sellafield and Dounreay merits further investigation. The question therefore naturally arises as to whether discharges from the sites are a common causative factor. In addition to considering the very detailed radiation dose assessments which have been done, useful information is gained by directly comparing the environmental measurements at the two sites.
Comparison Of Environmental Conditions At Seascale And Thurso The environmental sampling data for the two areas (2, 3, 4, 56) indicate that concentrations of radionuclides in the area around Thurso are much lower than those around Seascale. Consideration of discharge history for both sites (38, 39) confirms that this is so throughout the history of operation. This difference results in substantial differences in calculated doses to people in the area, as exemplified in Table 2. Since it would be reasonable to assume a stimulus of similar magnitude in each area would be required to constitute a common causative factor, this would suggest that discharges are unlikely to be that common factor.
262
Furthermore, since the spectrum of radionuclides released by each site is broadly similar, this conclusion would hold even if the radiobiological effects of some specific nuclide, say plutonium, were radically different from those assumed in the assessment. The levels of environmental contamination and associated doses, in Berkshire are much lower still, and the argument would be further strengthened if it were concluded that there was an excess of leukaemia related to nuclear installations, in that area.
Time Distribution Of Cases Relative To Time Variation Of Discharges And Environmental Radioactivity If discharges are a causative factor for leukaemias in Thurso and Seascale, one might expect some correlation between significant changes in discharge rates and occurrence of cases. Figures 4 and 5 indicate some correlations between cases in Seascale and Thurso and discharges to sea of alpha emitting radionuclides (ie plutonium and americium). This correlation is selected because these discharges are one of the main contributors to radiation dose, especially for high LET dose to red bone marrow, and the debate on radiobiology has concentrated on these nuclides. In terms of registration date, the cases in Thurso cluster very strongly in the period 1980–1984, but because the dates of birth vary between 1961 and 1978 the time available for induction of cases spreads over an extended period when discharges were relatively constant. At Seascale, on the other hand, cases appear to have arisen at a constant rate despite the large variation in discharge rate, and there is no evidence that the higher discharges in the late 1960s and early 1970s have influenced the occurrence rate of leukaemia, either at that time or subsequently. On this basis it is reasonable to argue that the discharges appear to be unconnected with the leukaemia excess.
Consideration Of Variation In Other Sources Of Radiation Exposure One of the major sources of radiation exposure to the worldwide population in the 1960s was the fall-out from testing of nuclear weapons in the atmosphere. This resulted in doses to red bone marrow substantially greater than those in Thurso from Dounreay discharges in all years, and comparable to or greater than those in Seascale from Sellafield discharges
Figure 4.Time distribution of Thurso leukaemia cases in relation to total alpha discharges to sea from Dounreay
Figure 5. Time distribution of Seascale leukaemia cases in relation to total alpha discharges to sea from Sellafield
265
during the 1960s (Table 2, references 36–42). Moreover the doses varied sharply with a peak in the mid 1960s, and were delivered by the same principal nuclides as those released from Sellafield and Dounreay, namely isotopes of strontium, plutonium, americium and caesium. Darby and Doll (57) have considered this in detail and conclude that, since there is no evidence for a national rise in leukaemia incidence which correlates with the variation in dose from fall-out, that the doses from Dounreay discharges in particular are unlikely to be responsible for any leukaemia excesses. These three examples illustrate the way in which environmental information used in a diagnostic manner can provide information complimentary to that obtained by detailed examination of the radiobiological and medical science involved. Overall, the examples provide evidence against the hypothesis that radioactive discharges are a causative factor in the leukaemia excesses but of course they cannot prove the issue either way. Their real value is that they act as a severe but informative constraint on hypotheses which seek to establish a mechanism by which radioactive discharges can cause the leukaemia excesses at Seascale and Thurso, and as such would be of value in determining research priorities.
DISCUSSION AND CONCLUSIONS It is in the nature of scientific proof that it is virtually impossible to prove a negative—that is that a particular effect or event will not occur under any circumstances, or that a particular factor is not the cause of an observed effect. It would be much easier (although still very difficult) in this case to prove a positive—that is, to identify a factor which is causing excess leukaemias in the areas under investigation. The summary review in this paper radiation, and in particular radioactive very intensively with no evidence being factor. Indeed such evidence as has been
indicates that causation by discharges, has been examined found that it is a causative gained indicates the contrary.
266
Uncertainties however still remain, and further research will continue for some time. In carrying out such further research, it is suggested that systematic correlation of the available environmental, discharge and epidemiological data would be valuable as a way of testing hypotheses concerning causation and in establishing research priorities. It may be argued that the ‘radioactivity causation hypothesis’ has already been tested quite rigorously and extensively and found to be weak. It therefore becomes appropriate to consider other possible hypotheses, and subject them to testing of similar rigour. Further impetus is given to this idea by the studies (58) which have demonstrated that highly significant excesses occur at locations remote from nuclear sites, and that environmental factors (in the widest sense) may be involved. Plausible causative factors have already been proposed (59). In the same way that environmental data for radionuclides can be of value in hypothesis testing, it is likely that environmental information on other plausible causative factors would be of great value. Whilst there is an abundance of data on radionuclide concentrations in the environment around nuclear facilities, together with methodologies for assessing human exposure and its consequences, there is a general dearth of information on other agents. There is also difficulty in obtaining such information, because of the multiplicity of possible agents and the need for specific and very sensitive measurement techniques for each. Nonetheless it is suggested that efforts applied in this area, in the context of epidemiological data on a regional or national scale, would make a valuable contribution to research in this area.
267
REFERENCES 1
Principles of monitoring for the radiation protection of the population (ICRP 43). Annals of the ICRP 15 1 (1985).
2
Annual Reports on Radioactive Discharges and Monitoring of the Environment. Health and Safety Directorate, British Nuclear Fuels plc, Risley (1971–1987).
3
Radioactivity in the Surface and Coastal Waters of the British Isles. Ministry of Agriculture, Fisheries and Food, Directorate of Fisheries Research, Lowestoft (1967–1987).
4
Terrestrial Radioactivity Monitoring Programme (TRAMP) Radioactivity in food in agricultural products in England and Wales, 1986. Ministry of Agriculture, Fisheries and Food, Food Science Division. MAFF publications, Alnwick (1988).
5
Cambray and Eakins: Concentrations of plutonium and caesium 137 in environmental samples from West Cumbria, and a possible maritime effect. AERE R9807, HMSO (1980).
6
Eakins, Pattenden et al: Radionuclide deposits in soil in the coastal region of Cumbria. AERE R9873, HMSO (1981).
7
Pattenden, Cambray et al: Studies of Environmental Radioactivity in Cumbria: Measurements of radionuclides in airborne and deposited material. AERE R9857, HMSO (1980).
8
Cawse: Caesium 137 and Plutonium in Soils in Cumbria and the Isle of Man. AERE R9851, HMSO (1980).
9
Eakins, Lally et al: The magnitude and mechanism of enrichment of seaspray with actinides in West Cumbria. AERE R10127, HMSO (1982).
10 Eakins and Cambray: The chronology of discharges of caesium 137, plutonium and americium 241 from BNFL Sellafield, as recorded in lake sediments. AERE R11182, HMSO (1985). 11 Howarth and Kirby: Studies of Environmental Radioactivity in Cumbria: Modelling the dispersion of radionuclides in the Irish Sea. AERE R11734, HMSO (1988). 12 Howarth and Eggleton: Studies of Environmental Radioactivity in Cumbria: Modelling of the sea-to-land transfer of radionuclides and an assessment of the radiological consequences. AERE R11733, HMSO (1988). 13 Eakins, Morgan et al: Plutonium and americium in the intertidal sands of NW England. AERE R12061, HMSO (1988). 14 Cawse: Studies of Environmental Radioactivity in Caithness and Sutherland—A Survey of radioactive nuclides in soils, peat and crops in 1979. AERE R9997, HMSO (1988).
268
15 Pattenden, Cambray and Playford: Environmental Radioactivity in Caithness and Sutherland—Radionuclide deposits in coastal soils 1978– 1983. AERE R12792, HMSO (1988). 16 Booker: Caesium 137 in soil in the Windscale area. AERE R40240, HMSO (1962). 14 17 Otlet, Walker and Longley: The use of C in natural materials to establish the average gaseous dispersion patterns of releases from nuclear installations . Proc Radiocarbon 25, 593–602 (1982). 18 Otlet, Walker and Fulker: The transfer of C to foodstuffs from experiments on a controlled plot in Cumbria. Proc. 4th SRP International Symposium, Malvern, in Press. 19 Bradford, Curtis and Popplewell: Radioactivity in Environmental Samples taken in the Sellafield and Ravenglass areas of West Cumbria, 1977– 1982. Science of the Total Environment, 35 267–283 (1984). 20 Howard and Lindley: Aspects of the uptake of radionuclides by sheep grazing on an estuarine saltmarsh, 2, Radionuclides in sheep tissues. J.Environ. Radioactivity 2 199–213 (1985). 21 Limits for intakes of radionuclides by workers (ICRP 30). Annals of the ICRP 2 3/4 (1979). 22 Metabolic and dosimetric models for application to members of the public. NRPB GS3, HMSO (1984). 23 Kendall, Kennedy, Greenhalgh, Adams and Fell: Committed dose equivalent to selected organs and committed effective dose equivalents from intakes of radionuclides. NRPB GS7, HMSO (1987). 24 Recommendations of the International Commission on Radiological Protection (ICRP 26). Annals of the ICRP, 1 3 (1977). 25 Statement from the Paris meeting of the ICRP. Journal of the Society of Radiological Protection 5 2, 87–88 (1985). 26 Radioactive Waste Management Advisory Committee: Fifth Annual Report. HMSO (1984). 27 Interim guidance on the implications of recent revisions of risk estimates and the ICRP 1987 Como Statement. NRPB GS9, HMSO (1987). 28 A model for short and medium range dispersion of radionuclides released to atmosphere. NRPB–R91, HMSO (1979). 29 Simmonds, Linsley and Jones: A general model for the transfer of radioactive materials in terrestrial food chains. NRPB–R89, HMSO (1979). 30 Hunt: Timescales for dilution and dispersion of transuranics in the Irish Sea near Sellafield. Science of the Total Environment, 46 261–278 (1985).
269
31 Investigation of the possible increased incidence of cancer in West Cumbria: Report of the independent advisory group chaired by Sir Douglas Black. HMSO (1984). 32 The implications of the new data on releases from Sellafield for the conclusions of the Report on the Investigation of the Possible Increased Incidence of Cancer in West Cumbria: Committee on Medical Aspects of Radiation in the Environment (COMARE), first report. HMSO (1986). 33 Investigation of the possible increased incidence of leukaemia in young people near the Dounreay Nuclear Establishment, Caithness, Scotland: Committee on Medical Aspects of Radiation in the Environment (COMARE), second report. HMSO (1988). 34 Heasman, Kemp, Urquhart and Black: Childhood Leukaemia in Northern Scotland. The Lancet, 1, 266 (1986). 35 Barton, Roman, Ryder and Watson: Childhood Leukaemia in West Berkshire. Lancet, 8466, 1248 (1985). 36 Linsley, Dionian, Simmonds and Burgess: An assessment of the radiation exposure of members of the public in West Cumbria as a result of the discharges from BNFL Sellafield. NRPB R170, HMSO (1984). 37 The risks of leukaemia and other cancers in Seascale from radiation exposure. NRPB R171, HMSO (1984). 38 Stather, Dionian et al: The risks of leukaemia and other cancers in Seascale from radiation exposure—Addendum to Report R171. NRPB R171 Addendum, HMSO (1986). 39 Hill and Cooper (Project Managers): Radiation doses to members of the public in Thurso. NRPB R195, HMSO (1986). 40 Dionian, Muirhead, Wan and Wrixon: The risks of leukaemia and other cancers in Thurso from radiation exposure. NRPB R196, HMSO (1986). 41 Dionian, Wan and Wrixon: Radiation doses to members of the public around AWRE Aldermaston, ROF Burghfield and AERE Harwell. NRPB R202, HMSO (1987). 42 Stather, Clarke and Duncan: The risk of childhood leukaemia near nuclear establishments. NRPB R215, HMSO (1988). 43 Pentreath et al: The impact on radiation exposure of transuranium elements discharged in liquid wastes from fuel element reprocessing at Sellafield. In: Radioactive Waste Disposal, Vol. 5, Proc. IAEA Conference, Seattle, May 1983. 44 Pentreath et al: The behaviour of plutonium and americium in the Irish Sea. ICES Symposium on contaminant fluxes through the coastal zone (1986).
270
45 Fry and Sumerling: Measurements of caesium 137 in the residents of Seascale and its environs. NRPB R172, HMSO (1984). 46 Rundo: Radiocaesium in human beings. Nature 188 702 (1960). 47 Hesp: Uptake of caesium 137 due to nuclear weapons fall-out in subjects from West Cumberland. Nature 206 1213 (1965). 48 Fry and Britcher: Doses from Chernobyl Radiocaesium. The Lancet, 160– 161, July 18, 1987. 49 Jones: Derivation and validation of a urinary excretion function for plutonium applicable over tens of years post-uptake. Radiation Protection Dosimetry 11, 1, 19–27 (1985). 50 Legget and Eckerman: A method of estimating the systemic burden of Pu from urinalyses. Health Physics 52, 3, 337–346 (1987). 51 Popplewell, Ham, McCarthy and Morgan: Isotopic composition of plutonium in human tissue samples determined by mass spectrometry. Radiation Protection Dosimetry, in Press. 52 Hunt, Leonard and Lovett: Transfer of environmental plutonium and americium across the human gut. Science of the Total Environment, 53, 89 (1986). 53 Walters and Mondon: Dietary studies—Assessment of radionuclide intake by population groups in the UK. Proceedings of BNES Conference (May 1987)—health effects of low dose ionising radiation, BNES (1988). 54 Fry, Green, Dodd and Hammond: Radionuclides in house dust. NRPB R181, HMSO (1985). 55 Goddard, Minski, Thornton and Culbard; Household particulate survey (radioactivity in house dust in West Cumbria and its significance). DOE/RW/8703, Department of the Environment, London (1986). 56 Radioactive Waste Disposal by UKAEA establishments and associated environmental monitoring results (series). Safety and Reliability Directorate, UKAEA, Culceth. 57 Darby and Doll: Fall-out, radiation doses near Dounreay and childhood leukaemia. British Medical Journal 294, 603. 58 Openshaw, Craft, Charlton and Birch: Investigation of leukaemia clusters by use of a geographical analysis machine. The Lancet, 272–273, February 6, 1988. 59 Kinlen: Evidence for an infective cause of childhood leukaemia-Comparison of a Scottish new town with nuclear reprocessing sites in Britain. The Lancet, 1323–1327, December 10, 1988. 60 Gardner, Hall, Dowries and Terrell: Follow-up study of children born to mothers resident in Seascale, West Cumbria (birth cohort). BMJ 295 3 822–827 (1987).
STUDIES OF LEUKAEMIA INCIDENCE IN SCOTLAND JAMES URQUHART Information and Statistics Division Scottish Health Service Common Services Agency Trinity Park House Edinburgh EH5 3SQ.
Abstract Scottish Cancer Registration Date and the Small Area Population Data provided by the General Register Office for Scotland offer unique opportunities for carrying out analyses of the incidence of cancer in precisely defined areas. The paper will explore some of the problems of drawing conclusions about the distribution of a rare disease such as childhood cancer and of interpreting results relating to specific sites. Scottish Cancer Registration Data will be used to provide specific examples to illuminate these problems.
271
THE RELEVANCE OF POPULATION MIXING TO THE AETIOLOGY OF CHILDHOOD LEUKAEMIA
L.J.KINLEN CRC Cancer Epidemiology Unit, University of Edinburgh, Edinburgh EH8 9JZ.
ABSTRACT It is postulated that childhood leukaemia represents a rare response to some unidentified common infection(s), epidemics of which would be encouraged by mixing of populations with plausibly different previous experiences of infective agents. This has been tested in two studies, the first in the only rural local authority district of Scotland moderately separated from a conurbation that received a large influx of people in the 1950s; the second concerned those new towns in Britain designated by 1950 situated away from the major conurbations of London and Glasgow. In both, highly significant excesses of leukaemia at ages 0–4 were observed, suggesting that it originates in some form of infective process and in one that is favoured by certain types of population mixing. Strong reasons would be required for supposing that this effect did not operate near nuclear reprocessing sites, so unusual is the population mixing associated with them.
INTRODUCTION Reports of clusters of childhood leukaemia have a long history. Indeed, it is now over 50 years since Kellett suggested that an infective origin might underlie a small group of cases in Ashington in Northumberland [1]. Similarities with an outbreak of an infection were also noted in a striking concentration of 8 cases in a parish in Niles, Illinois, studied by Heath and his colleagues in the U.S. Communicable Disease Centre [2]. Formal studies to determine if childhood leukaemia clusters in time and space more often than would be expected by chance owe much to Knox, whose simple test [3] encouraged much work on the subject. The results of this work were on the whole disappointing, as was the detailed investigation of particular clusters by the US Public Health Service and others. All this does not, of course, exclude an infective origin: thus several infective illnesses do not cluster appreciably [4, 5]—consistent
272
273
with the fact that those with the illness represent a minority of those with the infection. In other words, such infections are not spread to any important extent by those with the illness. The possibility exists therefore that childhood leukaemia is such an illness, a rare consequence of a common but unidentified childhood infection (or infections). It is in general difficult to avoid widespread and common infections, though certain parents may enable their children to do so by an upbringing with particular emphasis on hygiene and mixing only with selected children. Similarly, whole communities may be relatively protected if they are sufficiently isolated from population centres with their associated pools of infection. The well-known severity of epidemics of infections in island and other isolated communities is due to the greater than average number of susceptibles available to contract the infection when introduced by outsiders, is well known. This continues to the present day in such places as the Highlands and Islands of Scotland, where children may even escape an almost ubiquitous infection like measles. Over the past 40 years mortality from measles among adults from that region has been almost 20 times greater than in the rest of Scotland [6]. By analogy, it was argued that the isolation and associated population influxes (both extreme) of Britain’s two nuclear reprocessing sites at Dounreay and Sellafield might be relevant to the excesses of childhood leukaemia recorded in their vicinity [6]. A First Test of the Hypothesis In theory this could be tested by investigating leukaemia incidence in areas similar in isolation and population increase but without a source of population irradiation. In a survey of all 397 Scottish local authority districts, however, no such area could be found. Instead the comparison area had to be the only rural district of Scotland that received a large influx of people in the 1950s and at a time when it was even moderately separated from a conurbation [6]. This was Kirkcaldy DC in Fife in the period before the opening of the Forth Road Bridge in 1964 made access to the area much easier. The reason for the population growth in this area was the building of the new town of Glenrothes. A significant excess of leukaemia below age 25 was found there (O/E ratio 2.78 based on 10 deaths) mainly due to a greater excess below age 5 (O/E 4.7, based on 7 deaths). New Towns In the study mentioned above, the area of Scotland that showed a significant excess of childhood leukaemia had increased in size due to a new town within its boundaries. This leads logically to a similar investigation of childhood leukaemia in other new towns. Glenrothes is one of 14 new towns in Britain designated by the government by 1950, listed in Table 1. Of these, 8, forming a loose ring round London, were intended primarily to accommodate people and employment dispersed from the city, while another (East Kilbride) was to serve the same function for the highly congested city of Glasgow [7]. These 9 conurbation-linked towns have sufficient in common to be considered together (group A) [7]. The others served other purposes and were built in largely rural areas further removed from conurbations, and form another group (B). Three (Aycliffe, Peterlee and Glenrothes) were entirely new towns built in open
274
TABLE I NEW TOWNS IN BRITAIN DESIGNATED BY 1950
* This is the year of designation except for Corby which alone among the above towns started to increase a few years before it was designated in 1950 [7].
country, one (Cwmbran) was in Monmouthshire and the other, Corby in Northamptonshire, was notable in receiving large numbers of people from Scotland. Compared to the overspill (A) towns close to London and Glasgow, group B towns would involve more mixing of hitherto separated groups of people with probable differences in their experience of infective agents. It is relevant that such separation may be a feature not only of the incomers in relation to the previous residents, but also of one group
275
of incomers compared to another. On the present hypothesis, group B is more likely to show an increase in childhood leukaemia. The excess found in the previous study in Kirkcaldy District of Fife (which includes Glenrothes) had ended by 1961, suggesting that by this year the epidemic of the infection (of which leukaemia is a rare consequence) was over, the number of susceptibles having declined below some critical level. Mortality from childhood leukaemia has therefore been examined in these two groups of new towns in two periods (i) from the start of their population increases to 1960, and (ii) from 1961 to 1980. None of these towns is as isolated as Sellafield and Dounreay have continued to be, so any increase of childhood leukaemia might be expected to occur, as in Glenrothes, only in an early period. Since the decision to regard 1961 as ending the early period came from inspection of the Glenrothes data, the findings for this town have been shown separately from the others. Table II shows preliminary findings from a current study [8], namely the observed numbers and observed to expected ratios of deaths from leukaemia at ages 0–4 in the two time periods in the new towns grouped as described above. A significant increase is found in group B (away from London and Glasgow) in the early period, but in neither period in Group A, the conurbation-linked towns. The absence of an excess in the later period is consistent with an epidemic of an infective disorder, the number of susceptibles having by then declined below some critical level. There were no significant excesses at ages 5–9, 10–14 or 15–24 in either period in groups A or B.
TABLE II Leukaemia mortality at ages 0–4 years in New Towns Established by 1950. Observed numbers and observed to expected ratios (0/E) by calendar period
276
The finding of a significant excess of leukaemia at ages 0–4 in all the group B towns in the early period and not only in Glenrothes, represents additional evidence for the relevance of population mixing to the aetiology of this disorder. Specific Factors Associated with Population Influxes It should not be supposed that the mixing implied by the marked growth of a town influences herd immunity in any simple way. Several factors may be involved.
(i) Increases in the numbers of susceptibles exposed. This is the most obvious factor, and in the case of an influx into a remote area the prevalence of susceptibles among the local population would be expected to be higher than the average. But incomers may also be unusual in this respect if they come from similar rural areas or are from high social classes, as in Seascale near Sellafield where only higher grade employees at the nuclear site were allocated houses.
(ii) Changes in herd structure. Major changes in the incidence of infectious diseases can result however from altering the structure of the community without changing the total number of susceptibles [9]. In a new community or when large numbers of people move into an existing town, changes in the pattern of contacts between infections and susceptibles are inevitable.
(iii) Large doses of infective agents. The dose of an infective agent is important in determining the form of the infection, including the incidence of complications. In epidemics many individuals by definition are infected at the same time, so that large doses of the relevant agent are more frequently received (from repeated exposures) than in sporadic infections.
(iv) Crowding. A special factor may be crowding if only in school classrooms, which is at least made more likely when populations increase markedly. In this respect, feline leukaemia may offer a parallel. Without laboratory tests, infectivity is not evident among free-roaming urban cats, but only among cats living in large numbers in the same house; here, the high incidence after introduction of the virus is attributed partly to the large doses consequent on repeated exposures received by animals living in close proximity.
(v) Calendar period. Improvements in communication and the increase in car ownership since the war must have greatly reduced the differences in herd immunity from one area of Britain to another. It may be relevant therefore that the increase in childhood leukaemia in certain new towns occurred in the 1950s and late 1940s.
The Relevance of the New Town Data for Nuclear Sites. These findings in new towns seem to point strongly to an origin of childhood leukaemia in some sort of infective process, and specifically in one that is favoured
277
by population mixing. This has obvious relevance to the excesses of childhood leukaemia recorded near Britain’s nuclear reprocessing sites at Sellafield and Dounreay. Construction at these sites represented civil engineering operations on a massive scale. This and the subsequent operation of these plants required the influx of large workforces into these isolated areas, inevitably with the mixing of different groups of people with plausibly different experiences of infective agents. Differences exist between the new towns and the two nuclear sites, in the size and timing of the excesses, and in the age groups affected. It is possible that these partly represent differences in the pattern of social interactions of people, which are well known to be complex and difficult to study. Indeed, these factors must differ between a new town set up in open country and an existing town into which people move. Niles Revisited The most arresting and intensively studied cluster of childhood leukaemia was that in Niles [2]. No less than 8 cases were concentrated among the pupils of a single parish school and their siblings in the years 1957–60. Unlike the excesses in the new towns [6,8], the Niles cluster was brought to the attention of the investigators by local people, and not because of any previous hypothesis. Nevertheless it is striking that Niles has certain features in common with the British new towns studied here. For Niles grew from a population of 3,587 in 1950 to 20,393 in 1960, most of the increase occurring in 1955–60 in the parish mentioned above, resulting in class sizes of about 50 in the parochial school, about double those in the nearby public schools. Despite being fairly close to Chicago, it is not impossible that the unaccustomed population mixing and classroom overcrowding in Niles encouraged an epidemic of an infection similar to that in British new towns. It is of interest that class sizes of near 50 were recorded in Glenrothes at the time of the excess of leukaemia [10].
CONCLUSION Evidence is therefore mounting that childhood leukaemia originates in some type of infective process and that certain types of population mixing are conducive to its occurrence. Strong reasons would be required for supposing that this effect did not operate near nuclear reprocessing sites, so unusual is their demographic pattern.
278
REFERENCES 1.
Kellett, C.E. Acute leukaemia in one of identical twins Arch. Dis. Child. 1937. 12. 239–252.
2.
Heath, C.W. & Hasterlik, R.J. Leukaemia among children in a suburban community. Amer. J. Med. 1963. 34. 796–812.
3.
Knox, G.E. The detection of space-time interractions. Applied Statistics. 1964. 13. 25–29.
4.
Nye, F.J. and Spicer, C.C. Space-time Clustering in Infectious Mononucleosis. Brit. J. prev. soc. Med. 1972, 26, 257–258.
5.
Goldacre, M.J. Space-time and Family Characteristics of Meningococcal Disease and Haemophilus Meningitis. International Journal of Epidemiology, 1977. 6, 101–105.
6.
Kinlen, L. Evidence for an infective cause of childhood leukaemia: Comparison of a Scottish New Town with Nuclear Reprocessing Sites in Britain. Lancet 1988. 2. 1323–1327.
7.
Osborn, F.J. and Whittick, A. New Towns, Their Origins, Achievements and Progress. Leonard Hill. London, 1977.
8.
Kinlen, L.J. and Chalmers, K. Childhood leukaemia in the new towns of Britain designated by 1950. In preparation 1989.
9.
Miles, A. Herd infection and herd immunity. In: Topley & Wilson’s Principles of bacteriology, virology and immunity. ed. G.S.Wilson, A.A.Miles and M.T.Parker. Edward Arnold, London. 1983. Vol.1 413–428.
10.
Kinlen, L.J. A case-control study of childhood leukaemia in the Glenrothes area of Fife 1954–60. In preparation 1989.
The role of ionising radiation in the aetiology of the leukaemias
R.A.CARTWRIGHT Director, Leukaemia Research Fund Centre for Clinical Epidemiology University of Leeds, Department of Pathology, 17 Springfield Mount Leeds LS2 9NG, United Kingdom
Introduction
There is often considerable confusion in the mind of the public as to what the role of ionising irradiation might be in relation to the possible aetiology of the leukaemias. This uncertainty is partly due to the apparently conflicting messages coming from scientists as reported in the media and partly from the apparent lack of consensus amongst research workers.
At the present time the risk factors for childhood ALL (as one example of the various subtypes of leukaemia) encompass neonatal X-ray exposure, lack of vaccination and possible unusual patterns of infection (McKinney et al 1987, Van Steensel-Moll 1985, Kneale et al 1986, Knox et al 1983), Chloramphenicol use (Shu et al 1987) and very little else. There is no good evidence that gamma or neutron exposures are linked with childhood or adult ALL and no universally accepted risk factor for adult ALL, although one study suggests pesticide exposures are important (McKinney et al 1989).
279
280
It is the purpose of this presentation to try and put these apparently unconnected observations into perspective and explain the background to a proposed new research effort on certain types of leukaemias.
Problems in Leukaemia Epidemiology
There are certain problems associated specifically with leukaemia epidemiology which make this disease group particularly difficult to study. These include:
1.
The fact that the leukaemias are a heterogenous group of
conditions for which evidence exists to suggest they have a wide variety of different (but possibly overlapping) causal agents. Table 1 gives some details of the different types and their abbreviations. Figure 1 shows the age specific incidence rates for these types except CLL and for several very rare types pooled as ‘others’. CLL is unusual in its biology and is generally regarded as a lymphoma-like condition. No studies have been able to link CLL occurrance with excessive exposure to ionising irradiation. It is very likely each leukaemia has a different biological pathogenic pathway. It is also now emerging that those conditions once thought to be homogenous, such as ALL, have many biological subtypes, for example, a large number of cases of the rare group of ALL diagnosed in the first year of life have a specific 4:11 chromosomal translocation not found in other
281
Table 1 Types of leukaemias occurring in the UK in recent years
Figure 1
283
age groups except those rare cases of ALL occurring in the over 70 year olds.
2.
Certain types of leukaemias have experienced major changes
in survival over the past two decades. In particular ALL in younger people since the mid–1970’s have achieved a survival of nearly 60% at 5 years after diagnoses. Thus studies which use mortality as an end point are very difficult to interpret if they span the 1970’s.
3.
There have been major changes in diagnostic practices for
these conditions which has meant that the 9th edition of the International Classification of Diseases coding system (ICD 9) have not been able to keep up with the new concepts and categories of these conditions. This is particularly the case with modern definitions of AML (Bennett et al 1985a and 1985b) but also applies to some other subgroups.
4.
More worrying for epidemiologists in the UK are the
deficiencies in the registration of new cases of these conditions by the cancer registry scheme. This national scheme was set up in 1957 in England and Wales and should record all malignancies as new case registrations. A comparison made from the LRF Centre in Leeds for 3 such regional registries (Alexander et al 1989) showed that of the leukaemias (ICD9 codes 204 to 208), 39% of cases were registered by both a novel system of data recording set up by the LRF centre in Leeds and the three cancer registries after a years lapse in case registrations.
284
The vast majority—55% had not been registered by the national scheme by that time, whilst 6% had been missed by the LRF system. It was earlier observations of this type which led to the setting up of a unique method of recording leukaemia information from certain parts of the UK known as the LRF Data Collection Survey. Furthermore the national scheme, as a whole, is very slow at producing statistics, currently 1985 is the last complete year of published data for England and Wales.
5.
The descriptive observations on leukaemia ‘clusters’ have
led to much effort being dissipated on post hoc studies to attempt to detect causality. This is a vain effort, at best cluster analysis can create hypotheses for testing elsewhere.
285
The Leukaemia Research Fund Data Collection Survey
This survey provides more accurate and up to date statistics on the distribution of the subtypes of leukaemia in parts of the UK. The purpose of the survey is to generate descriptive data to assist in the formation of new hypotheses regarding causes of leukaemia as well as provide the means whereby the leukaemias can be quickly and accurately found for further studies.
It should be emphasised that cases in the DCS all have positive haematological criteria for diagnoses, for example, death certificate data only are not used (unlike the national scheme). The case’s residence at diagnosis is used as the criteria for allocation to a specific area and all age groups are registered, although analyses are confined to those aged under 85 at diagnoses.
The survey started in 1984 in the areas shown in figure 2 and is ongoing. So far over 25000 cases of leukaemia and lymphoma have been registered. In parallel with this descriptive approach, various relatively new biological observations have enabled two hypotheses to be created specifically designed to be tested as possible aetiologial factors in ALL. The descriptive epidemiology of ALL for the DCS is useful in illustrating how these interactive hypotheses have been formulated.
Figure 2
Figure 3
288
Figure 3 shows the crude standardised registration rates by county for the years 1984–1986. These 3 years were used because they have had data thoroughly crosschecked from many sources. The map shows that some areas such as Lancashire and South wales have low rates and others in Cumbria and the South West peninsula are higher. These are rates for cases aged 0–84, thus roughly half these cases are under 15 years at diagnosis. Cumbria, Cornwall, Devon and Somerset also have the highest rates for children with ALL in the same time period. Results for the years of 1987 and 1988 have not been fully crosschecked and probably represent an underestimation of cases. However when informally analysed, for all age groups, they also show that Cumbria and the South West of England have higher rates, in the same pattern as the earlier years.
What may be emerging from these observations is that a geographical variation occurs in the distribution of ALL cases in different parts of the UK. when smaller divisions of land— county districts and electoral wards are examined a similar pattern of high and low rate areas emerge, broadly based on the county distribution. At the district level the variation is statistically significant for ALL for all age groups combined but not so for children alone, although the variation amongst younger patients are in the same direction as in all ages combined.
289
If this is a true observation then this leads to the necessity to postulate a geographically based environmental factor in the pathogenesis of the condition, particularly for the wider age group.
Such a hypothesis may be supported from other sources, for example the reverse-dose effect studies on animal models that have been reported by several laboratories (Humphreys et al 1985). These studies suggest that differential and fractionated contact with alpha-emitters might provide a trigger for leukaemogenesis at doses low enought to be found in the common environment. If so, areas with more alphaemitters in natural background, may well have more leukaemia cases. This might be true of naturally occurring radionucleotides in certain parts of Britain and in specific areas where alpha-emitters are concentrated such as river estuaries (MacKenzie 1987). A study of leukaemia distribution of the residents of electoral wards abutting such estuaries has shown an overall statistical excess for cases of AML, CML and ALL combined and a non-statistical excess for ALL alone.
Further evidence suggesting that Po210 is uniformly distributed across the marrow space from the bone volume seeking Pb210 and that methods are available for detecting differences in blood/ marrow level of Po210 (Henshaw et al 1984) gives a possible testable mechanism for the proximation of alpha decay tracks and marrow stem cells.
290
Greaves Hypotheses
If low fractionated doses of alpha-emitters are linked to leukaemogenesis in ALL it is likely to be related to the older children and adults who have had enough time to acquire a sufficient burden of alpha-emitting radionucleotides.
The childhood peak of ALL in those cases in the UK roughly aged 3–6, is not likely to be explained simply by this mechanism. More acceptable here might be the hypothesis developed by Greaves (Greaves 1988). He suggests the cell of origin for the common-ALL subtype (which forms the peak in children) is a pre-B cell which naturally and rapidly expands in post natal life as a result of specific environmental stimuli, particularly through early and non-specific infections. The hypothesis states that this rapid expansion of cells will lead to miscoding divisions and leukaemogenesis particularly if the stimuli leading to this developmental phenomenon are unusual eg. delayed infections which generally do not follow the pattern of environmental exposure within which the human population has evolved over the last few hundred thousand years.
One might expect, therefore that those more primitive societies have fewer childhood ALL and no childhood peak and our ‘Western’ societies have a peak and more childhood ALL. This is in fact the case (Parkin et al 1988). The stimuli to the
291
neonatal immune system—such as patterns of infection, patterns of vaccination, length of breast feeding and total calorie intake will also vary within such societies as they do in our own. Already there is some evidence that the patterns of infection in early life are different in case children than controls (McKinney et al 1987) as is the pattern of vaccination. The observations regarding excess chloramphenicol use might also fit in here (Shu et al 1987) and possibly the observation we have made about case cluster distribution in ALL in childhood (Alexander et al 1989). The LRF Centre in Leeds has also observed that more ALL cases occur in electoral wards whose structure in the 1981 census was at the upper socioeconomic extremes, suggesting further possible explanation for the observed variation in ALL in the DCS areas. It is unlikely that a specific virus or other infectious agent is causally involved in ALL in the childhood peak if this hypotheses is correct. Major aspects of this hypothesis are testable by casecontrol methodology linked to a multidisciplinary scientific approach.
Other hypotheses exist to create risk factors for ALL but the testing of the two complementary ones will represent a significant step forward in our understanding of this disease.
In conclusion two hypotheses have emerged in ALL due to a combined biological approach coupled with epidemiological
292
analyses of modern high quality data sets. The latter were created due to the difficulties in using other pre-existing data sets. It is hoped this approach will be able to create progress in understanding other types of leukaemia in the near future.
Acknowledgements The work for Leeds described in this paper was supported by the Leukaemia Research Fund. My colleagues in Leeds Dr P A McKinney and Dr F E Alexander have assisted with data analyses and collection whilst Miss Lorraine Harvey typed the text.
References Alexander FE, Cartwright RA, McKinney PA and Ricketts TJ. The small area distribution of leukaemias and lymphomas in England and Wales: some results from the Leukaemia Research Fund Data Collection Study. Submitted J Epidemiol and Comm Med 1988. Alexander FE, Ricketts TJ, McKinney PA and Cartwright RA. Cancer registrations of leukaemias and lymphomas: results of a comparison with a specialist registry. Comm Med 1989 (Accepted). Bennett JM, Catovsky D, Daniel MT, Flandrin G, Dalton DAG, Gralnick HR, Sultan C. Criteria for the diagnosis of acute leukaemia of megakaryocytic lineage (M7): a report of the French-AmericanBritish co-operative group. Am Intern Med 1985 ; 103:460–462. Bennett JM, Catovsky D, Daniel MT, Flandrin G, Dalton DAG, Gralnick HR, Sultan C. Proposed revised criteria for the classification of acute myeloid leukaemia: A report of the
293
French-American-British co-operative group. Am Intern Med 1985; 103:620–629. Greaves MF. Speculations on the cause of childhood acute lymphoblastic leukaemia. Leukemia 1988; 2:120–125. Henshaw DL, Heyward KJ, Thomas JP, Fews AP. Comparsion of the aactivity in the blood of smokers and non-smokers. Nuclear Tracks and Radiation Measurements. 1984; 8:453–456. Humphreys ER, Loutit JF, Major IR and Stones VA. The induction of 224Ra of myeloid leukaemia and osteosarcoma in male CBA mice. Int J Radiat Biol 1985; 47(3):239–247. ICD–9 International Classification of Diseases: Manual of the International Statistical Classification of Diseases, Injuries and Causes of Death. Based on the recommendation of the 9th Revision Conference (1975) Vol 1, WHO, Geneva 1977. Kneale GW, Stewart AM, Kinnear Wilson AM. Immunization against infectious diseases in childhood cancer. Cancer Immunol Immunotherapy 1986; 21:129–132. Knox EG, Stewart AM, Kneale GW. Foetal infection, childhood leukaemia and cancer. Br J Cancer 1983; 48:849–852. Mackenzie AB, Scott RDS, Williams TM. Mechanisms for northwards dispensal of Sellafield waste. Nature 1987; 329: 42–45. McKinney PA, Alexander FE, Roberts BE et al. Yorkshire casecontrol study of leukaemia and lymphoma: parallel multivariate analysis of seven disease categories. In preparation. McKinney PA, Cartwright RA, Saiu JMT et al. A case-control study of aetiological factors in childhood leukaemia and lymphoma. Arch Dis Child 1987; 62:279–287.
294
Parkin DM, Stiller CA, Draper GJ and Bieber CA. The international incidence of childhood cancer. Int J Cancer 1988; 42:511–520. Shu X-O, Gao Y-T, Linet MS et al. Chloramphenicol use and childhood leukaemia in Shanghai. Lancet 1987; 2:934–937. Van Steensel-Moll HA, Valkenburg HA, Van Zanen GE. Childhood leukemia and parental occuption: a register-based case-control study. Am J Epidemiol 1985; 121:216–224.
A METHOD OF DETECTING SPATIAL CLUSTERING OF DISEASE
S OPENSHAW University of Newcastle D WILKIE UKAEA, Windscale K BINKS and R WAKEFORD BNFL, Risley M H GERRARD Tessella, Abington M R CROASDALE CEGB, London
ABSTRACT A statistical technique has been developed to identify extreme groupings of a disease and is being applied to childhood cancers, initially to acute lymphoblastic leukaemia incidence in the Northern and North-Western Regions of England. The method covers the area with a square grid, the size of which is varied over a wide range and whose origin is moved in small increments in two directions. The population at risk within any square is estimated using the 1971 and 1981 censuses. The significance of an excess of disease is determined by random simulation. In addition, tests to detect a general departure from a background Poisson process are carried out. Available results will be presented at the conference.
INTRODUCTION The vexed question as to whether childhood leukaemia cases “cluster” in space and time more than would be expected by chance alone has frequently been addressed in the medical literature, but with limited success. If such excess clustering does occur, it could provide clues concerning the aetiology of childhood leukaemia, for example whether environmental or
295
296
genetic in nature. However, investigations have been hampered by the low registration rate of childhood leukaemia (~ 35 cases per million children per year), by doubts about the quality of the available data(1) and by the tendency of some researchers to define the criteria for assessing the significance of clusters after they have been observed(2).
Interest has been revived in the leukaemia clustering issue by the observation of apparent excesses of childhood leukaemia cases near some nuclear establishments in Britain. Interpretation of results is complicated by the post hoc nature of some of the methods employed(3); but there is evidence of an excess childhood leukaemia incidence around certain installations(4). However, radiological assessments continue to imply that exposure to ionising radiation is most unlikely to be the cause of these excesses(5) and other causes, such as an infective aetiology, have been proposed as an alternative explanation for these observations(6). It may be that there are several factors, some or all of which may interact. This lack of prior knowledge about possible causation and thus of testable hypotheses encourages the adoption of an exploratory style of epidemiological analysis. By examining the data for evidence of nonspecific departures from a Poisson process, it may be possible to generate hypotheses that can be tested using data from other regions or by a casecontrol study within the original area of interest. Because so little is known about the nature of leukaemogenesis, the resulting exploratory analysis has to adopt a highly non-parametric approach of potentially low power and sensitivity: no assumptions can be safely made about the nature of any possible deviations from a Poisson distribution and this forces the adoption of a testing procedure which is inherently conservative. Thus some clusters may be missed because of lack of sensitivity. Nevertheless, it is hoped that the analysis may provide some clues which could be profitably pursued in other studies. In this paper we will discuss the various problems encountered in the investigation.
The analysis has been confined initially to childhood (conventionally defined as the 0–14 year age range) acute lymphoblastic leukaemia (ALL) incidence. ALL is the commonest leukaemia type occurring in childhood (~ 80% of cases) and is also the commonest childhood malignancy (~ 25% of cases). In addition, particular attention has been paid to the biology of
297
ALL and the implications for the distribution of cases. Other cancer types, or indeed other diseases, could also be investigated using the technique.
The purpose, of this paper is to outline the data and methodology that are being used to locate clusters in space for a chosen time period and to determine the probability of the cluster occurring by chance. The effect of varying the time period will be considered at a later stage.
ANALYSIS DATA
Source Data The study area is that covered by the Northern and the North-Western Health Regions of England. These Regions correspond respectively to the post–1974 counties of Cleveland, Durham, Tyne and Wear, Northumberland and Cumbria, and to the post–1974 counties of Lancashire and Greater Manchester; (Figure 1).
Childhood (0–14 years of age) cancer registration details for this area were obtained from the Northern Region Children’s Malignant Disease Registry and the Manchester Children’s Tumour Registry. Cases studied are those diagnosed between 1968 and 1987, this being the time period for which full and validated case details are available for the whole study area. High quality childhood cancer registration data are available for this area: it is believed that at least 98% ascertainment is achieved, and the diagnoses have been reviewed centrally(7)(8). Registration details supplied by the cancer registries are: diagnosis, year of diagnosis, age at diagnosis, sex, and grid reference of place of residence at diagnosis. No other case details have been supplied for reasons of medical confidentiality.
Childhood population estimates have been obtained from the 1971 and 1981 censuses. Population resolution is at the census enumeration district (ed) level, which is the highest areal resolution generally available for population data. The study area possesses about 16,000 eds, each ed
Figure 1. Map of the study area
299
containing, on average, about 100 children; but the childhood population of eds ranges from 1 to around 500. The geographical size of eds varies considerably between urban and rural areas. Childhood population details are available for the two sexes in three 5 year age bands for each ed in each census.
Population Estimates For each ed, population data for males and females in the age groups 0– 4, 5–9 and 10–14 years are used, generating six age-sex groups. The ed structure of the study area changed between the two censuses, but each ed in each census possesses a centroid with a 100m grid reference to locate the ed population. Consequently, for each 100m grid reference location, a weighted mean population for each age-sex group during the 1968–87 period may be calculated. 1968–87 population estimates for the six age-sex groups are built up from these 100m grid reference populations.
Registration Rates The numbers of cases registered as resident within either the Northern or the North-Western Region at diagnosis are available for each year within the 1968–87 period. These registrations are broken down into the chosen six agesex groups. The cancer registries have re-checked case details as part of a quality assurance exercise. Regional population estimates for the census years 1971 and 1981 have been obtained by summing ed populations, and Regional population estimates for other years within the study period have been obtained from linear interpolation and extrapolation on these two values. (This linear approximation has been checked as being reasonable by comparison with N and NW Regional Health Authority mid-year population estimates for each year in the 1968–87 period). Regional registration rates for each year have been obtained by dividing annual registration numbers by these population estimates. Tests for trends with time for these year-by-year age-sex specific registration rates for each Region have been carried out using the SAS package. For
300
ALL, no statistically significant (two-tailed test) trends were detected at the 0.05 level. In addition, the N and NW Region age-sex specific registration rates for the full 20 year period have been compared. For ALL, no statistically significant Regional differences were detected at the 0.05 level.
Therefore, ALL age-sex specific analysis have been derived by taking in each age-sex group throughout the dividing by the weighted mean of the study area.
registration rates for use in the the total number of ALL registrations whole study area during 1968–87, and 1971 and 1981 populations for the
Expected Numbers Expected numbers of case registrations for a given area are obtained by applying these age-sex specific registration rates to age-sex group population estimates for the area. These population estimates are the sum of the 100m grid reference weighted mean 1971 and 1981 populations contained within the area. Cook-Mozaffari et al(9, p 35) have shown this method for population estimation to be a good approximation for pre–1974 local authority areas. Overall expected numbers (males and females, 0–14 years of age) are obtained by summing these six age-sex group expected numbers.
Observed Cases A 100m grid reference for the place of residence at diagnosis has been determined by the registries from the address post-code for each registered case. A 100m grid reference for a case will not necessarily coincide with a 100m grid reference for a population estimate. For comparison purposes, a second dataset of case locations has been generated by assigning a case to the grid reference of the nearest population estimate for the relevant agesex group, or, if this is not a unique location, randomly to one of the nearest points according to age-sex group population size. Observed numbers of case registrations for a given area are obtained by summing the number of cases with grid references lying within the area.
301
STATISTICAL ANALYSIS
Outline And Limitations The statistical analysis is based on these expected and observed numbers of childhood ALL cases, and is intended to answer two questions. 1. Where, and what, are the extreme groupings of cases that might constitute clusters in space, and what statistical significance may be attached to these groupings assuming that the underlying risk of ALL is dependent on age and sex only? 2. Can a general deviation from an underlying age-sex adjusted Poisson (random) selection process be detected? Clearly, these questions are not independent, since if an underlying non-Poisson distribution of cases exists, then one should expect cases to “cluster” together (or spread apart) in space to an extent which is greater than that due to chance alone. On the other hand, a background Poisson process does not preclude the existence of a small number of spatially-limited “clusters” of cases due to a specific but unidentified localised aetiological factor. The answers to these two questions must be considered together, and should be regarded as providing foundations upon which other studies may be constructed. As such, the present analysis can usefully be viewed as hypothesis-generating as well as hypothesistesting. The adopted statistical analysis technique is a development of the descriptive methodology devised by Openshaw et al(10). As a “cluster finding” tool (to provide an answer to Question 1) the method may be regarded as a systematic search for extreme groupings of cases—the study area is exhaustively covered, in a regular and repeatable fashion, by a grid of squares of varying size and origin. In this way, the analysis “homes in” on case groupings which are extreme as measured by the onetailed Poisson probability of obtaining at least the observed number of cases contained in a square, given the expected number of cases in the square.
302
The problem with interpreting the cumulative Poisson probability so obtained is that allowance must be made for the total number of tests of significance carried out, and of the degree of overlap of the analysis squares. The lattice size and origin undergoes a systematic pattern of shifts in an attempt to overcome the arbitrariness of the grid. However, the scale of non-independence contained within the method renders an analytical solution if not impossible, at least so complex as to be computationally unfeasible for any realistically sized dataset. As a consequence, the significance of a given case grouping for any cell is to be been assessed through the use of random simulations. The search technique is being applied to 104 random simulations of the ALL case distribution, with cases assigned to ed centroids according to expected numbers.
Random Simulations For each of the 104 simulated datasets, the search technique is exactly the same as that executed on the observed data. The significance of a given case grouping (“cluster”) in the real data is then the frequency with which simulated datasets produce at least one “cluster” with a cumulative Poisson probability at least as small as that associated with the real case grouping. The 104 simulations enable a significance level of 0.01 to be determined with reasonable precision (ie, a reasonably narrow confidence interval).
To test for a general departure from a Poisson distribution of cases (ie to provide an answer to Question 2) the above method has been modified so that squares do not overlap, on the grounds that any deviation from a Poisson distribution should show up in a single grid. The study area is therefore covered by a single exhaustive regular grid of squares with an arbitrary origin. Statistical tests are applied to this grid which measure the deviation of the observed distribution from that expected for a Poisson process. The sensitivity of the results to square size is also investigated.
303
The methods have been developed on the CRAY2 supercomputer at the Harwell Laboratory, with results postprocessed on a Silicon Graphics Iris 4D20G Workstation. It is hoped to transfer the technique to a more accessible facility at some stage of the development but the 104 simulations will be carried out on the CRAY2. Full details of the statistical analysis and the results for the ALL cases will be published at a later date.
DISCUSSION
Data Inaccuracies In the previous section, the childhood cancer registration data and the population data required to generate age-sex adjusted expected numbers (and observed numbers) of registrations for a given area have been described. The statistical analysis techniques applied to these data are also outlined. Unfortunately, it is inevitable that, despite the care taken in collecting and collating these data, some errors will remain. Moreover the nature of the data dictates the use of certain approximations within the analysis. For example, although the high quality of the childhood cancer incidence data held at the Newcastle and Manchester registries is widely recognised, a number of address post-coding errors probably exist, and 100m grid references obtained from post-codes will be imprecise to some degree. In addition, ALL may be disaggregated into a number of subtypes which may have different aetiologies; but subtyping is not available for all cases diagnosed in the earlier part of the time period.
Population data at the ed level may only be located through the use of the grid reference of the ed centroid. The locational precision of population estimates will probably be lower for larger eds, ie for rural areas. Also, small area population estimates are only available for the 1971 and 1981 census years, necessitating an approximation for other years covered by the study. This approximation will be of variable accuracy.
304
Sensitivity Studies The problems posed by these data inaccuracies and approximations may be addressed to some extent by sensitivity analysis. However there may be other causes of extra-Poisson variation and it will be useful to investigate their impact.
Population migration presents difficulties to population-based studies of cancer incidence. Apart from the effect on population estimates, migration during the latency period of a cancer means that place of residence at diagnosis may not be the place of residence at which exposure to some environmental carcinogen occurred. Therefore, migration can confound any causal relationship and without properly constituted population at risk data on an annual basis there is no way of avoiding these problems. For a number of childhood cancers, it may be that place of residence of the mother during pregnancy may be the more appropriate piece of geographical information. However, owing to the high mobility of young families, national studies may be required to assess accurately the importance of the distribution of childhood cancers by place of birth, rather than limited regional studies.
However, even this is not likely to be completely adequate since knowledge of intermediate locations and duration would also be required. The effects of these various sources of error and uncertainty will be to make it even more difficult to detect clusters, (they may even create false clusters, although this is not likely). It also means that improving geographical resolution without also improving knowledge of space-time histories may not be worthwhile. The only immediate attack on these problems is to incorporate basic data uncertainty simulations and representations in the analytical procedure itself.
Statistical Power In order that the results of a statistical study may be properly interpreted, some investigation of statistical power (the probability of detecting a genuine excess risk) must be made. We intend to carry out such an analysis. However a rough indication can be given. If the total number of
305
squares tested is n, then the null expected number of squares with a Poisson probability of 1/n or less will be about 1. Since n is of the order of 1 million for the area being studied, the criterion for significance is a stringent one.
It has to be recognised that the “cluster finding” technique described in the previous section can only be expected to identify the most extreme case groupings in the study area. This is the consequence of the lack of a specific a priori hypothesis which would allow a particular area, time period or age group to be examined for evidence of an elevated risk. The general nature of the method implies low power against any mechanism which does not produce “clusters” with very low associated Poisson probabilities. Thus, the failure of the technique to identify an excess of cases in a particular area does not necessarily imply that an elevated risk does not exist in that area. The ability of the method to identify a significant excess (the power of the method) will depend on the nature of the causal factor which is acting to raise the risk, ie on how the elevated risk is manifested as observed cases. Nevertheless, those groupings which are identified as significant by the technique should be worthy of further study.
The analysis of the broad spatial distribution of cases does not suffer from this particular problem, partly because it is the whole distribution, rather than the extreme upper tail of the distribution, which is being examined. The sensitivity of the method depends on the power of the analysis against a given type of deviation from a Poisson process. It is possible that a tendency for cases to “cluster” to an extent greater than that expected by chance alone can be detected, but that individual case “clusters” are not identified because the method does not possess the power to find those non-Poisson “clusters” which do exist. This must be borne in mind when interpreting results.
Hypothesis Testing Any individual clusters found significant may generate hypotheses for further investigation but it would be unwise to exhaust the available
306
data for hypothesis generation and leave no independent data for testing hypotheses. It is a matter of judgement as to when a proper balance between using sufficient data to produce relevant hypotheses, and leaving enough data for testing, has been struck. The subsequent test of hypothesis should employ preselected analysis boundaries, dictated by the nature of the proposed causal mechanism, and will inevitably be more powerful than the broad “hunting” methodology of hypothesis-generation that has been used. Thus, Kinlen(6) was able to generate a hypothesis concerning the aetiology of childhood leukaemia which was related to isolation and population influx. This hypothesis was generated by the initial observations of case excesses at Seascale and Thurso. Kinlen tested this hypothesis by collecting childhood leukaemia mortality data for a specific preselected area (Glenrothes) and time period (1951–67). This test must be considerably more powerful than a search technique because the number of tests carried out (ie, the opportunity made available to produce a low probability by chance) is severely limited by the prior hypothesis. It is quite likely that a search method would not have identified the Glenrothes excess because of its limited power, even though Kinlen’s work indicates that a genuine causal mechanism may be detected. Nevertheless, Kinlen’s hypothesis was constructed upon initial reports of case excesses, and it is these that the above search technique will attempt to identify.
Generalisation Of Method This description of analysis methods has concentrated on acute lymphoblastic leukaemia incidence in the 0–14 year old age group over the full 20 year time period. It may be that non-Poisson structure is only apparent in narrower age groups and shorter time intervals. The search technique may be applied to age and time as well as to space; but because this will necessitate a larger number of tests to be carried out, deviations from a Poisson process will have to be even more extreme to be identified in this way. Again, an improved understanding of childhood leukaemogenesis would allow greater power to detect case groupings of interest by limiting the search technique to relevant time periods and age groups.
307
Finally, there is no reason why the analysis should be confined to acute lymphoblastic leukaemia incidence. We shall be applying the methodology to other childhood cancer types, including acute myeloid leukaemia.
CONCLUDING REMARKS In this paper we have outlined a statistical technique to identify extreme groupings of cancer cases and to determine whether the underlying distribution of cancer cases is Poisson. The methodology has been finalised and results will be available soon.
The analysis technique is being applied to childhood cancer cases from the Northern and North-Western Regions of England, for which high quality registration data are available from the Newcastle and Manchester Registries. Expected numbers of cases are based on 1971 and 1981 census data, and on age-sex specific registration rates for the study area.
The detection or otherwise of any extreme groupings will have to be interpreted in the light of the findings of the tests of the underlying distribution of cases, and of the results of power calculations. The stringency of the “cluster finding” technique implies that only the most extreme case groupings will stand out against the background of chance fluctuations. This low power is the price that is paid for the generality of the technique, but we can be reasonably confident that any cluster detected will be worthy of further investigation.
ACKNOWLEDGEMENT We are grateful to Drs Alan Craft and Jill Birch and their colleagues for providing the childhood cancer registration details required for the analysis. Martin Croasdale has participated in this research and its publication with the permission of the CEGB.
308
REFERENCES 1.
2. 3. 4. 5. 6.
7.
8.
9.
10.
Swerdlow, A.J., Cancer registration in England and Wales: some aspects relevant to intrepretation of the data. J.R. Statist. Soc. A, 149, 146– 160, 1986. Glass A.G., Hill J.A. and Miller R.W., Significance of leukemia clusters. J. Pediat.. 73, 101–107, 1968. Wakeford R., Binks K. and Wilkie D., Childhood leukaemia and nuclear installations. J.R. Statist. Soc. A, 152, 61–86, 1989. Committee on Medical Aspects of Radiation in the Environment (COMARE) Second Report. London: HMSO, 1988. Stather J.W., Clarke R.H. and Duncan K.P., The risk of childhood leukaemia near nuclear establishments. Report NRPB–R215. London: HMSO, 1988. Kinlen L., Evidence for an infective cause of childhood leukaemia: comparison of a Scottish new town with nuclear reprocessing sites in Britain. Lancet ii 1323–1327, 1988. Craft A.W. and Amineddine H.A., The Northern Region Childrens Malignant Disease Registry 1968–82, incidence and survival. Br. J. Cancer, 56, p 853–858, 1987. Birch J.M., Marsden M.B. and Swindell R., Incidence of malignant disease in childhood: a 24–year review of the Manchester Children’s Tumour Registry data Br. J. Cancer, 42, 215–23, 1980. Cook-Mozaffari P.J., Ashwood F.L., Vincent T., Forman D. and Alderson M., Cancer incidence and mortality in the vicinity of nuclear installations, England and Wales, 1959–80. Studies on Medical and Population Subjects No. 51. London: HMSO, 1987. Openshaw S., Charlton M., Craft A.W. and Birch J.M., Investigation of leukaemia clusters by use of a geographical analysis machine. Lancet, i, 272–3, 1988.
PREDICTION OF THE EFFECT OF SMALL DOSES: INCONSISTENCIES IN THE EPIDEMIOLOGICAL EVIDENCE PROF. R. DOLL Cancer Epidemiology and Clinical Trials Unit University of Oxford Gibson Building The Radcliffe Infirmary Oxford OX2 6HE
Abstract Estimates of the incidence of cancer following exposure to doses of ionizing radiation less than 0.1 Gy are based primarily on extrapolation from observations of the mortality of residents of Hiroshima and Nagasaki who survived higher doses from the atomic bomb explosions in 1945, supplemented by observations on groups of patients exposed in the course of therapeutic or diagnostic medical procedures, on employees exposed in the course of their work, and on children exposed in utero to diagnostic radiography, and guided by the results of experiments on animals. The total risk that is predicted to follow small doses depends crucially on assumptions about (a) the temporal distribution of induced cases, (b) the relationship that the induced risk bears to the risk from other causes, and (c) in the case of alpha radiation, on the distribution of absorbed radionuclides throughout the body. Not all the observations are concordant and some of the assumptions on which the predictions are based are open to question.
309