Following a recent paper in the Proceedings of the OXMI 2020 Conference, published in Radiation Protection Dosimetry in 2021, Mike has published peer reviewed papers in seven decades, from the 1960’s to the 2020’s.
Most of the papers fall within the field of diagnostic radiological sciences including radiation protection and are based upon experimental, theoretical, review and analysis, professional and philosophical considerations. The topics covered reflect the wide-ranging and fundamental technological developments that have taken place during the period of 1960’s to 2020’s, as well as the impact this has had on both clinical and scientific/technological aspects of diagnostic radiology. This has included every aspect of the X-ray imaging process from the design of X-ray tubes, the physical performance of film-screen combinations and digital imaging systems (including patient dose assessments) in radiography, mammography, CT, fluoroscopy and conventional tomography, as well as the underpinning imaging science.
Quality assurance and quality control formed an important part of scientific outputs in the 1970’s and 80’s and these led to the standardisation of test protocols for all imaging modalities. Prior to this a wide variety of test methods were employed in scientific publications to try and quantify system performance. Attempting to understand the relationship between resolution, contrast detection and noise analysis was an important consideration, particularly in CT, which had broken the mould in that it sacrificed spatial resolution for contrast detection. Much of this work was undertaken when test equipment was rudimentary. For example, the field testing of the kV of an X-ray unit could only be assessed by means of an Ardran-Crookes penetrameter, which at this time was a significant development in QA practice. Prior to this the tube kV was measured by inserting a potentiometer across the high voltage terminals of the generator. Equally, X-ray outputs were measured with ionisation chamber dosemeters and the waveform using an oscilloscope to display the signal generated from a large biscuit tin ionisation chamber. The development of QA test equipment for medical imaging is now a major commercial/industrial activity. However, the performance of modern test equipment is still largely based upon the basic principles of these earlier methods.
Although test equipment was often rudimentary, test methods often required a degree of improvisation. This was demonstrated whilst testing one of the first four clinical prototype EMI 1010 CT scanners in the mid 1970’s, which had been purchased by Manchester University. To assess the kV, the X-ray beam collimation was removed in order to image an Ardran-Crookes penetrameter whilst the X-ray tube was held in stationary mode during the exposure!
IPEM Topical Report: An evidence and risk assessment based analysis of the efficacy of quality assurance tests on fluoroscopy units-part II; image quality
Dan Shaw 1 , Mark Worrall 2 , Chris Baker 3 , Paul Charnock 4 , Jason Fazakerley 4 , Ian Honey 5 , Gareth Iball 6 , Manthos Koutalonis 7 , Mandy Price 8 , Caroline Renaud 9 , Amy Rose 8 , Tim Wood 10
The work undertaken during this period led to some of the first dedicated quality-control instrumentation for diagnostic radiology. The Sensi-densitometer developed to monitor the performance of film processors and the Dose-box to monitor the consistency of X-ray outputs, were two instruments developed in Liverpool that were sold commercially. The practical application of work presented in publications was always a primary consideration, which was to be expected in a field of applied sciences.
The interplay between noise and resolution was an important consideration during the transition from analogue to digital radiographic imaging systems, which was gaining momentum in the late 1980’s and 1990’s. There was a realisation that the resolution of a digital system was unlikely to match that which could be achieved with film-screen combinations. These could achieve resolutions up to 10 line pairs/mm. However, the performance of digital systems need only match the visual performance of the eye in terms of resolution with optimum performance at roughly 2 line pairs/mm so that 4 line pairs/mm (pixel size of 0.125 mm) would suffice for most applications. However, the development of suitable image display systems was an important requirement since screen-film systems provided both the detection and display processes, hence film gamma was important for contrast detection. However, in the case of digital systems these two stages were now separated. For a 30 X 40 cm chest image providing 4 lp/mm resolution a display capable of 3200 X 2800 pixels was required. However, the Japanese, amongst others, were aggressively pursuing the development of flat panel image displays with suitable resolution and contrast capabilities. Of course, new test methods were required for these display systems.
In the 1980’s as part of the exploration of the potential for digital radiographic imaging, work was undertaken on a prototype high resolution digital radiographic system based upon ionography. This utilised a high Z gas detector operating at atmospheric pressure and involved the direct electronic readout of an electrostatic image produced on aluminium backed plastic foil providing resolutions of up to 4 line pairs/mm. The digital images produced enabled the application and study of image processing techniques to radiographic images (smoothing and edge enhancement) that had been produced at dose levels commensurate with those employed with existing film-screen combinations. Previously, such studies had required the digitisation of film images. A high-resolution VDU was required in these studies. Luckily, a small company in Sandbach, Cheshire, was able to provide a 1600 x 1200 pixel display based upon conventional flying spot technology that employed broad band signal amplifiers. This capability was significantly higher than the 500 x 500 resolution available on standard TV systems.
It is worth noting that during the 1970’s there was a great deal of interest in the potential of electrostatic imaging for X-ray imaging, particularly the Xerox process. At this time two brothers, Bunker and William Hunt had nearly cornered the global market for silver. The price rose from $11 an ounce in September 1979 to $50 an ounce in January 1980. Since X-ray film employed medically was a major user of silver, the pursuit of non-silver imaging techniques for medical applications received much attention.
Based upon the experience gained in this programme of work, the development and application of test methods and the physical evaluation of commercial digital radiographic imaging systems employing computed and direct digital imaging methods was pursued in the early 2000’s, as they became available to local departments of radiology. These studies helped to form the basis of standard test protocols for modern digital systems as well as help demonstrate scientifically their suitability for clinical applications.
Throughout all the work undertaken since the 1970’s, the development and assessment of test phantoms for the quantification of image quality was a fundamental pursuit across all imaging modalities. Indeed, it was the work of the Hospital Physicists Association Diagnostic Radiology Topic Group in the 1970’s that led to the Leeds Test Phantoms, developed by George Hay in Leeds General Hospital to test fluoroscopic systems in his hospital, becoming a gold standard method for the assessment of the imaging performance of fluoroscopic systems throughout the UK. Indeed, the NHS procurement department was prepared to accept the performance measured by the Leeds test objects as the basis for acceptance of an installation and hence payment to the manufacture of the outstanding instalment, usually 10%. Based upon these initiatives phantoms have since been produced by the Leeds team for mammography and digital subtraction imaging systems as well as CT. In fact, Leeds Test Objects Ltd, which was eventually spun out of the Leeds Medical Physics NHS department is still a leader in this field.
During the late 1980’s, the Commission of European Countries (CEC) Radiation Protection Research Programme was established with one important aspect based on the development and application of Image Quality Criteria for radiographic, mammographic, as well as CT images. These Criteria utilised the presentation of normal anatomy as a measure of image quality by defining the desirable appearance of relevant key structures determined by ogists. In this way, every patient could be considered a test phantom. The aim of this work was to try and establish quantitative links between patient dose and image quality. The work was pursued by a panel of expert radiologists as part of a European-wide scientific research programme aimed at underpinning the development and application of the EC Directive concerned with radiation protection of the patient in diagnostic radiology. The intention was to provide a scientifically based yardstick for the optimisation of the image quality for each individual X-ray examination when an appropriate dose was employed.
Fifth Framework CEC Radiation Protection Research Group
In underlying studies, the assessment and grading of image quality was undertaken by groups of expert human- observers. However, it is interesting to note that that such an approach may be relevant to the quality control of automated clinical image assessments employing Artificial Intelligence (AI) and Machine Learning (ML) systems. The utilisation of sets of training images could, perhaps, help to develop their image reading capabilities.
In the 1990’s and 2000’s work underpinning publications was centred around the application of IT in quality control and radiation protection including the development and utilisation of software tools for audit and analysis of both QA and patient dose data. This included the use of electronic patient examination records that were then available from Radiology Information Systems (RIS) as a basis for the automation of patient dose audits. Thus, large -scale data sets involving 100’s or 1000’s of records could be employed for the establishment of local and regional Dose Reference Levels (DRL’s). Previously, up to 10 dose values had been proposed in national guidance for a particular examination in an X-ray room. The RIS based approach also included the development and use of web-based centralised dose-data management and analysis systems. During this period, such was the pace of ongoing IT developments, techniques were constantly having to be modified to accommodate and utilise rapidly developing IT capabilities including web-based applications for data collection, analysis and presentation of results.
More recent publications in 2010’s have been concerned with the review and analysis of the fundamental principles of radiation protection to the field of medical imaging. This has included the scientific limitations that have resulted from a focus on patient dose as the basic tool for optimisation. The primary objective of all medical imaging procedures is to produce the best possible image(s) from an examination to assure an optimum clinical diagnosis whilst being assured that an acceptable patient dose is employed. Equally, only examinations that can knowingly provide clinical benefit should be performed. Hence knowledge of the image quality and corresponding clinical efficacy are primary considerations in meaningful optimisation strategies. Thus, any decision to undertake an X-ay examination should be underpinned by the fact that a patient meets specific clinical characteristics and cost-risk-benefits should underpin any decision-making process. Both are areas where science can have significant impact operating as it does at the interface between image production and diagnostic outcomes.
ICRP is presently undertaking a major review of the framework and principles of radiation protection to ensure it is fit for purpose in the future. This undertaking, which may take up to a decade to complete appears somewhat analogous to the ecumenical councils of the early Christian church. These were convened periodically to regulate matters of discipline and doctrine so it will be interesting to see the outcomes.
The work underlying the publications was to a large extent undertaken in collaboration with young scientists as part of their training programmes but also involved collaborations over many years with senior scientists from within the UK and throughout Europe. Hopefully, the publications help to highlight the fact that diagnostic radiology is an extremely fruitful and worthwhile area for the application of science as well as demonstrate its importance to a field that has operated at the forefront of technological development for more than 50 years. The exciting possibilities that appear to be on the horizon represented by the application of AI and ML as well as data analytical techniques to one of the most widely employed and valuable diagnostic methods indicates that diagnostic radiology will continue to be an extremely worthwhile and fruitful area for the application of scientific knowledge and understanding.
Dr Mike Moores, Director
March 2022