Skip to main content

Introduction

This article describes the different sets of indicators that have been developed to assess the performance of OSH infrastructures, policies and legislation. The use of OSH performance indicators is widespread in business, however, sets of indicators that have been developed for specific areas will not be trated (see [1] for details). First, there is an examination of why indicators are used, describing the types of indicators and the difficulties in developing them. Secondly, there is an account of the different sets of indicators. And finally, there is a look at their benefits and limitations.

The need for indicators

In policy making, indicators are important for monitoring the status-quo, and for measuring progress against policy objectives Impact assessment. Hence, they serve as a basis for policy changes The role of legislation in OSH management. This applies universally to policies and strategies at all levels; at company level, regionally, nationally and globally, and for public and private sector.

Policy making involves looking at a range of options. A crucial purpose of the process is directing resources to prioritised fields of activity. Setting priorities, however, is subject to political legitimation, especially when it is difficult to sufficiently describe the status quo. The priorities may then be based on incomplete information. Indicators are used in most policy areas, such as for economy and environment. However, they are also used in areas where quantitative evaluation is more difficult, e.g. education.

Detailed OSH data is not sufficiently available. As for the general OSH situation, assessment is extremely challenging due to the size of the field: there are millions of different workplaces and companies with different national legislative frameworks, though these may be harmonised in political unions such as the EU. For a reasoned decision, policy makers and stakeholders should have basic data, such as:

  • quantity and quality of occupational risks
  • workload and exposure
  • organisational and technical measures for prevention (and their costs)
  • outcomes, i.e. well-being and health or adverse effects, such as accidents and diseases, and their costs for society.

OSH performance indicators are used to measure and assess the status quo and the development in the main areas of OSH:

  • occupational hazards and work-related ill-health
  • prevention in companies, e.g. safety of technologies, internal OSH organisation including indicators for participation and cost-benefit
  • the performance of the public OSH infrastructure, e.g. collaboration between institutions, quality of enforcement, cost-benefit
  • the performance of the private OSH infrastructure, e.g. OSH associations, prevention services, etc.

This article aims to provide a comprehensive list of core indicators, the related methods of data collection, and consideration of the application, relevance and limitations of indicators.

Concepts and definitions

Indicandum and indicators

Policy objectives are formulated in terms of conceptual domains or abstract categories, referred to as ‘Indicandum’ (the subject to be indicated) [2]. An indicandum is not measurable or hardly assessable in itself. Typical OSH examples are ‘prevention culture’, ‘prevention level’, ‘complian¬ce’, ‘effectiveness of the public system’, ‘smart regulation’, ‘well-being of workers’, ‘cooperation and collaboration’, ‘safety level’ (of the technology and organisation), and ‘competence’ of the OSH-professionals etc.

To assess the indicandum, measurable categories that are less abstract are needed. These are called indicators, which are used as a subset or an element of a broader ‘indicandum’, to find measurable and assessable elements. The process of formulating scales or questions to assess a certain domain is also called operationalisation. The indicandum is usually measured by more than one indicator.

An illustrative example is the national evaluation of measures to reduce musculo-skeletal diseases. It is possible to create two meaningful indicanda, e.g. ‘company compliance with existing OSH legislation on protection’, and ‘health promotion - rate of companies who actively promote measures to combat musculo-skeletal diseases’.

For the first indicandum, the indicator would be the rate of companies that comply with regulations to prevent excessive physical workload. The field of legislation is too broad to measure with just one indicator. The indicators relate to such regulations as those that prevent heavy lifting and excessive vibrations, or oblige companies to use ergonomic equipment. These compliance rates can be assessed. The second indicandum - ‘active promotion’ - is split into various measures, e.g. sports facilities and changes to working practices. It covers voluntarily activities. Other indicanda might include ‘issuing protective standards by standardisation organisations’, the ‘rate of suppliers selling ergonomic equipment in different sectors’, etc.

There are cases where one indicator is sufficient: where the indicandum and the indicator are identical. An OSH example is when the indicandum is specific, e.g. the rate of accidents per 1,000,000 workers. However, there still might be concerns about the reliability of this indicator, which means that other criteria (indicators) need to be set in order to assess the rate of underreporting. National or international OSH strategies typically include objectives (indicanda) of this kind. This is the simplest case as far as measurability is concerned, provided that measurable and assessable data is available – in this case on work accidents.

There are other cases where the indicators are so complex that sub indicators are required. These are generally called parameters. If the indicandum is ‘prevention level’, and one of the indicators is ‘quality of risk assessment’, then the different aspects of risk assessment form the parameters or sub indicators, e.g. organisation of OSH, workplace conditions, safety of the used equipment, and work organisation.

Sometimes it is necessary to split up indicators horizontally, i.e. the same indicator is used to study the situation at different workplaces. If the indicandum is ‘prevention level of manual heavy lifting processes’, the indicator should be specified and the data checked for several workplaces and sectors. In such cases, large amounts of data are required from different sectors and workplaces to perform an overall assessment of the prevention level.

Often there are no appropriate and direct indicators available, and so-called ‘proxies’ have to be used. For instance, a study is to be carried out into how many workers fully comprehend OSH. It is assumed that there are language difficulties for non-native speakers. Short comprehension tests would be ideal but probably not feasible in most cases. Possible proxies are nationality or birthplace of the workers.

Weighting indicators

The calculation method used to get an overall result from different indicators has to be made transparent. E.g. if three of twelve sectors perform very well, two perform well, four are average and one falls below average, a calculation needs to be done to find an overall assessment.

In other cases, some indicators might be more heavily weighted. E.g. high-level safety technology is seen as more important than documented risk assessments. In a final assessment of the prevention level, safety technology counts more than a good risk assessment. This is a decision for committees, evaluators, or researchers.

The following example illustrates this: SMEs in the construction industry often don’t prepare a written risk assessment. Whilst this scores negatively in the OSH evaluation, the actual working situation might not necessarily be unsafe, as observations have shown. According to the reports of labour inspectors, a large proportion of the companies in this industry do work safely: equipment is new and well-maintained, staff work safely, and are well informed about safety and health. Again, how these results are weighted needs consideration.

Quality of indicators

The scientific challenge is to define indicators and parameters to best represent all aspects of the indicandum. However, this is often a balancing act between the best possible indicators and the data available. A number of associations, scientific research groups, and government institutions have developed criteria for the appropriateness of performance indicators (for references see chapter 5). Major criteria are [3]:

  • Clear relation between indicators and indicandum (directness)
  • Quantitative indicators (where possible)
  • Disaggregated indicators covering e.g. the performance in different sectors, regions etc.
  • Good access to underlying data
  • High reliability of the data

The interpretation of the data by the stakeholders can often be controversial, as precise and reliable data are missing. If the knowledge basis is inadequate, it is difficult to make dependable statements about changes, and to judge the effectiveness of policies.

Data requirements for indicators

As stated above, indicators must be measurable and assessable. In general, this requires data provided by research studies or public statistics, registers or surveys.

To assess the OSH-performance of a national system, policy or law, the typical steps include:

  • Setting the prioritised subjects of interest (the indicanda) related to the activity / target
  • Defining the indicators for each indicandum
  • Defining parameters if one or more indicators are still too complex
  • Identifying and analysing the available aggregated statistics
  • Using raw data if possible

(more examples are given in the table below).

Table 1: Indicators for measurement and assessment

Indicators – key
function between
‘Goals’ and ‘Raw Data’
OSH Example
Political goal Raise the prevention level in companies in one EU Member State (or in
one sector), e.g. from 50% to 75%
Indicandum Prevention level in companies in one EU Member State (or in
one sector)
Indicators
  1. Rate of companies complying fully or partly with all OSH regulations
  2. Use of OSH management systems in companies
  3. Use of advanced safety technology
  4. Comparatively low rate of occupational accidents (country or sector)
  5. High number of internal safety professionals or extensive use of
    external preventions services
  6. High quality risk assessment and effective control of risk reduction
    measures
  7. High awareness of OSH (in surveys for employers and workers)
Parameters
(Indicators - detailed
level)
  1. One parameter for each major area of compliance – e.g. technical, organisational, protection of vulnerable groups
  2. Rate of companies with such OSH management systems
  3. 10 examples from different sectors
  4. Official statistics of accidents at work
  5. Number of OSH professionals, time taken per worker, degree of qualification
  6. Good performance in all areas of risk assessment
  7. OSH is addressed in board meetings and trade union meetings, OSH measures are implemented without
    excessive internal barriers, no risk culture
Available statistics
and data
  1. National Compliance reports
  2. Statistics form Certification bodies
  3. Reports from trade associations, visits of companies, interviews with experts
  4. National and international occupational accident statistics
Raw data For all aspects:
  1. Company visits, interviews with experts, small scale surveys,
    meetings of stakeholders, employers, workers etc., company reports

Source: Overview by the authors

This table shows typical indicators. There are currently no compulsory standards for indicators and performance measuring methods in OSH. Every evaluation uses different indicators and methodologies, depending on the focus of the study. However, there are a series of indicator sets that are applied in large-scale international studies.

The classification of indicators

Types of indicators: measuring inputs, processes, outputs, and outcomes

Indicators can be status-oriented or progress-oriented. The first type is used to measure the status of a ‘system’ - the OSH-infrastructure - compared to the desired status. If progress needs to be measured - e.g. of a strategy, programme, policy, campaign or legislation – then the same indicators should ideally be measured at the beginning, and then again after a certain period.

‘Input indicators’ are one of the most commonly used types. They describe the resources, finances and competences available in the system. These indicators determine the status quo of the infrastructure, and help identify the resources for basing policy initiatives on.

Typical OSH input indicators include:

  • Legislation status
  • Financial and human resources for supervision and enforcement
  • Resources for prevention and guidelines in the public and private sector
  • Expertise and training of OSH professionals
  • Number of OSH professionals
  • Safety features of the main technologies in different sectors
  • Research institutions

These indicators can also refer to softer characteristics, such as OSH awareness, etc. At the start of an evaluation, input indicators can be applied to define the level of resources needed to successfully carry out a certain policy or activity. The German National OSH Strategy, for example, dedicated 10% of all resources to strategy activities.

When indicators are used to assess progress - i.e. the success of a policy or legislation - three other types of indicators are generally applied – process, output and outcome.

Process indicators’ measure and assess process-related activities, and are in general short-term indicators. They include the number of activities, collaborative actions, campaigns and communication activities. Outcomes such as ‘fewer accidents or diseases’ cannot be measured due to the limited time available.

Australia, [4] for instance, has reported on goal achievement by means of the following process-related indicators:

  • Legislation is reviewed / adapted / implemented / in force
  • Reports / strategies / programmes / action plans are developed / reviewed / updated
  • Webtool / information / guidance / training material is developed / reviewed / updated / available
  • Commission / working groups are established, regular meetings are held
  • Milestones of specific projects are met
  • Number of (inspection) visits / investigations / events / training courses / audits; number of resolved compliance issues
  • Industry confirms commitment
  • Targets / principles are incorporated by target group
  • Performance indicators are reported by federal states

Output indicators are closely connected to the results of process-related activities. They can be defined as short-term, mostly quantitatively-operationalised results of processes, e.g. number of flyers spread to the target population, number of common projects, participants at health promotion meetings, number of trained workers, etc.

The final result is measured or assessed by outcome indicators, which refer to the broader results of activities and policies. In OSH, these are mainly health and hazard related goals, e.g. the effects of measures on accident rates after a certain period, and on long-term health effects.

Specific OSH indicators

There are different areas of interest for OSH performance measurements:

A) The hazard / health performance and its related indicators [5][6][7][8] and
B) The performance of the system [1][9][10].

Type A are mostly output or outcome oriented, and can be verified by exposure data, accident reports or medical diagnoses of the working population.

A: Indicators on occupational risk factors and staff health

A.1. Exposures Physical and chemical Ergonomic Psychosocial Working time Other

A.2. Health and well-being Work-related illnesses of all kinds (ICD codes) based on accident statistics, health / pension / accident insurances, rates of disease and early pension rates in certain professions Work related absence from the workplace Work ability indices Number and type of rehabilitation measures,

A.3. Accidents Accident rates, sector accident rates, commuting accidents, severity of accidents

B: System (structure) and procedure indicators

[‘system’ is often used synonymously with ‘structure’, and vice versa]

These indicators are designed to describe the system and procedural aspects of the OSH infra¬structure, i.e. the legal background (basic legislation), the administrative procedures, the culture of conflict and collaboration between groups in society (social partnership), the economic level (resources and finances). These indicators mainly describe the potential to act and the ways of action.

B.1 Examples of system indicators:

  • Human and financial resources for OSH
  • Professional expertise and education of OSH professionals
  • (Advanced) training
  • Background legislation e.g. legislation on liability or worker participation
  • OSH legislation
  • Regulation and administration of prevention and compensation
  • Availability of information and guidance
  • Research capacities

Procedure indicators mainly describe the quantity and quality of activities.

B.2 Procedure indicators

These include:

  • Legislation (and the practical implementation thereof), including the improvement of complex legal regulations and the simplification of legal requirements for SMEs
  • Development and introduction of standards
  • Use of research results,
  • Sector-specific regulations, particularly in focus industries (e.g. construction)
  • Financial incentives
  • Instructions and publications
  • Risk assessment guidance
  • Cooperation between OSH actors
  • Use of OSH management systems in companies
  • Worker involvement
  • Communication of OSH concerns to the public.

Demarcation between these two indicator types is not particularly sharp, as successful procedures often become basic structural elements after a certain period of time.

Cost-benefit analyses of OSH measures can determine the economic impact for business and society. Different methods and indicators are used, but the general calculation is between the investment (e.g. equipment, organisation, personnel) and the savings (e.g. lower rates of absence, fewer accidents and diseases, higher productivity due to increased production time).

There are other indicator taxonomies that partly correspond to that presented here. Two dimensions of safety indicators are distinguished: personal safety indicators (measuring aspects of attitudes and behaviour) vs. process safety indicators (measuring technical integrity). Another distinction is made between lead and lag indicators, Injury and fatality rates are referred to as lag indicators (roughly corresponding with type A), whereas indicators measuring aspects of the safety of a management system are referred to as lead indicators in the area of personal safety. Although these indicator types are frequently referred to, the distinction is problematic (for more details cf.[11]).

Statistical base and sets of indicators

Some information may be obtained from the registers and data sets of OSH institutions or government statistical offices. However, conventional databases are not thorough or precise enough to evaluate developments and connections in areas such as MSDs and psychological illnesses. In order to determine the prevalence and distribution of exposure, large-scale surveys are usually carried out or evaluated by means of secondary analysis.

Work-related indicator sets are used in aggregated studies and statistics (national, European or international) (see also: Burden of occupational diseases and Reporting and monitoring occupational accidents and diseases in Europe). Some large data sources are exclusively OSH-related. However, information on OSH performance also can be found in data sources on related fields, such as surveys on working conditions, labour markets and companies, and data from the public health sector.

The most prominent data sources are:

1) Statistics - OSH related

  • European statistics on Accidents at Work (ESAW) [12]
  • European statistics on Occupational Diseases [13]
  • National statistics.

2) Surveys on working conditions

  • Eurofound: European Survey on working conditions [14]
  • EU-OSHA: ESENER
  • Regular national surveys on working conditions, e.g. for Denmark [15], Finland [16], Germany [17], the Netherlands [18], Spain [19] etc.

3) Statistics - labour and company related

  • ILO and national statistical agencies: Labour Force Survey [20]
  • ILO: Decent Work [21]
  • ISSA: Social security reports [22]
  • Eurofound: Company survey [23]
  • National statistics and surveys.

4) Statistics - health related

  • OECD: Health Data [24]
  • European Community Health indicators (ECHI) [25]
  • European Network for Workplace Health Promotion: Quality Criteria for Workplace Health Promotion. [26]

In a number of cases, the specific need for information cannot be covered by publicly available statistics. This means that surveys must often be conducted, in order to design specific OSH performance measurements.

The European Agency for Safety and Health at Work established an OSH monitoring system. It uses an extensive set of national and international data sources to provide a comprehensive, evidence-based picture of OSH, identifying trends and making recommendations on research, policy and practice [27].

Indicators – benefits and limitations

Indicators provide evidence that a certain condition prevails, or whether results have been achieved. They enable decision-makers to assess the progress of targets and objectives, towards their intended outcomes. As such, they are an integral part of a results-based accountability system.

OSH performance measurements require indicators. They are a source of essential valid and reliable factual knowledge when assessing the status and performance of an OSH system. Without them, the discussion lacks evidence, and is merely based on personal experience.

One should, however, be aware of their limitations:

  • Indicators rarely cover all aspects of a subject. Opinions on the choice of indicators vary significantly between stakeholders and OSH specialists.
  • Indicators are often so complex that they require more detailed parameters (sub indicators).
  • Data quality is often insufficient, with gaps for certain sectors or areas. Proxies can be used in these cases if the connection between the proxy and the indicator is sufficiently close.
  • The use of indicators in international comparisons is limited, since the indicators are based on different national systems of data collection and aggregation.

References

[1] Köper B, Möller K., Zwetsloot, G., ‘The Occupational Safety & Health Scorecard’, ''Scandinavian Journal of Work, Environment and Health,'' 35, vol. 6, 2009, pp. 413-420. Available at: http://www.ncbi.nlm.nih.gov/pubmed/19806280

[2] Bastianonia, S., Niccolucci, V., Pulselli, R.M., Marchettini, N., ‘Indicator and indicandum: “Sustainable way" vs “prevailing conditions" in the Ecological Footprint’, ''Ecological Indicator'', Vol. 16, 2012, pp. 47-50. Available at: http://www.sciencedirect.com/science/article/pii/S1470160X11003396

[3] United States Agency International Development USAID, Center for Development Information and Evaluation, ‘Performance Monitoring and Evaluation – Selecting performance indicators’, ''Tips No. 6,'' 2nd Edition 2010, pp. 1-12. Available at: http://transition.usaid.gov/policy/evalweb/documents/TIPS-SelectingPerformanceIndicators.pdf

[4] Australian Safety and Compensation Council, ''Report on indicators for occupational disease,'' Australian Government, 2006, pp. 1-45. Available at: http://www.safeworkaustralia.gov.au/sites/SWA/AboutSafeWorkAustralia/WhatWeDo/Publications/Pages/SR200604nIndicatorsForDisease.aspx

[5] Council of State and Territorial Epidemiologists (CSTE) (2005). Putting Data to Work: Occupational Health Indicators from Thirteen Pilot States for 2000. Retrieved 19 December 2011, from: https://www.cdc.gov/niosh/docs/2005-154/pdfs/2005-154.pdf?id=10.26616/NIOSHPUB2005154

[6] Kreis, J; Bödeker, W, ''Indicators for work-related health monitoring in Europe.'' Betriebliches Gesundheitsmanagement und Prävention arbeitsbedingter Gesundheitsgefahren, Band 33, Bremerhafen, 2004. Available at: http://www.certh.gr/dat/9DE33F7D/file.pdf

[7] International Labour Organisation ILO (no publishing date available). Decent work: Measuring decent work. Retrieved 19 December 2011, from: http://www.ilo.org/integration/themes/mdw/lang--en/index.htm

[8] Work Health Organisation WHO (2011). Global Health Observatory: WHO indicator registry. Retrieved 19 December 2011, from: http://www.who.int/gho/indicator_registry/en/index.html

[9] European Commission DG Employment, ''Socio-economic costs of accidents at work and work-related ill health – Key messages and case studies.'' Luxemburg, 2011, pp. 1-55. Available at: http://ec.europa.eu/social/main.jsp?catId=716&langId=en&intPageId=1716

[10] Kohstall, T., ''Calculating the International Return on Prevention for Companies - Costs and Benefits of Investments in Occupational Safety and Health,'' Project of the International Social Security Association (ISSA), German Social Accident Insurance (DGUV), Berufsgenossenschaft Energie Textil Elektro Medienerzeugnisse (BG ETEM), 2012, pp. 1-45. Available at: http://www.dguv.de/inhalt/praevention/praev_lohnt_sich/wirtschaft/documents/kurzbericht_praev_engl.pdf

[11] Safety Science 47 (2009) – Special issue of the journal dedicated to the topic of personal vs. process safety and leading vs. lagging indicators. Retrieved on 4th Sept 2012 from: http://www.sciencedirect.com/science/journal/09257535/47/4

[12] Eurostat, ''European Statistics on Accidents at Work (ESAW) – Methodology''. European Commission, 2011. Available at: http://ec.europa.eu/eurostat/ramon/statmanuals/files/ESAW_2001_EN.pdf

[13] Eurostat, ''European Occupational Diseases Statistics (EODS) – Phase 1 Methodology,'' Eurostat working paper, 2000.

[14] Eurofound (2012), European Working Conditions Survey (EWCS). Retrieved 20 August 2012, from http://www.eurofound.europa.eu/surveys/ewcs/index.htm

[15] Jakob Bue Bjørner, Hermann Burr, Helene Feveile, Katja Løngaard, Jan Pejtersen, Christian Roepstorff, Hans H.K. Sønderstrup-Andersen and Sannie Vester Thorsen, ed by Det Nationale Forskningscenter for Arbejdsmiljø (NFA) (2010): ''Ændringer I det Dankse arbeijsmiljø fra 2005 til 2008'' (Changes in the Danish Work Environment between 2005 and 2008). Available at: http://www.arbejdsmiljoforskning.dk/da/nyheder/arkiv/2010/~/media/Boeger-og-rapporter/aendringer-i-det-danske-arbejdsmiljo.pdf

[16] Finnish Institute of Occupational Health - FIOH (2009): ''Työ ja terveys Suomessa 2009.'' (Work and health in Finland 2009). Available at http://www.ttl.fi/fi/verkkokirjat/tyo_ja_terveys_suomessa/Documents/Tyo_ja_terveys_2009.pdf

[17] German Federal Institute for Occupational Safety and Health BAuA (2010): ''Sicherheit und Gesundheit bei der Arbeit 2010 - Unfallverhütungsbericht Arbeit ''(Safety and Health at Work 2010 – Accident prevention report). Available at http://www.baua.de/de/Publikationen/Fachbeitraege/Suga-2010.pdf?__blob=publicationFile&v=3

[18] Labour Inspection of the Netherlands ed. (2011): ''Arbo in bedrijf 2010. Een onderzoek naar de naleving van arbo-verplichtingen, blootstelling aan arbeidsrisico’s en genomen maatregelen in 2010 ''(Health and Safety at Work 2010. An investigation into compliance with health and safety requirements, exposure to occupational risks and measures taken in 2010) Retrieved 3 September 2012 from: http://www.rijksoverheid.nl/documenten-en-publicaties/rapporten/2012/02/27/arbo-in-bedrijf-2010.html

[19] Spanish National OSH Institute INSHT (2008): ''VII encuesta nacional de condiciones de trabajo,'' (Seventh national survey on working conditions), Available at: http://www.insht.es/InshtWeb/Contenidos/Documentacion/FICHAS%20DE%20PUBLICACIONES/EN%20CATALOGO/OBSERVATORIO/Informe%20%28VII%20ENCT%29.pdf

[20] International Labour Organisation ILO (2012). Labour force survey LFS. Retrieved 20 August 2012, from: http://www.ilo.org/dyn/lfsurvey/lfsurvey.home

[21] International Labour Organisation ILO (2008). Decent work: Revised Office proposal for the measurement of decent work, Retrieved 20 August, from: http://www.ilo.org/integration/resources/mtgdocs/WCMS_100995/lang--en/index.htm

[22] International Social Security Association ISSA (no publishing data available). Social Security Observatory. Retrieved 20 August 2012, from: http://www.issa.int/Observatory/Social-Security-Observatory

[23] Eurofound (2011). European Company Survey (ECS) 2009. Retrieved 20 August 2012, from: http://www.eurofound.europa.eu/surveys/ecs/2009/index.htm

[24] Organisation for Economic Co-operation and Development OECD (2011). Health Data, OECD Health Data 2011. Retrieved 19 December 2011, from: http://www.oecd.org/topic/0,3699,en_2649_33929_1_1_1_1_37407,00.html

[25] European Commission, DG Health and Consumers (2012). European Community health indicators (ECHI). Retrieved 20 August 2012, from: http://ec.europa.eu/health/indicators/echi/index_en.htm

[26] European Network for Workplace Health Promotion, ''Healthy Employees in Healthy Organisations – Good Practice in WHP in Europe. Quality Criteria for Workplace Health Promotion'', 1999. Available at: http://www.enwhp.org/fileadmin/downloads/quality_criteria_01.pdf

[27] European Agency for Safety and Health at Work, Risk Observatory (2012): ‘''OSH in figures''’: identification and analysis of trends, Retrieved 3 September 2012, from: http://osha.europa.eu/en/riskobservatory/monitoring-osh.pdf

Further reading

Council of State and Territorial Epidemiologists (CSTE) (2005). Putting Data to Work: Occupational Health Indicators from Thirteen Pilot States for 2000. Retrieved 19 December 2011, from: https://www.cdc.gov/niosh/docs/2005-154/pdfs/2005-154.pdf?id=10.26616/NIOSHPUB2005154

Select theme

Contributor
Ellen Schmitz-Felten