Improve your ability to establish, execute and

0 downloads 0 Views 15MB Size Report
Jan 4, 2012 - A method for developing organisation-wide manual handling based physical employment ..... wegian Ranger School, male and female cadets must perform ...... Source: Reed Hoyt (USARIEM) and Jeffrey Palmer (MIT Lincoln Labs), unpublished. ...... J Gerontol A Biol Sci Med Sci 1996; 51:M223–M225. 28.
21/11 Journal of Science and Medicine in Sport Vol. 21/11 (2018) 1113– 1178

Improve your ability to establish, execute and evaluate institutional research strategy

Organize your research, collaborate and connect with others online, and discover the latest research with our free reference manager and academic social network. Mendeley Institutional Edition includes premium user features and competency for researchers and librarians.

Visualize your institution’s research performance, benchmark relative to peers, develop collaborative partnerships and explore research trends.

Develop reports on research output, carry out performance assessments, and showcase your researchers’ expertise, all while reducing administrative burden for researchers, IDFXOW\DQGVWDȆ

SPORTS INJURY PHYSICAL ACTIVITY SPORTS & EXERCISE SCIENCE Special issue: 4th International Congress on Soldiers’ Physical Performance

Dan C. Billing Jace R. Drain Jeremy Witchalls

ELSEVIER

For a FREE custom report on your institution’s research strengths, visit: elsevier.com/research-intelligence/ace

SPORTS & EXERCISE MEDICINE

Guest Editors

Elsevier’s Research Intelligence solutions provides answers to the most pressing challenges that research administrators face. Our suite of innovative software solutions improves your ability to establish, execute and evaluate research strategy and performance.

Track, analyze and visualize global research with our abstract and citation database of peer-reviewed literature, LQFOXGLQJVFLHQWLȇFMRXUQDOVERRNVDQGFRQIHUHQFH SURFHHGLQJVFRYHULQJWKHȇHOGVRIVFLHQFHWHFKQRORJ\ medicine, social sciences and arts and humanities.

VOLUME 21 • ISSUE 11 • NOVEMBER 2018

Elsevier’s Science & Technology content is available online via ScienceDirect—providing easy, instant, DRM-free access to searchable books, journals and citation tools all in one place. Learn more at http://info.ScienceDirect.com/books

Books Content on ScienceDirect Offering comprehensive coverage of the full range of scientific disciplines, including those published under the renowned Woodhead Publishing and Gulf Professional Publishing imprints, over 17,000 eBooks, 59,000 Major Reference Works chapters, and 200,500 Book & Handbook Series chapters are now fully integrated with journals on ScienceDirect. We’ll provide the resources to help you maximize awareness and usage of the titles in your collection: Ɣ tutorials and guides Ɣ local “lunch and learn” training sessions Ɣ customized marketing toolkits Ɣ webinars Ɣ free MARC records

NEW TITLES

for 2014!

&RQWDFW XV WR ȈQG RXW KRZ ZH FDQ KHOS \RX PDNH WKH PRVW RXW of your ScienceDirect Books investment.

CHOICE

Outstanding Academic Title

Purchasing Models

2013

There are many f lexible purchase options available for the ScienceDirect books collection, which includes eBooks, Book Series, Handbooks, Major Reference Works, and Reference Modules. Pick and Choose

Collections

Subscription

One time (perpetual)

eBooks

9

9

9

9

Book Series

9

9

9

9

Handbooks

9

9

9

Reference Works

9

9

9

PROSE

R.R. Hawkins Award

Reference Modules

9

9

For more information and to see a complete list of titles by subject collection, talk to your Elsevier Sales Representative today, or visit: info.sciencedirect.com/books

Journal of Science and Medicine in Sport Volume 21 (11) (2018) Official Journal of Sports Medicine Australia Editor-in-Chief Gordon Waddington, PhD, University of Canberra

Consulting Editor Gregory Kolt, PhD, Western Sydney University

Associate Editors Teatske Altenburg, PhD, Vrije Universiteit Medisch Centrum Lisa Barnett, PhD, Deakin University David Bentley, PhD, Flinders University Julien V. Brugniaux, PhD, Université Grenoble Alpes Michael Callaghan, PhD, University of Manchester Cristina Caperchione, PhD, University of British Columbia Douglas Casa, PhD, University of Connecticut Steve Cobley, University of Sydney James Dimmock, PhD, University of Western Australia Mitch Duncan, PhD, University of Newcastle Christina Ekegren, PhD, Monash University Lauren Fortington, PhD, Federation University Australia Andrew Kilding, PhD, Auckland University of Technology

Ollie Jay, PhD, University of Sydney Anthony Leicht, PhD, James Cook University Andrew McKune, PhD, University of Canberra Lars McNaughton, PhD, Edge Hill University Tim Meyer, MD, PhD, Saarland University Stephan Praet, MD, PhD, Australian Sports Commission Kate Pumpa, PhD, University of Canberra Amanda Rebar, PhD, CQUniversity Jason Siegler, PhD, Western Sydney University Gisela Sole, PhD, University of Otago Steve Stannard, Massey University, NZ Evert Verhagen, PhD, Vrije Universiteit Jeremy Witchalls, PhD, University of Canberra

Editorial Board Gary Brickley, PhD, University of Brighton Wendy Brown, PhD, University of Queensland Ray Browning, PhD, University of Colorado Health Sciences Centre Timothy Cable, PhD, Liverpool John Moores University Jill Cook, PhD, La Trobe University Aaron Coutts, PhD, University of Technology Sydney Andrew Cresswell, PhD, University of Queensland Annet J. Dallmeijer, PhD, Vrije Universiteit Medisch Centrum Patrik Danielson, PhD, Umea University Eamonn Delahunt, PhD, University College Dublin Peter Eastwood, PhD, University of Western Australia Carolyn Emery, PhD, University of Calgary Kieran Fallon, PhD, Australian Institute of Sport Damian Farrow, PhD, Victoria University Paul Fourneir, PhD, University of Western Australia Andrew Garnham, MBBS, Deakin University Emma George, PhD, Western Sydney University Conor Gissane, PhD, St Marys University College Marc Hamilton, PhD, University of Missouri-Columbia Spencer Hayes, PhD, Liverpool John Moores University Erik Hohmann, PhD, Valiant/Houston Methodist Group, UAE Franco M. Impellizzeri, PhD, Neuromuscular Research Laboratory Costas Karageorghis, PhD, Brunel University

Michael Kennedy, PhD, University of Alberta Michael Kinchington, PhD, Centre for Podiatric Medicine Katriina Kukkonen-Harjula, PhD, University of Eastern Finland Paul Laursen, PhD, New Zealand Academy of Sport and AUT University David Lloyd, PhD, Griffith University Mary Magarey, PhD, University of South Australia Michael Makdissi, PhD, University of Melbourne Peter Malliaras, PhD, Monash University Frank Marino, PhD, Charles Sturt University Alberto Méndez-Villanueva, PhD, ASPIRE, Academy for Sports Excellence Geraldine Naughton, PhD, Australian Catholic University Leslie Nicholson, PhD, University of Sydney Ken Nosaka, PhD, Edith Cowan University Les Podlog, PhD, Univerity of Utah Stephan Riek, PhD, University of Queensland Jo Salmon, PhD, Deakin University Grant Schofield, PhD, Auckland University of Technology Joanna Scurr, PhD, University of Portsmouth Ian Shrier, PhD, McGill University Dennis Taaffe, PhD, Edith Cowan University Toomas Timpka, PhD, Linköping University Stuart Warden, PhD, Indiana University Anita Wluka, PhD, Monash University Ella Yeung, PhD, Hong Kong Polytechnic University

The Editorial Office E-mail: [email protected]

AMSTERDAM—BOSTON—LONDON—NEW YORK—OXFORD—PARIS—PHILADELPHIA—SAN DIEGO—ST. LOUIS

Aims and Scope The Journal of Science and Medicine in Sport is an international refereed research publication covering all aspects of sport science and medicine. The Journal considers for publication Original research and Review papers in the sub-disciplines relating generally to the broad sports medicine and sports science fields: sports medicine, sports injury (including injury epidemiology and injury prevention), physiotherapy, podiatry, physical activity and health, sports science, biomechanics, exercise physiology, motor control and learning, sport and exercise psychology, sports nutrition, public health (as relevant to sport and exercise), and rehabilitation and injury management. Manuscripts with an interdisciplinary perspective with specific applications to sport and exercise and its interaction with health will also be considered. Only studies involving human subjects will be considered. Contributors are invited to submit their manuscripts in English to the Editor for critical review. All manuscripts must be submitted electronically at http://www.ees.elsevier.com/jsams

Copyright Copyright Papers accepted for publication become the copyright of Sports Medicine Australia, and authors will be asked to sign a transfer of copyright form, on receipt of the accepted manuscript by Elsevier. This enables the Publisher to administer copyright on behalf of the Authors, whilst allowing the continued use of the material by the Author for scholarly communication. © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved. This journal and the individual contributions contained in it are protected under copyright by Elsevier Ltd., and the following terms and conditions apply to their use: Photocopying Single photocopies of single articles may be made for personal use as allowed by national copyright laws. Permission of the Publisher and payment of a fee is required for all other photocopying, including multiple or systematic copying, copying for advertising or promotional purposes, resale, and all forms of document delivery. Special rates are available for educational institutions that wish to make photocopies for non-profit educational classroom use. For information on how to seek permission visit www.elsevier.com/permissions or call: (+44) 1865 843830 (UK) / (+1) 215 239 3804 (USA). Derivative Works Subscribers may reproduce table of contents or prepare lists of articles including abstracts for internal circulation within their institutions. Permission of the Publisher is required for resale or distribution outside the institution. Permission of the Publisher is required for all other derivative works, including compilations and translations (please consult www. elsevier.com/permissions). Electronic Storage or Usage Permission of the Publisher is required to store or use electronically any material contained in this journal, including any article or part of an article (please consult www.elsevier.com/permissions). Except as outlined above, no part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without prior written permission of the Publisher. Notice No responsibility is assumed by the Publisher for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Because of rapid advances in the medical sciences, in particular, independent verification of diagnoses and drug dosages should be made. Although all advertising material is expected to conform to ethical (medical) standards, inclusion in this publication does not constitute a guarantee or endorsement of the quality or value of such product or of the claims made of it by its manufacturer. Orders, claims, and journal inquiries: Please visit our Support Hub page https://service.elsevier.com for assistance. Publication information: Journal of Science and Medicine in Sport (ISSN 1440-2440). For 2018, volume 21 (12 issues) is scheduled for publication. Subscription prices are available upon request from the Publisher or from the Regional Sales Office nearest you or from this journal’s website http://ees.elsevier.com/jsams/default.asp. Further information is available on this journal and other Elsevier products through Elsevier’s website (http://www.elsevier.com). Subscriptions are accepted on a prepaid basis only and are entered on a calendar year basis. Issues are sent by Priority Air. Rates are available upon request. Claims for missing issues should be made within six months of the date of dispatch. Author inquiries You can track your submitted article at http://www.elsevier.com/track-submission. You can track your accepted article at http://www.elsevier.com/trackarticle. You are also welcome to contact Customer Support via http://support.elsevier.com

Journal of Science and Medicine in Sport (2018) 21, iii

Contents

Available online at www.sciencedirect.com

ScienceDirect

Special issue: 4th International Congress on Soldiers’ Physical Performance Guest Editors: Dan C. Billing, Jace R. Drain and Jeremy Witchalls Cited in: AMED, CISTI, CSA Physical Education Index, EMBASE, EMCARE, PEDro, SCI Expanded, SIRC, SPORT Database/Discus Scisearch, Thomson Scientifi c Focus on Sport Science & Medicine, ISI, Medline, PsycINFO. Also covered in the abstract and citation database SCOPUS®. Full text available on ScienceDirect® International Congress on Soldiers’ Physical Performance 2017 (ICSPP2017) Special Issue D.C. Billing, J.R. Drain, J. Witchalls (Australia) .....................................................................................

1113

Perspectives on resilience for military readiness and preparedness: Report of an international military physiology roundtable B.C. Nindl, D.C. Billing, J.R. Drain, M.E. Beckner, J. Greeves, H. Groeller, H.K. Teien, S. Marcora, A. Moffitt, T. Reilly, N.A.S. Taylor, A.J. Young, K.E. Friedl (USA, Australia, UK, Norway, Canada) ...................................................

1116

International consensus on military research priorities and gaps — Survey results from the 4th International Congress on Soldiers’ Physical Performance M. Lovalekar, M.A. Sharp, D.C. Billing, J.R. Drain, B.C. Nindl, E.J. Zambraski (USA, Australia) ............................

1125

Optimising training adaptations and performance in military environment H. Kyröläinen, K. Pihlainen, J.P. Vaara, T. Ojanen, M. Santtila (Finland) .......................................................

1131

Musculoskeletal training injury prevention in the U.S. Army: Evolution of the science and the public health approach B.H. Jones, V.D. Hauschild, M. Canham-Chervak (USA) ............................................................................

1139

Military applications of soldier physiological monitoring K.E. Friedl (USA) ..........................................................................................................................

1147

Consensus paper on testing and evaluation of military exoskeletons for the dismounted combatant K.L. Mudie, A.C. Boynton, T. Karakolis, M.P. O’Donovan, G.B. Kanagaki, H.P. Crowell, R.K. Begg, M.E. LaFiandra, D.C. Billing (Australia, USA, Canada) ..................................................................................................

1154

A method for developing organisation-wide manual handling based physical employment standards in a military context G.L. Carstairs, D.J. Ham, R.J. Savage, S.A. Best, B. Beck, D.C. Billing (Australia) ...........................................

1162

Positive, limited and negative responders: The variability in physical fitness adaptation to basic military training S.D. Burley, J.R. Drain, J.A. Sampson, H. Groeller (Australia) ....................................................................

1168

The association between obesity related health risk and fitness test results in the British Army personnel P.W. Sanderson, S.A. Clemes, K.E. Friedl, S.J.H. Biddle (UK, USA, Australia) ..................................................

1173

The Journal of Science and Medicine in Sport is supported by the following Societies Established in 1963, Sports Medicine Australia (SMA) has a membership of health and science professionals and related organisations involved or interested in sport and physical activity. SMA members work in all levels of sport, from the grass roots through to elite levels. As well as the promotion of healthy sport and physical activity, SMA members focus on the prevention, treatment and management of sports injuries. SMA strives to create a safer physical and social environment for all participants in sport and physical activity. SMA’s greatest achievement is drawing together the expertise of members from a diverse range of professions. This multidisciplinary approach to sports medicine and science has played an important part in the success of Australia’s athletes and sporting teams, and SMA members’ shared knowledge and skills play an important part in keeping all Australians active and healthy. SMA members include dietitians, doctors, exercise scientists, orthopaedic surgeons, physicians, physiotherapists, podiatrists, psychologists, public health professionals, soft tissue therapists and sports trainers, as well as nurses, teachers and parents. Many of these members are also members of the professional discipline-based associations listed below. http://www.sma.org.au Australasian Academy of Podiatric Sports Medicine (AAPSM) encourages research in the area of musculoskeletal pathomechanics, biomechanical analysis and orthotic control as applied to the athletic community. The AAPSM also encourages accumulation of statistics in the area of sports medicine in order to develop sound methods of prevention and treatment of sports injuries. The AAPSM participates in the annual Conference of Science and Medicine in Sport to disseminate current knowledge to the profession, allied professions, and those engaged in the treatment of amateur, professional, or individual athletics. http://www.aapsm.org.au College of Sport and Exercise Psychologists (CoSEP) Since 1984, Australia has been leading the world in the use of sport and exercise psychology. The APS College of Sport and Exercise Psychologists develops and safeguards the standards of practice and supervised experience. It sets the quality of service in sport and exercise psychology, and advises and makes recommendations regarding the education and training of sport and exercise psychologists. The college also acts as a focal point for consumer and general inquiries relating to sport and exercise. http://www.groups.psychology.org.au/csep/ Sports Dietitians Australia (SDA) is a national organisation of qualified sports dietitians. SDA is leading the promotion of healthy eating to enhance the performance of all Australians whatever their level of physical activity. Sports dietitians assist Australians to make healthier food choices by providing accurate nutrition information based on scientific principles. Sports dietitians work with elite and recreational athletes, sporting clubs, active children and anyone whose nutrition needs play an important part in achieving their health and activity goals. http://www.sportsdietitians.com.au

Exercise and Sports Science Australia (ESSA) is a professional organisation which is committed to establishing, promoting and defending the career paths of tertiary trained exercise and sports science practitioners, who are in turn committed to best practice and client wellbeing. http://www.essa.com.au Australian Physiotherapy Association —– Sport Physiotherapy Group (SPA) The Australian Physiotherapy Association (APA) is the peak body representing the interests of Australian physiotherapists and their patients. The APA is a national organisation with non-autonomous state and territory branches and specialty subgroups. http://www.physiotherapy.asn.au Sports Doctors Australia (SDrA) is a professional society for medical practitioners to promote and communicate the highest levels of knowledge, research and education in sports medicine. It consists of doctors from a wide area of specialties including general practice, emergency medicine, rehabilitation medicine and orthopaedics. Many have postgraduate degrees in sports medicine and have vast experience as team doctors. http://www.sportsdoctors.com.au/ The Australasian College of Sports Physicians (ACSP) is an organisation of physicians committed to excellence in the practice of medicine as it applies to all aspects of physical activity. Founded in 1985, it is the professional body representing sports physicians in both Australia and New Zealand. It conducts an annual scientific conference and is involved in both training and assessment of sports physicians. http://www.acsp.org.au

Journal of Science and Medicine in Sport 21 (2018) 1113–1115

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Editorial

International Congress on Soldiers’ Physical Performance 2017 (ICSPP2017) Special Issue

This special issue of the Journal of Science and Medicine in Sport originates from the 4th International Congress of Soldiers’ Physical Performance (ICSPP2017) which was hosted by the Defence Science and Technology Group in Melbourne, Australia from 28 November to 1 December 2017. The ICSPP is the most important international conference in applied military human performance research and therefore attracts experts from all over the world. ICSPP2017 had a record attendance of 502 delegates from 32 countries (Fig. 1). ICSPP2017 included 8 invited keynote lectures (Table 1), 23 featured science session, 17 oral free communication sessions, 9 thematic oral poster sessions, and 3 poster free communication sessions. In addition, an interactive case study on dismounted soldier performance assessment and a roundtable on resilience for military readiness and preparedness was conducted. Overall, the topics covered were comprehensive and included physical training programs and adaptations, occupational and physical performance testing and assessment, injury prevention, public health practices, nutritional considerations, human factors, ergonomics, equipment design, biomechanics, load carriage, gender integration issues, environmental issues, health promotion and wellness, deployment concerns, pedagogy, psychological, and cognitive factors, leadership, and social factors. Accepted congress abstracts were made available through an online Journal of Science and Medicine in Sport supplement.1 This special issue contains 9 manuscripts focused on aspects of military physical performance. Three manuscripts are invited reviews from ICSPP2017 keynote speakers and cover important topic areas which have been developed and refined through many years of primary research in the military context, including, optimising training adaptations and performance (Kyröläinen et al. , Finland), a public health approach to prevent musculoskeletal injuries (Jones et al., United States of America) and physiological monitoring (Friedl, United States of America). The special issue also contains 2 invited manuscripts which provide important methodological guidance in the military context for the testing and evaluation of exoskeletons (Mudie et al., Australia) and for the development of organisation-wide physical employment standards (Carstairs et al., Australia). Two further papers are dedicated to important areas of primary research including the variability in physical fitness adaptations during military training (Burley et al. , Australia) and the association between obesity related health risk and fitness test results (Sanderson et al., United Kingdom). The final 2 manuscripts present important international per-

spectives on resilience for military readiness and preparedness (Nindl et al., United States of America) and military research priorities and gaps (Lovalekar et al., United States of America). The later manuscript provides important guidance to researchers and practitioners about the current and future research priorities across the service members operational lifecycle. The articles in this issue identify some of the latest findings, methodologies and international perspectives pertinent to military physical performance. The manuscripts add to the knowledge-base and provide guidance on how to leverage and translate human performance research well beyond the military, with applications in many other physically demanding occupations and to sport. As such, the Journal of Science and Medicine in Sport is pleased to be able to publish high quality research in the field of military human performance. Synergies of research methodology and the focus on human performance outcomes have resulted in increasing dialogue between military and sport researchers. The dual focuses of the journal, namely sport science and sports medicine, are highly relevant to the military environment, where enhanced human performance and reduced injury risk are as critical to mission success as they would be to sports performance.

Table 1 Invited keynote speakers for the 4th International Congress on Soldiers’ Physical Performance. Speaker

Presentation title

Professor Romain Meeusen (BEL)

The underlying mechanisms of overtraining and management strategies in the military environment Managing musculoskeletal injuries in the military environment Optimising physical training adaptations and military performance Limits to exercise tolerance in humans – mind over muscle? Resistance training, it’s a no brainer. . .

Dr Bruce Jones (US) Professor Heikki Kyröläinen (FIN) Professor Sam Marcora (UK) Professor Maria Fiatarone-Singh (AUS) Professor Louise Burke (AUS)

Dr Karl Friedl (US) Associate Professor Thor Besier (NZ)

Supporting training and performance in the military environment through nutrition and supplementation Role and benefits of wearable physiological sensors in the military The application of wearable technologies in the military context

https://doi.org/10.1016/j.jsams.2018.09.003 1440-2440/© 2018 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/ by-nc-nd/4.0/).

1114

Editorial / Journal of Science and Medicine in Sport 21 (2018) 1113–1115

Fig. 1. Countries represented at the 4th International Congress on Soldiers’ Physical Performance.

Table 2 Recipients of the inaugural International Congress on Soldiers’ Physical Performance (ICSPP) – International Military Sports Council (CISM) Awards. Award

Recipient

ICSPP–CISM Lifetime Achievement Award

• Karl Friedl (United States of America) – Recognising leadership and distinguished service across a lifetime in military physical and physiological performance research.

ICSPP–CISM Best Oral Presentation Award

• Thor Anders Kilen (Denmark) – The effect of training frequency and initial training status on adaptation amongst basic army conscripts.

ICSPP–CISM Best Thematic Oral Poster Presentation Award

• Heather Bowes (Australia) – A contribution to understanding the impact of variations in body mass on fractioning the metabolic burden of military load carriage.

ICSPP–CISM Best Poster Presentation Award

• Christopher Vine (United Kingdom) – A job task analysis to quantify the physical demands of load carriage duties conducted by ground close combat roles in the UK Armed Forces.

ICSPP–CISM Student Travel Grant Awards

• Shawn Eagle (United States of America) – Asymmetrical landing patters combined with heavier body mass increases lower extremity injury risk in Special Operation forces. • Anne Beethe (United States of America) – Comparing lower extremity strength with aerobic and anaerobic capacity to predict novice combat swimmer 500 m time trial performance. • Lopes Thiago (Brazil) – Phase one of a musculoskeletal injury prediction model validation: a prospective study in Navy cadets. • Jeremy McAdam (United States of America) – Impact of whey protein supplementation on fitness performance, body composition and injury rates in Army initial entry soldiers. • Nilton Gomes Rolim Filho (Brazil) – Serum creatine kinase and immune system relationship in different Brazilian biomes during the 2012 Commandos Special Operations Course.

Through an Agreement of Cooperation between the Defence Science and Technology (DST) Group and the International Military Sports Council (CISM) for ICSPP2017 a number of congress awards and travel grants were established. The congress awards acknowledge the outstanding contributions of individuals and organizations and recognises excellence in military physical and physiological performance research. Recipients of each award are provided in Table 2. The inaugural ICSPP–CISM Lifetime Achieve-

ment Award was awarded to Dr Karl Friedl (United States of America). Karl has inspired, provided strong leadership and positively influenced other researchers from around the world over several decades, and in doing so has served to advance soldiers’ physical performance on an international scale. As guest editors, we are confident that you will find papers in this supplement of scientific substance and of military relevance. Finally, we encourage your attendance and participation at

Editorial / Journal of Science and Medicine in Sport 21 (2018) 1113–1115

ICSPP2020 from 11 to 14 February 2020 in Quebec City, Canada (http://www.icspp2020.ca/). Reference 1. ICSPP2017 Abstracts. J Sci Med Sport 2017; 20(Suppl. 2):S1–S178.

Dan C. Billing ∗ Jace R. Drain Land Division, Defence Science and Technology Group, Australia

1115

Jeremy Witchalls UC Research Institute for Sport and Exercise, University of Canberra, Australia ∗ Corresponding author. E-mail address: [email protected] (D.C. Billing)

Available online 6 September 2018

Journal of Science and Medicine in Sport 21 (2018) 1116–1124

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Perspectives on resilience for military readiness and preparedness: Report of an international military physiology roundtable Bradley C. Nindl a,∗ , Daniel C. Billing b , Jace R. Drain b , Meaghan E. Beckner a , Julie Greeves c , Herbert Groeller d , Hilde K. Teien e , Samuele Marcora f , Anthony Moffitt g , Tara Reilly h , Nigel A.S. Taylor d , Andrew J. Young i , Karl E. Friedl i a

Neuromuscular Research Laboratory and Warrior Human Performance Research Center, University of Pittsburgh, United States Defence Science and Technology Group, Australia c Army Personnel Research Capability, HQ, Army, UK d Centre for Human Applied Physiology, School of Medicine, University of Wollongong, Australia e Norwegian Defense Research Establishment, FFI, Norway f School of Sport and Exercise Science, University of Kent, UK g Department of Defence, Australia h CFWMS Human Performance and Development Canadian Armed Forces, Canada i U.S. Army Research Institute of Environmental Medicine, United States b

a r t i c l e

i n f o

Article history: Received 16 March 2018 Received in revised form 23 April 2018 Accepted 8 May 2018 Available online 19 May 2018 Keywords: Adaptation Psychological Biological Stress Extreme environment Task performance

a b s t r a c t Modern warfare operations often occur in volatile, uncertain, complex, and ambiguous (VUCA) environments accompanied by physical exertion, cognitive overload, sleep restriction and caloric deprivation. The increasingly fast-paced nature of these operations requires military personnel to demonstrate readiness and resiliency in the face of stressful environments to maintain optimal cognitive and physical performance necessary for success. Resiliency, the capacity to overcome the negative effects of setbacks and associated stress on performance, is a complex process involving not only an individual’s physiology and psychology, but the influence of factors such as sex, environment, and training. The purpose of this moderated roundtable was to address five key domains of resiliency in a point/counterpoint format: physiological versus psychological resiliency, sex differences, contributions of aerobic and strength training, thermal tolerance, and the role of nature versus nurture. Each speaker was given three minutes to present and the moderator facilitated questions and discussion following the panel’s presentation. The interconnectedness of the five domains highlights the need for an interdisciplinary approach to understand and build resilience to enhance military performance. © 2018 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction Charles Darwin and Leon Megginson showed that the species that is best able to adapt and adjust to a changing environment is the species that will prevail, not the strongest nor most intellectual.1 The same principle can be applied to Warfighters, as possessing a high level of physical fitness and cognitive ability is simply not enough to succeed and to maintain overmatch superiority against adversaries. Military operations expose servicemen and women to a variety of stressors including demanding workloads, harsh and dangerous environments, and ambiguity that degrade performance.2,3 The Armies that prevail are the ones composed

∗ Corresponding author. E-mail address: [email protected] (B.C. Nindl).

of resilient individuals who can overcome these challenges and perform with greater agility, tenacity, survivability, and lethality. Military resilience can be defined as the capacity to overcome the negative effects of setbacks and associated stress on military performance and combat effectiveness. Military operational stress can come in many forms via the singular or combined effects of physical exertion, cognitive overload, sleep restriction, energy insufficiency, variations in the operational environments, and emotional and psychological stress. In the volatile, uncertain, complex, and ambiguous (VUCA) contemporary operating environment, both current and future operations demand and place a higher priority on enhancing and sustaining the readiness and resiliency of military service members in order to decisively win in multi-domain battle. According to Ruiz-Casares et al., resilience is a dynamic process involving the interaction between risk and compensatory factors

https://doi.org/10.1016/j.jsams.2018.05.005 1440-2440/© 2018 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/ by-nc-nd/4.0/).

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

over the lifespan.4 Despite everyday stressors of poverty, violence, and political instability, a study by Eggerman and Panter-Brick reported that Afghani students and caregivers possess resilience through the belief that adversity can be overcome by adherence to cultural values, life goals, and daily perseverance.5 Such values are also necessary in the military as Warfighters must maintain daily perseverance throughout intense military training and must share common goals to protect and serve at all costs. However, resilience is more than a mindset. Studies have reported that autonomic regulation, measured by heart rate variability, may be an indicator of resiliency and ability to adapt to changing environments.6,7 Furthermore, there is growing evidence that genetic, epigenetic, and neurochemical factors also play a key role in the development of resilience through biological responses to stress.8 Beyond the inter- and intra-personal interactions between the body and the mind, there are many other factors that contribute to resiliency including sex, environment, and physical training. It is well known that a variety of anatomical, physiological and functional differences exist between men and women, including body composition, cardiovascular and musculoskeletal systems, as well as hormonal secretion, that can influence initial functioning as well as subsequent resilience in stressful environments. The environment alone can also have an adverse effect on stress tolerance, regardless of sex. Cold stress limits the fine motor dexterity and touch sensitivity9 and has been shown to decrease vigilance, mood and increase tension.10 Conversely, extreme heat, combined with physical exercise and increased core temperature, can have detrimental effects on cardiovascular and endocrine function that result in decreased performance.11 In addition to sex and environment, physical training has a direct impact on the body’s ability to withstand physical and cognitive stressors. Lower aerobic power has been associated with an increase in musculoskeletal injuries during basic combat training,12 whereas load carriage and lifting are among the most frequent activities in which musculoskeletal injuries occur during deployment.13 In both instances, the result is reduced capability. Without adequate rest, physical training can diminish cognitive performance. A study comparing over trained and control athletes demonstrated that over trained athletes made more mistakes when completing the Stroop Color Word Test.14 Similarly, sleep restriction has been shown to negatively impacts soldiers’ reaction times to shoot foe targets during marksmanship tasks.15 Therefore, the selection and training of service members must be used to identify those who can maintain normal physiological and psychological functioning under stress. Such factors demonstrate the complexity of resilience and the need to identify the best means to promote resiliency among military service members. The need is so high that implementing human performance optimisation strategies aimed at enhancing military readiness and lethality has been identified as a top priority for the modernization of future military operations.3,16,17 However, the most effective strategies to enhance resiliency remain unclear. This paper summarizes a roundtable discussion, held at the 4th International Congress on Soldiers’ Physical Performance assembled to address five key domains of resiliency relative to arduous military roles in a point/counterpoint format: 1) physiological versus psychological resiliency, 2) sex differences, 3) contributions of aerobic and strength training, 4) thermal tolerance, and 5) the role of nature versus nurture. A panel of ten internationally recognised scientists and practioners was selected to represent each perspective as follows: Hilde Teien, physiological resilience; Sam Marcora, psychological resilience; Dan Billing, male resilience; Tara Reilly, female resilience; Jace Drain, aerobic training; Herbert Groeller, strength training; Andrew Young, cold tolerance; Nigel Taylor, heat tolerance; Anthony Moffitt, nurturing resilience; Karl

1117

Friedl, the nature of resilience. Each presenter was allotted three minutes to effectively defend the perspective. Bradley Nindl served as the moderator, facilitating questions and discussion following the panel’s presentation. 1.1. Physiological or psychological resilience is most critical for military readiness 1.1.1. Physiological resilience (Hilde K. Teien, Norway) Employment standards for soldiers primarily evaluate physiological resilience. If you consider a sniper’s success to hit the target, the ability to handle stress has a huge impact on the performance. However, this factor, which might be trained, is always a secondary consideration after the sniper’s physical performance, which is crucial for success in military operation. In other words, psychological resilience flows from the more fundamental physiological resilience. Soldier physiology underpins all soldier performance. Even psychological performance is determined by physiological mechanisms and neurochemistry. Soldier resilience is shaped by personal habits such as daily physical exercise. Physical exercise improves musculoskeletal and cardiovascular fitness. It also stimulates trophic factors such as insulin-like growth factor 1 (IGF-I) and brain-derived neurotrophic factor (BDNF) with benefits to sustaining muscle and bone health.18 The same factors promote brain neurogenesis and synaptogenesis and these effects improve psychological resilience, including mood, cognition, and pain thresholds.18 Hence, physiological resilience is the basis for psychological resilience rather than the opposite. In this respect an important research gap is to understand the differential effects of exercise mode and intensity on neurobiology with resilience outcomes ranging from motivation and cognition to immune function and disease resistance.19,20 Extreme soldier performance, where resilience really counts, generally involves physical and metabolic endurance. In the Norwegian Ranger School, male and female cadets must perform virtually nonstop for one week with no organized sleep, and limited or no food.21 Metabolic resilience is the determinant of success, where plummeting blood glucose levels for less resilient soldiers can result in physical collapse, and where men may be less resilient than women because of their biology.21 When the soldiers need to perform extreme physical activity in combination with deprivation of sleep and energy intake, basic components of survival, physiological resilience will be the predominant factor for success.22,23 A preponderance of data demonstrates that the ability to adjust to and overcome the effects of military operational stressors such as thermal extremes, high workload, and inadequate rest is influenced by physiological fitness.24,25 These combined stressors can affect a wide range of outcomes related to the ability to perform the military mission. Susceptibility to disease is one outcome that has been well investigated in Norwegian soldiers, where physiological resilience factors such as the ability to mobilize body energy stores moderates immune function.21,26 This has also been demonstrated in the U.S. Army Ranger course.18,27,28 Thermal tolerance in hot environments is significantly influenced by fitness.24 Musculoskeletal injury is also significantly predicted by physical fitness.25 Since musculoskeletal injuries are the leading cause of injury and lost duty time in soldiers, this makes physiological resilience the most important factor in overall soldier readiness. The single major contributor to loss of soldiers from the military is associated with poor physical fitness, including overweight, and psychological resilience is only a subset of this group because of the fundamental importance of a fit body to cognitive readiness. In conclusion, a physiologically resilient soldier will also be happy, motivated, and capable of good decision making under

1118

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

stress because these are all metabolic functions that depend on physiological resilience. 1.1.2. Psychological resilience (Samuele Marcora, United Kingdom) Psychological resilience refers to the role of mental processes and behavior in protecting an individual from the potential negative effect of stressors.29 It is widely accepted that psychological resilience is critical for coping with the cognitive, emotional and social stressors associated with war exposure. Psychological resilience is most critical for military readiness because it also plays an important role for coping with physiological stressors, and because a psychologically stressed soldier (i.e. a soldier that cannot cope with psychological stressors) will not perform well during military operations no matter how physiologically capable he/she is. With regards to coping with physiological stressors, scientists have focused on the autonomic, endocrine and immune responses, and autoregulation. However, mental processes and behaviour are also critical to maintain bodily homeostasis when exposed to physiological stressors. For example, coping with physical activity in the heat is not just about sweating and the heat flow from the core to the skin via the blood.30 Education and self-monitoring as well as pacing and appropriate drinking (behavioural thermoregulation) are also extremely important to optimise performance, and prevent exertional heat stroke and hyponatremia.31 Furthermore, physiological stressors have psychological manifestations (e.g. subjective fatigue and thermal discomfort) that add to the psychological burden the soldier has to cope with. With regards to the effects of psychological stressors on physical performance, a good example is provided by our work on mental fatigue.32 This experimental work has demonstrated that prolonged and demanding cognitive activity reduces performance in subsequent aerobic exercise despite no significant alterations in the physiological factors thought to determine endurance performance, e.g. cardiac output and muscle fatigue. In other words, mental fatigue (via an increase in perceived exertion) reduces endurance performance despite no reduction in the physiological capacity to perform prolonged aerobic exercise. Importantly, we have also produced some evidence that elite endurance athletes are more resilient than amateurs to the negative effects of prolonged and demanding cognitive activity.33 These findings suggest that being psychologically resilient may help soldiers perform better physically as well as cognitively during stressful military operations. In summary, there is evidence suggesting that a psychologically resilient soldier would cope better not only with the psychological stressors associated with war exposure, but also with the physiological stressors associated with military operations. Therefore, psychological resilience has implications not only for mental health, but also for the physical health of a soldier. Furthermore, there is now considerable experimental evidence reporting that psychological stressors like mental fatigue can have a negative impact on physical performance and not just on cognitive performance. Therefore, selecting and developing psychologically resilient soldiers would ensure that they can perform optimally during military operations that require both physical and cognitive tasks. For all these reasons, psychological resilience is most critical for military readiness. 1.2. Men or women are more physiologically/psychologically resilient 1.2.1. Men are more resilient (Daniel Billing, Australia) Women display superior performance in many roles and will continue to be a vital element of an armed force. However, there are

certain roles or assignments where the proportion of men likely to have the requisite physiological resilience to safely and efficiently execute the required duties will be higher than that of women. This position can be explained by discussing the pathway from sex differences to mission accomplishment. Firstly, physiological sex differences in dimensions such as stature, body mass, bone structure and geometry, cardiac output, oxygen extraction, cardiopulmonary endurance, muscle strength and anaerobic capacity, and muscle endurance have been well documented.34–36 Secondly, as a result of these physiological differences, the execution of a physically demanding single task such as load carriage requires less capable personnel to work at a higher percentage of their maximal capacity.34 Thirdly, when physically demanding tasks are performed in series, which is reflective of contemporary operations, cumulative fatigue ensues resulting in a higher propensity for musculoskeletal injury and or reduced reserve to respond to emergencies.35 Fourthly, inadequate reserves to respond to emergencies and or the incapacitation of individual team members due to injury have important implications for small team performance and cohesion. Ultimately, a reduction in the capability and or capacity of the team may compromise mission accomplishment.37 We know from my co-authors that the two dimensions of resilience (physiological and psychological) are intrinsically linked so to help support this thesis, two specific case examples are discussed. The assignments and tasks performed by some Special Forces personnel demand extremely high physiological and psychological resilience. Less physiologically capable personnel will be unsuitable for these roles as they have a reduced reserve above normal work conditions to respond to emergencies and are less resistant to fatigue and injury. Another case is extreme manual handling assignments which demand high physiological resilience. In many instances, the demands of these tasks are beyond the capacity of many soldiers. Although operational roles or assignments’ will continue to change with the ever-evolving battlefield, at this point in time the high demands of the more extreme case examples cannot be made easier. As a result, there will remain a lower percentage of women than men who are capable of serving in these occupations. However, through research and implementation of female specific best practice training to enhance modifiable characteristic, such as muscular strength, and the introduction of new performance augmentation technologies, such as exoskeletons, the magnitude of the observed sex differences will continue to diminish or potentially become irrelevant and thereby enable more females (and males) to participate fully in these roles.38 Further, a better understanding on the psychological profile of women who are successful in physically strenuous occupations will also assist in providing targeted support. In conclusion, when it comes to some of the most arduous roles in the military, at this present point in time men have a higher physiological capacity and are more resistant to fatigue and injury and thereby more resilient than women. 1.2.2. Women are more resilient (Tara Reilly, Canada) History demonstrates that in times of famine and extreme environmental conditions, women are more likely to survive than men. Assuming resilience equates to survival, women demonstrate lower mortality rates than men at all ages, resulting in women outliving men typically by a 10 year margin. Between 15 and 24 years old men are three times more likely to die than women, and most of these male fatalities are self-inflicted, caused by reckless behavior or violence, a finding that is reflected in other male primates as well.39 As men age, their choices continue to propel them towards higher risk of death. Illnesses related to smoking and alcohol consumption kill more men than women, and in their 40 s cardiovascular disease and cancer kills far more males than females.39

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

Specific to the demands of combat, women are better at making logical decisions under stressful conditions without the negative interactions caused by testosterone which increases activity of brain areas associated with impulse control and distractibility. Research demonstrates that women in combat roles would result in far fewer accidents, assaults, and cases of fratricide.40 Biomechanically, women have a lower center of gravity, which inherently gives them better balance, useful for hand to hand combat, climbing and traversing difficult terrain.41 In terms of mental resilience, after controlling for reports of prior life stressors and sexual harassment during deployment, Vogt et al. reported no gender differences in the association between several types of deployment stressors including combat exposure and PTSD.42 In fact, men are 5 times more likely to use alcohol as a coping mechanism, and become alcohol dependent or diagnosed with antisocial personality disorder. In summary, with lower levels of oxygen free radicals, higher body fat, lower need for caloric intake, and better lipid utilization for energy metabolism while sparing muscle protein and glycogen,43 women have a higher average survival rate than men in times of great metabolic stress, like severe famine. Additionally, the greater the severity of the stress, the greater the difference in survival numbers between men and women.44 Women, are designed to have children, and evolutionary adaptations to bear children have enabled women to deal better with deprivation.44 These physiological advantages that women have over men for survival in adverse environments remain, and this advantage is further supported by the rapid reduction in the male to female gap in athletic performance, a result of the scale up of athletic programs targeting girls and women.45 1.3. Aerobic or strength training best builds military physiologically resilience 1.3.1. Aerobic training best builds military physiological resilience (Jace Drain, Australia) It is well understood that many military occupational tasks involve prolonged and/or repeated performance, e.g. pack marches, digging, sand-bagging, fire and movement, material manual handling. Typically, cardiovascular endurance underpins the performance of these tasks. An individual with a higher aerobic capacity (VO2max ) will therefore be working at a lower relative intensity (%VO2max ), when compared to less aerobically fit individuals. A reduced relative task intensity will in turn allow for longer task performance and/or a greater capacity for repeated efforts.46 Beyond occupational performance, aerobic fitness is also strongly correlated with injury rates and attrition during military training.47 In fact, aerobic fitness is one of the most common risk factors for musculoskeletal injury during military training.12 Aerobic training can also help military personnel buffer the allostatic stress associated with military training. Specifically, aerobic fitness has been associated with attenuated hormonal and subjective stress reactivity in response to military training.48,49 Importantly in a military context, aerobic training helps to moderate reactivity to psychological stressors, in the absence of physical stress. Furthermore, evidence indicates that aerobic training can help to attenuate age-related increases in the hypothalamic-pituitary-adrenal axis reactivity to psychological stress.50 Improved aerobic fitness is also associated with reduced cardiometabolic risk factors and importantly, can help to attenuate stress-related increases in cardiovascular risk factors.51 In summary, aerobic training can confer an array of benefits to military personnel including increased physical and physiological ability to tolerate occupational task demands, decreased injury risk, improved overall health (including psychological), and enhanced ability to buffer stress. These benefits are realized in both the

1119

short-term (e.g. improved ability to execute a task/mission) and the longer-term (e.g. improved injury resistance and stress buffering during sustained training/deployment, and decreased disease risk). Whilst the requirement for physical conditioning is overt for military personnel in physically demanding roles/occupations (e.g. infantry, artillery), physical fitness should also be considered a tool to manage capability (and resilience) across an ageing and diverse workforce. It is suggested that there can be little doubt that aerobic training is essential to building military physiological resilience. On this basis, the establishment and maintenance of aerobic fitness should be an imperative for military organizations. 1.3.2. Strength training best builds military physiological resilience (Herbert Groeller, Australia) Physical fitness clearly influences the ability of individuals to manage and adapt well to stressors.52 Higher levels of physical fitness (cardiorespiratory fitness and local muscle endurance) prior to entry into basic combat and severe military training was associated with a lowered stress response and improved psychological outcomes in soldiers.53,54 However, the optimal type and amount of exercise to facilitate the protective benefits of physical fitness has not as yet been established.54 Furthermore, of the range of physical regimen investigated, more is known of the effect of aerobic exercise training and responsiveness to physical and psychological stress. Therefore what role might strength training have upon the physiological resilience of soldiers? The characteristics of the stressor are an important consideration, as intermittent exposure to stress with sufficient recovery is known to facilitate toughness, mastery that can provided a protection function for the soldier. Given the carriage and lifting of external loads is associated with the highest incidence of injury during deployment,13 intermittent and functional exposure to physical stressors to improve performance in this area would appear to have the greatest utility with respect to physiological resilience. Indeed, the modern day battlefield requires high intensity and explosive movement, often with soldiers burdened by the carriage of an external mass; physical performance characteristics that benefit from increased muscular strength and power.55,56 Yet, cardiorespiratory endurance training is still a significant bias within modern military training regimen.55 However, the incorporation of resistance training to improve physiological resilience in soldiers should be carefully considered. A focus upon physical gains to increase force production capacity or skeletal muscle mass that has poor utility with the essential physical demands of deployment and combat may serve to decrease the physiological resilience of the soldier. Thus, a critical evaluation of the end state requirements of the soldier should be acknowledged and used to inform the application of resistance training regimen, to improve not only muscular strength and power for functionally relevant tasks, but also enhance endurance performance and movement competency and quality. Nonetheless, this strategy in isolation is likely to have limited efficacy with respect to the development of physiological resilience. The totality of the physical stress should be considered; where paradoxically increased absolute physical training loads when progressively applied load, can increase resilience to musculoskeletal injury.57 1.4. Thermal resilience is essential for military preparedness 1.4.1. Cold environmental resilience is most essential for military preparedness (Andrew Young, United States) Preparations to improve Warfighter tolerance/resilience to cold exposure during military operations are probably more important to undertake than preparations to enhance tolerance of heat stress. For one thing, the likelihood that military forces will be deployed in the cold, northern latitudes for peacekeeping and national secu-

1120

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

rity operations is increasing as global warming causes sea lanes in the Arctic Ocean to open, and nations compete for the natural resources in that region.58 Also, the incidence rate of cold injuries59 is much higher than the incidence of heat injuries.60 Most Warfighters and their leaders have prior experience coping with heat-stress conditions during military missions, whereas far fewer have experience with cold-weather operations. Further, it is widely appreciated by military leaders that physiological mechanisms underlying human heat tolerance can be optimized in their Warfighters relatively easily, simply by having them perform increasingly strenuous bouts of physical work in the hot-weather conditions for progressively longer periods of time over five to ten consecutive days (i.e., induction of heat acclimatization), and ensuring that they consume adequate amounts of water to maintain homeostasis. In contrast, the primary human physiological responses to cold exposure, shivering and peripheral vasoconstriction, provide little meaningful protection even after induction of cold acclimatization, which is slower to develop and less effective for improving thermal tolerance than heat acclimatization.61 Optimizing behavioral responses to cold is more effective for enhancing cold tolerance/resilience than optimizing physiological responses. Developing optimal behavioral responses to operate effectively in cold conditions without suffering cold injuries will entail learning and practice by the individual Warfighter, as well as specialized clothing and equipment, and will therefore require more time and resources than needed to optimize heat tolerance. Key behavioral responses that must be learned and practiced during cold-weather operations to improve Warfighter tolerance/resilience include understanding how to wear, use and maintain cold-weather protective clothing, shelters, tools, and mobility equipment. Proper wear of cold-weather protective clothing, will be highly variable between and within individuals, depending on weather conditions, physical activity levels and individual anthropometric characteristics.62 Individual Warfighters should be allowed to choose their own clothing combinations to achieve optimal environmental protection. This skill cannot be mastered in a classroom, and requires training in different cold-weather conditions at different activity levels so Warfighters learn to appreciate their own individual requirements.63 Similarly, Warfighters must train to perform their duties wearing their cold-weather clothing using their weapons and equipment during different cold-weather conditions, so they can appreciate how that clothing and the cold weather conditions affect their dexterity and ability to function in the environment. Compared to optimizing heat tolerance/resilience, it is essential that Warfighters complete much more extensive experiential learning to develop behavioral responses to cold exposure that optimize environmental tolerance/resilience. 1.4.2. Heat tolerance is an essential part of military preparation (Nigel Taylor, Australia) Homoeothermic species are vulnerable to climatic extremes that challenge temperature regulation and elicit significant changes in tissue temperatures. Humans are no exception, with those in military and emergency-service occupations facing regular thermal challenges. From a military perspective, operations in both hot and cold extremes are likely, with the probability dictated by national priorities and international obligations. For instance, Asia–Pacific countries routinely prepare for deployment into tropical and equatorial regions. Since humans evolved in hot-dry climates, it may be argued that we are more prone to cold-related injuries, and the evidence supports that proposition.64–66 Cold per se does not exist; it is merely a subjective description assigned to states of lower thermal energy (heat). Since energy constantly moves from higher to lower energetic states, then some solutions to these thermal challenges come in the form of protective

barriers. Designers, manufacturers and procurers of personal protective clothing and equipment for the military and first responders face significant challenges. In the cold, thermal protective clothing must resist heat loss, whilst the influx of thermal energy must be minimised when external temperatures exceed body temperature. Furthermore, protective clothing should enhance heat dissipation during states of high metabolic heat production, regardless of environmental temperature. Thermal problems also challenge physiologists seeking to identify strategies to enhance the tolerance and resilience of warfighters and emergency personnel. With the exception of extreme radiant-heat exposures, it is the deep tissues that are most susceptible during heat stress, giving rise to illnesses ranging from cramps to heat stroke. For the military and first responders, those disorders are associated as much, if not more so, with physical exertion and metabolically derived heat, largely due to occupational requirements mandating the wearing of protective clothing, equipment and body armour. Such ensembles can encapsulate the wearer, particularly during chemical and biological threats, isolating that person from the ambient medium. In that state, limited exchange occurs between the body and the external environment.67 Thus, heat produced within, and fluid lost into, that closed system remains within the protective ensemble, and the microclimate approximates body temperature and rapidly becomes saturated with water vapor. That state, when combined with elevated heat production, is not conducive to prolonged survival, regardless of prior physical and thermal conditioning. Three approaches have been used to minimise the risk of exertional heat illness: heat adapting personnel,68 developing fabrics that facilitate heat and moisture removal67 and supplemental cooling. The first two are beneficial to minimally clothed athletes. However, heat adaptation elevates sweat secretion, at least in the short term, most of which remains unevaporated, and provides negligible cooling for those wearing protective clothing. Such sweat losses accelerate dehydration and compromise thermal insulation of the protective clothing. Smart fabrics, if worn beneath protective clothing and equipment, offer no respite.67 The less exotic solution must, just like in the cold, center around sound educational and managerial practices, in combination with ample experiential opportunities. 1.5. The military can (nurture) or cannot (nature) build and instill physiological/psychological resilience 1.5.1. The military can build and instill physiological and psychological resilience (Anthony Moffitt, Australia) “Man can (nurture) only on external and visible characters: nature cares nothing for appearances, except in so far as they may be useful.” If Darwin’s allusion to the futility of influencing eons of random variations and infinitesimal ‘nature’ adaptations is correct, “which as far as our ignorance permits” it is, should we consume ourselves with the ‘nurture’ of man at all? How many citizens would need to be trained to counter a potentially catastrophic threat to Australia – hundreds of thousands? If so, our military will essentially ‘get what we get’. We can no longer take for granted the ‘hardiness’ of past generations given the profound biopsychosocial developmental challenges that the emerging digital native generation are experiencing.69 Certainly, the brutality of combat is profoundly divorced from a contemporary young westerner’s reality. So, how prepared is the current fighting aged generation?70 Building and instilling resilience (nurture) in ‘what we get’ (nature) is not so much a question as it is a critical vulnerability. A military’s first object is to defend its people and territories. We have successfully made soldiers of our citizens forever. During WW1&2 pressure for boots on the ground ultimately meant that genetics mattered little. Since this time, Australia’s commitment to

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

warfare has been modest and safe in comparison and we have been able to build enviable military capability. In the face of a potentially catastrophic threat, ‘nature’ will again be largely irrelevant. In the context of a softening modern Australian society, building resilience in a sustainable lethal capability will be more important than ever. However, the challenge to build resilient combat effective soldiers to operate in VUCA battlefield environments appears to be unprecedented. For example, more sedentary lifestyles and the increase in ‘knowledge work’ may be conspiring against us in terms of a worst case national defense perspective. Further, there appears to be an over emphasis on what we put on our soldiers rather than what we put in them.71 The nature/nurture debate has outlasted its usefulness. Technology and developments in pedagogy, psychology and physiology have revolutionized how we ‘build’ humans. Developments we may well leverage deliberately to influence epigenetic factors72 and the biopsychosocial plasticity of our soldiers to assist in the build. Many things we considered fixed in humans, are not. For example, our understanding of how social stimuli are translated into physical characteristics in the brain73 ; or, the significant psychological (cognitive) benefits of physiological training.74–76 Interestingly, both raise questions around the maintenance of resilience. This can be achieved by immersing the scientists with our soldiers in what must be reality based training environments. We must resist the risk adversity of the bureaucratic policy makers that predominate modern training serials. Indeed, many senior soldiers would agree that this risk adversity is a threat to our soldier’s resilience. It is time to return the ancients’ approach77 of a locally coordinated, multidisciplinary, multifaceted ‘Human Performance’ programs that are well resourced and founded in practitioneracademia alliances. We have successfully built resilience in our military forever, with few exceptions. Through training and by organizing them into groupings and indoctrinations we have also built social and national resilience. However, in less determinable times we must modernize deliberately and rapidly. Rather than policy our soldiers need adaptive human performance programs supported by local academic alliances. Our soldiers’, and indeed our nation’s, resilience profoundly defines us all, and therefore we ‘must’ build on what we get.

1.5.2. The military cannot build and instill physiological and psychological resilience (Karl Friedl, United States) Everyone can do their part for national defense, but not everyone is born to be a soldier. Soldier resilience is determined by genetics and early childhood experiences; building resilience rather than selecting individuals who already possess it is generally not feasible. By the time 17 or 18 year old recruits report for duty, the die has been cast and there are practical limits to how much biology can be modified to best meet soldier performance needs. Early influences have been well entrained by the time young men and women report to military training and the resulting resilience attributes are not easily modifiable. In 1946, the U.S. Congress enacted the school lunch program because of national security concerns. Too many chronically malnourished conscripts had been unsuitable for military service in World War II, and the Army could not build or instill resilience in these individuals after the fact. Today, reversing the first two decades of nutrition and exercise habits has also been unsuccessful for obese young men and women; obese recruits who successfully lose weight during basic training are still likely to be eliminated as fitness failures before the end of their first enlistment.78 During basic training additional selection occurs because trainability genes determine who can achieve minimum physical training standards and continue as a soldier.79

1121

Genetic and epigenetic influences determine resilience factors such as hardiness and metabolic flexibility. In a study of the U.S. Army Ranger course, the two leanest individuals out of 50 young soldiers who completed the full eight weeks involving high workload and hypocaloria illustrated opposite extremes of metabolic response. One of these soldiers lost the least amount of weight (only 9% of body weight) and relatively little lean mass, while the other lost the largest amount of weight (23%) and consumed an estimated 40% of muscle mass (and was not awarded the Ranger tab, based on patrol leadership performance).80 U.S. Army Rangers are selected on the basis of demonstrated resilience. Epigenetics can determine psychological resilience. Traumainduced stress responsivity can be passed to offspring, putting these individuals at increased risk for PTSD and other maladaptive responses to future traumatic exposures.81 Other recent findings show that mindful control of anxiety is moderated by the strength of the connection between prefrontal cortex and the amygdala. Trait anxiety is lower in individuals with a thicker fiber tract connection to frontal cortex, the center of psychological resilience.82 Gut microbiota also play an important role in stress and anxiety.83 Until there is a program to reverse epigenetic effects or successfully reconfigure the gut microbiome of recruits, these factors affect key resilience traits that should be part of soldier selection. Artificial attempts to enhance soldier performance may actually reduce resilience. For example, pharmaceutical enhancement of alertness removes the flexibility for restorative sleep opportunities, and drug manipulation of myostatin action to create massively muscled hulks reduces the opportunity to run fast and tolerate hot environments. Armies should select individuals with high resilience genes from the most promising pools of recruits; warrior ¯ cultures such as Sikhs, Gurkhas, New Zealand Maori, and Highland Scots are examples of such individuals who are purposely overrepresented in military service. Selection is preferable to extraordinary training, drug, and genetic enhancement of average individuals.

2. Discussion Resilience is the ability to maintain normal psychological and physiological functioning in the presence of high stress and trauma.8,84 As demonstrated in this roundtable, there are many co-dependent layers to resilience that build upon one another to ultimately enhance military readiness and preparedness (Fig. 1). Resilience is initially instilled within soldiers through training and preparation aimed to enhance physiological tolerance to stress. Aerobic training has long been the cornerstone of military training due to the physiological adaptations including increased cardiac output, decreased peripheral vascular resistance, and increased number of mitochondria in muscle cells that are vital to optimal performance of many military tasks.85 Certainly these adaptations are advantageous in the presence of high physiological stress. However, the modern battlefield requires higher levels of anaerobic fitness, involving high force and quick explosive movements, and failure to prepare for such demands can lead to increase in injury or death.85 As more combat roles become open to women, the importance of anaerobic and strength training become increasingly essential for women to develop the adaptations necessary to meet the demands of the battlefield. The discernable physiological differences between men and women can promote and hinder resiliency for either sex. Though men have several physiological advantages over women, including higher average cardiac output and muscle strength, testosterone can negatively affect impulse control and decision making in combat.40 In contrast, despite the physiological shortcomings requiring women to perform at a higher level of their maximum capacity during some military-specific tasks,

1122

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

Fig. 1. Five key domains of resiliency resilience can be promoted through a variety of domains to enhance the readiness, lethality, and modernization of armed forces. While performance is ultimately grounded in cellular biology and physiology, it is enhanced through developing psychological coping mechanisms that nurture soldiers to tolerate discomfort and stress. For years aerobic training has been the cornerstone of military training due to the advantageous physiological adaptations. However, the modern battlefield requires higher levels of strength and anaerobic fitness. Though men have several physiological advantages over women, better lipid utilization provides women a resilient advantage over men in adverse environments. Regardless of sex, soldiers can prepare for extreme heat and cold by training in protective clothing while exposed to the elements.

their decreased caloric requirements and better lipid utilization43 provide a resilient advantage over men in adverse environments. While the capability of a soldier’s performance is ultimately grounded at the cellular level, performance will be suboptimal if the soldier is unable to develop coping mechanisms to handle a changing operational environment. Therefore, building adaptive resilience in soldiers is the next layer necessary to promote military readiness. Considering the response to the same psychological stressor can vary immensely from person to person, resilience is considered an individual trait.86 However, humans have proven to be a highly adaptable species. Through behavioral adaptations and reality based training environments, resilience has the potential to be instilled in soldiers, just as it has been historically during wartime with minimal selection criteria for soldiers, i.e. conscription. The Comprehensive Soldier Fitness (CSF) program is one example of how the US Army is taking a proactive approach to building resiliency in soldiers.87 Based on positive psychology, the CSF program takes a similar approach to the Army’s physical fitness training. Adaptive resilience is not solely based in psychology as a soldier must also be physically prepared to adapt to extreme climates. A combination of physiological training using specialized equipment for extreme environments, performing tasks while wearing appropriate protective clothing, and exposure to the elements, in conjunction with psychological training such as pacing, self-monitoring, and managing discomfort, are necessary to build resilience in the presence of extreme heat or cold.

Natures versus nurture tradeoffs are completely dependent on the needs of the military. Though some aspects surrounding resilience are solely grounded in nature, such as biological sex, genetic predisposition, and environmental conditions, resilience has the potential to be nurtured through physical and psychological training combined with the use of specialized equipment for extreme conditions. In times of national emergency, every able bodied individual may be called to serve in defense of their country and selection standards are eased or eliminated. In conscript armies around the world, individuals are prepared in basic training to do their part for national defense. Professional armies and specialized elite performers are more likely to be selected for their performance, including demonstrated resilience traits. Once the layers of foundational and adaptive resilience have been established, the final layer should aim to reduce the demands for resilience in the modern battlefield to enhance readiness and preparedness. For instance, the application of new technologies, such as the use of exoskeletons for carrying heavy combat loads,88 or innovative approaches to determine readiness, such as biomarker analysis,16 can further enhance the level of preparedness. Furthermore, forecasting the operational environment and building appropriate techniques and countermeasures will optimize the readiness of our soldiers. Ultimately, the interrelations of the layers of resilience indicate there is no singular or even binary approach that is most advantageous for building resilience to enhance military preparedness. Rather, a hybrid approach may be superior. Combining strate-

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

gies may promote optimal readiness in the face of unanticipated, adverse stressors allowing service members to be equipped for a variety of scenarios. In doing so, the military can optimize performance in soldiers that are both mentally and physically resilient and equipped with behavioral adaptations to overcome the forces of nature, including physiological predispositions and extreme environmental conditions. While this roundtable focused on the individual soldier, future concern should also be provided for team resilience as military operations are generally executed in small and large units. Acknowledgements This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. References [1]. Megginson LC. Lessons from Europe for American business. Southwest Soc Sci Q 1963; 44:3–13. [2]. Nindl BC, Castellani JW, Warr BJ et al. Physiological employment standards III: physiological challenges and consequences encountered during international military deployments. Eur J Appl Physiol 2013; 113(11):2655–2672. [3]. Nindl BC, Williams TJ, Deuster PA et al. Strategies for optimizing military physical readiness and preventing musculoskeletal injuries in the 21st century. US Army Med Dep J 2013:5–23. [4]. Ruiz-Casares M, Guzder J, Rousseau C et al. Cultural roots of well-being and resilience in child mental health, In: Handbook of Child Well-Being. Springer, 2014. p. 2379–2407. [5]. Eggerman M, Panter-Brick C. Suffering, hope, and entrapment: resilience and cultural values in Afghanistan. Soc Sci Med 2010; 71(1):71–83. [6]. Shaffer F, McCraty R, Zerr CL. A healthy heart is not a metronome: an integrative review of the heart’s anatomy and heart rate variability. Front Psychol 2014; 5. [7]. Thayer JF, Hansen AL, Saus-Rose E et al. Heart rate variability, prefrontal neural function, and cognitive performance: the neurovisceral integration perspective on self-regulation, adaptation, and health. Ann Behav Med 2009; 37(2):141–153. [8]. Wu G, Feder A, Cohen H et al. Understanding resilience. Front Behav Neurosci 2013; 7:10. [9]. Muza S, Roussel M. Fit, Nourished and Resilient, Army AL&T, 2018, 151–155. [10]. Lieberman HR, Castellani JW, Young AJ. Cognitive function and mood during acute cold stress after extended military training and recovery. Aviat Space Environ Med 2009; 80(7):629–636. [11]. Sawka MN, Leon LR, Montain SJ et al. Integrated physiological mechanisms of exercise performance, adaptation, and maladaptation to heat stress. Compr Physiol 2011; 1(4):1883–1928. [12]. Knapik JJ, Sharp MA, Canham-Chervak M et al. Risk factors for training-related injuries among men and women in basic combat training. Med Sci Sports Exerc 2001; 33(6):946–954. [13]. Roy TC, Knapik JJ, Ritland BM et al. Risk factors for musculoskeletal injuries for soldiers deployed to Afghanistan. Aviat Space Environ Med 2012; 83(11):1060–1066. [14]. Hynynen E, Uusitalo A, Konttinen N et al. Cardiac autonomic responses to standing up and cognitive task in overtrained athletes. Int J Sports Med 2008; 29(7):552–558. [15]. Smith CD, Cooper AD, Merullo DJ et al. Sleep restriction and cognitive load affect performance on a simulated marksmanship task. J Sleep Res 2017. http://dx.doi.org/10.1111/jsr.12637. Epub ahead of print. [16]. Nindl BC, Jaffin DP, Dretsch MN et al. Human performance optimization metrics: consensus findings, gaps, and recommendations for future research. J Strength Cond Res 2015; 29(Suppl. 11):S221–S245. [17]. Maze R, Cavallaro G. Battling bureaucracy: the way forward requires modernizing the modernization process. Army Mag 2016; 682018:36–38. [18]. Friedl KE, Breivik TJ, Carter 3rd R et al. Soldier health habits and the metabolically optimized brain. Mil Med 2016; 181(11):e1499–e1507. [19]. Nindl BC, Alemany JA, Tuckow AP et al. Effects of exercise mode and duration on 24-h IGF-I system recovery responses. Med Sci Sports Exerc 2009; 41(6):1261–1270. [20]. Nindl BC, Pierce JR, Rarick KR et al. Twenty-hour growth hormone secretory profiles after aerobic and resistance exercise. Med Sci Sports Exerc 2014; 46(10):1917–1927. [21]. Hoyt RW, Opstad PK, Haugen AH et al. Negative energy balance in male and female rangers: effects of 7 d of sustained exercise and food deprivation. Am J Clin Nutr 2006; 83(5):1068–1075. [22]. Millet GY, Tomazin K, Verges S et al. Neuromuscular consequences of an extreme mountain ultra-marathon. PLoS One 2011; 6(2):e17059. [23]. Temesi J, Arnal PJ, Rupp T et al. Are females more resistant to extreme neuromuscular fatigue? Med Sci Sports Exerc 2015; 47(7):1372–1382. [24]. Cheung SS, McLellan TM. Heat acclimation, aerobic fitness, and hydration effects on tolerance during uncompensable heat stress. J Appl Physiol (1985) 1998; 84(5):1731–1739.

1123

[25]. Jones BH, Bovee MW, Harris 3rd JM et al. Intrinsic risk factors for exerciserelated injuries among male and female army trainees. Am J Sports Med 1993; 21(5):705–710. [26]. Boyum A, Wiik P, Gustavsson E et al. The effect of strenuous exercise, calorie deficiency and sleep deprivation on white blood cells, plasma immunoglobulins and cytokines. Scand J Immunol 1996; 43(2):228–235. [27]. Martinez-Lopez LE, Friedl KE, Moore RJ, Kramer TR. A longitudinal study of infections and injuries of ranger students. Mil Med 1993; 158(7):433–437. [28]. Friedl K. Military studies and nutritional immunology-undernutrition and susceptibility to illness, In: Diet and Human Immune Function. New York, Human Press, 2004. p. 381–396. [29]. Fletcher D, Sarkar M. A grounded theory of psychological resilience in Olympic champions. Psychol Sport Exerc 2012; 13(5):669–678. [30]. Sawka MN, Wenger CB, Young AJ et al. Physiological responses to exercise in the heat, In: Nutritional Needs in Hot Environments: Applications for Military Personnel in Field Operations. Washington (DC), National Academies Press (US), 1993. p. 55. [31]. Epstein Y, Druyan A, Heled Y. Heat injury prevention—a military perspective. J Strength Cond Res 2012; 26(Suppl. 2):S82–S86. [32]. Marcora SM, Staiano W, Manning V. Mental fatigue impairs physical performance in humans. J Appl Physiol (1985) 2009; 106(3):857–864. [33]. Martin K, Staiano W, Menaspa P et al. Superior inhibitory control and resistance to mental fatigue in professional road cyclists. PLoS One 2016; 11(7):e0159907. [34]. Epstein Y, Yanovich R, Moran DS et al. Physiological employment standards IV: integration of women in combat units physiological and medical considerations. Eur J Appl Physiol 2013; 113(11):2673–2690. [35]. Roberts D, Gebhardt DL, Gaskill SE et al. Current considerations related to physiological differences between the sexes and physical employment standards. Appl Physiol Nutr Metab 2016; 41(6 Suppl. 2):S108–S120. [36]. Nindl BC, Jones BH, Van Arsdale SJ et al. Operational physical performance and fitness in military women: physiological, musculoskeletal injury, and optimized physical training considerations for successfully integrating women into combat-centric military occupations. Mil Med 2016; 181(1 Suppl):50–62. [37]. Moore KM. Ground Combat Element Integrated Task Force Experimental Assessment Report, 2015. [38]. Gabbay FH, Ursano RJ, Norwood A et al. Sex Differences, Stress, and Military Readiness, Uniformed Services Univ of The Health Sciences Bethesda MD Dept Of Psychiatry, 1996. [39]. Perls TT, Fretts RC. Why women live longer than men—what gives women the extra years? Sci Am 1998;(2):100–103. [40]. Johnson RF, Merullo DJ. Friend-foe discrimination, caffeine, and sentry duty. Paper Presented at: Proceedings of the Human Factors and Ergonomics Society Annual Meeting 1999. [41]. Shephard RJ. Exercise and training in women, part I: influence of gender on exercise and training responses. Can J Appl Physiol 2000; 25(1):19–34. [42]. Vogt D, Vaughn R, Glickman ME et al. Gender differences in combat-related stressors and their association with postdeployment mental health in a nationally representative sample of U.S. OEF/OIF veterans. J Abnorm Psychol 2011; 120(4):797–806. [43]. Friedl KE. Biases of the incumbents—what if we were integrating men into an all women’s army? Mil Rev 2016; 96(2):69–75. [44]. Brody J. Sex and the survival of the fittest: Calamities are a disaster for Men. The New York Times. April 24, 1996. [45]. Kenney WL, Wilmore JH, Costill DL. Physiology of Sport and Exercise, 5th edition Champaign, IL, Human Kinetics, 2012. [46]. Astrand PO, Rodahl K. Textbook of Wrok Physiology: Physiological Bases of Exercise, 3rd edition New York, McGraw-Hill, 1986. [47]. Pope RP, Herbert R, Kirwan JD et al. Predicting attrition in basic military training. Mil Med 1999; 164(10):710–714. [48]. Taylor MK, Markham AE, Reis JP et al. Physical fitness influences stress reactions to extreme military training. Mil Med 2008; 173(8):738–742. [49]. Tyyska J, Kokko J, Salonen M et al. Association with physical fitness, serum hormones and sleep during a 15-day military field training. J Sci Med Sport 2010; 13(3):356–359. [50]. Traustadottir T, Bosch PR, Matt KS. The HPA axis response to stress in women: effects of aging and fitness. Psychoneuroendocrinology 2005; 30(4):392–402. [51]. Gerber M, Borjesson M, Ljung T et al. Fitness moderates the relationship between stress and cardiovascular risk factors. Med Sci Sports Exerc 2016; 48(11):2075–2081. [52]. Southwick SM, Charney DS. The science of resilience: implications for the prevention and treatment of depression. Science 2012; 338(6103):79–82. [53]. Crowley SK, Wilkinson LL, Wigfall LT et al. Physical fitness and depressive symptoms during army basic combat training. Med Sci Sports Exerc 2015; 47(1):151–158. [54]. Silverman MN, Deuster PA. Biological mechanisms underlying the role of physical fitness in health and resilience. Interface Focus 2014; 4(5):20140040. [55]. Kraemer WJ, Szivak TK. Strength training for the warfighter. J Strength Cond Res 2012; 26(Suppl. 2):S107–S118. [56]. Nindl BC, Alvar BA, R Dudley J et al. Executive summary from the National Strength and Conditioning Association’s second blue ribbon panel on military physical readiness: military physical performance testing. J Strength Cond Res 2015; 29(Suppl. 11):S216–S220. [57]. Gabbett TJ. The training-injury prevention paradox: should athletes be training smarter and harder? Br J Sports Med 2016; 50(5):273–280. [58]. Goldenberg S. Pentagon: global warming will change how US military trains and goes to war. theguardian, 2014.

1124

B.C. Nindl et al. / Journal of Science and Medicine in Sport 21 (2018) 1116–1124

[59]. Harirchi I, Arvin A, Vash JH et al. Frostbite: incidence and predisposing factors in mountaineers. Br J Sports Med 2005; 39(12):898–901, discussion 901. [60]. Armed Forces Health Surveillance B. Update: heat injuries, active component, U.S., Army, Navy, Air Force, and Marine Corps, 2015. MSMR 2016; 23(3):16–19. [61]. Castellani JW, Young AJ. Human physiological responses to cold exposure: acute responses and acclimatization to prolonged exposure. Auton Neurosci 2016; 196:63–74. [62]. Sawka MN, Castellani JW, Cheuvront SN et al. Physiologic systems and their responses to conditions of heat and cold, in ACSM’s Advanced Exercise Physiology, Farrell PA, Joyner MJ, Caiozzo VJ, editors, Baltimore, Wolters Kluwer/Lippincott Williams & Wilkins, 2012, p. 567–602. [63]. Headquarters, Department of the Army. TB MED 508, Prevention and Management of Cold-Weather Injuries. In. Washington, DC2005. [64]. Golden FS, Francis TJ, Gallimore D et al. Lessons from history: morbidity of cold injury in the Royal Marines during the Falklands Conflict of 1982. Extrem Physiol Med 2013; 2(1):23. [65]. Kazman JB, Purvis DL, Heled Y et al. Women and exertional heat illness: identification of gender specific risk factors. US Army Med Dep J 2015:58–66. [66]. Armed Forces Health Surveillance B. Update: heat illness, active component, U.S. Armed Forces, 2016. MSMR 2017; 24(3):9–13. [67]. Taylor NA. Overwhelming physiological regulation through personal protection. J Strength Cond Res 2015; 29(Suppl. 11):S111–S118. [68]. Taylor NA. Human heat adaptation. Compr Physiol 2014; 4(1):325–365. [69]. Twenge JM, Park H. The decline in adult activities among U.S. adolescents, 1976–2016. Child Dev 2017. http://dx.doi.org/10.1111/cdev.12930. Epub ahead of print. [70]. Episode 16 – Achieving Tactical Overmatch with MG (R) Robert Scales [Internet]; 2016 November 18, 2016. Podcast: 00:37:14. Available from: http:// modernwarinstitute.libsyn.com/podcast. [71]. Episode 24 – Physical Fitness and National Security with Lt. Gen. (Ret) Mark Hertling [Internet]; 2017 April 26, 2017. Podcast: 00:32:29. Available from: http://modernwarinstitute.libsyn.com/podcast. [72]. Waterland RA, Jirtle RL. Transposable elements: targets for early nutritional effects on epigenetic gene regulation. Mol Cell Biol 2003; 23(15):5293–5300.

[73]. Doidge N. The Brain That Changes Itself: Stories of Personal Triumph from the Frontiers of Brain Science, Carlton North, Victoria, Penguin, 2007. [74]. Schuch FB, Vancampfort D, Richards J et al. Exercise as a treatment for depression: a meta-analysis adjusting for publication bias. J Psychiatr Res 2016; 77:42–51. [75]. Schuch FB, Vancampfort D, Rosenbaum S et al. Exercise improves physical and psychological quality of life in people with depression: a meta-analysis including the evaluation of control group response. Psychiatry Res 2016; 241:47–54. [76]. Chapman SB, Aslan S, Spence JS et al. Distinct brain and behavioral benefits from cognitive vs. physical training: a randomized trial in aging adults. Front Hum Neurosci 2016; 10:338. [77]. Hodkinson S. Agoge, in Oxford Classiscal Dictionary, Hornblower S, editor, Oxford, Oxford University Press, 1996. [78]. Friedl KE, Vogel JA, Bove MW, Jones BH. Assessment of body weight standards in male and female Army recruits. In: Medicine USARIoE, ed., Natick, MA 1989:97. [79]. Bouchard C. Genomic predictors of trainability. Exp Physiol 2012; 97(3):347–352. [80]. Friedl KE, Moore RJ, Martinez-Lopez LE et al. Lower limit of body fat in healthy active men. J Appl Physiol (1985) 1994; 77(2):933–940. [81]. Yehuda R, Daskalakis NP, Bierer LM et al. Holocaust exposure induced intergenerational effects on FKBP5 methylation. Biol Psychiatry 2016; 80(5):372–380. [82]. Kim MJ, Whalen PJ. The structural integrity of an amygdala-prefrontal pathway predicts trait anxiety. J Neurosci 2009; 29(37):11614–11618. [83]. Cryan JF, Dinan TG. Mind-altering microorganisms: the impact of the gut microbiota on brain and behaviour. Nat Rev Neurosci 2012; 13(10):701–712. [84]. Russo SJ, Murrough JW, Han MH et al. Neurobiology of resilience. Nat Neurosci 2012; 15(11):1475–1484. [85]. Friedl KE, Knapik JJ, Hakkinen K et al. Perspectives on aerobic and strength influences on military physical readiness: report of an international military physiology roundtable. J Strength Cond Res 2015; 29(Suppl. 11):S10–S23. [86]. Reichmann F, Holzer P, Neuropeptide Y. A stressful review. Neuropeptides 2016; 55:99–109. [87]. Cornum R, Matthews MD, Seligman ME. Comprehensive soldier fitness: building resilience in a challenging institutional context. Am Psychol 2011; 66(1):4–9. [88]. Letendre LA. Women warriors: why the robotics revolution changes the combat equation 1. Prism: J Center Complex Oper 2016; 6(1):90.

Journal of Science and Medicine in Sport 21 (2018) 1125–1130

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

International consensus on military research priorities and gaps — Survey results from the 4th International Congress on Soldiers’ Physical Performance Mita Lovalekar a,∗ , Marilyn A. Sharp b , Daniel C. Billing c , Jace R. Drain c , Bradley C. Nindl a , Edward J. Zambraski b a

Department of Sports Medicine and Nutrition, School of Health and Rehabilitation Sciences, University of Pittsburgh, USA US Army Research Institute of Environmental Medicine, USA c Land Division, Defense Science & Technology Group, Australia b

a r t i c l e

i n f o

Article history: Received 8 March 2018 Received in revised form 25 May 2018 Accepted 29 May 2018 Available online 6 June 2018 Keywords: Military personnel Occupational health Physical fitness Injuries Surveys and questionnaires

a b s t r a c t Objectives: The objectives of this study were to identify perceived priorities related to military personnel’s health and physical performance, among attendees at the 4th International Congress on Soldiers’ Physical Performance (ICSPP), and to determine if perceived priorities had changed between the 3rd ICSPP survey held in 2014 and the 4th ICSPP survey held in 2017. Design: Electronic survey. Methods: Respondents were asked to grade priority areas on a Likert scale, and average ratings were used to rank priority areas. Responses to free text questions were analyzed qualitatively. Responses to the 4th ICSPP survey were described and compared to responses to the 3rd ICSPP survey. Results: The 4th ICSPP survey respondents were a diverse group (40.6% military, 58.9% civilian). The two most important priority areas identified were physical demands in operational environments (mean score = 4.41/5) and measuring physical performance/fitness (4.38/5), which were also the top two areas in the 3rd ICSPP survey. There was remarkable overlap in the rankings of priority areas between the two surveys. Sleep and nutrition were emerging priority areas and were perceived as relatively more important in the 4th ICSPP survey compared to the 3rd ICSPP survey. The greatest perceived emerging threat was resilience/psychological fitness of recruits (4.16/5). Physiological status monitoring (2.79/4) was identified as the most important technology. Conclusions: Despite the diverse backgrounds of the respondents, there was a clear continuing consensus about perceived important priority areas influencing military personnel’s health and physical performance. Soldier resiliency and assessment of physiological status were research topics identified as top priorities. © 2018 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction The physical demands1–3 and conditions under which military personnel perform are unique. Most countries and militaries have scientists, clinicians and human performance personnel researching factors that affect military personnel’s performance, readiness, resiliency and health. This applied research needs to address a wide array of factors, both in the training and operational environment. There is a need for innovative strategies to improve physical fitness, physical training, resiliency, and prevent injuries.4

∗ Corresponding author. E-mail address: [email protected] (M. Lovalekar).

The 3rd International Congress on Soldiers’ Physical Performance (ICSPP) was held in Boston, USA, in 2014.5 The ICSPP focused on applied human performance research and was attended by experts in military performance, clinicians and military personnel from all over the world. The ICSPP presents a unique opportunity to assess priority research topics for the military and other physically demanding occupations. A survey was administered to registrants to assess their opinions about perceived gaps and priority topics related to military health and physical performance. The two most important priority topics identified were physical demands in operational environments, and measuring physical performance/fitness.6 Previous studies have also emphasized these two priority areas.7,8 An analysis of data from the Total Army Injury and Health Outcomes Database (TAIHOD) showed that soldiers

https://doi.org/10.1016/j.jsams.2018.05.028 1440-2440/© 2018 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/ by-nc-nd/4.0/).

1126

M. Lovalekar et al. / Journal of Science and Medicine in Sport 21 (2018) 1125–1130

with heavier demands were at increased risk for musculoskeletal and on-duty injury hospitalizations, and disability, as compared to those with light demands.7 Also, poor performance on health- and skill-related fitness assessments has been shown to be associated with higher injury risk,9 but skill-related fitness components are not being assessed by military populations.10 The 4th ICSPP was held in Melbourne, Australia, in 2017. Similar to the 3rd ICSPP, a survey was completed by registrants of the 4th ICSPP. The purpose of this study was to describe gaps, needs and priorities related to military personnel’s health and physical performance, as identified in the survey by experts attending the 4th ICSPP. The second purpose was to determine if perceived research priorities had changed between administration of the 3rd ICSPP and the 4th ICSPP surveys, if the demographics of the respondents influenced their value judgments, and to obtain new information on emerging threats and technologies relevant to military human performance research.

2. Methods Detailed methodology for the 3rd ICSPP survey and results have been described previously.6 A link to an electronic survey was sent to registrants of the 4th ICSPP, before the conference. The survey was estimated to take about 30 min to complete. Almost all respondents completed the survey before the beginning of the conference, except for four respondents, who completed the survey when the conference was ongoing. The project was reviewed by the institutional review board of the University of Pittsburgh and designated as exempt research. The 4th ICSPP survey had 5 sections — (1) Respondent information; (2) General questions: research; (3) Emerging problems, challenges and threats; (4) Technologies influencing soldiers’ physical performance; and (5) Suggested research topics. Respondent information included country, military or civilian, military branch (if relevant), highest degree obtained, years of post-education experience, and position/expertise. The 2nd section included questions about 43 general areas of research important to military personnel’s physical performance. The 3rd section included questions about potential emerging or future importance of nine research areas. The 4th section included eight newly developed technologies. The last section included open ended questions about suggested research topics. The questions in the 4th ICSPP survey were similar to those in the 3rd ICSPP survey,6 though some additional questions were included. Sections on emerging problems, challenges, and threats; and technologies influencing soldiers’ physical performance were not included in the 3rd ICSPP survey and were added to the 4th ICSPP survey. The 3rd ICSPP survey contained 37 research priority areas, while the 4th ICSPP survey contained 43 topics. New priority areas added to the 4th ICSPP survey included — suicides, opioid use/substance abuse, optimism vs helplessness, hostility, and coping skills. Two questions were modified — (1) “cognitive function: resilience” changed to “psychosocial subjective measures: resilience/grit/hardiness”; and (2) “cognitive function: motivation” changed to “psychosocial subjective measures: motivation”. Also, the topic effects of deployment, was split into two topics — effects of deployment: emotional and physical. For sections 2 and 3, respondents were asked to grade each area on a Likert scale from 1 to 5 (“1” — unnecessary, do not consider; “5” — highest possible importance/priority). Respondents were asked to rate section 4 questions on a Likert scale from 1 to 4 (“1” — not being utilized, low potential or interest for future use; “4” — fully developed and being utilized widely). To encourage unbiased responses, respondents were told to provide their own individual

responses, and that no information obtained would be interpreted as an official position of a military or country. Statistical analysis was conducted using IBM SPSS Statistics 24 (IBM Inc., Armonk, NY). Average ratings (mean score) were used to rank research areas, separately for each section. The number of highest priority responses (5’s for sections 2 and 3, and 4’s for section 4) were counted separately for each section and reported as absolute frequency (count). The counts of highest priority responses are described after the mean score for each question within the respective sections. For the top 10 priority areas for the 4th ICSPP, chi-square tests were utilized to assess if the proportion of respondents who gave the highest priority score, were different between the top two priority areas and the remaining eight priority areas. Responses to free text questions were analyzed qualitatively. Responses were read multiple times to develop an understanding of the underlying themes. An a-priori theoretical framework was not used; instead a process of inductive reasoning was conducted to identify themes and patterns in the responses.11 The top 10 priority areas based on mean scores and relative rankings, were compared between the 3rd and 4th ICSPP surveys. To better understand the areas of research focus of conference presenters, abstracts presented at the 3rd and 4th ICSPP were categorized into 10 broad topics. These research topics were; fitness, testing and performance, musculoskeletal injuries, psychological and cognitive factors, nutrition, environmental factors, deployment considerations, sleep, load carriage, equipment and human factors, and gender integration and leadership.

3. Results The greatest number of attendees at the 3rd ICSPP were from the United States (53.7%). Out of the 313 3rd ICSPP attendees who received the survey, 140 responded (response rate: 44.7%).6 Detailed results from the 3rd ICSPP survey have been published elsewhere.6 The 4th ICSPP had 502 attendees from 32 countries. The online survey was completed by 224 respondents from 28 countries (44.6% response rate). Most respondents (36.6%) were from Australia, followed by the United States (18.8%), Brazil (4.5%) and Sweden (4.5%). Of the 224 respondents, 40.6% were military members and 58.9% were civilians. A large proportion (73.2%) were directly employed or contracted by a military organization, while 26.8% were not. Of the respondents employed or contracted by a military organization, 57.9% were affiliated with the Army, 2.4% with the Navy, 9.1% with the Air Force, and 18.3% with Tri-service. Of the 60 respondents who were not directly employed or contracted by a military organization, 75.0% indicated that they were associated with an organization/profession where “physical performance”, or the assessment of it, was important. Many respondents to the 4th ICSPP survey (38.8%) had earned a doctoral degree, 32.6% had a graduate degree, 16.5% had a college degree, and 3.6% were physicians. Respondents had an average of 14.5 years (standard deviation: 9.4 years) of post-education professional experience. Respondents were asked about their position/expertise and could choose more than one option. Most (61.2%) identified their position as a scientist/researcher. Others were service members (soldier/airman/sailor, 26.8%), administrators (12.1%), clinicians (10.3%), students (8.5%), or had “other” positions (16.1%). Average scores for the 43 topics under general questions: research, are included in Table 1, in descending order of mean score. Physical demands in operational environments had the highest mean score (4.41), and 122 respondents had identified this topic as being of the highest possible importance and assigned it a score

M. Lovalekar et al. / Journal of Science and Medicine in Sport 21 (2018) 1125–1130

1127

Table 1 Respondents’ priority scores for 43 general questions about research (224 respondents), 4th International Congress on Soldiers’ Physical Performance, Melbourne, Australia, 2017. Topic area under general questions: research

Mean scorea

Number of 5’sa (highest possible importance/priority)

Physical demands in operational environments Measuring physical performance/fitness Sleep Physical training programs: strength Physical training programs: endurance Soldiers’ load impairing performance Nutrition Physical demands in training environments Musculoskeletal injuries: injury mitigation programs Musculoskeletal injuries: overuse injuries Soldiers’ equipment impairing performance Physical employment standards Physical training programs: high intensity programs Fitness of recruits Psychosocial subjective measures: resilience/Grit/Hardiness Clinical research: treatment of injuries that impair performance Effects of deployment(s): emotional (e.g. PTSD) Effects of deployment(s): physical (disability/pain) Environmental factors: heat Psychosocial subjective measures: coping skills Musculoskeletal injuries: stress fractures Epidemiology (characterizing individuals, tracking pertinent medical information, injuries) Clinical research: diagnosing injuries that impair performance Clinical research: basis for return to duty decisions Psychosocial subjective measures: motivation Body composition of recruits Environmental factors: cold Musculoskeletal injuries: acute injuries Male vs female performance Computational/predictive modeling of performance Gender integration within the military Environmental factors: altitude Endocrinological factors and performance Clinical research: suicides Supplements: legal Psychosocial subjective measures: optimism vs helplessness Clinical research: concussions Performance of national guard/reserve troops (leave blank if no national guard/reserve troops) Genetics (selection for performance, injury potential) Use of NSAIDs (non-steroidal anti-inflammatory drugs) Clinical research: Opioid Use/Substance Abuse Supplements: illegal Psychosocial subjective measures: hostility

4.41 4.38 4.28 4.27 4.23 4.21 4.19 4.18 4.17 4.15 4.13 4.12 4.09 4.08 3.95 3.92 3.91 3.89 3.89 3.83 3.82 3.81 3.78 3.76 3.72 3.72 3.68 3.66 3.63 3.58 3.50 3.41 3.34 3.34 3.31 3.31 3.26 3.20 3.05 2.97 2.95 2.92 2.85

122 124 99 94 97 86 91 89 99 91 79 98 77 87 68 66 69 59 55 60 58 52 54 54 47 52 45 36 57 38 58 33 27 51 34 24 32 24 19 12 24 25 9

a

Priority score options: 1 — unnecessary, do not consider, 2 — low priority, 3 — moderate priority, 4 — high priority, and 5 — highest possible importance/priority.

of 5 out of 5. Measuring physical performance/fitness was ranked second, with a mean score of 4.38 and 124 respondents rating it as 5 out of 5. An additional 12 topics received mean scores of greater than 4. Among the top 10 topics, the results of chi-square tests showed that the proportion of respondents who rated each of the top two topics as being most important (rating of 5) were significantly higher than the proportion who rated each of the other eight priority areas as most important (p-values: < 0.001 to 0.030). There was no difference in the proportion of respondents who rated the top two topics as most important (p = 0.849). Respondents’ ratings were analyzed separately for the two single most common positions/expertise — scientist/researcher and soldier/airman/sailor. Measuring physical performance/fitness had the highest mean score (4.43) among scientist/researcher, while sleep was rated highest (4.50) among soldier/airman/sailor. Among the two countries with the highest number of respondents — sleep was rated highest among respondents from Australia (4.27), while physical demands in operational environments was rated highest by respondents from the USA (4.59). The top 10 priority areas based on mean scores were compared between the respondents at the 3rd ICSPP and the 4th ICSPP. The

two top ranked priority areas were the same between the two surveys — physical demands in operational environments and measuring physical performance/fitness. There was remarkable overlap of priority areas and eight topics were included among the top 10 topics in both surveys (Table 2). Interestingly, sleep and nutrition were two new priority areas in the top 10 in the 4th ICSPP survey. In contrast, high-intensity programs and physical employment standards were in the top 10 in the 3rd ICSPP survey, but not in the 4th ICSPP survey. Respondents to the 4th ICSPP survey were asked to rate nine emerging problems, challenges, and threats on a scale from 1 to 5. Resilience or psychological fitness of recruits was identified as the most important (mean score = 4.16, number of 5’s = 80). The other eight topics listed in descending order of mean score were: physical fitness of recruits (mean score = 4.11, number of 5’s = 85), declining appreciation of the value of soldiers’ physical performance (3.72, 51), post-traumatic stress disorder (3.66, 55), declining recruit population (3.44, 30), depression (3.39, 26), attrition/separation (3.05, 12), opioid use and substance abuse (2.80, 12), and incorporation of transgender soldiers (2.44, 10). On further analysis of responses by positions/expertise and country, resilience or psychological fitness of recruits was rated highest among scientist/researcher (mean

1128

M. Lovalekar et al. / Journal of Science and Medicine in Sport 21 (2018) 1125–1130

Table 2 Comparing top 10 ranks based on mean scores between 3rd and 4th International Congress on Soldier Physical Performance, held in 2014 (140 respondents) and 2017 (224 respondents), respectively. 2014: Research topic (37 topics)a

Physical demands in operational environments Measuring physical performance/fitness Musculoskeletal injuries: injury mitigation programs Physical training programs: strength Musculoskeletal injuries: overuse injuries Physical training programs: high-intensity programs Physical demands in training environments Physical employment standards Physical training programs: endurance Soldier load impairing performance a b

2017: General questions: research (43 topics) Mean scoreb

Rank

Rank

4.34 4.16 4.16 4.12 4.10 4.07 4.06 4.00 4.00 4.00

1 2 3 4 5 6 7 8 9 10

1 2 3 4 5 6 7 8 9 10

Mean scoreb Physical demands in operational environments Measuring physical performance/fitness Sleep Physical training programs: strength Physical training programs: endurance Soldiers’ load impairing performance Nutrition Physical demands in training environments Musculoskeletal injuries: injury mitigation programs Musculoskeletal injuries: overuse injuries

4.41 4.38 4.28 4.27 4.23 4.21 4.19 4.18 4.17 4.15

Hydren JR, Zambraski EJ. International research consensus: identifying military research priorities and gaps. J Strength Cond Res 2015; 29 Suppl. 11:S24-7. Priority score options: 1 — unnecessary, do not consider, 2 — low priority, 3 — moderate priority, 4 — high priority, and 5 — highest possible importance/priority.

score = 4.18), soldier/airman/sailor (4.31), and among respondents from Australia (4.28). Post-traumatic stress disorder was rated highest by respondents from the USA (3.98). Respondents were asked to rate eight technologies influencing soldiers’ physical performance on a scale from 1 to 4. Physiological status monitoring received the highest mean score (2.79, number of 4’s = 58). The other topics listed in descending order of mean score were: improved body armor (mean score = 2.77, number of 4’s = 53), wearable load or movement sensors (2.45, 31), body hydration monitors (2.16, 20), wearable braces to support musculoskeletal system (2.13, 18), robotics (1.83, 6), non-powered exoskeletons (1.75, 5), and powered exoskeletons (1.68, 4). On analysis by positions/expertise, physiological status monitoring was rated highest among scientist/researcher (mean score = 2.96), and improved body armor was rated highest (3.07) among soldier/airman/sailor. Analysis by country showed that improved body armor was rated highest among respondents from Australia (2.84), while physiological status monitoring was rated highest (2.95) by respondents from the USA. In the final section of the questionnaire, respondents provided a free text response to the question — “What do you feel is the most important factor pertaining to soldiers’ physical performance that needs to be addressed scientifically?” The recurring underlying theme identified was physical or mental health and performance including considerations such as resilience, motivation, cognition, and fatigue (number of respondents = 61; example of text: “Cognitive function”, “Lack of fitness throughout career in military”). Training (30; example: “How to train soldiers physically”, “Correct training doses”); and identifying better correlates of injury, performance, physical fitness, and stress (30; example: “Biomarkers indicative of stress in different settings”, “Genetics and operational performance”) were also identified as being important. Other considerations included — (1) Sociocultural environmental factors including occupational training environment, and better communication (20); (2) Physical environmental factors including armor/load, and temperature (17); (3) Lifestyle factors such as sleep, food, and substance use/abuse (14); (4) Operational/physical demands (11); (5) Performance assessment (6); and (6) Gender integration (5). Not all respondents answered this free text question, and some identified more than one problem area that needs scientific attention. Another free text question asked of the respondents was — “If your military/country is not addressing it (the most important factor pertaining to soldiers’ physical performance), what is the main reason? The responses to this question could be clearly grouped into three categories — lack of resources including funds and expertise (number of respondents = 32), lack of understand-

ing/commitment (26), and efforts in early stages (15). Not all survey respondents provided an answer to this question, and some identified more than one reason for the lack of response. The respondents were asked if they were collaborating or interested in collaborating with other military/countries to address this major concern. Most respondents (76.8%) responded affirmatively to this question about collaboration. The top two categories of abstract topics presented at both the 3rd and 4th ICSPP were; fitness, performance and testing, and musculoskeletal injuries with 43.7 and 19.2% in 2014 (3rd ICSPP) and 38.0 and 15.7% in 2017 (4th ICSPP), respectively. At the 4th ICSPP no topic outside of the top two exceeded ∼9% however there was a more consistent representation across the remaining eight topics when compared to 2014. In 2017, there was however, an approximately seven-fold increase in the number abstracts related to equipment and human factors (3 vs. 28). The primary foci across these two categories were human-system interactions during load carriage, and physiological status and sleep monitoring. 4. Discussion The purpose of this study was to describe perceived priorities related to military personnel’s health and physical performance, as identified by attendees at the 4th ICSPP, and to determine if perceived research priorities had changed between the administration of the 3rd and 4th ICSPP surveys. Among the priority areas, respondents rated physical demands in operational environments and measuring physical performance/fitness, as the two topic areas with the highest possible importance. The strong agreement on what were the most important priority areas suggests that more attention needs to be given to these critical areas. Physical demands in operational environments2 ,12,13 and measuring physical performance/fitness8–10,14 have been identified as important priority areas in previous publications as well. As the 4th ICSPP was conducted in Australia, local attendees had the largest representation which translated into the most respondents (36.6%). Similarly, most respondents to the 3rd ICSSP survey were from the host nation, the United States (41.4%).6 Despite this difference in the relative frequency of country of origin, there was remarkable overlap in the rankings of the topics identified as being most important, with eight topics in the top 10 being common between the two surveys. When rankings of research priority areas were compared within countries with the most number of respondents (Australia and USA) between the 3rd and 4th ICSPP, there was again considerable overlap. Respondents from each country consistently identified similar priority areas as being most important between the 3rd and 4th ICSPP. For respondents from Australia

M. Lovalekar et al. / Journal of Science and Medicine in Sport 21 (2018) 1125–1130

and the USA analyzed separately, seven of the top 10 priority areas were similar between the 3rd and 4th ICSPP. To the best of our knowledge, no other similar survey has been conducted among a diverse, international group of military researchers, though research priorities have been surveyed among law enforcement leaders,15 and need assessment surveys have been conducted among firefighters.16,17 A study conducted at the U.S. Army Public Health Center among Army personnel and civilians, reported that activities which respondents were most interested in receiving information about were running, weight training and extreme conditioning.18 A Blue Ribbon Panel on military physical performance testing, sponsored by the National Strength and Conditioning Association, rated muscular strength, power, and endurance as important fitness components to accomplish selected military tasks.10 These findings are in agreement with the current study, where physical training programs for strength and endurance were ranked among the top 10 priority areas. Two emerging research priority areas that were identified in the 4th ICSPP survey were sleep and nutrition. Sleep19,20 and nutrition21,22 have also been identified as priority areas in other publications, which shows an increased focus on lifestyle factors as a means to optimizing health and performance in military populations. Retired U.S. Army Surgeon General, LTG Patricia Horoho initiated a major public health campaign in 2012 named the Performance Triad, that championed the importance of activity, nutrition, and sleep as major contributors to Soldiers’ overall health, performance, and resiliency.4 Interestingly, high-intensity physical training programs, and physical employment standards were in the top 10 in the 3rd ICSPP survey but not the 4th ICSPP survey. Most respondents to the 4th ICSPP survey were from Australia and the United States. Understanding physical employment standards was a critically important area in 2014 at the time of the 3rd ICSSP survey, when discussions about integration of women in military roles were prevalent. With the removal of gender restrictions on combat roles, the relative importance of this topic may have reduced in both these countries. In addition, significant research has been conducted in this area by the militaries of Australia,23,24 Canada,25 and the United States.26 The most important emerging threat identified was resilience or psychological fitness of recruits. Mental health and resilience have been identified as important factors for success in military environments in previous studies.27–29 In the 4th ICSPP survey, the findings from questions about the most important technology and the most important perceived research priority area, agreed with each other. The most important technology influencing soldiers’ physical performance was identified as physiological status monitoring. The most important perceived priority research area was physical demands in operational environments. Physiological status monitoring is a clear means to this end. A large proportion of abstracts at the 3rd ICSPP (43.7%) and 4th ICSPP (38.0%) were related to fitness, performance and testing. Interestingly, respondents identified physical or mental health and performance as the most important factor pertaining to soldiers’ physical performance that needs to be addressed scientifically. There were a greater number of abstracts on ‘equipment and human factors’ presented at the 4th ICSPP (28) as compared to the 3rd ICSPP (3). This may be attributable to differences in geographic distribution of attendees and government imperatives. This result may also reflect an increased awareness of the importance of human system integration, particularly noting the increasing workforce diversity across many nations. A major concern cited by 4th ICSPP survey respondents was a lack of resources limiting the ability to scientifically address important factors pertaining to soldiers’ physical performance. Examples of responses about lack of resources were “Financial

1129

problems” and “Lack of researchers, funds and knowledge”. This suggests that development of better international coordination in research endeavors could lead to more progress despite constrained resources. There are certain limitations to interpreting the results of this survey. It is possible that the research areas of focus as well as the geographic locations of the 3rd and 4th ICSPP conferences may have influenced the range of attendees who registered for each conference, and ultimately the respondents’ perceptions. Also, the questions in the survey were designed to identify perceived priority research needs, but the survey could not quantify the amount or utility of research that is being done in a specific area. A low priority score could mean that a topic was not perceived as being important, but it could also mean that the topic has already been adequately addressed or there were less respondents with interest in that topic area. Moving forward, it would also be of interest to conduct a similar survey with operational (i.e. non-scientist) military stakeholders to compare the perceived research priorities and gaps to the scientific community represented in the current survey.

5. Conclusion Despite the diverse backgrounds of the respondents, there was a clear continuing consensus about perceived important priority areas influencing health and physical performance of military personnel. Physical demands in operational environments and measuring physical performance/fitness were the top two priority areas, and eight of the top 10 highest ranked topics were identical between the two surveys. Importance of physical or mental health and performance was the important broad recurring underlying theme, and respondents reported lack of resources as a big impediment. Increased international research collaboration can be recommended to facilitate targeted application of constrained research resources and address identified research gaps and priorities.

Practical implications • There was a clear continuing consensus about perceived important priority areas influencing the health and physical performance of military personnel, among respondents to the 3rd and 4th International Congress on Soldiers’ Physical Performance surveys. • Physical demands in operational environments and measuring physical performance/fitness were perceived as the two most important priority areas. • The most important emerging threat was identified as resilience or psychological fitness of recruits, while physiological status monitoring was identified as the most important technology influencing soldiers’ physical performance. • Respondents identified lack of resources, including funding and expertise, as a big impediment, and were interested in collaborating with other militaries/countries to address identified priority areas.

Acknowledgements No external funding was provided for this project. We would like to thank the respondents to the 3rd and the 4th International Congress on Soldiers’ Physical Performance surveys. Disclaimer: The opinions or assertions contained herein are the private views of the author(s) and are not to be construed as official or as reflecting the views of the Army or the Department of Defense.

1130

M. Lovalekar et al. / Journal of Science and Medicine in Sport 21 (2018) 1125–1130

References 1. Nindl BC, Castellani JW, Warr BJ et al. Physiological employment standards iii: physiological challenges and consequences encountered during international military deployments. Eur J Appl Physiol 2013; 113(11):2655–2672. 2. Sharp MA, Cohen BS, Boye MW et al. U.S. Army physical demands study: identification and validation of the physically demanding tasks of combat arms occupations. J Sci Med Sport 2017; 20(Suppl. 4):S62–S67. 3. Sean R, Maria CL, Carra SS et al. Fit for duty?: evaluating the physical fitness requirements of battlefield airmen. Rand Health Q 2018; 7(2):8. 4. Nindl BC, Williams TJ, Deuster PA et al. Strategies for optimizing military physical readiness and preventing musculoskeletal injuries in the 21st century. US Army Med Dept J 2013:5–23. 5. Nindl BC, Sharp MA. Third international congress on soldiers’ physical performance: translating state-of-the-science soldier research for operational utility. J Strength Cond Res 2015; 29(Suppl. 11):S1–S3. 6. Hydren JR, Zambraski EJ. International research consensus: identifying military research priorities and gaps. J Strength Cond Res 2015; 29(Suppl. 11):S24–S27. 7. Hollander IE, Bell NS. Physically demanding jobs and occupational injury and disability in the U.S Army. Mil Med 2010; 175(10):705–712. 8. Nindl BC, Jaffin DP, Dretsch MN et al. Human performance optimization metrics: consensus findings, gaps, and recommendations for future research. J Strength Cond Res 2015; 29(Suppl. 11):S221–S245. 9. Grier TL, Canham-Chervak M, Bushman TT et al. Evaluating injury risk and gender performance on health- and skill-related fitness assessments. J Strength Cond Res 2017; 31(4):971–980. 10. Nindl BC, Alvar BA, R Dudley J et al. Executive summary from the national strength and conditioning association’s second blue ribbon panel on military physical readiness: military physical performance testing. J Strength Cond Res 2015; 29(Suppl. 11):S216–S220. 11. Harding T, Whitehead D et al. Analysing data in qualitative research, in Nursing and Midwifery Research − Methods and Appraisal for Evidence-Based Practice, Schneider Z, editor, Elsevier Australia, 2016, p. 127–142. 12. Kyrolainen H, Nindl BC. The physical demands placed on modern soldiers continue to be substantial. Preface. J Strength Cond Res 2012; 26(Suppl. 2):S1. 13. Dybel GJ, Seymour CJ. Identifying the physical demands of army reserve personnel during deployable medical systems training. Mil Med 1997; 162(8):537–542. 14. NATO Research and Technology Organisation. Optimizing operational physical fitness. Final report of task group 019. RTO technical report TR-HFM-080, Neuilly-sur-Seine Cedex, France, 2009. 15. The International Association of Chiefs of Police. Law enforcement research priorities for 2011 and beyond. Results of the IACP membership survey and focus group 2009–2010. 2011. Available at http://www.theiacp.org/portals/0/pdfs/ LEResearchPriorities2011.pdf. Accessed 21 May 2018.

16. Lee JY, Park J, Park H et al. What do firefighters desire from the next generation of personal protective equipment? Outcomes from an international survey. Ind Health 2015; 53(5):434–444. 17. National Fire Protection Association. United States Fire Service Needs Assessment. 2016. Available at https://www.nfpa.org/-/media/Files/Code-or-topicfact-sheets/NeedsAssessmentFactSheet.pdf. Accessed 21 May 2018. 18. Hauschild VD, Schuh A, Jones BH. What soldiers know and want to know about preventing injuries: a needs survey regarding a key threat to readiness. US Army Med Dept J 2016:10–19. 19. Eliasson AH, Lettieri C, Netzer N. Sleep medicine is coming of age in military medicine: report from the military health system research symposium (2017) in Kissimmee, Florida. Sleep Breath 2017; 22:481–483. 20. Novak A, Hornyak B, Razso Z et al. Predicting how health behaviours contribute to the development of diseases within a military population in the Hungarian defence forces. J R Army Med Corps 2017; 164:107–111. 21. Nindl BC, Beals K, Witchalls J et al. Military human performance optimization and injury prevention: strategies for the 21st century warfighter. J Sci Med Sport 2017; 20(Suppl. 4):S1–S2. 22. Rahmani J, Milajerdi A, Dorosty-Motlagh A. Association of the alternative healthy eating index (AHEI-2010) with depression, stress and anxiety among Iranian military personnel. J R Army Med Corps 2017; 164:87–91. 23. Carstairs GL, Ham DJ, Savage RJ et al. A box lift and place assessment is related to performance of several military manual handling tasks. Mil Med 2016; 181(3):258–264. 24. Doyle T, Billing D, Drain J et al. Physical employment standards for Australian Defence Force employment categories currently restricted to women-Part A: physically demanding trade tasks, In: Report Number DSTO-CR -2011-0377. Victoria, Australia, Defence Science and Technology Group, 2011. 25. Reilly TJ, Gebhardt DL, Billing DC et al. Development and implementation of evidence-based physical employment standards: key challenges in the military context. J Strength Cond Res 2015; 29(Suppl. 11):S28–S33. 26. Foulis SA, Sharp MA, Redmond JE et al. U.S. Army physical demands study: development of the occupational physical assessment test for combat arms soldiers. J Sci Med Sport 2017; 20(Suppl. 4):S74–S78. 27. Nakkas C, Annen H, Brand S. Psychological distress and coping in military cadre candidates. Neuropsychiatr Dis Treat 2016; 12:2237–2243. 28. Williams A, Hagerty BM, Andrei AC et al. Stars: Strategies to assist navy recruits’ success. Mil Med 2007; 172(9):942–949. 29. Cigrang JA, Carbone EG, Todd S et al. Mental health attrition from air force basic military training. Mil Med 1998; 163(12):834–838.

Journal of Science and Medicine in Sport 21 (2018) 1131–1138

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Review

Optimising training adaptations and performance in military environment Heikki Kyröläinen a,b,∗ , Kai Pihlainen c , Jani P. Vaara b , Tommi Ojanen d , Matti Santtila b a

Faculty of Sport and Health Sciences, Biology of Physical Activity, University of Jyväskylä, Finland Department of Leadership and Military Pedagogy, National Defence University, Finland Training Division, Defence Command, Finland d Human Performance Division, Finnish Defence Research Agency, Finland b c

a r t i c l e

i n f o

Article history: Received 29 August 2017 Received in revised form 15 November 2017 Accepted 28 November 2017 Available online 20 December 2017 Keywords: Soldier Strength Endurance Body composition

a b s t r a c t Objectives: Worldwide decreases in physical fitness and increases in body fat among youth have set challenges for armed forces to recruit physically capable soldiers. Therefore, knowledge of optimizing physical adaptation and performance through physical training is vital. In addition, maintaining or improving physical performance among professional soldiers in various military environments is crucial for overall military readiness. The present review focuses on the effects of military training on physical performance by searching for optimal methods to do it. Design and methods: Review article based on selected literature searches using the main keywords ® ‘physical performance’ and ‘training’ and ‘military’ from MEDLINE and SportDiscus engines. Additional selected references were included that encompassed the same words but were not found in the present search. Results: Military training mainly consists of prolonged physical activities and training performed at lowintensities, which may interfere with optimal muscle strength and considering development of maximal strength, power, and aerobic capacity. Combined endurance and strength training seems to be a superior training method to improve overall physical performance of soldiers. Conclusions: The present study demonstrated that military training needs a greater variation in training stimulus to induce more effective training adaptations, especially, when considering the development of maximal or explosive strength and maximal aerobic capacity. Training programs should be well periodised so that total training load increases progressively but also includes sufficient recovery periods. In addition, some individualized programming is required to avoid unnecessary injuries and overloading because the differences in initial physical fitness of soldiers can be very high. © 2017 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction The average aerobic fitness of young men entering military service has declined and has been accompanied by a simultaneous increases in body mass, especially in western countries.1,2 At the same time, however, it is well recognised that successful performance of military duties requires a high level of physical fitness, especially, aerobic fitness and muscular strength.3,4 Physical training, of which specific aims vary in relation to the phase of a soldier’s career, is the most effective method to improve or maintain physical performance. The goal of military basic training (BT), for example, is to reach an employment standard level or a level of

∗ Corresponding author. E-mail address: heikki.kyrolainen@jyu.fi (H. Kyröläinen).

physical performance needed during the following training phases. For professional soldiers, the focus is to reach or maintain the physical performance level required for deployment and occupation. The outcome of physical training programs depends on training volume (duration, distance or repetitions), intensity (load, velocity or power) and frequency, which are key factors of training. In sports training, total training load, nutrition and recovery are typically planned in an individual way in order to optimise training adaptations and minimise training-related injuries and overtraining. Similarly, when designing a training plan in a military environment, the coach, tactical strength and conditioning facilitator, or military instructor must first decide which factors to emphasise in order to meet the performance goals or task requirements. At the same time, it is important that these emphasised training factors are also well in proportion with the trainee’s individual needs and initial fitness level, and that the training plan is well periodised.

https://doi.org/10.1016/j.jsams.2017.11.019 1440-2440/© 2017 Sports Medicine Australia. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/ by-nc-nd/4.0/).

1132

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

Fig. 1. Several factors affect total training load, training adaptation and performance of an individual soldier in military environments.

In general, several factors such as age, sex, training history, recovery, sleep and nutrition, as well as environmental, psychological, and social factors can significantly affect training adaptations. In addition, optimising performance in military environments is often challenged by external stress factors such as prolonged physical activity while carrying loads, negative energy and fluid balance, sustained readiness, and sleep deprivation.5–7 Therefore, training load combined with these external stress factors can lead to compromised training adaptations and/or overreaching and overtraining in addition to increased musculoskeletal injury rates.8–10 These factors should be taken into account when planning and implementing optimal training programs for soldiers. Fig. 1 summarizes factors affecting optimal development of soldiers’ physical performance and, therefore, operational readiness. As mentioned earlier, a variety of factors, which differ individually, have influences on optimised training adaptations in military environments. In recruits, military training consists of a high amount of low-intensity physical activity, which can be a challenge for optimising improvements in strength performance.11 For professional soldiers, on the other hand, the challenges arise more from reaching and maintaining the performance level set for more demanding occupational requirements or deployment standards. Furthermore, due to physiological sex differences, female soldiers are often required to increase their physical fitness to a greater extent compared to male soldiers.12,13 In particular, tasks involving extra loads or carry and lifting heavy materials seem to be more challenging for female soldiers due to their smaller body size and lower muscular fitness levels. Therefore, physical fitness demands are relatively higher for female soldiers compared to those of male soldiers.14 Training principles and periodisation in endurance and strength training are essentially the same for both sexes.13 The present review aimed to explore studies that have attempted to improve physical performance by optimising physical training adaptations in military environments in both recruits and professional soldiers. In addition, possible mechanisms for suboptimal adaptations and optimisation of training strategies are discussed. 2. Design and methods The present review focuses on the military training effects on physical performance. The articles have been selected from literature searches using the keywords ‘physical performance’ and ® ‘training’ and ‘military’ from MEDLINE and SportDiscus search engines. Additional selected references were included that encompassed the same words but were not found in the present search. All

together, we found almost 200 published articles, of which 60 were selected for this review according to inclusive criteria of the main keywords. The published articles mainly focused on male army soldiers who can be further divided into two main categories; recruits or conscripts with no prior experience performing military duties; and professional soldiers.

3. Endurance training Traditionally, endurance training in the military has consisted of moderate-intensity running, walking or marches with or without load carriage at a constant speed.11 Santtila et al.11 found that additional endurance training during military BT did not produce additional gains in aerobic fitness. Despite this finding, moderate intensity aerobic training, such as marching with extra loads, is still a widely used training method in the military. Recent studies have suggested that high-intensity interval training (HIIT) will induce similar or superior training responses compared to moderateintensity endurance training with less time commitment.15 HIIT refers to a training mode that involves repeated, relatively brief bouts at high intensity interspersed with lower intensity periods of recovery.15,16 HIIT has been shown to induce greater neuromuscular adaptations than traditional endurance training.17 Thus, a low volume of HIIT may elicit a higher neuromuscular training effect and thereby may better induce developments in strength performance as a part of combined training. Nevertheless, to date there are only a very few HIIT studies implemented in military environments.15 Knuttgen et al.18 compared adaptations to HIIT in three training groups of conscripts. Each group performed HIIT by running 15 min per session, with one group performing 5 training sessions per week for one month, and the other two groups 3 sessions per week for two months. In total, all groups had 19 training sessions during the study period. The improvement in maximal aerobic capacity was around 20% in all groups. Kilen et al.19 studied training adaptations of professional soldiers in two training groups: one performing nine 15-min training sessions per week (“microtraining”) and the other performing three 45-min sessions (“traditional training”) during 8 weeks with sessions of strength, high-intensity endurance training and muscle endurance. Both groups improved shuttle run performance but only the “microtraining” group improved peak oxygen uptake, grip strength, and loaded lunge performance. Finally, the authors concluded that short and more frequent training sessions can induce at least similar training adaptations compared to longer and less frequent sessions.

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

Gist et al.20 studied the effects of HIIT training performed as calisthenics, otherwise known as high-intensity functional training (HIFT) in twenty cadets. The participants performed a 4-week training period with 3 exercises per week either in a group with typical physical training (combined endurance and muscular endurance exercises) or HIFT. The typical training group performed 60-min workouts consisting of one running exercise with moderate intensity, one exercise of self-paced load carriage, and one exercise of moderate intensity running followed by calisthenics. The HIFT group performed 4–7 sets of 30 s “all-out” burpees with 4 min of active recovery. The performance of aerobic and anaerobic capacity, and muscular endurance were unchanged and revealed no differences between the groups. Thus, the authors concluded that HIFT might be a suitable training method to maintain fitness. In conclusion, HIIT and HIFT can be recommended for soldiers mainly for its superior or similar training responses compared to moderate-intensity endurance training but as a less time devoted training modality. In addition, for recruits and conscripts HIIT and HIFT may provide an essential training stimulus that differs from their service-related high volume of low-intensity endurance type of activity. Furthermore, from practical point of view HIIT/HIFT may be considered an accessible and easily individualised in soldiers, especially, under operational and field conditions. However, when HIIT/HIFT is applied in the physical training program a special caution should be placed on overall physical loading to avoid overtraining.

4. Strength training During increasingly physically-demanding military operations, maximal strength and power are vital parts of modern physical training and operational readiness of soldiers.21 For optimal performance of military tasks (e.g. lifting or carrying heavy loads, casualty drag, sprinting or climbing obstacles, patrolling in variable terrain), the development of strength and power should be an essential part of soldiers’ regular training.22 Maximal muscle strength can be improved by increasing muscle size due to a hypertrophic training or by increasing the role of neural factors by power training.23 Prolonged military field training and operations have been shown to lead to a decrease in muscular strength and power.24,25 Vantarakis et al.26 studied specific conditioning of muscle endurance and strength among Naval Academy cadets for an 8week study period when the experimental group participated in a linear periodised strength training program in addition to their daily training. The exclusive training of the experimental group included uni- and multilateral resistance training exercises such as squats, deadlifts, lunges, bench presses, arm curls etc. Unlike the control group, the experimental group showed improvements in upper and lower-body maximal strength, power, and time to complete occupational obstacle course. Thus, additional strength training seems to improve both physical and occupational performance of navy cadets. In a study of Lester et al.27 , a novel 7-week physical training program was compared with traditional Army physical fitness training. The experimental training included core stability, flexibility, resistance training, agility, speed, and power exercises, whereas the traditional army physical training group completed their normal fitness training including calisthenics and aerobic exercises. In these previously trained soldiers, greater improvements were observed in the experimental group for maximal strength, power and for one occupational test, namely casualty recovery time. However, similar improvements were observed for both groups in the agility drill and vertical jump height, and pull-up performance. It must be noted that strength training interventions in military environments should partly be regarded as combined training

1133

due to the aerobic nature of military training itself, especially, during military BT in recruits or conscripts. Moreover, for professional soldiers, endurance training is always integrated into their total training program; thus, limited research is available investigating strength training adaptations alone. In conclusion, non-optimal adaptation to added strength training during initial military training may be a result of interference effect. Thus, periodisation of strength and endurance training may improve adaptations to strength training. The programming of strength training should be planned carefully, taking into account the training load from endurance type of military activities such as marching and field exercises.

5. Combined strength and endurance training Military training and operations consist of tasks that can be attained through combined strength and endurance training.28 Therefore, it can be concluded that combined strength and endurance training is the foundation of soldiers’ physical performance.29 The combined training may well induce positive training adaptations both in aerobic fitness and muscle strength capabilities in poorly-conditioned, overweight and inactive individuals.28 However, training adaptations are being compromised in more fit and active individuals. This phenomenon is called interference effect, which was first established by Hickson30 The interference on optimal strength training adaptations may be caused by high volumes of endurance type of activity, which is a typical characteristic of military training. This combination has been shown to inhibit signalling mechanisms of protein synthesis and thus responses can be observed both at the molecular level31,32 as well as the systemic level.33,34 Therefore, combined strength and endurance training may hamper training responses, especially for strength development, when compared to training either exercise mode alone. The interference of combined training may be avoided or, at least, reduced by using optimal training programming and periodisation. However, in recruits or poorlyconditioned soldiers, all types of training most likely improve physical performance without a risk for interference. Many studies have shown that the initial, typically 7- to 10-week standardized military BT period has positive effects on physical performance and body composition of the recruits,35–37 as well as on performance of military occupational specialties.38 Improved aerobic fitness and muscular strength of up to 10–15% in eight weeks have been observed, especially in recruits or conscripts with lower levels of initial fitness.9,11,37 During the following military training phases, nevertheless, adaptations may not have been optimal in relation to total training volume, and some of the performance gains may have been compromised.35,39 Therefore, physical training interventions, before and during the initial military training periods, have been conducted aiming to improve optimization of training adaptations. Some studies have suggested that a preparatory physical training intervention (performed 4–8 weeks before military service) for recruits, may improve training adaptations for military BT while reducing the risk for musculoskeletal injuries.40–42 These findings are important when keeping in mind that the average aerobic fitness level of young men entering military service has declined along with a simultaneous increases in body mass.1,2 Thus, it is more challenging to meet the goals of military training during the initial training phases of the service. Chai et al.42 found a preparatory 6-week physical training program beneficial in terms of aerobic capacity for those with a lower initial fitness level. The progressive preparatory training consisted of 197 h of strength and endurance training, flexibility and motor skill training, and theoretical education. The unfit recruits reached the average aerobic fitness level of

1134

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

the study population by the end of the BT period.42 This is an important finding since it is better to execute standardised BT when there is less variation in the physical fitness of soldiers. A number of training intervention studies have been conducted during the BT period. For example, Santtila et al.11,33 observed improvements in maximal aerobic capacity, load carriage performance, and maximal strength of both upper and lower extremities of conscripts during the BT period. In addition, they found that all beneficial changes in physical performance and body composition were particularly prominent among previously inactive young men. However, the strength training group did not improve strength or muscle hypertrophy to a greater extent than the aerobic intervention group or a group with normal military BT. Moreover, Hofstetter et al.36 reported that additional outdoor circuit training induced greater increases in trunk strength and aerobic capacity, whereas no difference was observed in power of the upper and lower body. Furthermore, Sporis et al.43 studied the effects of two different 5-week training programs on the physical fitness of military recruits. A total of 124 recruits were divided into continuous endurance and relative strength training (CERS) and basic military physical readiness training (BMPR) groups. Both groups trained three times per week for 1.5 to 2 h per day. As a result, both groups improved their physical readiness, but BMPR established greater advances in some motor abilities while CERS achieved greater improvements in endurance tests. One example of training optimization development is the U.S. Army Physical Readiness Training (PRT). This training program was based on a thorough task analysis of a soldier and aimed for a simultaneous reduction in injuries. PRT included reduced running mileage, more gradually progressive periodisation and exercise variety and has been shown to lead to similar or improved training adaptations and lower injury rates compared to traditional Army physical training.44 However, in terms of physiological adaptations, conflicted findings have been observed in many studies. Additional gains in the measured variables were observed in some studies43 while no changes or decreases in physical performance were found in other studies.11,33,45 For example, Vaara et al.46 compared the effects of a block-periodised resistance training protocol performed twice a week with military training over an 8-week special military training period. The intervention did not lead to improved maximal strength compared to the control group in fact, maximal strength of the lower extremities was reduced in both groups. It can be speculated that training frequency of two times a week was not a sufficient training stimulus for optimal muscle strength adaptions, or that the adaptations were interfered with strenuous military training that was primarily endurance type training. Nevertheless, both groups improved their load carriage performance, measured by 3.2 km running with a combat gear. Possible explanations for the suboptimal adaptations to military training include high overall training volume and unilateral prolonged low-intensity endurance activity with inadequate recovery, which may lead to overtraining.47,48 The same attributes typically increase the risk for musculoskeletal injuries, especially in low-fit service members9,49 who are cigarette smokers.50 In addition to low fitness level and smoking, female sex, high running mileage (or mileage on foot), high body mass index, and prior injury history have been recognized as major risk factors for injuries in many military studies.2,51 On the other hand, the same absolute training volume may be optimal for a low-fit recruit but too low for a high-fit recruit.33 The main findings of selected training adaptation studies in military environments are presented in Table 1. Despite the interference effect in high-fit individuals such as special operators, a well-design periodisation of the training program is required for improving physical performance. Thus,

some studies have concentrated on physical training interventions in professional soldiers, mainly in special force operators25,52,53 and interventions on military occupational specialties, such as load-carriage performance.54 Abt et al.53 studied the effects of block-periodised and non-linear periodised training in Naval Special Warfare operators with 85 soldiers during 12 weeks. The experimental block training group trained with three 4-week blocks starting with aerobic endurance, muscular strength and coordination training followed by power and strength endurance and mixed endurance training. The third block aimed to improve power, strength, and high intensity tactical drills. The control group trained with a non-linear periodisation program that included training increments after every two weeks. For the first block, over the course of a week the program consisted of whole-body resistance training for one day, Olympic lifts and strength exercises followed by short high-intensity intervals for two days, a high-intensity interval strength training for one day, and a slow endurance session each for two days. The second block focused on tactical-specific conditioning for two days, high-intensity interval cross training for two days, and a slow endurance session for one day. Both groups improved maximal aerobic capacity and standing long jump, medicine ball throw, and pull-ups. In addition, the experimental group improved agility runs and deadlift. On the contrary, neither group improved isokinetic maximal strength. In fact, decreases were observed for upper and lower body strength in experimental group, whereas the control group decreased in trunk flexion. Solberg et al.52 studied the effects of block training consisting of a 6-month linear periodisation period followed by a 6-month non-linear periodisation period in 22 operators serving in Navy Special Operations Command. The training programs emphasized either strength or endurance with block periodisation including 56 sessions per week. Linear periodisation included hypertrophic strength training, mixed endurance training, and typical strength training (4 × 5RM) followed by maximum strength training. In the non-linear periodisation, the training varied between blocks of endurance and strength training. The initial linear periodisation resulted in small to moderate training adaptations (ranging from −1 to +20%) followed by smaller adaptations (ranging from −10 to +15%) in non-linear periodisation. However, as the authors reported, even small improvements in physical fitness in soldiers with a high baseline level may be considered important. Military occupational specialties may largely vary depending on military branch. To date, there are some studies available that are specifically targeted to improve these special demands. One of the most studied military tasks is load carriage, which is a highly relevant and required task in most of the branches of the military, especially in combat units. A meta-analysis54 combined results from ten original physical training intervention studies, which aimed to improve load-carriage performance. The authors concluded that strength and endurance training alone had smaller effects with substantial variation compared to combined strength and endurance training, progressive load-carriage training, and field-based training including load carriage exercises. The most effective training mode to improve load-carriage performance was load-carriage exercises when they were progressively integrated as a part of the training program. In addition, significant training adaptations were found for combined strength and endurance training, as well as for field-based exercise such as plyometrics, agility training, sandbag lifts, and load-carriage. Furthermore, Williams et al.55 raised concerns of carefully design of such training modality, especially for female and low-fit soldiers. It has also been shown that sex differences exist in most of the physical and military occupational performance variables,56,57 which mainly remain unaltered after military BT. To date, there exists only a few training intervention studies exclusively

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

1135

Table 1 Findings of previous training studies in military environment. Study

Country

N

Sex (personnel group)a

Training programb

Results after training

Vantarakis et al.26

Greece

31

1 (2)

Effect of 8-week linear periodised training program on Navy Cadets

Bench press 1RM ↑, Squat 1RM ↑ Muscle endurance ↑ (push-ups, sit-ups), 30 m sprint time ↓ Navy obstacle course time ↓

Grant et al.60

South Africa

154

1 + 2 (1)

Effect of 12 and 20 weeks of medium-to-high intensity military training

VO2 max ↑ during 1st 12 weeks, ↔ after 20 weeks 2.4 km run time ↓ BMI ↔

Abt et al.53

USA

46

1 (2)

Effect of 12-week block-periodised training program in Naval Special Warfare Operators

Body fat ↓, Fat mass ↓, Body mass ↓ Aerobic capacity ↑ Upper body muscular endurance ↑ Upper and lower body power ↑ Total body muscular strength ↑

Solberg et al.52

Norway

22

? (2)

Assessment of novel 6-month linear (LP) vs. non-linear training program (NLP) in Navy Special Operation Forces

Both programs; abdominal strength, standing long jump ↑ LP; mobility, agility, upper body power, pull-ups, VO2 max, muscle mass ↑, fat percent ↓ NLP; anaerobic capacity ↑, VO2 max, upper body power ↓

Vaara et al.46

Finland

25

1 (1)

Effect of added ST during 8-weeks of special military training

3.2 km load carriage (27 kg) time ↓ Isometric bench press ↔ Isometric leg press ↓, Abdominal strength ↑, Back extension ↓

Lester et al.27

USA

133

1 (2)

Effect of 7-week novel physical NT improved more bench press, training (NT) program compared to medicine ball put, 30 m rush time, traditional army physical fitness casualty recovery time than TT training (TT)

Sporis et al.43

Croatia

124

1 + 2 (1)

Effects of two different 5-week training programs

3.2 km run time ↓, Muscle endurance (sit-ups, push-ups, squats) ↑ No statistically significant differences between programs

Sporis et al.25

Croatia

25

? (2)

Effects of a training program for special operations battalion (SOB) soldiers’ fitness parameters

Body mass, fat mass ↓ Muscle endurance (sit-ups, push-ups) ↔ Maximal leg extension, bench press ↓ Aerobic and anaerobic performance ↓

Santtila et al.35

Finland

57

1 (1)

Effects of 8 weeks basic training, followed by 8 weeks of specialised military training

VO2 max and maximal arm and leg extension ↑ during 1st 8 weeks, ↔ after 2nd 8 weeks

Hendrickson et al.59

USA

56

2 (3)

Effect of combined ST and ET on tactical occupational tasks

ST group ≥ Squat ↑, Bench press ↑, Bench press throw ↑, Repeatable lift and carry ↑, 3.2 km load carriage time ↓, 3.2 km run ↔

Santtila et al.11

Finland

72

1 (1)

Effects of 8-week basic training (BT), VO2 max ↑ ST 12%, ET 9%, BT 13%, ET and ST on functional parameters Body fat ↓ in all groups, leg strength ↑ ST 9,1% and ET 12.9%

Harman et al.45

USA

32

1 (3)

Effects or 8-week standardised army physical vs. weight-based training

Both training programs had the same effect 3.2 km and 400 m load carriage time ↓, obstacle course, sprints, casualty rescue time ↓

Kraemer et al.29

USA

35

1 (2)

Effects of concurrent 12-week training, 4 days per week; ET, ST, upper body strength training + endurance training (UB + ET) or combined (ET + ST) group

All groups; push-ups ↑ (18–43%) ST + ET; sit-ups ↔ The groups that included ET; 2-mile unloaded run time ↓ Only ET + ST and UB + ET; loaded 2-mile run time ↓ The groups that included RT exercises; leg power ↑

Knapik et al. (2009)44

USA

2580

1 + 2 (1)

Comparison of a standardized physical training and physical readiness training (PRT)

PRT resulted in higher fitness test pass rates and lower injury rates compared to a traditional physical training programs

1136

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

Table 1 (Continued) Study

Country

N

Sex (personnel group)a

Training programb

Results after training

Williams et al.55

UK

52

1 + 2 (1)

Normal British Army 11-week basic training with modified physical training, which consisted of added ST and a higher proportion of ET and material handling training compared to normal British Army basic training

Greater ↑ in the modified training compared with the normal training; Maximal box lift (12 vs. 2%) 3.2 km loaded march performance (9 vs. 4%) VO2 max (9 vs. 4%) Dynamic lift (16 vs. 0%) Estimated fat-free mass (4 vs. 2%) Both genders were reported separately

Kraemer et al.58

USA

93

2 (3)

Effects of 6-month ST programmes on physical and military occupational task performances

Improvements in physical performance in relation to specificity of training ST ↑occupational performance When compared to a male control group, gender differences ↓ after ST, especially for occupational tasks

Sex: 1, male; 2, female,?, not reported; Personnel group: (1), conscripts/recruits; (2), cadets/professional soldiers; (3), civilians. Abbreviations: 1RM, one repetition maximum; VO2 max, maximal oxygen uptake; LP, linear periodization; NLP, non-linear periodization; NT, novel training; TT, traditional training; ST, strength training; ET, endurance training; BT, basic training; UB, upper body; RT, resistance training, PRT, physical readiness training.

focused on females in military environments.58,59 Williams et al.55 showed that added resistance training during 11-week military BT improved, in relative terms, occupational performance equally in both sexes. In the same study, the female recruits improved their maximal aerobic capacity in relation to their body mass by 18% while the respective change in males was only 8%. In conclusion, compared to either strength or endurance training alone, combined training induces superior adaptations to soldiers’ physical performance. Due to the physical demand of the profession and the need for continued military readiness, it is unavoidable to train only strength or endurance in military environment. However, several factors should be individually taken into account such as the initial fitness level, training history, task requirements, and periodisation of training. 6. Conclusions and practical applications High volumes of low-intensity endurance training during initial military training phases leads to compromised training adaptations and in the worst cases, musculoskeletal injuries. Particularly lowfit, inactive, overweight recruits and female sex form a risk-group in this regard. For high-fit recruits, injury risk is lower but similarly, high volume and monotony in the training stimulus leads to stalling of performance, especially development of strength in military environment. Several modifications exist to increase variation in training stimulus such as progressively increased training load, individualisation or more variability in training modes. Physical task requirements should be the basis for goal setting in military training. A progressive increase in training load should be carefully planned throughout the initial training period. Progression can only be achieved through some level of individualisation of training. One option to adapt low-fit recruits to the physical stress of military training is a preparatory training intervention before the actual military service. Some individualization may be achieved by dividing the recruits into groups according to their initial fitness level in the beginning of the BT training period. Thereafter, adjustments can be made to the total training load by varying the volume and intensity of the exercises between the groups. This method might result in improvements in the fitness of also high-fit recruits, whose maximal aerobic capacity has been shown to even decline during the latter part of their military service.35 In this regard, it must be kept in mind that the nature of military training per se, is mainly high-volume and low-intensity endurance training. Thus, there may not be a need to implement physical training consisting

of low-intensity endurance activities as a part of military training. A gradual increase in endurance exercise intensity adds variation to military training and may induce greater improvements in training adaptations. HIIT/HIFT may effectively improve physical fitness, both aerobic capacity and neuromuscular performance, with less time devoted to training as compared to low or moderate-intensity training.15,20 Moreover, HIIT/HIFT can be considered as a practical training method for soldiers whenever time allocated to training and access to fitness facilities are limited, like in field conditions. HIIT/HIFT may even be performed in operational environments or during operations where decrements in aerobic performance have been observed. Nevertheless, its application in military environment should always be evaluated in relation to the composition of the rest of the physical training combined with other possible external stress factors. In addition, long-term HIIT/HIFT studies (>8 weeks) concentrating on physical performance, body composition, and injury incidence in the military settings are warranted. As mentioned earlier, non-optimal adaptations to added strength training during initial military training may be a result of the interference effect. Block periodisation of strength and endurance training may improve adaptations to both training modalities, especially strength, in military environments. The planning of strength training blocks should be done carefully, taking into account the training load from endurance type military training, such as marching and field exercises. In addition, proper nutrition and time for recovery should be planned to optimise the effects of the strength training stimulus.60 Proper strength training sessions might be possible to implement during theoretical education, basic shooting skills, and material handling training phases when the endurance training volume is low. Future studies are needed to elucidate whether positive strength training adaptations can be achieved during field exercises if adequate nutrition and recovery are simultaneously provided. As the number of female soldiers is increasing, more focus should be paid on optimising their physical performance in relation to their task requirements in military environments. Surprisingly, only a few training studies on female soldiers have been published. Furthermore, in military training studies that use both sexes as subjects, the results should be reported by sex. This would give more information on the possible differences in adaptation between male and female soldiers. In conclusion, practical recommendations should be based on the traditional nature of military training, which consists of highvolume and low-intensity endurance training with extra load of 25–65 kg. Therefore, progressively increasing combined endurance

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

and strength and power training, possibly including in some extent high intensity interval training or microtraining, can be seen to induce superior adaptations in soldiers’ physical performance. For achieving optimal physiological adaptations and, therefore, more effective development in physical performance, increasing attention should be paid to progressive and individualised training programs and their division into phases that sequentially develop performance. Thus, a personalised approach to performance optimisation should be emphasised when improving physical fitness and enhanced operational readiness of soldiers. Acknowledgement The authors thank Dr. Ritva Taipale for editing English. References 1. Santtila M, Kyröläinen H, Vasankari T et al. Physical fitness profiles in young Finnish men during the years 1975–2004. Med Sci Sports Exerc 2006; 38:1990–1994. 2. Knapik J, Sharp M, Steelman RA. Secular trends in the physical fitness of United States Army recruits on entry to service, 1975–2013. J Strength Cond Res 2017; 31:2030–2052. 3. Sharp MA, Patton JF, Vogel JA, Vogel JA. A database of physically demanding tasks performed by US Army soldiers. T98-12, 1-42. Natick, MA, USA Army Research Institute of Environmental Medicine Technical Report 1998; 3-10-0098. 4. Hauschild VD, DeGroot DW, Hall SM et al. Fitness tests and occupational tasks of military interest: a systematic review of correlations. Occup Environ Med 2017; 74:144–153. 5. Tharion W, Lieberman H, Montain S et al. Energy requirements of military personnel. Appetite 2005; 44:47–65. 6. Booth CK, Probert B, Forbes-Ewan C et al. Australian army recruits in training display symptoms of overtraining. Mil Med 2006; 171:1059–1064. 7. Henning P, Park B-S, Kim J-S. Physiological decrements during sustained military operational stress. Mil Med 2011; 176:991–997. 8. Knapik JJ, Sharp MA, Canham-Chervak M et al. Risk factors for training-related injuries among men and women in basic combat training. Med Sci Sports Exerc 2001; 33:946–954. 9. Rosendal L, Langberg H, Skov-Jensen A et al. Incidence of injury and physical performance adaptations during military training. Clin J Sport Med 2003; 13:157–163. 10. Tanskanen MM, Uusitalo AL, Kinnunen H et al. Association of military training with oxidative stress and overreaching. Med Sci Sports Exerc 2011; 43:1552–1560. 11. Santtila M, Kyröläinen H, Häkkinen K. Changes in maximal and explosive strength, electromyography, and muscle thickness of lower and upper extremities induced by combined strength and endurance training in soldiers. J Strength Cond Res 2009; 23:1300–1308. 12. Courtright SH, McCormick BW, Postlethwaite BE et al. A meta-analysis of sex differences in physical ability: revised estimates and strategies for reducing differences in selection contexts. J Appl Psychol 2013; 98:623–641. 13. Nindl BC, Jones BH, Van Arsdale SJ et al. Operational physical performance and fitness in military women: physiological, musculoskeletal injury, and optimized physical training considerations for successfully integrating women into combat-centric military occupations. Mil Med 2016; 181:50–62. 14. Epstein Y, Yanovich R, Moran DS et al. Physiological employment standards IV: integration of women in combat units physiological and medical considerations. Eur J Appl Physiol 2013; 113:2673–2690. 15. Gibala MJ, Gagnon PJ, Nindl BC. Military applicability of interval training for health and performance. J Strength Cond Res 2015; 29(Suppl. 11):S40–S45. 16. Buchheit M, Laursen PB. High-intensity interval training, solutions to the programming puzzle: part I: cardiopulmonary emphasis. Sports Med 2013; 43:313–338. 17. Martinez-Valdes E, Falla D, Negro F et al. Differential motor unit changes after endurance or high-intensity interval training. Med Sci Sports Exerc 2017; 49:1126–1136. 18. Knuttgen HG, Nordesjö LO, Ollander B et al. Physical conditioning through interval training with young male adults. Med Sci Sports 1973; 5:220–226. 19. Kilen A, Hjelvang LB, Dall N et al. Adaptations to short, frequent sessions of endurance and strength training are similar to longer, less frequent exercise sessions when the total volume is the same. J Strength Cond Res 2015; 29(Suppl. 11):S46–S51. 20. Gist NH, Freese EC, Ryan TE et al. Effects of low-volume, high-intensity wholebody calisthenics on army ROTC cadets. Mil Med 2015; 180:492–498. 21. Kraemer WJ, Szivak TK. Strength training for the warfighter. J Strength Cond Res 2012; 26(Suppl. 2):S107–S118. 22. Friedl KE, Knapik JJ, Häkkinen K et al. Perspectives on aerobic and strength influences on military physical readiness: report of an international military physiology roundtable. J Strength Cond Res 2015; 29(Suppl. 11):S10–S23. 23. Moritani T, deVries H. Neural factors versus hypertrophy in the time course of muscle strength gain. Am J Physiol Med 1979; 58:115–130.

1137

24. Nindl BC, Barnes BR, Alemany JA et al. Physiological consequences of U.S. Army Ranger training. Med Sci Sports Exerc 2007; 39:1380–1387. 25. Sporis G, Harasin D, Bok D et al. Effects of a training program for special operations battalion on soldiers’ fitness characteristics. J Strength Cond Res 2012; 26:2872–2882. 26. Vantarakis A, Chatzinikolaou A, Avloniti A et al. A 2-month linear periodized resistance exercise training improved musculoskeletal fitness and specific conditioning of navy cadets. J Strength Cond Res 2017; 31:1362–1370. 27. Lester ME, Sharp MA, Werling WC et al. Effect of specific short-term physical training on fitness measures in conditioned men. J Strength Cond Res 2014; 28:679–688. 28. Santtila M, Pihlainen K, Viskari J et al. Optimal physical training during military basic training period. J Strength Cond Res 2015; 29(Suppl. 11):S154–S157. 29. Kraemer WJ, Vescovi JD, Volek JS et al. Effects of concurrent resistance and aerobic training on load-bearing performance and the Army physical fitness test. Mil Med 2004; 169:994–999. 30. Hickson RC. Interference of strength development by simultaneously training for strength and endurance. Eur J Appl Physiol 1980; 45:255–263. 31. Hawley JA. Molecular responses to strength and endurance training: are they incompatible? Appl Physiol Nutr Metab 2009; 34:355–361. Review. 32. Fyfe JJ, Bishop DJ, Stepto NK. Interference between concurrent resistance and endurance exercise: molecular bases and the role of individual training variables. Sports Med 2014; 44:743–762. 33. Santtila M, Häkkinen K, Karavirta L et al. Changes in cardiovascular performance during an 8-week military basic training period combined with added endurance or strength training. Mil Med 2008; 173:1173–1179. 34. Ihalainen JK, Schumann M, Eklund D et al. Combined aerobic and resistance training decreases inflammation markers in healthy men. Scand J Med Sci Sports 2017. http://dx.doi.org/10.1111/sms.12906. Epub ahead of print. 35. Santtila M, Häkkinen K, Nindl BC et al. Cardiovascular and neuromuscular performance responses induced by 8 weeks of basic training followed by 8 weeks of specialized military training. J Strength Cond Res 2012; 26:745–751. 36. Hofstetter MC, Mäder U, Wyss T. Effects of a 7-week outdoor circuit training program on Swiss Army recruits. J Strength Cond Res 2012; 26:3418–3425. 37. Mikkola I, Keinänen-Kiukaanniemi S, Jokelainen J et al. Aerobic performance and body composition changes during military service. Scand J Prim Health Care 2012; 30:95–100. 38. Drain J, Billing D, Neesham-Smith D et al. Predicting physiological capacity of human load carriage — a review. Appl Ergon 2016; 52:85–94. 39. Groeller H, Burley S, Orchard P et al. How effective is initial military-specific training in the development of physical performance of soldiers? J Strength Cond Res 2015; 29(Suppl. 11):S158–S162. 40. Lee L, Kumar S, Kok WL et al. Effects of a pre-training conditioning programme on basic military training attrition rates. Ann Acad Med 1997; 26:3–7. 41. Knapik JJ, Darakjy S, Hauret KG et al. Increasing the physical fitness of lowfit recruits before basic combat training: an evaluation of fitness, injuries, and training outcomes. Mil Med 2006; 171:45–54. 42. Chai LY, Ong KC, Kee A et al. A prospective cohort study on the impact of a modified Basic Military Training (mBMT) programme based on pre-enlistment fitness stratification amongst Asian military enlistees. Ann Acad Med 2009; 38:862–868. 43. Sporis G, Harasin D, Baic´ M et al. Effects of two different 5 weeks training programs on the physical fitness of military recruits. Coll Antropol 2014; 38(Suppl. 2):157–164. 44. Knapik JJ, Rieger W, Palkoska F et al. United States Army physical readiness training: rationale and evaluation of the physical training doctrine. J Strength Cond Res 2009; 23:1353–1362. 45. Harman EA, Gutekunst DJ, Frykman PN et al. Effects of two different eight-week training programs on military physical performance. J Strength Cond Res 2008; 22:524–534. 46. Vaara JP, Kokko J, Isoranta M et al. Effects of added resistance training on physical fitness, body composition, and serum hormone concentrations during eight weeks of special military training period. J Strength Cond Res 2015; 29(Suppl. 11):S168–S172. 47. Booth CK, Probert B, Forbes-Ewan C et al. Australian army recruits in training display symptoms of overtraining. Mil Med 2006; 171:1059–1064. 48. Tanskanen M, Uusitalo AL, Häkkinen K et al. Aerobic fitness, energy balance, and body mass index are associated with training load assessed by activity energy expenditure. Scand J Med Sci Sports 2009; 19:871–878. 49. Taanila H, Suni JH, Kannus P et al. Risk factors of acute and overuse musculoskeletal injuries among young conscripts: a population-based cohort study. BMC Musculoskel Disord 2015; 16:104. http://dx.doi.org/10.1186/s12891-015-0557-7. 50. Knapik J, Reynolds K, Harman E. Soldier load carriage: historical, physiological, biomedical, and medical aspects. Mil Med 2004; 169:45–56. 51. Jones BH, Knapik JJ. Physical training and exercise-related injuries: surveillance, research and injury prevention in military populations. Sports Med 1999; 27:111–125. 52. Solberg PA, Paulsen G, Slaathaug OG et al. Development and implementation of a new physical training concept in the Norwegian Navy Special Operations Command. J Strength Cond Res 2015; 29(Suppl. 11):S204–S210. 53. Abt JP, Oliver JM, Nagai T et al. Block-periodized training improves physiological and tactically relevant performance in Naval Special Warfare Operators. J Strength Cond Res 2016; 30:39–52. 54. Knapik JJ, Harman EA, Steelman RA et al. A systematic review of the effects of physical training on load carriage performance. J Strength Cond Res 2012; 26:585–597.

1138

H. Kyröläinen et al. / Journal of Science and Medicine in Sport 21 (2018) 1131–1138

55. Williams A, Rayson M, Jones D. Resistance training and the enhancement of the gains in material-handling ability and physical fitness of British Army recruits during basic training. Ergonomics 2002; 45:267–279. 57. Wood P, Grant C, Toit P et al. Effect of mixed military training on the physical fitness of male and female soldiers. Mil Med 2017; 182:e1771–e1779. 58. Kraemer W, Mazzetti SA, Nindl BC et al. Effect of resistance training on women’s strength/power and occupational performances. Med Sci Sports Exerc 2001; 33:1011–1025.

59. Hendrickson NR, Sharp MA, Alemany JA et al. Combined resistance and endurance training improves physical capacity and performance on tactical occupational tasks. Eur J Appl Physiol 2010; 109:1197–1208. 60. Bartlett CG, Stankorb S. Physical performance and attrition among U.S. Air Force trainees participating in the basic military training fueling initiative. Mil Med 2017; 182:e1603–e1609.

Journal of Science and Medicine in Sport 21 (2018) 1139–1146

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Original research

Musculoskeletal training injury prevention in the U.S. Army: Evolution of the science and the public health approach Bruce H. Jones ∗ , Veronique D. Hauschild, Michelle Canham-Chervak Injury Prevention Division, Army Public Health Center, United States

a r t i c l e

i n f o

Article history: Received 25 September 2017 Received in revised form 12 January 2018 Accepted 22 February 2018 Available online 3 March 2018 Keywords: Military Injuries Occupational Public health Historical article

a b s t r a c t Injuries cause more morbidity among soldiers in the U.S. Army than any other health condition. Over two-thirds of U.S. soldiers’ injuries occur gradually from cumulative micro-traumatic damage to the musculoskeletal system as a result of physical training activities. Paradoxically, the very physical training activities required to improve soldier performance also result in injury. Determining the amounts and types of physical training that maximize performance while minimizing injuries requires scientific evidence. This evidence must be incorporated into a framework that ensures scientific gaps are addressed and prevention efforts are evaluated. The five-step public health approach has proven to be an effective construct for Army public health to organize and build an injury prevention program. Steps include: 1) surveillance to define the magnitude of the problem, 2) research and field investigations to identify causes and risk factors, 3) intervention trials and systematic reviews to determine what works to address leading risk factors, 4) program and policy implementation to execute prevention, and 5) program evaluation to assess effectiveness. Dissemination is also needed to ensure availability of scientific lessons learned. Although the steps may not be conducted in order, the capability to perform each step is necessary to sustain a successful program and make progress toward injury control and prevention. As with many U.S. public health successes (e.g., seatbelts, smoking cessation), the full process can take decades. As described in this paper, the U.S. Army uses the public health approach to assure that, as the science evolves, it is translated into effective prevention. Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction Injuries cause more morbidity among soldiers in the U.S. Army than any other health condition and result in over a million medical encounters annually.1–3 Though acute traumatic injuries and accidents often receive the greatest attention, the vast majority (75%) of U.S. soldiers’ injuries occur gradually from cumulative microtraumatic damage caused by repetitive physical forces. Approximately two-thirds are musculoskeletal (MSK) injuries due to cumulative microtrauma.3 These MSK injuries, which are predominantly to the lower extremities and lower back, have been referred to as Army training-related overuse injuries.2–4 Yet the need to maintain a vigorous physical training regimen will always be paramount to the Army. It is a paradox that the very physical training activities required or encouraged by the U.S. Army to improve soldiers’ physical performance, are also what can most detract from their physical readiness by causing injuries.

∗ Corresponding author. E-mail address: [email protected] (B.H. Jones).

Determining the amounts and types of military physical training that will maximize performance while minimizing injuries requires substantiating scientific evidence. The evidence needs to be incorporated into a systematic approach that answers the following five key questions and accomplishes the associated key functions.5,6 1) Is there a problem and how big is it? (i.e., define the magnitude of the problem and populations affected). 2) What causes the problem? (i.e., identify risk factors, mechanisms, and activities associated with injuries). 3) What works to prevent the problem? (i.e., determine what works to address leading injury risk factors, mechanisms, and activities). 4) Who needs to know and do what? (i.e., coordinate necessary injury prevention partners to implement prevention strategies). 5) How well did what was done work? (i.e., determine the effectiveness of programs and policies put in place). Research alone cannot answer these questions; different types of public health activities are required. The activities represent the steps of the public health approach (PHA). These steps include: (1)

https://doi.org/10.1016/j.jsams.2018.02.011 1440-2440/Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/ licenses/by-nc-nd/4.0/).

1140

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146

surveillance, (2) research and field investigations, (3) intervention trials and systematic reviews, (4) program and policy implementation, (5) program evaluation and monitoring.1,5 Dissemination is also needed to ensure availability of scientific lessons learned.6 This paper provides a historical account of how the steps of the PHA have provided the framework for the Army’s injury prevention program. Specifically, this paper focuses on the evolution of the U.S. Army’s injury science pertaining to soldiers physical training-related MSK injuries. The paper describes key efforts and publications of the U.S. Army Research Institute of Environmental Medicine (USARIEM), the U.S. Army Public Health Center (APHC) including its predecessors (see Acknowledgements), and demonstrates how these efforts have aligned with the PHA to support an evidence-based prevention program. While much has been learned from the experience of other armies, this paper will focus specifically on lessons learned from U.S. Army injury prevention activities. 2. The steps of the Public Health Approach are interrelated, but not always sequential The steps of the PHA theoretically build on each other, and suggest a consecutive order of scientific study. In practice, however, public health studies are often not completed in such a sequence. Admittedly, adequately addressing the first question (“Is there a problem, and how big is it”) is an important and formative step. Answering this first question, with adequate supporting epidemiological surveillance data, can justify resources and focus future research, implementation, and evaluation. But what happens when this first question cannot be answered by existing evidence or public health surveillance programs? For example, though public health surveillance has evolved since the early 20th century, its focus is still often on causes of death and acute traumatic injuries.3 As a result, less catastrophic and/or nontraumatic health outcomes are less likely to be recognized as a health “problem.” If the lack of data is allowed to serve as justification that there is not a problem, its existence and scope is difficult to address. In such cases, prevention priorities are also more likely to be misdirected. However, a research investigation or policy question may highlight the problem — and thus initiate interest or support in pursuing the full PHA. This is the situation that gave rise to the U.S. Army’s efforts to prevent training-related injuries. 3. Basic research initiates the Army’s injury prevention program The U.S. Army’s interest in injuries started in 1982 with a question from the Commanding General of the U.S. Army’s Forces Command: “Would wearing running shoes for physical training instead of combat boots prevent injuries?” Military medical surveillance systems did not exist at that time, so this training-related injury question could not be easily answered. Instead, the USARIEM performed a scientific literature review that revealed there was no evidence that running shoes or any other type of footwear would prevent injuries. Next, an intervention trial to determine whether running shoes during physical training would prevent injuries was planned, however it was never conducted because, despite the lack of evidence supporting its effectiveness, the Army modified its policy and began requiring use of running shoes during physical training.7 The question, and lack of scientific evidence, however, piqued the interest of USARIEM scientists. Further review of the literature in 1983 found no other proven ways to prevent physical training or exercise-related injuries. The lack of evidence was itself noteworthy, so a paper was written to summarize the scant findings in the

literature, consisting mostly of anecdotal clinical accounts and case series.8 The paper outlined the following potential risk factors for training injuries deserving further examination: prior physical condition, physical anomalies, body weight, previous injury, gender, training surfaces, footwear, and training techniques. The USARIEM initiated several research projects to investigate the causes and risk factors associated with training injuries among soldiers. The first field investigation was conducted at Ft. Jackson, South Carolina in 1984.9 In that study, Jones et al. showed that female trainees experienced almost twice as many injuries as male trainees, and that men and women who ran slowest on the initial physical training test experienced more training injuries (especially stress fractures) than their male and female peers who ran the fastest. Associations of higher injury risks were also found for low levels of muscle endurance (measured by numbers of push-ups and sit-ups), high and low body mass index (BMI) and low levels of prior physical activity among male trainees.9 The next USARIEM study, conducted in 1985 at Parris Island, Marine Recruit Depot, South Carolina, was an intervention trial to determine whether shock absorbent boot insoles would prevent stress fractures among marine recruits. The trial found that the selected shock absorbent insole did not prevent stress fractures.10 The study however supported other studies by showing that recruits who had been the least physically active prior to entering the Marine Corps were at substantially higher risk of injury than the most active. This early work identified key potential risk factors for military training injuries and established the foundation for future investigations.5 Further research was conducted by the USARIEM from 1987 to 1989 at Forts Benning, Jackson, and Bliss. Key findings of these studies include: • Ft. Benning, Georgia, 1987. This investigation evaluated two infantry units, one that performed high running mileage and another that ran much less. The study found that the high mileage unit ran twice as many miles, experienced higher injury risks, and actually ran slower on the final Army Physical Fitness Test.11 In addition to showing higher injury risks with low amounts of running and exercise prior to entering the Army and low levels of physical fitness, a number of other risk factors not previously established were identified including older age, high arches, knock knees, cigarette smoking and high and low flexibility.11–15 • Ft. Jackson, South Carolina, 1988. This investigation of Army recruits focused on body composition as a risk factor for injuries. The study provided further evidence suggesting that high and low percentages of body fat predisposed recruits to higher risks of injury.16 It also confirmed previously identified risk factors such as cigarette smoking,17 and lower levels of aerobic fitness, including that women of the same fitness levels as men experience similar risks of injury.18 • Ft. Bliss, Texas, 1989. By 1989, running had increasingly been shown as a high-risk activity for MSK training injury in the civilian literature. Data from the U.S. Army also suggested greater risk of injury with more miles run.11,19 The study of Army basic combat trainees at Ft Bliss intended to determine whether injury risks could be reduced by providing recovery from weight-bearing training, primarily running, during the peak weeks of injury risk during Army basic combat training (BCT). However, the study showed no effect of additional recovery time on risks of stress fractures or other injuries.20 As a result of these studies, the USARIEM created a new research division, the Occupational Medicine Division, to specifically focus research on activities and potential risk factors associated with U.S. Army training-related injuries. The initial USARIEM research had focused on Army basic trainees, but it was not clear if a similar problem existed in active operational units. To address this question, the

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146

USARIEM employed a research protocol similar to that previously used to study trainees to investigate an infantry unit at Ft. Wainwright, AK. That study showed, as with Army trainees, slow run times and low numbers of sit ups were risk factors for injury among infantry soldiers.21 Another USARIEM study that showed high percentages of body fat, high and low BMI, low aerobic endurance (slow two-mile run times), low muscle endurance (fewer sit ups) and cigarette smoking were associated with significantly higher risk of injuries among infantry soldiers.22 The research methods developed by the USARIEM for studying military physical training related injuries were increasingly reflected in other national and international military studies.23 A focus of many studies was the evaluation of risk factors associated with these injuries, as had been identified in the earliest USARIEM work.8 As a result of the increasing research, an expanded and refined list of intrinsic and extrinsic risk factors, which included health behaviors and types of training, evolved into that shown in Table 1. 4. Using the Public Health Approach to address data gaps By the 1990s, it was evident that individual research studies alone would not be adequate to demonstrate the magnitude of the U.S. Army training-related injury problem, identify additional risk factors, determine what works to prevent injuries, and prioritize prevention goals. A comprehensive, systematic approach was needed. Strategic level changes began when the U.S. Department of Defense (DOD) was included on the Centers for Disease Control (CDC) Advisory Committee for Injury Prevention and Control. During the Committee’s deliberations, the CDC introduced the Army to its PHA, later outlined by Jones et al.5,24 The CDC’s PHA provided a framework that the Army and DoD could use to organize their activities and ultimately reduce injuries. The process starts with the questions: Is there a problem, and if so, how big is it? Through the efforts of two work groups in the 1990s, the value of surveillance data was recognized and it became increasingly clear that injuries were the biggest health problem not only of the U.S. Army, but other U.S. military as well. These two work groups helped to address the initial question of the PHA, as follows: • DOD Injury Surveillance and Prevention Work Group (1992–1998). With this group, the Army first applied the PHA construct. This Work Group of U.S. military medical and safety personnel contributed to a joint effort to demonstrate the full impact of injuries. The primary result of the group’s efforts was the pivotal publication, the “Atlas of Injuries in U.S. Armed Forces”, which inventoried existing U.S. military databases containing injury data.1 Development of a comprehensive, integrated medical surveillance system and implementation of the steps of the PHA were corner stones of the DOD Injury Surveillance and Prevention Work Group’s recommendations. In addition, at the request of the Secretary of Defense, the Atlas was used in 2000 by the National Safety Council as the basis for estimating the impact and costs of injuries for the U.S. military. The National Safety Council estimated the annual costs of injury to DOD to be between $12 billion and $20 billion per year.25 • Armed Forces Epidemiologic Board (AFEB) Injury Prevention and Control Work Group (1994–1997). The DOD Injury Surveillance and Prevention Work Group presented its data to the AFEB to highlight the need for a routine process to prevent injury, including surveillance (Table 2). The AFEB concurred and concluded that a key to injury prevention and future military public health goals would be the development of a comprehensive medical surveillance system. The AFEB recommendations led the Assistant Secretary of Defense for Health Affairs to recognize the

1141

magnitude of the problem of injuries and resulted in the designation of the Army and the APHC as the executive agent for the Defense Medical Surveillance System in 1997. Data from across the spectrum of health, from deaths to disabilities, to hospitalizations and outpatient visits showed incontrovertibly that injuries were in fact the biggest health problem of the U.S. military. It also became clear that injury research received one of the smallest budgets. Things were not going to change with regard to budgets or recognition of injuries as a problem without a systematic approach and surveillance data to demonstrate the full magnitude of the problem. While the USARIEM continued to pursue necessary injury research, the U.S. Army recognized the need to establish an organization to routinely monitor Army medical, and more specifically, injury data. In 1994, the predecessor of the APHC Injury Prevention Division was stood up. The design of the legacy Injury Prevention Division provided the Army with the capability to more strategically influence Army injury prevention efforts through routine surveillance and greater responsiveness to immediate problems. Of initial importance to the Division were activities that would address the first public health question — is there a problem and what is its magnitude? As shown in Table 2, by 2000 the evidence more clearly demonstrated the magnitude of the military training-related injury problem. The USARIEM’s earlier research had begun to address the second step (investigations of causes and risk factors). Much impetus for injury prevention in the early 2000s came from the following committees and task forces: • Military Training Task Force of the Defense Safety Oversight Council (2004–2012). After the National Safety Council reported its estimated DOD annual costs of injury, the U.S. Secretary of Defense formed the Defense Safety Oversight Council (DSOC). The DSOC gave continued prominence to the problem of injuries across the DoD and the value of medical data to assess causes, risks, and interventions. The Task Force concluded that USACHPPM (predecessor of the APHC) was the only organization in DOD with the investigative and evaluation capabilities that could respond in a timely fashion to emerging injury questions. DSOC support enabled the investigation of training-related injury questions such as causes of injuries during deployment, the new T-11 parachute jump-related injury risks, and assignment of running shoes based on foot shape.26 Many of the papers summarizing Task Force work, including the methods and process developed to identify DOD injury priorities are contained in a supplement to the American Journal of Preventive Medicine in 2010 titled “A Public Health Approach to Injury Prevention: the U.S. Military Experience.”26 • Joint Services Training Injury Prevention Work Group (2004–2010). Created under the Military Training Task Force, this group consisted of 29 military and civilian researchers, public health practioners, clinicians, and epidemiologists, including scientists from four U.S. military services (Army, Navy, Marine Corps, Air Force), the CDC, and academic institutions. The group conducted an expedited systematic review of current literature and provided evidence-based recommendations for physical training-related injury interventions,27 discussed in further detail below. • U.S. Department of Health and Human Service’s Planning Committee for the Adequacy of Evidence for Physical Activity Guidelines (2007–2008). The APHC expertise was recognized nationally for its contribution to this Committee. Because of its efforts to study military training injuries, the Army was able to outline the key risk factors for physical activity-related musculoskeletal injuries, and provide evidence that the amount and type of physical activity have strong associations with injury risks.28

1142

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146

Table 1 Intrinsic and Extrinsic Risk Factors for U.S. Army Training-Related Injuries Identified in Key USARIEM and USAPHC Publications, 1982–2017. Intrinsic risk factors Demographic factors that increase risk

Female sex18,29,47,48 Older age15,23,34,49

Anatomic factors that increase risk

High foot arches13,40 Knock knees14 High Q angles14

Physical fitness characteristics and effects on injury

Low aerobic fitness – increases risk9,18,21,34,35,47,49 Low muscle endurance – increases risk9,18,21,34,35,47,49 Muscle strength – no association with injury risk12,15,29 Low and high flexibility – increases injury risk5,12,15,23,29,37 Body composition – low and high BMI increase injury risk9,16,34,35,47

Health behaviors that increase risk

Sedentary Lifestyle/inactivity9,15,16,29,47 Tobacco use12,15,17,22,35 Prior injury15,35

Extrinsic risk factors Activities that increase risk

Interventions: equipment, procedure/policy

Running4,11,30,31,36,38 Physical training (not running)4,23,31,35 Obstacle courses (less exposure but high injury rates)23,31 Foot marching/load carriage31,32 Job duties/Army MOS (e.g. Infantry)34,50 Effective in reducing injury risk Standardized physical training, reduced running mileage1,5,27,42 Ability groups for running27,49 Not effective in reducing injury risk or findings are inconsistent Back brace to prevent job or sport injury (no effect)27 Running shoe types (no effect)33,39 Boots versus running shoes (no effect)7 Insoles, orthotics (inconsistent)10

The second step of the process answers the question: What causes the problem? Because physical training injuries were a priority accounting for over 50% of Army injuries,2 investigations to further address the causes and risk factors were warranted. Significantly, Knapik et al. showed that higher VO2 max, a more scientific measure of aerobic fitness than run time, was associated with lower risks of injury, confirming that higher levels of aerobic fitness are protective.29 As shown in Table 1, other BCT studies in the late 1990s and early 2000s12,29 confirmed previous findings that: • women experienced almost twice as many injuries as men, • lower levels of physical fitness (measured by two-mile run times and push-ups on entry to training) was associated with higher injury risk, • cigarette smoking was associated with higher injury risk in both men and women, • high and low flexibility was associated with higher injury risk in men, • low levels of physical activity in men prior to BCT were associated with higher injury risk, and • muscle strength and power were not associated with risks of injury for men or women in basic training, whereas upper body muscle endurance (i.e., push-ups) was confirmed as a risk factor. Recognizing the key role of the amounts and types of activity in the causation of physical training-related injuries, the APHC staff began focusing on those factors. In 2007, the APHC found that greater amounts of weight-bearing activity such as running and marching resulted in greater injury risks for male and female U.S. Army trainees.30 In 2011, it was shown for the first time in BCT that, compared to physical training such as calisthenics and running, the risk of sustaining an injury associated with road marching was almost five times greater per unit of time and the risk of injury on the obstacle course was over seven times greater per unit of time spent on the activity.23,31 More recently, APHC has shown that greater amounts of weight carried and greater distances of

miles marched were associated with greater risks of injury among male soldiers.32 These findings represent a dose response effect; the more physical stress to which soldiers are exposed, the more likely it is they will be injured. As shown in Table 1, the study of injury causes and risk factors continues. The APHC continues to monitor injury surveillance data and to investigate questions about risk associated with specific concerns, such as use of minimalist shoes,33 specific military occupations,34 and physical training activities (i.e., extreme conditioning programs,35 running,36 and functional movement screening37 ). Running has continued to be a leading trainingrelated cause of injury. For context, Hauret et al. showed in 2015 that over 55% of soldiers experience one or more injuries a year and that soldiers attribute over half of those injuries to exercise or sports, of which running accounted for 43%.4 This was supported by results from a 2014 injury prevention needs assessment survey, in which over half of the respondents reported an injury in the past year and over a third of those were attributed to running.38 The third step of the process asks: What works to prevent the problem? As the evidence demonstrating the magnitude of the problem has grown, the interest in finding effective prevention strategies has increased. For example, in 2006 when the DSOC considered recommending that the military assess the foot types of incoming recruits in order to prescribe running shoes to minimize injury risk, the APHC recommended that an intervention trial be conducted to determine the effectiveness of such an intervention. The resulting intervention trials of U.S. Army, Air Force, and Marine recruits found that prescribing shoes on the basis of foot type did not reduce injuries.39 This was an important lesson learned, and saved the military from unnecessary expenditures related to implementation of an ineffective injury prevention measure (i.e., measuring static foot type to prescribe shoe types does not work). Given the time and expense of intervention studies, the systematic review process (i.e., the systematic process of compiling data from existing publications to establish ‘current evidence’) is another methodology increasingly used to summarize intervention

Intervention Trials, Systematic Reviews (2010) American Journal of Preventive Medicine Supplement26

Program and Policy Implementation (1999) Atlas of Injuries (Ch 6, 9)1 (2000) An Armed Forces Epidemiological Board Evaluation52 (2010) American Journal of Preventive Medicine Supplement26 Program Evaluation and Monitoring (2010) American Journal of Preventive Medicine Supplement26

3. What works to prevent the problem?

4. Who needs to know and do what?

5. How effective is what was done?

APHC = 1994 (U.S. Army Center for Health Promotion and Preventive Medicine (USACHPPM) to present.

Research, Field Investigations (1999) Atlas of Injuries (Ch 6, 9)1 (1999) Training and exercise-related injuries: surveillance, research, and injury prevention in military populations5 (2010) American Journal of Preventive Medicine26

2. What causes the problem?

*

Studies demonstrate magnitude of injury medical encounters in Army/US military: (1999) Atlas of Injuries in the military1 (2000) Jones BH, Hansen BC AJPM52

Surveillance, Epidemiological Studies (1999) Atlas of Injuries (Ch 1-5, 8, 9)1 (1999) Training and exercise-related injuries: surveillance, research, and injury prevention in military populations5 (2000) Jones BH. An Armed Forces Epidemiological Board Evaluation51 (2010) American Journal of Preventive Medicine supplement26

1. Is there a problem? How big is it?

First program evaluation: (2005) Evaluation of a standardized physical training program for basic combat training.42

Assessment of Soldier knowledge/perceptions: (2015) Needs assessment of soldiers know regarding injury, risks, interventions, and leadership38

Implementation of a new Army PT program with reduced running: (2005) Evaluation of a standardized physical training program for basic combat training.42

Systematic Reviews: (2002) Prevention of lower extremity stress fractures40 (2010) Prevention of physical training related injuries27

Intervention trials: (1985) Evaluation of shock absorbent boot insoles10 (2007) Foot type assessment and footwear prescription39

APHC* investigations to identify risk factors: (2001) Risk factors for training-related injuries in basic training29 (2015) Occupation and other risk factors for injury among enlisted U.S. soldiers.34 (2017) Effects of physical training and fitness on running injuries36 (2017) Risk factors for injury associated with low, moderate, and high mileage marching32

(1994) Cigarette smoking, physical fitness, and injuries in infantry soldiers22

USARIEM initiates research program to address training injuries: (1983) Overuse injuries of lower extremities associated with marching, jogging, running8 (1993) Intrinsic risk factors for military exercise-related injuries47

Surveillance shows Army cumulative stress overuse injuries are unrecognized: (2000) Methodology and results for describing Army cumulative MSK injury problem55

Installation-specific reports describe extent and type of local injury problem: (2002) First Installation Injury Reports (AMSA)53 (2004) Assessment of Ft Jackson Physical training program54 (2017) Installation-specific annual report by APHC45

(2015) Annual summary of Army medical conditions (Health of the Force)43

(2009) Example Army annual injury report for 2008 data48

Specific examples of associated Army publications

Step of the Public Health Process and example Army publications

Public health question

Table 2 The Public Health Process: Examples of U.S. Army Training-Related Injury Studies, USARIEM and USAPHC, 1982– 2017.

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146 1143

1144

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146

evidence. The 2002 systematic review of interventions to prevent stress fractures by Jones et al. represents the first systematic review directly pertinent to a U.S. Army training injury issue.40 Other systematic reviews that have contributed to efforts of the Army’s injury prevention program include reviews of training-related injury prevention strategies, mouth guard effectiveness, injury risks from wearing combat boots versus running shoes, and the correlation of physical fitness with military task performance.7,27,39,41 Of particular note, the review and recommendation of Bullock et al. identified several interventions for reducing military-training related injuries considered to be effective, not effective, or for which evidence was inadequate. Effective interventions included reduced running mileage, leadership oversight of training, mouth guards in high risk activities (e.g., combat training, sports like football), use of rigid ankle braces during high risk activities (e.g., parachuting, basketball), and incorporation of balance, agility, and proprioceptive warmups.27 Only six of the 31 interventions thought to have potential to prevent training or sports-related injuries had enough evidence to recommend them; 23 interventions had insufficient evidence and two strategies (back braces, pre-exercise ingestion of nonsteroidal anti-inflammatory drugs) were not recommended. The fourth step considers: Who needs to know and do what? Since prior research and literature demonstrated that running was a high-risk activity for injury, in 2002 the APHC provided the Army Training and Doctrine Command leadership a training injury prevention strategy that focused on reductions in physical training running mileage, exercise progression, precision of movement, and a greater variety of exercises such as multidirectional grass drills.42 APHC conducted an evaluation of the new “standardized physical training program” in 2003, at the start of its implementation. The evaluation showed a 40% reduction in injury risk with the new program.42 Since implementation of the standardized program in 2003, injury surveillance data has shown a steady decline in injury rates during basic combat training.43 A 2014 injury prevention needs assessment survey was another effort to address the fourth step of the PHA.38 It was recognized that the evidence being produced through research and documented only in scientific literature would not reach or change the behaviors in the general Army population. The primary survey goal was to identify injury types, activities, risk factors, and prevention tactics to be addressed in future educational products for soldiers and leaders. The results identified interests and perceptions, as well as gaps in knowledge that have helped prioritize the APHC’s development of injury prevention factsheets and social media articles. Additional efforts led by the APHC in support of the fourth step of the PHA have begun to focus on supporting U.S. Army base or installation needs, as injury prevention is recognized to be most successful when applied and enforced at the installation and unit level. For example, APHC is now formalizing a means to connect unit commanders with potential installation injury prevention partners across numerous disciplines (e.g., preventive medicine, physical therapy, occupational health, health promotion, safety) through the existing Community Health Promotion Council infrastructure.44 Current efforts coordinate installation medical assets, standardize installation injury monitoring, and provide data to inform prevention decision-making and prioritization of local injury prevention goals.45 Part of this effort is the adoption of system engineering methodology to enable long-term monitoring of installation injury trends and identification of statistically significant shifts indicating when injury rates are unacceptably high or when significant reductions have been achieved.46 The final step addresses the question: How effective is what was done? The APHC has been requested to evaluate the benefits of unit-specific training programs. To date, the most important intervention evaluation has been that of the “standardized physical training program” in Army BCT.42 If enforced, continued evidence

Table 3 Example methodological issues and solutions in U.S. Army’s injury prevention program publications. Example methodological questions

Solution

What is meant by “injuries” when conducting injury epidemiological surveillance?

Create a scientific, citable references to provide conceptual and operational definitions of injuries for consistent epidemiological use3,55 Provide evidence to validate the use of soldier-recalled APFT and body mass scores as measure of fitness56,57 Provide evidence to justify the use of use of BMI as measure of body composition in epidemiological injury studies58 Demonstrate the application of statistical control charts to monitoring trends in injury rates over time46

Can soldiers’ self-reported APFT scores be used as a measure of fitness? Is BMI an appropriate measure for Army soldiers’ body composition? How can the Army monitor injury rates to determine if there are problems or improvements over the years? How can the military evaluate scientific evidence to determine which training-related injury prevention strategies and equipment are effective? How can scientific evidence be used to influence injury prevention policy?

Demonstrate the state-of-the-science for effective injury interventions using a systematic process27

Establish an evidence-based public health approach to injury prevention26

of injury reductions is anticipated. For injury prevention progress to continue, it is essential to include evaluation plans when implementing future injury prevention programs and policies. In addition, it has also been necessary to describe and validate various methodological issues in the scientific literature. Table 3 summarizes examples of the Army injury prevention methodrelated issues that have been addressed to ensure the credibility of the scientific underpinning of U.S. Army injury prevention initiatives. To reiterate, the focus of this paper is intentionally on U.S. Army injury prevention research and other activities. However, Australian, British, and other armies continue to advance military injury prevention knowledge through their work to identify military injury risk factors and evaluate prevention strategies. Future partnerships will continue to inform U.S. military injury prevention. In addition, the PHA approach admittedly requires a breadth and depth of resources to execute. Field investigation and survey capabilities that enable defining “what causes the problem?” are likely best maintained centrally, while program development that determines “what prevents a problem?” may best be designed at the unit level, where specific mission requirements and personnel characteristics can be taken into account. Other steps, such as defining “is there a problem, and how big is it?” and “have programs and policies reduced the problem?” require partnerships, using centralized and unit resources. In large organizations like the U.S. Army, with continuous and long-term leadership support, it is challenging yet possible to develop and sustain the capability to execute all aspects of the PHA. Effective communication across the organization is essential to maintain an understanding of knowledge gaps and disseminate scientific lessons learned. 5. Conclusions Just as with public health problems such as smoking cessation and seat belt use, it has taken time for the science of military training-related injury prevention to be recognized and take effect. The U.S. Army’s injury prevention efforts can be traced back to a 1982 research question of whether running shoes would result in fewer injuries than running in boots. The initial USARIEM work formed the basis for other questions, which later were systematically addressed using the PHA. Systematically building the

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146

capability to perform all steps of the PHA has taken a long-term focus, formation of partnerships, commitment to the steps of the process, and dedicated efforts of numerous military and civilian subject matter experts. This commitment has led to significant progress in addressing a large and complex health problem, Army training-related injuries. The U.S. Army now has the evidence to describe the magnitude of the problem and identify key risk factors. Importantly, reduction of training-related injuries was also demonstrated with a standardized physical readiness training program that put less emphasis on running. Other discoveries, such as the ineffectiveness of running shoe prescription, have provided the Army with necessary information to avoid potential costly or resource-intensive changes to equipment or procedures. Additional work to identify and assess interventions is needed. The U.S. Army’s injury prevention program has matured tremendously over three decades. As a result, risk factors are much more clearly understood, and future efforts can begin to focus on targeting populations at highest risk, identifying leading mechanisms, evaluating promising interventions, and disseminating proven prevention strategies and data necessary for monitoring a leading Army health issue. Acknowledgments The APHC predecessors include the following organizations: U.S. Army Center for Health Promotion and Preventive Medicine, 1995–2010; U.S. Army Public Health Command, 2010–2015; Army Public Health Center, 2016–present. References 1. Jones BH, Amoroso PJ, Canham ML et al. Atlas of injuries in the United States Armed Forces. Mil Med 1999; 164(8 Suppl), 1-1 to 9-25. 2. Jones BH, Canham-Chervak M, Canada S et al. Medical surveillance of injuries in the U.S. military: descriptive epidemiology and recommendations for improvement. Am J Prev Med 2010; 38(1 Suppl):S42–S60. 3. Hauschild VD, Hauret KG, Richardson M et al. A Taxonomy of Injuries for Public Health Monitoring and Reporting (Public Health Information Paper No. 12-010717), Aberdeen Proving Ground, MD, Army Public Health Center, 2017. 4. Hauret KG, Bedno S, Loringer K et al. Epidemiology of exercise- and sportsrelated injuries in a population of young, physically active adults: a survey of military servicemembers. Am J Sports Med 2015; 43(11):2645–2653. 5. Jones BH, Knapik JJ. Physical training and exercise-related injuries. Surveillance, research and injury prevention in military populations. Sports Med 1999; 27(2):111–125. 6. Sleet DA, Baldwin G. It wouldn’t hurt to create a safer military. Am J Prev Med 2010; 38(1 Suppl):S218–S221. 7. Knapik JJ, Jones BH, Steelman RA. Physical training in boots and running shoes: a historical comparison of injury incidence in basic combat training. Mil Med 2015; 180(3):321–328. 8. Jones BH. Overuse injuries of the lower extremities associated with marching, jogging, and running: a review. Mil Med 1983; 148(10):783–787. 9. Jones B, Manikowski R, Harris J et al. Incidence of and Risk Factors for Injury and Illness Among Male and Female Army Basic Trainees, Natick, MA, U.S. Army Research Institute of Environmental Medicine, 1988. 10. Gardner Jr LI, Dziados JE, Jones BH et al. Prevention of lower extremity stress fractures: a controlled trial of a shock absorbent insole. Am J Public Health 1988; 78(12):1563–1567. 11. Jones BH, Cowan DN, Knapik JJ. Exercise, training, and injuries. Sports Med 1994; 18(3):202–214. 12. Cowan D, Jones B, Tomlinson P et al. The epidemiology of physical training injuries in US Army infantry trainees: methodology, population, and risk factors, Natick, MA, U.S. Army Research Institute of Environmental Medicine, 1988. 13. Cowan DN, Jones BH, Robinson JR. Foot morphologic characteristics and risk of exercise-related injury. Arch Fam Med 1993; 2(7):773–777. 14. Cowan DN, Jones BH, Frykman PN et al. Lower limb morphology and risk of overuse injury among male infantry trainees. Med Sci Sports Exerc 1996; 28(8):945–952. 15. Jones BH, Cowan DN, Tomlinson JP et al. Epidemiology of injuries associated with physical training among young men in the army. Med Sci Sports Exerc 1993; 25(2):197–203. 16. Jones BH, Bovee MW, Knapik JJ. Associations among body composition, physical fitness, and injury in men and women Army trainees, in Body composition and physical performance, Institute of Medicine, editor, Washington, DC, National Academy Press, 1992, p. 141––173.

1145

17. Altarac M, Gardner JW, Popovich RM et al. Cigarette smoking and exerciserelated injuries among young men and women. Am J Prev Med 2000; 18(3 Suppl):96–102. 18. Bell NS, Mangione TW, Hemenway D et al. High injury rates among female army trainees: a function of gender? Am J Prev Med 2000; 18(3):141–146. 19. Jones BH, Rock PB, Moore MP. Musculoskeletal injury: risks, prevention, and first aid, Natick, MA, U.S. Army Research Institute of Environmental Medicine, 1986. 20. Popovich RM, Gardner JW, Potter R et al. Effect of rest from running on overuse injuries in army basic training. Am J Prev Med 2000; 18(3 Suppl):147–155. 21. Knapik J, Ang P, Reynolds K, Jones B. Physical fitness, age, and injury incidence in infantry soldiers. J Occup Med 1993; 35(6):598–603. 22. Reynolds KL, Heckel HA, Witt CE et al. Cigarette smoking, physical fitness, and injuries in infantry soldiers. Am J Prev Med 1994; 10(3):145–150. 23. Jones BH, Hauschild VD. Physical training, fitness, and injuries: lessons learned from military studies. J Strength Cond Res 2015; 29(Suppl. 11):S57–S64. 24. Jones BH, Canham-Chervak M, Sleet DA. An evidence-based public health approach to injury priorities and prevention recommendations for the U.S. Military. Am J Prev Med 2010; 38(1 Suppl):S1–S10. 25. Department of Defense Inspector General. Evaluation of DoD Accident Reporting (Report No. SPO-2010-007). Arlington, Virginia, Department of Defense, September 2010. 26. Canham-Chervak M, Jones BH, Sleet DA, eds. A public health approach to injury prevention: The U.S. military experience. Am J Prev Med 2010; 38(1 Supplement 1). Available at: https://phc.amedd.army.mil/topics/healthsurv/ip/Pages/ PublicationsandReports.aspx. Accessed September 2017. 27. Bullock SH, Jones BH, Gilchrist J et al. Prevention of physical training-related injuries recommendations for the military and other active populations based on expedited systematic reviews. Am J Prev Med 2010; 38(1 Suppl):S156–S181. 28. Institute of Medicine (IOM). Adequacy of Evidence for Physical Activity Guidelines Development: Workshop summary, National Academy Press, 2007. 29. Knapik JJ, Sharp MA, Canham-Chervak M et al. Risk factors for training-related injuries among men and women in basic combat training. Med Sci Sports Exerc 2001; 33(6):946–954. 30. Knapik JJ, Hauret KG, Canada S et al. Association between ambulatory physical activity and injuries during United States Army Basic Combat Training. J Phys Act Health 2011; 8(4):496–502. 31. Knapik JJ, Graham BS, Rieger J et al. Activities associated with injuries in initial entry training. Mil Med 2013; 178(5):500–506. 32. Schuh-Renner A, Grier TL, Canham-Chervak M et al. Risk factors for injury associated with low, moderate, and high mileage road marching in a U.S. Army infantry brigade. J Sci Med Sport 2017. Available at: https://doi.org/10.1016/j.jsams.2017. 07.027. Accessed September 2017. 33. Grier T, Canham-Chervak M, Bushman T et al. Minimalist running shoes and injury risk among United States army soldiers. Am J Sports Med 2016; 44(6):1439–1446. 34. Anderson M, Grier T, Canham-Chervak M et al. Occupation and other risk factors for injury among enlisted U.S. Army Soldiers. Public health 2015; 129(5):531–538. 35. Grier T, Canham-Chervak M, McNulty V et al. Extreme conditioning programs and injury risk in a US Army Brigade Combat Team. US Army Med Dep J 2013:36–47. 36. Grier TL, Canham-Chervak M, Anderson MK et al. Effects of physical training and fitness on running injuries in physically active young men. J Strength Cond Res 2017; 31(1):207–216. 37. Bushman TT, Grier TL, Canham-Chervak M et al. The Functional Movement Screen and injury risk: association and predictive value in active men. Am J Sports Med 2016; 44(2):297–304. 38. Hauschild VD, Schuh A, Jones BH. What soldiers know and want to know about preventing injuries: a needs survey regarding a key threat to readiness. US Army Med Dep J 2016; 2016:10–19. 39. Knapik JJ, Trone DW, Tchandja J et al. Injury-reduction effectiveness of prescribing running shoes on the basis of foot arch height: summary of military investigations. J Orthop Sports Phys Ther 2014; 44(10):805–812. 40. Jones BH, Thacker SB, Gilchrist J et al. Prevention of lower extremity stress fractures in athletes and soldiers: a systematic review. Epidemiol Rev 2002; 24(2):228–247. 41. Hauschild VD, DeGroot DW, Hall SM et al. Fitness tests and occupational tasks of military interest: a systematic review of correlations. Occup Environ Med 2017; 74(2):144–153. 42. Knapik J, Darakjy S, Scott SJ et al. Evaluation of a standardized physical training program for basic combat training. J Strength Cond Res 2005; 19(2):246–253. 43. Health of the Force (Technical Report No. TA-267-1215). Aberdeen Proving Ground, MD, Army Public Health Center (Provisional), 2015. 44. Courie AF, Rivera MS, Pompey A. Managing public health in the Army through a standard community health promotion council model. US Army Med Dep J 2014:82–90. 45. Canham-Chervak M, Pecko J, Schuh A et al. Summary of activities supporting the Army Medicine 2020 (AM2020) Campaign’s Injury and Violence Free Living Program, March 2013–July 2016 (Public Health Report No. S.0023112), Aberdeen Proving Ground, Maryland, Army Public Health Center, 2017. 46. Schuh A, Canham-Chervak M, Jones BH. Statistical process control charts for monitoring military injuries. Inj Prev 2017; 23:416–422. 47. Jones BH, Bovee MW, Harris 3rd JM et al. Intrinsic risk factors for exerciserelated injuries among male and female army trainees. Am J Sports Med 1993; 21(5):705–710.

1146

B.H. Jones et al. / Journal of Science and Medicine in Sport 21 (2018) 1139–1146

48. Canham ML, Knapik JJ, Smutok MA et al. Training, physical fitness, and injuries among men and women preparing for occupations in the Army, In: Advances in Occupational Ergonomics and Safety: Proceedings of the XIIIth Annual International Occupational Ergonomics and Safety Conference. Washington, DC, IOS Press, 1998. p. 711–714. 49. Knapik JJ, Darakjy S, Hauret KG et al. Increasing the physical fitness of lowfit recruits before basic combat training: an evaluation of fitness, injuries, and training outcomes. Mil Med 2006; 171(1):45–54. 50. Knapik JJ, Jones SB, Darakjy S et al. Injury rates and injury risk factors among U.S. Army wheel vehicle mechanics. Mil Med 2007; 172(9):988–996. 51. DoD Injury Surveillance and Prevention Work Group.Injuries in the military: a hidden epidemic, In: Jones BH, Hansen BC, editors, Falls Church, VA, Headquarters, Department of the Army, 1996. 52. Jones BH, Hansen BC. An Armed Forces Epidemiological Board evaluation of injuries in the military. Am J Prev Med 2000; 18(3 Suppl):14–25. 53. Knapik JJ, Hauret K, Jones BH. Primary prevention of injuries in initial entry training, in Recruit Medicine, Borden Institute, editor, Washington, DC, Department of Defense, Office of the Surgeon General, U.S. Army, 2006.

54. Hauret KG, Knapik JJ, Lange JL et al. Outcomes of Fort Jackson’s Physical Training and Rehabilitation Program in Army basic combat training: return to training, graduation, and 2-year retention. Mil Med 2004; 169(7):562–567. 55. Hauret KG, Jones BH, Bullock SH et al. Musculoskeletal injuries description of an under-recognized injury problem among military personnel. Am J Prev Med 2010; 38(1 Suppl):S61–S70. 56. Jones SB, Knapik JJ, Sharp MA et al. The validity of self-reported physical fitness test scores. Mil Med 2007; 172(2):115–120. 57. Martin RC, Grier T, Canham-Chervak M et al. Validity of self-reported physical fitness and body mass index in a military population. J Strength Cond Res 2016; 30(1):26–32. 58. Grier T, Canham-Chervak M, Sharp M et al. Does body mass index misclassify physically active young men. Prev Med Rep 2015; 2:483–487.

Journal of Science and Medicine in Sport 21 (2018) 1147–1153

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Military applications of soldier physiological monitoring夽 Karl E. Friedl U.S. Army Research Institute of Environmental Medicine, USA

a r t i c l e

i n f o

Article history: Received 12 September 2017 Received in revised form 10 March 2018 Accepted 11 June 2018 Available online 20 June 2018 Keywords: Physiological models Wearable sensors Military performance Thermal biology Metabolic monitoring Neurophysiology

a b s t r a c t Wearable physiological status monitoring is part of modern precision medicine that permits predictions about an individual’s health and performance from their real-time physiological status (RT-PSM) instead of relying on population-based predictions informed by estimated human, mission, and environmental/ambient conditions. RT-PSM systems have useful military applications if they are soldier-acceptable and provide important actionable information. Most commercially available systems do not address relevant military needs, typically lack the validated algorithms that make real time computed information useful, and are not open architected to be integrated with the soldier technological ecology. Military RT-PSM development requires committed investments in iterative efforts involving physiologists, biomedical engineers, and the soldier users. Military operational applications include: (1) technological enhancement of performance by providing individual status information to optimize self-regulation, workload distribution, and enhanced team sensing/situational awareness; (2) detection of impending soldier failure from stress load (physical, psychological, and environmental); (3) earliest possible detection of threat agent exposure that includes the “human sensor”; (4) casualty detection, triage, and early clinical management; (5) optimization of individual health and fitness readiness habits; and (6) long term health risk-associated exposure monitoring and dosimetry. This paper is focused on the performance-related applications and considers near term predictions such as thermal-work limits, alertness and fitness for duty status, musculoskeletal fatigue limits, neuropsychological status, and mission-specific physiological status. Each new measurement capability has provided insights into soldier physiology and advances the cycle of invention, lab and field testing, new discovery and redesign. Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

“Unless the void that exists between the scientist or engineer and the war fighter is recognized, a hiatus will exist between the inventor who knows what they could invent, if they only knew what was wanted, and the soldiers who know, or ought to know, what they want and would ask for it if they only knew how much science could do for them. You have never really bridged that gap yet.” Winston Churchill, World Crisis, v. 2, 19271 1. Introduction Wearable physiological monitoring can provide predictions about an individual’s health and performance from their realtime physiological state. This precision medicine approach offers major improvements over population-based predictions derived

夽 Mandatory Disclaimer: The opinions and assertions in this paper are those of the author and do not necessarily represent the official views or policies of the U.S. Department of the Army E-mail address: [email protected]

from ambient conditions and the general context of a mission. Advances in computing power and microelectronics make possible this improvement in human performance assessment, with real time physiological measurement capabilities and data processing that can provide actionable and important information about the individual. This review summarizes current progress in the development of these systems for military applications. Previously, predicting soldier work-rest cycles and training limits could only be addressed using generalized models based on estimated inputs about individuals and ambient conditions.2 In this example of thermal-workload limits, real time physiological status monitoring (RT-PSM) now provides new military capability with individual assessment of soldier performance limits.3,4 The technological advance has come about by turning the focus inward, to consider the actual state of an individual, instead of extrapolating from external conditions and assuming typical responses. Thermalwork strain monitoring is one of the first military PSM applications to be used outside of the research community but provides only one example of near term uses (Table 1). Currently available commercial systems generally do not satisfy the requirements for military use. Even when the systems offer

https://doi.org/10.1016/j.jsams.2018.06.004 1440-2440/Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/ licenses/by-nc-nd/4.0/).

1148

K.E. Friedl / Journal of Science and Medicine in Sport 21 (2018) 1147–1153

Table 1 Some military applications of wearable physiological monitoring technologies. Soldier performance and readiness applications (to be used by soldiers) and leaders) Specific outcome assessments in near term system development • Thermal-work strain limits • Alertness and fitness for duty • Impending musculoskeletal injury & physical fatigue limits • Neuropsychological status (mood and cognitive status) • Pulmonary exposures limiting performance • Specialized environmental exposures (e.g., hypoxia, peripheral cold monitoring) Some likely use cases • Training to personalized safe limits of performance • Mission decision support tool (e.g., optimized route, pacing, soldier status) • Physiological controller for performance augmentation systems (e.g., agile microclimate cooling, exoskeleton activation/proprioception) • Man-machine interface to distribute workload (i.e., stress management) • Biofeedback to train self-regulated performance • Situational awareness in networked squad (i.e., shared sensing) • Materiel testing and acquisition/product selection decision tools Health and medical management applications (to be used by medical) providers) • Casualty detection, remote triage, and medical management (e.g., hemorrhage) • Chemical/biological threat agent exposure – early detection and management • Environmental/military occupational exposure dosimetry (e.g., blast exposure) • Health readiness behavioral management tool (e.g., Army Triad initiative)

something more than raw physiological data, computed information such as recent sleep history or caloric expenditure, is usually based on proprietary algorithms that cannot be properly reviewed and validated, making the output unusable. Unsecure and powerdemanding Bluetooth connections and proprietary architectures cannot be easily integrated into tactically secure systems and military communications networks. The systems should not add significant weight to the soldier and they cannot require daily recharge or battery replacement. Thus, reduced size, weight and power (SWaP) is critical to soldier acceptability and tactical usability and relevant applications are best developed iteratively with the intended users. These are some of the considerations to making a RT-PSM system fit into the technological ecosystem of the soldier (Fig. 1). The critical component of a RT-PSM system is the algorithm that turns data into useful and actionable knowledge for a soldier or a small unit leader. Useful information from a RT-PSM system is defined as vitally important alerts that can be acted on to affect the outcome of a mission or, in a training environment, to improve safety and effectiveness of the training. Physiological telemetry has been an important research tool to investigate the full range of normal human responses outside of the laboratory; however, reams of raw physiological data are not particularly useful in military applications. Even to a trained medic, a parameter such as elevated heart rate could mean variously that an individual is: properly activated for peak performance, hemorrhaging and needs urgent care, or experiencing psychological distress. The validated algorithms that turn data into useful information are fundamental to RT-PSM, and these algorithms should then guide the minimal sensor set that is required. A technology “push” from new and interesting sensors must be at least matched by an informed “pull” strategy driven by Army problems and leader information needs. This represents the coordination gap that Churchill identified: “what the inventor could invent” and “what the soldier wants”.1

2. Field measurement of energy metabolism and workload demands Energy expenditure is a metabolic parameter that is fundamental to many models and applications involving thermal, workload, and injury risk predictions. For example, cold injury risks can be significantly reduced during high energy expenditure with metabolic heat production, while high physical activity levels are increasingly limited in hot environments. Activity energy expenditure (AEE) has been used to assess workload; fatigue limits for commander foot march planning have been expressed in terms of energy expenditure rates (although endurance time for operational missions and overuse injury risk in training would be more practical translations of the workload information). Total daily energy expenditure (TDEE) estimates are also used for fitness and weight management programs. Generalized predictions of AEE and TDEE are not good enough. For example, population-based predictions of energy expenditure such as the original Goldman–Givoni model and subsequent modifications provide general planning estimates for walking speed and load carriage but are inaccurate, with underprediction of metabolic costs by as much as one third.5 The recently reported Ludlow–Weyand model simplifies assumptions to walking speed, grade, and total mass and may prove to be more generalizable, but this needs to be fully evaluated.6 Wearable systems have typically estimated individual TDEE from heart rate or accelerometry.7,8 Triaxial accelerometry on the trunk improves TDEE estimates over that predicted simply from age, body mass, and height, with estimates within 1 MJ/d.9 The location of the sensors on the body is an important factor in capturing the desired information. Wrist-worn heart rate measurements are less reliable than chest or trunk, even after correction for motion artifacts during intense exercise.10 If only one sensored site is available, trunk is also more reliable than wrist for accelerometrybased TDEE.11 This is unfortunate as wear compliance is generally better with a wrist-watch like system than other typical body sites. Boot-worn systems can accurately estimation activity-based energy expenditure (AEE) and classify types of activity.12 Simple foot contact time from a boot-worn device provides AEE.13 Measures of foot contact time and heart rate over a period of time also track changes in aerobic fitness of an individual.14 Measuring distance traveled, including inertial navigation and global positioning systems, combined with barometric measures to measure movement up and down, further increase opportunities to improve AEE estimates.15 Monitoring workload associated with physical training injury risk is an important target for wearable technologies as musculoskeletal injuries continue to be a major modifiable component affecting military readiness.16 Energy expenditure is only one component of these predictions. Patterns of movement and ground reaction forces may predict impending injury, permitting preventive “prehabilitation” such as gait retraining or other interventions. Sophisticated but intrusive wearable systems have been used experimentally for field biomechanical assessments. Simpler “smart shoe” devices with sensors on each shoe have provided useful performance data based on pattern analyses.17 It has been rather more difficult to develop suitable footwear inserts to measure ground reaction forces without causing foot irritation and without rapidly breaking down from use; a notable line of inquiry led by Reed Hoyt has explored a succession of strategies ranging from sensors in form-fitted insoles to acoustic ground sensors to piezoelectric foam technologies. Development efforts in the Netherlands and the U.S. have produced prototypes for field studies on training workload and load carriage.18,19 Parallel efforts have demonstrated the feasibility of estimating ground reaction forces using only inertial sensors worn around each ankle, suggesting other options outside the boot.20

K.E. Friedl / Journal of Science and Medicine in Sport 21 (2018) 1147–1153

1149

Fig. 1. The range of technical considerations for a real time physiological status monitoring system includes much more than the algorithms and physiological measurements discussed in this paper. Close partnerships with the developer and user communities are necessary to the actual implementation of a soldier useable system. Source: Reed Hoyt (USARIEM) and Jeffrey Palmer (MIT Lincoln Labs), unpublished.

More direct TDEE measurements may be obtainable with new portable metabolic monitors that measure expired O2 and CO2 , represent technology advances that make century-old concepts feasible.21,22 The advantage of more portable metabolic monitoring systems is the access to more detailed aspects of energy metabolism such as macronutrient nutrition. Knowledge about shifts between lipid and carbohydrate metabolism provide insights about physiological posture in extreme conditions as well as more routine uses such as progress in weight management.23 New technologies to measure water turnover/flux will provide future sensing of water balance related to performance for more precise and optimal hydration.24,25 Advances in metabolic monitoring are opening the door to assessment of key circulating components of energy metabolism such as glucose and lactate, associated with fatigue limits affecting physical, cognitive, and behavioral performance. Minimally invasive continuous glucose monitoring systems have greatly improved individual management for diabetics; however, these systems still require some form of analyte sampling through the skin.26 Improved transcutaneous spectroscopic methods, sweat sampling systems, and even breath analyses currently under investigation may soon reduce metabolic assessments to noninvasive wear-and-forget systems.

3. A wearable monitoring application success: thermal-work strain In clinical medicine, hyperthermia is confirmed by temperature measurement in the context of an individual exhibiting neurological symptoms. But simply measuring core temperature to detect hyperthermia without this context belies the physiological adaptation to persistence hunting. Humans are uniquely suited for endurance running at high temperature, and core temperature sustained at over 40 ◦ C for several hours in marathon runners is

associated with peak performance rather than indicating a medical emergency (of course, soldiers have additional thermal and workload burdens, including personal protective equipment).27 Thus, the core temperature monitoring with the ingestible temperature pill has provided a useful field research tool, but simple core temperature monitoring or prediction does not solve the problem of detecting individuals approaching their limits of work in the heat. Safe work limits have been addressed with thermal-work strain (TWS) indices, using inputs such as core temperature combined with heart rate,28 and several other developments have made this a practical solution. One was the development of a core temperature predictive algorithm based on analysis of time series heart rates to replace the need to swallow temperature pills.29 Another was the development of a sufficiently accurate and reliable physiological measurement system. The U.S. Army invested heavily in the testing, integration, and validation of monitoring technologies which resulted in a commercially available chest worn system.4 Recent efforts have further reduced RT-PSM size, weight and power requirements, improved comfort, and provide tactically acceptable communications. A third aspect of improving practical but reliable individual estimates of TWS limits has been the development of an adaptive TWS index.3 The Army National Guard Bureau is an early adopter of this enhanced capability, using RT-PSM technology to closely monitor limits of individuals performing critical tasks while encapsulated in protective suits in specialized operations by their Weapons of Mass Destruction Civil Support Teams (WMDCST).4 Personalized monitoring in hot training environments is also being investigated, where it could improve training effectiveness by permitting higher training workloads than might be predicted by group-based predictive heat strain models. Simple core temperature also does not provide actionable information regarding hypothermia. Large drops in core temperatures have been observed without medical consequences in

1150

K.E. Friedl / Journal of Science and Medicine in Sport 21 (2018) 1147–1153

Ranger students and Marine Infantry school officers during high risk field training, while hypothermia deaths have previously occurred unexpectedly in moderate temperature conditions. Productive monitoring approaches will likely center on indicators of thermoregulatory failure such as cessation of intense shivering thermogenesis.30 This area of research is still immature although the effects of cold on hands and feet is being intensively investigated with skin, boot, and glove temperature sensing to prevent peripheral freezing cold injury and to sustain critical performance capabilities, especially involving hands.31

sleep behaviors in soldiers, providing benefit to health readiness initiatives.42 Reliable measurement of sleep quality (e.g., sleep stages) outside of a laboratory will be important to development and evaluation of attempts to compress restorative sleep in the field for soldiers; feasibility of correctly measuring deep sleep and REM sleep stages with a wrist-worn system has been demonstrated.43 More studies are also needed on the translation between the psychomotor vigilance task, the performance outcome measure often used in sleep laboratories, and militarily-relevant performance outcomes.44

4. Early models of alertness and fitness for duty assessment Situational awareness is a critical component of soldier readiness and this is an important target for alertness monitoring. Sustained alertness applies to the point man on patrol, sentries, or even military technicians monitoring computer displays. Even well rested individuals concentrating on threat detection and friend–foe identification begin to increase errors after two hours of sustained effort. Vehicle drivers face alertness challenges in tactical settings, especially driving in low light conditions at night. Soldier effectiveness is reduced during night operations when performance circadian rhythms are lowest and sleep drive is highest and attentional lapses and micro sleeps increase markedly with inadequate sleep. Early concepts for fatigue and acute alertness monitoring were as simple as a mercury switch on the back of the helmet to detect head bobbing. A more sensitive measure of attentional lapse uses infrared reflectance to detect slow eyelid closure (PERCLOSE), tracking retinas from a dashboard mounted system. This is more reliable if built into a helmet or glasses that move with the head to maintain alignment with the eyes. Eye movements (“oculometrics”) such as eye blinks, eye saccades, and pupillometry have long attracted fatigue researchers as performance assessment predictors but are still only promising possibilities.32,33 Despite a large Army investment in oculometrics evaluation systems intended for assessment of fitness for duty, the technology has not matured. More complex EEG monitoring of alertness has been demonstrated by computational neurophysiologists34 but has always been hampered by limitations on computing power and speed as well as the intrusiveness of the EEG scalp electrodes. When power is not a limiter, such as in a vehicle or aircraft, EEG monitoring of soldier fatigue is feasible and has been used in performance research, but a full array of scalp electrodes is overly intrusive for routine use.35 Simpler single channel EEG systems have been developed that might eventually be positioned in a cap or in a helmet headband.36 The French military is now using a system developed by military researchers to optimize rest and flight schedules and to modify tactics, techniques and procedures for more effective performance.37 This represents an important near term military PSM application. The value of near infrared spectroscopy forehead measurements, perhaps in conjunction with single channel EEG, may add to alertness predictions.38 Measurement of sleep history has been used to predict alertness status, especially when combined with the variation in alertness expected through the circadian cycle in a two process sleep and performance model.39 General conclusions can be drawn about decreased performance and attention at various levels of sleep restriction or hours of sleep deprivation but many moderating factors from genetics of sleep resilience to the effect of naps and caffeine confound reliable performance predictions.40 Sleep history is obtained from total sleep time estimated by triaxial accelerometry. The commonly used algorithms predict sleep duration with >90% reliability but are poor at correctly classifying wake state (∼60%).41 Nevertheless, sleep monitoring appears to help improve

5. Neurophysiological assessments of performance readiness Military leaders would like to have some assurance that individuals are competent to make critical decisions and to know if someone is about to fail due to overwhelming psychological stress. Early research efforts during the Korean War investigated predictive stress markers, and neuroendocrinologists further clarified stress mechanisms in combat studies during the Vietnam War.45 The markers that are useful in measuring acute stress, including appropriate stress responses to a novel threat, do not appear to be useful predictors of imminent failure. Measureable physiological responses such as changes in skin conductance, heart rate and heart rate variability, and components of voice are consistent markers of acutely stressful events such as parachute training and confidence courses.45 Voice stress analysis detects an emotionally stressful event and this response diminishes with increasing confidence in subsequent trials, while elevated heart rate is a persistent feature of appropriate stress activation in preparation for a dangerous task.45 Similarly, landing on a pitching aircraft carrier deck in a storm at night provokes appropriate and measurable physiological responses that are interesting in characterizing an emotionally significant event and stress activation but do not provide actionable information.45 New approaches from the growing field of affective computing, based on other aspects of voice and behavior, show predictive value in important outcomes such as depression and cognitive impairment. A DARPA initiative, Detection and Computational Analysis of Psychological Signals (DCAPS), was organized to target “honest signals” in human behavior that signal psychological status. This led to advances in voice stress analysis, assessments of social interactions through modern media, and the development of physiologically aware virtual agents (PAVA) such as the Institute of Creative Technologies’ “SimSensei,” incorporating mental status monitoring technologies such as eye gaze, body posture, voice analysis, and even speech content analysis.46,47 Components of speech have been further dissected by providing a differential assessment of involved brain domains.48 Combined with facial unit activation, voice analysis can successfully distinguished depressed patients from nondepressed.49 Patterns of movement have been used to detect changes in cognitive status. Meandering patterns of movement, reflected in a high fractal “D” score, identify cognitive impairment and separate demented from normal veterans.50 This may also identify individuals with persistent symptoms following traumatic brain injury.51 Combined with information from embedded neuropsychological testing, these technologies will provide additional insights into neurocognitive status.52 In combination with the thermal-work strain index, this neurocognitive assessment based on changes in movement patterns might provide reliable prediction of both impending heat stroke and hypothermia. These neurophysiological efforts significantly leverage current research on wearable systems to improve the quality of life for patients with chronic diseases such as Parkinson’s.53

K.E. Friedl / Journal of Science and Medicine in Sport 21 (2018) 1147–1153

Future monitoring will include other modes of sensing such as volatile odor production. We know that humans produce unique stress signals detectable by dogs, based on the emerging empirical value of diabetic alert dogs to Type 1 diabetics and psychiatric dogs to veterans.54,55 At some point those distinguishing patterns of movement, physiological response, or secretion of volatile organic compounds may be useful in machine detection of what dogs already can detect. Human odors are a potentially rich source of information with specific volatile organic compounds produced during infection and possibly following head injury, and new olfactory receptor-lined nanotube sensing technologies make detection feasible.56,57 Moderating soldier “stress load” will be an important application of a neurophysiological monitoring capability. This application is an essential component of future man-machine interfacing. This was recognized in another DARPA research initiative on “Augmented Cognition,” where redistribution of mental workload within teams, the amount and form of information displays, and various types of performance augmentation were all based on an assumption of a real time neurocognitive status monitoring capability.58 Another application is to detect and help reduce maladaptive psychological stress responses in soldiers. Continued physiological activation in the hours following an intense psychologically traumatic exposure has been postulated to contribute to later trauma disorders, and psychological first aid involving pharmacological interventions has been proposed. An effective self-management calming alternative has been demonstrated using biofeedback from cardiovascular measures combined with gaming technology on a smart phone.59

1151

protect against environmental threats to expeditionary suits for astronauts would benefit from RT-PSM metabolic sensing. 7. Practical considerations Military leaders have well-founded concerns about adding more information to an already overcrowded attentional space. Earlier concepts of a “soldier dashboard” were derived from clinical practice where a patient’s vital signs are displayed with an impressive range of data. The information needs of an intensive care unit specialist caring for a patient are not the same as the information needs of a small unit leader orchestrating the performance of a team of soldiers. Equating soldiers to vehicles is even less appropriate. Humans are not cars and a “dashboard” displaying human analogs of temperature, fuel, and maintenance requirements is a rather simplistic view of useful monitoring for humans that can sense and communicate about their own systems. Leaders may actually want to receive higher level computed information about things like gaps in security due to real time attentional lapses of sentries; who is about to fail due to cognitive or physical burdens that can be redistributed; or identify a soldier who should not be making critical decisions because their head is not in the game (e.g., due to stress, depression, or recent head impact). RT-PSM offers new capabilities to leaders for alerts about soldier readiness status that they may not otherwise readily detect, but these alerts must roll up into simple stoplight displays (i.e., red/yellow/green) that can be further queried only if more information is required. Many applications of RT-PSM may be reserved for training environments to better prepare for the operational environment, providing a technology assist to teach individual and team performance limits and ensure safer training.

6. Monitoring to overcome mission-specific physiological limitations

8. Medical applications

Soldier performance may be enhanced by real time behavioral guidance based on RT-PSM (“technological doping”). One of the thematic session at the 4th International Congress on Soldier Physical Performance (ICSPP) provided examples of workload pacing and accelerated acclimation that might be accomplished using PSM tools.60 Many more such applications can be envisioned. For example, voluntary control of metabolic rate and heat production in certain extreme conditions could provide soldiers with a performance and survival advantage. PSM-informed biofeedback might provide soldiers with a capability that currently takes years of disciplined training, where Buddhist monks can increase and decrease basal metabolism by 60% in either direction and increase finger and toe temperatures.61,62 An increase or decrease of core body temperature by only 1 ◦ C is associated with a ∼15% change in metabolic rate. RT-PSM opens the door to many other solder performance applications, including needs for specialized missions and environments. This calls for a plug-and-play design of the soldier system so that it can be configured for specific mission needs, ranging from respiratory monitoring in subterranean and dense urban environments to thermal threats from directed energy systems. Prediction of hypoxic impairment of performance during ascent to altitude is also tactically relevant, as many national borders and potential sites of military conflict are formed by high terrestrial environments. Performance predictions based on RT-PSM monitoring of cardiovascular parameters such as blood oxygen saturation can guide staging to altitude and identify individual performance risk. Military pilots are also at potential risk for “physiological events,” unexplained episodes of hypoxia in the cockpit, that may compromise health and critical performance. Many closed environments, including submarines and diving systems to vehicles sealed to

There are logical extensions of the current RT-PSM efforts that serve the needs of medical specialists. This information is derived from many of the same sensors already used for performance predictions and the same platform and data management architectures would support these additional capabilities (Table 1). However, these medical capabilities generally must follow behind the performance capabilities. The adoption of wearable systems only to support remote medical triage, especially for live-dead detection, has not garnered widespread interest from soldiers, even though these were some of the first capabilities developed by direction of the Army medical department.63 Soldiers might be more agreeable to wearing a performance-based RT-PSM that could also detect a casualty event, support a “911” alert, and support medical management (Fig. 2). Casualty detection, triage, and early clinical management would likely involve many of the same physiological measures and primarily requires a set of “casualty algorithms” that detect and track hemorrhage and other critical problems. Hemorrhage is the major preventable cause of death on the battlefield and systems to detect hemorrhage and predict hemorrhage severity would be of immediate importance in conserving the fighting strength.65 This might be possible with algorithms that are currently in development, such as the Compensatory Reserve Index (CRI) that quantifies a failing hemodynamic response due to intravascular volume depletion based on photoplethysmography (PPG) measurements.66 At present, the medic brings a sophisticated suite of medical monitoring tools to the casualty and lacks the initial wounding detection capability.67 Optimization of individual health and fitness readiness habits is already supported with commercially available fitness systems that serve as behavioral prompts as much as providing meaningful

1152

K.E. Friedl / Journal of Science and Medicine in Sport 21 (2018) 1147–1153

Fig. 2. Concepts for real time physiological status monitoring (RT-PSM) include a common sensors and communications architecture for a system that supports soldier readiness status and performance, and will also be able to support medical needs. Source: Friedl.64

data about total sleep time or daily activity in the U.S. Army Triad Initiative.42 RT-PSM, combined with outward looking environmental sensors, may also become important for individual exposure monitoring and dosimetry, tying deployment and occupational exposures to longer term health risks (this may not require real time information). At present, this involves extraordinarily complex issues of what to measure in the environment and in the individual (e.g., acute responses, exposomic markers, etc.) and how to relate these measurements to actual health outcomes. Recent efforts with wearable blast sensors and mild traumatic brain injury illustrate the practical challenges of linking measurable exposures to health risks. Current work by programs exploring critical air quality triggers and related respiratory distress signals in asthmatics may provide a new model approach to environmental exposure and RT-PSM.68 Other RT-PSM efforts will also provide rapid warning of immediate risks from chemical and biological threat agents, especially from inhalation threats. These other applications are beyond the scope of this review but would likely build on the common PSM strategies and architecture. 9. Conclusions Physiological sensors will have useful military applications if they are soldier-acceptable and provide important actionable information. This paper has focused on operational medicine priorities, considering the components of a “soldier readiness score” comprised of thermal-work limits, alertness and fitness for duty status, musculoskeletal fatigue limits, neuropsychological status, and mission-specific physiological status (e.g., hypoxia, pulmonary threats, freezing cold). One of the most promising and still least developed part of performance monitoring is to use measurements

of human physiological and behavioral signals to detect neurophysiological status, particularly in predicting individuals reaching stress limits and with impending degraded performance. Acknowledgements This paper is dedicated to Frederick W. Hegge, PhD (1935–2000), an Army neuropsychologist who led the Office of Military Performance Assessment Technologies (OMPAT) in the 1980s and who enabled and confidently predicted the technological developments for soldier physiological monitoring that exist today. Most of the work presented in this paper is based on more than two decades of USARIEM research that has been led by Reed W. Hoyt. Concepts presented in this paper have also been informed by discussions in meetings of the NATO Human Factors in Medicine (HFM) Research Technology Group (RTG) 260, “Enhancing Warfighter Effectiveness with Wearable Bio-Sensors and Physiological Models.” The many contributions to PSM development by Jeffrey Palmer and his bioengineering research team at MIT Lincoln Laboratory are also gratefully acknowledged. References [1]. Churchill WS. The World Crisis: 1916–1918, 2. New York, Charles Scribner’s Sons, 1927. p. 564. [2]. Potter AW, Blanchard LA, Friedl KE et al. Mathematical prediction of core body temperature from environment, activity, and clothing: the heat strain decision aid (HSDA). J Therm Biol 2017; 64:78–85. [3]. Buller MJ, Welles AP, Friedl KE. Wearable physiological monitoring for human thermal-work strain optimization. J Appl Physiol 2018; 124(2):432–441. [4]. Tharion WJ, Friedl KE, Buller MJ et al. Evolution of physiological status monitoring for ambulatory military applications, in The Cognitive and Behavioral Neuroscience of Human Performance in Extreme Settings, Matthews Michael D, Schnyer David, editors, New York, NY, Oxford University Press, 2019, (in press).

K.E. Friedl / Journal of Science and Medicine in Sport 21 (2018) 1147–1153 [5]. Drain JR, Aisbett B, Lewis M, Billing DC. The Pandolf equation under-predicts the metabolic rate of contemporary military load carriage. J Sci Med Sport 2017; 20:S104–S108. [6]. Ludlow LW, Weyand PG. Walking economy is predictably determined by speed: grade and gravitational load. J Appl Physiol 2017; 123(5):1288–1302. [7]. Freedson PS, Miller K. Objective monitoring of physical activity using motion sensors and heart rate. Res Quart Exerc Sport 2000; 71(Suppl. 2):21–29. [8]. Spurr GB, Prentice AM, Murgatroyd PR et al. Energy expenditure from minuteby-minute heart-rate recording: comparison with indirect calorimetry. Amer J Clin Nutr 1988; 48(3):552–559. [9]. Plasqui G, Joosen AM, Kester AD et al. Measuring free-living energy expenditure and physical activity with triaxial accelerometry. Obesity 2005; 13(8):1363–1369. [10]. Zong C, Jafari R. Robust heart rate estimation using wrist-based PPG signals in the presence of intense physical activities. In: Engineering in Medicine and Biology Society (EMBC), 2015 37th Annual International Conference of the IEEE 2015: 8078–8082. [11]. Chen KY, Acra SA, Majchrzak K et al. Predicting energy expenditure of physical activity using hip-and wrist-worn accelerometers. Diab Technol Therapeutics 2003; 5(6):1023–1033. [12]. Hoyt RW, Buller MJ, Santee WR et al. Total energy expenditure estimated using foot–ground contact pedometry. Diab Technol Therapeutics 2004; 6(1):71–81. [13]. Hoyt RW, Knapik JJ, Lanza JF et al. Ambulatory foot contact monitor to estimate metabolic cost of human locomotion. J Appl Physiol 1994; 76(4):1818–1822. [14]. Weyand PG, Kelly M, Blackadar T et al. Ambulatory estimates of maximal aerobic power from foot-ground contact times and heart rates in running humans. J Appl Physiol 2001; 91(1):451–458. [15]. Strozzi N, Parisi F, Ferrari G. A multifloor hybrid inertial/barometric navigation system, 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2018, p. 1–5. [16]. Hauret KG, Jones BH, Bullock SH et al. Musculoskeletal injuries: description of an under-recognized injury problem among military personnel. Amer J Prev Med 2010; 38(1):S61–S70. [17]. Eskofier B, Hoenig F, Kuehner P. Classification of perceived running fatigue in digital sports, 2008 International Conference on Pattern Recognition, 2018, p. 1–4. [18]. Lacirignola J, Weston C, Byrd K et al. Instrumented footwear inserts: a new tool for measuring forces and biomechanical state changes during dynamic movements, 2017 IEEE 14th International Conference on Wearable and Implantable Body Sensor Networks (BSN) 2017, 2017, p. 119–124. [19]. Valk PJ, Veenstra BJ. Military Performance and Health Monitoring in Extreme environments. Technical Report, Netherlands, TNO Defence Security and Safety, Soesterberg, 2009. [20]. Clark KP, Ryan LJ, Weyand PG. A general relationship links gait mechanics and running ground reaction forces. J Exper Biol 2017; 220(2):247–258. [21]. Candell LM, Ferraiolo C, Shaw GA, et al., inventors. Systems, apparatus, and methods related to modeling, monitoring, and/or managing metabolism. United States patent application US 15/221,313. 2016 Jul 27. [22]. Committee on Metabolic Monitoring for Military Field Applications. Institute of Medicine. Monitoring Metabolic Status: Predicting Decrements in Physiological and Cognitive Performance, Washington D.C, National Academies Press, 2004. [23]. Gribok A, Leger JL, Stevens M et al. Measuring the short-term substrate utilization response to high-carbohydrate and high-fat meals in the whole-body indirect calorimeter. Physiol Report 2016; 4(12):e12835. [24]. Nolte H, Noakes TD, Van Vuuren B. Ad libitum fluid replacement in military personnel during a 4-h route march. Med Sci Sports Exerc 2010; 42(9):1675–1680. [25]. First International Workshop on Hydration Monitoring Technologies. 2017 IEEE 14th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Eindhoven, Netherlands. May 12, 2017. [26]. Klonoff DC. Continuous glucose monitoring. Diab Care 2005; 28(5):1231–1239. [27]. Maron MB, Wagner JA, Horvath SM. Thermoregulatory responses during competitive marathon running. J Appl Physiol 1977; 42(6):909–914. [28]. Moran DS, Shitzer A, Pandolf KB. A physiological strain index to evaluate heat stress. Amer J Physiol 1998; 275(1):R129–34. [29]. Buller MJ, Tharion WJ, Cheuvront SN et al. Estimation of human core temperature from sequential heart rate observations. Physiol Meas 2013; 34(7):781. [30]. Young AJ, Castellani JW, O’Brien C et al. Exertional fatigue, sleep loss, and negative energy balance increase susceptibility to hypothermia. J Appl Physiol 1998; 85(4):1210–1217. [31]. Xu X, Santee WR, Gonzalez RR, et al. Prediction of hand manual performance during cold exposure. SAE Technical Paper 2004-01-2348, 2004 (available at: https://doi. org/10.4271/2004-01-2348, Last Accessed 4 March 2018). [32]. Brozek J. Quantitative criteria of oculomotor performance and fatigue. J Appl Physiol 1949; 2(5):247–260. [33]. Van Orden KF, Jung TP, Makeig S. Combined eye activity measures accurately estimate changes in sustained visual task performance. Biol Psychol 2000; 52(3):221–240. [34]. Jung TP, Makeig S, Stensmo M et al. Estimating alertness from the EEG power spectrum. IEEE Trans Biomed Engineering 1997; 44(1):60–69. [35]. Caldwell JA, Hall KK, Erickson BS. EEG data collected from helicopter pilots in flight are sufficiently sensitive to detect increased fatigue from sleep deprivation. Int J Aviat Psychol 2002; 12(1):19–32. [36]. Sauvet F, Bougard C, Coroenne M et al. In-flight automatic detection of vigilance states using a single EEG channel. IEEE Trans Biomed Engineering 2014; 61(12):2840–2847. [37]. Chennaoui M, Van Beers P, Caid S et al. Microsleep and alertness monitoring in French Air Force long haul pilots. J Sci Med Sport 2017; 20:S134.

1153

[38]. Strangman GE, Ivkovic V, Zhang Q. Wearable brain imaging with multi-modal physiological recording. J Appl Physiol 2018; 124(3):564–572. [39]. Redmond DP, Hegge FW. Observations on the design and specification of a wrist-worn human activity monitoring system. Behav Res Methods 1985; 17(6):659–669. [40]. Van Dongen H. Comparison of mathematical model predictions to experimental data of fatigue and performance. Aviat Space Environ Med 2004; 75(3):A15–A36. [41]. Rupp TL, Balkin TJ. Comparison of motionlogger watch and actiwatch actigraphs to polysomnography for sleep/wake estimation in healthy young adults. Behav Res Methods 2011; 43(4):1152–1160. [42]. Adler AB, Gunia BC, Bliese PD et al. Using actigraphy feedback to improve sleep in soldiers: an exploratory trial. Sleep Health: J Nat Sleep Found 2017; 3(2):126–131. [43]. Mantua J, Gravel N, Spencer R. Reliability of sleep measures from four personal health monitoring devices compared to research-based actigraphy and polysomnography. Sensors 2016; 16(5):646. [44]. Lieberman HR, Falco CM, Slade SS. Carbohydrate administration during a day of sustained aerobic activity improves vigilance, as assessed by a novel ambulatory monitoring device, and mood. Amer J Clin Nutr 2002; 76(1):120–127. [45]. Friedl KE, Grate SJ. Metabolic enhancement of the soldier brain, in The Cognitive and Behavioral Neuroscience of Human Performance in Extreme Settings, Matthews Michael D, Schnyer David, editors, New York NY, Oxford University Press, 2019, (in press). [46]. DeVault D, Artstein R, Benn G et al. SimSensei Kiosk: a virtual human interviewer for healthcare decision support, Proceedings of the 2014 international conference on Autonomous agents and multi-agent systems, 2014, p. 1061–1068. [47]. Scherer S, Stratou G, Gratch J, Morency LP. Investigating voice quality as a speaker-independent indicator of depression and PTSD. In: Interspeech 2013, pp. 847-851. [48]. Cummins N, Scherer S, Krajewski J et al. A review of depression and suicide risk assessment using speech analysis. Speech Comm 2015; 71:10–49. [49]. Williamson JR, Quatieri TF, Helfer BS et al. Vocal biomarkers of depression based on motor incoordination, Proc 3rd ACM Int Workshop on Audio/Visual Emotion Challenge, 2013, p. 41–48. [50]. Kearns WD, Fozard JL, Nams VO. Movement path tortuosity in free ambulation: relationships to age and brain disease. IEEE J Biomed Health Inform 2017; 21(2):539–548. [51]. Kearns WD, Scott S, Fozard JL et al. Decreased movement path tortuosity is associated with improved functional status in patients with traumatic brain injury. J Head Trauma Rehab 2016; 31(1):E13–E19. [52]. Friedl KE, Grate SJ, Proctor SP et al. Army research needs for automated neuropsychological tests: monitoring soldier health and performance status. Arch Clin Neuropsychol 2007; 22:7–14. [53]. Stamford JA, Schmidt PN, Friedl KE. What engineering technology could do for quality of life in Parkinson’s disease: a review of current needs and opportunities. IEEE J Biomed Health Inform 2015; 19(6):1862–1872. [54]. Rooney NJ, Morant S, Guest C. Investigation into the value of trained glycaemia alert dogs to clients with type I diabetes. PloS One 2013; 8(8):e69921. [55]. Pleil J, Giese R. Integrating exhaled breath diagnostics by disease-sniffing dogs with instrumental laboratory analysis. J Breath Res 2017; 11(3):032001. [56]. Goldsmith BR, Mitala Jr JJ, Josue J et al. Biomimetic chemical sensors using nanoelectronic readout of olfactory receptor proteins. ACS Nano 2011; 5(7):5408–5416. [57]. Sethi S, Nanda R, Chakraborty T. Clinical application of volatile organic compound analysis for detecting infectious diseases. Clin Microbiol Rev 2013; 26(3):462–475. [58]. Stanney KM, Schmorrow DD, Johnston M et al. Augmented cognition: an overview. Rev Hum Factors Ergonomics 2009; 5(1):195–224. [59]. Russoniello CV, O’Brien K, Parks JM. The effectiveness of casual video games in improving mood and decreasing stress. J Cyber Ther Rehab 2009; 2(1):53–66. [60]. Friedl K, Buller M. Session 14: non-pharmacological military performance enhancement technologies. J Sci Med Sport 2017 2017; 20:S93–S95. [61]. Benson H, Malhotra MS, Goldman RF et al. Three case reports of the metabolic and electroencephalographic changes during advanced Buddhist meditation techniques. Behav Med 1990; 16(2):90–95. [62]. Benson H, Lehmann JW, Malhotraf MS et al. Body temperature changes during the practice of g Tum-mo yoga. Nature 1982; 295:21–22. [63]. Savell CT, Borsotto M, Reifman J, Hoyt RW. Life sign decision support algorithms. InMedinfo 2004; 145:1453–1457. [64]. Friedl KE. Is it possible to monitor the warfighter for prediction of performance deterioration? Pp. 7.1-7.10, In: Workshop on operational fatigue. Technical Report RTO-HFM/WS-151. Research and Technological Organization, North Atlantic Treaty Organization, Neuilly-sur-Seine Cedex, France. 2008. [65]. Eastridge BJ, Mabry RL, Seguin P et al. Death on the battlefield (2001-2011): implications for the future of combat casualty care. J Trauma Acute Care Surg 2012; 73(6):S431–S437. [66]. Johnson M, Alarhayem A, Convertino V et al. Compensatory reserve Index: performance of a novel monitoring technology to identify the bleeding trauma patient. Shock 2018; 49(3):295–300. [67]. Moulton SL, Mulligan J, Santoro MA et al. Validation of a noninvasive monitor to continuously trend individual responses to hypovolemia. J Trauma Acute Care Surg 2017; 83(1):S104–S111. [68]. Misra V, Lee B, Manickam P et al. Ultra-low power sensing platform for personal health and personal environmental monitoring, Electron Devices Meeting (IEDM), 2015 IEEE International 2015, 2018,. pp. 13-1.

Journal of Science and Medicine in Sport 21 (2018) 1154–1161

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Original research

Consensus paper on testing and evaluation of military exoskeletons for the dismounted combatant Kurt L. Mudie a,∗ , Angela C. Boynton b , Thomas Karakolis c , Meghan P. O’Donovan d , Gregory B. Kanagaki d , Harrison P. Crowell b , Rezaul K. Begg a , Michael E. LaFiandra b , Daniel C. Billing e a

Institute for Health and Sport, Victoria University, Australia US Army Research Laboratory, Aberdeen Proving Ground, USA c Human Systems Integration Section, Defence Research and Development Canada, Canada d US Army Natick Soldier Research, Development and Engineering Center, USA e Land Division, Defence Science and Technology Group, Australia b

a r t i c l e

i n f o

Article history: Received 3 September 2017 Received in revised form 4 May 2018 Accepted 13 May 2018 Available online 18 May 2018 Keywords: Exoskeleton Military Load carriage Methodology

a b s t r a c t Enhancing the capabilities of the dismounted combatant has been an enduring goal of international military research communities. Emerging developments in exoskeleton technology offers the potential to augment the dismounted combatant’s capabilities. However, the ability to determine the value proposition of an exoskeleton in a military context is difficult due to the variety of methods and metrics used to evaluate previous devices. The aim of this paper was to present a standard framework for the evaluation and assessment of exoskeletons for use in the military. A structured and systematic methodology was developed from the end-user perspective and progresses from controlled laboratory conditions (Stage A), to simulated movements specific to the dismounted combatant (Stage B), and real-world military specific tasks (Stage C). A standard set of objective and subjective metrics were described to ensure a holistic assessment on the human response to wearing the exoskeleton and the device’s mechanical performance during each stage. A standardised methodology will ensure further advancement of exoskeleton technology and support improved international collaboration across research and industry groups. In doing so, this better enables international military groups to evaluate a system’s potential, with the hope of accelerating the maturity and ultimately the fielding of devices to augment the dismounted close combatant and small team capability. Crown Copyright © 2018 Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/bync-nd/4.0/).

1. Introduction The dismounted combatant is required to carry loads in excess of 80–100% of their body mass over a variety of difficult and complex terrains.1–3 In response to the added mass there is a decreased time to fatigue4 and increased prevalence of overuse or chronic musculoskeletal injuries.2 The dismounted combatant takes a direct part in the hostilities of an armed conflict, thus reducing the negative impacts of load carriage and enhancing their capability during warfare has been an enduring goal. The North Atlantic Treaty Organization (NATO) has identified and defined five key capability areas for the dismounted close combatant: mobility, lethality, survivability, sustainability and C4I (Command, Control, Communications, Computers and Intelligence). Exoskeletons designed to augment

∗ Corresponding author. E-mail address: [email protected] (K.L. Mudie).

the mobility capabilities during load carriage would allow the dismounted combatant to more easily traverse through any kind of complex terrain and thereby extend their geographic sphere of influence. An exoskeleton, defined here as a body-worn mechanical device that works in parallel with the user,5 has the potential to enhance mobility. Current exoskeletons can be classified according to their various power states (active or passive) and structures (hard or soft). Active exoskeletons6,7 describe devices that require a power source, whereas passive devices8,9 have no external power and typically utilise springs or serve as an external support. Further, the structure of the device may include hard rigid components such as aluminium or carbon fibre,10–12 soft textile or cable components,6,13 or a combination of both. Recent developmental efforts have yielded devices that reduced metabolic cost by up to 15%6–8 or potentially minimised injury risk by offsetting the external load outside the musculoskeletal system10,14,15 during common military tasks such as walking and load carriage. While

https://doi.org/10.1016/j.jsams.2018.05.016 1440-2440/Crown Copyright © 2018 Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http:// creativecommons.org/licenses/by-nc-nd/4.0/).

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161

reducing the metabolic cost of walking and the potential for injury are highly important, equally important may be the ability of exoskeletons to improve both the physical and cognitive capacity of the individual dismounted combatant, as well as the effectiveness of a combat unit as a whole. Over the previous decade, there has been an exponential increase in the number of available exoskeletons on the market.5,16,17 There has also been a concurrent increase in the number of scientific evaluations involving human volunteers performed internationally by academic and government laboratories. For example, evaluations have recently been conducted for the OX,18 Exo Hiker,10,11 Warrior Web,6 B-Temia19 and MIT ankle exoskeleton.7,12 Many publications pertaining to current militaryspecific exoskeletons have focused on a single activity or task such as steady state walking or running. Further, these assessments were generally completed on an instrumented treadmill6,8,10 or overground,15,19,20 with few devices being tested in multiple evaluations or during military specific exercises. Though common assessment tasks have been employed, a wide range of different metrics and methods used to evaluate devices has been reported. The importance of developing standard assessment and evaluation metrics for specific exoskeleton applications has been recently highlighted.21 In line with this suggestion, an initial set of basic and applied military specific tasks for assessing lower body exoskeletons has been developed by Carlson et al.22 The authors22 provided a readily accessible methodology that was low cost, and intended as a high-level assessment to inform initial designs and concepts. As such, the methods described did not include details of task parameters or objective and subjective metrics, as has been recommended by Torricelli et al.21 Therefore, the absence of traditional laboratory measurements significantly reduces the scientific rigour and reliability of the potential findings from the tasks described by Carlson et al.22 Transparency in the literature regarding design details and the effect of the device on human performance will expedite the rate at which technology matures, the likelihood of such products being fielded23 and the military impact of this new technology. As such, in addition to in-house testing, independent and impartial evaluations offer an unbiased insight into a device’s performance, shortcomings and potential use cases/applications. The development and consensus of a progressive, staged methodology will ensure suitable methods, protocols and metrics are employed as a baseline standard for the assessment of military exoskeletons during in-house testing and independent evaluations. This would make comparisons between different devices, control systems, materials, and refined versions of a system possible. Further, the value proposition of a device(s) can then be adequately determined across various military relevant situations and requirements to help determine a potential use case for the device (Fig. 1). The aim of this paper was to present a standard framework for the evaluation and assessment of exoskeletons for use in the military. A structured and systematic methodology that enables a range of exoskeleton systems to be evaluated through a progressive set of activities that relate to the dismounted close combatant is presented (Fig. 2). We are sharing our standardised framework in the hope that in doing so will help the exoskeleton community objectively evaluate individual systems while producing meaningful results that may be comparable, generalizable, and applicable to the development of future exoskeleton systems.

2. Assessment and evaluation methodologies Human-in-the-loop evaluations must entail a holistic approach, with standardised assessments ranging from controlled laboratory tests to the real-world military context, using both objective

1155

and subjective measurement tools. Further, testing methodologies should be developed from the end-user perspective and remain agnostic to a particular device. This approach will permit any device to utilise the staged assessment methodology, regardless of the design and functional characteristics. The value proposition of a device can then be adequately determined across various military relevant situations and tasks to help determine its optimal usecase(s). Additionally, utilisation of a standard protocol in assessing these technologies can enable “fair” comparisons between various exoskeleton systems. The presented testing methodology is to be completed as an iterative process (Stages A–C; Fig. 2), structured to progress from simple, controlled laboratory measurements to more complex, real-world military specific tasks and duties.21 The progression of activities is such that negative outcomes found at a particular stage can be provided to the system designers so that the exoskeleton design can be changed before the next iteration of the study. By including a variety of tasks in the assessment, those activities that the device may augment and those for which it imposes a cost can be highlighted and identified. This approach minimises participant risk and provides an opportunity to complete an integrated evaluation process where the device can be further refined by the developers dependent on early findings. The development of military specific testing recommendations should consider key mobility categories required of the dismounted combatant. Specific tasks, duration, and level of physical effort may be highly varied across these categories. For instance, in the case of the dismounted close combatant these activities included, (i) tactical or approach marches (prolonged moderate intensity, i.e. march); (ii) moving tactically but not in an engagement (prolonged moderate intensity, i.e. advance, patrol, urban clearance, obstacle negotiation); (iii) moving tactically while engaged (intermittent high intensity, i.e. fire and movement, obstacle negotiation); (iv) manual material handling (prolonged moderate intensity, i.e. movement of material within a base, stretcher carry); and (v) ingress/egress (low intensity, i.e. fixed/rotary wing, ground transport, maritime). It is critical that a variety of tasks be included in the assessment protocol to encompass the full range of movement patterns, muscle groups and energy systems associated with dismounted operations. For a specific task, the completed conditions should include a control (no device), exoskeleton ON (device worn and active) and exoskeleton OFF (device worn and inactive). The exoskeleton ON condition characterises the performance of the device (relative to the control), and the exoskeleton OFF condition sets the baseline for the burden or penalty of wearing the device (relative to the control). Test conditions should be randomised or counterbalanced across participants to minimise any order effect. Participants should have sufficient military experience relevant to the assessable outcomes, no current or recent musculoskeletal injuries and cleared for full duties. Combat relevant clothing (i.e. helmet, boots, armour and weapon) should be worn and the type and mass of the backpack consistent throughout all conditions, and equivalent to relevant combat loads (i.e. standard patrol order (∼20 kg) or marching order (∼35 kg)). Although it should be noted that as technology continues to progress some exoskeletons might change the standard dismounted combatant’s combat ensemble, and as such, these configurations should be considered when making comparisons. Further, it is imperative the device is correctly fitted, comfortable for the user, and software algorithms adjusted to suit the task and the user’s characteristics (e.g., anthropometry, gait pattern, etc.). Familiarisation sessions must be completed prior to testing to ensure the user has sufficiently adapted to wearing the device. Previous research has demonstrated 1–2 familiarisation sessions of approximately 15–24 min each to adapt to a powered and passive

1156

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161

performance Current work

Reduced work

Substantially

load/rate

load/rate

reduced workload/work-rate

Current capabilities

Improved current capabilities

Baseline

Core competency

Core competency

Core competency

Core competency

Advanced

Advanced

Transformational

Transformational

Improved current capabilities and additional capabilities provided Fig. 1. A presented value proposition framework for military exoskeleton systems for a given task or set of tasks. Baseline is defined as a system that is able to overcome the burden imposed on the user by simply wearing the device and does not impose any new penalties on the user. Core competency defines a system that is optimising existing capabilities. Advanced highlights a system that expands from existing capabilities. Transformational implies a device that allows a substantial improvement in current performance and enables the completion of new capabilities that were not previously possible.

exoskeleton.6,8,24 The total number and duration of familiarisation sessions required may vary between devices (active or passive), design (whole body, upper body, lower limb with multiple joints or single joint) and control schemes (EMG, kinematic, force-based, or none). As a guide, identifying a plateau or diminishing returns in a key metric of interest (e.g., oxygen consumption or EMG) within and between consecutive sessions serves as an indicator of a participant’s familiarisation status.24,25 A sample size of 6–10 participants is common amongst previous research.25–27 Participant factors including demographics, strength and fitness, and unencumbered anthropometry that are relevant to the device to be tested should be characterised. A minimum set of recommended metrics include: • Demographic information — age, sex, handedness (shooting, writing and/or eye dominance), rank, role(s), trade, length of service and/or operational experience. • Strength and fitness — maximal aerobic capacity via an acceptable test such as a VO2max test, multi-stage fitness test or 2-mile run time28,29 and leg power via a vertical jump test. • Unencumbered anthropometry — stature, body mass, segment lengths (e.g. arm, leg or trunk lengths), breadths (e.g. waist, shoulder or acromion breadth), circumferences (e.g. arm, hip, waist and chest) and skinfold thickness (e.g. seven-site formula30 ; abdominal, triceps, chest, midaxillary, subscapular, suprailiac and thigh).30–32 Stage A is to be completed under laboratory conditions to assess device impacts on the performance of controlled tasks (Table 1). While less mission specific, this stage assesses the device through a range of foundational movements to understand the basic impact a device has on soldier mobility. Treadmill and overground laboratory locomotion trials provide large, reliable sets of data on human and exoskeleton performance. To provide a holistic evaluation of the device a range of objective and subjective metrics can be collected (Table 1). Functional movement tests will evaluate basic movement patterns including range of motion, stepping over obstacles, static balance, movement between firing positions, handling firearms or relevant tools and donning/doffing the exoskeleton. The functional movement tests are to be completed in a controlled laboratory environment, tailored to the specific device under evaluation and in

positions the user would be expected to achieve. It is essential these tests are completed prior to any dynamic tests to ensure safety of the participant and identify any limitations or risks with the device. Laboratory locomotion trials can be completed on a treadmill and overground for a minimum duration of 5 min each trial to allow the user to reach steady state oxygen consumption. Speeds should be representative of key tasks and guided by subject matter experts, such as patrolling, marching, running and/or user selfselected speeds. Treadmill locomotion trials can be performed on a level, inclined, and declined grade. Overground trials should be performed on a standardised flat surface to ensure a steady pace throughout the walking duration. The use of exoskeletons may offset decreases in cognitive performance associated with load carriage. Cognitive tasks relevant to the missions of the dismounted combatant need to be examined during loaded marching, such as navigation, target identification, communication, marksmanship and reaction time and accuracy.33,34 These tasks could be completed under both fatigued and non-fatigued conditions. Stage B is intended to simulate movements that are specific to the dismounted combatant, improving specificity and relevance to the end-user (Table 1). The activities completed in Stage B will require multiple movement transitions, assessing the devices’ ability to adapt to changes in posture and tasks performed. Standardised assessments including tests used by an Armed Service for the selection and retention of the dismounted close combatant and standard military obstacle courses such as the load effects assessment program (LEAP)35 are recommended. Whilst tasks are simulated, participants can be fully instrumented during these assessments to obtain objective measurements in the field on the devices impact on soldier mobility, lethality, sustainability and survivability. Standardised assessments specific to the dismounted close combatant are to be completed according to defined protocols. For example, the Australian Physical Employment Standards Assessment (PESA) and Canadian FORCE evaluation36 include tasks such as a forced march, tactical movement test, and manual material handling assessments to assess soldier mobility and sustainability. The LEAP is an instrumented obstacle and combat effectiveness course designed to replicate movement patterns regularly performed by Army personnel to assess soldier mobility and survivability.35 The entire LEAP course can be completed or a

a

Mission specific military training exercises under a range of operationally relevant environments, tasks and group settings

F

L/F

L/F

L/F

F

F

L—treadmill & overground

L/F

Application Lab (L)/Field (F)

Note: AUS-LEAP = Australia load effects assessment program; GRF = kinetics—ground reaction forces; Insole = kinetics—insole forces; Exo = kinetics—exoskeleton forces.

Stage C Military training exercise

Static—simulated/range marksmanship assessment Dynamic assessment (nation specific)

Marksmanship assessment

15 km forced paced march (5.5 km h−1 ), completed between 150–165 min, wearing a marching order (40–45 kg) 1 km move (8 min), 16 × 6 m bounds (20 s bounds), 18 m leopard crawl (35 s) 2 × 22 kg jerry cans/kettle bells for 11 × 25 m legs (5 s rest between legs); Lift a 35 kg box from the ground to a 1.5 m platform then lower back to ground Tunnel and hatch Sprint Stair and ladder Agility run Casualty drag Window clearance Bounding rushes Balance beam Low crawl Courtyard walls Manual handling (horizontal & vertical weight transfer) Vertical jump

Manual handling

Tactical movement test

Forced march test

7–10 min duration 0.55 m s−1 and/or 1.39 m s−1 and/or 2.08 m s−1 0%, 10% and/or −10% grade

Joint range of motion Movement between operational tasks (prone/crouched/seated/standing upright) Movement between firing positions (prone to taking-a-knee to upright) Crawling and stepping over obstacles Static balance

Task description

AUS-LEAP 35, a

Stage B Dismounted combatant employment standards

Walking/running

Stage A Functional movement tests

Task

Table 1 An example of AUS specific test methodologies, including suggested task descriptions, applications, and key metrics to be assessed for each stage.

Questionnaire RPE Mission evaluation by subject matter expert Pre-post objective measurements

Accuracy Time

Obstacle completion time Cardiovascular EMG Spatiotemporal Kinetics (FP, insole) Kinematics Questionnaire RPE

Cardiovascular EMG Spatiotemporal Kinetics (insole) Kinematics Questionnaire RPE

Cardiovascular EMG Spatiotemporal Kinetics (GRF, insole, exo)a Kinematics Questionnaire RPE Cognitive

Questionnaire RPE Time to complete

Kinematics

EMG Kinetics

Metrics

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161 1157

1158

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161

Exoskeleton No Technology certification

EMG Kinetics Kinematics Questionnaire RPE

Yes Functional Movements Stage A Controlled laboratory assessments

Walking Treadmill/overground

No Assessment gate successfully completed

Cognitive tasks

Yes

Physical employment standards

Stage B Simulated field tasks and duties

LEAP obstacle course Assessment gate successfully completed Yes

No

Cardiovascular EMG Spatiotemporal Kinetics (FP, insole, exo) Kinematics Cognitive Musculoskeletal modelling Questionnaire RPE

Completion time Cardiovascular EMG Spatiotemporal Kinetics (FP, insole) Kinematics Questionnaire RPE

Environments

Stage C Military training exercises

Tasks

Questionnaire RPE Mission evaluation Pre-Post objective measurements

Group settings Fig. 2. Flowchart detailing the stage gated process from simple and controlled laboratory evaluations (Stage A) to more complex simulated and real-world military tasks (Stages B–C). Whilst a device is intended to progress from the top down, relevant components of the testing construct can be selected to best suit the intended application and maturity of the technology as detailed by the outer dotted grey line.

selection of relevant obstacles that are specific to the intended exoskeleton can be used (Table 1). Lethality is one of the key NATO capability areas for the dismounted close combatant. While many current exoskeletons are not intended to directly affect marksmanship, it is important there are no negative impacts on shooting performances in a static or dynamic environment. Marksmanship tests are to be completed in accordance with the defined simulated and live fire weapon proficiency requirements set by the relevant Armed Forces. Stage C involves the completion of mission specific military training exercises under a range of relevant environments (e.g. temperate, jungle, desert, arctic, urban, mountainous, etc.) (Table 1). At this stage, the exoskeleton under evaluation has likely reached a high level of technical readiness and should be tested under operationally relevant scenarios. Participants will be trained and experienced dismounted combatants with all tasks completed under the supervision of subject matter experts (i.e. Senior Noncommissioned Officer (NCO)) in order to ensure that they are

completed in accordance with relevant tactics, techniques and procedures. Ideally, the same Senior NCO would observe the same mission profile completed under all three conditions (exoskeleton ON, exoskeleton OFF, and control). The aim of Stage C is to evaluate the exoskeleton in a team environment and under real world settings to determine the potential utility within a military context and the impact on soldier mobility, sustainability, lethality, survivability and C4I. Multiple dismounted combatants and devices should be used with the focus on completing set mission profiles. Within this context, dismounted combatants should be afforded the flexibility to employ the exoskeleton in the manner that they believe best serves the mission. Data collected will be mostly subjective(Table 2) (questionnaires, including impact of the exoskeleton on individual users and the entire team) and ratings of perceived exertion (RPE) in addition to mission evaluations (i.e. rating of role and mission performance) from a subject matter expert. If possible, it is highly recommended that un-obtrusive field deployable assessment suits,

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161

1159

Table 2 A standard set of test metrics, measurement units and specific applications to collect objective and subjective data relevant to military exoskeletons. Metric

Units

Application

ml kg−1 min−1 bpm W kg−1

L/F L/F L/F

% ms

L/F L/F



L/F L/F

Participant factors Demographic information Age and sex Handedness Rank, role, corp/trade, length of service and operational experience Strength and fitness Maximal aerobic capacity Leg power Unencumbered anthropometry Stature and body mass Segment lengths, breadths and circumference Skinfold thickness Physiology Cardiovascular function10,38 Absolute oxygen consumption relative to BM or total mass (BM + exoskeleton) Absolute heart rate Net metabolic cost Surface EMG39 Integral muscle activity Muscle onset, offset and duration Thermal load40,41 Surface temperature Core temperature Biomechanics Postural stability11,42 Dynamic postural stability index Variability of GRF Limits of stability COP length and excursion Spatiotemporal43 Speed Stride rate Stride length Duration of stance, loading, propulsion, single/double support and swing phases Step width Kinetics—ground reaction forces43 Peak vertical and anterior/posterior GRF during the breaking and propulsion phases Peak medio-lateral GRF during stance phase Minimum vertical GRF during single-support phase Loading and propulsion rates Joint moments Kinetics—insole forces43 Peak normal force during breaking and propulsion phases Minimum normal force during single-support phase Peak normal force at rearfoot, midfoot and forefoot during stance phase Kinetics—exoskeleton forces6,8 Force/torque applied/measured by the device during gait Kinematics44,45 Peak joint angles (max and min) Range of motion Musculoskeletal modelling19,46 Internal joint moments Muscle forces



C C

Unitless CV/SD cm cm

L L L L/F

m s−1 Hz m s/% m

L/F L/F L/F L/F L/F

N N N N s−1 Nm

L L L L L

N N N

L/F L/F L/F

N/N m

L

◦ ◦

L/F L/F

Nm N kg−1

L L

Cognitive33,34 Response time Response accuracy

ms Correct/incorrect

L/F L/F

Psychophysiology37,47 RPE VAS Questionnaire (Supplementary files)

15 point scale 0–100 VAS scale 5 point Likert scale

L/F L/F L/F

Note: CV = coefficient of variation.

similar to that of Brandon et al.,19 be used to collect physiological and biomechanical data throughout the mission, or before and after the completion of the mission. The combination of subjective and objective data in conjunction with mission evaluations will provide a comprehensive assessment of the effects of the exoskeleton(s) on mission performance. When utilised as a staged process, relevant “reviews/gates” are to be conducted prior to progressing to the next stage and to assess the potential value proposition of the device (Fig. 2). Ini-

tially, a technology certification (safety assessment) of the device to be tested should be completed. The technology certification should ensure the safety aspects and hazards associated with use of the exoskeleton in relation to the activities to be performed have been considered and negated. Further, specific details such as the purpose of the device, specifications, materials, total mass, load ratings, previous test results and potential risks are determined and reviewed. A device should not progress to any human testing without first assessing quality and safety. Although it should be noted

1160

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161

that due to the new nature of the technology there are no standard quality assessment tools or checklists for new exoskeletons to adhere to, therefore a thorough safety assessment by the chief investigator should be completed on every new device. However, there are currently a number of collaborative efforts around the world working on developing safety standards for exoskeletons (i.e. ASTM International, Committee F48 on Exoskeletons and Exosuits, National Institute of Standards and Technology (NIST) Exoskeleton Terminology Task Group, Wearable Robotics Association (WearRA) Standards Committee and International Organization for Standardization (ISO)/Technical Committee (TC) 299 Robotics).16 Following the completion of a testing stage, and prior to progressing to the next stage, it is necessary to complete a holistic assessment on the device’s performance and safety (assessment gate). Any observed safety issues such as control system faults, device misalignments, durability concerns, thermal burden issues, significant restrictions in movement or general system failures should be addressed and rectified prior to advancing to the next stage. Additionally, any user subjective feedback about fit, comfort, or system performance should be important considerations before the system is permitted to advance. Following changes to a device, it should be considered if they were substantial and whether they necessitate previous stages/trials to be repeated to ensure a valid data set (Fig. 2). 3. Metrics A standard set of metrics are presented for researchers and developers to select from dependent upon device specifications and the evaluation stage of interest . Objective measurements assess changes in the physiological, biomechanical, and cognitive characteristics of the user when wearing the exoskeleton. In addition, subjective measurements assess the psychophysiological effect in regards to their experiences with wearing and using the exoskeleton. Combining objective and subjective measurements throughout all stages of testing ensures a holistic assessment of exoskeleton performance. Objective measurements are separated into three main categories: physiological (cardiovascular and muscle function), biomechanical (spatiotemporal, kinetics and kinematics) and cognitive characteristics (response time and accuracy) of human performance. A range of key and commonly measured variables are presented, the majority of which can be measured reliably in the laboratory and the field. Subjective measurement tools are presented to investigate the user’s perception and views of wearing the device, including an example standardised questionnaire (Supplementary files), Borg 15-point RPE scale and a visual analogue scale (VAS).37 A questionnaire was developed by the authors to assess the user’s perspective on the fit and comfort, usability, integration and durability of the exoskeleton. Users should also be given the opportunity to provide any general comments on the device and highlight areas of discomfort or pain using a body-mapping tool. 4. Future directions The development of a standardised evaluation framework for assessing the impact of an exoskeleton on the mobility of the dismounted close combatant was the primary focus of this paper. Nonetheless, as technology matures, improving exoskeleton design and assessment protocols to target the other four capability areas of the dismounted close combatant (lethality, survivability, sustainability and C4I) should be considered in future work. Further, developing a small number of simple and key metrics that combine objective and subjective measurements weighted to a specific task and device will significantly improve the evaluation and translation of findings to exoskeleton developers and military procurement

specialists. Lastly, a critical area for future work will involve the further development of Stage C to improve the validly and reliability of evaluating exoskeletons during operationally relevant scenarios, whether it be at a team or individual level. 5. Conclusions A structured and systematic methodology has been outlined, with the intent of enabling a more consistent and holistic assessment of exoskeleton performance across a variety of dismounted close combatant tasks, to obtain high quality quantitative and qualitative metrics related to device and personnel performance. The presented testing methodology is to be completed as a staged process, structured to progress from simple, controlled laboratory measurements to more complex field assessments. Representative tests at each stage of development that underlie the principles of each assessment stage are provided. Tests should reflect tasks and duties expected of the dismounted combatant, in addition to being suitable for each stage of the device’s development cycle and maturity level (i.e. simple controlled tests first, followed by progressively more complex field tasks). Lastly, potential metrics and associated units that can be calculated during each test are presented to improve reporting and presentation consistency. These metrics entail a holistic approach, providing data on (1) the human response to wearing the exoskeleton (i.e. objective and subjective), and (2) the device’s mechanical performance. The development of a standardised methodology allows for the classification of exoskeletons in accordance with a common framework. Thus facilitating the potential to compare different devices and complete pre-/post-tests of a single device over time to assess design modifications or long-term effects of wear. These benefits will continue to ensure further advancement of the exoskeleton industry and support improved international collaboration. In doing so, this enables international military groups to better evaluate a system’s potential, with the hope of accelerating the maturity and ultimately the fielding of devices to augment the dismounted close combatant and small team capability. In summary, to evaluate an exoskeleton’s performance within the military context we present a multi-disciplinary assessment utilising objective and subjective measurement techniques. We developed a three-staged testing procedure that progresses from simple controlled laboratory evaluations to complex in-field military specific exercises and simulated missions. Relevant components of the testing construct can be selected to best suit the intended application and maturity of the exoskeleton technology, thereby ensuring a robust and flexible approach. Appendix A. Supplementary data Supplementary data associated with this article can be found, in the online version, at https://doi.org/10.1016/j.jsams.2018.05.016. References 1. Orr RM. The history of the soldier’s load. Aust Army J 2010; 7(2):67. 2. Orr RM, Pope R, Johnston V et al. Soldier occupational load carriage: a narrative review of associated injuries. Int J Inj Contr Saf Promot 2014; 21(4):388–396. 3. Knapik JJ, Reynolds KL, Harman E. Soldier load carriage: historical, physiological, biomechanical, and medical aspects. Mil Med 2004; 169(1):45–56. 4. Jaworski RL, Jensen A, Niederberger B et al. Changes in combat task performance under increasing loads in active duty marines. Mil Med 2015; 180(3):179–186. 5. Herr H. Exoskeletons and orthoses: classification, design challenges and future directions. J Neuroeng Rehabil 2009; 6(1). 6. Panizzolo FA, Galiana I, Asbeck AT et al. A biologically-inspired multi-joint soft exosuit that can reduce the energy cost of loaded walking. J Neuroeng Rehabil 2016; 13(1). 7. Mooney LM, Herr HM. Biomechanical walking mechanisms underlying the metabolic reduction caused by an autonomous exoskeleton. J Neuroeng Rehabil 2016; 13(1).

K.L. Mudie et al. / Journal of Science and Medicine in Sport 21 (2018) 1154–1161 8. Collins SH, Bruce Wiggin M, Sawicki GS. Reducing the energy cost of human walking using an unpowered exoskeleton. Nature 2015; 522(7555):212–215. 9. Cherry MS, Kota S, Young A et al. Running with an elastic lower limb exoskeleton. J Appl Biomech 2015; 32(3):269–277. 10. Gregorczyk KN, Hasselquist L, Schiffman JM et al. Effects of a lower-body exoskeleton device on metabolic cost and gait biomechanics during load carriage. Ergonomics 2010; 53(10):1263–1275. 11. Schiffman JM, Gregorczyk KN, Bensel CK et al. The effects of a lower body exoskeleton load carriage assistive device on limits of stability and postural sway. Ergonomics 2008; 51(10):1515–1529. 12. Mooney LM, Rouse EJ, Herr HM. Autonomous exoskeleton reduces metabolic cost of human walking during load carriage. J Neuroeng Rehabil 2014; 11(1). 13. Ding Y, Galiana I, Asbeck AT et al. Biomechanical and physiological evaluation of multi-joint assistance with soft exosuits. IEEE Trans Neural Syst Rehabil Eng 2016; 25(2):119–130. 14. Kazerooni H, Steger R. The Berkeley lower extremity exoskeleton. J Dyn Syst Meas Control Trans ASME 2006; 128(1):14–25. 15. Walsh CJ, Endo K, Herr H. A quasi-passive leg exoskeleton for load-carrying augmentation. Int J Humanoid Rob 2007; 4(3):487–506. 16. Exoskeleton report. http://exoskeletonreport.com/. 17. Young AJ, Ferris DP. State-of-the-art and future directions for lower limb robotic exoskeletons. IEEE Trans Neural Syst Rehabil Eng 2016; 25(2):171–182. 18. Mudie KL, Billing DC, Bishop DJ et al. Reducing load carriage during walking using a lower limb passive exoskeleton. Int Soc Biomech: Proceedings of the 26th Congress of the International Society of Biomechanics 2017 2017:51. 19. Brandon S, Brookshaw M, Sexton A et al. Biomechanical evaluation of a dermoskeleton during gait, 19th Biennial Meeting of the Canadian Society for Biomechanics, 2017. 20. Norris JA, Granata KP, Mitros MR et al. Effect of augmented plantarflexion power on preferred walking speed and economy in young and older adults. Gait Posture 2007; 25(4):620–627. 21. Torricelli D, Gonzalez-Vargas J, Veneman JF et al. Benchmarking bipedal locomotion: a unified scheme for humanoids, wearable robots, and humans. IEEE Rob Autom Mag 2015; 22(3):103–115. 22. Carlson B, Norton A, Yanco H. Preliminary development of test methods to evaluate lower body wearable robots for human performance augmentation. Advances in Cooperative Robotics: Proceedings of the 19th International Conference on Clawar 2016 2016:143. World Scientific. 23. Ferris DP, Sawicki GS, Daley MA. A physiologist’s perspective on robotic exoskeletons for human locomotion. Int J Humanoid Rob 2007; 4(3):507–528. 24. Galle S, Malcolm P, Derave W et al. Adaptation to walking with an exoskeleton that assists ankle extension. Gait Posture 2013; 38(3):495–499. 25. Sawicki GS, Ferris DP. Mechanics and energetics of level walking with powered ankle exoskeletons. J Exp Biol 2008; 211(9):1402–1413. 26. Rome LC, Flynn L, Yoo TD. Rubber bands reduce the cost of carrying loads. Nature 2006; 444(7122):1023–1024. 27. Ding Y, Panizzolo FA, Siviy C et al. Effect of timing of hip extension assistance during loaded walking with a soft exosuit. J Neuroeng Rehabil 2016; 13(1):1–10. 28. Mello RP, Murphy MM, Vogel JA. Relationship between a two mile run for time and maximal oxygen uptake. J Strength Cond Res 1988; 2(1):9–12.

1161

29. Sporiˇs G. Validity of 2-miles run test for determination of VO2max among soldiers. J Sport Hum Perform 2013; 1(1):15–22. 30. Blair SN, Gibbons LW. ACSM’s Guidelines for Exercise Testing and Prescription, 9th ed. Wolters Kluwer, 1986. 31. Keefe AA, Angel H, Mangan B. 2012 Canadian Forces Anthropometric Survey (CFAS), Canada, Defence Research and Development, 2015. 32. Edwards M, Furnell A, Coleman J et al. A Preliminary Anthropometry Standard for Australian Army Equipment Evaluation, Defence Science and Technology Organisation, 2014. 33. Haas EC, Crowell HP, Kehring KL. The Effect of Physical Load and Environment on Soldier Performance, Army Research Lab Aberdeen Proving Ground MD Human Research and Engineering Directorate, 2014. 34. Kobus DA, Brown CM, Wu L et al. Cognitive Performance and Physiological Changes under Heavy Load Carriage, San Diego, CA, Pacific Science and Engineering Group Inc, 2010. 35. Karakolis T, Sinclair BA, Kelly A et al. Determination of orientation and practice requirements when using an obstacle course for mobility performance assessment. Hum Factors 2017; 59(4):535–545. 36. Reilly TJ. Canada’s physical fitness standard for the land force: a global comparison. Can Army 2010; 13:59–69. 37. Grant S, Aitchison T, Henderson E et al. A comparison of the reproducibility and the sensitivity to change of visual analogue scales, Borg scales, and Likert scales in normal subjects during submaximal exercise. Chest 1999; 116(5):1208–1217. 38. Brockway JM. Derivation of formulae used to calculate energy expenditure in man. Hum Nutr Clin Nutr 1987; 41(6):463–471. 39. Hermens HJ, Freriks B, Disselhorst-Klug C et al. Development of recommendations for SEMG sensors and sensor placement procedures. J Electromyogr Kinesiol 2000; 10:361–374. 40. Casa DJ, Becker SM, Ganio MS et al. Validity of devices that assess body temperature during outdoor exercise in the heat. J Athl Train 2007; 42(3):333–342. 41. Ganio MS, Brown CM, Casa DJ et al. Validity and reliability of devices that assess body temperature during indoor exercise in the heat. J Athl Train 2009; 44(2):124–135. 42. Wikstrom EA, Tillman MD, Smith AN et al. A new force-plate technology measure of dynamic postural stability: the dynamic postural stability index. J Athl Train 2005; 40(4):305–309. 43. Hall SJ. Basic Biomechanics, 5th ed. WCB/McGraw-Hill, 2007. 44. Wu G, Siegler S, Allard P et al. ISB recommendation on definitions of joint coordinate system of various joints for the reporting of human joint motion—part I: ankle, hip, and spine. J Biomech 2002; 35(4):543–548. 45. Wu G, van der Helm FCT, Veeger HEJ et al. ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion—part II: shoulder, elbow, wrist and hand. J Biomech 2005; 38(5):981–992. 46. Ewing KA, Fernandez JW, Begg RK et al. Prophylactic knee bracing alters lower-limb muscle forces during a double-leg drop landing. J Biomech 2016; 49(14):3347–3354. 47. Scherr J, Wolfarth B, Christle JW et al. Associations between Borg’s rating of perceived exertion and physiological measures of exercise intensity. Eur J Appl Physiol 2013; 113(1):147–155.

Journal of Science and Medicine in Sport 21 (2018) 1162–1167

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Review

A method for developing organisation-wide manual handling based physical employment standards in a military context Greg L. Carstairs ∗ , Daniel J. Ham, Robert J. Savage, Stuart A. Best, Ben Beck, Daniel C. Billing Land Division, Defence Science and Technology Group, Australia

a r t i c l e

i n f o

Article history: Received 13 September 2017 Received in revised form 7 February 2018 Accepted 17 February 2018 Available online 2 March 2018 Keywords: Trade task analysis Physical test development Physical demands Task performance Ergonomics Work

a b s t r a c t The benefit of job-related employment standards in physically demanding occupations are well known. A number of methodological frameworks have been established to guide the development of physical employment standards for single job functions. In the case of an organisation comprised of multiple and diverse employment specialisations, such as the Australian Army, it is impractical to develop unique employment standards for each occupation. Objectives: To present an approach to organisational level physical employment standards development that seeks to retain occupationally specific task characteristics by applying a movement cluster approach. Design: Structured methodological overview. Methods: An outline of the research process used in performing job tasks analysis are presented, including the identification, quantification and characterisation, and verification of physically demanding manual handling tasks. The methodology used to filter task information collected from this job analyses to group manual handling tasks with similar characteristics (termed clusters), across a range of employment specialisations is given. Finally, we provide examples of test development based on these key manual handling clusters to develop a limited suite of tests with high content, criterion and face validity that may be implementable across a large organisation. Results: Job task analysis was performed on 57 employment specialisations, identifying 458 tasks that were grouped into 10 movement based clusters. The rationalisation of criterion tasks through clustering informed the development of a limited suite of tests with high content, criterion and face validity that may be implementable across a large organisation. Conclusion: This approach could be applied when developing physical employment standards across other multi-occupation organisations. Crown Copyright © 2018 Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/bync-nd/4.0/).

1. Introduction The most demanding occupational duties often involve manual material handling,1,2 and are a significant source of musculoskeletal injuries.3–5 Manual handling is defined as the exertion of force to hold and/or move objects from one location to another, predominately by hand. Manual handling tasks in the military are diverse,1,2 often not modifiable6,7 and require high levels of muscular strength and/or muscular endurance.2,6 The mismatch between workers physical capacity and the physical demands of the job may increase the incidence of musculoskeletal injury.8 The implemen-

∗ Corresponding author. E-mail address: [email protected] (G.L. Carstairs).

tation of scientifically developed physical employment standards (PES) serves to address this mismatch and can lead to increased capability, and operational effectiveness.1,9 A number of frameworks have been developed9–11 for creating PES. A common approach includes the identification of physically demanding tasks through workshop(s) and/or surveys with experiential experts,12 followed by field observations, to measure and quantify task demand. This process yields a subset of tasks, known as criterion tasks, which typically represent those that are the most physically demanding and critically important.11,13 Physical tests that reflect the physical demands of one or more criterion tasks are then developed. While this approach is suitable for developing PES for a single employment category (e.g. infantry soldier, which in itself has multiple different distinct roles or sub-specialities), this frame-

https://doi.org/10.1016/j.jsams.2018.02.008 1440-2440/Crown Copyright © 2018 Published by Elsevier Ltd on behalf of Sports Medicine Australia. This is an open access article under the CC BY-NC-ND license (http:// creativecommons.org/licenses/by-nc-nd/4.0/).

G.L. Carstairs et al. / Journal of Science and Medicine in Sport 21 (2018) 1162–1167

work becomes cumbersome and impractical at an organisational level, which can incorporate multiple distinct employment categories. For example, the Australian Army contains more than 50 diverse employment specialisations, making it impractical to implement a specific PES assessment test for each criterion task for each employment category. Since physical screening also occurs prior to employment specialisation, it is clearly advantageous to have a limited suite of broadly applicable PES assessments that can predict physical suitability across different employment categories and which may be adjusted, by altering key characteristics of the test such as the mass lifted or carried and/or the distance traversed. The purpose of this paper is to present a methodological approach of characterising manual handling tasks to support the development of evidence-based PES for an organisation comprised of multiple and diverse employment specialisations, such as the Australian Army. Firstly, we briefly describe the process of performing job task analysis for manual handling tasks. Secondly, we present a method to condense the vast amount of information collected from these job task analyses procedures to group manual handling tasks into a manageable set, termed clusters. Finally, using these clusters we provide examples of test development. While the greatest applicability of this approach is for a diverse organisation such as the military, these steps could be applied to other physically demanding occupations such as the emergency services, which can contain many different distinct roles or sub-specialities.

2. Task identification The first stage of the PES development was to develop an initial task list. A list of preliminary physically demanding tasks was identified by systematically reviewing employment profile manuals and doctrine and extracting tasks that were documented as requiring physical attributes for successful task completion. These lists were then used to initiate discussions with experiential experts from each of the employment categories during focus groups called Trade Task Workshops (TTW). Each TTW included experienced personnel covering a range of ranks and roles, from those who directly execute the execute the task, those that supervise personnel completing the task, and those that manage the relevant workforce, with an aim of at least three personnel from each category, as well as physiological and biomechanical researchers experienced in PES development. The purpose of this stage was to; (1) identify physically demanding tasks that personnel within the employment categories are expected to perform, (2) gain an understanding of the physical nature of job functions and (3) to create task descriptions and a physically demanding task list for each employment category. For each task identified, the scenario and the context of how, when and where the task could be performed were discussed with specific details of the task focusing on: the difficulty, duration and frequency of the task; the body worn equipment such as protective armour and; whether the task was self- or externally-paced; and possible work environments. More manual handling-specific information gathered included: equipment used; object dimensions and masses (if known); actions used (e.g., lifting and carrying); number of people involved and; distances covered. Generally, the information extracted from the TTW would be incomplete or lack fidelity, which necessitated the requirement to observe tasks actually been performed. Comprehensive details on the scenarios and context of tasks were sufficient for the researches to gain an appreciation of the task and to be able to work with experiential experts to set up accurate task observations. Australian Army personnel perform two key functions, firstly, a generic service requirement involving common soldiering tasks and secondly, employment requirements directly related to their trade specialisation. This dual nature role necessitated the forma-

1163

tion of two baseline levels for the generic service requirements, the All Corps Soldier and Combat Arms. The All Corps Soldier is the minimum level and is based on the performance of essential military duties, which are typically defensive in nature. The Combat Arms level is based upon the requirement to operate in a high threat environment and, if necessary, conduct direct tactical action against the enemy. At the conclusion of the TTW, each group of experiential experts confirmed that the formulated list of tasks was inclusive of all employment category-specific tasks with a perceived high physical demand. For the 57 employment categories in the Australian Army that underwent review, 583 tasks were identified, 458 of which were characterised as manual handling tasks. The high proportion of physically demanding tasks classified as manual handling (79%) underscores the importance of these activities to successful job performance.

3. Task quantification and characterisation The second stage of the PES development was conducting Trade Task Field Observations (TTFOs), with the purpose of objectively quantifying the physical demands of tasks identified in the TTWs. All physically demanding job tasks were observed and measured under ‘typical’ work conditions in the field and/or barracks, with experiential experts providing guidance on the most appropriate locations and scenarios. For each task, experienced senior soldiers and/or officers verified that the soldiers performing the simulations were competent and that the task scenarios appropriately represented the task to an ‘expected standard’ i.e. what would be expected reasonable of all soldiers within the employment category to perform and not on the extremes of tasks performance. The qualified soldiers performing these task simulations were all medically fit for duty and were encouraged to identify any other physically demanding tasks not recognised during the TTW. Manual handling task parameters were quantified according to a list of factors (described by Waters et al.,14 Snook and Ciriello,15 and Gallagher16 ) known to influence task difficulty. Specifically, the following object, movement and task parameters were measured: object mass and size; hand location and object coupling; vertical and horizontal distances; asymmetry; frequency; repetitions; quantity of items; duration; pacing; pushing and pulling positions and forces; postures (i.e. stand, stoop, kneel, sit, crouch); number of personnel involved; number of hands used; action used (i.e. carry, push, pull, drag, hold); surface type; and physiological cost. All items carried, lifted or dragged were weighed with platform scales or a portable force plate. Distances were measured using global-positioning system (GPS) units, a trundle wheel and/or a tape measure. Lifting heights and object dimensions including handle location were measured using a tape measure. Push and pull forces were measured using an inline force transducer. Time to task completion and work:rest ratios were collected with a stop watch and/or video camera. The body worn equipment was noted and weighed. To measure the physiological demands of repetitive manual handling tasks, heart rate and oxygen consumption were collected. Participants were also asked to rate their perceived exertion (RPE) on a 15-point Borg scale (6–20)17 during certain tasks to subjectively compare the physical demands of different tasks within the same employment category. To demonstrate examples of some of the key task characterisation measures, three manual handling tasks; pack lift and place, stretcher carry, and bombing up an M1 Abrams main battle tank have been selected and presented in Table 1. Lifting one’s own field pack to the back of a common Australian military vehicle (height 1.50 m) is a discrete muscular strength task required by all Australian Army personnel, where the mass of the

1164

G.L. Carstairs et al. / Journal of Science and Medicine in Sport 21 (2018) 1162–1167

Table 1 Example of task characteristics for three tasks during trade task field observations.

Object characteristics

Lift/carry positions

Task parameters

a

Mass (kg) Dimensions (m) Coupling Handle height (m) Start Mid End Personnel Carry distance Duration Pace Work:rest Repetitions Number of hands Worn equipment

Combat arms pack lift and place

All corps soldier stretcher carry

Bombing up an M1 tank

22.1 (field pack) 0.80 × 0.65 × 0.30 Poor None Ground – in front of body Waist/chest – in front of body 1.50 m platform – in front of body One None/stationary ∼5 s NA NA 1 2 Fighting ordera

90.4 (casualty and stretcher) 2.29 × 0.54 Good 0.12 Ground – side of body Knuckle height – side of body Ground – side of body Four 100 m in 25 m bouts 01:40 min 4.5 km h−1 (4:1) carry 25 m in 20 s, 5 s rotate 4 x 25 m 1 swap every 25 m Fighting ordera

23 (ammunition shell) Diameter 0.12, length 0.98 Poor None 0.10, 0.30, 0.50 m – side/front of body Waist or chest height 1.70 m, can be lifted higher One 10 m loaded walk, 10 m unloaded 10–15 min 3.6 km h−1 1:1 36 shells 2 Disruptive pattern combat uniform

Fighting order is in additional to wearing standard combat uniform and boot consist of body armour, webbing, helmet and weapon (20–22 kg).

pack varies depending on the employment category. For example, the minimum required mass for an All Corps soldiers pack was 20.0 kg, while Combat Arms and Infantry soldiers require a 22.1 kg and 32.0 kg pack, respectively. While in many cases the mass used on operations may exceed this; these masses represent baseline levels required of all soldiers in these employment categories as defined by the load list. Performing a stretcher carry is an essential muscular endurance task required by all Australian Army personnel, where the expected mass, carry distance and type of stretcher varies across employment categories due to the different scenarios/contexts and threat profile. The casualty tasks were profiled using GPS to measure distances, speeds and work:rest ratios (as described by Silk and Billing18 ) on multiple patrols performing mock assaults and casualty evacuations through different terrain over different distances. The All Corps soldiers’ requirement was to participate in a 100 m carry of a 90.4 kg stretcher (average Australian soldier mass of 83.3 kg and 7.1 kg stretcher mass), the Combat Arms requirement was a 300 m carry of 108.8 kg stretcher (additional fighting order mass on casualty), and the Infantry requirement was a 750 m carry of a 110.6 kg stretcher. The All Corps and Combat Arms stretcher carry was performed with a rigid stretcher in a team of four. The Infantry stretcher carry was performed with a soft stretcher in a team of six, where four soldiers carry the stretcher at a time, while the other two provide covering fire. The six soldiers rotate roles to perform at least three 150 m carry efforts. Bombing up an M1 tank was completed in a team where a soldier would repeatedly pick up an ammunition shell from a tray, carry it 10 m, and then pass it up to another soldier standing on the tank, who would then pass it to another soldier standing in the turret, who would pass it to yet another soldier inside the tank for storage. The soldier performing the first part of the task was judged to have the most physically challenging role, due to the requirement to perform 23 lift and carry repetitions of the ammunition shell (23 kg), with a final lift height of 1.70 m at the end of each repetition, requiring both muscular strength and muscular endurance. The pack lift, stretcher carry and bombing up tasks each have a lifting load of 22–23 kg but the physical demands vary according to the different task parameters, highlighting the need to quantify all components of the task, rather than only quantifying the mass of a lift. As a result of the TTFOs, alterations to task characteristics could be made to the original task lists. 4. Task verification The third stage of the PES development involved the conduct of follow-up focus groups, via a Trade Task Confirmation Workshop

(TTCW), with a group of experiential experts that had a breath of understanding of the requirements of their employment category. The composition of these experiential experts mimicked that of the initial TTW as closely as possible. The purpose of this stage was to; (1) receive confirmation and endorsement of the physically demanding task list and descriptions, (2) confirm task parameters and (3) select criterion tasks. Researchers presented updated task descriptions of all relevant task characteristics and parameters based on the outcomes of the observations for all tasks, to confirm descriptions, and that tasks were conducted in the appropriate context. To characterise variation, tasks were observed being performed under different simulated operational scenarios and environmental contexts. These variations were presented to experiential experts to identify the most appropriate scenario(s) for their employment category, allowing parameters to be established. Experiential experts were able to identify differences between work practices observed during TTFOs and expected methods, allowing tasks to be adjusted accordingly and re-quantified where necessary. For example, lifting a 72 kg generator onto a trailer was an observed task performed by two soldiers from the Royal Australian Corps of Signals during the TTFO, however during the TTCW experiential experts indicated that there was no operational requirement for this and the task could be performed by four soldiers. Tasks were presented and grouped together as muscular strength, muscular endurance or requiring high levels of both based on the physical capacity or attribute that was likely to fatigue and/or fail first and limit continued task performance, to focus attention on one grouping of tasks at a time. In this context, muscular strength is the ability of a specific muscle or muscle group(s) to generate sufficient force for successful task completion, while muscular endurance is the execution of repeated isotonic contractions or a sustained isometric muscular contraction. Together researchers and experiential experts identified task(s) within each physical capacity to be criterion tasks for each employment category. The selection of criterion tasks was ultimately based on two key criteria (1) the task was physically demanding relative to others that were classified in the physical capacity or attribute group, researcher led decision (based from the objective data collected during TTFOs) and (2) all members within the employment category were expected to be able to safety and successfully execute the task, experiential experts led decision. Multiple criterion tasks for each physical capacity were selected when tasks were demanding and vastly different, for example the Infantry muscular strength criterion tasks chosen were a 32 kg pack lift and individually performing a 109 kg casualty drag over 10 m.

G.L. Carstairs et al. / Journal of Science and Medicine in Sport 21 (2018) 1162–1167 Table 2 Cluster allocation of all manual handling tasks and the criterion manual handling tasks. Cluster

All tasks

Criterion tasks

Lift to platform Lift to an anatomical height (without carry) Lift and hold Lift and twist Seated, crouching or kneeling lift Lift-carry-lower Lift-carry-lift Drag Push/pull Dig/hammer Other

198 34 54 10 9 100 40 13 38 30 13

49 4 2 1 0 25 9 3 6 5 2

Total 1 cluster 2 clusters 3 clusters 4 clusters

458 374 62 6 2

88 69 16 1 0

At the conclusion of each TTCW, experiential experts confirmed the formulated task list and selection of criterion tasks. Across the different employment categories investigated, 88 criterion tasks were identified from 458 manual handling tasks. The dominant physical capacity of criterion tasks was identified as muscular strength for 49 (55.7%) as muscular endurance for 44 (50.0%) while five (5.7%) were in both. 5. Clustering tasks across job functions Given the large number of criterion tasks identified and the need to provide a suite of physical tests that could be implemented on an organisation-wide level, developing specific individual PES assessments according to each occupational speciality was not feasible. In order to balance scientific integrity by maintaining the link between the task and the test with the successful service wide introduction of PES, a filtration approach was used to cluster tasks with similar characteristics. The research team methodically evaluated each task using a structured approach utilising the IDEA protocol19 to improve the accuracy of expert judgements and included several key steps including – investigate, discuss, estimate and aggregate. Task characteristics and parameters were discussed and data entry accuracy was checked, with the aid of video recordings, photographs, notes and first-hand observational insights collected during the TTFO and workshops. All task characteristics and parameters collected during TTFOs for all manual handling tasks from the 57 employment categories investigated, were tabulated in Microsoft Excel. Major column headings were based on the list of factors known to influence task difficulty as earlier identified in the task quantification and characterisation section. Through this process, a pattern of common movements and actions emerged that could lay the foundation of task simulation test design. Initially the team found four clear groupings with the following components: (1) vertical lift; (2) locomotion with load; (3) pushing and/or pulling and; (4) repetitive striking. These initial clusters were then further separated into 10 discrete clusters, with the majority of tasks (82%) placed into one cluster and a small portion falling into more than one cluster (14%) (see Table 2). Lifting was the most commonly performed manual handling task. The most common lifting cluster ‘lift to a platform’ involved a fixed-height external feature (such as a vehicle or work bench) that dictated the height of the lift. Additionally, there were a number of lifts that finished at an anatomically defined position ‘lift to an anatomical height without carry’, such as full arm extension overhead, hand at shoulder height or knuckle height (i.e. arms are by the side during standing). This cluster focused on the end position

1165

of the lift and not the carry component as most carry positions were governed by anatomy of the lifter. The next lifting cluster identified was ‘lift and hold’ where soldiers were often holding and operating tools and machinery for a prolonged period of time. The last two lifting clusters were ‘lift and twist’ and ‘seated, crouching or kneeling lift’. A lift was classified ‘lift and twist’ when there was 90◦ of rotation of the body in moving the item where the lower body was fixed in place, which often occurs when someone is seated. The ‘seated, crouching or kneeling lift’ was applicable to tasks where a soldier performed the lift in one of these postures which often occurred in a confined space such as inside a vehicle loading munitions. Locomotion with load tasks predominantly involved carrying equipment. Tasks were deemed to be a carry when items were lifted and moved a distance of 10 m or more. While many lifting tasks require movement of a few metres to complete the lift, it was deemed that the carry component was unlikely to limit task performance. Carries were divided into ‘lift-carry-lift’ and ‘liftcarry-lower’. When the item was lifted at the end of the carry above 0.8 m (representing a height requiring a lift above knuckle height of military personnel20 ) this was classified as ‘lift-carry-lift’, such as when loading a vehicle. The vast majority of carries ended with the item back on the ground or below knuckle height representing ‘lift-carry-lower’. The last cluster within locomotion was ‘drag’ which predominantly related to dragging a casualty, but could also include items such as heavy chains. Another cluster established was the ‘push/pull’ cluster as many tasks could be performed by either pushing or pulling the item, or require both actions (e.g., using a hydraulic pump). The last cluster, ‘dig/hammer’, included tasks that soldiers would do in setting up a field position such as hamming in star pickets and digging shell scrapes. A very small number of tasks (13 in total), such as manoeuvring in an explosive ordnance suit, did not fit within the 10 clusters and were classed as ‘other’. Once all tasks were clustered, the development of PES based on clusters rather than individual criterion tasks could be realised.

6. Development of test options Focusing on the cluster characteristics allowed for the targeted development of a limited number of fitness assessments that could predict performance across a range of criterion tasks. To establish relationships between tasks and test options, a number of representative criterion tasks were selected from each cluster. These representative cluster tasks were chosen to provide coverage of the range of tasks within the cluster that exhibited the important differences in object, movement and task parameters that occur within the cluster. For example, there were multiple criterion tasks of varying masses that required a solider to, individually lift an item, with good coupling, from the ground to a 1.5 m platform, once. These common features allowed a single criterion task to be used to represent a cluster of like tasks. Task simulations were devised for each of the chosen representative cluster tasks that were measurable and scalable, (e.g., the mass of a representative cluster task could be readily altered). Participants performed each representative cluster task and test maximally for task-to-test development (Fig. 1). Pearson’s product-moment correlations or coefficients of determination were used to investigate the relationships developed. Sensitivity and specificity analyses were used to investigate the ability of a test to correctly classify an employee’s performance on a representative cluster task. The lift to platform cluster led to the development of a box lift and place test which assesses muscular strength across multiple representative cluster tasks. The establishment of the box lift and place as a possible task related predictive test was firstly identified through investigations that found a strong relationship to

1166

G.L. Carstairs et al. / Journal of Science and Medicine in Sport 21 (2018) 1162–1167

Fig. 1. A schematic example of manual handling task-to-test development methodology.

lifting capacity, with different lift to platform cluster heights.21 This finding allowed for the development of a single box lift and place test to 1.50 m, ensuring content and face validity. Relationships were then developed that could explain the variation in task demands of the criterion tasks using the box lift and place test to 1.50 m as a reference point. Representative cluster tasks were performed maximally by participants to examine the impact of the different lifting characteristics: repetition,22 object type,6,23,24 posture,25 number of lifters24 and repetitive lift-carry-lifts to a platform.6,26 These representative cluster tasks provided coverage of the different lifting characteristics present within the cluster. Task-to-test relationships were then developed to ensure criterion validity, where strong relationships were found between representative cluster tasks and the box lift and place test.6,25,27 Multiple representative cluster tasks within the lift-carry-lower cluster were investigated, with specific focus on the stretcher carry criterion tasks, which ultimately resulted in the development of the jerry can carry test that assesses muscular endurance. Possible tests were designed to be completed individually, while faithfully representing as many of the stretcher carry parameters as possible (e.g. mass, lift and carry position, and work:rest periods) to maximise content and face validity. Unilateral carries were excluded as potential test options, as although they may have strong face validity for stretcher carry performance, unilateral carries have been shown to increase spinal compression and shear forces when compared with bilateral carries,28 and the majority of criterion lift-carry-lower tasks were determined to be bilateral. Using the stretcher carry as the reference point, strong task-to-test relationships were developed for a range of possible bilateral task-related predictive tests. By assessing the sensitivity and impact of altering key test characteristics such as mass,29 speed30 and item type,30 we were able to ensure criterion validity. Having the stretcher carry as a stable constant allowed for the understanding of the impact of these different carry characteristics, which could allow comparison amongst other criterion tasks within the lift-carry-lower cluster. While extensive research is required to first understand the separate effects of multiple factors on performance in the tasks and

tests, the use of the clustering method quickly highlighted points of parity and points of difference. As a result, concentrated effort could be directed at understanding and evaluating the key variables that differed amongst the criterion tasks and then develop task-to-test relationship with a sample of representative criterion tasks. With task-to-test relationships developed and the impact of different factors within the cluster known, scaling could then be confidently applied to other criterion tasks to set standards without needing to test them directly. This also builds in the capacity to adjust standards should the task requirements change in the future. Another key advantage is that knowledge of the impact of different tasks characteristics on task and test performance is often transferable to other clusters. This cluster approach provides broad coverage of the criterion tasks, thereby maximising scientific defensibility by ensuring recommended tests that have high content, criterion and face validity. However, this cluster approach may not provide highresolution sensitivity for all criterion tasks compared to what could be achieved by developing custom tests for each criterion task.

7. Conclusion This paper outlined a framework and the processes in which manual handling tasks were profiled, and appropriate physical testing measures were established, within the Australian Army. Through a rigorous process of subjective and objective research procedures, a task list of 458 manual handling tasks and 88 manual handling criterion tasks was established. The practical challenges associated in the development of PES for a multi-trade organisation; in profiling the vast physical task duties of a service and the development a limited suite of assessments were addressed. By utilising a clustering and filtration approach we provide a method of developing task-to-test validity by narrowing down a large and diverse task list to more manageable clusters of like manual handling tasks. Criterion referenced performance standards can then be developed based on the physical requirements of the array of

G.L. Carstairs et al. / Journal of Science and Medicine in Sport 21 (2018) 1162–1167

soldiering tasks. These processes can be followed when developing PES for manual handling tasks in other occupations. Acknowledgements The authors would like to acknowledge and thank the members of the Physical Performance Team at Defence Science and Technology Group that assisted with data collection, be it leading or participating in running workshops or field observations. References 1. Rayson M. The development of physical selection procedures. Phase 1: job analysis, In: Contemporary Ergonomics., 1998. p. 393––397. 2. Sharp M, Rosenberger M, Knapik J. Common military tasks: materials handling, In: North Atlantic Treaty Organisation Research and Technology Organisation Research Technical Group-019 Optimizing Operational Physical Fitness., 2006, 5-1. 3. Lötters F, Burdorf A, Kuiper J et al. Model for the work-relatedness of low-back pain. Scand J Work Environ Health 2003:431–440. 4. Putz-Anderson V, Bernard BP, Burt SE et al. Musculoskeletal Disorders and Workplace Factors, vol. 104. National Institute for Occupational Safety and Health (NIOSH), 1997. 5. Roy T, Ritland B, Knapik J et al. Lifting tasks are associated with injuries during the early portion of a deployment to Afghanistan. Mil Med 2012; 177(6):716. 6. Carstairs GL, Ham DJ, Savage RJ et al. A box lift and place assessment is related to performance of several military manual handling tasks. Mil Med 2016; 181(3):258––264. 7. Sharp M, Legg S. Effects of psychophysical lifting training on maximal repetitive lifting capacity. Am Ind Hyg Assoc J 1988; 49(12):639. 8. Rosenblum KE, Shankar A. A study of the effects of isokinetic pre-employment physical capability screening in the reduction of musculoskeletal disorders in a labor intensive work environment. Work 2006; 26(2):215––228. 9. Taylor NAS, Groeller H. Work-based physiological assessment of physicallydemanding trades: a methodological overview. J Physiol Anthropol Appl Hum Sci 2003; 22(2):73––81. 10. Payne W, Harvey J. A framework for the design and development of physical employment tests and standards. Ergonomics 2010; 53(7):858––871. 11. Tipton M, Milligan G, Reilly T. Physiological employment standards I. Occupational fitness standards: objectively subjective? Eur J Appl Physiol 2013; 113(10):2435–2446. 12. Zumbo BD. Standard-setting methodology: establishing performance standards and setting cut-scores to assist score interpretation. Appl Physiol Nutr Metab 2016; 41(6):S74–S82.

1167

13. Larsen B, Aisbett B. Subjective job task analyses for physically demanding occupations: what is best practice? Ergonomics 2012; 55(10):1266––1277. 14. Waters TR, Putz-Anderson V, Garg A et al. Revised NIOSH equation for the design and evaluation of manual lifting tasks. Ergonomics 1993; 36(7):749––776. 15. Snook SH, Ciriello VM. The design of manual handling tasks: revised tables of maximum acceptable weights and forces. Ergonomics 1991; 34(9):1197––1213. 16. Gallagher S. Physical limitations and musculoskeletal complaints associated with work in unusual or restricted postures: a literature review. J Saf Res 2005; 36(1):51–61. 17. Borg G. Borg’s Perceived Exertion and Pain Scales, Champaign, IL, Human Kinetics, 1998. 18. Silk AJ, Billing DC. Development of a valid simulation assessment for a military dismounted assault task. Mil Med 2013; 178(3):315––320. 19. Hanea A, McBride M, Burgman M et al. Classical meets modern in the IDEA protocol for structured expert judgement. J Risk Res 2016:1––17. 20. Beck B, Carstairs GL, Billing DC et al. Modifiable anthropometric characteristics are associated with unilateral and bilateral carry performance. J Strength Cond Res 2017; 31(2):489––494. 21. Savage RJ, Jaffrey MA, Billing DC et al. Maximal and sub-maximal functional lifting performance at different platform heights. Ergonomics 2015; 58(5):762––769. 22. Savage RJ, Best SA, Carstairs GL et al. On the relationship between discrete and repetitive lifting performance in military tasks. J Strength Cond Res 2014; 28(3):767––773. 23. Beck B, Middleton KJ, Billing DC et al. Understanding anthropometric characteristics associated with performance in manual lifting tasks. J Strength Cond Res 2017. http://dx.doi.org/10.1519/JSC.0000000000002113. 24. Savage RJ, Best SA, Carstairs GL et al. The relationship between maximal lifting capacity and maximum acceptable lift in strength-based soldiering tasks. J Strength Cond Res 2012; 26(Suppl. (7)):7. 25. Middleton KJ, Carstairs GL, Ham DJ. Lift performance and lumbar loading in standing and seated lifts. Ergonomics 2016; 59(9):1242–1250. 26. Beck B, Ham DJ, Best SA et al. Predicting endurance time in a repetitive lift and carry task using linear mixed models. PLoS One 2016; 11(7):e0158418. 27. Middleton KJ, Carstairs GL, Caldwell JN et al. The sensitivity of a militarybased occupational fitness test of muscular strength. Appl Ergon 2017; 60: 255––259. 28. McGill SM, Marshall L, Andersen J. Low back loads while walking and carrying: comparing the load carried in one hand or in both hands. Ergonomics 2013; 56(2):293––302. 29. Beck B, Carstairs GL, Caldwell Odgers JN et al. Jerry can carriage is an effective predictor of stretcher carry performance. Ergonomics 2016; 59(6):813–820. 30. Beck B, Middleton KJ, Carstairs GL et al. Predicting stretcher carriage: investigating variations in bilateral carry test. Appl Ergon 2016; 55:124––132.

Journal of Science and Medicine in Sport 21 (2018) 1168–1172

Contents lists available at ScienceDirect

Journal of Science and Medicine in Sport journal homepage: www.elsevier.com/locate/jsams

Positive, limited and negative responders: The variability in physical fitness adaptation to basic military training Simon D. Burley a , Jace R. Drain b , John A. Sampson a , Herbert Groeller a,∗ a b

Centre for Human and Applied Physiology, Faculty of Science, Medicine and Health, University of Wollongong, Australia Land Division, Defence Science and Technology Group, Australia

a r t i c l e

i n f o

Article history: Received 15 February 2018 Received in revised form 24 June 2018 Accepted 28 June 2018 Available online 7 July 2018 Keywords: Health Exercise Occupational Muscle Fitness Military

a b s t r a c t Objectives: To investigate the heterogeneity of physical adaptation in Australian Army recruits completing a 12-week basic military training regimen. Design: A prospective research design. Methods: Volunteer recruits (n = 195) completed 12-weeks of basic military training. Recruit physical fitness was assessed at week 1, weeks 6–8 and week 12. Recruits in the upper (75th) and lower (25th) quartiles for each assessment were then analysed using a repeated measures two-way ANOVA. The relative magnitude of recruit adaptions were classified as positive response (Rpositive , ≥5%), limited response (Rlimited , >−5% to −5% to 0.05) in body mass, 0.3 kg (CI: −0.5:1.1) was recorded following 12 weeks of basic military training. However, body mass was observed to regress to the mean, with recruits in the lowest quartile gaining (p < 0.05) body mass after weeks 6–8, 2.5 kg (CI: 1.6:3.3) and week 12, 4.0 kg (CI: 3.0:5.1) of basic military training. In contrast, the heaviest 25% of recruits lost (p = 0.01) body mass, −3.6 kg (CI: −2.2:−4.9) and −4.3 kg (CI: −2.6:−6.0) at weeks 6–8 and week 12 respectively. An improvement (p < 0.001) in estimated cardiorespiratory fitness was observed after weeks 6–8, 4.9 mL·kg−1 ·min−1 (CI: 4.3:5.4) and week 12, 3.7 mL·kg−1 ·min−1 (CI: 3.0:4.4) of training (Table 1). An interaction (p < 0.001) was observed between the lower and upper quartiles at week 1, −10.9 mL·kg−1 ·min−1 (CI: −12.5:−9.3), weeks 6–8, −8.8 mL·kg−1 ·min−1 (CI: −10.4:−7.2) and week 12, −7.4 mL·kg−1 ·min−1 (CI: −9.1:−5.8) with the greatest improvements occurring in recruits with the lowest cardiorespiratory fitness (Fig. 1A). Upper-body muscular endurance improved (p < 0.001) after weeks 6–8, 6.6 reps (CI: 5.2:8.0) and week 12, 6.6 reps (CI: 5.2:8.0) compared to week 1, however no change in push-ups performance was observed between weeks 6–8 and week 12 (Table 1). An interaction (p < 0.001) was observed between the lower and upper quartiles at week 1, −30.9 reps (CI: −34.7:−27.0), weeks 6–8, −21.1 reps (CI: −25.0:−17.3) and week 12, −20.1 reps (CI: −23.9:−16.2) of basic training (Fig. 1B). Furthermore, only recruits in the lowest quartile demonstrated improved (p < 0.001) push-ups performance in weeks 6–8, 10.3 reps (CI: 7.8:12.8) and week 12, 10.4 reps (CI: 8.3:12.4) compared to week 1. One repetition maximum box lift strength improved by 2.7 kg (CI: 1.7:3.7) and 3.6 kg (CI: 2.5:4.7) in weeks 6–8 and week 12 of training respectively compared to week 1 (Table 1). With an improvement (p = 0.018) in box lift performance also observed between weeks 6–8 and week 12 of 0.9 kg (CI: 0.12:1.7). An interaction (p < 0.001) was observed between the lower and upper quartiles at week 1, 28.6 kg (CI: 32.3:24.9), weeks 6–8, 22.0 (CI: 25.7:18.3) and week 12, 22.2 kg (CI: 25.9:18.45) of basic training (Fig. 1C). However, compared to week 1, only the lower quartile displayed enhanced (p < 0.001) 1RM box lift strength at weeks 6–8, 4.2 kg (CI: 2.6:5.8) and week 12, 5.6 kg (CI: 2.6:5.8) with a 1.4 kg (CI: 0.3–2.4) improvement (p = 0.011) between weeks 6–8 and week 12 (Fig. 1C). In contrast, box lift performance in the upper quartile declined (p < 0.001) between week 1 and weeks 6–8, 2.3 kg (CI: 0.8:3.9) with no change (p > 0.05) observed in box lift performance after 12 weeks of basic training. Load carriage improved from week 1 by 118.5 s (CI: 100.1:136.8) and 116.6 s (CI: 96.4:136.8) in weeks 6–8 and week 12, respectively

1170

S.D. Burley et al. / Journal of Science and Medicine in Sport 21 (2018) 1168–1172

Table 1 Physical performance changes after 6–8 weeks and 12 weeks of basic military training. Measure

Group

n

Week 1

Weeks 6–8

Week 12

Change (%) Week 1–6/8

Week 1–12

1RM box lift (kg)

All Lower Uppera

182 49 49

41.2 ± 12.0 27.3 ± 5.8 55.8 ± 6.7b

43.9 ± 11.2c 31.5 ± 7.9c 53.5 ± 8.9b , c

44.8 ± 10.9c , d 32.8 ± 8.4c , d 55.0 ± 7.9b

8 ± 15% 16 ± 16% −4 ± 8%

11 ± 16% 21 ± 16% −1 ± 11%

Push-ups (reps)

All Lower Uppera

184 52 47

38.5 ± 12.7 24.2 ± 5.7 55.1 ± 6.7b

45.1 ± 11.4c 34.5 ± 8.4c 55.6 ± 9.6b

45.1 ± 10.8c 34.5 ± 8.4c 54.6 ± 7.2b

21 ± 18% 46 ± 36% 2 ± 16%

22 ± 21% 46 ± 30% 0 ± 14%

Est V˙ O2peak (mL·kg−1 ·min−1 )

All Lower Uppera

178 54 49

43.7 ± 4.6 39.1 ± 0.8 50.0 ± 3.2b

48.7 ± 4.6c 44.2 ± 3.1c 53.0 ± 3.7b , c

47.5 ± 4.8c , d 43.9 ± 3.9c 51.4 ± 4.6b , c , d

12 ± 7% 13 ± 8% 6 ± 6%

9 ± 9% 12 ± 9% 3 ± 7%

Load carriage (s)

All Lower Uppera

168 42 42

1267 ± 147.8 1460 ± 72.2 1084 ± 62.7b

1149 ± 115.6c 1257 ± 114.8c 1040 ± 75.1b , c

1150 ± 125.8c 1257 ± 121.5c 1035 ± 78.5b , c

8 ± 11% 14 ± 8% 2 ± 17%

8 ± 8% 13 ± 8% 4 ± 7%

Notes: Est V˙ O2peak = peak oxygen consumption estimated using the multi-stage fitness test. All; complete data set, lower; bottom 25th percentile and upper; highest 75th percentile. All values are mean ± standard deviation. a Denotes a significant interaction. b Denotes a significant difference between the lower and upper quartiles. c Denotes significant (p < 0.05) difference to week 1. d Denotes significant (p < 0.05) difference to weeks 6–8.

Fig. 1. Change in estimated V˙ O2peak (A), 2-min push-ups (B), 1RM box lift (C) and 3.2 km load carriage (D) after 12 weeks of basic military training in the lower (25th percentile) and upper (75th percentile). * Denotes significantly (p < 0.05) different from the lower quartile. Grey dots represent individual data points with mean ± standard deviation. A positive change represents improved performance.

(Table 1). However, between weeks 6–8 and week 12 no change in load carriage performance was observed. An interaction (p < 0.001) in load carriage time was observed between the upper and lower quartiles at week 1, 376.7 s (CI: 329.4:424.1), weeks 6–8, 217.7 s (CI: 170.1:264.8) and week 12, 222.1 s (CI: 174.7:269.4) of basic training (Fig. 1D). Significant improvements in load carriage performance from week 1 were observed at weeks 6–8, 203.2 s (CI: 161.6:244.9),

44.0 s (CI: 23.5:64.5) and week 12, 203.2 s (CI: 161.7:244.7), 48.6 s (CI: 21.8:75.3) in both the lower and upper quartiles, respectively. The proportional distribution of recruits classified as either Rpositive , Rlimited , or Rnegative was significantly (p < 0.001) different between the lower and upper quartiles for each of the four physical assessments (Table 2).

S.D. Burley et al. / Journal of Science and Medicine in Sport 21 (2018) 1168–1172 Table 2 Classification of participants in each quartile according to relative responsiveness. Assessment

Responsiveness

Lower % (n)

Upper % (n)

All % (n)

Positive Limited Negative

84 (41) 12 (6)a 4 (2)

24 (12) 41 (20) 35 (17)

61 (112) 24 (43) 15 (27)

Positive Limited Negative

96 (50) 2 (1)a 2 (1)

28 (13) 40 (19) 32 (15)

71 (131) 15 (28) 14 (25)

Positive Limited Negative

81 (44) 17 (9)a 2 (1)

35 (17) 49 (24) 16 (8)

69 (122) 24 (43) 7 (13)

Positive Limited Negative

91 (38) 7 (3)a 2 (1)

50 (21) 43 (18) 7 (3)

71 (119) 24 (42) 4 (7)

1RM box lift

Push-ups

Est V˙ O2peak

Load carriage

Notes: Est V˙ O2peak = peak oxygen consumption estimated using the multi-stage fitness test. Responsiveness was classified as a change in performance; ≥5% Positive, >−5% to