Neil Gibson: Director of Sport, Performance and Health at Oriam: Scotland’s Sports Performance Centre
John Hill: Head of Sport Science at MK Dons FC
John Fitzpatrick: Sports Scientist at Newcastle United FC
Robert McCunn: Performance Sports Manager at Oriam: Scotland’s Sports Performance Centre
A recent article in Professional Strength & Conditioning entitled ‘The evidence based straitjacket’ by Prof. Ian Jeffries raised some interesting and thought provoking topics around the interaction between applied practice and academic research. We felt there was room for, in some cases, an alternative perspective to be presented. What follows is our take on some of the key issues including experience, innovation, universities and the academic process.
We will start however by offering an alternative to the term ‘evidence-based’; if this term is seen as overly restrictive to the practitioner they may wish to consider using ‘evidence-informed’ as a proxy. This subtle change in terminology perhaps offers a greater degree of freedom in their work philosophy by allowing the extant literature to guide their work without controlling it.
This is a difficult entity to quantify; we have all heard the oft-stated question, do you have 10 years’ experience or 1 years’ experience repeated 10 times? In contrast, what research allows practitioners to do, in a range of formats, is present their work, after reflection and self-evaluation, to a panel of their peers for critique and review. This need not always be in an academic format, there are a number of blog’s, industry quarterlies and national governing body publications that allow us to share our ideas and, hopefully, provide the reader with new ones. If, however, we rely on someone’s experience, be that time in the trade or number of organisations they have worked with it is difficult to assess how effective their practises were. In lectures to students one should always be keen to differentiate between what their opinion (based on anecdote and experience) versus what they have objective data to support. Perhaps this would be a useful tact to take, especially for those sharing information on social media. A recent article on the newly formed ‘Sport Performance & Science Reports’ platform stresses the importance and usefulness of employing magnitude based inferences when we assess the change someone has made over time. We would suggest that without such approaches, clearly steeped in the realm of research, it is difficult to assess how effective a coach has been, irrespective of their standing or perceived experience. Thanks largely to the work of people like Will Hopkins and Martin Buchheit resources that help those (and we include ourselves in this) who are not necessarily statistically minded to employ their use are now publically available in easy to manage formats. Even if we are not pursuing publication there is no reason why practitioners cannot adopt these research skills when interpreting data they have collected.
Information gathering of this nature does not always need to be quantitative in nature. We would stress that one cannot ignore the perceptions of athletes and coaches who have experienced the benefit of effective and enjoyable training programmes; look good, feel good, play good is as true today as it was before the proliferation of research around strength and conditioning. As such, qualitative analysis through interviews and focus groups provides an alternate approach to support and validate the efficacy of their work. As 007 and his quartermaster state, ‘youth is no guarantee of innovation and age is no guarantee of efficiency.’
A lack of innovation
Whilst it is difficult to generalise, we would argue that there still exists a high degree of innovation within the field of strength and conditioning. We have rarely, if ever, visited other clubs or institutions and left without learning something new or seeing a different way of implementing training concepts we were already familiar with. We certainly don’t think that research, or, group think, has eroded this at all. What may be stifling innovation and creativity, however, is the structure of departments that exist within sporting organisations.
Many practitioners working in the industry started their career at a time when sport science departments were in their infancy and as such benefited from regular interactions with coaches, medical staff and other disciplines within sport. This allowed many to make their own decisions, and mistakes, generating new knowledge from the information presented to them from a range of different sources. It is worth pausing here; knowledge is information that is personalised, that we can interpret and link to the environment within which we find ourselves in. Often this exists as tacit knowledge that we need to, somehow, present to the outside world by externalising it, in order for others to see our perspective and also, where appropriate, learn from. Prof. Jeffries also expressed this valid sentiment in the aforementioned article.
Unfortunately, as the industry has grown, so too have the departments which house the practitioners; in tandem, the opportunities for young practitioners to interact with other disciplines, athletes and coaches has diminished. Individuals are recruited into roles that are niche and fairly narrow, we work in silos designated by our profession and as such are not able to assimilate information from a variety of sources, many of which are important to athletic development. It is this, I suspect, that might be responsible for a perceived loss of innovation within the field, if indeed it exists at all.
We have heard from a number of sources that Universities need to do more to prepare students for high performance sport and perhaps this is true. There is certainly a disconnect between what is taught and what is discussed within professional teams and national governing bodies regarding the biggest challenges they face. Of course, this is true of many disciplines, especially those where the lecturers are no longer interacting regularly with the field or topic on which they are teaching. In America, they have industry sabbaticals for academic staff who want to recalibrate their skill set to more closely mirror the requirements industry places on the students they teach. This seems like a good idea and one that sport could adopt; we are sure there would be no shortage of academic staff who would value the opportunity to spend some time engaging with and working alongside practitioners at the sharp end of sport.
Despite this it is still academia that is proving to be the most fertile breeding ground for new and aspiring coaches, not to mention those further on in their career trajectory. It would be difficult to find anyone in a position of power or influence within the realm of sport science or strength and conditioning that hasn’t, at some point, engaged with academia through the pursuit of a degree. Whilst not every athlete who embarks on a career in sport will make it to the top level, similarly, not every sport student will end up in working in sport. What an academic education does provide however is the ability to be critical in ones thought processes; to instil the ability to decide what might be appropriate for the athletes you are working with and what isn’t, based on the available evidence; to query and probe that which is presented as fact. Far from being barriers to innovation, these are skills that ensure students are not overly influenced by conjecture and opinion but pursue lines of enquiry that provide useful and useable information about the interventions they choose. The important question is never ‘show me what you do?’ it is ‘show me what you don’t do and why?’; the latter should reveal whether a sound rationale has been developed and why certain ways of working were, either through trial and error or desk based research, deemed inappropriate.
The academic process
For those of us with a foot in this camp publications are the currency we deal in and, along with other less formal channels of communication, are an effective way to share information (not knowledge, it doesn’t become that until someone reads it and internalises what they consider to be the key message(s)). That is not to say that there are not issues. Publication bias exists, so even when you read a review it is likely that the findings will be skewed in favour of positive outcomes (few journals will publish a study that reports no effect, unfortunately). There is a disconnect between those writing the materials and those reading them; it is still fairly common for national governing bodies, national institutes and professional teams not to provide access to journals for their staff, the ones working directly with athletes. This is perhaps why people such as Yann Le Meur have had such success with the approach of sharing studies via infographics, making them accessible to a larger audience.
Things are changing, however, there are now numerous opportunities for postgraduate students to pursue Masters degrees and PhDs whilst working in and being funded by sport, closing the circle between academia and performance. Furthermore, the increase in the number of journals of the topic of strength and conditioning has allowed articles with different approaches to be published and shared with the public. Universities are also becoming better at providing industry placements for their students so that they can combine their understanding of the research base with some practical skills gained on the job. For this to grow, however, sporting organisations need to decide what value they place on the discipline and whether it can be a viable career for those who choose to pursue it. For this to be the case there needs to be a more formal process of continued professional development that employers can view to determine just how ‘experienced’ potential applicants are. In the absence of such a scheme, publications provide a useful way of evidencing our own willingness to learn, develop new skills and share our information with a wider audience.
Another feature of the academic process is giving practitioners the skills and competencies to conduct research and decipher data. Key principles such as assessing reliability and validity of the tests practitioners are conducting on a daily basis is something that is often forgotten in the applied world. Furthermore, critically assessing the success of ones interventions is essential to provide best practice. Without the skills and knowledge to perform such assessments, practitioners are simply shooting in the dark. The academic process gives practitioners the ability to provide their own evidence informed practice.
Out of the straitjacket and into the water park
Rather than being a straitjacket, we see being research informed as more of a popular ride at water parks, ‘the rapids’. From a relatively calm and spacious lagoon (where you generate the question) you move into the rapids (the research) where you are tossed around somewhat, often in an uncomfortable manner, before being ejected back into the lagoon knowing a little bit more (albeit not much) than before you started. This is research; you channel your energy into answering a specific question, once you have addressed this you start the process again, however, in a more informed state. Unlike the straitjacket you are never really free of it as there are always questions to be answered. It is the pursuit of these answers that allows us to be innovative and advance our field by creating shared knowledge through the information we create.