To describe the injury rates in first team rugby league in terms of those injuries that require missed playing time and those that do not.A pooled data analysis from 2 independent databases.Rugby league match and training environment over several seasons from 1990 to 2003.Injuries were reported as rates per 1000 hours of participation and as percentages with their associated 95% confidence intervals (CIs).A total of 1707 match injuries were recorded. Of these injuries, 257 required players to miss the subsequent match. The remaining 1450 injuries did not require players to miss the next game. They represented 85% (95% CI, 83-87) of all injuries received and recorded. The ratio of non-time-loss (NTL) to time-loss (TL) injuries was 5.64 (95% CI, 4.96-6.42). There were 450 training injuries, of which 81 were TL injuries and 369 NTL injuries. The NTL training injury rate was 4.56 (95% CI, 3.58-5.79) times higher than TL injury rate.Non-time-loss injuries represent the largest proportion of injuries in rugby league. If NTL injuries are not recorded, the workload of practitioners is likely to be severely underestimated.
Background The incidence of lumbar spine injury is high. Ergometer rowing and fatigue have been cited as risk factors for this injury. Objective Compare the kinematics of the lumbar spine in rowers on an ergometer and a sculling boat during a fatiguing protocol. Design Lumbar spine kinematics were assessed in the sagittal plane using a twin axis electrogoniometer. Setting A laboratory and field based study of elite level rowing. Participants 19 male rowers, mean age 24.2 (3.7) years, mean weight 82.5 (8.4) kg and height 1.88 (0.05) m. Assessment of risk factors Maximum range of lumbar flexion was measured before each test. Sagittal plane lumbar spine angular kinematics were measured during an ergometer rowing trial which comprised of an incremental ‘step test’ on a Concept 2 ergometer. The trial was repeated in a sculling boat on the water. Main outcome measurements Pre-test maximum flexion and maximum lumbar flexion during each test were compared. Mean sagittal flexion in the lumbar spine during the first and last stage of the step test was compared. The findings of ergometer and boat rowing were compared. Results Maximum sagittal flexion was 113% (15.2) of pre-test range during ergometer rowing and 104.1% (10.2) during boat rowing. Lumbar flexion increased over the course of testing. Increase in lumbar flexion was significantly greater on the ergometer (mean change+4.4°(0.85)) compared with the boat (mean change+1.3°(1.1)), (mean difference 3.1, 95% CI 0.3 to 5.9, t=2.28, p=0.035). Conclusion Rowers experience a very high level of lumbar spine flexion which increases over the course of a rowing trial. Ergometer rowing results in significantly greater increases in sagittal motion in the lumbar spine compared to boat rowing.
Managing injury risk is important for maximising athlete availability and performance. Although athletes are inherently predisposed to musculoskeletal injuries by participating in sports, etiology models have illustrated how susceptibility is influenced by repeat interactions between the athlete (i.e. intrinsic factors) and environmental stimuli (i.e. extrinsic factors). Such models also reveal that the likelihood of an injury emerging across time is related to the interconnectedness of multiple factors cumulating in a pattern of either positive (i.e. increased fitness) or negative adaptation (i.e. injury). The process of repeatedly exposing athletes to workloads in order to promote positive adaptations whilst minimising injury risk can be difficult to manage. Etiology models have highlighted that preventing injuries in sport, as opposed to reducing injury risk, is likely impossible given our inability to appreciate the interactions of the factors at play. Given these uncertainties, practitioners need to be able to design, deliver, and monitor risk management strategies that ensure a low susceptibility to injury is maintained during pursuits to enhance performance. The current article discusses previous etiology and injury prevention models before proposing a new operational framework.
Background: Total arterial occlusive pressure (AOP) is used to prescribe pressures for surgery, blood flow restriction (BFRE) and ischemic preconditioning (IPC).AOP is often measured in a supine position; however, the influence of body position on AOP measurement is unknown and may influence of level of occlusion in different positions during BFR and IPC.The aim of this study was therefore to investigate the influence of body position on AOP.Methods: Fifty healthy individuals (age = 29 ± 6 y) underwent AOP measurements on the dominant lower-limb in supine, seated and standing positions in a randomised order.AOP was measured automatically using the Delfi Personalised Tourniquet System device, with each measurement separated by 5 min of rest.Results: AOP was significantly lower in the supine position compared to the seated position (187.00± 32.5 vs 204.00 ± 28.5 mmHg, p<0.001) and standing position (187.00± 32.5 vs.241.50 ± 49.3 mmHg, p<0.001).AOP was significantly higher in the standing position compared to the seated position (241.50± 49.3 vs. 204.00± 28.5 mmHg, p<0.001).Discussion: AOP measurement is body position dependent, thus for accurate prescription of occlusion pressure during surgery, BFR and IPC, AOP should be measured in the position intended for subsequent application of occlusion.
The normal distribution is the foundation of many statistical analysis techniques. These so called ‘parametric methods’ use the parameters of the distribution (mean, standard deviation) as part of the calculations. When data is analysed using parametric statistics, certain conditions should be met to apply those statistics correctly. One of these conditions is that data are normally distributed, and some suggest that this should be determined early in any analysis [1, 2]. But others suggest it is unnecessary [3], as normality is not an important assumption [4] and many parametric tests are ‘robust’ and can deal with non-normal data distributions [3]. Yet, readers of research papers seek assurances that the data analysis is appropriate [5]. In spite of authors discussing their need, Thode [6] described approximately 400 methods to test for normality. So, many options are available to researchers which range from informal plotting through to formal hypothesis testing which tests the null hypothesis of ‘the variable being examined follows a normal distribution’ [1]. So, a P value below a given significance level, suggests departure from normality. Researchers need to be aware of these techniques, so that they can determine their analysis options, and to make sure the data contains no surprises. The purpose of this paper is to outline some of the techniques that can be used.
Evidence with regard to the incidence of injury to forwards and backs in the game of rugby league is extremely limited. A four year prospective study of all the injuries from one professional Rugby League club was conducted. All injuries that were received during match play were recorded, and those for forwards and backs compared. Forwards had a higher overall rates of injury than backs (139.4 [124.2-154.6] vs. 92.7 [80.9-104.6] per 1000 player hours, P < 0.00006). Forwards had a higher rate of injuries to all body sites with the exception of the ankle and the 'others' category of injury. They had significantly higher rates for the arm (11.6 [6.9-16.3] vs. 3.9 [1.4-6.4] per 1000 player hours, P = 0.005) and, the head and neck (53.9 [43.9-63.8] vs. 25.0 [18.7-31.4] injuries per 1000 player hours, P < 0.00006). Forwards had significantly more injuries than backs for contusions (17.1 vs. 7.3 per 1000 player hours, z = 2.85, P = 0.0044), lacerations (26.7 vs. 13.8 per 1000 player hours, z = 2.92, P = 0.0035) and haematomas (20.6 vs. 11.6 per 1000 player hours, z = 2.29, P = 0.02). Forwards were also more likely to be injured when in possession of the ball (70.5 [59.2-81.7] vs. 38.0 [30.2-45.7]), and also when tackling (33.2 [25.3-41.1] vs. 16.8 [11.6-22.1]). The higher rates of injury experienced by forwards were most likely as a result of their greater physical involvement in the game, both in attack and in defence.
This study examined the cuff to limb interface pressure during blood flow restriction ( BFR ), and the perceptual and mean arterial pressure responses, in different BFR systems. Eighteen participants attended three experimental sessions in a randomised, crossover, counterbalanced design. Participants underwent inflations at 40% and 80% limb occlusive pressure ( LOP ) at rest and completed 4 sets of unilateral leg press exercise at 30% of one repetition maximum with BFR at 80% LOP . Different BFR systems were used each session: an automatic rapid‐inflation ( RI ), automatic personalized tourniquet ( PT ) and manual handheld pump and sphygmomanometer ( HS ) system. Interface pressure was measured using a universal interface device with pressure sensors. Perceived exertion and pain were measured after each set, mean arterial pressure ( MAP ) was measured pre‐, 1‐minute post‐ and 5‐minutes post‐exercise. Interface pressure was lower than the set pressure in all BFR systems at rest ( P < .05). Interface pressure was, on average, 10 ± 8 and 48 ± 36 mm Hg higher than the set pressure in the RI and HS system ( P < .01), with no differences observed in the PT system ( P > .05), during exercise. Pain and exertion were greater in sets 3 and 4 in the RI and HS system compared to the PT system ( P < .05). MAP was higher in the RI and HS system compared to the PT system at 1‐minute and 5‐minutes post‐exercise ( P < .05). BFR systems applying higher pressures amplify mean arterial pressure and perceptual responses. Automatic BFR systems appear to regulate pressure effectively within an acceptable range during BFR exercise.