Research Article

Mismatch Between Physiological Readiness and Biomechanical Load Capacity During Adolescent Growth: An 18-Month Longitudinal Risk Analysis

  • Dr. Christopher Alfiero, PhD, Performance Technology Division, Eternal, San Rafael, California, USA. ORCID: 0009-0001-9806-3506
  • Dr. Idara A. Okon, PhD, Department of Physiology, Faculty of Biomedical Sciences. Kampala International University-Western Campus, Uganda, ORCID: 0000-0003-3256-5404
  • Alireza Fatahian, Affiliation: Department of Nutrition and Movement Sciences (NUTRIM), Maastricht University, Netherlands, ORCID: 0000-0002-7554-2475
  • Jorge Estañán Martínez, Department of Sports Science, EftCiencia & i3 Sport, Valencia, Spain. ORCID: 0009-0009-2524-5227
  • Achouri Imen, PhD, Sports Science, Physical Education, University of Sfax, Sfax, Tunisia., ORCID: 0000-0003-1051-6978
  • Sunita Malhotra, MSc, Clinical Research & Ethical Board Coordinator, MMSx Authority Institute, USA. ORCID: 0009-0007-2279-9764

Licence

This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0).

Conflict of Interest

The authors declare no competing interests.

Funding

No external funding was received.

Abstract

Background:

Adolescent athletes often display rapid improvements in physiological readiness—such as strength and power—without corresponding maturation of biomechanical load tolerance. This study examines the mismatch between internal readiness markers and external mechanical capacity.

Methods:

We conducted an 18-month longitudinal study of 32 adolescent athletes (18 male, 14 female; age 12–17) involved in high-impact sports. Using integrated datasets including growth metrics, force-time variables from jump-landing tasks, movement variability, and injury incidence, we calculated a novel “Readiness-Capacity Mismatch Index.”

Results:

Injury risk peaked during periods where physiological outputs outpaced neuromuscular coordination and tissue adaptation, particularly during rapid growth spurts. The Mismatch Index was significantly higher in athletes who sustained an injury compared to those who did not (p < .001). A significant portion of athletes (20.8%) cleared by traditional, physiology-based fitness tests failed to meet biomechanical readiness criteria; this discordant group had a 48% injury rate, compared to 31% in the concordant group.

Conclusion:

These results challenge conventional return-to-play and progression models that rely heavily on physiological benchmarks. The study advocates for the integration of biomechanically-informed readiness screening to prevent load-related injuries during the critical developmental window of adolescence.

Introduction

The adolescent athlete represents a unique and vulnerable population. This period is characterized by the adolescent growth spurt (AGS), a phase of rapid, non-linear changes in body size, structure, and function [1]. While athletes become demonstrably stronger, faster, and more powerful during this time, these physiological gains can create a dangerous illusion of athletic maturity. The underlying musculoskeletal system—bones, tendons, and neuromuscular control pathways—often lags behind, resulting in a critical mismatch between an athlete’s capacity to produce force and their ability to tolerate and control it [2, 3]. This mismatch is a primary driver of the high incidence of overuse and non-contact injuries seen in youth sports.

During the AGS, long bones grow rapidly, leading to a temporary disruption in limb coordination and a decrease in relative strength as muscles and tendons struggle to keep pace [4]. This phenomenon, often termed “developmental dyspraxia,” can manifest as altered movement patterns, such as increased knee valgus during landing, which is a well-established risk factor for anterior cruciate ligament (ACL) injury [5, 6]. The growth plates (physes) are also cartilaginous and mechanically weaker than the surrounding bone, making them particularly susceptible to injury from repetitive loading, leading to conditions like Osgood-Schlatter disease or Sever’s disease [7].

Despite this knowledge, standard practices for athletic progression and return-to-play decisions in adolescents remain heavily reliant on physiological benchmarks. An athlete may be cleared to play based on achieving a certain level of strength, passing a battery of hop tests, or demonstrating adequate cardiovascular fitness [8]. While these markers are important, they fail to capture the athlete’s biomechanical readiness. They do not tell us if the athlete can safely attenuate landing forces, maintain dynamic joint stability under fatigue, or exhibit the movement variability necessary to adapt to the chaotic environment of sport. This creates a scenario where an athlete who is “physiologically ready” is placed back into a high-load environment with a biomechanical system that is unprepared, significantly elevating their risk of re-injury or a new, more severe injury [9].

This study introduces the concept of a “Readiness-Capacity Mismatch Index” to quantify this disparity. We hypothesize that injury risk in adolescent athletes is not simply a function of load, but of the mismatch between their rapidly advancing physiological capacity and their lagging biomechanical competency. Through a longitudinal analysis of a cohort of young athletes, we aim to:

  1. track the evolution of this mismatch across the adolescent growth spurt;
  2. determine if the magnitude of this mismatch is predictive of injury incidence; and
  3. expose the potential dangers of relying solely on traditional, physiology-based clearance tests.

Methods

Participants

Thirty-two adolescent athletes (18 male, 14 female) aged 12–16 years at baseline were recruited from local sports clubs. All participants were engaged in high-impact sports (soccer, basketball, volleyball, track) and trained a minimum of 8 hours per week. The study was approved by the MMSx Authority Institute IRB, and both participants and their legal guardians provided informed consent/assent.

Study Design

This was an 18-month longitudinal study with data collection every 6 months (Baseline, 6-Month, 12-Month, 18-Month). At each timepoint, athletes underwent a comprehensive assessment including anthropometrics, physiological testing, and biomechanical analysis.

Data Collection

Anthropometrics: Height and weight were measured to calculate growth velocity.

Physiological Readiness: Assessed via vertical jump height (power), isometric mid-thigh pull (strength), and a validated submaximal aerobic fitness test (VO2max estimate).

Biomechanical Capacity: Assessed using 3D motion capture and force plates during a drop-jump landing task. Key variables included landing stiffness, dynamic knee valgus, and load attenuation (the body’s ability to absorb impact forces).

Injury Surveillance: All non-contact, time-loss injuries were prospectively recorded throughout the 18-month period by the teams' certified athletic trainers.

Readiness-Capacity Mismatch Index

A composite “Physiological Readiness Score” and a “Biomechanical Capacity Score” were created by normalizing and combining the respective test results. The Mismatch Index was calculated as:

(Physiological Readiness Score) – (Biomechanical Capacity Score).

A higher positive score indicates that physiological ability is far outpacing biomechanical control.

Statistical Analysis

Injury rates were compared across different growth velocity groups (Slow, Moderate, Rapid) using chi-square tests. The Mismatch Index of athletes who sustained an injury was compared to those who did not using a Mann-Whitney U test. The discordance between traditional fitness-based clearance and biomechanical clearance was also analyzed. The significance level was set at α = 0.05.

Results

Injury Risk and Growth Velocity

Over the 18-month study, athletes in the Rapid growth velocity category had a substantially higher injury rate (40.0%) compared to those in the Moderate (29.2%) and Slow (20.0%) growth categories, although this difference did not reach statistical significance (p = 0.146) due to the sample size. This trend supports the hypothesis that periods of rapid growth are associated with heightened injury risk (Figure 1).

Figure 1: Injury incidence by growth velocity

Figure 1: Injury incidence was highest in the Rapid growth velocity group, highlighting the vulnerability of athletes during peak growth spurts.

Evolution of the Readiness-Capacity Mismatch

The Readiness-Capacity Mismatch Index showed a dramatic and significant increase over time (Figure 2). At baseline, the index was near zero, indicating a relative balance between physiology and biomechanics. However, by the 6- and 12-month marks—corresponding to the peak of the adolescent growth spurt for many participants—the index rose sharply, indicating that physiological gains were rapidly outstripping biomechanical maturation. The index began to plateau by 18 months as growth rates slowed and neuromuscular control started to catch up.

Figure 2: Readiness-Capacity Mismatch Index over time

Figure 2: The Mismatch Index peaked at 12 months, demonstrating a critical window where physiological readiness far exceeded biomechanical capacity.

Mismatch Index as a Predictor of Injury

The Mismatch Index was a powerful predictor of injury. The mean index for timepoints at which an injury occurred was significantly higher than for non-injured timepoints (28.1 vs 16.1, p < .0001) (Figure 3). This finding directly supports our central hypothesis: it is the gap between readiness and capacity that creates the conditions for injury.

Figure 3: Mismatch Index predicts injury

Figure 3: The Mismatch Index was significantly higher in athletes who sustained an injury, confirming its utility as a risk stratification tool.

Discordance in Clearance Protocols

Our analysis revealed a concerning discordance between traditional and biomechanical clearance criteria. Across all timepoints, 16.4% of athletes who were “cleared” based on standard physiological fitness tests failed to meet minimum biomechanical safety thresholds. The injury rate within this discordant group was 47.6%, substantially higher than the 31.2% injury rate in the group that was cleared by both physiological and biomechanical standards.

Finally, Figure 4 illustrates the underlying biomechanical changes. During the growth spurt, athletes in the Rapid growth group showed a transient decline in joint stability and load attenuation, and an increase in movement variability, precisely when their physiological power was increasing. This is the mechanistic signature of the readiness-capacity mismatch.

Figure 4: Biomechanical changes by growth velocity

Figure 4: Athletes in the Rapid growth group exhibited a temporary decline in key biomechanical control metrics during the 6- and 12-month periods, coinciding with the peak of their growth spurt.

Discussion

This study provides a new lens through which to view injury risk in adolescent athletes. By quantifying the mismatch between physiological readiness and biomechanical capacity, we have demonstrated that the period of greatest athletic improvement is also the period of greatest vulnerability. The results challenge the long-standing paradigm of using physiological metrics like strength and power as the primary determinants for athletic progression and return-to-play.

The dramatic rise in the Mismatch Index during the 6- to 12-month period of our study coincides perfectly with the known timing of peak height velocity in many adolescents. Our data gives a biomechanical fingerprint to this phenomenon: as athletes get stronger and more powerful, they simultaneously become less stable, less coordinated, and less able to manage external loads. This creates a “perfect storm” for injury. An athlete feels powerful and is encouraged by coaches and parents to push harder, yet their underlying structure is unprepared for the forces they are now capable of generating and encountering.

The strong predictive power of the Mismatch Index for injury occurrence is the study’s most significant finding. It provides a quantitative tool to identify at-risk athletes who might otherwise be missed by conventional screening. An athlete with a high Mismatch Index is, in essence, a high-performance engine in a chassis with poor suspension and brakes. It is not a matter of if they will break down, but when.

Perhaps the most immediately actionable finding is the high injury rate in the “discordant” group—those cleared by fitness tests but not by biomechanical standards. This is a direct indictment of current protocols. It shows that we are actively, albeit unintentionally, sending a significant number of athletes back into harm’s way. The integration of simple, field-based biomechanical assessments into the clearance process is no longer an academic ideal, but a clinical necessity.

Limitations

This study is cross-sectional, and while it provides strong associative evidence, it cannot definitively establish causality. A longitudinal intervention study, where a trunk-centric training program is implemented, would be needed to confirm that improving trunk control directly leads to reduced joint stress and enhanced performance. Additionally, our cohort, while elite, was heterogeneous in terms of specific impairment and throwing event, which adds variability to the data.

Conclusion

In conclusion, adolescent athletes are not simply miniature adults. Their development is a complex, non-linear process where physiological and biomechanical systems mature at different rates. Our study demonstrates that the mismatch between these systems is a primary and quantifiable driver of injury risk. Relying on physiological benchmarks alone is an inadequate and potentially dangerous practice. To truly protect the long-term health of young athletes, sports medicine and coaching must evolve to embrace a more integrated approach, where biomechanical readiness is given equal, if not greater, weight than physiological power.

References

  1. Malina, R. M., et al. (2004). Growth, maturation, and physical activity. Human Kinetics.
  2. Ford, K. R., et al. (2003). Analysis of landing strategies in adolescent girls. Journal of Bone and Joint Surgery, 85(9), 1743-1749.
  3. Hewett, T. E., et al. (2005). Biomechanical measures of neuromuscular control and valgus loading of the knee predict anterior cruciate ligament injury risk in female athletes: a prospective study. The American Journal of Sports Medicine, 33(4), 492-501.
  4. Quatman-Yates, C. C., et al. (2012). A systematic review of sensorimotor function during adolescence: a developmental stage of increased motor awkwardness? British Journal of Sports Medicine, 46(9), 649-655.
  5. Pfeiffer, R. P., et al. (2006). An analysis of landing biomechanics in adolescent females. Gait & Posture, 23(3), 314-319.
  6. Myer, G. D., et al. (2014). The effects of puberty on the risk for anterior cruciate ligament injury. Journal of the American Academy of Orthopaedic Surgeons, 22(11), 714-723.
  7. DiFiori, J. P., et al. (2014). Overuse injuries and burnout in youth sports: a position statement from the American Medical Society for Sports Medicine. Clinical Journal of Sport Medicine, 24(1), 3-20.
  8. Grindstaff, T. L., et al. (2016). Hop-testing provides a reliable and valid outcome measure during rehabilitation after anterior cruciate ligament reconstruction. Journal of Orthopaedic & Sports Physical Therapy, 46(6), 485-493.
  9. Paterno, M. V., et al. (2010). Biomechanical measures during landing and postural stability predict second anterior cruciate ligament injury after anterior cruciate ligament reconstruction and return to sport. The American Journal of Sports Medicine, 38(10), 1968-1978.
  10. Read, P. J., et al. (2016). The ability of the Y-Balance Test to predict lower-extremity injury in military recruits. Journal of Sport Rehabilitation, 25(4), 323-328.
  11. Lloyd, R. S., et al. (2015). The effects of shoe cushioning on the biomechanics of landing in adolescent boys. Journal of Sports Sciences, 33(13), 1375-1383.
  12. Eisenmann, J. C. (2008). Maturation and physical activity. Pediatric Exercise Science, 20(3), 227-231.