Aprendizaje fuera del aula

Impacto moderado, Costo moderado, Evidencia moderada

+4

Technical Appendix

Definition

Outdoor adventure learning experiences are learning experiences with a high level of physical (and often emotional) challenge, undertaken in an outdoor or wilderness setting for educational purposes. Outdoor adventure learning usually involves collaborative learning activities. Practical problem-solving, explicit reflection and discussion of thinking and feelings (see also Metacognition and self-regulation) may also be involved. Adventure learning interventions typically do not include a formal academic component.

Search terms: adventure activities; adventure education; bush experience; bushcraft; outdoor learning; outdoor education; experiential education programs; wilderness experience; wilderness education

Evidence Rating

There are five meta-analyses suggesting that outdoor adventure learning can consistently provide positive benefits for academic learning. Three of these have been published since 2000. The effect size for academic outcomes ranges from 0.17 (controlled study comparisons) to 0.61. On average, pupils who participate in outdoor adventure learning activities appear to make approximately four additional months’ progress. Overall, the evidence is rated as moderate.

Cost Information

Costs vary. For example, a six-day adventure sailing experience costs about £600 and a  seven-day outdoor adventure course costs about £550 per pupil. An adventure ropes course costs about £30 for half a day. Costs are estimated at £500 per pupil per year and are therefore moderate.

References

  1. Bowen, D. J., & Neill, J. T.

    A meta-analysis of adventure therapy outcomes and moderators

    The Open Psychology Journal, 6(1), 28-53

    (2013)

  2. Cason, D., & Gillis, H. L.

    A meta-analysis of outdoor adventure programming with adolescents

    Journal of Experiential Education, 17(1), 40-47

    (1994)

  3. Gillis, L. H., & Speelman, E.

    Are challenge (ropes) courses an effective tool? A meta-analysis

    Journal of Experiential Education, 31(2), 111-135

    (2008)

  4. Hans, T. A.

    A meta-analysis of the effects of adventure programming on locus of control

    Journal of Contemporary Psychotherapy, 30(1), 33-60

    (2000)

  5. Hattie, J., Marsh, H. W., Neill, J. T., & Richards, G. E.

    Adventure education and Outward Bound: Out-of-class experiences that make a lasting difference

    Review of Educational Research, 67(1), 43-87

    (1997)

  6. Laidlaw, J. S.

    A meta-analysis of outdoor education programs (Order No. 9999509)

    Available from ProQuest Dissertations & Theses Global. (304612041)

    (2000)

  7. McKenzie, M. D.

    How are adventure education program outcomes achieved?: A review of the literature

    Australian Journal of Outdoor Education, 5(1), 19-27

    (2000)

  8. Wilson, S. J., & Lipsey, M. W.

    Wilderness challenge programs for delinquent youth: A meta-analysis of outcome evaluations

    Evaluation and Program Planning, 23(1), 1-12

    (2000)

Summary of effects

Meta-analyses Effect size FSM effect size
Bowen, D. J., & Neill, J. T. (2013)
0.41 - Academic outcomes
0.47 - All effects
Cason, D., & Gillis, H. L. (1994)
0.31 - All effects
0.61 - School grades
Gillis, L. H., & Speelman, E. (2008)
0.43 - Overall
0.26 - Academic Outcomes
Hattie, J., Marsh, H. W., Neill, J. T., & Richards, G. E. (1997)
0.34 -
0.45 - Academic outcomes
Laidlaw, J. S. (2000)
0.17 - Controlled trials
Single Studies
Effect size (median) 0.31

The right hand column provides detail on the specific outcome measures or, if in brackets, details of the intervention or control group.

Meta-analyses abstracts

1

Bowen, D. J., & Neill, J. T. (2013)

This study reports on a meta-analytic review of 197 studies of adventure therapy participant outcomes (2,908 effect sizes, 206 unique samples). The short-term effect size for adventure therapy was moderate (g = .47) and larger than for alternative (.14) and no treatment (.08) comparison groups. There was little change during the lead-up (.09) and follow-up periods (.03) for adventure therapy, indicating long-term maintenance of the short-term gains. The short-term adventure therapy outcomes were significant for seven out of the eight outcome categories, with the strongest effects for clinical and self-concept measures, and the smallest effects for spirituality/morality. The only significant moderator of outcomes was a positive relationship with participant age. There was also evidence that adventure therapy studies have reported larger effects over time since the 1960s. Publication bias analyses indicated that the study may slightly underestimate true effects. Overall, the findings provide the most robust meta-analysis of the effects of adventure therapy to date. Thus, an effect size of approximately .5 is suggested as a benchmark for adventure therapy programs, although this should be adjusted according to the age group.

2

Cason, D., & Gillis, H. L. (1994)

Adventure practitioners asked to justify their work with adolescent populations have no one study to point to that statistically sums up major findings in the field. Whether it be a school board, treatment facility, or funding agency, one study is needed which can combine statistics from many studies into a format to show overall effectiveness of adventure programming? This study used the statistical technique of meta-analysis to demonstrate that adolescents who attend adventure programming are 62% better off than those who do not. While combining various populations and outcomes resulted in an overall effect that could be considered small by some accounts, the study did point to major problems with current research and offers some direction for future researchers to explore.

3

Gillis, L. H., & Speelman, E. (2008)

This study reports the results of a meta-analysis of 44 studies that examined the impacts of participation in challenge (ropes) course activities. Overall, a medium standardized mean difference effect size was found (d = 0.43). Effect sizes were calculated for various study characteristics, including demographics and outcome. Higher effects were found for adult groups (d = 0.80) and for studies measuring family functioning (d = 0.67). Studies with therapeutic (d = 0.53) or developmental foci (d = 0.47) had higher effect sizes than those with educational foci (d = 0.17). Higher effect sizes for group effectiveness (d = 0.62) affirmed the use of challenge course experiences for team-building purposes. Implications for further research include the importance of recording detailed program design information, selecting appropriate instrumentation, and including follow-up data.

5

Hattie, J., Marsh, H. W., Neill, J. T., & Richards, G. E. (1997)

The purpose of this meta-analysis is to examine the effects of adventure programs on a diverse array of outcomes such as self-concept, locus of control, and leadership. The meta-analysis was based on 1,728 effect sizes drawn from 151 unique samples from 96 studies, and the average effect size at the end of the programs was .34. In a remarkable contrast to most educational research, these short-term or immediate gains were followed by substantial additional gains between the end of the program and follow-up assessments (ΈS = .17). The effect sizes varied substantially according the particular program and outcome and improved as the length of the program and the ages of participants increased. Too little is known, however, about why adventure programs work most effectively.

6

Laidlaw, J. S. (2000)

The purpose of this meta-analysis was to examine research in the field of outdoor education to determine if features of studies, outcomes, and programs are significantly related to variation among the estimated effects of outdoor education programs. The primary findings of this dissertation were that study design and the degree to which outcomes were proximal to the intent of the program explained a significant part of the variance in effect estimates. Specifically, studies using poorly controlled designs had the highest mean effect size estimates (effect size = .6), in contrast to those that used controlled, experimental designs (effect size = .17). In this aspect, the findings of this study support the results of Cason and Gillis. In addition, the findings of this meta-analysis indicated that studies which evaluated outcomes proximally related to program goals had significantly higher effect sizes (effect size = .77) than those studies which evaluated distally related outcomes (effect size = .40). In a notable contrast to both prior meta-analyses in the field, after controlling for the influence of potentially confounding variables, and after controlling for a problematic issue of meta-analysis, that of independence of effect sizes, no other feature of outcomes or programs were significantly related to effect sizes. The results of this dissertation imply that the relationship between outcomes and program goals are important considerations, and that relationship between other substantive features of programs (such as length) and their subsequent outcomes (self-concept) cannot be determined from the existing literature given its inherent problem.