14 Principles of Multimedia Learning – eLearningExpert

Recently I came across this work from Koichi Sato at the University of Nebraska–Lincoln (his coordinates are at the bottom). I found it so clear and helpful that I asked him if I could work it into a blog and I’m happy to say he agreed. I took the liberty to add a few things, but don’t want to take any of the credit!
Richard Mayer’s Cognitive Theory of Multimedia Learning is based on a number of assumptions, namely that there are two separate channels – auditory and visual – for processing information (Paivio, 1990); there is limited channel capacity (Sweller, 1988), and that learning is an active process of filtering, selecting, organizing, and integrating information (Baddeley & Hitch, 1974).
Based upon these three assumptions, there have been 14 principles developed governing the good (and poor) use of multimedia. Herre’s the first.

Multimedia Principle:
People learn better when texts and pictures are presented together rather than from words alone.

The rest can be downloaded:

Koichi Sato, MSEd & MPH
University of Nebraska–Lincoln
Email: sw-ksato2@unl.edu
Web: go.unl.edu/unl-code

Advertentie

Bringing More of a Classroom Feel to Distance Learning

Apparently wonders never cease. The website Edutopia from George Lucas is well-known for farfetched, absurd, and downright wrong and harmful articles about teaching and education, but today I found something really useful there. Maybe it’s a fluke or maybe it’s the exception justifying thSchermafbeelding 2021-01-15 091749e rule, but whatever it is, it’s a nice and easy hack to help teachers teach at a distance.

Through technology, teachers can share the sights and sounds of the classroom to help remote students feel more connected.

Bron: Bringing More of a Classroom Feel to Distance Learning

Pupils can learn more effectively through stories than activities

You often hear that active ‘doing’ (pupil-centred) is better than ‘telling’ (teacher-centred) and that this is even more important and more ‘true’ when it comes to younger children. Unfortunately, this new study at the University of Bath shows that storytelling — the oldest form of teaching — is the most effective way of teaching primary school children about evolution. A randomised controlled trial found that children learn about evolution more effectively when engaged through stories read by the teacher (i.e., instruction), than through doing tasks to demonstrate the same concept.

The scientists investigated several different methods of teaching evolution in primary schools, to test whether a pupil-centred approach (where pupils took part in an activity) or a teacher-centred approach (where pupils were read a story by the teacher), led to a greater improvement in understanding of the topic.

Abstract of the article in Nature’s Science of Learning journal:

Current educational discourse holds that effective pedagogy requires engagement through active student participation with subject matter relating to them. The lack of testing of lessons in series is recognized as a potential weakness in the evidence base, not least because standard parallel designs cannot capture serial interaction effects (cf. drug interactions). However, logistic issues make large-scale replicated in situ assessment of serial designs challenging. The recent introduction of evolution into the UK primary school curriculum presents a rare opportunity to overcome this. We implemented a randomised control 2 × 2 design with four inexpensive schemes of work, comparable to drug interaction trials. This involved an initial test phase (N = 1152) with replication (N = 1505), delivered by teachers, after training, in their classrooms with quantitative before-after-retention testing. Utilising the “genetics-first” approach, the schemes comprised four lessons taught in the same order. Lessons 1 (variation) and 3 (deep-time) were invariant. Lesson 2 (selection) was either student-centred or teacher-centred, with subject organism constant, while lesson 4 (homology) was either human-centred or not, with learning mode constant. All four schemes were effective in replicate, even for lower ability students. Unexpectedly, the teacher-focused/non-human centred scheme was the most successful in both test and replicate, in part owing to a replicable interaction effect but also because it enabled engagement. These results highlight the importance of testing lessons in sequence and indicate that there are many routes to effective engagement with no “one-size fits all” solution in education.

 

Bron: Pupils can learn more effectively through stories than activities

Organising Knowledge: Ausubel’s Advance organisers

Here’s a really clear and useful blog by David Rodger-Goodwin on David Ausubel’s Advance Organisers.

The Situation – Expert teachers create and maintain, almost automatically, complex networks that link concepts, ideas and facts within their domain which students don’t have.

The Solution – Popularised by the psychologist David Ausubel (1968), advance organisers provide the conceptual framework for the incorporation and retention of new information. They should be presented in advance of a new topic (or sequence of learning) and at a higher level of abstraction than the learning that follows.

Bron: Organising Knowledge: Ausubel’s Advance organisers

Are computer-supported literacy interventions effective for young children?

Computer-assisted learning (CAL) is gaining popularity due to its promise of cost-effectiveness, individualized approach, and enhanced engagement. However, recent research, published by Ludo Verhoeven and colleagues in Educational Research Review, shows that the effect size of CAL was smaller than those found in previous meta-analyses that investigated teacher-supported early literacy interventions. This lends evidence to a plausible conclusion that teachers are more effective than computers in enhancing literacy skills in early childhood development.

Bron: Are computer-supported literacy interventions effective for young children?

Registered Reports in the Journal of Computer Assisted Learning

By:

Em. prof. dr. Paul. A. Kirschner, Editor-in-Chief, Open University of the Netherlands / Thomas More University of Applied Sciences, and

Dr. Jeroen Janssen, Associate Editor, Utrecht University, Netherlands

The science of learning and instruction is a rapidly evolving field, with a variety of theoretical and methodological approaches. It is also sometimes burdened by two gadflies, namely the so-called ‘replication crisis’ where attempts to replicate accepted theories and results may not replicate (for whatever reason) and ‘positive’ publication bias where articles that fail to confirm hypotheses and/or that don’t produce statistically significant results are seen to have a lesser chance of getting published. In an attempt to combat both of these problems, and increase trust in research, we at the Journal of Computer Assisted Learning have chosen to explicitly invite authors to submit Registered Reports – a new format of empirical articles that is designed to improve the transparency and reproducibility of hypothesis-driven research.

The call to invite authors to submit their Registered Reports went out in February 2018. Since then, we have received a growing number of Registered Reports submissions, and we are now delighted to publish our first collection of Registered Reports. To us, this proves that the learning and instruction community is responding positively to this new format.

Registered Reports differ from conventional empirical articles by performing part of the review process before researchers even collect and analyze data. The Introduction and Method section (including hypotheses and all relevant materials) are peer-reviewed prior to data collection. High quality pre-registered protocols that meet strict editorial criteria are then offered in principle acceptance, which guarantees publication of the results, provided that the authors adhere to their pre-registered protocol.

Daniel Ansari (University of Western Ontario, Canada) and Judit Gervain (Université Paris Descartes, Paris) in a blog about the introduction of Registered Reports in Developmental Science, noted the following:

“Registered Reports (RR) place an emphasis on the adequacy of methods and analysis plan for studies deemed to be informative. They thus benefit both the submitting authors and the discipline. If the rationale and the methods of the planned studies are sound, and accepted during the review process, authors can expect their work to be published, unless they deviate from the accepted methods, even if the results are weak, null, or different than predicted. The discipline gains, because the publication process will be more transparent and is therefore more likely to curtail questionable research practices. This increases the reliability and reproducibility of data. … [Also, in] RR format interpretations are less likely to be modified post hoc to fit the initial predictions.”

Three arguments for Registered Reports

For studies with a clear hypothesis, the Registered Reports format has three key strengths compared with traditional research publishing. First, it prevents publication bias by ensuring that editorial decisions are made on the basis of the theoretical importance and methodological rigor of a study, before research outcomes are known. Second, by requiring authors to pre-register their study methods and analysis plans in advance, it prevents common forms of research bias including p-hacking (mis-use of data analysis to find patterns that are statistically significant) and HARKing (Hypothesizing After Results are Known or hindsight bias) while still welcoming unregistered analyses that are clearly labelled as exploratory. Third, because protocols are accepted in advance of data being collected, the format provides an incentive for researchers to conduct important replication studies and other novel, resource-intensive projects (e.g. involving multi-site consortia) — projects that would otherwise be too risky to undertake where the publishability of the outcome is contingent on the results.

It is encouraging to see the positive reaction to Registered Reports from researchers in their various roles as editors, authors and peer reviewers. For more information and answers frequently asked questions about Registered Reports see https://cos.io/rr/

Procedure

  • Authors are invited to submit ‘Stage 1 manuscripts’ which outline the rationale and method of the proposed study or studies (see Guidelines for Authors).
  • All submitted papers will be peer-reviewed for theoretical significance and methodological quality. In-Principle Acceptance (IPA) will be given to high quality submissions. Data collecting can only commence after a Stage 1 manuscript has been accepted.
  • Once the study (or studies) is complete, the ‘Stage 2 manuscripts’ will be also be peer-reviewed to see whether they are consistent with the pre-registered Stage 1 protocol. Editorial decisions will not be based on the perceived importance, novelty or conclusiveness of the results.
  • The journal operates double blind peer review as a means of tackling real or perceived bias in the review process, so the authors must provide their title page as a separate file from their main document. Title page includes the complete title of the paper, affiliation and contact details for the corresponding author (both postal address and email address).

Five tips for preparing a Registered Report for submission

The following tips have been offered by Ansari and Gervain. They are also useful for authors considering to submit a registered report to the Journal of Computer Assisted Learning

  1. Carefully read the submission guidelines. They are extremely detailed and informative.
  2. Ensure that you have all the resources necessary to carry out the research you have proposed. You don’t want to receive in-principle acceptance of a Stage 1 Registered Report, only to find out that you do not have the resources to collect the data! Hopefully in the near future, funding agencies will begin to support Stage 1 manuscripts that have received IPA.
  3. Make sure that you are setting yourself a realistic timeline that takes into account variability in the time it will take to get your manuscript reviewed at both Stage 1 and 2. This is especially important for graduate students and post-doctoral fellows who may have pressing deadlines for finalizing their data collection so that they can defend, move to a new position etc.
  4. Be very precise in your hypotheses and analysis plan. Registered Reports are all about spending a lot of time thinking and discussing your research plans before you collect the data. Being as precise as possible and having a very detailed analysis plan can save you a lot of time both during peer review as well as the analysis after you have collected the data and before you prepare your Stage 2 manuscript.
  5. Make sure that your research ethics approval will allow you to share your data on an open data repository, such as the Open Science Framework. Open data are a requirement for the publication of a Registered Report.

Evidence informed pedagogy

Em. prof. dr. Paul A. Kirschnera, b & Tim Surma, MSca

a ExpertiseCentrum voor Effectief Leren (ExCEL), Thomas More University of Applied Sciences, Mechelen, Belgium

b kirschnerED, Educational Advice and Training

This is the opening editorial that Tim Surma and I wrote for the Autumn 2020 edition of Impact on evidence-informed pedagogy. The reason that we collated a publication with this theme is simple and really straightforward: If we, as educational professionals, choose to inform the choices that we make for our practice by the best available evidence, we can make meaningful striking enhancements in our pedagogical practice, and thus on the efficiency, effectiveness, and success of our teaching and of children’s learning.

What is evidence-informed pedagogy?

Some educational policy makers, politicians, and teachers use the term evidence- based when they speak of instruction and teaching while others (we for example) use the term evidence-informed. Is there a difference, and if so, then what is it? There is an, albeit sometimes subtle, distinction between evidence-based and evidence-informed in terms of practice in education. Originating in medicine but now used across numerous professions such as economics, technology, and agriculture, an evidence-based practice is an approach to practice that focuses practitioner attention on sound empirical evidence in professional decision making and action (Rousseau & Gunia, 2016). In medical research for instance, research processes are more rigorous, more well-defined, and better controllable than in educational sciences which makes outcomes more distinct and reliable. As Neelen and Kirschner (2020, p. 3) state:

Sackett et al (1996) see it as a three legged stool integrating three basic principles: (1) the best available research evidence bearing on whether and why a treatment works, (2) clinical expertise of the health care professional (clinical judgment and experience) to rapidly identify each patient’s unique health state and diagnosis, their individual risks and benefits of potential interventions, and (3) client preferences and values.

Here everything is clean cut. The target population is clearly defined with respect to age, weight, disease, and so forth. Also, the directions for use are clear-cut, for example that the medicine should be consumed on an empty stomach, one hour prior to eating.

Evidence-informed practice is still based on empirical evidence, but acknowledges the fact that it’s harder for real classroom practice to determine what works for who under which circumstances. What seems to works in one classroom does not always work in another classroom. Five year olds are different from fifteen year olds both with respect to their cognitive development as their knowledge and expertise, a lesson on concepts and definitions is different from a lesson on applications, and to a lesser extent a lesson in chemistry differs from a lesson in drawing. Also, what works for one teacher might not work for another because teachers differ qualitatively; subtle and not so subtle differences between teachers mean that the way that they carry out the same thing differs both in how it is carried out and how it is perceived by their students. Also, what works in a lesson today won’t necessarily work in the same lesson this afternoon, tomorrow, or in three months. Just the fact that learners are different with respect to their prior knowledge, beliefs, needs, and/or motivations to participate, can change everything. Unfortunately, this entropy (i.e., lack of order or predictability) of the classroom does not allow us to predict with statistical ‘certainty’ which intervention will yield which effect and when. Even in perfect circumstances with the best prepared lessons, some of our students might still underperform, despite the evidence brought to us by eminent cognitive and educational psychologists. While ‘evidence-based’ provides fairly hard results, ‘evidence-informed’ is less hard, but still very useful with a higher chance of success if applied thoughtfully. That is why in this issue we advocate a pedagogy informed by evidence, more than a pedagogy based (or dictated?) by evidence. The challenge of going from the evidence to the design of actual pedagogical practices in the classroom calls for a deep understanding – let’s call it pedagogical knowledge – of what, why, when something works in optimal conditions in order to have, for example, conversations with your fellow teachers and headmasters on certain pedagogical decisions or actions.

Second, the literature presents a variety of accounts of exactly what pedagogy is. Since pedagogy has both broad and narrow definitions, we have had to make a choice and have chosen to follow Dylan Wiliam (2018) in using the broad definition. He cites Alexander (2008) in stating that pedagogy is “what one needs to know, and the skills one needs to command, in order to make and justify the many different kinds of decision of which teaching is constituted” (p. 47). It can be seen as the act and discourse of teaching (Alexander, 2004). Pedagogy therefore includes instruction but is broader and also embraces the interplay between factors that influence teaching and learning. Both evidence-informed practice and pedagogy assume that the educational professional knows what might the best options for optimal teaching and learning under given circumstances (knowing your repertoire as a teacher).

What do we know about teacher repertoires? We have already learned a lot about classroom practices from the abundance of quality research conducted in laboratories and schools, online and offline, and virtually anywhere and when teaching and learning takes place. The evidence is out there. The past few decades, researchers have designed interventions and devised general techniques – not rarely based on a two way street interaction between teachers and practitioners – that work or do not work for particular learners of particular ages undertaking particular academic tasks in particular subject areas (see, for example, Roediger & Pyc, 2012). Some fundamental techniques from cognitive and educational research were derived from this substantial base of empirical research and some of these are gaining attention as they hold the potential that they are sufficiently general that they can be applied in a range of academic subject matter areas and readily implemented in classrooms across all ages (for an overview of learning strategies see Dunlosky, Rawson, Marsh, Nathan, & Willingham, 2013). Several examples of effective techniques are elaborated in this issue as these general approaches may need domain-specific adjustments to maximise their promise as learning tools for particular domains, which in itself is a shining example of evidence informed practice. Retrieval practice, the act of engaging in active recall of already learned information (see Roediger & Karpicke, 2006), is adapted to the perspective of CPD (Beauchamp, this issue); Clare Badger transfers cognitive load theory (Sweller, 1998) into practical guidelines for chemistry courses; the provision and function of individual feedback (Hattie & Timperley, 2007) is tackled by Caroline Locke; and popular learning myths (Kirschner & Van Merriënboer, 2013) are challenged by both Jonathan Firth and Jennifer Zyke as well as by Lewis Baker.

Some other evidence principles are much less domain-general, which is why they were once called theories of subject matter by Richard Mayer (2004), and some of them are still conceived as a “unique and monumental contribution of educational psychology to the science of learning” (Mayer, 2017, p. 175). Striking examples are the theory of how people learn to read (i.e., explicit phonics instruction), learning a second language and so forth. This issue also pays attention to the subject-specific uniqueness of teaching, with a focus on a selection of subjects that are less often in the spotlight of educational research, such as arts (by Gatward) and physics (by Astolfi).

Although we may now sound endlessly full of self-confidence regarding evidence informing education, we must of course temper our enthusiasm to some extent: we obviously do not know the answer to every question, for the simple reason that education does not take place in isolation. The evidence is out there – but it’s neither definite nor complete. As an example, the concept of affect—which refers to students’ experience of emotion—is gaining growing recognition as an essential component in teaching and learning but still holds many secrets for both researchers and seasoned teachers (Mayer, 2017). Therefore, some consideration was given in this issue to educational outcomes beyond retention of basic declarative and procedural knowledge. Several articles explore pedagogies such as playful learning in the early years (by Sarah Seleznyov), reading for pleasure (by Alice Reedy) and the crossroads between coaching and direct instruction (by Ed Cope and Chris Cushion). Given the broad range of content areas represented in this issue, our readers should not be surprised that the educational outcomes in this issue differ greatly.

The UK occupies a pole position worldwide with educational undertakings, supporting the implementation of structured evidence informed education. We think of influential research centres such as the Education Endowment Foundation, or professional learning communities such as the Chartered College of Teaching and the ever increasing researchED community. A number of articles in this issue zoom in on this fascinating but complex interplay between research and practice. Andrew Davis elaborates on classroom research, Richard Churches and colleagues shine a light on teacher led randomized controlled trials, and Lorne Stefanini and Jenny Griffiths address some challenges when implementing an evidence-informed approach to education.

This issue might be what David Daniel (2012) described as a targeted investment in translational research: with this issue CCT supports the development of pedagogical approaches with the goal of understanding how, when and under what constraints to apply best-evidence strategies in relevant educational contexts. Readers will find a multiplicity of approaches in the current issue, all aimed at revealing how to inform your pedagogy with the best available evidence. We hope that this issue can help you to make more and better evidence-informed decisions.

Enjoy, learn, and use the content to reflect upon and improve both your teaching and your students learning!

References

Alexander, R. (2004). Still no pedagogy? Principle, pragmatism and compliance in primary education. Cambridge Journal of Education, 34, 7–33.

Alexander, R. (2008). Essays on pedagogy. London: Routledge.

Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment in Education: Principles, Policy & Practice, 25, 551-575.

Daniel, D. B. (2012). Promising principles: Translating the science of learning to educational practice. Journal of Applied Research in Memory and Cognition, 1, 251–253

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77, 81-112.

Kirschner, P. A., & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational psychologist, 48, 169-183.

Mayer, R. E. (2004). Teaching of subject matter. In S. T. Fiske (Ed.), Annual Review of Psychology (Vol. 55, pp. 715–744). Palo Alto, CA: Annual Reviews.

Mayer, R. E. (2018). Educational psychology’s past and future contributions to the science of learning, science of instruction, and science of assessment. Journal of Educational Psychology, 110, 174–179.

Neelen, M. & Kirschner, P. A. (2020). Evidence-informed learning design: Creating training to improve performance. London, UK: Kogan Page.

Roediger, H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1, 181–210.

Roediger, H.,& Pyc, M. (2012). Inexpensive techniques to improve education: Applying cognitive psychology to enhance educational practice. Journal of Applied Research in Memory and Cognition. 1, 242–248.

Rousseau, D. M., & Gunia, B. C. (2016). Evidence-based practice: The psychology of EBP implementation. Annual Review of Psychology, 67, 667-692.

Sackett, D. L., Rosenberg, W. M., Gray, J. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn’t, Clinical Orthopaedics and Related Research, 455, 3‑5.

Sweller, J. (1988). Cognitive load during problem solving: effects on learning. Cognitive Science, 12, 275–285.

Every cloud has a silver lining: Positive effects of Covid-19?

I read an article today in my local paper Dagblad De Limburger about a school district that: “… in the morning when the cleaning service is present and in the evening when they are back at work, all windows are open against each other. There can also be ventilated in between…at the request of teachers, CO2 meters were also purchased to measure the amount of carbon dioxide in the classrooms …” Dit zette mij aan het denken omdat ik herinnerde een aantal artikelen over CO2 en leren.

Normal outside air contains carbon dioxide (CO2) at a level around 400 ppm (parts per million) and an accepted standard in a classroom is 1000 ppm which is not hard to maintain if it’s empty. But if the classroom is full of children, the level rises dramatically because we breathe carbon dioxide out with every breath. A classroom study in California (2013) found CO2 levels can reach 2200 ppm, more than 2X the recommended level, and 3X the level normally in an office setting. A study in Texas (2002) found CO2 levels over 3000 ppm in 21% of the classrooms tested; levels not conducive to efficient learning!

If the windows in a class room are closed all day or if there is poor ventilation in the room, the level of CO2 in the room is increased. That research was about CO2 concentrations and the adverse effect on lessons later in the day. In lessons at the start of the day, and therefore where the classroom could air all night, the learning and learning results were better than in lessons later in the day. Those results were attributed to the increase in CO2 during the day. In search of that article (which I didn’t find) I came across the following research results from Pawel Wargocki, José Alí Porras-Salazar, Sergio Contrertas-Espinoza, and William P. Bahnfleth (2020):

…reducing CO2 concentration from 2,100 ppm to 900 ppm would improve the performance of psychological tests and school tasks by 12% with respect to the speed at which the tasks are performed and by 2% with respect to errors made. For other learning outcomes and short-term sick leave, only the relationships published in the original studies were available. They were therefore used to make predictions. These relationships show that reducing the CO2 concentration from 2,300 ppm to 900 ppm would improve performance on the tests used to assess progress in learning by 5% and that reducing CO2 from 4,100 ppm to 1,000 ppm would increase daily attendance by 2.5%. These results suggest that increasing the ventilation rate in classrooms in the range from 2 L/s-person to 10 L/s-person can bring significant benefits in terms of learning performance and pupil attendance; no data are available for higher rates. The results provide a strong incentive for improving classroom air quality and can be used in cost-benefit analyses.

Petersen, Jensen, Pedersen, en Rasmussen (2016) found that:

Analysis of the total sample suggested the number of correct answers was improved significantly in four of four performance test, addition (6.3%), number comparison (4.8%), grammatical reasoning (3.2%), and reading and comprehension (7.4%), when the outdoor air supply rate was increased from an average of 1.7 (1.4-2.0) to 6.6 l/s per person. The increased outdoor air supply rate did not have any significant effect on the number of errors in any of the performance tests. Results from questionnaires regarding pupil perception of the indoor environment, reported Sick Building Syndrome symptoms, and motivation suggested that the study classroom air was perceived more still and pupil were experiencing less pain in the eyes in the recirculation condition compared to the fresh air condition.

Finally, though I could go on for pages, research at Harvard University by Allen and colleagues (2016) found “statistically significant declines” in cognitive function scores when CO2 concentrations were increased to 950 ppm, which is “common in indoor spaces “. The study found even larger declines when CO2 was raised to 1,400 ppm.

In other words, every cloud – no matter how dark – can also have a silver lining.

References

Allen, J. G., MacNaughton, P., Satish, U., Santanam, S., Vallarino, J. & Spengler, J. D. (2016). Associations of cognitive function scores with carbon dioxide, ventilation, and volatile organic compound exposures in office workers: A controlled exposure study of green and conventional office environments. Environmental Health Perspectives 124, 6. https://doi.org/10.1289/ehp.1510037

Corsi, R. L., Torres, V. M., Sanders, M., & Kinney, K. A. (20020). Carbon dioxide levels and dynamics in elementary schools: Results of the TESIAS study. In Indoor Air 2002, the 9th International Conference on Indoor Air Quality and Climate, (pp. 74-79). Monterey, Calif. Espoo, Finland: ISIAQ.

Mendell, M. J., & Heath, G. A. (2005). Do indoor pollutants and thermal conditions in schools influence student performance? A critical review of the literature. Indoor Air Journal; 15, 27–32.

Petersen, S., Jensen, K. L., Pedersen, A. L., & Rasmussen, H. S. (2016). The effect of increased classroom ventilation rate indicated by reduced CO2 concentration on the performance of schoolwork by children. Indoor air, 26, 366–379. https://doi.org/10.1111/ina.12210

Wargocki, P.; Porras-Salazar, J.A.; Contreras-Espinoza, S., & Bahnfleth, W. (2020). The relationships between classroom air quality and children’s performance in school. Building Environment, 173, 106749.

Elk nadeel heb z’n voordeel: Positieve effecten van Covid-19?

Ik las vanochtend in Dagblad De Limburger het volgende over SVOPL-scholen in Parkstad, Limburg: “…’s morgens als de schoonmaakdienst aanwezig is en ’s avonds als ze opnieuw aan het werk zijn, dan staan alle ramen tegen elkaar open. Ook tussendoor kan er nog gespuid worden. Op verzoek van docenten zijn bovendien CO2-meters aangeschaft om de hoeveelheid kooldioxide in de klaslokalen te meten…” Dit zette mij aan het denken omdat ik herinnerde een aantal artikelen over CO2 en leren.

Normale buitenlucht bevat kooldioxide (Carbon Dioxide; CO2) op een niveau van rond de 400 ppm (parts per million; deeltjes per miljoen of ppm). Het geaccepteerde standaard CO2-niveau in een klaslokaal is 1000 ppm. Dat niveau is gemakkelijk vast te houden als een klaslokaal leeg is. Vul de klas met een studenten en het CO2-niveau stijgt dramatisch. Dat komt omdat koolstofdioxide een deel uitmaakt van elke ademhaling die je uitademt, waardoor de niveaus stijgen naarmate meer mensen samenkomen. Een studie van klaslokalen in het Californische schoolsysteem (2013) vond CO2-gehaltes zo hoog als 2200 ppm, meer dan tweemaal het aanbevolen niveau en bijna driemaal het niveau dat je zou verwachten in een kantooromgeving. Een studie uitgevoerd in Texas (2002) vond een CO2-gehalte van meer dan 3000 ppm in 21% van de geteste klaslokalen! Dat zijn geen niveaus die bevorderlijk zijn voor efficiënt leren!

Als de ramen in een klasselokaal de hele dag dicht zijn of als er slechte ventilatie in het lokaal is wordt het niveau van CO2 in het lokaal verhoogd. In dat onderzoek ging het over CO2 concentraties en het nadelige effect op lessen later op de dag. Bij lessen aan het begin van de dag, en dus waar het lokaal de hele nacht kon luchten, bleken het leren en de leerresultaten beter dan bij lessen later in de dag. Men schreef die resultaten toe aan de toename in CO2 gedurende de dag. Op zoek naar dat artikel kwam ik, o.a. de volgende onderzoeksresultaten van Wargocki, Porras-Salazar, Contreras-Espinoza, en Bahnfleth (2020) tegen. In een voorstudie bepaalden zij dat het verlagen van “de CO2-concentratie van 2.100 ppm naar 900 ppm zou de prestatie van psychologische tests en schooltaken met 12% verbeteren met betrekking tot de snelheid waarmee de taken worden uitgevoerd en met 2% met betrekking tot gemaakte fouten. Deze relaties laten zien dat het verlagen van de CO2-concentratie van 2.300 ppm naar 900 ppm de prestaties van de tests die worden gebruikt om de voortgang van het leren te beoordelen met 5% zou verbeteren en dat het verminderen van CO2 van 4.100 ppm naar 1.000 ppm ziekteverzuim met 2,5% zou verlagen. Deze resultaten suggereren dat het verbeteren van de ventilatie(snelheid) in klaslokalen aanzienlijke voordelen kan opleveren in termen van leerprestaties en aanwezigheid van leerlingenDe resultaten vormen een sterke stimulans om de luchtkwaliteit in klaslokalen te verbeteren en kunnen worden gebruikt in kosten-batenanalyses.”

Petersen, Jensen, Pedersen, en Rasmussen (2016) vonden dat bij voldoende ventilatie “het aantal juiste antwoorden significant was verbeterd in vier van de vier prestatietests, optellen (6,3%), getalsvergelijking (4,8%), grammaticaal redeneren (3,2%) en lezen en begrijpen (7,4%), toen de toevoer van buitenlucht werd verhoogd. Resultaten van vragenlijsten met betrekking tot de perceptie van de leerling van het binnenmilieu, rapporteerden symptomen van het Sick Building Syndrome en motivatie suggereerde dat de lucht in het klaslokaal stiller werd waargenomen en dat de leerling minder pijn in de ogen ervoer in de recirculatieconditie in vergelijking met de frisse lucht.

Tot slot, maar ik zou pagina’s kunnen vullen, onderzoek uitgevoerd door by Harvard University by Allen and colleagues (2016) vonden “statistisch significante dalingen” in cognitieve functiescores wanneer de CO2-concentraties werden verhoogd tot 950 ppm, wat “gebruikelijk is in binnenruimtes”. De studie vond zelfs nog grotere dalingen wanneer CO2 werd verhoogd tot 1.400 ppm.”

Met andere woorden, elk nadeel – hoe groot dan ook – kan ook een voordeel hebben. Hopelijk blijft deze gewoonte kleven als alles ooit weer normaal wordt!

Referenties

Allen, J. G., MacNaughton, P., Satish, U., Santanam, S., Vallarino, J. & Spengler, J. D. (2016). Associations of cognitive function scores with carbon dioxide, ventilation, and volatile organic compound exposures in office workers: A controlled exposure study of green and conventional office environments. Environmental Health Perspectives 124, 6. https://doi.org/10.1289/ehp.1510037

Corsi, R. L., Torres, V. M., Sanders, M., & Kinney, K. A. (20020). Carbon dioxide levels and dynamics in elementary schools: Results of the TESIAS study. In Indoor Air 2002, the 9th International Conference on Indoor Air Quality and Climate, (pp. 74-79). Monterey, Calif. Espoo, Finland: ISIAQ.

Mendell, M. J., & Heath, G. A. (2005). Do indoor pollutants and thermal conditions in schools influence student performance? A critical review of the literature. Indoor Air Journal; 15, 27–32.

Petersen, S., Jensen, K. L., Pedersen, A. L., & Rasmussen, H. S. (2016). The effect of increased classroom ventilation rate indicated by reduced CO2 concentration on the performance of schoolwork by children. Indoor air, 26, 366–379. https://doi.org/10.1111/ina.12210

Wargocki, P.; Porras-Salazar, J.A.; Contreras-Espinoza, S., & Bahnfleth, W. (2020). The relationships between classroom air quality and children’s performance in school. Building Environment, 173, 106749.

Spacing Effect; Intensive ‘Crash’ Courses and 6th Form Independent Study

Found on CogSciSci, a really interesting blog by Rob King (@Ironic_Bonding) on spaced and massed practice to teach learners how to drive, along with some worrying ramifications for road safety. He then writes about applying this to the chemistry classroom. He sets his blog up as follows (a kind of graphic organiser):

Read the blog here: Spacing Effect; Intensive ‘Crash’ Courses and 6th Form Independent Study