One of our tasks this week was to identify an outstanding virtual lab. Did we really do that? What counts as an outstanding virtual lab? Can simulations count as virtual labs? Can games count as virtual labs?
I spent hours looking at supposed virtual labs and came to the conclusion that there are many groups building simulations, games, and virtual labs, but there is very little consistency in the use of the terms.
I looked at many different fields including biology, physics, chemistry, geology, meteorology, Earth science, and from engineering hydrology and materials science, hunting for something that might fit the idea of “virtual” and “outstanding.” What struck me is that the vast majority were not really labs that would be usefully described as inquiry, and that would allow the students to vary parameters and truly discover relationships. Many seem to teach a narrow set of concepts in a relatively didactic way.
I was especially disappointed with most of the labs I found for Earth Science (my field). To give just one example, virtual earthquake at:
taught the bare basics of seismological use of seismograms, but seemed little more than a standard lab manual type of exercise. It used S-P times to determine epicentral distances and locate quake epicenters, with an additional part using a nomograph to determine Richter magnitude. The student selects an area in the world to simulate an earthquake; the program simulates a quake without precisely locating it, and the student is given three seismograms which can then be used to determine epicentral distances using travel time curves of P and S waves. From epicentral distances and the plotted locations of seismographs, the program draws three circles whose radii are the epicentral distances, and where they all intersect gives the location of the epicenter.
All of this was done in a dry, matter of fact fashion which could not possibly create any real motivation or interest. It lacks what the developers of PhET describe as a potential for “engaged exploration” (p. 28 of our readings, also p. 28 of the original Adams et al., 2008 paper available on the PhET research website).
Adams et al. (2008a) identify students as being in “engaged exploration” when students pose questions and seek “answers by observing the results of their own interactions with the simulation and making sense of what they see.’ Students are described as being “more easily engaged in the exploration of topics that include relatively unfamiliar science.”
What happens when students are not engaged in the simulations? What reasons are there for students to not become engaged? Some reasons related directly to the design and content of the simulations include: Insufficient time for interacting with the simulation; inability to figure out how to use the simulation; being overwhelmedby the simulation; not knowing where to start; and familiarity with the concepts, such that students then try to use the simulation as a demonstration tool instead of using it as a learning tool.
Adams et al. (2008a and b; see reference list p. 129 of our reading, and various parts of our readings) used extensive interviews and interactions with students actively using simulations to attempt to determine how to best design simulations to promote engagement and to promote actual learning.
Obviously, the results of these, and other researcher’s work on simulations and games are essential to consider when selecting such materials for use in secondary classrooms. We really want to avoid using such putative “virtual labs” as the virtual earthquake lab I referenced above (to their credit, they apparently have a newer version that purports to be designed in a more inquiry type of way, but so far I have been unable to get it to work properly).
Besides “engaged exploration” (active exploration of information in the simulation, to make sense of it; and exploration of relatively unfamiliar science content), Adams et al. (2008a) invoke the coherence principle and a consistency principle as concerns for the details of the design process. Clark and Mayer’s (2003) coherence principle can be paraphrased as: Don’t add unnecessary material to simulations, even if interesting, because it can inhibit the learning process. Inhibition can be by distraction (guiding attention away), disruption (preventing linkage of concepts because of irrelevant information intruding), or seduction (activating pre-exixting knowledge that may mislead).
In their second paper, Adams et al. (2008b) investigate the design elements of successful simulations. Some of that is relatively technical but straightforward aspects of how learners most easily use computer simulations---aspects of clicking, dragging, grabbing objects, sliders, buttons, and checkboxes. Those are all important, because inefficient use of those design elements can greatly affect learners’ ability to use and remain engaged in the simulation.
Three different levels of usability are distinguished: (1) Non-intuitive, requiring much instruction to use; (2) Semi-intuitive, usable after instruction and demonstration; and (3) intuitive, which is characterized as most desirable, being “easy to use with no instruction”.
The goal is to create fully intuitive simulations.
Given the high degree of diversity in ability, attention, and interest in science in secondary science students, it seems most important to strive for use of virtual labs, simulations, and games that are as intuitive as possible. Of course thay also need to be engaging and have a readily recognizable science learning content and realizable goals.
There is one more aspect that I want to highlight here, from general learning theory, which is applied in these simulation discussions by Adams et al. (2008b):
Learners who have little or no knowledge of new content tend to be unable to quickly determine which information is important (when multiple pieces of info are provided) or the relationships between those pieces of information.
Applied to representations in simulations, they put this as:
“Visual representations must be created with care because we observe that when students are learning about the phenomena they will apply equal importance to every feature….care must be taken not to overwhelm the students with too much new informatin at once.” (p. 14).
Also:
“ “Students look at all visual clues equally, if they do not understand a concept. It is important to emphasize items that are pedagogicallly important and eliminate all potential distractions.” (p. 20).
Very thorough reflection, Karst. I would tend to agree that the selection of "outstanding" virtual labs is limited. I think it may be one of the unfortunate disconnects between scientists, educators and the technology specialists when it comes to building these. The visuals are continuing to get better, so they are nice, shiny activities that wow the average viewer upon first glance- but they are lacking the fundamental inquiry component that scaffolds learning while still requiring critical thinking and application. Sounds like a great grant opportunity for someone....
ReplyDeleteI agree with that many the virtual labs lack an inquiry component. Looking for virtual labs online, I noticed many were simple activities with yes or no type of supplemental questions. This is where I think the teacher can step in and make their own inquiry component, and just use the lab for its visuals and provided data.
DeleteThis course has been valuable for alerting me to possibilities of which I had limited awareness. I'm not particularly a games person, but my 13 year old 7th grade son spends inordinate amounts of time playing them online. This assignment has alerted me to a variety of educational games, simulations and labs. I am looking forward to exploring them with him after this semester ends and I have a bit of free time. He will absolutely love exploring these and will be able to give me a better insight into what works.
DeleteLast summer I had him use the free Khan academy videos to start learning algebra. He liked those, but mostly because he loves computers.
http://www.khanacademy.org/
Note: Khan is not a good teacher, but many of his videos are helpful for kids like my son.
I actually did a lab very similar to the one you posted above as part of an online geology course. I admit it was not that engaging, but it did review what was outlined in the book and in lecture.
ReplyDeleteI agree with Joseph that as a teacher we can help create an inquiry component. We can make sure students can explain WHY they got certain results. A lab report raising conceptual questions could help.
I also found that the terms simulation, lab, etc were used inconsistently. For example, the site I turned in the wiki page called itself a simulation, even though it allowed students to manipulate variables. I think as long as we know the elements to look for in a virtual lab it doesn't matter that not all the material out there is named consistently.
As always, what matters is what use we put the instructional activities to---no matter what the materials or activities are---real labs with physical materials, or virtual labs, simulations, or games.
DeleteMany of the online items we looked at lst week can be imbedded usefully as activities within our instruction, if only we introduce them properly and provide the proper scaffolding for the particular students we have.