Artigo Acesso aberto

Using Mobile Location‐Based Augmented Reality to Support Outdoor Learning in Undergraduate Ecology and Environmental Science Courses

2018; Ecological Society of America; Volume: 99; Issue: 2 Linguagem: Inglês

10.1002/bes2.1396

ISSN

2327-6096

Autores

Amy M. Kamarainen, Joseph Reilly, Shari Metcalf, Tina A. Grotzer, Chris Dede,

Tópico(s)

Mobile Learning in Education

Resumo

Augmented reality (AR) applications use a technological device to visually display digital information so that it appears to be overlaid, embedded in, or activated by the physical environment. Augmented reality is an emerging technology that falls on a spectrum between real and virtual (Milgram and Kishino 1994); current descriptors of points along this spectrum include mixed reality, AR, and virtual reality (Klopfer 2008, Liu et al. 2017). While AR is currently most visible in entertainment and gaming industries (e.g., Pokemon Go), there is growing theoretical and empirical evidence that AR supports learning and engagement (Price and Rogers 2004, Dede 2009, Radu 2014; Reilly and Dede, in press), and it is important that environmental educators for all ages consider the opportunities and challenges these technologies present (McCauley 2017). Because AR is an emerging medium, there are a number of formats that fall under the AR "umbrella," and we describe the relevant distinctions below. There are two primary formats for AR—location-based and vision-based AR—and each offers different opportunities to support learning (Dunleavy 2014, Dunleavy and Dede 2014). There are also two primary modes for delivering AR experiences—through mobile devices (like smartphones and tablets) or through head-mounted displays (like the Microsoft HoloLens; Radu 2014). Vision-based AR allows a designer to link digital information and media with a physical "trigger," which might be an object, image, or Quick Response (QR) code (like the black-and-white square shown in Fig. 1a). The camera(s) on a smartphone or tablet, or on a head-mounted display (like Microsoft HoloLens), are used to recognize the pattern of the trigger and activate the associated information and media, which is then displayed to the user. This works in a similar way to a barcode that is scanned at a grocery store in order to reveal the price of an item. In contrast, location-based AR involves learners using GPS-enabled smartphones or tablets to activate media at particular locations in an outdoor space (Fig. 1b). A designer uses a map-based online interface to embed digital information and media at locations of interest, and the embedded information or media is activated when the user reaches that location. After being activated by location or by a vision-based trigger, the AR application superimposes digital media, data, audio, video, art, and/or narratives on the real world, making this information appear to be embedded or overlaid on the real environment. Prior work on the use of AR in undergraduate teaching and learning contexts has focused largely on vision-based applications of AR. Studies in this area show that vision-based AR can support student understanding of concepts that require abstraction or interpretation of complex spatial relationships—concepts for which visualization is a useful tool (Radu 2014). For example, work by Lin et al. (2013) demonstrates that undergraduate students learned more about elastic collisions using an AR application for physics compared to a 2D simulation, while Shelton and Hedley (2002) report on improvements among undergraduate geography students in their factual and conceptual understandings of complex spatial concepts associated with Earth–Sun relationships following use of an AR display. Also, when applying vision-based AR to physics and astronomy laboratories, the AR treatment had positive effects on students' attitudes, skills, and conceptual understanding-related specific concepts (Yen et al. 2013, Akçayır et al. 2016). While these studies represent a valuable "proof of concept," more work needs to be done to characterize how AR interfaces may support learning in undergraduate classrooms, and to identify the limits in the utility of AR. One limitation of the prior work is that much of this has been done in indoor learning environments with vision-based AR, and the potential for location-based AR to support learning among young adults in outdoor contexts has been under-studied. Studies in mixed-reality contexts suggest that overlaying digital information with real physical objects can help support transfer by bridging abstract and concrete forms of understanding (Quarles, Lampotang, Fischler, Fishwick, and Lok 2008). Augmented reality offers similar opportunities to bridge between the abstract and concrete (Rogers 2004), and for learners to build deeper connections with the material by learning about otherwise hidden physical, historical, and cultural aspects of the outdoor space (Zimmerman and Land 2014, Kamarainen et al. 2015). For reasons of practicality and cost, this article focuses on AR experiences that are accessible through mobile devices (like smartphones and tablets), rather than those that require a head-mounted display. We present examples that use a combination of location-based and vision-based triggers. As described below, the affordances of mobile and location-based AR align most closely with learning goals relevant to ecology and environmental science. Ecologists and ecosystem scientists bring to bear sophisticated conceptual models and background knowledge when they observe or study natural systems (Eberbach and Crowley 2009; Kamarainen and Grotzer, in review). These perspectives provide scientists with a "search image" that can help them identify patterns, notice things that are unusual, pay attention to relevant signals, and connect their observations to prior understanding of natural history. This leads to the question: "How might AR support learners in seeing the world through an ecologist's eyes?" While AR platforms and experiences have only recently reached a level of maturity that makes broad use in learning contexts feasible, research using early versions and prototypes provides compelling arguments that well-designed AR experiences support learning. (A short review of this active area of research is provided below.) Augmented reality can support ecology learning through revealing hidden or invisible aspects of the system, and linking visualization of hidden processes with macro-scale or emergent outcomes. Dunleavy (2014) refers to this as using AR to "see the unseen" and highlights this as a general principle for designing impactful AR for learning. Prior research supports the idea that prompting students to notice and reflect upon processes that are responsible for patterns and change in natural systems can support shifts in student thinking from static or event-based notions of causality toward process-based explanations (Lindgren and Moshell 2011, Grotzer et al. 2013). There are a number of ways AR can be leveraged to support these outcomes. Augmented reality locations can be positioned at places where one wants students to observe an object, pattern, or phenomenon that they might not otherwise notice—for example, a nesting cavity created by a woodpecker, the layer of silt left behind after a spring flood, or a path cut through brush by a deer. As described by Eberbach and Crowley (2009), engaging in scientific observation is a more challenging skill than is generally appreciated, and requires the coordination of disciplinary knowledge, practices of observation, and the application of one's attention. Novices often do not know what to look for, so may quickly give up or overlook interesting artifacts. Augmented reality can be used to alert students to an opportunity to observe something meaningful that connects with ideas they have been learning; and tips, reminders, and reflective prompts embedded in the experience can encourage students to adopt practices of observation that mirror those used by experts (Klopfer and Squire 2007, Dunleavy and Dede 2014, Grotzer et al. 2015). Augmented reality can also be used to overlay or link multiple representations of a system or phenomenon (Zimmerman and Land 2014, Kamarainen et al. 2016). Many ecological changes are driven by organisms or processes that are too small to see, while emergent patterns or outcomes may be best visualized from a bird's-eye perspective. When making sense of scientific visualizations and abstractions, students benefit from viewing multiple forms of representation (Ainsworth 1999, Wu and Shah 2004), manipulating and interacting with physical models as well as visual representations, engaging in metacognition and reflection related to the visualizations (Chang et al. 2009, Wu and Shah 2004), and making links among representations (Wu and Shah 2004). Augmented reality can support this by juxtaposing multiple representations or overlaying them with a physical pattern or phenomenon, which allows the user to easily connect and compare the representations without having to hold one in mind while accessing a second (Tang et al. 2003, Pathomaree and Charoenseang 2005, Radu 2014). Augmented reality visualizations can be used to communicate changes over time by embedding views or narratives that describe the history of a place (Zimmerman and Land 2014, Grotzer 2015), or by pointing to evidence of change over time that may be present within the landscape. Ecosystems can have long "memories"—the abiotic conditions, species composition, and relationships present may depend on what happened at the site last season or long ago. Through repeated visits and careful observation, ecologists often develop deep knowledge of the history of a place, its rhythms, and its phenology. Using AR to provide visitors with time lapse, before-and-after, or "what if it were winter" views of a space that they may only visit once can provide a powerful shift in perspective. For example, residents near a community park that was the location of one of our augmented field trips had fortuitously installed a hidden video camera and collected footage of nocturnal and diurnal visitors to the park. We embedded a highlight reel from the camera in the AR experience and activated the video when students arrived at the tree in which the camera had been mounted. Students watched as the nighttime footage showed a fox passing through, a pair of raccoons climbing the tree, and a coyote urinating near the base of the tree to mark its territory. The next clip showed a daytime scene of a neighborhood dog sniffing the location and adding his own scent to the mix. The AR experience prompted students to consider that there are many organisms that frequent the park even though they may not be seen during the daytime field trip. These ways of making the invisible visible help students to see an environment through an ecologist's eyes. How complex is developing these kinds of learning experiences? The basic architecture of location-based or vision-based AR platforms includes two parts: (1) an online editor that the designer uses to upload media, link those media with GPS locations or visual QR codes, and orchestrate the sequence of events the user will engage with during the experience; and (2) a client-side application downloadable to a mobile device that allows the user to login and access the experience the designer has constructed. There are a number of location-based AR platforms that allow you to design your experience for free, and most also allow you to share the experience with unlimited users for free (e.g., ARIS, TaleBlazer, Aurasma), though some require a fee to use the experience with more than one user (e.g., FreshAiR) (see a list of experiences and platforms in Dunleavy and Dede 2014). A number of these platforms have recently been applied to undergraduate learning contexts. Klopfer and Squire (2008) summarize how a predecessor to TaleBlazer was used to create a mobile AR game called Environmental Detectives that supported undergraduates in an environmental science course. More recently, work by Clements (2017) outlines how the TaleBlazer AR platform was used to design a guided tour of a canyon for an undergraduate Physical Sciences course. Meanwhile, Holden and Sykes (2011) offer an example of the use of ARIS to support place-based foreign language learning by undergraduate students. This body of work contains valuable guidance for the design of location-based AR activities for undergraduates. Each location-based AR platform outlined above has pros and cons, but they share similar design features that enable a designer to embed media and link those media with location-based or vision-based triggers. A designer can upload different forms of media (including text, images, audio, and video) into the online editor and then embed these in the experience by linking them with particular locations or "characters" (e.g., a simulated Ranger) who are part of the narrative of the experience (Fig. 2). Locks and conditions can be used to turn locations and media on and off so that, for example, the user must stop at the "toolbox" and pick up a measurement device before they will see a sampling location on their display. In its simplest form, the experience may unfold as a series of sequential stops on a virtual tour of the location—akin to a tour guided by a virtual naturalist. But the platform also allows a designer to thoughtfully use locks, items, and triggers to develop creative narratives, compelling games, and interactive social experiences that engage learners in new and unexpected ways (J. Reilly and C. Dede, forthcoming). Once the design is complete, the experience must be published; then, it becomes accessible to the user through the application running on their mobile device (see Fig. 3 for an example of the user interface). A number of freely available platforms provide off-the-shelf experiences by allowing a designer to copy or drag-and-drop an experience from one location to another. This is a quick and easy way to get started, but it is always important to consider the relationship between the experience and the place that it will be enacted. Some AR experiences are designed in a way that is place-agnostic—where the virtual aspects of the experience refer to fairly ubiquitous environmental features (like air, water, or trees) such that the experience may be transported to a new location without losing the fidelity of the learning experience. In contrast, other experiences are designed to be deeply place-dependent, referring to specific and unique characteristics of the environment that may not be found in other locations (e.g., a snakeskin discovered on the ground; Kamarainen et al. 2015). When using off-the-shelf experiences, it is critical to consider whether and how the original design of the experience aligns with the features of your location. Modifying an existing experience is another way to get started. Modifying a functioning AR experience can allow you to observe how the experience appears in both the design and user interfaces in order to understand the relationship among the components. Once you have unpacked the existing design, you can begin to bring your own ideas into the mix by uploading and replacing media, changing the text, adding new locations, or changing the order and sequence of user interactions. Each of the AR platforms mentioned above (ARIS, TaleBlazer, FreshAiR) has online resources, manuals, and tutorials available to support new designers. These resources make it fairly straightforward to modify a template or design and build your own AR experience from scratch. In addition, an exciting way to use AR in your undergraduate classroom would be to engage your students in the design of their own AR experiences. Scholars suggest that engaging students in design is a powerful way to support learning because it engages students in "learning by doing" and prompts them to incorporate multiple tools and resources, the design process can guide learners into a way of thinking, and frustrations that arise during the design process become rich opportunities for problem-solving (Squire and Jan 2007, Petrich et al. 2013). Prior work outlines benefits of engaging students, as young as middle school, in AR design activities including critical analysis, deeper thinking, and increased understanding of issues in their community (Klopfer and Sheldon 2010, Coulter et al. 2012, Bower et al. 2014, Martin, Dikkers, Squire, and Gagnon 2014). Given the diversity of freely available AR platforms, and their ease of use, the design of AR experiences may be an ideal way to engage undergraduate learners in deeper understanding of the environments around them. You may imagine a course project in which students are required to collect images, video, or audio that captures and represents how different ecosystems on campus change over time (perhaps over the course of a day, a month, or a season). They might embed media they have collected in an AR experience that can be shared with the rest of the class or even the larger campus community. As students use and critique the AR experiences created by their peers, they might gain understanding of diurnal patterns, phenology, or seasonal cycles, and thus more deeply appreciate how ecological processes and dynamics play out in the ecosystems that surround them. EcoMOBILE was a research project funded by the National Science Foundation (DRL-1118530), which focused on exploring the utility of blending AR experiences with multi-user virtual environments for middle school ecosystem science instruction. Through a process of design-based research, we designed and tested a number of EcoMOBILE activities that focused on different physical spaces, content areas, and technological modalities (Kamarainen et al. 2013, Grotzer et al. 2015, Kamarainen et al. 2015, 2016). Following this process of iterative refinement, we used the ARIS platform (Gagnon 2010, Holden, Gagnon, Litts, and Smith 2014) to produce a number of location-based AR experiences with associated curriculum that provide a variety of types of experiences from place-dependent to place-agnostic. We have made these publicly available (http://ecolearn.gse.harvard.edu/ecoMOBILE/overview.php) and hope that designers will download and modify them to fit best with the unique features of their own locations. To demonstrate the design features discussed above, we provide a description of a few of the EcoMOBILE experiences that are available. Atom Tracker is designed to help students better understand the cycling of matter in ecosystems, with a focus on the concept of conservation of matter and the processes of photosynthesis and respiration. Literature suggests that middle school students struggle to comprehend the particulate nature of matter and to make sense of processes that are not visible in everyday life, which contribute to difficulty in reasoning about photosynthesis and respiration (Lee et al. 1993, Cho and Anderson 2006). Difficulties in understanding these molecular processes persist into undergraduate classrooms, confounded by misunderstanding of the relationship between matter and energy, and these misconceptions can be relatively resistant to instruction (Anderson et al. 1990, Hartley et al. 2011). To address such challenges, the EcoMOBILE Atom Tracker Module invites students to follow a carbon or oxygen atom through their environment. The oxygen atom begins as part of a water molecule, and students trace the movement of water either through infiltration followed by transpiration or via running off of pavement. Eventually the water molecule goes through photosynthesis, and an oxygen molecule is released from a tree. The carbon atom begins as part of a starch molecule within a duckweed plant, and after being eaten by a virtual duck, the digested material settles to the bottom of the pond and is used by bacterial decomposers. The carbon dioxide is released and makes its way into the atmosphere, and this CO2 molecule is taken up by a plant and takes part in photosynthesis, with the carbon atom thus returning to being a part of a starch molecule in a different plant. The AR provides ways to engage students in active and experiential learning activities—pedagogical strategies that have been shown to be effective in supporting student learning (Price and Rogers 2004). Students physically move around the environment tracing the pathway of water, oxygen, and carbon. Physical embodiment of the movement of molecules likely gives students a more dynamic perception of material cycling (Price and Rogers 2004, Lindgren and Moshell 2011). Through engaging students in this activity and movement, the design aims to give a tacit sense that atoms and molecules, though not directly visible, are all around us, as well as providing a physical sense of the pathways for movement of matter in the environment. Students are prompted to look for and notice features of the environment (e.g., look for duckweed; notice and document presence of pavement and storm water drains) that connect directly with the movement of their virtual atom (Fig. 3). A QR code attached to the trunk of a tree activates an animation of photosynthesis, represented by molecules of carbon dioxide and water rearranging to form glucose and oxygen within a chloroplast (Fig. 4). Through layering of multiple representations—images and animations with real objects (water, trees, duckweed)—students can situate and visualize otherwise hidden processes including transpiration, photosynthesis, and respiration. By tracing the same atom through multiple chemical reactions that situate the atom in different molecules, students gain insight into conservation of matter—the atom may move, change form (from liquid to gas), and become a part of different physical objects, but it is never destroyed (Fig. 5). The EcoMOBILE Atom Tracker Module includes supporting classroom activities that reinforce the ideas that are tacitly communicated through the augmented field trip experience. In class, students might build their own version of the carbon or oxygen cycle based on their experience. We used the move–stick–change framework outlined by Weathers, Strayer, and Likens (2012) to help students think about the movement and fate of matter in the context of stock and flow models. The move–stick–change framework provides a simple way of characterizing the fate of elements in an ecosystem—when thinking about a molecule and where it is found over time, it might "move" from one place to another; it might "stick" or stay in the original location and stay in its original form; or it may "change" or be transformed from one molecular form to another through a physical, chemical, or biological process. The design of EcoMOBILE Atom Tracker is place-agnostic, meaning that the experience can be used in any place that has water, plants, and some pavement. We chose to use colorful, cartoon-like ball-and-stick representations of atoms and molecules, which are simplifications compared to current understanding of atomic structure, but these representations are easy for students to parse and trace throughout the experience. The experience guides students through a few canonical pathways for oxygen and carbon, yet the number and diversity of pathways shown is relatively limited. Therefore, this experience does not fully represent the random and probabilistic patterns of movement and interaction that characterize atoms and molecules. It is intended to reveal hidden processes involved in the movement and transformation of materials in ecosystems; in focusing on these learning outcomes, it foregrounds certain aspects related to the nature of matter, while backgrounding others. One potential modification might be to engage students in designing additional pathways that could be incorporated into the Atom Tracker experience, including pathways that unfold over different time scales or pathways related to different elements (e.g., nitrogen or phosphorus). By engaging students in revision of the experience, they could be prompted to consider ways in which the original experience faithfully represents the movement and fate of materials, and in what ways it could be improved to better represent what is currently known about material cycling in ecosystems. Another powerful modification of this activity would be to trace both matter and energy through the processes that are depicted in the current version of Atom Tracker. This would help address core misconceptions documented in undergraduate learners related to understanding how energy is used and transformed during basic ecological processes like photosynthesis and respiration (Anderson et al. 1990). The Water Quality Measurement Module is framed as a continuation of a classroom-based investigation that students conducted in a virtual world called EcoMUVE (Kamarainen et al. 2013). In the EcoMUVE curriculum, students explore a virtual pond environment, discover a fish kill, and collect information and evidence to build an argument about what happened to the fish (http://ecolearn.gse.harvard.edu/ecoMUVE/overview.php). The subsequent EcoMOBILE Water Quality Measurement experience invites students to explore a real pond or stream to evaluate whether a fish kill is likely to occur in the real system. The AR app leads students to visit a water measurement "toolbox" to pick up environmental probes (which might include temperature, dissolved oxygen, or turbidity) (Fig. 6). With probes in hand, students explore the area and collect data on the water quality of a nearby pond or stream (Fig. 7). Through prompts embedded in the AR experience, they share and compare their measurements to those collected by their peers, and collect other evidence about factors that might influence their measurements. The goal of this experience is to support students in collecting their own water quality data, and collaboratively constructing a data set that reveals the scope and range of natural variation. The location-based AR triggers guide students to a location where they can pick up a probe that will allow them to capture real-time measurements of their environment. In addition to helping students navigate the space and find resources, the AR experience delivers tips and reminders about how to use the probes. This just-in-time support can allow students to explore the environment, move at their own pace through the experience, and collect samples and measurements at any location they choose. We have found that students appreciate an experience that is designed to offer a balance between freedom and support. After having collected data from multiple locations around the pond, students are led to a "data gallery" where they write their data values on sticky notes and stick them to large foam-backed number lines and maps (Fig. 8). As more students visit the data gallery, patterns emerge and students can see trends in the frequency of data values, or can notice spatial patterns in the data based on the map representation. Providing multiple representations of the data can help the students make comparisons and connect the variation in values with physical features of the environment. In the third part of the activity, we use the affordances of AR to help students engage in deeper interpretation of their data. They have freedom to visit any of the locations that is interesting to them and can engage with each of the following: The water quality measurement experience addresses a number of learning goals that are relevant to undergraduate audiences. Students engage in authentic approaches to scientific practice, as the experience guides students to use environmental probes to collect their own data. These data are then displayed in a visual representation that allows the aggregation and sharing of data from multiple people, times, and locations. In our case, we used physical visual poster-based displays available in the environment, but you might imagine using a Google spreadsheet to aggregate student data and make it available for further analysis. Such activities can be leveraged to support students in analyzing highly variable data and applying appropriate statistical techniques to make sense of the spatial and temporal patterns in these data. Part of what makes AR a powerful instructional tool is that it allows the instructor to support student learning outside of the confines of the classroom, and to do so in ways that connect with core concepts or practices emphasized in the course. You can use the AR to prompt reflection and metacognition by asking the user to consider how what they are observing in the real world connects with what they are learning in class. These pedagogical moves will be most effective if students are given opportunities to engage in reflection and metacognition in other activities during the course. Another way that these AR activities can be integrated into instruction would be to bring artifacts from the field experience back into the classroom for further analysis or discussion. Artifacts may include data, images, or observations. We found bringing a combination of artifacts back to the classroom helped support rich discussions about the causes behind data variability and also supported students in building conceptual models (of material cycles, or food webs; Cooke et al. 2016, Kamarainen et al. 2016). Since the activity supports self-paced and self-directed activity, each student's experience is different and the diversity of experiences can be leveraged in the classroom to support learning that is peer-enabled and personalized. Augmented reality will be most powerful when it is thoughtfully integrated into the overall arc of instruction in ways that clarify, emphasize, and connect with knowledge and practices discussed in class. The EcoMOBILE experiences were developed through an iterative design process that involved testing and revising the experiences based on pilot tests with middle school students and teachers (Kamarainen et al. 2013, 2016). While the final designs that are publicly available are tailored to middle school audiences, the content and language of these experiences could be modified to suit an older audience. Many of the core concepts explored by these activities (e.g., conservation of matter, understanding the carbon cycle, making sense of data variability) are difficult even for undergraduate students and align with core learning goals outlined in the Vision and Change in Undergraduate Biology Education (American Association for the Advancement of Science 2013). Beyond supporting the learning outcomes that are outlined in the descriptions above, using AR with your undergraduate students could help you connect the concepts you are discussing in class with outdoor environments that students pass through every day. Students often seek evidence that what they are learning about connects to their everyday lives—embedding learning experiences in the places they pass through each day can make the connection clear. If you use AR to give your students "X-ray vision" so they can see transpiration happening in the big oak tree near the entrance to the lecture hall, they may just think of it each time they pass the tree on their way to class. The idea of including your students in the design process has the potential to support both learning and engagement (Sheldon and Klopfer 2010, Coulter et al. 2012). The AR platforms offer rich possibilities for creative designs, and students are likely to come up with ways to engage and communicate with one another (a form of peer-to-peer learning) that could offer a powerful alternative or reinforcement to the way the same ideas are communicated by the instructor. The AR platform offers ways to "share" experiences that have been built, so that the instructor could easily review the students' experience to offer feedback on content and quality. Augmented reality makes it possible to design multiple field trips with different themes to accompany any course, as an instructor does not need to organize the logistics normally associated with a field trip or take time out of class to accompany students during the experience. The majority of undergraduate students have their own smartphone, so could potentially complete an AR experience at their own pace on their own time, even if the location is not on campus. In summary, ecologists and ecosystem scientists carry with them a wealth of background knowledge when they go out into natural systems (Kamarainen and Grotzer, in review), and students only gain access to a small fraction of this insight through typical classroom instruction. Engaging with these ideas, perspectives, and knowledge through the use of AR has the potential to support deeper understanding of core concepts in the discipline and more meaningful engagement with the environments around us. EcoMOBILE research was supported by National Science Foundation grant no. 1118530 and by the Qualcomm Wireless Reach Initiative. EcoMUVE was supported by the United States Department of Education Institute of Education Sciences through grant no. IES-R305A080514. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation, the United States Department of Education, or Qualcomm Technologies, Inc.

Referência(s)