Artigo Acesso aberto Revisado por pares

Discipline‐centered post‐secondary science education research: Distinctive targets, challenges and opportunities

2014; Wiley; Volume: 51; Issue: 6 Linguagem: Inglês

10.1002/tea.21165

ISSN

1098-2736

Autores

Brian P. Coppola, Joseph Krajcik,

Tópico(s)

Education and Critical Thinking Development

Resumo

This is the second JRST Special Issue on Discipline-Centered Post-Secondary Science Education Research. The response to our focus on the distinctive role of the discipline in shaping science education research at the post-secondary level (Coppola & Krajcik, 2013) has been overwhelmingly positive. In this issue, our selection of papers raises questions on how meaningful learning outcomes at the college and university levels are influenced by the rich background and prior knowledge of post-secondary students, on how gaining understanding of relevant subject matter intersects with the ability to use it productively, and on how to bridge the transition from working in school settings to the real world needs of the baccalaureate class as they become professionals and practitioners. Promoting deep versus surface learning (Marton & Saljo, 1976) is an unquestioned axiom of education. Deep learning (synthesizing and integrating new knowledge in the context of prior knowledge, permitting critical problem-solving in new and unfamiliar situations) is deemed to be better than surface learning (memorization of unlinked facts with no integration into existing contexts, providing recall of information and procedural heuristics over understanding). In his critique of situated cognition, Bereiter (1997) provides the fictionalized account of two students doing equally well when they take Algebra I, thus having been in the same situations, who nonetheless diverge when they take Algebra II. Flora, it is supposed, has used meaningful learning strategies (Ausubel, 1963) in Algebra I, while her classmate Dora has used rote and recall strategies. And while they both did well in Algebra I, presumably because the assessments could not sort them out, Dora fails Algebra II because she has not learned in a way that allows her to transfer and apply the facts and heuristics she only recalls. Dora has learned how to recognize and reproduce what is needed to do Algebra I problems, and how to take Algebra I tests, but she has not learned Algebra. Bereiter's point in no way takes instructors off the hook for making good choices in creating thoughtful learning environments which facilitate meaningful learning. The full intent of an excellent teacher about a learning environment is never self-evident, nor can it presuppose that its users, the students, will automatically follow the prescription. Without stating the aphorism explicitly, Bereiter rightly reminds instructors and researchers alike: you can lead a horse to water… The choices made by Flora and Dora are as important as the teacher's. The perceptions and decisions made by the students in a learning environment are as critical to understanding that environment, and its educational success, as are the perceptions and decisions made by instructors. Seminal work by Ramsden and Entwistle (1981) laid the groundwork for understanding the intimate, reciprocal relationship between the design of a university-level learning environment (“teaching”) and decisions made by university-level learners, as well as its connection to fundamental positive learning characteristics (Ning & Downing, 2012). In his 1991 Editorial in this Journal, “Mantras, False Dichotomies, and Science Education Research,” Jim Wandersee took exception to the then-common way of expressing Flora's strengths: she was learning process, not content. Wandersee explicitly and justifiably characterized “content versus process” as a canard, a false dichotomy, and implicitly suggested the usefulness of the relationships shown in Figure 1 (Ege, Coppola, & Lawton, 1997). He identified a third type of student, we'll call her Cora, with his whimsical label of “intellectual amnesiac”: someone who knows how to think, but who has nothing to think about. More seriously, this category is an excellent reminder that understanding a new topic (content and process) does not take place in isolation, but embedded in the existing sets of facts, relationships, and strategies that already exist in the mind of the learner. Not having all the facts about something does not preclude Cora from creating relationships and conclusions based on what she does know. In fact, some might say Cora is building hypotheses, critical components of which are recognizing that your factual knowledge is limited and that your propositions depend on an array of assumptions. Wandersee's formulation rightly characterizes Flora, in the “expert learning” quadrant, as a person who can take advantage of a well-structured learning environment. She builds her new knowledge by integrating what she is learning with her prior knowledge, including relevant content, the situational conditions for its use, existing processes for making meaning, and the ability to identify and evaluate useful analogies from outside the immediate domain of interest. In the dichotomous “content versus process” competition, Flora can be seen as sacrificing her learning when she accumulates factual information, which is unfair because having factual information is a critical component to learning. However, factual information needs to be accumulated in an environment that promotes connection, coherence, and integration in order to be meaningful. Although Dora has made a poor choice when she only memorizes and recalls facts and heuristic operations, as does a teacher whose learning environment only promotes these goals, she may have a foundation for learning based on relevant (and not random) associations. As an “encyclopedist,” she has made a connection between a topic (the encyclopedia topic) and some relevant information (the encyclopedia entry). As in her Algebra I class, where, presumably, the assessments were built on rote and recall, she can do as well as Flora only if a critical component of the learning environment is flawed. Make no mistake: Dora is in trouble. Let's be clear. There is no point, at any level of education, where the legitimate need to have factual information can be an excuse for poor instruction. That is, teaching and testing around a rote, recognition, and recall scheme because one version or another of this argument is used: students cannot possibly understand anything unless they first know this list of facts (Momsen, Long, Wyse, & Ebert-May, 2010). Unless the environment supports the integration of those facts into a larger framework in which students can build upon their prior learning and experiences, that list of facts will not really be known in any meaningful way, at all. For all her apparent success in Algebra I, Dora is behind the eight ball when she starts Algebra II, and she has low odds of recovering from what was ultimately a bad experience, and for persisting in an area that might need her to understand Algebra (Seymour & Hewitt, 1997). The underlying context for Figure 1 is that a learner, and particularly one at the post-secondary level, comes into a new learning situation with years of experience to drawn from, representing an idiosyncratic mélange of facts, relationships, and processing skills, and covering a range of accuracy. This context is the one in which the new learning will occur. Acknowledging this view of learning is a point made by recent reports guiding the direction of K-12 science education (National Research Council, 2007, 2012). In addition, the idea that imperfect prior knowledge and experience can potentially derail, or at least have adverse effects upon, new learning, is also going to be amplified for post-secondary students. An important change in thinking, from “fixing misconceptions” to “building upon knowledge in pieces,” is truly reflective of a student-centered perspective, acknowledging that the learning cannot be separated from its situation (diSessa, 1993; Özdemir & Clark, 2007). In Figure 2, we want to return concretely to this point about how important a different balance in the strengths and weaknesses suggested by Figure 1 can be. Imagine that in moving from Bereiter's fictitious Algebra I to Algebra II class (1997), Flora, Dora and, now, Cora, need to extend from using simple integers to non-integers and negative numbers. Flora has the relevant recall from Algebra I, and she has integrated her knowledge of manipulating non-integers and proceeds with her learning in Algebra II. Dora's strategy for handling problems with integers, as it turned out, involved her profound ability to recall multiplication tables and their patterns, but this does not help her deal with the non-integers and negative numbers (regardless of how well she understands them as topics). Cora, who did not always do well in upper level math, provides an intriguing set of answers: five out of six of her responses are correct. While it might be tempting to encourage her as a natural Algebra talent, comparable to Flora, who simply got a little sloppy on one of the easy examples, it turns out that Cora has a huge conceptual problem. She is exceptionally poor at multiplication and division and quite good at addition and subtraction. If you have not noticed it by now, five of these examples are from the subset of those where the operations of addition and multiplication produce the same numerical answer (Hoffmann & Coppola, 1996). Teachers who view Cora as doing quite well, as pupil who just needs to take a little more care with her work, are not using a truly student-centered approach when analyzing the situation. And researchers who might study these three students cannot use the ability to produce correct answers as a surrogate for conceptual understanding, which is an important and often-neglected lesson. A “meaningful versus rote learning” debate (Mayer, 2002) is no less a false dichotomy than “content versus process.” Cora has taken a commonsense (Friedman, Forbus, & Sherin, 2011) approach, constructing a quite consistent performance based on an analogy (Gentner & Smith, 2013; Taylor, Friedman, Forbus, Goldwater, & Gentner, 2011); that is, she uses, quite consistently and deployed correctly, the operation of addition for that of multiplication. Friedman et al.'s (2011) subjects, similar to the college graduates nearly 25 years ago, in “A Private Universe” (Schneps & Sadler, 1988), also use their everyday experiences quite consistently when constructing their explanations about the seasons. What they and Cora lack is the relevant information, and the conditions of its use, not the intrinsic ability to use it. Thus we adapt Wandersee's procedure for resolving false dichotomies into Figure 3, and remind ourselves that integrating relevant factual information with how to use it is critical to Flora's Meaningful Learning (Linn & Elyon, 2006), and that Cora's Analogical Learning quadrant (operating well with incomplete or incorrect information) can be just as problematic as Dora's Rote Learning quadrant (incorrect or incomplete operation on the relevant content). The consequences of these ideas for post-secondary science education research are that more adult learners bring more prior knowledge and learning experience to the table, and they have probably not learned everything they know through the same starting point or by the same pathway. Their answers cannot be categorized simply as right, rote, or wrong. Ausubel's third point is critical, and on which Bereiter's commentary about situated cognition pivots: the learner's agency to integrate ideas, to make connections to prior knowledge and skills, to learn meaningfully when the opportunity exists. Pursuing meaningful (deep, not surface) learning extends to researchers, too! Simply evaluating the accuracy of Cora's answers (Figure 2) is not enough to understand what she knows. Cora is not correct, at all, just because she is able to produce mostly correct answers. Cora's coworkers, answering interview questions about the origins of the seasons, are not wholly incorrect, either. Their prior knowledge is not only the factual information they recall, which might be incorrect or incomplete, but also the processes they bring to the information they use. As researchers, we must go far beyond just evaluating Cora's answers in order to understand what she knows, including understanding her prior knowledge, her learned thinking processes, and details about the environments in which she learned them. She might be reflecting what she does or does not understand, but she might also be reflecting exactly what the expectations and practices were in her prior experiences. Like all learning environments, post-secondary learning environments are complex, involving many critical resources that exist outside of the classroom itself, and which might have been generated by the current students, by past students, and by the institution. There is no common core in higher education, so the educational specifics of a class (goals, methods, implementation, assessment) are usually highly idiosyncratic, often to the individual instructor, and so examining student learning and achievement cannot be dissociated from these, nor easily tagged to only the classroom environment. The detailed interaction between students and the components of their learning environment must be understood, from the standpoint of the intent of the designed and implemented (by the instructor), from how it was received and utilized (by the learner), and from how aligned these two things are with the assessments that are used. We received 32 submissions from our call for papers, which is twice the response from last year's solicitation. The peer-review process resulted in nine manuscripts that were sent out for revisions, and ultimately yielded the five articles comprising this issue. We have two articles in the area of undergraduate chemistry (general and organic), one in graduate chemistry, one in evolutionary biology, and one that is multidisciplinary. Teaching for students from underrepresented population is featured in several articles, as is a direct comparison of education in an online versus face-to-face environment. Dr. Vicente Talanquer, Professor of Chemistry and Biochemistry at the University of Arizona, a post-secondary science education researcher, frequent contributor to, and reviewer for, this Journal, and an award-winning classroom educator, has authored the closing essay for the Special Issue. He provides thoughtful and provocative challenges to the discipline-based research community to advance the field. In the first article, Bhattacharyya and Bodner examine the highly discipline-centered transition of first-year organic chemistry graduate students, through the pedagogical orientation of their highly skill-based course in organic synthesis, by comparing the evolution of their problem-solving abilities in organic synthesis to the strategies used by third-year graduate students. Using in-depth, longitudinal interviews around a performance-based task, the authors explore highly contextualized ideas and artifacts using ethnomethodology, which is based on gathering data through recording the normal daily experiences of its subjects. Similar to Feldon et al.'s account on the effect of graduate teaching on research (2011), the task used by the third-year students in this study is authentic: developing an independent research proposal. The appreciation by the learners that their education has involved authentic, or “real world,” activities is concluded as a key feature in the observed growth in their epistemological understanding. These authors also triangulate their interviews with the students with additional data derived from understanding the learning environment in which this development takes place. Implications for improving the transition at the undergraduate level towards a more legitimate participation in the discipline include the explicit connection with the primary literature in the design of undergraduate research and textbooks, which is aligned with earlier work on examinations (Coppola, Ege, & Lawton, 1997). Varma-Nelson and her coworkers have studied the replication of the face-to-face Peer-Led Team Learning (PLTL) program in a distance, but synchronous, cyber-environment (cyber-PLTL; cPLTL). Using a strong experimental design, where a group of the same peer leaders facilitate instruction under both conditions, a collection of different data sources is used. The authors demonstrate that adding detailed discourse analyses on student work from a set of 24 comparable PLTL and cPLTL group sessions (12 each, with three samples from each of four instructors), using features from a deep learning model, reveals interesting differences in what students do in each of these conditions. This study benefited from digging past reporting the simple effect: that the less finely grained measures of academic performance, such as aggregated grades, pointed to comparable outcomes under each of the conditions. By exploring the details of what students were actually doing during their sessions, the authors present a compelling hypothesis, based on their evidence, that students in the cyber-groups may demonstrate a higher degree of constructivist orientation in their learning than in the face-to-face groups. Students in the cyber-group, perhaps mediated by the more formal infrastructural demands of their setting were more focused on the problem-solving process of their work compared with students in the more informal, easily digressed face-to-face setting, where mutual agreement about “getting the right answer” tended to shut down any additional conversation. Understanding the origins of why certain groups remain underrepresented in the STEM fields is an important problem. Lopez and his coworkers have continued to explore the organic chemistry setting, which is a key gateway course. These researchers have used student-generated concept maps as a source of evidence about the knowledge structures constructed during learning, in comparing the understanding and achievement of relatively large group of 90 students from diverse ethnic backgrounds while taking this course. In order to assess the disciplinary validity of the propositions contained in the maps, disciplinary experts were needed. A second, holistic analysis of the maps was also performed to provide complementary data. Although prior academic performance and ethnicity are difficult to disentangle, the researchers have used their multiple methods of analysis to point strongly to the role of prior achievement as the mediator in differences they observe. As the development of expertise in the discipline follows from conceptual understanding combined with socialization, understanding the things that differentiate students on the first day of class is critical. The issues raised by these researchers are not at all settled, by their own account, but their study directs them, and others, to explore more deeply the question of who is walking into the classroom by what their experiences have been, and not so much by other demographic information. As such, their work points to the importance of learning more about our students than the indices of demographic information. In their paper, Novick and her coworkers look at a specific and generally troublesome feature of learning evolutionary biology, namely, the relationship between microevolutionary concepts, such as natural selection, with corresponding macroevolutionary ones, such as Tree of Life (ToL) thinking. In a sample of 124 students assessed for their prior knowledge of natural selection, half of the subjects received self-paced but highly directed and explicit instructional materials related to ToL thinking (the representations, their interpretation, and use), while the other half engaged a comparable level of effort on general science reasoning activities. The findings in this study, derived from testing the integrated understanding of micro- and macroevolution held by these students, counteracts a dogmatic belief that understanding natural selection automatically transfers to an understanding of ToL thinking. As in Bhattacharyya and Bodner (Article #1), the success of the instructional environment is attributed, in some part, to its connection with authentic, or “real work,” (Coppola, in press) representation of science. Energy has emerged as an explicit, crosscutting concept in K-12 science education (Chen, Eisenkraft, Fortus, & Krajcik, 2014; National Research Council, 2012). In their study, Becker and Cooper have looked at the existing understanding of potential energy, in the context of chemistry, with first- and second-year undergraduate chemistry students, as a way to understand their prior knowledge about this area, from their precollege education, and how it may or may not have been integrated into their university education. Using a set of written, open-ended surveys with 333 students from three courses, in addition to semi-structured interviews with 18 students from these classes combined with four other upper division students, the researchers probed students' understanding of what potential energy means at the atomic–molecular level. These authors found that while students' explanations fell into three more or less useful and comprehensible categories, their understanding of how those categories (capacity for work, stored energy, and stability) related to notions of potential energy were incomplete, incorrect, and/or incoherent. These intuition-based explanations, inevitably derived from the course contexts in which they were learned, functioned operationally in these classes, but nonetheless broke down when examined in detail. The question posed by the K-12 Framework for Science Education (National Research Council, 2012) is whether reduction of the historically diverse way in which the disciplines treat energy to a common, foundational core, can improve both the depth of understanding held by learners within the disciplinary units as well as across the multi-disciplinary spectrum. These findings suggest that this is likely to be a significant challenge. In his essay, Talanquer lightly characterizes the community of post-secondary science education research as being constrained by its origins as well as the history and traditions of precollege science education research. Understanding how post-secondary disciplinary expertise might affect the design and scope of research was one of the topics we also speculated about in our previous editorial (Coppola & Krajcik, 2013). In describing a set of challenges, Talenquer artfully challenges the community itself. At the same time, he sees some of the underlying strengths of the papers included in this Special Issue in how they begin to model the way to break down some of the walls that he sees surrounding the DBER community. Talenquer's final challenge is for researchers to move their collaborations into areas where the understanding of the subject matter is not the only focus, which is highly represented, but to deepen the meaningfulness of that understanding beyond the surface features of learning, which is far less represented. He also argues for increasing the breath of study, to how we achieve, and can more effectively achieve, the scientific and intellectual dispositions that ought to emerge in a truly educated person in our postsecondary education system. Do not only report the effect, examine what produces the effect (what, why, how, under what conditions…). Evaluation versus Research. Based on our collective experience as editors, one of the first filters used by reviewers is the question: is this an evaluation report or is it a research study? This question comes up more frequently in response to studies carried out by the emergent group of investigators from the post-secondary, discipline-based community than it does for others. One hypothesis for this is that the ubiquitous offices for institutional research and evaluation have dominated the collection of data for academic accountability and regulatory compliance in college and university settings. We cannot be the only ones to encounter an administrator who reminds us “these are just data, and you can do what you want with data; why do the standards of research have a bearing on this?” Generating results by short-circuiting research standards is a cottage industry for so-called Institutional Evaluation. A recent article, for example, was titled “The Counterfactual Self-Estimation of Program Participants: Impact Assessment Without Control Groups or Pretests,” a method in which “program participants are capable of estimating the hypothetical state they would be in had they not participated” (Meuller, Gaus, & Rech, 2014). Administering a single retrospective survey, after a mathematics class, that asks students to self-assess the learning gains they have made in mathematics because of that class, is constrained by many theoretical and methodological limitations (Finney, 1981; Poggio, Miller, & Glasnapp, 1987). Reporting this out as evidence for an increase in quantitative reasoning skills, which is not problematic for Institutional Evaluation, obviously does not come near the standards for pushing the field forward with respect to what can account for increased learning, nor do these reports met the standards for publishable work in JRST (or many other places, we imagine). Currently, concept inventories are proliferating (Libarkin, 2008). These short, multiple-choice, standardized tests exist at the border between evaluation and research. The earliest and most commonly used inventories, and nearly all those to come later, are built upon the older misconceptions literature which, as indicated above, has yielded to new ideas about meaningful learning. How has the psychometric validity been affected by this? In addition, while these tests are usually focused, they still cover much ground, often with only 1–2 items to reveal student understanding on a given topic. Another lingering concern is the degree to which students, through the content bias that the examinations might have on instructors, are being more narrowly prepared for these specific situations. Smith and Tanner (2010) provide a thoughtful and balanced review of the benefits and limitations of these exams. The connection between selecting a correct answer and the application of conceptual understanding (one of three goals from the transformational teaching model) is not automatic. Regardless of how well selecting the correct answer might be derived from conceptual understanding, correct answers can also derive from other sources that have nothing to do with conceptual understanding at all (remember Cora and Dora?). If an instructional changes, it is worth investigating. The two most common sources of evidence are at least suspect in their superficial nature: (a) fewer students fail (the DFW rate, students who earn grades of D, F, and who have withdrawn from the course), and (b) a short, standardized multiple choice exam is used as a pre- and post-test (these are the concept inventories), and a gain score is reported (Smith and Tanner, 2010). The observations of higher grades and improved standardized exam scores are unquestionably true, but without an aligned chain of evidence gathered through a triangulated research design, claims about why the outcomes are different cannot exceed correlation. Cora: again. The field needs to learn more about why these changes have been successful. Advocates for “scientific (evidence-based) teaching” (Handelsman et al., 2004; Handelsman, Miller, & Pfund, 2007) ought to not compromise on the existing standards of evidence for the complex, social science of education research. Every claim of effective instructional intervention ought to include evidence for and alignment between the intentionality in the pedagogical design, observation and open coding of the implementation, interview and/or observation of student work, analysis of artifacts, independent performance-based assessment, and serious consideration of all alternative hypotheses and whether there is any evidence for falsification. The observed outcomes from the various “active learning” classrooms are real (Freeman et al., 2014), but, scientifically, their enthusiastic advocates need to follow good scientific practices and separate the observation from its attribution. Conceptual understanding is not observed. What is observed is typically two things: fewer students get failing grades, and students show gain scores on short, standardized, multiple-choice exams. Without evidence to the contrary, multiple hypotheses for these outcomes are potentially operating (beyond the accomplishment of meaningful learning). To date, for example, we are unaware of serious research that has started with the hypothesis that students in these settings are being targeted for test-training, resulting in robust heuristics that allow them to recognize, select or generate correct answers more efficiently. Is this the same as meaningful learning with conceptual understanding? We do not know; it needs to be examined. Without a doubt, the active classroom observations have changed the way a large fraction of instructors think about teaching and assessment, which may be the most positive result. Perhaps simply having fewer students fail is a desired outcome, because it might increase the fraction of students who stay in the science pipeline, and give those who leave a lingering positive impression of science. But are these active classrooms also developing more Floras, and not simply improving the testing skills of Dora and Cora? There are many unanswered questions for which research can provide evidence. Shift the focus from understand how well students do on science exams to how well they are learning science. As described previously (Coppola & Krajcik, 2013), discipline-centered, post-secondary science education is more likely to be carried out by practicing scientists who carry deep and complex disciplinary dispositions as a integral part

Referência(s)