Skip Navigation
Button that takes you to the Center for Educational Technologies home page. Button that takes you to the Virtual Design Center main page. Button that takes you to the Virtual Design Center main page. Header image that reads the Center for Educational Technologies: Virtual Design Center: designing for the next generation.
   
 

Button that takes you to the Overview & Registration page. Button that takes you to the Step 2: Standards Alignment page. Button that takes you to the Step 3: Investigation Question page. Button that takes you to the Step 4: Assessment page. Button that takes you to the Step 5: Best Practices page. Button that takes you to the Step 6: Learning Technologies page. Button that takes you to the Wrap-Up Activities. Button that takes you to the About the VDC page. Button that takes you to the Introductory Activities page. Button that takes you to the Step 1: Project Rationale page. Image map of the Virtual Design Center Navigation.

Button that takes you to the Discussion Board page.

Button that links to the NASA home page.

Image that reads Design Principles.

Main | Principle: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

PRINCIPLE 10:
Designers Should Help Learners Understand and Internalize the Structure of Argumentation

Ingo Kollar, Frank Fischer, & James D. Slotta

The original paper, entitled “Internal and External Collaboration Scripts in Web-based Science Learning at Schools,” was presented at the 2005 Computer-Supported Collaborative Learning Conference in Taipei, Taiwan.[1]

This study by Drs. Kollar, Fischer, and Slotta addresses a key challenge in developing worthwhile science, technology, engineering, and mathematics (STEM) learning environments. Current views of teaching and learning support authentic and inquiry-oriented instructional approaches. These let students explore scientific phenomena and problems in much the same way that STEM professionals do. Particularly with new networked computer technologies, students are increasingly able to collaboratively generate and investigate researchable questions, form hypotheses, conduct experiments, and interpret data (van Joolingen, de Jong, Lazonder, Savelsbergh, & Manlove, 2005). While learning scientists and educational researchers have made great progress in this regard, many students don’t know how to work and learn collaboratively. They need to learn how to do this in the context of authentic scientific inquiry (Quintana et al., 2004).

This study looks at a promising and well-researched web-based inquiry learning environment that supports collaborative inquiry and learning. In particular, the study investigated how collaboration scripts (O’Donnell, 1999) can improve how people argue collaboratively and gain knowledge.

All learners have internal scripts. These refer to the learners’ individual procedural knowledge that guides them in how they argue (Schank & Abelson, 1977), and they play a key role in successful collaborative argumentation.

Collaboration scripts can help improve collaborative argumentation by prescribing certain activities, sequences, and roles (Kollar, Fischer, & Hesse, in press). For example, scripts can specify that one learner should advocate a specific hypothesis while another learner should attack this hypothesis. The idea is that when students engage in these activities, sequences, and roles, they elaborate content more deeply and acquire more individual knowledge than they would without a collaboration script. Of course, learners possess widely varying ideas about collaboration and different capabilities for argumentation. These differences might require different collaboration scripts for learners to achieve the benefits of argumentation.

Why Support Collaborative Argumentation?

STEM instructional designers should consider focusing on collaborative argumentation. That’s because the ability to engage in argumentation is an important lifelong learning skill that schools should teach (Linn, Davis, & Eylon, 2004). In addition, debating with peers about hypotheses, data, or evidence helps learners deeply elaborate science content (Sandoval, 2003). This supports the learning of specific scientific knowledge in robust and useful ways. Traditional approaches that teach content separate from argumentation leave students with a shallow understanding of the concepts and lacking in applying meaningful argumentation skills, such as using evidence to support claims.

What Is the Structure of Collaborative Argumentation?

This study builds on prior research that examined the kinds of collaborative argumentation that support learning (Andriessen, Baker, & Suthers, 2003). Arguments have two kinds of structure. One is the structure of single arguments; the other involves argumentation sequences. Single arguments include (1) data that provides evidence on which the argument relies, (2) a claim that states a position, and (3) a reason stating why the data supports the claim (Toulmin, 1958). Argumentation sequences involve (1) arguments, (2) counterarguments, and (3) integrative arguments (Leitao, 2000).

How Can Collaborative Argumentation Be Supported?

Previous studies by these researchers showed that collaboration scripts in computer-supported learning environments improved collaborative argumentation and enhanced learning (Weinberger, Stegmann, & Fischer, 2005). However, one key question about external scripts that has yet to be answered is how structured they should be. Some researchers have experimented with more unstructured scripts that provide loose constraints for activities, sequences, and roles (e.g., Baker & Lund, 1997). Others have used highly structured scripts that provide very detailed instructions concerning activities, sequences, and roles (e.g., Weinberger et al., 2005).

This study considered the internal scripts that learners already have when they enter an inquiry learning session. That’s important because preexisting internal scripts might conflict with the external script, while other internal scripts might help students master and learn the external scripts. The researchers assumed that internal scripts also vary in terms of structure. Some individuals might know that they should make their reasons explicit in arguments and that good argumentation sequences feature counterarguments and integrative arguments. Others might not have that knowledge and rather develop arguments that simply rely on claims and skip the development of counterarguments or integrative arguments. The study examined whether learners with differently structured internal scripts benefit from differently structured external scripts and how the interplay between internal and external scripts affects learning collaborative argumentation and more specific scientific knowledge. The study did so using the Web-Based Inquiry Science Environment (WISE), which was developed by researchers at the University of California at Berkeley and has been researched extensively.[2]

How Can Learning Technology Help Learners Understand the Structure of Argumentation?

In order to study differently structured internal and external scripts in WISE, the researchers looked at both process measures and outcome measures. Process measures examined effects of internal and external scripts on the quality of collaborative argumentation. Outcome measures examined individual learning. Measures assessed more general knowledge of argumentation as well as more specific knowledge associated with the scientific domain.

The researchers started by examining the internal scripts of 98 high school students, two weeks before the actual collaborative inquiry session took place. The students took an assessment that had them identify “good” and “bad” examples in samples of scientific argumentation. The students who were better at this task were judged to have more structured internal scripts. The scores were used to split the students into two groups. The half of the students with higher scores were classified as having high-structure internal scripts; the others were classified as having low-structure internal scripts. Collaborative pairs were then assembled consisting of either two high-structure individuals or two low-structure individuals. This resulted in a “2 x 2” experimental design that had four combinations of internal/external scripts (high/high, high/low, low/high, low/low).

The pairs of students then completed one of two versions of a WISE investigation, “The Deformed Frogs Mystery” (Linn, Shear, Bell, & Slotta, 2004). The investigation asked learners to contrast two competing hypotheses (parasitic vs. environmental) for widespread physical deformities in frogs. The hypotheses were to be discussed across five content-specific units. Each unit offered students a range of various scientific data sources (e.g., photographs, maps, reports), which could be used to explore the two hypotheses. At the end of each unit, learners were asked to discuss their hypotheses in light of the information. Students were then instructed to record their arguments supporting the two hypotheses in a text-entry box.

In the low-structure version of the WISE investigation, students were simply instructed to record their arguments into a plain text box. In the high-structure version (see Figure 1) instructional text and prestructured text boxes prompted learners to construct arguments using data, claim, and reason, and construct argument sequences consisting of arguments, counterarguments, and integrative arguments. Each box specified which of the two learners had to create an argument component and provided sentence starters for each component (e.g., “It was found that…” for data). Each student was instructed to advocate for just one of the two hypotheses, with the assignment switching several times during the course of the project. To ensure that students internalized the argument structure, the degree of structure was faded across the units, so that the final unit in the high-structure environment was similar to the final unit in the low-structured environment.   

Image of figure 1 which shows a screenshot of the high-structured external collaboration script (introductory text).Image of figure 1 which shows a screenshot of the high-structured external collaboration script (prestructured text boxes to be filled in by the participants).

 Figure 1: Screenshots of the high-structured external collaboration script (left screen: introductory text; right screen: prestructured text boxes to be filled in by the participants).

Each pair of students worked together on the investigation for two hours, relying entirely on the instructions and guidance provided in the WISE investigation. After completing the investigation, each student completed two outcome measures. One assessed students’ knowledge of argumentation. It asked learners to list the components of good arguments and argument sequences and to give examples of complete arguments and argument sequences. The other measure assessed each student’s knowledge of the embedded scientific concepts. Five open-ended questions asked students to describe the mechanisms that might cause the frog deformities and how one could formally test them.

What Happened When This Technology Was Actually Used?

As expected, scores on the test of knowledge about argumentation showed that the high-structure external script resulted in greater learning about argumentation. Specifically, the students who completed the high-structure version of the WISE investigation showed greater knowledge of argumentation than the students who completed the low-structure version and were better able to construct appropriate arguments and argument sequences. The difference was statistically significant, meaning that there was less than a 1 in 20 chance that the difference between the two groups might have occurred by chance. The effect was the same for students with low-structure internal scripts as for students with high-structure internal scripts. Technically, this means that there was no statistically significant “interaction” between internal and external scripts.

Scores on the scientific knowledge test showed that students with high-structure internal scripts learned more of the embedded scientific content than students with low-structure internal scripts. The group with the lowest score on the knowledge test was the group whose internal and external scripts were both low structured. But overall there was no statistically significant evidence that the high-structure external script improved student learning of the embedded scientific content. The researchers also conducted detailed analyses of argumentation by coding transcripts of the recorded discussions of some of the students. Examples and analyses are presented in the original paper but are not included in this summary.

What Should Designers Do to Help Learners in Collaborative Argumentation?

This study showed that within a relatively short intervention (two hours) it is possible for students to learn useful information about the structure of scientific argumentation. This included the appropriate structure of scientific arguments as well as the appropriate structure of argument sequences. This study also showed that students who had relatively more knowledge about argument structure learned more of the domain-specific scientific knowledge when participating in collaborative inquiry-oriented investigations than students with relative less knowledge about argument structure. While the study did not show that providing students with the high-structure external script helped them learn more specific scientific knowledge, this is not entirely surprising given the length of the intervention and the fact that the students were also learning about argumentation structure and sequences represented by those scripts. The important point for STEM instructional designers is that argument structure can be directly taught within a content-rich, web-based inquiry environment and that learners who bring such knowledge to inquiry-oriented environments learn more than students who do not.

The study also provided a useful illustration of a general way that STEM instructional designers can support participation in scientific argumentation. In a wide range of environments (both with and without sophisticated technology), instructional designers can help students construct scientific arguments using data, claims, and reasons, and construct sequences consisting of arguments, counterarguments, and integrative arguments. This study provided valid examples of how more sophisticated designers might measure learning of scientific argumentation as well as learning of domain-specific scientific knowledge by participating in that argumentation.


[1] This paper was a winner of the 2005 Virtual Design Center's paper competition, selected through an extensive review process by the Virtual Design Center advisory board. The above summary was prepared by the authors with the assistance of the advisory board chair (Daniel T. Hickey) and the project manager (Beaumie Kim). For more information on how you could use this design principle for your practice, you may contact the Virtual Design Center (vdc@cet.edu) or the board representatives (Daniel T. Hickey, dthickey@indiana.edu; Beaumie Kim, bkim@cet.edu) and the authors (Ingo Kollar, i.kollar@iwm-kmrc.de; Frank Fischer, f.fischer@iwm-kmrc.de; James D. Slotta, slotta@tels.berkeley.edu).

[2]  The development project was funded primarily by the National Science Foundation and directed by Marcia Linn and third author, James Slotta. For more information visit www.wise.berkeley.edu

Main | Principle: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

References.
Andriessen, J., Baker, M., & Suthers, D. (2003). Argumentation, computer support, and the educational context of confronting cognitions. In J. Andriessen, M. Baker, & D. Suthers (Eds.), Arguing to learn: Confronting cognitions in Computer-supported collaborative learning environments (pp. 1-25).
Dordrecht, Germany: Kluwer.

Baker, M. & Lund, K. (1997). Promoting reflective interactions in a CSCL environment.
Journal of Computer-Assisted Learning, 13, 175-193.

Kollar, I., Fischer, F., & Hesse, F.W. (in press). Collaboration scripts—A conceptual analysis. Educational Psychology Review.

Leitao, S. (2000). The potential of argument in knowledge building. Human Development, 43, 332-360.

Linn, M.C., Davis, E.A., & Eylon, B.-S. (2004). The scaffolded knowledge integration framework for instruction. In M.C. Linn, E.A. Davis, & P. Bell (Eds.) Internet environments for science education (pp. 47-72). Mahwah, NJ: Lawrence Erlbaum Associates.

Linn, M.C., Shear, L., Bell, P., & Slotta, J.D. (2004). Organizing principles for science education partnerships: Case studies of students’ learning about “Rats in Space” and “Deformed Frogs.” Educational Technology, Research, and Development, 47(2), 61-84.

O’Donnell, A.M. (1999). Structuring dyadic interaction through scripted cooperation. In A.M. O’Donnell & A. King (Eds.), Cognitive perspectives on peer learning (pp. 179-196). Mahwah, NJ: Erlbaum.

Quintana, C., Reiser, B.J., Davis, E.A., Krajcik, J., Fretz, E., Duncan, R.G., Kyza, E., Edelson, D., & Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. The Journal of the Learning Sciences, 13(3), 337-386.

Sandoval, W.A. (2003). Conceptual and epistemic aspects of students’ scientific explanations. The Journal of the Learning Sciences, 21(1), 5-51.

Schank, R.C., & Abelson, R.P. (1977). Scripts, plans, goals, and understanding. Hillsdale, NJ: Erlbaum.

Toulmin, S. (1958). The uses of argument. Cambridge, UK: Cambridge University Press.

Van Joolingen, W.R., de Jong, T., Lazonder, A.W., Savelsbergh, E.R., & Manlove, S. (2005). Co-Lab: research and development of an online learning environment for collaborative scientific discovery learning. Computers in Human Behavior, 21, 671-688.

Weinberger, A., Stegmann, K. & Fischer, F. (2005). Computer-supported collaborative learning in higher education: Scripts for argumentative knowledge construction in distributed groups. In T. Koschmann, D.D. Suthers, & T.-K. Chan (Eds.), Computer supported collaborative learning: The next 10 years! (pp. 717-726). Mahwah, NJ: Erlbaum.

 

Button that says contact us and allows you to email the virtual design team at vdc@cet.edu.

 
        
 

© 1999-2006 by Wheeling Jesuit University/Center for Educational Technologies®. 316 Washington Ave., Wheeling, WV, 26003-6243. All rights reserved. Privacy Policy and Terms of Use.