Skip Navigation
Button that takes you to the Center for Educational Technologies home page. Button that takes you to the Virtual Design Center main page. Button that takes you to the Virtual Design Center main page. Header image that reads the Center for Educational Technologies: Virtual Design Center: designing for the next generation.
   
 

Button that takes you to the Overview & Registration page. Button that takes you to the Step 2: Standards Alignment page. Button that takes you to the Step 3: Investigation Question page. Button that takes you to the Step 4: Assessment page. Button that takes you to the Step 5: Best Practices page. Button that takes you to the Step 6: Learning Technologies page. Button that takes you to the Wrap-Up Activities page. Button that takes you to the About the VDC page. Button that takes you to the Introductory Activities page. Button that takes you to the Step 1: Project Rationale page. Image map of the Virtual Design Center Navigation.

Button that takes you to the Discussion Board page.

Button that links to the NASA home page.

Image that reads Design Principles.

Main | Principle: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

Principle 11:
Assessment should be designed to examine and foster inquiry, collaboration, and understanding in science learning

Eddy Y.C. Lee, Carol K.K. Chan, & Jan van Aalst

The original paper, entitled Students Assessing Their Own Knowledge Advances in a Knowledge-building Environment,” was presented at the 2005 Computer-Supported Collaborative Learning Conference in Taipei, Taiwan.[1]

Instructional innovators in science, technology, engineering, and mathematics (STEM) education have long relied on collaborative learning, in which students work together to solve problems and learn. With continual advances in networked computer technology and the increased used of digital course materials, much of this collaboration is now computer supported. Cognitive scientists and educational researchers have made major advances in using “asynchronous networked environments” to help students collaboratively investigate and learn STEM knowledge.

This summary describes one particularly noteworthy study of computer-supported collaborative learning. The study is based on the concept of knowledge building. By showing the power of specific knowledge-building principles, the study shows this concept to be more useful for designing effective collaborative learning environments, compared to more simplistic characterizations of learning. Another particularly useful aspect of the study is that it employed formative assessment, which is itself one of the most useful tools that STEM instructional designers can use. The portfolio assessment methods examined in the study can be used to improve learning in most instructional environments and create an ideal context for experimenting with different knowledge-building principles. In addition to demonstrating the added value of both formative assessment activities and the use of knowledge-building principles, the study illustrates a simple but powerful research design that many STEM instructional innovators should find useful.

What Is a Knowledge-building Environment?

The importance of inquiry and collaboration is widely recognized in STEM instruction. This study examines collaborative inquiry within the framework of knowledge building, a theoretical approach that examines how students make collective knowledge advances as members of a scientific community (Bereiter, 2002; Paavola, Lipponen, & Hakkarainen, 2004; Scardamalia & Bereiter, 2003).

Knowledge-building perspective. The fundamental aspects of knowledge building include improvable ideas and collective cognitive responsibility. Just as in STEM professional communities, ideas are viewed as conceptual artifacts that can be examined and improved by means of public discourse within a knowledge-building community. This notion is useful for designing STEM instruction, because the collective discourse that defines a knowledge community has much in common with the everyday efforts of STEM professionals. By developing ways to support that discourse, STEM instructional designers can create environments that support the development of authentic knowledge and skills of individual learners as well.

With the advent of the knowledge-based era, Scardamalia and Bereiter (2003) propose that knowledge building focusing on knowledge creation and innovation is an important collaborative practice that students need to develop. Similar to research and scientific communities, when engaging in knowledge-building discourse, students pose cutting-edge questions that help the community to make advances in its collective understanding. Learners take on progressive problem solving, progressively seeking to understand problems at deeper levels. Students make progress not only by improving their personal ideas but through their contribution to collective knowledge advances.

The Knowledge Forum™ learning environment. To support progressive discourse in the community, Scardamalia & Bereiter and their research team have developed a computer environment now called Knowledge Forum™ (formerly called Computer-supported Intentional Learning Environment, CSILE, Scardamalia & Bereiter, 1994; Scardamalia, Bereiter, McLean, Swallow, & Woodruff, 1989). A Knowledge Forum database is created by students. Using networked computers, a number of students can simultaneously create notes (text or graphics) to add to the database, search existing notes, comment on other students’ notes, or organize notes into more complex structures. As the database grows, Knowledge Forum provides a progressive trace of how ideas have evolved in the class. The Knowledge Forum database thus helps to formally show and document the community’s knowledge advancement. Its features help students further advance their ideas. For example, scaffolds (or sentence starters) such as “My Theory” are metacognitive prompts intended to promote deeper thinking and effective communication (Figure 1).

Image of figure 1 which shows the Knowledge Forum features (notes, views, scaffolds) that support knowledge building. Please have someone assist you with this.

Figure 1. Knowledge Forum features (notes, views, scaffolds) that support knowledge building. 

In typical knowledge-building classrooms, the class usually starts with a general exploration of the science topic to be studied. This helps the class articulate questions and ideas they have about the science topic. Students may contribute their ideas to the Knowledge Forum database and/or talk to each other about them. After some teacher scaffolding, students may identify certain learning goals. From this point they work collaboratively and progressively to understand the science problems the class has formulated. Students have a responsibility to make their ideas available to the knowledge-building community and to help each other improve the ideas. Students formulate problems, develop conjectures and hypotheses, examine alternative explanations, revise theories, and examine others’ ideas in improving collective knowledge.

The concept of knowledge building and the Knowledge Forum environment are now being employed in many schools and workplaces in different countries. Different domains have been examined but most studies have been conducted in science domains. Research evidence has shown that students have deeper domain understanding in science (Hewitt, 2002; Scardamalia, Bereiter, & Lamon, 1994; van Aalst & Chan, in press); there is also evidence that students can participate in knowledge building from the early grades of elementary school (Hewitt, 2002; Hakkarainen, Lipponen, & Jarvela, 2002; Scardamalia, 2002). The many studies conducted within Knowledge Forum provide inspiring examples of strategies for supporting knowledge construction and strong empirical evidence about the impact of these strategies. The important point for STEM instructional designers is that these strategies will be useful in many other collaborative learning contexts and in most instructional domains.

What Are the Roles of Assessment in Learning and Collaboration?

Much of the prior research on computer-supported collaborative learning has focused on evaluation and assessment of collaborative processes, systems, and designs. While breaking important new ground, these studies have also shown that putting students together does not necessarily mean they will engage in collaborative inquiry and deep discourse. Researchers are increasingly focusing on the assessment of student learning and participation in collaboration in order to scaffold student’s collaborative inquiry and understanding.

Many of the most useful insights for supporting collaborative inquiry come from research on formative assessment (e.g., Barron et al., 1998). Rather than the traditional focus of assessment of learning, formative assessment is carried out for learning. A great deal of progress has been made in formative assessment for individual learning. This study is one of several that explores formative assessment of collective aspects of knowledge advance as well. Formative assessment in computer-supported collaborative learning is intended to give students agency to assess their own and community knowledge advances. For this to happen, assessments need to be designed that both measure and foster deeper inquiry and collaboration. This study addressed three important issues for all STEM instructional designers:

  • Assessment of learning AND assessment for learning. Current views posit that assessment and instruction need to be integrally related (Bransford, Brown, & Cocking, 1999; Sheppard, 2000). However, in schools assessment usually takes place at the end of teaching for the purposes of testing what students have learned. Assessment that occurs after learning is problematic in that the opportunity to scaffold learning and to provide feedback is overlooked; this is referred to as assessment for learning (Black & Wiliam, 1998). We propose that science assessment needs to be designed in ways not only to measure, but also to foster science learning in technology-based environments.

  • Assessment of individual AND collective learning. There has been much emphasis on collaboration in science learning; however, assessment in schools persists to focus on individual outcomes and overlooks collective components of knowledge creation (Chan & van Aalst, 2004). Stahl (2002) discussed that knowledge growth in a community emerges from the community’s collaboration; it is a collective phenomenon that cannot be condensed to individual involvement. Scientific inquiry and collaboration need to capture both individual and collective growth in knowledge.

  • Assessment of content AND process. Contemporary understanding of human cognition says that knowledge is constructed rather than received. If we want to prepare students for future learning—with less dependence on a teacher—we need to teach them to execute, monitor, and regulate the knowledge construction process. This would suggest we must value not only what science content is learned, but also how students engage in scientific inquiry. On the other hand, there may be a danger to the separation of process from content. We propose that science assessments need to be designed so they tap both the collaborative process and knowledge products.

How Can Electronic Portfolios Be Used for Formative Assessment?

This study showed how to augment the knowledge-building environment using an innovative formative assessment design that served to both characterize and scaffold knowledge building. It builds on a well-known approach to formative assessment called portfolio assessment. Hence, the formative assessments are called knowledge-building portfolios.

Portfolio task. Portfolios usually consist of a selection of best items (e.g., papers, diaries, drawings) accompanied by a reflection statement explaining why students have selected these items as exemplary work and why they thought progress had been made. We asked students to prepare portfolio notes in Knowledge Forum as formal course assessments. To help them with the selection, we provided them with several knowledge-building principles. They selected exemplary notes in the computer discourse (similar to the selection of best items in portfolios) and wrote a statement (reflection) explaining why they thought these were their best notes in evidence of knowledge building. Specifically, a portfolio note included hyperlinks to other computer notes providing evidence for the principles. As an example, the author of the portfolio note shown in Figure 2 explained that she had found a cluster of notes about the chiral centers of molecules that illustrated the knowledge building principle of progressive problem solving. She then articulated how these notes developed over time. In doing so, she was reflecting on the progress of ideas in the community.

Image of figure 2 which shows an example of a portfolio note with hyperlinks to reference notes.

Figure 2. An example of a portfolio note with hyperlinks to reference notes.

Knowledge-building principles. To help students with the selection, they were provided with a set of knowledge-building principles as criteria. The key idea is to use these principles to help students recognize and engage in more knowledge building. A brief description is given for the knowledge-building principles.  

  1. Working at the cutting edge. Students are to pose cutting-edge problems. This principle is related to epistemic agency (a sense of control over knowledge), and it is based on the idea that a scientific community works to advance its collective knowledge. In practice, this principle guides students away from unproductive reiteration of knowledge that is already known and toward working at the frontiers of their knowledge for both individuals and the community.  

  2. Progressive problem solving. The basic idea is that when an expert understands a problem at one level, he or she reinvests learning resources into new learning. In a scientific community one study often raises new questions that are explored in follow-up studies. In practice, this principle focuses problem-solving activity on the “hard problems” that require new knowledge to solve and deepening the problems. 

  3. Collaborative effort. This principle focuses on the importance of working on shared goals and values in developing community knowledge for advances in science. In practice, this principle reminds participants to build shared understanding and consensus with their collaborators. For example, it reminds students that they need to be prepared to support their conclusions in a way that their collaborators will find convincing, and their goal is to advance communal knowledge.

  4. Monitoring personal knowledge. This principle is based on the idea that metacognitive understanding is needed for knowledge-building work. Specifically, it requires students to have insight into their own learning processes as well as collective growth. In practice, it reminds students to take time to reflect on their own thinking and learning in relation to community advances at regular intervals. 

  5. Constructive uses of authoritative source. This principle focuses on the importance of keeping in touch with the growing edge of knowledge in the field. To make knowledge advancement requires making references, building on as well as critiquing authoritative sources. In practice, this principle reminds students to build on existing established knowledge in the process of building new knowledge. 

This set of principles is adapted from Scardamalia’s (2002) 12 principles of knowledge building. It is important to note that these principles enable students to identify knowledge advances and document the community’s best work and progress in any collaborative context. They are not specific to the knowledge forum environment or even to computer-based collaborative environments.

How Was the Knowledge-building Portfolio Examined?

This study is part of our ongoing design research program that examines the theory and design of knowledge-building portfolios in scaffolding collaborative inquiry. In the past few years we have examined the design of the knowledge-building portfolios in a graduate course as well as in two other grade 12 classes in Earth sciences and biochemistry in Hong Kong (Chan, & van Aalst, 2004; Hill, van Aalst, Lee, & Chan, 2003; van Aalst & Chan, in press). This study extends our work examining the roles of knowledge-building principles with a larger group of younger students.

Participants and context. There were 119 students studying in four grade-nine geography classes in a high school in Hong Kong taught by the same teacher. Knowledge Forum was implemented in the physical geography curriculum in the second semester of the year for several months. As with other knowledge-building classrooms, students worked on Knowledge Forum as they generated questions, posed alternative theories/hypotheses, brought in new information, considered different students’ views, and reconstructed their own understanding.

Research design. To examine the roles of knowledge-building portfolio, we employed a quasi-experimental approach. Three of the classes used Knowledge Forum with different design conditions; the fourth was a comparison class not using Knowledge Forum. Specifically, class one was the comparison class; class two worked on Knowledge Forum only; class three worked on Knowledge Forum and were asked to produce a portfolio in which they identified good notes with no principles; and class four worked on Knowledge Forum, and they were asked to produce a portfolio and identify exemplary clusters of notes of their classes’ best work based on a set of knowledge-building principles.

Outcome measures. Several outcome measures were used to assess individual and collective knowledge building. First, the student responses and questions in the Knowledge Forum database were assessed for evidence of knowledge-seeking inquiry. Each response was coded on a seven-point scale while each question was coded on a four-point scale, with larger numbers representing deeper inquiry. Second, the knowledge-building portfolios that students prepared were scored for the quality of explanation and evidence of knowledge building on a six-point scale. Third, individual conceptual understanding at the end of the unit was assessed using an essay question that asked students to answer a broad question that was not biased toward the Knowledge Forum curriculum or any of the Knowledge Forum conditions. Student responses were coded using the kind of standard rubric that teachers normally use to score essays. For all three outcome measures a second scorer coded at least 30 percent of the responses. The correlation between the scores from the two raters was more than .80 in each case, showing that the measures were reliable.

Results. The findings showed that knowledge-building portfolios played important roles in scaffolding collaboration and scientific understanding in the following ways: (1) The knowledge-building portfolio classes performed better than the other two classes on database usage; they wrote better questions and explanations in the database, and they performed better on scientific understanding. (2) Collective knowledge-building portfolio scores predicted students’ scientific understanding over and above the effects of achievement, database usage, and individual knowledge inquiry. Students who engaged in more collective work also developed more scientific understanding. (3) Analyses of student discourse showed how knowledge-building portfolios might foster scientific inquiry. Guided by the principles, students were better able to recognize and engage in productive scientific discourse. The portfolio notes of students who used the knowledge-building principles revealed trajectories of collaboration and knowledge growth in the community. In contrast, the portfolio notes of the students in other classes were primarily a selection of good answers. Through analyzing their own best work, students extended both collective and individual understanding; the principles helped students to develop collaboration and domain knowledge. 

Lessons Learned: How Did Knowledge-building Portfolios Support Scientific Inquiry and Key Design Principles?

As indicated in the introduction, this study is particularly useful for STEM instructional designers because the core ideas can apply to most other collaborative learning settings. The following is a summary of the core lessons learned in the study and principles for other instructional designers to consider.

Assess both individual and collective learning. It is now commonplace to encourage collaboration in scientific inquiry. Teachers, however, need a way to assess collaboration in student work. A major theme in computer-supported collaborate learning focuses on examining collaboration and the interactions between individual and collective knowledge advances. We propose that our design of knowledge-building portfolios helps to capture collective knowledge building: A given portfolio note represents more than individual knowledge; it shows collective knowledge advances in the community with multiple contributions from students. Knowledge does not belong to any single student; it is distributed across the group and community. The portfolio provides a way for teachers to identify what the class has learned, what they do not understand, and what progress the community has made in the cutting-edge and collective knowledge.

Use the portfolios and principles as scaffolds for scientific understanding. The knowledge-building portfolios not only help to characterize the collaborative process, they also provide a tool to help scaffold collaborative inquiry and domain understanding. The key idea is that when students are told explicitly the goal of instruction and principles and criteria, they can work toward engaging in more knowledge building. Often students are asked to engage in collaborative inquiry in science classrooms, but they might not understand what productive inquiry involves. In our design we made the knowledge-building principles explicit to students and asked them to identify examples from their own work. Students were not only examining their own work, they were analyzing the collective work of the community and reflecting on the process. The principles as criteria could help them recognize and engage in more knowledge building. As well, when students analyzed and synthesized ideas, they also deepened their domain understanding.

Design assessments to support learning. The study showed that student assessments need to be formative, process oriented, collaborative, and integrated with instruction. The study yielded several specific design principles that many instructional designers should find useful in developing formative assessments:

  • Focus on both individual and collective growth. Collaborative scientific inquiry should incorporate assessment of both individual and collective aspects of learning. A different culture needs to be developed. Teachers may let students know that demonstrating collaboration and helping others learn are valued just as much as, if not more than, correct answers. Interaction of individual and collective knowledge suggests that as students help others improve ideas, they would also improve their own understanding.

  • Assess to support learning and collaborative science inquiry. Assessments need to be formative, embedded, and concurrent so they can serve scaffolding purposes for learning and collaboration. The portfolio serves both roles of assessing collaborative inquiry as well as scaffolding domain understanding.

  • Support student agency. Turn over the responsibility of assessment to students so they can have increased agency as they examine their own and community progress. With the use of technology, students can have the opportunity to examine different models and refine their scientific understanding.

  • Provide explicit criteria. Students also need to be provided with criteria for understanding the goals of science instruction. There can be different criteria for what one emphasizes as scientific inquiry or other aspects in science learning. Assessment criteria of expectations can help scaffold student knowledge advances. Other examples include asking students to make self- and peer assessments of their reflective thinking in scientific inquiry.

  • Assess both processes and products. Both content and process need to be emphasized in science learning. We employ electronic portfolios wherein students identify high points of their learning assessing both content and process (subject matter, reflection, and collaboration). Different kinds of assessments should be designed to measure and elicit deep understanding and metacognition.


[1] This paper was a winner of the 2005 Virtual Design Center's paper competition, selected through an extensive review process by the Virtual Design Center advisory board. The above summary was prepared by the authors with the assistance of the advisory board chair (Daniel T. Hickey) and the project manager (Beaumie Kim). For more information on how you could use this design principle for your practice, you may contact the Virtual Design Center (vdc@cet.edu) or the board representatives (Daniel T. Hickey, dthickey@indiana.edu; Beaumie Kim, bkim@cet.edu) and the authors (Eddy Y.C. Lee, h9297168@hkusua.hku.hk; Carol K.K. Chan, ckkchan@hkucc.hku.hk; Jan van Aalst, vanaalst@sfu.ca).

Main | Principle: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

References.
Barron, B.J.S., Schwartz, D.L., Vye, N.J., Moore, A., Petrosino, A., Zech, L., Bransford, J.D, & the Cognition and Technology Group at Vanderbilt (1998). Doing with understanding: Lessons from research on problem- and project-based learning. Journal of the Learning Sciences, 7, 271-311.

Bereiter, C (2002). Education and mind in the knowledge age. Mahwah, NJ: Lawrence Erlbaum Associates.

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5, 1, 7-44.

Bransford, J.D., Brown, A.L., & Cocking, R.R. (Eds.) (1999). How people learn: Brain, mind, experience and school. Committee on Developments in the Science of Learning, Commission on Behavioral and Social Sciences and Education, National Research Council. Washington, DC: National Academy Press.

Chan, C.K.K., & van Aalst, J. (2004). Learning, assessment, and collaboration in computer-supported environments. In J.W. Strijbos, P.A. Kirschner, & R. Martens (eds.), What we know about CSCL and implementing it in higher education (pp. 87-112). Dordrecht, the Netherlands: Kluwer Academic Publishing.

Hakkarainen, K., Lipponen, L., & Järvelä, S. (2002). Epistemology of inquiry and computer-supported collaborative learning. In T. Koschmann, R. Hall, & N. Miyake (Eds.), CSCL 2: Carrying forward the conversation (pp. 129-156). Mahwah, NJ: Lawrence Erlbaum Associates.

Hewitt, J. (2002). From a focus on task to a focus on understanding: The cultural transformation of a Toronto classroom. In. T. Koschmann, R. Hall, & N. Miyake (Eds.), CSCL2: Carrying Forward the Conversation (pp. 11-41). Mahwah, NJ: Lawrence Erlbaum Associates.

Hill, C.M., van Aalst, J., Lee, E.Y.C., & Chan, C.K.K. (2003). How do we recognize knowledge building? A virtual tour. Last visited on Feb. 10, 2006, at www.ikit.org/mvt/ca.html. The Institute for Knowledge Innovation and Technology.

Lee, E.Y.C, Chan, C.K.K., & van Aalst, J. (in press). Students assessing their own collaborative knowledge building. International Journal of Computer-supported Collaborative Learning.

Paavola, S., Lipponen, L., & Hakkarainen, K. (2004). Models of innovative knowledge communities and three metaphors of learning. Review of Educational Research, 74, 557-576.

Scardamalia, M. (2002). Collective cognitive responsibility for the advancement of knowledge. In B. Smith (Ed.), Liberal education in a knowledge society (pp. 67-98). Chicago: Open Court.

Scardamalia, M., and Bereiter, C. (2003). Knowledge building. In Encyclopedia of Education, Second Edition (pp. 1370-1373). New York: Macmillan Reference, USA.

Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal of the Learning Sciences, 3, 265-283.

Scardamalia, M., Bereiter, C., & Lamon, M. (1994). The CSILE Project: Trying to bring the classroom into World 3. In K. McGilley (Ed.), Classroom lessons: Integrating cognitive theory and classroom practice (pp. 201-228). Cambridge, MA: MIT Press.

Scardamalia, M., Bereiter, C., McLean, R.S., Swallow, J., & Woodruff, E. (1989). Computer-supported intentional learning environments. Journal of Educational Computing Research, 5, 51-68.

Shepard, L.E. (2000). The role of assessment in a learning culture. Educational Researcher, 29, 7, 1-14.

Stahl, G. (2004). Building collaborative knowing. In P.W. Strijbos, P.A. Kirschner, & R.L. Martens (Eds.). What we know about CSCL and implementing it in higher education (pp. 53-85). Dordrecht, the Netherlands: Kluwer Academic Publishing.

van Aalst, J., & Chan, C.K.K. (in press). Student-directed assessment of knowledge building through electronic portfolio. The Journal of the Learning Sciences.

Wolf, D., Bixby, J., Glenn, J., & Gardner, H. (1991). To use their minds well: Investigating new forms of student assessment. Review of Research in Education, 17, 31-74.

 

Button that says contact us and allows you to email the virtual design team at vdc@cet.edu.

 
        
 

© 1999-2006 by Wheeling Jesuit University/Center for Educational Technologies®. 316 Washington Ave., Wheeling, WV, 26003-6243. All rights reserved. Privacy Policy and Terms of Use.