The Personal Learning Planner:

Collaboration through Online Learning and Publication

 

 

Bruce Havelock, RMC Research Corporation. Denver, CO.  havelock@rmcdenver.com

David Gibson, National Institute for Community Innovations, Montpelier, VT.  dgibson@nici-mc2.org

Lorraine Sherry, RMC Research Corporation, Denver, CO.  sherry@rmcdenver.com

 

 

 

Abstract:  This paper discusses the online Personal Learning Planner (PLP) project underway at the National Institute of Community Innovations (NICI), one of the partners in the Teacher Education Network (TEN), a 2000 PT3 Catalyst grantee.  The Web-based PLP provides a standards-linked “portfolio space” for both works in progress and demonstration collections of completed work, combined with structures to support mentorship and advising centered around the improvement of work.  In this paper, we describe the PLP’s history and rationale, design, and some initial results of its use in pilot programs, discussing the implications of lessons learned through these pilot experiences that can inform the PLP’s effective use in teacher education.  Early lessons from the field show the cultural, pedagogical, and technological challenges and potentials of basing performance reviews on collaboratively generated, standards-linked, Web-based portfolio processes and products.

 

 

The Personal and Professional Learning Planner and Portfolio (PLP), built by the National Institute for Community Innovations (NICI), has been available for a little over a year and piloted in preservice teacher education programs, K-12 schools, regional service centers and other local educational agencies.  This tool for collaboratively discussing and improving student work, linking that work to goals and standards, and collecting it in a Web-based portfolio format, is uniquely suited to some of the current challenges surrounding authentic assessment, digital literacy, and collaborative reflection in teacher education.  In this paper, we discuss the utility of the PLP in teacher education; the PLP’s theoretical framework, design history, and goals; and some early results of its implementation in a variety of institutional contexts.  Lessons from these varied contexts serve to illuminate challenges and future goals that will augment the effectiveness of the PLP in supporting a continuum of teacher learning from preservice coursework through the duration of the teaching career.

Electronic Portfolios to Support Teacher Learning

The need for a Web-based tool focused on the improvement of preservice teacher work has two parts.  First, learners benefit from feedback that comes from a diverse audience, yet preservice and induction programs often have limited resources and structures that result in scant feedback to aspiring teachers.  In these cases, the work of aspiring teachers frequently evolves in relative isolation, potentially solidifying patterns of work that do not naturally include constructive feedback as a natural part of work, leading to the oft-lamented isolated teaching condition that persists in many schools today.  A Web-based professional network can both help overcome this isolation, and just as importantly, it can provide the future teacher with high quality information that might otherwise be unavailable.  The feedback from multiple perspectives thus enabled can help teachers reflect on multiple dimensions of their work.

 

Secondly, our goals for preservice teacher education are evolving toward sophisticated understandings of demanding and complex material.  In addition to mastering subject matter, thorough teacher education requires the development of a critical and reflective stance towards the work of teaching and toward one’s own progress therein.  To support this important work, assessment of preservice and ongoing teacher education must evolve to match. In small, personalized teacher preparation programs, preservice teachers may benefit from interviews, observation, and feedback sessions related to their work, but in many programs that experience is limited to the last few months of preparation.  Too often, assessment of preservice teacher learning is a one-way interaction that takes place at a single point in time—usually the end of a course in the preservice curriculum.  Ideally, assessment should play a meaningful part in the ongoing learning of the person being assessed, while providing information that helps advisors and mentors to support the learner’s education.  Rather than an isolated measurement of skills or knowledge, effective assessment should be dynamic and  ongoing, forming an integral part of the learning process (Shepard, 2000).  This kind of effective and authentic educational assessment should record problems encountered, decisions considered and made, and the validation of the work produced—not just the final outcome.

 

The dynamic online collaboration supported by the PLP performs these functions while also putting the learner in a position of control of and responsibility for a dialogue with advisors around his or her own learning.  The PLP aims to create a longitudinal multimedia record of growth and change in an aspiring teacher’s skills and capabilities.  As such, the PLP can potentially document a future teacher’s progress through the learning/adoption trajectory (Sherry, Billig, Tavalin, & Gibson, 2000), from learner to adopter, co-learner, reaffirmer, and leader.

 

Although not widespread, portfolios have been embraced in some corners of preservice teacher education (cf. Andrews, Ducharme, & Cox, 2002).  Among the purposes relevant to teacher education that electronic portfolios can serve are Barrett’s (1998) diagnosis of student learning, grading or proficiency testing, promotion or certification, or as an aid to the job-seeking process.  Yet while the AAHE currently reports 61 institutions of higher education that use electronic portfolios in some form, few if any of them at present integrate all of the functions of meaningful reflection, ongoing dialogue, and a platform for both ongoing and completed products.  The PLP provides a “portfolio space” for works in progress and demonstration collections of work, and multiple channels for communicating around their creation, revision, and assessment.  By flexibly serving these functions, the PLP is robust enough to meet the demands of ongoing, dynamic, authentic assessment of growth in teacher knowledge and skills.  The online PLP allows all media formats, and a multiplicity of linkages among learning goals or standards of performance, projects, and the evidence of attainment of those goals and standards.  Distinct from electronic portfolios that concentrate on the presentation and storage of completed work, the PLP concentrates on the continuous improvement of work and the documentation of that improvement over time.

Project History

The Web-based application at the heart of the PLP has been developed over the last 3 years with funding from the Preparing Tomorrow’s Teachers to Use Technology (PT3) program, as well as from work funded by the National Science Foundation and the Technology Innovation Challenge Grant program. The lineage of the PLP comes from two sources.  One source was a bold move by a local secondary school community in Montpelier, Vermont, which in 1993 placed “individualized educational plans for every student” into its long-term strategic plan.  In 1995, this led to the creation and implementation of a school-wide program to place personal learning at the center of a continuous conversation involving all students, their parents or guardians, and caring adults in a school.  The University of Vermont provided support and energy to this school-based development through the writings of researchers and theorists such as Bentley (1999), Moffat (1998), Friedrichs (2000), and Gibson (1999, 2000).

 

In addition, early in its development, the concept of the Montpelier “PLP” was picked up by the Regional Laboratory at Brown University and combined with similar movements and interests in Rhode Island, Maine, Massachusetts, and other New England states.  In Maine, for example, the concept of personal learning took on a primary role in that state’s new proposal for the reform of secondary schools.  In other work of the Lab, the theme of personalization became a crucial feature of the secondary school reform network in the region, and was tied to the principles of Breaking Ranks, the reform monograph of the National Association of Secondary School Principals.  Thus, the concept of personalization of learning as essential to educational reform is well founded in theory as well as in practice.

 

The PLP’s second thread of lineage came from the pioneering work of The WEB Project, which used Web-based tools and networked communities to share and critique original student work online.  The WEB Project provided a rich research base with which to explore online dialogue and design conversations within a virtual community of learners and to define the path by which teachers progress from learners to leaders with technology (Sherry, 2000; Sherry, Tavalin, & Billig, 2000; Sherry, Billig, Tavalin, & Gibson, 2000).  The WEB Project established a system that linked ten participating schools and districts (including Montpelier High School) and multiple cooperating initiatives in online discussions of student work.  Art and music students posted works in progress and received constructive feedback from community practitioners and learners, based on their articulated intentions for their works-in-progress.  Middle school students from three schools across Vermont conducted book discussions, facilitated by staff from the Vermont Center for the Book and their teachers.  Teachers discussed challenges, conducted action research, shared results, and co-developed rubrics to assess instructional processes, progress, and outcomes.  Through these efforts, The WEB Project contributed substantially to knowledge of effective practice for conducting online dialogue and design conversations.

 

Through The WEB Project, teachers developed connections with and drew on the expertise of other practitioners in their discipline from participating initiatives throughout the state.  For example, art teachers and students shared online interactions with traditional artists, graphic artists, and multimedia designers; and music teachers and students carried on conversations with resident musicians, music teachers, and composers.  The mentor-practitioners, in turn, were asked to give students feedback and essentially became co-instructors in the course.  This learning community resembled an apprenticeship model, but it allowed for many mentors and was not constrained by time or place.

 

The secrets of The WEB Project’s success were many, but it is worth highlighting the singular focus on creation of original student work, which ensured that online dialogue remained centered around the learner (Sherry, 2000; Sherry, Tavalin, & Billig, 2000).  In The WEB Project, “student work” included two important genres:

 

·         student-created works of traditional art, digital art, multimedia, and music, supported by student-initiated design conversations with teachers, peers, and experts; and

·         student-moderated dialogue with reflective, threaded discussions about assigned language arts texts that focused on controversial issues faced by middle school students.

 

In the design conversations, the entire sequence of activity only began if and when a student shared a work-in-progress and asked for specific feedback.  If the work was shared too early, then the request for feedback and the ensuing online interactions with experts was too general and superficial.  On the other hand, if the request for feedback came too late, when the work was already in its final form, the conversation was viewed by the learner as unimportant or too critical.  However, if the student requested feedback at some optimum point when the work was already posted on The WEB Project’s Web site in draft form, and if he or she was able to articulate specific design problems that needed prompt attention, then the community of experts was able to provide a useful range of practical suggestions to be filtered, evaluated, and used for revision and refinement of the work-in-progress.  These qualities of learner-centeredness, creativity, self-initiative, and intellectual focus were carried forward into the Web-based PLP.

The Personal Learning Portfolio: Design Rationale

The PLP is based on a theory of dialogue recently articulated by Gibson and Friedrichs (Friedrichs, 2000; Friedrichs & Gibson, 2001).  Friedrichs (2000) discusses four distinct dialogue states for which supports were explicitly built into the PLP:

 

1.       Sharing experience — listening to one’s own and others’ inner speech and natural attitude about a skill or concept;

2.       Expressing and examining diverse concepts — recognizing conflicts; analyzing old and new concepts, models, and beliefs; working in one’s zone of proximal development;

3.        Articulating applications and understandings — practicing new skills; combining old and new concepts; using others’ ideas; using scaffolds to renegotiate understandings; and

4.        Communicating new powers and creations — celebrating effects of critical analysis.

 

The premise of collaborative interaction as a basis for learning is consistent with research focused on authenticity, use of technology to create problem-centered learning teams, representation of complex dynamics in educational settings, and e-learning (Carroll, 2000; Gibson, 1999; Gibson & Clarke, 2000; Newmann & Wehlage, 1995; Sherry & Myers, 1998; Stiggins, 1997; Wiggins, 1989; NSDC, 2001).

 

The learner’s productivity and self-efficacy is the ultimate goal of the online PLP.  Work samples are the critical source for evidence of learning, the documentation of progress, and the verification that high standards have been achieved.  By placing student work at the center of the PLP, the learner is pushed to a higher standard of personal accountability for the publicly visible quality of that work.

 

In the PLP, learners pose questions to advisors; they develop, use, and compare the value of a variety of learning assets — their strengths, interests, aspirations, and community and personal resources—and they retain ultimate control over the progress of their work, the integration of feedback they receive, and its ultimate publication.  This decision-making power enhances learner motivation and develops a sense of ownership of work products, resulting in final products of higher quality.

 

With these concepts of learning in mind, and with funding from the U.S. Department of Education under the PT3 program, NICI developed the first version of the PLP as a “critical friends” online space to help future teachers assemble portfolios of evidence showing that they meet the standards required for a teaching credential or license.  The PLP is designed to assist aspiring teachers through the processes of:

 

·         self-assessing of strengths, interests, and aspirations and their relationship to program requirements;

·         planning preservice education learning goals and projects;

·         linking goals and projects to valued outcome standards;

·         creating original multimedia work samples and sharing those with others;

·         receiving high quality feedback from mentors, advisors, or other critical friends for the consideration of their learning goals, improvement of their work, and strengthening of their knowledge and skills;

·         documenting and validating the achievement of learning goals; and

·         selecting and preparing exhibits of learning.

 

The PLP includes tools for online survey building and administration, developing local standards and rubrics, organizing uploaded work in relation to those standards and rubrics, forming learners and advisors into various communities, and creating a completed Web-based portfolio product.  The PLP can be flexibly customized to serve the needs of practically any outcome-oriented collaborative learning group.  While most electronic portfolios demonstrate either formative or summative learning, the PLP showcases both.

 

In practice, learners in the PLP system articulate their goals for learning, reference those goals to standards for work or knowledge introduced into the PLP by program administrators, and upload computer files to the PLP server that exhibit their progress toward meeting these goals and standards.  Through a process of collaborative reflection, assessment, and several iterations of multiple work products, learners develop an electronic portfolio showing their growth and abilities; this portfolio is then available to them as an exhibit of their growth and an aid to their future career progression.

 

“Advisors” and “Learners” exhibit specific technical characteristics in the context of the PLP.  Through the management interface, Advisors are associated with one or more Learners.  When a Learner’s goal or work is being shared for critique and feedback, the Advisor can discuss, offer direct edits, or validate the goal or work as adequate for its purpose.  For example, an Advisor might validate a goal as appropriate for completion of a secondary teaching license in science, and validate a piece of work as evidence of achieving a standard of performance linked to one or more goals.  The validation process can be formalized with rubrics or entered as narrative.  Any rubric can be linked with any piece of work as evidence.  When a group of Advisors scores work using a common rubric, a summative rubric can then be built upon completion of their work.

Early Experiments in PLP Use

As the PLP evolves, so do its potential applications.  While it is difficult to argue that any technology is entirely value-free, the flexibility of the PLP allows for its customization to fit the values of the groups that use it.  In this section, we explore five such applications in very different contexts, aiming to draw from this range of experiences lessons that can inform the PLP’s continuing design and enhancement to support preservice teacher education. 

 

In summer 2002, 31 sites were using the online PLP in support of an extensive range of educational programs and networks.  Five of these sites were selected to explore initial reception of and reactions to the PLP.  To explore the boundaries of the PLP’s flexibility, the sites were selected to illustrate the issues involved with customizing the PLP to meet the distinct needs of five very different learning communities. 

 

Fourteen participating program administrators completed surveys and were interviewed about their experience with the PLP.  From the PLP environment itself, we examined (from six programs) the uploaded program standards, the list of rubrics for evaluating learner work, the surveys that had been created and administered within the PLP sites, and a number of individual learners’ PLP pages that contained the work of individuals and the commentaries of their mentors or advisors.  Initial analysis of these data led to the generation of several hypotheses about the critical tends in PLP implementation.  The data were then re-examined to extract trends and potential lessons to guide the PLP’s future development and implementation.

 

Initial results affirmed that the PLP can effectively meet the needs of a wide range of users.  All five program administrators felt that the purpose and functions of the PLP were appropriate, useful, effective, powerful, and well suited to their intended audiences.  After getting past the initial hurdles of training and basic system familiarity, users found that the PLP resonated with their ideas about cognitive coaching, peer mentoring, standards-based instruction, and social learning.  Pilot users also found the design and management approach adopted by the PLP design team to be very effective.  All five pilot users described a high level of support, responsiveness, and personal attention from the design team.

 

By the end of 2002, several more institutions had begun to use the PLP.  At two of these sites, potentially richer data were generated than had been available for two of the initial five sites explored; the summaries below include two new sites and three that were already using the PLP by summer 2002.  For each group of learners, distinctive features of the pilot experience (as described by program administrators and research staff) are summarized.

 

·         8th grade students in a public middle school.  A primary goal in this program was to “boost student engagement…  By working through the goal setting and reflection process, [students] will be able to articulate what they want to get out of school.”  Supported by a dedicated program leader (who is also the technical administrator), students in this group seemed undaunted by the technical challenges, and posted more work than any other single group in this sample, often proceeding through several drafts.  Comments from teachers, while initially somewhat superficial, began to show more substance as the process of providing online comments on student work became more familiar to them.

·         Attendees at an intensive two-day professional conference session.  The goal in this case was to extend professional learning that had been initiated at a regional conference; however, the PLP was also used here to prepare for a more effective conference-based learning experience.  The session leader was able to review individual learning goals for nearly all of the session participants before the conference took place, and alter her instructional plan for the conference to accommodate those needs.  After the conference, the attendees continued to serve as a community of learners, providing input and advice on each other’s work and extending the learning afforded by the conference.

·         School administrators working with a regional service organization.  Though this group uploaded no work in their early stages of PLP use, they completed a large number of surveys.  The program administrator attributed this to the fact that the surveys were analogous to an exit interview protocol that was part of the existing program.  Similarly, this administrator focused on the “4-step work cycle” of the PLP because it nicely matched the “4-step learning cycle” that was an existing component of her program.  The PLP was set up for teams of administrators to function together as “Learner” units.  As these administrators in general communicated infrequently and were geographically dispersed, the PLP provided for “greater continuity between sessions and a sense of measurable progress.”

·         Ph.D. students in an educational leadership program.  While no state standards existed for doctoral students in this field, students developed their own goals and standards for graduation (4 mandatory areas and 3 chosen by students) with input from their advisory committees, and made early steps toward organizing their works around those goals through the PLP.  Students in this program requested to be able to give each other feedback, so all were given system rights as both Learners and Advisors.  Some students started the process by giving each other “fake” feedback to test the system, but comments tended to develop more substance over time.  Learners also started to personalize their PLP portal pages and requested more flexibility in doing so.  Participants in the program felt that the program as a whole had an attitude that encouraged getting feedback for revision, which increased receptivity to the PLP.

·         Preservice teachers in a preparation program for urban teachers.  In this program, candidates posted their inquiries and their action research projects while building a profile that was provided to hiring school systems as evidence of their competence.  While initially, prospective teachers in this program were not overtly enthusiastic about improving works in progress for portfolio inclusion, the program administrator exerted considerable to effort to make it clear that learners’ PLP work could play a crucial role in getting a job, and that school systems were very interested in seeing documentation of learner growth.  It became clear in the job market that participants who did develop portfolios through the PLP had a distinct advantage over those who did not.  The administrator then began lobbying locally to have official program credit awarded for PLP portfolio development, feeling strongly that the PLP enhanced both accountability and leadership among participants.

Early Lessons

While it is still too early to see the full potential of an evolutionary record of learner progress, signs are abundant in the pilot sites that the design goals of the PLP were meeting with some success.  Several of the issues that emerged through examination of these particular sites are discussed below.

 

Implementation in a number of varied sites and contexts confirmed that the structures of the PLP are flexible enough to meet the needs of every permutation of learning community yet encountered.  Participants reported high levels of satisfaction with several aspects of the PLP’s flexible design, including the easy customization of roles and groups; the ability to develop different sets of standards, goals, rubrics, and surveys for those groups; and the natural affordance of technology to transcend traditional boundaries of distance or time.

 

The nature of PLP-embedded information about student progress and growth over time provided many learners and advisors with a strong sense of the power of diagnostic assessment of their works in progress.  They also began to comprehend and value the linkage of their work to personal goals and institutional standards, and the importance of a record of continuity and growth in their learning experiences.  In many settings, learning was both supported and extended through the ongoing reflection and dialogue supported by the PLP’s structure.  However, this complex vision of learning and assessment also accounted for difficulties as well: many challenges in implementation stemmed from the observed fact that the vision of learning and assessment as a process in addition to a product was difficult for participants at various institutions to incorporate into their practice.

 

In each of the explored pilot sites, effective program ownership and advocacy to build interest, enthusiasm, and commitment around use of the PLP was perceived as a critical factor supporting implementation.  Some sites experienced difficulty stemming from different personnel being responsible for the technical and the conceptual ownership of the PLP at their site.  Where these functions were consolidated in one person, that program tended to be successful.  In several settings, both users and advisors were less active in their engagement with the PLP until advocates insured that it either became part of their programmatic requirements, or explicitly demonstrated its utility in helping meet their larger personal and professional goals.  The presence, commitment, and informed outreach efforts is likely to play a key role in the PLP’s future sustainability among existing and new learner communities.

The PLP and Cultural Change

Perhaps the most powerful and challenging process observed at each of the PLP sites was the intersection of existing cultural norms with some of the changes in thinking and practices implied by the approach to learning, assessment, standards, and mentorship embedded in the PLP.  On one level, the importance of considering participants’ experience with portfolio systems prior to using the PLP was very evident among the pilot users (cf. Barrett, 1998).  In a deeper sense, the cultural or institutional practices and prior experiences of different pilot groups strongly influenced their initial engagements with the PLP.  Beyond their experience with paper portfolios, different groups had varying types of norms in place surrounding many aspects of their work with the PLP.  This included at various times participant conceptions of mentorship, reflection, the purpose of a portfolio-like collection of work, the relevance of such a portfolio to their jobs and careers, the idea of assessment as entailing a fixed-point evaluation of a finished product, and familiarity with and ideas about content standards.  As users became more familiar with the PLP, some tentative reconsideration of norms of teaching and learning—visible through more reflective comments, active engagement with PLP work, and descriptions of such changes by program administrators—were evident as risk-taking and experimentation with the PLP was supported and encouraged.

 

Participants’ most effective initial engagements with the PLP—those that led to increased buy-in and participation—centered on aspects of its design that were analogous to structures and practices with which participants were already familiar. Expectations and rewards for participation also tended to be closely tied to existing program structures.  All potential users of an innovation need to be persuaded of the viability and relevance of a new way of doing things (cf. Rogers, 1995).  In these cases of PLP implementation, those “selling points” were exploited from various angles by the above-mentioned program advocates who recognized a match between these points of leverage and existing institutional values and conditions.  In some cases, this constituted the 4-step work cycle or the survey component; for a group that had more extensive experience linking work to standards, creating standards-linked individual goals was a logical first step. 

 

In each case, the approach to learning, assessment, and collaboration embodied in the PLP and embodied in each institution intersected to create a unique PLP site reflecting those aspects of the PLP approach that most fit the given context for implementation.  This is likely to be equally true in other programs with their differences in size, scope, and institutional history.  Understanding the relative important of these contextual factors in other programs may help potential adopters consider the relevance of the PLP to their settings, while also helping to identify areas where leaders may wish to instigate or catalyze cultural and institutional change.  In all cases, the PLP fostered conditions of increased accountability for both learners and advisors, to each other as well as to the institutional standards that they were charged with meeting.  Through this lens, it will be helpful for new adapters to conceptualize the PLP as a way to reify, enhance and reflect on institutional norms rather than replace them.

Future Directions

We have described the design rationale and early stages of execution of the PLP, or Personal Learning Planner, a Web-based application system designed to provide a flexible environment for standards-based work and mentorship. We have also seen that effective use of the PLP will imply cultural changes for many institutions as they rethink the demanding nature of in student-mentor relationships, standards-linked performance, and ongoing documentation of professional learning.  As use of the PLP scales to wider audiences, it may be worthwhile to develop tools and protocols to help understand the dimensions that will influence the cultural fit of the PLP with candidate groups of learners.

 

Initial forays into the PLP medium provided participants with opportunities for rehearsal of new norms and conventions for working, interacting, reflecting, and providing feedback in this environment.  Each of these forays that we have seen resulted in a feedback loop of positive reinforcement that added momentum to leaner participation.  In a paradox that is common to many such innovations, to fully achieve maximal learning benefits through the PLP requires an up-front period of investment in which those benefits may not be readily apparent.  To date, several sites that have attempted to use the PLP have been able to develop their work for a long enough period and in such a way as to foster a self-sustaining critical mass of interest and participation.  Part of this success at sustaining early engagements is attributable to the high level of personal support provided to pilot sites by the development team.  As use of the PLP expands to a wider audience of learning communities, the team will continue to explore structures and roles that can provide this high level of personalized support and customization on a larger scale.  Addressing these issues, and further understanding other keys to achieving that level of interest and participation, will be a critical part of sustaining the PLP beyond the duration of its initial funding period.

 

While high standards for education are mandated by accountability legislation, few educators are at present adequately familiar with the standards relevant to their practice to implement them fully.  One of the outcomes that PLP participants valued was increased familiarity with these standards—both on the part of learners and their administrators.  The linking and goal-setting features of the PLP provide an unprecedented dimension of interactivity to standards, and force consideration of their relevance to one’s actual work. As the climate around standards becomes more imperative, the PLP is likely to find new audiences that will further contribute to its sustainability.  As a component in preservice teacher education, the PLP can help future and existing teachers arrive in the workforce well-versed in thinking through the complex interaction of standards with their work, prepared to collaboratively reflect on their importance and attainment, and able to powerfully demonstrate their results. 

References

Andrews, S., Ducharme, A., & Cox, C. (2003).  Development and use of electronic portfolios in preservice education.  Paper presented at the annual meeting of the Society for Information Technology in Teacher Education (SITE), March 23-26, Albuquerque, NM.

 

Barrett, H. (1998).  What to consider when planning for electronic portfolios.  Learning & Leading with Technology.

 

Bentley, T. (1999, December 8).  Students empowered in Montpelier [Our Generation page].  Times Argus Newspaper.

 

Carroll, T. (2000).  If we didn’t have the schools we have today, would we create the schools we have today?  Keynote speech at the AACE/SITE conference, San Diego, CA.

 

Friedrichs, A. (2000).  Continuous learning dialogues: An ethnography of personal learning plans’ impact on four River High School learners.  University of Vermont: Unpublished doctoral dissertation.

 

Friedrichs, A., & Gibson, D. (2001).  Personalization and secondary school renewal.  Forthcoming from Brown University Regional Laboratory.

 

Gibson, D. (1999).  Mapping the dynamics of change: A complexity theory analysis of innovation in five Vermont high schools.  University of Vermont: Unpublished doctoral dissertation.

 

Gibson, D. (2000).  Complexity theory as a leadership framework.  Montpelier, Vermont: Vermont Institute for Mathematics, Science, and Technology (VISMT).  Available: http://www.vismt.org/pub/ComplexityandLeadership. pdf

 

Gibson, D., & Clarke, J. (1999).  Growing towards systemic change.  Providence, Rhode Island: Brown University Regional Laboratory.

 

Moffett, J. (1998).  The universal schoolhouse: Spiritual awakening through education.  Portland, Maine: Calendar Island Publishers.

 

Newmann, F.M., & Wehlage, G.G. (1995).  Successful school restructuring: A report to the public and educators.  University of Wisconsin: Center on Organization and Restructuring.

 

National Staff Development Council.  (2001). National standards for online learning.  Forthcoming from National Staff Development Council.

 

Rogers, E. (1995).  Diffusion of innovations.  New York: The Free Press.

 

Shepard, L. (2000).  The role of assessment in a learning culture.  Educational Researcher, 29 (7), 4-14.

 

Sherry, L. (2000).  The nature and purpose of online conversations: A brief synthesis of current research. International Journal of Educational Telecommunications, 6 (1), 19-52.

 

Sherry, L., & Myers, K.M.M. (1998).  The dynamics of collaborative design.  IEEE Transactions on Professional Communication, 41 (2), 123-139.

 

Sherry, L., Tavalin, F., & Billig, S.H. (2000).  Good online conversation: Building on research to inform practice.  Journal of Interactive Learning Research, 11 (1), 85-127.

 

Sherry, L., Billig, S.H., Tavalin, F., & Gibson, D. (2000).  New insights on technology adoption in schools.  T.H.E. Journal, 27 (7), 43-46.

 

Stiggins, R.J. (1997). Student-centered classroom assessment. Upper Saddle River, New Jersey: Merrill, Prentice Hall.

 

Wiggins, G. (1989). Teaching to the (authentic) test. Educational Leadership, 46: 41-46.