# A Dynamic Social Feedback System to Support Learning and Social Interaction in Higher Education

Brian Thoms

Pages: pp. 340-352

Abstract—In this research, we examine the design, construction, and implementation of a dynamic, easy to use, feedback mechanism for social software. The tool was integrated into an existing university's online learning community (OLC). In line with constructivist learning models and practical information systems (IS) design, the feedback system provides members of the OLC with the capability to rate blog posts and provide instant feedback on the content of their peers. The software was implemented at a US university in an introductory course on IS with the goal of fostering higher levels of learning and social interaction. A content analysis showed higher levels of system usage corresponded with higher course grades. A survey analysis supported these results showing statistical significance between levels of system use and perceived levels of learning.

Index Terms—Online social networks, online learning community, feedback system, asynchronous learning, design science.

## Introduction

Milgram's [ 1] small-world phenomenon asserts that everyone in the world can be reached through a small chain of social ties. From this concept, the more familiar phrase six degrees of separation is derived. Today, we bear witness to a new wave of social software that attempts to harness the power of social ties to foster greater interaction.

While underlying business models look to the power of advertising money and membership fees as grounds for building sustainable social software, philosophical underpinnings focus on the power of “We” for collaborating on everything from general interests to building solutions to complex problems. Today's online social networking software utilizes the latest Internet technologies to provide members with a complete interactive and multimedia environment. Consequently, many individuals are simply hooked on online social networking. Nielsen Online reports that two-thirds of the world's Internet population visit a social network or blogging site and the sector now accounts for almost 10 percent of all Internet time. Some reports estimate that 37 percent of all online US adults and 70 percent of all US teens engage in some form of online social networking every month [ 2].

While the number of online social network users is growing, so, too, is the number of social networking websites across the Internet. Mashable.com [ 3] identifies over 350 popular social networking websites that maintain active users and are open to new users. While there are more dominant players such as Blogger and Facebook, niche social networks provide context specific environments while utilizing the same underlying technologies as those previously mentioned.

Within higher education, classroom networks are dominated by course management software (CMS). While the technologies that comprise CMS platforms continually change to meet the needs of today's changing student populations, most still fall short in offering instructors and students with a comprehensive social learning environment. In this research, we depart from traditional CMS software and introduce social software within a university setting. More specifically, we show how enhanced social software components can foster greater levels of interaction and learning in large classroom environments.

## Background

### 2.1 Online Social Learning Environments

In 2002, Stacey [ 4] found that a higher quality of electronic communication helps to engage students and aids in their learning of the course material. Further studies in online collaboration have shown that virtual communication patterns correspond in some fashion to real-life communication [ 5], [ 6]. Accordingly, online learning environments provide a valid form of learning and offer many different methods of interaction [ 7]. As in face-to-face communication, members of online social learning environments are able to state what they think, comment on what others have said, collaborate on common statements, and share information in many forms.

Web 2.0 technologies, such as blogs and wikis along with peer-to-peer networking discussion boards and file sharing, empower individuals to take ownership of the content they create while also making it easier to pursue social or scholastic ties with their peers. And increasingly, more individuals are gaining access and familiarizing themselves with these technologies, which makes their introduction into the classroom more-or-less seamless.

Research trends support these assumptions. Brescia and Miller [ 8] found benefits to using blogging in the classroom including enhanced student reflection, increased student engagement, portfolio building, and better synthesis across multiple activities. In a similar study within large classrooms, Song and Yuen [ 9] found that group blogging motivated student reflection and interaction. Research on large-scale discussion forums also supports these claims. Cook et al. [ 10] found that online discussions helped students to better comprehend the material and that students were more excited to come to class having prior exposure to the material online.

While our research has shown success in implementing social software within a classroom environment, it should be noted that caveats do exist. Hernández-Ramos [ 11] investigated the use of blogs in a teacher training program and found that teachers were unsure about how to integrate blogging assignments with course learning objectives. Furthermore, Richardson [ 12] identifies factors relating to privacy and copyright that instructors must be aware of when implementing social software within the classroom. Additionally, with any open platform, there is a potential for peer harassment or abuse within the community, which can have devastating effects on learning [ 12], [ 13].

In 2008, Vassileva [ 14] defined two main challenges to implementing social learning environments: 1) supporting the learner to find the right information and 2) motivating the learner. Through careful integration with course learning objectives, we suggest social learning environments can accommodate for these needs and can stay clear of the pitfalls to provide students with a powerful learning resource. Inspired by exemplars in online community and conversation—including LinkedIn and Facebook—our research mirrors a social networking model and various Web 2.0 technologies to facilitate learning and social interaction inside and outside the classroom.

### 2.2 Course Management Software versus Online Social Networking Software

One specific type of online platform, common across a majority of learning institutions, is course management system (CMS) software. CMS software provides powerful platforms for the facilitation and management of academic course work. Illustrated in Fig. 1, our university's CMS platform is Blackboard and provides instructors with a comprehensive system to manage and distribute course-based materials. Blackboard allows instructors to receive student submitted assignments, enter/review grades, create group discussions, and communicate with students. However, while Blackboard provides a powerful resource for instructors, it falls short in providing an engaging student-centric environment.

Figure    Fig. 1. Blackboard instructor control panel showcasing various tools available only to course administrators.

As a largely top-down system, instructors control the flow and ownership of information within Blackboard. Outside of basic discussion forums, the software provides minimal opportunities for peer-to-peer interaction. Additionally, courses in Blackboard close at the end of a term leaving no persistent artifacts for students to return to at a later time. Blackboard also limits personal profile development; elements that can be essential in knowledge sharing and building community. As a result, CMS software, like Blackboard, can be viewed as an institutional resource that deters students from controlling the visibility, organization, and presentation of their online content.

Residing on the opposite end of this “control spectrum” are online learning communities (OLCs). OLCs are a specific type of online social networking software. Where CMS software is top down, OLCs can be viewed as bottom up, where individuals own the content they create. Furthermore, it is the individual or community that decides what and to whom work becomes visible. In providing the user with this control, OLCs can be characterized more closely as traditional communities of practice (CoP), where the success of the OLC is directly correlated with the participation and engagement of its users. In an academic setting, although individuals may be graded on their individual contribution, the overall community can also be assessed on the value it provides its members.

Table 1 identifies this bifurcation in more detail.

Table 1. CMS Software versus OLC Software

The transition toward more student-centered learning tools is occurring in new releases of Blackboard, Angel, and Sakai CMS platforms. In later versions of these software packages, instructors have the ability to incorporate blogging and advanced discussion boards and students have the ability to create personalized profiles. Unfortunately, at the time of this study, such features were not incorporated in our university's CMS, even during a recent system upgrade. But rather than await the future adoption of this software, which comes with significant upgrade costs and may still lack key components such as connection-making, collaborative writing, and advanced personal profiles, we are committed to using rigorous research to design, construct, and evaluate existing open source software to overcome these shortfalls.

### 2.3 Online Learning Community Research

In prior research, we examined how OLCs can facilitate knowledge sharing and foster learning in higher education [ 13], [ 15], [ 16]. Through survey analyses, this research showed that 60 percent of respondents agreed that the OLC aligned well with course-related learning goals and 82 percent of respondents agreed that the course community fostered learning. Furthermore, 74 percent of respondents agreed that making their work accessible to peers increased their motivation to perform quality work. Additionally, 54 percent of respondents felt that seeing their peers' work helped with their own work. Lastly, 85 percent of respondents indicated that the software provided an excellent medium for social interaction.

However, one area where we have identified shortcomings in our software centers on peer-to-peer interaction across online content, or the lack thereof. In other words, while students agreed that the software provided an excellent means for social interaction, few students were actually interacting with one another. Through a content analysis, we discovered that, on average, less than one blog comment existed per blog post. To remedy this, this research attempts to spur student-to-student interaction through a dynamic peer feedback system.

## Theory-Guided Research

Regardless of the technological approach an institution adopts, whether it is CMS software or OLC software, the technology must have concrete pedagogical underpinnings to ensure that learning objectives are supported by the appropriate media [ 17]. Any innovation that is steeped with good pedagogy will also provide opportunities for active participation, collaboration, and social interaction [ 18]. To ensure this is the case, our research is grounded in multiple theories of learning and interaction.

During the course of our research, we have developed a theoretical model to guide how system components can facilitate learning and interaction within an academic environment. Illustrated in Fig. 2, our model integrates three fundamental theories of learning and interaction and considers how different social technologies can facilitate learning, interaction, and community building within an academic OLC.

Figure    Fig. 2. Theoretical model illustrating how different aspects of Web 2.0 technologies catalyze individual learning and community.

### 3.1 Activity Theory—Technological Catalysts

While we should not consider technology as the driving force in all types of learning, technology plays a critical facilitating role within OLCs. Therefore, when designing our university's OLC, we first consider how individuals and groups interact with these technologies. From its origin, activity theory considers human actions to be directed at objects and mediated by artifacts [ 19]. More simply put, an activity is the way a subject (either an individual or a group) moves toward an end goal with the purpose of attaining certain results or accomplishing certain objectives [ 20].

Activity theory also considers aspects of motivation and engagement. In activity theory, activities are goal directed, where multiple ways exist to achieve those goals, oftentimes through adaptive means [ 21]. In educational environments, when instructors can choose activities from both online and face-to-face mediums, they can also select the activity that provides the best fit for any particular learning objective [ 22], [ 23].

Activity theory provides a useful guideline for evaluating human-computer interaction (HCI) in a field setting, such as an online environment [ 24]. It can be used as a lens for understanding the sociotechnical interactive networks as a function of technology, community, and individual interaction between the two. When studying motivations behind blogging, Nardi et al. [ 25] applied activity theory to understand how blogs were used to communicate specific social purposes to others. In a study on higher education, Issroff and Scanlon [ 26] found that activity theory dictates that multiple factors exist that can impact the usage of any one specific technology.

In this research, activity theory is used as a guide for how individuals manipulate specific social technologies to accomplish course-based tasks and goals. These activities can include, but are not limited to, online discussion boards, instant messaging, blogging, and collaborative writing, which all can be used to share knowledge and build course community. To limit the scope for this research, we focus our measurements on new system components (i.e., a peer ratings system).

### 3.2 Constructivism—Student-Centric Learning

In an educational setting, activities must account and accommodate for an individual learner's needs. In many cases, a learner will manipulate many different technologies in different ways. Therefore, depending on the situation, a technology must meet the learning needs of individuals and be flexible in adapting to those needs.

Prior research has traced the roots of community to constructivism [ 27], [ 28], [ 29]. Constructivism has largely been attributed to the work of Piaget [ 30], who first theorized that learning can be based on the interaction and experiences of the learner within a specific context. Consequently, individuals develop knowledge and understanding through forming and continually refining concepts. There has been much research extending Piaget's work. Hagstrom and Wertsch [ 31] state that constructivism encourages, utilizes, and rewards the unique and multidimensional characteristics of the individual throughout the learning process. Additionally, Squires [ 32] states that constructivism focuses on learner control, with learners making decisions that match their own cognitive state and their own needs.

While constructivism began as a theory of learning, it has progressively been used as a theory of education, of the origin of ideas, and of both personal knowledge and scientific knowledge [ 33]. Dalsgaard [ 34] argues that social software can be used to support a constructivist approach to online learning. In this respect, social software can refer to any loosely connected application where individuals can communicate with one another, and track discussions across the Internet [ 35]. The development of any online learning tool should consider the learner's point of view across these discussions [ 36], providing them with a certain and needed level of control [ 30].

Once again, social software can support these philosophies, providing users with self-governing and individually motivating activities. As we continually manipulate our OLC, we also reinvestigate how each social technology can support the constructivist learning model.

### 3.3 Social Presence—Social Interaction

A number of theories look at the role people play in an OLC, including connectivism, social constructivism, behaviorism, social learning, situated learning, and social presence. These theories focus on how individuals learn in groups, interact, and collaborate with other members of the environment. Social presence theory asserts that individuals are influenced to a great extent by the surrounding members of a community. Social presence theory also considers the degree to which an individual's perception of an online community, in its entirety, affects his or her participation in that community [ 37]. In other words, social presence refers to a communicator's sense of awareness of the presence of an interaction partner.

Within human-computer interaction, social presence theory considers how “sense of community” is shaped and affected by technological interactions [ 39]. Tu and McIsaac [ 39] redefine social presence theory for computer mediated communication stating that it is the degree of feeling, perception, and reaction to another intellectual entity within a computer mediated environment.

Levels of social presence can be a critical factor that affects the quality of social interaction within a group, and can also influence the dynamic of the group [ 40]. Existing research indicates that high levels of social presence play a significant role in improving instructional effectiveness and building a sense of online community [ 41]. Richardson and Swan [ 42] and Shih and Swan [ 43] discovered that a student's perception of social presence in online courses was significantly related to overall satisfaction with the course, perceived learning, and instructor satisfaction. When measuring social presence in an online professional development class, Wise et al. [ 44] concluded that high social presence is thought to create an approachable environment and hence more satisfying learning experience and greater learning. Delicious is a social bookmarking web service for storing, sharing, and discovering web bookmarks. In measuring annotations made by users of Delicious, Lee [ 45] discovered that individuals were more likely to include annotations (or more helpful information) with their bookmarks, if they interacted with other individuals more frequently.

When individuals perceive others within an online CoP to be real, they can begin building trust in the community and also start to view the online community as a valid source of knowledge building and/or social interaction. Thus, when including new OLC components, it is critical to consider the composition of community, including understanding that the community itself as a unique entity. Furthermore, an OLC cannot thrive without a palpable sense of social presence. In this research, we look to transfer existing levels of social presence found within classroom and campus environments to the online realm.

## Toward a Dynamic Social Feedback System

Our research to date has showcased the potential that social technologies possess when guided by theory and are carefully integrated with course learning objectives. However, while our research has shown that these technologies provide a beneficial method for students to reflect on course material, online interaction among students remained low with less than one comment existing per blog post. Feedback among blogs is not isolated to our system and is common across the Internet. In a more comprehensive study of 500 randomly selected blog posts from across the Internet, Mishne and Glance [ 46] discovered that only 15 percent of blog posts contained comments.

While a number of reasons may exist why some blogs receive comments and others do not, one possible reason why individuals choose not to leave feedback is the time required to construct a valuable comment. Sometimes users simply want to let the blogger know that they have read the post and whether or not they thought it was interesting, or not. However, with no “quick” mechanism to do so, many users will opt not to undertake the tedious and timely process of submitting a comment that oftentimes involves

1. clicking to view the blog entry,
2. typing in the comment in the comment box,
3. clicking the submit button, and
4. awaiting acceptance of the comment by the user.

Therefore, many blog posts, while viewed, remain uncommented, with many blog owners left wondering who, if anyone has read their post.

We see a ratings system as a simple mechanism to bring forth this desired feedback offering students multiple ways to do so.

### 4.1 Ratings Systems

As stated, one mechanism to elicit greater blogging interaction, while also helping to foster a community of active participants, is through a peer ratings system, which can be integrated seamlessly with any blogging engine. A ratings system is a quick and easy way for users to leave an opinion or evaluation about an object. In ecommerce websites such as eBay, Yahoo! Auction and Amazon, products are rated by consumers, thus adding to the collective information base for that product [ 47], [ 48], [ 49]. In ratings systems, individuals are often presented with simple 1 through $x$ star rating schemas, where more stars indicate higher degrees of satisfaction or interest as perceived by a consumer.

Such a system may also provide the necessary input for reputation building, which can plan an important role in sustaining a healthy OLC. Donath [ 50] discovered that the establishment of a reputation and recognition of others plays a vital role in building a user's identity. Consequently, reputations provide other members of an OLC insight into which individuals are providing the most valuable contributions. In 2005, Wasko and Faraj [ 51] discovered that a significant predictor of individual knowledge contribution centers on the perception that participation enhances one's professional reputation.

### 4.2 Ratings Systems in Education

Within the context of higher education, our proposed system is novel and we have not seen any studies incorporating online peer ratings systems into academic blogging. However, the notion of peer ratings in education is not new at all. In fact, many instructors use peer ratings as a mechanism to receive feedback on collaborative course projects. In education, where assessment is tantamount to instructors grading student material, collaborative ratings systems may be considered largely suspect. However, we believe that such a system can provide a new dimension to learning and social interaction, and thus should be incorporated into our OLC.

This belief is supported by educational research in the area of peer assessment. Johnston and Miles [ 52] found that students took peer assessment seriously and Pope [ 53] found that both peer and self-assessment contribute positively to a student's course performance. Johnston and Miles [ 54] further discovered that peer assessment allowed students to learn about their own effectiveness in a group setting and Somervell [ 55] found that peer assessment helped promote independent, reflective, and more critical learners. Research also suggests that peer assessment can help motivate student participation and foster student initiative to learn [ 56].

In this research, we design, construct, and implement an online peer ratings system for blogging, where students can assess the contributions of their peers. We assert that such a system will help motivate and better engage students within large classrooms by helping them focus on those contributions the community deems important. We anticipate that such a system will generate interest and interaction across the OLC and further foster learning.

## Artifact Design

In Design Science Research (DSR), the researcher is concerned with the way things ought to be in order to attain goals, and in order to achieve such goals the researcher devises artifacts [ 57]. In this research, the design and integration of a peer feedback system into the Elgg OLC software will constitute our IT artifact.

### 5.1 Elgg

Our feedback system was designed to integrate with Elgg, an open source social networking platform. Available through SourceForge.com, Elgg comes bundled with capabilities for blogging, file sharing, page creation, profile building, and peer-to-peer networking. More importantly for course instructors are advanced Elgg features that allow any member to create unlimited subcommunities. This feature provides instructors with the ability to create a new course community for each course they teach. Additionally, Elgg allows all users to restrict the data they create across numerous levels, including an individual level (or private), community level, logged in user level and public level, as well as various custom levels. The software is also designed for users to each have their own set of Web 2.0 tools, distinct from those of the community. This feature provides students with their own customizable personal space.

### 5.2 Elgg Social Feedback System

In this research, we expand Elgg's blogging engine to provide members with a robust peer feedback system. Described in detail below, we began with the simple rule of thumb, “provide a simple way to allow individuals to provide feedback on a blog post and to an individual.” Additionally, since we are dealing with short turn-around times based on weekly blogging assignments, the system should integrate and calculate blog statistics instantaneously, in order to provide the community with real-time blog assessments.

For the look and feel of our ratings system interface, we mirror the design at Amazon.com, illustrated in Fig. 3a. Amazon.com provides more than a basic mechanism to rate items, it also provides a breakdown of these ratings.

Figure    Fig. 3. (a) The Amazon.com ratings system with ratings breakdown. (b) Proposed OLC ratings system with ratings breakdown.

Illustrated in Fig. 3b, our design closely resembles Amazon.com's product ratings, but is integrated a blogging component instead. After a user enters his or her respective community blogs, the individual is presented with a chronological list of all blog posts across his or her community. Individuals can further drill down to view a complete breakdown of ratings for each blog post by hovering over the “Avg. Rating” hyperlink. If a user is the owner of a blog post, they are presented with an additional hyperlink that, when hovered, showcases private feedback received. Illustrated in Fig. 4, private feedback is shown in similar fashion to the ratings breakdown.

Figure    Fig. 4. Pop-up window, which appears after a user rates a blog post. The AJAX pop-up window solicits an individual for additional private feedback for the blog author.

A comprehensive list of design considerations is included as follows:

1. Adapt existing open source software, Unobtrusive AJAX Rating Bars V.1.2.2 [ 58], to Elgg.
2. Prevent any self-rating biases [ 59], [ 60].
3. Utilize a ratings scale of 1 through 5 stars with 1 star being the lowest rating.
4. Provide a breakdown of ratings when hovering over the average rating (see Fig. 4).
5. Ratings are displayed and editable at the individual blog level. Ratings are displayed but not editable from the homepage.
6. Ratings appear directly below the blog post title.
7. If a user has not rated a blog, “Rate this blog” text appears next to the stars.
8. If a user is not logged in, “Log in to rate this blog post” text will appear next to the stars.
9. To avoid ballot stuffing, a user can rate a blog post only once and users who are not logged into the system cannot rate blog posts.
10. If a user has already rated a blog post, that user can change his or her vote by selecting a different rating.
11. A user can delete a blog post subsequently deleting those ratings from his or her profile.
12. After rating an item, a dialog prompts users to input personalized feedback only the blog poster can view ( Fig. 5).

### 5.3 Further System Enhancements

The software was further enhanced to be intuitive for students, with all requirements and student expectations posted on the homepage of the course community.

It was not enough to wait for individuals to click on a blog post to view ratings and we deemed it necessary to create exposure to content immediately after a user logs into the system. Illustrated in Fig. 5, we modified the course homepage to showcase unrated and rated blog content from across the site. The reason was twofold. The primary reason was to make the site more user friendly. By providing users with a listing of site activity on the homepage, we reduced the number of clicks a user must make to review course content. A secondary goal centered on showcasing rated and unrated content from across the site. Most blogs are set to view posts chronologically. In large class sizes, where individuals will be creating hundreds of blog posts every week, it would mean that late blog posts would be shown first, which would not be fair to students who get their work in early. To mix this up and assure no one blog post received preferential treatment, the homepage randomly selected posts.

Figure    Fig. 5. Screenshot of the OLC homepage showcasing one unrated blog assignment and a list of six rated blog posts in randomized order.

It should also be noted that users could not add blog feedback directly from the homepage. Rather, they were required to click and view the complete blog post before providing any feedback. Additionally, a user's blog posts were not included in the randomly selected posts on the homepage. Rather, a user could review their posts from the “Your Blog” link in the topmost navigation bar.

## Research Design

Our experiment targeted specific information systems courses and can be categorized as one-group posttest-only quasi-experimental design since no random assignment was performed. Similar to the characteristics of a field experiment, we measure the effects of the feedback system on a specific population within an existing organization. While the organization, a US-based university, is not a “naturally” occurring setting, preexisting baselines exist for which to compare results.

### 6.1 Student Participation

All students were set up with accounts prior to the first day of class. 1 During the course of one semester, students were periodically required to create blog posts. The blog topics ranged from open-ended reflective pieces based on topics discussed the previous week, to critical reading and writing pieces centered on topics taken from the course textbook. In each assignment, students were required to refer back to the textbook to support their arguments. As directed on the homepage, students were also required to read and rate three blog posts each week. Ratings were used to determine the credit awarded for each blog post. In total, blog posts accounted for 25 percent of the homework grade and total feedback was accumulated for participation points which went toward extra credit on course exams.

While peer ratings were used to determine the homework grade, students could utilize the different feedback mechanisms for asynchronous participation points. The first and easiest way for an individual to participate in the OLC was to rate a blog post using the peer ratings system. After rating a blog post, a user could then choose to leave ratings feedback in a pop-up comment box. This feedback was only made viewable to the blog author. A third method was to leave a public blog comment, which remained viewable to the community.

### 6.2 Research Questions and Hypotheses

The semester-long experiment looked to explore the following research questions:

1. Will blog feedback impact course learning in large classroom environments?
2. Will blog feedback impact social interaction in large classroom environments?
3. Will blog feedback impact course motivation in large classroom environments?

Additionally, the following two hypotheses were generated to measure the impact the feedback system on learning and social interaction:

Hypothesis 1 (H1).
A social feedback system for blogs will positively impact perceived learning.
Hypothesis 2 (H2).
A social feedback system for blogs will positively impact perceived social interaction.

## Results

A survey analysis and content analysis of OLC use and grading outcomes were used to determine the software's success. Qualitative responses were also collected for further insights into the effectiveness of the software. In total, 192 students completed the end-of-semester survey resulting in a response rate of 86 percent.

### 7.1 Course Statistics

Table 2 provides a detailed breakdown of site and course-related statistics. The course consisted of one introductory course on information systems, titled IDS180, and consisted of 223 students. Traffic across the site was strong and students averaged 100 page views over the 16 week semester (or seven page views each week). On average, students created five blog posts, 20 ratings, 7.5 ratings feedback, and 8.5 blog comments. The class average was 2.52 (or a B-/C+).

Table 2. OLC Usage Statistics

### 7.2 Content Analysis—Site Activity versus Grades

Figs. 6a and 6b showcase trends in blogging and value-added blogging versus letter grades. Students were required to create a minimum of four blog posts across the semester. On average, students who exceeded minimal blog threshold also passed the course. The term value-added blogging was used to identify blogging add-on features or items that could contribute to the value of a blog post. These included public blog comments, public blog ratings, and private ratings feedback. Data from value-added blogging yielded stronger trends. Individuals with the greatest levels of feedback performed higher across the course where individuals with the fewest weekly contributions performed the lowest.

### 7.3 Survey Analysis—Perceived System Usage

Survey questions asked students to respond on how frequently they used the blogging system and feedback mechanisms. Detailed in Fig. 7, the majority of students reported using the system components weekly or biweekly in order to fulfill the course requirements. On average, students responded to blogging weekly or biweekly (44 and 38 percent, respectively). For asynchronous participation credit, students provided a substantial number of comments ratings and ratings feedback. Responses indicated that 80 percent of individuals rated posts weekly or biweekly. Students also responded positively to providing textual feedback, including blog comments and ratings feedback. Across these features, 49 percent indicated weekly or biweekly blog commenting and 48 percent indicated that they provided private feedback after rating blog posts weekly or biweekly.

Figure    Fig. 7. Blogging activity versus activity frequency.

### 7.4 Survey Analysis—Perceived Learning

Detailed fully in Table 3, survey questions also asked students to respond to general perceptions on the OLC, the peer feedback system, and their impact on perceived levels of learning and social interaction. Overall, 55 percent of respondents stated that the course community increased learning, with 18 percent disagreeing. Blogging was shown to have the greatest impact on perceived learning, with 67 percent of respondents stating that blogging increased learning for the course while only 14 percent disagreed with this statement.

Table 3. Survey Results—Perceived Learning

With respect to specific subcomponents of the feedback system, 33 percent of respondents indicated that ratings contributed to learning, 25 percent of respondents indicated that ratings feedback contributed to learning, and 35 percent of respondents indicated that comments contributed to learning.

### 7.5 Hypothesis Testing—Perceived Learning

Hypothesis 1 stated that blog feedback would have a positive impact on course learning. Using Pearson's Product Movement Correlation Coefficient (PMCC), we measured the frequency students used each software component and their perceived levels of learning. The PMCC critical value with 200 degrees of freedom (df) and a Significance Level $p < .05$ is 0.138. Detailed in Table 4, correlations for each component show significance at these levels. Consequently, we can reject the null hypothesis that the impact occurred due to chance and we can conclude that each component had a positive impact on perceived learning.

### 7.6 Survey Analysis—Perceived Interaction

A posttest questionnaire was used to acquire student perceptions of the OLC, the feedback system, and its impact on perceived levels of social interaction. Detailed in Table 5, overall responses were mixed. With respect to the OLC overall, 30 percent responded the community increased interaction with their peers while 35 percent disagreeing with this statement. Perceptions toward course blogging were slightly higher and 39 percent of respondents indicated that blogging increased interaction with 36 percent disagreeing with this statement. It should be noted that a large number of individuals remained neutral across these statements.

Table 4. Hypothesis Testing—Perceived Learning

Table 5. Survey Results—Perceived Social Interaction

With respect to the subcomponents of our feedback system, 39 percent of respondents indicated that ratings increased interaction with only 29 percent disagreeing. And 37 percent of respondents indicated that ratings feedback increased interaction with classmates with only 25 percent disagreeing. Finally, 35 percent of respondents indicated that comments increased interaction with 27 percent disagreeing.

### 7.7 Hypothesis Testing—Perceived Interaction

Hypothesis 2 stated that a blog feedback system would have a positive impact on course interaction. Using Pearson's PMCC, we analyzed these responses and looked specifically at the frequency students used these components and their perceived levels of social interaction. The critical value for Pearson's Correlation Coefficient with 200 degrees of freedom and a Significance Level $p <.05$ is 0.138. Detailed in Table 6, the correlations for each component were shown to be significant at these levels. Consequently, we can reject the null hypotheses that the impact made by these system components occurred due to chance and we can conclude that there was some positive impact each made one various aspects of perceived social interaction.

Table 6. Hypothesis Testing—Perceived Interaction

### 7.8 Qualitative Feedback

Open-ended feedback was also collected and allowed students to express their overall perceptions of the software. Over 100 students provided feedback on many aspects of the system from general usability to pedagogical aspects. Overall, responses were positive with many students acknowledging the benefits of course reflection through weekly blogging and feedback. While some individuals felt that the blogging requirement was tedious, other students stated that blogging did help with learning and interaction. One student stated, “It was easy to post blogs and it was interesting seeing other's perspectives.” And another student wrote, “I enjoyed it and it helped me study for tests.”

Some students commented on that lack of diversity in blog posts. In a class of 223 students, blog content had the tendency to look repetitive since each student was required to respond to a single biweekly blog assignment. Consequently, students expressed interest in having a greater variety of topics to choose from. Another student suggested that the assignments be more open ended.

Additionally, some students were skeptical toward the benefits of ratings. One student stated, “It sometimes felt that people would not fairly grade someone's blog.” Another student stated, “I think it was easy for students to not be very thorough, making the community not that helpful.” One other student stated, “I only rate for points and I don't actually read [blog posts].”

## Discussion

Existing CMS software provides faculty and students with an asynchronous environment for communication and information sharing. However, these systems fall short in providing members with a collaborative learning space that mirrors a traditional classroom setting. Additionally, in large classroom settings, where interaction within the classroom becomes difficult, students should be afforded some opportunity to connect with their peers. Asynchronous online software, comprised of various social technologies, can accommodate for these needs.

With the exponential growth of online social software in recent years, many instructors have adopted social software and adapted it to learning. One popular social technology with measured success in the classroom is blogging. Blogging allows individuals to reflect on course material and present these reflections in a semipublic environment (i.e., a classroom environment). However, a problem with blogging centers on the lack of comments any one blog post receives. Particularly in academic settings, where students count on feedback and interaction for learning purposes, a lack of feedback can discourage some students. The challenge of ameliorating this problem becomes even more difficult in class sizes exceeding 200 students.

In this research, we incorporate Activity Theory, Social Constructivism, and Social Presence to further guide how social technology can better engage learners within large classrooms. We build on prior OLC research to integrate a peer feedback system into the Elgg OLC blogging engine. Our system was carefully tied to the learning objectives of the course and offered students with multiple methods to deliver feedback to their peers. With the greater goal of enhancing learning and social interaction, multiple feedback mechanisms, such as blog commenting, blog rating, and ratings feedback, allowed students to connect in a manner they felt most comfortable.

During the course of the semester, students utilized each of the different feedback mechanisms. While some students preferred to leave a general assessment (i.e., a blog rating), some students offered more than a general rating and included private messages to the blog owner as well. Other students chose to leave public feedback that the blog owner and class could view.

To the best of our knowledge, there is no existing research that looks at online social feedback systems as mechanisms to drive learning and interaction within large classroom environments. Our results are largely positive and show that such a system can help foster asynchronous peer-to-peer communication, motivate students to play a more active role within the course community, and increase perceived levels of learning.

In prior research, we found it difficult to measure the impact of our system because adoption was voluntary and not explicitly tied to course learning objectives. In this research, we revisit theoretical underpinnings of Activity Theory and bring asynchronous course activities to the forefront of learning. For course credit, students were required to read and provide weekly feedback to course blogging assignments. Students responded positively and adoption rates exceeded 95 percent. Since participation was not optional, it was important that students agreed that the OLC aligned with the objectives of the course. 74 percent of students responded that this was the case.

As added incentives to publish quality blog posts and comment on the posts of their peers, peer ratings were used to determine the grade for a blog post. Consequently, if five students rated a blog 4.5, that post was awarded five points (from a possible five points). And if a blog posts received a rating of 2, that post was awarded two points (from a possible five points). It should be noted that instructors did examine posts for quality and if it was determined that a post was unfairly awarded a low score, that grade was amended.

Further incentives were added to help stimulate greater activity. Students were awarded bonus points on exams for high-traffic and high-rated blog posts. As one final measure, each week, top blog posts were showcased at the start of each lecture to further stimulate interest in the course and the online community.

### 8.2 Peer Feedback on Course Learning

A primary goal of our research has been to offer students with an asynchronous means to learn through reflective blog assignments and social learning.

One of the challenges we have faced throughout our research has centered on measuring our system's impact on student learning. In this study, we use a correlation analysis to show statistical significance between perceived system use and perceived learning. However, perceived learning does not necessarily reflect actual learning. Therefore, to get a more accurate depiction of course learning, we looked at system use and course grades. Through a content analysis of system usage, we discovered that students using the system more frequently, on average, received higher grades across the course.

Yet, survey responses showed little indication that the feedback mechanism aided in learning. On average, 41 percent of responses were neutral. One explanation for this could be related to system limitations. At the time of our system's release, the OLC was unable to send automated e-mail responses when feedback was generated. The majority of students logged into the system weekly or biweekly to complete course assignments. If the OLC was capable of sending notifications when feedback was generated, students might have been persuaded to login more frequently and add to the community dialog. This system limitation has been corrected in new releases of the software.

During the course of our research, we have been asked why we thought ratings would contribute to learning. Our response has always been that they do not directly contribute to learning but rather indirectly contribute. Ratings provide metadata, or data about blog posts. While a single rating, or group of ratings, does not contribute directly to learning, it does provide the reader with some knowledge before they read that post and will alter an individual's perception of that post. Understanding how the group views content may prompt an individual to reevaluate their initial viewpoint. And from the blog author's perspective, it is important to understand how one's own perspective is valued within the context of the community. Knowing one's peers will review and rate one's work offers considerable motivation to write better quality blog posts.

These concepts were acknowledged in additional survey responses. Students responded that blog feedback did help them think more critically while writing blog posts (51 percent of respondents agreed versus 20 percent who disagreed). Additionally, there was a general level of agreement that ratings helped individuals to think more critically while reading blog posts as well (52 percent of respondents agreed versus 18 percent who disagreed).

Lastly, the ratings system allowed students to discover more interesting content faster. The star ratings were developed to stand out in order to present individuals with an eye-catching high-level assessment of a blog post. This goal was to pique a user's curiosity as to why some blogs were rated highly and others were not.

### 8.3 Social Feedback on Course Interaction

A secondary goal of our research has been to offer students with an asynchronous means to interact with their peers.

Universities are inherently social settings. However, when individuals enter courses in excess of 200 students, the possibilities for interaction and building social ties within the classroom become drastically reduced, if they are not eliminated altogether. While somewhat ironic, it appears that the larger the class size, the more difficult it is to meet and learn from one another within the classroom. The goal of this research was to provide students with an asynchronous OLC to allow them to interact asynchronously within a virtual environment. Prior research has suggested that such a space does help.

Surprisingly, survey responses showed little indication (37 percent student agreement) that the blog feedback mechanism enhanced social interaction and, on average, student responses remained neutral (36 percent agreement). These responses are surprising given the fact that over 8,000 feedback entries were created across 1,100 blog posts (or seven per single blog post). Additionally, students were required to post avatars and complete basic profile information to ensure that interactions across the site were not faceless. One possible explanation could be related, again, to our systems inability to send e-mail notifications when an interaction occurred. Therefore, it was up to the student to discover those interactions (i.e., check their blog posts for feedback), rather than be made aware of an interaction in an e-mail or message. Should this capability have existed, students may have been more inclined to log in, view comments, and respond in a timely manner, rather than the average weekly or biweekly logon.

Aware of this system limitation, the instructor made additional attempts at enhancing social interaction. Social presence theory considers how individuals perceive one another in an online community of practice. To reemphasize the student element, select blog posts were displayed at the beginning of each lecture along with the profile icon and the feedback they received. This helped to revisit learning points discussed from course lectures and reemphasize that each post and rating was tied to a specific student. Additionally, this also helped to show best practices in blogging, providing in-class acknowledgment to those students. Many students enjoyed the classroom dynamic when a user's profile icon was displayed along with an excerpt of their blog post. The exercise stimulated the lecture and helped generate in-class discussion.

Another possible explanation why perceived social interaction was low surfaced in qualitative responses. Some responses indicated a lack of sincerity or quality within the feedback they received. This could certainly have played an important role in perceiving any valuable interaction across the OLC. During the third and fourth blogging assignments, canned responses began to surface by the same users such as “Great Job!” or “Nice Work!”

### 8.4 Introduction of Social Software

Lastly, while the merits of our peer feedback system on learning and social interaction can be debated, the introduction of social software into the classroom continues to provide numerous tangible and intangible benefits not measured completely in this study.

In today's dynamic business world, social software is rampant across most, if not all industries. Consequently, introducing an introductory class on information systems to various social technologies or Web 2.0 technologies, including blogging, tagging, collaborative writing, file sharing and profile creation may go a long way in preparing those students for industry.

Additionally, our research reinforces the notion that social software can offer a powerful resource for communicating course objectives. An underlying movement of our research has been to continually push academic institutions and CMS developers to build more dynamic and student-centric institutional software. It is our belief that all types of learners are continually becoming exposed to powerful social software in their personal and professional lives and these competencies should be integrated into our institutions of learning. Consequently, instructors should be challenging students across a broader range of new media, which will provide students with critical new skills in the process.

## Limitations and Next Steps

We understand that limitations in our research exist. An important limitation in this study stems from steps taken to ensure student confidentiality. As required by our university's research review board, survey data received were encoded and made anonymous. Therefore, we were unable to link survey responses to actual system usage and student outcomes. This meant that we were not able to triangulate between perceived learning, system usage, and student grades. Along similar lines, while we link course grades to system use, our research does not measure actual learning. Rather, we focus our results on perceived learning.

A technical limitation our software faced was with the software's mail server, which prevented automated e-mails from being generated and sent to students each time feedback was created. An e-mail could have gone a long way in engaging students further, prompting them to log in to the system to review site changes and possibly engaging them further to review new site content.

Finally, the current version of the software was running Elgg Version 0.8 (a beta version of the Elgg software). Launched in 2004, the Elgg 0.8 interface is not as clean and sleek as many of today's social networking sites. That being said, this version of Elgg has shown successful results in prior studies. For upcoming semesters, our research team has decided to upgrade to Elgg Version 1.2 and also moved the system to an external host, subsequently correcting the issue with the mail server as well.

## Conclusion

In this research, we measure the impact of OLC software within a large classroom environment. The software provides individuals with the ability to build online profiles, post blogs and read, rate and comment on the posts of their peers. Through Design Science research, we enhance the Elgg social networking platform to provide students with a peer ratings system. The system provides students with multiple methods of offering feedback to peer blog posts. Our results showed that higher levels of system use corresponded with higher levels of perceived learning. Results also showed that higher levels of system use corresponded with higher levels perceived social interaction.

Additionally, we found positive trends between system usage and course grades. Our research reinforces the idea that social software can foster higher levels of course learning through openness and collaboration and can align very well with course learning objectives. Our research is particularly relevant to teaching pedagogy in large classroom environments where in-class, student-to-student interaction can be minimal.

## References

• 1. S. Milgram, “The Small World Problem,” Psychology Today, vol. 1, no. 1, pp. 60-67, May 1967.
• 2. D.A. Williamson, “Social Network Marketing: Ad Spending and Usage,” Social Network Marketing, 2008.
• 3. D. Sharma, “Social Networking God: 350+ Social Networking Sites,” Mashable.com, http://mashable.com/2007/10/23/social-networking-god, 2009.
• 4. E. Stacey, “Social Presence Online: Networking Learners at a Distance,” Education and Information Technologies, vol. 7, no. 4, pp. 287-294, 2002.
• 5. S. Redfern, and N. Naughton, “Collaborative Virtual Environments to Support Communication and Community in Internet-Based Distance Education,” J. Information Technology Education, vol. 1, no. 3, pp. 201-211, 2002.
• 6. M. Rohde, L. Reinecke, B. Pape, and M. Janneck, “Community-Building with Web-Based Systems - Investigating a Hybrid Community of Students,” Computer Supported Cooperative Work, vol. 13, nos. 5/6, pp. 471-499, 2004.
• 7. A. Quan-Haase, “Trends in Online Learning Communities,” SIGGROUP Bull., vol. 25, no. 1, pp. 2-6, 2005.
• 8. W. Brescia, and M. Miller, “What's It Worth? The Perceived Benefits of Instructional Blogging,” Electronic J. for the Integration of Technology in Education, vol. 5, pp. 44-52, 2006.
• 9. H.S.Y. Song, and M.C. Yuen, “Educational Blogging: A Malaysian University Students' Perception and Experience,” Proc. Australasian Soc. for Computers in Learning in Tertiary Education (ASCLITE '08), 2008.
• 10. C. Cook, R. Owston, and D.K. Garrison, Blended Learning Practices at COHERE Univ., York Univ. Inst. for Research on Learning Tech nologies, 2004.
• 11. P. Hernández-Ramos, “Web Logs and Online Discussions as Tools to Promote Reflective Practice,” The J. Interactive Online Learning, vol. 3, no. 1, pp. 1-16, 2004.
• 12. W. Richardson, “The Educator's Guide to the Read/Write Web,” Educational Leadership, vol. 63, no. 4, pp. 24-27, 2005.
• 13. B. Thoms, N. Garrett, J.C. Herrera, and T. Ryan, “Understanding the Roles of Knowledge Sharing and Trust in Online Learning Communities,” Proc. 41st Ann. Hawaiian Int'l Conf. System Sciences (HICSS '08), Jan. 2008.
• 14. J. Vassileva, “Toward Social Learning Environments,” IEEE Trans. Learning Technologies, vol. 1, no. 4, pp. 199-214, Oct.-Dec. 2008.
• 15. B. Thoms, N. Garrett, and T. Ryan, “Online Learning Communities in the New ‘U’,” Int'l J. Networking and Virtual Organisations, vol. 6, no. 5, pp. 499-517, 2009.
• 16. B. Thoms, N. Garrett, M. Soffer, and T. Ryan, “Resurrecting Graduate Conversation through an Online Learning Community,” Int'l J. Information Comm. and Technology Education, vol. 4, no. 3, pp. 341-350, 2008.
• 17. G. Conole, and M. Oliver, “A Pedagogical Framework for Embedding C&IT into the Curriculum,” Assoc. for Learning Technology J., vol. 6, pp. 4-16, 1998.
• 18. R.E. Ferdig, “Assessing Technologies for Teaching and Learning: Understanding the Importance of Technological Pedagogical Content Knowledge,” British J. Educational Technology, vol. 37, no. 5, pp. 749-760, 2006.
• 19. L.S. Vygotsky, Mind in Society: The Development of Higher Psychological Processes. Harvard Univ., 1987.
• 20. G.C. Neto, A.S. Gomes, J. Castro, and S. Sampaio, “Integrating Activity Theory and Organizational Modeling for Context of Use Analysis,” Proc. Latin Am. Conf. Human-Computer Interaction, Oct. 2005.
• 21. S. Bødker, “A Human Activity Approach to User Interfaces,” Human-Computer Interaction, vol. 4, no. 3, pp. 171-195, 1989.
• 22. Y. Mor, J. Tholander, and J. Holmberg, “Designing for Constructionist Web-Based Knowledge Building,” Proc. Conf. Computer Support for Collaborative Learning: Learning 2005: The Next 10 Years, pp. 450-459, 2005.
• 23. R. Heckman, and H. Annabi, “Cultivating Voluntary Online Learning Communities in Blended Environments,” J. Asynchronous Learning Networks, vol. 10, no. 4, 2006.
• 24. K. Kuutti, “Activity Theory as a Potential Framework for Human Computer Interaction Research,” Context and Consciousness: Activity Theory and Human Computer Interaction, B. Nardi, ed., pp. 17-44, MIT, 1995.
• 25. B.A. Nardi, D.J. Schiano, and M. Gumbrecht, “Blogging as Social Activity, or, Would You Let 900 Million People Read Your Diary?” Proc. ACM Conf. Computer Supported Cooperative Work, pp. 222-231, 2004.
• 26. K. Issroff, and E. Scanlon, “Case Studies Revisited- What Can Activity Theory Offer?” Proc. First Euro-Computer Supported Collaborative Learning Conf., 2001.
• 27. C.M. Johnson, “A Survey of Current Research on Online Communities of Practice,” Internet and Higher Education, vol. 4, pp. 45-60, 2001.
• 28. R. Palloff, and K. Pratt, Building Learning Communities in Cyberspace: Effective Strategies for the Online Classroom. Jossey-Bass, 1999.
• 29. J.R. Savery, and T.M. Duffy, “Problem Based Learning: An Instructional Model and Its Constructivist Framework,” Constructivist Learning Environments: Case Studies in Instructional Design, B. Wilson, ed., Educational Tech nology, 1996.
• 30. J. Piaget, The Origins of Intelligence in Children. Int'l Univ., 1952.
• 31. F. Hagstrom, and J.V. Wertsch, “Grounding Social Identity for Professional Practice,” Topics in Language Disorders, vol. 3, no. 24, pp. 162-173, 2004.
• 32. D. Squires, “Educational Software and Learning: Subversive Use and Volatile Design,” Proc. 32nd Ann. Hawaii Int'l Conf. System Sciences (HICSS '99), Jan. 1999.
• 33. M.R. Matthews, “Constructivism and Science Education: A Further Appraisal,” J. Science Education and Technology, vol. 11, no. 2, pp. 121-134, 2002.
• 34. C. Dalsgaard, “Social Software: E-Learning beyond Learning Management Systems,” http://www.eurodl.org/materials/ contrib/2006/Christian_Dalsgaard.htm. Oct. 2007.
• 35. M. Teeper, “The Rise of Social Software,” netWorker, vol. 7, pp. 18-23, 2003.
• 36. E. Soloway, S. Jackson, J. Klein, C. Quintana, J. Reed, J. Sptulnik, S.J. Stratford, S. Studer, S. Jul, J. Eng, and N. Scala, “Learning Theory in Practice: Case Studies of Learner-Centered Design,” Proc. Conf. Human Factors in Computing Systems (CHI '96), 1996.
• 37. J.A. Short, E. Williams, and B. Christie, The Social Psychology of Telecommunications. John Wiley & Sons, 1976.
• 38. F. Biocca, C. Harms, and J.K. Burgoon, “Toward a More Robust Theory and Measure of Social Presence: Review and Suggested Criteria,” Presence: Teleoperators and Virtual Environments, vol. 12, pp. 456-480, 2003.
• 39. C.H. Tu, and M. McIsaac, “The Relationship of Social Presence and Interaction in Online Classes,” The Am. J. Distance Education, vol. 16, pp. 131-150, 2002.
• 40. W. Hung, “Building Learning Communities by Enhancing Social Presence: Implementing Blended Instructional Delivery Methods,” SIGGROUP Bull., vol. 24, no. 3, pp. 79-84, 2003.
• 41. C.N. Gunawardena, and F.J. Zittle, “Social Presence as a Predictor of Satisfaction within a Computer-Mediated Conferencing Environment,” The Am. J. Distance Education, vol. 11, pp. 8-26, 1997.
• 42. J.C. Richardson, and K. Swan, “Examining Social Presence in Online Courses in Relation to Student's Perceived Learning and Satisfaction,” J. Asynchronous Learning Networks, vol. 7, no. 1, pp. 68-88, 2003.
• 43. L. Shih, and K. Swan, “Fostering Social Presence in Asynchronous Online Class Discussions,” Proc. Conf. Computer Support for Collaborative Learning, pp. 602-606, 2005.
• 44. A. Wise, J. Chang, T. Duffy, and R. del Valle, “The Effects of Teacher Social Presence on Student Satisfaction, Engagement, and Learning,” Proc. Sixth Int'l Conf. Learning Sciences, pp. 568-575, 2004.
• 45. K.J. Lee, “What Goes around Comes Around: An Analysis of Del.icio.us as Social Space,” Proc. 20th Anniversary Conf. Computer Supported Cooperative Work, pp. 191-194, 2006.
• 47. P. Resnick, R. Zeckhauser, E. Friedman, and K. Kuwabara, “Reputation Systems: Facilitating Trust in Internet Interactions,” Comm. ACM, vol. 43, no. 12, pp. 45-48, 2000.
• 48. J. Sabater, and C. Sierra, “Social ReGreT, A Reputation Model Based on Social Relations,” ACM SIGecom Exchanges, vol. 3, no. 1, pp. 44-56, 2001.
• 49. H. Mengshu, L. Xianliang, Z. Xu, and Z. Chuan, “A Trust Model of P2P System Based on Confirmation Theory,” ACM SIGOPS Operating Systems Rev., vol. 39, no. 1, pp. 56-62, 2005.
• 50. J.S. Donath, “Identity and Deception in the Virtual Community,” Communities in Cyberspace, M.A. Smith and P. Kollock, eds., pp. 29-59, Routledge, 1999.
• 51. M.M. Wasko, and S. Faraj, “Why Should I Share? Examining Social Capital and Knowledge Contribution in Electronic Networks of Practice,” MIS Quarterly, vol. 29, no. 1, pp. 35-57, 2005.
• 52. L. Johnston, and L. Miles, “Assessing Contributions to Group Assignments,” Assessment and Evaluation in Higher Education, vol. 29, no. 6, pp. 751-768, 2004.
• 53. N.K. Pope, “The Impact of Stress in Self- and Peer Assessment,” Assessment and Evaluation in Higher Education, vol. 30, no. 1, pp. 51-63, 2005.
• 54. L. Johnston, and L. Miles, “Assessing Contributions to Group Assignments,” Assessment and Evaluation in Higher Education, vol. 29, no. 6, pp. 751-768, 2004.
• 55. H. Somervell, “Issues in Assessment, Enterprise and Higher Education: The Case for Self-, Peer and Collaborative Assessment,” Assessment and Evaluation in Higher Education, vol. 18, pp. 221-233, 1993.
• 56. Y. Rafiq, and H. Fullerton, “Peer Assessment of Group Projects in Civil Engineering,” Assessment and Evaluation in Higher Education, vol. 21, pp. 69-81, 1996.
• 57. H. Simon, The Sciences of the Artificial, third ed. MIT, 1996.
• 58. Masuga Design, “Unobtrusive Ajax Star Rating,” http://masuga design.com/the-lab/scripts/unobtrusive-ajax-star-rating-bar, July 2008.
• 59. R.D. Goffin, and D.W. Anderson, “The Self-Rater's Personality and Self-Other Disagreement in Multi-Source Performance Ratings: Is Disagreement Healthy?” J. Managerial Psychology, vol. 22, no. 3, pp. 271-289, 2007.
• 60. D.B. Kaufman, R.M. Felder, and H. Fuller, “Peer Ratings in Cooperative Learning Teams,” Proc. Ann. American Society for Engineering Education Meeting, 1999.