Relationship between identity management and reputation management. We define reputation as a component of an identity, and consequently, we establish the relationship between identity management and reputation management.
Reputation assessment in learning environments. We propose a contextual reputation assessment technique within a learning environment.
Supporting trust while preserving privacy. We face the challenge of supporting trust while preserving privacy, and devise a privacy-preserving reputation management solution to address this challenge.
Implementation. As a proof of concept, we implement and evaluate our solution in an online learning environment.
Peer tutoring. Peer tutoring is a widely practiced learning method. The main idea behind forming an online community of practice is peer tutoring. A learner needs to trust the competence and benevolence of their peer tutors. In a tutoring activity, a tutee shares her weakness with an expectation that her privacy will be preserved. A privacy breach may put the tutee in a disadvantageous or embarrassing situation. Privacy and trust concerns can easily demotivate learners from participating in peer-tutoring activities.
Peer reviewing. Online portfolios are becoming increasingly common to engage learners in peer reviewing and assessment. These portfolios contain various sensitive information such as tests and test scores, projects, and self-reflections. Accessibility to an e-portfolio has privacy implications. Learners need to decide who they should trust with their e-portfolio items.
Learning object selection. The selection of a suitable learning object requires making a trust decision of a sort. This trust may involve trusting (the reliability of) a learning object, trusting (the competence of) the author of the learning object, or trusting (the competence or authoritativeness of) the recommender of the learning object.
Collaboration. Trust is essential to successful collaboration among learners [ 4], [ 5]. Online collaboration can cause stress depending on the level of the collaborators' mutual trust [ 6]. If trust is not present in a relationship, a large amount of energy is wasted in checking up on the other's commitments and on the quality of their works. In a learning environment, various key relationships of recommender-recommendation seeker, peer-peer, helper-helpee, and mentor-mentee are formed based on mutual trust. Privacy concerns are inherent in a collaborative environment. The privacy concerns in collaborative systems originate from individuals' desire to control how one is perceived by another [ 7].
Group learning. Group learning in the form of discussion forum, or reading group, offers valuable learning experience to learners. A group functions well when each member trusts each other and respects each other's privacy. An online learning system should facilitate a trust- and privacy-preserving learning environment.
Evaluation. Confidentiality is very important in the learner assessment and evaluation process. Sometimes, learners experience various biases such as gender, ethnic, or connectedness (more connected to the evaluator). Biases in learner evaluation can be prevented through privacy-preserving techniques [ 8]. In a trust relationship, learners' confidence can grow regarding the fairness of evaluation.
Role playing. Role playing is an effective technique for exploring complex social issues in certain courses (such as Sociology). Safety is an essential condition for authentic role playing. When a learner plays a controversial role, the learner may run the risk of being stigmatized or feel embarrassed. For example, when talking in favor of same-sex marriage, a learner may fear to be ridiculed. Learners' safety can be assured through trusting and privacy preserving learning environments.
Personalization. Personalization of learning objects increases the motivation and interest of learners [ 9]. As a result, in recent time, we have witnessed an increasing volume of research and development efforts to offer personalized e-learning. Trust has been identified as a prerequisite [ 10] and a consequence of good personalization practice [ 11]. Anwar et al. define key characters of an e-learning environment that offers personalization together with trust and privacy [ 1].
Trust about purpose. In e-learning, each context explicitly or implicitly manifests some purpose for its participants. For example, a math discussion forum context may have a purpose of offering peer tutoring in math. Within the math forum context, there could be more granular contexts like an Algebra thread or Calculus thread for the purpose of peer tutoring the respective topics. This form of trust is based on the expectation from the purpose of a context. For example, Alice may highly trust the Math Forum to find an effective helper in Calculus.
Trust in partner. This form of trust considers the trustworthiness of a partner in a given context. For example, in a Calculus course, Alice may be considered as a trusted peer helper. Trust in partners may need further consideration of the roles of, and relationships with, the transacting partners. Some roles convey more trust than others. For example, an instructor role may convey a higher degree of trust. However, not all instructors are equally trusted by learners. A learner may trust one instructor over another based on their perceived relationship or reputation.
6.2.1 Secure Reputation Transfer Protocol In the secure reputation-transfer protocol, a user registers its pseudonym with a guarantor who would vouch for the user and be credible in the community. The guarantor periodically evaluates the reputation of the user based on their and other community members' observations. After each evaluation, a copy of the reputation is sent to the respective user. The user gets an opportunity to contest any misrepresentation of their reputation to the guarantor. The guarantor investigates the challenge and thereafter makes an appropriate adjustment to the reputation. In the RT model, there are the following four entities:
Actor. An actor is a user (e.g., student, tutor, instructor in an e-learning environment), who takes part in various activities (e.g., chat, discussion) assuming their various contextual partial identities.
Reputation. Reputation measures trustworthiness of a user assessed over their past activities. For example, Alice may have worked in numerous collaborative course projects in the past. Based on her previous records, she could be trusted as a hardworking participant. However her skills in programming assignments may not be highly trusted.
Guarantor. A guarantor is a “public” user who is a trusted witness of the past activities of a pseudonymous user. For example, since an instructor observes a student over a period of time, the instructor can serve as a guarantor of a students reputation in a traditional e-learning context.
Key Generator (KG). A trusted key generator that facilitates Public Key Infrastructure. This is a system component that will provide public/private key pairs to the users and the guarantor without knowing the purpose or usage of the key pairs. The steps of reputation transfer model are detailed in the table found in Appendix, which can be found on the Computer Society Digital Library at http://doi.ieeecomputersociety.org/10.1109/TLT.2011.23.
In summary, in the RT model (see the figure found in Appendix available in the online supplementary material), a pseudonymous user can update the reputation of one pseudonym by transferring its reputation from another pseudonym. A guarantor vouches for a user in two ways: 1) responding to the queries about the user, and 2) responding to the user's reputation transfer request from one pseudonym to another.
Register. A user registers with a guarantor entity of the system. The communication between a user and a guarantor is cryptographically secure. At the time of registration, a user provides their pseudonym (partial identity) and context (reputation context for which the user wants to be evaluated for reputation). Upon registration, the user receives two pieces of information to be kept secret: 128-bit unique registration number and a digest (MD5 hash) for reputation. For any change in reputation, the system generates a new digest.
Evaluate. Any user can evaluate others (i.e., pseudonyms) against the features specific to the role of the user being evaluated on a scale of 0 to 5. Additionally, an evaluator may write comments in support of their evaluation.
Transfer. Reputation transfer is a two way process that has to be carried out by both the pseudonyms— transferor and transferee. First, the transferor and then the transferee authenticate themselves by providing their respective contexts, registration numbers, and reputation digests. Reputation from one pseudonym can be transferred to a new pseudonym, or reputation of one pseudonym can be merged with the reputation of the other pseudonym. Reputation merge takes place incrementally by combining each rating transaction of a pseudonym one-by-one to the aggregate rating of the other pseudonym and vice versa. Though the end result of the merge is two pseudonyms with the same reputation, their reputations are different on each time step of the merge. There is a little time delay induced in between each step to give the impression that there could have been another transaction (evaluation) taking place.
Query. A user may query reputation about another user (corresponding pseudonym). A reputation summary, which is an aggregation of collected ratings against context-relevant features, is displayed in the following format: Feature| Score| #Trans (i.e., number-of-ratings ).”
6.4.1 Value of Reputation Management System Methodology. The system was used in an experiment to support online course discussions of 35 students (19 female and 16 male) in an intensive six-week undergraduate course on Introduction to Sociology. The study was done in two phases: 1) in the first phase, the class made 173 postings using the original version of iHelp Discussions (without reputation management system), and 2) in the next phase, they made 302 postings using a version of iHelp Discussions with reputation management system features.
The system allowed the participants to create multiple role- and relationship-level identities, provided awareness support of contexts and identities, and enabled them to rate others and query others' as well as their own identity-specific reputation (a screen shot of reputation Window in iHelp Discussion Forum is shown in Fig. 2). In each phase, the participants (students and the instructor) discussed topics under 11 contexts (chosen by the instructor of the course as per the course objectives), each addressing 11 different social and behavioral questions. The goal of the discussion is to collaboratively find answers to different social phenomena (e.g., Dating Older man, Spitting on the Ground, Eye-contact on elevator, etc.). Prior to each phase of the study, users were trained to use the system. At the end of the second phase, 25 participants (of the 35 who used the system) took a post-use online survey to share their use experience and their attitudes toward reputation-based trust.
Results. The usage data reveal that every participant received reputation ratings on their posts and that 43 percent of the participants checked their own or others' reputation. On an average, each participant received 12.5 ratings. 31 percent of the participants consulted self-reputation. We realize that the need for reputation or trust in the study is not as critical as it is in an online setting where there is no bodily presence to act as a trust guarantor. Since the participants of this study were classmates, they were already involved in trust relationships. However, it was observed that those who cared about trust measures (based on the survey) used the trust and reputation features of the system more extensively. The postuse survey reveals that 28 percent of learners used the system to identify trustworthy peers. 36 percent of learners valued postings based on posters' reputation while 40 percent found that reputation management system helped them identify trustworthy postings (see Table 1 for details).
Discussions. In this study the guarantor role is automated by the system. The system transferred a participant's reputation earned using a group identity (i.e., while a group identity is used to make a posting) to all of her individual partial identities within the same context. 22 percent of postings (66 of 302 postings) are made using group identities. Also, reputation is transferred among partial identities within the same context. Even though 43 percent of users (lower than our expectation) were interested in seeking out reputation information, every user was interested in managing their identities—switching identities in different contexts. They engaged in this identity switching activity because they felt that identity linkability was not going to be a problem—that is, they implicitly trusted the security of the reputation management system. Perhaps those who were seeking out more reputation information were indeed checking up on how well the reputation mechanism preserved their privacy. We plan to conduct future study in an environment, where the need for reputation or trust is naturally higher so that we can fully understand the impact of our system.
6.4.2 Validating RT Model Methodology. For validating the RT model, the system was initialized to generate multiple instances of four types of events (reputation evaluation request, reputation transfer request, reputation merge request, and null requests) in some random order for pseudonyms representing actors. At multiple time steps during the simulation, the system (the component representing the guarantor) was queried for the latest reputation of each of the registered pseudonyms and the query results are logged. A version of this simulation was run for and reputation update actions were logged accordingly. These logs were then provided to a security attack-defense expert to attempt to deduce types of events might have occurred based on an analysis of the reputation score patterns over various time steps. The expert was also asked to see whether he could distinguish among or determine instances of reputation transfer, reputation merge, and normal updates of reputation ratings.
Results. The simulation performed three transfers and seven merges of reputations across four pseudonyms of two actors. Although the data set was relatively small, the expert could not make any definitive conclusions that would identify which pseudonyms corresponded to the same actor. Our expert suspected that four mergers or transfers of reputation occurred. The one merger hypothesis in which the expert was most confident was totally incorrect. Two of our expert's suspected mergers or transfers actually did correspond to real mergers or transfers, but the expert entirely missed eight of the merger/transfer events. Our expert correctly had a suspicion that one transfer and one merger (of the 10) had occurred, but he could not be sure. Out of these two correct hypotheses, the expert could not confirm conclusively about any of the mergers or transfers.
We could say that these correct guesses are no more than random luck. With an increase in the number of actors or pseudonyms, it becomes even harder to guess about any reputation transfer or merge. Therefore, we could say that our system supports reputation transfer with privacy preservation.
The integrity of reputation can be checked using the reputation digest, a 128-bit “fingerprint” of reputation information generated through the calculation of MD5 hash.
Since both the transferring and receiving pseudonyms are registered to the guarantor, any bad acting can be traced and verified by the guarantor.
To restrict the taking of undue advantage from recurring merger of a bad reputation with a good reputation, a history of already merged ratings is kept and compared before entertaining a new merge request.
The model also supports rollback of reputation to recover from bad acting.
Use of public key infrastructure ensures secure reputation transfer channel so that an observer cannot snoop a reputation transfer or identify two pseudonyms involved in the process of a reputation transfer.
One pseudonym's reputation (i.e., aggregated ratings) is incremented one-by-one by each rating transaction of the other pseudonym and vice versa allowing longitudinal increase or decrease in reputation to make transfer indistinguishable from reputation update by a new rating.
A random time delay is induced between each of the increments to make reputation transfer indistinguishable from reputation update by a new rating, which may not happen in a continuous succession of a short burst.
A time delay proportional to the amount of activities takes place in the system is induced between updates of reputation so that multiple partial identities of an individual are not linkable because of one reputation update triggering changes of reputation of multiple pseudonyms.
How much privacy is lost by a user when disclosing the given data?
How much does a user benefit from a particular trust gain?
How much privacy should a user be willing to sacrifice for a certain amount of trust gain?
M. Anwar is with the School of Information Sciences, University of Pittsburgh, 135 N. Bellefield Ave., Pittsburgh, PA 15260.
J. Greer is with the Department of Computer Science, University of Saskatchewan, 176 Thorvaldson, 110 Science Place, Saskatoon, SK S7N 5C9, Canada. E-mail: firstname.lastname@example.org.
Manuscript received 16 Mar. 2010; revised 20 Dec. 2010; accepted 18 Apr. 2011; published online 5 May. 2011.
For information on obtaining reprints of this article, please send e-mail to: email@example.com, and reference IEEECS Log Number TLT-2010-03-0025.
Digital Object Identifier no. 10.1109/TLT.2011.23.
Mohd Anwar received the PhD degree from the University of Saskatchewan. Currently, he is working as a visiting research assistant professor in the School of Information Sciences of the University of Pittsburgh.
Jim Greer received the PhD degree from the University of Texas at Austin and has been a faculty member at the University of Saskatchewan for more than 20 years. Currently, he is working as a professor of computer science and also serves as a director of the University Learning Center.