Show simple item record

dc.contributor.authorPelechrinis, Konstantinos
dc.contributor.authorZadorozhny, Vladimir
dc.contributor.authorKounev, Velin
dc.contributor.authorOleshchuk, Vladimir
dc.contributor.authorAnwar, Mohd
dc.contributor.authorLin, Yiling
dc.date.accessioned2013-09-20T08:44:10Z
dc.date.available2013-09-20T08:44:10Z
dc.date.issued2013
dc.identifier.citationPelechrinis, K., Zadorozhny, V., Kounev, V., Oleshchuk, V., Anwar, M., & Lin, Y. (2013). Automatic evaluation of information provider reliability and expertise. World Wide Web, 1-40. doi: 10.1007/s11280-013-0249-xno_NO
dc.identifier.issn1386-145X
dc.identifier.urihttp://hdl.handle.net/11250/138010
dc.descriptionPublished version of an article in the journal: World Wide Web. Also available from the publisher at: http://dx.doi.org/10.1007/s11280-013-0249-xno_NO
dc.description.abstractQ&A social media have gained a lot of attention during the recent years. People rely on these sites to obtain information due to a number of advantages they offer as compared to conventional sources of knowledge (e.g., asynchronous and convenient access). However, for the same question one may find highly contradicting answers, causing an ambiguity with respect to the correct information. This can be attributed to the presence of unreliable and/or non-expert users. These two attributes (reliability and expertise) significantly affect the quality of the answer/information provided. We present a novel approach for estimating these user’s characteristics relying on human cognitive traits. In brief, we propose each user to monitor the activity of his peers (on the basis of responses to questions asked by him) and observe their compliance with predefined cognitive models. These observations lead to local assessments that can be further fused to obtain a reliability and expertise consensus for every other user in the social network (SN). For the aggregation part we use subjective logic. To the best of our knowledge this is the first study of this kind in the context of Q&A SNs. Our proposed approach is highly distributed; each user can individually estimate the expertise and the reliability of his peers using his direct interactions with them and our framework. The online SN (OSN), which can be considered as a distributed database, performs continuous data aggregation for users expertise and reliability assesment in order to reach a consensus. In our evaluations, we first emulate a Q&ASN to examine various performance aspects of our algorithm (e.g., convergence time, responsiveness etc.). Our evaluations indicate that it can accurately assess the reliability and the expertise of a user with a small number of samples and can successfully react to the latter’s behavior change, provided that the cognitive traits hold in practice. Furthermore, the use of the consensus operator for the aggregation of multiple opinions on a specific user, reduces the uncertainty with regards to the final assessment. However, as real data obtained from Yahoo! Answers imply, the pairwise interactions between specific users are limited. Hence, we consider the aggregate set of questions as posted from the system itself and we assess the expertise and realibility of users based on their response behavior. We observe, that users have different behaviors depending on the level at which we are observing them. In particular, while their activity is focused on a few general categories, yielding them reliable, theirmicroscopic (within general category) activity is highly scattered.no_NO
dc.language.isoengno_NO
dc.publisherSpringerno_NO
dc.subjectQ&A social networksno_NO
dc.subjectuser reliabilityno_NO
dc.subjectuser expertiseno_NO
dc.subjectsubjective logicno_NO
dc.titleAutomatic evaluation of information provider reliability and expertiseno_NO
dc.typeJournal articleno_NO
dc.typePeer reviewedno_NO
dc.subject.nsiVDP::Mathematics and natural science: 400::Information and communication science: 420no_NO
dc.source.pagenumber1-40no_NO
dc.source.journalWorld Wide Webno_NO
dc.identifier.doi10.1007/s11280-013-0249-x


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record