Corpus Analysis and Annotation for Helpful Sentences in Product Reviews

Hana Almagrabi, Areej Malibari, John McNaught

Abstract


For the last two decades, various studies on determining the quality of online product reviews have been concerned with the classification of complete documents into helpful or unhelpful classes using supervised learning methods. As in any supervised machine-learning task, a manually annotated corpus is required to train a model. Corpora annotated for helpful product reviews are an important resource for the understanding of what makes online product reviews helpful and of how to rank them according to their quality. However, most corpora for helpfulness are annotated on the document level: the full review. Little attention has been paid to carrying out a deeper analysis of helpful comments in reviews. In this article, a new annotation scheme is proposed to identify helpful sentences from each product review in the dataset. The annotation scheme, guidelines and the inter-annotator agreement scores are presented and discussed. A high level of inter-annotator agreement is obtained, indicating that the annotated corpus is suitable to support subsequent research.


Full Text:

PDF


DOI: https://doi.org/10.5539/cis.v11n2p76

Copyright (c) 2018 Areej A. Malibari

License URL: http://creativecommons.org/licenses/by/4.0

Computer and Information Science   ISSN 1913-8989 (Print)   ISSN 1913-8997 (Online)  Email: cis@ccsenet.org


Copyright © Canadian Center of Science and Education

To make sure that you can receive messages from us, please add the 'ccsenet.org' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.