A professional photographer annotated IQA databaseThe dataset was collected as part of a study on how to effectively screen expert crowd workers in image quality assessment (IQA). It consists of 200 images randomly chosen from Flickr, 50 pristine that are selected so that they do not show visible quality degradations and 50 artificially distorted obtained from the pristines.
In our paper we propose a screening approach to find reliable and effectively expert crowd workers in image quality assessment (IQA). Our method measures the users' ability to identify image degradations by using test questions, together with several relaxed reliability checks. We conduct multiple experiments, obtaining reproducible results with a high agreement between the expertise-screened crowd and the freelance experts of 0.95 Spearman rank order correlation (SROCC), with one restriction: shallow depth of field images. Our contributions include a reliability screening method for uninformative users, a new type of test questions that rely on our proposed database of pristine and artificially distorted images, a group agreement extrapolation method and an analysis of the crowdsourcing experiments. We provide the full experimental results from both experts and crowdsourcing experiments. |
Cite usIQA-Experts-300 is freely available to the research community. If you use our database in your research, you can cite it as follows:
@misc{IQAExperts300, title = {IQA-Experts-300: A professional photographer annotated IQA database}, author = {Hosu, Vlad and Lin, Hanhe and Saupe, Dietmar}, year = {2018}, url = {http://database.mmsp-kn.de}} @inproceedings{Hosu2018-expertise-screening, title = {Expertise screening in crowdsourcing image quality}, booktitle = {QoMEX 2018: Tenth International Conference on Quality of Multimedia Experience}, author = {Hosu, Vlad and Lin, Hanhe and Saupe, Dietmar}, year = {2018}} |
Downloads
146 Kb download
2.2 Mb download
|
The experiment files expert_scores_aggregated.csv
experts_crowdflower_full.csv
|