The objectives of this shape retrieval contest are to evaluate the performance of 3D shape retrieval approaches on a new generic 3D shape benchmark.
Task description: In response to a given set of queries, the task is to evaluate similarity scores with the target models and return an ordered ranked list along with the similarity scores for each query.
Data set: In this new generic benchmark there are 800 3D models. The target database contains 720 complete 3D models, which are categorized into 40 classes. In each class there are 18 models. The file format to represent the 3D models is the ASCII Object File Format (*.off).
Evaluation Methodology: We will employ the following evaluation measures: Precision-Recall curve; Average Precision (AP) and Mean Average Precision (MAP); E-Measure; Discounted Cumulative Gain; Nearest Neighbor, First-Tier (Tier1) and Second-Tier (Tier2).
Paper: Godil, A., Dutagaci, H., Akgül, C.B., Axenopoulos, A., Bustos, B., Chaouch, M., Daras, P., Furuya, T., Kreft, S., Lian, Z. and Napoleon, T., 2009, March. SHREC'09 Track: Generic shape retrieval. In 3DOR (pp. 61-68). http://dx.doi.org/10.2312/3DOR/3DOR09/061-068