SHREC 2010 - Shape Retrieval Contest based on Generic 3D Warehouse
Call For Participation:SHREC 2010 - Shape Retrieval Contest based on Generic 3D Warehouse
The objective of this track is to evaluate the performance of 3D shape retrieval approaches on a Generic 3D shape benchmark based on the Google 3D Warehouse.
With the increasing number of 3D models are created everyday and stored in databases. Effectively searching a 3D repository for 3D shapes which are similar to a given 3D query model has become an important area of research. Benchmarking allows researchers to evaluate the quality of results of different 3D shape retrieval approaches. Here, we propose a new publicly available 3D shape benchmark based on the Google 3D Warehouse to advance the state of art in 3D shape retrieval
Task descriptionThe task is to evaluate the dissimilarity between every two objects in the database mentioned above and then output the dissimilarity matrix.
Data setAll the 3D models in the generic shape benchmark were acquired by a web crawler from the Google 3D Warehouse. To classify the 3D shape models into a ground truth database, one person based on the Google tags has classified objects into ground truth categories based mainly on visual similarity. In this benchmark, there will be over three thousand 3D models. The file format used to represent the 3D models will be the ASCII Object File Format (*.off).
Evaluation MethodologyWe will employ the following evaluation measures: Precision-Recall curve; Average Precision (AP) and Mean Average Precision (MAP); E-Measure; Discounted Cumulative Gain; Nearest Neighbor, First-Tier (Tier1) and Second-Tier (Tier2).
ProcedureThe following list is a step-by-step description of the activities:
- The participants must register by sending a message to SHREC@nist.gov. Early registration is encouraged, so that we get an impression of the number of participants at an early stage.
- The database will be made available via this website. Final Test Dataset now Posted test dataset.
- Participants will submit the dissimilarity matrix (also named as distance matrix) for the test database. Upto 5 matrixs per group may be submitted, resulting from different runs. Each run may be a different algorithm, or a different parameter setting. More information on the dissimilarity matrix file format. More information on the dissimilarity matrix file format
- The evaluations will be done automatically.
- The organization will release the evaluation scores of all the runs.
- The participants write a two page description of their method and commenting the evaluation results with two figures.
- The track results are combined into a joint paper, published in the proceedings of the Eurographics Workshop on 3D Object Retrieval.
- The description of the tracks and their results are presented at the Eurographics Workshop on 3D Object Retrieval (May 2, 2010).
|January 26||- Call for participation.|
|January 29||- few sample models of the test database will be available on line .|
|February 3||- Please register before this date.|
|February 3||- Distribution of the whole database. Participants can start the retrieval. .|
|February 13||- Submission of results (dissimilarity matrix) and a one page description of their method(s)).|
|February 17||- Distribution of relevance judgments and evaluation scores.|
|February 20||- Submission of final descriptions (two page) for the contest proceedings.|
|February 24||- Track is finished, and results are ready for inclusion in a track report|
|March 7||- Camera ready track papers submitted for printing|
|May 2||- EUROGRAPHICS Workshop on 3D Object Retrieval including SHREC'2010|