Efficient Interactive Search for Geo-tagged Multimedia Data
Due to the advances in mobile computing and multimedia techniques, there are vast amount of multimedia data with geographical information collected in multifarious applications. In this paper, we propose a novel type of image search named interactive geo-tagged image search which aims to find out a set of images based on geographical proximity and similarity of visual content, as well as the preference of users. Existing approaches for spatial keyword query and geo-image query cannot address this problem effectively since they do not consider these three type of information together for query. In order to solve this challenge efficiently, we propose the definition of interactive top-k geo-tagged image query and then present a framework including candidate search stage , interaction stage and termination stage. To enhance the searching efficiency in a large-scale database, we propose the candidate search algorithm named GI-SUPER Search based on a new notion called superior relationship and GIR-Tree, a novel index structure. Furthermore, two candidate selection methods are proposed for learning the preferences of the user during the interaction. At last, the termination procedure and estimation procedure are introduced in brief. Experimental evaluation on real multimedia dataset demonstrates that our solution has a really high performance.
READ FULL TEXT