Researchers at IBM’s lab in Haifa, Israel initiated and spearheaded a European Union project that has led to the development of an analytics engine to allow people to find even untagged video, pictures and music that match multimedia they’ve submitted as a query.

In collaboration with a European Union Consortium, the Haifa researchers have engineered a Web technology called SAPIR (Search in Audio visual content using Peer-to-peer Information Retrieval) that can analyze and identify the pixels in large-scale collections of audio-visual content.

SAPIR can sift through collections of millions of multimedia items by extracting “low-level descriptors” from the photographs or videos. These descriptors include features such as color, layout, shapes or sounds. The multimedia identified is automatically indexed and ranked for easy retrieval.

For example, an image of a statue taken with a mobile phone could be compared to existing photographs and the statue could be identified. Similarly, in the future, doctors could compare rich media medical images with historical data from medical repositories to aid in diagnosis; or a photograph of a fashionable item seen on the street could be analyzed so you could find out which stores carry it.

A demo for testing by the general public is now available at