Is a picture worth a thousand search words?

By Catherine
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

Written by Catherine Bolgar

Selecting the right Internet search words can be frustrating. But thanks to broader bandwidth and better picture-recognition technology, future searches may be image- or video-driven

“There’s a long history of search engines that have tried to use images,, says Greg Sterling, vice president of strategy for the Local Search Association, an industry association of media companies, agencies and technology providers. “Visual search was seen as more directly delivering information than text. Maybe it was a technology thing or timing thing, but they didn’t quite find the right model.”

As smart phones began reshaping the Internet landscape—some 340 million were shipped in the second quarter of 2015 alone—pre-existing visual search engines such as Grokker, Viewzi and SearchMe floundered. Yet the proliferation of smart phones and tablets may have increased demand because their small screens are more suited to pictures than text.

Visual is definitely one path forward for search,” Mr. Sterling says. At the moment, when searching for a particular product, “unless you have a specific brand name, it’s hard and frustrating clicking back and forth to different sites.”

An image search “will confirm quickly if it’s what you’re looking for, plus provide customer reviews and other product information,” Mr. Sterling says.

 

However, image search is not so straightforward. You take a photograph and use it to search related information, but success depends on the angle, light and focus of the photo.

“In the future, maybe it will be the case where you snap a picture of a landmark and get all the information about it,” he says. “What’s open for improvement is using a camera to get information. Inputting a 16-digit credit card number into a small screen on a phone is problematic. You mistype. Today, you can take a picture of the credit card and certain apps will recognize it and process it into the form.”

Images by themselves probably aren’t the future. “Look for a mix of images and structured data, finding what images are, finding other related things and organizing that information with tags and other data,” Mr. Sterling says. “There’s more and more sophistication in how you identify and index, with machine learning and other technology that exists behind the scenes that could apply to a pure text or image model.”

Researchers are working to improve the technological foundations for image searches. A group of universities is developing ImageNet, a database of 14 million images that attaches images to nouns.

Meanwhile, Lorenzo Torresani, associate professor of computer science at Dartmouth College in New Hampshire, has helped create a machine-learning algorithm that uses images to find documents. However, only a few users annotate their uploaded pictures and videos, and not necessarily accurately. “The repository is expanding at an astonishing rate, but we can’t retrieve content efficiently,” Dr. Torresani says.

Software can check whether the searched-for objects are in a picture, and if so automatically tags them. “It works, but has limitations,” Dr. Torresani says. “It’s difficult to expose all the content in the picture with predefined classes. And if you use predefined classes, then the search is only accessible through those keywords.”

Another way is to extract some visual features, like a visual signature, that allows users to search by example. Alternatively, software could translate key words into the visual signature, because users are accustomed to searching via text. This would work like language translation software, but translating from text to image instead.

“It could be used to find images or videos that are similar in context or appearance, and link them somehow,” Dr. Torresani says. “It could make the repositories browsable.”

Video is the bigger challenge. “One second of video has 30 images,” he says. “The amount of data we need to analyze a one-minute video is huge. Storage is a problem. Retrieval is a problem. Processing is a problem.”

Yet “even if the recognition process fails on one or two images, we have so many of them and the view maybe changes and the object that was ambiguous becomes clearer later in the video,” Dr. Torresani says. “From that point of view, video is easier than a still image.”

 

Catherine Bolgar is a former managing editor of The Wall Street Journal Europe. For more from Catherine Bolgar, contributors from the Economist Intelligence Unit along with industry experts, join the Future Realities discussion.

Photos courtesy of iStock

Search-Based Applications: What’s in it for e-biz?

By Morgan
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

For those of you who are unfamiliar with search-based applications, I thought it would be helpful to provide a concrete example, in this case, an online SBA and an iPhone application based on that SBA.

While enterprise SBAs are often used to provide database-style information access and reporting – without the usability and performance constraints of direct database access – online and mobile SBAs often ‘ mash-up’ unstructured and structured content to create a pertinent and engaging experience for consumers.

First, the online SBA. It is a restaurant directory proof-of-concept that mashes up source data including database content (restaurant listings in the directory database), Web content (photos, details like opening hours, prices, menus, payment options, etc.), and user-generated content (opinions, ratings, reviews, blogs, etc., also culled from the Web), with sentiment analysis applied on the aggregated content. It also incorporates geospatial data for mapping. The result is an ultra-rich directory that synthesizes a massive amount of information into a coherent, at-a-glance consumer dashboard.

And this dashboard evolves in real-time.

Restminer

Based on this POC, we developed an iPhone SBA for Yellow Pages Group Canada called Urbanizer. Urbanizer leverages the same sources as Restminer (YPG Database + Web & UGC content + Sentiment Analysis + Mapping), with one significant addition: Social Networking. The result is the industry’s first Mood-Based Local Search application.Urbanizer

Urbanizer combines search, sentiment analysis and social networking to help consumers find the perfect local restaurant according to their mood (“Tonight, I’m in the mood for an authentic, cozy Italian restaurant.”).

CloudView semantic processors and sentiment analyzers dynamically map restaurants to the types of meals to which they are well-suited (e.g., “Romantic Dinner,” “Hipster Snack,” or “Business Lunch”) and to match the restaurant’s service, cuisine and ambiance to qualitative ratings like “homey,” “refined,” “casual,” “upscale,” etc.

A range of other quantitative and qualitative data is available to help users hone in on the perfect dining experience: type of cuisine, proximity, price, ratings, friends’ recommendations, review details… With a live connection to Facebook and other social websites, users can also instantly share the information they’ve found with friends, family and colleagues.

In addition this sharing of information within one’s own social network, Urbanizer simultaneously empowers users to build a knowledge base of benefit to the public at large. As each Urbanizer member interacts with the database and their own social network to refine their search and share their experiences, a “mood map” of an entire city is constructed for the benefit of all Urbanizer users.

This type of rich, emotive search grounded in social networking carries great potential for numerous sectors, including hospitality, travel, entertainment, classified, and personal and business services.

What do you think about this?

Best,

Morgan

(Morgan Zimmerman is Exalead’s VP of Business Development.)

Exalead Acquisition Perspectives

By Kate
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

Acquisitions stir up all kinds of questions:

  • What does this mean for me the customer?
  • What are the people like who work for the new company?
  • What do the employees think about it?

So I whipped out my Flip and went to speak about these sorts of things with folks from Exalead and Dassault Systèmes.  Here’s a little video that gives you a flavor not only for their answers, but also a glimpse at the faces and emotion behind the acquisition. 

YouTube Preview Image

To help you get more familiar with Exalead, what they do, and how this acquisition could translate for DS customers, I interviewed their VP of Business Development Morgan Zimmerman.  Enjoy!

Q1:  Exalead does a variety of things for very different user groups. What’s the best metaphor to understand Exalead as a single entity?

MZimmermanMZ:  Exalead represents the application of the best of the consumer Web to complex corporate legacy systems. It brings everything we’ve become accustomed to on the Web – real-time information, single text box access, natural language querying, a conversational mode of interacting with data, sub-second responsiveness, zero-training usage, etc., etc. – right into the enterprise.

 

Q2:  What’s your favorite Exalead product and why?

MZ:  That’s easy – we only have one product, Exalead Product! The same platform powers our public Web search engine, is embedded inside our OEM partners’ products, and supports our clients’ very diverse search-based applications.

However, I have to admit that I’m especially proud of some of our online deployments. The Web is the most demanding environment in terms of performance and innovation.  These include:

  • Exalead.com , which represents a 16-billion page implementation of our software and features a powerful Webcrawler, a panel of industrial grade semantics modules, and an innovative navigation experience;
  • Voxalead, which performs speech-to-text transcription on daily news videos from hundreds of online sources – it’s a technology the French Presidency recently incorporated in their redesigned website;
  • and the Urbanizer application we developed for Yellow Pages Canada. Urbanizer is the world’s first mood-based search engine for restaurants.  It’s available as an iPhone application.

Q3. What technical features excite your customers the most?

MZ:  It’s not the features that excite our clients; it’s the bottom line results. They’re blown away when they see performance and usability bottlenecks they’ve been struggling with for years simply dissolve away in a few weeks or a few months – even for the most complex environments.

Q4:  Why does the SBA market have such a bright future?

MZ:  The Internet has forever changed the game. Corporate users’ expectations for enterprise information systems have been permanently altered by users’ experience with the consumer Web. Only SBAs can meet these expectations and bridge the gap between the online and enterprise worlds. It is why search technologies are pivotal to developing the next generation of consumer-to-business applications.

Q5:  What type of ‘children’ do you wish for the Exalead-DS marriage to engender?

MZ:  Applications. Exalead has an infrastructure level information search and access technology. DS develops business applications. I’m looking forward to seeing a new generation of business applications come to life through the application of Exalead technology. And I’m excited to see what will happen when Exalead’s search technologies are combined with DS’s 3D technologies.


To Morgan, Laura, Christophe and Carole from Exalead, as well as Bruno, Laurent and Xavier from DS—merci beaucoup for having shared your perspectives! 

To all Exalead employees and customers, on behalf of the 3D Perspectives community, welcome to Dassault Systèmes!

And you, dear reader?  What do you think about the Exalead acquisition?

Best,

Kate



Page 1 of 3123