Tuesday 6 December 2011

Siri's Mysterious Brain Fog

Siri's Mysterious Brain Fog
The dust-up over Siri's apparent inability to help users find abortion information has lead some to wonder why the virtual assistant clams up at the mention of the issue. Regardless of the reason, the situation's given Apple a small taste of what it's like to come up with search results that some people don't agree with.

The performance of Apple's (Nasdaq: AAPL) iPhone voice-activated "personal assistant" when asked for information about abortion information has created a tempest of controversy for the Cupertino, Calif. company, though the problem is likely a technological one, rather than part of some hidden political agenda.
However, the gaffe could create potential trust problems for Apple in the future.
The issue came to light when several news outlets began reporting about complaints from iPhone 4S users that requests for abortion information was drawing a blank or in some cases, references to a pro-life clinic.
Apple did not respond to our request for comment on the Siri problem, but in a widely reported statement on the issue, the company said that while Siri can find a lot, it doesn't always find what users want. The company asserted that these are not intentional omissions, and it reiterated that Siri remains a beta product.
Search engine experts interviewed by MacNewsWorld agree with Apple's assessment of the situation.

Kindness of Others

Like Blanche Dubois, Siri depends on the kindness of others to answer questions asked of it. It doesn't have its own database of information from which it can extract answers, as a Google (Nasdaq: GOOG) or Bingdoes. It has to query other programs' databases for those answers.
Siri is a search agent, rather than a search engine, explained Jonathan Allen, director of Search Engine Watch.
"Search engines crawl the Web and try to index and rank the Web, while a search agent is programatically connected to other searchable entities," he told MacNewsWorld.
Siri can directly interface with traditional search engines like Google, Yelpand Wolfram Alpha and pull information from them for its answers to your questions, he added.
In addition, Siri also tries to determine if a question has a local spin on it. If it does, Siri will try to find a local answer for it. That can produce some befuddling results.
That could be what happened with the abortion search. Siri may be checking a local database and in some cases, finding no abortion clinics.
"The database doesn't come back with any answers almost certainly because abortion clinics don't call themselves 'abortion clinics,'" Search Engine Land Editor In Chief Danny Sullivan explained to MacNewsWorld.
The same thing would happen if you searched for a "tool store," he added. "When I do that, I don't get any answers," he elaborated, "and it's not because Siri is trying to prevent me from buying tools. It's because the tool stores aren't calling themselves 'tool stores.' They're calling themselves 'hardware stores.'"

Ethical Question

Another possible source for the problem is that Siri, which is still in beta, just couldn't parse the abortion question correctly, added Allen. Whatever caused the glitch, he continued, "it's not a moral thing."
Ironically, if the abortion query had been shipped off to Google instead of to a local database, the current controversy may not of developed. A search for "Where can I get an abortion?" on that engine produces a directory of abortion clinics high up in its search results.
While Siri may not be a search engine in the mold of Google and Bing, this latest dustup is giving Apple a taste of what the ferret set has gone through for some time. "These aren't new things," Sullivan observed. "Google has been dealing with these things for years."
"People get upset with Google because you can do a search that tells you how to pirate software, or you do a search that can tell you how to kill people or how to make bombs," he added.
However, not being a pure search engine could open Apple up to a different kind of criticism in the future. That's because there's money to be made from referring Siri users to the databases it's using to answer questions.
"The ethical question is," Sullivan said, "are you pointing to database that are the best for the user or are you pointing to databases because you're making the most money from them?"

0 comments:

Post a Comment