In my previous post I discussed search engines in general and also how do they build the index table which is the core of any search engine. One thing which became very clear after these studies that the search engines available today are very limited when it comes to functionality. The keyword search does not leave much room for returning the relevant result. In google if we enter Paris Hilton as a search keyword we also get Hilton Hotel in Paris returned as search result that too on the first page. But the search engine there is not able to distinguish that we are not looking Hilton in Paris but Paris Hilton celebrity. On the other hand if we enter Hiton Paris we also get Paris Hilton in our search result. One way or the other the search result we get is not relevant to what we are looking for.
Last night I was reading about Latent Semantic Indexing (LSI) and that did give some hope. I found this page at SEOBook explaining it in a much simpler way about LSI. There are other references as well but this is one page which other than Wikipedia that explains it in a layman's term.
But the million dollar question we are faced with is whether LSI is going to take away the pain of going through irrelevant search results when we query the search engine. In my opinion that is still not very clear. As the algorithm of LSI is still based on the keywords found in the document. And that is not going to take the pain away unless we use the semantic search. But then why semantic search?
Semantic search as most of us know is based on the meanings conveyed by the objects. The term meaning has more depth than it appears from surface. The semantic search is not new, its been there for centuries now. In ancient times philosophers have given the mantra to the world as how to perform the semantic search. Its just that only a handful of people (technologists) today take the pain to read through those literatures. What the current search engines fail today is to restrict the result to what the user wants. We are allowed to input only a bunch of keywords.
In case of Semantic Search the driving factor is context as different terms (or concepts as John F. Sowa describes it) have different meaning or interpretation depending on where they are used. If we build a search engine around these philosophies then we can definitely achieve semantic search (upto a great extent).
Until Next Time... :)
Last night I was reading about Latent Semantic Indexing (LSI) and that did give some hope. I found this page at SEOBook explaining it in a much simpler way about LSI. There are other references as well but this is one page which other than Wikipedia that explains it in a layman's term.
But the million dollar question we are faced with is whether LSI is going to take away the pain of going through irrelevant search results when we query the search engine. In my opinion that is still not very clear. As the algorithm of LSI is still based on the keywords found in the document. And that is not going to take the pain away unless we use the semantic search. But then why semantic search?
Semantic search as most of us know is based on the meanings conveyed by the objects. The term meaning has more depth than it appears from surface. The semantic search is not new, its been there for centuries now. In ancient times philosophers have given the mantra to the world as how to perform the semantic search. Its just that only a handful of people (technologists) today take the pain to read through those literatures. What the current search engines fail today is to restrict the result to what the user wants. We are allowed to input only a bunch of keywords.
In case of Semantic Search the driving factor is context as different terms (or concepts as John F. Sowa describes it) have different meaning or interpretation depending on where they are used. If we build a search engine around these philosophies then we can definitely achieve semantic search (upto a great extent).
Until Next Time... :)
0 comments:
Post a Comment