System and method for text disambiguation and context designation in incremental search

- Veveo, Inc.

Methods and systems for text disambiguation and context designation in incremental search are provided. A method for selecting items in response to ambiguous keystrokes entered by a user and unambiguous metadata associated with a previously selected search result includes receiving ambiguous keystrokes, selecting and presenting a first subset of items and metadata associated with the items presented based on the ambiguous keystrokes. The method also includes receiving a selection of one of the items from the user, and, in response to a locking operation received from the user, locking in fixed relation at least one of the ambiguous keystrokes to at least one metadata term associated with the selected item. The method further includes, subsequent to receiving the locking operation, selecting and presenting a second subset of items based at least in part on the locked metadata term, and presenting the second subset of items.

Skip to: Description  ·  Claims  ·  References Cited  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit under 35 U.S.C. §119(e) of the following application, the contents of which are incorporated by reference herein:

U.S. Provisional Application No. 60/940,244, entitled System and Method for Text Disambiguation and Context Designation in Incremental Search, filed May 25, 2007.

BACKGROUND

1. Field of the Invention

The present invention relates to user interfaces for searching and browsing and, more specifically, to user interfaces that are intended to operate on input-constrained devices and to provide relevant search results with a minimum of user interaction.

2. Discussion of Related Art

Discovery of desired content is not always simple as searching for a person, place or object and selecting the desired results. The user's intent may be deeper and broader, and retrieving the desired results might require more than merely providing more search terms. For example, a user may want to first discover a particular person, place, or entity, then browse through results in that context, and finally, make further refinements. In this case, the search system would need to be able to infer that the user is searching or browsing in a particular context, and serve content related to that context that satisfies the user's intent. A user progressively adding additional search terms may be misinterpreted as an attempt to intersect multiple interests rather than as a context based search. Some search engines have attempted to define specific grammars for users to specify a context search, but these grammars are often complex and idiosyncratic, and thus only experienced and advanced users can use them effectively. The expression and discovery of intent is further complicated by the possibility that a query may be entered using ambiguous keypad input (e.g., typed on a cellular phone using ambiguous keys).

SUMMARY OF THE INVENTION

The invention provides methods of and systems for text disambiguation and context designation in incremental search.

Under one aspect of the invention, a user-interface method for selecting a subset of items from a relatively large set of items in response to search criteria including ambiguous keystrokes entered by a user from a keypad with overloaded keys and including unambiguous metadata associated with a previously selected search result includes receiving ambiguous keystrokes entered by a user from a keypad with overloaded keys. A given key of the keypad is in fixed association with a plurality of alphabetical and numerical symbols the user is using to search for desired items. In response to receiving the ambiguous keystrokes, the method selects and presents a first subset of items and corresponding unambiguous metadata associated with the items presented based on the ambiguous keystrokes. The method also includes receiving a selection of one of the items of the first subset of items from the user, and, in response to a locking operation received from the user, locking in fixed relation at least one of the ambiguous keystrokes to at least one unambiguous metadata term associated with the selected item. The method further includes, subsequent to receiving the locking operation, selecting and presenting a second subset of items based at least in part on the locked unambiguous metadata term, and presenting the second subset of items.

Under another aspect of the invention, the unambiguous metadata term locked in fixed relation to the at least one of the ambiguous keystrokes is one of the presented metadata terms associated with the selected item of the first subset. The locked metadata term lexically disambiguates the items of the first subset from each other for the subsequent selecting and presenting step.

Under a further aspect of the invention, the unambiguous metadata term locked in fixed relation to the at least one of the ambiguous keystrokes is a metadata term describing a concept associated with the selected item of the first subset. Only items associated with the concept of the selected item are selected and presented in the subsequent selecting are presenting step.

Under yet another aspect of the invention, the method also includes receiving a browse action from the user for highlighting one of the presented items of the first subset. In response to the user browse action, the method transforms at least part of the ambiguous keystrokes into at least one unambiguous metadata term associated with the highlighted item At least some of the characters of the unambiguous metadata term match the alphabetical and numerical symbols in fixed association with the ambiguous keystrokes entered by the user.

Under a still further aspect of the invention, the receiving a selection of one of the items of the first subset of items includes receiving a browse action from the user for highlighting one of the presented items of the first subset. In addition, receiving a locking operation from the user includes receiving at least one additional keystroke entry from the user.

Under another aspect of the invention, the at least one additional keystroke entry from the user is a keystroke for performing an explicit lock operation.

Under yet another aspect of the invention, the at least one additional keystroke entry from the user includes additional ambiguous keystrokes entered by the user for providing additional alphabetical or numerical symbols for searching for desired items.

Under a further aspect of the invention, systems including logic for performing the methods above are provided.

BRIEF DESCRIPTION OF THE DRAWINGS

For a more complete understanding of various embodiments of the present invention, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:

FIG. 1 is a network diagram that illustrates the various components of a search system, according to certain embodiments of the invention.

FIG. 2 is a schematic diagram that depicts the various components of a user device, according to certain embodiments of the invention.

FIG. 3 is a user-interface diagram that depicts the various components of the search interface, according to certain embodiments of the invention.

FIG. 4 is a flowchart that illustrates the operation of a search system, according to certain embodiments of the invention.

FIG. 5 is a user interface diagram that illustrates the operation of the navigation, synchronization, and LEXICAL LOCK features, according to certain embodiments of the invention.

FIG. 6 is a user interface diagram that illustrates the operation of the CONCEPT LOCK feature, according to certain embodiments of the invention.

DETAILED DESCRIPTION

Embodiments of the invention described here enable a user to disambiguate otherwise ambiguous and/or incomplete text query entries based on terms and metacontent associated with search results that are determined by a search engine to be relevant to the user's query input. A user interface incorporating the techniques disclosed herein can use an ambiguous keypad (e.g., a keypad with overloaded keys) or an unambiguous keypad to receive a search query input from a user. The input query symbols may be, for example, single numeric characters (e.g., on an ambiguous keypad) or single text-alphabet characters (e.g., on an unambiguous QWERTY keypad). Embodiments of the invention can also be used with incremental search techniques, in which results are retrieved as each character is typed.

Techniques for selecting a set of results responsive to the user's query include, but are not limited to, those disclosed in U.S. patent application Ser. No. 11/235,928, entitled Method and System For Processing Ambiguous, Multi-Term Search Queries, filed Sep. 27, 2005, U.S. patent application Ser. No. 11/136,261, entitled Method and System For Performing Searches For Television Content Using Reduced Text Input, filed May 24, 2005, and U.S. patent application Ser. No. 11/246,432, entitled Method and System For Incremental Search With Reduced Text Entry Where The Relevance of Results is a Dynamically Computed Function of User Input Search String Character Count, filed Oct. 7, 2005, all of which are herein incorporated by reference. Similarly, lists of relevant results can be displayed using techniques disclosed in U.S. patent application Ser. No. 12/123,940, entitled Method and System for Search with Reduced Physical Interaction Requirements, filed on May 20, 2008, incorporated by reference herein.

While the user is composing a text query on an ambiguous keypad, the user's query, in general, can be said to be ambiguous (in the sense of the symbol being from an overloaded numeric keypad, where each key corresponds to a set containing more than one character, as is standard in cellular telephones) as well as incomplete (in the sense that one or more words in the query could be incomplete). For an illustration of ambiguous and incomplete queries and query-words, consider the following example. Suppose that the user's intended complete and unambiguous query is “engelbert humperdinck biggest hit.” An incomplete but unambiguous version of the same query is “engel hump bigg hit” because the first three query-words in the latter query are prefixes of the respective first three query-words in the complete and unambiguous query. Similarly, an incomplete as well as ambiguous version of the same query would be entered by pressing the keys labeled “36435 4867 2444 448” (assuming this query is entered using a standard numeric keypad of the kind commonly found in telephones and/or television remote controls) because the query-words “36435,” “4867,” and “2444” are prefixes of the numeric versions of the first three query-words in the complete and unambiguous query. The words “humpback” and “humperdinck” both match the incomplete query-word “hump,” because “hump” is a prefix of both the words. The words “humpback” and “humperdinck” both match the ambiguous and incomplete query-word “4867,” because “4867” is an ambiguous prefix of the complete and ambiguous query-words “48672225” and “48673734625” (which match “humpback” and “humperdinck,” respectively).

Preferred embodiments of the present invention address several usability problems. First, preferred embodiments allow users to press each key only once to enter a specific character, even if the key is associated with multiple characters (as on an overloaded keypad). Second, preferred embodiments permit users to type only a partial prefix of each search term. Finally, preferred embodiments allow for the progressive refinement of search queries in a context-sensitive way.

The techniques described herein provide methods for partially automated completion, disambiguation, and progressive refinement of search queries by using an iterative search-browse-select process. In most cases, this approach reduces the number of steps in reaching the desired result, by eliminating separate disambiguation and context-narrowing steps.

In a preferred embodiment of the present invention, the query system maintains four entities in a computer-readable data structure. The search-query is a data structure that contains the contents of the query input box in the user interface. The clone-query is a secondary query storage structure that allows the restoration of previous queries after the primary search-query has been changed. Unlike the search-query, the contents of the clone-query are not directly displayed to the user; this data structure is maintained by the search system for internal use. The context-list is a data structure that contains information that is used to limit the search space from which the search system will retrieve results. Finally, the result-list is a data structure that holds the results that the system has determined are relevant to the user's query and query context. The manipulation and use of these four structures is described in greater detail below.

The search-query contains a set of query terms, which may be either direct-input query terms or locked query terms. Direct-input query terms are those query terms (that could be incomplete and/or ambiguous) that have been input by the user using the keypad. Locked query-words are query terms that have been placed into the search-query automatically as a result of a “lock” operation. Lock operations are described in greater detail below, but in general, a locked query term is a word that the user interface has put into the search-query in place of a user-entered ambiguous and/or incomplete query term portion. These locked query terms can come from metacontent associated with a particular search result returned by a search engine.

FIG. 1 illustrates the various components of a search system, according to certain embodiments of the invention. A server farm [100] serves as a source of search data and relevance updates with a network [105] functioning as the distribution framework. The distribution framework might be a cable television network, a satellite television network, an IP-based network, or any other type of widely-used networking framework. It may be a wired network or a wireless network, or a hybrid network that uses both of these communication technologies. The search devices are preferably hand-held devices with limited display and input capabilities, such as a hand-held PDA [101], a remote control [115b] that interfaces with a television [115a], or any other input- and output-constrained mobile device (e.g., a cellular phone).

FIG. 2 is a diagram that depicts the various components of a user device, according to certain embodiments of the invention. The user device communicates with the user via a display [201] and a keypad [204]. This keypad may be an overloaded keypad that produces ambiguous text input. Computation is performed using a processor [202] that stores temporary information in a volatile memory store and persistent data in a persistent memory store [206]. Either or both of these memory stores may hold the computer instructions for the processor to perform the logic described herein. The device is operable to connect to a remote system using a remote connectivity module [205].

FIG. 3 is a user-interface diagram that depicts the various components of the search interface, according to certain embodiments of the invention. Box [300] represents the screen of the user device. At the top of the screen is the query input box [301]. As described above, the query input box displays the current contents of the search-query data structure. In FIG. 3, the search-query consists of the ambiguous query term corresponding to the keystrokes “36435” on an overloaded telephone-style keypad, where each number key is associated with multiple characters. The portion [302] of the screen below the query input box [301] is used to display the contents of the result-list. If the result-list is empty, no results are displayed. Otherwise, this portion of the screen [302] is subdivided into rows [303], each of which displays information about a particular search result. In addition to the title of the result, this information may include metadata relevant to the result. In fact, the title of the result is itself only one example of metadata relevant to the result. For example, in FIG. 3 the words “(Music/Multimedia)” appear next to “Engel” in order to provide additional context to the user. The user may use a keyboard navigation interface to browse through the result-list. When the user navigates to a particular row, that row is highlighted. When none of the rows are selected, the query input box is highlighted (illustrated in FIG. 3 as a shaded background with white text [301]).

As described above, the techniques described herein may be used with devices that have overloaded keypads. In FIG. 3, the ambiguous and incomplete search term “36435” has been entered using an overloaded numeric telephone keypad. Using the techniques described in the patent applications referenced above, the system may automatically generate various completions and disambiguations of the search query. In this example, both “engel” and “fogel” are selected as possible disambiguations of “36435,” and the set of suggested completions includes “Fogelburg,” “Engelbert,” and “Engelke,” among others. The portion of the suggested completion that matches the ambiguous query term is here shown in boldface and underlined.

FIG. 4 is a flowchart that illustrates the operation of a search system, according to certain embodiments of the invention. When the system is initialized, the search-query, the clone-query, the result-list, and the context-list are empty [401]. Although not shown in FIG. 4, the search interface optionally provides a means for the user to return to this initial state at any time during the search process. At this point, the system waits for the user to begin searching by entering [402] a character into the query input box [301]. After the user enters a character, it is added to the search-query. If the rightmost query term in the search-query is a direct-input query term, the character is appended to this query term. The user may begin entering a new direct-input query term by inserting a space (or any other appropriate delimiter). Following the user's character entry, the new search-query is optionally submitted to the search engine [404], without requiring the user to explicitly launch the query. At this point, the contents of the search-query are copied to the clone-query, and the results of the search are displayed [404].

At this point, the user may enter another character to further refine the search query [402] or navigate [405] to one of the displayed results [304]. When the user navigates to a displayed result, the result is highlighted, and the incomplete or ambiguous terms in the search-query are synchronized to the highlighted result. In the context of FIG. 3, the user's navigating to the “Engelbert Humperdinck” result [304] would cause the search query to change from “36435” to “Engelbert Humperdinck.” Only the search-query is synchronized to the highlighted result; the clone-query remains unchanged. If the user navigates to a different result, the search-query is restored from the contents of the clone-query and re-synchronized to the newly highlighted result. The process of synchronization is described in more detail below.

Having navigated [405] to a search result, the user is presented with four options. First, if desired, the user may select the highlighted result [406]. In preferred embodiments, selecting the result might instruct the system to retrieve the associated document and open it using an appropriate application. For example, depending on the type of result selected, the system might retrieve and open it using a web-browser, a video player, a text reader, etc.

Second, the user may trigger a REVERT operation [403]. This option will cause the contents of the clone-query to be copied into the search-query, restoring it to its original state. Also, it will un-highlight the currently highlighted result and more the input focus back to the query input box. Thus, a REVERT operation returns the search interface to the state it was in before the user navigated to a specific search result [405].

Third, the user may trigger a LEXICAL LOCK operation [407]. The user may perform a LEXICAL LOCK in order to accept the synchronized search-query and launch a new query using the disambiguated terms. This option will cause the contents of the search-query to be copied into the clone-query. After this occurs, it is no longer possible to restore the initial, ambiguous search-query using a REVERT operation. As described above, navigating to the “Engelbert Humperdinck” result in FIG. 3 [304] would cause the search-query to synchronize to “Engelbert Humperdinck.” If, at this point, the user were to trigger a LEXICAL LOCK operation, a new search for the unambiguous term “Engelbert Humperdinck” would be performed, eliminating disambiguations like “Engel” and partial matches like “Fogelburg.” The LEXICAL LOCK operation is further described below.

Fourth, the user may choose to trigger a CONCEPT LOCK operation [408]. A CONCEPT LOCK is intended to address situations in which metadata fails to sufficiently distinguish results that represent fundamentally different “concepts.” For example, the query terms “Engelbert Humperdinck” might refer to Engelbert Humperdinck the popular singer born in 1936, or they might refer to Engelbert Humperdinck the well-known composer of German Opera who lived in the 19th century. In this case, topical metadata may be unable to distinguish between these two possibilities, since both Engelbert Humperdincks would likely be indexed under terms like “Composer,” “Musician,” “Singing,” etc.

To overcome this problem, search results are manually associated with Global Identifiers (GIDs) that correspond to various “concepts.” These identifiers make it possible to distinguish between two separate concepts that happen to be associated with similar metadata. For example, Engelbert Humperdinck the singer might be associated with GID 500, while results about Engelbert Humperdinck the composer might have GID 510. Navigating to a result with GID 500 and triggering a CONCEPT LOCK will cause the selected GID (i.e. 500) to be stored in the context-list. Preferably, when launching a query, the system will pass the contents of the context-list to the search engine, thereby ensuring that only results related to GID 500 will be returned. Results about Engelbert Humperdinck the composer, though they may be associated with similar metadata, will not be included in the result-list because they are not associated with GID 500. The synchronization process and the LEXICAL LOCK and CONCEPT LOCK operations are further illustrated below.

FIG. 5 is a user interface diagram that illustrates the operation of the navigation, synchronization, and LEXICAL LOCK features described above. Screen I [300] shows the search interface as it appeared in FIG. 3. The user may then use the device's keypad to navigate [501] to the “Engelbert Humperdinck” result. As shown in Screen II [510], this result is highlighted [304], and the search-query (displayed in the query input box [301]) is synchronized to “Engelbert Humperdinck.” During the synchronization process, each direct-input query term in the search-query that matches a complete and unambiguous term (or phrase, such as “Engelbert Humperdinck”) in the highlighted result's metacontent is removed and the corresponding complete and unambiguous term (or phrase) is put in place of the corresponding direct-input query term as a locked query term. Thus, because “Engelbert Humperdinck” matches “36435,” the direct-input query term “36435” is replaced by the locked query term “Engelbert Humperdinck” in the search-query.

Having navigated to the highlighted result [304], the user may enter more ambiguous characters [521]. This automatically moves the focus (i.e., the highlighted item) to the query input box, and the ambiguous characters “244” are added to the end of the search-query. This also performs an implicit LEXICAL LOCK on the query term “Engelbert Humperdinck.” The search system automatically launches the new search-query in the search engine and returns results that are related to the locked query term “Engelbert Humperdinck” and the ambiguous query term “244.” The state of the interface after this search is shown in Screen III [520]. At the end of each row, the system may optionally display the metadata matched by the ambiguous query term. For example, after the result “And I Love Him” [522], the word “cigarettes” appears in parentheses, indicating that this result was selected because it is associated with “cigarettes,” which matches the ambiguous query term “244.” Optionally, the portion of the metadata that matches the ambiguous query term may be set-off from the rest of the text. In Screen III [520], the matching portion of the metadata is underlined.

To fully disambiguate the search-query, the user may perform a LEXICAL LOCK operation. In the scenario shown in FIG. 5, the user navigates to the “What a Wonderful World” result and triggers a LEXICAL LOCK [521]. As described above, navigating to the “What a Wonderful World” result causes the search-query to synchronize its direct-input query terms to the corresponding locked query terms. In this case, the direct-input term “244” is synchronized to “biggest.” The LEXICAL LOCK operation makes this change permanent, fully disambiguating the search-query, as shown in Screen IV [530].

FIG. 6 is a user interface diagram that illustrates the operation of the CONCEPT LOCK feature. Screen III [520] (identical to Screen III in FIG. 5) lists results that are relevant to the locked query term “Engelbert Humperdinck” and the direct-input term “244.” This listing includes terms that are relevant to Humperdinck the popular singer (e.g. “What a Wonderful World” [523], the title of a song performed by Humperdinck) and also terms that are relevant to Humperdinck the German composer (e.g., “Hänsel und Gretel” [611], the name of his most famous opera). As explained above, it is difficult to distinguish between these two concepts using metadata alone. This type of distinction is facilitated by the CONCEPT LOCK operation.

For example, in FIG. 6, suppose the user is searching for results related to Humperdinck the composer and not Humperdinck the singer. In order to narrow the scope of the search to the composer, the user would first navigate to a search result related to the desired concept [601]. In this case, the user selects the result titled “Hänsel und Gretel” [611]. Screen IVa [610] depicts the state of the interface after the user has navigated to this result.

At this point, the user triggers a CONCEPT LOCK [611], which limits the query to concepts related to the selected result. As explained above, search terms may be associated with an arbitrary number of GIDs that correspond to various concepts. When the system performs a CONCEPT LOCK, the GIDs associated with the current result are added to the context-list. For example, performing a CONCEPT LOCK on “Hänsel und Gretel” might add the GID corresponding to the concept “Humperdinck the German Composer” to the context-list. By limiting future searches to this concept, the system is able to filter out unwanted search results about Engelbert Humperdinck the popular singer. CONCEPT LOCK operations may be performed explicitly (e.g., in response to the user pressing a button) or implicitly by the search system.

The database used to associate concept GIDs with search terms may be stored and maintained by either the search engine or the client device. If maintained by the search engine, the client device would submit the current context-list to the search engine together with the search-query. The search engine would then return only those results that are relevant to the concept GIDs contained in the context-list. Alternatively, the client device may maintain a database of GIDs in which each GID is associated with a set of pre-constructed queries. In this case, the client device will send these pre-constructed queries to the search engine along with the search-query in order to limit the search results.

It will be appreciated that the scope of the present invention is not limited to the above-described embodiments, but rather is defined by the appended claims; and that these claims will encompass modifications of and improvements to what has been described.

Claims

1. A user-interface method for selecting a subset of items from a collection of items in response to search criteria including keystrokes entered by a user from a keypad with overloaded keys and including unambiguous metadata associated with a previously selected search result, the method comprising:

a. receiving a sequence of keystrokes entered by a user, each keystroke of the sequence associated with an overloaded key of a keypad, in which a given overloaded key is in fixed association with a plurality of alphabetical and numerical symbols, wherein the user is using the symbols to search for desired items and each keystroke of an overloaded key of the sequence represents any of the plurality of alphabetical and numerical symbols in fixed association with that overloaded key;
b. in response to receiving the keystrokes, performing a first search operation on the collection of items based on the sequence of keystrokes to retrieve a first subset of items from the collection of items and corresponding unambiguous metadata associated with the first subset of items, wherein the unambiguous metadata associated with each item of the first subset of items matches a permutation of the pluralities of alphabetical and numerical symbols in fixed association with each overloaded key associated with each keystroke of the sequence of keystrokes;
c. presenting on a display device the first subset of items;
d. receiving a selection of one of the items of the first subset of items from the user;
e. in response to a locking operation received from the user, performing a second search operation on the collection of items based on at least a portion of the unambiguous metadata associated with the selected item to retrieve a second subset of items from the collection of items, wherein at least a portion of the unambiguous metadata associated with each item of the second subset of items matches the portion of the unambiguous metadata associated with the selected item; and
f. presenting on the display device the second subset of items.

2. The method of claim 1, wherein the unambiguous metadata term associated with the selected item lexically disambiguates the items of the first subset from each other for the subsequent searching and presenting steps.

3. The method of claim 1, wherein the unambiguous metadata term associated with the selected item is a metadata term describing a concept associated with the selected item of the first subset so that only items associated with the concept of the selected item are selected and presented in the subsequent searching and presenting steps.

4. The method of claim 1, further comprising: receiving a browse action from the user for highlighting one of the presented items of the first subset; and, in response to the user browse action, transforming at least part of the keystrokes into at least one unambiguous metadata term associated with the highlighted item, wherein at least some of the characters of the unambiguous metadata term match the alphabetical and numerical symbols in fixed association with the keystrokes entered by the user.

5. The method of claim 1, wherein the receiving a selection of one of the items of the first subset of items includes receiving a browse action from the user for highlighting one of the presented items of the first subset and wherein receiving a locking operation from the user includes receiving at least one additional keystroke entry from the user.

6. The method of claim 5, wherein the at least one additional keystroke entry from the user is a keystroke for performing an explicit lock operation.

7. The method of claim 5, wherein the at least one additional keystroke entry from the user includes additional keystrokes entered by the user for providing additional alphabetical or numerical symbols for searching for desired items.

8. A user-interface system for selecting a subset of items from a collection of items in response to search criteria including keystrokes entered by a user from a keypad with overloaded keys and including unambiguous metadata associated with a previously selected search result, the system comprising instructions encoded on one or more non-transitory computer-readable media and when executed operable to:

a. receive a sequence of keystrokes entered by a user, each keystroke of the sequence associated with an overloaded key of a keypad, in which a given overloaded key is in fixed association with a plurality of alphabetical and numerical symbols, wherein the user is using the symbols to search for desired items and each keystrokes of an overloaded key of the sequence represents any of the plurality of alphabetical and numerical symbols in fixed associated with that overloaded key;
b. in response to receiving the keystrokes, perform a first search operation on the collection of items based on the sequence of keystrokes to retrieve a first subset of items from the collection of items and corresponding unambiguous metadata associated with the first subset of items, wherein the unambiguous metadata associated with each item of the first subset of items matches a permutation of the pluralities of alphabetical and numerical symbols in fixed association with each overloaded key associated with each keystroke of the sequence of keystrokes;
c. present on a display device the first subset of items;
d. receive a selection of one of the items of the first subset of items from the user;
e. in response to a locking operation received from the user, perform a second search operation on the collection of items based on at least a portion of the unambiguous metadata associated with the selected item to retrieve second subset of items from the collection of items, wherein at least a portion of the unambiguous metadata associated with each item of the second subset of items matches the portion of the unambiguous metadata associated with the selected item; and
f. present on the display device the second subset of items.

9. The system of claim 8, wherein the unambiguous metadata term associated with the selected item lexically disambiguates the items of the first subset from each other for the subsequent searching and presenting steps.

10. The system of claim 8, wherein the unambiguous metadata term associated with the selected item is a metadata term describing a concept associated with the selected item of the first subset so that only items associated with the concept of the selected item are selected and presented in the subsequent searching and presenting steps.

11. The system of claim 8, further comprising instructions operable to receive a browse action from the user for highlighting one of the presented items of the first subset; and instructions operable to, in response to the user browse action, transform at least part of the keystrokes into at least one unambiguous metadata term associated with the highlighted item, wherein at least some of the characters of the unambiguous metadata term match the alphabetical and numerical symbols in fixed association with the keystrokes entered by the user.

12. The system of claim 8, wherein the instructions operable to receive a selection of one of the items of the first subset of items is further operable to receive a browse action from the user for highlighting one of the presented items of the first subset and wherein the instructions operable to receive a locking operation from the user is further operable to receive at least one additional keystroke entry from the user.

13. The system of claim 12, wherein the at least one additional keystroke entry from the user is a keystroke for performing an explicit lock operation.

14. The system of claim 12, wherein the at least one additional keystroke entry from the user includes additional keystrokes entered by the user for providing additional alphabetical or numerical symbols for searching for desired items.

Referenced Cited
U.S. Patent Documents
1261167 April 1918 Russell
4453217 June 5, 1984 Boivie
4760528 July 26, 1988 Levin
4797855 January 10, 1989 Duncan, IV et al.
4893238 January 9, 1990 Venema
5224060 June 29, 1993 Ma et al.
5337347 August 9, 1994 Halstead-Nussloch et al.
5369605 November 29, 1994 Parks
5487616 January 30, 1996 Ichbiah
5532754 July 2, 1996 Young et al.
5623406 April 22, 1997 Ichbiah
5635989 June 3, 1997 Rothmuller
5774588 June 30, 1998 Li
5805155 September 8, 1998 Allibhoy et al.
5818437 October 6, 1998 Grover et al.
5828420 October 27, 1998 Marshall et al.
5828991 October 27, 1998 Skiena et al.
5835087 November 10, 1998 Herz et al.
5859662 January 12, 1999 Cragun et al.
5880768 March 9, 1999 Lemmons et al.
5896444 April 20, 1999 Perlman et al.
5912664 June 15, 1999 Eick et al.
5930788 July 27, 1999 Wical
5937422 August 10, 1999 Nelson et al.
5945928 August 31, 1999 Kushler et al.
5945987 August 31, 1999 Dunn
5953541 September 14, 1999 King et al.
6005565 December 21, 1999 Legall et al.
6005597 December 21, 1999 Barrett et al.
6006225 December 21, 1999 Bowman et al.
6008799 December 28, 1999 Van Kleeck
6009459 December 28, 1999 Belfiore et al.
6011554 January 4, 2000 King et al.
6047300 April 4, 2000 Walfish et al.
6075526 June 13, 2000 Rothmuller
6133909 October 17, 2000 Schein et al.
6184877 February 6, 2001 Dodson et al.
6189002 February 13, 2001 Roitblat
6204848 March 20, 2001 Nowlan et al.
6223059 April 24, 2001 Haestrupet al.
6260050 July 10, 2001 Yost et al.
6266048 July 24, 2001 Carau, Sr.
6266814 July 24, 2001 Lemmons et al.
6269361 July 31, 2001 Davis et al.
6286064 September 4, 2001 King et al.
6307548 October 23, 2001 Flinchem et al.
6307549 October 23, 2001 King et al.
6360215 March 19, 2002 Judd et al.
6377945 April 23, 2002 Risvik et al.
6383080 May 7, 2002 Link et al.
6392640 May 21, 2002 Will
6438751 August 20, 2002 Voyticky et al.
6463586 October 8, 2002 Jerding
6466933 October 15, 2002 Huang et al.
6529903 March 4, 2003 Smith et al.
6543052 April 1, 2003 Ogasawara
6564213 May 13, 2003 Ortega et al.
6564313 May 13, 2003 Kashyap
6594657 July 15, 2003 Livowsky et al.
6600496 July 29, 2003 Wagner et al.
6614422 September 2, 2003 Rafii et al.
6614455 September 2, 2003 Cuijpers et al.
6615248 September 2, 2003 Smith
6622148 September 16, 2003 Noble et al.
6664980 December 16, 2003 Bryan et al.
6708336 March 16, 2004 Bruette
6721954 April 13, 2004 Nickum
6732369 May 4, 2004 Leftwich et al.
6734881 May 11, 2004 Will
6757906 June 29, 2004 Look et al.
6766526 July 20, 2004 Ellis
6772147 August 3, 2004 Wang
6785671 August 31, 2004 Bailey et al.
6801909 October 5, 2004 Delgado et al.
6835602 December 28, 2004 Norskov et al.
6839702 January 4, 2005 Patel et al.
6839705 January 4, 2005 Grooters
6850693 February 1, 2005 Young et al.
6865575 March 8, 2005 Smith et al.
6865746 March 8, 2005 Herrington et al.
6907273 June 14, 2005 Smethers
6965374 November 15, 2005 Villet et al.
6999959 February 14, 2006 Lawrence et al.
7013304 March 14, 2006 Schuetze et al.
7117207 October 3, 2006 Kerschberg et al.
7136854 November 14, 2006 Smith
7149983 December 12, 2006 Robertson et al.
7213256 May 1, 2007 Kikinis
7225180 May 29, 2007 Donaldson et al.
7225184 May 29, 2007 Carrasco et al.
7225455 May 29, 2007 Bennington et al.
7269548 September 11, 2007 Fux et al.
7293231 November 6, 2007 Gunn et al.
7461061 December 2, 2008 Aravamudan et al.
7509313 March 24, 2009 Colledge et al.
7529744 May 5, 2009 Srivastava et al.
7536384 May 19, 2009 Venkataraman et al.
7539676 May 26, 2009 Aravamudan et al.
7548915 June 16, 2009 Ramer et al.
7644054 January 5, 2010 Garg et al.
7657526 February 2, 2010 Aravamudan et al.
7679534 March 16, 2010 Kay et al.
7683886 March 23, 2010 Willey
7712053 May 4, 2010 Bradford et al.
7779011 August 17, 2010 Venkataraman et al.
7788266 August 31, 2010 Venkataraman et al.
20020002550 January 3, 2002 Berman
20020042791 April 11, 2002 Smith et al.
20020049752 April 25, 2002 Bowman et al.
20020052873 May 2, 2002 Delgado et al.
20020059066 May 16, 2002 O'Hagan
20020059621 May 16, 2002 Thomas et al.
20020077143 June 20, 2002 Sharif et al.
20020083448 June 27, 2002 Johnson
20020133481 September 19, 2002 Smith et al.
20020184373 December 5, 2002 Maes
20020188488 December 12, 2002 Hinkle
20020199194 December 26, 2002 Ali
20030005452 January 2, 2003 Rodriguez
20030005462 January 2, 2003 Broadus et al.
20030011573 January 16, 2003 Villet et al.
20030014753 January 16, 2003 Beach et al.
20030023976 January 30, 2003 Kamen et al.
20030033292 February 13, 2003 Meisel et al.
20030037043 February 20, 2003 Chang et al.
20030046698 March 6, 2003 Kamen et al.
20030066079 April 3, 2003 Suga
20030067495 April 10, 2003 Pu et al.
20030084270 May 1, 2003 Coon et al.
20030097661 May 22, 2003 Li et al.
20030226146 December 4, 2003 Thurston et al.
20030237096 December 25, 2003 Barrett et al.
20040021691 February 5, 2004 Dostie et al.
20040046744 March 11, 2004 Rafii et al.
20040049783 March 11, 2004 Lemmons et al.
20040073432 April 15, 2004 Stone
20040073926 April 15, 2004 Nakamura et al.
20040078815 April 22, 2004 Lemmons et al.
20040078816 April 22, 2004 Johnson
20040078820 April 22, 2004 Nickum
20040083198 April 29, 2004 Bradford et al.
20040093616 May 13, 2004 Johnson
20040111745 June 10, 2004 Schein et al.
20040128686 July 1, 2004 Boyer et al.
20040139091 July 15, 2004 Shin
20040143569 July 22, 2004 Gross et al.
20040194141 September 30, 2004 Sanders
20040216160 October 28, 2004 Lemmons et al.
20040220926 November 4, 2004 Lamkin et al.
20040221308 November 4, 2004 Cuttner et al.
20040261021 December 23, 2004 Mittal et al.
20050015366 January 20, 2005 Carrasco et al.
20050038702 February 17, 2005 Merriman et al.
20050071874 March 31, 2005 Elcock et al.
20050079895 April 14, 2005 Kalenius et al.
20050086234 April 21, 2005 Tosey
20050086691 April 21, 2005 Dudkiewicz et al.
20050086692 April 21, 2005 Dudkiewicz et al.
20050129199 June 16, 2005 Abe
20050174333 August 11, 2005 Robinson et al.
20050192944 September 1, 2005 Flinchem
20050210020 September 22, 2005 Gunn et al.
20050210383 September 22, 2005 Cucerzan et al.
20050210402 September 22, 2005 Gunn et al.
20050223308 October 6, 2005 Gunn et al.
20050240580 October 27, 2005 Zamir et al.
20050246311 November 3, 2005 Whelan et al.
20050246324 November 3, 2005 Paalasmaa et al.
20050278175 December 15, 2005 Hyvonen
20050283468 December 22, 2005 Kamvar et al.
20060010477 January 12, 2006 Yu
20060013487 January 19, 2006 Longe et al.
20060015906 January 19, 2006 Boyer et al.
20060036640 February 16, 2006 Tateno et al.
20060044277 March 2, 2006 Fux et al.
20060059044 March 16, 2006 Chan et al.
20060069616 March 30, 2006 Bau
20060075429 April 6, 2006 Istvan et al.
20060090182 April 27, 2006 Horowitz et al.
20060090185 April 27, 2006 Zito et al.
20060098899 May 11, 2006 King et al.
20060101499 May 11, 2006 Aravamudan et al.
20060101503 May 11, 2006 Venkataraman
20060101504 May 11, 2006 Aravamudan et al.
20060112162 May 25, 2006 Marot et al.
20060136379 June 22, 2006 Marino et al.
20060156233 July 13, 2006 Nurmi
20060161520 July 20, 2006 Brewer et al.
20060163337 July 27, 2006 Unruh
20060167676 July 27, 2006 Plumb
20060167859 July 27, 2006 Verbeck Sibley et al.
20060176283 August 10, 2006 Suraqui
20060190308 August 24, 2006 Janssens et al.
20060195435 August 31, 2006 Laird-McConnell et al.
20060206454 September 14, 2006 Forstall et al.
20060206815 September 14, 2006 Pathiyal et al.
20060242607 October 26, 2006 Hudson
20060248078 November 2, 2006 Gross et al.
20060256078 November 16, 2006 Flinchem et al.
20060259479 November 16, 2006 Dai
20060274051 December 7, 2006 Longe et al.
20070005563 January 4, 2007 Aravamudan
20070016476 January 18, 2007 Hoffberg et al.
20070016862 January 18, 2007 Kuzmin
20070027852 February 1, 2007 Howard et al.
20070043750 February 22, 2007 Dingle
20070050337 March 1, 2007 Venkataraman et al.
20070050348 March 1, 2007 Aharoni et al.
20070061244 March 15, 2007 Ramer et al.
20070061317 March 15, 2007 Ramer et al.
20070061321 March 15, 2007 Venkataraman
20070061753 March 15, 2007 Ng et al.
20070061754 March 15, 2007 Ardhanari et al.
20070074131 March 29, 2007 Assadollahi
20070088681 April 19, 2007 Aravamudan et al.
20070094024 April 26, 2007 Kristensson et al.
20070130128 June 7, 2007 Garg et al.
20070136689 June 14, 2007 Richardson-Bunbury et al.
20070143567 June 21, 2007 Gorobets
20070150606 June 28, 2007 Flinchem et al.
20070182595 August 9, 2007 Ghasabian
20070219984 September 20, 2007 Aravamudan et al.
20070219985 September 20, 2007 Aravamudan et al.
20070226649 September 27, 2007 Agmon
20070240044 October 11, 2007 Fux et al.
20070240045 October 11, 2007 Fux et al.
20070255693 November 1, 2007 Ramaswamy et al.
20070260703 November 8, 2007 Ardhanari et al.
20070266021 November 15, 2007 Aravamudan et al.
20070266026 November 15, 2007 Aravamudan et al.
20070266406 November 15, 2007 Aravamudan et al.
20070271205 November 22, 2007 Aravamudan et al.
20070276773 November 29, 2007 Aravamudan et al.
20070276821 November 29, 2007 Aravamudan et al.
20070276859 November 29, 2007 Aravamudan et al.
20070288457 December 13, 2007 Aravamudan et al.
20080010611 January 10, 2008 Fux et al.
20080065617 March 13, 2008 Burke et al.
20080071771 March 20, 2008 Venkataraman et al.
20080077577 March 27, 2008 Byrne et al.
20080086704 April 10, 2008 Aravamudan
20080114743 May 15, 2008 Venkataraman et al.
20080177717 July 24, 2008 Kumar et al.
20080195601 August 14, 2008 Ntoulas et al.
20080209229 August 28, 2008 Ramakrishnan et al.
20080313564 December 18, 2008 Barve et al.
20090077496 March 19, 2009 Aravamudan et al.
20090198688 August 6, 2009 Venkataraman et al.
20100121845 May 13, 2010 Aravamudan et al.
20100153380 June 17, 2010 Garg et al.
Foreign Patent Documents
1050794 November 2000 EA
181058 May 1986 EP
1143691 October 2001 EP
1338967 August 2003 EP
1463307 September 2004 EP
1622054 February 2006 EP
WO-98/56173 December 1998 WO
WO-0070505 November 2000 WO
WO-2004010326 January 2004 WO
WO-2004/031931 April 2004 WO
WO-2005/033967 April 2005 WO
WO-2005/084235 September 2005 WO
WO-2006/052959 May 2006 WO
WO-2006/052966 May 2006 WO
WO-2007/025148 March 2007 WO
WO-2007/025149 March 2007 WO
WO-2007/062035 May 2007 WO
WO-2007/118038 October 2007 WO
WO-2007/124429 November 2007 WO
WO-2007/124436 November 2007 WO
WO-2007/131058 November 2007 WO
WO-2008/034057 March 2008 WO
WO-2008/091941 July 2008 WO
WO-2008/148012 December 2008 WO
Other references
  • European Supplemental Search Report for PCT/US2005040415, dated Aug. 11, 2009, 15 pages.
  • Supplemental European Search Report for PCT/US2005040424, dated Aug. 20, 2009, 13 pages.
  • Nardi, et al., “Integrating Communication and Information Through Contact Map,” Communications of the ACM, vol. 45, No. 4, Apr. 2002, 7 pages, retrieved from URL:http://portal.acm.org/citation.cfm?id+505251.
  • Supplemental European Search Report for 06838179.7 dated Dec. 9, 2009, 7 pages.
  • Supplemental European Search Report for EP07761026.9 dated Jan. 28, 2010, 8 pages.
  • Turski, et al., “Inner Circle—People Centered Email Client,” CHI 2005 Conference on Human Factors in Computing Sysems, Apr. 2005, pp. 1845-1848, 4 pages, retrieved from URL:http://portal.acm.org/citation.cfm?id+1056808.1057037.
  • Mackenzie et al., LetterWise: Prefix-based disambiguation for mobile text input, Proceedings of the ACM Symposium on User Interface Software and Technology—UIST2001, pp. 111-120.
  • Matthom, “Text Highlighting in Search Results”, Jul. 22, 2005. Available at www.matthom.com/archive/2005/07/22/text-highlighting-in-search-results; retrieved Jun. 23, 2006. (4 pages).
  • Mokotoff, Soundexing and Genealogy, Available at http://www.avotaynu.com/soundex.html, retrieved Mar. 19, 2008, last updated Sep. 8, 2007 (6 pages).
  • Press Release from Tegic Communications, Tegic Communications is awarded patent for Japanese T9(R) text input software from the Japan Patent Office, Oct. 12, 2004. Retrieved Nov. 18, 2005 from http://www.tegic.com/pressview.html?releasenum=55254242.
  • Review of Personalization Technologies: Collaborative Filtering vs. ChoiceStream's Attributized Bayesian Choice Modeling, Technology Brief, ChoiceStream Technologies, Cambridge, MA.
  • Silfverberg et al., Predicting text entry speed on mobile phones, Proceedings of the ACM Conference on Human Factors in Computing System—Chi, 2000. pp. 1-16.
  • Talbot, David. “Soul of a New Mobile Machine.” Technology Review: The Design Issue May/Jun. 2007. (pp. 46-53).
  • Wikipedia's entry for Levenshtein distance (n.d.). Retrieved Nov. 15, 2006 from http://en.wikipedia.org/wiki/Levenshteindistance.
  • Written Opinion of the International Searching Authority, International Application No. PCT/US06/25249, mailed Jan. 29, 2008 (4 pages).
  • Written Opinion of the International Searching Authority, International Application No. PCT/US06/33204, mailed Sep. 21, 2007 (3 pages).
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2005/040415, dated Nov. 27, 2006, 4 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2006/033257, dated Mar. 26, 2008, 5 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2006/045053, dated Jul. 24, 2008, 8 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2006/33258, dated Mar. 26, 2008, 5 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2007/067114, dated Jul. 2, 2008, 4 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2007/068064, 9 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2007/078490, 4 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2008/051789, dated Jul. 14, 2008, 5 pages.
  • International Search Report and Written Opinion of the International Searching Authority of the United States Patent and Trademark Office for PCT/US2008/064730, dated Sep. 8, 2008, 5 pages.
  • International Search Report and Written Opinion of the International Searching Authority, the United States Patent and Trademark Office, for PCT/US2005/40424, mailing date of Nov. 21, 2006, 6 pages.
  • Roe, et al., “Mapping UML Models Incorporating OCL Constraints into Object-Z,” Technical Report, Sep. 2003, Department of Computing, Imperial College London, retrieved on Jul. 12, 2007, retrieved from the internet: <URL: http://www.doc.ic.ac.uk/-ar3/TechnicalReport20039.pdf>, 17 pages.
  • Supplementary European Search Report and Written Opinion for EP07842499, dated Aug. 25, 2010, 6 pages.
  • U.S. Appl. No. 11/939,086, Ramakrishnan et al.
  • Ardissono, L. et al., User Modeling and Recommendation Techniques for Personalized Electronic Program Guides, Personalized Digital Television, Editors: Ardissono, et al., Kluwer Academic Press, 2004.
  • Dalianis, “Improving search engine retrieval using a compound splitter for Swedish,” Abstract of presentation at Nodalida 2005—15th Nordic Conference on Computational Linguistics, Joensuu Finland, May 21-22, 2005. Retrieved Jan. 5, 2006 from http://phon.joensuu.fi/nodalida/abstracts/03.shtml.
  • Digital Video Broadcasting, http://www.dvb.org (Oct. 12, 2007).
  • Gadd, Phonix: The Algorith, Program, vol. 24(4), Oct. 1990 (pp. 363-369).
  • Good, N. et al., Combining Collaborative Filtering with Personal Agents for Better Recommendations, in Proc. of the 16th National Conference on Artificial Intelligence, pp. 439-446, Orlando, Florida, Jul. 18-22, 1999.
  • International Search Report, International Application No. PCT/US06/25249, mailed Jan. 29, 2008 (2 pages).
  • International Search Report, International Application No. PCT/US06/33204, mailed Sep. 21, 2007 (2 pages).
  • International Search Report, International Application No. PCT/US06/40005, mailed Jul. 3, 2007 (4 Pages).
  • International Search Report, International Application No. PCT/US07/65703, mailed Jan. 25, 2008 (2 pages).
  • International Search Report, International Application No. PCT/US07/67100, mailed Mar. 7, 2008 (2 pages).
  • Written Opinion of the International Searching Authority, International Application No. PCT/US06/40005, mailed Jul. 3, 2007 (4 Pages).
  • Written Opinion of the International Searching Authority, International Application No. PCT/US07/65703, mailed Jan. 25, 2008 (4 pages).
  • Written Opinion of the International Searching Authority, International Application No. PCT/US07/67100, mailed Mar. 7, 2008 (3 pages).
Patent History
Patent number: 8549424
Type: Grant
Filed: May 23, 2008
Date of Patent: Oct 1, 2013
Patent Publication Number: 20080313564
Assignee: Veveo, Inc. (Andover, MA)
Inventors: Rakesh Barve (Bangalore), Sashikumar Venkataraman (Andover, MA), Murali Aravamudan (Windham, NH), Manish M. Sharma (Ghaziabad), Pankaj Garg (Patiala), Sankar Ardhanari (Windham, NH)
Primary Examiner: Ramsey Refai
Assistant Examiner: William Titcomb
Application Number: 12/126,549