Patents by Inventor Ohil K. Manyam
Ohil K. Manyam has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11568632Abstract: This disclosure describes a system for automatically identifying an item from among a variation of items of a same type. For example, an image may be processed and resulting item image information compared with stored item image information to determine a type of item represented in the image. If the matching stored item image information is part of a cluster, the item image information may then be compared with distinctive features associated with stored item image information of the cluster to determine the variation of the item represented in the received image.Type: GrantFiled: September 4, 2020Date of Patent: January 31, 2023Assignee: Amazon Technologies, Inc.Inventors: Sudarshan Narasimha Raghavan, Xiaofeng Ren, Michel Leonard Goldstein, Ohil K. Manyam
-
Patent number: 10769488Abstract: This disclosure describes a system for automatically identifying an item from among a variation of items of a same type. For example, an image may be processed and resulting item image information compared with stored item image information to determine a type of item represented in the image. If the matching stored item image information is part of a cluster, the item image information may then be compared with distinctive features associated with stored item image information of the cluster to determine the variation of the item represented in the received image.Type: GrantFiled: July 24, 2019Date of Patent: September 8, 2020Assignee: Amazon Technologies, Inc.Inventors: Sudarshan Narasimha Raghavan, Xiaofeng Ren, Michel Leonard Goldstein, Ohil K. Manyam
-
Patent number: 10366306Abstract: This disclosure describes a system for automatically identifying an item from among a variation of items of a same type. For example, an image may be processed and resulting item image information compared with stored item image information to determine a type of item represented in the image. If the matching stored item image information is part of a cluster, the item image information may then be compared with distinctive features associated with stored item image information of the cluster to determine the variation of the item represented in the received image.Type: GrantFiled: September 19, 2013Date of Patent: July 30, 2019Assignee: Amazon Technologies, Inc.Inventors: Sudarshan Narasimha Raghavan, Xiaofeng Ren, Michel Leonard Goldstein, Ohil K. Manyam
-
Patent number: 9552421Abstract: Simplified collaborative searching is provided by pattern recognition such as facial recognition, motion recognition, and the like to provide handsfree functionality. Users join a collaborative search by placing themselves within the field of view of a camera communicationally coupled to a computing device that performs facial recognition and identifies the users, thereby adding them to the collaboration. Users also join by performing simple movements with a portable computing device, such as the ubiquitous mobile phone. A collaboration component tracks the users in the collaboration and identifies them to a search engine, thereby enabling the search engine to perform a collaborative search. The collaboration component also disseminates the collaborative recommendations, either automatically or based upon explicit requests triggered by pattern recognition, including motion recognition and touch recognition.Type: GrantFiled: March 15, 2013Date of Patent: January 24, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Aidan C. Crook, Avneesh Sud, Xiaoyuan Cui, Ohil K. Manyam
-
Publication number: 20140280299Abstract: Simplified collaborative searching is provided by pattern recognition such as facial recognition, motion recognition, and the like to provide handsfree functionality. Users join a collaborative search by placing themselves within the field of view of a camera communicationally coupled to a computing device that performs facial recognition and identifies the users, thereby adding them to the collaboration. Users also join by performing simple movements with a portable computing device, such as the ubiquitous mobile phone. A collaboration component tracks the users in the collaboration and identifies them to a search engine, thereby enabling the search engine to perform a collaborative search. The collaboration component also disseminates the collaborative recommendations, either automatically or based upon explicit requests triggered by pattern recognition, including motion recognition and touch recognition.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: MICROSOFT CORPORATIONInventors: Aidan C. Crook, Avneesh Sud, Xiaoyuan Cui, Ohil K. Manyam
-
Publication number: 20140046922Abstract: The disclosed architecture enables user feedback in the form of gestures, and optionally, voice signals, of one or more users, to interact with a search engine framework. For example, document relevance, document ranking, and output of the search engine can be modified based on the capture and interpretation of physical gestures of a user. The recognition of a specific gesture is detected based on the physical location and movement of the joints of a user. The architecture captures emotive responses while navigating the voice-driven and gesture-driven interface, and indicates that appropriate feedback has been captured. The feedback can be used to alter the search query, personalize the response using the feedback collected through the search/browsing session, modifying result ranking, navigation of the user interface, modification of the entire result page, etc., among many others.Type: ApplicationFiled: August 8, 2012Publication date: February 13, 2014Applicant: Microsoft CorporationInventors: Aidan C. Crook, Nikhil Dandekar, Ohil K. Manyam, Gautam Kedia, Sisi Sarkizova, Sara Javanmardi, Daniel Liebling, Ryen William White, Kevyn Collins-Thompson
-
Publication number: 20120137146Abstract: A remote power management system and method for awakening a remote computing device over a computer network. A wakeup event is initiated by a client on the network, and an application server publishes the wakeup event. A subscriber, such as the subnet where the remote computing device resides, picks up the wakeup event and gives it to a computing device on the subnet that is awake. The awake computing device constructs a “magic packet” to take advantage of the automatic wakeup feature of the network card on the remote computing device, and broadcasts the “magic packet” throughout the subnet. Stateless handling of wakeup events is used to alleviate any need for a dedicated proxy server on the subnet to send the “magic packets.” The computing device on the subnet that constructs and broadcasts the “magic packets” is rotated to provide equitable rest time for each of the computing devices.Type: ApplicationFiled: November 29, 2010Publication date: May 31, 2012Applicant: Microsoft CorporationInventors: Sandeep Karanth, Ohil K. Manyam