Patents by Inventor Malek M. Chalabi
Malek M. Chalabi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9950431Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.Type: GrantFiled: January 25, 2016Date of Patent: April 24, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jean Sebastien Fouillade, Russel Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
-
Publication number: 20160136817Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.Type: ApplicationFiled: January 25, 2016Publication date: May 19, 2016Inventors: Jean Sebastien Fouillade, Russel Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
-
Patent number: 9259842Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.Type: GrantFiled: June 10, 2011Date of Patent: February 16, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Jean Sebastien Fouillade, Russell Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
-
Patent number: 8818556Abstract: The subject disclosure is directed towards a robot device including a model that controls a robot's task-related operations to perform tasks and user-engagement operations to interact with the robot. The model controls the operating states, including transitioning from an autonomous task state to an engaged state based on current context determined from various stimuli, e.g., information received via sensors of the robot and/or learned data. The robot may seek to engage the user, the user may seek to engage the robot, or the user and robot may meet by chance in which either may attempt to initiate the engagement.Type: GrantFiled: January 13, 2011Date of Patent: August 26, 2014Assignee: Microsoft CorporationInventors: Russell Irvin Sanchez, Malek M. Chalabi, Kordell B. Krauskopf
-
Publication number: 20120316676Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.Type: ApplicationFiled: June 10, 2011Publication date: December 13, 2012Applicant: MICROSOFT CORPORATIONInventors: Jean Sebastien Fouillade, Russell Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
-
Publication number: 20120316680Abstract: A robot tracks objects using sensory data, and follows an object selected by a user. The object can be designated by a user from a set of objects recognized by the robot. The relative positions and orientations of the robot and object are determined. The position and orientation of the robot can be used so as to maintain a desired relationship between the object and the robot. Using the navigation system of the robot, during its movement, obstacles can be avoided. If the robot loses contact with the object being tracked, the robot can continue to navigate and search the environment until the object is reacquired.Type: ApplicationFiled: June 13, 2011Publication date: December 13, 2012Applicant: Microsoft CorporationInventors: Charles F. Olivier, III, Jean Sebastien Fouillade, Adrien Felon, Jeffrey Cole, Nathaniel T. Clinton, Russell Sanchez, Francois Burianek, Malek M. Chalabi, Harshavardhana Narayana Kikkeri
-
Publication number: 20120277914Abstract: The subject disclosure is directed towards a set of autonomous and semi-autonomous modes for a robot by which the robot captures content (e.g., still images and video) from a location such as a house. The robot may produce a summarized presentation of the content (a “botcast”) that is appropriate for a specific scenario, such as an event, according to a specified style. Modes include an event mode where the robot may interact with and simulate event participants to provide desired content for capture. A patrol mode operates the robot to move among locations (e.g., different rooms) to capture a panorama (e.g., 360 degrees) of images that can be remotely viewed.Type: ApplicationFiled: April 29, 2011Publication date: November 1, 2012Applicant: MICROSOFT CORPORATIONInventors: William M. Crow, Nathaniel T. Clinton, Malek M. Chalabi, Dane T. Storrusten
-
Publication number: 20120215380Abstract: Described herein are technologies pertaining to robot navigation. The robot includes a video camera that is configured to transmit a live video feed to a remotely located computing device. A user interacts with the live video feed, and the robot navigates in its environment based upon the user interaction. In a first navigation mode, the user selects a location, and the robot autonomously navigates to the selected location. In a second navigation mode, the user causes the point of view of the video camera on the robot to change, and thereafter causes the robot to semi-autonomously drive in a direction corresponding to the new point of view of the video camera. In a third navigation mode, the user causes the robot to navigate to a selected location in the live video feed.Type: ApplicationFiled: February 23, 2011Publication date: August 23, 2012Applicant: Microsoft CorporationInventors: Jean Sebastien Fouillade, Charles F. Olivier, III, Malek M. Chalabi, Nathaniel T. Clinton, Russ Sanchez, Chad Aron Voss
-
Publication number: 20120185090Abstract: The subject disclosure is directed towards a robot device including a model that controls a robot's task-related operations to perform tasks and user-engagement operations to interact with the robot. The model controls the operating states, including transitioning from an autonomous task state to an engaged state based on current context determined from various stimuli, e.g., information received via sensors of the robot and/or learned data. The robot may seek to engage the user, the user may seek to engage the robot, or the user and robot may meet by chance in which either may attempt to initiate the engagement.Type: ApplicationFiled: January 13, 2011Publication date: July 19, 2012Applicant: MICROSOFT CORPORATIONInventors: Russell Irvin Sanchez, Malek M. Chalabi, Kordell B. Krauskopf