Patents by Inventor Malek M. Chalabi

Malek M. Chalabi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9950431
    Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.
    Type: Grant
    Filed: January 25, 2016
    Date of Patent: April 24, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jean Sebastien Fouillade, Russel Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
  • Publication number: 20160136817
    Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.
    Type: Application
    Filed: January 25, 2016
    Publication date: May 19, 2016
    Inventors: Jean Sebastien Fouillade, Russel Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
  • Patent number: 9259842
    Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.
    Type: Grant
    Filed: June 10, 2011
    Date of Patent: February 16, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jean Sebastien Fouillade, Russell Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
  • Patent number: 8818556
    Abstract: The subject disclosure is directed towards a robot device including a model that controls a robot's task-related operations to perform tasks and user-engagement operations to interact with the robot. The model controls the operating states, including transitioning from an autonomous task state to an engaged state based on current context determined from various stimuli, e.g., information received via sensors of the robot and/or learned data. The robot may seek to engage the user, the user may seek to engage the robot, or the user and robot may meet by chance in which either may attempt to initiate the engagement.
    Type: Grant
    Filed: January 13, 2011
    Date of Patent: August 26, 2014
    Assignee: Microsoft Corporation
    Inventors: Russell Irvin Sanchez, Malek M. Chalabi, Kordell B. Krauskopf
  • Publication number: 20120316676
    Abstract: Initial interaction between a mobile robot and at least one user is described herein. The mobile robot captures several images of its surroundings, and identifies existence of a user in at least one of the several images. The robot then orients itself to face the user, and outputs an instruction to the user with regard to the orientation of the user with respect to the mobile robot. The mobile robot captures images of the face of the user responsive to detecting that the user has followed the instruction. Information captured by the robot is uploaded to a cloud-storage system, where information is included in a profile of the user and is shareable with others.
    Type: Application
    Filed: June 10, 2011
    Publication date: December 13, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Jean Sebastien Fouillade, Russell Sanchez, Efstathios Papaefstathiou, Malek M. Chalabi
  • Publication number: 20120316680
    Abstract: A robot tracks objects using sensory data, and follows an object selected by a user. The object can be designated by a user from a set of objects recognized by the robot. The relative positions and orientations of the robot and object are determined. The position and orientation of the robot can be used so as to maintain a desired relationship between the object and the robot. Using the navigation system of the robot, during its movement, obstacles can be avoided. If the robot loses contact with the object being tracked, the robot can continue to navigate and search the environment until the object is reacquired.
    Type: Application
    Filed: June 13, 2011
    Publication date: December 13, 2012
    Applicant: Microsoft Corporation
    Inventors: Charles F. Olivier, III, Jean Sebastien Fouillade, Adrien Felon, Jeffrey Cole, Nathaniel T. Clinton, Russell Sanchez, Francois Burianek, Malek M. Chalabi, Harshavardhana Narayana Kikkeri
  • Publication number: 20120277914
    Abstract: The subject disclosure is directed towards a set of autonomous and semi-autonomous modes for a robot by which the robot captures content (e.g., still images and video) from a location such as a house. The robot may produce a summarized presentation of the content (a “botcast”) that is appropriate for a specific scenario, such as an event, according to a specified style. Modes include an event mode where the robot may interact with and simulate event participants to provide desired content for capture. A patrol mode operates the robot to move among locations (e.g., different rooms) to capture a panorama (e.g., 360 degrees) of images that can be remotely viewed.
    Type: Application
    Filed: April 29, 2011
    Publication date: November 1, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: William M. Crow, Nathaniel T. Clinton, Malek M. Chalabi, Dane T. Storrusten
  • Publication number: 20120215380
    Abstract: Described herein are technologies pertaining to robot navigation. The robot includes a video camera that is configured to transmit a live video feed to a remotely located computing device. A user interacts with the live video feed, and the robot navigates in its environment based upon the user interaction. In a first navigation mode, the user selects a location, and the robot autonomously navigates to the selected location. In a second navigation mode, the user causes the point of view of the video camera on the robot to change, and thereafter causes the robot to semi-autonomously drive in a direction corresponding to the new point of view of the video camera. In a third navigation mode, the user causes the robot to navigate to a selected location in the live video feed.
    Type: Application
    Filed: February 23, 2011
    Publication date: August 23, 2012
    Applicant: Microsoft Corporation
    Inventors: Jean Sebastien Fouillade, Charles F. Olivier, III, Malek M. Chalabi, Nathaniel T. Clinton, Russ Sanchez, Chad Aron Voss
  • Publication number: 20120185090
    Abstract: The subject disclosure is directed towards a robot device including a model that controls a robot's task-related operations to perform tasks and user-engagement operations to interact with the robot. The model controls the operating states, including transitioning from an autonomous task state to an engaged state based on current context determined from various stimuli, e.g., information received via sensors of the robot and/or learned data. The robot may seek to engage the user, the user may seek to engage the robot, or the user and robot may meet by chance in which either may attempt to initiate the engagement.
    Type: Application
    Filed: January 13, 2011
    Publication date: July 19, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Russell Irvin Sanchez, Malek M. Chalabi, Kordell B. Krauskopf