Patents by Inventor Charu Pandhi

Charu Pandhi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11947437
    Abstract: Provided is a method, computer program product, and system for automatically assigning robotic devices to users based on need using predictive analytics. A processor may monitor activities performed by one or more users. The processor may determine, based on the monitoring, a set of activities that require assistance from a robotic device when being performed by the one or more users. The processor may match the set of activities to a set of capabilities related to a plurality of robotic devices. The processor may identify, based on the matching, a first robotic device that is capable of assisting the one or more users in performing a first activity of the set of activities. The processor may deploy the first robotic device to assist the one or more users in performing the first activity.
    Type: Grant
    Filed: July 29, 2020
    Date of Patent: April 2, 2024
    Assignee: International Business Machines Corporation
    Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
  • Patent number: 11373373
    Abstract: Techniques for translating air writing to augmented reality (AR) devices are described. In one embodiment, a method includes receiving indications of gestures from an originator. The indications identify movement in three dimensions that correspond to an emphasis conferred on one or more words that are air-written by the originator and are configured to be displayed by a plurality of AR devices. The method includes analyzing the identified movement to determine a gesture type associated with the emphasis, where the gesture type includes a first emphasis to be conferred on the air-written words. The method includes providing a display of the air-written words on a first AR device of the plurality of AR devices. The first emphasis is conferred on the words on the display of the first AR device using a first gesture display style based on a profile of a first user utilizing the first AR device.
    Type: Grant
    Filed: October 22, 2019
    Date of Patent: June 28, 2022
    Assignee: International Business Machines Corporation
    Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
  • Publication number: 20220035727
    Abstract: Provided is a method, computer program product, and system for automatically assigning robotic devices to users based on need using predictive analytics. A processor may monitor activities performed by one or more users. The processor may determine, based on the monitoring, a set of activities that require assistance from a robotic device when being performed by the one or more users. The processor may match the set of activities to a set of capabilities related to a plurality of robotic devices. The processor may identify, based on the matching, a first robotic device that is capable of assisting the one or more users in performing a first activity of the set of activities. The processor may deploy the first robotic device to assist the one or more users in performing the first activity.
    Type: Application
    Filed: July 29, 2020
    Publication date: February 3, 2022
    Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
  • Publication number: 20210209365
    Abstract: Embodiments herein provide an augmented reality (AR) system that uses sound localization to identify sounds that may be of interest to a user and generates an audio description of the source of the sound as well as AR content that can be magnified and displayed to the user. In one embodiment, an AR device captures images that have the source of the sound within their field of view. Using machine learning (ML) techniques, the AR device can identify the object creating the sound (i.e., the sound source). A description of the sound source and its actions can outputted to the user. In parallel, the AR device can also generate AR content for the sound source. For example, the AR device can magnify the sound source to a size that is viewable to the user and create AR content that is then superimposed onto a display.
    Type: Application
    Filed: January 2, 2020
    Publication date: July 8, 2021
    Inventors: Willie L SCOTT, II, Seema NAGAR, Charu PANDHI, Kuntal DEY
  • Patent number: 11055533
    Abstract: Embodiments herein provide an augmented reality (AR) system that uses sound localization to identify sounds that may be of interest to a user and generates an audio description of the source of the sound as well as AR content that can be magnified and displayed to the user. In one embodiment, an AR device captures images that have the source of the sound within their field of view. Using machine learning (ML) techniques, the AR device can identify the object creating the sound (i.e., the sound source). A description of the sound source and its actions can outputted to the user. In parallel, the AR device can also generate AR content for the sound source. For example, the AR device can magnify the sound source to a size that is viewable to the user and create AR content that is then superimposed onto a display.
    Type: Grant
    Filed: January 2, 2020
    Date of Patent: July 6, 2021
    Assignee: International Business Machines Corporation
    Inventors: Willie L Scott, II, Seema Nagar, Charu Pandhi, Kuntal Dey
  • Patent number: 11042259
    Abstract: An approach is provided in which the approach the approach deconstructs a user interface into user interface elements that each are assigned an importance score. The approach compares a user eye gaze pattern of a user viewing the user interface against an expected eye gaze pattern corresponding to the user interface, and determines that the user requires assistance navigating the user interface. The approach selects one of the user interface elements based on its importance score, generates an augmented reality overlay of the selected user interface element, and displays the augmented reality overlay on the user interface using an augmented reality device.
    Type: Grant
    Filed: August 18, 2019
    Date of Patent: June 22, 2021
    Assignee: International Business Machines Corporation
    Inventors: Willie L. Scott, II, Charu Pandhi, Mohit Jain, Kuntal Dey
  • Publication number: 20210118232
    Abstract: Techniques for translating air writing to augmented reality (AR) devices are described. In one embodiment, a method includes receiving indications of gestures from an originator. The indications identify movement in three dimensions that correspond to an emphasis conferred on one or more words that are air-written by the originator and are configured to be displayed by a plurality of AR devices. The method includes analyzing the identified movement to determine a gesture type associated with the emphasis, where the gesture type includes a first emphasis to be conferred on the air-written words. The method includes providing a display of the air-written words on a first AR device of the plurality of AR devices. The first emphasis is conferred on the words on the display of the first AR device using a first gesture display style based on a profile of a first user utilizing the first AR device.
    Type: Application
    Filed: October 22, 2019
    Publication date: April 22, 2021
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Willie L. SCOTT, II, Charu PANDHI, Seema NAGAR, Kuntal DEY
  • Publication number: 20210048938
    Abstract: An approach is provided in which the approach the approach deconstructs a user interface into user interface elements that each are assigned an importance score. The approach compares a user eye gaze pattern of a user viewing the user interface against an expected eye gaze pattern corresponding to the user interface, and determines that the user requires assistance navigating the user interface. The approach selects one of the user interface elements based on its importance score, generates an augmented reality overlay of the selected user interface element, and displays the augmented reality overlay on the user interface using an augmented reality device.
    Type: Application
    Filed: August 18, 2019
    Publication date: February 18, 2021
    Inventors: Willie L. Scott, II, Charu Pandhi, Mohit Jain, Kuntal Dey
  • Patent number: 10782777
    Abstract: Provided are systems, methods, and media for real-time alteration of video. An example method includes presenting a video to a user. The method includes monitoring a gaze point of the user as the user views one or more frames of the video. The method includes, in response to a determination that the monitored gaze point of the user is different from a predetermined target gaze point, changing the orientation of the video to reposition the target gaze point of the video to the monitored gaze point of the user, in which the orientation of the video is changed during the presentation of the video to the user.
    Type: Grant
    Filed: November 29, 2018
    Date of Patent: September 22, 2020
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Willie L. Scott, II, Kuntal Dey, Mohit Jain, Charu Pandhi
  • Patent number: 10739864
    Abstract: A gesture to speech conversion device may receive indications of user gestures via at least one sensor, the indications identifying movement in three dimensions. A 2-dimensional (2D) plane on which a beginning of the movement and an end of the movement is substantially planar and a third dimension orthogonal to the 2D plane may be determined. A change of the movement in a direction of the third dimension in a course of the movement occurring on the 2D plane is detected. The change of the movement in the third dimension is mapped to an emphasis in the movement. The movement is transformed into speech with emphasis on a part of the speech corresponding to a part of the movement having the detected change.
    Type: Grant
    Filed: December 31, 2018
    Date of Patent: August 11, 2020
    Assignee: International Business Machines Corporation
    Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
  • Publication number: 20200209976
    Abstract: A gesture to speech conversion device may receive indications of user gestures via at least one sensor, the indications identifying movement in three dimensions. A 2-dimensional (2D) plane on which a beginning of the movement and an end of the movement is substantially planar and a third dimension orthogonal to the 2D plane may be determined. A change of the movement in a direction of the third dimension in a course of the movement occurring on the 2D plane is detected. The change of the movement in the third dimension is mapped to an emphasis in the movement. The movement is transformed into speech with emphasis on a part of the speech corresponding to a part of the movement having the detected change.
    Type: Application
    Filed: December 31, 2018
    Publication date: July 2, 2020
    Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
  • Publication number: 20200174559
    Abstract: Provided are systems, methods, and media for real-time alteration of video. An example method includes presenting a video to a user. The method includes monitoring a gaze point of the user as the user views one or more frames of the video. The method includes, in response to a determination that the monitored gaze point of the user is different from a predetermined target gaze point, changing the orientation of the video to reposition the target gaze point of the video to the monitored gaze point of the user, in which the orientation of the video is changed during the presentation of the video to the user.
    Type: Application
    Filed: November 29, 2018
    Publication date: June 4, 2020
    Inventors: Willie L. Scott, II, Kuntal Dey, Mohit Jain, Charu Pandhi
  • Patent number: 10585936
    Abstract: Textual content is analyzed to determine a tone of the content. A first color palette including a first plurality of colors is computed. The first plurality of colors corresponds to the tone. A first color of the first plurality of colors is selected, and a second color palette including a second plurality of colors is computed. The second plurality of colors corresponds to the first plurality of colors and satisfies a predetermined color-related accessibility requirement between the first selected color and each of the second plurality of colors. A second color of the second plurality of colors is selected, and the content is modified using the first selected color and the second selected color.
    Type: Grant
    Filed: June 12, 2017
    Date of Patent: March 10, 2020
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Maureen Kraft, Fang Lu, Charu Pandhi
  • Publication number: 20180357231
    Abstract: Textual content is analyzed to determine a tone of the content. A first color palette including a first plurality of colors is computed. The first plurality of colors corresponds to the tone. A first color of the first plurality of colors is selected, and a second color palette including a second plurality of colors is computed. The second plurality of colors corresponds to the first plurality of colors and satisfies a predetermined color-related accessibility requirement between the first selected color and each of the second plurality of colors. A second color of the second plurality of colors is selected, and the content is modified using the first selected color and the second selected color.
    Type: Application
    Filed: June 12, 2017
    Publication date: December 13, 2018
    Applicant: International Business Machines Corporation
    Inventors: Maureen Kraft, Fang Lu, Charu Pandhi