Patents by Inventor Charu Pandhi
Charu Pandhi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11947437Abstract: Provided is a method, computer program product, and system for automatically assigning robotic devices to users based on need using predictive analytics. A processor may monitor activities performed by one or more users. The processor may determine, based on the monitoring, a set of activities that require assistance from a robotic device when being performed by the one or more users. The processor may match the set of activities to a set of capabilities related to a plurality of robotic devices. The processor may identify, based on the matching, a first robotic device that is capable of assisting the one or more users in performing a first activity of the set of activities. The processor may deploy the first robotic device to assist the one or more users in performing the first activity.Type: GrantFiled: July 29, 2020Date of Patent: April 2, 2024Assignee: International Business Machines CorporationInventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
-
Patent number: 11373373Abstract: Techniques for translating air writing to augmented reality (AR) devices are described. In one embodiment, a method includes receiving indications of gestures from an originator. The indications identify movement in three dimensions that correspond to an emphasis conferred on one or more words that are air-written by the originator and are configured to be displayed by a plurality of AR devices. The method includes analyzing the identified movement to determine a gesture type associated with the emphasis, where the gesture type includes a first emphasis to be conferred on the air-written words. The method includes providing a display of the air-written words on a first AR device of the plurality of AR devices. The first emphasis is conferred on the words on the display of the first AR device using a first gesture display style based on a profile of a first user utilizing the first AR device.Type: GrantFiled: October 22, 2019Date of Patent: June 28, 2022Assignee: International Business Machines CorporationInventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
-
Publication number: 20220035727Abstract: Provided is a method, computer program product, and system for automatically assigning robotic devices to users based on need using predictive analytics. A processor may monitor activities performed by one or more users. The processor may determine, based on the monitoring, a set of activities that require assistance from a robotic device when being performed by the one or more users. The processor may match the set of activities to a set of capabilities related to a plurality of robotic devices. The processor may identify, based on the matching, a first robotic device that is capable of assisting the one or more users in performing a first activity of the set of activities. The processor may deploy the first robotic device to assist the one or more users in performing the first activity.Type: ApplicationFiled: July 29, 2020Publication date: February 3, 2022Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
-
Publication number: 20210209365Abstract: Embodiments herein provide an augmented reality (AR) system that uses sound localization to identify sounds that may be of interest to a user and generates an audio description of the source of the sound as well as AR content that can be magnified and displayed to the user. In one embodiment, an AR device captures images that have the source of the sound within their field of view. Using machine learning (ML) techniques, the AR device can identify the object creating the sound (i.e., the sound source). A description of the sound source and its actions can outputted to the user. In parallel, the AR device can also generate AR content for the sound source. For example, the AR device can magnify the sound source to a size that is viewable to the user and create AR content that is then superimposed onto a display.Type: ApplicationFiled: January 2, 2020Publication date: July 8, 2021Inventors: Willie L SCOTT, II, Seema NAGAR, Charu PANDHI, Kuntal DEY
-
Patent number: 11055533Abstract: Embodiments herein provide an augmented reality (AR) system that uses sound localization to identify sounds that may be of interest to a user and generates an audio description of the source of the sound as well as AR content that can be magnified and displayed to the user. In one embodiment, an AR device captures images that have the source of the sound within their field of view. Using machine learning (ML) techniques, the AR device can identify the object creating the sound (i.e., the sound source). A description of the sound source and its actions can outputted to the user. In parallel, the AR device can also generate AR content for the sound source. For example, the AR device can magnify the sound source to a size that is viewable to the user and create AR content that is then superimposed onto a display.Type: GrantFiled: January 2, 2020Date of Patent: July 6, 2021Assignee: International Business Machines CorporationInventors: Willie L Scott, II, Seema Nagar, Charu Pandhi, Kuntal Dey
-
Patent number: 11042259Abstract: An approach is provided in which the approach the approach deconstructs a user interface into user interface elements that each are assigned an importance score. The approach compares a user eye gaze pattern of a user viewing the user interface against an expected eye gaze pattern corresponding to the user interface, and determines that the user requires assistance navigating the user interface. The approach selects one of the user interface elements based on its importance score, generates an augmented reality overlay of the selected user interface element, and displays the augmented reality overlay on the user interface using an augmented reality device.Type: GrantFiled: August 18, 2019Date of Patent: June 22, 2021Assignee: International Business Machines CorporationInventors: Willie L. Scott, II, Charu Pandhi, Mohit Jain, Kuntal Dey
-
Publication number: 20210118232Abstract: Techniques for translating air writing to augmented reality (AR) devices are described. In one embodiment, a method includes receiving indications of gestures from an originator. The indications identify movement in three dimensions that correspond to an emphasis conferred on one or more words that are air-written by the originator and are configured to be displayed by a plurality of AR devices. The method includes analyzing the identified movement to determine a gesture type associated with the emphasis, where the gesture type includes a first emphasis to be conferred on the air-written words. The method includes providing a display of the air-written words on a first AR device of the plurality of AR devices. The first emphasis is conferred on the words on the display of the first AR device using a first gesture display style based on a profile of a first user utilizing the first AR device.Type: ApplicationFiled: October 22, 2019Publication date: April 22, 2021Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Willie L. SCOTT, II, Charu PANDHI, Seema NAGAR, Kuntal DEY
-
Publication number: 20210048938Abstract: An approach is provided in which the approach the approach deconstructs a user interface into user interface elements that each are assigned an importance score. The approach compares a user eye gaze pattern of a user viewing the user interface against an expected eye gaze pattern corresponding to the user interface, and determines that the user requires assistance navigating the user interface. The approach selects one of the user interface elements based on its importance score, generates an augmented reality overlay of the selected user interface element, and displays the augmented reality overlay on the user interface using an augmented reality device.Type: ApplicationFiled: August 18, 2019Publication date: February 18, 2021Inventors: Willie L. Scott, II, Charu Pandhi, Mohit Jain, Kuntal Dey
-
Patent number: 10782777Abstract: Provided are systems, methods, and media for real-time alteration of video. An example method includes presenting a video to a user. The method includes monitoring a gaze point of the user as the user views one or more frames of the video. The method includes, in response to a determination that the monitored gaze point of the user is different from a predetermined target gaze point, changing the orientation of the video to reposition the target gaze point of the video to the monitored gaze point of the user, in which the orientation of the video is changed during the presentation of the video to the user.Type: GrantFiled: November 29, 2018Date of Patent: September 22, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Willie L. Scott, II, Kuntal Dey, Mohit Jain, Charu Pandhi
-
Patent number: 10739864Abstract: A gesture to speech conversion device may receive indications of user gestures via at least one sensor, the indications identifying movement in three dimensions. A 2-dimensional (2D) plane on which a beginning of the movement and an end of the movement is substantially planar and a third dimension orthogonal to the 2D plane may be determined. A change of the movement in a direction of the third dimension in a course of the movement occurring on the 2D plane is detected. The change of the movement in the third dimension is mapped to an emphasis in the movement. The movement is transformed into speech with emphasis on a part of the speech corresponding to a part of the movement having the detected change.Type: GrantFiled: December 31, 2018Date of Patent: August 11, 2020Assignee: International Business Machines CorporationInventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
-
Publication number: 20200209976Abstract: A gesture to speech conversion device may receive indications of user gestures via at least one sensor, the indications identifying movement in three dimensions. A 2-dimensional (2D) plane on which a beginning of the movement and an end of the movement is substantially planar and a third dimension orthogonal to the 2D plane may be determined. A change of the movement in a direction of the third dimension in a course of the movement occurring on the 2D plane is detected. The change of the movement in the third dimension is mapped to an emphasis in the movement. The movement is transformed into speech with emphasis on a part of the speech corresponding to a part of the movement having the detected change.Type: ApplicationFiled: December 31, 2018Publication date: July 2, 2020Inventors: Willie L. Scott, II, Charu Pandhi, Seema Nagar, Kuntal Dey
-
Publication number: 20200174559Abstract: Provided are systems, methods, and media for real-time alteration of video. An example method includes presenting a video to a user. The method includes monitoring a gaze point of the user as the user views one or more frames of the video. The method includes, in response to a determination that the monitored gaze point of the user is different from a predetermined target gaze point, changing the orientation of the video to reposition the target gaze point of the video to the monitored gaze point of the user, in which the orientation of the video is changed during the presentation of the video to the user.Type: ApplicationFiled: November 29, 2018Publication date: June 4, 2020Inventors: Willie L. Scott, II, Kuntal Dey, Mohit Jain, Charu Pandhi
-
Patent number: 10585936Abstract: Textual content is analyzed to determine a tone of the content. A first color palette including a first plurality of colors is computed. The first plurality of colors corresponds to the tone. A first color of the first plurality of colors is selected, and a second color palette including a second plurality of colors is computed. The second plurality of colors corresponds to the first plurality of colors and satisfies a predetermined color-related accessibility requirement between the first selected color and each of the second plurality of colors. A second color of the second plurality of colors is selected, and the content is modified using the first selected color and the second selected color.Type: GrantFiled: June 12, 2017Date of Patent: March 10, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Maureen Kraft, Fang Lu, Charu Pandhi
-
Publication number: 20180357231Abstract: Textual content is analyzed to determine a tone of the content. A first color palette including a first plurality of colors is computed. The first plurality of colors corresponds to the tone. A first color of the first plurality of colors is selected, and a second color palette including a second plurality of colors is computed. The second plurality of colors corresponds to the first plurality of colors and satisfies a predetermined color-related accessibility requirement between the first selected color and each of the second plurality of colors. A second color of the second plurality of colors is selected, and the content is modified using the first selected color and the second selected color.Type: ApplicationFiled: June 12, 2017Publication date: December 13, 2018Applicant: International Business Machines CorporationInventors: Maureen Kraft, Fang Lu, Charu Pandhi