Patents by Inventor Oded Dubovsky

Oded Dubovsky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11651538
    Abstract: An approach for creating instructional 3D animated videos, without physical access to the object or to the object CAD models as a prerequisite is disclosed. The approach allows the user to submit some images or a video of the object and a knowledge about the required procedure. The required procedures includes, adding the instructions and text annotations. The approach will build a 3D model based on the submitted images and/or video. The approach will generate the instructional animated video based on the 3D model and the required procedure.
    Type: Grant
    Filed: March 17, 2021
    Date of Patent: May 16, 2023
    Assignee: International Business Machines Corporation
    Inventors: Adi Raz Goldfarb, Tal Drory, Oded Dubovsky
  • Patent number: 11620796
    Abstract: A method, a computer program product, and a computer system for transferring knowledge from an expert to a user using a mixed reality rendering. The method includes determining a user perspective of a user viewing an object on which a procedure is to be performed. The method includes determining an anchoring of the user perspective to an expert perspective, the expert perspective associated with an expert providing a demonstration of the procedure. The method includes generating a virtual rendering of the expert at the user perspective based on the anchoring at a scene viewed by the user, the virtual rendering corresponding to the demonstration of the procedure as performed by the expert. The method includes generating a mixed reality environment in which the virtual rendering of the expert is shown in the scene viewed by the user.
    Type: Grant
    Filed: March 1, 2021
    Date of Patent: April 4, 2023
    Assignee: International Business Machines Corporation
    Inventors: Joseph Shtok, Leonid Karlinsky, Adi Raz Goldfarb, Oded Dubovsky
  • Patent number: 11501502
    Abstract: A method, computer system, and a computer program product for augmented reality guidance are provided. Device orientation instructions may be displayed as augmented reality on a display screen of a device. The device may include a camera and may be portable. The display screen may show a view of an object. At least one additional instruction may be received that includes at least one word directing user interaction with the object. The at least one additional instruction may be displayed on the display screen of the device. The camera may capture an image of the object regarding the at least one additional instruction. The image may be input to a first machine learning model so that an output of the first machine learning model is generated. The output may be received from the first machine learning model. The output may be displayed on the display screen.
    Type: Grant
    Filed: March 19, 2021
    Date of Patent: November 15, 2022
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Nancy Anne Greco, Oded Dubovsky, Adi Raz Goldfarb, John L. Nard
  • Publication number: 20220301266
    Abstract: A method, computer system, and a computer program product for augmented reality guidance are provided. Device orientation instructions may be displayed as augmented reality on a display screen of a device. The device may include a camera and may be portable. The display screen may show a view of an object. At least one additional instruction may be received that includes at least one word directing user interaction with the object. The at least one additional instruction may be displayed on the display screen of the device. The camera may capture an image of the object regarding the at least one additional instruction. The image may be input to a first machine learning model so that an output of the first machine learning model is generated. The output may be received from the first machine learning model. The output may be displayed on the display screen.
    Type: Application
    Filed: March 19, 2021
    Publication date: September 22, 2022
    Inventors: Nancy Anne Greco, ODED DUBOVSKY, Adi Raz Goldfarb, John L. Ward
  • Publication number: 20220301247
    Abstract: An approach for creating instructional 3D animated videos, without physical access to the object or to the object CAD models as a prerequisite is disclosed. The approach allows the user to submit some images or a video of the object and a knowledge about the required procedure. The required procedures includes, adding the instructions and text annotations. The approach will build a 3D model based on the submitted images and/or video. The approach will generate the instructional animated video based on the 3D model and the required procedure.
    Type: Application
    Filed: March 17, 2021
    Publication date: September 22, 2022
    Inventors: Adi Raz Goldfarb, Tal Drory, ODED DUBOVSKY
  • Publication number: 20220291981
    Abstract: In an approach for deducing a root cause analysis model, a processor trains a classifier based on labeled data to identify entities. A processor trains the classifier with first taxonomy and ontology. A processor uses the classifier to classify each component from one or more augmented reality peer assistance sessions into a class. A processor generates a root cause analysis model based on the identified entities and the classified components.
    Type: Application
    Filed: March 9, 2021
    Publication date: September 15, 2022
    Inventors: Adi Raz Goldfarb, Oded Dubovsky, Erez Lev Meir Bilgory
  • Publication number: 20220277524
    Abstract: A method, a computer program product, and a computer system for transferring knowledge from an expert to a user using a mixed reality rendering. The method includes determining a user perspective of a user viewing an object on which a procedure is to be performed. The method includes determining an anchoring of the user perspective to an expert perspective, the expert perspective associated with an expert providing a demonstration of the procedure. The method includes generating a virtual rendering of the expert at the user perspective based on the anchoring at a scene viewed by the user, the virtual rendering corresponding to the demonstration of the procedure as performed by the expert. The method includes generating a mixed reality environment in which the virtual rendering of the expert is shown in the scene viewed by the user.
    Type: Application
    Filed: March 1, 2021
    Publication date: September 1, 2022
    Inventors: Joseph Shtok, Leonid Karlinsky, Adi Raz Goldfarb, ODED DUBOVSKY
  • Patent number: 11145129
    Abstract: Automatically generating augmented reality (AR) content by constructing a three-dimensional (3D) model of an object-including scene using images recorded during a remotely-guided AR session from a camera position defined relative to first 3D axes, the model including camera positions defined relative to second 3D axes, registering the first axes with the second axes by matching a trajectory derived from the image camera positions to a trajectory derived from the model's camera positions for determining a session-to-model transform, translating, using the transform, positions of points of interest (POIs) indicated on the object during the session, to corresponding POI positions on the object within the model, where the session POI positions are defined relative to the first axes and the model POI positions are defined relative to the second axes, and generating a content package including the model, model POI positions, and POI annotations provided during the session.
    Type: Grant
    Filed: November 13, 2019
    Date of Patent: October 12, 2021
    Assignee: International Business Machines Corporation
    Inventors: Oded Dubovsky, Adi Raz Goldfarb, Yochay Tzur
  • Publication number: 20210142570
    Abstract: Automatically generating augmented reality (AR) content by constructing a three-dimensional (3D) model of an object-including scene using images recorded during a remotely-guided AR session from a camera position defined relative to first 3D axes, the model including camera positions defined relative to second 3D axes, registering the first axes with the second axes by matching a trajectory derived from the image camera positions to a trajectory derived from the model's camera positions for determining a session-to-model transform, translating, using the transform, positions of points of interest (POIs) indicated on the object during the session, to corresponding POI positions on the object within the model, where the session POI positions are defined relative to the first axes and the model POI positions are defined relative to the second axes, and generating a content package including the model, model POI positions, and POI annotations provided during the session.
    Type: Application
    Filed: November 13, 2019
    Publication date: May 13, 2021
    Inventors: Oded Dubovsky, Adi Raz Goldfarb, Yochay Tzur
  • Patent number: 10984341
    Abstract: A computer implemented method of detecting complex user activities, comprising using processor(s) in each of a plurality of consecutive time intervals for: obtaining sensory data from wearable inertial sensor(s) worn by a user, computing an action score for continuous physical action(s) performed by the user, the continuous physical action(s) extending over multiple time intervals are indicated by repetitive motion pattern(s) identified by analyzing the sensory data, computing a gesture score for brief gesture(s) performed by the user, the brief gesture(s) bounded in a single basic time interval is identified by analyzing the sensory data, aggregating the action and gesture scores to produce an interval activity score of predefined activity(s) for a current time interval, adding the interval activity score to a cumulative activity score accumulated during a predefined number of preceding time intervals and identifying the predefined activity(s) when the cumulative activity score exceeds a predefined threshold
    Type: Grant
    Filed: September 27, 2017
    Date of Patent: April 20, 2021
    Assignee: International Business Machines Corporation
    Inventors: Oded Dubovsky, Alexander Zadorojniy, Sergey Zeltyn
  • Patent number: 10878297
    Abstract: Embodiments may provide visual recognition techniques that provide improved recognition accuracy and reduced use of computing resources in cases where only a small set of examples is used to train an unlimited number of recognized categories. For example, in an embodiment, a computer-implemented method of visual recognition may comprise generating a plurality of personal embedding models, each personal embedding model including categories relating to a person, and object, or a subject, wherein at least some of the personal embedding models include at least some different categories, training the plurality of personal embedding models using image training data having a limited number of examples of each category, wherein the examples of each category are used to train more than one category in more than one of the personal embedding models, recognizing images from image data using the plurality of personal embedding models, and outputting information relating to the recognized images.
    Type: Grant
    Filed: August 29, 2018
    Date of Patent: December 29, 2020
    Assignee: International Business Machines Corporation
    Inventors: Oded Dubovsky, Leonid Karlinsky, Joseph Shtok
  • Patent number: 10712930
    Abstract: Electronic devices that include a force sensor input and input user interface elements are described. The force sensors may be located to detect force on the display of the electronic device. The force sensor, alone or in combination with one or more other sensors such as capacitive touch sensors, allows for interaction with the user interface input on the device. By using a hold detection logic with a pressure level sensitive sensor, user interface elements can be manipulated or values can be assigned to input elements.
    Type: Grant
    Filed: May 28, 2017
    Date of Patent: July 14, 2020
    Assignee: International Business Machines Corporation
    Inventors: Oded Dubovsky, Yossi Mesika
  • Publication number: 20200074247
    Abstract: Embodiments may provide visual recognition techniques that provide improved recognition accuracy and reduced use of computing resources in cases where only a small set of examples is used to train an unlimited number of recognized categories. For example, in an embodiment, a computer-implemented method of visual recognition may comprise generating a plurality of personal embedding models, each personal embedding model including categories relating to a person, and object, or a subject, wherein at least some of the personal embedding models include at least some different categories, training the plurality of personal embedding models using image training data having a limited number of examples of each category, wherein the examples of each category are used to train more than one category in more than one of the personal embedding models, recognizing images from image data using the plurality of personal embedding models, and outputting information relating to the recognized images.
    Type: Application
    Filed: August 29, 2018
    Publication date: March 5, 2020
    Inventors: ODED DUBOVSKY, LEONID KARLINSKY, JOSEPH SHTOK
  • Patent number: 10353385
    Abstract: A method enhances an emergency reporting system for controlling equipment. A message receiver receives an electronic message from a person. The electronic message is a report regarding an emergency event related to equipment. One or more processors identify a profile of the person who sent the electronic message, and determine a bias of the person regarding the emergency event based on the person's profile. One or more processors amend, based on the bias of the person, a content of the electronic message to create a modified electronic message regarding the emergency event. The message receiver detects that the modified electronic message came from an unauthorized source. A local controller on the equipment, in response to detecting that the modified electronic message came from the unauthorized source, automatically isolates the equipment from remote control signals for controlling the equipment.
    Type: Grant
    Filed: September 28, 2018
    Date of Patent: July 16, 2019
    Assignee: International Business Machines Corporation
    Inventors: Aaron K. Baughman, Oded Dubovsky, James R. Kozloski, Boaz Mizrachi, Clifford A. Pickover
  • Publication number: 20190095814
    Abstract: A computer implemented method of detecting complex user activities, comprising using processor(s) in each of a plurality of consecutive time intervals for: obtaining sensory data from wearable inertial sensor(s) worn by a user, computing an action score for continuous physical action(s) performed by the user, the continuous physical action(s) extending over multiple time intervals are indicated by repetitive motion pattern(s) identified by analyzing the sensory data, computing a gesture score for brief gesture(s) performed by the user, the brief gesture(s) bounded in a single basic time interval is identified by analyzing the sensory data, aggregating the action and gesture scores to produce an interval activity score of predefined activity(s) for a current time interval, adding the interval activity score to a cumulative activity score accumulated during a predefined number of preceding time intervals and identifying the predefined activity(s) when the cumulative activity score exceeds a predefined threshold
    Type: Application
    Filed: September 27, 2017
    Publication date: March 28, 2019
    Inventors: Oded Dubovsky, Alexander Zadorojniy, Sergey Zeltyn
  • Publication number: 20190033841
    Abstract: A method enhances an emergency reporting system for controlling equipment. A message receiver receives an electronic message from a person. The electronic message is a report regarding an emergency event related to equipment. One or more processors identify a profile of the person who sent the electronic message, and determine a bias of the person regarding the emergency event based on the person's profile. One or more processors amend, based on the bias of the person, a content of the electronic message to create a modified electronic message regarding the emergency event. The message receiver detects that the modified electronic message came from an unauthorized source. A local controller on the equipment, in response to detecting that the modified electronic message came from the unauthorized source, automatically isolates the equipment from remote control signals for controlling the equipment.
    Type: Application
    Filed: September 28, 2018
    Publication date: January 31, 2019
    Inventors: Aaron K. Baughman, Oded Dubovsky, James R. Kozloski, Boaz Mizrachi, Clifford A. Pickover
  • Patent number: 10162345
    Abstract: A method enhances an emergency reporting system for controlling equipment. A message receiver receives an electronic message from a person. The electronic message is a report regarding an emergency event. One or more processors identify a profile of the person who sent the electronic message, and determine a bias of the person regarding the emergency event based on the person's profile. One or more processors amend, based on the bias of the person, a content of the electronic message to create a modified electronic message regarding the emergency event. The modified electronic message is consolidated with other modified electronic messages into a bias-corrected report about the emergency event. One or more processors then automatically adjust equipment based on the bias-corrected report about the emergency event.
    Type: Grant
    Filed: April 21, 2015
    Date of Patent: December 25, 2018
    Assignee: International Business Machines Corporation
    Inventors: Aaron K. Baughman, Oded Dubovsky, James R. Kozloski, Boaz Mizrachi, Clifford A. Pickover
  • Publication number: 20180341384
    Abstract: Electronic devices that include a force sensor input and input user interface elements are described. The force sensors may be located to detect force on the display of the electronic device. The force sensor, alone or in combination with one or more other sensors such as capacitive touch sensors, allows for interaction with the user interface input on the device. By using a hold detection logic with a pressure level sensitive sensor, user interface elements can be manipulated or values can be assigned to input elements.
    Type: Application
    Filed: May 28, 2017
    Publication date: November 29, 2018
    Inventors: Oded Dubovsky, Yossi Mesika
  • Patent number: 10044816
    Abstract: Systems and methods are provided for location-based Domain Name System (DNS) service discovery using a central DNS server in which network resources are aggregated by geographic location (e.g., subnets) and defined using DNS service discovery records that are mapped to corresponding geographic locations.
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: August 7, 2018
    Assignee: International Business Machines Corporation
    Inventors: Yoni Amishav, Eric J. Barkie, Oded Dubovsky, Benjamin L. Fletcher
  • Patent number: 10044815
    Abstract: Systems and methods are provided for location-based Domain Name System (DNS) service discovery using a central DNS server in which network resources are aggregated by geographic location (e.g., subnets) and defined using DNS service discovery records that are mapped to corresponding geographic locations.
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: August 7, 2018
    Assignee: International Business Machines Corporation
    Inventors: Yoni Amishav, Eric J. Barkie, Oded Dubovsky, Benjamin L. Fletcher