Patents by Inventor Kfir Karmon

Kfir Karmon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230154238
    Abstract: Computer implemented method for detecting a hand gesture of a user, comprising: (a) Receiving sequential logic models each representing a hand gesture. The sequential logic model maps pre-defined hand poses and motions each represented by a hand features record defined by discrete hand values each indicating a state of respective hand feature. (b) Receiving a runtime sequence of runtime hand datasets each defined by discrete hand values scores indicating current state hand features of a user's moving hand which are inferred by analyzing timed images depicting the moving hand. (c) Submitting the runtime hand datasets and the pre-defined hand features records in SSVM functions to generate estimation terms for the runtime hand datasets with respect to the hand features records. (d) Estimating which of the hand gestures best matches the runtime sequence depicted in the timed images by optimizing score functions using the estimation terms for the runtime hand datasets.
    Type: Application
    Filed: January 18, 2023
    Publication date: May 18, 2023
    Inventors: Daniel FREEDMAN, Kfir KARMON, Eyal KRUPKA, Yagil ENGEL, Yevgeny SHAPIRO
  • Publication number: 20220343689
    Abstract: Computer implemented method for detecting a hand gesture of a user, comprising: (a) Receiving sequential logic models each representing a hand gesture. The sequential logic model maps pre-defined hand poses and motions each represented by a hand features record defined by discrete hand values each indicating a state of respective hand feature. (b) Receiving a runtime sequence of runtime hand datasets each defined by discrete hand values scores indicating current state hand features of a user's moving hand which are inferred by analyzing timed images depicting the moving hand. (c) Submitting the runtime hand datasets and the pre-defined hand features records in SSVM functions to generate estimation terms for the runtime hand datasets with respect to the hand features records. (d) Estimating which of the hand gestures best matches the runtime sequence depicted in the timed images by optimizing score functions using the estimation terms for the runtime hand datasets.
    Type: Application
    Filed: July 5, 2022
    Publication date: October 27, 2022
    Inventors: Daniel FREEDMAN, Kfir KARMON, Eyal KRUPKA, Yagil ENGEL, Yevgeny SHAPIRO
  • Patent number: 11410464
    Abstract: Computer implemented method for detecting a hand gesture of a user, comprising: (a) Receiving sequential logic models each representing a hand gesture. The sequential logic model maps pre-defined hand poses and motions each represented by a hand features record defined by discrete hand values each indicating a state of respective hand feature. (b) Receiving a runtime sequence of runtime hand datasets each defined by discrete hand values scores indicating current state hand features of a user's moving hand which are inferred by analyzing timed images depicting the moving hand. (c) Submitting the runtime hand datasets and the pre-defined hand features records in SSVM functions to generate estimation terms for the runtime hand datasets with respect to the hand features records. (d) Estimating which of the hand gestures best matches the runtime sequence depicted in the timed images by optimizing score functions using the estimation terms for the runtime hand datasets.
    Type: Grant
    Filed: February 19, 2020
    Date of Patent: August 9, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel Freedman, Kfir Karmon, Eyal Krupka, Yagil Engel, Yevgeny Shapiro
  • Patent number: 10866882
    Abstract: A debugging tool comprises user input apparatus to receive user input from a debugging user, computer storage configured to hold a piece of code to be debugged, the code embodying a state machine defining a user input action, a display configured to display a timeline, and at least one processor configured to execute an iterative debugging process for visualising behaviour of the code on the timeline. The debugging process is driven by changes in the user input received at the user input apparatus and is performed so as to represent on the timeline a sequence of expected user input states of the state machine as they are actualized by the debugging user according to the permitted transitions.
    Type: Grant
    Filed: June 29, 2017
    Date of Patent: December 15, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Yuval Tzairi
  • Publication number: 20200184204
    Abstract: Computer implemented method for detecting a hand gesture of a user, comprising: (a) Receiving sequential logic models each representing a hand gesture. The sequential logic model maps pre-defined hand poses and motions each represented by a hand features record defined by discrete hand values each indicating a state of respective hand feature. (b) Receiving a runtime sequence of runtime hand datasets each defined by discrete hand values scores indicating current state hand features of a user's moving hand which are inferred by analyzing timed images depicting the moving hand. (c) Submitting the runtime hand datasets and the pre-defined hand features records in SSVM functions to generate estimation terms for the runtime hand datasets with respect to the hand features records. (d) Estimating which of the hand gestures best matches the runtime sequence depicted in the timed images by optimizing score functions using the estimation terms for the runtime hand datasets.
    Type: Application
    Filed: February 19, 2020
    Publication date: June 11, 2020
    Inventors: Daniel FREEDMAN, Kfir KARMON, Eyal KRUPKA, Yagil ENGEL, Yevgeny SHAPIRO
  • Patent number: 10599919
    Abstract: Computer implemented method for detecting a hand gesture of a user, comprising: (a) Receiving sequential logic models each representing a hand gesture. The sequential logic model maps pre-defined hand poses and motions each represented by a hand features record defined by discrete hand values each indicating a state of respective hand feature. (b) Receiving a runtime sequence of runtime hand datasets each defined by discrete hand values scores indicating current state hand features of a user's moving hand which are inferred by analyzing timed images depicting the moving hand. (c) Submitting the runtime hand datasets and the pre-defined hand features records in SSVM functions to generate estimation terms for the runtime hand datasets with respect to the hand features records. (d) Estimating which of the hand gestures best matches the runtime sequence depicted in the timed images by optimizing score functions using the estimation terms for the runtime hand datasets.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: March 24, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel Freedman, Kfir Karmon, Eyal Krupka, Yagil Engel, Yevgeny Shapiro
  • Patent number: 10599324
    Abstract: System for associating a computerized hand gestures model with application functions, comprising: (a) A storage storing a plurality of hand pose features records and hand motion features records. Each of the hand pose features records and hand motion features records is defined by a set of discrete pose values and discrete motion values respectively. (b) An interface receiving programmer instructions. (c) A memory storing code. (d) One or more processors coupled to the interface, storage and memory for executing the code which comprises: 1) Code instructions to define hand gestures by constructing a unique logical sequence of the hand pose features records and hand motion features records. 2) Code instructions to associate the unique logical sequence with application functions per the instructions for initiating execution of the functions during the application runtime in response to detection of the unique logical sequence from analysis of images depicting movement of user's hand(s).
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: March 24, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Kfir Karmon
  • Patent number: 10488939
    Abstract: A gesture recognition method comprises receiving at a processor from a sensor a sequence of captured signal frames for extracting hand pose information for a hand and using at least one trained predictor executed on the processor to extract hand pose information from the received signal frames. For at least one defined gesture, defined as a time sequence comprising hand poses, with each of the hand poses defined as a conjunction or disjunction of qualitative propositions relating to interest points on the hand, truth values are computed for the qualitative propositions using the hand pose information extracted from the received signal frames, and execution of the gesture is tracked, by using the truth values to determine which of the hand poses in the time sequence have already been executed and which of the hand poses in the time sequence is expected next.
    Type: Grant
    Filed: August 7, 2017
    Date of Patent: November 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Aharon Bar-Hillel, Eyal Krupka, Noam Bloom, Ilya Gurvich, Aviv Hurvitz, Ido Leichter, Yoni Smolin, Yuval Tzairi, Alon Vinnikov
  • Patent number: 10345919
    Abstract: An apparatus including a housing for coupling a keyboard to a tablet device through at least one respective electric interface and circuitry integrated in the housing, and a camera sensor integrated in the housing, the camera sensor positioned such that a central axis of a field of view (FOV) of the camera sensor is substantially parallel to a surface of the keyboard, the camera sensor further positioned to capture a plurality of images and to transmit the plurality of images to the tablet device through the respective electric interface and the electric circuitry. Related apparatus and methods are also described.
    Type: Grant
    Filed: February 2, 2016
    Date of Patent: July 9, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Adi Diamant
  • Patent number: 10310618
    Abstract: A system for creating hand gestures representations, comprising an interface for interacting with a user, a storage storing a plurality of discrete pose values and discrete motion values, a memory storing a gesture visual builder code, one or more processors coupled to the interface, storage and memory to execute the gesture visual builder code allowing the user to create hand gesture. The gesture visual builder code comprising code instructions to present the user with a GUI which displays a hierarchical menu driven interface, code instructions to receive iteratively user instructions from the user using the hierarchical menu driven interface, for creating a logical sequence of hand gesture by defining one or more hand pose features records and hand motion features records and code instructions to generate a code segment defining the one or more hand pose/motion features records through the discrete pose/motion values respectively.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: June 4, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Eyal Krupka, Yuval Tzairi, Uri Levanon, Shelly Horowitz
  • Patent number: 10139921
    Abstract: Hand gesture detection electrical device for detecting hand gestures, comprising an IC electronically integrating: (a) First interface connecting to imaging device(s). (b) Second interface connecting to controlled unit. (c) Data storage storing sequential logic models representing a hand gestures. The sequential logic models map a sequence of pre-defined hand poses and/or motions. (d) Memory storing code. (e) Processor(s) coupled to the first and second interfaces, data storage and memory for executing the code to: (1) Receive timed images depicting a user's moving hand. (2) Generate a runtime sequence mapping runtime hand datasets each defined by discrete hand values indicating current state of the moving hand. (3) Estimate which hand gesture(s) best match the runtime sequence by optimizing the runtime sequence compared to the sequential logic models using SSVM functions. (4) Initiate action(s) to the controlled unit. The action(s) are associated with selected hand gesture(s) based on the estimation.
    Type: Grant
    Filed: December 27, 2017
    Date of Patent: November 27, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Eyal Krupka, Adi Diamant
  • Publication number: 20180307319
    Abstract: A gesture recognition method comprises receiving at a processor from a sensor a sequence of captured signal frames for extracting hand pose information for a hand and using at least one trained predictor executed on the processor to extract hand pose information from the received signal frames. For at least one defined gesture, defined as a time sequence comprising hand poses, with each of the hand poses defined as a conjunction or disjunction of qualitative propositions relating to interest points on the hand, truth values are computed for the qualitative propositions using the hand pose information extracted from the received signal frames, and execution of the gesture is tracked, by using the truth values to determine which of the hand poses in the time sequence have already been executed and which of the hand poses in the time sequence is expected next.
    Type: Application
    Filed: August 7, 2017
    Publication date: October 25, 2018
    Inventors: Kfir KARMON, Eyal KRUPKA, Noam BLOOM, Ilya GURVICH, Aviv HURVITZ, Ido LEICHTER, Yoni SMOLIN, Yuval TZAIRI, Alon VINNIKOV, Aharon BAR-HILLEL
  • Publication number: 20180307587
    Abstract: A debugging tool comprises user input apparatus to receive user input from a debugging user, computer storage configured to hold a piece of code to be debugged, the code embodying a state machine defining a user input action, a display configured to display a timeline, and at least one processor configured to execute an iterative debugging process for visualising behaviour of the code on the timeline. The debugging process is driven by changes in the user input received at the user input apparatus and is performed so as to represent on the timeline a sequence of expected user input states of the state machine as they are actualized by the debugging user according to the permitted transitions.
    Type: Application
    Filed: June 29, 2017
    Publication date: October 25, 2018
    Inventors: Kfir KARMON, Yuval TZAIRI
  • Publication number: 20180120950
    Abstract: Hand gesture detection electrical device for detecting hand gestures, comprising an IC electronically integrating: (a) First interface connecting to imaging device(s). (b) Second interface connecting to controlled unit. (c) Data storage storing sequential logic models representing a hand gestures. The sequential logic models map a sequence of pre-defined hand poses and/or motions. (d) Memory storing code. (e) Processor(s) coupled to the first and second interfaces, data storage and memory for executing the code to: (1) Receive timed images depicting a user's moving hand. (2) Generate a runtime sequence mapping runtime hand datasets each defined by discrete hand values indicating current state of the moving hand. (3) Estimate which hand gesture(s) best match the runtime sequence by optimizing the runtime sequence compared to the sequential logic models using SSVM functions. (4) Initiate action(s) to the controlled unit. The action(s) are associated with selected hand gesture(s) based on the estimation.
    Type: Application
    Filed: December 27, 2017
    Publication date: May 3, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kfir KARMON, Eyal KRUPKA, Adi DIAMANT
  • Patent number: 9898256
    Abstract: A system of injecting a code section to a code edited by a graphical user interface (GUI) of an integrated development environment (IDE), comprising: a memory storing a dataset associating each code segment with one hand pose feature or hand motion feature; an imager adapted to capture images of a hand while an IDE being executed on a client terminal; and processor for executing code of an application, comprising: code instructions to identify at least one of the features and at least one discrete value of the identified features from an analysis of the images; code instructions to select at least one of the code segments associated with the identified features; and code instructions to add automatically a code section generated based on the code segments and the discrete value to a code presented by a code editor of the IDE.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: February 20, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Adi Diamant, Eyal Krupka
  • Patent number: 9870063
    Abstract: A system for associating between a computerized model of multimodal human interaction and application functions, comprising: (a) An interface for receiving instructions from a programmer defining one or more application functions. (b) A memory storing hand gestures each defined by a dataset of discrete pose values and discrete motion values. (c) A code store storing a code. (d) One or more processors coupled to the interface, the memory and the code store for executing the stored code which comprises: (1) Code instructions to define a logical sequence of user input per instructions of the programmer. The logical sequence combines hand gestures with non-gesture user input. (2) Code instructions to associate the logical sequence with the application function(s) for initiating an execution of the application function(s) during runtime of the application in response to detection of the logical sequence by analyzing a captured data depicting a user during runtime.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: January 16, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Adi Diamant, Karen Master Ben-Dor, Eyal Krupka
  • Patent number: 9857881
    Abstract: Hand gesture detection electrical device for detecting hand gestures, comprising an IC electronically integrating: (a) First interface connecting to imaging device(s). (b) Second interface connecting to controlled unit. (c) Data storage storing sequential logic models representing a hand gestures. The sequential logic models map a sequence of pre-defined hand poses and/or motions. (d) Memory storing code. (e) Processor(s) coupled to the first and second interfaces, data storage and memory for executing the code to: (1) Receive timed images depicting a user's moving hand. (2) Generate a runtime sequence mapping runtime hand datasets each defined by discrete hand values indicating current state of the moving hand. (3) Estimate which hand gesture(s) best match the runtime sequence by optimizing the runtime sequence compared to the sequential logic models using SSVM functions. (4) Initiate action(s) to the controlled unit. The action(s) are associated with selected hand gesture(s) based on the estimation.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: January 2, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kfir Karmon, Eyal Krupka, Adi Diamant
  • Patent number: 9734435
    Abstract: Computer implemented method for computing a feature dataset classifying a pose of a human hand, comprising: (a) Selecting a global orientation category (GOC) defining a spatial orientation of a human hand in a 3D space by applying GOC classifying functions on a received image segment depicting the hand. (b) Identifying in-plane rotation by applying in-plane rotation classifying functions on the image segment, the in-plane rotation classifying functions are selected according to said GOC. (c) Aligning the image segment in a 2D plane according to the in-plane rotation. (d) Applying hand pose features classifying functions on the aligned image segment. Each one of the feature classifying functions outputs a current discrete pose value of an associated hand feature. (e) Outputting a features dataset defining a current discrete pose value for each of the hand pose features for classifying current hand pose of the hand.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: August 15, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Krupka, Alon Vinnikov, Kfir Karmon
  • Publication number: 20170220128
    Abstract: An apparatus including a housing for coupling a keyboard to a tablet device through at least one respective electric interface and circuitry integrated in the housing, and a camera sensor integrated in the housing, the camera sensor positioned such that a central axis of a field of view (FOV) of the camera sensor is substantially parallel to a surface of the keyboard, the camera sensor further positioned to capture a plurality of images and to transmit the plurality of images to the tablet device through the respective electric interface and the electric circuitry. Related apparatus and methods are also described.
    Type: Application
    Filed: February 2, 2016
    Publication date: August 3, 2017
    Inventors: Kfir KARMON, Adi DIAMANT
  • Publication number: 20170193334
    Abstract: Computer implemented method for computing a feature dataset classifying a pose of a human hand, comprising: (a) Selecting a global orientation category (GOC) defining a spatial orientation of a human hand in a 3D space by applying GOC classifying functions on a received image segment depicting the hand. (b) Identifying in-plane rotation by applying in-plane rotation classifying functions on the image segment, the in-plane rotation classifying functions are selected according to said GOC. (c) Aligning the image segment in a 2D plane according to the in-plane rotation. (d) Applying hand pose features classifying functions on the aligned image segment. Each one of the feature classifying functions outputs a current discrete pose value of an associated hand feature. (e) Outputting a features dataset defining a current discrete pose value for each of the hand pose features for classifying current hand pose of the hand.
    Type: Application
    Filed: December 31, 2015
    Publication date: July 6, 2017
    Inventors: Eyal KRUPKA, Alon VINNIKOV, Kfir KARMON