Patents by Inventor Ambrus Csaszar

Ambrus Csaszar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230186575
    Abstract: A method including receiving a first depth image associated with a first frame at a first time of an augmented reality (AR) application, the first depth image representing at least a first portion of a real-world space storing the first depth image receiving a second depth image associated with a second frame at a second time, after the first time, of the AR application, the second depth image representing at least a second portion of the real-world space generating a real-world image by blending, at least, the stored first depth image with the second depth image receiving a rendered AR object combining the AR object in the real-world image and displaying the real-world image combined with the AR object.
    Type: Application
    Filed: May 22, 2020
    Publication date: June 15, 2023
    Inventors: Eric Turner, Keisuke Tateno, Konstantine Nicholas John Tsotsos, Adarsh Prakash Murthy Kowdle, Vaibhav Gupta, Ambrus Csaszar
  • Patent number: 11429076
    Abstract: A method for automated manufacturing strategy generation can include: identifying features of a desired part from a virtual model; and determining a tactic strategy based on the identified features. The method can additionally include: determining a toolpath primitive for each tactic; combining the toolpath primitives for the tactics to generate a master toolpath; and translating the master toolpath into machine code.
    Type: Grant
    Filed: October 2, 2020
    Date of Patent: August 30, 2022
    Assignee: CADDi Inc.
    Inventors: William Haldean Brown, Ruza Markov, Michael Rivlin, Ambrus Csaszar, Jonathan Stults
  • Patent number: 11145075
    Abstract: A handheld user device includes a monocular camera to capture a feed of images of a local scene and a processor to select, from the feed, a keyframe and perform, for a first image from the feed, stereo matching using the first image, the keyframe, and a relative pose based on a pose associated with the first image and a pose associated with the keyframe to generate a sparse disparity map representing disparities between the first image and the keyframe. The processor further is to determine a dense depth map from the disparity map using a bilateral solver algorithm, and process a viewfinder image generated from a second image of the feed with occlusion rendering based on the depth map to incorporate one or more virtual objects into the viewfinder image to generate an AR viewfinder image. Further, the processor is to provide the AR viewfinder image for display.
    Type: Grant
    Filed: October 4, 2019
    Date of Patent: October 12, 2021
    Assignee: Google LLC
    Inventors: Julien Valentin, Onur G. Guleryuz, Mira Leung, Maksym Dzitsiuk, Jose Pascoal, Mirko Schmidt, Christoph Rhemann, Neal Wadhwa, Eric Turner, Sameh Khamis, Adarsh Prakash Murthy Kowdle, Ambrus Csaszar, João Manuel Castro Afonso, Jonathan T. Barron, Michael Schoenberg, Ivan Dryanovski, Vivek Verma, Vladimir Tankovich, Shahram Izadi, Sean Ryan Francesco Fanello, Konstantine Nicholas John Tsotsos
  • Publication number: 20210018887
    Abstract: A method for automated manufacturing strategy generation can include: identifying features of a desired part from a virtual model; and determining a tactic strategy based on the identified features. The method can additionally include: determining a toolpath primitive for each tactic; combining the toolpath primitives for the tactics to generate a master toolpath; and translating the master toolpath into machine code.
    Type: Application
    Filed: October 2, 2020
    Publication date: January 21, 2021
    Inventors: William Haldean Brown, Ruza Markov, Michael Rivlin, Ambrus Csaszar, Jonathan Stults
  • Publication number: 20210004979
    Abstract: A handheld user device includes a monocular camera to capture a feed of images of a local scene and a processor to select, from the feed, a keyframe and perform, for a first image from the feed, stereo matching using the first image, the keyframe, and a relative pose based on a pose associated with the first image and a pose associated with the keyframe to generate a sparse disparity map representing disparities between the first image and the keyframe. The processor further is to determine a dense depth map from the disparity map using a bilateral solver algorithm, and process a viewfinder image generated from a second image of the feed with occlusion rendering based on the depth map to incorporate one or more virtual objects into the viewfinder image to generate an AR viewfinder image. Further, the processor is to provide the AR viewfinder image for display.
    Type: Application
    Filed: October 4, 2019
    Publication date: January 7, 2021
    Inventors: Jullien VALENTIN, Onur G. GULERYUZ, Mira LEUNG, Maksym DZITSIUK, Jose PASCOAL, Mirko SCHMIDT, Christoph RHEMANN, Neal WADHWA, Eric TURNER, Sameh KHAMIS, Adarsh Prakash Murthy KOWDLE, Ambrus CSASZAR, João Manuel Castro AFONSO, Jonathan T. BARRON, Michael SCHOENBERG, Ivan DRYANOVSKI, Vivek VERMA, Vladimir TANKOVICH, Shahram IZADI, Sean Ryan Francesco FANELLO, Konstantine Nicholas John TSOTSOS
  • Patent number: 10838387
    Abstract: A method for automated manufacturing strategy generation can include: identifying features of a desired part from a virtual model; and determining a tactic strategy based on the identified features. The method can additionally include: determining a toolpath primitive for each tactic; combining the toolpath primitives for the tactics to generate a master toolpath; and translating the master toolpath into machine code.
    Type: Grant
    Filed: December 19, 2017
    Date of Patent: November 17, 2020
    Assignee: Plethora Corporation
    Inventors: William Haldean Brown, Ruza Markov, Michael Rivlin, Ambrus Csaszar, Jonathan Stults
  • Patent number: 10803663
    Abstract: A method for depth sensor aided estimation of virtual reality environment boundaries includes generating depth data at a depth sensor of an electronic device based on a local environment proximate the electronic device. A set of initial boundary data is estimated based on the depth data, wherein the set of initial boundary data defines an exterior boundary of a virtual bounded floor plan. The virtual bounded floor plan is generated based at least in part on the set of initial boundary data. Additionally, a relative pose of the electronic device within the virtual bounded floor plan is determined and a collision warning is displayed on a display of the electronic device based on the relative pose.
    Type: Grant
    Filed: August 2, 2017
    Date of Patent: October 13, 2020
    Assignee: GOOGLE LLC
    Inventors: Zhaoguang Wang, Mugur Marculescu, Chris McKenzie, Ambrus Csaszar, Ivan Dryanovski
  • Patent number: 10484456
    Abstract: A content item on a content management system can be shared using a generated sharing link. The sharing link can be a custom network address, such as a uniform resource locator (URL), which allows the content item to be accessed without authentication. The sharing index and a content path of the content items can be listed in a sharing index used to identify the content upon a request initiated by the sharing link. The content management system can generate a content link to a content item upon receiving a sharing input from a user indicating the user wants to share a content item. Alternatively, in some embodiments, sharing links can be pre-generated by the content management system and stored on the client device. The sharing link can stored directly to a data buffer on the client device, such as a clipboard, where it can be accessed by the user.
    Type: Grant
    Filed: March 20, 2017
    Date of Patent: November 19, 2019
    Assignee: DROPBOX, INC.
    Inventors: Dwayne Litzenberger, Matt Holden, David Euresti, Maxime Larabie-Belanger, Ambrus Csaszar, Peter Welinder, Bryan Jadot
  • Patent number: 10255489
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: March 20, 2018
    Date of Patent: April 9, 2019
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20190043259
    Abstract: A method for depth sensor aided estimation of virtual reality environment boundaries includes generating depth data at a depth sensor of an electronic device based on a local environment proximate the electronic device. A set of initial boundary data is estimated based on the depth data, wherein the set of initial boundary data defines an exterior boundary of a virtual bounded floor plan. The virtual bounded floor plan is generated based at least in part on the set of initial boundary data. Additionally, a relative pose of the electronic device within the virtual bounded floor plan is determined and a collision warning is displayed on a display of the electronic device based on the relative pose.
    Type: Application
    Filed: August 2, 2017
    Publication date: February 7, 2019
    Inventors: Zhaoguang Wang, Mugur Marculescu, Chris McKenzie, Ambrus Csaszar, Ivan Dryanovski
  • Publication number: 20190033989
    Abstract: A method for generating virtual reality environment boundaries includes receiving, at a depth sensor of an electronic device, depth data from a local environment proximate the electronic device. Further, a set of outer boundary data is received defines an exterior boundary of a virtual bounded floor plan. A virtual bounded floor plan is generated based at least in part on the set of outer boundary data and the depth data from the local environment. Further, a relative pose of the electronic device within the virtual bounded floor plan is determined and collision warning is displayed on a display of the electronic device based on the relative pose.
    Type: Application
    Filed: July 31, 2017
    Publication date: January 31, 2019
    Inventors: Zhaoguang Wang, Mugur Marculescu, Chris McKenzie, Ambrus Csaszar, Ivan Dryanovski
  • Publication number: 20180218205
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: March 20, 2018
    Publication date: August 2, 2018
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 9984285
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: July 7, 2017
    Date of Patent: May 29, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20180107186
    Abstract: A method for automated manufacturing strategy generation can include: identifying features of a desired part from a virtual model; and determining a tactic strategy based on the identified features. The method can additionally include: determining a toolpath primitive for each tactic; combining the toolpath primitives for the tactics to generate a master toolpath; and translating the master toolpath into machine code.
    Type: Application
    Filed: December 19, 2017
    Publication date: April 19, 2018
    Inventors: William Haldean Brown, Ruza Markov, Michael Rivlin, Ambrus Csaszar, Jonathan Stults
  • Patent number: 9928051
    Abstract: A light installer can be utilized to improve the installation process of a client-side application. A light installer can be an installer containing only the necessary information to initiate the installation process, such as information necessary for prompting the user for required data and authorizations, communicating with the content management system, downloading additional resources, and installing the client-side application. The light installer can minimize user interaction time by obtaining all necessary user authorizations early in the installation process, thereby enabling the light installer to install all components of the client-side application without further authorization from the user. Further, the light installer can be tagged with data identifying a user account associated with the client device that can be used for reporting, pre-populating data during the installation process, customizing the installation process, pre-authorizing the client-side application, etc.
    Type: Grant
    Filed: March 11, 2016
    Date of Patent: March 27, 2018
    Assignee: DROPBOX, INC.
    Inventor: Ambrus Csaszar
  • Patent number: 9880542
    Abstract: A method for automated manufacturing strategy generation can include: identifying features of a desired part from a virtual model; and determining a tactic strategy based on the identified features. The method can additionally include: determining a toolpath primitive for each tactic; combining the toolpath primitives for the tactics to generate a master toolpath; and translating the master toolpath into machine code.
    Type: Grant
    Filed: September 12, 2016
    Date of Patent: January 30, 2018
    Assignee: Plethora Corporation
    Inventors: William Haldean Brown, Ruza Markov, Michael Rivlin, Ambrus Csaszar, Jonathan Stults
  • Publication number: 20170308743
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: July 7, 2017
    Publication date: October 26, 2017
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 9740922
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: January 27, 2015
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20170195402
    Abstract: A content item on a content management system can be shared using a generated sharing link. The sharing link can be a custom network address, such as a uniform resource locator (URL), which allows the content item to be accessed without authentication. The sharing index and a content path of the content items can be listed in a sharing index used to identify the content upon a request initiated by the sharing link. The content management system can generate a content link to a content item upon receiving a sharing input from a user indicating the user wants to share a content item. Alternatively, in some embodiments, sharing links can be pre-generated by the content management system and stored on the client device. The sharing link can stored directly to a data buffer on the client device, such as a clipboard, where it can be accessed by the user.
    Type: Application
    Filed: March 20, 2017
    Publication date: July 6, 2017
    Inventors: Dwayne LITZENBERGER, Matt HOLDEN, David EURESTI, Maxime LARABIE-BELANGER, Ambrus CSASZAR, Peter WELINDER, Bryan JADOT
  • Patent number: 9628560
    Abstract: A content item on a content management system can be shared using a generated sharing link. The sharing link can be a custom network address, such as a uniform resource locator (URL), which allows the content item to be accessed without authentication. The sharing index and a content path of the content items can be listed in a sharing index used to identify the content upon a request initiated by the sharing link. The content management system can generate a content link to a content item upon receiving a sharing input from a user indicating the user wants to share a content item. Alternatively, in some embodiments, sharing links can be pre-generated by the content management system and stored on the client device. The sharing link can stored directly to a data buffer on the client device, such as a clipboard, where it can be accessed by the user.
    Type: Grant
    Filed: March 10, 2014
    Date of Patent: April 18, 2017
    Assignee: Dropbox, Inc.
    Inventors: Dwayne Litzenberger, Matt Holden, David Euresti, Maxime Larabie-Belanger, Ambrus Csaszar, Peter Welinder, Bryan Jadot