Patents by Inventor Jonathan J. Hosenpud

Jonathan J. Hosenpud has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11843755
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Grant
    Filed: June 7, 2021
    Date of Patent: December 12, 2023
    Assignee: ZSPACE, INC.
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Publication number: 20230336701
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Application
    Filed: June 9, 2023
    Publication date: October 19, 2023
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Publication number: 20230328213
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Application
    Filed: June 9, 2023
    Publication date: October 12, 2023
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Publication number: 20230319247
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Application
    Filed: June 9, 2023
    Publication date: October 5, 2023
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Publication number: 20230308623
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Application
    Filed: May 16, 2023
    Publication date: September 28, 2023
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Publication number: 20230291882
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Application
    Filed: May 16, 2023
    Publication date: September 14, 2023
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Patent number: 11645809
    Abstract: Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.
    Type: Grant
    Filed: March 2, 2021
    Date of Patent: May 9, 2023
    Assignee: zSpace, Inc.
    Inventors: Jonathan J. Hosenpud, Clifford S. Champion, David A. Chavez, Kevin S. Yamada, Alexandre R. Lelievre
  • Publication number: 20220394225
    Abstract: Systems and methods for implementing methods for cloud-based rendering of interactive augmented reality (AR) and/or virtual reality (VR) experiences. A client device may initiate execution of a content application on a server and provide information associated with the content application to the server. The client device may initialize, while awaiting a notification from the server, local systems associated with the content application and, upon receipt of the notification, provide, to the server, information associated with the local systems. Further, the client device may receive, from the server, data associated with the content application and render an AR/VR scene based on the received data. The data may be based, at least in part, on the information associated with the local system. The providing and receiving may be performed periodically, e.g., at a rate to sustain a comfortable viewing environment of the AR/VR scene by a user of the client device.
    Type: Application
    Filed: June 7, 2021
    Publication date: December 8, 2022
    Inventors: Clifford S. Champion, Jonathan J. Hosenpud, Baifang Lu, Alex Shorey, Robert D. Kalnins
  • Patent number: 11287905
    Abstract: Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.
    Type: Grant
    Filed: March 2, 2021
    Date of Patent: March 29, 2022
    Assignee: ZSPACE, INC.
    Inventors: Kevin S. Yamada, Jonathan J. Hosenpud, Christian R. Larsen, David A. Chavez, Arthur L. Berman, Clifford S. Champion
  • Patent number: 11284061
    Abstract: Systems and methods for capturing a two dimensional (2D) image of a portion of a three dimensional (3D) scene may include a computer rendering a 3D scene on a display from a user's point of view (POV). A camera mode may be activated in response to user input and a POV of a camera may be determined. The POV of the camera may be specified by position and orientation of a user input device coupled to the computer, and may be independent of the user's POV. A 2D frame of the 3D scene based on the POV of the camera may be determined and the 2D image based on the 2D frame may be captured in response to user input. The 2D image may be stored locally or on a server of a network.
    Type: Grant
    Filed: April 25, 2019
    Date of Patent: March 22, 2022
    Assignee: ZSPACE, INC.
    Inventors: Jonathan J. Hosenpud, Arthur L. Berman, Jerome C. Tu, Kevin D. Morishige, David A. Chavez
  • Publication number: 20210208700
    Abstract: Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6 DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6 DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.
    Type: Application
    Filed: March 2, 2021
    Publication date: July 8, 2021
    Inventors: Kevin S. Yamada, Jonathan J. Hosenpud, Christian R. Larsen, David A. Chavez, Arthur L. Berman, Clifford S. Champion
  • Publication number: 20210183132
    Abstract: Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.
    Type: Application
    Filed: March 2, 2021
    Publication date: June 17, 2021
    Inventors: Jonathan J. Hosenpud, Clifford S. Champion, David A. Chavez, Kevin S. Yamada, Alexandre R. Lelievre
  • Patent number: 11003305
    Abstract: Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.
    Type: Grant
    Filed: November 18, 2016
    Date of Patent: May 11, 2021
    Assignee: ZSPACE, INC.
    Inventors: Clifford S. Champion, Eduardo Baraf, Alexandre R. Lelievre, Jonathan J. Hosenpud
  • Publication number: 20210074055
    Abstract: Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.
    Type: Application
    Filed: September 6, 2019
    Publication date: March 11, 2021
    Inventors: Jonathan J. Hosenpud, Clifford S. Champion, David A. Chavez, Kevin S. Yamada, Alexandre R. Lelievre
  • Patent number: 10942585
    Abstract: Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.
    Type: Grant
    Filed: November 14, 2019
    Date of Patent: March 9, 2021
    Assignee: ZSPACE, INC.
    Inventors: Kevin S. Yamada, Jonathan J. Hosenpud, Christian R. Larsen, David A. Chavez, Arthur L. Berman, Clifford S. Champion
  • Patent number: 10943388
    Abstract: Systems and methods for implementing methods for user selection of a virtual object in a virtual scene. A user input may be received via a user input device. The user input may be an attempt to select a virtual object from a plurality of virtual objects rendered in a virtual scene on a display of a display system. A position and orientation of the user input device may be determined in response to the first user input. A probability the user input may select each virtual object may be calculated via a probability model. Based on the position and orientation of the user input device, a ray-cast procedure and a sphere-cast procedure may be performed to determine the virtual object being selected. The probability of selection may also be considered in determining the virtual object. A virtual beam may be rendered from the user input device to the virtual object.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: March 9, 2021
    Assignee: ZSPACE, INC.
    Inventors: Jonathan J. Hosenpud, Clifford S. Champion, David A. Chavez, Kevin S. Yamada, Alexandre R. Lelievre
  • Publication number: 20210026464
    Abstract: Systems and methods for enhancing trackability of a passive stylus. A six degree of freedom (6DoF) location and orientation of a passive stylus may be tracked by a tracking system via a retroreflector system disposed on the passive stylus. Additionally, characteristic movements of one of a user's finger, hand, and/or wrist may be recognized by the tracking system. The passive stylus may useable to interact with a virtual 3D scene being displayed via a 3D display. A user input via the passive stylus may be determined based on the tracked 6DoF location and orientation of the passive stylus and/or the recognized characteristic movements. The retroreflector system may include multiple patterns of retroreflectors and one of the patterns may be a spiral pattern of retroreflectors disposed along a longitudinal axis of the passive stylus.
    Type: Application
    Filed: November 14, 2019
    Publication date: January 28, 2021
    Inventors: Kevin S. Yamada, Jonathan J. Hosenpud, Christian R. Larsen, David A. Chavez, Arthur L. Berman, Clifford S. Champion
  • Patent number: 10866820
    Abstract: Systems and methods for displaying a stereoscopic three-dimensional (3D) webpage overlay. User input may be received from a user input device and in response to determining that the user input device is interacting with the 3D content, at least one of a plurality of render properties associated with of the 3D content may be modified. The at least one render property may be incrementally modified over a specified period of time, thereby animating modification of the at least one render property.
    Type: Grant
    Filed: April 24, 2019
    Date of Patent: December 15, 2020
    Assignee: ZSPACE
    Inventors: Jonathan J. Hosenpud, Clifford S. Champion
  • Patent number: 10863168
    Abstract: Systems and methods for displaying a three-dimensional (3D) workspace, including a 3D internet browser, in addition to a traditional two-dimensional (2D) workspace and for browsing the internet in a 3D/virtual reality workspace and transforming and/or upconverting objects and/or visual media from the 2D workspace and/or 2D webpages to the 3D workspace as 3D objects and/or stereoscopic output for display in the 3D workspace.
    Type: Grant
    Filed: November 27, 2019
    Date of Patent: December 8, 2020
    Assignee: ZSPACE, INC.
    Inventors: Clifford S. Champion, Eduardo Baraf, Alexandre R. Lelievre, Jonathan J. Hosenpud
  • Patent number: 10701347
    Abstract: Systems and methods for replacing a 2D image with an equivalent 3D image within a web page. Content of a 2D image displayed within a web page may be identified and 3D images may be identified as possible replacements of the 2D image. The 3D images may be ranked based on sets of ranking criteria. A 3D image with a highest-ranking value may be selected based on a ranking of the 3D images. The selected 3D image may be integrated into the web page, thereby replacing the 2D image with the selected 3D image. Further, a user input manipulating the 3D image within the web page may be received. The user input may include movement of a view point of a user relative to a display displaying the web page and/or detection of a beam projected from an end of a user input device intersecting with the 3D image.
    Type: Grant
    Filed: December 20, 2019
    Date of Patent: June 30, 2020
    Assignee: zSpace, Inc.
    Inventors: David A. Chavez, Jonathan J. Hosenpud, Clifford S. Champion, Alexandre R. Lelievre, Arthur L. Berman, Kevin S. Yamada