Patents by Inventor Jerome C. Tu

Jerome C. Tu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230086527
    Abstract: The subject disclosure provides systems and methods for low-latency point-to-point communication between electronic devices. In one or more implementations, low latency can be achieved by including data in transmission units, in order of priority up to a maximum duration for the transmission unit. In an example in which the communication occurs over a WiFi channel, the transmission unit may be a multi-TID AMPDU, and the priority may be based on a Quality of Service (QoS) category as identified by the traffic identifier (TID). The communication may be exchanged using a periodic access window that can be adaptively offset, to efficiently share bandwidth with other device pairs on the same wireless channel. Base and client devices can be time synchronized, and transmission opportunity (TXOP) bursting can be allowed, in one or more implementations.
    Type: Application
    Filed: September 20, 2021
    Publication date: March 23, 2023
    Inventors: Yoel BOGER, Oren SHANI, Jerome C. TU, Sungho YUN
  • Patent number: 11284061
    Abstract: Systems and methods for capturing a two dimensional (2D) image of a portion of a three dimensional (3D) scene may include a computer rendering a 3D scene on a display from a user's point of view (POV). A camera mode may be activated in response to user input and a POV of a camera may be determined. The POV of the camera may be specified by position and orientation of a user input device coupled to the computer, and may be independent of the user's POV. A 2D frame of the 3D scene based on the POV of the camera may be determined and the 2D image based on the 2D frame may be captured in response to user input. The 2D image may be stored locally or on a server of a network.
    Type: Grant
    Filed: April 25, 2019
    Date of Patent: March 22, 2022
    Assignee: ZSPACE, INC.
    Inventors: Jonathan J. Hosenpud, Arthur L. Berman, Jerome C. Tu, Kevin D. Morishige, David A. Chavez
  • Publication number: 20190253699
    Abstract: Systems and methods for capturing a two dimensional (2D) image of a portion of a three dimensional (3D) scene may include a computer rendering a 3D scene on a display from a user's point of view (POV). A camera mode may be activated in response to user input and a POV of a camera may be determined. The POV of the camera may be specified by position and orientation of a user input device coupled to the computer, and may be independent of the user's POV. A 2D frame of the 3D scene based on the POV of the camera may be determined and the 2D image based on the 2D frame may be captured in response to user input. The 2D image may be stored locally or on a server of a network.
    Type: Application
    Filed: April 25, 2019
    Publication date: August 15, 2019
    Inventors: Jonathan J. Hosenpud, Arthur L. Berman, Jerome C. Tu, Kevin D. Morishige, David A. Chavez
  • Patent number: 10321126
    Abstract: Systems and methods for capturing a two dimensional (2D) image of a portion of a three dimensional (3D) scene may include a computer rendering a 3D scene on a display from a user's point of view (POV). A camera mode may be activated in response to user input and a POV of a camera may be determined. The POV of the camera may be specified by position and orientation of a user input device coupled to the computer, and may be independent of the user's POV. A 2D frame of the 3D scene based on the POV of the camera may be determined and the 2D image based on the 2D frame may be captured in response to user input. The 2D image may be stored locally or on a server of a network.
    Type: Grant
    Filed: July 7, 2015
    Date of Patent: June 11, 2019
    Assignee: zSpace, Inc.
    Inventors: Jonathan J. Hosenpud, Arthur L. Berman, Jerome C. Tu, Kevin D. Morishige, David A. Chavez
  • Patent number: 10019831
    Abstract: Systems and methods for incorporating real world conditions into a three-dimensional (3D) graphics object are described herein. In some embodiments, images of a physical location of a user of a three-dimensional (3D) display system may be received from at least one camera and a data imagery map of the physical location may be determined based at least in part on the received images. The data imagery map may capture real world conditions associated with the physical location of the user. Instructions to render a 3D graphics object may be generated and the data imagery map may be incorporated into a virtual 3D scene comprising the 3D graphics object, thereby incorporating the real world conditions into virtual world imagery. In some embodiments, the data imagery may include a light map, a sparse light field, and/or a depth map of the physical location.
    Type: Grant
    Filed: October 20, 2016
    Date of Patent: July 10, 2018
    Assignee: zSpace, Inc.
    Inventors: Clifford S. Champion, Jerome C. Tu
  • Patent number: 10019849
    Abstract: Systems and methods for interacting with a display system using a personal electronic device (PED). The display system may establish communication with and receive user input from the PED. The display system may use the received user input to generate and/or update content displayed on a display of the display system.
    Type: Grant
    Filed: July 29, 2016
    Date of Patent: July 10, 2018
    Assignee: zSpace, Inc.
    Inventors: Arthur L. Berman, Clifford S. Champion, David A. Chavez, Francisco Lopez-Fresquet, Jonathan J. Hosenpud, Robert D. Kalnins, Alexandre R. Lelievre, Christopher W. Sherman, Jerome C. Tu, Kevin S. Yamada, Chun Wun Yeung
  • Publication number: 20180114353
    Abstract: Systems and methods for incorporating real world conditions into a three-dimensional (3D) graphics object are described herein. In some embodiments, images of a physical location of a user of a three-dimensional (3D) display system may be received from at least one camera and a data imagery map of the physical location may be determined based at least in part on the received images. The data imagery map may capture real world conditions associated with the physical location of the user. Instructions to render a 3D graphics object may be generated and the data imagery map may be incorporated into a virtual 3D scene comprising the 3D graphics object, thereby incorporating the real world conditions into virtual world imagery. In some embodiments, the data imagery may include a light map, a sparse light field, and/or a depth map of the physical location.
    Type: Application
    Filed: October 20, 2016
    Publication date: April 26, 2018
    Inventors: Clifford S. Champion, Jerome C. Tu
  • Publication number: 20180033211
    Abstract: Systems and methods for interacting with a display system using a personal electronic device (PED). The display system may establish communication with and receive user input from the PED. The display system may use the received user input to generate and/or update content displayed on a display of the display system.
    Type: Application
    Filed: July 29, 2016
    Publication date: February 1, 2018
    Inventors: Arthur L. Berman, Clifford S. Champion, David A. Chavez, Francisco Lopez-Fresquet, Jonathan J. Hosenpud, Robert D. Kalnins, Alexandre R. Lelievre, Christopher W. Sherman, Jerome C. Tu, Kevin S. Yamada, Chun Wun Yeung
  • Patent number: 9848184
    Abstract: Systems and methods for a head tracked stereoscopic display system that uses light field type data may include receiving light field type data corresponding to a scene. The stereoscopic display system may track a user's head. Using the received light field type data and the head tracking, the system may generate three dimensional (3D) virtual content that corresponds to a virtual representation of the scene. The stereoscopic display system may then present the 3D virtual content to a user. The stereoscopic display system may present a left eye perspective image and a right eye perspective image of the scene to the user based on the position and orientation of the user's head. The images presented to the user may be updated based on a change in the position or the orientation of the user's head or based on receiving user input.
    Type: Grant
    Filed: January 16, 2017
    Date of Patent: December 19, 2017
    Assignee: zSpace, Inc.
    Inventors: David A. Chavez, Bruce J. Bell, Alexandre R. Lelievre, Jerome C. Tu, Christopher W. Sherman, Robert D. Kalnins, Jonathan J. Hosenpud, Francisco Lopez-Fresquet, Clifford S. Champion, Arthur L. Berman
  • Patent number: 9841821
    Abstract: In some embodiments, a system and/or method may assess handedness of a user of a system in an automated manner. The method may include displaying a 3D image on a display. The 3D image may include at least one object. The method may include tracking a position and an orientation of an input device in open space in relation to the 3D image. The method may include assessing a handedness of a user based on the position and the orientation of the input device with respect to at least one of the objects. In some embodiments, the method may include configuring at least a portion of the 3D image based upon the assessed handedness. The at least a portion of the 3D image may include interactive menus. In some embodiments, the method may include configuring at least a portion of an interactive hardware associated with the system based upon the assessed handedness.
    Type: Grant
    Filed: November 6, 2013
    Date of Patent: December 12, 2017
    Assignee: zSpace, Inc.
    Inventors: Jerome C. Tu, Carola F. Thompson, Mark F. Flynn, Douglas C. Twilleager, David A. Chavez, Kevin D. Morishige, Peter F. Ullmann, Arthur L. Berman
  • Patent number: 9706191
    Abstract: In some embodiments, a system for tracking with reference to a three-dimensional display system may include a display device, an image processor, a surface including at least three emitters, at least two sensors, a processor. The display device may image, during use, a first stereo three-dimensional image. The surface may be positionable, during use, with reference to the display device. At least two of the sensors may detect, during use, light received from at least three of the emitters as light blobs. The processor may correlate, during use, the assessed referenced position of the detected light blobs such that a first position/orientation of the surface is assessed. The image processor may generate, during use, the first stereo three-dimensional image using the assessed first position/orientation of the surface with reference to the display.
    Type: Grant
    Filed: June 20, 2016
    Date of Patent: July 11, 2017
    Assignee: zSpace, Inc.
    Inventors: Jerome C Tu, David A. Chavez
  • Patent number: 9703400
    Abstract: Virtual plane and use in a stylus based three dimensional (3D) stereoscopic display system. A virtual plane may be displayed in a virtual 3D space on a display of the 3D stereoscopic display system. The virtual plane may extend from a stylus of the 3D stereoscopic display system. Content may be generated in response to a geometric relationship of the virtual plane with at least one virtual object in the virtual 3D space. The generated content may indicate one or more attributes of the at least one virtual object. The content may be presented via the 3D stereoscopic display system.
    Type: Grant
    Filed: October 9, 2015
    Date of Patent: July 11, 2017
    Assignee: zSpace, Inc.
    Inventors: Jonathan J. Hosenpud, Arthur L. Berman, Clifford S. Champion, David A. Chavez, Francisco Lopez-Fresquet, Robert D. Kalnins, Alexandre R. Lelievre, Christopher W. Sherman, Jerome C. Tu, Murugappan R. Venkat
  • Publication number: 20170127051
    Abstract: Systems and methods for a head tracked stereoscopic display system that uses light field type data may include receiving light field type data corresponding to a scene. The stereoscopic display system may track a user's head. Using the received light field type data and the head tracking, the system may generate three dimensional (3D) virtual content that corresponds to a virtual representation of the scene. The stereoscopic display system may then present the 3D virtual content to a user. The stereoscopic display system may present a left eye perspective image and a right eye perspective image of the scene to the user based on the position and orientation of the user's head. The images presented to the user may be updated based on a change in the position or the orientation of the user's head or based on receiving user input.
    Type: Application
    Filed: January 16, 2017
    Publication date: May 4, 2017
    Inventors: David A. Chavez, Bruce J. Bell, Alexandre R. Lelievre, Jerome C. Tu, Christopher W. Sherman, Robert D. Kalnins, Jonathan J. Hosenpud, Francisco Lopez-Fresquet, Clifford S. Champion, Arthur L. Berman
  • Publication number: 20170102791
    Abstract: Virtual plane and use in a stylus based three dimensional (3D) stereoscopic display system. A virtual plane may be displayed in a virtual 3D space on a display of the 3D stereoscopic display system. The virtual plane may extend from a stylus of the 3D stereoscopic display system. Content may be generated in response to a geometric relationship of the virtual plane with at least one virtual object in the virtual 3D space. The generated content may indicate one or more attributes of the at least one virtual object. The content may be presented via the 3D stereoscopic display system.
    Type: Application
    Filed: October 9, 2015
    Publication date: April 13, 2017
    Inventors: Jonathan J. Hosenpud, Arthur L. Berman, Clifford S. Champion, David A. Chavez, Francisco Lopez-Fresquet, Robert D. Kalnins, Alexandre R. Lelievre, Christopher W. Sherman, Jerome C. Tu, Murugappan R. Venkat
  • Patent number: 9549174
    Abstract: Systems and methods for a head tracked stereoscopic display system that uses light field type data may include receiving light field type data corresponding to a scene. The stereoscopic display system may track a user's head. Using the received light field type data and the head tracking, the system may generate three dimensional (3D) virtual content that corresponds to a virtual representation of the scene. The stereoscopic display system may then present the 3D virtual content to a user. The stereoscopic display system may present a left eye perspective image and a right eye perspective image of the scene to the user based on the position and orientation of the user's head. The images presented to the user may be updated based on a change in the position or the orientation of the user's head or based on receiving user input.
    Type: Grant
    Filed: October 14, 2015
    Date of Patent: January 17, 2017
    Assignee: zSpace, Inc.
    Inventors: David A. Chavez, Bruce J. Bell, Alexandre R. Lelievre, Jerome C. Tu, Christopher W. Sherman, Robert D. Kalnins, Jonathan J. Hosenpud, Francisco Lopez-Fresquet, Clifford S. Champion, Arthur L. Berman
  • Patent number: 9473763
    Abstract: In some embodiments, a system for tracking with reference to a three-dimensional display system may include a display device, an image processor, a surface including at least three emitters, at least two sensors, a processor. The display device may image, during use, a first stereo three-dimensional image. The surface may be positionable, during use, with reference to the display device. At least two of the sensors may detect, during use, light received from at least three of the emitters as light blobs. The processor may correlate, during use, the assessed referenced position of the detected light blobs such that a first position/orientation of the surface is assessed. The image processor may generate, during use, the first stereo three-dimensional image using the assessed first position/orientation of the surface with reference to the display.
    Type: Grant
    Filed: August 10, 2015
    Date of Patent: October 18, 2016
    Assignee: zSpace, Inc.
    Inventors: Jerome C. Tu, David A. Chavez
  • Patent number: 9467685
    Abstract: Systems and methods for calibrating a three dimensional (3D) stereoscopic display system may include rendering a virtual model on a display of a 3D stereoscopic display system that may include a substantially horizontal display. The virtual model may be geometrically similar to a physical object placed at a location on the display. A vertex of the virtual model may be adjusted in response to user input. The adjustment may be such that the vertex of the virtual model is substantially coincident with a corresponding vertex of the physical object.
    Type: Grant
    Filed: August 27, 2015
    Date of Patent: October 11, 2016
    Assignee: zSpace, Inc.
    Inventors: David A. Chavez, Arthur L. Berman, Jerome C. Tu, Kevin D. Morishige
  • Publication number: 20160295203
    Abstract: In some embodiments, a system for tracking with reference to a three-dimensional display system may include a display device, an image processor, a surface including at least three emitters, at least two sensors, a processor. The display device may image, during use, a first stereo three-dimensional image. The surface may be positionable, during use, with reference to the display device. At least two of the sensors may detect, during use, light received from at least three of the emitters as light blobs. The processor may correlate, during use, the assessed referenced position of the detected light blobs such that a first position/orientation of the surface is assessed. The image processor may generate, during use, the first stereo three-dimensional image using the assessed first position/orientation of the surface with reference to the display.
    Type: Application
    Filed: June 20, 2016
    Publication date: October 6, 2016
    Inventors: Jerome C. Tu, David A. Chavez
  • Patent number: 9342917
    Abstract: In some embodiments, a system and/or method may include accessing three-dimensional (3D) imaging software on a remote server. The method may include accessing over a network a 3D imaging software package on a remote server using a first system. The method may include assessing, using the remote server, a capability of the first system to execute the 3D imaging software package. The method may include displaying an output of the 3D imaging software using the first system based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a first portion of the 3D imaging software using the remote server based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a second portion of the 3D imaging software using the first system based upon the assessed capabilities of the first system.
    Type: Grant
    Filed: August 27, 2015
    Date of Patent: May 17, 2016
    Assignee: zSpace, Inc.
    Inventors: David A. Chavez, Jerome C. Tu, Carola F. Thompson, Mark F. Flynn, Douglas C. Twilleager, Kevin D. Morishige, Peter F. Ullmann, Arthur L. Berman
  • Patent number: 9286713
    Abstract: In some embodiments, a system and/or method may include accessing three-dimensional (3D) imaging software on a remote server. The method may include accessing over a network a 3D imaging software package on a remote server using a first system. The method may include assessing, using the remote server, a capability of the first system to execute the 3D imaging software package. The method may include displaying an output of the 3D imaging software using the first system based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a first portion of the 3D imaging software using the remote server based upon the assessed capabilities of the first system. In some embodiments, the method may include executing a second portion of the 3D imaging software using the first system based upon the assessed capabilities of the first system.
    Type: Grant
    Filed: August 27, 2015
    Date of Patent: March 15, 2016
    Assignee: zSpace, Inc.
    Inventors: David A. Chavez, Jerome C. Tu, Carola F. Thompson, Mark F. Flynn, Douglas C. Twilleager, Kevin D. Morishige, Peter F. Ullmann, Arthur L. Berman