Patents by Inventor Derek Jung
Derek Jung has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240132452Abstract: The present invention relates to diarylhydantoin compounds, including diarylthiohydantoins, and methods for synthesizing them and using them in the treatment of hormone refractory prostate cancer.Type: ApplicationFiled: June 13, 2023Publication date: April 25, 2024Applicant: The Regents of the University of CaliforniaInventors: Charles L. Sawyers, Michael Jung, Charlie D. Chen, Samedy Ouk, Derek Welsbie, Chris Tran, John Wongvipat, Dongwon Yoo
-
Publication number: 20240033923Abstract: A collaborative safety awareness system for robotic applications enabled by augmented reality (AR). One or more robots in a work cell are in communication with an application running on an AR device worn or held by an operator in the work cell. The AR device may be a headset apparatus, or a tablet or teach pendant device. A dynamic safety zone is created around the operator and the location of the dynamic safety zone is continuously updated by the robot controller based on the position of the AR device provided by the AR application, where the position of the AR device is determined using inertial sensors and visual odometry. The robot controller prohibits motion of the robot into the dynamic safety zone, and slows or reroutes the robot as needed to prevent an interference condition between the robot and the dynamic safety zone.Type: ApplicationFiled: July 14, 2023Publication date: February 1, 2024Inventor: Derek Jung
-
Patent number: 11850755Abstract: An augmented reality (AR) system for visualizing and modifying robot operational zones. The system includes an AR device such as a headset in communication with a robot controller. The AR device includes software for the AR display and modification of the operational zones. The AR device is registered with the robot coordinate frame via detection of a visual marker. The AR device displays operational zones overlaid on real world images of the robot and existing fixtures, where the display is updated as the user moves around the robot work cell. Control points on the virtual operational zones are displayed and allow the user to reshape the operational zones. The robot can be operated during the AR session, running the robot's programmed motion and evaluating the operational zones. Zone violations are highlighted in the AR display. When zone definition is complete, the finalized operational zones are uploaded to the robot controller.Type: GrantFiled: June 26, 2019Date of Patent: December 26, 2023Assignee: FANUC AMERICA CORPORATIONInventors: Derek Jung, Bruce Coldren, Sam Yung-Sen Lee, Leo Keselman, Kenneth W. Krause
-
Patent number: 11752632Abstract: A system and method for calibrating the position of a machine having a stationary part to a stationary marker. The process first images the machine and the marker, and then identifies a visible part of the machine in the images that has been 3D modeled. The process then calculates a location of the stationary part of the machine using the modeled position of the visible part and the known kinematics and position of the machine. The process then identifies the stationary marker in the images, and establishes a relationship between the stationary marker and the stationary part of the machine, which can be used for calibration purposes. In one embodiment, the machine is a robot and the process is performed by an AR application.Type: GrantFiled: April 30, 2020Date of Patent: September 12, 2023Assignee: FANUC AMERICA CORPORATIONInventors: Leo Keselman, Derek Jung, Kenneth W. Krause
-
Patent number: 11472035Abstract: An augmented reality (AR) system for production-tuning of parameters for a visual tracking robotic picking system. The robotic picking system includes one or more robots configured to pick randomly-placed and randomly-oriented parts off a conveyor belt and place the parts in an available position, either on a second moving conveyor belt or on a stationary device such as a pallet. A visual tracking system identifies position and orientation of the parts on the feed conveyor. The AR system allows picking system tuning parameters including upstream, discard and downstream boundary locations to be visualized and controlled, real-time robot pick/place operations to be viewed with virtual boundaries, and system performance parameters such as part throughput rate and part allocation by robot to be viewed. The AR system also allows virtual parts to be used in simulations, either instead of or in addition to real parts.Type: GrantFiled: June 26, 2019Date of Patent: October 18, 2022Assignee: FANUC AMERICA CORPORATIONInventors: Ganesh Kalbavi, Derek Jung, Leo Keselman, Min-Ren Jean, Kenneth W. Krause, Jason Tsai
-
Patent number: 11396100Abstract: A method and system for calibration of an augmented reality (AR) device's position and orientation based on a robot's positional configuration. A conventional visual calibration target is not required for AR device calibration. Instead, the robot itself, in any pose, is used as a three dimensional (3D) calibration target. The AR system is provided with a CAD model of the entire robot to use as a reference frame, and 3D models of the individual robot arms are combined into a single object model based on joint positions known from the robot controller. The 3D surface model of the entire robot in the current pose is then used for visual calibration of the AR system by analyzing images from the AR device camera in comparison to the surface model of the robot in the current pose. The technique is applicable to initial AR device calibration and to ongoing device tracking.Type: GrantFiled: September 10, 2019Date of Patent: July 26, 2022Assignee: FANUC AMERICA CORPORATIONInventors: Kenneth W. Krause, Derek Jung, Leo Keselman
-
Publication number: 20200346350Abstract: A system and method for calibrating the position of a machine having a stationary part to a stationary marker. The process first images the machine and the marker, and then identifies a visible part of the machine in the images that has been 3D modeled. The process then calculates a location of the stationary part of the machine using the modeled position of the visible part and the known kinematics and position of the machine. The process then identifies the stationary marker in the images, and establishes a relationship between the stationary marker and the stationary part of the machine, which can be used for calibration purposes. In one embodiment, the machine is a robot and the process is performed by an AR application.Type: ApplicationFiled: April 30, 2020Publication date: November 5, 2020Inventors: Leo Keselman, Derek Jung, Kenneth W. Krause
-
Publication number: 20200349737Abstract: A system and method for setting up an AR application that uses a plurality of markers so that accurate augmentations can be displayed anywhere a marker is visible. The method includes placing a plurality of markers throughout the workspace so that a plurality of pairs of two adjacent markers can be viewed in a field-of-view of an AR device. The method further includes determining a distance relationship between the two markers in all of the pairs of markers, and determining a distance relationship between all non-adjacent markers using the distance relationship between the two markers in all of the pairs of markers. The method also includes identifying a distance relationship between one of the plurality of markers and an augmentation in the workspace, and identifying a distance relationship between the other markers and the augmentation using the distance relationships between the adjacent markers and the non-adjacent markers.Type: ApplicationFiled: April 30, 2020Publication date: November 5, 2020Inventors: Leo Keselman, Derek Jung, Kenneth W. Krause
-
Publication number: 20200078948Abstract: A method and system for calibration of an augmented reality (AR) device's position and orientation based on a robot's positional configuration. A conventional visual calibration target is not required for AR device calibration. Instead, the robot itself, in any pose, is used as a three dimensional (3D) calibration target. The AR system is provided with a CAD model of the entire robot to use as a reference frame, and 3D models of the individual robot arms are combined into a single object model based on joint positions known from the robot controller. The 3D surface model of the entire robot in the current pose is then used for visual calibration of the AR system by analyzing images from the AR device camera in comparison to the surface model of the robot in the current pose. The technique is applicable to initial AR device calibration and to ongoing device tracking.Type: ApplicationFiled: September 10, 2019Publication date: March 12, 2020Inventors: Kenneth W. Krause, Derek Jung, Leo Keselman
-
Publication number: 20190389066Abstract: An augmented reality (AR) system for visualizing and modifying robot operational zones. The system includes an AR device such as a headset in communication with a robot controller. The AR device includes software for the AR display and modification of the operational zones. The AR device is registered with the robot coordinate frame via detection of a visual marker. The AR device displays operational zones overlaid on real world images of the robot and existing fixtures, where the display is updated as the user moves around the robot work cell. Control points on the virtual operational zones are displayed and allow the user to reshape the operational zones. The robot can be operated during the AR session, running the robot's programmed motion and evaluating the operational zones. Zone violations are highlighted in the AR display. When zone definition is complete, the finalized operational zones are uploaded to the robot controller.Type: ApplicationFiled: June 26, 2019Publication date: December 26, 2019Inventors: Derek Jung, Bruce Coldren, Sam Yung-Sen Lee, Leo Keselman, Kenneth W. Krause
-
Publication number: 20190389069Abstract: An augmented reality (AR) system for production-tuning of parameters for a visual tracking robotic picking system. The robotic picking system includes one or more robots configured to pick randomly-placed and randomly-oriented parts off a conveyor belt and place the parts in an available position, either on a second moving conveyor belt or on a stationary device such as a pallet. A visual tracking system identifies position and orientation of the parts on the feed conveyor. The AR system allows picking system tuning parameters including upstream, discard and downstream boundary locations to be visualized and controlled, real-time robot pick/place operations to be viewed with virtual boundaries, and system performance parameters such as part throughput rate and part allocation by robot to be viewed. The AR system also allows virtual parts to be used in simulations, either instead of or in addition to real parts.Type: ApplicationFiled: June 26, 2019Publication date: December 26, 2019Inventors: Ganesh Kalbavi, Derek Jung, Leo Keselman, Min-Ren Jean, Kenneth W. Krause, Jason Tsai