Patents by Inventor Leo Keselman
Leo Keselman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11850755Abstract: An augmented reality (AR) system for visualizing and modifying robot operational zones. The system includes an AR device such as a headset in communication with a robot controller. The AR device includes software for the AR display and modification of the operational zones. The AR device is registered with the robot coordinate frame via detection of a visual marker. The AR device displays operational zones overlaid on real world images of the robot and existing fixtures, where the display is updated as the user moves around the robot work cell. Control points on the virtual operational zones are displayed and allow the user to reshape the operational zones. The robot can be operated during the AR session, running the robot's programmed motion and evaluating the operational zones. Zone violations are highlighted in the AR display. When zone definition is complete, the finalized operational zones are uploaded to the robot controller.Type: GrantFiled: June 26, 2019Date of Patent: December 26, 2023Assignee: FANUC AMERICA CORPORATIONInventors: Derek Jung, Bruce Coldren, Sam Yung-Sen Lee, Leo Keselman, Kenneth W. Krause
-
Patent number: 11752632Abstract: A system and method for calibrating the position of a machine having a stationary part to a stationary marker. The process first images the machine and the marker, and then identifies a visible part of the machine in the images that has been 3D modeled. The process then calculates a location of the stationary part of the machine using the modeled position of the visible part and the known kinematics and position of the machine. The process then identifies the stationary marker in the images, and establishes a relationship between the stationary marker and the stationary part of the machine, which can be used for calibration purposes. In one embodiment, the machine is a robot and the process is performed by an AR application.Type: GrantFiled: April 30, 2020Date of Patent: September 12, 2023Assignee: FANUC AMERICA CORPORATIONInventors: Leo Keselman, Derek Jung, Kenneth W. Krause
-
Patent number: 11679418Abstract: A work cell and method for automatically separating objects disposed in 3D clusters includes dispensing the objects onto a horizontal conveying surface to form a 2D array, reforming the 2D array into a 1D stream in which the objects move in single-file in a predefined moving direction, utilizing a vision-based or other stationary sensing system to identify a selected target object in the 1D stream as the target object passes through an image capture (sensing) region, calculating trajectory data defining the target object's time-based position in the 1D stream, and then utilizing the trajectory data to control a robot arm or other object removal mechanism such that only the selected object is forcibly removed (e.g., swiped or picked-up) from the horizontal conveying surface. A continuous-loop-type conveying mechanism includes two parallel conveyor-belt-type conveying structures and associated belt-switching structures. An AI-powered vision system identifies new object shapes during preliminary learning phases.Type: GrantFiled: April 16, 2021Date of Patent: June 20, 2023Assignee: RIOS Intelligent Machines, Inc.Inventors: Christopher A. Paulson, Nicholas L. Choi, Christopher Lalau Keraly, Matthew E. Shaffer, Laura Stelzner, Leo Keselman, Anthony Canalungo, Clinton J. Smith
-
Publication number: 20230092690Abstract: A robotic work cell uses an object separating mechanism to disperse bulk objects into a 2D arrangement on a horizontal surface and uses a vision system to generate pick-up (positional) data and rotational orientation data for each sequentially selected target object of the 2D arrangement. A pick-and-place robot mechanism uses the positional data to pick-up each target object and uses the rotational orientation data to reorientate the target object during transfer to a designated hand-off location. A carousel-type robotic end-tool disposed on a 4-axis object-processing robot mechanism rotates a gripper mechanism around a vertical axis to move the target object from the hand-off location to a designated processing location, where an associated processing device performs a desired process (e.g., label application) on the target object. In one embodiment the gripper mechanism is selectively rotatable around a horizontal axis to facilitate processing on opposing surfaces of the target object.Type: ApplicationFiled: September 21, 2021Publication date: March 23, 2023Applicant: RIOS Intelligent Machines, Inc.Inventors: Christopher Lalau Keraly, Christipher A. Paulson, Nicholas L. Choi, Laura L. Sullivan, Leo Keselman, Michael Benedict, Kent A. Evans, Scott M Dellenbaugh
-
Patent number: 11542103Abstract: An automated food production work cell includes a robotic system that utilizes a food-grade robotic gripper to transfer individual food items. The robotic gripper is constructed using food-grade materials and includes finger structures that are linearly movably connected by linear bearings to parallel guide rods and are independently driven by a non-contact actuating system to grasp targeted food items disposed on a first work surface, to hold the targeted food items while the robotic system moves the gripper to a second work surface, and to release the targeted food items onto the second work surface. Encoding and external sensing systems facilitate fully automated food transfer processes. Optional sensor arrays are disposed on tip portions of the finger structures to provide feedback data (e.g., grasping force/pressure). Two or more pairs of independently controlled finger structures are provided to facilitate the transfer of multiple food items during each transfer process.Type: GrantFiled: July 29, 2021Date of Patent: January 3, 2023Assignee: RIOS Intelligent Machines, Inc.Inventors: Christopher A. Paulson, Nicholas L. Choi, Leo Keselman, Laura L. Sullivan, Kent A. Evans, Laura Stelzner, Clinton J. Smith
-
Publication number: 20220331840Abstract: A work cell and method for automatically separating objects disposed in 3D clusters includes dispensing the objects onto a horizontal conveying surface to form a 2D array, reforming the 2D array into a 1D stream in which the objects move in single-file in a predefined moving direction, utilizing a vision-based or other stationary sensing system to identify a selected target object in the 1D stream as the target object passes through an image capture (sensing) region, calculating trajectory data defining the target object's time-based position in the 1D stream, and then utilizing the trajectory data to control a robot arm or other object removal mechanism such that only the selected object is forcibly removed (e.g., swiped or picked-up) from the horizontal conveying surface. A continuous-loop-type conveying mechanism includes two parallel conveyor-belt-type conveying structures and associated belt-switching structures. An AI-powered vision system identifies new object shapes during preliminary learning phases.Type: ApplicationFiled: April 16, 2021Publication date: October 20, 2022Applicant: RIOS Intelligent Machines, Inc.Inventors: Christopher A. Paulson, Nicholas L. Choi, Christopher Lalau Keraly, Matthew E. Shaffer, Laura Stelzner, Leo Keselman, Anthony Canalungo, Clinton J. Smith
-
Patent number: 11472035Abstract: An augmented reality (AR) system for production-tuning of parameters for a visual tracking robotic picking system. The robotic picking system includes one or more robots configured to pick randomly-placed and randomly-oriented parts off a conveyor belt and place the parts in an available position, either on a second moving conveyor belt or on a stationary device such as a pallet. A visual tracking system identifies position and orientation of the parts on the feed conveyor. The AR system allows picking system tuning parameters including upstream, discard and downstream boundary locations to be visualized and controlled, real-time robot pick/place operations to be viewed with virtual boundaries, and system performance parameters such as part throughput rate and part allocation by robot to be viewed. The AR system also allows virtual parts to be used in simulations, either instead of or in addition to real parts.Type: GrantFiled: June 26, 2019Date of Patent: October 18, 2022Assignee: FANUC AMERICA CORPORATIONInventors: Ganesh Kalbavi, Derek Jung, Leo Keselman, Min-Ren Jean, Kenneth W. Krause, Jason Tsai
-
Patent number: 11396100Abstract: A method and system for calibration of an augmented reality (AR) device's position and orientation based on a robot's positional configuration. A conventional visual calibration target is not required for AR device calibration. Instead, the robot itself, in any pose, is used as a three dimensional (3D) calibration target. The AR system is provided with a CAD model of the entire robot to use as a reference frame, and 3D models of the individual robot arms are combined into a single object model based on joint positions known from the robot controller. The 3D surface model of the entire robot in the current pose is then used for visual calibration of the AR system by analyzing images from the AR device camera in comparison to the surface model of the robot in the current pose. The technique is applicable to initial AR device calibration and to ongoing device tracking.Type: GrantFiled: September 10, 2019Date of Patent: July 26, 2022Assignee: FANUC AMERICA CORPORATIONInventors: Kenneth W. Krause, Derek Jung, Leo Keselman
-
Patent number: 11373372Abstract: An augmented reality (AR) system for diagnosis, troubleshooting and repair of industrial robots. The disclosed diagnosis guide system communicates with a controller of an industrial robot and collects data from the robot controller, including a trouble code identifying a problem with the robot. The system then identifies an appropriate diagnosis decision tree based on the collected data, and provides an interactive step-by-step troubleshooting guide to a user on an AR-capable mobile device, including augmented reality for depicting actions to be taken during testing and component replacement. The system includes data collector, tree generator and guide generator modules, and builds the decision tree and the diagnosis guide using a stored library of diagnosis trees, decisions and diagnosis steps, along with the associated AR data.Type: GrantFiled: June 26, 2019Date of Patent: June 28, 2022Assignee: FANUC AMERICA CORPORATIONInventors: Leo Keselman, Yi Sun, Sai-Kai Cheng, Jason Tsai
-
Patent number: 11112776Abstract: Methods and systems include ways to synchronize a press machine and tending robots, including a pick robot and a drop robot, where the press machine includes an operating area for pressing a blank into a part. The pick robot and the part are moved out of the operating area while the drop robot carrying the blank is moved into the operating area. At least a portion of the pick robot and/or the part resides within the operating area at the same time at least a portion of the drop robot and/or the blank resides within the operating area. The pick robot is in communication with the drop robot and the movement of the pick robot is synchronized with the movement of the drop robot to prevent the pick robot or part from colliding with the drop robot or the blank.Type: GrantFiled: September 26, 2017Date of Patent: September 7, 2021Assignee: FANUC AMERICA CORPORATIONInventors: Leo Keselman, Matthew DeNio, Eric Lee, Ho Cheung Wong, Peter Swanson, Sai-Kai Cheng
-
Publication number: 20200349737Abstract: A system and method for setting up an AR application that uses a plurality of markers so that accurate augmentations can be displayed anywhere a marker is visible. The method includes placing a plurality of markers throughout the workspace so that a plurality of pairs of two adjacent markers can be viewed in a field-of-view of an AR device. The method further includes determining a distance relationship between the two markers in all of the pairs of markers, and determining a distance relationship between all non-adjacent markers using the distance relationship between the two markers in all of the pairs of markers. The method also includes identifying a distance relationship between one of the plurality of markers and an augmentation in the workspace, and identifying a distance relationship between the other markers and the augmentation using the distance relationships between the adjacent markers and the non-adjacent markers.Type: ApplicationFiled: April 30, 2020Publication date: November 5, 2020Inventors: Leo Keselman, Derek Jung, Kenneth W. Krause
-
Publication number: 20200346350Abstract: A system and method for calibrating the position of a machine having a stationary part to a stationary marker. The process first images the machine and the marker, and then identifies a visible part of the machine in the images that has been 3D modeled. The process then calculates a location of the stationary part of the machine using the modeled position of the visible part and the known kinematics and position of the machine. The process then identifies the stationary marker in the images, and establishes a relationship between the stationary marker and the stationary part of the machine, which can be used for calibration purposes. In one embodiment, the machine is a robot and the process is performed by an AR application.Type: ApplicationFiled: April 30, 2020Publication date: November 5, 2020Inventors: Leo Keselman, Derek Jung, Kenneth W. Krause
-
Publication number: 20200078948Abstract: A method and system for calibration of an augmented reality (AR) device's position and orientation based on a robot's positional configuration. A conventional visual calibration target is not required for AR device calibration. Instead, the robot itself, in any pose, is used as a three dimensional (3D) calibration target. The AR system is provided with a CAD model of the entire robot to use as a reference frame, and 3D models of the individual robot arms are combined into a single object model based on joint positions known from the robot controller. The 3D surface model of the entire robot in the current pose is then used for visual calibration of the AR system by analyzing images from the AR device camera in comparison to the surface model of the robot in the current pose. The technique is applicable to initial AR device calibration and to ongoing device tracking.Type: ApplicationFiled: September 10, 2019Publication date: March 12, 2020Inventors: Kenneth W. Krause, Derek Jung, Leo Keselman
-
Publication number: 20190389069Abstract: An augmented reality (AR) system for production-tuning of parameters for a visual tracking robotic picking system. The robotic picking system includes one or more robots configured to pick randomly-placed and randomly-oriented parts off a conveyor belt and place the parts in an available position, either on a second moving conveyor belt or on a stationary device such as a pallet. A visual tracking system identifies position and orientation of the parts on the feed conveyor. The AR system allows picking system tuning parameters including upstream, discard and downstream boundary locations to be visualized and controlled, real-time robot pick/place operations to be viewed with virtual boundaries, and system performance parameters such as part throughput rate and part allocation by robot to be viewed. The AR system also allows virtual parts to be used in simulations, either instead of or in addition to real parts.Type: ApplicationFiled: June 26, 2019Publication date: December 26, 2019Inventors: Ganesh Kalbavi, Derek Jung, Leo Keselman, Min-Ren Jean, Kenneth W. Krause, Jason Tsai
-
Publication number: 20190392644Abstract: An augmented reality (AR) system for diagnosis, troubleshooting and repair of industrial robots. The disclosed diagnosis guide system communicates with a controller of an industrial robot and collects data from the robot controller, including a trouble code identifying a problem with the robot. The system then identifies an appropriate diagnosis decision tree based on the collected data, and provides an interactive step-by-step troubleshooting guide to a user on an AR-capable mobile device, including augmented reality for depicting actions to be taken during testing and component replacement. The system includes data collector, tree generator and guide generator modules, and builds the decision tree and the diagnosis guide using a stored library of diagnosis trees, decisions and diagnosis steps, along with the associated AR data.Type: ApplicationFiled: June 26, 2019Publication date: December 26, 2019Inventors: Leo Keselman, Yi Sun, Sai-Kai Cheng, Jason Tsai
-
Publication number: 20190389066Abstract: An augmented reality (AR) system for visualizing and modifying robot operational zones. The system includes an AR device such as a headset in communication with a robot controller. The AR device includes software for the AR display and modification of the operational zones. The AR device is registered with the robot coordinate frame via detection of a visual marker. The AR device displays operational zones overlaid on real world images of the robot and existing fixtures, where the display is updated as the user moves around the robot work cell. Control points on the virtual operational zones are displayed and allow the user to reshape the operational zones. The robot can be operated during the AR session, running the robot's programmed motion and evaluating the operational zones. Zone violations are highlighted in the AR display. When zone definition is complete, the finalized operational zones are uploaded to the robot controller.Type: ApplicationFiled: June 26, 2019Publication date: December 26, 2019Inventors: Derek Jung, Bruce Coldren, Sam Yung-Sen Lee, Leo Keselman, Kenneth W. Krause
-
Publication number: 20190227532Abstract: Methods and systems include ways to synchronize a press machine and tending robots, including a pick robot and a drop robot, where the press machine includes an operating area for pressing a blank into a part. The pick robot and the part are moved out of the operating area while the drop robot carrying the blank is moved into the operating area. At least a portion of the pick robot and/or the part resides within the operating area at the same time at least a portion of the drop robot and/or the blank resides within the operating area. The pick robot is in communication with the drop robot and the movement of the pick robot is synchronized with the movement of the drop robot to prevent the pick robot or part from colliding with the drop robot or the blank.Type: ApplicationFiled: September 26, 2017Publication date: July 25, 2019Inventors: Leo Keselman, Matthew DeNio, Eric Lee, Ho Cheung Wong, Peter Swanson, Sai-Kai Cheng