Patents by Inventor Fredrik Ryden
Fredrik Ryden has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20210383096Abstract: A system and method are provided for training a machine learning system. In an embodiment, the system generates a three-dimensional model of an environment using a video sequence that includes individual frames taken from a variety of perspectives and environmental conditions. An object in the environment is identified and labeled, in some examples, by an operator, and a three-dimensional model of the object is created. Training data for the machine learning system is created by applying the label to the individual video frames of the video sequence, or by applying a rendering of the three-dimensional model to additional images or video sequences.Type: ApplicationFiled: June 8, 2020Publication date: December 9, 2021Inventors: Steven James White, Olof Fredrik Ryden, Donald Mark Marsh
-
Patent number: 11127112Abstract: Techniques for simplifying a user interface by transforming a model and inversely transforming commands for controlling a remote device based on the transformed model are described herein. A computer system determines a warping transformation, applies the transformation to a model, and provides the transformed model to an operator display. The operator, referencing the transformed model, provides a movement command to a remote device reduces robot and the computer system inversely transforms the command to correspond to the space of the remote device.Type: GrantFiled: April 8, 2020Date of Patent: September 21, 2021Assignee: Bluhaptics, Inc.Inventor: Olof Fredrik Ryden
-
Publication number: 20200302241Abstract: A system and method are provided for training a machine learning system. In an embodiment, the system generates a three-dimensional model of an environment using a video sequence that includes individual frames taken from a variety of perspectives and environmental conditions. An object in the environment is identified and labeled, in some examples, by an operator, and a three-dimensional model of the object is created. Training data for the machine learning system is created by applying the label to the individual video frames of the video sequence, or by applying a rendering of the three-dimensional model to additional images or video sequences.Type: ApplicationFiled: June 7, 2020Publication date: September 24, 2020Inventors: Steven James White, Olof Fredrik Ryden, Donald Mark Marsh
-
Patent number: 10394327Abstract: Apparatus and methods for generating virtual environment displays based on a group of sensors are provided. A computing device can receive first data about an environment from a first group of one or more sensors. The computing device can model the environment as a virtual environment based on the first data. The computing device can determine whether to obtain additional data to model the environment. After determining to obtain additional data to model the environment, the computing device can: receive second data about the environment, and model the environment as the virtual environment based on at least the second data. The computing device can generate a display of the virtual environment.Type: GrantFiled: September 11, 2015Date of Patent: August 27, 2019Assignee: University of WashingtonInventors: Howard Jay Chizeck, Kevin Huang, Fredrik Ryden, Andrew Stewart
-
Patent number: 10226869Abstract: Apparatus and methods for defining and utilizing virtual fixtures for haptic navigation within real-world environments, including underwater environments, are provided. A computing device can determine a real-world object within a real-world environment. The computing device can receive an indication of the real-world object. The computing device can determine a virtual fixture that corresponds to the real-world object based on the indication, where aspects of the virtual fixture are configured to align with aspects of the real-world object. The computing device can provide a virtual environment for manipulating the robotic tool to operate on the real-world object utilizing the virtual fixture. The virtual fixture is configured to provide haptic feedback based on a position of a virtual robotic tool in the virtual environment that corresponds to the robotic tool in the real-world environment.Type: GrantFiled: March 2, 2015Date of Patent: March 12, 2019Assignee: University of WashingtonInventors: Howard Jay Chizeck, Andrew Stewart, Fredrik Ryden
-
Publication number: 20180232052Abstract: Apparatus and methods for generating virtual environment displays based on a group of sensors are provided. A computing device can receive first data about an environment from a first group of one or more sensors. The computing device can model the environment as a virtual environment based on the first data. The computing device can determine whether to obtain additional data to model the environment. After determining to obtain additional data to model the environment, the computing device can: receive second data about the environment, and model the environment as the virtual environment based on at least the second data. The computing device can generate a display of the virtual environment.Type: ApplicationFiled: September 11, 2015Publication date: August 16, 2018Inventors: Howard Jay CHIZECK, Kevin HUANG, Fredrik RYDEN, Andrew STEWART
-
Patent number: 9753542Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.Type: GrantFiled: October 11, 2016Date of Patent: September 5, 2017Assignee: University of Washington Through Its Center for CommercializationInventors: Howard Jay Chizeck, Fredrik Ryden, Andrew Stewart
-
Publication number: 20170106537Abstract: Apparatus and methods for defining and utilizing virtual fixtures for haptic navigation within real-world environments, including underwater environments, are provided. A computing device can determine a real-world object within a real-world environment. The computing device can receive an indication of the real-world object. The computing device can determine a virtual fixture that corresponds to the real-world object based on the indication, where aspects of the virtual fixture are configured to align with aspects of the real-world object. The computing device can provide a virtual environment for manipulating the robotic tool to operate on the real-world object utilizing the virtual fixture. The virtual fixture is configured to provide haptic feedback based on a position of a virtual robotic tool in the virtual environment that corresponds to the robotic tool in the real-world environment.Type: ApplicationFiled: March 2, 2015Publication date: April 20, 2017Inventors: Howard Jay CHIZECK, Andrew STEWART, Fredrik RYDEN
-
Publication number: 20170024014Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.Type: ApplicationFiled: October 11, 2016Publication date: January 26, 2017Inventors: Howard Jay Chizeck, Fredrik Ryden
-
Patent number: 9477307Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.Type: GrantFiled: January 24, 2014Date of Patent: October 25, 2016Assignee: The University of WashingtonInventors: Howard Jay Chizeck, Fredrik Ryden
-
Patent number: 9471142Abstract: Methods, articles of manufacture, and devices related to generating haptic feedback for point clouds are provided. A computing device receives depth data about an In-Contact environment. The computing device generates a point cloud from the depth data. The computing device determines a haptic interface point (HIP). The computing device determines a haptic interface point (HIP). The computing device determines a force vector between the HIP and point cloud. The computing device sends an indication of haptic feedback based on the force vector.Type: GrantFiled: June 15, 2012Date of Patent: October 18, 2016Assignee: The University of WashingtonInventors: Howard Jay Chizeck, Fredrik Rydén, Sina Nia Kosari, Blake Hannaford, Nicklas Gustafsson, Hawkeye I. King
-
Publication number: 20140320629Abstract: Apparatus configured for operation in an underwater environment are provided. A device includes a camera and a transceiver. The camera can capture, within a predetermined interval of time, first light within a first frequency range of light and second light within a second frequency range of light. The camera can generate a first image based on the first light and a second image based on the second light. The first frequency range of light can differ from the second frequency range of light; for example, the first frequency range can include 420-450 nanometers (nm) and the second frequency range can include 830 nm. The transceiver can send the images and receive commands based on haptic feedback generated based on the first image and the second image. For example, an actuator can be configured to be controlled by one or more commands.Type: ApplicationFiled: January 24, 2014Publication date: October 30, 2014Inventors: Howard Jay Chizeck, Fredrik Ryden, Andrew Stewart, Wei-Chih Wang, Payman Arabshahi
-
Publication number: 20140320489Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.Type: ApplicationFiled: January 24, 2014Publication date: October 30, 2014Inventors: Howard Jay Chizeck, Fredrik Ryden
-
Publication number: 20140320392Abstract: Apparatus and method for defining and utilizing virtual fixtures in haptic rendering sessions interacting with various environments, including underwater environments, are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a haptic interface point (HIP) and can define a virtual fixture for the environment. The computing device can determine a first force vector between the HIP and the first plurality of points using the computing device, where the first force vector is based on the virtual fixture. The computing device can send a first indication of haptic feedback based on the first force vector.Type: ApplicationFiled: January 24, 2014Publication date: October 30, 2014Inventors: Howard Jay Chizeck, Fredrik Ryden, Andrew Stewart
-
Publication number: 20140168073Abstract: Methods, articles of manufacture, and devices related to generating haptic feedback for point clouds are provided. A computing device receives depth data about an environment. The computing device generates a point cloud from the depth data. The computing device determines a haptic interface point (HIP). The computing device determines a haptic interface point (HIP). The computing device determines a force vector between the HIP and point cloud. The computing device sends an indication of haptic feedback based on the force vector.Type: ApplicationFiled: June 15, 2012Publication date: June 19, 2014Applicant: University of Washington through its Center for CommericializationInventors: Howard Jay Chizeck, Fredrik Rydén, Sina Nia Kosari, Blake Hannaford, Nicklas Gustafsson, Hawkeye I. King