Patents by Inventor Fredrik Ryden

Fredrik Ryden has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210383096
    Abstract: A system and method are provided for training a machine learning system. In an embodiment, the system generates a three-dimensional model of an environment using a video sequence that includes individual frames taken from a variety of perspectives and environmental conditions. An object in the environment is identified and labeled, in some examples, by an operator, and a three-dimensional model of the object is created. Training data for the machine learning system is created by applying the label to the individual video frames of the video sequence, or by applying a rendering of the three-dimensional model to additional images or video sequences.
    Type: Application
    Filed: June 8, 2020
    Publication date: December 9, 2021
    Inventors: Steven James White, Olof Fredrik Ryden, Donald Mark Marsh
  • Patent number: 11127112
    Abstract: Techniques for simplifying a user interface by transforming a model and inversely transforming commands for controlling a remote device based on the transformed model are described herein. A computer system determines a warping transformation, applies the transformation to a model, and provides the transformed model to an operator display. The operator, referencing the transformed model, provides a movement command to a remote device reduces robot and the computer system inversely transforms the command to correspond to the space of the remote device.
    Type: Grant
    Filed: April 8, 2020
    Date of Patent: September 21, 2021
    Assignee: Bluhaptics, Inc.
    Inventor: Olof Fredrik Ryden
  • Publication number: 20200302241
    Abstract: A system and method are provided for training a machine learning system. In an embodiment, the system generates a three-dimensional model of an environment using a video sequence that includes individual frames taken from a variety of perspectives and environmental conditions. An object in the environment is identified and labeled, in some examples, by an operator, and a three-dimensional model of the object is created. Training data for the machine learning system is created by applying the label to the individual video frames of the video sequence, or by applying a rendering of the three-dimensional model to additional images or video sequences.
    Type: Application
    Filed: June 7, 2020
    Publication date: September 24, 2020
    Inventors: Steven James White, Olof Fredrik Ryden, Donald Mark Marsh
  • Patent number: 10394327
    Abstract: Apparatus and methods for generating virtual environment displays based on a group of sensors are provided. A computing device can receive first data about an environment from a first group of one or more sensors. The computing device can model the environment as a virtual environment based on the first data. The computing device can determine whether to obtain additional data to model the environment. After determining to obtain additional data to model the environment, the computing device can: receive second data about the environment, and model the environment as the virtual environment based on at least the second data. The computing device can generate a display of the virtual environment.
    Type: Grant
    Filed: September 11, 2015
    Date of Patent: August 27, 2019
    Assignee: University of Washington
    Inventors: Howard Jay Chizeck, Kevin Huang, Fredrik Ryden, Andrew Stewart
  • Patent number: 10226869
    Abstract: Apparatus and methods for defining and utilizing virtual fixtures for haptic navigation within real-world environments, including underwater environments, are provided. A computing device can determine a real-world object within a real-world environment. The computing device can receive an indication of the real-world object. The computing device can determine a virtual fixture that corresponds to the real-world object based on the indication, where aspects of the virtual fixture are configured to align with aspects of the real-world object. The computing device can provide a virtual environment for manipulating the robotic tool to operate on the real-world object utilizing the virtual fixture. The virtual fixture is configured to provide haptic feedback based on a position of a virtual robotic tool in the virtual environment that corresponds to the robotic tool in the real-world environment.
    Type: Grant
    Filed: March 2, 2015
    Date of Patent: March 12, 2019
    Assignee: University of Washington
    Inventors: Howard Jay Chizeck, Andrew Stewart, Fredrik Ryden
  • Publication number: 20180232052
    Abstract: Apparatus and methods for generating virtual environment displays based on a group of sensors are provided. A computing device can receive first data about an environment from a first group of one or more sensors. The computing device can model the environment as a virtual environment based on the first data. The computing device can determine whether to obtain additional data to model the environment. After determining to obtain additional data to model the environment, the computing device can: receive second data about the environment, and model the environment as the virtual environment based on at least the second data. The computing device can generate a display of the virtual environment.
    Type: Application
    Filed: September 11, 2015
    Publication date: August 16, 2018
    Inventors: Howard Jay CHIZECK, Kevin HUANG, Fredrik RYDEN, Andrew STEWART
  • Patent number: 9753542
    Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.
    Type: Grant
    Filed: October 11, 2016
    Date of Patent: September 5, 2017
    Assignee: University of Washington Through Its Center for Commercialization
    Inventors: Howard Jay Chizeck, Fredrik Ryden, Andrew Stewart
  • Publication number: 20170106537
    Abstract: Apparatus and methods for defining and utilizing virtual fixtures for haptic navigation within real-world environments, including underwater environments, are provided. A computing device can determine a real-world object within a real-world environment. The computing device can receive an indication of the real-world object. The computing device can determine a virtual fixture that corresponds to the real-world object based on the indication, where aspects of the virtual fixture are configured to align with aspects of the real-world object. The computing device can provide a virtual environment for manipulating the robotic tool to operate on the real-world object utilizing the virtual fixture. The virtual fixture is configured to provide haptic feedback based on a position of a virtual robotic tool in the virtual environment that corresponds to the robotic tool in the real-world environment.
    Type: Application
    Filed: March 2, 2015
    Publication date: April 20, 2017
    Inventors: Howard Jay CHIZECK, Andrew STEWART, Fredrik RYDEN
  • Publication number: 20170024014
    Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.
    Type: Application
    Filed: October 11, 2016
    Publication date: January 26, 2017
    Inventors: Howard Jay Chizeck, Fredrik Ryden
  • Patent number: 9477307
    Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.
    Type: Grant
    Filed: January 24, 2014
    Date of Patent: October 25, 2016
    Assignee: The University of Washington
    Inventors: Howard Jay Chizeck, Fredrik Ryden
  • Patent number: 9471142
    Abstract: Methods, articles of manufacture, and devices related to generating haptic feedback for point clouds are provided. A computing device receives depth data about an In-Contact environment. The computing device generates a point cloud from the depth data. The computing device determines a haptic interface point (HIP). The computing device determines a haptic interface point (HIP). The computing device determines a force vector between the HIP and point cloud. The computing device sends an indication of haptic feedback based on the force vector.
    Type: Grant
    Filed: June 15, 2012
    Date of Patent: October 18, 2016
    Assignee: The University of Washington
    Inventors: Howard Jay Chizeck, Fredrik Rydén, Sina Nia Kosari, Blake Hannaford, Nicklas Gustafsson, Hawkeye I. King
  • Publication number: 20140320629
    Abstract: Apparatus configured for operation in an underwater environment are provided. A device includes a camera and a transceiver. The camera can capture, within a predetermined interval of time, first light within a first frequency range of light and second light within a second frequency range of light. The camera can generate a first image based on the first light and a second image based on the second light. The first frequency range of light can differ from the second frequency range of light; for example, the first frequency range can include 420-450 nanometers (nm) and the second frequency range can include 830 nm. The transceiver can send the images and receive commands based on haptic feedback generated based on the first image and the second image. For example, an actuator can be configured to be controlled by one or more commands.
    Type: Application
    Filed: January 24, 2014
    Publication date: October 30, 2014
    Inventors: Howard Jay Chizeck, Fredrik Ryden, Andrew Stewart, Wei-Chih Wang, Payman Arabshahi
  • Publication number: 20140320489
    Abstract: Methods, articles of manufacture, and devices related to generating six degree of freedom (DOF) haptic feedback are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a virtual tool, where the virtual tool is specified in terms of a translation component for the virtual tool and a rotation component for the virtual tool. The computing device can determine a first force vector between the virtual tool and the first plurality of points. The computing device can send a first indication of haptic feedback based on the first force vector.
    Type: Application
    Filed: January 24, 2014
    Publication date: October 30, 2014
    Inventors: Howard Jay Chizeck, Fredrik Ryden
  • Publication number: 20140320392
    Abstract: Apparatus and method for defining and utilizing virtual fixtures in haptic rendering sessions interacting with various environments, including underwater environments, are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a haptic interface point (HIP) and can define a virtual fixture for the environment. The computing device can determine a first force vector between the HIP and the first plurality of points using the computing device, where the first force vector is based on the virtual fixture. The computing device can send a first indication of haptic feedback based on the first force vector.
    Type: Application
    Filed: January 24, 2014
    Publication date: October 30, 2014
    Inventors: Howard Jay Chizeck, Fredrik Ryden, Andrew Stewart
  • Publication number: 20140168073
    Abstract: Methods, articles of manufacture, and devices related to generating haptic feedback for point clouds are provided. A computing device receives depth data about an environment. The computing device generates a point cloud from the depth data. The computing device determines a haptic interface point (HIP). The computing device determines a haptic interface point (HIP). The computing device determines a force vector between the HIP and point cloud. The computing device sends an indication of haptic feedback based on the force vector.
    Type: Application
    Filed: June 15, 2012
    Publication date: June 19, 2014
    Applicant: University of Washington through its Center for Commericialization
    Inventors: Howard Jay Chizeck, Fredrik Rydén, Sina Nia Kosari, Blake Hannaford, Nicklas Gustafsson, Hawkeye I. King