Patents by Inventor Paul Yarin

Paul Yarin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170308743
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: July 7, 2017
    Publication date: October 26, 2017
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20170300122
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: April 28, 2017
    Publication date: October 19, 2017
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9740293
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: December 31, 2013
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9740922
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: January 27, 2015
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20170038846
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: March 7, 2016
    Publication date: February 9, 2017
    Inventors: David MINNEN, Paul YARIN
  • Patent number: 9317128
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: March 17, 2014
    Date of Patent: April 19, 2016
    Assignee: Oblong Industries, Inc.
    Inventors: David Minnen, Paul Yarin
  • Publication number: 20150371082
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: January 27, 2015
    Publication date: December 24, 2015
    Inventors: Ambrus CSASZAR, Dima KOGAN, Paul YARIN
  • Publication number: 20150371083
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: January 27, 2015
    Publication date: December 24, 2015
    Inventors: Ambrus CSASZAR, Dima KOGAN, Paul YARIN
  • Publication number: 20150077326
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: December 31, 2013
    Publication date: March 19, 2015
    Inventors: Kwindla Hultman KRAMER, John UNDERKOFFLER, Carlton SPARRELL, Navjot SINGH, Kate HOLLENBACH, Paul YARIN
  • Publication number: 20150054729
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: March 17, 2014
    Publication date: February 26, 2015
    Inventors: David MINNEN, Paul YARIN
  • Patent number: 8941590
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: June 25, 2012
    Date of Patent: January 27, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 8941589
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: June 25, 2012
    Date of Patent: January 27, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20140351073
    Abstract: An apparatus for enrolling a package is disclosed including: a receiving surface for receiving the package; at least one weight sensor in communication with the receiving surface which generates a weight signal indicative of the weight of the package; at least one RGB video camera which generates a first video signal indicative of an image of the package on the receiving surface; at least one infrared three-dimensional camera which generates a second video signal indicative of an the dimensions of the package; and a processor in communication with the at least one weight sensor and the at least one video camera.
    Type: Application
    Filed: May 11, 2012
    Publication date: November 27, 2014
    Inventors: Mike Murphy, Paul Yarin, Eric Metois, Will Crosby, Wilson Pearce, Mark Durbin
  • Publication number: 20140211982
    Abstract: An apparatus for enrolling a package is disclosed including: a receiving surface for receiving the package; at least one weight sensor in communication with the receiving surface which generates a weight signal indicative of the weight of the package; at least one video camera which generates a video signal indicative of an image of the package on the receiving surface; and a processor in communication with the at least one weight sensor and the at least one video camera. The processor includes: a weight module which produces, in response to the weight signal, weight data indicative of the weight of the package; and a dimension capture module which produces, in response to the video signal, dimension data indicative of the size of the package. In some embodiments, the processor further includes a recognition module which produces, in response to the video signal, character data indicative of one or more characters present on the package.
    Type: Application
    Filed: February 4, 2014
    Publication date: July 31, 2014
    Applicant: Proiam, LLC
    Inventors: Mike Murphy, Paul Yarin, Eric Metois, Will Crosby, Wilson Pearce, Mark Durbin
  • Publication number: 20140195988
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.
    Type: Application
    Filed: October 8, 2013
    Publication date: July 10, 2014
    Applicant: OBLONG INDUSTRIES, INC.
    Inventors: Kwindla Hultman KRAMER, John UNDERKOFFLER, Carlton SPARRELL, Navjot SINGH, Kate HOLLENBACH, Paul YARIN
  • Publication number: 20140035805
    Abstract: A Spatial Operating Environment (SOE) with markerless gestural control includes a sensor coupled to a processor that runs numerous applications. A gestural interface application executes on the processor. The gestural interface application receives data from the sensor that corresponds to a hand of a user detected by the sensor, and tracks the hand by generating images from the data and associating blobs in the images with tracks of the hand. The gestural interface application detects a pose of the hand by classifying each blob as corresponding to an object shape. The gestural interface application generates a gesture signal in response to a gesture comprising the pose and the tracks, and controls the applications with the gesture signal.
    Type: Application
    Filed: June 4, 2013
    Publication date: February 6, 2014
    Inventors: David MINNEN, Alan BROWNING, Peter HAWKES, Tobias RICK, Miguel Sanchez VALDES, Alessandro VALLI, Dan CHAK, Paul YARIN
  • Patent number: 8645216
    Abstract: An apparatus for enrolling a package is disclosed including: a receiving surface for receiving the package; at least one weight sensor in communication with the receiving surface which generates a weight signal indicative of the weight of the package; at least one video camera which generates a video signal indicative of an image of the package on the receiving surface; and a processor in communication with the at least one weight sensor and the at least one video camera. The processor includes: a weight module which produces, in response to the weight signal, weight data indicative of the weight of the package; and a dimension capture module which produces, in response to the video signal, dimension data indicative of the size of the package. In some embodiments, the processor further includes a recognition module which produces, in response to the video signal, character data indicative of one or more characters present on the package.
    Type: Grant
    Filed: March 2, 2012
    Date of Patent: February 4, 2014
    Assignee: Proiam, LLC
    Inventors: Mike Murphy, Paul Yarin, Eric Metois, Will Crosby, Wilson Pearce, Mark Durbin
  • Publication number: 20130076616
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: June 25, 2012
    Publication date: March 28, 2013
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20130076617
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: June 25, 2012
    Publication date: March 28, 2013
    Inventors: Ambrus CSASZAR, Dima KOGAN, Paul YARIN
  • Publication number: 20130076522
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: June 25, 2012
    Publication date: March 28, 2013
    Inventors: Ambrus CSASZAR, Dima KOGAN, Paul YARIN