Patents by Inventor Paul Yarin

Paul Yarin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190286243
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: June 4, 2019
    Publication date: September 19, 2019
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Publication number: 20190272043
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: May 21, 2019
    Publication date: September 5, 2019
    Inventors: David Minnen, Paul Yarin
  • Patent number: 10353483
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: August 1, 2018
    Date of Patent: July 16, 2019
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
  • Patent number: 10338693
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: April 10, 2018
    Date of Patent: July 2, 2019
    Assignee: Oblong Industries, Inc.
    Inventors: David Minnen, Paul Yarin
  • Patent number: 10255489
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: March 20, 2018
    Date of Patent: April 9, 2019
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20180348883
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: August 1, 2018
    Publication date: December 6, 2018
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
  • Publication number: 20180299966
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: April 10, 2018
    Publication date: October 18, 2018
    Inventors: David MINNEN, Paul YARIN
  • Patent number: 10067571
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: December 15, 2017
    Date of Patent: September 4, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
  • Publication number: 20180218205
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: March 20, 2018
    Publication date: August 2, 2018
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20180203520
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.
    Type: Application
    Filed: March 12, 2018
    Publication date: July 19, 2018
    Inventors: Kwindla Hultman KRAMER, John UNDERKOFFLER, Carlton SPARRELL, Navjot SINGH, Kate HOLLENBACH, Paul YARIN
  • Patent number: 9990046
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: March 7, 2016
    Date of Patent: June 5, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: David Minnen, Paul Yarin
  • Patent number: 9984285
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: July 7, 2017
    Date of Patent: May 29, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 9952673
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.
    Type: Grant
    Filed: October 8, 2013
    Date of Patent: April 24, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Publication number: 20180107281
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: December 15, 2017
    Publication date: April 19, 2018
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
  • Patent number: 9880635
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: April 28, 2017
    Date of Patent: January 30, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Publication number: 20170308743
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Application
    Filed: July 7, 2017
    Publication date: October 26, 2017
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20170300122
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: April 28, 2017
    Publication date: October 19, 2017
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9740293
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: December 31, 2013
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9740922
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: January 27, 2015
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Publication number: 20170038846
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Application
    Filed: March 7, 2016
    Publication date: February 9, 2017
    Inventors: David MINNEN, Paul YARIN