Patents Assigned to Oblong Industries, Inc.
  • Patent number: 9984285
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: July 7, 2017
    Date of Patent: May 29, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 9971807
    Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.
    Type: Grant
    Filed: June 8, 2015
    Date of Patent: May 15, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John S. Underkoffler
  • Patent number: 9952673
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.
    Type: Grant
    Filed: October 8, 2013
    Date of Patent: April 24, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9933852
    Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.
    Type: Grant
    Filed: May 1, 2015
    Date of Patent: April 3, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John S. Underkoffler
  • Patent number: 9910497
    Abstract: Systems and methods are described for controlling a remote system. The controlling involves detecting a gesture of a body from gesture data received via a detector. The gesture comprises an instantaneous state of the body. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting aggregates only the gesture data of the gesture at an instant in time and identifies the gesture using the gesture data. The controlling then translates the gesture to a gesture signal, and controls a component of the remote system in response to the gesture signal.
    Type: Grant
    Filed: September 10, 2009
    Date of Patent: March 6, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, Tom White, Mattie Ruth Kramer
  • Patent number: 9880635
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: April 28, 2017
    Date of Patent: January 30, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9823747
    Abstract: A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation.
    Type: Grant
    Filed: May 27, 2010
    Date of Patent: November 21, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Carlton Sparrell, Harald Belker, Kwindla Hultman Kramer
  • Patent number: 9804902
    Abstract: Embodiments described herein include mechanisms for encapsulating data that needs to be shared between or across processes. These mechanisms include slawx (plural of “slaw”), proteins, and pools. Generally, slawx provide the lowest-level of data definition for inter-process exchange, proteins provide mid-level structure and hooks for querying and filtering, and pools provide for high-level organization and access semantics. Slawx includes a mechanism for efficient, platform-independent data representation and access. Proteins provide a data encapsulation and transport scheme using slawx as the payload. Pools provide structured and flexible aggregation, ordering, filtering, and distribution of proteins within a process, among local processes, across a network between remote or distributed processes, and via longer term (e.g. on-disk, etc.) storage.
    Type: Grant
    Filed: March 26, 2013
    Date of Patent: October 31, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John S. Underkoffler
  • Patent number: 9778751
    Abstract: Systems and methods are described for gesture-based control using three-dimensional information extracted over an extended depth of field. The system comprises a plurality of optical detectors coupled to at least one processor. The optical detectors image a body. At least two optical detectors of the plurality of optical detectors comprise wavefront coding cameras. The processor automatically detects a gesture of the body, wherein the gesture comprises an instantaneous state of the body. The detecting comprises aggregating gesture data of the gesture at an instant in time. The gesture data includes focus-resolved data of the body within a depth of field of the imaging system. The processor translates the gesture to a gesture signal, and uses the gesture signal to control a component coupled to the processor.
    Type: Grant
    Filed: June 8, 2015
    Date of Patent: October 3, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Pierre St. Hilaire, John S. Underkoffler
  • Patent number: 9779131
    Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.
    Type: Grant
    Filed: May 13, 2014
    Date of Patent: October 3, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla Hultman Kramer
  • Patent number: 9740293
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: December 31, 2013
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
  • Patent number: 9740922
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: January 27, 2015
    Date of Patent: August 22, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 9684380
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: November 12, 2013
    Date of Patent: June 20, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John Underkoffler
  • Patent number: 9606630
    Abstract: The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands.
    Type: Grant
    Filed: September 9, 2014
    Date of Patent: March 28, 2017
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kevin T. Parent
  • Patent number: 9495228
    Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.
    Type: Grant
    Filed: February 5, 2013
    Date of Patent: November 15, 2016
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla Hultman Kramer, John S. Underkoffler
  • Patent number: 9495013
    Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.
    Type: Grant
    Filed: March 25, 2014
    Date of Patent: November 15, 2016
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla Hultman Kramer
  • Patent number: 9471149
    Abstract: Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
    Type: Grant
    Filed: September 17, 2013
    Date of Patent: October 18, 2016
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla H. Kramer
  • Patent number: 9471147
    Abstract: Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
    Type: Grant
    Filed: September 9, 2013
    Date of Patent: October 18, 2016
    Assignee: OBLONG INDUSTRIES, INC.
    Inventors: John S. Underkoffler, Kwindla H. Kramer
  • Patent number: 9471148
    Abstract: Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
    Type: Grant
    Filed: September 17, 2013
    Date of Patent: October 18, 2016
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla H. Kramer
  • Patent number: 9317128
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.
    Type: Grant
    Filed: March 17, 2014
    Date of Patent: April 19, 2016
    Assignee: Oblong Industries, Inc.
    Inventors: David Minnen, Paul Yarin