Patents Assigned to Oblong Industries, Inc.
  • Patent number: 9075441
    Abstract: Systems and methods are described for gesture-based control using three-dimensional information extracted over an extended depth of field. The system comprises a plurality of optical detectors coupled to at least one processor. The optical detectors image a body. At least two optical detectors of the plurality of optical detectors comprise wavefront coding cameras. The processor automatically detects a gesture of the body, wherein the gesture comprises an instantaneous state of the body. The detecting comprises aggregating gesture data of the gesture at an instant in time. The gesture data includes focus-resolved data of the body within a depth of field of the imaging system. The processor translates the gesture to a gesture signal, and uses the gesture signal to control a component coupled to the processor.
    Type: Grant
    Filed: April 2, 2009
    Date of Patent: July 7, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Pierre St. Hilaire, John S. Underkoffler
  • Patent number: 9063801
    Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.
    Type: Grant
    Filed: October 14, 2009
    Date of Patent: June 23, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla H. Kramer, John S. Underkoffler
  • Patent number: 9052970
    Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.
    Type: Grant
    Filed: October 14, 2009
    Date of Patent: June 9, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Kwindla H. Kramer, John S. Underkoffler
  • Patent number: 8941589
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: June 25, 2012
    Date of Patent: January 27, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 8941590
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: June 25, 2012
    Date of Patent: January 27, 2015
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 8941588
    Abstract: Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data.
    Type: Grant
    Filed: March 26, 2012
    Date of Patent: January 27, 2015
    Assignee: Oblong Industries, Inc.
    Inventor: David Minnen
  • Patent number: 8896531
    Abstract: Systems and methods for initializing real-time, vision-based hand tracking systems are described. The systems and methods for initializing the vision-based hand tracking systems image a body and receive gesture data that is absolute three-space data of an instantaneous state of the body at a point in time and space, and at least one of determine an orientation of the body using an appendage of the body and track the body using at least one of the orientation and the gesture data.
    Type: Grant
    Filed: March 26, 2012
    Date of Patent: November 25, 2014
    Assignee: Oblong Industries, Inc.
    Inventor: David Minnen
  • Patent number: 8890813
    Abstract: Embodiments include vision-based interfaces performing hand or object tracking and shape recognition. The vision-based interface receives data from a sensor, and the data corresponds to an object detected by the sensor. The interface generates images from each frame of the data, and the images represent numerous resolutions. The interface detects blobs in the images and tracks the object by associating the blobs with tracks of the object. The interface detects a pose of the object by classifying each blob as corresponding to one of a number of object shapes. The interface controls a gestural interface in response to the pose and the tracks.
    Type: Grant
    Filed: May 6, 2013
    Date of Patent: November 18, 2014
    Assignee: Oblong Industries, Inc.
    Inventor: David Minnen
  • Patent number: 8866740
    Abstract: The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands.
    Type: Grant
    Filed: October 2, 2009
    Date of Patent: October 21, 2014
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kevin T. Parent
  • Patent number: 8830168
    Abstract: The system provides a gestural interface to various visually presented elements, presented on a display screen or screens. A gestural vocabulary includes ‘instantaneous’ commands, in which forming one or both hands into the appropriate ‘pose’ results in an immediate, one-time action; and ‘spatial’ commands, in which the operator either refers directly to elements on the screen by way of literal ‘pointing’ gestures or performs navigational maneuvers by way of relative or “offset” gestures. The system contemplates the ability to identify the users hands in the form of a glove or gloves with certain indicia provided thereon, or any suitable means for providing recognizable indicia on a user's hands or body parts. A system of cameras can detect the position, orientation, and movement of the user's hands and translate that information into executable commands.
    Type: Grant
    Filed: October 2, 2009
    Date of Patent: September 9, 2014
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kevin T. Parent
  • Publication number: 20140240231
    Abstract: Systems and methods are described for detecting an event of a source device, and generating at least one data sequence comprising device event data specifying the event and state information of the event. The device event data and state information are type-specific data having a type corresponding to an application of the source device. A data capsule is formed to include the at least one data sequence. The data capsule has a data structure comprising an application-independent representation of the at least one data sequence. The systems and methods detect poses and motion of an object, translate the poses and motion into a control signal using a gesture notation, and control a computer application using the control signal. The systems and methods automatically detect a gesture of a body, translate the gesture to a gesture signal, and control a component coupled to a computer in response to the gesture signal.
    Type: Application
    Filed: October 28, 2013
    Publication date: August 28, 2014
    Applicant: OBLONG INDUSTRIES, INC.
    Inventor: David MINNEN
  • Publication number: 20140195988
    Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.
    Type: Application
    Filed: October 8, 2013
    Publication date: July 10, 2014
    Applicant: OBLONG INDUSTRIES, INC.
    Inventors: Kwindla Hultman KRAMER, John UNDERKOFFLER, Carlton SPARRELL, Navjot SINGH, Kate HOLLENBACH, Paul YARIN
  • Publication number: 20140145929
    Abstract: Embodiments include vision-based interfaces performing hand or object tracking and shape recognition. The vision-based interface receives data from a sensor, and the data corresponds to an object detected by the sensor. The interface generates images from each frame of the data, and the images represent numerous resolutions. The interface detects blobs in the images and tracks the object by associating the blobs with tracks of the object. The interface detects a pose of the object by classifying each blob as corresponding to one of a number of object shapes. The interface controls a gestural interface in response to the pose and the tracks.
    Type: Application
    Filed: May 6, 2013
    Publication date: May 29, 2014
    Applicant: OBLONG INDUSTRIES, INC.
    Inventor: David MINNEN
  • Patent number: 8723795
    Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.
    Type: Grant
    Filed: May 4, 2010
    Date of Patent: May 13, 2014
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla Hultman Kramer
  • Patent number: 8681098
    Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.
    Type: Grant
    Filed: May 4, 2010
    Date of Patent: March 25, 2014
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla Hultman Kramer
  • Patent number: 8669939
    Abstract: A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation.
    Type: Grant
    Filed: May 27, 2010
    Date of Patent: March 11, 2014
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Carlton Sparrell, Harald Belker, Kwindla Hultman Kramer
  • Patent number: 8665213
    Abstract: A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation.
    Type: Grant
    Filed: May 27, 2010
    Date of Patent: March 4, 2014
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Carlton Sparrell, Harald Belker, Kwindla Hultman Kramer
  • Patent number: 8537111
    Abstract: Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
    Type: Grant
    Filed: September 3, 2009
    Date of Patent: September 17, 2013
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla H. Kramer
  • Patent number: 8537112
    Abstract: Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
    Type: Grant
    Filed: September 3, 2009
    Date of Patent: September 17, 2013
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla H. Kramer
  • Patent number: 8531396
    Abstract: Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.
    Type: Grant
    Filed: September 3, 2009
    Date of Patent: September 10, 2013
    Assignee: Oblong Industries, Inc.
    Inventors: John S. Underkoffler, Kwindla H. Kramer