Patents by Inventor Kwindla Hultman Kramer
Kwindla Hultman Kramer has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10990454Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.Type: GrantFiled: January 22, 2019Date of Patent: April 27, 2021Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John S. Underkoffler
-
Patent number: 10824238Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: GrantFiled: March 28, 2019Date of Patent: November 3, 2020Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John Underkoffler
-
Patent number: 10739865Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: GrantFiled: June 4, 2019Date of Patent: August 11, 2020Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
-
Publication number: 20200250018Abstract: Embodiments described herein include mechanisms for encapsulating data that needs to be shared between or across processes. These mechanisms include slawx (plural of “slaw”), proteins, and pools. Generally, slawx provide the lowest-level of data definition for inter-process exchange, proteins provide mid-level structure and hooks for querying and filtering, and pools provide for high-level organization and access semantics. Slawx includes a mechanism for efficient, platform-independent data representation and access. Proteins provide a data encapsulation and transport scheme using slawx as the payload. Pools provide structured and flexible aggregation, ordering, filtering, and distribution of proteins within a process, among local processes, across a network between remote or distributed processes, and via longer term (e.g. on-disk, etc.) storage.Type: ApplicationFiled: April 20, 2020Publication date: August 6, 2020Inventors: Kwindla Hultman Kramer, John S. Underkoffler
-
Publication number: 20200241650Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.Type: ApplicationFiled: April 14, 2020Publication date: July 30, 2020Inventors: Kwindla Hultman KRAMER, John UNDERKOFFLER, Carlton SPARRELL, Navjot SINGH, Kate HOLLENBACH, Paul YARIN
-
Patent number: 10664327Abstract: Embodiments described herein include mechanisms for encapsulating data that needs to be shared between or across processes. These mechanisms include slawx (plural of “slaw”), proteins, and pools. Generally, slawx provide the lowest-level of data definition for inter-process exchange, proteins provide mid-level structure and hooks for querying and filtering, and pools provide for high-level organization and access semantics. Slawx includes a mechanism for efficient, platform-independent data representation and access. Proteins provide a data encapsulation and transport scheme using slawx as the payload. Pools provide structured and flexible aggregation, ordering, filtering, and distribution of proteins within a process, among local processes, across a network between remote or distributed processes, and via longer term (e.g. on-disk, etc.) storage.Type: GrantFiled: September 14, 2017Date of Patent: May 26, 2020Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John S. Underkoffler
-
Patent number: 10656724Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.Type: GrantFiled: March 12, 2018Date of Patent: May 19, 2020Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
-
Patent number: 10565030Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.Type: GrantFiled: September 30, 2016Date of Patent: February 18, 2020Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John S. Underkoffler
-
Patent number: 10521021Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.Type: GrantFiled: February 1, 2019Date of Patent: December 31, 2019Assignee: Oblong Industries, Inc.Inventors: John S. Underkoffler, Kwindla Hultman Kramer
-
Publication number: 20190286243Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: ApplicationFiled: June 4, 2019Publication date: September 19, 2019Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenbach, Paul Yarin
-
Publication number: 20190220100Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: ApplicationFiled: March 28, 2019Publication date: July 18, 2019Inventors: Kwindla Hultman Kramer, John Underkoffler
-
Patent number: 10353483Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: GrantFiled: August 1, 2018Date of Patent: July 16, 2019Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
-
Publication number: 20190187801Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.Type: ApplicationFiled: February 1, 2019Publication date: June 20, 2019Inventors: John S. Underkoffler, Kwindla Hultman Kramer
-
Publication number: 20190171496Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.Type: ApplicationFiled: January 22, 2019Publication date: June 6, 2019Inventors: Kwindla Hultman Kramer, John S. Underkoffler
-
Patent number: 10296099Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: GrantFiled: May 17, 2017Date of Patent: May 21, 2019Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John Underkoffler
-
Patent number: 10235412Abstract: Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an SOE, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions. The events produced are suitable for consumption by interactive systems, and the embodiments provide one or more mechanisms for controlling and effecting event distribution to these consumers. The embodiments further provide to the consumers of its events a facility for transforming gestural events among arbitrary spatial and semantic frames of reference.Type: GrantFiled: August 25, 2017Date of Patent: March 19, 2019Assignee: Oblong Industries, Inc.Inventors: John S. Underkoffler, Kwindla Hultman Kramer
-
Patent number: 10223418Abstract: A multi-process interactive system is described. The system includes numerous processes running on a processing device. The processes include separable program execution contexts of application programs, such that each application program comprises at least one process. The system translates events of each process into data capsules. A data capsule includes an application-independent representation of event data of an event and state information of the process originating the content of the data capsule. The system transfers the data messages into pools or repositories. Each process operates as a recognizing process, where the recognizing process recognizes in the pools data capsules comprising content that corresponds to an interactive function of the recognizing process and/or an identification of the recognizing process. The recognizing process retrieves recognized data capsules from the pools and executes processing appropriate to contents of the recognized data capsules.Type: GrantFiled: January 19, 2018Date of Patent: March 5, 2019Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John S. Underkoffler
-
Publication number: 20180348883Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: ApplicationFiled: August 1, 2018Publication date: December 6, 2018Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
-
Patent number: 10067571Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: GrantFiled: December 15, 2017Date of Patent: September 4, 2018Assignee: Oblong Industries, Inc.Inventors: Kwindla Hultman Kramer, John Underkoffler, Carlton Sparrell, Navjot Singh, Kate Hollenback, Paul Yarin
-
Publication number: 20180203520Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across at least one of the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The gesture data is absolute three-space location data of an instantaneous state of the at least one object at a point in time and space. The detecting comprises aggregating the gesture data, and identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control at least one of the display devices and the remote client devices in response to the gesture signal.Type: ApplicationFiled: March 12, 2018Publication date: July 19, 2018Inventors: Kwindla Hultman KRAMER, John UNDERKOFFLER, Carlton SPARRELL, Navjot SINGH, Kate HOLLENBACH, Paul YARIN