Patents by Inventor Robert Samuel GORDON
Robert Samuel GORDON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230205151Abstract: The technology disclosed relates to selecting among devices to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: ApplicationFiled: January 4, 2023Publication date: June 29, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Robert Samuel GORDON, Paul Alan DURDIK, Maxwell SILLS
-
Publication number: 20230125265Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: December 21, 2022Publication date: April 27, 2023Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
-
Patent number: 11561519Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: GrantFiled: September 21, 2020Date of Patent: January 24, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Paul Alan Durdik, Maxwell Sills
-
Patent number: 11538224Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: GrantFiled: November 7, 2019Date of Patent: December 27, 2022Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 11307282Abstract: The technology disclosed relates to determining positional information about an object of interest is provided. In particular, it includes, conducting scanning of a field of interest with an emission from a transmission area according to an ordered scan pattern. The emission can be received to form a signal based upon at least one salient property (e.g., intensity, amplitude, frequency, polarization, phase, or other detectable feature) of the emission varying with time at the object of interest. Synchronization information about the ordered scan pattern can be derived from a source, a second signal broadcast separately, social media share, others, or and/or combinations thereof). A correspondence between at least one characteristic of the signal and the synchronization information can be established. Positional information can be determined based at least in part upon the correspondence.Type: GrantFiled: October 24, 2014Date of Patent: April 19, 2022Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Gabriel A. Hare, Neeloy Roy, Maxwell Sills, Paul Durdik
-
Publication number: 20210003977Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: ApplicationFiled: September 21, 2020Publication date: January 7, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Paul Alan Durdik, Maxwell Sills
-
Patent number: 10782657Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: GrantFiled: February 19, 2015Date of Patent: September 22, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Maxwell Sills, Paul Alan Durdik
-
Patent number: 10739823Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.Type: GrantFiled: October 5, 2018Date of Patent: August 11, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Paul Alan Durdik
-
Publication number: 20200143599Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: November 7, 2019Publication date: May 7, 2020Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
-
Patent number: 10475249Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: GrantFiled: August 3, 2018Date of Patent: November 12, 2019Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 10351240Abstract: Methods and systems for autonomous device reconfiguration are described herein. A system may include aerially-mobile devices each configured to perform a respective end-use function and carry out a portion of a reconfiguration operation, which involves arranging the one or more aerially-mobile devices according to a device configuration. A given device configuration may specify spatial locations within an environment corresponding to the aerially-mobile devices.Type: GrantFiled: December 29, 2016Date of Patent: July 16, 2019Assignee: Wing Aviation LLCInventors: Maxwell Andrew Sills, Ian Wetherbee, Robert Samuel Gordon
-
Patent number: 10336543Abstract: Systems and methods are provided for worksite automation. One example method includes receiving a work request indicative of at least one of a first item or one or more work request parameters, where the first item is one of a plurality of items stored in an item-storage environment, and where each item is associated with a co-located identifier device; in response to receipt of the work request: identifying the first item; determining a target location corresponding to the first item; selecting an unmanned aerial vehicle (UAV) from a plurality of encoder UAVs in the item-storage environment, where each encoder UAV includes an encoder device configured to encode data to the identifier devices associated with the plurality of items; and causing the selected UAV to: (a) travel to the target location, and (b) while hovering near to the location, encode particular identification data to the device associated with the first item.Type: GrantFiled: December 29, 2016Date of Patent: July 2, 2019Assignee: Wing Aviation LLCInventors: Maxwell Andrew Sills, Ian Wetherbee, Robert Samuel Gordon
-
Patent number: 10258534Abstract: Described herein is a control system that facilitates assistance mode(s). In particular, the control system may determine a particular assistance mode associated with an account. This particular assistance mode may specify (i) operations for an aerial vehicle to carry out in order to obtain sensor data providing environment information corresponding to a location associated with the account and (ii) feedback processes to provide feedback, via a feedback system associated with the account, that corresponds to respective environment information. The control system may transmit to the aerial vehicle an indication of the particular operations corresponding to the particular assistance mode and may then receive environment information for the location associated with the account. Based on the received environment information, the control system may apply the specified feedback processes to initiate feedback in accordance with the particular assistance mode via the associated feedback system.Type: GrantFiled: February 14, 2017Date of Patent: April 16, 2019Assignee: Wing Aviation LLCInventors: Maxwell Andrew Sills, Robert Samuel Gordon, Ian Wetherbee
-
Publication number: 20190094909Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.Type: ApplicationFiled: October 5, 2018Publication date: March 28, 2019Applicant: Leap Motion, Inc.Inventors: Robert Samuel GORDON, Paul Alan Durdik
-
Publication number: 20190073829Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: August 3, 2018Publication date: March 7, 2019Applicant: Leap Motion, Inc.Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
-
Patent number: 10095269Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.Type: GrantFiled: August 15, 2017Date of Patent: October 9, 2018Assignee: Leap Motion, Inc.Inventors: Robert Samuel Gordon, Paul Alan Durdik
-
Patent number: 10043320Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: GrantFiled: August 31, 2017Date of Patent: August 7, 2018Assignee: Leap Motion, Inc.Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 9868449Abstract: The technology disclosed relates to an embeddable motion sensory control device that detects gestures in a three dimensional (3D) sensory space within a vehicle cabin, detecting a gesture in the 3D sensory space and interpreting the gesture as a command to a (sub) system of the vehicle under control, and issuing the command when appropriate.Type: GrantFiled: May 29, 2015Date of Patent: January 16, 2018Assignee: Leap Motion, Inc.Inventors: David S. Holz, Hua Yang, Robert Samuel Gordon, Neeloy Roy, Justin Schunick, Paul A. Durdik
-
Publication number: 20180012074Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: August 31, 2017Publication date: January 11, 2018Applicant: Leap Motion, Inc.Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
-
Publication number: 20170344068Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.Type: ApplicationFiled: August 15, 2017Publication date: November 30, 2017Applicant: Leap Motion, Inc.Inventors: Robert Samuel GORDON, Paul Alan DURDIK