Patents by Inventor David Hygh
David Hygh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230040969Abstract: A system and methods for assessing an environment are disclosed. A method includes causing a robot to transmit data to first and second user devices, causing the robot to execute a first action, and, responsive to a second instruction, causing the robot to execute a second action. At least one user device is outside the environment of the robot. At least one action includes recording a video of at least a portion of the environment, displaying the video in real time on both user devices, and storing the video on a cloud-based network. The other action includes determining a first physical location of the robot, determining a desired second physical location of the robot, and propelling the robot from the first location to the second location. Determining the desired second location is responsive to detecting a touch on a touchscreen video feed displaying the video in real time.Type: ApplicationFiled: December 31, 2020Publication date: February 9, 2023Inventors: Paul BERBERIAN, Damon ARNIOTES, Joshua SAVAGE, Andrew SAVAGE, Ross MACGREGOR, David HYGH, James BOOTH, Jonathan CARROLL
-
Publication number: 20190171295Abstract: A modular sensing device can include an inertial measurement unit to generate sensor data corresponding to user gestures performed by a user, a mode selector enabling the user to select a mode of the modular sensing device out of a plurality of modes, and one or more output devices to generate output based on the user gestures and the selected mode. The modular sensing device can further include a controller to implement a plurality of state machines. Each state machine can be associated with a corresponding user gesture by a sensor data signature. The state machine can execute a state transition when the sensor data matches the sensor data signature. The executed state transition can cause the controller to generate a corresponding output via the one or more output devices specific to the selected mode and based on the corresponding user gesture.Type: ApplicationFiled: June 18, 2018Publication date: June 6, 2019Applicant: SPHERO, INC.Inventors: David Hygh, Fabrizio Polo
-
Publication number: 20190126158Abstract: Aspects of the present disclosure relate to track layout identification techniques. As an example, a set of track segments may be arranged in a certain order to form a track. A variety of track segment types may be used to form the track, and different combinations of the track segments may yield a track having varying characteristics. In an example, each track segment may comprise of a shift register, which may provide information to a base track segment relating to the order of the track segments and the track segment types that comprise the track. In some examples, the base track segment may relay the information to a computing device. Accordingly, the information may be used to determine a track layout, which may in turn be used to provide, for example, the ability for someone else to reproduce the track configuration.Type: ApplicationFiled: February 9, 2018Publication date: May 2, 2019Applicant: SPHERO, INC.Inventors: David Hygh, Quentin Michelet, Michael Byrd
-
Publication number: 20190129442Abstract: Aspects of the present disclosure relate to magnetic robot calibration. As an example, a robot may engage in a calibration process based at least in part on data samples from a magnetometer. The robot may use the data samples to determine a reference point, with which the robot may process movement instructions accordingly. In some examples, a user device may be used to control the robot, and may comprise a magnetometer for determining a reference point similar to that of the robot. As a result, the user device may communicate with the robot using movement instructions that are based on the reference point determined at the user device, such that the robot may perform the movement instructions using the reference point determined at the robot.Type: ApplicationFiled: February 9, 2018Publication date: May 2, 2019Applicant: SPHERO, INC.Inventors: David Hygh, Jonathan Carroll, Patrick Martin, Quentin Michelet
-
Patent number: 10275036Abstract: A wearable device can be worn by a user, and can include one or more sensors to detect user gestures performed by the user. The wearable device can further include a wireless communication module to establish a communication link with a self-propelled device, and a controller that can generate control commands based on the user gestures. The control commands may be executable to accelerate and maneuver the self-propelled device. The controller may then transmit the control commands to the self-propelled device over the communication link for execution by the self-propelled device.Type: GrantFiled: August 31, 2016Date of Patent: April 30, 2019Assignee: Sphero, Inc.Inventors: David Hygh, Jeffrey Wiencrot, Ian H. Bernstein
-
Patent number: 10001843Abstract: A modular sensing device can include an inertial measurement unit to generate sensor data corresponding to user gestures performed by a user, a mode selector enabling the user to select a mode of the modular sensing device out of a plurality of modes, and one or more output devices to generate output based on the user gestures and the selected mode. The modular sensing device can further include a controller to implement a plurality of state machines. Each state machine can be associated with a corresponding user gesture by a sensor data signature. The state machine can execute a state transition when the sensor data matches the sensor data signature. The executed state transition can cause the controller to generate a corresponding output via the one or more output devices specific to the selected mode and based on the corresponding user gesture.Type: GrantFiled: August 31, 2016Date of Patent: June 19, 2018Assignee: Sphero, Inc.Inventors: David Hygh, Fabrizio Polo
-
Publication number: 20180089699Abstract: Aspects of the present disclosure relate to systems and method for providing a managed user experience for consumer products. A base station may detect a tag associated with a consumer product, which may be used to retrieve a managed user experience cached locally, or stored at a remote device associated with an experience provider. The managed user experience may include gameplay, content, and/or instructions to perform specified functionality. In an example, content and/or instructions may be transmitted to the consumer product and/or one or more intermediary devices in order to generate the managed user experience. In some examples, information may be received from the consumer product by way of the tag and used to provide the managed user experience. In other examples, the tag may be retrieve and/or provide aspects of a managed user experience from an experience provider without use of the base station or intermediary device.Type: ApplicationFiled: September 22, 2017Publication date: March 29, 2018Inventors: Damon Arniotes, Ian Bernstein, David Hygh, Jonathan Carroll, Kevin Menzie, Anya Grechka, Isaac Squires
-
Publication number: 20170192517Abstract: A wearable device can be worn by a user, and can include one or more sensors to detect user gestures performed by the user. The wearable device can further include a wireless communication module to establish a communication link with a self-propelled device, and a controller that can generate control commands based on the user gestures. The control commands may be executable to accelerate and maneuver the self-propelled device. The controller may then transmit the control commands to the self-propelled device over the communication link for execution by the self-propelled device.Type: ApplicationFiled: August 31, 2016Publication date: July 6, 2017Inventors: David Hygh, Jeffrey Wiencrot, Ian H. Bernstein
-
Publication number: 20170192518Abstract: A modular sensing device can include an inertial measurement unit to generate sensor data corresponding to user gestures performed by a user, a mode selector enabling the user to select a mode of the modular sensing device out of a plurality of modes, and one or more output devices to generate output based on the user gestures and the selected mode. The modular sensing device can further include a controller to implement a plurality of state machines. Each state machine can be associated with a corresponding user gesture by a sensor data signature. The state machine can execute a state transition when the sensor data matches the sensor data signature. The executed state transition can cause the controller to generate a corresponding output via the one or more output devices specific to the selected mode and based on the corresponding user gesture.Type: ApplicationFiled: August 31, 2016Publication date: July 6, 2017Inventors: David Hygh, Fabrizio Polo