Patents by Inventor Nathan D. NOCON

Nathan D. NOCON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10898795
    Abstract: Embodiments relate to gameplay using mobile devices. Embodiments include receiving, by a first device, input from a player initiating a targeted action. Embodiments include determining, by the first device, an orientation of the first device. Embodiments include determining, by the first device, a location of at least a second device based at least on a message received from the second device. Embodiments include identifying, by the first device, that a target of the targeted action is associated with the second device based on the orientation of the first device and the location of the second device. Embodiments include transmitting, by the first device, an indication of the targeted action to the second device.
    Type: Grant
    Filed: February 13, 2019
    Date of Patent: January 26, 2021
    Assignee: Disney Enterprises, Inc.
    Inventors: Nathan D. Nocon, R. Hunter Gough
  • Publication number: 20210020079
    Abstract: According to one implementation, an image generation system includes a rotor and a motor for spinning the rotor about an axis of rotation, and a display secured to the rotor. The display includes a display surface and a display aperture, a first privacy s screen situated at the display surface and having a convex curvature relative to light emitted from the display surface, and a second privacy screen situated between the first privacy screen and the display aperture and having a concave curvature relative to light emitted from the display surface. The first privacy screen and the second privacy screen are configured to substantially prevent rotational blur of an image displayed by the display surface while the display is spun by the motor and the rotor.
    Type: Application
    Filed: July 19, 2019
    Publication date: January 21, 2021
    Inventors: Nathan D. Nocon, Clifford Wong
  • Publication number: 20200341495
    Abstract: A stability controlled system includes a computing platform having a hardware processor, a memory storing a software code, a moveable component, and a tilt sensor. The hardware processor executes the software code to monitor the tilt sensor to determine whether the system is at a tilt with respect to a support surface for the system. When the tilt sensor is sensing the tilt with respect to the support surface: when the moveable component is off, the software code prevents the moveable component from turning on, and when the moveable component is on, the software code performs one of (a) turning off the moveable component, and (b) slowing down a regular rate of motion of the moveable component. When the tilt sensor is not sensing the tilt with respect to the support surface, the software code permits the moveable component to be turned on and have the regular rate of motion.
    Type: Application
    Filed: April 23, 2019
    Publication date: October 29, 2020
    Inventors: Clifford Wong, Nathan D. Nocon
  • Publication number: 20200342683
    Abstract: This disclosure presents systems and methods to synchronize real-world motion of physical objects with presentation of virtual content. Individual physical objects may be detected and/or identified based on image information defining one or more images of a real-world environment. Individual network connections may be established between individual computing platforms and individual physical objects. A network connection may facilitate a synchronization of a presentation of virtual content on a computing platform with motion of one or more physical objects in the real-world environment.
    Type: Application
    Filed: April 24, 2019
    Publication date: October 29, 2020
    Inventors: Timothy M. Panec, Janice Rosenthal, Hunter J. Gibson, Nathan D. Nocon, Stephen A. Thornton
  • Patent number: 10761180
    Abstract: A system includes a host device having a hardware processor and a host wireless transceiver, and client devices having client wireless transceivers for wireless communications with the host device. The hardware processor receives wireless signals transmitted by the client wireless transceivers using the host wireless transceiver, and determines locations of the client devices relative to the host device based on angles of arrival of the of the wireless signals. The hardware processor further determines an activation sequence for activating the client devices based on the locations relative to the host device, and transmits control signals using the host wireless transceiver, according to the activation sequence, to activate the client devices.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: September 1, 2020
    Assignee: Disney Enterprises Inc.
    Inventor: Nathan D. Nocon
  • Publication number: 20200254336
    Abstract: Embodiments relate to gameplay using mobile devices. Embodiments include receiving, by a first device, input from a player initiating a targeted action. Embodiments include determining, by the first device, an orientation of the first device. Embodiments include determining, by the first device, a location of at least a second device based at least on a message received from the second device. Embodiments include identifying, by the first device, that a target of the targeted action is associated with the second device based on the orientation of the first device and the location of the second device. Embodiments include transmitting, by the first device, an indication of the targeted action to the second device.
    Type: Application
    Filed: February 13, 2019
    Publication date: August 13, 2020
    Inventors: Nathan D. NOCON, R. Hunter GOUGH
  • Publication number: 20200151956
    Abstract: To capture augmented reality (AR) on a head mounted display the embodiments described herein include the optical display of virtual components of an AR on an AR device. A real world view within a field of view of the display of the AR device is captured, creating a composite image that combines the virtual components and real world physical components within the field of view of the AR device.
    Type: Application
    Filed: November 13, 2018
    Publication date: May 14, 2020
    Inventors: Michael P. GOSLIN, Nathan D. NOCON, Jonathan RD HSU, Eric C. HASELTINE, Elliott H. BAUMBACH
  • Publication number: 20200134920
    Abstract: Embodiments provide for tracking location and resolving drift in Augmented Reality (AR) devices. The AR devices includes computing devices having screens on a first face and cameras on a second, opposite face to project an image onto optical arrangements for viewing by wearers of the AR devices. The AR devices map locations for real objects in the environment to a virtual environment; anchor virtual objects at anchor locations within the virtual environment; capture station keeping images of the environment from a first Field of View via the camera; determine a second, different Field of View in the environment for the wearer of the AR device based on the relative locations of real objects present in the station keeping images; and output images depicting the virtual objects at positions on the screen to depict the virtual objects in the physical environment at the anchor locations.
    Type: Application
    Filed: October 24, 2018
    Publication date: April 30, 2020
    Inventors: Randall S. DAVIS, Elliott H. BAUMBACH, Nathan D. NOCON, Todd Michael GRAHAM
  • Publication number: 20200122046
    Abstract: Techniques for randomized device interaction are provided. A first communication pattern is selected, with at least a degree of randomness, from a plurality of communication patterns, where each of the plurality of communication patterns specifies one or more audio profiles. A first audio profile specified in the first communication pattern is identified. A first portion of audio is extracted from a first audio file with at least a degree of randomness, and the first portion of audio is modified based on the first audio profile. Finally, the first modified portion of audio is outputted by a first device.
    Type: Application
    Filed: October 22, 2018
    Publication date: April 23, 2020
    Inventors: Nathan D. NOCON, Stephen A. THORNTON, Timothy M. PANEC
  • Patent number: 10621789
    Abstract: Embodiments provide for tracking location and resolving drift in Augmented Reality (AR) devices. The AR devices includes computing devices having screens on a first face and cameras on a second, opposite face to project an image onto optical arrangements for viewing by wearers of the AR devices. The AR devices map locations for real objects in the environment to a virtual environment; anchor virtual objects at anchor locations within the virtual environment; capture station keeping images of the environment from a first Field of View via the camera; determine a second, different Field of View in the environment for the wearer of the AR device based on the relative locations of real objects present in the station keeping images; and output images depicting the virtual objects at positions on the screen to depict the virtual objects in the physical environment at the anchor locations.
    Type: Grant
    Filed: October 24, 2018
    Date of Patent: April 14, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Randall S. Davis, Elliott H. Baumbach, Nathan D. Nocon, Todd Michael Graham
  • Publication number: 20200055186
    Abstract: Systems, methods and articles of manufacture for synchronized robot orientation are described herein. A magnetometer, gyroscope, and accelerometer in a remotely controlled device are used to determine a current orientation of that device, and a command with a specified orientation or location are set to several such devices. The remotely controlled devices self-align based on the specified orientation/location, and when in position, receive swarm commands to perform actions as a group of devices in coordination with one another.
    Type: Application
    Filed: August 17, 2018
    Publication date: February 20, 2020
    Inventors: Nathan D. NOCON, Michael P. GOSLIN, Clifford WONG, Jonathan RD HSU, Timothy M. PANEC
  • Publication number: 20200058168
    Abstract: Techniques for aligning a virtual object with a physical object in an Augmented Reality (AR) or Virtual Reality (VR) application are described. An electronic peripheral includes a first inertial measurement unit (“IMU”). A head mounted display includes a second IMU. An estimated attitude for the electronic peripheral is generated using data from the first IMU. An estimated attitude for the head mounted display is generated using data from the second IMU. An orientation of a virtual object is determined based on the estimated first and second attitudes, such that the virtual object is aligned with an object in a user's physical environment when the virtual object is displayed to the user. The virtual object is displayed on the head mounted display.
    Type: Application
    Filed: August 17, 2018
    Publication date: February 20, 2020
    Inventors: Nathan D. NOCON, Amy E. NELSON, Michael P. GOSLIN, Seth A. DAVIS, Randall S. DAVIS
  • Publication number: 20200036609
    Abstract: Systems, methods and articles of manufacture that handle secondary robot commands in robot swarms may operate by receiving, at a receiving device in a swarm of devices, a packet included in a signal broadcast within an environment from a transmitting device in the swarm of devices; parsing the packet for a command associated with a primary effect and a secondary effect; in response to determining that the receiving device is paired with the transmitting device, implementing, by the receiving device, the primary effect; and in response to determining that the receiving device is not paired with the transmitting device, implementing, by the receiving device, the secondary effect.
    Type: Application
    Filed: July 27, 2018
    Publication date: January 30, 2020
    Inventors: Nathan D. NOCON, Michael P. GOSLIN, Janice K. ROSENTHAL, Corey D. DRAKE
  • Publication number: 20200004235
    Abstract: Embodiments provide for autonomous drone play and directional alignment by in response to receiving a command for a remotely controlled device to perform a behavior, monitoring a first series of actions performed by the remotely controlled device that comprise the behavior; receiving feedback related to how the remotely controlled device performs the behavior, wherein the feedback is received from at least one of a user, a second device, and environmental sensors; updating, according to the feedback, a machine learning model used by the remotely controlled device to produce a second, different series of actions to perform the behavior; and in response to receiving a subsequent command to perform the behavior, instructing the remotely controlled device to perform the second series of actions.
    Type: Application
    Filed: May 8, 2019
    Publication date: January 2, 2020
    Inventors: Nathan D. NOCON, Timothy M. PANEC, Tritia V. MEDRANO, Clifford WONG, Nicholas F. BARONE, Elliott H. BAUMBACH
  • Publication number: 20190379465
    Abstract: A method and related system and apparatus are described, in which the method comprises detecting a predefined event using an omnidirectional antenna of a first interactive device, and responsive to detecting the predefined event, enabling a directional antenna of the first interactive device. The method further comprises transmitting a first signal using the directional antenna, and responsive to receiving the first signal at a second interactive device, performing an audiovisual effect.
    Type: Application
    Filed: June 8, 2018
    Publication date: December 12, 2019
    Inventor: Nathan D. NOCON
  • Publication number: 20190243594
    Abstract: Techniques for improved interactive devices are provided. Input is received from a user, where the input includes a first request. The input is evaluated using one or more natural language processing techniques to determine a context of the input, and a response to the input is generated based at least in part on the determined context. A first virtual character of a plurality of virtual characters is selected based at least in part on the determined context. The first virtual character is displayed on a rotating display, and the generated response is implemented while the first virtual character is being displayed.
    Type: Application
    Filed: February 5, 2019
    Publication date: August 8, 2019
    Inventors: Michael P. GOSLIN, Nathan D. NOCON, Timothy M. PANEC, Jonathan RD HSU, Janice K. ROSENTHAL
  • Patent number: D891429
    Type: Grant
    Filed: May 22, 2018
    Date of Patent: July 28, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Jonathan RD Hsu, Nathan D. Nocon