Patents by Inventor James M. Powderly

James M. Powderly has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200082637
    Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
    Type: Application
    Filed: November 13, 2019
    Publication date: March 12, 2020
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Paul Armistead Hoover
  • Patent number: 10540941
    Abstract: Systems and methods for displaying a cursor and a focus indicator associated with real or virtual objects in a virtual, augmented, or mixed reality environment by a wearable display device are disclosed. The system can determine a spatial relationship between a user-movable cursor and a target object within the environment. The system may render a focus indicator (e.g., a halo, shading, or highlighting) around or adjacent objects that are near the cursor. The focus indicator may be emphasized in directions closer to the cursor and deemphasized in directions farther from the cursor. When the cursor overlaps with a target object, the system can render the object in front of the cursor (or not render the cursor at all), so the object is not occluded by the cursor. The cursor and focus indicator can provide the user with positional feedback and help the user navigate among objects in the environment.
    Type: Grant
    Filed: January 30, 2018
    Date of Patent: January 21, 2020
    Assignee: Magic Leap, Inc.
    Inventors: John Austin Day, Lorena Pazmino, James Cameron Petty, Paul Armistead Hoover, Chris Sorrell, James M. Powderly, Savannah Niles
  • Patent number: 10521025
    Abstract: Systems and methods for interacting with virtual objects in a three-dimensional space using a wearable system are disclosed. The wearable system can be programmed to allow a user to interact with virtual objects using a user input device and poses. The wearable system can also automatically determine contextual information such as layout of the virtual objects in the user's environment and switch the user input mode based on the contextual information.
    Type: Grant
    Filed: October 18, 2016
    Date of Patent: December 31, 2019
    Assignee: Magic Leap, Inc.
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Rony Abovitz, Alysha Naples
  • Patent number: 10510191
    Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
    Type: Grant
    Filed: August 2, 2019
    Date of Patent: December 17, 2019
    Assignee: Magic Leap, Inc.
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Paul Armistead Hoover
  • Publication number: 20190377487
    Abstract: Systems and methods for displaying a group of virtual objects and a scrollbar in a virtual, augmented, or mixed reality environment are disclosed. The group of virtual objects can be scrolled, and a virtual control panel can be displayed indicating objects that are upcoming in the scroll. The scrollbar can provide real-time feedback that can give the user an indication of the point from which the scrolling started, the point at which the scrolling currently has reached, an amount of the group of virtual objects that are displayed to the user relative to the total amount of the group of virtual objects, or a relative position of the currently-viewable virtual objects relative to the entire group of virtual objects.
    Type: Application
    Filed: May 29, 2019
    Publication date: December 12, 2019
    Inventors: Richard St. Clair Bailey, Brian Everett Meaney, John Austin Day, Lorena Pazmino, James Cameron Petty, James M. Powderly, Savannah Niles
  • Publication number: 20190355180
    Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
    Type: Application
    Filed: August 2, 2019
    Publication date: November 21, 2019
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Paul Armistead Hoover
  • Patent number: 10417831
    Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
    Type: Grant
    Filed: August 2, 2018
    Date of Patent: September 17, 2019
    Assignee: Magic Leap, Inc.
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Paul Armistead Hoover
  • Publication number: 20190237044
    Abstract: Systems and methods for displaying a cursor and a focus indicator associated with real or virtual objects in a virtual, augmented, or mixed reality environment by a wearable display device are disclosed. The system can determine a spatial relationship between a user-movable cursor and a target object within the environment. The system may render a focus indicator (e.g., a halo, shading, or highlighting) around or adjacent objects that are near the cursor. The focus indicator may be emphasized in directions closer to the cursor and deemphasized in directions farther from the cursor. When the cursor overlaps with a target object, the system can render the object in front of the cursor (or not render the cursor at all), so the object is not occluded by the cursor. The cursor and focus indicator can provide the user with positional feedback and help the user navigate among objects in the environment.
    Type: Application
    Filed: January 30, 2018
    Publication date: August 1, 2019
    Inventors: John Austin Day, Lorena Pazmino, James Cameron Petty, Paul Armistead Hoover, Chris Sorrell, James M. Powderly, Savannah Niles
  • Publication number: 20190235729
    Abstract: Systems and methods for displaying a cursor and a focus indicator associated with real or virtual objects in a virtual, augmented, or mixed reality environment by a wearable display device are disclosed. The system can determine a spatial relationship between a user-movable cursor and a target object within the environment. The system may render a focus indicator (e.g., a halo, shading, or highlighting) around or adjacent objects that are near the cursor. When the cursor overlaps with a target object, the system can render the object in front of the cursor (or not render the cursor at all), so the object is not occluded by the cursor. The object can be rendered closer to the user than the cursor. A group of virtual objects can be scrolled, and a virtual control panel can be displayed indicating objects that are upcoming in the scroll.
    Type: Application
    Filed: March 14, 2018
    Publication date: August 1, 2019
    Inventors: John Austin Day, Lorena Pazmino, James Cameron Petty, Paul Armistead Hoover, Chris Sorrell, James M. Powderly, Savannah Niles, Richard Bailey
  • Publication number: 20180365901
    Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
    Type: Application
    Filed: August 2, 2018
    Publication date: December 20, 2018
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Paul Armistead Hoover
  • Publication number: 20180350150
    Abstract: User interfaces for virtual reality, augmented reality, and mixed reality display systems are disclosed. The user interfaces may be virtual or physical keyboards. Techniques are described for displaying, configuring, and/or interacting with the user interfaces.
    Type: Application
    Filed: May 18, 2018
    Publication date: December 6, 2018
    Inventors: James M. Powderly, Savannah Niles, Haney Awad, William Wheeler, Nari Choi, Timothy Michael Stutts, Josh Anon, Jeffrey Scott Sommers
  • Publication number: 20180314416
    Abstract: A light emitting user input device can include a touch sensitive portion configured to accept user input (e.g., from a user's thumb) and a light emitting portion configured to output a light pattern. The light pattern can be used to assist the user in interacting with the user input device. Examples include emulating a multi-degree-of-freedom controller, indicating scrolling or swiping actions, indicating presence of objects nearby the device, indicating receipt of notifications, assisting pairing the user input device with another device, or assisting calibrating the user input device. The light emitting user input device can be used to provide user input to a wearable device, such as, e.g., a head mounted display device.
    Type: Application
    Filed: April 27, 2018
    Publication date: November 1, 2018
    Inventors: James M. Powderly, Savannah Niles, Christopher David Nesladek, Isioma Osagbemwenorue Azu, Marshal Ainsworth Fontaine, Haney Awad, William Wheeler, Brian David Schwab, Brian Bucknor
  • Publication number: 20180314406
    Abstract: A light emitting user input device can include a touch sensitive portion configured to accept user input (e.g., from a user's thumb) and a light emitting portion configured to output a light pattern. The light pattern can be used to assist the user in interacting with the user input device. Examples include emulating a multi-degree-of-freedom controller, indicating scrolling or swiping actions, indicating presence of objects nearby the device, indicating receipt of notifications, assisting pairing the user input device with another device, or assisting calibrating the user input device. The light emitting user input device can be used to provide user input to a wearable device, such as, e.g., a head mounted display device.
    Type: Application
    Filed: April 27, 2018
    Publication date: November 1, 2018
    Inventors: James M. Powderly, Savannah Niles, Christopher David Nesladek, Isioma Osagbemwenorue Azu, Marshal Ainsworth Fontaine, Haney Awad, William Wheeler, Brian David Schwab, Brian Bucknor
  • Publication number: 20180307303
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.
    Type: Application
    Filed: April 17, 2018
    Publication date: October 25, 2018
    Inventors: JAMES M. POWDERLY, SAVANNAH NILES, JENNIFER M.R. DEVINE, ADAM C. CARLSON, JEFFREY SOMMERS, PRAVEEN BABU J D
  • Patent number: 10078919
    Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.
    Type: Grant
    Filed: March 29, 2017
    Date of Patent: September 18, 2018
    Assignee: Magic Leap, Inc.
    Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Paul Armistead Hoover
  • Publication number: 20180189568
    Abstract: Embodiments of a wearable device can include a head-mounted display (HMD) which can be configured to display virtual content. While the user is interacting with visual or audible virtual content, the user of the wearable may encounter a triggering event such as, for example, an emergency condition or an unsafe condition, detecting one or more triggering objects in an environment, or determining characteristics of the user's environment (e.g., home or office). Embodiments of the wearable device can automatically detect the triggering event and automatically control the HMD to deemphasize, block, or stop displaying the virtual content. The HMD may include a button that can be actuated by the user to manually deemphasize, block, or stop displaying the virtual content.
    Type: Application
    Filed: November 17, 2017
    Publication date: July 5, 2018
    Inventors: James M. Powderly, Savannah Niles, Nicole Elizabeth Samec, Ali Amirhooshmand, Nastasja U. Robaina, Christopher M. Harrises, Mark Baerenrodt, Carlos A. Rivera Cintron, Brian Keith Smith
  • Publication number: 20170337742
    Abstract: Examples of systems and methods for a wearable system to automatically select or filter available user interface interactions or virtual objects are disclosed. The wearable system can select a group of virtual objects for user interaction based on contextual information associated with the user, the user's environment, physical or virtual objects in the user's environment, or the user's physiological or psychological state.
    Type: Application
    Filed: May 18, 2017
    Publication date: November 23, 2017
    Inventors: James M. Powderly, Alysha Naples, Paul Armistead Hoover, Tucker Spofford
  • Patent number: D868108
    Type: Grant
    Filed: March 16, 2018
    Date of Patent: November 26, 2019
    Assignee: Magic Leap, Inc.
    Inventors: Amy Dedonato, Yan Xu, Lorena Pazmino, Marc Coleman Shelton, James M. Powderly, Dylan Nathan, Li Chin Lin
  • Patent number: D873285
    Type: Grant
    Filed: July 24, 2018
    Date of Patent: January 21, 2020
    Assignee: Magic Leap, Inc.
    Inventors: Lorena Pazmino, Gregory Minh Tran, Karen Stolzenberg, James M. Powderly, Savannah Niles, William Adams, Ian Michener Bradley
  • Patent number: D873852
    Type: Grant
    Filed: July 24, 2018
    Date of Patent: January 28, 2020
    Assignee: Magic Leap, Inc.
    Inventors: Lorena Pazmino, Gregory Minh Tran, Karen Stolzenberg, James M. Powderly, Savannah Niles, William Adams, Ian Michener Bradley