Patents by Inventor Michael Ishigaki

Michael Ishigaki has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250054227
    Abstract: A method implemented by a computing device includes displaying on a display of the computing device an extended reality (XR) environment, and determining one or more virtual characteristics associated with a first virtual content and a second visual content viewable within the displayed XR environment, in which the second virtual content is at least partially occluded by the first virtual content. The method further includes generating, based on the one or more virtual characteristics, a plurality of user input interception layers to be associated with the first virtual content and the second visual content, and in response to determining a user intent to interact with the second virtual content, directing one or more user inputs to the second virtual content based on whether or not the one or more user inputs are intercepted by one or more of the plurality of user input interception layers.
    Type: Application
    Filed: August 9, 2023
    Publication date: February 13, 2025
    Inventors: Michael Ishigaki, Shengzhi Wu
  • Publication number: 20240264851
    Abstract: The present disclosure provides world-controlled augments and application-controlled augments. World-controlled augments can be controlled directly by a shell in the artificial reality environment. To allow even inexperienced users to develop world-controlled augments, a world-controlled builder system is provided. Application-controlled augments may be resource intensive (e.g., using eye-tracking, social-media tie-ins, etc.), may support complicated interactions among themselves, may require or have extensive use of inputs and permissioned resources, and are controlled by their hosting application. When a running application is halted, the application closes its application-controlled augments but can choose to have the XR system run a “place-holder” world-controlled augment for as long as the application is not running. The place-holder world-controlled augment preserves the appearance of the application in the artificial reality environment but uses few system resources.
    Type: Application
    Filed: April 16, 2024
    Publication date: August 8, 2024
    Inventors: John Jacob BLAKELEY, Michal HLAVAC, Pol PLA I CONESA, Michael ISHIGAKI, Jonathan Michael PROTO, Paul MEALY, Kevin HARPER, Jenny KAM, Jossie E. TIRADO ARROYO
  • Publication number: 20240242591
    Abstract: The present disclosure provides for Contextual Alerting Functions (CAFs). CAFs are an interface paradigm for synchronized device frameworks across multi-device ecosystems and user data models. CAFs replace existing concepts such as apps and/or notifications.
    Type: Application
    Filed: December 15, 2021
    Publication date: July 18, 2024
    Inventors: Shengzhi Wu, Diane C. Wang, Michael Ishigaki, Elena Jessop Nattinger
  • Patent number: 12026527
    Abstract: The present disclosure provides world-controlled augments and application-controlled augments. World-controlled augments can be controlled directly by a shell in the artificial reality environment. To allow even inexperienced users to develop world-controlled augments, a world-controlled builder system is provided. Application-controlled augments may be resource intensive (e.g., using eye-tracking, social-media tie-ins, etc.), may support complicated interactions among themselves, may require or have extensive use of inputs and permissioned resources, and are controlled by their hosting application. When a running application is halted, the application closes its application-controlled augments but can choose to have the XR system run a “place-holder” world-controlled augment for as long as the application is not running. The place-holder world-controlled augment preserves the appearance of the application in the artificial reality environment but uses few system resources.
    Type: Grant
    Filed: May 10, 2022
    Date of Patent: July 2, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: John Jacob Blakeley, Michal Hlavac, Pol Pla I Conesa, Michael Ishigaki, Jonathan Michael Proto, Paul Mealy, Kevin Harper, Jenny Kam, Jossie E Tirado Arroyo
  • Publication number: 20230367611
    Abstract: The present disclosure provides world-controlled augments and application-controlled augments. World-controlled augments can be controlled directly by a shell in the artificial reality environment. To allow even inexperienced users to develop world-controlled augments, a world-controlled builder system is provided. Application-controlled augments may be resource intensive (e.g., using eye-tracking, social-media tie-ins, etc.), may support complicated interactions among themselves, may require or have extensive use of inputs and permissioned resources, and are controlled by their hosting application. When a running application is halted, the application closes its application-controlled augments but can choose to have the XR system run a “place-holder” world-controlled augment for as long as the application is not running. The place-holder world-controlled augment preserves the appearance of the application in the artificial reality environment but uses few system resources.
    Type: Application
    Filed: May 10, 2022
    Publication date: November 16, 2023
    Inventors: John Jacob BLAKELEY, Michal HLAVAC, Pol PLA I CONESA, Michael ISHIGAKI, Jonathan Michael PROTO, Paul MEALY, Kevin HARPER, Jenny KAM, Jossie E. TIRADO ARROYO
  • Publication number: 20230025516
    Abstract: The present disclosure provides a system and method for accurately detecting exercises performed by a user through a combination of signals from a visual input device and from one or more sensors of a wearable device. For each workout type, an algorithm leverages multimodal inputs for automatic workout detection/identification. Using multiple sources of visual and gestural inputs to detect the same workout results in a higher confidence in the detection. Moreover, it allows for continued detection of the workout, including counting repetitions, even when one or more signals becomes unavailable, such as if the user moves out of a field of view of the visual input device.
    Type: Application
    Filed: July 22, 2021
    Publication date: January 26, 2023
    Inventors: Diane C. Wang, Michael Ishigaki, Elena Jessop Nattinger
  • Patent number: 11501505
    Abstract: Systems and methods are described that obtain depth data associated with a scene captured by an electronic device, obtain location data associated with a plurality of physical objects within a predetermined distance of the electronic device, generate a plurality of augmented reality objects configured to be displayed over a portion of the plurality of physical objects, and generate a plurality of proximity layers corresponding to the at least one scene, wherein a respective proximity layer is configured to trigger display of the auxiliary data corresponding to AR objects associated with the respective proximity layer while suppressing other AR objects.
    Type: Grant
    Filed: July 27, 2021
    Date of Patent: November 15, 2022
    Assignee: GOOGLE LLC
    Inventors: Michael Ishigaki, Diane Wang
  • Publication number: 20210358225
    Abstract: Systems and methods are described that obtain depth data associated with a scene captured by an electronic device, obtain location data associated with a plurality of physical objects within a predetermined distance of the electronic device, generate a plurality of augmented reality objects configured to be displayed over a portion of the plurality of physical objects, and generate a plurality of proximity layers corresponding to the at least one scene, wherein a respective proximity layer is configured to trigger display of the auxiliary data corresponding to AR objects associated with the respective proximity layer while suppressing other AR objects.
    Type: Application
    Filed: July 27, 2021
    Publication date: November 18, 2021
    Inventors: Michael Ishigaki, Diane Wang
  • Patent number: 11107291
    Abstract: Systems and methods are described that obtain depth data associated with a scene captured by an electronic device, obtain location data associated with a plurality of physical objects within a predetermined distance of the electronic device, generate a plurality of augmented reality objects configured to be displayed over a portion of the plurality of physical objects, and generate a plurality of proximity layers corresponding to the at least one scene, wherein a respective proximity layer is configured to trigger display of the auxiliary data corresponding to AR objects associated with the respective proximity layer while suppressing other AR objects.
    Type: Grant
    Filed: July 6, 2020
    Date of Patent: August 31, 2021
    Assignee: GOOGLE LLC
    Inventors: Michael Ishigaki, Diane Wang
  • Publication number: 20210012572
    Abstract: Systems and methods are described that obtain depth data associated with a scene captured by an electronic device, obtain location data associated with a plurality of physical objects within a predetermined distance of the electronic device, generate a plurality of augmented reality objects configured to be displayed over a portion of the plurality of physical objects, and generate a plurality of proximity layers corresponding to the at least one scene, wherein a respective proximity layer is configured to trigger display of the auxiliary data corresponding to AR objects associated with the respective proximity layer while suppressing other AR objects.
    Type: Application
    Filed: July 6, 2020
    Publication date: January 14, 2021
    Inventors: Michael Ishigaki, Diane Wang