Patents by Inventor James Tichenor

James Tichenor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11847753
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: January 9, 2023
    Date of Patent: December 19, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11769304
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: November 9, 2021
    Date of Patent: September 26, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11651573
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: October 12, 2021
    Date of Patent: May 16, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11637999
    Abstract: Aspects of the present disclosure are directed to setting a display mode for a virtual object based on a display mode timer that is controlled by context factors. An artificial reality system can associate one or more virtual objects with a corresponding display mode timer. Various ranges on the display mode timer can be mapped to different display modes that the virtual object can assume. The display mode timer can be adjusted to add time based on a determination of a user focusing on the virtual object or other context factors. Display mode timers can also have rules for setting other display mode timer properties, such as how quickly the display mode timer runs down, that are evaluated based on context factors.
    Type: Grant
    Filed: October 14, 2021
    Date of Patent: April 25, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: James Tichenor, Hayden Schoen, Yeliz Karadayi
  • Patent number: 11513605
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Grant
    Filed: October 22, 2020
    Date of Patent: November 29, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Publication number: 20220122329
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Application
    Filed: October 12, 2021
    Publication date: April 21, 2022
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Publication number: 20220068035
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Application
    Filed: November 9, 2021
    Publication date: March 3, 2022
    Inventors: James TICHENOR, Arthur ZWIEGINCEW, Hayden Schoen, Alex MARCOLINA, Gregory ALT, Todd HARRIS, Merlyn DENG, Barrett FOX, Michal HLAVAC
  • Patent number: 11227445
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: January 18, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11176755
    Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: November 16, 2021
    Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
  • Patent number: 11178376
    Abstract: Aspects of the present disclosure are directed to setting a display mode for a virtual object based on a display mode timer that is controlled by context factors. An artificial reality system can associate one or more virtual objects with a corresponding display mode timer. Various ranges on the display mode timer can be mapped to different display modes that the virtual object can assume. The display mode timer can be adjusted to add time based on a determination of a user focusing on the virtual object or other context factors. Display mode timers can also have rules for setting other display mode timer properties, such as how quickly the display mode timer runs down, that are evaluated based on context factors.
    Type: Grant
    Filed: September 4, 2020
    Date of Patent: November 16, 2021
    Inventors: James Tichenor, Hayden Schoen, Yeliz Karadayi
  • Publication number: 20210034161
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Application
    Filed: October 22, 2020
    Publication date: February 4, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Patent number: 10908694
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Grant
    Filed: February 1, 2016
    Date of Patent: February 2, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Publication number: 20190324528
    Abstract: A head mounted display device provides offset adjustments for gaze points provided by an eye tracking component. In a model generation phase, heuristics are used to estimate a gaze point of the user based on the gaze point provided by the eye tracking component and features that are visible in the field of view of the user. The features may include objects, edges, faces, and text. If the estimated gaze point is different than the gaze point that was provided by the eye tracking component, the difference is used to train a model along with a confidence value that reflects the strength of the estimated gaze point. In an adjustment phase, when the user is using an application that relies on the eye tracking component, the generated model is used to determine offsets to adjust the gaze points that are provided by the eye tracking component.
    Type: Application
    Filed: April 20, 2018
    Publication date: October 24, 2019
    Inventors: Shane Frandon WILLIAMS, James TICHENOR, Sophie STELLMACH, Andrew David WILSON
  • Patent number: 10234935
    Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.
    Type: Grant
    Filed: August 11, 2016
    Date of Patent: March 19, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
  • Patent number: 10140776
    Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.
    Type: Grant
    Filed: June 13, 2016
    Date of Patent: November 27, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert C J Pengelly, Yasaman Sheri, James Tichenor
  • Publication number: 20180046245
    Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.
    Type: Application
    Filed: August 11, 2016
    Publication date: February 15, 2018
    Inventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
  • Publication number: 20170358144
    Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.
    Type: Application
    Filed: June 13, 2016
    Publication date: December 14, 2017
    Inventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert CJ Pengelly, Yasaman Sheri, James Tichenor
  • Publication number: 20170220119
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Application
    Filed: February 1, 2016
    Publication date: August 3, 2017
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Patent number: 9563270
    Abstract: A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector.
    Type: Grant
    Filed: December 26, 2014
    Date of Patent: February 7, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: James Tichenor, Aaron Mackay Burns, Jamie Marconi, Bharat Ahluwalia, Bede Jordan
  • Publication number: 20160187971
    Abstract: A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector.
    Type: Application
    Filed: December 26, 2014
    Publication date: June 30, 2016
    Inventors: James Tichenor, Aaron Mackay Burns, Jamie Marconi, Bharat Ahluwalia, Bede Jordan