Patents by Inventor James Tichenor
James Tichenor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11847753Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: GrantFiled: January 9, 2023Date of Patent: December 19, 2023Assignee: Meta Platforms Technologies, LLCInventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
-
Patent number: 11769304Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: GrantFiled: November 9, 2021Date of Patent: September 26, 2023Assignee: Meta Platforms Technologies, LLCInventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
-
Patent number: 11651573Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: GrantFiled: October 12, 2021Date of Patent: May 16, 2023Assignee: Meta Platforms Technologies, LLCInventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
-
Patent number: 11637999Abstract: Aspects of the present disclosure are directed to setting a display mode for a virtual object based on a display mode timer that is controlled by context factors. An artificial reality system can associate one or more virtual objects with a corresponding display mode timer. Various ranges on the display mode timer can be mapped to different display modes that the virtual object can assume. The display mode timer can be adjusted to add time based on a determination of a user focusing on the virtual object or other context factors. Display mode timers can also have rules for setting other display mode timer properties, such as how quickly the display mode timer runs down, that are evaluated based on context factors.Type: GrantFiled: October 14, 2021Date of Patent: April 25, 2023Assignee: Meta Platforms Technologies, LLCInventors: James Tichenor, Hayden Schoen, Yeliz Karadayi
-
Patent number: 11513605Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.Type: GrantFiled: October 22, 2020Date of Patent: November 29, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
-
Publication number: 20220122329Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: ApplicationFiled: October 12, 2021Publication date: April 21, 2022Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
-
Publication number: 20220068035Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: ApplicationFiled: November 9, 2021Publication date: March 3, 2022Inventors: James TICHENOR, Arthur ZWIEGINCEW, Hayden Schoen, Alex MARCOLINA, Gregory ALT, Todd HARRIS, Merlyn DENG, Barrett FOX, Michal HLAVAC
-
Patent number: 11227445Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: GrantFiled: August 31, 2020Date of Patent: January 18, 2022Assignee: Facebook Technologies, LLCInventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
-
Patent number: 11176755Abstract: Aspects of the present disclosure are directed to providing an artificial reality environment with augments and surfaces. An “augment” is a virtual container in 3D space that can include presentation data, context, and logic. An artificial reality system can use augments as the fundamental building block for displaying 2D and 3D models in the artificial reality environment. For example, augments can represent people, places, and things in an artificial reality environment and can respond to a context such as a current display mode, time of day, a type of surface the augment is on, a relationship to other augments, etc. Augments can be on a “surface” that has a layout and properties that cause augments on that surface to display in different ways. Augments and other objects (real or virtual) can also interact, where these interactions can be controlled by rules for the objects evaluated based on information from the shell.Type: GrantFiled: August 31, 2020Date of Patent: November 16, 2021Inventors: James Tichenor, Arthur Zwiegincew, Hayden Schoen, Alex Marcolina, Gregory Alt, Todd Harris, Merlyn Deng, Barrett Fox, Michal Hlavac
-
Patent number: 11178376Abstract: Aspects of the present disclosure are directed to setting a display mode for a virtual object based on a display mode timer that is controlled by context factors. An artificial reality system can associate one or more virtual objects with a corresponding display mode timer. Various ranges on the display mode timer can be mapped to different display modes that the virtual object can assume. The display mode timer can be adjusted to add time based on a determination of a user focusing on the virtual object or other context factors. Display mode timers can also have rules for setting other display mode timer properties, such as how quickly the display mode timer runs down, that are evaluated based on context factors.Type: GrantFiled: September 4, 2020Date of Patent: November 16, 2021Inventors: James Tichenor, Hayden Schoen, Yeliz Karadayi
-
Publication number: 20210034161Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.Type: ApplicationFiled: October 22, 2020Publication date: February 4, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
-
Patent number: 10908694Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.Type: GrantFiled: February 1, 2016Date of Patent: February 2, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
-
Publication number: 20190324528Abstract: A head mounted display device provides offset adjustments for gaze points provided by an eye tracking component. In a model generation phase, heuristics are used to estimate a gaze point of the user based on the gaze point provided by the eye tracking component and features that are visible in the field of view of the user. The features may include objects, edges, faces, and text. If the estimated gaze point is different than the gaze point that was provided by the eye tracking component, the difference is used to train a model along with a confidence value that reflects the strength of the estimated gaze point. In an adjustment phase, when the user is using an application that relies on the eye tracking component, the generated model is used to determine offsets to adjust the gaze points that are provided by the eye tracking component.Type: ApplicationFiled: April 20, 2018Publication date: October 24, 2019Inventors: Shane Frandon WILLIAMS, James TICHENOR, Sophie STELLMACH, Andrew David WILSON
-
Patent number: 10234935Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.Type: GrantFiled: August 11, 2016Date of Patent: March 19, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
-
Patent number: 10140776Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.Type: GrantFiled: June 13, 2016Date of Patent: November 27, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert C J Pengelly, Yasaman Sheri, James Tichenor
-
Publication number: 20180046245Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.Type: ApplicationFiled: August 11, 2016Publication date: February 15, 2018Inventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
-
Publication number: 20170358144Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.Type: ApplicationFiled: June 13, 2016Publication date: December 14, 2017Inventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert CJ Pengelly, Yasaman Sheri, James Tichenor
-
Publication number: 20170220119Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.Type: ApplicationFiled: February 1, 2016Publication date: August 3, 2017Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
-
Patent number: 9563270Abstract: A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector.Type: GrantFiled: December 26, 2014Date of Patent: February 7, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: James Tichenor, Aaron Mackay Burns, Jamie Marconi, Bharat Ahluwalia, Bede Jordan
-
Publication number: 20160187971Abstract: A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector.Type: ApplicationFiled: December 26, 2014Publication date: June 30, 2016Inventors: James Tichenor, Aaron Mackay Burns, Jamie Marconi, Bharat Ahluwalia, Bede Jordan