Patents by Inventor STERLING CRISPIN
STERLING CRISPIN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250053226Abstract: Systems and methods are provided for performing AR button selection operations on an augmented reality (AR) device. The system displays, by an AR device, a plurality of AR objects on a display region that overlaps a first real-world object, each of the plurality of AR objects being associated with an object selection region. The system computes a first spatial relationship factor for a first AR object of the plurality of AR objects based on a position of the first AR object relative to a position of a second real-world object and adjusts the object selection region of the first AR object based on the first spatial relationship factor. The system activates the first AR object in response to determining that the second real-world object overlaps the object selection region of the first AR object.Type: ApplicationFiled: October 28, 2024Publication date: February 13, 2025Inventor: Sterling Crispin
-
Patent number: 12158982Abstract: Systems and methods are provided for performing AR button selection operations on an augmented reality (AR) device. The system displays, by an AR device, a plurality of AR objects on a display region that overlaps a first real-world object, each of the plurality of AR objects being associated with an object selection region. The system computes a first spatial relationship factor for a first AR object of the plurality of AR objects based on a position of the first AR object relative to a position of a second real-world object and adjusts the object selection region of the first AR object based on the first spatial relationship factor. The system activates the first AR object in response to determining that the second real-world object overlaps the object selection region of the first AR object.Type: GrantFiled: September 7, 2022Date of Patent: December 3, 2024Assignee: SNAP INC.Inventor: Sterling Crispin
-
Publication number: 20240077936Abstract: Systems and methods are provided for performing AR button selection operations on an augmented reality (AR) device. The system displays, by an AR device, a plurality of AR objects on a display region that overlaps a first real-world object, each of the plurality of AR objects being associated with an object selection region. The system computes a first spatial relationship factor for a first AR object of the plurality of AR objects based on a position of the first AR object relative to a position of a second real-world object and adjusts the object selection region of the first AR object based on the first spatial relationship factor. The system activates the first AR object in response to determining that the second real-world object overlaps the object selection region of the first AR object.Type: ApplicationFiled: September 7, 2022Publication date: March 7, 2024Inventor: Sterling Crispin
-
Publication number: 20220155855Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: ApplicationFiled: February 7, 2022Publication date: May 19, 2022Applicant: RPX CORPORATIONInventor: Sterling CRISPIN
-
Patent number: 11275436Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: GrantFiled: September 1, 2020Date of Patent: March 15, 2022Assignee: RPX CORPORATIONInventor: Sterling Crispin
-
Publication number: 20210055788Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: ApplicationFiled: September 1, 2020Publication date: February 25, 2021Applicant: RPX CORPORATIONInventor: Sterling CRISPIN
-
Patent number: 10795434Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: GrantFiled: October 8, 2019Date of Patent: October 6, 2020Assignee: RPX CORPORATIONInventor: Sterling Crispin
-
Patent number: 10592000Abstract: A computing device provides augmented reality images of an environment in which the computing device is used. The computing device is further configured to display a graphical user interface for interacting with the computing device. The graphical user interface may be displayed according to one or more configured graphical user interface layouts. The computing device further includes an inertial measurement unit, which provides input for interacting with one or more portions of the graphical user interface. As a user of the computing device moves the computing device, corresponding graphical changes are made to the displayed graphical user interface. In this way, by moving the computing device, the user is able to interact with, and provide input to, the computing device.Type: GrantFiled: March 22, 2017Date of Patent: March 17, 2020Assignee: DAQRI, LLCInventors: Jonathan Trevor Freeman, Michael Kozlowski, Sterling Crispin, Ari Green
-
Publication number: 20200042082Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: ApplicationFiled: October 8, 2019Publication date: February 6, 2020Applicant: DAQRI LLCInventor: Sterling CRISPIN
-
Patent number: 10481678Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: GrantFiled: January 11, 2017Date of Patent: November 19, 2019Assignee: DAQRI LLCInventor: Sterling Crispin
-
Patent number: 10168778Abstract: A device has a biometric sensor, a display, a light source, and a hardware processor. The biometric sensor measures biometric information of a user of the device. The display displays augmented reality (AR) content. The light source outputs a visual signal. The hardware processor performs operations comprising: identifying a task being performed by the user of the device, determining a disruption level based on a combination of the AR content, the task, and the biometric information, and generating, using the light source, the visual signal corresponding to the disruption level. The visual indicator visually alerts other users of other devices to avoid engaging the user of the device.Type: GrantFiled: June 19, 2017Date of Patent: January 1, 2019Assignee: DAQRI, LLCInventors: Lucas Kazansky, Sterling Crispin, Stefanie A. Hutka
-
Publication number: 20180275762Abstract: A computing device provides augmented reality images of an environment in which the computing device is used. The computing device is further configured to display a graphical user interface for interacting with the computing device. The graphical user interface may be displayed according to one or more configured graphical user interface layouts. The computing device further includes an inertial measurement unit, which provides input for interacting with one or more portions of the graphical user interface. As a user of the computing device moves the computing device, corresponding graphical changes are made to the displayed graphical user interface. In this way, by moving the computing device, the user is able to interact with, and provide input to, the computing device.Type: ApplicationFiled: March 22, 2017Publication date: September 27, 2018Inventors: Jonathan Trevor Freeman, Michael Kozlowski, Sterling Crispin
-
Publication number: 20180197338Abstract: Interface-based modeling and design of three dimensional spaces using two dimensional representations are provided herein. An example method includes converting a three dimensional space into a two dimensional space using a map projection schema, where the two dimensional space is bounded by ergonomic limits of a human, and the two dimensional space is provided as an ergonomic user interface, receiving an anchor position within the ergonomic user interface that defines a placement of an asset relative to the three dimensional space when the two dimensional space is re-converted back to a three dimensional space, and re-converting the two dimensional space back into the three dimensional space for display along with the asset, within an optical display system.Type: ApplicationFiled: January 11, 2017Publication date: July 12, 2018Applicant: DAQRI LLCInventor: Sterling CRISPIN
-
Publication number: 20170364153Abstract: A device has a biometric sensor, a display, a light source, and a hardware processor. The biometric sensor measures biometric information of a user of the device. The display displays augmented reality (AR) content. The light source outputs a visual signal. The hardware processor performs operations comprising: identifying a task being performed by the user of the device, determining a disruption level based on a combination of the AR content, the task, and the biometric information, and generating, using the light source, the visual signal corresponding to the disruption level. The visual indicator visually alerts other users of other devices to avoid engaging the user of the device.Type: ApplicationFiled: June 19, 2017Publication date: December 21, 2017Inventors: Lucas Kazansky, Sterling Crispin, Stefanie A. Hutka
-
Patent number: 9807383Abstract: During calibration, a wearable video headset displays a pattern on a partially transparent display positioned in a field of view of a user's eye. The user has a hand-held marker that includes a scaled version of the displayed pattern. The user moves the marker toward or away from the user's eye until the pattern on the marker appears to be the same size as the pattern on the display. When the sizes match, the headset measures a distance between a forward-facing camera and the hand-held marker. The headset uses the measured distance, and geometrical relationships, to determine the spacing between the user's eye and the display. Such calibration can ensure that the images displayed to the user mesh realistically with the surroundings, which remain partially visible through the partially transparent display of the video headset.Type: GrantFiled: March 30, 2016Date of Patent: October 31, 2017Assignee: DAQRI, LLCInventors: Sterling Crispin, Mark Anthony Sararu
-
Publication number: 20170289535Abstract: During calibration, a wearable video headset displays a pattern on a partially transparent display positioned in a field of view of a user's eye. The user has a hand-held marker that includes a scaled version of the displayed pattern. The user moves the marker toward or away from the user's eye until the pattern on the marker appears to be the same size as the pattern on the display. When the sizes match, the headset measures a distance between a forward-facing camera and the hand-held marker. The headset uses the measured distance, and geometrical relationships, to determine the spacing between the user's eye and the display. Such calibration can ensure that the images displayed to the user mesh realistically with the surroundings, which remain partially visible through the partially transparent display of the video headset.Type: ApplicationFiled: March 30, 2016Publication date: October 5, 2017Inventors: Sterling Crispin, Mark Anthony Sararu
-
Patent number: 9652047Abstract: Visual gestures in a display device allow a user to select and activate features in a display of the display device. A sensor of the display device tracks an eye gaze of a user directed at a display of the display device. A visual gesture module identifies a predefined trigger zone in the display. A virtual object application displays a virtual object in the display based on the eye gaze of the user and the predefined trigger zone.Type: GrantFiled: February 24, 2016Date of Patent: May 16, 2017Assignee: DAQRI, LLCInventors: Brian Mullins, Matthew Kammerait, Christopher Broaddus, Sterling Crispin, Lucas Kazansky
-
Publication number: 20160246384Abstract: Visual gestures in a display device allow a user to select and activate features in a display of the display device. A sensor of the display device tracks an eye gaze of a user directed at a display of the display device. A visual gesture module identifies a predefined trigger zone in the display. A virtual object application displays a virtual object in the display based on the eye gaze of the user and the predefined trigger zone.Type: ApplicationFiled: February 24, 2016Publication date: August 25, 2016Inventors: BRIAN MULLINS, MATTHEW KAMMERAIT, CHRISTOPHER BROADDUS, STERLING CRISPIN, LUCAS KAZANSKY