Patents by Inventor Scott Herz
Scott Herz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11953339Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.Type: GrantFiled: February 23, 2021Date of Patent: April 9, 2024Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
-
Publication number: 20240086061Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.Type: ApplicationFiled: September 18, 2023Publication date: March 14, 2024Inventors: Stephen O. LEMAY, Marcel VAN OS, Scott HERZ, Greg CHRISTIE
-
Publication number: 20240068835Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.Type: ApplicationFiled: November 7, 2023Publication date: February 29, 2024Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
-
Patent number: 11907417Abstract: Described herein are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an eye-mounted display (EMD), assist the wearer of a contact lens carrying the EMD to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments of the invention provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, this is accomplished by revealing layers of virtual objects and content based on eye-tracking and other motion information.Type: GrantFiled: July 25, 2019Date of Patent: February 20, 2024Assignee: Tectus CorporationInventors: Dominic Philip Haine, Scott Herz
-
Patent number: 11861138Abstract: Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.Type: GrantFiled: September 17, 2021Date of Patent: January 2, 2024Assignee: Apple Inc.Inventors: Imran A. Chaudhri, Scott Herz, Steven P. Jobs, Freddy A. Anzures, Greg Christie
-
Patent number: 11762547Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.Type: GrantFiled: November 8, 2021Date of Patent: September 19, 2023Assignee: Apple Inc.Inventors: Stephen O. Lemay, Marcel Van Os, Scott Herz, Greg Christie
-
Patent number: 11662807Abstract: The present disclosure relates generally to eye-tracking systems and methods that provide a user the ability to efficiently activate the system and select and dismiss virtual objects within an augmented reality (“AR”) and/or virtual reality (“VR”) environment. A user may activate the user interface by glancing beyond an activation threshold positioned close enough to the edge of the field of view to reliably infer an intent to activate the virtual controls. Subsequently, the user interacts with virtual tools, first virtual “peek” windows and secondary virtual windows to obtain content or virtual control across a variety of granular levels. Subsequently, the user may glance away at virtual content or to other predefined areas within their eye's range of motion to dismiss the tool and/or deactivate the system.Type: GrantFiled: July 27, 2020Date of Patent: May 30, 2023Assignee: Tectus CorporationInventors: Dominic Philip Haine, Scott Herz
-
Publication number: 20230081617Abstract: In some implementations, documents can be presented on a display of a computing device based on a context of the computing device. The context can include a current time and/or current location associated with the computing device. The documents can be presented based on a time and/or location associated with the documents. Documents can be downloaded and shared between devices. Documents can be dynamically updated based on document vendor provided information and/or other network based resources. In some implementations, the various graphical interfaces described herein provide access to a variety of document types in an organized and efficient manner.Type: ApplicationFiled: November 22, 2022Publication date: March 16, 2023Inventors: Chanaka G. KARUNAMUNI, Marcel VAN OS, Scott HERZ, Eliza BLOCK, Glen W. STEELE, Ken FERRY, Peter LAURENS
-
Patent number: 11562325Abstract: In some implementations, documents can be presented on a display of a computing device based on a context of the computing device. The context can include a current time and/or current location associated with the computing device. The documents can be presented based on a time and/or location associated with the documents. Documents can be downloaded and shared between devices. Documents can be dynamically updated based on document vendor provided information and/or other network based resources. In some implementations, the various graphical interfaces described herein provide access to a variety of document types in an organized and efficient manner.Type: GrantFiled: June 6, 2019Date of Patent: January 24, 2023Assignee: Apple Inc.Inventors: Chanaka G. Karunamuni, Marcel Van Os, Scott Herz, Eliza Block, Glen W. Steele, Ken Ferry, Peter Laurens
-
Publication number: 20220301566Abstract: Among other things, techniques and systems are disclosed for implementing contextual voice commands. On a device, a data item in a first context is displayed. On the device, a physical input selecting the displayed data item in the first context is received. On the device, a voice input that relates the selected data item to an operation in a second context is received. The operation is performed on the selected data item in the second context.Type: ApplicationFiled: June 2, 2022Publication date: September 22, 2022Inventors: Marcel VAN OS, Gregory NOVICK, Scott HERZ
-
Patent number: 11343420Abstract: Presented in the present disclosure are system and methods embodiments that allow a user to wear a contact lens that provide a virtual framework for the user to retrieve information from one or more remote cameras and implement remote camera control via eye movement tracked by one or more motion sensors embedded within the contact lens. The remote camera control may include, but not limited to, pan, tilt, and zoom control. A user may activate projection of content captured from a remote camera and control the remote camera via an established communication link. The communication link may be a direct link or indirect link via one or more intermediate devices, e.g., a server and/or an accessory device. This unique way of projection activation and camera control by tracking eye movement provides a convenient and secure way for remote camera control without involvement of hands or voices.Type: GrantFiled: March 30, 2021Date of Patent: May 24, 2022Assignee: Tectus CorporationInventors: Scott Herz, Dominic Philip Haine
-
Publication number: 20220147226Abstract: Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.Type: ApplicationFiled: September 17, 2021Publication date: May 12, 2022Inventors: Imran A. CHAUDHRI, Scott HERZ, Steven JOBS, Freddy A. ANZURES, Greg CHRISTIE
-
Publication number: 20220057907Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.Type: ApplicationFiled: November 8, 2021Publication date: February 24, 2022Inventors: Stephen O. LEMAY, Marcel VAN OS, Scott HERZ, Greg CHRISTIE
-
Patent number: 11169690Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.Type: GrantFiled: February 24, 2020Date of Patent: November 9, 2021Assignee: Apple Inc.Inventors: Stephen O. Lemay, Marcel Van Os, Scott Herz, Greg Christie
-
Publication number: 20210326016Abstract: At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application. A method for operating through an application programming interface (API) in this environment includes transferring a set bounce call. The method further includes setting at least one of maximum and minimum bounce values. The set bounce call causes a bounce of a scrolled region in an opposite direction of a scroll based on a region past an edge of the scrolled region being visible in a display region at the end of the scroll.Type: ApplicationFiled: October 26, 2020Publication date: October 21, 2021Inventors: Andrew PLATZER, Scott HERZ
-
Patent number: 11126321Abstract: Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.Type: GrantFiled: September 4, 2007Date of Patent: September 21, 2021Assignee: Apple Inc.Inventors: Imran A. Chaudhri, Scott Herz, Steven Jobs, Freddy A. Anzures, Greg Christie
-
Publication number: 20210247203Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.Type: ApplicationFiled: February 23, 2021Publication date: August 12, 2021Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
-
Publication number: 20210208674Abstract: The present disclosure relates generally to eye-tracking systems and methods that provide a user the ability to efficiently activate the system and select and dismiss virtual objects within an augmented reality (“AR”) and/or virtual reality (“VR”) environment. A user may activate the user interface by glancing beyond an activation threshold positioned close enough to the edge of the field of view to reliably infer an intent to activate the virtual controls. Subsequently, the user interacts with virtual tools, first virtual “peek” windows and secondary virtual windows to obtain content or virtual control across a variety of granular levels. Subsequently, the user may glance away at virtual content or to other predefined areas within their eye's range of motion to dismiss the tool and/or deactivate the system.Type: ApplicationFiled: July 27, 2020Publication date: July 8, 2021Applicant: Tectus CorporationInventors: Dominic Philip HAINE, Scott HERZ
-
Publication number: 20210124415Abstract: Presented are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an electronic contact lens, assist the wearer of the contact lens carrying a femtoprojector to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, eye motion information is used to wake a smart electronic contact lens, activate tools in a virtual scene, or any combination thereof without the need for blinking, winking, hand gestures, and use of buttons.Type: ApplicationFiled: December 10, 2020Publication date: April 29, 2021Applicant: Tectus CorporationInventors: Dominic Philip HAINE, Scott HERZ, Renaldi WINOTO, Abhishek BHAT, Ramin MIRJALILI, Joseph CZOMPO
-
Patent number: 10976178Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.Type: GrantFiled: September 21, 2016Date of Patent: April 13, 2021Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz