Patents by Inventor Scott Herz

Scott Herz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11953339
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.
    Type: Grant
    Filed: February 23, 2021
    Date of Patent: April 9, 2024
    Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
  • Publication number: 20240086061
    Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.
    Type: Application
    Filed: September 18, 2023
    Publication date: March 14, 2024
    Inventors: Stephen O. LEMAY, Marcel VAN OS, Scott HERZ, Greg CHRISTIE
  • Publication number: 20240068835
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.
    Type: Application
    Filed: November 7, 2023
    Publication date: February 29, 2024
    Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
  • Patent number: 11907417
    Abstract: Described herein are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an eye-mounted display (EMD), assist the wearer of a contact lens carrying the EMD to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments of the invention provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, this is accomplished by revealing layers of virtual objects and content based on eye-tracking and other motion information.
    Type: Grant
    Filed: July 25, 2019
    Date of Patent: February 20, 2024
    Assignee: Tectus Corporation
    Inventors: Dominic Philip Haine, Scott Herz
  • Patent number: 11861138
    Abstract: Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.
    Type: Grant
    Filed: September 17, 2021
    Date of Patent: January 2, 2024
    Assignee: Apple Inc.
    Inventors: Imran A. Chaudhri, Scott Herz, Steven P. Jobs, Freddy A. Anzures, Greg Christie
  • Patent number: 11762547
    Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.
    Type: Grant
    Filed: November 8, 2021
    Date of Patent: September 19, 2023
    Assignee: Apple Inc.
    Inventors: Stephen O. Lemay, Marcel Van Os, Scott Herz, Greg Christie
  • Patent number: 11662807
    Abstract: The present disclosure relates generally to eye-tracking systems and methods that provide a user the ability to efficiently activate the system and select and dismiss virtual objects within an augmented reality (“AR”) and/or virtual reality (“VR”) environment. A user may activate the user interface by glancing beyond an activation threshold positioned close enough to the edge of the field of view to reliably infer an intent to activate the virtual controls. Subsequently, the user interacts with virtual tools, first virtual “peek” windows and secondary virtual windows to obtain content or virtual control across a variety of granular levels. Subsequently, the user may glance away at virtual content or to other predefined areas within their eye's range of motion to dismiss the tool and/or deactivate the system.
    Type: Grant
    Filed: July 27, 2020
    Date of Patent: May 30, 2023
    Assignee: Tectus Corporation
    Inventors: Dominic Philip Haine, Scott Herz
  • Publication number: 20230081617
    Abstract: In some implementations, documents can be presented on a display of a computing device based on a context of the computing device. The context can include a current time and/or current location associated with the computing device. The documents can be presented based on a time and/or location associated with the documents. Documents can be downloaded and shared between devices. Documents can be dynamically updated based on document vendor provided information and/or other network based resources. In some implementations, the various graphical interfaces described herein provide access to a variety of document types in an organized and efficient manner.
    Type: Application
    Filed: November 22, 2022
    Publication date: March 16, 2023
    Inventors: Chanaka G. KARUNAMUNI, Marcel VAN OS, Scott HERZ, Eliza BLOCK, Glen W. STEELE, Ken FERRY, Peter LAURENS
  • Patent number: 11562325
    Abstract: In some implementations, documents can be presented on a display of a computing device based on a context of the computing device. The context can include a current time and/or current location associated with the computing device. The documents can be presented based on a time and/or location associated with the documents. Documents can be downloaded and shared between devices. Documents can be dynamically updated based on document vendor provided information and/or other network based resources. In some implementations, the various graphical interfaces described herein provide access to a variety of document types in an organized and efficient manner.
    Type: Grant
    Filed: June 6, 2019
    Date of Patent: January 24, 2023
    Assignee: Apple Inc.
    Inventors: Chanaka G. Karunamuni, Marcel Van Os, Scott Herz, Eliza Block, Glen W. Steele, Ken Ferry, Peter Laurens
  • Publication number: 20220301566
    Abstract: Among other things, techniques and systems are disclosed for implementing contextual voice commands. On a device, a data item in a first context is displayed. On the device, a physical input selecting the displayed data item in the first context is received. On the device, a voice input that relates the selected data item to an operation in a second context is received. The operation is performed on the selected data item in the second context.
    Type: Application
    Filed: June 2, 2022
    Publication date: September 22, 2022
    Inventors: Marcel VAN OS, Gregory NOVICK, Scott HERZ
  • Patent number: 11343420
    Abstract: Presented in the present disclosure are system and methods embodiments that allow a user to wear a contact lens that provide a virtual framework for the user to retrieve information from one or more remote cameras and implement remote camera control via eye movement tracked by one or more motion sensors embedded within the contact lens. The remote camera control may include, but not limited to, pan, tilt, and zoom control. A user may activate projection of content captured from a remote camera and control the remote camera via an established communication link. The communication link may be a direct link or indirect link via one or more intermediate devices, e.g., a server and/or an accessory device. This unique way of projection activation and camera control by tracking eye movement provides a convenient and secure way for remote camera control without involvement of hands or voices.
    Type: Grant
    Filed: March 30, 2021
    Date of Patent: May 24, 2022
    Assignee: Tectus Corporation
    Inventors: Scott Herz, Dominic Philip Haine
  • Publication number: 20220147226
    Abstract: Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.
    Type: Application
    Filed: September 17, 2021
    Publication date: May 12, 2022
    Inventors: Imran A. CHAUDHRI, Scott HERZ, Steven JOBS, Freddy A. ANZURES, Greg CHRISTIE
  • Publication number: 20220057907
    Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.
    Type: Application
    Filed: November 8, 2021
    Publication date: February 24, 2022
    Inventors: Stephen O. LEMAY, Marcel VAN OS, Scott HERZ, Greg CHRISTIE
  • Patent number: 11169690
    Abstract: A portable electronic device for instant messaging is disclosed. One aspect of the invention involves a graphical user interface (GUI) on a portable electronic device with a touch screen display. The GUI has a set of messages exchanged between a user of the device and another person. The set of messages are displayed in a chronological order. In response to detecting a scrolling gesture comprising a substantially vertical movement of a user contact with the touch screen display, the display of messages are scrolled in accordance with a direction of the scrolling gesture. The detecting of the scrolling gesture is substantially independent of a horizontal position of the user contact with the touch screen display.
    Type: Grant
    Filed: February 24, 2020
    Date of Patent: November 9, 2021
    Assignee: Apple Inc.
    Inventors: Stephen O. Lemay, Marcel Van Os, Scott Herz, Greg Christie
  • Publication number: 20210326016
    Abstract: At least certain embodiments of the present disclosure include an environment with user interface software interacting with a software application. A method for operating through an application programming interface (API) in this environment includes transferring a set bounce call. The method further includes setting at least one of maximum and minimum bounce values. The set bounce call causes a bounce of a scrolled region in an opposite direction of a scroll based on a region past an edge of the scrolled region being visible in a display region at the end of the scroll.
    Type: Application
    Filed: October 26, 2020
    Publication date: October 21, 2021
    Inventors: Andrew PLATZER, Scott HERZ
  • Patent number: 11126321
    Abstract: Methods, systems, and apparatus, including computer program products, for presenting user interface elements. A first page of one or more user interface elements is presented on a touch-sensitive display. Each of the user interface elements corresponds to a respective application. A gesture performed on the touch-sensitive display is detected. In response to the detected gesture, a second page of one or more user interface elements is presented on the touch-sensitive display.
    Type: Grant
    Filed: September 4, 2007
    Date of Patent: September 21, 2021
    Assignee: Apple Inc.
    Inventors: Imran A. Chaudhri, Scott Herz, Steven Jobs, Freddy A. Anzures, Greg Christie
  • Publication number: 20210247203
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.
    Type: Application
    Filed: February 23, 2021
    Publication date: August 12, 2021
    Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
  • Publication number: 20210208674
    Abstract: The present disclosure relates generally to eye-tracking systems and methods that provide a user the ability to efficiently activate the system and select and dismiss virtual objects within an augmented reality (“AR”) and/or virtual reality (“VR”) environment. A user may activate the user interface by glancing beyond an activation threshold positioned close enough to the edge of the field of view to reliably infer an intent to activate the virtual controls. Subsequently, the user interacts with virtual tools, first virtual “peek” windows and secondary virtual windows to obtain content or virtual control across a variety of granular levels. Subsequently, the user may glance away at virtual content or to other predefined areas within their eye's range of motion to dismiss the tool and/or deactivate the system.
    Type: Application
    Filed: July 27, 2020
    Publication date: July 8, 2021
    Applicant: Tectus Corporation
    Inventors: Dominic Philip HAINE, Scott HERZ
  • Publication number: 20210124415
    Abstract: Presented are eye-controlled user-machine interaction systems and methods that, based on input variables that comprise orientation and motion of an electronic contact lens, assist the wearer of the contact lens carrying a femtoprojector to control and navigate a virtual scene that may be superimposed onto the real-world environment. Various embodiments provide for smooth, intuitive, and naturally flowing eye-controlled, interactive operations between the wearer and a virtual environment. In certain embodiments, eye motion information is used to wake a smart electronic contact lens, activate tools in a virtual scene, or any combination thereof without the need for blinking, winking, hand gestures, and use of buttons.
    Type: Application
    Filed: December 10, 2020
    Publication date: April 29, 2021
    Applicant: Tectus Corporation
    Inventors: Dominic Philip HAINE, Scott HERZ, Renaldi WINOTO, Abhishek BHAT, Ramin MIRJALILI, Joseph CZOMPO
  • Patent number: 10976178
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.
    Type: Grant
    Filed: September 21, 2016
    Date of Patent: April 13, 2021
    Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz