Patents by Inventor Ann Miller
Ann Miller has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250109766Abstract: Fasteners pre-dosed with a curable sealant material encased within an at least partially-cured rupturable protective shell and methods are disclosed for the application of a selected curable sealant material dose to an aircraft assembly that is delivered simultaneously to an aircraft assembly location during fastener installation.Type: ApplicationFiled: October 3, 2023Publication date: April 3, 2025Inventors: Benjamin Priest Hargrave, Carissa Ann Pajel, Melinda Dae Miller, David Michael Sims, JR.
-
Publication number: 20250082173Abstract: Embodiments described herein provide examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: ApplicationFiled: September 13, 2024Publication date: March 13, 2025Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Publication number: 20250066823Abstract: Disclosed herein are methods, compositions and systems useful for genetically engineering subcellular compartments such as OMVs for synthetic biology applications. In an embodiment, genetically engineered bacteria use OMVs to secrete compounds or proteins of interest extracellularly where the compounds or proteins of interest can be isolated from the growth media.Type: ApplicationFiled: July 29, 2024Publication date: February 27, 2025Inventors: Rebecca Ann WILKES, Allison Jean ZIMONT WERNER, Tarryn E. MILLER, Gregg Tyler BECKHAM
-
Publication number: 20250045819Abstract: A method may include accessing a financial account of a user to detect an expenditure. The method may include determining a category of the expenditure. The method may include determining whether the user has a financial goal pertaining to the category. The method may include generating a user interface, the user interface including a non-textual visualization pertaining to the financial goal. The method may include presenting the user interface on a computing device of the user.Type: ApplicationFiled: October 28, 2022Publication date: February 6, 2025Inventors: Himanshu Baral, Anthony Scott Best, Daniel Jerome Clifford, Cathy Ann Costa, Frank A. DiGangi, Marta Alejandra Gonzalez, Venkatesh Hebbar, Matthew J. Heffron, Brenda S. Makowski, Dale C. Miller, Alyssa R. Santiago, Michael J. Sbandi
-
Patent number: 12213756Abstract: A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.Type: GrantFiled: September 19, 2023Date of Patent: February 4, 2025Assignee: Verb Surgical Inc.Inventors: Joan Savall, Denise Ann Miller, Anette Lia Freiin von Kapri, Paolo Invernizzi, John Magnasco
-
Publication number: 20250027061Abstract: The present invention provides engineered RNA polymerase variants and compositions comprising these variants. The present invention further provides engineered T7 RNA polymerase variants and compositions comprising these variants. These variants have been evolved for selective incorporation of the m7G(5?)ppp(5?)m7G cap analog over GTP at the initiation of in vitro transcription. The present invention also provides methods for selective capping of RNA transcripts.Type: ApplicationFiled: October 8, 2024Publication date: January 23, 2025Inventors: Mathew G. Miller, Chinping Chng, Oscar Alvizo, Melissa Ann Mayo, James Nicholas Riggins, Xiang Yi, Jonathan S. Penfield, Gjalt W. Huisman, Jared Davis, Yasushi Saotome
-
Patent number: 12203493Abstract: A flow restrictor is provided, comprising a first sheet including a flow passage, and a second sheet stacked on the first sheet. A hole is provided in a center of the second sheet. The flow passage includes a groove cut into a surface of the first sheet that communicates with an expansion zone at a peripheral area of the first sheet. A peripheral edge of the second sheet contacts the first sheet in the expansion zone between an inner diameter and an outer diameter of the expansion zone.Type: GrantFiled: January 10, 2022Date of Patent: January 21, 2025Assignee: HORIBA STEC, Co., Ltd.Inventors: Andrew Wayne Price, Era Benjamin Hartman, Esteban Daniel Gonzalez, William Wylie White, Virginia Ann Miller
-
Patent number: 12206223Abstract: An electrical distribution enclosure includes an electrically insulated panel defining an access portion of the enclosure and a load portion of the enclosure; at least one electrically insulated compartment panel positioned within the access portion; and two or more compartments within the access portion, each compartment separated from an adjacent compartment by an electrically insulated compartment panel, each of the two or more compartments having a front-accessible neutral connection, at least one front-accessible phase connection and a at least rear phase connection.Type: GrantFiled: June 2, 2022Date of Patent: January 21, 2025Assignee: ABB SCHWEIZ AGInventors: Rebecca Ann Waddell, Gilbert Taylor Miller, Zachary Wade Smith, Thomas Anthony Kendzia, III, Frank Allen Cowan
-
Patent number: 12175018Abstract: A method for disengaging a surgical instrument of a surgical robotic system comprising receiving a gaze input from an eye tracker; determining, by one or more processors, whether the gaze input indicates the gaze of the user is outside or inside of the display; in response to determining the gaze input indicates the gaze of the user is outside of the display, determining an amount of time the gaze of the user is outside of the display; in response to determining the gaze of the user is outside of the display for less than a maximum amount of time, pause the surgical robotic system from a teleoperation mode; and in response to determining the gaze of the user is outside of the display for more than the maximum amount of time, disengage the surgical robotic system from the teleoperation mode.Type: GrantFiled: March 8, 2024Date of Patent: December 24, 2024Assignee: Verb Surgical Inc.Inventors: Anette Lia Freiin von Kapri, Denise Ann Miller, Paolo Invernizzi, Joan Savall, John Magnasco
-
Patent number: 12133704Abstract: Surgical systems including a user console for controlling a surgical robotic tool are described. A witness sensor and a reference sensor can be mounted on the user console to measure an electromagnetic field distortion near a location, and to measure deformation of the location, respectively. Distortion in the electromagnetic field can be detected based on the measurements from the witness sensor and the reference sensor. An alert can be generated, or teleoperation of the surgical tool can be adjusted or paused, when a user interface device used to control the surgical tool is within a range of the distortion. The distortion can be from a known source, such as from actuation of a haptic motor of the user interface device, and the user console can adjust the actuation to reduce the likelihood that the distortion will disrupt surgical tool control. Other embodiments are described and claimed.Type: GrantFiled: April 21, 2023Date of Patent: November 5, 2024Assignee: Verb Surgical Inc.Inventors: Joan Savall, Denise Ann Miller, Hamid Reza Sani
-
Publication number: 20240355455Abstract: A method performed by a surgical system. The method includes receiving 1) a video stream captured by a camera inside an operating room and 2) an audio stream that includes sounds captured by a microphone inside the operating room. The method detects an audio event within the operating room by performing an acoustic analysis upon the audio stream. The method produces a timestamp based on the detected audio event and based on an internal clock of the electronic device and tags the timestamp to the video stream and the audio stream. The method stores the tagged image stream and the tagged audio stream in memory.Type: ApplicationFiled: April 21, 2023Publication date: October 24, 2024Inventors: David Wallace, Daniel Ostler-Mildner, Salvatore Virga, Denise Ann Miller, Felix Bork
-
Patent number: 12108928Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: GrantFiled: October 10, 2023Date of Patent: October 8, 2024Assignee: Verb Surgical Inc.Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Publication number: 20240288936Abstract: A method for disengaging a surgical instrument of a surgical robotic system comprising receiving a gaze input from an eye tracker; determining, by one or more processors, whether the gaze input indicates the gaze of the user is outside or inside of the display; in response to determining the gaze input indicates the gaze of the user is outside of the display, determining an amount of time the gaze of the user is outside of the display; in response to determining the gaze of the user is outside of the display for less than a maximum amount of time, pause the surgical robotic system from a teleoperation mode; and in response to determining the gaze of the user is outside of the display for more than the maximum amount of time, disengage the surgical robotic system from the teleoperation mode.Type: ApplicationFiled: March 8, 2024Publication date: August 29, 2024Inventors: Anette Lia Freiin von Kapri, Denise Ann Miller, Paolo Invernizzi, Joan Savall, John Magnasco
-
Patent number: 11960645Abstract: A method for disengaging a surgical instrument of a surgical robotic system comprising receiving a gaze input from an eye tracker; determining, by one or more processors, whether the gaze input indicates the gaze of the user is outside or inside of the display; in response to determining the gaze input indicates the gaze of the user is outside of the display, determining an amount of time the gaze of the user is outside of the display; in response to determining the gaze of the user is outside of the display for less than a maximum amount of time, pause the surgical robotic system from a teleoperation mode; and in response to determining the gaze of the user is outside of the display for more than the maximum amount of time, disengage the surgical robotic system from the teleoperation mode.Type: GrantFiled: November 16, 2021Date of Patent: April 16, 2024Assignee: Verb Surgical Inc.Inventors: Anette Lia Freiin von Kapri, Denise Ann Miller, Paolo Invernizzi, Joan Savall, John Magnasco
-
Patent number: 11944402Abstract: Surgical robotic systems, and methods of verifying functionality of a user interface device of such systems, are described. During a surgical procedure, the user interface device controls motion of a surgical tool. Proximity sensors of the user interface device generate proximity measures throughout the surgical procedure. The proximity measures are used to detect whether the user interface device is dropped and to responsively halt motion of the surgical tool. To verify an accuracy of the proximity sensors that provide the drop detection, a test is performed when the user interface device is placed in a dock. The test compares the generated proximity measures to expected proximity data. When the proximity measures match the expected proximity data, the system determines that the proximity sensors are functioning accurately and verifies that the user interface device is functioning safely. Other embodiments are described and claimed.Type: GrantFiled: September 18, 2020Date of Patent: April 2, 2024Assignee: Verb Surgical Inc.Inventors: Karan Handa, Denise Ann Miller, Joan Savall, Mufaddal Jhaveri
-
Publication number: 20240099555Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: ApplicationFiled: October 10, 2023Publication date: March 28, 2024Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Publication number: 20240081929Abstract: A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.Type: ApplicationFiled: September 19, 2023Publication date: March 14, 2024Inventors: Joan Savall, Denise Ann Miller, Anette Lia Freiin von Kapri, Paolo Invernizzi, John Magnasco
-
Publication number: 20240033023Abstract: A method performed by a surgical robotic system that includes a seat that is arranged for a user to sit and a display column that includes at least one display for displaying a three-dimensional (3D) surgical presentation. The method includes receiving an indication that the user has manually adjusted the seat and in response, determining, while the user is sitting on the seat, a position of the user's eyes, determining a configuration for the display column based on the determined position of the user's eyes, and adjusting the display column by actuating one or more actuators of the display column according to the determined configuration.Type: ApplicationFiled: October 16, 2023Publication date: February 1, 2024Inventors: Anette Lia FREIIN VON KAPRI, Joan SAVALL, Denise Ann MILLER
-
Patent number: 11826115Abstract: A method performed by a surgical robotic system that includes a seat that is arranged for a user to sit and a display column that includes at least one display for displaying a three-dimensional (3D) surgical presentation. The method includes receiving an indication that the user has manually adjusted the seat and in response, determining, while the user is sitting on the seat, a position of the user's eyes, determining a configuration for the display column based on the determined position of the user's eyes, and adjusting the display column by actuating one or more actuators of the display column according to the determined configuration.Type: GrantFiled: September 14, 2020Date of Patent: November 28, 2023Assignee: Verb Surgical Inc.Inventors: Anette Lia Freiin Von Kapri, Joan Savall, Denise Ann Miller
-
Patent number: 11819188Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: GrantFiled: February 8, 2023Date of Patent: November 21, 2023Assignee: Verb Surgical Inc.Inventors: Jagadish Venkataraman, Denise Ann Miller