Patents by Inventor Paul F. Sorenson
Paul F. Sorenson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20210137451Abstract: A system is described to facilitate breathing management. The system includes a plurality of wearable sensors, and one or more processors to access a breathing pattern of a wearer of the plurality of wearable sensors based on at least a portion of data from one or more of the plurality of sensors, determine a body movement of the wearer based on at least a portion of the data from one or more of the plurality of sensors, access a heart rate of the wearer based on at least a portion of the data from one or more of the plurality of sensors, and identify a recovery time for the wearer based on the breathing pattern, the body movement, and the heart rate.Type: ApplicationFiled: January 20, 2021Publication date: May 13, 2021Inventors: Mei Lu, Barrett Kenney, Jason Janavicius, Saurabh Dadu, David P. Kuhns, Paul F. Sorenson
-
Patent number: 10905373Abstract: A method is described to facilitate breathing management. The method includes receiving sensory data from one or more wearable devices, recording a breathing pattern of a wearer of the one or more wearable devices based on data received from the sensor array, calculating a body movement of the wearer based on the data received from the sensor array and correlating a wearer breathing pattern with the wearer body movement and heart rate.Type: GrantFiled: September 24, 2015Date of Patent: February 2, 2021Assignee: Intel CorporationInventors: Mei Lu, Barrett Kenney, Jason Janavicius, Saurabh Dadu, David P. Kuhns, Paul F. Sorenson
-
Patent number: 10082940Abstract: Various systems and methods for implementing text functions in augmented reality are described herein. A system for implementing text functions in augmented reality includes a display to display a field of view to a user of the system; a gesture detection module to detect a selection gesture performed by a user of the system, the selection gesture defining a selection area in the field of view; a camera array to capture an image of the selection area; a text module to perform a text operation on text identified in the image; and a presentation module to present an indication of the text operation to the user.Type: GrantFiled: July 3, 2017Date of Patent: September 25, 2018Assignee: Intel CorporationInventors: Emily N. Ivers, Kahyun Kim, Paul F. Sorenson
-
Publication number: 20180114366Abstract: Various systems and methods for implementing text functions in augmented reality are described herein. A system for implementing text functions in augmented reality includes a display to display a field of view to a user of the system; a gesture detection module to detect a selection gesture performed by a user of the system, the selection gesture defining a selection area in the field of view; a camera array to capture an image of the selection area; a text module to perform a text operation on text identified in the image; and a presentation module to present an indication of the text operation to the user.Type: ApplicationFiled: July 3, 2017Publication date: April 26, 2018Inventors: Emily N. Ivers, Kahyun Kim, Paul F. Sorenson
-
Patent number: 9942627Abstract: In embodiments, apparatuses, methods, and computer-readable media are described that are associated with a context-based information presentation system (“CIP”). The CIP may be configured to modify presentation of information to a user during performance of a physical activity. The CIP may be configured to receive information describing the physical activity and to identify a context for the physical activity based on the received information. The identification of the context may be performed through application of one or more context rules to the information describing the physical activity received by the CIP. The CIP may be configured to modify presentation of information to the user based on the identified context. Other embodiments may be described and claimed.Type: GrantFiled: September 12, 2014Date of Patent: April 10, 2018Assignee: Intel CorporationInventors: Kahyun Kim, Paul F. Sorenson
-
Patent number: 9912902Abstract: Systems and techniques for an automatic event recorder are described herein. An input stream may be written to a memory buffer in an overwrite mode. The input stream may be received from a sensor array of a wearable device. A first portion of the input stream written to the memory buffer may be protected upon obtaining an indication of occurrence of a start event. The memory buffer may be configured to receive a second portion of the input stream in a write mode subsequent to the occurrence of the start event. The memory buffer may be reconfigured to receive a third portion of the input stream in the overwrite mode upon obtaining an indication of the occurrence of the stop event.Type: GrantFiled: December 16, 2015Date of Patent: March 6, 2018Assignee: Intel CorporationInventors: Benjamin Conrad Oien, Paul F. Sorenson, Kahyun Kim, Emily N. Ivers
-
Publication number: 20170287217Abstract: Various systems and methods for providing alerts of preceding traffic are described herein. A system for providing alerts of preceding traffic comprising a processor installed on a trailing vehicle operated by a user, the processor to: receive image data from a camera and identify a preceding vehicle in front of the trailing vehicle; receive data from a distance sensor to detect a change in relative velocity between the preceding vehicle and the trailing vehicle that exceeds a threshold; and cause an augmented reality content to be displayed in a head-mounted display worn by the user, the augmented reality content to alert the user of the change in relative velocity.Type: ApplicationFiled: March 30, 2016Publication date: October 5, 2017Inventors: Kahyun Kim, Paul F. Sorenson, Emily N. Ivers
-
Publication number: 20170277257Abstract: Various systems and methods for implementing gaze-based sound selection are described herein. A system for gaze-based sound selection, the system includes a gaze detection circuit to determine a gaze direction of a user, the gaze direction being toward an object; an audio capture mechanism to obtain audio data from the object, the audio capture mechanism selectively configured based on the gaze direction; an audio transformation circuit to transform the audio data to an output data; and a presentation mechanism to present the output data to the user.Type: ApplicationFiled: March 23, 2016Publication date: September 28, 2017Inventors: Jeffrey Ota, Kahyun Kim, Alexander Essaian, Emily N. Ivers, Jeremy Miossec-Backer, Paul F. Sorenson
-
Patent number: 9697648Abstract: Various systems and methods for implementing text functions in augmented reality are described herein. A system for implementing text functions in augmented reality includes a display to display a field of view to a user of the system; a gesture detection module to detect a selection gesture performed by a user of the system, the selection gesture defining a selection area in the field of view; a camera array to capture an image of the selection area; a text module to perform a text operation on text identified in the image; and a presentation module to present an indication of the text operation to the user.Type: GrantFiled: December 23, 2015Date of Patent: July 4, 2017Assignee: Intel CorporationInventors: Emily N. Ivers, Kahyun Kim, Paul F. Sorenson
-
Publication number: 20170186230Abstract: Various systems and methods for implementing text functions in augmented reality are described herein. A system for implementing text functions in augmented reality includes a display to display a field of view to a user of the system; a gesture detection module to detect a selection gesture performed by a user of the system, the selection gesture defining a selection area in the field of view; a camera array to capture an image of the selection area; a text module to perform a text operation on text identified in the image; and a presentation module to present an indication of the text operation to the user.Type: ApplicationFiled: December 23, 2015Publication date: June 29, 2017Inventors: Emily N. Ivers, Kahyun Kim, Paul F. Sorenson
-
Publication number: 20170178380Abstract: A method is described to facilitate real-time visualization. The method includes receiving sensory data from one or more wearable devices, determining a real-time body position of a use based on the sensory data, generating an image of the user based on the real-time body position and displaying the image of the user at an optical head-mounted display computing device.Type: ApplicationFiled: December 21, 2015Publication date: June 22, 2017Applicant: Intel CorporationInventors: Emily N. Ivers, Kahyun Kim, Paul F. Sorenson, Brian R. Fairbanks, Ronald T. Azuma, Jeremy Miossec-Backer
-
Publication number: 20170180676Abstract: Systems and techniques for an automatic event recorder are described herein. An input stream may be written to a memory buffer in an overwrite mode. The input stream may be received from a sensor array of a wearable device. A first portion of the input stream written to the memory buffer may be protected upon obtaining an indication of occurrence of a start event. The memory buffer may be configured to receive a second portion of the input stream in a write mode subsequent to the occurrence of the start event. The memory buffer may be reconfigured to receive a third portion of the input stream in the overwrite mode upon obtaining an indication of the occurrence of the stop event.Type: ApplicationFiled: December 16, 2015Publication date: June 22, 2017Inventors: Benjamin Conrad Oien, Paul F. Sorenson, Kahyun Sophie Kim, Emily N. Ivers
-
Publication number: 20170086730Abstract: A method is described to facilitate breathing management.Type: ApplicationFiled: September 24, 2015Publication date: March 30, 2017Inventors: Mei Lu, Barrett Kenney, Jason Paul Janavicius, Saurabh Dadu, David P. Kuhns, Paul F. Sorenson
-
Publication number: 20160080838Abstract: In embodiments, apparatuses, methods, and computer-readable media are described that are associated with a context-based information presentation system (“CIP”). The CIP may be configured to modify presentation of information to a user during performance of a physical activity. The CIP may be configured to receive information describing the physical activity and to identify a context for the physical activity based on the received information. The identification of the context may be performed through application of one or more context rules to the information describing the physical activity received by the CIP. The CIP may be configured to modify presentation of information to the user based on the identified context. Other embodiments may be described and claimed.Type: ApplicationFiled: September 12, 2014Publication date: March 17, 2016Inventors: Kahyun Kim, Paul F. Sorenson
-
Patent number: 8533630Abstract: A processing system may include an array of input/output (I/O) devices (e.g., an array of microphones, an array of antennas, etc.) A device array control program for the processing system may provide a user interface for controlling the array of I/O devices. The user interface may include an array of objects representing the array of I/O devices, as well as a set of controls near the array of objects. The controls may include a right handle, a left handle, and a gain handle. The left handle and the right handle may be moved by a user to control which I/O devices will be used, and the gain handle may be moved to control gain for at least one of the I/O devices. Other embodiments are described and claimed.Type: GrantFiled: September 5, 2006Date of Patent: September 10, 2013Assignee: Intel CorporationInventor: Paul F. Sorenson
-
Publication number: 20080055238Abstract: A processing system may include an array of input/output (I/O) devices (e.g., an array of microphones, an array of antennas, etc.) A device array control program for the processing system may provide a user interface for controlling the array of I/O devices. The user interface may include an array of objects representing the array of I/O devices, as well as a set of controls near the array of objects. The controls may include a right handle, a left handle, and a gain handle. The left handle and the right handle may be moved by a user to control which I/O devices will be used, and the gain handle may be moved to control gain for at least one of the I/O devices. Other embodiments are described and claimed.Type: ApplicationFiled: September 5, 2006Publication date: March 6, 2008Inventor: Paul F. Sorenson