Patents by Inventor Tynan J. Garrett

Tynan J. Garrett has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170134374
    Abstract: A method for receiving a virtual machine image comprising configuring a user device to receive a virtual machine image, receive a biometric input from a user, send a request for a virtual machine image to a server, the request including data corresponding to the biometric input, and receive the virtual machine image.
    Type: Application
    Filed: November 5, 2015
    Publication date: May 11, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck
  • Patent number: 9619638
    Abstract: Embodiments of the present invention provide a method and system for enacting various driving profiles based on identifying the age and gender of the user. Initially, identifying information is acquired from a user of a vehicle. An identification program determines whether the driver is known, by comparing the identifying information to a set of stored identifying information. If the driver of the vehicle is known, as the identifying information from the user is similar to stored identifying information, then a specific driver profile associated with the specific driver is activated. If the driver is not known, as the identifying information from the user of the vehicle is not similar to the stored identifying information, then the age and gender of the user of the vehicle is determined. Based on the determined age and gender of the driver of the vehicle, various vehicle setting ranges are enacted.
    Type: Grant
    Filed: August 25, 2015
    Date of Patent: April 11, 2017
    Assignee: International Business Machines Corporation
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck, Charles J. Stocker, IV
  • Publication number: 20170083102
    Abstract: Embodiments of the present invention provide a method and system for sharing content between devices, where one of the devices is a wearable device. The wearable device is configured to detect a second device, detect movement and send a data file wirelessly to the second device. Initially, a set of movement data and an associated data file is stored in the wearable device. The movement data may be a gesture such as a handshake, a high-five or a fist bump. Once the wearable device receives at least one movement, it determines whether the movement is similar to the set of stored movements. If, the received movement of the wearable device is similar to the stored movements, the wearable device sends the associated data file to the second device. Based on the determined gesture, the wearable device may send different files.
    Type: Application
    Filed: November 4, 2015
    Publication date: March 23, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck
  • Publication number: 20170083101
    Abstract: Embodiments of the present invention provide a method and system for sharing content between devices, where one of the devices is a wearable device. The wearable device is configured to detect a second device, detect movement and send a data file wirelessly to the second device. Initially, a set of movement data and an associated data file is stored in the wearable device. The movement data may be a gesture such as a handshake, a high-five or a fist bump. Once the wearable device receives at least one movement, it determines whether the movement is similar to the set of stored movements. If, the received movement of the wearable device is similar to the stored movements, the wearable device sends the associated data file to the second device. Based on the determined gesture, the wearable device may send different files.
    Type: Application
    Filed: September 17, 2015
    Publication date: March 23, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck
  • Patent number: 9600985
    Abstract: Aspects include a system for providing feedback to a user with a wearable device. The system including a wearable device and a feedback mechanism disposed within the wearable device. The feedback mechanism configured to vibrate the wearable device. A memory is provided having computer readable instructions. One or more processors are provided for executing the computer readable instructions, the one or more processors being coupled to communicate with the feedback mechanism. The computer readable instructions include defining at least one prompt point that is associated with a first time parameter. The instructions further include determining when the current time is equal to the first time parameter. The instructions provide for actuating the feedback mechanism in response to the current time being equal to the first time period.
    Type: Grant
    Filed: November 12, 2015
    Date of Patent: March 21, 2017
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck
  • Publication number: 20170057436
    Abstract: Embodiments of the present invention provide a method and system for enacting various driving profiles based on identifying the age and gender of the user. Initially, identifying information is acquired from a user of a vehicle. An identification program determines whether the driver is known, by comparing the identifying information to a set of stored identifying information. If the driver of the vehicle is known, as the identifying information from the user is similar to stored identifying information, then a specific driver profile associated with the specific driver is activated. If the driver is not known, as the identifying information from the user of the vehicle is not similar to the stored identifying information, then the age and gender of the user of the vehicle is determined. Based on the determined age and gender of the driver of the vehicle, various vehicle setting ranges are enacted.
    Type: Application
    Filed: August 25, 2015
    Publication date: March 2, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck, Charles J. Stocker, IV
  • Publication number: 20170061989
    Abstract: A computer-implemented method includes determining a meeting has initialized between a first user and a second user, wherein vocal and video recordings are produced for at least the first user. The method receives the vocal and video recordings for the first user. The method analyzes the vocal and video recordings for the first user according to one or more parameters for speech and one or more parameters for gestures. The method determines one or more emotions and a role in the meeting for the first user based at least on the analyzed vocal and video recordings. The method sends an output of analysis to at least one of the first user and the second user, wherein the output of analysis includes at least the determined one or more emotions and the role in the meeting for the first user.
    Type: Application
    Filed: September 2, 2015
    Publication date: March 2, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck
  • Publication number: 20170057438
    Abstract: Embodiments of the present invention provide a method and system for enacting various driving profiles based on identifying the age and gender of the user. Initially, identifying information is acquired from a user of a vehicle. An identification program determines whether the driver is known, by comparing the identifying information to a set of stored identifying information. If the driver of the vehicle is known, as the identifying information from the user is similar to stored identifying information, then a specific driver profile associated with the specific driver is activated. If the driver is not known, as the identifying information from the user of the vehicle is not similar to the stored identifying information, then the age and gender of the user of the vehicle is determined. Based on the determined age and gender of the driver of the vehicle, various vehicle setting ranges are enacted.
    Type: Application
    Filed: November 3, 2015
    Publication date: March 2, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck, Charles J. Stocker, IV
  • Publication number: 20170061967
    Abstract: A computer-implemented method includes determining a meeting has initialized between a first user and a second user, wherein vocal and video recordings are produced for at least the first user. The method receives the vocal and video recordings for the first user. The method analyzes the vocal and video recordings for the first user according to one or more parameters for speech and one or more parameters for gestures. The method determines one or more emotions and a role in the meeting for the first user based at least on the analyzed vocal and video recordings. The method sends an output of analysis to at least one of the first user and the second user, wherein the output of analysis includes at least the determined one or more emotions and the role in the meeting for the first user.
    Type: Application
    Filed: October 21, 2015
    Publication date: March 2, 2017
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Emily M. Metruck
  • Publication number: 20160370907
    Abstract: Embodiments of the present invention relate to directing orientation on a screen by determining an orientation of a user's fingerprint. A user's finger position may be determined when the user's finger is a predetermined distance from or in contact with a touch-sensitive device. The user's finger position may be compared with a finger position database. The finger position database may include known finger positions associated with a screen orientation. A match analysis may be performed to determine which known finger position most closely resembles the user's finger position. A screen orientation associated with the known finger position most closely matching the user's finger position may be selected. Once the screen orientation is selected, an image may be displayed on a screen corresponding with the selected screen orientation.
    Type: Application
    Filed: September 1, 2015
    Publication date: December 22, 2016
    Inventors: ELI M. DOW, THOMAS D. FITZSIMMONS, TYNAN J. GARRETT, EMILY M. METRUCK
  • Publication number: 20160372088
    Abstract: Embodiments of the present invention relate to directing orientation on a screen by determining an orientation of a user's fingerprint. A user's finger position may be determined when the user's finger is a predetermined distance from or in contact with a touch-sensitive device. The user's finger position may be compared with a finger position database. The finger position database may include known finger positions associated with a screen orientation. A match analysis may be performed to determine which known finger position most closely resembles the user's finger position. A screen orientation associated with the known finger position most closely matching the user's finger position may be selected. Once the screen orientation is selected, an image may be displayed on a screen corresponding with the selected screen orientation.
    Type: Application
    Filed: June 17, 2015
    Publication date: December 22, 2016
    Inventors: ELI M. DOW, THOMAS D. FITZSIMMONS, TYNAN J. GARRETT, EMILY M. METRUCK
  • Publication number: 20160307001
    Abstract: Embodiments of the present invention provide methods and systems for numeric keypad encryption using an augmented reality device. The method may include establishing a secure connection to an augmented reality device. A random keypad layout is generated and sent to the augmented reality device. The random keypad layout is displayed in the augmented reality view over a real-world numeric keypad.
    Type: Application
    Filed: April 14, 2015
    Publication date: October 20, 2016
    Inventors: Eli M. Dow, Thomas D. Fitzsimmons, Tynan J. Garrett, Joseph D. Harvey, Emily M. Metruck, Douglas E. Rohde, Charles J. Stocker, IV