Patents by Inventor Yasuhiro Ota

Yasuhiro Ota has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10065439
    Abstract: A printing apparatus includes a printhead on a movable carriage; an operating unit operating by a driving force of a driving source; a transmitting unit being displaced by the driving force between a position where the transmitting unit transmits the driving force to the operating unit and a non-transmitting position; a movable member arranged on a moving path of the carriage; a restricting member being displaced, in coordination with a displacement of the movable member, between a position where the displacement of the transmitting unit is restricted and a non-restricting position; and a selecting member provided between the movable member and the restricting member. The selecting member is displaced between a position where the movable member and the restricting member are coordinated with each other by the selecting member and a non-coordinating position.
    Type: Grant
    Filed: July 18, 2017
    Date of Patent: September 4, 2018
    Assignee: Canon Kabushiki Kaisha
    Inventors: Ryuji Nogami, Yasuhiro Ota, Kenji Soya
  • Patent number: 10024667
    Abstract: An intelligent earpiece to be worn over an ear of a user is described. The earpiece includes a processor connected to the IMU, the GPS unit and the at least one camera. The processor can recognize an object in the surrounding environment by analyzing the image data based on the stored object data and at least one of the inertial measurement data or the location data. The processor can determine a desirable event or action based on the recognized object, the previously determined user data, and a current time or day. The processor can determine a destination based on the determined desirable event or action. The processor can determine a navigation path for navigating the intelligent guidance device to the destination based on the determined destination, the image data, the inertial measurement data or the location data. The processor can determine output data based on the determined navigation path.
    Type: Grant
    Filed: August 1, 2014
    Date of Patent: July 17, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota
  • Patent number: 10024678
    Abstract: A clip includes an IMU coupled to the clip and adapted to detect inertial measurement data and a GPS coupled to the device and adapted to detect location data. The clip further includes a camera adapted to detect image data and a memory adapted to store data. The clip further includes a processor adapted to recognize an object in the surrounding environment by analyzing the data. The processor can determine a desirable action based on the data and a current time or day. The processor can determine a destination based on the determined desirable action. The processor can determine a navigation path based on the determined destination and the data. The processor is further adapted to determine output based on the navigation path. The clip further includes a speaker adapted to provide audio information to the user.
    Type: Grant
    Filed: September 17, 2014
    Date of Patent: July 17, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota
  • Patent number: 10024679
    Abstract: A wearable neck device includes an IMU coupled to the wearable neck device and adapted to detect inertial measurement data and a GPS coupled to the device and adapted to detect location data. The wearable neck device further includes a camera adapted to detect image data and a memory adapted to store data. The wearable neck device further includes a processor adapted to recognize an object in the surrounding environment by analyzing the data. The processor can determine a desirable action based on the data and a current time or day. The processor can determine a destination based on the determined desirable action. The processor can determine a navigation path based on the determined destination and the data. The processor is further adapted to determine output based on the navigation path. The wearable neck device further includes a speaker adapted to provide audio information to the user.
    Type: Grant
    Filed: September 8, 2014
    Date of Patent: July 17, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota
  • Patent number: 9933776
    Abstract: A numerical control device includes an NC program storing unit, a parameter change unit configured to change parameter based on parameter change operation by an operator, a parameter change monitoring unit configured to detect change of the parameter by monitoring the parameter change unit and output command for starting the NC program based on the change of the parameter, and an NC program execution unit configured to execute the NC program based on command from the parameter change monitoring unit.
    Type: Grant
    Filed: November 5, 2015
    Date of Patent: April 3, 2018
    Assignee: FANUC Corporation
    Inventors: Yasuhiro Ota, Hideaki Maeda, Akira Egashira
  • Patent number: 9922236
    Abstract: Eyeglasses include a left lens, a right lens and an IMU sensor and a GPS unit. A camera and a memory are coupled to the eyeglasses. A processor is connected to the IMU, the GPS unit and the at least one camera and is adapted to recognize objects by analyzing image data based on the stored object data and inertial measurement data or location data. The processor is also adapted to determine a desirable event based on the object, previously determined user data, and a time. The processor is also adapted to determine a destination based on the determined desirable event and determine a navigation path for navigating the eyeglasses to the destination based on the determined destination, image data, and inertial measurement data or location data. The processor is also adapted to determine output data based on the determined navigation path. A speaker is also provided.
    Type: Grant
    Filed: September 17, 2014
    Date of Patent: March 20, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota
  • Patent number: 9914218
    Abstract: In one embodiment, a method for responding to a detected event by a robot is provided. The method includes using a sensor to detect an event within an operational space of a robot. The event includes a movement of an object or a person within the operational space. The method also includes using a processor to predict an action to occur within the operational space of the robot based upon the detected event. The method also identifies at least one correlated robot action to be taken in response to the detected event and compares the predicted action to a movement plan of the robot. The method further selects at least one of the correlated robot actions and modifies a movement plan of a robot to include at least one of the identified correlated robot actions in response to the detected event.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: March 13, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventor: Yasuhiro Ota
  • Publication number: 20180029391
    Abstract: A printing apparatus includes a printhead on a movable carriage; an operating unit operating by a driving force of a driving source; a transmitting unit being displaced by the driving force between a position where the transmitting unit transmits the driving force to the operating unit and a non-transmitting position; a movable member arranged on a moving path of the carriage; a restricting member being displaced, in coordination with a displacement of the movable member, between a position where the displacement of the transmitting unit is restricted and a non-restricting position; and a selecting member provided between the movable member and the restricting member. The selecting member is displaced between a position where the movable member and the restricting member are coordinated with each other by the selecting member and a non-coordinating position.
    Type: Application
    Filed: July 18, 2017
    Publication date: February 1, 2018
    Inventors: Ryuji Nogami, Yasuhiro Ota, Kenji Soya
  • Patent number: 9870718
    Abstract: Imaging devices including spacing members and imaging devices including tactile feedback devices are disclosed. An imaging device includes a body portion, a spacing member, and a camera. The body portion extends in a lengthwise direction from a distal end of the body portion to an imaging end of the body portion. The spacing member extends from the imaging end of the body portion in the lengthwise direction. The camera is coupled to the imaging end of the body portion. When the spacing member of the imaging device is positioned in contact with a surface to be imaged by the camera and the imaging device is moved across the surface, the spacing member maintains a fixed distance between the camera and the surface as the imaging device moves across the surface to be imaged. Imaging devices including tactile feedback devices that are activated when text is recognized are also disclosed.
    Type: Grant
    Filed: December 11, 2014
    Date of Patent: January 16, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota, Shin Sano, Sarah Rosenbach, Sho Hiruta, Maura Hoven
  • Patent number: 9629774
    Abstract: A smart necklace includes a body defining at least one cavity and having a neck portion and first and a second side portions. The necklace includes a pair of stereo cameras that is configured to detect image data including depth information corresponding to a surrounding environment of the smart necklace. The necklace further includes a positioning sensor configured to detect positioning data corresponding to a positioning of the smart necklace. The necklace includes a non-transitory memory positioned in the at least one cavity and configured to store map data and object data. The smart necklace also includes a processor positioned in the at least one cavity, coupled to the pair of stereo cameras, the positioning sensor and the non-transitory memory. The processor is configured to determine output data based on the image data, the positioning data, the map data and the object data.
    Type: Grant
    Filed: December 5, 2014
    Date of Patent: April 25, 2017
    Assignee: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Rajiv Dayal, Douglas A. Moore, Yasuhiro Ota, Joseph M. A. Djugash, Tiffany L. Chen, Kenichi Yamamoto
  • Patent number: 9578307
    Abstract: A wearable neck device for providing environmental awareness to a user, the wearable neck device includes a flexible tube. A first stereo pair of cameras is encased in a left portion of the flexible tube and a second stereo pair of cameras is encased in a right portion of the flexible tube. A vibration motor within the flexible tube provides haptic and audio feedback to the user. A processor in the flexible tube recognizes objects from the first stereo pair of cameras and the second stereo pair of cameras. The vibration motor provides haptic and audio feedback of the items or points of interest to the user.
    Type: Grant
    Filed: January 14, 2014
    Date of Patent: February 21, 2017
    Assignee: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota
  • Patent number: 9530058
    Abstract: In one embodiment, a visual-assist robot includes a housing defining a base portion, an imaging assembly, a motorized wheel assembly positioned at the lower surface of the base portion, a processor disposed within the housing and communicatively coupled to the imaging assembly and the motorized wheel assembly, and a non-transitory memory device disposed within the housing. The imaging assembly generates image data corresponding to an environment, and at least a portion of the imaging assembly is configured to be disposed above the upper surface of the base portion. The non-transitory memory device stores machine-readable instructions that cause the processor to provide a drive signal to the motorized wheel assembly such that the motorized wheel assembly moves the visual-assist robot to a desired location within the environment, determine objects from the image data received from the imaging assembly, and transmit message data for receipt by a user.
    Type: Grant
    Filed: December 11, 2014
    Date of Patent: December 27, 2016
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventors: Douglas A. Moore, Joseph M. A. Djugash, Yasuhiro Ota, Shin Sano, Sarah Rosenbach, Sho Hiruta, Maura Hoven
  • Publication number: 20160296404
    Abstract: A physical assistive robotic device may include a frame including an upright support member, a lateral member slidably engaged with the upright support member, a handle slidably engaged with the lateral member, an elevation actuator coupled to the upright support member and the lateral member, and a lateral actuator coupled to the lateral member and the handle. The elevation actuator translates the lateral member and the lateral actuator translates the handle to transition a user between a standing position and a non-standing position.
    Type: Application
    Filed: June 22, 2016
    Publication date: October 13, 2016
    Applicants: Toyota Motor Engineeirng & Manufacturing North America, Inc., Illinois Institute of Technology
    Inventors: Yasuhiro Ota, Masaru Ryumae, Keiichi Sato, Shin Sano
  • Publication number: 20160221191
    Abstract: In one embodiment, a method for responding to a detected event by a robot is provided. The method includes using a sensor to detect an event within an operational space of a robot. The event includes a movement of an object or a person within the operational space. The method also includes using a processor to predict an action to occur within the operational space of the robot based upon the detected event. The method also identifies at least one correlated robot action to be taken in response to the detected event and compares the predicted action to a movement plan of the robot. The method further selects at least one of the correlated robot actions and modifies a movement plan of a robot to include at least one of the identified correlated robot actions in response to the detected event.
    Type: Application
    Filed: January 30, 2015
    Publication date: August 4, 2016
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventor: Yasuhiro Ota
  • Patent number: 9381131
    Abstract: A physical assistive robotic device may include a frame including an upright support member, a lateral member slidably engaged with the upright support member, a handle slidably engaged with the lateral member, an elevation actuator coupled to the upright support member and the lateral member, and a lateral actuator coupled to the lateral member and the handle. The elevation actuator translates the lateral member and the lateral actuator translates the handle to transition a user between a standing position and a non-standing position.
    Type: Grant
    Filed: January 10, 2013
    Date of Patent: July 5, 2016
    Assignees: Toyota Motor Engineering & Manufacturing North America, Inc., Illinois Institute of Technology
    Inventors: Yasuhiro Ota, Masaru Ryumae, Keiichi Sato, Shin Sano
  • Patent number: 9375839
    Abstract: Methods and computer-program products for evaluating grasp patterns for use by a robot are disclosed. In one embodiment, a method of evaluating grasp patterns includes selecting an individual grasp pattern from a grasp pattern set, establishing a thumb-up vector, and simulating the motion of the manipulator and the end effector according to the selected individual grasp pattern, wherein each individual grasp pattern of the grasp pattern set corresponds to motion for manipulating a target object. The method further includes evaluating a direction of the thumb-up vector during at least a portion of the simulated motion of the manipulator and the end effector, and excluding the selected individual grasp pattern from use by the robot if the direction of the thumb-up vector during the simulated motion is outside of one or more predetermined thresholds. Robots utilizing the methods and computer-program products for evaluating grasp patterns are also disclosed.
    Type: Grant
    Filed: March 6, 2015
    Date of Patent: June 28, 2016
    Assignees: Carnegie Mellon University, Toyota Jidosha Kabushiki Kaisha
    Inventors: Yasuhiro Ota, Junggon Kim, James J. Kuffner
  • Publication number: 20160170508
    Abstract: Embodiments of tactile display devices are disclosed. In one embodiment, a tactile display device includes a housing having a first surface, a tactile display located at the first surface, a camera, a processor, and a non-transitory memory device. The tactile display is configured to produce a plurality of raised portions defining a tactile message. The camera generates image data corresponding to an environment. The processor is disposed within the housing and communicatively coupled to the tactile display and the camera. The non-transitory memory device stores machine-readable instructions that, when executed by the processor, cause the processor to, generate a topographical map of objects within the environment from the image data received from the camera, generate tactile display data corresponding to the topographical map, and provide the tactile display data to the tactile display such that the tactile display produces the plurality of raised portions to form the tactile message.
    Type: Application
    Filed: December 11, 2014
    Publication date: June 16, 2016
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Douglas A. Moore, Joseph M.A. Djugash, Yasuhiro Ota, Shin Sano, Sarah Rosenbach, Sho Hiruta, Maura Hoven
  • Publication number: 20160171908
    Abstract: Imaging devices including spacing members and imaging devices including tactile feedback devices are disclosed. An imaging device includes a body portion, a spacing member, and a camera. The body portion extends in a lengthwise direction from a distal end of the body portion to an imaging end of the body portion. The spacing member extends from the imaging end of the body portion in the lengthwise direction. The camera is coupled to the imaging end of the body portion. When the spacing member of the imaging device is positioned in contact with a surface to be imaged by the camera and the imaging device is moved across the surface, the spacing member maintains a fixed distance between the camera and the surface as the imaging device moves across the surface to be imaged. Imaging devices including tactile feedback devices that are activated when text is recognized are also disclosed.
    Type: Application
    Filed: December 11, 2014
    Publication date: June 16, 2016
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Douglas A. Moore, Joseph M.A. Djugash, Yasuhiro Ota, Shin Sano, Sarah Rosenbach, Sho Hiruta, Maura Hoven
  • Publication number: 20160171907
    Abstract: Imaging gloves including wrist cameras and finger cameras are disclosed. An imaging glove includes a wrist portion, a finger portion extending from the wrist portion, a wrist camera coupled to the wrist portion, a finger camera coupled to the finger portion, a processor communicatively coupled to the wrist camera and the finger camera, a memory module communicatively coupled to the processor, and machine readable instructions stored in the memory module. When executed by the processor, the machine readable instructions cause the imaging glove to receive image data from the wrist camera or the finger camera, recognize an object in the received image data, and provide output indicative of the recognized object.
    Type: Application
    Filed: December 11, 2014
    Publication date: June 16, 2016
    Applicant: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Douglas A. Moore, Joseph M.A. Djugash, Yasuhiro Ota, Shin Sano, Sarah Rosenbach, Sho Hiruta, Maura Hoven
  • Patent number: D768024
    Type: Grant
    Filed: September 22, 2014
    Date of Patent: October 4, 2016
    Assignee: Toyota Motor Engineering & Manufacturing North America, Inc.
    Inventors: Rajiv Dayal, Douglas A. Moore, Yasuhiro Ota, Joseph M. A. Djugash, Tiffany L. Chen, Kenichi Yamamoto