Patents by Inventor Akihito Seki
Akihito Seki has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20140285816Abstract: According to an embodiment, a measuring device includes a projector, an image capturing unit, and a first calculator. The projector projects, onto a target, a first superimposed pattern which is obtained by superimposing a first pattern having a periodic change and a second pattern configured with a first design for specifying a period of the first pattern. The image capturing unit captures the target, onto which the first superimposed pattern is projected to obtain an image. The first calculator performs matching of the first design taken by the image capturing unit, which is included in the first superimposed pattern in the image, with the first design projected by the projecting unit, and calculates correspondence between a second superimposed pattern, which points to the first superimposed pattern captured in the image, and the first superimposed pattern.Type: ApplicationFiled: March 7, 2014Publication date: September 25, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Masaki Yamazaki, Akihito Seki, Satoshi Ito, Yuta Itoh, Hideaki Uchiyama, Ryuzo Okada
-
Publication number: 20140285794Abstract: According to an embodiment, a measuring device includes an imaging unit to capture an object from a plurality of positions to obtain a plurality of images; a distance measuring unit to measure a distance to the object from each position to obtain a plurality of pieces of distance information; a position measuring unit to measure each position to obtain a plurality of pieces of position information; a first calculator to calculate three-dimensional data of the object using the images; a second calculator to calculate a degree of reliability of each piece of distance information and each piece of position information; and a estimating unit to, among the pieces of distance information and the pieces of position information, make use of such pieces of distance information and such pieces of position information which have the degree of reliability greater than a predetermined value to estimate the scale of the three-dimensional data.Type: ApplicationFiled: March 4, 2014Publication date: September 25, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Satoshi ITO, Akihito SEKI, Masaki YAMAZAKI, Yuta ITOH, Hideaki UCHIYAMA
-
Publication number: 20140132600Abstract: An example three-dimensional model generating device includes an emitting unit that emits a laser light and a first deflector that deflects laser light, whose emission direction rotates in a first rotation range, within a first scan plane. A second deflector deflects laser light, whose emission direction rotates in a second rotation range, within a second scan plane intersecting with the first scan plane. The detector detects a reflected light when laser light deflected from the first deflector is reflected from the target object or detects a reflected light when laser light deflected from the second deflector is reflected from the target object. The measuring unit measures a distance to the target object on the basis of the time taken since emission of the laser light to detection of the reflected light. The generating unit generates a three-dimensional model of the target object by using the measurement result.Type: ApplicationFiled: November 5, 2013Publication date: May 15, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kenichi SHIMOYAMA, Akihito SEKI, Satoshi ITO, Masaki YAMAZAKI, Yuta ITOH
-
Publication number: 20140132759Abstract: According to an embodiment, a measuring device includes a measuring unit, a capturing unit, an estimation unit, a calculator, and a detector. The estimation unit estimates, for each irradiated point, an estimated projection position on the image, using a direction and a distance of the irradiated point and calibration information based on calibration of a measuring unit and a capturing unit. The calculator calculates, for each irradiated point, an amount of change in reflection intensity and obtains a reflection intensity change point. The calculator calculates, for each estimated projection position, an amount of change in brightness and obtains a brightness change point. The detector detects a calibration shift between the measuring unit and the capturing unit by comparing the reflection intensity change point and the brightness change point.Type: ApplicationFiled: November 4, 2013Publication date: May 15, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuta ITOH, Akihito Seki, Kenichi Shimoyama, Satoshi Ito, Masaki Yamazaki
-
Publication number: 20140133700Abstract: According to an embodiment, a detecting device includes a projecting unit, a calculator, and a detector. The projecting unit is configured to obtain a first projection position by projecting a capturing position of a captured image on a road surface, obtain a second projection position by projecting a spatial position in the captured image on the road surface, and obtain a third projection position by projecting an error position on the road surface. The calculator is configured to calculate an existence probability of an object on a line passing through the first and second projection positions so that an existence probability of the object between the second and third projection positions on the straight line is greater than between the first and third projection positions on the straight line. The detector is configured to detect a boundary between the road surface and the object by using the existence probability.Type: ApplicationFiled: October 28, 2013Publication date: May 15, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventor: Akihito SEKI
-
Publication number: 20140111097Abstract: According to an embodiment, an identification device includes a controller, an acquiring unit, and an identifying unit. The controller controls turning on/off of a plurality of light emitting apparatuses via a network individually by using pieces of identification information of their respective light emitting apparatuses. The acquiring unit acquires images of the light emitting apparatuses in time-series. The identifying unit determines an installation position of each of the light emitting apparatuses by using the on/off control on the light emitting apparatuses and the images, and identifies each of the light emitting apparatuses determined by the installation positions corresponding to each of the light emitting apparatuses identified by the pieces of identification information with each other.Type: ApplicationFiled: August 29, 2013Publication date: April 24, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Satoshi Ito, Kenichi Shimoyama, Akihito Seki, Yuta Itoh, Masaki Yamazaki
-
Publication number: 20140104386Abstract: According to an embodiment, an observation support device includes an acquiring unit, a determining unit, and a generating unit. The acquiring unit is configured to acquire observation information capable of identifying an observing direction vector indicating an observing direction of an observing unit that observes an object for which a three-dimensional model is to be generated. The determining unit is configured to determine whether or not observation of the object in a direction indicated by an observed direction vector in which at least part of the object can be observed is completed based on a degree of coincidence of the observing direction vector and the observed direction vector. The generating unit is configured to generate completion information indicating whether or not observation of the object in the direction indicated by the observed direction vector is completed.Type: ApplicationFiled: March 14, 2013Publication date: April 17, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Kenichi SHIMOYAMA, Akihito SEKI, Satoshi ITO, Masaki YAMAZAKI, Yuta ITOH, Ryuzo OKADA
-
Publication number: 20140098222Abstract: According to an embodiment, an area identifying device includes a projecting unit, an image capturing unit, a calculating unit, and an identifying unit. The projecting unit is configured to project a pattern so that the pattern performs a predetermined movement. The image capturing unit is configured to capture, in sequential order, multiple images of an area on which the pattern is projected, the image having a plurality of areas. The calculating unit is configured to calculate an amount of change of pattern appearances for each region in the multiple images. The identifying unit is configured to identify, as a reflective area, at least one area having a different amount of change of the pattern appearances from a reference amount of change of the pattern appearances based on the predetermined movement, among the amounts of change of the pattern appearances that are calculated by the calculating unit.Type: ApplicationFiled: March 13, 2013Publication date: April 10, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Masaki YAMAZAKI, Akihito Seki, Kenichi Shimoyama, Satoshi Ito, Yuta Itoh, Ryuzo Okada
-
Publication number: 20140100814Abstract: According to an embodiment, a measuring device includes a generating unit, an updating unit, and a calculating unit. The generating unit is configured to generate first position and orientation of an observing unit for observing a space containing a target object, an error model indicating a theoretical precision of observation, and three-dimensional shape data of the target object. The updating unit is configured to update space information that indicates a probability of the presence of the target object at each coordinate within the space, using the first position and the three-dimensional shape data. The calculating unit is configured to calculate a measurement quality increment for second position and orientation of a search area within the space using the error model and the space information, and set third position and orientation for which the calculated measurement quality increment satisfies a predetermined condition as new position and orientation, respectively.Type: ApplicationFiled: March 13, 2013Publication date: April 10, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yuta ITOH, Akihito Seki, Kenichi Shimoyama, Satoshi Ito, Masaki Yamazaki
-
Publication number: 20140093129Abstract: According to one embodiment, an object detection apparatus includes an acquisition unit, a first detector, a determination unit, and a second detector. The acquisition unit acquires frames in a time-series manner. The first detector detects a predetermined object in each of the frames. The determination unit stores detection results corresponding to the frames, compares a first detection result corresponding to a first frame of the frames with a second detection result corresponding to a second frame of the frames, and determines whether false negative of the predetermined object exists in the second frame. The second detector detects the predetermined object in the second frames when it is determined that false negative of the predetermined object exists. The second detector differs in performance from the first detector.Type: ApplicationFiled: September 27, 2013Publication date: April 3, 2014Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Mayu OKUMURA, Tomoki WATANABE, Akihito SEKI
-
Publication number: 20140002448Abstract: According to an embodiment, a measurement support device includes a first calculator configured to calculate, when three-dimensional shape data representing a measured part of an object are viewed from a plurality of points of view, a plurality of first information quantities representing visibility of the three-dimensional shape; a second calculator configured to calculate a second information quantity by multiplying a maximum value of the first information quantities by a predetermined proportion; a selector configured to select a point of view, which has a smaller difference between the first information quantity and the second information quantity, from the points of view; and a display controller configured to display the three-dimensional shape data as viewed from the selected point of view on a display unit.Type: ApplicationFiled: March 13, 2013Publication date: January 2, 2014Inventors: Satoshi ITO, Akihito SEKI, Yuta ITOH, Masaki YAMAZAKI, Kenichi SHIMOYAMA
-
Publication number: 20130188860Abstract: According to an embodiment, a second calculator calculates a three-dimensional position of a measurement position and error in the three-dimensional position using a first image, the measurement position, a second image, and a correspondence position. A selection unit determines whether there is an image pair, in which error in the three-dimensional position becomes smaller than the error calculated by the second calculator, from among image pairs of the plurality of images, when there is an image pair, selects the image pair, and when there is no image pair, decides on the three-dimensional position. Each time an image pair is selected, the second calculator calculates a new three-dimensional position of the measurement position and error using new first and second images each included in the image pair, and first and second projection positions where the three-dimensional positions are projected onto the new first and second images, respectively.Type: ApplicationFiled: December 31, 2012Publication date: July 25, 2013Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Akihito SEKI, Ryuzo OKADA
-
Patent number: 8331653Abstract: An object position area is calculated according to a position in the space of a detected truck. A template corresponding to the object position area recorded at the previous time is then called. The template is moved to a position on a reference image Ib where the similarity is highest. Overlap rate of the object position area and the template is calculated. A decision is made whether the object is identical to that detected in the past by using the overlap rate.Type: GrantFiled: August 11, 2005Date of Patent: December 11, 2012Assignees: Tokyo Institute of Technology, Honda Motor Co., Ltd.Inventors: Akihito Seki, Masatoshi Okutomi
-
Patent number: 8229249Abstract: A spatial motion calculation apparatus includes an image position relation calculation unit that calculates first similarities based on an image position relation between an inputted groups of feature points 1 and 2, a spatial position relation calculation unit that calculates second similarities based on an spatial position relation between said inputted groups, a feature descriptor relation calculation unit that calculates third similarities based on a feature descriptor relation between said inputted groups, and a spatial motion calculation unit that estimates the spatial motion based on the result that integrates the first to third similarities.Type: GrantFiled: September 18, 2008Date of Patent: July 24, 2012Assignee: Kabushiki Kaisha ToshibaInventor: Akihito Seki
-
Patent number: 8154594Abstract: A standard camera picks up a standard image, and a reference camera picks up a reference image. A flat area is extracted from the standard image and the reference image. An edge image is created by extracting the edges and feature points from the standard image and then a corrected edge image is created by removing the flat area. Object detection processing is carried out on the edges and feature points of the corrected edge image with reference to the reference image.Type: GrantFiled: August 11, 2005Date of Patent: April 10, 2012Assignees: Tokyo Institute of Technology, Honda Motor Co., Ltd.Inventors: Akihito Seki, Masatoshi Okutomi
-
Publication number: 20120075428Abstract: According to one embodiment, an image processing apparatus includes plural imaging units and a calibration unit. The imaging units capture an overlapping region. The calibration unit calibrates plural captured images and obtains plural calibrated images in which lens distortion in each of the captured images is corrected and corresponding positions of the captured images are aligned with each other horizontally and are adjusted to a surface perpendicular to a plane. The plural imaging units are arranged such that a baseline vector connecting optical centers of the imaging units is perpendicular to a normal vector of the plane.Type: ApplicationFiled: June 21, 2011Publication date: March 29, 2012Applicant: KABUSHIKI KAISHA TOSHIBAInventor: Akihito SEKI
-
Publication number: 20100245266Abstract: A handwriting processing apparatus includes an acquiring unit configured to acquire coordinate information of handwriting input by an input unit and attribute information, the attribute information indicating a type of input of the handwriting; a determining unit configured to determined a kind of the handwriting using the attribute information; a handwriting processing unit configured to perform handwriting processing corresponding to the kind of the handwriting using the coordinate information; and a display control unit configured to control a display unit to display a result of the handwriting processing.Type: ApplicationFiled: September 15, 2009Publication date: September 30, 2010Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Yojiro Tonouchi, Ryuzo Okada, Mieko Asano, Hiroshi Hattori, Tsukasa Ike, Akihito Seki, Hidetaka Ohira
-
Publication number: 20090226034Abstract: A spatial motion calculation apparatus includes an image position relation calculation unit that calculates first similarities based on an image position relation between an inputted groups of feature points 1 and 2, a spatial position relation calculation unit that calculates second similarities based on an spatial position relation between said inputted groups, a feature descriptor relation calculation unit that calculates third similarities based on a feature descriptor relation between said inputted groups, and a spatial motion calculation unit that estimates the spatial motion based on the result that integrates the first to third similarities.Type: ApplicationFiled: September 18, 2008Publication date: September 10, 2009Applicant: KABUSHIKI KAISHA TOSHIBAInventor: Akihito Seki
-
Publication number: 20090169052Abstract: An object position area (82a) is calculated according to a position in the space of a detected truck (70). A template (80a) corresponding to the object position area (82a) recorded at the previous time is then called. The template (80a) is moved to a position on a reference image Ib where the similarity is highest. Overlap rate (R(t-1)?Rt) of the object position area (82a) and the template (80a) is calculated. A decision is made whether the object is identical to that detected in the past by using the overlap rate.Type: ApplicationFiled: August 11, 2005Publication date: July 2, 2009Applicants: TOKYO INSTITUTE OF TECHNOLOGY, HONDA MOTOR CO., LTD.Inventors: Akihito Seki, Masatoshi Okutomi
-
Publication number: 20090167844Abstract: A standard camera (12) picks up a standard image Ib, and a reference camera (14) picks up a reference image Ir. A flat area IIf is extracted from the standard image Ib and the reference image Ir. An edge image is created by extracting the edges and feature points from the standard image Ib and then a corrected edge image (62) is created by removing the flat area IIf. Object detection processing is carried out on the edges and feature points of the corrected edge image (62) with reference to the reference image Ir.Type: ApplicationFiled: August 11, 2005Publication date: July 2, 2009Applicants: Tokyo Institute of Technology, Honda Motor Co., Ltd.Inventors: Akihito Seki, Masatoshi Okutomi