Patents Examined by Andrew W. Johns
  • Patent number: 10121224
    Abstract: Embodiments of a device and a frequency data extrapolator are generally described herein. The frequency data extrapolator may receive input frequency data mapped to a two-dimensional frequency grid. As an example, the input frequency data may be based on return signals received, at a sensor of the device, in response to pulsed transmissions of the sensor in a physical environment. Regions of the frequency grid may be classified as high fidelity or low fidelity. A group of basis rectangles may be determined within the high fidelity regions. A column-wise extrapolation matrix and a row-wise extrapolation matrix may be determined based on the input frequency data of the basis rectangles. The input frequency data of the high fidelity regions may be extrapolated to replace the input frequency data of the low fidelity regions.
    Type: Grant
    Filed: July 28, 2016
    Date of Patent: November 6, 2018
    Assignee: Raytheon Company
    Inventors: J. Kent Harbaugh, Michael W. Whitt, Theagenis J. Abatzoglou
  • Patent number: 10112585
    Abstract: Example vehicle cleanliness detection systems and methods are described. In one implementation, a method positions a vehicle proximate an image capture station and receives multiple images from the image capture station. The multiple images include different views of the vehicle's exterior surfaces. A vehicle cleanliness detection system analyzes the multiple images to determine a cleanliness of the vehicle's exterior surfaces. Based on analyzing the cleanliness of the vehicle's exterior surfaces, the method determines whether the vehicle exterior needs to be cleaned.
    Type: Grant
    Filed: June 20, 2017
    Date of Patent: October 30, 2018
    Assignee: FORD GLOBAL TECHNOLOGIES, LLC
    Inventor: Shant Tokatyan
  • Patent number: 10105187
    Abstract: Techniques facilitating augmented reality-assisted surgery are provided. In one example, a method is provided that includes receiving, by a first device including a processor, image data associated with an external portion of a tool located within a body of a patient, wherein the image data includes first information indicative of a first fiducial marker on the external portion of the tool. The method also includes determining one or more relative positions of an internal portion of the tool within the body relative to one or more anatomical structures of the body based on the image data and a defined configuration of the tool. The method also includes generating one or more representations of the tool within the body relative to the one or more anatomical structures based on the one or more relative positions and the defined configuration of the tool.
    Type: Grant
    Filed: August 15, 2016
    Date of Patent: October 23, 2018
    Assignee: Medtronic, Inc.
    Inventors: Eric D. Corndorf, Andrzej M. Malewicz
  • Patent number: 10092216
    Abstract: In an endoscope system, an insertion amount of an insertion unit is detected based on camera images captured by two cameras provided to a mouthpiece. Then, past images of a predetermined range corresponding to the detected insertion amount are acquired from a past image storage unit. A current image is compared with each of the acquired past images to calculate similarity between the current image and each of the past images. A body part captured in the past image having the highest similarity with the current image is determined to be the body part captured in the current image.
    Type: Grant
    Filed: March 29, 2016
    Date of Patent: October 9, 2018
    Assignee: FUJIFILM Corporation
    Inventor: Nobuyuki Miura
  • Patent number: 10092279
    Abstract: A system further comprises a processing unit functionally associated with the display, wherein the processing unit comprises an image processing module. The system further comprises a camera functionally associated with the processing unit via a communication channel for transferring images from the camera to the processing unit and configured to obtain images of a biopsy sample obtained from the body of the patient. The processing unit is configured to receive image data from an imaging modality capable of obtaining images of internal patient's body parts not directly visible from outside the body, and to display to a user on a display images related to the image data. The processing unit is further configured to generate, from at least one image of a biopsy sample and using the image processing module, a processed image related to the biopsy sample, and to display the processed image on the display.
    Type: Grant
    Filed: March 13, 2014
    Date of Patent: October 9, 2018
    Assignee: UC-CARE LTD.
    Inventors: Alex Pasternak, Tomer Schatzberger, Shaike Schatzberger, Moshe Ebenstein
  • Patent number: 10095947
    Abstract: Various embodiments disclosed herein are directed to methods of capturing Vehicle Identification Numbers (VIN) from images captured by a mobile device. Capturing VIN data can be useful in several applications, for example, insurance data capture applications. There are at least two types of images supported by this technology: (1) images of documents and (2) images of non-documents.
    Type: Grant
    Filed: September 25, 2017
    Date of Patent: October 9, 2018
    Assignee: MITEK SYSTEMS, INC.
    Inventors: Grigori Nepomniachtchi, Nikolay Kotovich
  • Patent number: 10089549
    Abstract: Described is a system for estimating ego-motion of a moving camera for detection of independent moving objects in a scene. For consecutive frames in a video captured by a moving camera, a first ego-translation estimate is determined between the consecutive frames from a first local minimum. From a second local minimum, a second ego-translation estimate is determined. If the first ego-translation estimate is equivalent to the second ego-translation estimate, the second ego-translation estimate is output as the optimal solution. Otherwise, a cost function is minimized to determine an optimal translation until the first ego-translation estimate is equivalent to the second ego-translation estimate, and an optimal solution is output. Ego-motion of the camera is estimated using the optimal solution, and independent moving objects are detected in the scene.
    Type: Grant
    Filed: May 2, 2017
    Date of Patent: October 2, 2018
    Assignee: HRL Laboratories, LLC
    Inventors: Yongqiang Cao, Narayan Srinivasa
  • Patent number: 10078902
    Abstract: Described is a system for compensating ego-translations in video captured with a moving camera. Translative ego-motion is estimated on a sequence of image frames captured by a moving camera by minimizing a cost function that is based on at least one image frame difference between consecutive image frames. An alternating one directional search is performed to minimize the cost function to find an optimal translation. The optimal translation is applied to the sequence of image frames, resulting in a sequence of image frames with ego-translations.
    Type: Grant
    Filed: August 29, 2016
    Date of Patent: September 18, 2018
    Assignee: HRL Laboratories, LLC
    Inventors: Yongqiang Cao, Narayan Srinivasa
  • Patent number: 10068134
    Abstract: Techniques and systems for identifying objects using gaze tracking techniques are described. A computing system may determine or infer that an individual is requesting to identify an object that is unknown to the individual based at least partly on images of the individual, images of a scene including the object, or both. In some cases, images of the individual may be used to determine a gaze path of the individual and the unknown object may be within the gaze path of the individual. Additionally, a computing system may send a request to identify the object to at least one individual. One or more of the responses received from the at least one individual may be provided in order to identify the object.
    Type: Grant
    Filed: May 3, 2016
    Date of Patent: September 4, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: John C. Gordon
  • Patent number: 10068335
    Abstract: A moving-object counter apparatus includes a first captured-image acquisition unit, a moving-object moving-path specifying unit, a virtual-line setting unit, and a first moving-object counter unit. The first captured-image acquisition unit acquires multiple images captured at respective times different from each other by a first imaging apparatus capturing images of a predetermined region. The moving-object moving-path specifying unit specifies, on the basis of the acquired images, one or more moving paths along which one or more respective moving objects have moved in the predetermined region. The virtual-line setting unit sets a virtual line on the basis of the specified one or more moving paths. The first moving-object counter unit counts, by counting one or more moving paths that cross the set virtual line among the specified one or more moving paths, one or more moving objects that have passed through the predetermined position.
    Type: Grant
    Filed: August 29, 2016
    Date of Patent: September 4, 2018
    Assignee: FUJI XEROX CO., LTD.
    Inventors: Daisuke Ikeda, Takeshi Onishi, Masatsugu Tonoike, Jun Shingu, Yusuke Uno, Yusuke Yamaura
  • Patent number: 10062083
    Abstract: A scalable system to provide a means for a brand manager, marketer, consultant, or researcher to identify, monitor, measure, and rank the propagation of a brand's digital imagery across the web, including the social web, the system configured to implement a novel process in which digital image files obtained from social networks that are perceptually similar (i.e., appear identical to the human visual system), but whose digital representation differs, are identified, data associated with the images files is clustered into groups, each group representing a common single piece of content that originated from the user, and enabling a user to access and organize the clusters of brand image data to measure and track the engagement of users on the social network with that brand image content, thereby providing measurable statistics for the user.
    Type: Grant
    Filed: March 10, 2014
    Date of Patent: August 28, 2018
    Assignee: Curalate, Inc.
    Inventors: Nicholas Aaron Shiftan, Apu Sorabh Gupta, Brendan William Lowry, Louis Kratz, III
  • Patent number: 10055859
    Abstract: The invention relates to a CT imaging apparatus and a method for generating sectional images of an object such as a patient on a patient table. According to one embodiment, first projections (P) are generated along a first helical scanning path (Tr1) of a first X-ray source according to a sparse angular sampling scheme. Additional projections (Q1, Q2, R1) may dynamically be introduced along said first helical scanning path (Tr1) and/or along a second helical scanning path (Tr2) of an additional X-ray source based on the evaluation of previous projections (P1).
    Type: Grant
    Filed: June 16, 2015
    Date of Patent: August 21, 2018
    Assignee: KONINKLIJKE PHILIPS N.V.
    Inventors: Roland Proksa, Michael Grass, Thomas Koehler
  • Patent number: 10055852
    Abstract: Various aspects of a system and a method are provided for detection of objects in motion are disclosed herein. In accordance with an embodiment, the system includes an electronic device, which is configured to compute a first sensor offset for a current frame based on a first motion vector and a second motion vector. A validation of the first motion vector is determined based on the second motion vector and one or more criteria. An object in motion from the current frame is extracted based on the first sensor offset of the current frame and the determined validation of the first motion vector.
    Type: Grant
    Filed: August 16, 2016
    Date of Patent: August 21, 2018
    Assignee: SONY CORPORATION
    Inventor: Junji Shimada
  • Patent number: 10049457
    Abstract: A system and method are described for automating the analysis of cephalometric x-rays. Included in the analysis is a method for automatic anatomical landmark localization based on convolutional neural networks. In an aspect, the system and method employ a deep database of images and/or prior image analysis results so as to improve the outcome from the present automated landmark detection scheme.
    Type: Grant
    Filed: August 29, 2016
    Date of Patent: August 14, 2018
    Assignee: CephX Technologies Ltd.
    Inventors: Zeev Abraham, Daniel Abraham
  • Patent number: 10043056
    Abstract: The present invention provides a computer implemented method, a system, and a computer program product for verifying a writing of a user. In an exemplary embodiment, the present invention includes in response to receiving a writing on a pressure sensing touchpad logically coupled a computer system, recording a position and a pressure of one or more points of the writing via a pressure sensing touchscreen, executing a set of logical operations normalizing the writing, comparing the normalized writing to one or more stored writing parameters, executing a set of logical operations determining the normalized writing is within a tolerance of writing parameter deviation limits, thereby verifying the writing, and in response to determining the writing is within the tolerance of writing parameter deviation limits, storing, by the computer system, a value indicating that the writing is valid.
    Type: Grant
    Filed: February 16, 2018
    Date of Patent: August 7, 2018
    Assignee: International Business Machines Corporation
    Inventors: Nicholas G. Danyluk, Eli M. Dow, Kavita Sehgal, Diane M. Stamboni, Sneha M. Varghese, John S. Werner, Sarah Wu
  • Patent number: 10043076
    Abstract: The described positional awareness techniques employing visual-inertial sensory data gathering and analysis hardware with reference to specific example implementations implement improvements in the use of sensors, techniques and hardware design that can enable specific embodiments to provide positional awareness to machines with improved speed and accuracy.
    Type: Grant
    Filed: August 29, 2016
    Date of Patent: August 7, 2018
    Assignee: PerceptIn, Inc.
    Inventors: Zhe Zhang, Grace Tsai, Shaoshan Liu
  • Patent number: 10037616
    Abstract: Provided is an apparatus for reconstructing an image using a microwave. The apparatus includes: a microwave measurement unit configured to obtain a microwave measurement value for a microwave measurement object; and an image reconstruction unit configured to perform an image reconstruction by using the microwave measurement value and shape boundary information of the object.
    Type: Grant
    Filed: August 29, 2016
    Date of Patent: July 31, 2018
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Bo Ra Kim, Seong Ho Son, Simonov Nikolai, Soon Ik Jeon
  • Patent number: 10028646
    Abstract: Computerized information acquisition and processing apparatus for use in endoscopic analysis. In one embodiment, the apparatus includes a swallowable apparatus having a digital processor apparatus and video apparatus with image capture and digitization capability, and a wireless interface for accomplishing various purposes, including e.g., transferring the digital image data to a second computerized apparatus having various algorithmic analysis functions operative thereon, including for example pixel intensity analysis, shape recognition, and preview functions, so as to enhance efficiency of the review process. In another embodiment, portions of the algorithmic analysis (and other functions) are performed by the digital processor apparatus on the swallowable apparatus, including selective deletion or omission of data prior to transmission.
    Type: Grant
    Filed: February 14, 2018
    Date of Patent: July 24, 2018
    Assignee: West View Research, LLC
    Inventor: Robert F. Gazdzinski
  • Patent number: 10032276
    Abstract: The described positional awareness techniques employing visual-inertial sensory data gathering and analysis hardware with reference to specific example implementations implement improvements in the use of sensors, techniques and hardware design that can enable specific embodiments to provide positional awareness to machines with improved speed and accuracy.
    Type: Grant
    Filed: August 29, 2016
    Date of Patent: July 24, 2018
    Assignee: PerceptIn, Inc.
    Inventors: Shaoshan Liu, Zhe Zhang, Grace Tsai
  • Patent number: 10032074
    Abstract: A system mounted within eyewear or headwear to unobtrusively produce and track reference locations on the surface of one or both eyes of an observer is provided to improve the accuracy of gaze tracking. The system utilizes multiple illumination sources and/or multiple cameras to generate and observe glints from multiple directions. The use of multiple illumination sources and cameras can compensate for the complex, three-dimensional geometry of the head and the significant anatomical variations of the head and eye region that occurs among individuals. The system continuously tracks the initial placement and any slippage of eyewear or headwear. In addition, the use of multiple illumination sources and cameras can maintain high-precision, dynamic eye tracking as an eye moves through its full physiological range.
    Type: Grant
    Filed: July 11, 2016
    Date of Patent: July 24, 2018
    Assignee: Google LLC
    Inventors: Nelson G. Publicover, William C. Torch, Christopher N. Spitler