For Television Cameras (epo) Patents (Class 348/E17.002)
-
Patent number: 11912513Abstract: Systems, methods, and computer-readable media are disclosed for robotic system camera calibration and localization using robot-mounted registered patterns. In one embodiment, an example robotic system may include a robotic manipulator, and a picking assembly coupled to the robotic manipulator, where the picking assembly is configured to grasp and release items, and where the picking assembly includes a housing having a first flat surface. Example robotic systems may include a first calibration pattern disposed on the first flat surface of the housing, a first camera configured to image the first calibration pattern, and a controller configured to calibrate the robotic system.Type: GrantFiled: April 14, 2021Date of Patent: February 27, 2024Assignee: Amazon Technologies, Inc.Inventors: Felipe De Arruda Camargo Polido, Sanjeev Khanna, Steven Alexander Viola
-
Patent number: 11871098Abstract: Methods and apparatus are disclosed for selecting a lens control mode for a camera in external magnetic fields or other types of non-magnetic interference. In embodiments, when the camera is activated, a test of a position sensor for the camera lens is performed by moving the camera lens through a range of positions and collecting values from the sensor. In embodiments, the sensor readings are analyzed to determine conditions such as (a) whether the sensor is saturated by an external magnetic field or non-magnetic interference, (b) whether the sensor's readings are within an error margin, and (c) whether a computed position offset for the sensor is valid. Based on the analysis, the camera is placed into a first control mode where movement of the lens is controlled using the position sensor, or a second control mode where lens movement is controlled without the position sensor.Type: GrantFiled: May 21, 2021Date of Patent: January 9, 2024Assignee: Apple Inc.Inventors: Abhishek Dhanda, Andrew D. Fernandez, Arathi S. Bale, David C. Beard, Santiago Alban
-
Patent number: 11838698Abstract: A display apparatus includes a display unit, an acquiring unit configured to acquire a captured image generated by capturing a displayed image, which is displayed by the display unit, by an image pickup apparatus, and a control unit configured to cause the display unit to display the captured image together with or instead of the displayed image. The control unit causes the display unit to stop displaying the captured image when the image pickup apparatus captures the displayed image.Type: GrantFiled: May 3, 2022Date of Patent: December 5, 2023Assignee: CANON KABUSHIKI KAISHAInventor: Hirofumi Kuroda
-
Patent number: 11698641Abstract: Systems and method are provided for controlling a vehicle. In one embodiment, a method includes: initiating, by a controller onboard the vehicle, a first laser pulse from a first laser device; initiating, by a controller onboard the vehicle, a second laser pulse from a second laser device, wherein the initiating the second laser pulse is based on a phase shift angle; receiving, by the controller onboard the vehicle, first return data and second return data as a result of the first laser pulse and the second laser pulse; interleaving, by the controller onboard the vehicle, the first return pulse and the second return pulse to form a point cloud; and controlling, by the controller onboard the vehicle, the vehicle based on the point cloud.Type: GrantFiled: April 26, 2019Date of Patent: July 11, 2023Assignee: GM GLOBAL TECHNOLOGY OPERATIONS LLCInventors: Brian J. Hufnagel, Robert D. Sims, III, Nathaniel W. Hart
-
Patent number: 11380014Abstract: A control module comprises: a first interface to a camera provided in a vehicle; a second interface to a serializer for communication with an electronic control unit of the vehicle; and a processing unit configured to process input data received via the first interface or via the second interface to obtain output data, and to output the output data via the first interface or via the second interface.Type: GrantFiled: February 17, 2021Date of Patent: July 5, 2022Assignee: Aptiv Technologies LimitedInventor: Waldemar Dworakowski
-
Patent number: 11321872Abstract: Methods, systems, and techniques for automatic camera calibration. One or more calibration images are captured using a camera. The calibration images depict one or more bounding boxes, and each of the bounding boxes bounds a person. For each of the bounding boxes, the person is modeled using a rectangle or a parallelepiped, and a projection of the rectangle or parallelepiped is determined. A mapping that maps foot vertices of the projection to head vertices of the projection is determined, and using the foot vertices and the mapping, estimates of the head vertices and distances between the head vertices and the estimates of the head vertices are determined. The camera is calibrated by iteratively updating, using an objective function, the camera parameters so as to reduce those distances.Type: GrantFiled: May 20, 2020Date of Patent: May 3, 2022Assignee: AVIGILON CORPORATIONInventors: Aleksey Lipchin, Sergey Veselkov
-
Patent number: 11218692Abstract: A method for testing and/or calibrating a camera is provided. An optical test standard is produced. A plurality of subareas, with different grey tones, are applied to a substrate. At least one reference image of the substrate is captured using a reference camera, using a predetermined setting comprising at least one parameter. Reference greyscale values for the subareas are determined from the reference image and assigned to the substrate. At least one image of the optical test standard is captured by the camera using a predetermined setting of the camera, the setting comprising at least one parameter. Actual greyscale values for the subareas are determined from this image. An assertion about a sufficient or insufficient calibration of the camera is made based on a comparison of the actual greyscale values with the reference greyscale values or with a predetermined range of permissible actual greyscale values.Type: GrantFiled: December 20, 2017Date of Patent: January 4, 2022Assignee: WIPOTEC GMBHInventors: Stefan Schulz, Rene Elspass
-
Patent number: 10915782Abstract: An image parameter calculating method comprising: (a) transforming a spatial domain target image to a frequency domain target image; (b) multiplying the frequency domain target image with a frequency domain reference image to acquire a frequency domain multiplying result; and (c) calculating at least one peak location of the spatial domain target image according to the frequency domain multiplying result.Type: GrantFiled: December 14, 2017Date of Patent: February 9, 2021Assignee: PixArt Imaging Inc.Inventors: Kok Sing Yap, Tong Sen Liew
-
Patent number: 10789465Abstract: In a feature extraction and pattern matching system, image sharpening can enable vascular point detection (VPD) for detecting points of interest from visible vasculature of the eye. Pattern Histograms of Extended Multi-Radii Local Binary Patterns and/or Pattern Histograms of Extended Multi-Radii Center Symmetric Local Binary Patterns can provide description of portions of images surrounding a point of interest, and enrollment and verification templates can be generated using points detected via VPD and the corresponding descriptors. Inlier point pairs can be selected from the enrollment and verification templates, and a first match score indicating similarity of the two templates can be computed based on the number of inlier point pairs and one or more parameters of a transform selected by the inlier detection. A second match score can be computed by applying the selected transform, and either or both scores can be used to authenticate the user.Type: GrantFiled: February 4, 2020Date of Patent: September 29, 2020Assignee: EyeVerify Inc.Inventors: Vikas Gottemukkula, Reza R. Derakhshani, Sashi K. Saripalle
-
Patent number: 10726260Abstract: In a feature extraction and pattern matching system, image sharpening can enable vascular point detection (VPD) for detecting points of interest from visible vasculature of the eye. Pattern Histograms of Extended Multi-Radii Local Binary Patterns and/or Pattern Histograms of Extended Multi-Radii Center Symmetric Local Binary Patterns can provide description of portions of images surrounding a point of interest, and enrollment and verification templates can be generated using points detected via VPD and the corresponding descriptors. Inlier point pairs can be selected from the enrollment and verification templates, and a first match score indicating similarity of the two templates can be computed based on the number of inlier point pairs and one or more parameters of a transform selected by the inlier detection. A second match score can be computed by applying the selected transform, and either or both scores can be used to authenticate the user.Type: GrantFiled: February 4, 2020Date of Patent: July 28, 2020Assignee: EyeVerify Inc.Inventors: Vikas Gottemukkula, Reza R. Derakhshani, Sashi K. Saripalle
-
Patent number: 10725884Abstract: A method and computer device for storage and retrieval of a data object on a storage medium. The method includes steps of disassembling the data object into a predetermined number of redundant sub blocks, storing the redundant sub blocks on the storage medium, retrieving at least a predetermined multiple of a minimal spreading requirement of the redundant sub blocks from the storage medium, and assembling the data object from any combination of a particular number of the redundant sub blocks, the particular number corresponding to a predetermined multiple of a minimal spreading requirement. The computer device includes modules for performing the steps.Type: GrantFiled: December 13, 2017Date of Patent: July 28, 2020Assignee: Western Digital Technologies, Inc.Inventors: Frederik De Schrijver, Romain Raymond Agnes Slootmaekers, Bastiaan Stougie, Joost Yervante Damad, Wim De Wispelaere, Wouter Van Eetvelde, Bart De Vylder
-
Patent number: 10664700Abstract: In a feature extraction and pattern matching system, image sharpening can enable vascular point detection (VPD) for detecting points of interest from visible vasculature of the eye. Pattern Histograms of Extended Multi-Radii Local Binary Patterns and/or Pattern Histograms of Extended Multi-Radii Center Symmetric Local Binary Patterns can provide description of portions of images surrounding a point of interest, and enrollment and verification templates can be generated using points detected via VPD and the corresponding descriptors. Inlier point pairs can be selected from the enrollment and verification templates, and a first match score indicating similarity of the two templates can be computed based on the number of inlier point pairs and one or more parameters of a transform selected by the Inlier detection. A second match score can be computed by applying the selected transform, and either or both scores can be used to authenticate the user.Type: GrantFiled: February 27, 2018Date of Patent: May 26, 2020Assignee: EyeVerify Inc.Inventors: Vikas Gottemukkula, Reza R. Derakhshani, Sashi K. Saripalle
-
Patent number: 9875390Abstract: A method of recognizing an object includes controlling an event-based vision sensor to perform sampling in a first mode and to output first event signals based on the sampling in the first mode, determining whether object recognition is to be performed based on the first event signals, controlling the event-based vision sensor to perform sampling in a second mode and to output second event signals based on the sampling in the second mode in response to the determining indicating that the object recognition is to be performed, and performing the object recognition based on the second event signals.Type: GrantFiled: May 18, 2015Date of Patent: January 23, 2018Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Kyoobin Lee, Keun Joo Park, Eric Hyunsurk Ryu, Jun Haeng Lee
-
Patent number: 8766808Abstract: An imager array may be provided as part of an imaging system. The imager array may include a plurality of sensor arrays (e.g., also referred to as lenslets or optical elements). Each sensor array may include a plurality of sensors (e.g., pixels) associated with a lens. The sensor arrays may be oriented, for example, substantially in a plane facing the same direction and configured to detect images from the same scene (e.g., target area). Such images may be processed in accordance with various techniques to provide images of electromagnetic radiation. The sensor arrays may include filters or lens coatings to selectively detect desired ranges of electromagnetic radiation. Such arrangements of sensor arrays in an imager array may be used to advantageous effect in a variety of different applications.Type: GrantFiled: March 8, 2011Date of Patent: July 1, 2014Assignee: FLIR Systems, Inc.Inventor: Nicholas Hogasten
-
Publication number: 20140125818Abstract: A method for calibrating a camera and a display monitor is provided. The method includes identifying a parameter for optimization, assigning to a test color a target color relevant to the parameter, repeatedly performing, two or more times, a set of steps, determining a direction and timing of color divergence for the target color from obtained images, and adjusting the parameter based on the direction and rate of color divergence for the target color. The set of steps includes instructing the display monitor to display the test color on a portion of the display monitor, obtaining an image captured by the camera while the display is executing the instruction, and reassigning, to the test color, a color obtained from a portion of the image in which the portion of the display monitor was captured. The obtained image includes the portion of the display monitor.Type: ApplicationFiled: November 6, 2012Publication date: May 8, 2014Applicant: CATERPILLAR INC.Inventor: Paul Russell Friend
-
Publication number: 20140104437Abstract: Disclosed are systems, apparatus, devices, method, computer program products, and other implementations, including a method that includes capturing an image of a scene by an image capturing unit of a device that includes at least one sensor, determining relative device orientation of the device based, at least in part, on determined location of at least one vanishing point in the captured image of the scene, and performing one or more calibration operations for the at least one sensor based, at least in part, on the determined relative device orientation.Type: ApplicationFiled: November 7, 2012Publication date: April 17, 2014Applicant: QUALCOMM INCORPORATEDInventors: Hui CHAO, Sameera PODURI, Saumitra Mohan DAS, Ayman Fawzy NAGUIB, Faraz Mohammad MIRZAEI
-
Patent number: 8698901Abstract: An automatic calibration method for a projector-camera system including a semi-transparent screen is disclosed herein. An image sequence is caused to be captured from the semi-transparent screen and through the semi-transparent screen while a calibration pattern having features is displayed and not displayed in an alternating succession on the semi-transparent screen. A temporal correlation image is created from the image sequence and a discrete binary signal. Peaks are identified in a spatial cross correlation image generated from the temporal correlation image, where a pattern of the identified peaks corresponds to a pattern of the features in the calibration pattern. The peaks are transformed to coordinates of corrected feature points. A comparison of the corrected feature points and a ground truth set of coordinates for the features is used to determine whether the projector-camera system is calibrated.Type: GrantFiled: April 19, 2012Date of Patent: April 15, 2014Assignee: Hewlett-Packard Development Company, L.P.Inventor: Wei Hong
-
Publication number: 20140071281Abstract: A method for determining a response to misalignment of a camera monitoring a desired area includes acquiring temporal related frames from the camera including a reference frame. A pixel location is determined of a reference object from the frames. Using the pixel location of the reference object, a displacement of the camera between a current frame and the reference frame is determined. For the displacement exceeding a first threshold, a new displacement of the camera is measured by introducing at least one additional object to a camera field of view and comparing the new displacement to a second threshold. For the new displacement not exceeding the second threshold, the camera is recalibrated using a determined pixel location and a physical location of the at least one additional object. For the new displacement exceeding the second threshold, notification is provided of a misalignment to an associated user device.Type: ApplicationFiled: September 12, 2012Publication date: March 13, 2014Applicant: XEROX CORPORATIONInventors: Wencheng Wu, Edul N. Dalal
-
Publication number: 20140063265Abstract: An electronic device may include a camera module. Control circuitry within the electronic device may use an image sensor within the camera module to acquire digital images. The camera module may have lens structures that are supported by lens support structures such as a lens barrel and lens carrier. An actuator such as a voice coil motor may control the position of the lens support structures relative to internal support structures such as upper and lower spacer members. Springs may be used to couple the lens support structures to the internal support structures. Outer wall structures in the camera module such as a ferromagnetic shield structures may surround and enclose at least some of the internal support structures. The outer wall structures may have openings. The internal support structures may have pins or other alignment structures that protrude through the openings.Type: ApplicationFiled: September 6, 2012Publication date: March 6, 2014Inventors: Ashutosh Y. Shukla, Kenta K. Williams, Shashikant G. Hegde, Tang Yew Tan, David A. Pakula
-
Publication number: 20140055591Abstract: Embodiments are disclosed that relate to calibrating an eye tracking system for a computing device. For example, one disclosed embodiment provides, in a computing device comprising a gaze estimation system, a method of calibrating the gaze estimation system. The method includes receiving a request to log a user onto the computing device, outputting a passcode entry display image to a display device, receiving image data from one or more eye tracking cameras, and from the image data, determining a gaze scanpath representing a path of a user's gaze on the passcode entry display image. The method further includes comparing the gaze scanpath to a stored scanpath for the user, and calibrating the gaze estimation system based upon a result of comparing the gaze scanpath to the stored scanpath for the user.Type: ApplicationFiled: August 24, 2012Publication date: February 27, 2014Inventor: Sagi Katz
-
Patent number: 8654873Abstract: In one embodiment, a Television (TV) receiver to perform a method of synchronizing a demodulator at a Viterbi decode input in the TV receiver using one or more bit de-interleaved even and odd Orthogonal Frequency Division Multiplexing (OFDM) symbols is provided. The method includes (i) performing a Viterbi decoding on the bit de-interleaved even and odd OFDM symbols when a frame boundary does not exist for the bit de-interleaved even and odd OFDM symbols, (ii) performing a convolutional encoding on an decoded data output of the Viterbi decoding, (iii) determining whether an output of the convolutional encoding of the bit de-interleaved OFDM symbols matches an input at a Viterbi decode, and (iv) determining whether the output of the convolutional encoding of the bit de-interleaved even and odd OFDM symbols matches with a SYNC pattern or a SYNC? pattern to obtain a RS packet align boundary.Type: GrantFiled: March 30, 2012Date of Patent: February 18, 2014Inventors: Gururaj Padaki, Sunil Hosur Rames, Rakesh A Joshi, Raghavendra Raichur, Rajendra Hegde
-
Publication number: 20140036095Abstract: A method and system for determining a camera-to-display latency of an electronic device (100) having a camera (134) and touch-sensitive display (108) are disclosed. In one example embodiment, the method (500) includes receiving (511) first light (136) at the camera, and essentially simultaneously receiving second light (138) at a first photosensitive structural portion (102, 602). The method (500) further includes detecting (512) a first simulated touch input at the display (108) in response to a first actuation of the first photosensitive structural portion (102, 602), receiving third light (140) at a second photosensitive structural portion (104, 604), the third light being generated based at least indirectly upon the received first light (136), detecting (514) a second simulated touch input at the display (108) as a result of the receiving of the third light (140), and determining the camera-to-display latency based at least indirectly upon the touch inputs (516).Type: ApplicationFiled: August 1, 2012Publication date: February 6, 2014Applicant: Motorola Mobility LLCInventors: John W. Kaehler, Alexander Klement, Mark F. Valentine, Sandeep Vuppu, Daniel H. Wagner
-
Publication number: 20140028819Abstract: An imaging section images a subject to generate an image. A display displays an image. A measuring section measures the size of the subject on the basis of the image. A recording section associates calibration data of a measuring endoscope apparatus including a value obtained when the measuring section measures the size of the calibrator on the basis of an image obtained when the imaging section has imaged the calibrator, and calibration data of the calibrator including a value obtained when the calibrator is measured by a higher-rank measuring instrument of a traceability system, and a value obtained when the measuring section measures the size of a measurement object different from the calibrator on the basis of an image of the measurement object, with the image of the measurement object image and records the associated data on a recording medium.Type: ApplicationFiled: October 26, 2012Publication date: January 30, 2014Applicant: OLYMPUS CORPORATIONInventor: Sumito NAKANO
-
Publication number: 20140002661Abstract: A method for detecting camera degradation and faults comprises identifying a plurality of cameras comprising a camera network, collecting at least one system metric indicative of the camera's performance, analyzing the system metrics according to at least one of a plurality diagnostics layers comprising an individual diagnostic layer, a network diagnostic layer, and a pair diagnostic layer, and identifying a fault condition indicative of a faulty camera in the camera network according to the diagnostic layers.Type: ApplicationFiled: June 29, 2012Publication date: January 2, 2014Applicant: XEROX CORPORATIONInventors: Wencheng Wu, Edul N. Dalal
-
Publication number: 20130286219Abstract: An image calibration system and an image calibration method thereof. The image calibration system includes an image capture module capturing a calibration image, first and second calibration tools comprising first and second calibration graphs respectively and a processing module calculating first and second calibration templates in the calibration image based on the first and second calibration graphs, and comparing the calibration image and the first and second calibration templates so as to obtain at least one locating point.Type: ApplicationFiled: June 8, 2012Publication date: October 31, 2013Applicant: ALTEK AUTOTRONICS CORP.Inventors: Ching-Sung Yeh, Chung-Fang Chien
-
Publication number: 20130278779Abstract: An automatic calibration method for a projector-camera system including a semi-transparent screen is disclosed herein. An image sequence is caused to be captured from the semi-transparent screen and through the semi-transparent screen while a calibration pattern having features is displayed and not displayed in an alternating succession on the semi-transparent screen. A temporal correlation image is created from the image sequence and a discrete binary signal. Peaks are identified in a spatial cross correlation image generated from the temporal correlation image, where a pattern of the identified peaks corresponds to a pattern of the features in the calibration pattern. The peaks are transformed to coordinates of corrected feature points. A comparison of the corrected feature points and a ground truth set of coordinates for the features is used to determine whether the projector-camera system is calibrated.Type: ApplicationFiled: April 19, 2012Publication date: October 24, 2013Inventor: Wei Hong
-
Publication number: 20130258044Abstract: A camera with multiple lenses and multiple sensors wherein each lens/sensor pair generates a sub-image of a final photograph or video. Different embodiments include: manufacturing all lenses as a single component; manufacturing all sensors as one piece of silicon; different lenses incorporate filters for different wavelengths, including IR and UV; non-circular lenses; different lenses are different focal lengths; different lenses focus at different distances; selection of sharpest sub-image; blurring of selected sub-images; different lens/sensor pairs have different exposures; selection of optimum exposure sub-images; identification of distinct objects based on distance; stereo imaging in more than one axis; and dynamic optical center-line calibration.Type: ApplicationFiled: March 30, 2012Publication date: October 3, 2013Applicant: ZETTA RESEARCH AND DEVELOPMENT LLC - FORC SERIESInventor: Jonathan N. Betts-Lacroix
-
Publication number: 20130250127Abstract: A method for checking a camera includes the steps of capturing an image of an object using a photo-sensitive element, converting the color level value of each of a plurality of pixels of the image into image gray level values, and when one of the image gray level values is higher than a predetermined gray level threshold value, displaying an alarm message on the screen of the camera. A camera is also disclosed herein.Type: ApplicationFiled: August 23, 2012Publication date: September 26, 2013Inventor: Wen-Lung Huang
-
Publication number: 20130235213Abstract: An accurate camera pose is determined by pairing a first camera with a second camera in proximity to one another, and by developing a known spatial relationship between them. An image from the first camera and an image from the second camera are analyzed to determine corresponding features in both images, and a relative homography is calculated from these corresponding features. A relative parameter, such as a focal length or an extrinsic parameter is used to calculate a first camera's parameter based on a second camera's parameter and the relative homography.Type: ApplicationFiled: March 9, 2012Publication date: September 12, 2013Inventors: Howard J. Kennedy, Smadar Gefen
-
Publication number: 20130222605Abstract: A camera module under test is signaled to capture an image of a target. A group of pixels of the image that represent portions of several objects in the target are low pass filtered and then analyzed to compute a pixel distance between different subgroups of pixels that represent portions of the different objects. The computed pixel distance is then converted into a true distance using a predetermined math relationship that relates a pixel distance variable with a true distance variable.Type: ApplicationFiled: February 23, 2012Publication date: August 29, 2013Applicant: Apple Inc.Inventor: Jason R. Rukes
-
Publication number: 20130208121Abstract: A method, system, and computer-usable tangible storage device for traffic camera diagnostics via strategic use of moving test targets are disclosed. The disclosed embodiments can comprise four modules: Moving test target management module, Moving test target detection and identification module, Image/video feature extraction module, and Sensor characterization and diagnostics module. A first test vehicle can travel periodically through traffic camera(s) of interest. The traffic camera(s) would then identify these test vehicles via matching of license plate numbers and then identify test targets in video frames through pattern matching or barcode reading. The identified test targets are then analyzed to extract image and video features that can be used for sensor characterization, sensor health assessment, and sensor diagnostics. The disclosed embodiments provide for a non-traffic-stop (i.e., non-traffic-interruption) traffic camera diagnostics.Type: ApplicationFiled: February 10, 2012Publication date: August 15, 2013Inventors: Wencheng Wu, Martin E. Hoover
-
Publication number: 20130182080Abstract: A method for testing a 3D camera is provided. The method includes: making a first camera of the 3D camera align with a first picture of a reference picture, wherein, the reference picture includes a second picture, a central point of the first picture and the second picture respectively has a first label and a second label; obtaining an image captured by a second camera of the 3D camera; identifying the first label, the second label, and a central point of the image; calculating an actual angle difference and an actual distance difference according to coordinates of the first label, the second label, and the central point; determining whether the 3D camera is installed appropriately by comparing the actual distance difference with the reference distance difference and the actual angle difference with the reference angle difference respectively.Type: ApplicationFiled: March 27, 2012Publication date: July 18, 2013Applicant: HON HAI PRECISION INDUSTRY CO., LTD.Inventors: YU-CHENG LIN, CHI-HSIANG PENG, CHIEH-WEN HSUEH
-
Publication number: 20130162813Abstract: A sensor event assessor trainer and integrator is disclosed. In one embodiment, a sensor event assessor assesses any event detected by at least two sensors and provides assessment information about the event. A different event detection system provides an input regarding an event. A collaborator combines the assessment information with the different event detection system input to generate a user recognizable output for presenting the integrated event information.Type: ApplicationFiled: December 22, 2011Publication date: June 27, 2013Inventor: Cory J. Stephanson
-
Publication number: 20130147968Abstract: A testing method includes: providing a camera under test and a planar light source having a first mark, wherein the camera includes a voice coil motor (VCM) and a lens module is fixed on the VCM by glue, the VCM moves the lens module, the VCM includes an elastic tab for limiting and restoring the movement of the lens module; taking an image of a light source using the camera during the movement of the VCM; displaying the image having the first mark; determining if the first mark tilts using a detector; displaying a first message indicating that the tab is not stuck by glue when the detector determines the first mark does not tilt; and displaying a second message indicating that the tab is stuck by glue when the detector determines the first mark tilts.Type: ApplicationFiled: August 15, 2012Publication date: June 13, 2013Applicant: HON HAI PRECISION INDUSTRY CO., LTD.Inventor: YUNG-FENG LIN
-
Publication number: 20130127999Abstract: According to one embodiment, an apparatus for calibrating a camera while a vehicle is moving includes a rearward monitoring camera mounted in a vehicle to acquire image information, a memory that stores a camera calibration program for computationally determining and calibrating mounting parameters of the camera while the vehicle is moving, using the image information acquired by the camera, a gear position detector to detect gear positions of the vehicle and generates position signals according to the gear positions and a control unit that receives the image information, and when the gear position detector detects a position signal for other than a reverse gear position, that reads out and executes the camera calibration program stored in the memory.Type: ApplicationFiled: February 29, 2012Publication date: May 23, 2013Applicant: Toshiba Alpine Automotive Technology CorporationInventor: Kiyoyuki KAWAI
-
Publication number: 20130128040Abstract: A method is provided for determining a position where a reference point should be located on a display (24) of an alignment device (20). The reference point corresponds to a target located within a region to be monitored by a camera (10) being aligned with the alignment device (20). The method includes the steps of: determining a minimum Field of View (FoV) such that the camera (10) will view a substantial entirety of the region; determining a first bearing for the camera (10), the first bearing substantially bisecting the FoV; determining a second bearing to the target; determining a different between the first and second bearings; determining a scaling factor (A); and, determining a position where a reference point corresponding to the target should be located on the display (24) of the alignment device (20) based on the scaling factor (A) and the difference between the first and second bearings.Type: ApplicationFiled: November 23, 2011Publication date: May 23, 2013Applicant: ALCATEL-LUCENT USA INC.Inventors: Karl A. STOUGH, George P. WILKIN, Dean W. CRAIG
-
Publication number: 20130107059Abstract: A noise detecting device includes an image acquiring unit and a parallel processing unit. The image acquiring unit acquires an image from a camera that captures the image of a test pattern including a plurality of lines tilted at a specific angle relative to a specific direction. The parallel processing unit sequentially extracts image signals for two lines extending in parallel in the specific direction of the image, the two parallel lines being away from each other by a specific number of lines, and performs in a parallel manner, for each image signals for the two parallel lines, processing for detecting noise generated in the image on the basis of a difference in pixel values calculated from the image signals for the two parallel lines.Type: ApplicationFiled: October 23, 2012Publication date: May 2, 2013Applicant: SONY CORPORATIONInventor: Sony Corporation
-
Publication number: 20130083204Abstract: An imager including a self test mode. The imager includes a pixel array for providing multiple pixel output signals via multiple columns; and a test switch for (a) receiving a test signal from a test generator and (b) disconnecting a pixel output signal from a column of the pixel array. The test switch provides the test signal to the column of the pixel array. The test signal includes a test voltage that replaces the pixel output signal. The test signal is digitized by an analog-to digital converter (ADC) and provided to a processor. The processor compares the digitized test signal to an expected pixel output signal. The processor also interpolates the output signal from a corresponding pixel using adjacent pixels, when the test switch disconnects the pixel output signal from the column of the pixel array.Type: ApplicationFiled: September 29, 2011Publication date: April 4, 2013Applicant: APTINA IMAGING CORPORATIONInventors: JOHANNES SOLHUSVIK, TORE MARTINUSSEN
-
Publication number: 20130083205Abstract: A test chart can be used to test sharpness performance of an imaging system's full image field by having a sharpness inspection area formed of a plurality of identical visual elements that abut each other leaving no gaps to thereby form a mosaic. Each visual element includes a plurality of groups of differently oriented contrasting lines. The mosaic may fill an entire image captured by an imager. Thus, a test system can image the chart to objectively assess the performance of the imaging system in terms of image quality (e.g. sharpness, tilt, etc) throughout the entire spatial area of the captured image. The size of the chart and spatial frequency spacing) of the visual element lines can be selected to test an imaging system's full field sharpness at selected spatial frequencies. The full field sharpness results more quickly and accurately determine different aspects of a given imaging system.Type: ApplicationFiled: September 30, 2011Publication date: April 4, 2013Inventors: Mark N. Gamadia, Fei Wu, Shizhe Shen, Jason Rukes
-
Publication number: 20130083168Abstract: There is provided a calibration apparatus for a camera module capable of calibrating the difference in optical characteristics between left and right images of a binocular camera module in real time by capturing images of a plurality of rotating test boards. The calibration apparatus of a camera module includes: a test unit including two or more mutually connected test boards, the test boards having images captured by a camera module and rotating at a pre-set angle; and a calibration unit receiving the images of the test boards captured by the camera module and calibrating optical characteristics thereof.Type: ApplicationFiled: December 29, 2011Publication date: April 4, 2013Applicant: SAMSUNG ELCTRO-MECHANICS CO., LTD.Inventors: Joo Hyun KIM, Jagarlamudi Veera Venkata PRASAD, Nagaraj AVINASH, Soon Seok KANG
-
Publication number: 20130076914Abstract: A method for assessing performance of an operator of an imaging system is provided. The method comprises acquiring operator identification information, and acquiring time taken by the operator for initializing the imaging system for an image acquisition, wherein initializing the imaging system comprises at least one activity, and wherein initializing the imaging system is accomplished before a subject to be imaged is exposed for imaging.Type: ApplicationFiled: September 27, 2012Publication date: March 28, 2013Applicant: GENERAL ELECTRIC COMPANYInventor: GENERAL ELECTRIC COMPANY
-
Publication number: 20130070108Abstract: A method for determining calibration data for at least two cameras (camera1, camera2) in a multi view position, includes a step of determining respective parameters ((h100, . . . , h122), (h200, . . . , h222)) for identifying at least one respective homographic transformation on respective images (image1,image2) taken by said cameras of a same scene, by performing respective geometry analyses on said respective images (image1, image2), a step of performing at least one respective combined homographic transformation/feature detection step on said respective images thereby obtaining respective sets (feature set1, feature set2) of features on respective transformed images, such that said calibration data are obtained from matches (m1, . . . , mk) determined between said respective sets of features.Type: ApplicationFiled: March 8, 2011Publication date: March 21, 2013Inventors: Maarten Aerts, Donny Tytgat, Jean-Francois Macq, Sammy Lievens
-
Publication number: 20130063607Abstract: A camera system that transfers static image data while reducing power consumption is provided. A static image application issues a frame transfer command at a certain transfer period to a camera module. In each transfer period, the camera module wakes up from a suspend state and generates static image data. After the end of the transfer, the camera module transitions to the suspend state. The camera module is able to transition to the suspend state at each transfer period and thus is able to reduce power consumption.Type: ApplicationFiled: September 10, 2012Publication date: March 14, 2013Applicant: LENOVO (SINGAPORE) PTE, LTD.Inventors: Susumu Shimotono, Jun Sugiyama, Hideki Kashiyama
-
Publication number: 20130057706Abstract: Methods, systems and software are disclosed for automated Measurement of Video Quality parameters. The system includes a static Test Pattern provided either in form of a Test Pattern File, converted via a standard playout device (test source) into analog or digital test signal and supplied to the input of a System Under Test, or in form of a Reflectance Chart installed before the front-end device of the System Under Test, such as TV camera. The system also includes a video capture device connected to the back-end device of the System Under Test, e.g. to the output of system decoder/player. A Video Quality Analyzer processes the captured video data and generates a detailed Analysis Report.Type: ApplicationFiled: September 4, 2011Publication date: March 7, 2013Inventors: Victor Steinberg, Michael Shinsky
-
Publication number: 20130050505Abstract: Embodiments of a system and method for blur-calibration of an imaging sensor using a moving constellation are generally described herein. In some embodiments, blur-calibration of an imaging sensor includes moving a known target pattern across the field-of view (FOV) of the imaging sensor to present the target pattern across different frames at different pixel phases. Frames of images of the moving target pattern as seen in the FOV of the imaging sensor are captured to generate image data output. The image data output may be subsequently processed to generate data products representative of a shape of a point-spread function (PSF) from a high-resolution composite image generated from the captured frames. A chopper modulation may be applied to the moving target sequence and separate chopper-open and chopper-closed composite images are created. The PSF may be determined based on the difference between the chopper-open and chopper-closed composite images.Type: ApplicationFiled: August 22, 2011Publication date: February 28, 2013Applicant: Raytheon CompanyInventor: Darin Williams
-
Publication number: 20130044223Abstract: A method for testing an image capturing device with quality parameters includes providing a predetermined criteria for the variations in images and quality parameters corresponding to a specific target-object, capturing a test image corresponding to the same target object by an image capturing device to be tested, and retrieving a standardized image and a test image located at the same position by a standard image unit and a test image unit, respectively, thereby comparing quality parameters of the standard image unit and the test image unit on the same position to accurately detect and determine any abnormity in the image capturing device.Type: ApplicationFiled: November 9, 2011Publication date: February 21, 2013Applicants: ASKEY COMPUTER CORPORATION, ASKEY TECHNOLOGY (JIANGSU) LTD.Inventors: Yi-Ju Chen, Ching-Feng Hsieh
-
Publication number: 20130044222Abstract: Calculating a gain setting for a primary image sensor includes receiving a test-matrix of pixels from a test image sensor, and receiving a first-frame matrix of pixels from a primary image sensor. A gain setting is calculated for the primary image sensor using the first-frame matrix of pixels except those pixels imaging one or more exclusion regions identified from the test matrix of pixels.Type: ApplicationFiled: August 18, 2011Publication date: February 21, 2013Applicant: Microsoft CorporationInventors: John Tardif, Seshagiri Panchapagesan
-
Publication number: 20130033608Abstract: A method for predicting whether a test image (318) is sharp or blurred includes the steps of: providing a sharpness classifier (316) that is trained to discriminate between sharp and blurred images; computing a set of sharpness features (322) for the test image (318) by (i) generating a high pass image (404) from the test image (318), (ii) generating a band pass image (406) from the test image (318), (iii) identifying textured regions (408) in the high pass image, (iv) identifying texture regions (410) in the band pass image, and (v) evaluating the identified textured regions in the high pass image and the band pass image to compute the set of test sharpness features (412); and evaluating the sharpness features using the sharpness classifier (324) to estimate if the test image (318) is sharp or blurry (20).Type: ApplicationFiled: June 24, 2010Publication date: February 7, 2013Applicant: NIKON CORPORATIONInventor: Li Hong
-
Publication number: 20130027565Abstract: An imaging system may include an array of image pixels. The array of image pixels may be provided with one or more rows and columns of optically shielded dark image pixels. The dark image pixels may be used to produce verification image data that follows the same pixel-to-output data path of light-receiving pixels. The output signals from dark pixels may be continuously or intermittently compared with a set of expected output signals to verify that the imaging system is functioning properly. In some arrangements, verification image data may include a current frame number that is encoded into the dark pixels. The encoded current frame number may be compared with an expected current frame number. In other arrangements, dark pixels may be configured to have a predetermined pattern of conversion gain levels. The output signals may be compared with a “golden” image or other predetermined set of expected output signals.Type: ApplicationFiled: February 17, 2012Publication date: January 31, 2013Inventors: Johannes Solhusvik, Neal Crook
-
Publication number: 20130027566Abstract: Imaging systems may be provided with image sensors having verification circuitry. Verification circuitry may be configured to continuously or occasionally verify that the image sensor is functioning properly. For example, verification circuitry may be configured to monitor levels of leakage current during standby mode. Verification circuitry may be coupled between a power supply and circuitry that is powered by that power supply. When the imaging system is in standby mode, circuitry associated with the imaging system such as pixel circuitry may draw a standby leakage current. Verification circuitry may be configured to measure the amount of standby leakage current drawn by associated imaging system circuitry. If the measured level of standby leakage current exceeds a maximum acceptable level of standby leakage current, a warning signal may be generated. Standby leakage current levels on multiple power supply lines may be monitored with associated verification circuitry.Type: ApplicationFiled: February 17, 2012Publication date: January 31, 2013Inventors: Johannes Solhusvik, Steffen Skaug