Picture Signal Generators (epo) Patents (Class 348/E13.074)
  • Publication number: 20130010080
    Abstract: A method for registering a first imaging detector to a surface projects a sequence of k images toward the surface, wherein k?4, wherein each of the k images has a pattern of lines that extend in a direction that is orthogonal to a movement direction. The pattern encodes an ordered sequence of labels, each label having k binary elements, such that, in the movement direction, any portion of the pattern that is k equal increments long encodes one label of the ordered sequence. The method obtains, for at least a first pixel in the first imaging detector, along at least one line that is parallel to the movement direction, a first sequence of k signal values indicative of the k binary elements of a first label from the ordered sequence of labels and correlates the at least the first pixel in the first imaging detector to the surface.
    Type: Application
    Filed: July 8, 2011
    Publication date: January 10, 2013
    Inventors: Lawrence A. Ray, Richard A. Simon
  • Publication number: 20130010093
    Abstract: A method and system for presenting stereoscopic images are described, in which a portion of at least one stereoscopic image of a stereoscopic pair is blocked while displaying the stereoscopic images. The blocked portion has a width at least equal to a magnitude of a minimum disparity associated with a region of the image near a vertical edge of the image or near a vertical edge of an area for displaying the image. By blocking the portion of the image during content display, one can avoid depth cue conflicts near the edge of the images or the display area.
    Type: Application
    Filed: April 1, 2011
    Publication date: January 10, 2013
    Applicant: Thomson Licensing LLC
    Inventor: William Gibbens Redmann
  • Publication number: 20130010071
    Abstract: Disclosed are methods for determining and tracking a current location of a handheld pointing device, such as a remote control for an entertainment system, on a depth map generated by a gesture recognition control system. The methods disclosed herein enable identifying a user's hand gesture, and generating corresponding motion data. Further, the handheld pointing device may send motion, such as acceleration or velocity, and/or orientation data such as pitch, roll, and yaw angles. The motion data of user's hand gesture and motion data (orientation data) as received from the handheld pointing device are then compared, and if they correspond to each other, it is determined that the handheld pointing device is in active use by the user as it is held by a particular hand. Accordingly, a location of the handheld pointing device on the depth map can be determined.
    Type: Application
    Filed: July 4, 2012
    Publication date: January 10, 2013
    Applicant: 3DIVI
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov, Alexander Argutin
  • Publication number: 20130010068
    Abstract: Methods and systems for providing an augmented reality system are disclosed. In one instance, an augmented reality system may: identify a feature within a three-dimensional environment; project information into the three-dimensional environment; collect an image of the three-dimensional environment and the projected information; determine at least one of distance and orientation of the feature from the projected information; identify an object within the three-dimensional environment; and perform markerless tracking of the object.
    Type: Application
    Filed: April 12, 2012
    Publication date: January 10, 2013
    Applicant: Radiation Monitoring Devices, Inc.
    Inventors: Timothy C. Tiernan, Kevin Grant Osborn, Thomas Anthony Keemon, JR., Robert Vinci
  • Publication number: 20130010083
    Abstract: A portable device that has first and second image sensors and a central processor. The central processor has four processing units and a first image sensor interface and a second image sensor interface for receiving data from the from the first and second image sensors respectively. The four processing units and the first and second sensor interfaces are integrated onto a single chip such that the four processing units are configured to simultaneously process the data from the first and second image interfaces to generate stereoscopic image data.
    Type: Application
    Filed: September 15, 2012
    Publication date: January 10, 2013
    Inventor: Kia Silverbrook
  • Publication number: 20130010084
    Abstract: A camera processor image-processes an image for left eye captured by a left-eye image capturing unit and an image for right eye captured by a right-eye image capturing unit to generate a parallax image of the left-eye image and the right-eye image. A vertical timing adjuster adjusts timings of driving a left-eye image sensor and a right-eye image sensor using a frame synchronizing drive controller to reduce as close to zero as possible an amount of vertical displacement between a framing position of a target object image in the left-eye image and a framing position of a target object image in the right-eye image.
    Type: Application
    Filed: September 15, 2012
    Publication date: January 10, 2013
    Applicant: Panasonic Corporation
    Inventor: Toshinobu HATANO
  • Publication number: 20130010070
    Abstract: An information processing apparatus configured to estimate a position and orientation of a measuring object using an imaging apparatus includes an approximate position and orientation input unit configured to input a relative approximate position and orientation between the imaging apparatus and the measuring object, a first position and orientation updating unit configured to update the approximate position and orientation by matching a three-dimensional shape model to a captured image, a position and orientation difference information input unit configured to calculate and acquire a position and orientation difference amount of the imaging apparatus relative to the measuring object having moved after the imaging apparatus has captured an image of the measuring object or after last position and orientation difference information has been acquired, and a second position and orientation updating unit configured to update the approximate position and orientation based on the position and orientation difference
    Type: Application
    Filed: June 29, 2012
    Publication date: January 10, 2013
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Keisuke Tateno, Daisuke Kotake, Shinji Uchiyama
  • Publication number: 20130010085
    Abstract: A right-eye image n+½ is an image that interpolates between a left-eye image n and a left-eye image n+1. This makes it possible to achieve a stereoscopic image display device capable of carrying out a stereoscopic display with improved display quality of moving images and improved viewability of moving images.
    Type: Application
    Filed: January 19, 2011
    Publication date: January 10, 2013
    Inventor: Yoshiki Takata
  • Publication number: 20130010066
    Abstract: A robot is provided that includes a processor executing instructions that generate an image. The robot also includes a depth sensor that captures depth data about an environment of the robot. Additionally, the robot includes a software component executed by the processor configured to generate a depth map of the environment based on the depth data. The software component is also configured to generate the image based on the depth map and red-green-blue (RGB) data about the environment.
    Type: Application
    Filed: July 5, 2011
    Publication date: January 10, 2013
    Applicant: Microsoft Corporation
    Inventors: Charles F. Olivier, III, Jean Sebastien Fouillade, Ashley Feniello, Jordan Correa, Russell Sanchez, Malek Chalabi
  • Publication number: 20130002811
    Abstract: A three-dimensional (3D) imaging method using one single-lens image-capture apparatus, comprising: deriving a first two-dimensional (2D) image with the single-lens image-capture apparatus; deriving a depth map corresponding to the first 2D image; synthesizing a view synthesized image according to the depth map and the first 2D image; and deriving a second 2D image with the single-lens image-capture apparatus according to the view synthesized image, wherein the first 2D image and the second 2D image are utilized for 3D image display.
    Type: Application
    Filed: June 28, 2011
    Publication date: January 3, 2013
    Inventors: Lin-Kai Bu, Chi-Chia Lin
  • Publication number: 20130002825
    Abstract: This 3D image capture device includes a light-transmitting section 2 with m transmitting areas (where m is an integer and m?2) and a solid-state image sensor 1. The sensor 1 has unit elements, each of which includes n photosensitive cells (where n is an integer and n?m) and n transmitting filters that face those photosensitive cells. If the wavelength is identified by ?, the transmittances of transmitting areas C1 and C2 are identified by Tc1(?) and Tc2(?), the transmittances of two transmitting filters are identified by Td1(?) and Td2(?), and the interval of integration is the entire visible radiation wavelength range, ?Tc1(?)Td1(?)d?>0, ?Tc1(?)Td2(?)d?>0, ?Tc2(?)Td1(?)d?>0, ?Tc2(?)Td2(?)d?>0, and ?Tc1(?)Td1(?)d??Tc2(?)Td2(?)d???Tc2(?)Td1(?)d??Tc1(?)Td2(?)d? are satisfied.
    Type: Application
    Filed: October 19, 2011
    Publication date: January 3, 2013
    Applicant: PANASONIC CORPORATION
    Inventors: Yasunori Ishii, Masao Hiramoto
  • Publication number: 20130002829
    Abstract: Disclosed is a method of capturing 3D data of one or more airborne. At least one image of the one or more airborne particles is taken by a plenoptic camera of which the geometry and the optical properties of its optics are known, and the distance of a plane of focus with at least one selected particle of the one or more airborne particles from a defined reference location is determined by use of the captured image together with the known optical properties and the known geometry of the optics of the plenoptic camera.
    Type: Application
    Filed: June 26, 2012
    Publication date: January 3, 2013
    Inventor: Emil Hedevang Lohse Soerensen
  • Publication number: 20130000252
    Abstract: Methods, apparatus, assemblies, and systems relate to producing on-demand packaging. For example, packaging can be automatically produced on-demand and be sized and configured for use with a customized set of items and/or a customized arrangement of items. In one aspect, items are arranged on a resting device and an imaging component obtains image or other data related to such arrangement. Based on the image, dimensions of the arrangement may be determined and a packaging template, such as a box template, may be designed. The designed template may have dimensions suitable to enclose the items when arranged in the manner provided on the resting device. A packaging production machine may produce a box template, on-demand, after the box template has been designed based on the physical arrangement of items on the resting device.
    Type: Application
    Filed: December 10, 2010
    Publication date: January 3, 2013
    Applicant: PACKSIZE, LLC
    Inventors: Niklas Pettersson, Ryan Osterhout
  • Publication number: 20130002794
    Abstract: A system that incorporates teachings of the present disclosure may include, for example, receive a request for a telepresence seat at an event, obtain media content comprising event images of the event that are captured by an event camera system, receive images that are captured by a camera system at a user location, provide the media content and video content representative of the images to a processor for presentation at a display device utilizing a telepresence configuration that simulates the first and second users being present at the event, where the providing of the first and second video content establishes a communication session between the first and second users. Other embodiments are disclosed.
    Type: Application
    Filed: June 30, 2011
    Publication date: January 3, 2013
    Applicant: AT&T Intellectual Property I, LP
    Inventors: Tara Hines, Andrea Basso, Aleksey Ivanov, Jeffrey Mikan, Nadia Morris
  • Publication number: 20130002830
    Abstract: A stereoscopic imaging device comprising: a second focus adjusting unit that operates a second focus lens to carry out a search within a second search range and searches for a second lens position at which a subject to be imaged is brought into focus; and a photographing unit that performs a photographing of a first viewpoint image and a second viewpoint image when a photographing instruction is inputted after a process by a first focus adjusting unit and a second focus adjusting unit, wherein the second focus adjusting unit calculates the second lens position based upon a first lens position and a focus positional deviation amount stored in a storage unit, and shifts the second focus lens to the second lens position, if it is not possible to acquire the second lens position within the second search range as a result of the search.
    Type: Application
    Filed: November 11, 2010
    Publication date: January 3, 2013
    Inventor: Yi Pan
  • Publication number: 20130003128
    Abstract: An image processing device 29 for generating plural virtual view image data according to L and R view image data I(L) and I(R), and includes an imaging error detection circuit 32, a disparity map generation circuit 33, and an image generation circuit 34 for a virtual view image. The imaging error detection circuit 32 detects whether an imaging error has occurred with L and R view image data I(L) and I(R) or not. If one of the L and R view image data I(L) and I(R) is abnormal image data, the disparity map generation circuit 33 extracts a corresponding point in the abnormal image data corresponding respectively to a pixel in remaining normal image data, and generates a disparity map. The image generation circuit 34 generates the virtual view image data according to the disparity map and the normal image data.
    Type: Application
    Filed: March 18, 2011
    Publication date: January 3, 2013
    Inventor: Mikio Watanabe
  • Publication number: 20130002822
    Abstract: A product ordering system storing the information of 3D product models including data for forming each of the 3D product models and the scale of each of the 3D product models. A method for ordering a product using the system includes first capturing an image. Then, sensing the distance between the image capturing unit and the user, next, obtaining specific image data from the captured image according to one selected 3D product model, then, converting life size data needed to form a 3D model of the user according to the focus of the image capturing unit and the sensed distance. After that, generating a 3D model of the user according to the scale of the 3D product model, and overlaying the selected 3D product model with the 3D model of the user and displaying the combination for viewing by the user.
    Type: Application
    Filed: October 20, 2011
    Publication date: January 3, 2013
    Applicant: HON HAI PRECISION INDUSTRY CO., LTD.
    Inventors: YING-CHUAN YU, YING-XIONG HUANG, SHIH-PIN WU, HSING-CHU WU
  • Publication number: 20130002828
    Abstract: A catadioptric camera having a perspective camera and multiple curved mirrors, images the multiple curved mirrors and uses the epsilon constraint to establish a vertical parallax between points in one mirror and their corresponding reflection in another. An ASIFT transform is applied to all the mirror images to establish a collection of corresponding feature points, and edge detection is applied on mirror images to identify edge pixels. A first edge pixel in a first imaged mirror is selected, its 25 nearest feature points are identified, and a rigid transform is applied to them. The rigid transform is fitted to 25 corresponding feature points in a second imaged mirror. The closes edge pixel to the expected location as determined by the fitted rigid transform is identified, and its distance to the vertical parallax is determined. If the distance is not greater than predefined maximum, then it is deemed correlate to the edge pixel in the first imaged mirror.
    Type: Application
    Filed: July 1, 2011
    Publication date: January 3, 2013
    Inventors: Yuanyuan Ding, Jing Xiao
  • Publication number: 20130002824
    Abstract: A multi-purpose device that can be used, as, among other things, an otoscope and a three dimensional scanning system is disclosed.
    Type: Application
    Filed: June 27, 2012
    Publication date: January 3, 2013
    Inventors: Douglas P. Hart, Federico Frigerio, Douglas M. Johnston, Manas C. Menon, Daniel Vlasic
  • Publication number: 20130002815
    Abstract: A method for providing a three dimensional (3D) drawing experience. The method includes capturing a 3D image of a participant and then processing this image to key the participant's image from a background. The keyed participant's image is mixed with a 3D background image such as frames or scenes from a 3D movie, and the mixed 3D image is projected on a projection screen. For example, left and right eye images may be projected from a pair of projectors with polarization films over the lenses, and the projection screen may be a polarization-maintaining surface such as a silver screen. The user moves a drawing instrument in space in front of the projection screen, and spatial tracking performed to generate a locus of 3D positions. These 3D positions are used to create a 3D drawing image that is projected with the 3D background and participant images in real time.
    Type: Application
    Filed: July 1, 2011
    Publication date: January 3, 2013
    Applicant: DISNEY ENTERPRISES, INC.
    Inventors: LANNY S. SMOOT, QUINN SMITHWICK, DANIEL M. REETZ, MICHAEL J. ILARDI
  • Publication number: 20130002827
    Abstract: An apparatus and method for capturing a light field geometry using a multi-view camera that may refine the light field geometry varying depending on light within images acquired from a plurality of cameras with different viewpoints, and may restore a three-dimensional (3D) image.
    Type: Application
    Filed: May 30, 2012
    Publication date: January 3, 2013
    Applicant: Samsung Electronics Co., LTD.
    Inventors: Seung Kyu Lee, Do Kyoon Kim, Hyun Jung Shim
  • Publication number: 20120327188
    Abstract: A vehicle-mounted environment recognition apparatus including a simple pattern matching unit which extracts an object candidate from an image acquired from a vehicle-mounted image capturing apparatus by using a pattern shape stored in advance and outputs a position of the object candidate, an area change amount prediction unit which calculates a change amount prediction of the extracted object candidate on the basis of an object change amount prediction calculation method set differently for each area of a plurality of areas obtained by dividing the acquired image, detected vehicle behavior information, and an inputted position of the object candidate, and outputs a predicted position of an object, and a tracking unit which tracks the object on the basis of an inputted predicted position of the object.
    Type: Application
    Filed: August 13, 2010
    Publication date: December 27, 2012
    Applicant: Hitachi Automotive Systems, Ltd.
    Inventors: Masayuki Takemura, Shoji Muramatsu, Isao Furusawa, Shinya Ohtsuji, Takeshi Shima, Yuji Otsuka, Tatsuhiko Monji
  • Publication number: 20120327193
    Abstract: A system and method are disclosed for tracking image and audio data over time to automatically identify a person based on a correlation of their voice with their body in a multi-user game or multimedia setting.
    Type: Application
    Filed: September 10, 2012
    Publication date: December 27, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Mitchell Dernis, Tommer Leyvand, Christian Klein, Jinyu Li
  • Publication number: 20120327190
    Abstract: A monitoring system includes at least one three-dimensional (3D) time-of-flight (TOF) camera configured to monitor a safety-critical area. An evaluation unit is configured to activate a safety function upon an entrance of at least one of an object and a person into the monitored area and to suppress the activation of the safety function where at least one clearance element is recognized as being present on the at least one of the object and the person.
    Type: Application
    Filed: February 21, 2011
    Publication date: December 27, 2012
    Applicant: IFM ELECTRONIC GMBH
    Inventors: Javier Massanell, Thomas May, Manfred Strobel
  • Publication number: 20120327192
    Abstract: A dental 3D camera for optically scanning a three-dimensional object, and a method for operating a dental 3D camera. The camera operates in accordance with a triangulation procedure to acquire a plurality of images of the object. The method comprises forming at least one comparative signal based on at least two images of the object acquired by the camera while at least one pattern is projected on the object, and determining at least one camera shake index based on the at least one comparative signal.
    Type: Application
    Filed: September 6, 2012
    Publication date: December 27, 2012
    Applicant: SIRONA DENTAL SYSTEMS GMBH
    Inventors: Joachim Pfeiffer, Konrad Klein
  • Publication number: 20120327187
    Abstract: A system for inspecting a test article incorporates a diagnostic imaging system for a test article. A command controller receives two dimensional (2D) images from the diagnostic imaging system. A three dimensional (3D) computer aided design (CAD) model visualization system and an alignment system for determining local 3D coordinates are connected to the command controller. Computer software modules incorporated in the command controller are employed, in aligning, the 2D images and 3D CAD model responsive to the local 3D coordinates. The 2D images and 3D CAD model are displayed with reciprocal registration. The alignment system is then directed to selected coordinates in the 2D images or 3D CAD model.
    Type: Application
    Filed: June 22, 2011
    Publication date: December 27, 2012
    Applicant: THE BOEING COMPANY
    Inventors: James J. Troy, Scott W. Lea, Gary E. Georgeson, William P. Motzer, Peter J. Hellenbrand, Kevin Puterbaugh
  • Publication number: 20120327186
    Abstract: A virtual endoscopic image generating unit for generating, from a 3D medical image representing an interior of a body cavity of a subject formed by a 3D medical image forming unit and inputted thereto, a virtual endoscopic image, in which a position of a structure of interest identified by a position of interest identifying unit is a view point of the virtual endoscopic image, a real-time position of at least one of an endoscope and a surgical tool detected by an endoscope position detecting unit or a surgical tool position detecting unit is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image, is provided. A display control unit causes a WS display to display the generated virtual endoscopic image.
    Type: Application
    Filed: March 16, 2011
    Publication date: December 27, 2012
    Applicant: FUJIFILM CORPORATION
    Inventors: Yoshiro Kitamura, Keigo Nakamura
  • Publication number: 20120327189
    Abstract: A stereo camera apparatus which carries out distance measuring stably and with high accuracy by making measuring distance resolution variable according to a distance to an object is provided. A stereo camera apparatus 1 takes in two images, changes resolution of a partial area of each image that is taken in, and calculates a distance from a vehicle to an object that is imaged in the partial area, based on disparity of the partial area of each image in which resolution is changed. Thus, even when the object exists at a long distance and is small in size, distance measuring processing can be carried out stably.
    Type: Application
    Filed: August 13, 2010
    Publication date: December 27, 2012
    Applicant: Hitachi Automotive Systems, Ltd.
    Inventors: Shoji Muramatsu, Mirai Higuchi, Tatsuhiko Monji, Soichiro Yokota, Morihiko Sakano, Takeshi Shima
  • Publication number: 20120327197
    Abstract: A 3D imaging device determines, during imaging, whether the captured images will be perceived three-dimensionally without causing fatigue while simulating actual human perception. In a 3D imaging device, a display information setting unit obtains a display parameter associated with an environment in which a 3D video is viewed, and a control unit determines during 3D imaging whether a scene to be imaged three-dimensionally will be perceived three-dimensionally based on the viewing environment.
    Type: Application
    Filed: March 3, 2011
    Publication date: December 27, 2012
    Applicant: PANASONIC CORPORATION
    Inventors: Haruo Yamashita, Takeshi Ito, Hiromichi Ono
  • Publication number: 20120327194
    Abstract: Body-mounted cameras are used to accurately reconstruct the motion of a subject. Outward-looking cameras are attached to the limbs of the subject, and the joint angles and root pose that define the subject's configuration are estimated through a non-linear optimization, which can incorporate image matching error and temporal continuity of motion. Instrumentation of the environment is not required, allowing for motion capture over extended areas and in outdoor settings.
    Type: Application
    Filed: June 21, 2011
    Publication date: December 27, 2012
    Inventors: Takaaki Shiratori, Hyun Soo Park, Leonid Sigal, Yaser Sheikh, Jessica K. Hodgins
  • Publication number: 20120327185
    Abstract: The digital 3D/360° camera system is an omnidirectional stereoscopic device for capturing image data that may be used to create a 3-dimensional model for presenting a 3D image, a 3D movie, or 3D animation. The device uses multiple digital cameras, arranged with overlapping fields of view, to capture image data covering an entire 360° scene. The data collected by one, or several, digital 3D/360° camera systems can be used to create a 3D model of a 360° scene by using triangulation of the image data within the overlapping fields of view.
    Type: Application
    Filed: September 7, 2012
    Publication date: December 27, 2012
    Inventor: Leonard P. Steuart III
  • Publication number: 20120327196
    Abstract: In an image processing apparatus, an image pickup unit takes images of an object including the face of a person wearing the glasses by which to observe a stereoscopic image that contains a first parallax image and a second parallax image obtained when the object in a three-dimensional (3D) space is viewed from different viewpoints. A glasses identifying unit identifies the glasses included in the image of the object taken by the image pickup unit. A face detector detects a facial region the face of the person included in the image of the object taken by the image pickup unit, based on the glasses identified by the glasses identifying unit. An augmented-reality special rendering unit adds a virtual feature to the facial region of the face of the person detected by the face detector.
    Type: Application
    Filed: September 6, 2012
    Publication date: December 27, 2012
    Applicant: SONY COMPUTER ENTERTAINMENT INC.
    Inventors: Akio Ohba, Hiroyuki Segawa, Tetsugo Inada
  • Publication number: 20120327191
    Abstract: A 3D imaging device obtains a 3D image (3D video) that achieves an appropriate 3D effect and/or intended placement. The 3D imaging device (1000) estimates (calculates) an ideal disparity of a main subject based on a preset viewing environment (display condition) and a distance to the main subject, and obtains a disparity (an actual disparity) of the subject (main subject) actually occurring on a virtual screen. The 3D imaging device (1000) obtains a correction disparity using the ideal disparity and the actual disparity, and adds the calculated correction disparity to the 3D image (horizontally shifts the image) to perform appropriate disparity correction. The 3D imaging device (3000) obtains a 3D image (3D video) that achieves an appropriate 3D effect and/or intended placement without being affected by a disparity occurring in the horizontal direction caused by insufficient precision (in particular, insufficient optical precision).
    Type: Application
    Filed: March 3, 2011
    Publication date: December 27, 2012
    Applicant: Panasonic Corporation
    Inventors: Haruo Yamashita, Takeshi Ito, Hiromichi Ono
  • Publication number: 20120320162
    Abstract: An efficient 3D object localization method using multiple cameras is provided. The proposed method comprises a three-dimensional object localization process that firstly generates a plurality of two-dimensional line samples originated from a pre-calibrated vanishing point in each camera view for representing foreground video objects, secondly constructs a plurality of three-dimensional line samples from the two-dimensional line samples in all the multiple camera views, and thirdly determines three-dimensional object to locations by clustering the three-dimensional line samples into object groups.
    Type: Application
    Filed: June 4, 2012
    Publication date: December 20, 2012
    Applicant: NATIONAL CHIAO TUNG UNIVERSITY
    Inventors: Kuo-Hua LO, Jen-Hui CHUANG, Horng-Horng LIN
  • Publication number: 20120320161
    Abstract: A method disclosed herein relates to displaying three-dimensional images. The method comprises projecting integral images to a display device, and displaying three-dimensional images with the display device. Further disclosed herein is an apparatus for displaying orthoscopic 3-D images. The apparatus comprises a projector for projecting integral images, and a micro-convex-mirror array for displaying the projected images.
    Type: Application
    Filed: August 28, 2012
    Publication date: December 20, 2012
    Applicant: UNIVERSITY OF CONNECTICUT
    Inventors: Bahram Javidi, Ju-Seog Jang, Hyunju Ha
  • Publication number: 20120320157
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Application
    Filed: June 14, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20120320165
    Abstract: Disclosed herein are apparatuses and methods for reclaiming the full field of view (FOV) of the original camera lens in a stereoscopic image capture system using an anamorphic attachment. Also disclosed are apparatuses and methods of projecting stereoscopic images on a fixed size screen from a single projector that was initially designed primarily for 2D operation. An exemplary apparatus may comprise an anamorphic afocal converter configured to halve a FOV of a camera or projector into two optical paths, and convert the halved FOVs into two full FOVs of the camera or projector. Such an apparatus may further comprise reflecting elements cooperatively arranged to direct two rectified images at a camera sensor or projection screen, where one or more reflecting elements receive the first of the two full FOVs and one or more reflecting elements receive the second of the two full FOVs.
    Type: Application
    Filed: June 18, 2012
    Publication date: December 20, 2012
    Applicant: REALD INC.
    Inventor: Miller H. Schuck
  • Publication number: 20120320159
    Abstract: An inspection apparatus includes an imaging unit producing image signals; a processing unit for receiving the image signal; the imaging unit producing a stack of images of an article at different focal lengths in response to the processing unit; the processing unit generating a depth map from the stack of images; the processing unit analyzing the depth map to derive a depth profile of an object of interest; the processing unit determining a surface mean for the article from the stack of images; and the processing unit characterizing the article as degraded or contaminated in response to the depth profile and the surface mean.
    Type: Application
    Filed: June 15, 2012
    Publication date: December 20, 2012
    Applicant: Sikorsky Aircraft Corporation
    Inventors: Myra A. Torres, Abhijit Bhoite, Nikhil Beke, Kevin Patrick McCormick, Timothy Duffy, Jeremy W. Sheaffer, Maksim Bobrov, Michael Moore
  • Publication number: 20120320164
    Abstract: A new technology using polarizing apertures for imaging both left and right perspective views onto a single sensor is presented. The polarizing apertures can be used to image each perspective view onto one or more sensors. Polarizing apertures within a single lens or within two lenses may be employed. The apertures' areas may be changed to control exposure and the design allows for interaxial separations to be varied to the reduced values required for stereoscopic cinematography.
    Type: Application
    Filed: June 15, 2012
    Publication date: December 20, 2012
    Inventor: Lenny Lipton
  • Publication number: 20120320158
    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
    Type: Application
    Filed: June 14, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20120320156
    Abstract: An image recording apparatus includes an image-sensing unit, a video-processing unit and a power module. The image-sensing unit further includes a frame, a first connector, and a first image-sensing module for capturing an image. The video-processing unit further includes a body, a second connector, a video-processing circuit, a video-output module, and a memory module. Through an electric engagement of the first connector and the second connector, image data captured by the first image-sensing module can be transmitted to the video-processing unit for being processed by the video-processing circuit into an electronic image file readable and storable to an ordinary computer apparatus. The electronic image file can be stored in the memory module. The video-processing module can also forward the electronic image file to a foreign video-displaying apparatus through the video-output module. The power module is to provide electricity to the image-sensing unit and the video-processing unit.
    Type: Application
    Filed: June 17, 2011
    Publication date: December 20, 2012
    Inventors: Chia Ching Chen, Cheng Hao Chan
  • Publication number: 20120320163
    Abstract: Two or more images having a parallax therebetween are obtained by imaging a subject from different positions using imaging units. Three-dimensional processing for three-dimensional display is applied to the two or more images, and the two or more images are displayed on a display unit. While the imaging units carry out a zoom operation, three-dimensional display with a reduced parallax between the two or more images or two-dimensional display is performed.
    Type: Application
    Filed: August 30, 2012
    Publication date: December 20, 2012
    Applicant: FUJIFILM CORPORATION
    Inventor: Kouichi YAHAGI
  • Publication number: 20120320160
    Abstract: Device comprising: an optical system itself comprising a light sensor with a plurality of pixels and a lens able to image the elements of the scene on one of the pixels of said light sensor, means to adjust the focus of the optical system onto any one of the elements of said scene that are able to adjust said focus by fixing on the maximum of light streams coming from this element and captured by one of the pixels of said bit mapped light sensor, and means suitable for deducing from the adjustment of said focus, the depth of said element.
    Type: Application
    Filed: June 15, 2012
    Publication date: December 20, 2012
    Inventor: Valter Drazic
  • Publication number: 20120314037
    Abstract: The present invention relates to a system and method for providing 3D imaging. The system comprises: a) two or more cameras for allowing to generate stereoscopic information only for selected regions in a common or coincident field of view captured simultaneously by both of said two cameras, in order to provide distance measurements information to object(s) located in said selected regions; b) at least one 3DLR module for providing 3D image, wherein said two cameras and said 3DLR module are positioned in such a way that, at least partially, they will be able to capture similar or coincident field of view information; and c) a processing unit for generating said 3D imaging according to said stereoscopic information and said 3D image by using image processing algorithm(s) in order to provide 3D imaging information in real-time.
    Type: Application
    Filed: February 22, 2011
    Publication date: December 13, 2012
    Applicant: Ben-Gurion University of the Negev
    Inventors: Youval Nehmadi, Hugo Guterman
  • Publication number: 20120314040
    Abstract: A 3D model of an object is rendered using centered images of the object. An algorithm executed locally or in a distributed manner calculates camera positions for the images and determines a virtual camera path based on the camera positions. The application adjusts the images to fit the plane of the virtual camera path and fills in the gaps between the images using transition renderings. To improve user experience, the application also calculates resting positions for navigation stop points using a spring system. Upon constructing the 3D model, the application can transmit the 3D model to a variety of user devices including the network connected device having a camera module that captured the images.
    Type: Application
    Filed: June 9, 2011
    Publication date: December 13, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Johannes Kopf, Eric Stollnitz, Sudipta Sinha, Rick Szeliski
  • Publication number: 20120314035
    Abstract: A human-perspective stereoscopic camera is an apparatus that is used to capture stereoscopic viewing material that is from the perspective of one person's eyes. The apparatus mainly comprises a left telescopic tube, a right telescopic tube, a fixed pivoting mount, a lateral movement mechanism, and a platform with a channel as a support structure. The left telescopic tube and the right telescopic tube each comprise a first cylinder, a second cylinder, and an optics assembly. The first cylinder and the second cylinder are the means for the telescopic movement. The optics assembly comprises a lens, a digital single-lens reflex camera, a camera mount, a plate, and a pole.
    Type: Application
    Filed: June 13, 2012
    Publication date: December 13, 2012
    Inventor: John Peter Hall
  • Publication number: 20120314033
    Abstract: The present invention relates to an apparatus and a method for processing three-dimensional (3D) image data of a portable terminal, and particularly, to an apparatus and a method for enabling contents sharing and reproduction (playback) between various 3D devices using a file structure for effectively storing a 3D image obtained using a plurality of cameras, and a stored 3D related parameter, and sharing and reproduction between various 3D devices are possible using a file structure for effectively storing a 3D image (for example, a stereo image) obtained using a plurality of cameras, and a stored 3D related parameter.
    Type: Application
    Filed: February 23, 2011
    Publication date: December 13, 2012
    Inventors: Gun-Ill Lee, Ha-Joong Park, Yong-Tae Kim, Houng-Sog Min, Sung-Bin Hong, Kwang-Cheol Choi
  • Publication number: 20120314036
    Abstract: Disclosed herein are primary and auxiliary image capture devices for image processing and related methods. According to an aspect, a method may include using primary and auxiliary image capture devices to perform image processing. The method may include using the primary image capture device to capture a first image of a scene, the first image having a first quality characteristic. Further, the method may include using the auxiliary image capture device to capture a second image of the scene. The second image may have a second quality characteristic. The second quality characteristic may be of lower quality than the first quality characteristic. The method may also include adjusting at least one parameter of one of the captured images to create a plurality of adjusted images for one of approximating and matching the first quality characteristic. Further, the method may include utilizing the adjusted images for image processing.
    Type: Application
    Filed: August 13, 2012
    Publication date: December 13, 2012
    Applicant: 3DMEDIA CORPORATION
    Inventors: Bahram Dahi, Michael McNamer, Izzat H. Izzat, Tassos Markas
  • Publication number: 20120314031
    Abstract: Technology is described for determining and using invariant features for computer vision. A local orientation may be determined for each depth pixel in a subset of the depth pixels in a depth map. The local orientation may an in-plane orientation, an out-out-plane orientation or both. A local coordinate system is determined for each of the depth pixels in the subset based on the local orientation of the corresponding depth pixel. A feature region is defined relative to the local coordinate system for each of the depth pixels in the subset. The feature region for each of the depth pixels in the subset is transformed from the local coordinate system to an image coordinate system of the depth map. The transformed feature regions are used to process the depth map.
    Type: Application
    Filed: June 7, 2011
    Publication date: December 13, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Jamie D. J. Shotton, Mark J. Finocchio, Richard E. Moore, Alexandru O. Balan, Kyungsuk David Lee
  • Publication number: 20120314032
    Abstract: Method for pilot assistance in landing an aircraft with restricted visibility, in which the position of a landing point is defined by at least one of a motion compensated, aircraft based helmet sight system and a remotely controlled camera during a landing approach, and with the landing point is displayed on a ground surface in the at least one of the helmet sight system and the remotely controlled camera by production of symbols that conform with the outside view. The method includes one of producing or calculating during an approach, a ground surface based on measurement data from an aircraft based 3D sensor, and providing both the 3D measurement data of the ground surface and a definition of the landing point with reference to a same aircraft fixed coordinate system.
    Type: Application
    Filed: May 25, 2012
    Publication date: December 13, 2012
    Applicant: EADS DEUTSCHLAND GMBH
    Inventors: Thomas MUENSTERER, Peter KIELHORN, Matthias WEGNER