Abstract: A device and system automatically release scents in an immersive environment. When an audio signal includes a sound associated with an event correlated to one of the scents, the device releases the scent to provide the improved immersive experience with the sense of smell included in the scene. The occurrence of an event may be predicted by an artificial intelligence engine. The artificial intelligence engine may determine when an event associated with a scent to be dispensed is going to occur so that the timing of the dispense is more accurate and correlated with the timing of the action in the audio signal.
Abstract: Systems and methods for monitoring and controlling dynamic multi-phase flow phenomena, capable of sensing, detecting, quantifying, and inferring characteristics, properties, and compositions (including static and dynamic characteristics, properties and compositions). The systems combine machine vision and mathematical models, which enables direct observation and detection of static and dynamic multi-phase fluid flow properties and phenomena (e.g. voids, waves, shadows, dimples, wrinkles, foam, bubbles, particulates, discrete materials, collections of materials, and position) and inferring other properties and phenomena (e.g. flow regimes, bubble velocities and accelerations, material deposition rates, erosion rates, phasic critical behavioral points as related to heat transfer, and the volumetric and mass flow rates of the phases) that are used to monitor and control systems applied to a multi-phase fluid flow system.
Abstract: Systems and methods for monitoring and controlling dynamic multi-phase flow phenomena, capable of sensing, detecting, quantifying, and inferring characteristics, properties, and compositions (including static and dynamic characteristics, properties and compositions). The systems combine machine vision and mathematical models, which enables direct observation and detection of static and dynamic multi-phase fluid flow properties and phenomena (e.g. voids, waves, shadows, dimples, wrinkles, foam, bubbles, particulates, discrete materials, collections of materials, and position) and inferring other properties and phenomena (e.g. flow regimes, bubble velocities and accelerations, material deposition rates, erosion rates, phasic critical behavioral points as related to heat transfer, and the volumetric and mass flow rates of the phases) that are used to monitor and control systems applied to a multi-phase fluid flow system.
Abstract: Novel disposable pipette tips that enable spectroscopic analysis of analytes held within the tip while attached to a microspectrometer or microspectrometer which is a micropipette with the functional capability to irradiate an attached tip with light of a defined wavelength and measure the impact of the sample within the tip on the irradiated light as the modified light is directed back to sensors on or within the instrument. Spectroscopic sample analysis is integral to a wide range of research sciences including microbiology, molecular biology, medical, chemistry, environmental, food, and forensics.
Type:
Grant
Filed:
November 15, 2017
Date of Patent:
July 14, 2020
Assignee:
Spectrum Perception LLC
Inventors:
Bradley Lynn Postier, Thomas Michael Spudich, Jr.
Abstract: Embodiments of the invention provide apparatuses, methods, and systems for projecting images into a projection zone, while having the capability to detect the presence and movement of objects in the projection zone and to interact with those objects, according to programmed interactions. One of the programmed interactions may be to detect objects in the projection zone and avoid projecting light onto them. The capability to detect and avoid objects in the projection zone may allow for the use of high intensity light images including laser light images around people and animals without the risk of eye injury. Another programmed interaction may be to project an illuminated image around people and objects in the projection zone to emphasize their presence and movement.
Abstract: An apparatus and method in which a light control signal comprising successive computer generated images are retrieved from a storage medium and played back in an image space with a point of view with respect to a frame of reference of the image space synchronized with a changing point of view of images acquired in an object space by an image acquisition device sensed in an object space changing in its point of view with respect to a frame of reference of the object space, the frame of reference of the image space corresponding to the frame of reference of the object space.
Abstract: Methods and an apparatus responsive to sensed orientation and translatory position of an image acquisition device with respect to a three-dimensional reference frame of an object space for providing and storing successive computer generated images with respect to a three-dimensional frame of reference of an image space synchronized with successive images acquired by the electronic image acquisition device with respect to the three-dimensional reference frame of the object space, the successive computer generated images having a changing point of view that changes direction between images, the successive computer generated images stored on a storage medium for playback and presentation to a viewer.
Abstract: Apparatus and method of (1) navigating an image acquisition device with both translatory and attitudinal movements while acquiring successive images in an object space and at the same time sensing the translatory and attitudinal movements of the device with respect to a three-dimensional reference frame of the object space, (2) providing successive computer generated (CG) images produced by a computer workstation from the successive images captured by the device in synchronization with the sensed translatory and attitudinal movements of the device with respect to the three-dimensional reference frame, and (3) storing the successive CG images on a non-transitory storage medium for later retrieval by a playback device for presentation by the playback device of said successive CG images to at least one eye of a viewer in an image space for perception of the successive CG images. A non-transitory storage medium storing the successive CG images is also provided.
Abstract: A method and apparatus performing the method transmit an encoded light control signal having computer generated images over a network to a device decoding the received signal and controlling light in an image space for providing the computer generated images with a point of view presented changing in translatory position and orientation with respect to an image space frame of reference corresponding to a changing point of view of images acquired in an object space by an image acquisition device sensed changing in translatory position and orientation during navigation of the image acquisition device in the object space with respect to a frame of reference of the object space.
Abstract: A portable undercarriage vehicle inspection system (UVIS) (100) uses an under vehicle imaging (UVI) module (110) to capture an image of the undercarriage of a vehicle. The UVIS also includes multiple scene cameras (120) that capture the associated vehicle scene images. The scene cameras are easy to view and manipulate. The undercarriage image and the associated vehicle scene images are provided to a power and communications unit (PCU) (140) through a network (130) such as Ethernet. These images may be stored in a database repository connected to the network. A notebook computer will serve as an operator workstation (150, 152, 154) for display of real-time, as well as historical, vehicular records. An operator viewing the images can enter additional information related to the images, such as comments and remarks, and archive all of the information for future reference and comparisons.
Type:
Grant
Filed:
August 17, 2007
Date of Patent:
November 29, 2011
Assignee:
Perceptics, LLC
Inventors:
Juan A. Herrera, Charles A. Cruey, George E. Deichert, Alfred L Marston, III, Anthony S. Nelms, Christopher C. Richardson, Kent A. Rinehart, Richard P. Williams, Charles L. Guffey, Augustin L. Manolache