WEARABLE IMAGING SENSOR WITH WIRELESS REMOTE COMMUNICATIONS
A wearable image sensor is described. In one example an apparatus includes a camera attached to a garment to capture an image of a view of an area surrounding a user that is wearing the garment, the image including an item. A data interface is attached to the garment and coupled to the camera to send the camera image to an external device and to receive description information about the item from the external device. A power supply is attached to the garment and coupled to the camera and the data interface to power the camera and the data interface. The apparatus presents the received description information to a user of the garment.
Latest Intel Patents:
The present application is a continuation of prior U.S. patent application Ser. No. 13/717,254, filed Dec. 17, 2012, entitled Wearable Imaging Sensor for Communications, by Ravi Pillarisetty, et al., the priority of which is hereby claimed and the contents of which are hereby incorporated by reference herein.
FIELDThe present description relates to wearable sensors and in particular to a wearable camera capable of connection to display and information systems.
BACKGROUNDA variety of applications have developed that allows smartphone users to use the built-in cameras that are included in many such phones. In some cases, data from the camera is sent by the smartphone to servers for some purpose. As examples of using a camera, the phone may be used to send images to friends, upload pictures to social networking or photo sharing sites, or to find product information using a camera image of a Quick Response Code.
Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.
Wearable technology can extend mobile computer usage beyond its current state. When a sensor is integrated into a mobile device, the user may be required to hold the device one way to image an object and a different way to use the device. In addition, because sensors, such as cameras are directional with a limited field of view typically 30 to 70 degrees horizontal, the camera can only capture a limited view of the area surrounding a user, not a full 180 degrees around the consumer. Even the limited directional view can only be captured when the camera is held in a particular way.
By mounting, attaching, integrating, or embedding the camera into a shirt, hat, or pants, all objects in front of the user may be sensed, imaged, and captured in a panoramic 180 degree visual plane. The sensor element may be integrated into clothing either as an attachable element such as a pin or as a semi-permanent attached element.
The cameras or sensors may then transfer the image to the user's handheld or other type of mobile device. In one embodiment, the sensor may be equipped with circuitry and a wireless antenna to transmit data to a mobile device. In another example, the sensor may be equipped with fiber optic ports to seamlessly transfer data at a high rate of speed to the mobile device. The captured images may then be used with image recognition software to help improve the overall mobile experience of the user.
The data interface of the system on a chip 16 receives the images and processes them depending on the particular application. The external antenna 20 allows the processor to communicate with external devices 22 such as a smartphone, tablet, external computer, wearable computer, or data relay terminal. In the example of
While the processing resources 16 and power supply 18 are shown as being separate and apart from the camera module 12, this is not required. All components may be integrated into a single camera module which transmits information directly to an external device. The images captured by the camera may be further processed by the camera module 20 or by the connected processor 16. Alternatively, raw image data may be sent directly to an external device 22 for processing.
The processing module is also coupled to a display 42 which, in this case, is connected or embedded into a sleeve of the shirt. The display may be a touch screen display or it may include user interface buttons that allow the user to view the display and send commands and instructions from the display or associated components 42 back to the processing system 36. In this example the shirt 30 is a wearable computer with an awareness of its surrounding environment through the camera module 32.
While only one camera module is shown in the examples of
The camera 12, 32 and SOC 16, 36 may be embedded into or incorporated into or attached to the garment in any of a variety of different ways. They may be connected using a pin through the fabric of the garment so that the camera may easily be removed and then attached to other garments. Straps and belts may alternatively be used. Similarly hook and loop fasteners may be used to hold the camera sensor, SOC, and screen to the garment. They may be held in some type of holder incorporated into the garment such as a special pocket, flap, or tab. They may be sewn into the garment as a separate structure such as the button camera 32 of
A variety of different kinds of communications are possible using the wearable camera system shown in
The communication of
The tourist may obtain information about specific items displayed on store shelves or about monuments in a city park. Similarly, the maintenance worker may obtain information about large systems or detailed service information about a specific piece of equipment at a facility.
Using a smart phone, the wearable camera sensor system requires only a low power, short range connection to the smart phone, such as Bluetooth, Ultra-Low Power WiFi, or NFC (Near Field Communications). This allows for a lighter, smaller system with less need for recharging. The smart phone may then use a higher power long range radio, such as mobile cellular. Alternatively, a wearable system may be used in the same way except that the user views the information on the sleeve display screen 42. Using a camera mounted in a separate independent position, the sleeve display may be held in any position and yet the system obtains information about the environment in front of the user.
In the described examples the camera system 12, 32 may take many different configurations. It may be attached to the garment as an accessory or integrated into the garment as a button or a nonfunctional item.
Each camera module of
Any type of camera module may be attached to a garment 10, 30 using a clip or a pin. The camera module may also be permanently attached by being sewn on such as the example of the camera sensor 32 of
The processing system 16, 36 may take a variety of different forms. A simple example is shown in the block diagram of
The controller 96 may also be connected to a communications interface 98 with, for example, an antenna 99 with which to send and receive data with external devices. Depending on the particular implementation, data from the camera 92 may be delivered through the data interface 94 directly through the communications interface 98 to be sent through the antenna 99 to other devices. Similarly, information may be received from the communications interface 99 to the controller for communication to the user. As shown a common bus connects the data interface, communications interface and controller to each other to exchange data, signaling, and commands. The memory may be connected directly to the bus or connected through the controller depending on the implementation.
This process may be repeated as the camera sensor continues to capture images. The process may be timed so that a new image is sent a specific time intervals, such as once a second, once a minute, once every ten minutes, etc. The process may be triggered by user command or by a remote command or by the system. For example, the camera sensor may capture images and perform an analysis to determine if the scene has significantly changed. A significant change may be used as a trigger to send a new image. In addition, the image may be sent with additional information or commands based on an application currently in use or a request from the user.
Depending on its applications, computing device 500 may include other components that may or may not be physically and electrically coupled to the board 502. These other components include, but are not limited to, volatile memory (e.g., DRAM) 508, non-volatile memory (e.g., ROM) 509, flash memory (not shown), a graphics processor 512, a digital signal processor (not shown), a crypto processor (not shown), a chipset 514, an antenna 516, a display 518 such as a touchscreen display, a touchscreen controller 520, a battery 522, an audio codec (not shown), a video codec (not shown), a power amplifier 524, a global positioning system (GPS) device 526, a compass 528, an accelerometer (not shown), a gyroscope (not shown), a speaker 530, a camera 532, and a mass storage device (such as hard disk drive) 510. These components may be connected to the system board 502, mounted to the system board, or combined with any of the other components.
The communication chip 506 enables wireless and/or wired communications for the transfer of data to and from the computing device 500. The term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. The communication chip 506 may implement any of a number of wireless or wired standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing device 500 may include a plurality of communication chips 506. For instance, a first communication chip 506 may be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and a second communication chip 506 may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.
The processor 504 of the computing device 500 includes an integrated circuit die packaged within the processor 504. The term “processor” may refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory.
In various implementations, the computing device 500 may be a laptop, a netbook, a notebook, an ultrabook, a smartphone, a tablet, a personal digital assistant (PDA), an ultra mobile PC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, a set-top box, an entertainment control unit, a digital camera, a portable music player, or a digital video recorder. In further implementations, the computing device 500 may be any other electronic device that processes data.
Embodiments may be implemented as a part of one or more memory chips, controllers, CPUs (Central Processing Unit), microchips or integrated circuits interconnected using a motherboard, an application specific integrated circuit (ASIC), and/or a field programmable gate array (FPGA).
References to “one embodiment”, “an embodiment”, “example embodiment”, “various embodiments”, etc., indicate that the embodiment(s) of the invention so described may include particular features, structures, or characteristics, but not every embodiment necessarily includes the particular features, structures, or characteristics. Further, some embodiments may have some, all, or none of the features described for other embodiments.
In the following description and claims, the term “coupled” along with its derivatives, may be used. “Coupled” is used to indicate that two or more elements co-operate or interact with each other, but they may or may not have intervening physical or electrical components between them.
As used in the claims, unless otherwise specified, the use of the ordinal adjectives “first”, “second”, “third”, etc., to describe a common element, merely indicate that different instances of like elements are being referred to, and are not intended to imply that the elements so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
The following examples pertain to further embodiments. The various features of the different embodiments may be variously combined with some features included and others excluded to suit a variety of different applications. Some embodiments pertain to an apparatus that comprises a camera to capture images with a wide field of view, a data interface to send camera images to an external device, and a power supply to power the camera and the data interface. The camera, data interface, and power supply are attached to a garment. In further embodiments, the camera is integrated into the garment, such as by using a pin, or being sewn to the garment.
In further embodiments, the camera has a panoramic field of view or a 180 degree horizontal field of view. In further embodiments, the data interface is an optic fiber interface, the data interface is a wireless interface, and the external device is a cellular telephone. In further embodiments the apparatus further includes a processor and image recognition software to process images captured by the camera before sending by the data interface.
In another embodiment an imaging and communication system comprises a camera to capture images and to send the images through a short range wireless interface, a data interface to receive the images from the camera through the short range wireless interface, a processor coupled to the data interface to process the images for analysis, and a long range wireless interface coupled to the processor to send the processed images to a remote device.
In further embodiments, the long range wireless image is further to receive information about the sent images from the remote device, a display is coupled to the processor to display the received information about the images, and a control interface coupled to the display to allow user control of the displayed information.
In another embodiment, a method comprises capturing an image in a camera attached to a garment, sending the image to an external device for analysis, receiving the analysis from the external device, and presenting the analysis to a user of the garment.
In further embodiments, sending the image comprises sending the image to a local portable device and the local portable device sending the image to a remote server. In further embodiments, the local portable device is also attached to the garment, or the local portable device is a cellular telephone. In further embodiments, sending the image further comprises sending additional information about the image including time and location, presenting the analysis comprises presenting information about the image on a display that is movable independent of the camera.
Claims
1. An apparatus comprising:
- a camera attached to a garment to capture an image of a view of an area surrounding a user that is wearing the garment, the image including an item;
- a data interface attached to the garment and coupled to the camera to send the camera image to an external device and to receive description information about the item from the external device; and
- a power supply attached to the garment and coupled to the camera and the data interface to power the camera and the data interface,
- wherein the apparatus presents the received description information to a user of the garment.
2. The apparatus of claim 1, wherein the camera has a panoramic field of view.
3. The apparatus of claim 2, wherein the camera has a 180 degree horizontal field of view.
4. The apparatus of claim 1, wherein the data interface is an optic fiber interface coupled to a local processing device.
5. The apparatus of claim 4, wherein the local processing device is a separate mobile device.
6. The apparatus of claim 1, wherein the data interface is a wireless interface through a mobile device and the external device is a remote server.
7. The apparatus of claim 1, further comprising a processor and image recognition software to process a sequence of images captured by the camera and to select an image in the sequence that shows a significant change from an image before the selected image and wherein the data interface is to send the selected image to the external device.
8. The apparatus of claim 1, further comprising a display coupled to the data interface to present the received description information to the user.
9. The apparatus of claim 8, further comprising a handheld device with a wireless connection to the data interface and wherein the handheld device comprises the display.
10. The apparatus of claim 9, wherein the handheld device comprises a long range radio and wherein the data interface sends to the external device through the long range radio of the handheld device.
11. The apparatus of claim 1, wherein the apparatus presents the received description information as sound.
12. The apparatus of claim 1, wherein the description information is historical and status information about the item.
13. An imaging and communication system comprising:
- a camera attached to a garment to capture images of a view of an area surrounding a user that is wearing the garment, the images including an item, and to send the images through a short range wireless interface;
- a handheld device having a data interface to receive the images from the camera through the short range wireless interface;
- a processor of the handheld device coupled to the data interface;
- a long range wireless interface of the handheld coupled to the processor to send the images to an external device and to receive description information about the item from the external device; and
- a display of the handheld device to present the received description information.
14. The system of claim 13, further comprising a control interface of the handheld device coupled to the display to allow user control of the displayed information.
15. The system of claim 11, wherein the description information is historical and status information about the item.
16. A method comprising:
- capturing an image in a camera attached to a garment, the image including an item;
- sending the image to an external device for analysis;
- receiving description information about the item from the external device; and
- presenting the received description information to a user of the garment.
17. The method of claim 16, wherein sending the image comprises sending the image to a local portable device and the local portable device sending the image to a remote server.
18. The method of claim 17, wherein the local portable device is also attached to the garment.
19. The method of claim 18, wherein presenting the received description information comprises presenting information about the item on a display of the local portable device
20. The method of claim 16, wherein sending the image further comprises sending additional information about the image including time and location.
Type: Application
Filed: Mar 7, 2016
Publication Date: Jun 30, 2016
Applicant: Intel Corporation (Santa Clara, CA)
Inventors: Ravi Pillarisetty (Portland, OR), Sairam Agraharam (Chandler, AZ), John S. Guzek (Chandler, AZ), Christopher J. Jezewski (Hillsboro, OR)
Application Number: 15/062,867