HANDS-FREE BODY-MOUNTED PRESENTATION SCANNING

- Symbol Technologies, Inc.

A mobile device is described. The mobile device includes a housing having a pistol grip portion. A data acquisition device is positioned in the housing adjacent to the pistol grip portion. The data acquisition device is adapted to acquire data from an object. The data acquisition device generates an electrical signal that is representative of the data. A control activates the data acquisition device. A processor receives the electrical signal from the data acquisition device. A support is worn on a torso of a user. The support supports the housing by the pistol grip portion such that the data acquisition device is positioned to acquire data from the object when the object is located in front of the torso of the user. The support enables hands-free operation of the mobile device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Patent Application Ser. No. 61/741,163 filed Jul. 13, 2012, entitled, “MOBILE COMPUTING DEVICE INCLUDING AN ERGONOMIC HANDLE,” the contents of which are expressly incorporated herein by reference in their entirety.

TECHNICAL FIELD

The present disclosure relates generally to hands-free data acquisition for mobile devices.

BACKGROUND

Hands-free presentation scanning is conventionally achieved using a scanning device that is mounted to a surface, such as a table, shelf or a cart. The scanning device is usually activated in an always-on mode of operation. An object having a barcode symbol is moved through a field-of-view of the scanning device and data is acquired when the scanning device recognizes and reads the barcode symbol.

SUMMARY

In one aspect, the invention is embodied in a mobile device. The mobile device includes a housing having a pistol grip portion. A data acquisition device is positioned in the housing adjacent to the pistol grip portion. The data acquisition device acquires data from an object. The data acquisition device generates an electrical signal representative of the data. A control activates the data acquisition device. A processor receives the electrical signal from the data acquisition device. A support is worn on a torso of a user. The support supports the housing by the pistol grip portion such that the data acquisition device is positioned to acquire data from the object when the object is located in front of the torso of the user. The support enables hands-free operation of the mobile device.

In one embodiment, the data acquisition device is one of a laser scanner, an imager, and a radio-frequency identification (RFID) reader. An illumination system can be aligned with the data acquisition device for illuminating at least a portion of the object.

In one embodiment, the control for activating the data acquisition device includes a sensor for automatically sensing a presence of the object proximate to the data acquisition device and activating the data acquisition upon sensing the presence of the object. The control for activating the data acquisition device can include a switch for activating the data acquisition device in an always-on presentation mode. The control for activating the data acquisition device can include a timer that periodically activates the data acquisition device.

In one embodiment, the control for activating the data acquisition device includes a motion sensor coupled to the housing that activates the data acquisition device upon sensing an impact to the housing. The control for activating the data acquisition device can include a speech recognition module that activates the data acquisition device upon receiving a command spoken into a microphone of the mobile device. In one embodiment, the control for activating the data acquisition device includes a video analytic module that activates the data acquisition device upon recognizing an image related to the object that is captured by a camera of the mobile device.

In one embodiment, the support can include a holster coupled to a strap that is adapted to surround at least a portion of the torso of the user. Alternatively, the support can include a clip for coupling the mobile device to one of a belt, a lanyard and an article of clothing of the user.

In another aspect, the invention is embodied in a torso-mountable mobile device for capturing a barcode symbol. The torso-mountable mobile device includes a housing having a pistol grip portion. An imager is positioned in the housing adjacent to the pistol grip portion for capturing the barcode symbol. The imager generates image data corresponding to the barcode symbol. A control activates the imager. A processor receives the image data from the imager. A support is worn on a torso of a user for supporting the housing by the pistol grip portion such that the imager is positioned to capture the barcode symbol when the barcode symbol is located in front of the torso of the user. The support enables hands-free operation of the mobile device.

In one embodiment, the mobile device includes an illumination system aligned with the imager for illuminating the barcode symbol. The mobile device can also include a radio-frequency identification (RFID) reader.

In one embodiment, the control for activating the data acquisition device includes a sensor for automatically sensing a presence of an object having the barcode symbol proximate to the imager and activating the imager upon sensing the presence of the object. The control for activating the imager can include a switch for activating the imager in an always-on presentation mode. The control for activating the imager can include a timer that periodically activates the imager.

In one embodiment, the control for activating the imager includes a motion sensor coupled to the housing that activates the imager upon sensing an impact to the housing. The control for activating the imager can include a speech recognition module that activates the imager upon receiving a command spoken into a microphone of the mobile device. In one embodiment, the control for activating the imager includes a video analytic module that activates the imager upon recognizing an image corresponding to the object that is captured by the imager of the mobile device.

In one embodiment, the support includes a holster coupled to a strap that is adapted to surround at least a portion of the torso of the user. The support can alternatively include a clip for coupling the mobile device to one of a belt, a lanyard and an article of clothing of the user.

BRIEF DESCRIPTION OF THE FIGURES

Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of various embodiments. In addition, the description and drawings do not necessarily require the order illustrated. It will be further appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. Apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the various embodiments so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein. Thus, it will be appreciated that for simplicity and clarity of illustration, common and well-understood elements that are useful or necessary in a commercially feasible embodiment may not be depicted in order to facilitate a less obstructed view of these various embodiments.

The above and further advantages of this invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like numerals indicate like structural elements and features in various figures. Skilled artisans will appreciate that reference designators shown herein in parenthesis indicate components shown in a figure other than the one in discussion. For example, talking about a device (10) while discussing Figure A would refer to an element, 10, shown in figure other than Figure A.

FIG. 1 illustrates a block diagram of the components of a mobile device according to one embodiment of the invention.

FIG. 2 illustrates a perspective view of a mobile device according to one embodiment of the invention.

FIG. 3 illustrates the mobile device positioned in a holster according to one embodiment of the invention.

FIG. 4 illustrates a graphical representation of various locations on a torso of a user for positioning a mobile device according to embodiments of the invention.

FIG. 5 illustrates a range of scanning fields according to one embodiment of the invention.

FIG. 6A illustrates a mobile device in operation according to one embodiment of the invention.

FIG. 6B illustrates a mobile device in operation according to one embodiment of the invention.

FIG. 7A illustrates an aiming pattern for a data capture module of the mobile device according to one embodiment of the invention.

FIG. 7B illustrates another aiming pattern for a data capture module of the mobile device according to one embodiment of the invention.

DETAILED DESCRIPTION

The following detailed description is merely illustrative in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any express or implied theory presented in the preceding technical field, background, brief summary or the following detailed description. For the purposes of conciseness, many conventional techniques and principles related to acquiring data from an object, need not, and are not, described in detail herein.

Techniques and technologies may be described herein in terms of functional and/or logical block components and various processing steps. It should be appreciated that such block components may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.

The following description may refer to elements or nodes or features being “connected” or “coupled” together. As used herein, unless expressly stated otherwise, “connected” means that one element/node/feature is directly joined to (or directly communicates with) another element/node/feature, and not necessarily mechanically. Likewise, unless expressly stated otherwise, “coupled” means that one element/node/feature is directly or indirectly joined to (or directly or indirectly communicates with) another element/node/feature, and not necessarily mechanically. The term “exemplary” is used in the sense of “example, instance, or illustration” rather than “model,” or “deserving imitation.”

Technologies and concepts discussed herein relate to presentation scanning for hands-free operation of mobile devices. The mobile device includes a housing having a pistol grip portion. The mobile device can also include a data acquisition device positioned in the housing adjacent to the pistol grip portion for acquiring data from an object. The data acquisition device can generate an electrical signal representative of the data acquired from the object. The mobile device can also include a control for activating the data acquisition device. A processor receives the electrical signal from the data acquisition device.

A support is worn on a torso of a user. The support is adapted to support the housing by the pistol grip portion such that the data acquisition device is positioned to acquire data from the object when the object is located in front of the torso of the user. The support enables hands-free operation of the mobile computer.

The mobile device can also include a display coupled to the housing such that the display is viewable by a user of the mobile device when the pistol grip portion is held in a hand of the user. A trigger-like control switch is located on the pistol grip portion of the mobile device.

FIG. 1 illustrates a block diagram of the components of a mobile device 100 according to one embodiment of the invention. The mobile device 100 includes a data acquisition device 102 supported by a housing 104. The data acquisition device 102 can include an imaging device, a laser scanning device, a radio-frequency identification (RFID) device, or a combination of devices. In practice, any suitable data acquisition device 102 can be used. In one embodiment, an optional light source 106 can be supported by the housing 104. The light source 106 can illuminate a target for data acquisition.

When the data acquisition device 102 includes an imager, the imager can be any component configured to capture image data. For example, the imager can include any type of image sensor or sensors. The imager can capture an image in a field of view (FoV) of the imager. In one embodiment, the image captured in the FoV of the imager can be displayed on a display (not shown).

The mobile device 100 can also include a processor 110, a memory 112, a trigger switch 114, a battery 116, a transceiver 118, a motion sensor 120, a charging connector 122, a microphone 124, a loudspeaker 126, a proximity sensor 128 and other optional components (not shown), such as a volume control, and/or control switches, for example.

The display can be any component configured to display data to a user. The display can include, for example, a liquid crystal display (LCD) at least partially disposed within the housing 104 of the mobile device 100. The display can include touch screen capability. The display can display a graphical user interface (GUI). The GUI can be programmed to activate different functions of the mobile device 100. For example, the processor 110 can generate the GUI on the display to provide icons corresponding to certain functionality of the mobile device 100.

The trigger switch 114 can activate different functions of the mobile device 100. For example, the trigger switch 114 can activate the data acquisition device 102 of the mobile device 100 in a handheld mode of operation.

The processor 110 can provide conventional functionalities for the mobile device 100. In a specific example according to the exemplary embodiments of the present invention and as will be described in further detail below, the mobile device 100 can include a plurality of software applications that are executed on the processor 110 such as a software application related to capturing and processing images, documents and video. The memory 112 can also provide conventional functionalities for the mobile device 100. For example, the memory 112 can store data and software applications related to operations performed by the processor 110.

The microphone 124 can be coupled to the processor 110 and used as an input device to control functions of the mobile device 100. For example, the processor 110 can perform speech recognition on data received from the microphone 124. In one embodiment, the user commands the mobile device 100 to activate the data acquisition device 102 by speaking into the microphone 124. The loudspeaker 126 can provide audio signals to a user. For example, the loudspeaker 126 can emit an audio signal indicating that data was successfully acquired. In one embodiment, the mobile device 100 includes an audio jack (not shown) that couples to an audio connector of a headset. The audio signal can be transmitted to the headset through the audio jack.

A transceiver 118 can provide the mobile device 100 with a method of exchanging data with a communications network and/or other mobile devices. For example, the transceiver 118 can be a Bluetooth transceiver that wirelessly transmits audio signals to a Bluetooth-enabled headset.

The motion sensor 120 can detect a motion of the mobile device 100. The motion sensor 120 can be any device capable of detecting motion, such as an accelerometer or a gyroscopic device.

The battery 116 can be a rechargeable battery. The charging connector 122 can be a charging connector that is accessible to a corresponding connector on one end of a charging cable or in a charging cradle (not shown). In practice, the charging connector 122 can be a universal serial bus (USB) connector that conveys data as well as electrical current.

The proximity sensor 128 detects when an object is positioned proximate to the mobile device 100. The proximity sensor 128 is a sensor able to detect the presence of a nearby object without requiring any physical contact with the object. In one embodiment, the proximity sensor 128 can include an emitter and a detector. For example, the emitter can emit an electromagnetic field or a beam of electromagnetic radiation (such as infrared radiation). The detector can detect changes in the electromagnetic field or a detected return signal.

The mobile device 100 can include additional components conventionally found in electronic devices, such as a control switches, charging circuitry and one or more antennas, for example.

In operation, the mobile device 100 can be operated in a hands-free presentation mode. In one embodiment, a strap or belt (not shown) can be worn on a torso of a user. A holster (not shown) can be attached to the strap at a desired location. For example, the location of the holster will determine the position of a field of view of the data acquisition device 102 of the mobile device 100 when the mobile device 100 is supported by the holster. A control 130 activates the data acquisition device 102. In one embodiment, the data acquisition device 102 operates in an always-on presentation mode. In another embodiment, the data acquisition device 102 operates in a periodic mode. For example, the data acquisition device 102 can switched on and off periodically. In practice, any desired operating mode can be used.

A user moves the desired object into the field of view of the data acquisition device 102 when the control 130 activates the data acquisition device 102. The data acquisition device 102 is positioned to acquire data from the object when the object is located in front of the torso of the user. The data acquisition device 102 generates an electrical signal representative of the data. The processor 110 receives the electrical signal from the data acquisition device 102. The processor 110 can decode the electrical signal. In one embodiment, the transceiver 118 of the mobile device 100 can transmit object data to a remote device, such as a network server.

In one embodiment, the control 130 includes the proximity sensor 128 for automatically sensing a presence of the object proximate to the mobile device 100 and activating the data acquisition device 102 upon sensing the presence of the object. In one embodiment, the control 130 includes a timer 132 that periodically activates the data acquisition device 102.

In one embodiment, the control 130 includes a motion sensor 120 coupled to the housing 104 that activates the data acquisition device 102 upon sensing an impact to the housing 104. In one embodiment, the control 130 includes a speech recognition module 134 that activates the data acquisition device 102 upon receiving a command spoken into the microphone 124 of the mobile device 100. In one embodiment, the control 130 includes a video analytic module 136 that activates the data acquisition device 102 upon recognizing an image related to the object that is captured by a camera of the mobile device 100.

FIG. 2 illustrates a perspective view of a mobile device 200 according to one embodiment of the invention. The mobile device 200 includes a housing 202 supporting a display 204. The display 204 can be a touch screen display. The housing 202 includes a handle portion 206 in the shape of a pistol grip. The handle portion 206 is configured to be held in a hand of a user. The handle portion 206 can be supported by a holster (not shown) to enable hands-free operation. The holster can be attached to a belt, a lanyard, a strap, or an article of clothing, such as a vest of a user.

A trigger switch 208 is located on the handle portion 206 of the mobile device 200. The trigger switch 208 is positioned on the handle portion 206 such that the trigger switch 208 is accessible to an index finger of a hand of a user when the handle portion 206 is held in the hand of the user in a hand-held mode of operation.

In one embodiment, the trigger switch 208 can be a momentary switch. Alternatively, the trigger switch 208 can be a “toggle” switch for continuous “on” or “off” operation. In one embodiment, the trigger switch 208 actuates at least one function of the mobile device 200. For example, the trigger switch 208 can activate a data capture function of the mobile device 200.

The trigger switch 208 can be a bifurcated switch (not shown) for controlling two or more functions of the mobile device 200. In one example, a bifurcated switch can have a top switch that activates a first function of the mobile device 200 and a bottom switch that activates a second function of the mobile device 200. In practice, the trigger switch 208 can use any suitable switch.

The mobile device 200 can also include a rotary switch 210 for controlling an audio function of the mobile device 200. For example, the rotary switch 210 can control a volume level of a speaker 212 or a gain level of a microphone 214 of the mobile device 200. In one embodiment, the rotary switch 210 controls a volume level of a headset coupled to the mobile device 200 through an audio jack 216 or a Bluetooth connection.

The mobile device 200 can also include a proximity sensor 218. The proximity sensor 218 can detect when an object is positioned proximate to the mobile device 200. A processor of the mobile device 200 can activate a data acquisition device 220 in response to the detection of the object by the proximity sensor 218.

The mobile device 200 can also include a motion sensor 222. The motion sensor 222 is coupled to the housing 202. In one embodiment, the motion sensor 222 activates the data acquisition device 220 upon sensing an impact to the housing 202. The motion sensor 222 can also determine when the mobile device 200 is in a stationary or moving state. For example, the motion sensor 222 can activate the data acquisition device 220 upon sensing that the mobile device 200 is in a stationary state for a predetermined time period. This can reduce performance degradation due to blur when the data acquisition device 220 is an imaging device, for example.

In one embodiment, the hands-free presentation mode of the mobile device 200 is activated using a video analytic module 136 (FIG. 1) that activates the data acquisition device 220 upon recognizing an image related to an object that is captured by a camera 224 of the mobile device 200. In the embodiment shown, the camera 224 is a component that is separate from the data acquisition device 220. Alternatively, the camera 224 can be integrated with the data acquisition device 220.

In one embodiment, the mobile device 200 can be fabricated by forming the housing 202 having the handle portion 206 from a single piece of material. Alternatively, the housing 202 can be formed from several pieces of material, such as a front section and a back section. The display 204 is coupled to the housing 202 such that it is viewable by a user of the mobile device 200 when the handle portion 206 is held in the hand of the user.

The data capture module 220, such as an imaging device, can be coupled to the housing 202. The imaging device can capture images upon activation of the trigger switch 208. Alternatively, the data capture module 220 can be a laser scanning device and/or a radio-frequency identification (RFID) device.

FIG. 3 illustrates the mobile device 300 positioned in a holster 302 according to one embodiment of the invention. The holster 302 can be fabricated from any suitable material, such as plastic, nylon, leather, metal or cloth, for example. The holster 302 is coupled to a strap 304 with a coupling 306. The coupling 306 can include a joint (not shown) to allow the holster 302 to pivot relative to the strap 304. In one embodiment, the joint can pivot the holster 302 vertically and/or horizontally relative to the strap 304. For example, the joint can be a hinge joint, a universal joint, a ball joint or any other suitable joint.

The strap 304 can be fabricated from a flexible material, such as nylon fabric, plastic, leather, or any other suitable material. The strap 304 can include a buckle 308 for adjusting the length of the strap 304. The strap 304 can be worn around a hip, a torso, or a shoulder of a user. In one embodiment, the holster 304 can be used to support the mobile device 300. The coupling 306 that couples the holster 302 to the strap 304 can allow the mobile device 300 to be oriented such that a data acquisition device 310 is positioned to capture data in a hands-free presentation mode of operation.

In one embodiment, the mobile device 300 can include a sensor 310. For example, the sensor 310 can be a magnetic sensor, such as a Hall Effect sensor, that senses the presence of a nearby magnet 312. The magnet 312 can be positioned in the holster 304 at a location that activates a switch in the sensor 310 when the mobile device 300 is positioned in the holster 304. For example, the sensor 310 can automatically detect when the mobile device 300 is holstered to activate a presentation mode of operation. Similarly, the sensor 310 can automatically detect when the mobile device 300 is removed from the holster 304 to initiate a handheld mode of operation.

FIG. 4 illustrates a graphical representation 400 of various locations 402 on a torso 404 of a user 406 for positioning a mobile device according to embodiments of the invention. A strap 408 can be worn around the torso 404 in different configurations based on user requirements.

In one embodiment, the strap 408 can be worn across the chest 410 of the user 406 to position the mobile device such that a field of view 412 of the data acquisition device (not shown) is above the waist 414 and in front of the user 406.

In another embodiment, the strap 408 can be worn across the waist 414 of the user 406 to position the mobile device such that a midpoint of a field of view 416 of the data acquisition device (not shown) is approximately level to the waist 414 and in front of the user 406.

FIG. 5 illustrates a range of scanning fields 500 according to one embodiment of the invention. A holster (not shown) includes a coupling 502 having a joint that pivots in a vertical axis relative to a torso 504 of a user 506. In one embodiment, the holster (not shown) includes a coupling 504 having a joint that pivots in a horizontal axis relative to the torso 504 of the user 506. In practice, the coupling 502 can include a universal joint that can pivot in multiple dimensions.

FIG. 6A illustrates a mobile device 600 in operation according to one embodiment of the invention. The mobile device 600 is positioned in a holster 602. The holster 602 is supported by a belt 604. The belt 604 can include a buckle 606 for adjusting the size of the belt 604. The holster 602 can be positioned at any location along the length of the belt 604.

A user 608 activates a hands-free presentation mode of the mobile device 600 by any number of techniques. For example, the hands-free presentation mode can be activated via a control switch. The control switch can activate a data acquisition device 610 of the mobile device 600 such that the data acquisition device 610 operates in an always-on presentation mode. In one embodiment, the control switch can activate a timer that periodically activates the data acquisition device 610.

In one embodiment, the hands-free presentation mode is activated using a motion sensor (not shown) coupled to a housing of the mobile device 600 that activates the data acquisition device 610 upon sensing an impact to the housing.

In one embodiment, the hands-free presentation mode is activated using spoken commands via a speech recognition application. For example, the data acquisition device 610 can include a speech recognition module that activates the data acquisition device 610 upon receiving a command spoken into a microphone of the mobile device 600.

In one embodiment, the hands-free presentation mode of the mobile device 600 is activated using a video analytic module that activates the data acquisition device 610 upon recognizing an image related to an object 612 that is captured by a camera of the mobile device 600.

In one embodiment, the hands-free presentation mode can be activated using a proximity sensor (not shown) that detects when an object 612 of interest is positioned proximate to the mobile 600. The processor of the mobile device 600 can activate the data acquisition device upon receiving proximity data from the proximity sensor.

FIG. 6B illustrates a mobile device 600 in operation according to one embodiment of the invention. The mobile device 600 is positioned in the holster 602. The holster 602 is supported by a strap 614. The strap 614 can be positioned over the shoulder 616 and across the chest of the user 608. The holster 602 can be positioned at any location along the length of the strap 614.

FIG. 7A illustrates an aiming pattern 700 for a data capture module of a mobile device according to one embodiment of the invention. The aiming pattern 700 can be a grid-like pattern which is projected in front of the user 702. The grid-like pattern can be projected in two-dimensional or three dimensional patterns. The aiming pattern 700 can assist in positioning an object 704 so as to capture data from the object 704. For example, the aiming pattern 700 can provide a visual scan zone to help guide a user by projecting an effective scanning area. In one embodiment, the aiming pattern 700 can illuminate a barcode symbol 706 attached to the object 704 such that a data capture module of the mobile device can capture the barcode symbol 706.

FIG. 7B illustrates another aiming pattern 710 for a data capture module of the mobile device according to one embodiment of the invention. The aiming pattern 710 can be a spot pattern which is projected in front of the user 702. The spot pattern can be projected in two-dimensional or three dimensional patterns. The aiming pattern 710 can assist in positioning the object 704 so as to capture data from the object 704. For example, the aiming pattern 710 can provide a visual scan zone to help guide a user by projecting an effective scanning area. In one embodiment, the aiming pattern 710 can illuminate the barcode symbol 706 attached to the object 704 such that a data capture module of the mobile device can capture the barcode symbol 706.

In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.

Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has”, “having,” “includes”, “including,” “contains”, “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a”, “has . . . a”, “includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.

It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the methods and apparatus described herein. The non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices. As such, these functions may be interpreted as steps of a method described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used. Both the state machine and ASIC are considered herein as a “processing device” for purposes of the foregoing discussion and claim language.

Moreover, an embodiment can be implemented as a computer-readable storage element or medium having computer readable code stored thereon for programming a computer (e.g., comprising a processing device) to perform a method as described and claimed herein. Examples of such computer-readable storage elements include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.

The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

While at least one example embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the example embodiment or embodiments described herein are not intended to limit the scope, applicability, or configuration of the claimed subject matter in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the described embodiment or embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope defined by the claims, which includes known equivalents and foreseeable equivalents at the time of filing this patent application.

In addition, the section headings included herein are intended to facilitate a review but are not intended to limit the scope of the present invention. Accordingly, the specification and drawings are to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.

In interpreting the appended claims, it should be understood that:

    • a) the word “comprising” does not exclude the presence of other elements or acts than those listed in a given claim;
    • b) the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements;
    • c) any reference signs in the claims do not limit their scope;
    • d) several “means” may be represented by the same item or hardware or software implemented structure or function;
    • e) any of the disclosed elements may be comprised of hardware portions (e.g., including discrete and integrated electronic circuitry), software portions (e.g., computer programming), and any combination thereof;
    • f) hardware portions may be comprised of one or both of analog and digital portions;
    • g) any of the disclosed devices or portions thereof may be combined together or separated into further portions unless specifically stated otherwise; and
    • h) no specific sequence of acts or steps is intended to be required unless specifically indicated.

Claims

1. A mobile device, comprising:

a housing having a pistol grip portion;
a data acquisition device positioned in the housing adjacent to the pistol grip portion for acquiring data from an object, the data acquisition device generating an electrical signal representative of the data;
a control for activating the data acquisition device;
a processor for receiving the electrical signal from the data acquisition device; and
a support worn on a torso of a user for supporting the housing by the pistol grip portion such that the data acquisition device is positioned to acquire data from the object when the object is located in front of the torso of the user, the support enabling hands-free operation of the mobile device.

2. The mobile device of claim 1 wherein the data acquisition device is chosen from the group comprising a laser scanner, an imager, and a radio-frequency identification (RFID) reader.

3. The mobile device of claim 1 further comprising an illumination system aligned with the data acquisition device for illuminating at least a portion of the object.

4. The mobile device of claim 1 wherein the control for activating the data acquisition device comprises a sensor for automatically sensing a presence of the object proximate to the data acquisition device and activating the data acquisition upon sensing the presence of the object.

5. The mobile device of claim 1 wherein the control for activating the data acquisition device comprises a switch for activating the data acquisition device in an always-on presentation mode.

6. The mobile device of claim 1 wherein the control for activating the data acquisition device comprises a timer that periodically activates the data acquisition device.

7. The mobile device of claim 1 wherein the control for activating the data acquisition device comprises a motion sensor coupled to the housing that activates the data acquisition device upon sensing an impact to the housing.

8. The mobile device of claim 1 wherein the control for activating the data acquisition device comprises a speech recognition module that activates the data acquisition device upon receiving a command spoken into a microphone of the mobile device.

9. The mobile device of claim 1 wherein the control for activating the data acquisition device comprises a video analytic module that activates the data acquisition device upon recognizing an image related to the object that is captured by a camera of the mobile device.

10. The mobile device of claim 1 wherein the support comprises a holster coupled to a strap adapted to surround at least a portion of the torso of the user.

11. The mobile device of claim 1 wherein the support comprises a clip for coupling the mobile device to one of a belt, a lanyard and an article of clothing of the user.

12. A torso-mountable mobile device for capturing a barcode symbol, comprising:

a housing having a pistol grip portion;
an imager positioned in the housing adjacent to the pistol grip portion for capturing the barcode symbol, the imager generating image data corresponding to the barcode symbol;
a control for activating the imager;
a processor for receiving the image data from the imager; and
a support worn on a torso of a user for supporting the housing by the pistol grip portion such that the imager is positioned to capture the barcode symbol when the barcode symbol is located in front of the torso of the user, the support enabling hands-free operation of the mobile device.

13. The mobile device of claim 12 further comprising an illumination system aligned with the imager for illuminating the barcode symbol.

14. The mobile device of claim 12 further comprising a radio-frequency identification (RFID) reader.

15. The mobile device of claim 12 wherein the control for activating the imager comprises a sensor for automatically sensing a presence of an object including the barcode symbol proximate to the imager and activating the imager upon sensing the presence of the object.

16. The mobile device of claim 12 wherein the control for activating the imager comprises a switch for activating the imager in an always-on presentation mode.

17. The mobile device of claim 12 wherein the control for activating the imager comprises a timer that periodically activates the imager.

18. The mobile device of claim 12 wherein the control for activating the imager comprises a motion sensor coupled to the housing that activates the imager upon sensing an impact to the housing.

19. The mobile device of claim 12 wherein the control for activating the imager comprises a speech recognition module that activates the imager upon receiving a command spoken into a microphone of the mobile device.

20. The mobile device of claim 12 wherein the control for activating the imager comprises a video analytic module that activates the imager upon recognizing an image corresponding to the barcode symbol that is captured by the imager.

21. The mobile device of claim 12 wherein the support comprises a holster coupled to a strap adapted to surround at least a portion of the torso of the user.

22. The mobile device of claim 12 wherein the support comprises a clip for coupling the mobile device to one of a belt, a lanyard and an article of clothing of the user.

Patent History
Publication number: 20140014725
Type: Application
Filed: Nov 27, 2012
Publication Date: Jan 16, 2014
Applicant: Symbol Technologies, Inc. (Schaumburg, IL)
Inventors: Richard M. Martin (New Hyde Park, NY), Jaeho Choi (Whitestone, NY), Ian R. Jenkins (Stony Brook, NY), Chandra M. Nair (Mount Sinal, NY), Konstantinos D. Tsiopanos (Selden, NY)
Application Number: 13/685,750
Classifications