CUSTOMIZED FACE MASK
A face mask includes a covering member configured to cover facial features of a face of a user, the covering member including an air-permeable filter member; and a facial image provided on an outward facing surface of the covering member, the facial image being generated based on an image of the user and representing covered facial features of the user. The face mask further includes a fastening member configured to secure the covering member over the face of the user.
Face masks such as surgical masks (sometimes referred to as hygiene masks, procedure masks, etc.) are often worn by users to, for example, protect the user's mouth and nose from undesirable airborne particles such as bacteria, airborne diseases, and the like. Typically, a face mask covers the user's mouth and nose and is held in place by a strap, band, or a similar fastening device.
SUMMARYOne embodiment relates to a face mask, including a covering member configured to cover facial features of a face of a user, the covering member including an air-permeable filter member; and a facial image provided on an outward facing surface of the covering member, the facial image being generated based on an image of the user and representing covered facial features of the user; and a fastening member configured to secure the covering member over the face of the user.
Another embodiment relates to a face mask, including a covering member configured to cover facial features of a face of a user, the covering member including a filter layer; and a display layer configured to provide a changeable image; and a fastening member configured to secure the covering member over the face of the user.
Another embodiment relates to a clothing item, including a covering member configured to cover a portion of a user's head, at least a portion of the covering member being an air-permeable material; and a facial image provided on an outward-facing surface of the covering member, the facial image being based on an image of the user and representing covered facial features of the user.
Another embodiment relates to a method of producing a customized face mask, including receiving a user image of a face of a user; determining a covering area based on the image, the covering area including a portion of the user image; and printing a covering image including a representation of the covering area onto a display layer of an air-permeable face mask.
Another embodiment relates to a method of producing a customized face mask, including acquiring a user image of a face of a user using an image capture device; determining a covering area corresponding to a portion of the image of the face of the user; and providing a covering image representing the portion of the image using a display layer of an air-permeable face mask such that when the covering member is worn by a user, the covering image provides a visual representation of at least a portion of the underlying portions of the face of the user.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
In the following detailed description, reference is made to the accompanying drawings, which form a part thereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.
Referring to the figures generally, various embodiments disclosed herein relate to face masks, and more specifically, face masks that are customizable to provide unique imagery for users. A mask, such as a surgical mask or similar mask, may provide filtering features to prevent spread of infection, etc. from airborne particles. One or more portions of the mask may include an image provided on an outward facing surface such that others can view the image. The image may be a representation of underlying facial features of the user, modified facial features of the user, and the like. Furthermore, the mask may include portions that are opaque, translucent, or substantially transparent, or that can dynamically change between any or all of these states.
Referring now to
In one embodiment, mask 14 includes covering member 16 and one or more fastening members 18. Covering image 20 is provided on an outward-facing surface of covering member 16. Covering member 16 is in one embodiment made at least partially from an air-permeable material (e.g., a filter material, a paper or other non-woven or woven material, etc.) to enable a user to breathe through covering member 16. Fastening members 18 are configured to secure mask 14 to the user (e.g., around the ears, around the neck, around the head, etc.) to maintain mask 14 in a desired position.
Covering member 16 is in one embodiment a generally rectangular piece of material intended to cover all or a portion of a user's face. In one embodiment, covering member 16 is configured to cover a mouth and a nose of a user, and conform to the contour of the user's face in order to minimize the amount of air travelling to/from the user's nose and mouth without passing through covering member 16. In one embodiment, edge portions 22 provide a perimeter edge for covering member 16 (e.g., to provide a cleaner appearance to the edges of mask 14). Edge portions 22 may be adhered, stitched, sewn, or otherwise secured to covering member 16. In some embodiments covering image 20 extends up to and throughout edge portion 22 to the outer edge of covering member 16.
Fastening members 18 are in one embodiment elastic members configured to resiliently retain mask 14 on a face of a user. Any suitable straps, bands, strings, etc. may be used. As shown in
Covering image 20 is in one embodiment an image provided on an outward facing surface of covering member 16. Covering image 20 includes depictions of various facial features, such as a nose, a mouth, skin features or imperfections, and the like. Covering image 20 may be configured to provide a substantially homogenous transition between covering image 20 and adjacent uncovered portions of a user's face.
In one embodiment, covering image 20 is generated based on images (e.g., user images) of a user's face, such as digital photographs, scanned images, and the like. Based on the user images, various facial features can then be printed onto mask 14 as part of covering image 20. In one embodiment, covering image 20 is intended to substantially replicate the appearance of the user's face (e.g., to replicate facial features underlying mask 14). In other embodiments, one or more portions of covering image 20 may be a modified version of the user's face, or further yet, include customized and/or changeable features.
Referring to
According to one embodiment, one or both of first image 24 and second image 26 are provided on an air-permeable material, such as a filter material or similar material. One or both of first image 24 and second image 26 may include portions that are transparent, translucent, or opaque. For example, a user may desire to have modified facial features, such as a different mouth or nose, provided as part of covering image 20, yet have the image be translucent such that others can see facial movements such as lip movements while the user speaks, as this often helps others interpret what is being said by the user. As such, first or second images 24, 26 may both provide images visible to others and enable others to see or partially see the underlying facial features.
Referring now to
Layers 28, 30, and 32 can be made of any suitable material and arranged in any suitable manner. For example, in one embodiment, layer 30 is a display layer captured between layers 28, 32. In such a configuration, one or both of layers 28, 32 can be a filter member configured to filter particles during use of mask 14. Any of layers 28, 30, 32 may be opaque, transparent, or translucent to provide a desired appearance to others. For example, layer 32 in one embodiment is the outer-most layer of mask 14, and is provided as a transparent layer such that images provided by an underlying display layer, such as layer 30, are visible to others.
Referring further to
In one embodiment, control unit 38 is configured to control a display layer of mask 14. For example, in one embodiment, layer 30 shown in
In one embodiment, control unit 34 stores image data regarding images of a user, and controls the display layer to display images based on the image data. For example, the display layer may display images intended to replicate or modify facial features of the user, or alternatively, provide facial movements by dynamically modifying the displayed image over time. In one embodiment, control unit 34 is configured to operate based on user inputs to provide one or more desired display images. For example, control unit 34 may store multiple different images for display on the display layer, and a user may specify which images are to be displayed based on any of a number of factors (e.g., user selection, time of day, the location of the user, etc.).
In one embodiment, control unit 34 controls the display layer to provide different image portions, such as first image portion 24 and second image portion 26 shown in
According to various alternative embodiments, control unit 34 is configured to provide customized images that transition into the surrounding facial features of the user. For example, a user may wish to have an image of an animal face (e.g., a tiger, a bear, etc.) displayed. Control unit 34 may store such custom images in memory, and control operation of the display layer to provide the appropriate custom imagery. In one embodiment, control unit 34 is configured to determine a transitional image, such as second image portion 26, configured to transition the custom image into surrounding facial features of the user (e.g., by providing gradual changes in color, facial features, etc.). Any custom image may be used according to various alternative embodiments, and the custom images may be provided in the form of image data (e.g., digital photographs, electronic scans, etc.) from a user.
Referring now to
In some embodiments, covering member 16 includes actuation device 43 coupled to control unit 34. Actuation device 43 is a movable member (e.g., a flexible member, etc.) configured to provide movement of covering member 16 between collapsed configuration 39 and expanded configuration 41 by folding/unfolding folds 40. Control unit 34 is configured to control operation of actuation device 43 to enable control of the amount of expansion of face mask 14. For example, in some embodiments, covering member 16 provides a first image in collapsed configuration 39 that generally corresponds to the user's face, while in expanded configuration 41 covering member 16 provides a second image that is modified (e.g., elongated, distorted, etc.) relative to the first image. To minimize distortion while near other people, control unit 34 may be configured to collapse covering member 16 based on detecting nearby people. In some embodiments, control unit 34 is or includes a short wave range finder configured to detect the presence of nearby people. In other embodiments, control unit 34 is configured to operate device 43 (and therefore control the degree of expansion of covering member 16) based on other factors (e.g., time, location, user inputs, etc.).
Referring now to
Referring to
In one embodiment, mask 14 further includes one or more armatures 58, 60 (e.g., structural members or inserts, etc.) configured to provide structural support for mask 14 and enable modification of the appearance of various facial features of a user. As shown in
In one embodiment, one or both of armatures 58, 60 are movable (e.g., in a similar manner to movable member 43 shown in
Referring further to
Referring to
Referring now to
Image capture device 72 is configured to capture one or more images of a user. In one embodiment, image capture device 72 is or includes a digital camera (e.g., a dedicated camera, a cellular phone or other device with camera capabilities, etc.). Alternatively, image capture device 72 may be or include a video recorder, a scanning device, or other suitable image capture device. Image capture device 72 captures one or more images of a user and provides user images/user image data to image processor 74.
Image processor 74 is configured to receive user images/image data (e.g., from image capture device 72 or another suitable image capture device) and generate a covering image such as covering image 20 for printing by printer 76. Image processor 74 includes processor 80 and memory 82. Processor 80 may be implemented as a general-purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a digital-signal-processor (DSP), a group of processing components, or other suitable electronic processing components. Memory 82 is one or more devices (e.g., RAM, ROM, Flash Memory, hard disk storage, etc.) for storing data and/or computer code for facilitating the various processes described herein. Memory 82 may be or include non-transient volatile memory or non-volatile memory. Memory 82 may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described herein. Memory 82 may be communicably connected to processor 80 and provide computer code or instructions to processor 80 for executing the processes described herein.
According to one embodiment, image processor 74 is configured to generate a covering image based on a covering member and one or more user images. For example, based on known dimensions of a covering member of a particular mask and image data of a particular user, image processor 74 determines the likely portions of the user's face that will be covered by the mask during use. As such, image processor 74 may crop a portion an image of a user and, after any necessary or desired modifications, print the cropped image onto the covering member.
As noted above, the printed covering image may be a modified version of the user image to provide, for example, modified facial features, corrected blemishes, customized images (e.g., animal features, etc.), different facial expressions (e.g., smiles, frowns, etc.), and the like. In one embodiment, the covering image is configured to provide a generally homogenous transition between the covering image and adjacent uncovered portions of the user's face.
Image processor 74 controls operation of printer 76 to print the appropriate covering image onto a covering member or other appropriate surface. Image processor 74 may be configured to print the covering image onto a covering member. The covering image may be any suitable image, such as any of those discussed with respect to mask 14 and covering member 16. In one embodiment, printer 76 is an ink jet printer. In other embodiments, printer 76 is another suitable printing device. Printer 76 may be located locally or remotely from image processor 74 and/or image capture device 72. For example, in some embodiments, a user captures one or more images of his or her face (e.g., using a digital camera) and provides the images to a mask vendor, who in turn processes the images and prints one or masks for use by the user. In one embodiment, the one or more images may be taken from different perspectives, capture different facial expressions, or the like. In other embodiments, image processor 74 is accessible via a web-based application and/or resides on a user device (e.g., a cellular phone, laptop computer, desktop computer, etc.), such that the user can take one or more digital photographs, upload the photographs to the image processor, and subsequently print one or more masks (e.g., on a personal printer, etc.).
According to some embodiments, printer 76 is or includes a three-dimensional printer and is configured to print one or more structural components for mask 14. For example, based on the user image data, one or more armatures or other components can be designed (e.g., by image processor 72) to enable mask 14 to conform to a user's facial contour and/or to provide customized facial structural features. As such, based on one or more factors, such as user images, a particular covering member, and/or one or more user inputs, printer 74 may print one or more structural components for mask 14.
Referring further to
Referring now to
Once the covering image is complete, the covering image is printed onto one or more face masks (98). In one embodiment, a printer such as printer 76 is used to print the covering image onto the face masks. The printer may be located remotely from, or alternatively integrated with or located locally with, an image processor and/or an image capture device. For example, in one embodiment, a user uploads images from a digital camera (e.g., a cellular phone with camera capabilities), processes the images using a personal computer (e.g., a program residing on the personal computer, a program accessible via a web-based application, etc.), and prints one or more face masks using a personal printer. In alternative embodiments, a vendor can perform one or more of the image processing and printing steps and provide the user with one or more sets of face masks. In further embodiments, a user can select different image variations for printing (e.g., with different facial expressions, facial features, skin tones, etc.), and receive sets of each type of customized face mask.
According to an alternative embodiment, rather than printing a covering image onto a face mask, one or more covering image data files are created for use in generating electronic displays on an electronic display member of a face mask. For example, referring back to
Referring now to
The present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures may show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Claims
1. A face mask, comprising:
- a covering member configured to cover facial features of a face of a user, the covering member including: an air-permeable filter member; and a facial image provided on an outward facing surface of the covering member, the facial image being generated based on an image of the user and representing covered facial features of the user; and
- a fastening member configured to secure the covering member over the face of the user.
2. The mask of claim 1, wherein a color of the fastening member is based on a facial feature of the user.
3-6. (canceled)
7. The mask of claim 1, wherein the facial image includes a first image portion and a second image portion, the second image portion providing a transitional image between the first image portion and the face of a user.
8. The mask of claim 7, wherein the second image portion provides a substantially homogenous visual transition between adjacent regions of the first image portion and adjacent uncovered portions of the face of the user.
9. The mask of claim 7, wherein the first image portion includes a modified facial feature of a user.
10. (canceled)
11. The mask of claim 7, wherein a border region of the second image portion provides a substantially homogeneous visual transition between an adjacent interior region of the second image portion and adjacent uncovered portions of the face of the user.
12. (canceled)
13. The mask of claim 1, further comprising a removable insert including the facial image.
14-16. (canceled)
17. The mask of claim 1, further comprising a display layer configured to provide the facial image and enable dynamic changing of the facial image.
18-19. (canceled)
20. The mask of claim 17, wherein the display layer is configured to change the facial image based on movement of the user's face.
21-23. (canceled)
24. The mask of claim 17, wherein the display layer is configured to be changeable based on an input from the user.
25. (canceled)
26. The mask of claim 1, further comprising an armature movable to change the outward physical contour of the covering member.
27. The mask of claim 26, further comprising a control unit coupled to the armature and configured to control operation of the armature.
28. The mask of claim 27, wherein the control unit includes a ranging device configured to detect a person proximate the user, and wherein the control unit is configured to control operation of the armature based on detection of the person.
29-41. (canceled)
42. A face mask, comprising:
- a covering member configured to cover facial features of a face of a user, the covering member including: a filter layer; and a display layer configured to provide a changeable image; and a fastening member configured to secure the covering member over the face of the user.
43-44. (canceled)
45. The mask of claim 42, wherein the display layer is configured to provide an image representing covered facial features of the user.
46. The mask of claim 42, wherein the display layer is configured to provide a first image portion and a second image portion, the second image portion providing a transitional image between the first image portion and the face of a user.
47. (canceled)
48. The mask of claim 46, wherein the first image portion includes a modified facial feature of a user.
49-58. (canceled)
59. The mask of claim 42, wherein the display layer is configured to change the facial image based on movement of the user's face.
60-65. (canceled)
66. The mask of claim 42, wherein the covering member is movable between a folded state and an unfolded state.
67. The mask of claim 66, wherein the facial image is configured to provide a continuous image in one or both of the folded and unfolded states.
68. The mask of claim 66, further comprising an actuator configured to move the covering member between the folded state and the unfolded state.
69. The mask of claim 68, further comprising a control unit configured to control operation of the actuator.
70. The mask of claim 68, wherein the actuator is at least partially powered by breathing of the user.
71. (canceled)
72. The mask of claim 69, wherein the control unit is configured to control operation of the actuator based on detection of a person proximate the user.
73-78. (canceled)
79. A clothing item, comprising:
- a covering member configured to cover a portion of a user's head, at least a portion of the covering member being an air-permeable material; and
- a facial image provided on an outward-facing surface of the covering member, the facial image being based on an image of the user and representing covered facial features of the user.
80. (canceled)
81. The clothing item of claim 79, wherein the facial image includes a first image portion and a second image portion, the second image portion providing a transitional image between the first image portion and remaining portions of the covering member.
82. The clothing item of claim 81, wherein the first image portion includes a modified facial feature of a user.
83-84. (canceled)
85. The clothing item of claim 79, further comprising a removable insert including the facial image.
86. The clothing item of claim 85, wherein the covering member includes a receptacle configured to receive the removable insert.
87-88. (canceled)
89. The clothing item of claim 79, further comprising a display layer configured to provide the facial image and enable dynamic changing of the facial image.
90. The clothing item of claim 89, wherein the display layer includes e-ink.
91. The clothing item of claim 89, wherein the display layer includes an OLED.
92. The clothing item of claim 89, wherein the display layer is configured to change the facial image based on movement of the user's face.
93-100. (canceled)
101. The clothing item of claim 79, wherein the covering member is movable between a folded state and an unfolded state.
102. The clothing item of claim 101, wherein the facial image is configured to provide a continuous image in one or both of the folded and unfolded states.
103-182. (canceled)
Type: Application
Filed: Jul 30, 2014
Publication Date: Feb 4, 2016
Inventors: William D. Duncan (Mill Creek, WA), Roderick A. Hyde (Redmond, WA), Jordin T. Kare (Seattle, WA), Tony S. Pan (Bellevue, WA), Yaroslav A. Urzhumov (Bellevue, WA), Lowell L. Wood, JR. (Bellevue, WA)
Application Number: 14/447,490