COMPUTER-IMPLEMENTED TECHNIQUES FOR REMOTELY INTERACTING WITH PERFORMANCE OF FOOD QUALITY, FOOD SAFETY, AND WORKPLACE SAFETY TASKS

Techniques for remotely interacting with physical performance of a task relating to at least one of food safety, food quality, and workplace safety are provided. A first user physically performs the task at a first location utilizes a head wearable device. A computing system is utilized by an operator to remotely interact with physical performance of the task from a remote second location. Audio and first-person visual data captured by the device are transmitted to the computing system during physical performance of the task. Audio data and visual captured by the computing system during remote interaction of the task are transmitted via the network to the head wearable device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims priority to and the benefit of U.S. provisional patent application No. 62/149,711, filed Apr. 20, 2015 and U.S. provisional patent No. 62/194,448 filed Jul. 20, 2015, the disclosures of each being hereby incorporated by reference in their entirety.

TECHNICAL FIELD

U.S. Patent Classification Primary Class: 345/008 (COMPUTER GRAPHICS PROCESSING, OPERATOR INTERFACE PROCESSING, AND SELECTIVE VISUAL DISPLAY SYSTEMS/Operator body-mounted heads-up display (e.g., helmet mounted display)) and Art Unit: 2692.

The disclosure relates to data processing techniques for specific applications relating to remotely interacting with physical performance of a food safety, food quality, or workplace safety tasks performed with a head wearable device.

BACKGROUND

In many industries, food or workplace safety and quality are of high importance. Brand protection stems from consistent food or workplace safety practices, which are a major component of success. In attempt to achieve consistent quality and safety, in-person inspections and audits are performed at many steps of the foodservice, retail, and supply chain process.

In today's growing global economy, many factors are driving a demand for inspection and auditing services. The number of issues, such as workplace hazards, product recalls and food borne illness outbreaks, is increasing. This has caused new benchmarks for standards and regulations with increased complexity. These rigorous standards and regulations require more costly in-person audits and inspections. In order to reduce in-house complexities and costs, outsourcing of inspection and auditing services has become commonplace. Although common, many buyers and government regulators prefer to use their own internal inspectors to assure consistency and accuracy of inspections.

Typically, an auditor trained in food safety, for example, travels to a facility, such as a restaurant, to perform the audit. The auditor measures the effectiveness of employee training and subsequent process implementation of quality and safety standards at the facility in a manner that provides immediate feedback and interactive training on compliance. For example, when observing a hamburger being cooked, the auditor may check the final temperature but also verify that procedures are in place to cook the hamburger to the correct temperature. For instance, the procedures may check whether the grill is set to a verified proper temperature. A timer is used to cook the hamburger for a verified time, and temperature is documented.

Conventional practices surrounding food safety, food quality, and workplace safety audits or inspections are inefficient. Since conventional audits and inspections require the auditor to travel to the facility, travel expenses, such as fuel or lodging, increase the cost of audits and inspections. Travel time also limits the number of audits that can be performed in a given period, further increasing costs. Furthermore, audits and inspections have been limited to in-person assessment by the auditors and inspectors due to the high level of involvement required to perform the inspection or audit.

Therefore, there remains a need to address at least the aforementioned problems.

BRIEF DESCRIPTION OF THE DRAWINGS

Advantages of may become readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:

FIG. 1 is a layout of system for remotely performing a food safety, food quality, or workplace safety task using a head wearable device located at a first location in communication with a computing system located at a second location according to one embodiment.

FIG. 2 is a block diagram of one embodiment of the components system.

FIG. 3 is an example of visual data displayed at the computing system according to one embodiment.

FIG. 4 is a perspective view of the head wearable device according to one embodiment.

FIGS. 5a and 5b are example first-person views demonstrating use of the head wearable device to implement a method of performing the task according to one embodiment.

FIG. 6 is a flow diagram of the method of remotely performing of the task using the computing system, according to one embodiment.

FIG. 7 is another flow diagram of steps related to performance of the task according to one embodiment.

FIG. 8 is a sample process flow diagram of performing the task at the first location.

FIGS. 9A and 9B are example reports relating to the task generated by the computing system in accordance with the system and method.

FIG. 10 is a perspective view of one example of a kit for facilitating remote performance of the task.

SUMMARY AND ADVANTAGES

One embodiment of a computing system for remotely interacting with physical performance of a task relating to at least one of food safety, food quality, and workplace safety is provided. The computing system is in communication with a head wearable device via a network. The head wearable device is utilized by a first user at a first location and comprises a first digital display, a first camera, a first microphone, a first speaker, and a first processor. An operator at a second location remote from the first location utilizes the computing system. The computing system comprising a second digital display, a second camera, a second microphone, a second speaker, and a second processor. The computing system is configured to transmit audio data captured by the second microphone to a first speaker of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the audio data. The computing system is configured to transmit visual data captured by the second camera to the first digital display of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the visual data. The computing system is configured to receive with the second speaker audio data captured by a first microphone of the head wearable device and transmitted via the network during remote interaction with physical performance of the task being conducted by the first user. The computing system is configured to receive with the second digital display visual data captured from a first-person perspective of the first user by the first camera of the head wearable device and transmitted via the network during remote interaction of physical performance of the task being conducted by the first user.

One embodiment of a method for utilizing a computing system to remotely interact with physical performance of the task relating to at least one of food safety, food quality, and workplace safety is provided. The computing system is in communication with a head wearable device via a network. The head wearable device is utilized by a first user at a first location and comprises a first digital display, a first camera, a first microphone, a first speaker, and a first processor. The computing system is utilized by an operator at a second location remote from the first location and comprises a second digital display, a second camera, a second microphone, a second speaker, and a second processor. The method comprising the step of transmitting, with computing system, audio data captured by the second microphone of the computing system to a first speaker of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the audio data. The method comprising the step of transmitting, with computing system, visual data captured by the second camera of the computing system to the first digital display of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the visual data. The method comprising the step of receiving, with the second speaker of the computing system, audio data captured by a first microphone of the head wearable device and transmitted via the network during physical performance of the task being conducted by the first user. The method comprising the step of receiving, with the second digital display of the computing system, visual data captured from a first-person perspective of the first user by the first camera of the head wearable device and transmitted via the network during physical performance of the task being conducted by the first user.

One embodiment of a kit for facilitating remote interaction with physical performance of the task relating to at lest one of food safety, food quality, and workplace safety using a head wearable device at a first location and a computing system at a remote second location is provided. The head wearable device comprises a digital display, a camera, a microphone, a speaker, and a processor. The head wearable device and the computing system communicate via a network. The kit comprises a container comprising a base and a lid. The container is sealable and is configured to house at least the head wearable device being configured to transmit to the computing system, during physical performance of the task, visual data with the camera from a first-person perspective and audio data with the microphone, and receive from the computing system during remote interaction with physical performance of the task audio data to the speaker and visual data to the digital display. The container is configured to house at least one other article or tool utilized in physically performing the task at the first location.

The system and method solve at least the aforementioned problems of the prior art. By allowing the task to be conducted using the head wearable device and directed and/or observed using the computing system, the system and method advantageously provide a solution to perform such tasks remotely. That is, the operator can remotely direct and/or observe the first user to perform the task in a way that conforms to rigorous standards and regulations. As such, the system and method reduce the need for costly in-person audits and inspections to comply with such rigorous standards and regulations. Using specialized technology, the operator no longer needs to travel to the first location to perform the task. As such, the techniques described herein eliminate travel expenses, such as fuel or lodging, thereby reducing the cost of performing the task. Such decrease in travel time also allows increases the number of tasks that can be performed in a given period thereby meeting the demands of a growing economy and increasingly strict regulations. Furthermore, using specialized technology, the task can be directly and/or observed remotely with a high level of involvement not previously practical or conceivable.

Moreover, the system and method provide interactive communication between the operator and the first user. The first user retains a significant amount of information learned during performance of the task because the first user is actively performing the task first-hand, at the direction or observation of the operator. The user is not passively watching an auditor or inspector performing the task without user involvement. That is, the first user is forced to physically perform the task. In turn, the first user will be able to promote a better culture of safety and quality at the first location, thereby further solidifying brand protection and public health.

Furthermore, by performing the task remotely, multiple operators can interact remotely with one another from different locations. For example, one operator may be a “teacher” guiding the user to perform specific food safety, food quality, and workplace safety tasks and training, while at the same time a “student” who learns from a second operator. For example, an auditor may guide the user to perform an audit while a government regulator oversees the audit for consistency and compliance. The auditor provides instruction to the user, while also receiving instructions from the government regulator.

Of course, the system and method may exhibit or provide advantages other than those described above. The advantages described above are not intended to limit the scope of the claimed invention.

DETAILED DESCRIPTION

Referring to the Figures, wherein like numerals indicate like or corresponding parts throughout the several views, a system 10 and a method 12 for remotely performing a food safety, food quality, or workplace safety task are generally shown throughout the figures.

I. Overview

One embodiment of the system 10 for remotely performing the food safety, food quality, or workplace safety task is illustrated in FIG. 1. For simplicity, the food safety, food quality, and workplace safety task may herein be referred to as the task. The system 10 includes a head wearable device 20 and a computing system 21 that is remote from the head wearable device 20. The head wearable device 20 and the computing system 21 are in communication with one another over a network 22, such as a wireless network.

Additional details of the system 10, head wearable device 20, and remote computing system 21 are described in co-filed U.S. application Ser. No. ______ (Docket number 017565.00425) the full disclosure of which is hereby incorporated by reference in its entirety.

The head wearable device 20 and the computing system 21 are collectively utilized to perform the task. Specifically, the task is physically performed using the head wearable device 20 and the task is professionally directed and/or observed using the computing system 21. As such, the task is physically performed at the first location 23 and remotely directed and/or observed from a remote second location 24.

Examples of such tasks may include an audit, an inspection, or training relating to at least one of food safety, food quality, and/or workplace safety. Details about the tasks are described in detail below. Additional details of relating to food safety, food quality, and/or workplace safety tasks are described in co-filed U.S. application Ser. No. ______ (Docket number 017565.00425) the full disclosure of which is hereby incorporated by reference in its entirety.

A first user at the first location 23 wears the head wearable device 20. The first user may be an employee, manager, person-in-charge, or any other person assigned to perform the task at the first location 23. Examples of the first location 23 may include any public or private workplace, restaurant, kitchen, shipping vehicle or container, manufacturing plant, farm, market, warehouse, or vendor.

An operator at the second location 24 operates the computing system 21. The operator may be a trainer, auditor, or inspector skilled in food safety. The operator is preferably professionally trained and/or certified to conduct and remotely direct and/or observe performance of the task. In some training embodiments, as described below, the operator may also be an individual or group of students or trainees who are learning from physical performance of the task. The second location 24 is generally a facility of a party enlisted to engage in performing audits, inspections, or training relating to food safety. However, the second location 24 may be any other location where the operator is present. The operator communicates from the second location 24 with the first user at the first location 23 to remotely interact with, direct, instruct, and/or observe the first user.

Audio and visual data 26 generated during performance of the task is transmitted and received between the head wearable device 20 and the computing system 21 through the wireless network 22.

The network 22, shown in FIG. 1 and FIG. 2, may take many forms. In one example, the network 22 is a wireless network. The wireless network 22 may be any suitable network that enables wireless communication between the computing system 21 and the head wearable device 20. The wireless network 22 may a Wi-Fi®, cellular, cloud-based, or other type. The network 22 may have wired aspects. For example, the computing system 21 and/or memory 29 may be wired to the network 22.

II. Head Wearable Device

One embodiment of the head wearable device 20 is illustrated in FIG. 4. The head wearable device 20 is capable of capturing audio and visual data 26. The head wearable device 20 transmits the audio and visual data 26 using the wireless network 22. The head wearable device 20 is worn by the user in such a way that it is supported by the head of the user leaving the hands and arms available to perform the task. More preferably, the head wearable device 20 is eyewear. In a preferred embodiment, the head wearable device 20 is a Google Glass® wearable device. Using Google Glass®, the user is able to partake in audio and visual communication with the operator. The head wearable device 20 may have various other configurations. For example, the head wearable device may be a digital contact lens, a head mountable personal camera system, and the like.

The head wearable device 20 is arranged on the body of the user to allow the operator to view visual data 26 representative of a first-person perspective of the user.

As shown in FIGS. 2 and 4, the head wearable device 20 comprises a speaker 30, a camera 32, a digital display 33, a microphone 62, and a processor 66. These components may be integrated into the head wearable device 20. The processor 66 may be in communication with a memory 67 having computer-executable instructions stored in the memory wherein the processor 66 executes the instructions to perform various methods described herein.

The camera 32 is configured to capture visual data 26 at the first location 23 from a first-person perspective of the first user. The microphone 62 is configured to capture audio data 25 from the first location 23. The audio data 26 is transmitted from the head wearable device 20 to the computing system 21 to allow the operator to hear what the user is saying during performance of the task. The visual data 26 is transmitted from the head wearable device 20 to the computing system 21 to allow the operator to see what the user is seeing in first-person during performance of the task. The head wearable device 20 may perform various other functions, such as initiating or terminating communication with the operator.

The digital display 33 may be presented on a lens of the head wearable device 20 such that the digital display 33 is presented directly in the line of sight of the user. The user may view and control items shown on the digital display 33. The digital display 33 may be an augmented reality display that projects multimedia relating to the task over real world, physical objects or views. Additionally or alternatively, the digital display 33 may be a holographic display that utilizes light diffraction to create a virtual three-dimensional image of objects relating to performance of the task.

The computing system 21 is configured to transmit digital media 81 relating to performance of the task to digital display 33 of the head wearable device 20. The digital media 81 may comprise any suitable digital media, such as a video, animation, photograph, and the like. The digital media 81 may be stored in the memory 29 of the computing system 21 or the memory 67 of the head wearable device 20. In some embodiments, the computing system 21 and/or head wearable device 20 may be in communication with a database 82 that stores the digital media 81. The computing system 21 and/or head wearable device 20 may communicate with the database 82 via the network 22. The database 82 may be hosted by the entity employing or in charge of the operator or the second location 24.

In some embodiments, the computing system 21 may be configured to recognize a predetermined event relating to performance of the task from at least one of the audio or visual data received by the computing system 21 from the head wearable device 20. In such instances, the computing system 21 may transmit the digital media 81 to digital display 33 of the head wearable device 20 in response to recognition of the predetermined event. Alternatively, the operator may control the computing system 21 to access specific digital media 81 based on what the operator perceives via the audio and visual data 26.

Items or digital media 81 presented on the digital display 33 relate to performance of the task. FIGS. 5a and 5b show two example views 34a, 34b of what the user may see while wearing the head wearable device 20. In view 34a, the digital display 33 of the head wearable device 20 projects a digital video 65a relating to the task. For example, the video 65a may provide visual instructions on proper procedures relating to maintaining food safety. In other examples, the video 65a may provide a live-feed display of the operator at the second location 24. Additionally, as shown in view 34b, the digital display 33 may present information obtained during performance of the task. For example, if the user is instructed to take a temperature measurement of food, the digital display 33 may present the actual temperature measurement and measurement duration. The digital display 33 may also present compliance information, such as a require temperature and duration. In another example, the digital display 33 of the head wearable device 20 projects a sample of the visual data 26 being transmitted to the computing system 21, allowing the user to verify proper operation of the head wearable device 20. Those skilled in the art realize that various other forms of digital media and/or visual data 26 may be presented in the digital display 33 of the device 20 besides those specifically described herein.

The user may control the head wearable device 20 according to various methods. For example, the head wearable device 20 may include an auxiliary or peripheral input 31, such as a control touch pad, as shown in FIG. 4. The peripheral input 31 may detect motions from a finger of the first user, which contacts with the touch pad and renders the motions as controls for the head wearable device 20. For example, if the user is presented with a list of items, swiping the finger down on the touch pad may move a cursor on the digital display 33 to the next item, and a tap on the touch pad selects the item. Another method to control the head wearable device 20 is a voice command. Using the microphone 62, the head wearable device 20 captures and interprets the voice command from the user as controls for the head wearable device 20. For example, the user wearing the head wearable device 20 may say “take photo” while completing the task. In turn, the head wearable device 20 captures the photo of the current view of the user. Those skilled in the art may appreciate that other possibilities exist for controlling the head wearable device 20, such as a gesture command, a tactile button, a neural interface, and the like. Furthermore, the head wearable device 20 may have various other suitable configurations for allowing the user to have a head wearable experience during performance of the task other than those not specifically described herein.

The head wearable device 20 is configured to transmit the captured audio and visual data 26 to the computing system 21 via the network 22. As will be described below, the head wearable device 20 and the first user may also receive remote directions from the computing system 21 and operator to conduct physical performance of the task. To do so, the head wearable device 20 is configured to present audio data 26 transmitted from the computing system 21 to the speaker 30. The head wearable device 20 is also configured to receive visual data 26 transmitted from the computing system 21 to the digital display 33.

The head wearable device 20 is configured to transmit and/or receive the audio and visual data 26 to the computing system 21 in a live, or essentially real-time, manner. In one example, the audio and visual data 26 is streamed to the computing system 21. To facilitate this specialized form of transmission, the processor 66 of the head wearable device 20 may be configured with a live-steam module 69, which may be implemented as software or computer-executable instructions that are executed by the processor to perform the desired live-streaming function. In some embodiments, the live-stream module 69 may be provided by a mobile device app that is downloaded on to the head wearable device 20. Real-time capture of the audio and visual data 26 reduces cost and time associated with documenting or recording information relating to performance of the task. It is preferred that any live-streaming be capable of effectively rendering both live video and live audio in a high definition (HD) format. This way, with live-HD streaming, specialized actions and details may be addressed effectively and accurately during performance of the task. The head wearable device 20 may have configurations and may perform functions other than those specifically described herein.

III. Computing system

FIG. 3 shows an example embodiment of the computing system 21 used by the operator for remotely directing or observing the task. The computing system 21 is located at the second location 24, which is remote from the first location 23. The computing system 21 may be an electronic device or workstation capable of receiving the audio and visual data 26 from the head wearable device 20. The computing system 21 may be any suitable device for allowing communication with the head wearable device 20 and presentation of the audio and visual data 26. For example, the computing system 21 may be a desktop computer or workstation, a laptop, a tablet device, a mobile device, a smart device, and the like.

Referring back to FIG. 2, the computing system 21 is connected to the network 22. In one embodiment, the computing system 21 comprises a digital display 27, a processor 28, a memory 29, a camera 77, a microphone 78, and a speaker 79. These components may be integrated with the computing system 21 or connected to the computing system 21. The operator may interact with the computing system 21 using various other input and output (I/O) devices, such as a mouse, a keyboard, and the like.

The audio and visual data 26 presented by the computing system 21 may be saved to the memory 29 for later access. The memory 29 may be connected via the wireless network 22, such as a dedicated file server as shown in FIG. 2. Alternatively, the memory 29 may be integrated as a part of the computing system 21.

The camera 77 is configured to capture visual data 26 from the second location 24 and the microphone 78 is configured to capture audio data 26 from the second location 24.

The audio data 26 captured by the microphone 62 of the head wearable device 20 during performance of the task is transmittable via the network 22 to the speaker 79 of the computing system 21. The visual data 26 captured by the camera 32 of the head wearable device 20 is transmittable via the network 22 to the digital display 27 of the computing system 21.

The computing system 21 is utilized by the operator to remotely direct and/or observe physical performance of the task from the second location 24. As such, audio data 26 captured by the microphone 78 of the computing system 21 during remote directing or observing of physical performance of the task is transmittable via the network 22 to the speaker 30 of the head wearable device 20. Additionally, visual data 26 captured by the camera 77 of the computing system 21 during remote directing or observing of physical performance of the task is transmittable via the network 22 to the digital display 33 of the head wearable device 20.

FIG. 6 shows a flow diagram of the method 12 for utilizing the computing system 21 to remotely direct and/or observe performance of the task. Step 35 includes transmitting, with computing system 21, audio data 26 captured by the microphone 78 of the computing system 21 to the speaker 30 of the head wearable device 20 via the network 22 to allow the operator to remotely interact with physical performance of the task from the second location 24 using the audio data 26. Step 36 includes transmitting, with computing system 21, visual data 26 captured by the camera 77 of the computing system 21 to the digital display 33 of the head wearable device 20 via the network 22 to allow the operator to remotely interact with physical performance of the task from the second location 24 using the visual data 26. Step 37 includes receiving, with the speaker 79 of the computing system 21, audio data 26 captured by the microphone 62 of the head wearable device 20 and transmitted via the network 22 during remote interaction of physical performance of the task being conducted by the first user. Step 38 includes receiving, with the digital display 27 of the computing system 21, visual data 26 captured from a first-person perspective of the first user by the camera 32 of the head wearable 20 and transmitted via the network 22 during remote interaction of physical performance of the task being conducted by the first user.

FIG. 3 shows a sample of the items that may be presented to the operator on the digital display 27 of the computing system 21 during performance of the task. The visual data 26 captured from a first-person perspective by the head wearable device 20 while worn by the user is presented on the digital display 27. This way, the operator can see in high detail visual information relating to performance of the task. For example, a real-time HD video stream of what the user is seeing may be presented here.

The computing system 21 may simultaneously display a data entry system 63 alongside the presentation of the visual data 26. The data entry system 63 allows the operator to enter data obtained during performance of the task. For example, such data may include questions, notes, observations, and/or outcomes relating to the audit. The operator may populate the data entry system 63 based on the audio and visual data 26 presented by the computing system 21.

Furthermore, the computing system 21 may be configured to automatically save some or all of the obtained audio and visual data 26 from the head mounted device 20 and automatically populate the data entry system 63 without operator involvement. For example, the computing system 21 may capture a picture or a video from the visual data 26 and save into the data entry system 63 to attribute the picture or video to a specific event relating to the performance of the task. The computing system 21 may have any other configuration suitable for communicating with the head wearable device 20 and presenting audio and visual data 26.

In other embodiments, the computing system 21 may be configured to automatically evaluate at least some of the audio and visual data 26 transmitted from the head wearable device 20 during performance of the task. For example, the computing system 21 may process visual data 26 to recognize patterns, trends, or objects that are relevant to performance of the task and evaluate such data 26 to make an observation about the task. For instance, the computing system 21 may visually recognize the presence of a workplace hazard at a particular location in the first location 24 and electronically report this finding to populate the data entry system 64. Similar implementations may be performed using the audio data 26. For example, questions and responses may be recorded and analyzed using voice recognition algorithms, or the like.

When the head wearable device 20 is in communication with or attached to the peripheral input 31 controlled by the first user during performance of the task, the head wearable device 20 may generate control data conveying information related to the task in response to control of the peripheral input 31 by the first user. The computing system 21 receives this control data from the head wearable device 20 via the network 22. One example of the control data may include input from the user initiated in response to the user selecting certain options that are presented on the digital display 33 of the head wearable device 20 as provided from the computing system 21 during the task. Another example of control data may include a gestures performed by the user wherein such gestures are recognized by an auxiliary gesture device attached, for example, to the user's forearm. Control data from such gestures may be performed in conjunction with performing the task.

In such instances, the computing system 21 may automatically evaluate the control data received from the head wearable device 20 during performance of the task. The control data may be evaluated in conjunction with any of the audio and visual data.

The computing system 21 may also automatically generate a digital report 60 relating to an outcome of the task in response to automatically evaluating at least some of the audio and visual data 26 received from the head wearable device 20 during to performance of the task. For example, the digital report 60 may document measurements, conditions, or other outcomes resulting from the inspection or audit. Details about the digital report 60 are provided below.

When the peripheral input 31 is utilized at the first location 23 to provide control data, the computing system 21 may also automatically generate the digital report 60 in response to automatically evaluating the control data individually or in conjunction with at least some of the audio and visual data 26. The computing system 21 may have configurations and may perform functions other than those specifically described herein.

IV. Tasks

Details relating to the tasks are provided herein. Many examples of the tasks and implementations of the task are provided below. These examples and implementations may at times be described generally for simplicity. However, it is to be understood that where not explicitly stated, each of the examples and implementations relate to at least one of food safety, food quality, or workplace safety.

Food quality tasks involve the quality of characteristics of specific food products in effort to determine whether such characteristics are suitable or compliant with predetermined standards, and the like. Food quality may involve assessing a degree of excellence of the food product and may include characteristics of the food, such as taste, appearance, nutritional content, and the like.

Food safety tasks involve the safety in handling, preparing, and storing of food in effort to prevent foodborne illnesses and ensure that the food is fit for consumption.

Workplace safety tasks involve the safety of the workplace from work related illness, hazards, risks, and/or injury in effort to protect safety and health of individuals (e.g., employees) at the workplace. Workplace safety, for example, may be concerned with how, when, where equipment is utilized (e.g., how to use a food slicer), using protective articles (e.g., wearing a helmet), proper techniques relating to physical activity at the workplace (e.g., how to lift heavy items), proper techniques relating to alerting to the presence of or eliminating hazards (e.g., how to clean up a spill and/or where or how to place a sign when there is a spill), and the like. The techniques described herein may be utilized to remotely interact with physical performance of any task relating to food quality, food safety, or workplace safety.

As described, the task may be performed at any suitable first location 23. Such locations may include, but are not limited to any public or private workplace, restaurant, kitchen, shipping vehicle or container, manufacturing plant, farm, dietary supplement (DS) facilities, pharma and med device facilities, and bottling plants, market, slaughterhouse, warehouse, or vendor.

The task may include, by way of non-limiting example, any auditing, inspecting, grading, rating, testing, measuring, certifying, training, teaching, and any combination thereof. The task is performed preferably in view of, or in pursuit of compliance with governmental regulations relating to food safety, food quality, or workplace safety. Such governmental regulations may include FDA or OSHA regulations, or the like.

The task may be performed to assess any suitable objects or environments relating to food quality, food safety, or workplace safety. For example, such objects or environments may include, but are not limited to dietary supplements, medical devices, plastics, sustainability and environment, water and wastewater, building products and interiors, biohazard management equipment, and items or environments relating to pharma biotech. For instance, the task may relate to water programs: certification programs for products, certification programs for cooling towers and buildings, pre-audits for cooling towers and buildings, and the like.

Those skilled in the art appreciate that the system 10 and method 12 may be utilized in various other applications relating to food quality, food safety, or workplace safety not specifically described herein. Examples of how the system 10 and method 12 are preferably utilized in any one or more of three specific areas of food quality, food safety, or workplace safety: inspecting, auditing, and training, which are described in detail below. It is to be understood that any of the details below regarding inspecting, auditing, and training may be performed separately, or in combination using the techniques of the system 10 and method 12.

A. Inspecting

One example of the task is an inspection. The inspection is the process of examining the facility and food product against specific safety and quality standards. The inspection is largely associated with evaluations of specific food products, like seafood, and the corresponding quality characteristics of the food products. Thus, one example of a food quality task is to inspect specific food products to determine compliance of the specific food products with predetermined standards, and the like. The inspection process aims to identify food safety issues and correct them before they propagate into recalls and illnesses. Inspections may be a formal event, where an independent inspector evaluates food quality and safety, or, an informal event where an internal source performs routine monitoring.

Referring to FIG. 7, several steps may be performed when the inspection is conducted at the first location 23. At step 39, an opening meeting is conducted. During the opening meeting 39, introductions are made and a plan defined. At step 40, data is collected during performance of the task. The data collection 40 takes place throughout the first location 23. The collected data is analyzed at step 41. The collected data may be analyzed manually and/or automatically by the operator and the computing system 21 at the second location 24. The report 60 is generated about the inspection at step 42. The report 60 may result in the digital report as described herein, which can be generated manually and/or automatically by the operator and the computing system 21 at the second location 24. Similar steps may be performed during performance of other tasks, such as the audit.

FIG. 8 illustrates a sample layout of the first location 23 where the task, and more specifically, the inspection, may take place. The inspection may start in a receiving area 49, progressing to a walk-in freezer 50 and walk-in cooler 51, a dry storage area 52, a prep area 53, a dish area 54, a cook area 55, and finally a service area 56. The inspection may optionally include areas on the outside of the foodservice facility such as a dumpster 57. The task may also take place in an office 58.

Data collection 40 may take any form known in the art such as, observing, testing, sampling, smelling, photographing, etc. Samples may be taken of the first location 23 and food product, such as temperatures during storage or cooking, chlorine and quaternary ammonium sanitizer levels, and other properties. Following the data collection 40, the inspector does an analysis 41 of the data that was collected and records it in a report 60. The report 60 is optionally provided to the user at the first location 23. Those skilled in the art may appreciate that the steps may be reordered or omitted without restriction.

Data collection 40 may be transmitted through the head wearable device 20. The head wearable device 20 may include input peripherals for electronically receiving sampling data. For example, a testing device for measuring temperature or chemical properties may be connected to the head wearable device 20. The testing device may transmit the results of the measurements to the computing system 21 through the head wearable device 20.

During the inspection, the data 26 that the inspector collects may come from at least any of the following categories: sanitation, housekeeping, and hygiene; regulatory compliance; temperature controls; allergen management; menu detail and labeling; recordkeeping; quality of food supplies; cook time and temperatures; food storage, proper rotation, and date marking; food identification and labeling; staff training; food defense; rodent and pest management control; and building management, such as garbage, ventilation, lighting, and plumbing.

The user wears the head wearable device 20 while performing steps of the inspection. The head wearable device 20 captures audio and visual data 26 such as the first person perspective of the user, shown in FIGS. 5a and 5b, and transmits the audio and visual data 26 to the computing system 21. Using the computing system 21, the operator is able to view and record the audio and visual data 26 captured by the head wearable device 20. The results of the steps are captured by the head wearable device 20 and simultaneously viewed by the operator on the computing system 21. The operator analyzes the results to create the report 60, concluding the inspection. The report 60 is optionally sent to the user. The report 60 may additionally or alternatively be generated in an automated fashion by the computing system 21 based on automatic analysis of the captured audio and visual data 26, among other things.

FIGS. 9A and 9B show examples of the report 60 that may be created following the task. The report may have multiple sections for data entry. The sections may include a column of items that were examined as part of the task, a column showing a grade from the examined item, a column showing a weight average contribution of the grade, and a row showing an overall grade. The grades may be in a numerical or letter standard.

A column with comments regarding each item may also be included. The comments may further elaborate on the examined items. An example of the comments is noting specific temperatures that food items were stored. Another example of the comments is an area of the item that was examined that could be improved.

The report may be further broken down into sections corresponding to specific categories of tasks. The sections may be specific areas of the first location 23.

B. Auditing

Another example of the task is an audit relating to food quality, food safety, or workplace safety. The audit is a process verifying that the systems and training for food safety put in place at the first location 23 are functioning correctly. Contrary to the inspection, of which the objective is to identify and correct food safety issues, the objective of the audit is to evaluate items that may prevent the food safety issue from materializing. The audit establishes that the results of the food safety inspections are both accurate and repeatable.

Some of the steps that may take place during the audit process may include identifying what is supposed to be happening at the first location 23, observing what is happening, collecting information during the audit, sorting and analyzing evidence in support of observations, reporting what is found, and a following-up to verify issues have been corrected. Those skilled in the art may appreciate that the audit may also verify or test any items indicated as part of the inspection process.

Identifying what is supposed to be happening involves a working knowledge of what policies and procedures are being audited to verify compliance, as well as what training and quality systems have been implemented in order to evaluate effectiveness. Observing what is happening is viewing the implementation of or adherence to the policies, procedures, training, or standards that have been identified. Evidence in support of the observations is collected allowing objective evaluation of the compliance to the aforementioned policies, procedures, training, or standards. From the observations and evidence, the report 60 can optionally be generated to rank or grade the first location 23, or to provide reference for future audits. The follow-up may be conducted a predetermined length of time after the completion of the first audit to verify any issues that were observed have been resolved.

One type of audit is a manufacturing and processing audit. Manufacturing and processing audits address the adherence to multiple standards, such as local, state, and federal regulations. Additionally, good manufacturing practices (GMPs), process control procedures, and hazard analysis and critical control points (HACCP) may be analyzed. During the audit, a systematic examination of activities is conducted. The examination substantiates that the plant complies with standard operating procedures (SOPs), work instructions to maintain efficient production, and the implementation of management systems. Manufacturing and process audits may be scheduled in advance to take place over several days due to the time required to assess the audit criteria. Manufacturing and processing audits may require that the auditor have experience working in a manufacturing and processing facility to be eligible to conduct the audit.

An additional type of audit is a retail audit. Retail audits may be shorter in duration than the manufacturing and processing audit and may be without warning to observe a snapshot in time of overall operations. The standards that are audited against in the retail audit are provided by an entity requesting the audit. The standards may be based on local or national food safety regulations and internal brand standards and policies. Prior to the audit, foodservice employees are trained on required food safety and quality procedures that are to be implemented at the first location 23.

The audit is conducted when the user wears the head wearable device 20 while performing the steps of the audit. The head wearable device 20 captures audio and visual data 26 from the audit and transmits the audio and visual data 26 to the computing system 21. Using the computing system 21, the operator is able to remotely view and record the audio and visual data 26 captured by the head wearable device 20. The operator guides the user to perform the steps of the audit as described. The results of the steps of the audit are captured by the head wearable device 20 and simultaneously viewed by the operator on the computing system 21. The operator analyzes the results to create the report 60, concluding the audit.

C. Training

Yet another example of the task is food or workplace safety related training. To ensure that the tasks are consistently performed correctly, several methods may be used. Training audits may be conducted where the auditor in training follows a lead auditor as the lead auditor conducts the audit. The auditor in training learns the process for conducting the audit from the lead auditor. A calibration audit is later performed to ensure that the auditor in training is following a process that is consistent with other auditors conducting the same type of assessments. In the calibration audit, the lead auditor shadows the auditor in training at the audit. The lead auditor assesses the performance of the auditor in training and advises the auditor in training as necessary to ensure consistency.

The head wearable device 20 is used for training and calibration purposes. During training, the lead auditor wears the head wearable device 20 while conducting the audit. The auditor in training is able to view and listen to the audit while taking notes and asking questions to the lead auditor during the audit. Alternatively, when conducting a calibration audit, the auditor in training wears the head wearable device 20 while performing the audit. The lead auditor views and listens to the audit while documenting strengths and areas of opportunity of the auditor in training in a report without interaction. At the conclusion of the calibration audit, the lead auditor reviews the report with the auditor in training using the head wearable device 20. Both the training audit and calibration audit may optionally be recorded for future use.

Another embodiment of training auditors or inspectors would be for the auditor in training to wear the head wearable device 20 while conducting the audit. Instructions for performing the audit would be displayed on the digital display 33. The instructions allow the auditor in training to perform the audit with or without the supervision of the lead auditor.

An additional embodiment may use training during the audit or inspection. The remote operator provides interactive training to the user, while at the same time completing the audit report and documenting audit observations. For example, the interactive training may take place when the remote operator notices that the user did not wash their hands using proper procedure. The remote operator may interject a short video onto the digital display 33 showing proper hand washing technique. A library of the short videos in the digital media 81 of the database 82 may be available to the operator for many types of interactive training.

In a further embodiment, the head wearable device 20 may also be utilized for training of foodservice or workplace employees. Safety and quality assurance training in the form of instruction, tasks, or tests would be displayed on the head wearable device 20. The training may consist of self-paced learning or active instruction from a teacher. Results from tests, or task completion data may be automatically downloaded from the head wearable device 20 for further analysis.

D. Other Examples

A non-limiting list of other examples and uses of the tasks are provided below. The techniques described herein may be utilized to provide witness sessions to approve auditors, trainers, calibration sessions to solve specific issues with auditors and trainers, training of auditors on site, recording of real examples for auditor calibration exercises, or the like; to perform remote audits, pre-audits, witness testing, follow up inspections and audits for review of corrective actions, and practical training in new product inspections with a trained expert who knows the new process does an inspection wearing the head wearable device 20 while other inspectors watch the process remotely; to provide remote report writing to drive efficiency whereby the auditor speaks the audit report during the audit while a lower cost staff member utilize the audio visual data 26 to generate the report by entering data and/or screen shooting photos to the report, and the like. Such implementations save valuable time of the auditor and cut report turnaround time. Additionally, the system and method may allow the entire task to be recorded such that clients to can observe performance of the entire task, or portions of the task as desired.

Other examples of the task or implementations of the task include witness sessions to approve auditors or trainers; calibration sessions to solve specific issues with auditors or trainers; training of auditors on site; live sessions at a plant, restaurant or field during a training session of trainers or auditors in a class to show examples; follow up sessions for consulting after first visit to review progress; training of crews or managers on the job; remote inspections for consulting when contamination occurs and help is needed to find the problem; recording of real examples for auditor calibration exercises; consulting for restaurants, plants and possibly fields and harvest crews; beverage audits.

Additional examples of the task or uses of the task include calibration audits; providing specification details on-screen during product inspections to help the inspectors check products against detailed specs; product inspector calibration; providing expert support with an office based specialist supporting generalist auditors in the field; practical training in new product inspections using a trained expert knowledgeable of the new process who conducts an inspection wearing the head wearable device 20 while another inspector(s) watch the process remotely; practical training for new audit standards by having trainees observe a new audit process being performed by an experienced, trained auditor; remote translation for the auditor and for employees in foreign countries to avoid sending a local auditor abroad for translation; providing expert support for food safety certification programs; providing customer or clients with real time access to a deep dive audit as it is performed; practical webinars in new audit techniques; production line supervision; self audits; assisted self audits; allowing customers to observe audits; training government inspections in audit techniques.

Other examples and uses of the task relate to supply chains. One example includes confirmation checks when a site has a crisis or recall and having an auditor at the first location to carry out the confirmation check and a technical team at the second location working with and directing the auditor. The auditor remotely provides information to the technical team on the actions taken by the site to support the continued use of the certificate and logos. Evidence remotely provided by the techniques described herein can be used to provide verbal and written assurance that corrective actions are in place. Other examples include confirmation of species/substance for sustainability claims. In remote or difficult to access areas, such as fishing trawlers, the techniques described herein could be used to confirm that the product is from a sustainable source or the correct species. Other examples include loading checks for shipping containers (Cargo Tally Verification) by providing confirmation that the number and type of product is loaded into the container before shipping; pre-shipment inspection for import eligibility or tariff classification; assistance with complaint investigation by helping decisions if a recall is required by confirmation of the source or cause of the issue or for providing ‘expert witness’ in insurance claims.

Additional examples and uses of the task relate to dietary supplement (DS). Examples relating to DS include good manufacturing practices (GMPs) audits; training in GMP; inspections relating to GMP; DS consulting; low risk audits and 6 month program audits; on and off site training.

In relation to food, the task is generally concerned with the prevention, management, and elimination of food safety issues to minimize risk of illness or product recalls. Production, preparation, and/or distribution of food may occur at the first location 23. Food may encompass any consumable substance that is ingested by a living organism. Production or preparation of food may include any of the following activities involving food such as, storing, transporting, distributing, selling, packaging, processing, serving, or consuming without restriction.

The techniques described herein may also be utilized in relation to food equipment. In food equipment, the focus of the assessment is on the design and construction of the equipment to ensure hygienic design, ease of cleanability and performance as it relates to food safety. Segments of the industry including dishwashers, dispensing, ice making, etc. Such large equipment types present a significant challenge because evaluations must take place at the manufacturer locations due to the complexity, size and set-up requirements for the equipment. With hundreds of distinct equipment categories, it is a significant challenge to build ‘local’ expertise for carrying out onsite evaluations and testing, internationally. The techniques can be utilized to deploy real-time, shadow trainings and equipment evaluations for global teams for a fraction of the cost and time. Further examples include conducting a real-time coached training, witness test, or equipment evaluation guided by a locally-based ‘expert’ while international staff in foreign country locations carry out such tasks; conducting a real-time coached equipment evaluation carried out by an auditor at a client location, while guided by a locally-based ‘expert’; offering evaluations, witness testing, consultative exploratory sessions and providing key account ‘on demand’ support anywhere around the world.

Further examples and implementations of the task in relation to audits include performing observational behavior style audits to capture images of staff performing tasks against set standards to improve internal quality standards of a client and share such observations with the client organization. In such instances, images or video captured may be embedded into audit reports. Client organizations will be able to use information captured to develop impactful company-wide corrective actions using the real time examples. Furthermore, the techniques described herein could be performed for critical site food safety non-compliance issues. If there is disagreement about resolution, a certification line manager can see what the auditor has encountered on-site. Additionally, remote auditing can be performed by an in-house staff member for specific audit in a remote geographical location or in a no-go animal welfare bio-security environment, with guidance by a qualified third party auditor.

In further relation to witness audits, an evaluator can take notes and review them later for any issues to address with a trainee auditor as part of corrective actions. Also, the techniques may be used to increase the number of witness audits completed at client locations and reduce need for verifiers to travel to site to verify. Furthermore, integrity audits can be performed where an on-site scheme manager may effectively present as well to overcome any conflicts.

In further relation to self-audits, the techniques described herein may be used as part of technical support during self-audit with clients. For example, this may be useful for retailers during trouble shooting audits. In factory applications, staff can wear the head wearable device at key CCPs to remotely transmit video data relating to e.g., temperature controls etc., to ensure process is being accurately followed. Furthermore, the techniques may be utilized at Slaughterhouses and/or for animal welfare tasks. In addition, remote technical coaching may be provided to client teams while they are undertaking their self-audit.

In further relation to training, trainers can use the techniques described herein so that quality control can be taken on the delivery of training courses and to gauge client interaction. Additionally, training organizations can assess effectiveness of training sessions and improve training delivery through coaching of trainers and review of materials either real time or retrospectively, enhancing training sessions. Training for admin processes can also be performed remotely. Delegates can be at a classroom (second location) while an instructor actually on-farm or on-site (first location) demonstrates compliance criteria in place.

The system and method may additionally be utilized in water programs, such as certification programs for products, certification programs for cooling towers and buildings, pre-audits for cooling towers and buildings, and the like.

V. Food Safety Kit

As part of the system 10 for remotely performing the task a kit 100, as shown in FIG. 10, is provided which houses the items used at the first location 23 for physically carrying out the task.

Articles useful in physically performing the task at the first location 23 may be bundled into the kit 100. The kit 100 may include a container 102 that is sealed and has a lid 104. The kit 100 is sent to the first location 23 to provide the user with the items to complete the task. The kit 100 is sent to the first location 23 either directly from the second location 24, or forwarded from a previous “first location”. The auditor or inspector may send the kit 100 to the first location 23. The user at the first location 23 opens the kit 100. The kit 100 houses the head wearable device 20. The kit 100 also houses at least one other article or tool 106 utilized in physically performing the task at the first location 23. For example, the at least one other article or tool 106 may comprise connection devices for connecting the head wearable device 20 to the network 22, a charging unit for the head wearable device 20, an external battery pack, a portable hotspot device, and measurement tools, such as a digital thermometer or chemical test strips, chemical testing devices, as well as alcohol swabs, quaternary ammonium cation (quats) testing supplies, a flashlight, etc. The kit 100 may include a utility belt to hold all the necessary audit equipment easily, and head wearable.

The container 102 is adapted to hold the items during sending to protect and organize the items of the kit. The container 102 can be reusable and may be fitted with a liner that has cutouts for the items in the kit. The liner may be made of foam or energy absorbing material.

Several embodiments have been discussed in the foregoing description. However, the embodiments discussed herein are not intended to be exhaustive or limit the invention to any particular form. The terminology that has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations are possible in light of the above teachings and the invention may be practiced otherwise than as specifically described.

The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims to cover all such features and advantages of the invention that fall within the true spirit and scope of the invention. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction and operation illustrated and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention.

For simplicity in description, each and every possible combination of the claims recited herein are not expressly and separately listed for formalistic purposes. However, it is to be understood that substantive support is intended and inherently provided for combining any combination of the claims recited herein regardless of the specific claim dependency provided upon filing. As such, there is no intention to limit claim support solely to the specific claim dependency provided upon filing. Rights are reserved to change dependency of the claims in any suitable manner after filing.

Claims

1. A computing system for remotely interacting with physical performance of a task relating to at least one of food quality, food safety, and workplace safety, with the computing system being in communication with a head wearable device via a network and with the head wearable device being utilized by a first user at a first location and comprising a first digital display, a first camera, a first microphone, a first speaker, and a first processor and with the computing system being utilized by an operator at a second location remote from the first location and with the computing system comprising a second digital display, a second camera, a second microphone, a second speaker, and a second processor, and wherein said computing system is configured to:

transmit audio data captured by the second microphone of the computing system to a first speaker of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the audio data;
transmit visual data captured by the second camera of the computing system to the first digital display of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the visual data;
receive with the second speaker of the computing system audio data captured by a first microphone of the head wearable device and transmitted via the network during remote interaction with physical performance of the task being conducted by the first user; and
receive with the second digital display of the computing system visual data captured from a first-person perspective of the first user by the first camera of the head wearable device and transmitted via the network during remote interaction of physical performance of the task being conducted by the first user.

2. The computing system of claim 1 further being configured to evaluate at least some of the visual data received by the computing system during remote interaction with physical performance of the task.

3. The computing system of claim 2 further being configured to evaluate at least some of the audio data received by the computing system during remote interaction with physical performance of the task.

4. The computing system of claim 2 further being configured to generate a digital report relating to an outcome of the task in response to evaluating at least some of the visual data received by the computing system during remote interaction with physical performance of the task.

5. The computing system of claim 1 further being configured to receive control data from the head wearable device via the network, wherein the control data conveys information related to physical performance of the task and is generated in response to control of a peripheral input in communication with or attached to the head wearable device.

6. The computing system of claim 5 further being configured to evaluate at least some of the control data received by the computing system during remote interaction with physical performance of the task.

7. The computing system of claim 6 further being configured to generate a digital report relating to an outcome of the task in response to evaluating at least some of the control data received by the computing system during remote interaction with physical performance of the task.

8. The computing system of claim 1 further being configured to transmit digital media relating to performance of the task to the first digital display of the head wearable device.

9. The computing system of claim 8 further being configured to:

recognize a predetermined event relating to performance of the task from the at least one of the audio or visual data received by the computing system;
access the digital media from a server in communication with at least one of said computing system and the head wearable device; and
transmit the digital media to the first digital display of the head wearable device in response to recognition of the predetermined event by the computing system.

10. The computing system of claim 1 wherein the audio data and visual data transmitted from and received by the computing system are streamed concurrently and in essentially real-time.

11. The computing system of claim 1 wherein the task is further defined as at least one of an audit and inspection relating to at least one of food quality, food safety, and workplace safety, and wherein the first location is further defined as a workplace that is subject to at least one of the audit and inspection and wherein the second location is further defined as a facility of a party responsible for interacting with or managing execution of at least one of the audit and inspection.

12. The computing system of claim 1 wherein the first user is further defined as at least one of an employee, trainee, manager, and person-in-charge at the first location responsible for physically performing the task and wherein the operator is further defined as at least one of auditor, inspector, and trainer skilled in food quality, food safety, or workplace safety.

13. A computer-implemented method for utilizing a computing system to remotely interact with physical performance of a task relating to at least one of food quality, food safety, and workplace safety, with the computing system being in communication with a head wearable device via a network and with the head wearable device being utilized by a first user at a first location and comprising a first digital display, a first camera, a first microphone, a first speaker, and a first processor and with the computing system being utilized by an operator at a second location remote from the first location and with the computing system comprising a second digital display, a second camera, a second microphone, a second speaker, and a second processor, said method comprising the steps of:

transmitting, with computing system, audio data captured by the second microphone of the computing system to a first speaker of the head wearable device via the network to allow the operator to remotely interact with physical performance of the task from the second location using the audio data;
transmitting, with computing system, visual data captured by the second camera of the computing system to the first digital display of the head wearable device via the network to allow the operator to allow the operator to remotely interact with physical performance of the task from the second location using the visual data;
receiving, with the second speaker of the computing system, audio data captured by a first microphone of the head wearable device and transmitted via the network during remote interaction with physical performance of the task being conducted by the first user; and
receiving, with the second digital display of the computing system, visual data captured from a first-person perspective of the first user by the first camera of the head wearable device and transmitted via the network during remote interaction of physical performance of the task being conducted by the first user.

14. The computer-implemented method of claim 13 further comprising the step of evaluating with the computing system at least some of the visual data received by the computing system during performance of the task.

15. The computer-implemented method of claim 14 further comprising the step of generating with the computing system a digital report relating to an outcome of the task in response to evaluating at least some of the visual data received by the computing system during remote interaction with physical performance of the task.

16. The computer-implemented method of claim 13 further comprising the steps of receiving, with the computing system, control data from the head wearable device via the network, wherein the control data conveys information related to physical performance of the task and is generated in response to control of a peripheral input in communication with or attached to the head wearable device.

17. The computer-implemented method of claim 16 further comprising the step of evaluating with the computing system at least some of each of the audio data, visual data, and control data received by the computing system during remote interaction with physical performance of the task.

18. The computer-implemented method of claim 17 further comprising the step of generating with the computing system a digital report relating to an outcome of the task in response to evaluating at least some of the control data received by the computing system during remote interaction with physical performance of the task.

19. The computer-implemented method of claim 13 further comprising the step of transmitting, from the computing system, digital media relating to performance of the task to the first digital display of the head wearable device.

20. The computer-implemented method of claim 19 further comprising the step of:

recognizing, with the computing system, a predetermined event relating to performance of the task from the at least one of the audio or visual data received by the computing system from the head wearable device;
accessing the digital media from a server in communication with at least one of said computing system and the head wearable device; and
wherein transmitting, with the computing system, digital media relating to performance of the task to the first digital display of the head wearable device is further defined as occurring in response to recognition of the predetermined event by the computing system.

21. The computer-implemented method of claim 13 wherein the steps of transmitting the audio data and transmitting the visual data with the computing system are further defined as streaming the audio data and the visual data concurrently and in essentially real-time, and wherein the steps of receiving the audio data and receiving the visual data with the computing system are further defined as streaming the audio data and the visual data concurrently and in essentially real-time.

22. The computer-implemented method of claim 13 wherein the task is further defined as at least one of an audit and inspection relating to at least one of food quality, food safety, and workplace safety, and wherein the first location is further defined as a workplace that is subject to at least one of the audit and inspection and wherein the second location is further defined as a facility of a party responsible for interacting with or managing execution of at least one of the audit and inspection.

23. A kit for facilitating remote interaction with physical performance of a task relating to at least one of food quality, food safety, and workplace safety using a head wearable device at a first location and a computing system at a remote second location, wherein the head wearable device comprises a digital display, a camera, a microphone, a speaker, and a processor, and wherein the head wearable device and the computing system communicate via a network, said kit comprising:

a container comprising a base and a lid and with said container being sealable and being configured to house at least: the head wearable device being configured to: transmit to the computing system during physical performance of the task visual data with the camera from a first-person perspective and audio data with the microphone; and receive from the computing system during remote interaction with physical performance of the task audio data to the speaker and visual data to the digital display; and at least one other article or tool utilized in physically performing the task at the first location.
Patent History
Publication number: 20160306172
Type: Application
Filed: Apr 5, 2016
Publication Date: Oct 20, 2016
Inventors: Thomas Chestnut (Saline, MI), Jennifer Leighanne Tong (Troy, OH)
Application Number: 15/091,303
Classifications
International Classification: G02B 27/01 (20060101); G06Q 30/00 (20060101); G06F 3/14 (20060101); G06F 3/16 (20060101); G09G 5/12 (20060101);