PROCESS DIGITALIZATION TECHNOLOGY
A system and method for tracking actions of mobile assets used to perform a process within a facility includes a plurality of object trackers positioned throughout the facility to monitor, detect and digitize actions, including movement, of a mobile asset within the facility. The mobile asset includes an identifier which is detectable by each object tracker to track the movement and location of the detected asset in real time. Each object tracker includes at least one sensor for monitoring and detecting the asset and its identifier, where the input sensed by the sensor is transmitted to a computer within the object tracker for time stamping with a detected time, and processing of the sensor input using one or more algorithms to identify the asset ID and the asset type associated with the detected identifier, and the asset's location in the facility and interactions at the detected time.
Latest BEET, Inc. Patents:
This Application claims the benefit of U.S. Provisional Application 62/621,623 filed Jan. 25, 2018, and U.S. Provisional Application 62/621,709 filed Jan. 25, 2018, which are each hereby incorporated by reference in their entirety.
TECHNICAL FIELDThe present disclosure relates to a system and method for tracking actions, including movement, of mobile assets which are used to perform a process within a facility.
BACKGROUNDMaterial flow of component parts required to perform a process within a facility is one of the largest sources of down time in a manufacturing environment. Material flow of component parts is also one of the least digitized aspects of a process, as the dynamic nature of movement of component parts within a facility is complex and variable, requiring tracking of not only the direct productive parts such as workpieces and raw materials as these are moved and processed within the facility, but also requiring tracking of the carriers used to transport the workpieces and raw materials, which can include movement of the component parts by vehicles and/or human operators. Digitization of such an open-ended process with many component parts, carriers, and human interaction is very complex, and can be inherently abstract, for example, due to variability in the travel path of a component part through the facility, variety of carriers to transport the part, variability in human interaction in the movement process, etc. As such, it can be very difficult to collect data on material flow within a facility in a meaningful way. Without meaningful data collection, there is relatively minimal quantifiable analysis that can be done to identify sources of defects and delays and to identify opportunities for improvement in the movement and actioning of component parts within the facility, such that variation in movement of component parts within a facility is generally simply tolerated or compensated by adding additional and/or unnecessary lead time into the planned processing time of processes performed within the facility.
SUMMARYA system and method described herein provides a means for tracking and analyzing actions, including movements, of mobile assets used to perform a process within a facility, by utilizing a plurality of object trackers positioned throughout the facility to monitor, detect and digitize actions of the mobile asset within the facility. In a non-limiting example, the mobile asset can be identified by an identifier which is unique to that mobile asset and is detectable by each of the object trackers, such that an object tracker upon detecting the mobile asset can track the movement and location of the asset in real time. Each object tracker includes at least one sensor for monitoring and detecting the asset and asset identifier, where the sensor input sensed by the sensor is transmitted to a computer within the object tracker for time stamping with a detected time, and processing of the sensor input using one or more algorithms to identify the asset, including the asset ID and asset type associated with the identifier, the location of the asset in the facility at the detected time, and interactions of the asset at the detected time. Each object tracker is in communication via a facility network with a data broker such that the information detected by the object tracker, including the asset ID, asset type, detected time, detected location and detected interaction can be transmitted to the data broker as an action entry for that detection event and stored in a action list data structure associated with the detected asset. The computer within the object tracker can be referred to herein as a tracker computer. The sensor input can include, for example, sensed images, RFID signals, location input, etc., which is processed by the tracker computer to generate the action entry, where the action entry, in an illustrative example, is generated in JavaScript Object Notation (JSON), as a JSON string for transmission via the facility network to the data broker. Advantageously, by digitizing the sensor input for each detection event using the tracker computer, it is not necessary to transmit the sensor input over the facility network, and the amount of data transmitted via the facility network to the data broker for each detection event is substantially reduced.
As the asset is moved and/or acted upon within the facility through a sequence of actions, the object trackers continue to detect the asset and report information collected during each detection event to the data broker, such that the collected data can be analyzed by a data analyzer, also referred to herein as an analyst, for example, to determine an actual duration of each movement and/or action of the mobile asset during processing within the facility, to identify a sequence of movements and/or actions, to map the location of the asset at the detected time and/or over time to a facility map, to compare the actual duration with a baseline duration, and/or to identify opportunities for improving asset flow in the facility, including opportunities to reduce the duration of each movement and/or action to improve, e.g., reduce processing time and/or increase throughput and productivity of the process. Advantageously, the system and method can use the collected data to generate visualization outputs, including, for example, a detailed map of the facility tracking the movement of assets over time, and a heartbeat for the asset using the actual and/or baseline durations of sequential movements and actions of the asset within the facility. The visualization outputs can be displayed, for example, via a user device in communication with the analyst.
By way of illustration, the system and method are described herein using a non-limiting example where the mobile assets being tracked and analyzed include part carriers and component parts. In a non-limiting example, the actions of a mobile asset which are detected and tracked by the object trackers can include movement, e.g., motion, of the mobile asset, including transporting, lifting, and placing a mobile asset. In the illustrative example, the actions detected can include removing a component part from a part carrier, and/or moving a component part to a part carrier. A component part, as that term is used herein, refers to a component which is used to perform a process within a facility. In a non-limiting illustrative example, a component part, also referred to herein as a part, can be configured as one or more of a workpiece, an assembly including the workpiece, raw material used in forming the workpiece or assembly, and/or a tool, a gage, a fixture, or other component which is used in the process performed within the facility. A part carrier refers to a carrier which is used to move a component part within the facility. In a non-limiting illustrative example, a part carrier, also referred to herein as a carrier, can include any asset used to move or action a component part, including, for example, containers, bins, pallets, trays, etc. which are used to contain or support a component part during movement or actioning of the component part in the facility and further including any mobile asset used to transport the container, bin, pallet, tray etc. and/or the component part or parts, including, for example, vehicles including lift trucks, forklifts, pallet jacks, automatically guided vehicles (AGVs), carts, and people such as machine operators and material handling personnel used to move and/or action a component part and/or a carrier for transporting a component part.
In one example, the sensor input can be used by the tracker computer to determine one or more interactions of the detected asset. For example, where the detected asset is a first part carrier being conveyed by a second part carrier, an interaction determined by the tracker computer can be the asset ID and the asset type of the second part carrier being used to convey the first part carrier. For example, the first part carrier can be a part tray being transported by an AGV, where the detected asset is the part tray, and the interaction is the asset ID and asset type of the AGV. Another interaction can be, for example, a quantification of the number, type, and/or condition of parts being transported on the parts tray, using image sensor input of the first part carrier received by the object tracker, where the part condition, in one example, can include a part parameter such as a dimension, feature, or other parameter determinable by the object tracker from the image sensor input. Advantageously, using the asset list entries of the sequenced actions of an asset, including location over time and interaction data, block chain traceability of component parts through processing can be determined from the action list data structure for that asset.
A method for tracking actions of mobile assets used to perform a process within a facility is provided. The method can include positioning an object tracker at a tracker location within the facility, and providing a plurality of mobile assets to the facility, where each mobile asset includes an identifier which is unique to the mobile asset. The mobile asset is associated in a database with the identifier, an asset ID and an asset type. The object tracker defines a detection zone relative to the tracker location. The object tracker includes a sensor configured to collect sensor input within the detection zone, where collecting the sensor input includes detecting the identifier when the mobile asset is located in the detection zone. The object tracker further includes a tracker computer in communication with the sensor to receive the sensor input, and at least one algorithm for performing time stamping of the sensor input with a detection time, processing the sensor input to identify the identifier, processing the identifier to identify the asset ID and the asset type associated with the identifier and generating an asset entry including the asset ID, the asset type, and the detection time.
The method further includes collecting, via the sensor, the sensor input, receiving, via the tracker computer, the sensor input, time stamping, via the tracker computer, the sensor input with a detection time, processing, via the tracker computer, the sensor input to identify the identifier, processing, via the tracker computer, the identifier to identify the asset ID and the asset type associated with the identifier, and generating, via the tracker computer, the asset entry. The method can further include digitizing the asset entry using the object tracker, the tracker computer of the object tracker being in communication with a central data broker via a network, transmitting the asset entry to the central data broker via the network, mapping the asset entry to an asset action list using the central data broker, and storing the asset action list to the database, where the asset entry and the asset action list are each associated with the asset ID and asset type associated with the identifier. The method can include analyzing, via an analyst in communication with the database, the asset action list, where analyzing the asset action list can include determining an action event defined by the asset action list and determining an action event duration of the action event. The method can further include generating, via the analyst, one or more visualization outputs. For example, the method can include generating, via the analyst, a tracking map defined by the asset action list, wherein the tracking map visually displays at least one action performed by the mobile asset associated via the asset ID and asset type with the asset action list. The method can further include generating, via the analyst, a heartbeat defined by the asset action list, where the heartbeat visually displays the action event duration and the action event. In one example, analyzing the asset action list includes determining a plurality of action events defined by the asset action list, determining a respective action event duration for each action event of the plurality of action events, ordering the plurality of action events in a sequence according to time of occurrence, and generating, via the analyst, the heartbeat, where the heartbeat visually displays the respective action event duration and the action event of each of the plurality of action events in the sequence.
The above features and advantages, and other features and advantages, of the present teachings are readily apparent from the following detailed description of some of the best modes and other embodiments for carrying out the present teachings, as defined in the appended claims, when taken in connection with the accompanying drawings.
The elements of the disclosed embodiments, as described and illustrated herein, may be arranged and designed in a variety of different configurations. Thus, the following detailed description is not intended to limit the scope of the disclosure, as claimed, but is merely representative of possible embodiments thereof. In addition, while numerous specific details are set forth in the following description in order to provide a thorough understanding of the embodiments disclosed herein, some embodiments can be practiced without some of these details. Moreover, for the purpose of clarity, certain technical material that is understood in the related art has not been described in detail in order to avoid unnecessarily obscuring the disclosure. Furthermore, the disclosure, as illustrated and described herein, may be practiced in the absence of an element that is not specifically disclosed herein. Referring to the drawings wherein like reference numbers represent like components throughout the several figures, the elements shown in
Referring to
As the mobile asset 24 is moved through a sequence of actions 114 within the facility 10, the various object trackers 12 positioned within the facility 10 continue to detect the mobile asset 24, collect sensor input during each additional detection event, to process the sensor input to generate an additional action entry 90 for the detection event, and transmit the additional action entry 90 to the central data broker 28. The central data broker 28, upon receiving the additional action entry 90, deserializes the action entry data, which includes an asset ID 86 identifying the mobile asset 24, and maps the data retrieved from the additional action entry 90 to a data structure configured as an asset action list 102 associated with the mobile asset 24 identified in the action entry 90, as shown in
In one example, the remote server 46 is configured as a cloud server accessible via a network 48 in communication with the remote server 46 and the central data broker 28. In one example, the network 48 is the Internet. The server 46, 56 can be configured to receive and store asset data and action data to the database 122, including for example, identifier 30 data, asset instance 104 data, asset entry 90 data, and asset action list 102 data for each mobile asset 24, in a data structure as described herein. The server 46 can be configured to receive and store visualization outputs including, for example, tracking maps 116 and mobile asset heartbeats 110 generated by an analyst 54 in communication with the server 46, 56, using the action data.
The analyst 54 includes a central processing unit (CPU) 66 for executing one or more algorithms for analyzing the data stored in the database 122, and a memory, The analyst 54 can include, for example algorithms for analyzing the asset action lists 102, for determining asset event durations 108, for generating and analyzing visualization outputs including asset event heartbeats 110 and tracking maps 116, etc. The memory, at least some of which is tangible and non-transitory, may include, by way of example, ROM, RAM, EEPROM, etc., of a size and speed sufficient, for example, for executing the algorithms, storing a database, and/or communicating with the central data broker 28, the servers 46, 56, the network 48, one or more user devices 50 and/or one or more output displays 52.
The server 46, 56 includes one or more applications and a memory for receiving, storing, and/or providing the asset data, action data and data derived therefrom including visualization data, heartbeat data, map data, etc. within the system 100, and a central processing unit (CPU) for executing the applications. The memory, at least some of which is tangible and non-transitory, may include, by way of example, ROM, RAM, EEPROM, etc., of a size and speed sufficient, for example, for executing the applications, storing a database, which can be the database 122, and/or communicating with the central data broker 28, the analyst 54, the network 48, one or more user devices 50 and/or one or more output displays 52.
The analyst 54, also referred to herein as a data analyzer, is in communication with the server 46, 56, and analyzes the data stored to the asset action list 102, for example, to determine an actual duration 108 of each action and/or movement of the mobile asset 24, during processing within the facility 10, to identify a sequence 114 of action events 40 defined by the movements and/or actions, to map the location of the mobile asset 24 at the detected time 92 and/or over time to a facility map 116, to compare the actual action event duration 108 with a baseline action event duration, and/or to identify opportunities for improving asset movement efficiency and flow in the facility 10, including opportunities to reduce the action duration 108 of each movement and/or action to improve the effectiveness of the process by, for example, reduce processing time and/or increase throughput and productivity of the process. Advantageously, the system 100 and method 200 can use the data stored in the database 122 to generate visualization outputs, including, for example, a detailed map 116 of the facility 10, showing the tracked movement of the mobile assets 24 over time, and a heartbeat 110 for action events 40 of an asset 24, using the action durations 108 of sequential movements and actions of the asset 24 within the facility 10. The visualization outputs can be displayed, for example, via a user device 50 and/or an output display 52 in communication with the analyst 54.
Referring to
The system 100 includes a plurality of object trackers 12 positioned throughout the facility 10 to monitor, detect and digitize the actions of one or more of the mobile assets 24 used in performing at least one process within the facility 10. Each object tracker 12 is characterized by a detection zone 42 (see
Referring again to
Each of the object trackers 12 includes a communication module 80 such that each structural object tracker Sx, each line object tracker Lx, and each mobile object tracker Mx can communicate wirelessly with each other object tracker 12, for example, using WiFi and/or Bluetooth®. Each of the object trackers 12 includes a connector for connecting via a PoE cable 62 such that each structural object tracker Sx, each line object tracker Lx, and each mobile object tracker Mx can, when connected to the facility network 20, communicate via the facility network 20 with each other object tracker 12 connected to the facility network 20. Referring to
Each structural object tracker Sx is connected to one of the structural enclosure 14 or the exterior structure 16, such that each structural object tracker Sx is in a fixed position in a known location relative to the facility 10 when in operation. In a non-limiting example shown in
As shown in
Each line object tracker Lx is connected to one of processing lines 18, such that each line object tracker Lx is in a fixed position in a known location relative to the processing line 18 when in operation. In a non-limiting example shown in
Each mobile object tracker Mx is connected to one of the mobile assets 24, such that each mobile object tracker Mx is mobile, and is moved through the facility 10 by the mobile asset 24 to which the mobile object tracker Mx is connected. Each mobile object tracker Mx defines a detection zone 42 which moves with movement of the mobile object tracker Mx in the facility 10. In a non-limiting example, the location of each mobile object tracker Mx in the facility 10 is determined by the mobile object tracker Mx at any time, using, for example, its location module 82 and a SLAM algorithm 70, where the mobile object tracker Mx can communicate with other object trackers 24 having a fixed location, to provide input for determining its own location. The example is non-limiting, and other methods can be used. For example, the location module 82 can be configured to determine the GPS coordinates of the mobile object tracker Mx to determine location. In the illustrative example, each mobile object tracker Mx communicates with the facility network 20, for example, via one of the structural object trackers Sx, by sending signals and/or data, including digitized action entry 90 data to the structural object tracker Sx via the communication modules 80 of the respective mobile object tracker Mx sending the data, and the respective structural object tracker Sx receiving the data. The data received by the structural object tracker Sx from the mobile object tracker Mx can include, in one example, the tracker ID of the mobile object tracker Mx transmitting the data to the receiving structural object tracker Sx such that the structural object tracker Sx can transmit the tracker ID with the data received from the mobile object tracker Mx to the central data broker 28. As the mobile object tracker Mx identifies mobile assets 24 detected in its detection zone 42, and generates asset entries 90 for each detected mobile asset 24, the mobile object tracker Mx transmits the generated asset entries 90 in real time to a structural object tracker Sx for retransmission to the central data broker 28 via the facility network 20, such that there is no latency or delay in the transmission of the generated asset entries 90 from the mobile object tracker Mx to the central data broker 28. By transmitting all data generated by all of the object trackers 12, including the mobile object trackers Mx to the central data broker 28 via a single outlet, the facility network 20, data security is controlled. Each mobile object tracker Mx can be powered, for example, by a power source provided by the mobile asset to which the mobile object tracker Mx is connected, and/or can be powered, for example, by a portable and/or rechargeable power source such as a battery.
In a non-limiting example, the mobile assets 24 being tracked and analyzed include part carriers C1 . . . Cq and component parts P1 . . . Pp, as shown in
As used herein, a part carrier Cx refers generally to one of the part carriers C1 . . . Cq. A part carrier, as that term is used herein, refers to a carrier Cx which is used to move a component part Px within the facility 10. In a non-limiting illustrative example, a part carrier Cx, can include any mobile asset 24 used to move or action a component part Px, including, for example, containers, bins, pallets, trays, etc., which are configured to contain or support a component part Px during movement or actioning of the component part Px in the facility 10 (see for example carrier C2 containing part P1 in
Referring to
Referring again to
A mobile asset 24, which in the present example is configured as a carrier Cq for transporting one or more parts Px is shown in
In another example, the QR code 32 positioned on the carrier Cq can be detected using an image of the carrier Cq sensed by the camera 76 of the object reader 12 and inputted to the tracker computer 60 as a sensor input, such that the tracker computer 60, by processing the image sensor input, can detect the QR code data, which is mapped in the database 122 to the asset instance 104 of the carrier Cq and use the QR code data to identify the carrier Cq. In another example, the labels 34 can be detected using an image of the carrier Cq sensed by the camera 76 of the object reader 12 and inputted to the tracker computer 60 as a sensor input, such that the tracker computer 60, by processing the image sensor input, can each label. In one example, at least one of the labels 34 can include a marking, such as a serial number or bar code, uniquely identifying the carrier Cq and which is mapped in the database 122 to the asset instance 104 of the carrier Cq such that the tracker computer 60 in processing the image sensor input, can identify the marking and use the marking to identify the carrier Cq. In another example, the combination of the labels 34 can define a fiducial feature 36 shown in
A mobile asset 24, which in the present example is configured as a part PP is shown in
Referring to
Referring again to
Referring again to the object tracker 12 shown in
As shown in
The tracker computer 60 receives sensor input from the various sensors 64 in the object tracker 12, which includes image input from the one or more cameras 76, and can include one or more of RFID tag data input from the RFID reader 78, location data input from the location module 82, and wireless data from the communication module 80. The sensor input is time stamped by the tracker computer 60, using a live time obtained from the facility network 20 or a live time obtained from the processor 66, where in the later example, the processor time has been synchronized with the live time of the facility network 20. The facility network 20 time can be established, for example, by the central data broker 28 or by a server such as local server 56 in communication with the facility network 20. Each of the processors 66 of the object trackers 12 is synchronized with the facility network 20 for accuracy in time stamping of the sensor input and accuracy in determining the detected time 92 of a detected mobile asset 24.
The sensor input is processed by the tracker computer 60, using one or more of the algorithms 70, to determine if the sensor input has detected any identifiers 30 of mobile assets 24 in the detection zone 42 of the object tracker 12, where detection of an identifier 30 in the detection zone 42 is a detection event. When one or more identifier 30 is detected, each identifier 30 is processed by the tracker computer 60 to identify the mobile asset 24 associated with the identifier 30, by determining the asset instance 104 mapped to the identifier 30 in the database 122, where the asset instance 104 of the mobile asset 24 associated with the identifier 30 includes the asset ID 86 and the asset type 88 of the identified mobile asset 24. The asset ID 86 is stored in the database 122 as a simple unique integer mapped to the mobile asset 24, such that the tracker computer 60, using the identifier 30 data, retrieves the asset ID 86 mapped to the detected mobile asset 24, for entry into an action entry 90 being populated by the tracker computer 60 for that detection event. A listing of types of assets is stored in the database 122, with each asset type 88 mapped to an integer in the database 122. The tracker computer 60 retrieves integer mapped to the asset type 88 associated with the asset ID in the database 122, for entry into the action entry 90. The database 122, in one example, can be stored in a server 46, 56 in communication with the central data broker 28 and the analyst 54, such that the stored data is accessible by the central data broker 28, by the analyst 54, and/or by the object tracker 12 via the central data broker 28. The server can include one or more of a local server 56 and a remote server 46 such as a cloud server accessible via a network 48. The example is non-limiting, and it would be appreciated that the database 122 could be stored in the central data broker 28, or in the analyst 54, for example. In an illustrative example, an asset type can be a category of asset, such as a part carrier or component part, can be a specific asset type, such as a bin, pallet, tray, fastener, assembly, etc., of a combination of these, for example, a carrier-bin, carrier-pallet, part-fastener, part-assembly, etc. Non-limiting examples of various types and configurations of identifiers 30 which may be associated with a mobile asset 24 are shown in
The tracker computer 60 populates an action entry 90 data structure (see
The tracker computer 60 processes the sensor input to determine the location 96 of the mobile asset 24 detected during the detection event, for entry into the corresponding field(s) in the action entry 90. In the illustrative example shown in
In one example, the sensor input can be used by the tracker computer 60 to determine one or more interactions 98 of the detected asset 24. The type and form of the data entry into the interaction field 98 of the action entry 90 is dependent on the type of interaction which is determined for the mobile asset 24 detected during the detection event. For example, where the detected asset 24 is a second part carrier C2 being conveyed by another mobile asset 24 which is a first part carrier C1, as shown in
In one example, the tracker computer 60 can be instructed to enter a defined interaction 98 based on one or a combination of one of more of the asset ID 86, asset type 88, action type 94, and location 96. In an illustrative example, referring to
After the tracker computer 60 has populated the data fields 86, 88, 90, 92, 94, 96, 98 of the action entry 90 for the detected event, the action entry 90 is digitized by the tracker computer 60 and transmitted to the central data broker 28 via the facility network 20. In an illustrative example, the action entry 90 is generated in JavaScript Object Notation (JSON by serializing the data populating the data fields 86, 88, 90, 92, 94, 96, 98 into a JSON string for transmission as an action entry 90 for the detected event. As shown in
Over time, additional actions are detected by one or more of the object locators 12 as the asset 24 is used in performing a process within the facility 10, and additional action entries 90 are generated by the object locators 12 detecting the additional actions, and are added to the action list 102 of the mobile asset 24. For example, referring to
Using the example of the asset action list 102 generated for the pallet carrier C2, the data analyst 54 analyzes the asset action list 102, including the various action entries 90 generated for actions of the pallet carrier C2 detected by the various object trackers 12 as the pallet carrier C2 was transported by the carrier C1 from the retrieval location to the destination location during the action event 40. The analysis of the asset action list 102 and the action entries 90 contained therein performed by the analyst 54 can include using one or more algorithms to, for example, reconcile the various action entries 90 generated by the various object trackers S1, S3, S5, S7, L1 and M1 during the action event 40, for example, to determine the actual path taken by the pallet carrier C2 during the action event 40 using for example, the action type 94 data, the location 96 data and time stamp 92 data from the various action entries 90 in the asset action list 102, to determine an actual action event duration 108 for the action event 40 using, for example, the action event durations 108 and time stamp 92 data from the various action entries 90 in the asset action list 102, to generate a tracking map 116 showing the actual path of pallet carrier C2 during the action event 40, to generate a heartbeat 110 of the mobile asset 24, in this example, pallet carrier C2, to compare the actual action event 40 for example, to a baseline action event 40, to statistically quantify the action event 40, for example, to provide comparative statistical regarding the action event duration 108, etc. The analyst 54 can associate and store in the database 122 the action event 40 with asset instance 104 of the mobile asset 24, in this example the pallet carrier C2, with the tracking map data (including path data identifying the path traveled by the pallet carrier C2 during the action event 40), and with the action event duration 108 determined for the action event 40, and stored to the database 122. In an illustrative example, the action event 40 can be associated with one or more groups of action events having a common characteristic, for comparative analysis, where the common characteristic shared by the action events associated in the group like action, can be, for example, the event type, the action type, the mobile asset type, the interaction, etc.
The tracking map 116 and the mobile asset heartbeat 110 are non-limiting examples of a plurality of visualization outputs which can be generated by the analyst 54, which can be stored to the database 122 and displayed, for example, via a user device 50 or output display 52. In one example, the visualization outputs, including the tracking map 116 and mobile asset heartbeat 116 can be generated by the analyst 54 in near real time such that these visualization outputs can be used to provide alerts, show action event status, etc. to facilitate identification and implementation of corrective and/or improvement actions in real time. As used herein, an “action event” is distinguished from an “action”, in that an action event 40 includes, for example, the cumulative actions executed to complete the action event 40. In the present example, the action event 40 is the delivery of the pallet carrier C2 from the retrieval location (shown at C′1 in
The tracking map 116 can include additional information, such as the actual time at which the pallet carrier C2 is located at various points along the actual delivery path shown for the action event 40, the actual event duration 108 for the action event 40, etc., and can be color coded or otherwise indicate comparative information. For example, the tracking map 116 can display a baseline action event 40 with the actual event 40, to visual deviations of the actual action event 40 from the baseline event 40. For example, an action event 40 with an actual event duration 108 which is greater than a baseline event duration 108 for that action event can be coded red to indicate an alert or improvement opportunity. An action event 40 with an actual event duration 108 which is less than a baseline event duration 108 for that action event can be coded blue and investigate reasons for the demonstrated improvement, for replication in future action events of that type. The tracking map 116 can include icons identifying the action type 94 of the action event 40 shown on the tracking map 116, for example, whether the action event 40 is a transport, lifting, or placement type action. In one example, each action event 40 displayed on the tracking map 116 can be linked, for example, via a user interface element (UIE) to detail information for that action event 40 including, for example, the actual event duration 108, a baseline event duration, event interactions, a comparison of the actual event 40 to a baseline event, etc.
In one example, the sequence of action events 114 can be comprised of action events 40 which are known action events 40, and can, for example, be included in a sequence of operations executed to perform a process within the facility 10, such that, by tracking and digitizing the actions of the mobile assets 24 in the facility 10, the total cycle time required to perform the sequence of operations of the process can be accurately quantified and analyzed for improvement opportunities, including reduction in the action event durations 108 of the action events 40. In one example, not all of the actions tracked by the object trackers 12 will be defined by a known action event 40. In this example, advantageously, the analyst 54 can analyze the action entry 90 data, for example, to identify patterns in actions of the mobile assets 12 within the facility 10, including patterns which define repetitively occurring action events 40, such that these can be analyzed, quantified, baselined, and systematically monitored for improvement.
Referring now to
The term “comprising” and variations thereof as used herein is used synonymously with the term “including” and variations thereof and are open, non-limiting terms. Although the terms “comprising” and “including” have been used herein to describe various embodiments, the terms “consisting essentially of” and “consisting of” can be used in place of ‘comprising’ and “including” to provide more specific embodiments and are also disclosed. As used in this disclosure and in the appended claims, the singular forms “a”, “an”, “the”, include plural referents unless the context clearly dictates otherwise.
The detailed description and the drawings or figures are supportive and descriptive of the disclosure, but the scope of the disclosure is defined solely by the claims. While some of the best modes and other embodiments for carrying out the claimed disclosure have been described in detail, various alternative designs and embodiments exist for practicing the disclosure defined in the appended claims. Furthermore, the embodiments shown in the drawings or the characteristics of various embodiments mentioned in the present description are not necessarily to be understood as embodiments independent of each other. Rather, it is possible that each of the characteristics described in one of the examples of an embodiment can be combined with one or a plurality of other desired characteristics from other embodiments, resulting in other embodiments not described in words or by reference to the drawings. Accordingly, such other embodiments fall within the framework of the scope of the appended claims.
Claims
1. A method for tracking actions of mobile assets used to perform a process within a facility, the method comprising:
- positioning an object tracker at a tracker location within the facility;
- providing a plurality of mobile assets to the facility;
- wherein each mobile asset includes an identifier which is unique to the mobile asset;
- wherein the mobile asset is associated in a database with the identifier, an asset ID and an asset type;
- wherein the object tracker defines a detection zone relative to the tracker location;
- wherein the object tracker comprises: a sensor configured to collect sensor input within the detection zone; a tracker computer in communication with the sensor to receive the sensor input; at least one algorithm for: time stamping the sensor input with a detection time; processing the sensor input to identify the identifier; processing the identifier to identify the asset ID and the asset type associated with the identifier; and generating an asset entry including the asset ID, the asset type, and the detection time;
- the method further comprising: collecting, via the sensor, the sensor input wherein collecting the sensor input includes detecting the identifier when the mobile asset is located in the detection zone; receiving, via the tracker computer, the sensor input; time stamping, via the tracker computer, the sensor input with a detection time; processing, via the tracker computer, the sensor input to identify the identifier; processing, via the tracker computer, the identifier to identify the asset ID and the asset type associated with the identifier; and generating, via the tracker computer, the asset entry.
2. The method of claim 1, wherein the tracker computer is in communication with a central data broker via a network, the method further comprising:
- digitizing the asset entry using the object tracker;
- transmitting the asset entry to the central data broker via the network;
- mapping the asset entry to an asset action list using the central data broker; and
- storing the asset action list to the database;
- wherein the asset entry and the asset action list are each associated with the asset ID and the asset type associated with the identifier.
3. The method of claim 2, further comprising:
- analyzing, via an analyst in communication with the database, the asset action list;
- wherein analyzing the asset action list comprises: determining an action event defined by the asset action list; and determining an action event duration of the action event.
4. The method of claim 3, further comprising:
- generating, via the analyst, a tracking map defined by the asset action list;
- wherein the tracking map visually displays at least one action performed by the mobile asset associated via the asset ID and asset type with the asset action list.
5. The method of claim 3, further comprising:
- generating, via the analyst, a heartbeat defined by the asset action list;
- wherein the heartbeat visually displays the action event duration and the action event.
6. The method of claim 5, wherein analyzing the asset action list comprises:
- determining a plurality of action events defined by the asset action list; and
- determining a respective action event duration for each action event of the plurality of action events;
- ordering the plurality of action events in a sequence according to time of occurrence;
- generating, via the analyst, the heartbeat;
- wherein the heartbeat visually displays the respective action event duration and the action event of each of the plurality of action events in the sequence.
7. The method of claim 1, wherein the sensor comprises a camera; and
- wherein the sensor input is an image of the detection zone collected by the camera.
8. The method of claim 7, wherein the camera is an infrared sensitive camera.
9. The method of claim 7, wherein the camera is a thermographic infrared sensitive camera.
10. The method of claim 1, wherein the sensor comprises an RFID reader;
- wherein the identifier includes an RFID tag; and
- wherein the sensor input is RFID data read from the RFID tag by the RFID reader.
11. The method of claim 1, wherein the mobile asset includes a plurality of labels arranged in a pattern;
- wherein the pattern defines the identifier.
12. The method of claim 1, wherein the object tracker is affixed to a structure of the facility, such that the object tracker is fixed in position.
13. The method of claim 1, wherein the object tracker is affixed to one of the mobile assets, such that the object tracker is mobile.
14. The method of claim 1, wherein the object tracker comprises at least one algorithm for processing the sensor input to identify an interaction of the mobile asset at the detection time;
- the method further comprising: processing, via the tracker computer, the sensor input to identify the interaction; and inputting the interaction to the asset entry.
15. The method of claim 1, wherein the object tracker comprises at least one algorithm for processing the sensor input to identify a location of the mobile asset at the detection time;
- the method further comprising: processing, via the tracker computer, the sensor input to identify the location; and inputting the location to the asset entry.
16. The method of claim 1, wherein:
- the object tracker is one of a plurality of object trackers; and
- each one of the object trackers defines a detection zone relative to the tracker location;
- the method further comprising:
- positioning each of the object trackers at a respective tracker location within the facility;
- wherein the detection zone of each one of the object trackers overlaps the detection zone of at least one other of the object trackers.
17. The method of claim 1, wherein the plurality of mobile assets comprises:
- at least one component part; and
- at least one part carrier.
18. The method of claim 1, wherein the mobile asset is a part carrier configured to carry at least one component part; and
- wherein the object tracker comprises at least one algorithm for processing the sensor input to determine a quantity of component parts carried by the at least one part carrier;
- the method further comprising: processing, via the tracker computer, the sensor input to determine the quantity of component parts; and inputting the quantity to the asset entry.
19. The method of claim 1, wherein the mobile asset is a component part; and
- wherein the object tracker comprises at least one algorithm for processing the sensor input to determine an inspection result for the component part using the sensor input;
- the method further comprising: processing, via the tracker computer, the sensor input to determine the inspection result; and inputting the inspection result to the asset entry.
20. A system for tracking actions of mobile assets used to perform a process within a facility, the system comprising:
- an object tracker positioned at a tracker location within a facility;
- a plurality of mobile assets located within the facility;
- wherein each mobile asset includes an identifier which is unique to the mobile asset;
- wherein the mobile asset is associated in a database with the identifier, an asset ID and an asset type;
- wherein the object tracker defines a detection zone relative to the tracker location;
- wherein the object tracker comprises: a sensor configured to collect sensor input within the detection zone; wherein collecting the sensor input includes detecting the identifier when the mobile asset is located in the detection zone; a tracker computer in communication with the sensor to receive the sensor input; at least one algorithm for: time stamping the sensor input with a detection time; processing the sensor input to identify the identifier; processing the identifier to identify the asset ID and the asset type associated with the identifier; and generating an asset entry including the asset ID, the asset type, and the detection time.
Type: Application
Filed: Jan 24, 2019
Publication Date: Oct 15, 2020
Applicant: BEET, Inc. (Plymouth, MI)
Inventors: David Jingqiu Wang (Northville, MI), Aaron Gregory Romain (Dearborn, MI), Daniel Philip Romain (Northville, MI)
Application Number: 16/957,604