AUTOMATED OPERATION OF RAILROAD TRAINS

A new over lay technology for the Positive Train Control and Energy Management systems used by the railroad industry today which allows for automated train handling responses to potential on-track hazards. Various sensors, including image-capture devices, radar, and drones, are placed on or proximate to a train. These sensors are used to interface with or override the Positive Train Control and Energy Management systems where those systems activate specific actions such as slowing or stopping a train but hazardous conditions on the track may dictate an alternative response.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. provisional patent application No. 63/330,110, filed Apr. 12, 2022. The entirety of that provisional application is incorporated in its entirety herein by reference.

FIELD OF THE DISCLOSURE

The present disclosure relates to the field of automated operation of railroad trains.

BACKGROUND OF THE DISCLOSURE

The Rail Safety Improvement Act of 2008 mandated the implementation of a federal railway safety system called Positive Train Control (PTC). According to the U.S. Department of Transportation, PTC systems “are designed to prevent train-to-train collisions, over-speed derailments, incursions into established work zones, and movement of trains through switches left in the wrong position.” Positive Train Control (PTC), U.S. DEPT. OF TRANSP., FED. R.R. ADMIN., https://railroads.dot.gov/train-controUptc/positive-train-control-ptc (last accessed Mar. 11, 2022).

PTC complies with the 2008 Rail Safety Act by monitoring the current and predicted operation of a train and providing warnings to the locomotive engineer that action is required to maintain safe operation. If the locomotive engineer fails to adequately respond to PTC warnings, PTC will stop the train with a penalty brake application.

A typical PTC system includes a monitor for the locomotive engineer which displays distances to next speed and stop targets with which the locomotive engineer must comply. These targets are established as PTC navigates against a detailed track database using GPS and dead reckoning techniques. The detailed track database includes precision locations of wayside signal assets, precision locations of critical track features and all civil speed restrictions. Additionally, wirelessly communicated wayside signal indications and dispatch office mandatory directives are used by PTC to create speed and stop targets.

While the PTC system was being implemented, the railroad industry continued the development and implementation of Energy Management (EM) systems as well. EM systems can vary by individual rail company, but EM systems preserve train momentum to reduce fuel consumption while maintaining on time train performance. Other benefits include reduced in train forces and improved train handling. These automated systems can include throttle control and, application of pneumatic and regenerative brake systems based on track topology, route data, operating constraints, and consist information for individual trains.

One of the ways that PTC affects the speed of trains is by creating Restricted Speed targets by rule or in response to wayside signal indications, track data or mandatory directives. When operating within a Restricted Speed target, a human engineer is required to maintain a speed such that the engineer can stop the train within one-half the range of the engineer's vision. PTC cannot enforce this one-half range of vision requirement. This leads to inefficient and slow rail operations because engineers naturally are conservative with their estimates rather than risk a collision. Yet Restricted Speed collisions do occur and although at slower speeds, they may still have catastrophic consequences.

Currently PTC and EM rely on static track databases which contain grade, curvature, civil speed and critical feature location (i.e. switch and signal) information. PTC and EM are not equipped to handle many of the dynamic conditions that may occur on the railroad right of way. While vigilant train engineers can serve to lessen this gap, detection capabilities of advanced sensor packages can reduce and/or mitigate human error, particularly where operator fatigue or poor visibility are factors.

Accordingly, what is needed is a step beyond PTC to allow trains to autonomously respond to dynamic and/or changing conditions on the track.

SUMMARY

This disclosure is intended to advance the automation of trains. The purpose of devices and methods disclosed herein is to make best-practice, consistent responses to objects on or approaching the tracks which surpass human responses in both consistency and vigilance to achieve safer outcomes. The systems and methods disclosed herein may be used together with one or more human operators onboard the train or on a train with no human operator.

The present disclosure, directed toward a new train monitoring system utilizing computer vision, provides automated response to line-of-sight hazards on the right of way and provides an evolutionary path to better than line-of-sight detection. The present system improves computer functionality and computer vision for purposes of rail autonomous operations with or without a human operator, the underpinnings of which are center in the creation of Baseline Navigational Data.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a schematic view illustrating the concept of operations according to an embodiment.

FIG. 2 is a block diagram illustrating a system architecture of an embodiment.

FIG. 3 is a schematic diagram representing an exemplary Impact Zone according to an embodiment.

FIG. 4 is a schematic diagram illustrating an exemplary Unobstructed Distance in an embodiment.

FIG. 5 is an image illustrating a car in an Impact Zone in an embodiment.

FIG. 6 is an image illustrating out of tolerance pixels corresponding to the car in the Impact Zone.

FIGS. 7A-7B are images of the car of FIG. 5 at different locations relative to the tracks.

FIG. 8 is a schematic diagram illustrating a Last Trapezoid Definition according to an embodiment.

FIG. 9 is a schematic diagram illustrating a Penultimate Trapezoid Definition according to an embodiment.

FIG. 10 is a schematic diagram illustrating Trapezoidal Data Creation according to an embodiment.

FIG. 11 is a schematic diagram illustrating a Weighted Pixel Grid Box according to an embodiment.

FIG. 12 is a schematic diagram illustrating trapezoid regions along a curved track according to an embodiment.

FIG. 13 is a flow chart illustrating a Train Handling and Hazard Response process according to an embodiment.

FIG. 14 is an image used for Baseline Navigational Data creation in an embodiment.

FIG. 15 is an image of an Active View corresponding to Baseline Navigational Data in FIG. 14.

FIG. 16 is a justified active view image corresponding to the image of FIG. 15.

FIGS. 17A-17E are schematic diagram illustrating successive trapezoids processed to create baseline navigation data in an embodiment.

FIG. 18 is a schematic diagram illustrating a view of an adjacent track with a train traveling in an opposite direction.

FIG. 19 is an exemplary trapezoid with superimposed grid boxes which provide a framework for pixel analysis across consecutive images.

DETAILED DESCRIPTION

Detailed descriptions of examples and methods of the present disclosure are provided below. The description of preferred and alternative examples, while thorough, is meant to be exemplary only, and variations, modifications, and alterations may be apparent to one of ordinary skill in the art. These examples do not limit the breadth of the disclosure.

Definitions

Active Navigation: disclosed system's processes for synchronizing Baseline Navigational Data to the active view and for declaring a hazardous condition.

Active View: the current view from the locomotive image capture device and other sensors while the disclosed system is operating.

Baseline Navigation Data: disclosed system's developed data used by Active Navigation to orient the system to areas of interest in and around the track and provide data for comparison with the active view for hazardous condition detection.

Consist: the number of cars, length and weight of a train.

Energy Management (EM): An existing fuel efficiency system installed on most line of road locomotives which adjusts throttle settings and provides braking instructions to preserve momentum to conserve fuel.

Emergency Brake Application: applies maximum braking force (20% more than full service) as quickly as possible by opening brake pipe pressure to atmosphere, once applied it is not recoverable until the train comes to a stop.

Fail-Safe: a design principle that in the event of a failure, the system takes the safest course of action, even if it is more restrictive than necessary.

Hazardous condition: a condition in which any object or obstruction in the impact zone endangers the train's safe passage and/or people and/or animals are in danger of being struck by the train.

Impact Zone: area in and around the track where a train would impact an object, obstruction, individual or animal.

Image Subtraction—computer operation whereby pixels from one image are subtracted from another to detect changes between the images

Justification—a process for adjusting pixel values in the active view based on differences between sampled pixels represented in Baseline Navigational Data and those same pixels in the active view.

Neural Network: a subset of machine learning which must be trained to classify objects by analyzing input data using filtered layers which identify object characteristics.

Object Tracking: a computer vision activity where object movement is assessed between captured images.

Penalty Brake: a full-service or emergency brake application applied by PTC which cannot be recovered until after the train has stopped.

Performance Data and System Analytics: disclosed system's processes and methodologies to provide automated review and system learning from operational data.

Pixel Calculus: weighted and justified pixel summations for the disclosed system's prescribed regions around the track for impact zone and warning zones which in preferred embodiments are generally trapezoidal in shape; in some embodiments, some or all of the regions are further divided into analytical units referred to herein as grid boxes.

Pixel Signature: a collection of pixels identified as a hazardous condition, from a pixel calculus variance representing a difference between the Baseline Navigation Data and Active Navigation which may be used by the disclosed rail computer vision system to monitor and update the status of a hazardous condition.

Positive Train Control (PTC): a Federally, mandated train safety system which uses full-service and emergency brake applications to stop a train that is not in compliance with different operating parameters.

PTL: Positive Train Location, is a rail industry initiative to enhance PTC navigational location accuracy ensuring PTC is accurately localized for initialization and operation in multi-track areas by utilizing DGPS (Differential Global Positioning System), locomotive tachometer and an IMU (Inertial Measurement Unit).

Recoverable brake—train brake which does not require a stop; train brake other than penalty or emergency brake

Restricted Speed: railroad operating rule that requires stopping the train within one-half the range of vision, also imposes a maximum operating speed typically 20 mph or less.

Severity index: value assigned by the disclosed system to differentiate use case scenarios and facilitate appropriate train handling responses to hazardous conditions.

Subdivision: sometimes referred to as a district, a railroad organizational designation for a line segment which typically defines tracks and operations between two or more terminals or junctions.

Train Handling and Hazard Response: the disclosed system's processes for contextualizing rail computer vision and initiating speed reduction and braking events.

Trapezoidal Ranging: inherent byproduct of disclosed system's navigation that yields persistent distance to hazardous conditions values.

Unobstructed distance: measured value as determined by reflective technology which captures the distance to a potential obstruction.

Warning zone: area outside the impact zone where the present system will assess the need to warn people and/or animals of an approaching train.

Wayside Equipment Detector: devices used to assess the mechanical health of a train as rolling stock pass by the device.

Exemplary Concept of Operations

FIG. 1 is a high-level exemplary diagram of the Concept of Operations (CONOPS) for the disclosed system. In an abstract example, train 101 may be approaching hazardous condition 103. Hazardous condition 103 may be anything affecting the operational safety of train 101 which potentially could cause harm to the public, rail personnel and\or disrupt rail operations. By way of non-limiting examples, a hazardous condition may include the following when located within the impact zone: humans (including adults and children), animals (small and large, domestic or wild), vehicles, fallen trees or telephone poles, trash or garbage, farm equipment, earth construction equipment (e.g., backhoes and earth movers), track faults, rockslides, buckled track, washed out track, grade crossing warning device failure, misaligned switch/switch targets, improperly positioned derail devices, downed power lines, high water, on-track equipment including rolling stock and locomotives. These same objects located in the warning zone are not considered hazardous conditions and only elicit warning responses from the present system.

As train 101 approaches hazardous condition 103, two distances become important. The first is braking distance 102. This is the distance required for the train to come to a complete stop when a penalty brake application is applied. PTC already makes these calculations continuously based on current speed, train consist, grade, track curvature and possibly other factors.

The second distance is response distance 104. This represents the distance by which train 101 must complete the execution of its response to the hazardous condition 103. For example, if hazardous condition 103 is a downed tree across the track, then response distance 104 represents the distance the system has to evaluate the hazardous conditions and bring train 101 to a stop.

If the response distance is greater than the braking distance, there exists an evaluation distance 105, a time interval within which the disclosed system is afforded the opportunity to further evaluate the hazardous condition. Under such circumstances, the response distance equals the braking distance plus the evaluation distance.

The disclosed system's dynamic detection of, and responses to, the examples of hazardous conditions discussed above, which are unaddressed by PTC or EM operational scenarios, is presented herein. Set forth herein are autonomous operation processes and methodologies which address the unique requirements of rail. Introduced is highly developed navigational data which leverages the unique characteristics of rail operations and provides increased detection and response capabilities as compared to a locomotive engineer. In some embodiments, the disclosed system will interface with PTC, EM and PTL through planned Industry-defined protocols. Table 1 below represents an evolutionary path by which this system may be implemented and incrementally improved with technology.

TABLE 1 Functionality Operation Hazardous condition detection Manned by locomotive engineer able to and classification equivalent intervene to locomotive engineer Hazardous condition detection Unmanned locomotive - single and classification exceeding monitoring session with onboard locomotive engineer capabilities sensors, optional operator capable of intervention Hazardous condition detection Unmanned locomotive - dual and classification exceeding monitoring sessions for onboard and locomotive engineer capabilities/ extended vision drone sensors, optional Strategic drone support operator capable of intervention hazardous condition detection Unmanned locomotive - multiple with classification exceeding monitoring sessions for onboard, locomotive engineer capabilities/ extended vision drone and inspection Continuous drone support drone sensors, optional operator capable of intervention

The present disclosure makes no claim related to image capture device technology. A combination of different image capture device technologies may well provide the best solution. Advances in technology will only serve to enhance the disclosed system's use of Baseline Navigational Data and its novel approach to a rail computer vision system which separates object detection and object classification so that object detection performed against baseline data represents a “no hazardous condition present” truth which authorizes train operations to continue uninterrupted.

FIG. 2 is a block diagram representing an embodiment of the invention. The functions inside the dashed line in FIG. 2 may be performed on a computer systems in a distributed manner among a plurality of computer systems. In an embodiment, a computer system includes one or more processors, accompanying memory for program and data storage, and communications interfaces for communicating with, e.g., devices 209, 210, 211, 212, data storage device(s) 201, EM and PTC systems 207, 208, and performance data and system analytics 206. Referring to the exemplary embodiment of FIG. 2 for a high-level system overview, Baseline Navigational Data is inputted and synchronized with Active Navigation. Baseline Navigational Data effectively provides a roadmap which “tells” Active Navigation where to “look” and what it should “see” if no hazardous conditions are present.

Active Navigation may use multiple locomotive capture devices and sensors to support the disclosed system's unique approach to reliable hazardous condition detection which is accomplished exclusively against Baseline Navigational Data. The disclosed system is only required to detect an out of tolerance condition with Baseline Navigational Data before declaring a hazardous condition, without incurring the processing requirements imposed by a Neural Network or image subtraction. Once declared, the hazardous condition will be treated with the highest severity unless released by the disclosed system's specifically rail-trained Neural Network. This release would typically need to occur within the recognition and reaction time of the locomotive engineer unless current braking distance affords the system additional evaluation time. The efficiencies of declaring a hazardous condition in this manner, against Baseline Navigational Data without a need for object classification establishes a standard of defaulting to the safest course of action which befits the interests of public safety as it relates to rail operations.

Baseline Navigational Data is organized by precise location and contains pixel calculus and unobstructed distance values consistent with operating conditions when no hazardous conditions are present. These values contained in Baseline Navigational Data can be used for comparison with the disclosed system's active view calculated pixel calculus values and unobstructed distance values measured during Active Navigation to determine if it's safe for a train to continue unabated. Baseline Navigational Data provides the necessary information for Active Navigation to make these determinations and clear a train for continued operations on a section of track.

The components of Baseline Navigation Data which focuses the active view only on the areas for which the disclosed system provides protection are the impact zone and the warning zone. FIG. 3 is a depiction of exemplary impact zones as defined by (x,y) coordinates contained in Baseline Navigational Data which form trapezoids extending in front of the train along its path when applied to the active view. These trapezoids may be further divided by the present system by superimposing grids on each trapezoid. Baseline Navigational Data contains values calculated over the pixels in each of these grids/trapezoidal divisions when no hazardous condition is present, effectively telling Active Navigation what it should “see” with no hazardous conditions present. Active Navigation can now compare its own active view pixel calculus calculations with those stored in Baseline Navigational Data to determine whether a hazardous condition should be declared. In some embodiments, this comparison constitutes determining the difference between the pixel calculus values for a region stored in the Baseline Navigation Data and determined by the active view (the active view pixel calculus values are preferably justified as explained further below). A disclosed system's proof of concepts (POC) was conducted, the results of which are discussed later in this text. The aim of the POC was to demonstrate that the baseline of a railway roadbed provides an extraordinarily consistent background for hazardous condition detection when leveraged by the disclosed system's pixel calculus processes. Comparatively, conventional computer vision without the Baseline Navigational Data roadmap would need to use resources to process much more of the entire image presented to its computer vision system. Conventional computer vision systems without the aid of Baseline Navigational Data would presumably need to reference stored images or identify image characteristics by passing filters over the entire image to first discover the rail and establish an area of interest around the rail, and then continue processing the image to identify any potential hazards.

The second component of Baseline Navigational Data used to authorize continued operation into a section of track is unobstructed distance, which represents the minimum distances the track must be clear of any obstruction. At prescribed locations indicated in the Baseline Navigation Data, Active Navigation performs these measurements for comparison with Baseline Navigational Data values. Reflective technology (e.g., radar, laser) can provide reflected distances which Active Navigation can interpret as inside or outside the confines of the impact or warning zones by comparing the current reflected measurements with those in Baseline Navigational Data. FIG. 4 depicts the application of a reflective technology device which produces reflective distance measurements which may be further enhanced by multiple antennae capable of different beam widths and multiple receivers. In this depiction, the reflective technology device is focused on the outer limits of sight distance. In some embodiments, the width of the beam 402 generated by the reflective technology is approximately one degree. Wider or narrower beam widths are used in alternative embodiments. Active Navigation calculates and compares reflective technology values for unobstructed distance with those captured in Baseline Navigational Data, again to determine if the locomotive can continue unabated. Unobstructed distance may also leverage the present system's trapezoidal framework to interpret reflected results, this may be particularly useful if the chosen technology is imaging radar.

The above-described leveraging of Baseline Navigational Data by Active Navigation successfully shifts a significant portion of computer vision resource requirements from the active system to an offline process in which Baseline Navigational Data is created. Offline computational data from the lab embedded in the Baseline Navigational Data file facilitates hazardous condition detection by Active Navigation.

Continuing with FIG. 2, the impact of Baseline Navigational Data continues after a hazardous condition is identified by Active Navigation. The Neural Network classifies the object, for example, an object may be classified as a person, an animal or a vehicle. However, by using Baseline Navigational Data as a screening process for the Neural Network, a >95% reduction in pixel count for image classification is achievable since only the out of tolerance pixels (those pixels corresponding to a grid in an impact zone or warning zone for which the difference of the Baseline Navigation Data and justified Active Navigation pixel calculus values exceed a tolerance threshold) contained in the current image are processed by the Neural Network. The benefits of this can be illustrated by taking the hazardous condition example of a car in the impact zone. FIG. 5 is a representation of the image from which conventional computer vision would need to declare and classify a hazardous condition due to the presence of a car 502 near the track. Comparatively the present system has already declared a hazardous condition from Active Navigation's out-of-tolerance pixel detection within the predefined impact zone, and those out of tolerance pixels as represented in FIG. 6 is all that's required for the Neural Network to classify that hazardous condition as an automobile. Active Navigation treats these out of tolerance pixels as a pixel signature, provided the pixel signature remains within predetermined limits for both pixel variance and location. In this manner, a hazardous condition classified as a car need not be re-submitted to the Neural Network for object classification on each successive image (record) processed by Active Navigation. Instead, a temporary list of pixel signatures is maintained from image to image until the train has passed the location of the pixel signature in order to recognize that object classification for the pixel signature need not be performed again. However, if a change occurs, such as a human entering the impact zone at a point near the car, either or both of the predetermined limits for pixel variance and location would be exceeded, which would trigger a Neural Network classification event. The pixel signature provides a unique reference for the tracking and hazardous condition response by the present system's rail computer vision components, Active Navigation, Neural Network, Object Tracking and Train Handling and Hazard Response.

Continuing with a high-level overview, consider the third component of the present system's computer vision, Object Tracking. The Object Tracking processor of the onboard computer vision system consumes pixel signatures from Active Navigation and if available, the hazard classification from the Neural Network. Object Tracking monitors hazardous conditions to determine if they are moving in or out of the impact zone or failing to move at all. Using Baseline Navigational Data, grids are applied to the trapezoids in the active view to track a pixel signature within the trapezoidal framework. Grids may vary in size to optimize Object Tracking at different distances from the locomotive. Object Tracking can then determine if hazardous conditions are responding to locomotive warning devices by referencing movement against trapezoidal grid boxes. This trapezoidal referencing is depicted in FIG. 7A-7B. Initially the SUV 702 only occupies the grid boxes on the right side of the trapezoid, however in the next sampling the SUV 702 occupies grid boxes on the right, center and left side of the trapezoid allowing the present system in infer movement of the SUV from right to left as shown in FIG. 7B. Noted too in FIG. 7B, the locomotive is now closer to the SUV 702 and the reference trapezoid is proportionately larger however its orientation to the track is consistent, allowing the present system to monitor the SUV's progression across the track. Should the SUV span trapezoidal boundaries on subsequent image captures, the present system still ascertains the same number of out of tolerance grid boxes oriented to right, center or left of the trapezoidal reference. Utilizing trapezoidal reference in this manner extends to zoomed images as well in cases where a hazardous condition is detected in trapezoids at greater distances from the locomotive and the increased pixel count from a zoomed images can be deterministic. In this way, Object Tracking feature is similar to Active Navigation and Neural Network in that it directly benefits from the trapezoidal position information of Baseline Navigational Data. Conventional computer vision typically employs pixel image subtraction methods to determine if an object has moved between two images. Image subtraction becomes more onerous when both the object and locomotive are moving. Conversely, Object Tracking is constantly receiving pixel signature updates from Active Navigation within the Baseline Navigational Data framework. Consequently, the pixel signature can be tracked within the trapezoidal grid subdivisions of the active view. In FIG. 7A-7B for example, a vehicle's progression crossing the tracks can be monitored by the movement of the out of tolerance pixels through the trapezoid which is accomplished within the trapezoidal framework provided by Baseline Navigational Data and without the computational and interpretive burdens presented by image subtraction.

Object Tracking predicts hazardous condition states for Train Handling and Hazard Response facilitating responses to hazardous conditions entering or leaving the impact zone. Train Handling and Hazard Response incorporates information from Object Tracking combined with data on the current operating characteristics of the train, including stopping distances for both penalty and emergency brake applications to meet or exceed Industry best-practices for hazardous condition response.

Train Handling and Hazard Response monitors Active Navigation, Object Tracking and Neural Network as well as current braking distance as furnished by PTC. Based upon information gather during the development phase when the system is logging locomotive engineer's responses to dynamic hazardous conditions and input from railroad operation's practices personnel, use cases are developed which determine Train Handling and Hazard Response's adjudication for when to apply warning devices, throttle down or train brake. Train Handling and Hazard Response interfaces with EM, PTC, PTL, onboard or wayside drones, warning devices and the dispatch center to evaluate and command responses to detected hazardous conditions.

A more in-depth examination of the processes in FIG. 2 begins with the Baseline Navigational Data creation process which is the basis for the disclosed invention's unique approach to computer vision. When deploying this system, unlike conventional computer vision systems, railroads must make a significant investment in the creation and maintenance of Baseline Navigational Data. However, as will further be demonstrated, this endeavor will optimize hazardous condition identification, classification, and object tracking as well as Train Handling and Hazard Response.

Baseline Navigational Data

Unlike highway vehicles, rail trips between rail terminals are highly repetitive, occurring on line segments referred to as districts or subdivisions. These well-defined trips lend themselves to the collection and development of Baseline Navigational Data, which can then be referenced by Active Navigation to greatly reduce background and fixed object noise resulting in more robust hazardous condition detection and classification. Once developed, Baseline Navigational Data is assembled by the disclosed system's processes into a file which is downloaded by the locomotive during or before a system initialization process prior to beginning an automated trip.

The development of Baseline Navigational Data begins in the field with traversing a data-collection locomotive over the candidate subdivision, equipped with the disclosed system's hardware running its data-collection software which interfaces with the onboard PTL system. PTL is comprised of a GPS receiver, a ground-based correction signal receiver, wheel tachometer input and an inertial measurement unit (IMU). PTL provides sub-foot location accuracy which is leveraged by the present system for both Baseline Navigation Data creation and Active Navigation.

Onboard the data-collection locomotive, the disclosed system's software applies precision location information to captured images and unobstructed distance values, storing them at sample rates which support the post-trip development requirements for Baseline Navigational Data. The disclosed system's onboard Baseline Navigational Data development software uses inputs from PTL to collect and annotate images for the Baseline Navigational Data development process. In one exemplary embodiment of the disclosed system's Baseline Navigational Data creation process, precise location attribution will be provided for every 11 feet of locomotive travel along the track of the subdivision. For a train traveling 60 mph (88 ft/sec), this would provide Active Navigation a reference in Baseline Navigational Data every 0.125 seconds.

In the post-trip development process, images and data from the data-collection locomotive are converted to sets of numerical values for the Baseline Navigational Data file. When used by Active Navigation, these values provide the roadmap for where to “look”, i.e. (x,y) coordinates (FIG. 3), and what to “expect”, in terms of pixel calculus and unobstructed distance values, when no hazardous conditions are present. Baseline Navigational Data will be developed for all subdivisions for which automation is desired and may be organized such that each direction of travel is its own Baseline Navigational Data file. All subdivision multiple-track configurations and sidings which could be traversed in automated operations will require Baseline Navigational Data.

The present system's Baseline Navigational Data auto-creation process begins by navigating imagery collected from the data-collection locomotive. The lab version of the present system's Neural Network is utilized in this process. The lab Neural Network is similar to Active Navigation's Neural Network absent the time constraints of the active system, as such more neural layering is possible for higher accuracy during the data development process. The lab Neural Network identifies the rail being traversed and any hazardous condition present during the collection process.

Accurate identification of the rail being traversed, which may be complicated by multi-track locations, is essential for the definition of impact and warning zone coordinates. As such the Baseline Navigational Data development tools are able to navigate forward and backward through the collected data to maintain route integrity throughout the creation process. Allowing for accurate assignment of (x,y) coordinates for the impact and warning zones irrespective of multi-track configurations. Similarly this process aids in the establishment of the active view alignment information for inclusion in Baseline Navigational Data which allows the Active Navigation to apply Baseline Navigational Data to the active view without errors from vibration introduced by a moving locomotive.

The final application of the lab Neural Network is for identification of hazardous conditions in the impact and warning zones which were present during the data-collection process. For instance, consider a vehicle at a highway grade crossing passing in front of the data-collection locomotive. The auto-creation software tools will scrub the pixels in the impact and warning zones of any hazardous conditions before creation of the Baseline Navigational Data file by substituting pixels from adjacent imagery frames or substituting pixels from the surrounds of the affected frame. Similarly, these same tools will identify fixed objects in the warning zones which can be flagged in the Baseline Navigational Data file for special handling so that onboard classification resources are conserved. By this methodology the auto-creation processes in the lab orients trapezoids to defined areas of concern around the track and identifies fixed objects as non-hazardous conditions, greatly reducing computer vision resource demands.

After the field collected images have been scrubbed of any hazardous conditions, the disclosed system's lab software tools convert stored images into sets of numerical values for the Baseline Navigational Data file. When used by Active Navigation, again these values provide the roadmap for where to “look”, i.e. (x,y) coordinates for the impact and warning zones and what to “expect”, in terms of pixel calculus if no hazardous conditions are present. Baseline Navigational Data will need to be developed for all autonomous subdivisions and may be organized such that each direction of travel is its own Baseline Navigational Data file. All multi-track and sidings which could be traversed in autonomous operations would also require Baseline Navigational Data which again may be organized as individual files for each direction of travel.

The preferred methodology for the Baseline Navigational Data auto-creation begins by processing the imagery collected by the data collection device (e.g., a capture device on a locomotive) during a data collection trip in the reverse direction, starting with defining the coordinates of, and calculating the pixel calculus value (or values if multiple grid boxes are used) for, the final impact zone trapezoid and warning zone trapezoids corresponding to a location at the end of the data collection trip, and ending with defining the coordinates for, and calculating the pixel calculus values(s) for, the first impact zone trapezoid and warning zone trapezoids corresponding to a location at the start of the data collection trip.

An exemplary impact zone trapezoid 802 is depicted in FIG. 8 (also shown in FIG. 8 are warning zones 804, 806 on either side of the impact zone 802). As shown in FIG. 8, the impact zone trapezoid 802 is defined by four pairs of coordinates (x1,y1; x2,y2; x3,y3; and x4,y4). A Neural Network focused solely on identifying the rail is used in addition to predetermined measured pixel counts representing exemplary 11 ft of (1-n) linear track distance for a chosen image capture device of prescribed resolution. By this methodology, the pixel count for example between (x1,y1) and (x3,y3) is known for a given image capture device, similarly the pixel count between (x1,y1) and (x3,y3) of this trapezoid can be predetermined for (1-n) occurrences of this trapezoid in view of the reference locomotive. The pixel counts between exemplary lateral coordinates (x3,y3) and (x4,y4) follow the same methodology of predetermined measured pixel counts for (1-n) increments from the reference locomotive. Thus, creation of baseline navigational data for the impact and warning zones consists of the Neural Network identifying the rail followed by application of predetermined pixel counts to define trapezoid dimensions. In FIG. 9, the iterative process continues by working backward to process an image taken a location corresponding to the penultimate trapezoid 904 of the subdivision. At this location, the Neural Network again first identifies the rail and then the (x,y) trapezoid coordinates as before. The (x3,y3) and (x4,y4) coordinates of the newly defined penultimate trapezoid 1004 are shared points with the trapezoid 802 defined in FIG. 8 (reference numeral 1002 in FIG. 10), which effectively appends the two impact zone trapezoids 1002, 1004 together as depicted in FIG. 10. Additionally, the size of the final impact zone trapezoid 1002 as depicted in FIG. 10 is now decreased by a learned pixel count to account for its increased distance from the data capture device.

This methodology of appending previously defined trapezoids in prior images to the current images creates frame coupling between distinct images and provides the trapezoidal reference data necessary for Active Navigation to process its active view.

This process continues by iterating backward until the first impact zone trapezoid in the subdivision is reached. FIG. 17 further illustrates the first five steps of this process in one embodiment for the impact zone trapezoids (the process for the warning zone trapezoids is similar). By way of example, if the data-collection locomotive traversed the subdivision from south to north, in Step A as shown in FIG. 17A the northern-most (final) impact zone trapezoid 1702 is defined first. This definition includes trapezoidal (x,y) coordinates and pixel calculus (which again may be for the entire trapezoid or for multiple grids in the trapezoid). In Step B in FIG. 17B, the reference locomotive is now one trapezoid length south of its Step A position and the next (penultimate) impact zone trapezoid 1704 is defined. As shown in FIG. 17C at Step C, the trapezoid 1702 from 17A is appended to the newly defined trapezoid 1704 in 17B and sized to coincide with its new position relative to the reference locomotive. New pixel calculus value(s) for both the final impact zone trapezoid 1702 (because its position is now father from the reference locomotive and its dimensions have changed) and for the penultimate impact zone trapezoid 1704 (again, multiple pixel calculus values for each grid in a trapezoid may be calculated for embodiments that employ grids) are determined. In Step D in FIG. 17D, the process continues with the definition of a third-to-last trapezoid 1706 immediately in front of the locomotive. In Step E of FIG. 17e, the final and penultimate two trapezoids 1702, 1704 are appended to the newly defined third-to-last trapezoid 1706 from Step D. Again, new pixel calculus values are determined for all three trapezoids. By this methodology each time a trapezoid is originally defined it is immediately in front of the locomotive and then continues to be redefined with each iterative step until it is no longer in view. A typical subdivision defined by this methodology may contain a record representing every 11 feet of linear track containing the data for each trapezoid in view of the “reference” data collection locomotive.

Each time a trapezoid is defined, so too is its pixel calculus. The final trapezoid 802 as defined in FIG. 8 has a pixel calculus which is then redefined with a new pixel calculus calculation in FIG. 9, where it is now the second trapezoid 902 from the reference locomotive. This iterative process continues for each trapezoid until it is no longer in view of the reference locomotive.

As discussed above, in some embodiments, coordinates are determined and pixel calculus values are calculated for grid boxes superimposed on the trapezoids. Grid boxes vary in size and number depending on a trapezoid's distance from the reference locomotive with trapezoids closer to the locomotive having larger numbers of grids than trapezoids located further from the locomotive. FIG. 11 is a depiction of a grid box within a trapezoidal definition, a bit weighting (of 1, 2, or 3) is applied to lessening the effects of vibration and any misalignment of Baseline Navigational Data and the active view. In one embodiment of the present disclosure, grid boxes located on a side of the trapezoid will have a shape with a side that aligns with the side of the trapezoid, with the other side being vertical and aligned with the neighboring square-shaped grid box, and horizontal tops and bottoms aligned with the horizontal top or bottom of the trapezoid and/or grid boxes located above or below as depicted in FIG. 19.

The Baseline Navigational Data file can also contain pixel calculus for different times of day and, in some embodiments, even different times of year, and/or weather conditions (e.g., rain, snow). Rail's highly repetitive routes lend themselves to acquiring extensive data on these subdivisions and the disclosed system's lab and development processes allows every train that traverses a subdivision to validate and potentially drive data improvements for the subdivision.

Despite the efforts listed above, pixel calculus can be affected by differences between conditions during the collection process and current operating conditions. Preferred embodiments of the disclosed system address this disparity with a justification process that utilizes justification windows. The justification process allows the active system to conserve its tolerance budget and reliably identify hazardous conditions in varying operating conditions. Baseline Navigational Data will contain definitions for one or more justification windows. In some embodiments, a single justification window is used. In other embodiments, multiple justification windows, e.g., three justification windows, are used. In some embodiments, a justification window corresponds to a 20×20 pixel window. In preferred embodiments, a pixel weighting such as that depicted in FIG. 11 is applied to the pixel values in the justification window. The weighted pixel summation for the pixels in the window, or the average weighted pixel summation for each of multiple justification windows, is divided by the total number of pixels in the justification window (s) and is stored as a baseline justification value, in the Baseline Navigational Data file for each record. When active view data is received, justification window data is computed for the justification window(s) corresponding to the coordinates in the Baseline Navigation Data, including the application of the weighting scheme (if any) reflected in the Baseline Navigation Data, and used to compute an active view justification value. The difference between the baseline and active view justification values is then calculated and applied as a justification factor to each pixel in the active view data prior to pixel calculus being performed on the active view data. Additionally, the difference between the baseline and active view justification values is then calculated and divided by the number of pixels in the justification window (or windows if there are more than one) to arrive at a per-pixel value which is applied to each pixel in the active view data prior to pixel calculus being performed on the active view data.

In one exemplary embodiment the justification window(s) designation(s) would be in the leading edge of the impact zone immediately in front of the reference locomotive for Baseline Navigation Data. In this manner, any hazardous condition would already have been detected before being in this proximity to an active locomotive. The justification window(s) coordinates are preferably fixed relative to a capture source and stored in the system's database. However, it is possible to employ variable justification windows, in which case the coordinates would be stored in the corresponding record in the Baseline Navigation Data. This allows for a valid comparison between Baseline Navigational Data and the active view justification windows with no hazardous conditions present. The result of this comparison is used to adjust (justify) the pixel values in the impact and warning zones of the active view, so that the justified active view may be now compared to Baseline Navigational Data without distortion caused by environmental differences between current conditions and conditions at the time the data used to derive the Baseline Navigation Data was collected.

The Baseline Navigational Data development tools use PTL precision GPS coordinates collected in the field and railroad data to interpret track curvature. Employing a rail-trained Neural Network in the lab, the tools identify the rail and create the trapezoidal definitions, orienting them correctly to the track in accordance with its rail-trained computer vision and its understanding of curvature from railroad data or bearing interpreted from precision GPS readings. Exemplary FIG. 12 is an illustration of impact zones in a 2.5 degree curve in one embodiment.

An embodiment of the disclosed system may organize the Baseline Navigational Data file by precise location such that all (x,y) trapezoid coordinate information in the active view would represent a single record for that location within the Baseline Navigational Data file. In the FIG. 12 example using this methodology, 9 trapezoids would be contained in this Baseline Navigational Data record for the location. Other records may contain many more trapezoids as dictated by the available sight distance.

The size and scope of the development of the Baseline Navigational Data using the disclosed systems tools is significant. The simplification of hazardous condition detection by using Baseline Navigational Data and its offline computational data comes with considerable effort. For perspective, the auto-creation process applied over a typical 200-mile subdivision would generate a Baseline Navigational Data file containing over 90,000 location records and upwards of 2 million trapezoid definitions in some embodiments.

Table 1 below is an exemplary representation of a Baseline Navigational Data record. The example in Table 1 assumes that the justification window coordinates are fixed relative to a capture source, so that it is not necessary to store justification window coordinates in each record. In alternate embodiments, each record also includes (x,y) coordinates for one or more justification windows.

TABLE 2 high level description of an exemplary Baseline Navigational Data Record Attribute Comment Precision Location of required for Active Navigation sync to Capture Device Baseline Navigational Data file Trapezoidal (X, Y) Defines specific areas of interest for Coordinates onboard resources: Impact zone Warning zone Defines/optimizes trapezoidal grid size Organized by source: Normal locomotive view Zoomed locomotive view Drone view Fixed wayside device view Pixel Calculus Pixel summations for each trapezoid (impact zone and warning zones) or for each grid box in each trapezoid for a particular source/view, and for a particular time of day and/or time of year. Pixel calculus values are used for: Hazard detection Classification Object tracking Supports: Varying tolerance values (the grid sizes within a trapezoid are consistent, but grid sizes in a trapezoid further ahead of the train are larger than the grid sizes in a trapezoid closer to the train) Alignment Reference for One or more pixel coordinate values Active View corresponding to one or both rails Unobstructed Distance Reflective technology measured distance Capture Device Controls Zoom Pan Drone Flight Options Static drone coverage areas Available drone flight patterns Grade Crossing Attribution Design start time in advance of train arrival Interlocked with traffic signals Gate and light malfunction detection Multi-Track and Fixed Location where equipment on adjacent tracks Object Attribution and fixed objects affect sight distance Platform, Bridges, Tunnel Predetermined warning areas Locations Predetermined train handling factor areas Baseline Justification Value of weighted pixel summation for a Window Value justification window or average of weighted pixel summations for multiple justification windows, as applicable, divided by the number of pixels in the justification window(s) used for justifying corresponding active view data

Detailed Description of Baseline Navigational Data Attributes

Precision Location of Capture Device—an attribute in Baseline Navigational Data which allows Active Navigation to access the correct record from the file for its hazard detection analysis. The present system requires precise location information for both the Baseline Navigational Data file and for Active Navigation as will be described in detail in the Detailed Description of the Production System section of this text.

Trapezoidal (X,Y) Coordinates—attribute in the Baseline Navigational Data which when applied to Active Navigation defines areas of interest around the track, i.e., impact zone and warning zone; also defined is grid sizing for trapezoids in the impact and warning zones. Trapezoidal (X,Y) coordinates may be organized by capture source, i.e. locomotive, drone, and fixed wayside. While the impact zone remains consistent in its relationship to the track, for the warning zone this can vary. Instances where steep embankments make access to the track impossible, the warning zone could be very small. Conversely, at grade crossings the warning zone may expand to detect approaching vehicles and proper operation of the crossing gates and lights, an activity performed by onboard train personnel today. In one embodiment of the present disclosure a (z) coordinate may be defined. For example, the z-axis may be useful in defining the elevation of the top of the rail which provides a useful clearance reference for Active Navigation when minor debris is in the track or when operating with snow on the ground. A Baseline Navigational Data file may contain separate (x,y) coordinate information for different input devices or input device settings, including the normal locomotive image capture device(s), a zoomed image from the locomotive capture device, images from drones or deployed fixed wayside devices. An embodiment of the disclosed system may use the same Baseline Navigational Data for both the normal locomotive view and a drone with a similar capture device whose collection flight mimics the orientation of the locomotive capture device.

Pixel calculus—an attribute in the Baseline Navigational Data of weighted pixel summation performed on individual pixels or aggregations of pixels for each trapezoid and/or trapezoidal grid box. In some embodiments of the present disclosure a grid may be superimposed over a trapezoid, that grid may vary in size, such that trapezoids closer to the locomotive have a denser grid pattern than those farther away. When a grid is applied a separate pixel calculus will be established for each grid box, providing more localized hazardous condition detection, classification and object tracking. Tolerance levels are associated with each grid box and may vary with distance from the acquisition device. Tolerance levels may also be independently applied by Active Navigation to adjust for environmental factors.

Alignment Reference for Active view—attribute containing one or more pixel coordinates for the active rail which may be applied to the active view in order the align the active view with Baseline Navigational Data for computational purposes.

Unobstructed distance—attribute collected by reflective technology at strategic locations as prescribed by design criteria and identified by precision GPS coordinates in Baseline Navigational Data, comprising a distance measurement in linear track feet for which no obstruction should be present. Active Navigation can compare this value to current values to determine the presence of a hazardous condition. FIG. 4 depicts the use of this technology which typically would occur when a locomotive encounters longer sections of tangent track.

Capture Device Controls—attribute may be either static or dynamic. Static controls are established during the present system's locomotive data-collection process. Strategic zoom images are captured during this process at prescribed criteria for available sight distance. In some embodiments pan and tilt controls could also be defined. However, the system's reliability and availability may benefit from image capture devices having fewer moving components, such as a second capture devices capable of a wide-angle view. Dynamic zoom control, while not a navigational database attribute, is also enhanced by the present system's Baseline Navigational Data. A detected hazardous condition triggering a dynamic zoom event would utilize the inherent trapezoidal ranging from an out of tolerance trapezoid to affect proper zoom controls. As will be further explored, in the hazardous condition detection and response portion of this text, trapezoids provide an immediate reference for distance between the locomotive and the hazardous condition.

The drone flight option attribute designates predefined areas for drone coverage and indicates available drone flight patterns for use by Active Navigation. This attribute would also restrict drone usage in prohibited areas. Drones may be dispatched from locomotives or wayside stations.

The grade crossing attribute contains pertinent data for each grade crossing on the automated route. This includes but is not limited to: configuration of installed warning devices (allows the present system to verify proper operation of gates and light as currently performed by train crew), design start time in advance of train's arrival at the crossing and if warning device is interfaced with highway traffic signals designed to move vehicles off the track for an approaching train

The multi-track attribute designates track that may have the present system's evaluation distance reduced when equipment is on an adjacent track. Based upon curvature of the two tracks, the navigational database can so designate these areas. Dispatch center updates and/or present system detection and classification processes can identify equipment on adjacent tracks triggering the present system to operate with reduced evaluation and response distances. FIG. 18 is an example of reduced impact and warning zone evaluation and response distances resulting from equipment on an adjacent track. Similarly, the fixed object attribute identifies permanent obstructions such as bridge abutments, propane tanks, and wayside signal structures which can be flagged as non-hazardous conditions thus conserving onboard computer vision resources.

The platform, bridges and tunnels attribute designate areas where locomotive engineers typically provide warning of the approaching train. These designations in the Baseline Navigational Data allow for an auto-generated sounding of the train horn or bell at a prescribed approach distances.

Detailed Description of a System According to an Embodiment

The present disclosure provides generally for a system, methodology, processes, and apparatus for automated response to line-of-sight hazardous conditions. Active Navigation, Neural Network, Object Tracking and Train Handling and Hazard Response comprise the present system's computer vision software, processes and methodologies. According to the present disclosure, enhanced hazard-detection techniques are used in connection with existing speed controls to better predict and prevent collisions or other undesirable events by regulating the speed of the train.

Active Navigation and Synchronization with Baseline Navigational Data

Autonomous train operations begin by ensuring Baseline Navigational Data is downloaded to the locomotive before trip commencement. System initialization will be executed between the lead locomotive hosting the present systems computer vision system and the railroad back office or dispatch center. Among other checks, the initialization will confirm system health, proper communication between the present system and both PTC and EM, the automated route and download of the correct Baseline Navigational Data file for the intended trip (if not already stored onboard). All possible candidate Baseline Navigational Data files, including multiple mainline tracks and sidings, are downloaded from the back office or dispatch center for all subdivisions in the planned trip. Active Navigation would assemble these Baseline Navigational Data files into its active memory so that while the train is operating, any updates from PTC regarding track routing, are seamless.

Once positioned at the entry point of an autonomous subdivision, the locomotive can begin the trip with Energy Management controlling the throttle and routine brake applications, while PTC controls the penalty brake and emergency brake applications required to stop a train within prescribed stopping distances. The disclosed system interacting with these two systems provides responses to hazardous conditions in and around the track, a function previously performed by the locomotive engineer.

Referencing FIG. 2, at the trip's commencement, Active Navigation becomes located by obtaining its present location from the PTL input. Once located, Active Navigation references the Baseline Navigational Data file record which corresponds to its current location. Once underway, Active Navigation navigates using PTL and anticipates the next corresponding location record in the Baseline Navigational Data file of consecutive location records. Active Navigation then captures the next corresponding image from the Locomotive Image Capture Device for analysis. In this manner, by triggering on the next available Baseline Navigational Data record, Active Navigation is able to synchronize its active view with the Baseline Navigational Data file.

Active Navigation can synchronize multiple active views simultaneously with Baseline Navigational Data. In addition to the normal image from Locomotive Image Capture Device, a zoomed image from this device or dedicated device can be synchronized with Baseline Navigational Data. Active Navigation's monitoring of the Baseline Navigational Data file allows Active Navigation to send zoom controls to the Locomotive Image Capture Device and trigger an image capture to correspond to these predefined locations in the Baseline Navigational Data file. Areas of track where a locomotive coming out of a curve gains a 2500 foot view of the intended route are good examples of statically zoomed data contained within the Baseline Navigational Data file. Additionally, Active Navigation may send dynamic zoom commands to the Locomotive Image Capture Device(s) if an out of tolerance pixel calculus is detected. The trapezoid containing the pixel signature provides Active Navigation the trapezoidal range for affecting accurate zoom controls to the Locomotive Image Capture Device(s) potentially allowing the disclosed system to make a hazardous condition detection and classification from a zoomed image.

A Remote Drone Capture Device may provide another simultaneous view which can be supported by Active Navigation. Drones can be flown with a flight pattern that mimics the rail orientation of the Locomotive Image Capture Device allowing them to use the same Baseline Navigational Data provided proper compatibility between the two capture devices. In embodiments of the present disclosure where this is not the case, separate drone Baseline Navigational Data can be developed in a similar process as previously presented. Drones are not required to implement the disclosed system but when integrated into the present system's rapid screening processes (of where to “look” and what to “see”) to detect hazardous conditions against Baseline Navigational Data, drones have the potential to further increase safety of operations for both their ability to provide warning to the public of an approaching train and detection of hazardous conditions at distances which allow ample time to bring the train to a safe stop.

Remote Drone Capture Device usage may be static or dynamic. Static usage is defined as areas in the Baseline Navigational Data file where drones are always dispatched. These may include areas of limited sight distance, areas prone to rockslides or washouts, tunnels or grade crossings. Weather permitting, static drone usage would always occur during automated operation for the prescribed locations. The disclosed system can calculate when to dispatch the drone in advance of the area designated for static drone coverage by calculating the time required for the drone to arrive at the start of the desired coverage area in advance of the train's PTC calculated stopping distance.

Dynamic usage of Remote Drone Capture Device is defined as a response to an unexpected event. This might be caused by encountering a PTC mandated Restricted Speed target. In this case, a drone would be dispatched by the present system to determine what may have caused a more restrictive signal indication and augment compliance with the Restricted Speed mandate to operate at a speed capable of stopping within one-half the range of vision. Additional examples of dynamic drone usage include train inspection after an emergency braking event or a wayside equipment detector alarm calling for train inspection.

Hazardous Condition Identification, Classification and Object Tracking

The disclosed system's transformative approach to computer vision which leverages Baseline Navigational Data as a no hazardous condition present truth from which Active Navigation can declare a hazardous condition in a manner which conserves computer resources and improves computer vision functionality for classifying hazardous conditions in less time. A better understanding of the advantages of the present system can be taken from a comparison with conventional computer vision. While conventional computer vision systems are far from monolithic and ever-evolving, they do contain common components. Convolutional Neural Networks are considered necessary for the most advanced computer vision systems currently. These consist of convolutional layers which identify objects and classify them. Typically, this involves sliding different 3×3 filters over all the image pixels. These filters expose image features which when processed by the e.g., 24 convolutional layers, the output of which can then classify the object in what is termed the “connected layer” where a voting process yields a final classification with a degree of certainty at which point for rail computer vision purposes, a hazardous condition could be declared. Comparatively, the disclosed system declares a hazardous condition in fewer less complex steps, by comparing Active Navigation's active view pixel calculus and unobstructed distance values e with those in Baseline Navigational Data. Both active view pixel calculus and unobstructed distance are straight-forward computation that can easily be returned in far less than 10 msec. Once Active Navigation declares a hazardous condition, the Neural Network can add classification context (e.g., deer, person, car). If the Neural Network fails to identify the hazardous condition with acceptable certainty or within an acceptable time period as determined by Train Handling and Hazard Response, it will be treated in a fail safe manner by assigning the highest severity index to the hazardous condition. Transforming computer vision in this way makes sense for rail operations in that the immediate declaration of a hazardous conditions before classification is a conservative and prudent approach appropriate for the physics of stopping a large freight train.

The Neural Network as has been established only consumes the pixel signatures as identified by Active Navigation. Performing classification on this limited subset of pixels has several advantages. The first being Neural Networks with more layers tend to produce better results. More layers become a possibility with the present system because less time is required for the system's software to evaluate the smaller data set. Secondly because the pixel signatures are associated with a given trapezoid, a scaling factor can be applied to the classification process. The present system implicitly knows the distance each trapezoid is in advance of the train. This makes the present system's classification more precise in that it inherently knows if it's identifying a person in the track at 200 feet or 2000 feet. The disclosed system's Neural Network will be developed and trained specifically for rail operations. Image characteristics needed to train a rail specific Neural Network are primarily focused on human beings, highway vehicles, animals, railcars, locomotives, trees, poles, rock slides, washouts, and misaligned track. The design of the disclosed system facilitates a Neural Network which concentrates onboard resources on classifying the most common hazardous conditions sooner and more accurately than conventional computer vision systems. This novel system architecture supports this by allowing Active Navigation to declare a hazardous condition which immediately imposes a pending braking event unless the Neural Network can classify the hazardous condition within a time period comparable to a locomotive engineer's recognition and reaction time for applying the brakes manually. An example of this methodology begins with Active Navigation detecting out of tolerance pixels and declaring a hazardous condition. If the Neural Network determines this to be a deer in the impact zone, then the system would merely sound the warning device. However failure of the Neural Network to classify the hazardous condition in the prescribed time results in treating the unclassified hazardous condition as a highest severity event. This fail-safe approach to hazardous condition classification markedly distinguishes the disclosed system from conventional computer vision which relies on Neural Networks without the level of screening provided by highly developed Baseline Navigational Data. Further distinction of the present system's approach to hazardous condition classification is that Neural Network need not be trained to recognize random items such as sink holes, downed radio towers or even airplanes that may present as hazardous conditions. Active Navigation will declare these one-off events as hazardous conditions based on out of tolerance pixels, making it inconsequential that the present system's Neural Network cannot classify them. This approach focuses the present system's Neural Network resources on the precise and timely classifications of hazardous conditions which occur more frequently and bear interpretation, such as a vehicle on the track with its hood raised or a lowboy trailer stuck on the tracks.

Object Tracking benefits from out of tolerance pixels screening in a similar fashion as Neural Network. A hazardous condition identified by Active Navigation can be tracked by calculating the movement of the pixel signature. Active Navigation can coincidentally establish multiple pixel signatures to accommodate different out of tolerance pixel clusters. Object Tracking, Neural Network, Active Navigation and Train Handling and Hazard Response can now distinguish between different simultaneous hazardous conditions and communicate effectively, providing updates on the individually identified pixel signatures. Object Tracking can associate updates from Active Navigation with the pixel signature and compare the position of the pixel signature with that from the previous update. In this manner, Object Tracking can then make predictions based on the movement of the hazardous condition as to when it would be clear of the impact zone, much the way the locomotive engineer does today. In cases where pixel signatures are outside the impact zone, Object Tracking may predict when they would enter the impact zone. Additionally, Object Tracking may determine if a hazardous condition is responding to locomotive warning devices. Accomplishing Object Tracking against the backdrop of Baseline Navigational Data becomes straight forward in that movement within the trapezoidal reference is readily apparent. Comparatively, conventional computer vision would process consecutive image updates and subtract one from the other to determine position change of the object in question. This is complicated further by the fact that the train is moving. As such, conventional computer vision would need to identify a fixed object to reference in multiple images to perform image subtraction for a moving train. Again, FIG. 7 is a representation of the effect Baseline Navigational Data has on Object Tracking. Baseline Navigational Data's trapezoid provides an immediate reference irrespective of how fast the train is traveling.

Train Handling and Hazard Response

Despite impressive computer vision systems for highway navigation, most if not all are still in the realm of driver assist systems. Rail poses even greater challenges for autonomous operations, given freight train stopping distances routinely exceed 5000 feet and as a practical matter, trains because of their braking physics and available sight distances, may not be able to stop short of a potentially hazardous conditions even when immediately detected by human operators or computer vision systems. Compounding the challenges of rail operations is the unreasonableness of immediately placing a train in an emergency braking application because a hazard, even a person, is detected in the track 2500 feet in advance of a train. This would be an untenable practice for rail operations and one that locomotive engineers routinely encounter without invoking an emergency braking response. Another common example of these type of intense events which locomotive engineers typically must endure (“ride out”), occurs at highway grade crossings. When equipped with warning devices, highway grade crossings are typically designed to provide 30 seconds of warning to a motorist before the arrival of a train. However, a train traveling 30 mph would typically not activate the warning device until it was approximately 1400 feet from the crossing. Depending on track grade and train consist, this could result in a vehicle stopped momentarily on the tracks at less than the train's stopping distance before the motorist is warned of an approaching train by the highway grade crossing warning device. Again, initiating an emergency brake application in response to this common situation would be impractical due to the dramatic nature of an emergency brake application and the real prospects of creating exposure to derailment multiple times on every trip. A better response mimics the locomotive engineer who understands when the crossing signal activates and when to expect any vehicles to be clear of the impact zone.

Unquestionably, the vast majority of locomotive engineers do an exceptional job of responding to incursions into the path of the train, but as with any human behaviors, there is a wide spectrum of stimulus/response which may depend on a given engineer's skills, experience, alertness or simply his/her perception at any given moment. What is needed is a system that responds to these situations consistent with locomotive engineer's best operating practices with the increased vigilance and the consistency of a computer vision system. A system which optimizes early detection coupled with appropriate warning and braking events will improve safety of operations for the rail network. The disclosed system's development process would include comparing the hazardous condition responses of the locomotive engineer to that of the present system. During the development phase the present system would log its response to different hazardous condition events, while simultaneously logging the actions taken by the locomotive engineer still in full control of the train. These activities would serve to refine system use cases.

Train Handling and Hazard Response in one embodiment of the present system, can assign a severity index (or risk profile) to a variety of hazardous conditions detected by Active Navigation. Hazardous conditions are represented by a distinct pixel signature number which allows the event to be updated by Active Navigation, Neural Network and Object Tracking. The updates provided to Train Handling and Hazard Response may alter the severity index and hence the response. At a minimum, updates occur each time Active Navigation access the next Baseline Navigational Data record and performs pixel calculus which will continue to identify the hazardous condition by its pixel signature, provided the signature remains within prescribed limits. The present system POC demonstrates how the magnitude of out of tolerance pixels provides a pixel signature in the current active view that is identifiable by Active Navigation, allowing the hazardous condition to be updated and tracked when compared to the previous active view.

The Neural Network and Object Tracking provide classification and tracking updates respectively to Train Handling and Hazard Response. These updates are used by Train Handling and Hazard Response to contextualize hazardous conditions. Highway grade crossings is an area where the present system's specific rail-trained Neural Network is significant. While incidents of a vehicle on or near the track of an approaching train will always occur, Train Handling and Hazard Response would immediately change its normal response to this situation if Neural Network detected the hood of the vehicle raised, perhaps indicating a stall, or people outside and gathered around the vehicle, including emergency response vehicles present at the highway grade crossing.

Returning to the example of a person detected in the track 2500 feet in advance of the train, use cases for the present system are aimed at protecting human life but none more directly than those involving a person on the track. Informal polling of experienced locomotive engineers finds that placing a train in emergency braking application is a rare occurrence with some engineers averaging less than one occurrence per year. As one would expect, context for these decisions is critical. Like the locomotive engineer, the disclosed system would use its available tools to provide context so that Train Handling and Hazard Response can continually assess the impending severity and apply appropriate train handling. Object Tracking may for instance, determine the person is crossing the tracks in a perpendicular fashion and will easily be out of harm's way. The Neural Network may determine the person responded to the train warning device by waving, looking up or changing course. Conversely, the Neural Network may determine that a person is laying prone on the tracks or a small child is on the tracks which would trigger an immediate emergency brake application from Train Handling and Hazard Response. This technology allows factors such as these to provide context allowing the disclosed system to make consistent, best practices determinations from the same visual cues that a locomotive engineer uses today.

The present system's architecture, which allows its rail specific Neural Network and Object Tracking to focus only on the pixel signature of a hazardous condition and not the entire active view, can provide the context required to make these fractional second determinations when a human life is at risk. Again, conservation of onboard resources through the use of Baseline Navigational Data allows more resources for contextualizing and updating severity indices for these types of situations. Whether a person near the track is waving violently as if to signal the train to stop or a young child is pulling an imaginary cord to signal the engineer to blow the horn, the present system needs context and a baseline set of responses from experienced engineers to develop clearly defined use cases and provide consistent responses beyond what is humanly possible in these public safety situations.

The severity index may be mapped to train handling responses. The state of locomotive and train technology in the freight railroad industry offers a variety of responses to the unexpected appearance of hazardous conditions along the right of way. Responses may include one or a combination of the following:

    • No response to a relatively benign object
    • Ringing the bell
    • Blowing the horn sequence(s)
    • A reduction or elimination of power for propulsion
    • Application of the locomotive brakes (also known as the independent brake)
    • Application of the dynamic brakes
    • Minimum application of the train brakes (sometimes called the automatic brake)
    • Mid-range application of the train brakes
    • Penalty application of the train brakes
    • An emergency application of the train brakes

The assignment of risk concept associates a numerical risk with each of these or other possible responses. The higher the risk profile, the more restrictive the action to be prescribed for the train.

FIG. 13 is a flowchart of an exemplary high-level process including steps 901-916 by which Train Handling and Hazard Response determines appropriate actions in response to hazardous conditions in one embodiment. Those responses include sending commands to the existing EM and PTC systems at steps 913 and 914. EM controls the locomotive throttle and routine brake applications, while PTC controls Penalty brake and emergency brake applications which are not recoverable, as once applied, the train is required to stop prior to a brake system reset. When a hazardous condition is detected by Active Navigation at step 901 and is reported to the Train Handling and Response at step 902, Neural Network at step 903 and to Object Tracking at step 904 Train Handling and Hazard Response assigns a low severity index perhaps only sounding the horn at steps 907, 908 and the countdown begins with the prospect of assigning the highest severity index. If the Neural Network fails to classify the hazardous condition within this timeframe which is comparable to a locomotive engineer's response time. Train Handling and Hazard Response will declare the highest severity hazardous condition and provide PTC with a stop target which includes the distance to the hazardous condition at steps 911 and 913. PTC will then determine if a penalty or emergency brake application is required to satisfy Train Handling and Hazard Response's stop command. The exception to this scenario would occur when the braking distance to stop the train is less than the distance to the hazardous condition, then Train Handling and Hazard Response can allow the Neural Network this additional evaluation time to classify the hazardous condition before commanding a stop target to PTC.

Applying FIG. 13 to the exemplary hazardous condition use case of a person walking in the track detectable at 2500 feet in advance of the train begins with Active Navigation declaring an unclassified hazardous condition which is received by Train Handling and Hazard Response. If current braking distance for the train with an emergency brake application is 4000 ft, Train Handling and Hazard Response immediately sounds the horn at steps 907, 908 and dispatches a drone with the trapezoidal range and pixel signature of the hazardous condition at steps 905, 906. The Neural Network then has a prescribed time to classify the hazardous condition. For this example, the Neural Network successfully classified the hazardous condition as a person walking in the track with their back to the train. Train Handling and Hazard Response again sounds the horn and Object Tracking confirms the person has not made a discernable effort to leave the impact zone. In some embodiments, a notification may also be sent to the dispatch center at steps 909-910 if the severity index is high enough. Train Handling and Hazard Response sends a stop command to PTC which places the train in an emergency brake application within the time period established for locomotive engineer best practices. Train Handling and Hazard Response continues to sound the horn and the dispatched drone advances to the hazardous condition to deliver an audible signal or recorded message in close proximity to the person as a means of alerting them of the approaching train.

Restricted Speed operations is another operating scenario addressed by Train Handling and Hazard Response. PTC limitations can be improved upon with the present system's capabilities to stop the train within one-half the range of vision when a hazardous condition is detected. When operating within a Restricted Speed zone, Train Handling and Hazard Response compares the train's current stopping distance with the detection range as determined by the trapezoidal range in upcoming Baseline Navigational Data records. Train Handling and Hazard Response may then dynamically furnish PTC and/or EM the detection distance and/or safe operating speed commensurate with the trapezoidal distances obtained from Baseline Navigational Data, so that the train may be operated at a speed allowing it to stop short of a hazardous condition, including another train operating on the same track in the opposite direction at Restricted Speed. The present system can further optimize Restricted Speed operations by obtaining positive confirmation from a dispatch office that no other equipment is in operation on its route. Current industry practice of operating so as to stop in one-half the range of vision could then become obsolete or a rarely used operating option. Additionally, drone deployment in a Restricted Speed zone projects the present system's pixel calculus screening process and Neural Network beyond the human line of sight, all but rendering the current Restricted Speed practices obsolete.

The present system's Train Handling and Hazard Response may provide inputs to an external communications system onboard the train. In exemplary embodiments, such an external communication system may be capable of providing prerecorded announcements to persons near the locomotive and broadcasted on appropriate radio frequencies, such as at 160 MHz These messages may include notification of autonomous operations (e.g., informing nearby persons that the train is operating autonomously), train identification, mile post location, a contact phone number of the dispatch center, location of hazmat data for a train, and warning messages. Warning messages may be notifications relating to the impending movement of a stopped train, an indication that the train has applied its emergency brakes, or that the train needs assistance. The communications system may be able to receive communications from the dispatch center as well to allow for two-way communications with nearby persons on the track such as railroad personnel or emergency responders.

In some embodiments the present system's Train Handling and Hazard Response may also provision controls and viewing capabilities for remote operation from a dispatch center. In the event of an emergency situation, these controls would allow remote repositioning of the train at slow speed as requested by emergency responders or railroad personnel.

Inventions in accordance with the present disclosure may also address operational requirements associated with alarms from wayside equipment detectors or incurred by emergency brake applications. In such scenarios, Train Handling and Hazard Response may deploy an onboard-based drone to perform prudentially or legally mandated inspections for dragging equipment, hot wheels or bearings and other defects providing this information to the present system and to a dispatch center with such information as axle counts, bearing temperatures, pictures, and other pertinent data.

Performance Data and System Analytics

Hazardous condition detection events may be subject to the computer learning techniques. As seen in FIG. 2 Performance Data and System Analytics collects and flags the data for this analysis. In this way, identification, classification and tracking of hazardous condition events are subject to post-tip analysis. After each trip Performance Data and System Analytics uploads performance data to the present system's lab. Pixel calculus and tolerance values can be tuned to address consistent depletion of the tolerance budget or hazardous condition detection events which fell outside system performance parameters. This may precipitate adjustments to Baseline Navigational Data such as new pixel calculus values and/or different tolerance values for pixel calculus. A non-limiting example of this would be maintenance improvements resulting in changes to the physical plant. These improvements could drive a change in pixels calculus values if shown to decrease the tolerance budget. Similarly, the classification of a hazardous condition which fell outside system performance parameters may result in the retraining of Neural Network with image characteristics collected by Performance Data and System Analytics. Any differences between predicted and actual outcomes as collected by Performance Data and System Analytics may be modeled and used to further optimize both safety and performance of Active Navigation, Neural Network, Object Tracking as well as the refinement of severity index assignments and the actions of Train Handling and Hazard Response to improve system performance standards.

These optimization techniques pair well with the disclosed system's lab development of Baseline Navigational Data. The same suite of tools used to develop Baseline Navigational Data can be used to optimize its performance as well as qualify changes to the present system's computer vision software as driven by post-trip analysis of the data collected and flagged by Performance Data and System Analytics. The present system's lab and data driven approach to computer vision provides the ideal environment for regression testing all changes to both the present system's computer vision software and Baseline Navigational Data to ensure optimization in one area does not adversely affect another. A suite of lab regression tests that strategically injects hazardous conditions and interprets system responses with simulated lab navigation, assures the next release of a Baseline Navigational Data file or the present system's computer vision software contain the desired improvements without any degradation. The disclosed systems highly automated approach to the creation and maintenance of Baseline Navigational Data is instrumental in the timely release of both Baseline Navigational Data files and the present system's computer vision software. These releases may be driven by optimization efforts, track changes resulting from both planned and emergency work, or even changes to fixed structures in the warning zone.

The present system has a shared consciousness in that all trains operating the system benefit from the continuous hazardous condition recognition and response evolution. Post trip data is downloaded to a central back office where automated analysis and testing, as described above, drive software enhancements and ultimately new software releases. As different trains traverse the same subdivision over time, data collected contributes to improvements which include increased object recognition data and a more refined system response to object recognition data. As a quality control check, improved software is run against the data describing actual train runs of the past to verify appropriate responses are preserved in the proposed software release before being issued. Over time, this methodology of a shared intelligence ensures a system of increasingly accurate and appropriate responses to wayside hazards, optimizing the present system's response to hazardous conditions while improving safety of operations.

Proof of Concepts

An Active Navigation centric Proof of Concepts (POC) was conducted to validate the reliability of the present system. The limited scope POC consisted of drone captured track video with a CMOS camera in which the critical area (x,y) defined trapezoids were manually established. Subject trapezoids were pixel justified and pixel summation was calculated for a simple grid superimposed over the impact zone.

FIG. 14 was the baseline image from which Baseline Navigation Data was established.

FIG. 15 represents the image captured by a simulated active locomotive operating the disclosed system on a cloud covered day for the same section of track as in FIG. 14.

Table 3 contains exemplary data used for the application of a justification window and the resultant justification value.

TABLE 3 Active Navigation Baseline Navigational Data Justification Window 64190 85059 Pixel Sum Justification Window 104.886 138.985 Avg. Pixel Value Active Avg. Justified 138.886 Justification value 34.0997 (per-pixel) Active Justified Pixel 84998 Sum

The above justification process successfully adjusted the Active Navigation's justification window from a pixel calculus value of 64190 to 84998 to closely align with Baseline Navigational Data's justification window value of 85059. The per-pixel calculated pixel offset (the Justification value of 34.0997 in the table above, which is the difference between the average pixel values for justification windows in the Baseline Navigation Data and the Active Navigation data) can then applied to the pixels in the active view before the pixel calculus values for impact zone and warning zone trapezoids are computed and compared to the pixel calculus values for the corresponding trapezoids in the Baseline Navigation Data. Applying justification to active image FIG. 15 provides the resultant image FIG. 16 which is now justified and suitable for hazardous condition detection against Baseline Navigational Data.

FIG. 15 included the introduction of a hazardous condition at 1800 ft. in advance of the locomotive—a white pickup truck. The justified version of this hazardous condition in FIG. 16 is now compared to Baseline Navigational Data in Table 4 below also included as a reference are values for the “no truck present” condition. At 1800′ primarily two grid boxes contain the hazardous condition as represented by the white truck. Functionally, the disclosed system would need to interpret the no truck values (4265, 12094) as within tolerance of the Baseline Navigational Data values (5209, 12236). Conversely, the values in Table 4 calculated by the disclosed system with the truck present (7503, 6647) would need to fall outside tolerance as compared to Baseline Navigational Data (5209, 12236) and be declared a hazardous condition.

TABLE 4 Grid Box 1 Grid Box 2 Pixel Calculus Pixel Calculus Baseline Navigational Data 5209 12236 Justified Active View 4265 12094 No Truck Justified Active View 7503 6647 Truck

While the results of the POC are encouraging, it becomes even more apparent that current CMOS camera technology is less than optimal for this application particularly when one considers factors such as fog, snow, or even darkness. While these may be limiting factors for the locomotive engineer as well, it's obvious more advanced technologies can bring a higher level of safety. Radar, thermal, infrared, LiDAR, or a new advanced technology may provide the present system optimal imaging. Common to any chosen technology however are the benefits of Baseline Navigational Data which provides an unconventional approach to hazardous condition detection and greatly reduces the burden on a Neural Network for object classification which can only prevent a train stoppage within the framework of the disclosed system.

Prior Art

Some prior art in the collision detection and avoidance space targets driver assist implementations. This prior art is significantly different from the methods and devices described herein because those approaches ignore the conservation of onboard computer resources required for full autonomous operations. None of that prior art demonstrates any appreciation for the existing PTC, PTL or EM systems in which the Rail Industry is heavily invested. The Rail Industry fully anticipates any autonomous technology appliance would to be integrated into these systems rather than introduce a stand-alone or replacement system.

One prior art patent in the rail vehicle collision detection and avoidance space is U.S. Pat. No. 10,654,499B2, assigned at issuance to Rail Vision Ltd. (the “Rail Vision patent”). While the claims in this Rail Vision patent use some similar terminology to describe a driver-alerter system, it cannot reasonably be extended to a platform for autonomous operations. As such this patent makes no mention of PTC, EM or PTL interfaces which are critical to support autonomous operations. Moreover, claim 1 of the Rail Vision patent discloses “comparing pre stored images of a section of the rails in front of the train with frames obtained during the travel of the train in order to verify changes in the rail and in the rail's close vicinity; and detecting obstacles based on this comparison.” These image subtraction strategies are in sharp comparison to the current disclosure, which does not directly compare current images and historical images to each other but rather calculates current image pixel calculus values and compares them to corresponding pixel calculus values in the Baseline Navigational Data.

Moreover, the Rail Vision patent's zone of interest (16.5) differs markedly from the disclosed systems impact zone and warning zone in both its acquisition and utility. The zone of interest is discovered in each image using IR technology and comparing temperature differences between the rail and its surrounds. Conversely the disclosed system uses data embedded in the Baseline Navigation Data file to provide (x,y) coordinates to Active Navigation to define both the impact zone and the warning zone. These off-line derived trapezoidal references conserve onboard resources and provide the utility of trapezoidal ranging, comparatively the prior art employees triangulation methods, Section 17.5, to establish distance to an “object”.

CONCLUSION

The infrastructure associated with various rail routes, rail crossings, connecting tracks, draw bridges and so forth with multiple routes can be complex. There is the added complexity when new rail is being installed because at times replacement strands of steel rail are positioned for installation within, or just outside the two installed rails or “gauge of the track”. Whether locomotive or drone image collection devices are in use, focusing system resources only on areas of interest through data development differentiates the present systems effectiveness as opposed to conventional approaches. As one non-limiting example, it should be understood the concept of focusing system resources only on areas of interest may be used in alternative embodiments that may not use all of the concepts and techniques disclosed herein. For example, the technique of inputting only a pixel signatures (rather than an entire image from an image collection device) to a classification scheme may be used regardless of whether the technique for identifying the pixel signature is performed using one of the techniques disclosed herein or an alternate technique.

Several embodiments of the present disclosure have been described. While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any disclosures or of what may be claimed, but rather as descriptions of features specific to embodiments of the present disclosure.

Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination or in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in combination in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.

Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results.

Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments.

Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order show, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the claimed disclosure.

Claims

1. A method for autonomously detecting onboard a train a potential hazard condition on a railroad track, the method comprising:

obtaining a position of the train on the track;
accessing baseline navigational data for each of a plurality of regions of track in advance of the position of the train, the baseline navigational data for each of the plurality of regions including coordinates defining the region and at least one historical pixel calculus value calculated based on pixel values of historical image data corresponding to the region, the historical pixel calculus values in the baseline navigational data having been calculated prior to a current trip of the train;
accessing a current image of the plurality of regions of track in advance of the position of the train;
justifying the current image;
calculating at least one current pixel calculus value for each of the plurality of regions based on pixel values of the justified current image corresponding to the region;
performing, for each of the plurality of regions, a comparison of a tolerance value to a difference between the at least one current pixel calculus value and the at least one historical pixel calculus value for the region.
determining a potential hazard condition based on at least one of the comparisons.

2. The method of claim 1, wherein the historical pixel calculus values are calculated by performing a weighted summation of the pixel values of the historical image data corresponding to the region, and wherein the current pixel calculus values are calculated by performing a weighted summation of pixel values of the justified current image corresponding to the region.

3. The method of claim 2,

wherein the baseline navigational data for at least one of the regions further includes definitions for a plurality of grids that divide the region and historical pixel calculus values for each of the plurality of grids based on pixel values of a portion of historical image data corresponding to a respective grid; and
wherein calculating the at least one current pixel calculus value for the at least one of the regions comprises calculating current pixel calculus values for each of the plurality of grids based on pixel values of a portion of the justified current image data corresponding to a respective grid.

4. The method of claim 2, wherein justifying the current image comprises:

accessing, for at least one justification window, a baseline justification value based on a summation of the pixel values of historical image data corresponding to the at least one justification window;
calculating, for the at least one justification window, an active view justification value based on a summation of the pixel values of the current image data corresponding to the at least one justification window;
applying a justification factor based on a difference between the baseline view justification value and the active view justification value to pixel values in at least portions of the current image corresponding to the plurality of regions.

5. The method of claim 4, wherein the at least one justification window comprises a plurality of justification windows.

6. The method of claim 4, wherein the justification value is a per-pixel justification value that is further based on the number of pixels in the at least one justification window.

7. The method of claim 2, further comprising the step of:

transferring the justified image data for any region corresponding to a detected hazard to a classification process for classifying an object in the region corresponding to the detected hazard, the classification process being configured to process only regions of the justified current image corresponding to a potential hazard condition.

8. The method of claim 7, wherein the classification process is optimized to classify human beings, highway vehicles, animals, railcars, locomotives, trees, poles, rock slides, washouts, and misaligned track.

9. The method of claim 7, further comprising:

in response to the classification process being unable to classify the potential hazard condition within a predetermined time period or with a predetermined certainty, assigning a highest severity index to the hazardous condition.

10. The method of claim 2, further comprising the step of tracking movement of an object corresponding to a potential hazard condition.

11. The method of claim 2, further comprising:

accessing, in the baseline navigation data, a baseline unobstructed distance corresponding to the position of the train, the baseline unobstructed distance representing a distance to a potential obstruction as measured by reflective technology prior to a current trip of the train; and
accessing, from reflective technology mounted on the train, a current unobstructed distance representing a distance to a potential obstruction as measured by reflective technology;
wherein the step of determining a potential hazard is further based on a comparison of the baseline unobstructed distance and the current unobstructed difference.

12. The method of claim 2, wherein the further comprising the steps of:

determining that the train is in a multi-track location;
receiving an indication that an other rail vehicle is present on a nearby track; and
reducing the number of the plurality of regions of track in advance of the position of the train for which baseline navigation data is accessed to compensate for a reduction in visibility resulting from the presence of the other rail vehicle on the nearby track.

13. The method of claim 2, wherein the track in advance of the position of the train is divided into a plurality of sections, and plurality of regions includes an impact zone and warning zones on each side of the impact zone for each section of track.

14. The method of claim 13, further comprising the step of issuing a warning when a hazardous condition is detected in a warning zone.

15. The method of claim 13, further comprising the step of issuing a warning when a hazardous condition is detected in an impact zone.

16. The method of claim 13, further comprising the step of issuing an indication that the train's brakes should be activated in response to a detection of a hazardous condition that corresponds to a condition requiring brake activation in an impact zone at a distance which is substantially equal to or less than a distance in which it is possible to stop the train.

17. The method of claim 2, wherein the current image is obtained from a sensor mounted on the train.

18. The method of claim 2, wherein the current image is obtained from a sensor that is not mounted on the train, the sensor being selected from the group consisting of a sensor mounted on a drone and a sensor mounted in a fixed location on a track wayside.

19. The method of claim 2, wherein the sensor is an image collection device configured to image light in the visible spectrum.

20. A method for developing baseline navigation data comprising:

determining that a rail vehicle has traveled a fixed distance along a length of track;
in response to the determination, capturing an image from a camera mounted to a rail vehicle and determining an unobstructed distance ahead of the rail vehicle using a reflective technology sensor;
detecting an unobstructed distance in the absence of any hazardous condition using reflective technology at each location corresponding to an image;
defining, for each image, a coordinates for a plurality of regions of track in advance of the position of the rail vehicle;
calculating, for each region in the image, a pixel calculus value;
calculating, for at least one justification window in the image, a pixel calculus value;
identifying at least one pixel for at least one rail in the image;
storing, for each image, a record comprising the pixel calculus value for each of the plurality of regions in the image, the coordinates of each of the regions in the image, the pixel calculus value for the at least one justification window in the image, alignment data comprising coordinates for the at least one pixel of the at least one rail, and the unobstructed distance corresponding to the location of the image.
Patent History
Publication number: 20230391384
Type: Application
Filed: Apr 11, 2023
Publication Date: Dec 7, 2023
Inventors: Eric August HULLEMEYER (Sharpsburg, GA), Charles Edward TILLEY, JR. (Fort Worth, TX)
Application Number: 18/298,961
Classifications
International Classification: B61L 23/04 (20060101); B61K 9/08 (20060101); B61L 15/00 (20060101);