Ensuring the Accuracy of Traffic Monitoring Using Unmanned Aerial Vehicles Vision Systems

,


Introduction
The paper considers the traffic monitoring provided by the monitoring system installed on board of the unmanned aerial vehicle (UAV).

Automatic traffic monitoring includes:
• Patrolling -UAVs flights over a route of the controlled road segment.UAV flight is implemented with the help of the automatic control system (ACS).ACS generates control commands to the actuators moving the UAV control surfaces.In turn, the control commands depend on the difference between the desired (specified), and the current position of the UAV.The parameters of the current UAV position are determined by the UAV navigation system (NS).

•
Obtaining the video by the on-board surveillance system and analysis of the current situation observed for the detection of specific traffic situations (STS) that may arise.Analysis of the situation is based on the detection of individual vehicles, monitoring them and building their trajectories.In case of any STS, the corresponding video information is transmitted to the operator at the ground control and traffic management station.
STS generated in the course of monitoring may include accident, in particular, car collision; traffic accidents, reducing the bandwidth of the road section; movement of the vehicle, being a threat to other road users.
Detection of such STS requires assessment of the following at the received images: • Vehicles position relatively to the road markings; • Vehicles position relatively to each other; • Vehicles speed.
Effectiveness of achieving the target UAV tasks related to the observations is largely determined by the UAV navigation system.At the same time, conditions for fulfilling the target task form the requirements to UAV NS.
Let the controlled road section have its own coordinate system which origin is located at a selected point of the road.In the context of monitoring, it is necessary to determine the position and velocity of the vehicle in the coordinate system.
Let us consider the two options, providing assessment of the vehicle relative to the road: Option 1: 1.It is based on the UAV coordinate system.This option requires the following procedures to assess the vehicle position within the road coordinate system: 1.1.Determine the UAV position relative to a ground coordinate system.This procedure is implemented using SNS or computer vision system (CVS).
1.2.To determine the position and speed of the vehicle in the UAV coordinate system by using CVS.
Assessment of the situation and the vehicle speed relative to the road coordinate system is implemented using transition matrices.
Option 2: 2. It is based on the vehicle speed and position assessment relative to the road using image analysis.This option requires the following: 2.1.Highlight the road coordinates in the image; 2.2.Highlight the vehicle coordinates in the image; 2.3.Determine the vehicle position relative to the road with the respect to the ground coordinate system.
Assessment of the situation and the vehicle speed is based on affine transformations.
Let us define the requirements for the physical implementation of monitoring and then let us assess the two main parameters: noise immunity and SAM performance.

Physical implementability of monitoring
Given a desired accuracy in the form, ∆ is the maximum permissible error for vehicle position assessment relative to the road (vehicle-D).
Considering position measurement error for the vehicle (σ q is mean square error of vehicle position assessment) and the road (σ r is mean square error of road position assessment) as independent, let us assume that the mean square error of vehicle-D position assessment is In a normal distribution of errors and zero mathematical expectation, the condition for physical implementability will be the condition of It is obvious that the necessary conditions ∆ > 3 , ∆ > 3 shall be met.
Let us consider the implementation example of Option 1.
Since it was assumed that the vehicle coordinate assessment algorithms are set, i.e. q σ is known, the accuracy requirements for UAV position estimation relative to the road with regard to (1), (2) are defined as For example, Δ = 0.5 m, = 0.1 m.
Then it is necessary to have the UAV position error relative to the road below than ≤0.133 m.
The implementation example of Option 2.
Let us assume that the affine transformations are implemented correctly.
In this case and , where k is the scale factor, represent the error and determine the road and vehicle position with respect to the image (plane).
Failure to comply with condition (3) leads to the fact that the assessment accuracy for the vehicle-D will be below permissible level, and, consequently, traffic monitoring will be physically non-implementable.
Depending on the chosen option, the requirements for the ACS and NS composition and characteristics may vary, as well as the composition of software and algorithmic support for CVS.
Thus, the actual and practical importance is the task of choosing the option for assessing the vehicle position and speed relative to the road and thus shaping the image of the UAV navigation system, used in the road conditions monitoring system.

Literature Review
Traffic monitoring is based on the detecting ground vehicles and their monitoring.The complexity of the implementation for this process is the need to highlight moving objects in changeable conditions of observation, as well as within the UAV movement.
Solution to the problem of detection and monitoring ground moving objects is given in a large number of papers, for example, (Hayman & Eklundh, 2001;Ren et al., 2003;Uemura et al., 2008;Borshukov et al .;Ke & Kanade, 2001;Tao et al., 2007).These papers consider various approaches to highlighting moving objects, in particular, with the highlighting the vehicle with respect to the displacement of the background surface or with the movement of the background image due to the UAV movement.
The issues on UAV NS construction are discussed in the papers by (Brown & Hwang, 1997;Biezad, 1999;Dyers & Serebryakov, 2003).It is shown that some NS use information from various on-board sensors; the information is integrable (based on inertial navigation system) using the Kalman filter.
Much attention is paid to the NS integration into modern information received from the satellite navigation systems (SNS) (Turner, 2002).
However, in some cases, the NS accuracy using SNS cannot provide a solution to the target tasks with the required accuracy.In this case, the visual navigation methods can be used (Veremeyenko et al., 2009;Sasa et al., 2000;European Commission Information note, 2002;Kong et al., 2013;Jia et al.,;Gageik, 2013).
The paper by (Veremeyenko et al., 2009) discusses the various options for reference points -points, lines, areas, and structures.The algorithms for detecting these reference points and assessing their coordinates are provided.
The paper by (Kong et al., 2013;Gageik, 2013) explores options for using various information sensors in navigation, such as infrared or optical sensors.
A number of studies (Kong et al., 2014;Wiliams & Crump, 2012) demonstrate that the use of visual navigation methods can meet the challenges of high-precision assessment of the situation regarding the UAV reference points.In particular, the visual navigation provides control of UAVs during the landing phase.
The issues of highlighting ground mobile and stationary objects in the images taken on UAV board are discussed in many papers related to image processing and analysis (Hayman & Eklundh, 2003;Borshukov et al.;Veremeyenko et al., 2009;Bilbao et al., 2008;Senthilkumaran, 2009;Kim & Bodunkov, 2014).
These studies showed that depending on the task, there is a possibility to use algorithms highlighting the various features of the desired objects.Examples of high-precision monitoring ground vehicles show that modern surveillance systems can provide an assessment of the position of objects with errors up to ≤2 pixel.
Thus, Option 2 has the key issue for road position assessing in the image.
The literature review demonstrates that issues of aircraft NS constructions are of high importance.In this case, the base case of UAV NS is the NS version based on the inertial navigation system, integrated with SNS.

Methodology
The above-mentioned sources do not research the question of building UAVs NS for the tasks of traffic monitoring.
Let us consider the structure of the standard navigation NS.

Structure of the UAVs NS
UAV NS get measurements by on-board navigation (information) sensors.It shall be noted that in the general case, without further processing, the measurement results do not allow determining the required UAV navigation parameters.
Assessment of UAV coordinates and position in space (pitch, roll, and yaw angles) is realized by a digital computer (DC) -an on-board computer(s) based on the measurement results.
Calculation of control commands for power steering is also produced by the on-board calculator(s).
In most cases, the conditions of achieving the target UAV ACS allow to form NS, which is not fully autonomous, but having limited autonomy, since it is possible to use external navigation information as well.Centralized or specialized computers, as well as autonomous and external navigation information in such NS provide a high accuracy and reliability of measurements, as well as noise immunity in difficult movement conditions for the aircraft.
Inertial navigation systems (INS) are the basic element of an autonomous system.Inertial navigation systems participation in the navigation complexes provides the highest autonomy of measurements and the largest amount of information on the aircraft movement, eliminates restrictions on the use of these systems and limitations in the measurement range of navigation options.INS has the following advantages: • High informativeness and versatility is use (INS determines the totality of flight and navigation parameters necessary to control the aircraft); • Complete autonomy of action; • High noise immunity; • Possibility of high-speed delivery of information (up to 100 Hz and above).Inertial navigation systems without gyrostabilised platforms were named strap down (SINS).The potential advantages of SINS compared to strapped INS include: • Smaller size, lower weight and power consumption; • Substantial simplification of the mechanical part and system configuration and, as a result, increased system reliability; • No restrictions on the turn corners; • Reducing the initial alignment time; • Versatility of the system, since the transition to the definition of certain navigation parameters is done algorithmically; • Simplification of solving the problem of redundancy and performance monitoring system and its elements.Over time, errors of SINS (INS) increase due to systematic errors of accelerometers, gyroscopes, and because of the vertical channel instability in the SINS algorithms.Therefore, accurate determination of the aircraft height during landing and correction of systematic errors of accelerometers and gyroscopes as external sources of navigational information may require the following: • multichannel receiver of the satellite navigation system (SNS) such as GLONASS or GPS (Global Positioning System) (Turner, 2002; Interface Control Document); • altimeters; • magnetic, gyroscopic, and astronomical compasses; Coordinate measurement errors via SNS are up to 10 m in the horizontal plane and up to 20 meters in the vertical plane.These errors do not allow providing the required accuracy of aircraft position determination.
It is possible to determine the more exact aircraft location, e.g. during landing by differential satellite system, which is a method which greatly improves the accuracy and efficiency of the satellite system.However, within the framework of the problems solved by UAVs, using differential approach has significant limitations.
It shall be noted that the use of a single antenna SNS does not allow determining the angular position of the UAV in space, while the use of multiple antennas is restricted by small size (relative to SNS accuracy) of UAVs.
Implementation of Kalman filtering for light UAVs (e.g., up to several tens of kilograms) is associated with considerable difficulties because coefficients in the mathematical model can vary considerably in the course of the flight, making it difficult to extrapolate filtering and navigation parameters under assessment.

Ensuring the Required Accuracy for Assessing Road Coordinates
The issue to be solved implies that the position (coordinates) of the road is assumed known in advance in the geographic coordinate system (position on the map).The UAV position is also defined in this coordinate system using the standard on-board navigation system (NS).Therefore, in theory, at sufficiently precise determination of the UAV coordinates, it is possible to solve the general traffic monitoring problem.
Let us consider the possibility of using the standard on-board NS.In most cases the NS structure includes computers (including microcontrollers); inertial navigation system (INS), including angular velocity sensors (AVS) and accelerometers measuring acceleration in the coordinate system B X Y Z ; satellite navigation system (SNS); magnetometer measuring the characteristics of magnetic fields; inclinometer for measuring the roll angle of the carrier relative to the gravitational field of the Earth; and altimeters.The main NS subsystem and the core, combining (integrating) the work of other subsystems is INS.In solving the target problems used in many UAVs INS, small-sized micromechanical sensors iMEMS, for example, AVS and accelerometers have no sufficient accuracy to organize traffic monitoring.
Using sensors with higher precision characteristics significantly increases the cost, weight, and overall dimensions of UAVs NS.
However, even INS with the best characteristics does not meet the requirements for accuracy.Thus, an increase in errors ("departure") within INS operation with so-called floating gyroscope (for high dimension aircrafts) is about 1.7 km/h, which cannot fulfil the condition (3).
SNS is used to correct INS errors in the standard NS.
Errors in determining the UAVs coordinates using SNS are up to several meters in horizontal plane and 10 meters in vertical plane, which is also not enough to meet the challenges of traffic monitoring.
A significant increase in the accuracy of navigation determinations is provided by using differential measurements mode.Differential SNS mode (differential GPS, DGPS) allows consumers to reduce the error in determining the place to the level of meters or decimetres.
In order to implement the differential method, the control and correction station (CCS) and the consumer shall simultaneously use signals from at least four common satellites.The difference between the calculated differences of pseudo range of consumer and the obtained differences of pseudo ranges from CCS are used to calculate the coordinates of the consumer.Simultaneously, CCS, which is generally located within 150 km from the consumer, makes the calculations, transmits the obtained difference between the satellites, and own known position.
The disadvantage of this approach is the need for ongoing monitoring by CCS.
Thus, when implementing the Option 1, if the controlled road segment has no possibility to use CCS, the standard UAV NS does not provide the required accuracy of UAV position assessment relative to the ground coordinate system.

Using Visual Navigation System along with the Standard NS
One possible approach, related to the solution of this problem is the use of the additional information obtained from CVS in the NS.
This approach is based on the use of map-matching navigation methods through the images of the underlying surface.
Map-matching or visual navigation methods are based on a comparison of the previously stored reference images (RI) of reference points taken by CVS together with the current images (CI).
Detection and assessment of the coordinates for the reference points with known coordinates in the geographic coordinate system allows determining the UAV position relative to the road.
The complexity of the implementation for the visual (surveillance and comparative) methods in UAVs NS is that it is necessary to select processing and image analysis algorithms ensuring measurement of not only the origin, but also the UAV orientation angles.Image processing and analysis algorithms shall be selected to detect reference points and determine their position relative to the UAV CVS at a given signal/noise ( = ) level with a given reliability (i.e., the probability of correct detection).It shall also be noted that such an NS will only work if there is a sufficiently informative navigation field.
With automatic traffic control, CVS installed on the UAV shall be equipped with a processor for processing and analysing the received video, and shall have the related software installed.CVS includes a video camera or other surveillance devices.Depending on the UAV type and the target task, these devices may be fixedly mounted on UAV board or mounted on the gyrostabilized platform.For example, the UAV can have two cameras installed with different mounting angles the line of sight and different angles or operating in different ranges of radiation (e.g., a television or infrared radiation).
Processing and analysis of video taken on-board allows providing the aircraft binding to the terrain.If the geographical coordinates of the found reference points are known in advance, the use of the camera allows organizing the UAV flight along the route without the data from the SNS receiver.Moreover, there is a potential possibility to get the data of the velocity vector and the UAV space orientation by means of processing images.These problems can be solved by the target hardware installed on the UAV board.
The principles of ACS construction are basic, which are the basis for various specialized NS.
Below is the demonstration that if the vehicle position relative to the UAV assessment algorithms is given, the road assessment relative to the UAV shall be implemented using visual navigation methods.
The two problems shall be solved to determine the UAV position relative to the road: • Detecting straight road sections to determine the UAV position and course relative to the roadsides; • Detecting straight reference points to determine the longitudinal position of the UAV relative to the road.Let us assume that the target board equipment and, consequently, its parameters are set.Then, for the successful work of CVS in the real conditions and in real time mode, it is necessary to use software and algorithmic means (SAM) ensuring the implementation of certain requirements, in particular, noise immunity and performance.It shall be considered that the alleged interference situation may change, for example, in terms of precipitation, smoke, etc.
There are various algorithms for image processing and analysis to be used for visual navigation: correlation algorithms, descriptions, etc.These algorithms differ with noise immunity, accuracy, and performance.
Thus, one of the objectives of this paper is to choose the visual navigation algorithms to ensure effective monitoring in specific viewing conditions.
Let us assume that errors in determining the vehicle position relative to the road are associated with CVS errors of assessing the situation of the vehicle ( q σ ) and errors of assessing the road position ( r σ ).Let us assume that the tracking algorithms and vehicle coordinates assessments are set, therefore it is possible to determine the accuracy of the assessment q σ .In this case, the feasibility of monitoring will be determined by SAM providing an assessment of the UAV position relative to the road.Since measurements are performed on board of the UAV, it is necessary to adjust coordinates (in pixels) obtained in the observation plane of surveillance system with the assessments related to the UAV position and orientation.

Surveillance System Model
Description of a mathematical model of surveillance requires the coordinate system and the model of on-board observation equipment with the plane of observation.
Let us consider a fixed coordinate system associated with some controlled road section (D) N N N N X Y Z , the beginning of which is the selected point on the road ( ) N .Axis x gives the relative coordinates of on-board observation equipment with respect to the associated coordinate system.Let us assume that the vector is constant, since on-board equipment, monitoring is rigidly connected to the UAV and is located on its axis at a distance L from the UAV centre.
If the coordinates of the point P are the coordinates of the observed element at the Earth's surface (e.g. the reference point), the coordinates of the point are defined by the vector [ ] in the fixed coordinate system D.
Based on these coordinate systems, it is possible to record the coordinates of the observed element on the Earth's surface in the coordinate system of the on-board equipment (vector [ ] ) according to the equation: where C B C is the constant transition matrix of the coordinate system related to the on-board equipment; B N C is the on-board equipment coordinate system transition matrix into the fixed coordinate system D.

C
δ is the viewing angle of the on-board monitoring equipment; , , γ ϑ ψ is the roll, pitch, yaw angles of UAVs.
Equation (4) shows the relationship between the UAV coordinates and its orientation and the observed surface coordinates of the earth element in the on-board equipment coordinate system.Turning to scalar equations record, the obtained scalar equations have the form: (cos cos cos sin sin sin sin cos sin cos ) ( ) u v are the coordinate (in pixels) of the centre at the on-board surveillance equipment relative to the origin coordinates of the on-board equipment; f is the focal length of the on-board equipment.
For a vertically stabilized CCD of the matrix.Point 1 of the line A is projected through C focal Point from the camera on the camera matrix CCD.The resulting point M has coordinates y 1 ', x 1 ' in the coordinate system associated with the receiving matrix.y 0 ', x 0 '-matrix CCD centre coordinates.
Thus, viewing angles μ (horizontal) and ϕ (vertical) required for calculating the camera orientation may be obtained by using the following relations: It is important that to have the dimensions of the focal length f and the coordinates x', y' coincided.
Correct calculations require having coordinate pixel translated from pixels into millimetres, using the calculated conversion factors.Pixel size for the camera is determined by the camera documentation.

Search Algorithms (Coordinates Detection and Assessment) for the Reference Points
Let us assume that in UAV's CVS can use correlation algorithms and descriptors for reference points search.

Let us consider the main types of correlation criterion functions.
Cross correlation function has the following form: where [ ] ... σ is the standard deviation operator.
In the first phase of UAV binding to the reference points, an image (original image) is received from the camera oriented vertically downward, as well as receiving data from INS.
In the next step, INS data is used for calculating the search area, scale, and orientation of the desired reference point.Affine transformations are applied to the reference image.The original image is used to get the current image corresponding to the search area.
Then, there is a consistent calculation of the correlation function for the different RI displacements relative to CI di, dj.    ( max i , max j are the dimensions of the comparable image fragments).This approach provides making decisions on mismatch of the comparable images by comparing the computed current values of the correlation functions with the threshold for different n.
For the classical algorithm and other maximized correlation functions If at the stage of preliminary studies, the dependence of ΔK(n) is determined, where n is the number of the cell under comparison, then the variable threshold Tп(n) (border) may apply providing higher saving of computational effort.
Condition for stopping the calculation and the transition to the new position of RI and СI will be as follows: Roughly, this approach reduces the computation time in 5-10 times.
When using correlation algorithms in the conditions of variable illumination for the observed scene, it is necessary to have RI adaptive to these conditions.
The following options for adaptive RI formation are possible: 1. Use of a group of references (basic RI), from which the most suitable working reference image (WRI) is selected.
When forming the initial RI, it is necessary to determine how RI can be stored in the computer memory, and what the differences of CI from RI can ensure the desired accuracy of the calculations.
If the accuracy requirements are not met, it is necessary to consider the possibility of reducing the RI size or change the computer system parameters.
2. When using the synthesized WRI, the models (in particular 3-D) for the objects observed are built.These models are used to form an image based on the texture and illumination of the objects.
Thus, the visual navigation based correlation algorithms can solve the problem of binding targets with ≤2m errors, which is not enough to perform monitoring road conditions under Option 1.
Using visual navigation in Option 2 requires that the CVS field of view constantly has reference points providing binding to the ground coordinate system (road).
Such reference points are the roadsides.

Highlighting of the Road Straight Sections
For the implementation of monitoring in accordance with the Option 2, it is necessary to use the primary methods of image processing which allows selection the images coming from the on-board camera, the edges of the controlled road section.
The lines in the image can be highlighted with the options for using an algorithm based on the Hough transform.
Functionally used general algorithm can be divided into two stages: • Implementation of algorithms for image pre-processing (contours detection, straight line segments highlighting); • Implementation of algorithms for analysis of the information obtained.
The algorithms of the first group implement the procedures and functions used to highlight straight line segments in the image.These include algorithms for low-pass filtering, contours detection, elementary lines fragments highlighting.
The algorithms of the second group directly implement runway borders highlighting procedures.It uses the methods of estimating the line parameters, combining line segments, recognition of the reference point borders.

Contour Line Highlighting
The Sobel method seeks the differences between a sum of pixels' luminance in the top and bottom rows and in the left and right columns of the image fragment covered by the filter mask.The resulting differences are summarized in modulo.The value of this amount is used then as part of a new monochrome contour image element.Further, this images requires to be turned into a binary one, i. e. where all the luminance have values equal to either "0" or "1".
The calculation of the threshold for the Sobel method is the problem, which still has no unambiguous solution.
One of the approaches to its definition is to use a so-called hysteresis method.The method consists in selecting such a range of brightness when the pixel brightness higher than the upper limit of its range is guaranteed to be attributed to the contour, and when the brightness is less than the lower limit -then to the background.Special attention shall be paid to the case when the pixel brightness falls within the interval.In this case, the membership to the neighbouring pixels contour is checked in the direction perpendicular to the brightness gradient.If the current point (pixel) satisfies the edge direction (coincides with the edge direction), then it is also marked as an edge.

Hough Transform
The lines in the image can be highlighted with the options for using an algorithm based on the Hough transform.
Let us suppose that there is a segmented image containing segments of straight lines.The following condition shall apply to each straight line: Where a 0 , a 1 is the offset and the tilt of the line, [x n , y n ] are the coordinates of the n-the point of the line.
The equation can be written as follows: This is an equation of the line in the new space, given by the axes a 1 , a 0 and called the model space.Tilt and offset of the lines in it are defined respectively by the relations 1/x n and y n /x n .However, in practice, instead of (5), it is more convenient to use a linear equation in the form of: where d is the distance to the line from the centre of the coordinate system, α is the range vector tilt to the line with respect to the x-axis of the coordinate system (Figure 3).Contrast of the point in the model space is determined by the number of points belonging to the line in the data space.The more contrasting is the point, the higher the accuracy of the line equation reconstruction.
The complexity of the Hough method in the considered problem (selection of line segments in the board computer) is the need for more computing power to build the straight line in the model space for each space point.
To improve the performance of the algorithm, it can be implemented on its signal processor.Increasing the performance with such an approach is achieved due to the possibility of mass parallelization of operations and the use of high-speed memory for storing information.

Evaluation of the Lines' Parameters
Only lines, which are the most similar to the borders of the reference points (landing area), shall be selected from the entire array of straight lines detected in the image.
This is done by assessing the situation and the characteristics of the reference points' lines based on the current readings of the position and orientation of the device, as well as the known characteristics of the camera.
The result is a set of model parameters for the reference points' lines: The current stage requires that the latest line segments are used to restore the real line borders of the reference points.
Successful completion of the algorithmic block begins after comparing the angles of the two segments to be combined.If the angle between the two segments is less than the maximum permissible value, it is decided that such segments may be combined.This step is for making a decision only on the possibility of combining the lines.The procedure of association and further analysis of the lines occurs in the next step.
To avoid combining parallel lines, the distances from the starting and ending points of the 2nd line to the straight line passing through the 1st line are calculated.If at least one of the distances is larger than the allowed value, the lines are not combined.Combining lines occurs in such a way that the resulting line is maximally long.It means that as a new start and end of the new lines, the coordinates of the beginning and/or end of each of the reference lines are taken to ensure fulfilment of this condition.
In addition, to avoid combining different lines, lying on the same line, the length of the line end is checked.It shall not exceed the set thresholds.
The coordinates of the new line obtained in this way are stored in a new vector component, storing the coordinates of the first of the merged lines.Coordinates of the 2nd line are deleted from the vector.Thus, taking into account possible errors in determining position of the vehicle ( ≤ 2 pixel), it can be assumed that the condition (3) can be performed.
Therefore, Option 2 based on the direct assessment of the situation with respect to the vehicle position on the road in the image is physically implementable.

Conclusion
The paper demonstrated that the organization of traffic monitoring shall provide an accurate assessment of the vehicle position relative to the coordinate system of the road.
Different options for assessing the situation of vehicles are considered.The paper demonstrates that the option of monitoring with measuring the vehicle position relative to the UAV is impractical because UAV navigation system (including DGPS and INS) does not provide the required accuracy for assessing UAV coordinates relative to the coordinate system of the road.
The option for assessing the situation with respect to the vehicle coordinate system of the road is offered.The assessment accuracy depends on the accuracy of the road edges or reference points related to the road on the images of the on-board surveillance system.In particular, the correlation algorithms, contour lines detection algorithm (search for land reference points), and Hough algorithm (allocation of the straight road sections) are offered.
As a result, the studies demonstrated that the option based on a direct assessment of the situation with respect to the vehicle on the road in the image is physically implementable, and the resulting position assessment satisfy the requirements for the accuracy set forth above.
Thus, the main conclusion of this study is the need to assess the situation of the road and the vehicle position relative to the road in each image (from the UAV surveillance system), which will provide the required accuracy for the successful solution of the traffic monitoring problem.

NNX
is directed along the centre line D, the axis N N Y is directed upwards, and the transverse axis N N Z is directed to the right.Another basic system is the so-called associated coordinate system B B B B X Y Z , which origin is at the UAV mass centre ( ) B .Axis B B X is directed along the longitudinal axis of the UAV in forward direction, axis B BY lies in the UAV symmetry plane and is directed upwards.The transverse axis B BZ is directed to the right, and the vector of the UAV in a fixed coordinate system D. The coordinate system beginning for the on-board observation equipment is located in the centre of the equipment in the point ( ) C , and the vector B C ) , CI i j − is the current image (CI); ( ) , RI i j − is the reference image (RI); E [.] is the mathematical expectation symbol; , i j means the coordinates of the image cells; max max , i j means dimensions of the compared image fragment; di, dj mean coordinates RI displacement relative to CI.The normalized cross correlation function

Figure 1 .
Figure 1.An example of current (CI) and reference (RI) images

Figure 2
Figure 2 shows an example of calculating the correlation functions surfaces (the correlation function values are plotted on the vertical axis) for different values of , , using a variety of RI.Values di, dj, corresponding to the correlation function maximum is taken as the true state of the desired object.UAV position assessment error (at a flight height of 200 m, using the camcorder with a 20 megapixels matrix and a view field of 30 ang. deg.)≤2m.

Figure 2 .
Figure 2. Examples of correlation functions

Figure 3 .
Figure 3. Description of a straight line in the coordinates [α, d]

Figure 4
Figure 4 shows a typical example of an image containing a line (right) and the display of these lines in the model space [α, d].

Figure 4 .
Figure 4. Straight lines, noises and the corresponding values of the accumulator in the model space the parameters of the i-th line of the image; и − are the model parameters; ∆ и ∆ are the tolerances on the parameters.3.6.4Combining Line Segments.The lines, obtained in the previous step, are combined together into longer lines for further work with them.

Figure 5
Figure 5 shows an example of the implemented algorithm for processing the original image with the highlighting the road edges by straight lines.