A Modular Approach to Real-time Sensorial Fusion ... - Semantic Scholar

0 downloads 0 Views 301KB Size Report
grate the application layer to different OSs fulfilling that norm (VxWorks. [6], pSoS+, etc), with minimal changes. – Its source code is open. Thus, it will be possible ...
A Modular Approach to Real-time Sensorial Fusion Systems F. Gil-Castiñeira, P.S. Rodríguez-Hernández, F.J. González-Castaño, E. Costa-Montenegro, R. Asorey-Cacheda, J.M. Pousada Carballo Departamento de Ingeniería Telemática, Universidad de Vigo ETSI Telecomunicación, Campus, 36200 Vigo, Spain {xil,pedro,javier}@det.uvigo.es

Abstract. In this paper, we present a modular architecture for embedded sensorial-fusion systems. Each sensor is a task of a real-time operating system, whose core is a Deadline Monotonic scheduler. The scheduler decides if the sensor set is admissible or not, according to user-specified deadlines. Additionally, it monitors sensor behavior to check if the user model was correct and to characterize deadlines precisely (to be considered in future sensor changes/additions). We present a real development based on our concept: a low-cost platform that combines the contributions of an hyperspectrometer and a series of positioning sensors, which track the movements of the carrier vehicle.

1

Introduction

In 2003, the Spanish Ministry of Science and Technology launched a call for R&D proposals to handle sea oil spills, as a result of the sinking of the tanker Prestige. Diverse groups from the Atlantic universities of La Coruña, Santiago de Compostela, País Vasco and Vigo were granted a project for the development of a low-cost compact hyperspectrometer, to be carried in aerial vehicles (Spanish MCyT VEM200320088-C04, Desarrollo de un sistema hiperespectral multisensor para la detección, seguimiento y representación cartográfica de vertidos marinos). The project pursues several goals, including: – Hyperspectrometer compactness, since existent commercial solutions are too bulky [1, 2]. – As a secondary objective, the project pursues the development of a platform to fusion hyperspectrometer output with the contributions from diverse sensors. The platform is open, in the sense that it is easy to add/deinstall/replace sensors. Consequently, it is not physically part of the hyperspectrometer. This paper describes the sensorial fusion platform (subproject 04). Each of its sensors is a task of a real-time operating system, whose core is a Deadline Monotonic scheduler. The scheduler decides if a new sensor is admissible or not, according to user-specified deadlines. Additionally, it monitors sensor behavior

2

F. Gil-Castiñeira et al

to check if the user model was correct and to characterize deadlines precisely (to be considered in future sensor changes/additions). We present its software architecture in section 2. Section 3 describes the current hardware architecture of a low-cost low-weight airborne sensorial fusion platform. Section 4 evaluates its feasibility according to the theoretical framework in section 2. Finally, section 5 concludes.

2

Software architecture

The software architecture has two layers: – The application software layer, which serves the main goal of the system (sensorial data acquisition/fusion/storage), and – the operating system layer, which manages system resources and hides hardware details to the programmer. 2.1

Operating system layer

Since the sensorial fusion platform is open, we have not implemented a specialized operating system (OS). We have chosen Linux instead, as a general purpose OS. In our context, this has two extra advantages: – Linux is –highly– Posix-compliant [5]. Consequently, it is possible to migrate the application layer to different OSs fulfilling that norm (VxWorks [6], pSoS+, etc), with minimal changes. – Its source code is open. Thus, it will be possible to tailor the operating system layer in the future, if necessary. As mentioned in section 1, the sensorial fusion platform relies on a real-time operating system. For this reason, the actual operating system layer is based on the rt-Linux extension (real-time Linux) [7]. 2.2

Application software layer

This layer is totally modular. Each sensor is managed by a real-time periodic task, which is plugged to a software bus. A specialized task –the Deadline Monotonic scheduler– controls the software bus, as shown in Figure 1. Thus, it is possible to (de)install sensors easily. A series of parameters models each task. The most important ones are (a) activation period (which depends on the production rate of the corresponding source) and (b) processing time demand (which depends on both the communication rates between the sensor and the sensorial fusion platform and the acquisition rates of the latter).

A Modular Approach to Real-time Sensorial Fusion Systems

3

Fig. 1. Application software layer

2.3

Deadline Monotonic scheduler

This management task has three main goals: – It determines if the sensor task set (the mission) is feasible, i.e. if it is possible to fulfil its real-time constraints. – Accordingly, it assigns each task its priority at the operating system layer. – it monitors the actual behavior of each task. Obviously, this is for fine constraint characterization, since a completely wrong model would lead to system collapse. On system initialization, the scheduler reads the following parameters from a configuration file: – The number of real-time tasks connected to the software bus, n. – The number of resources they share, m. The tasks in the current implementation (section 3) do not share resources, i.e. m = 0. However, the design considers the case of m 6= 0. Additionally, the model of each task τi comprises the following parameters: – Ti : task period, which is the worst-case (minimum) elapsed time between consecutive task activations. – Ci : task demand, or the worst-case (maximum) processor time a task activation requires to finish its current work assignment. – Di : task response time, which is the worst-case time a task requires to finish all its work. – For each resource rj a task uses, there exists a critical region length zij , which is the worst-case (maximum) processor time the task consumes while holding rj .

4

F. Gil-Castiñeira et al

Then, the scheduler checks if the mission is feasible, i.e. it is possible to guarantee that all tasks will fulfil their deadlines according to the Deadline Monotonic scheduling algorithm [8] and the shared resource access synchronization Priority Ceiling protocol [9], known as Priority Protect Protocol in real-time Posix extensions [10]. In the current implementation in section 3, all tasks must complete a given activation before the next one, i.e. Di = Ti . Thus, it would be possible to schedule tasks according to the Rate Monotonic algorithm [11], which is optimum in this case (for fixed priorities). However, the Deadline Monotonic algorithm is more adequate for our open design philosophy. In order to check if the mission is feasible, the scheduler: 1. assigns task priorities as a monotonically decreasing function of their deadlines: the longer the deadline the lower the priority. 2. Assigns a priority ceiling to each shared resource, which is the maximum priority of all tasks using the resource. 3. Calculates the blocking time Bi , which task τi suffers in the worst case due to the remaining tasks, as a result of the application of the Priority Ceiling protocol: Bi =

max j∈lp(i),k∈hc(i)

zjk .

(1)

In other words, this is the maximum blocking time a task τi suffers is the longer critical region within the task set whose maximum priority is less than that of τi , regarding those resources whose ceiling is at least equal to the priority of τi . 4. Checks that each worst-case task response time Ri satisfies Ri ≤ Di . Since it is not possible to solve the response time equation X  Ri  Cj (2) Ri = Ci + Bi + Tj j∈hp(i)

analytically, one can use the recursive formula in [12] instead: ' & (n) X wi (n+1) wi = Ci + Bi + Cj Tj

(3)

j∈hp(i)

(0)

starting with wi = Ci , until any of the following conditions hold: (n+1) – wi > Di , which means that the task cannot fulfil its deadline and the mission is not feasible, or (n+1) (n) – wi = wi ≤ Di , which means that the task fulfils its deadline and (n) Ri = W i . If the validation stage succeeds, the scheduler launches one rt-Linux task per sensor with the corresponding Deadline Monotonic priority. Then, once the

A Modular Approach to Real-time Sensorial Fusion Systems

5

system is running, the scheduler monitors tasks periodically to check their temporal parameters. If the initial estimates were not exact, the Deadline Monotonic algorithm repeats its calculations with the new worst-case values. Finally, it generates a trusted file to be used in future system modifications, possibly combining new sensors with known ones (for example, in partial sensor upgrades).

3

Hardware architecture

As mentioned in section 1, the hyperspectrometer and the sensorial fusion platform are physically independent. This is because future applications may require a different sensor –mission– set. Our current design is a low-cost low-weight airborne box that is fully autonomous, in the sense that it does not interact with carrier vehicle instrumentation. Its core is a x86 mini-ITX [3] EPIA board (Figure 2) with a a 120 GB hard disk (Ultra-ATA/133 bus at 133 MBps), which can store up to five hours of sensor data. From the point of view of the platform, the hyperspectrometer is simply another –external– sensor that generates 100 lines per second (0.48 Mb each) that the platform acquires via IEEE 1394a at 400 Mbps. The remaining main elements are: – Gyroscopes and accelerometers (3+3): Three Gyration MG100 minigyroscopes and three Analog ADXL202 accelerometers to track 3-axis rotations of the carrier vehicle (accelerometer data allows us to correct gyroscopic deviations). An auxiliary ATmega128 microcontroller board captures their outputs each 10 ms through a CAD MAX1238 A/D interface (6×12 bits per hyperspectral line) and presents a single data flow to the platform via RS232. – GPS: a RoyalTek REB-2100 board that provides altitude, speed and direction of movement via RS232, using the NMEA 0183 protocol [4]: ∼ 60 Bps.

4

Feasibility analysis

This section estimates the feasibility of the platform according to the theoretical framework in section 2. – Each 10 ms, the hyperspectrometer generates a 0.48 Mb burst that the platform acquires at 400 Mbps (1394 link speed). Thus, processing time is less than 2 ms, T1 = 10 and C1 = 2. Computation must finish before the arrival of the next burst, which means that D1 = 10. – The rotation sensor pack generates 6x12 bits each 10 ms, which access the EPIA board via the first serial port at 100 Kbps. This means a computation time of 6 × 12/105 ' 1 ms each 10 ms. Thus, T2 = 10 and C2 = 1 and, as in the previous case, D2 = 10. – GPS data access the EPIA board via the second serial port: 60 Bps captured at 9600 bps. This means a computation time of 60 × 8/9600 = 0.05 s each second, i.e. T3 = 1000 and C3 = 50. Again, computation must finish before the arrival of the next burst, which means that D3 = 1000.

6

F. Gil-Castiñeira et al

Fig. 2. EPIA architecture

A Modular Approach to Real-time Sensorial Fusion Systems

7

From these parameters, the following priorities result: P1 = 3, P2 = 2 and P3 = 1. There are no shared resources. Then, ∀ τi Bi = 0, and from (3): – R 1 = 2 ⇒ R1 < D 1 . – R 2 = 3 ⇒ R2 < D 2 . – R3 = 74 ⇒ R3 < D3 . We conclude that the tasks always satisfy their deadlines, and the mission is feasible according to the initial estimates.

5

Conclusions and future work

We have presented a low-cost real-time modular platform for sensorial fusion. The platform evaluates mission feasibility from initial user estimates and corrects temporal parameters for future modifications. In the current research stage, we are considering the possibility of replacing rt-Linux by even lighter OSs like SARTL (Standalone Real-Time Linux) [13] and MaRTE OS (Minimal Real-Time Executive) [14].

References 1. HyMap, http://www.intspec.com/. 2. AISA, http://www.specim.fi/products-aisa.html. 3. New Mini-ITX Mainboard Specification White Paper, http://www.via.com.tw/en/VInternet/Mini-iTX.PDF 4. NMEA 0183 Standard for Interfacing Marine Electronic Devices. Version 2.30, March 1, 1998. 5. IEEE Standard 1003.1. International Standard for Information Technology – Portable Operating System Interface (POSIX) – Part 1: System Application Program Interface (API) [C Language] 1996. 6. VxWorks, http://www.windriver.com 7. Real Time Linux, http://www.rtlinux.org 8. J. Leung and J. Whitehead, “On the Complexity of fixed-priority scheduling of periodic, real-time tasks,” Performance Evaluation, vol. 2, no. 4, pp. 237–250, 1982. 9. R. Rajkumar, Synchronization in Real-Time Systems. A Priority Inheritance Approach. Kluwer Academic Publishers, 1991. 10. IEEE Standard 1003.1b–1993. Standard for Information Technology – Portable Operating System Interface (POSIX) – Part 1: System Application Program Interface (API) – Amendment 1: Realtime Extension [C Language] 1993. 11. C. L. Liu and J. W. Layland, “Scheduling Algorithms for Multiprogramming in a Hard-Real-Time Environment,” Journal of the ACM, vol. 20, no. 1, pp. 46–61, Feb. 1973. 12. N. Audsley, A. Burns, M. Richardson, K. Tindell, and A. J. Wellings, “Applying New Scheduling Theory to Static Priority Pre-emptive Scheduling,” Software Engineering Journal, vol. 8, no. 5, pp. 284–292, Sept. 1993. 13. Stand-Alone RTLinux main site, http://www.ocera.org/ vesteve 14. MaRTE OS, http://marte.unican.es