Visualization of Wireless Sensor Networks using

0 downloads 0 Views 942KB Size Report
contribution of this paper is to integrate the use of sensor cloud services for ... The proliferations of low-cost microcontrollers and sensors have encouraged the ...
Visualization of Wireless Sensor Networks using Zigbee’s Received Signal Strength Indicator (RSSI) for Indoor Localization and Tracking Flora Salim1, Mani Williams2, Nishant Sony3, Mars Dela Pena4, Yury Petrov5, Abdelsalam Ahmed Saad6, Bo Wu7 1, 3-7

School of Computer Science and Information Technology, 2Spatial Information Architecture Laboratory 1-7 RMIT University Melbourne, Australia {1flora.salim, [email protected]}

Abstract—With the increasing popularity of Wireless Sensor Networks (WSN), indoor localization has become a key research challenge, since it is crucial to locate the sensors in order to analyze the sensor data in their spatial and temporal contexts. The paper presents a novel method to visualize the position and trajectory of a dynamic WSN using ZigBee’s Received Signal Strength Indicator (RSSI) in a map-based visualization, deployed on smart phones. Existing research has demonstrated the use of RSSI to estimate the position of ZigBee tags (transmitters). The contribution of this paper is to integrate the use of sensor cloud services for managing spatio-temporal data streams and a smartphone app for visualizing the dynamic spatial and temporal aspects of the readings from wireless sensor networks. (Abstract) Keywords—localization; indoor positioning; RSSI; ZigBee; visualization; wireless sensor networks, smartphone app; sensor cloud (key words)

I.

INTRODUCTION

The proliferations of low-cost microcontrollers and sensors have encouraged the uptake of ad-hoc wireless sensor networks to be constructed and configured for various purposes. The wealth of intelligence that can be analysed and synthesized from these sensors has the potential to increase our understanding of environmental trends and social behaviours in the city. As such, it is necessary for the sensors to be analysed in the context of their positions in time and space. Although GPS provides passable data on physical location – or, alternatively, a method for approximating the physical distance between two objects – but it does so in the context of a global map. The accuracy and usefulness of GPS decrease as the contextual space inhabited by the object decreases, such that GPS-generated longitude and latitude values are not particularly helpful for differentiating the physical locations of multiple sensors in an indoor space. ZigBee [1] is a specification for communication protocols that implement IEEE 802.15.4 [2], a standard that defines medium access control (MAC) and the physical layer (PHY) protocol for low-power devices, hence, suitable for indoor and outdoor wireless sensor networks. While the focus on existing research has been mainly on improving the accuracy of localization techniques using estimates drawn from RSSI data

from ZigBee modules [3][4] and increasing the robustness of the RSSI measurements, such as under RF interference [5], there is a need to visualise the localization data to users in an accessible and intuitive manner. The increasing number of smartphone users provides a platform for information from WSN to be made available for a vast audience. To do this, it is essential for data from WSN to be managed in their spatial and temporal contexts in real-time. With the advances in cloud computing and the growing popularity of crowdsourcing techniques, many cloud services for managing sensor data, or sensor clouds, have been proposed. Cloud computing could alleviate issues in implementing traditional sensor networks, such as the requirement for large number of nodes in monitoring large areas, also introducing obstructions in environment that affect network topology [6]. Many existing sensor data models do not consider usage for indoors. Although geospatial references, which consist of latitude and longitude, are often included in existing sensor data models, these are insufficient for indoor localization of wireless sensor networks. There are several research challenges coupled with this issue. How to model data from WSN to allow spatial and temporal aspects of sensor data to be captured for indoor localisation and tracking? How to design a sensor cloud framework that is capable of handling multiple sensor data streams? How to manage and visualize spatiotemporal sensor data in situ? In this paper, we propose a sensor cloud framework that includes a data model for localizing sensors in indoor environment and methods to manage sensor data as services in the cloud and visualize sensor data in-situ with 3D visualization of sensor data on floor plans or maps. EnviS, an Android app, is an integrated toolkit, prototyped to evaluate this research. EnviS can be applied for managing environmental sensors in indoor or outdoor contexts. To evaluate this research, a set of environmental sensor modules and ZigBee mesh networks are constructed. The sensor nodes are currently built with light, temperature, motion, noise, humidity sensors, and ZigBee antennas. The existing testbed can be extended with any type of sensors on

any platform by implementing the RESTful web service presented in this paper. II.

RELATED WORK

The Sensor Cloud infrastructure proposed in [6] provides service instances (virtual sensors) to end users upon request, via user interface through web crawlers. Sensor-Grid architecture proposed in [7] is not suitable for this research as it deals with multiple wireless sensor networks with gateways in the architecture, without the use of spatial references; instead, it uses an event matching approach for delivering published sensor data or events to users. SenseWeb [8] is one of the first software architectures for sharing sensor data through connecting Wireless Sensor Networks (WSN) to the Internet. Using the SenseWeb API, users can register and publish their own sensor data. In SenseWeb, all the processes are executed at a single Coordinator as a central point of access for all sensors and applications using the sensor data through WS-API. SenseWeb has been used to provide RFID-tag-based data for inferring indoor events in Washington [8]. Piyare and Lee [9] introduced three layers for integration between wireless sensor network and the cloud, which are sensor layer, coordinator layer, and supervision layer. The main difference from SenseWeb is that the coordinator runs on ZigBee, and with the introduction of supervision layer, the coordinator is directly connected to the supervision layer via the REST-based API, and all applications can be connected to the supervision layer [9]. Piyare and Lee used Arduino-board and Zigbee to demonstrate the prototype. The coordinator layer essentially is just a bridge that buffers sensor data and pushes the data to the supervision layer at a predefined interval.

A. Data Model In our project, the sensor data model and protocol for streaming the data are designed with spatio-temporal data management and visualisation in mind. Every sensor is associated with a location and an identifier or belongs to a sensor set, which is associated with a location and an identifier. A sensor set defines a related group of sensors that need to be streamed together as a single data stream with one timestamp. If the sensors in a sensor set are collocated, it is the sensor set that is associated with a location. However, if the sensors in a sensor set are not collocated or if the locations of the sensors vary, each sensor in the set needs to specify their location. In this way, sensor data can be visualised “in situ”. When a sensor set is plotted on a floor plan the x,y and z positions and the geolocation of the set is copied over to the sensors by default, therefore the position of the set defines the position of the sensors of that set. Alternatively, if the sensors belonging to a set are widely spread out then the application gives the administrator an option to plot individual sensors onto a floor plan. A sensor cannot exist without a set. Set is a simple conceptualization of a sensor module, or a collection of sensors.

Guo et al [10] presents a standardization system model to integrate different types of sensor networks, to aim for unification of data structure and service model thereof. The sensor data structure composed of sensor description model and sensor data model, which includes timestamp and raw sensor data [10]. This is a standard approach in many existing sensor clouds, which is applying timestamps to every discrete sensor data. The OpenGIS® Sensor Model Language Encoding Standard (SensorML) (SensorML) [11] provides a standard model and XML encoding for describing characteristics of sensors and sensor systems. Sensor ML supports sensor geolocation, however, the model does not support indoor localization of sensors. III.

SENSOR CLOUD SERVICES

Existing literature generally treats ‘spatial’ as either a set of GPS coordinates, a mathematical concept within a model, or the parameters bounding a cluster of nodes [12]. EnviS is distinct from other sensor clouds in that it stores a representation of the sensors’ spatial context, insofar as the user is able to capture it via manually drawing the map on the app, and the three-dimensional data visualization presents the sensor and their readings within their associated map.

Fig. 1. Data Model

The database schema in Fig. 1 illustrates the relationships between the entities and their associated attributes, as they are stored in our database. Each sensor’s location is stored in x, y, and z attributes of the many-to-one relationship (‘include’) between the sensor and a sensor set; collocation with the sensor set is the default until the user manually places the sensor in a different location on the associated map, at which point the aforementioned location attributes will be updated to the new location. The sensor readings are stored as weak entities belonging to one specific sensor, as they are dependent on the sensors from which they originated in order to be of any use as meaningful data. When a sensor set is added to a map, the sensor set’s location within the map is stored as attributes in the relationship (‘has’) between the sensor set and map entities.

B. Service Architecture In order to accommodate the demands of multiple live data feeds and the data models of map and sensor readings, EnviS (Fig. 1.) uses five tiers in its multitier architecture. One tier contains the Wireless Sensor Network (WSN), which comprises of the sensors, XBee hardware, and Arduino [13] code. The WSN for this system is sense-only, using a Zigbee protocol, and the interaction pattern is many-to-one, to use the taxonomy described by Mottola and Picco [10]: the coordinator-XBee acts as a central node and relays the sensor readings from the router-XBees to the cloud. The disadvantage of this architectural decision is that it sets up each coordinator to be a single point of failure for all the end-routers in its vicinity.

The back-end system was initially hosted on Amazon Web Service. Lee, Murray, Hughes, and Joosen expounded on the suitability of Amazon EC2 for environmental monitoring [15]. The elasticity of the web service allows it to bear much of the resource-intensive processes, which is one of the factors driving the integration of WSNs with Cloud Computing. However, our system exhibits greater stability on the National eResearch Collaboration Tools and Resources (NeCTAR), and the decision was subsequently made to move all the back-end tiers to NeCTAR. NeCTAR provides a suitably high network bandwidth for numerous real-time data feeds and has the capacity to allow for scaling to larger feeds in the future. IV.

APPLICATIONS

Two applications of the EnviS system are presented here. The indoor localization application utilized an off-the-shelf tracking system coupled with a customer built data recording system. The Multi-sensor monitoring application interfaced a set of Arduino controlled electronic sensors with ZigBee mesh network.

Fig. 3. ZigBee modules (left: XBee modules attached to sensor sets for Multi-sensor Monitoring; right: individual ZigBee tags for Indoor Localization).

Fig. 2. Sensor Cloud Service Architecture

There are two web service tiers. The data consumer tier processes the sensor readings from the coordinators of multiple WSNs and transmits them to the database, while the data provider tier acts as an interface between the database and the application tiers. The database tier stores: the map coordinates, the sensor details, and sensor readings containing both. The application tier contains a local database, which is synchronized with the database when the mobile device in which it is installed detects a network connection. Storing data locally increases the application’s memory usage on the client side, but it allows the user to still view visualizations of historical data and edit maps when there is no connection with the cloud. This research had to address some of the common challenges of integrating WSN with the cloud, as discussed in [14], such as network bandwidth, load balancing, scalability with large real-time data feeds. Both web service tiers – the data provider and data consumer – are clustered, thus allowing for the load-balancing of incoming data and also providing greater stability and improved performance in the system.

Fig. 4. ZigBee Sensor Network Setup for Indoor Localization

A. Indoor Localization Using an off-the-shelf commercial system that utilised the ZigBee protocol, we implemented a wireless sensor network to evaluate the EnviS's visualization functionality for indoor localization (Fig. 4). To test the development, 9 beacons (8 ZigBee receivers and 1 ZigBee coordinator), were deployed around the room where the indoor tracking was performed to track 14 tags (ZigBee transmitters, Fig. 3 right) that were worn by the room occupants. Three webcams were also installed to

provide contextual information to the ZigBee tracking data. Graph on Fig. 9 shows real-time approximate locations of the people, wearing the tags, as they move around the room.

visible to all the users using the application. When the sensor is added to EnviS, the application picks up the geolocation of the phone and tags the sensor to those coordinates.

Through testing, the ZigBee system was found to provide an average tracking performance of True Positives (TP)=79% when recording transition between 2 beacons placed 10 meters apart, using 2 tags (Fig. 5, line plot). The corresponding signal strength reading was also recorded (Fig. 5, circles). As seen in the Fig. 5 testing results, the signal strength outputs is an unreliable representation of the precise distance, as they were sensitive to the placement of obstacles (such as people and furniture). In the EnviS visualisation we used the distance measure only as an indication. As noted earlier, other research provides algorithms for more accurate positioning using techniques such as Markov chain inference [3]. More accurate localization is dependent on the clear line of sight (LOS) to the transmitter. For example, putting a tag in the person's back pocket will reduce the accuracy, sometimes give false registration. Although clear LOS is a challenge, as highlighted in [16], existing works such as [17] provides a technique to preserve a clear LOS between the transmitter and the receiver.

The app allows users to draw 3D maps of the indoor spaces in which the sensors are positioned and plot sensors in the drawing. For example, the corners of a room may be warmer than the centre. Once the map is drawn the Admin/User has two ways of visualizing data on the application. One option available is to view the readings using chart visualization and the other option is using the 3D visualisation. The visualizations can be viewed in real time or by mentioning a time range. While visualizing the sensor data the user can interact with the interface to focus on readings if need be. For 3D Visualization, the users can rotate the map, zoom in/out and change perspective (top view, front view and side view). The 3D visualisation feature allows users to view sensor data with spatial references of the actual location in situ, as shown on the map, with time series visualisation. Two options for the 3D visualisation is with bar charts (Fig. 6) or spheres (Fig 7). The height of the bar charts or the size of the spheres will change depending on the sensor data values at different times of day. The scrollable time axis allows interactive time-series visualisation on the map. Fig.6 and Fig.7 demonstrate the app's capability for use in the multi-sensor monitoring application.

Fig. 5. ZigBee System Testing

B. Multi-sensor Monitoring In this research, custom maps made by users are stored with geospatial references in the database. The locations of the receivers need to first be initialised in the map, by tagging or scanning the location of each receiver to pinpoint its location in the building plan, room layout, or map, stored in the database. Each of the location of the ZigBee tags (transmitters) is considered as a location of the sensor sets, with x, y, and z references of the location of the ZigBee receivers inferred from the RSSI distance calculation. Whenever the location of the tags is updated, the database is updated through the webservice accessed by the ZigBee modules, and the visualisation of the locations of the ZigBee tags is also updated on the smartphone app, which is discussed next. V.

Fig. 6. 3D Visualisation with Bar Charts on the Map

ENVIS MOBILE APPLICATION

The mobile application is prototyped for Android devices. The Android app allows the user to fetch sensor readings from the cloud-based web service and database at their convenience. The app also generates intuitive charts and 3D visualizations of the data. There are two major parts of the mobile application. The first part is the administration part that allows the user to add a network of sensors to the EnviS system which is then

Fig. 7. 3D Visualisation with Spheres on the Map

The visualisation of the indoor localization application was implemented separately. Fig. 8 shows the visualisation of the 9 beacons (8 ZigBee receivers and 1 ZigBee coordinator) placed on the map. Fig. 9 displays the proximity of the 14 ZigBee tags around the room visualised as spheres centred on the registered beacon. Each coloured circle corresponds to a unique ZigBee tag with radius reflecting the distance estimation inferred from RSSI value from the beacon visualized in the centre of each circle. Each circle were drawn once around the closest beacon whenever the an update localisation reading was detected.

CONCLUSION AND FUTURE WORK This paper presents EnviS, a sensor cloud services platform with ad-hoc sensor network visualization toolkit for Android mobile devices. EnviS, as a sensor analytics toolkit, was developed for visualising the spatial and temporal aspects of sensor data. The EnviS app gives users a choice of different visualization options, including whether they want to use near real time or historical data, enabling time-series visualisation and 3D spatial representation of sensor locations and readings. This gives users the opportunity to see how environmental variables are influenced by the physical space around the sensors. Future work includes data mining and pattern mining of historical and real-time localization data. Social Network Analysis can be performed on the indoor localization data, to investigate the patterns of people movement and social behaviours in a particular space, room, or building over a period of time. Fig. 10 is an early visual analytics that is performed for visualizing trajectories of ZigBee tags over time in a large studio in a university. This evaluation provides useful visual cues for understanding people movement in a public space, e.g. museum, exhibition, as well as for analyzing their social networking space and collaboration style in a work environment.

Fig. 8. Map View of the 9 tracking beacons (ZigBee Coordinator and Receivers)

When the location data is correlated with environmental sensor data, which are also managed on the EnviS sensor cloud, the dynamic changes in the environment can be used as contextual data that may become the trigger of changes in people movement and the dynamic of their behaviours. The flux of people movement may also provide the contexts to the changes that may occur in the indoor thermal environment. Another future direction is to use test other positioning techniques, such as Markov chain interference and Line of Sight (LOS), in analysing ZigBee tracking data, and to employ activity recognition from the smartphone sensor data in conjunction with positioning data to provide a more efficient and accurate indoor localization technique.

Fig. 9. 3D Visualisation of ZigBee RSSI measurement on the room layout

Fig. 10. Visual trajectories of individual ZigBee tags over time

ACKNOWLEDGMENT The authors wish to thank Dr. Jane Burry, Daniel Prohasky, Philip Belesky, and participants of the Sense and Sustainability workshop from RMIT, Melbourne, and Universitat Politècnica de Catalunya (UPC), Barcelona, for their support and contribution to the Attractors project reported in the paper. We would also like to thank Karin Hofert and Eloi Coloma from UPC for the opportunity given to us to run the workshop and test our indoor localization and tracking system in UPC ETSAB main hall. REFERENCES [1] [2]

[3]

[4]

[5]

[6]

ZigBee Specification v.1.0. ZigBee Alliance, 2013. IEEE Standards 802 Part 15.4: Wireless Medium Access Control (MAC) and Physical Layer (PHY) Specifications for Low-Rate Wireless Personal Area Networks (LR-WPANS), (IEEE Inc., 2003). A. S.-I. Noh, W. J. Lee, and J. Y. Ye, “Comparison of the Mechanisms of the Zigbee’s Indoor Localization Algorithm,” in Ninth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing, 2008. SNPD ’08, 2008, pp. 13–18. M. Sugano, “Indoor localization system using rssi measurement of wireless sensor network based on zigbee standard,” in Wireless and Optical Communications, 2006, pp. 1–6. S.-Y. Lau, T.-H. Lin, T.-Y. Huang, I.-H. Ng, and P. Huang, “A Measurement Study of Zigbee-based Indoor Localization Systems Under RF Interference,” in Proceedings of the 4th ACM International Workshop on Experimental Evaluation and Characterization, New York, NY, USA, 2009, pp. 35–42. A. Alamri, W. S. Ansari, M. M. Hassan, M. S. Hossain, A. Alelaiwi, and M. A. Hossain, “A survey on sensor-cloud: architecture, applications, and approaches,” International Journal of Distributed Sensor Networks, vol. 2013, Feb. 2013.

[7]

[8]

[9]

[10]

[11] [12]

[13] [14]

[15]

[16]

[17]

M. M. Hassan, B. Song, and E.-N. Huh, “A framework of sensor-cloud integration opportunities and challenges,” in Proceedings of the 3rd International Conference on Ubiquitous Information Management and Communication, New York, NY, USA, 2009, pp. 618–626. A. Kansal, S. Nath, J. Liu, and F. Zhao, “SenseWeb: An infrastructure for shared sensing,” IEEE MultiMedia, vol. 14, no. 4, pp. 8–13, Oct. 2007. R. Piyare and S. R. Lee, “Towards Internet of Things (IOTS):Integration of Wireless Sensor Network to Cloud Services for Data Collection and Sharing,” arXiv e-print 1310.2095, Oct. 2013. Z. Guo, C. Liu, Y. Feng, and F. Hong, “CCSA: A Cloud Computing Service Architecture for Sensor Networks,” in 2012 International Conference on Cloud and Service Computing (CSC), 2012, pp. 25–31. SensorML, http://www.opengeospatial.org/standards/sensorml L. Mottola and G. P. Picco, “Programming Wireless Sensor Networks: Fundamental Concepts and State of the Art,” ACM Comput. Surv., vol. 43, no. 3, pp. 19:1–19:51, Apr. 2011. Arduino, http://www.arduino.cc/ R. Liu and I. J. Wassell, “Opportunities and challenges of wireless sensor networks using cloud services,” in Proceedings of the workshop on Internet of Things and Service Platforms, New York, NY, USA, 2011, pp. 4:1–4:7. K. Lee, D. Murray, D. Hughes, and W. Joosen, “Extending sensor networks into the Cloud using Amazon Web Services,” in 2010 IEEE International Conference on Networked Embedded Systems for Enterprise Applications (NESEA), 2010, pp. 1–7. H. Liu, H. Darabi, P. Banerjee, and J. Liu, “Survey of Wireless Indoor Positioning Techniques and Systems,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 37, no. 6, pp. 1067–1080, 2007. S. A. Mitilineos, D. M. Kyriazanos, O. E. Segou, J. N. Goufas, and S. C. Thomopoulos, “Indoor localisation with wireless sensor networks,” Progress In Electromagnetics Research, vol. 109, pp. 441–474, 2010.