The Digitise any Physical Space BoosterPack provides a deployable Sample Solution that allows users to observe and study the application of Bluetooth Low Energy Real-Time Location Systems (BLE RTLS) and Kibana to solve the problem of automatically collecting and reporting data about who/what is where/how within the physical spaces of a business’s operations. The purpose of this document is to describe the Sample Solution and how it demonstrates the use of advanced technologies.
The Internet of Things holds many promises, with one standing above all others: the ability for computers to observe the physical world – without the need for human entered data. Data entry is a tedious task and in many applications, it is impractical or infeasible for a human to capture the information that would allow computers to create efficiencies.
If it were possible for computers themselves to identify, locate and interpret the people, products, and places within any physical space using machine learning, they could help us to reduce waste, increase productivity, eliminate nuisances, and elevate experiences.
Thanks to the recent explosion in the number of radio-identifiable products, specifically with the BLE and RAIN RFID standards, it is increasingly viable for computers to observe the human occupants of a physical space through their smartphones and wearables, as well as other products through inexpensive radio-frequency identification (RFID) tags. Moreover, the infrastructure to detect these radio signals is increasingly prevalent. In many cases, all that is missing is software and a simple integration to pull it all together so that computers can “digitize” the physical space in real-time.
While it is possible today to find products and solutions which “digitize” some aspect of a physical space in pursuit of a specific application, these leave a myriad of potential complementary applications unrealized. An approach that facilitates the realization of these myriad applications is through software that “digitizes” everything to the extent possible, and provides APIs for application-specific software to leverage and share.
This Sample Solution showcases how BLE RTLS and Kibana can be used to address the challenge of automatically collecting and reporting data about a business’s physical operations.
This solution makes it possible to “digitize” any physical space and its occupants to the extent possible. It provides APIs to access the real-time information about who/what is where/how, facilitating the development of application-specific software.
As such, it facilitates the co-existence of multiple complementary applications, each consuming a web-standard stream of data from a shared physical infrastructure. Moreover, it frees SMEs from having to develop extensive knowledge about the underlying hardware and software involved in RTLS, RFID, and machine-to-machine communication (M2M), allowing them to focus on application development based on their specific domain expertise.
The software is easily configured to ingest real-time radio-decoding data from a variety of devices, and provides both REST and Socket.io APIs to consume the contextualized data. The software offers modular connectors to common databases and platforms, specifically Elasticsearch and Kibana integration. Finally, the software includes a variety of web applications to showcase the potential of the data stream and to facilitate exploration and understanding by end-users.
The diagram below illustrates the structure of the Sample Solution.
Significant components used in the Sample Solution are summarized in the table below:
|BLE Transmitters||Bluetooth Low Energy devices that spontaneously transmit advertisement packets. These include smartphones, wearables, key trackers, beacons, smart appliances, etc. (see BLE as Active RFID)|
|BLE Receivers||Gateway devices that listen for BLE advertisement packets and, in the case of the Sample Solution, relay these to a software instance for processing and aggregation. (e.g. Raspberry Pi 3+)|
|DAIR Cloud Platform||The DAIR Cloud Platform provides a web-based user interface for participants to access DAIR cloud resources.|
|hlc-server||Open source software package which combines all of the core reelyActive open source modules (see detailed diagram below) to implement the described solution. Ingests, processes and aggregates radio decoding data from various sources such that it may be consumed by push/pull APIs to observe in real-time who/what is where/how.|
|Elasticsearch||Open source database in which the collected data is stored.|
|Kibana||Open source software from which the collected data is retrieved, visualized and reported. See the reelyActive Kibana integration overview for examples of reporting and analytics.|
|Web Interfaces||Both hlc-server and Kibana provide user-friendly web interfaces to observe, manipulate and report the real-time and stored data.|
This section guides you through a demonstration of a Hyperlocal Context Server. Using an RTLS is compelling because it automates the collection of data of a business’s physical operations, facilitating the derivation of insights necessary for continuous improvement.
The demonstration will illustrate the deployment of the RTLS software, database, and analytics suite in the cloud using DAIR.
Configuration and Application Launch
Browse to cloud.canarie.ca and log in to the Morpheus dashboard with the credentials provided to you by DAIR.
- Select Provisioning
- Select Apps
- Click on the +ADD button
- Select the REELYACTIVE-HLC-ELASTICSEARCH-KIBANA blueprint
- Click on the NEXT button
- Select a NAME for the App
- Select AWS-Canada as the GROUP
- Select AWS-Canada as the DEFAULT CLOUD
- Select any ENVIRONMENT (this has no effect on the Sample Solution)
- Select the ubuntu part of the structure hierarchy
- Update the Configuration Options, selecting, at a minimum, a T2 Medium – 2 Core, 4GB Memory PLAN
- Update Resource Pool (select your default resource pool)
- Update Networks (select either of your availability zones “az”)
- Update Security Groups (select your default security group) and select “Assign EIP” option for Public IP
- Update the Advanced Options (see below)
- Optionally, enter any ENVIRONMENT VARIABLES (see the hlc-server documentation for advanced use cases such as custom ports)
- Select Expose Ports
- Expose hlc | 3001 | tcp and kibana | 5601 | tcp
- Confirm under Automation that the reelyActive-hlc-elasticsearch-kibana workflow is selected
- Click on the NEXT button (once all configuration is complete)
- Click on the COMPLETE button after reviewing the App configuration
Observe that the app, with the chosen name, has been added to the Provisioning/Apps page. The STATUS will be a rocket ship during provisioning. Click on the app name to browse to the provisioning view.
The provisioning view will list additional properties. Click on the reelyActive-hlc-elasticsearch-kibana instance for the instance view.
The instance view will list additional properties. Once provisioning is complete, the STATUS will display a green play button. In the VMS section, observe the LOCATION which should display the IP address assigned to the instance.
You will also need to update your default Network Security Group with new rules to allow access to the HCL web application and Kibana on ports 3001, 5601 respectively.
Point your web browser to the this IP address, port 3001. For instance, in the example above this would be 220.127.116.11:3001.
- Select the Infrastructure menu
- Select Network option
- Click on the Security Groups tab
- Click on the Name of the Security Group for AWS cloud to edit the rules
- Click the + Add Rule button
- Fill in the Rule form as shown above to open access to TCP ports 3001 and 5601 (one rule for each)
- Click Save Changes
Now, you can point your web browser to the instance IP address, port 3001 to access the HLC web application. For example, 18.104.22.168:3001 as shown below
A landing page similar to the one shown above should be displayed. It is normal that the software is processing no data as there is no RTLS hardware forwarding data until it has been configured.
Forwarding Data to the Sample Solution Instance
The hlc-server software is listening for UDP traffic on port 50001 containing (binary) encoded raddec (RADio DECoding) packets. There are several ways to forward raddecs to the instance, with purpose-built hardware such as an Owl-in-One, with off-the-shelf hardware such as a Raspberry Pi or simply through software.
Forwarding Data with an Owl-in-One
An Owl-in-One is a reelyActive product consisting of open hardware which runs open source software for Node.js. An Owl-in-One can easily be configured to forward BLE decoding data to the specific IP address of the Sample Solution instance by following this tutorial. CANARIE has some of these devices available “for loan”. If you would like to borrow a device for your evaluation of this BoosterPack, please contact: DAIR.Admin@canarie.ca.
Forwarding Data with a Raspberry Pi
The Raspberry Pi 3 and its contemporaries include an on-board BLE radio and can be configured to run reelyActive open source software by following this tutorial. With several lines of additional code (see the raddec package and the UDP example below), it is possible to forward the raddec stream to the Sample Solution instance.
Forwarding Data in Software
It is possible to generate and forward a raddec packet with the following Node.js code.
const dgram = require('dgram');
const raddec = Buffer.from('10001702aabbccddeeff013a0101001bc509408100000b', 'hex');
const client = dgram.createSocket('udp4');
client.send(raddec, 0, raddec.length, 50001, '22.214.171.124'); // Set IP address!
Paste the code above into a file called forward.js. Then from the command line run node forward. A single raddec will be sent via UDP to the Sample Solution instance.
Observing Data in Kibana
Data can be observed in Kibana under two conditions:
- At least one radio decoding (raddec) packet has been forwarded to the Sample Solution instance (see above).
- Kibana is explicitly configured to be accessible on the web (by default it is only accessible on the localhost).
Configure Kibana for Remote Access
Kibana can be configured for remote access by adding/editing one line of its configuration as follows:
- ssh into the Sample Solution instance (ex: ssh email@example.com) or use the web console available in the Console tab of the Instance View.
- Open the kibana.yml file for editing with the command sudo vi /etc/kibana/kibana.yml
- Add the following line to the bottom of the file: server.host: “0.0.0.0”
- Save the file
- Restart the Kibana with the command: sudo systemctl restart kibana.service
It should now be possible to browse to Kibana on port 5601 of the Sample Solution instance (ex: 126.96.36.199:5601).
Create the raddec index pattern in Kibana
In Kibana, click on the Discover icon from the left bar to be prompted to create an index pattern. Then:
- type raddec in the index pattern box and click Next Step to continue
- select timestamp in the Time Filter field name dropdown and click Create index pattern to continue
- click on the Discover icon again and observe raddec data (if necessary, adjust the time range to a period where raddec data is present)
Create Reports and Visualisations in Kibana
See the Kibana integration overview tutorials for examples of reports and visualisations.
To free up resources, from the Instances view, select the instance and from the ACTIONS menu button, select Delete. When prompted, ensure Release EIP is selected and click on the DELETE button.
This section describes considerations for usage and adaptation of the reference solution.
The hlc-server open source software package can be deployed on anything from a Raspberry Pi to a high-end cloud server, as it is designed for accessibility and versatility. Up to a given throughput of real-time location data, simply providing an adequate CPU is sufficient to ensure good performance. However, beyond a certain threshold, it is more efficient to optimize the architecture as opposed to simply scale up the CPU. This is discussed in the Scaling Considerations below.
The Elastic stack can also be deployed on a different machine than the hlc-server, allowing tailored resources for the very different needs of each.
With respect to real-time location hardware (the devices that detect and relay radio packets to the RTLS software), there are many vendors and technologies to consider. The most widespread Active RFID technology is Bluetooth Low Energy, and its Passive RFID counterpart is RAIN RFID. In the case of BLE, an off-the-shelf device such as the Raspberry Pi 3 can act as a receiver, and relay the packets with our open source software. In the case of RAIN RFID, more elaborate hardware is required, which can be sourced from a variety of vendors.
With respect to open source technology-agnostic RTLS software, we are not aware of any alternatives.
With respect to databases and analytics suites, there are many alternatives to the Elastic stack. In most cases, it would be straightforward to write a connector (equivalent to barnacles-elasticsearch) to integrate another database.
The output of the Sample Solution is data, specifically a stream of data points representing who/what is where/how. In a purely real-time application (where no data is stored), the only consideration is to make use of the data as it is produced. In applications where data is stored, there are many more considerations.
Where to store the data?
The type of database or medium in which to store the historic data will impact the cost and the performance of retrieval and manipulation of that data. The geographic location of the computing resources on which the data is stored may also be a consideration. Legal or contractual requirements may stipulate that the data be stored in the country or region where it is produced.
How long to store the data?
A real-time location system running 24/7 can produce a significant amount of data, which, if not archived or destroyed after a given time, can lead to degradation of system performance and significant additional costs.
What data to store?
A Bluetooth Low Energy RTLS can collect real-time location data about all BLE devices present in the space(s) under observation. In the case where only specific devices should be tracked (e.g. tagged assets) and others can be ignored (e.g. smartphones and wearables), simply whitelisting the devices of interest can reduce the amount of stored data, and hence costs, significantly.
The Sample Solution is designed for convenience of experimentation rather than for secure production deployment. By default, the software will accept input data (in the form of UDP packets) from any source and provide API access without authentication.
Security of ingress and egress data, if required, is left to the user. For ingress, the simplest solution is to apply firewall rules (e.g. ufw on Ubuntu) to accept inbound UDP packets only from specific IP addresses. For egress, a simple solution is to install and configure NGINX such that basic authentication is required to access the API and web apps.
There are no significant networking considerations aside from industry-standard best practices.
The Sample Solution’s scalability is limited by the data throughput and the available computing resources (primarily CPU). Past a certain throughput, it is more effective to parallelize the architecture rather than scale up the CPU.
With respect to the hlc-server software, in a high-throughput application, it may be more effective to run multiple instances of the barnacles sub-package, balancing the load between them based on the radio-identifiers of the incoming data stream. In other words, multiple barnacles instances can operate completely independently, provided that data from each specific radio-identified device is always routed to the same barnacles instance.
With respect to Elasticsearch and Kibana, in a high-throughput application, it is recommended to observe the best practices for Elastic stack. Running these on the same machine as hlc-server, as in the Sample Solution, will only scale to a limited extent. The Elasticsearch service instead provides flexible scalabilty, albeit for a fee.
The Sample Solution is not specifically designed to maximise availability, but has nonetheless demonstrated high availability when operating within its scaling limits. Operating parallel instances, similar to that described in the Scalability Considerations section, is recommended where availability is a critical factor.
The hlc-server software package of the Sample Solution includes a number of open source web applications which serve as a user interface. These web applications are written in HTML, CSS and vanilla JS (no frameworks) for readability and ease of modification/extension. Users are encouraged to customise and extend these web applications, and to share with the community.
Most real-time web applications are built using beaver.js, which abstracts away from the developer the interaction with the WebSocket API so they can focus on building the application itself.
The APIs provided by the hlc-server software package of the Sample Solution are sufficient for most use cases. Should an extended or revised API be required for data access, it is recommended to create a barnacles interface package. Should an API be required to ingest data from third-party RTLS hardware, it is recommended to create a barnowl listener package.
The APIs may also be wrapped in a layer of security and/or authentication if required.
The Sample Solution is I/O intensive and most costs are related to the continuous processing of a real-time data stream. Aside from optimizing the cloud hardware specifications to manage costs, a compelling alternative is to push as much processing as possible to the edge, offloading the cloud.
A good edge/cloud balance is often achieved by running barnowl at the edge and barnacles in the cloud. In this case, barnowl buffers data for one second (by default) resulting in significant (lossless) compression, reducing bandwidth and upstream processing requirements.
The hlc-server open source software package of the Sample Solution is MIT licensed. This is a permissive license where, in most cases, the only consideration for a user/developer is the following clause:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
The open source versions of Elasticsearch and Kibana are licensed under the Apache License Version 2.0. Alternative versions of these products use the Elastic License.
The source code of the hlc-server and the reelyActive software packages on which it depends can be found on reelyActive’s GitHub account: github.com/reelyactive
The open source Elasticsearch and Kibana can be found on Elastic’s GitHub account: github.com/elastic
The following terminology, as defined below, may be used throughout this document.
|IoT||Internet of Things|
|BLE||Bluetooth Low Energy|
|RTLS||Real-Time Location System|
|raddec||RADio DECoding (see raddec library)|
|API||Application Programming Interface|