This story is actually one of mine. Before I moved to JUMO I was Head of Product for a company I
One problem working in Tanzania is the lack of prevalent internet access. It’s getting better, yes, but there are still huge areas uncovered by the telecoms. Network connections are unreliable, and connectivity is not guaranteed even in the major cities. So when we decided to build a sales and distribution company, which tracks trucks using 3G connections in tablets, we needed to make sure these things would work no matter where they were. How do you build a product whose value add is tracking and monitoring the sales delivery channel, when you can’t track or monitor due to lack of internet? Well, we thought we could do it, and what follows is a quick summary of how we did it.
Database: POSTGRES SQLData layer: Custom built data access layerProxy Server: Proxy for caching data
In order to allow for an “offline capable” system, we moved from a client-server scenario to a more modern and lightweight API/browser-based approach. We needed to implement the javascript code in a more structured manner to facilitate the presentation and caching. Our “dashboards” are where users complete all their interactions on the system. To make this work, we usedbackbone’s hashtag routingimplementation, and then presented the data from the server using Handlebars.js combined with JSON data from our custom designed data access server. We also cache the data on the local device, using a proxy that is also implemented in JavaScript. The same proxy sends data back to the server and includes the ability to cache requests when the device is offline.
The caching works the following ways:
Datasource Component. This is responsible for sending requests to the server. It speaks REST basically, so a request is always something like ‘GET’ / customer to fetch all customers. This is far faster than what we had implemented previously. It works the same the other way as well, where POST /customer {name: ‘etc’…} handles creating customers.
Proxy Component: The application does not interact directly with the Datasource component, rather talking directly to the proxy, which is responsible for caching and fetching/sending data to the server, via the Datasource component. The two main methods are “GET” and “SEND.” When getting data, the proxy first looks in the cache to see if the resource is available on the device. If not, and the device is connected, it will acquire the resource from the server and insert it into the cache. The cache is a key-value store where each key corresponds to a resource identifier. For example “customer/5” is one entry. The other responsibility of the proxy is to send data back to the server. This is straight forward when connectivity is available, in which case the request is just forwarded to the Datasource component. If the device is offline, the request is instead encapsulated and pushed to a queue which will be processed once the device is online again. We use RabbitMQ to handle auxiliary requests, such as removing keys and receiving notifications. The client component connects these different parts together, handles initialization and provides the basic interface used by the dashboard application. Finally, a jQuery plugin handles the rendering of handlebars templates with the JSON data received from the proxy.
It was fun looking back at this and remembering those days. The larger point is that building applications for Africa