One year ago, we kickstarted the Spark Core project on kickstart. When we received our Core, we began to explore the use of it.
Like so many I started an automated garden project. Living in Brussels and only having a balcony full of plants makes data gathering pretty easy. Having the Wifi close-by, made this a great case to test some neat features the spark system has to offer.
The spark core gives you the option to publish your data. The core sends it to the Spark cloud which sends it as an event to the world wide web.
... char data[64]; //max amount of data your allowed to send sprintf(data, "[ %i,%.2f,%.2f,%.2f,%i,%i]", varOne, varTwo, varThree, varFour, varFive, varSix); Spark.publish("yourEventName", data); ...
The Spark.publish lets you publish a event "yourEventName" with "data" as payload.
Now some Node.js client code would take action on certain events.
Techblog hybris wrote a nice article about it: "consuming spark core sse-events via node.js". They tried the npm eventsource module but had no success with it and went on using the sollution that Spark CLI is using.
Not really satisfied with the code, I dove in the eventsource documentation. And found that you could add a listener to the eventsource.
var EventSource = require('eventsource'); var eventSourceInitDict = { headers: {Authorization: "Bearer your_access_token"} }; var url = "https://api.spark.io/v1/devices/your_device_id/events/"; var es = new EventSource(url, eventSourceInitDict); //add your listener es.addEventListener('yourEventName', function(e) { var rawData = JSON.parse(e.data); var parsedData = JSON.parse(rawData.data); console.log(parsedData); //result [0,12.14,12.15,548.54,15,457] });
Next thing you could do is using nodeMailer to send an email when something is wrong.