This is my actual issue and context.
I’ve configured an Arduino MKR1310 to sent a co2 sensor data (co2+temp+hum) to TTN through a RAK TTN gateway I own. The gateway is correctly joined to TTN and active. My MKR1310 device is correctly configured and registered on TTN by OTAA. on the TTN console I can see the received messages. I see the data coming to the “Application>data” section of the TTN console. And also the data are shown in Application>device>data and there I can see the payload that is correct.
On the arduino side I made this :
When I use your payload 01640000023E02670104036840 (if I copied that from your screenshot correctly) in a Simulate Uplink, then I get the same results: no fields.
But when using, e.g., 03670110056700FF I get what’s expected:
I’m afraid that addGenericSensor is not supported by TTN. So, try with some other type.
Do you have the myDevices Cayenne integration enabled as well? If yes: does Cayenne show anything? I wonder if this is only an issue in TTN Console, not in the myDevices dashboard.
Hi Arjan,
Thank you a lot for your help, you’re right that’s an issue of TTN not supported data type.
After modifying “lpp.addGenericSensor” to “lpp.addLuminosity” everything work fine.
And no I don’t have myDevices Cayenne integration. I’ve the project of using my own server with kind of influxdb / grafana stack. Regarding cloud platform actually I’m familiar with Blynk but maybe I should give a try to myDevices Cayenne…
PS: There are a lot of IoT cloud integrations with TTN and i don’t really know them so it’s hard to compare these platforms. Do you have a quick advise for someone who’s trying to do environmental controls ?
If you’re not using myDevices, and if you’re always sending all sensor readings, then you could save some bytes (and hence: air time) by using your own encoding/decoding.
LPP uses 6 additional bytes to describe your payload (channel and type for each value). There’s always a 13 bytes LoRaWAN overhead, so your actual sensor readings fit into a total of 13 + 2 + 2 + 1 = 18 bytes for the full LoRaWAN packet. (Currently 13 + 4 + 4 + 3 = 24 bytes, so 33% extra overhead for LPP.)
This is even more true if you need to use fake types to transfer your data. Like when using luminosity if the sensor is not actually measuring light at all.
If you’re referring to GitHub - siara-cc/Shox96: Guaranteed Compression for Short Strings then: that’s about compressing text. I also see search results for usage in Arduino PROGMEM. Nice, that would allow for saving some program storage space for readable debug messages on its serial port and all. However, as you won’t/shouldn’t be sending text using LoRaWAN, there’s not much to expect from generic compression libraries there. Only you know what’s the expected range of numerical readings, so only you can decide how many bits are needed for each encoded value.
Like say you know the expected readings range from 99,999,100 to 99,999,350 then you could just send 0 thru 250 in a single byte and add 99,999,100 in the decoder (and reserve 251 through 255 for error reporting). Any generic compression would not know about that. See Working with Bytes and Best practices to limit application payloads.
You’re totally right, thank you for the links.
I should do things with bytes, buffer, and some delta measuring, depending of the “size” of the delta, using Signed 16 bit integer or Signed 8 bit integer.
Do you think that’s the best way to start data uplink optimization ?
Deltas might not be needed if the sensor’s readings are not too large anyway. For example, I’d not use deltas for temperature measurements, even when I’d expect the readings to be in some range. You never know.
Also, not every reading will be signed. Like relative humidity should not be negative, if your sensor and the sensor’s library are okay.
But this is getting off-topic for the topic at hand; see the documentation and topics I linked to earlier. Success!