I cannot get your example to decode in the decoder. But it’s not always going to be 48 bits, but depends on the actual value:
But even the binary CBOR just requires too many bytes for just a simple value. Like in this example:
In RDK example, your temperature sensor reads 23 degrees Celsius, and you’d like to see that value in Maker. You can use cbor.me to convert the payload to CBOR, e.g,
{"temperature": 23}
translates to A1 6B 74 65 6D 70 65 72 61 74 75 72 65 17
While the plain text {"temperature": 23}
would be 19 bytes, the above CBOR’d version is still 14 bytes for a single integer value!
Even when using {"t": 23}
, you’d still get 4 bytes for A1 61 74 17
. A fixed position encoding, with an unrealistic temperature accuracy of 2 decimals, would allow for -327.68 thru 327.67 in just 2 bytes.
Adding a decimal in CBOR, {"t": 23.5}
yields 6 bytes in A1 61 74 F9 4D E0
. But other values might go crazy, like {"t": 23.4}
yields the whopping A1 61 74 FB 40 37 66 66 66 66 66 66
. Play around with http://cbor.me/ before considering using CBOR.
So, even when not using JSON, I’d fix the problem at its root cause, and try to get rid of CBOR. Unless your device sends a lot of different types of messages, it’s just too much overhead to use in LoRaWAN. And even if it does send a lot of different messages, then see, e.g., Multiple sensors with different payload decoding TTN and How to best write an application that contains many nodes with different measure data types .
If you really cannot get rid of it, then you’ll need to copy a CBOR helper into the payload format in TTN Console, and use that helper in the Decoder function. See Implementations on the CBOR website .