By default, Adaptive Data Rate is off and Tx Data Rate is set as SF10 (Spreading Factor 10) with Bandwidth 125KHz. SF10 means a payload size is max. 12 bytes.
With this setting, I can successfully send up to 12 bytes using an AT command such as ‘at+send=123456789012’ on my mDot module. But, I get an error when I try to send 13 bytes of data using an AT command such as ‘at+send=1234567890123’ on my mDot module.
And, I turned on the ‘Adaptive Data Rate’ on my mDot module to see if I can send more than 12 bytes of payload. Under this setting, if I simply try to send 13 bytes of payload using an AT command such as ‘at+send=1234567890123’ on my mDot module, it still fails.
Question 1.
How can I see with my eyes and verify if ADR (Adaptive Data Rate) is working with my end device (MultiTech’s mDot module)? In other words, how can I verify if a TTN network server sent a command to my mDot module end device to change the data rate?
Question 2.
For example, let’s say my end device application always sends 100 bytes to a TTN server. And, let’s say I have turned on the Adaptive Data Rate on my mDot module. Does my end device application simply send 100 bytes of payload to the LoRa chip and the LoRa chip will divide 100bytes into several small frames and send them to a TTN server based on the current data rate set by the Adaptive Data Rate scheme?
Or, does my end device application have to detect the current data rate set on the LoRa module and divide 100 bytes of payload into smaller data frames and send them to a TTN server in sequence?
Question 3.
Is there any good documentation that explains how Adaptive Data Rate works in detail (possibly using an example)?
If you are using staging ADR is not supported. For preview it might be implemented or it might not yet be implemented (target is Q4 2016 according to the roadmap, however it is listed below the production environment so it depends on internal priorities at TTN)
What is the current status of ADR implementation? I use default “community” environment (console.thethingsnetwork.org) but still see SF12 while rssi is -87 and snr is 10. I expect SF to drop to up to 7 after couple of hours in such conditions. I have radio with ADR on.
Any ideas?
Hi, there is some documentation here: https://www.thethingsnetwork.org/wiki/LoRaWAN/ADR
After 20 uplinks with good margin TTN will set the ADR on the next downlink created by you or the stack thanks to ADRACKReq.
Nevertheless the LoraWan spec states that ADRACKReq should not be send in SF12 so you have to send a downlink from the application to your node in order to allow TTN to add ADR into it.
Ok, to be clear - until your node does not use downlink messages, it will not be able to leverage ADR functionality? At least if your node boots initially with DR to SF12.
Peter
The next backend release will make it work for devices that don’t receive downlink from the application layer. Also, the release will add support for ADR in the and regions. If testing goes well, you can expect this within the next week or two.
Hi @htdvisser, did this happen yet? I’m experimenting with ADR but not having much success today. The device has been up for > 20 messages, all a decent SNR, but I’ve been sending downlink packets of a single byte to various FPort numbers to see if I can provoke it to change.
Yes, this is active on the network. If you look at the “trace” of the downlink messages in the gateway view of the console, you may see some messages about ADR.
But not the “The next backend release will make it work for devices that don’t receive downlink from the application layer”, right? If it is, please see ADR - not what I expected.
look at the “trace” of the downlink messages in the gateway view
Indeed, nice; see this screenshot of the trace of the ADR downlink:
(Though I didn’t see it when the node does not respond to SF9 in RX2, and TTN then repeats the ADR in SF12.)
Well, I finally tested it on Multitech mdot today and found it worked. Looking good!
However, the third party sensor we’re using still doesn’t elicit any ADR downlink messages. I used Anthony Kirby’s packet decoder and the FCtrl.ADR flags are set to true on all uplink messages. Is there any way to debug what happens when this flag is set and the message is received in TTN?