Hello!
I am using a LoRa Stick (LoStik) from Ronoth which has an RN2483 Module (https://ronoth.com/products/lostik)
I did some performance measures with ADR and I am bit confused about SNR and the Spreading Factor.
I found a table in an old Semtech Document which shows, which SNR is required for a specific Data Rate
Unfortunately, my measures does not match that.
My device is sending most of the time with SF12, even if the SNR is about -5dB. Shouldn´t ADR adapt the DR in that case?
I am about 3.5 km away from the gateway.
I would be happy if someone could help me understand ADR fully.
Did you enable ADR on the device, and does it send an ADRAckReqafter 64 uplinks? If you can capture a full LoRaWAN packet from the gateway then you can easily check if an online decoder shows that FCtrl.ADR is set.
Well, that document is about theoretical technical possiblity.
What a given network server chooses as a strategy to implement is rather a different possibility.
If you want to play with high spreading factors to see how they work, you might try setting them manually and not using ADR. But beware, high spreading factors make only short packets sensible, it’s debateable how useful SF11 and SF12 are in the uplink direction since merely sending the LoRaWAN headers without even any useful data at all starts to take quite a bit of time - in some places more time than a transmission is allowed to last. In other places it is allowed and people do it, but there are definite tradeoffs - the more time a packet takes, the more chance something interferes with it, too.
In terms of network servers, this is the TTN forum, so the ADR behavior of deployed and experimental versions of its software would be on topic, but the ADR behavior of other servers (chirpstack, etc) is not.