My hardware works fine with ABP activation, but when I program it for OTAA activation by setting the OVER_THE_AIR_ACTIVATION as 1, it does not transmit. I am using the keys generated after registering the device on TTN and selecting the OTAA option. Can anyone help me please
Hi All,
It seems that with the OTAA activation, the node does transmit the join request, but no ACK coming back to node and then it goes to sleep. On TTN, my Laird gateway is active and connected, I registered a new application under my gateway and then registered a new device, On TTN, I can see the join request. What am I missing? how can I debug and find out if the TTN is sending back any data to node? Do I need to configure my gateway differently? Just a note: For ABP activation, I can see the data coming through to the TTN without any loss. Any help will be really appreciated
Thanks
Silas
You should also see the Join Accept in TTN Console, in the gateway’s Traffic page (green icon). Not seeing that might indicate you did not copy the keys correctly; beware of MSB and LSB: Hex, lsb or msb? And click the Join Request (yellow/orange icon) to see if there’s any errors, such as OTAA shows "Activation DevNonce not valid: already used".
It seems old screenshots like below are no longer valid? Today, I do not see the orange Join Request icon in the gateway’s Traffic page; only the green Join Accept shows there. As usual, the Application’s Data page only shows the orange Join Request Activation. Not sure if that shows there if the keys are invalid, as then TTN might not even know to which application a device belongs:
(Update: one day later, I’m seeing both the Join Request and Join Accept in gateway Traffic for a Kerlink gateway, using the old packet forwarder. But for another node on a different location I still only get the Join Accept in the Traffic of a The Things Gateway.)
Thank you so much for your reply and information!
What I found the despite of changing the DevEui in the “commissioning.h”, I was seeing one particular value of it under gateway traffic. Going through the code in debug mode, I found that there is a function BoardGetUniqueId(DevEui) which generates the DevEui using the ID1 and ID2 for EFM32GG device. After I commented that line, I could see data coming through under the Application -> Device and seeing the configured value of DevEui as expected Under gateway -> traffic, I see the LoRa packet as Join Request and under Application -> Device -> Data, I see the Activation
But further there is no next message. Again I found in the debug mode, that after sending the join request and activation done, the code keeps cycling through sleep state code lines, and seems there comes no timerevent which will bring the code execution into transmitting next message. Any thoughts / ideas where I can look or try to debug?
In the main() after the initialization, under #if (OVER_THE_AIR_ACTIVATION != 0), after it sends the LoRaMacMlmeRequest(), the code puts the node into DEVICE_STATE_SLEEP. And hence the node goes into cycling through the sleep code lines, and never comes out of it. So here instead of DEVICE_STATE_SLEEP, I changed it to DEVICE_STATE_SEND, so it started transmitting the frames in the OTAA activation mode.
But when I check under Gateway -> Traffic, I see only series of Join Request and Join Accept sequences, there is no actual record with counter value going up. So it looks like even after the join accept, the node is not recording this as “JOINED” in the network. So the change may not be really be right.
I also checked some other versions of main(), and all of them have code like as below:
if (NextTx == true)
LoRaMacMlmeRequest(&mlmeReq);
DeviceState = DEVICE_STATE_SLEEP;
which actually puts the node into Sleep state after sending the request to join. So I am sure that must be correct. But using that the node just goes to sleep…
Can someone point me to a LoRa Mac Node example code that works with OVER_THE_AIR_ACTIVATION?
I would like to follow the program flow and try to fix my code accordingly…
In order for node to send the uplink with uplink counter values after the node has joined, the node has to have successfully received the Join Accept from TTN. And for that it should get an RXDONE interrupt. On TTN console, under Gateway-> Traffic it showed that server sent Join Request.
To make sure I getting right digital waveform, I tried using the oscilloscope. On DIO0, I do get an interrupt for TXDONE, since just at the same moment, I can see the Join Request under Gateway->Traffic. And then after some 4 sec, TTN sends Join Accept. And for that Join Accept, the node does not show any RXDONE interrupt.
So now I am trying to find out why there is no reception of the Join Request
In the Lora Device_State_Init, there is following code which sets channel and DR for RX2
I am sure those are correct values, but looks like that is not working. Node is sitting within 5 ft of the gateway. (Is the node too close?? ) Somehow the node is not receiving the Join Request from TTN. I am looking further into my Laird gateway …
Also by using my handheld RF spectrum analyzer, I see two wave bursts when the Join Request is sent and also when Join Accept is received from gateway. So gateway does pass on the Join Accept from TTN to node.
Some more thoughts:
It may be possible that for the node, the RX window after the Join Request is sent is not long enough or may be not exactly at the right time when the Join Accept arrives.
Is there any way I can check if the RTC is working fine with the millisecond accuracy?
Is there any way I can stretch the RX window? where can I update this setting? I know that the value must follow the LoRa specs.
Is there any latency from gateway? how can I check that?
I am sure the Join Accept is sent using 869.525MHz with SF12 for RX2. Only ‘normal’ traffic uses SF9. When using OTAA you should not modify the RX2 default setting. The back-end will provide the node with the channel list and the RX2 settings. Only for ABP you need to set the channels and RX2 information yourself (because the back-end is not involved in ABP join)
Much to close. Keep at least 10 feet between the node and gateway to avoid channel crosstalk and damaging input circuits of the receivers.
Thank you for your suggestion. I moved the gateway away into another room 25ft. But with no change in the results, the node is behaving the same No RXDONE interrupt yet.
Another possible thought: When does the radio chip switch to the Rx mode? I am guessing after the TXDone, right? If the chip does not at all go into Rx mode by setting the Rx/TX control pin to low, then it is not going to receive anything and there will no activity related to signal reception. Is there any way to check in the code if writes to any chip register to control the Rx/Tx pin to low? Any idea/input?
Just to make sure that this issue is related to software only, I changed the module to a new one, and it gives me the same result. So pretty sure that the hardware is fine.
hi @kersing
sorry for the late reply, yes I did change the RX2 to the correct ones.
But looks like that is an issue with RTC implementation, the EFM32PG12 has 32 bit RTCC. I believe that can be used as RTC as well, working on it, will post how it goes…
Thanks
Silas
if you are using OTAA, don’t configure RX2, let it as default, and don’t add/enable extra channels. After joining, the network server (TTN) will configure it for you.
Finally OTAA working!! i updated the RTCC implementation, now using 32bit main counter and works perfectly fine! Also confirmed that configuring or not RX2 channel does not really help/break the OTAA sequence, like @nestorayuso mentioned, the node automatically gets configuration from server.
Thank you all for your valuable suggestions and inputs
Sorry, I disagree. Setting RX2 to anything but the default SF12 and 869.525MHz will break OTAA. Setting it to different values will work if the network can respond in RX1, the node will not be able to receive the response in RX2 if the defaults are changed.
My apologies Sorry for not making it clearer!
What I mean is - there are code lines for configuring the RX2 - which of course configure the RX2 for 869.525MHz as under.
// Set which channel to use for RX2
mibReq.Type = MIB_RX2_CHANNEL;
mibReq.Param.Rx2Channel = (Rx2ChannelParams_t){869525000, DR_3};
LoRaMacMibSetRequestConfirm(&mibReq);
Well the node functions fine in OTAA, even after commenting out the above lines. So that way does not matter if configure or not. I did not check for setting the RX2 to other value.
…which, like @kersing and @nestorayuso mentioned multiple times, is wrong. During the OTAA Join, TTN will NOT use that value for RX2. Instead, during the join, it will use the default value. Only after the join TTN will switch to 869.525 for subsequent downlinks, but for OTAA you don’t need to program that manually as those details are included in the Join Accept.
The fact that it works is only because you’re lucky that in your tests the Join Accept was transmitted in RX1, not in RX2. See the gateway’s Traffic page in TTN Console, to see that the Join Accept was sent 5 seconds after receiving the Join Request, hence in RX1. You won’t always be that lucky.
For testing, you can trying joining using SF11 to increase the chances of RX2 being used, and you’ll see that TTN then uses the default SF12.
For future visitors: can you please explain what you changed exactly? Thanks!
LoRaMac-node implementation is smart enough to use default RX2 parameters during join regardless you change the RX2 parameters with:
// Set which channel to use for RX2
mibReq.Type = MIB_RX2_CHANNEL;
mibReq.Param.Rx2Channel = (Rx2ChannelParams_t){869525000, DR_3};
LoRaMacMibSetRequestConfirm(&mibReq);