TLDR: I think the ability to toggle an OTAA end device into a development mode, or reset devNonce would be very beneficial.
I’m developing an end node using the Microchip WLR089U0, and am currently using the devboard. I’m using OTAA and have successfully connected and been able to transmit data.
However, each time I modify my firmware I need to rejoin to the application. This then results in devNonce “is already in use” and “is too small” errors. I have occasionally been able to get around this by changing the LoRaWAN version from MAC v1.0.4 back to 1.0.3, but that doesn’t always work. The LoRaWAN stack supplied by Microchip is v1.0.4 compatible.
In my searching I’ve come across a few recommended solutions for this, however I think that I’m missing information.
Is it possible to access the console for the au1 community console? I know what I need to change, but I just can’t figure out how to do it. Is this something that could be included as an attribute option?
Currently trying to find where to ‘manually’ set the devNonce value. A search through the demo mote example project for nonce gives no indication of where it’s set. There are 3 references. 2 error messages and an output variable within secure application layer code. The comment is “If the output is encrypted, a 32 byte random nonce generated by the device is returned here. If output encryption is not used, this can be NULL.”
The search continues.
@descartes You make a good point. I did some quick experiments by changing over to ABP, but couldn’t get any messages to be received by the application. The gateway was receiving them, just not getting all the way through. My initial testing using a Dragino Arduino Shield, and an LG-01 (Shh, don’t tell anyone it was on TTN) used ABP, and worked fine, once disabling the frame counter checks. It was also on v2, but that shouldn’t matter.
I’ll have another crack at ABP later. The Microchip codebase is rather large to get my head around where and when things are happening. Most things that need changing are in conf_app.h
For those following this later, the function that returns the nonce is only in the certificate version of the mote example, and I’ve been unable to find the implementation of the function. I think the default when not using the certificate is NULL, so if I can figure out where to change that, that’ll be the best solution for LoRaWAN v1.0.x. However if a change is made to v1.1, the devNonce will need to increment instead of being random.
The MLS Migration Guide mention in the MLS_SDK_1_0_P_5 (Lorawan v1.0.4) section that:
2. AppNonce and AppEUI renamed to JoinEUI and JoinNonce.
4. DevNonce incremented with every join request.
However I can’t find reference to any of those names in the example projects. It’s tricky, because this is the release that adds support for the WLR089U0 in the End device demo application. Maybe I just need to keep submitting join requests until DevNonce is unique.
DevNonces should not be re-used. @kersing’s recommendation to store the counter is the correct approach. This approach is also suggested by the LoRaWAN 1.0.4 specification:
For development devices (don’t do it with production devices) it is possible to disable nonce checks, just as it’s possible to disable frame counter checks for ABP development devices (again, don’t do it with production devices).
At the moment (but that may change in the future) it’s not possible to disable those checks in the Console, but if you have set up the CLI, you can try the ttn-lw-cli end-devices set command with the --resets-join-nonces flag to allow re-using DevNonces.
Thanks for the info everyone. I’ll transition to ABP for testing, and back to OTAA once finished.
So, long story short, I’m trying to get ABP working on the WLR089U0 dev module using the end device example project supplied by Microchip. With OTAA I can see the traffic at the Gateway, however with ABP I get no data, of the occasional CRC-ERR message. I feel like there’s a problem somewhere deeper within the codebase, it just takes a while to figure out how it’s meant to work, before diagnosing problems.
My findings so far, for posterity:
OTAA and ABP are setup by default to use a different spreading factor.
CR changes between 4/5, 4/6, 4/8 and OFF. (Not sure what CR is yet).
At that point I was more looking for differences between what was being reported. I still haven’t managed to get transmission using ABP. I’ve purchased an RTL-SDR to ensure that there is actually a transmission. I’ll keep here updated once it arrives.
So far the impact is that I’m getting no transmission to the gateway using ABP. Is CR likely to be Coding Rate?
I’m currently using TTNv3. I did initial testing using v2, but when setting up my new gateway I moved straight to v3.
All of the examples, guides and tutorials that I’ve seen for the WLR089U0 have been using OTAA. I guess this is the downside of using new parts. Once I get it up and running I’ll be sure to document the whole process.
I’ve now also probed the antenna pin during message transmission with an oscilloscope, and it’s ‘going high’ when sending messages using both OTAA and ABP. The serial communication suggests that temperature data is being sent on both occaisions, but I haven’t checked to see that it’s the same data. The OTAA transmission takes 114ms with the ABP taking 210ms.
It’ll be interesting to see the radio spectrum once the RTL-SDR arrives.
Just for “fun”, I’ve registered a SAMR34 Xplained board on v3 using ABP and it just worked.
The same board was used last week on OTAA and also just worked, albeit with the expectations of constant rejoins being blocked either by the stack or by the network server due to duplicate DevNonce - the second I could resolve by clearing it using the CLI, the first by shouting loudly at the board until the reset button pressed itself, at which point it couldn’t remember when it last sent. (this is a joke based on the Chuck Norris meme - “@kersing’s devices don’t join on OTAA, TTN sends an open invitation”)
Can you tell me what region you are in ie, where in the world are you?
Do not connect the SDR directly to the antenna connector - it might be blown up. Use a directional coupler with abt -20 dB coupling and a 20 dB attenuator in between.
I don’t think the OP needs an SDR, but good advice none the less.
The Microchip stuff works well, but rather assumes a level of diligence & documentation reading that isn’t FaceBook friendly - ie, no instant gratification.
@descartes Ok, good to know that I’ll get it working eventually. I’m in Brisbane Australia and using AU915 subband 2. I think it’s worth getting the CLI up and running. Would you recommend following this guide? https://www.thethingsnetwork.org/docs/network/cli/quick-start/index.html
I should also try using the AS923 band to see if it’s a frequency thing, given it’s a legal option here and based on the no traffic through the gateway I see very little potential impact to other TTN users.
Oh absolutely I agree regarding the instant gratification. I just assumed that if the option is available through uncommenting a line, the demo applications should still work.
@wolfp Yep, good call. My aim with the SDR is to visualise the radio spectrum. Currently I have no way of doing so. It’s also a handy tool that I’ve been looking at for and while, and now work is paying for one, so it’s a win-win even without the project.
Microchip are pretty through so I suggest you are over-thinking - I’d not tried my SAMR34 board on ABP on v3 but all I did was setup a device on my test application, put the keys in to the right places and bingo. Let me see if I can fake an AU combo.
Ok, so some improvement has been made.
I’ve setup my gateway to use the AS923 920-923, and selected the JP923 band, and now see messages using ABP.
It’s a small win, but I’ll take it. Now to figure out what’s happening for AU915.
I’ve also started the CLI process, but had to get brew etc. Are there plans for a v3 CLI guide for Windows? I also have a Mac, but saw that there are Windows guide for v2.
You can just download the binary for Win, macOS and Linux and they are all identical so no separate manual should be required. Scroll down the CLI install page for the link to GitHub with the binaries.
Ahh, yes of course. That executable with the new v3 documentation. Thank you again for your help.
Side note for those following on: In lorawan_multiband.h there is the following line:
#define UPSTREAM_CH64_AU (FREQ_915900KHZ)
and is used to set
uint32_t _RegParamsType1::UpStreamCh64Freq
This doesn’t make sense to me as at a spacing of 200kHz, channel 64 should be FREQ_928000KHZ. This value (9280000) is already defined correctly in stack_common.h.
Solved for now. Thank you everyone for your help.
I went with the CLI route. The CH64 observation above doesn’t seem to be causing any problems without changing it.
For later reference, this was the process I followed.