I usually use this lib on an ESP32 based custom board. This is all working great, but I’d like to make a much lower power device which will be used purely for GPS/RSSI tracking - this is to allow us to check out a potential site quickly for coverage.
I’m not sure if the slow speed would cause timing issues with the lib? Is there a way to calibrate it to the clock speed? I’m assumiing the default would assume 16Mhz for an ATMega328x chip, such as in the arduino. The reason I wish to use 1 or 8Mhz is for the much lower power consumption both awake and sleeping, and obviously doesn’t require the external crystal.
Edit: Should just add that this would never need to process downlink packets.
To be a compliant node it MUST be able to process downlink packets.
This forum is littered, if not silted up with discussions around using a Pro Mini with an RFM95, ie ATmega328 at 8MHz with a radio module - indeed a topic just yesterday that asked the exact same thing.
There might be something about that in the LMIC docs for you to read.
Obviously? Good luck with the timing …
You may get better results asking the question more generally after searching the existing information here, rather than making too many assumptions about the 328.
But as a starter for ten of an answer you might have been looking for, if you are doing coverage checks, MCU sleep current would be less of a priority for me as the GPS is going to burn through most of the power here, sleeping LMIC is an art (that some have mastered) and I’d only want to do coverage checks the once, so spending time & money on a super low cost device seems counter intuitive if you then run the risk of poor reliability.
I’d look at using an Arduino MKR WAN 1310 - which is easy to sleep with low current and lots of space for firmware. If I was making a batch, I’d use an ATmega4808 with RFM95, plenty of space to accommodate the firmware. If I was kidnapped and had to make a coverage checking device that sleeps on a Pro Mini, I’d be done by lunch time, but as I said, it’s a bit of an art and not so easy cramming it all in.
Actually the GPS module I’m using can also sleep between uses and consumes only 12uA while maintaining its internal memory of satellite positions. In use, it consumes around 30mA. A “hot” startup re-acquires position information within 15 seconds. Sometimes in just a few seconds, so it won’t “burn through my power”. The 8mhz internal oscillator has been fine for me in previous applications using the ATMega parts - in fact I rarely use a crystal with them and have never had issues with serial comms, RS485/DMX/MODBUS, or anything else requiring reasonably granular timing. Unless something in the LMIC library requires millisecond precision, the drift shouldn’t be an issue in the short time the device is awake. As for memory, 32Kb is more than enough - the LMIC takes around 25K of flash, including lots of debugging output I won’t even be using in production, and parsing GPS packets through the UART isn’t exactly rocket science - you’d have be pretty sloppy to need 6-7k .I’ve recently been writing ASM for MCUs that don’t have any RAM, and 1K of flash. so the mega chips are a luxury
The reason these are low power is that once it’s sent to a site, it’ll likely be left on, sleeping for between 1-10 minutes, then sending an 8 byte payload. It will be left in different places within an area and gateways added where needed.
We will never be sending downlink packets back to it (I’m unsure if TTN sends anything back itself, but the device would ignore it). These will be custom boards - the ATMega328PB, RFM95W and GPS receiver are all easily available for PCBA for me (unlike the ATMega4808 due to the chip shortages, which I’d consider overkill anyway), so I can get 20 boards made and delivered in around 7-10 days. Paying £35 for the MKR WAN, or under £12 delivered per custom board isn’t a tough choice for me These are pretty much “throw away” devices we don’t expect to get back after they’re sent out to test locations.
Anyway, thanks for the response, but you didn’t really answer my question so I’ll just put something together on a breadboard, and take a look through the code in the lib if needed
Out of interest, why does a node need to process downlink packets to be compliant?
I think you’ll find that by implication the sentence “discussions around using a Pro Mini with an RFM95, ie ATmega328 at 8MHz” rather implies a lot of activity around this area which sort of implies a yes.
But like many of these duplicate topics, we do try to encourage some prior research and we do get a bit miffed when the OP reveals a pile more detail - like wanting to go disposable - that rather changes the landscape of the original question.
We can only answer questions in context so the type, quality & detail of the answer is largely down to the type, quality & detail of the question.
The Fair Use Policy would require a minimum sleep of 3 minutes for a great signal, 5 minutes for a typical one and 20 minutes for a ropey one.
To use TTN the device should process MAC commands sent from the Network Server otherwise it has the potential & actuality to cause disruption to other community users. The details are part of the fundamentals of creating any LoRaWAN device. The learn link at the top of the page will get you started - it’s mostly to do with the Adaptive Data Rate but there are other commands coming down as well.
328’s are half the price of a genuine Arduino Nano Every with a 4809 on it, but I can only get PDIP 328’s, whereas I can get lots and lots of 4809’s for half the price of a 328 that I can’t buy.
The reason for going large on the MCU is not flash, it’s RAM. The 328 + LMIC ends up on ~1.2K used and generally if you need more than a hundred bytes for your bit, you get lock ups. Whereas the larger ATmega has three times the RAM.
When I need to optimise a solution I change the microcode for the assembly instructions I’m using …
In TTN is is now absolutely mandatory to correctly implement downlink reception and processing (as has always been part of the LoRaWAN spec).
Downlink requires either millisecond timing accuracy, or substantially re-writing software to run the radio in receive mode for a much longer period of time around the spec-dictated receive time, wasting far more battery than a correctly utilized crystal oscillator would (if you want true low power, use a system where you can use a slower and or less accurate oscillator to wake up from low power sleep - as long as you stick to class A, it’s only the interval between transmission and checking for downlinks which needs to be precise)
You can probably use the ATmega328PB though you have the same code fit challenges as with the ATmega238P. If you want to go in that direction look at the ATmega4808/4809 instead, or better yet are ARM board.
You can probably run at 8 MHz if it’s the crystal oscillator, without really doing anything more than using the correct board settings when compiling.
1 MHz would be likely require some software changes beyond just clock settings.
Beware that reporting a lot of GPS locations within human patience time limits is likely to violate TTN usage rules and quite possibly legal airtime limits as well.
I’m interested in why it’s mandatory to accept downlink packets. I did do a search on this and couldn’t find anything in any spec for TTN in this regard. Would you mind linking some suitable reading material? Aside from the required initial transfers before sending the payload, why would a node “need” to process a downlink when it serves no purpose? TTN won’t care if it’s processed or not, right?
There is no code fit issues from my perspective - aside from the LMIC library, the rest of the code is trivial, just initialising peripherals, parsing data through the UART, and sending the 8 bytes representing the 2 floats for lat and long. I may add an LCD display, but more likely just a couple of LEDs to show success/fail in packet sends. As said, the 4808 is WAY overkill for this, besides being harder to source at short notice due to chip shortages. The 328pb is comparatively easy to source.
As I said above, one of the primary uses for downlinks is ADR which reduces inappropriately long transmissions from a device that can interfere with other well behaved devices.
And then there’s the whole FUP.
By all means stick to a 328 if you can get them, most of the support issues for LMIC on a 328 are regarding getting it to fit in to RAM, but YMMV.
But what do we know, we are just people hanging around the forum answering questions because we’ve already done this stuff and know enough about the LoRaWAN spec to advise (ps, that’s a very short Google search away).
Looking at you original question you are looking for advice and seem to be aware there might be timing issues regarding downlinks (given your remark regarding downlinks).
Next you get advice from experienced forum members that have track records to prove they know what they are talking about and all you do is ignore it or tell them you don’t need the advice given.
Are you seriously asking for advice and willing to consider the advice given or are you just trolling?
Not really, no. There’s no implication one way or another; only that a discussion was had.
I only added this information after YOU decided to tell me the hardware you would use, for some reason, without knowing anything about the project. I’m not using off-the-shelf Arduinos for a reason. This wasn’t in any way related to the question though, which was asking if the atmega328 could work with the LMIC library at 1 to 8mhz.
So now my question was “wrong” too? You started down a tangent unrelated to the question, suggesting different hardware - which I’m sure nobody asked for
Ok good.
I see. I was referring to downlink messages sent back from the queue that can be populated by our server - not the ones used in the negotiation of the uplink etc. I didn’t realise they were all referred to as downlinks. Thanks for clarifying that.
I’m not even sure what that means. Who asked about the Arduino Nano? It’s not really relevant if you can’t source the SMD atmega328pb - I have literally hundreds of them in inventory at my PCBA house. They’re my existing stock. Why would I use another part that’s overkill for the job, and harder for me to source? It’s completely unnecessary, and the 4809 is just ridiculous - I only need SPI, UART and possibly i2C (more likely just 2 indicator LEDs). I don’t need a 48 pin part - it’s a waste of board space.
You shouldn’t get lockups when using 1.3K of RAM on an MCU with 2K unless you’re sloppy with memory allocation, using a lot of recursion, or using heavily nested functions causing excessive stack usage. Burning 700K on that suggests you’re not really being efficient. My code to parse the GPS lat+long from GNSS data on the UART uses 32 bytes of RAM. Aside from the small amount used to atof() the strings to floats, then casting the floats to the 8 byte buffer to send, there’s not really a lot needed, 1 byte for the i2C to drive a display with fixed messages if I go that route, and possibly a byte to hold flag data. It’s nowhere near 2K.
Anyway, this isn’t really getting anyone anywhere. I’m just going to sort it out myself, thanks!
The LoRaWAN protocol stack does some of those things… If you follow the issue/commit history of MCCI LMiC (which is really the only version that works correctly) managing all of the edge cases between the protocol and the “design” of the legacy IBM code to work out the long buried bugs in it has been quite a challenge, especially the nesting of the consequences of things like the MAC downlink parsing.
Burning 700K on that suggests you’re not really being efficient.
700 bytes
My code to…
One would indeed hope that the application code were tiny, it’s the protocol stack that places a lot of demands on things.
It can fit in an ATmega328PB just like it can fit in an ATmega328PA with care to build configuration, but the only real reason for choosing that chip would be if you already have them, which you do.
The clocking issues were already explained and again present no real differences from the more common ATmega328PA. Running that off an external 8 MHz crystal is fairly common, since it’s simpler to have the MCU voltage match the radio voltage limit.
Things went off on a complete tangent - my question was purely about whether the ATMega328 could work at 1 to 8 Mhz with the LMIC library. The response about using various Arduinos etc had nothing to do with it and was more about how that person uses off the shelf hardware. My question hasn’t actually been answered despite all the time spent by people responding to explain why they don’t need to answer it, and so I’ve decided to just do my own testing and modify the lib if needed (I did fork it a while back to make some changes to support SPI on different ports on the ESP chips).