Does this return 0 for analogValue, or was the 0 maybe the result from calculating the actual voltage? (I had this problem somewhere during my testing.)
I have it working now, but I have tried and tested so many things that I do not remember what exactly fixed it.
I have first done some tests with a bluepill board because that allowed me to test with different cores: âArduino STM32â, âArduino Core STM32â and STM32GENERIC.
On the bluepill I used PA1 for the measurements and added a 2x100k voltage divider, similar to the LoRaM3-D boards.
Arduino STM32 on bluepill worked. It uses the full 12-bit ADC resolution (max value: 4093).
Arduino Core STM32 on bluepill also worked but uses only 10 bits resolution (max value 1023), itâs readings were around 10% too low.
STM32GENERIC failed to compile without errors when I included #include <U8x8lib.h> so I didnât bother to try it any further.
It now also works with BSFrance-stm32 and a LoRaM3-D F103 board. BSFrance-stm32 also uses only 10-bit ADC resolution.
With no battery connected and powered via USB (from computer) the multimeter measured a constant 2.007V on PA1, but analog read reported values fluctuating between 507 and 600+ (around 1.64 to 1.95V).
When battery connected (LiPo) to the battery connector and no USB connected, the multimeter measured a constant 1.864V on Pa1 and analog read is more stable, reporting values between 539 and 542 (1.74 to 1.75V).
#define ANALOG_MAX_VALUE 1023 // 1023 for 10-bit resolution, 4093 for 12-bit resolution
int analogValue = analogRead(PA1);
float voltage = analogValue * (3.3 / ANALOG_MAX_VALUE);
Hi
New challenge that I am facing today: Get a Waveshare e-Paper (1.54") running on the F103.
Yes, I know there is an OLED so what do I need a ePaper for âŚ
And yes, could even be off-topic because it has not specifically to do with the radio part.
Thing is: With the Radio-module sitting on SPI, I am not sure if it is me, the library ⌠why it wonât work.
And I always struggle with SPI, I2C seems to be easier to handle. Less wires maybe.
The soonuse-library https://github.com/soonuse/epd-library-arduino looked promising, simple and straight forward, so I gave it a try. Starting with the Pin description:
3.3V --> 3V3
GND --> GND
DIN --> D11
CLK --> D13
CS --> D10
DC --> D9
RST --> D8
BUSY --> D7
⌠it is easy to change some in the epdif.h.
like that: #define RST_PIN PA8 #define DC_PIN PA0 #define CS_PIN PB14 #define BUSY_PIN PB9
These should be free to choose except DIN (MOSI) and CLK (SCLK)
I am not sure if I can share it with the SPI radio pins (PA7, PA5) or not.
Code stops at epd.Init() and I have no real clue where to look next.
If I need to use SPI2 (PB15, PB13) I have no idea how to do that.
OK, as edit is no longer possible, I will do a reply to my own questionâŚ
SOLVED -
I moved over to the library: https://github.com/ZinggJM/GxEPD
I had a bit trouble finding the parts and Pins and all, but it runs sharing SPI between the LoRa-Radio and the ePaper.
As there is only a suggested Pinout for STM32F103s but not the BSFrance-Board, I chose the following:
No changes are necessary in the libraries, just in the code to use it.
Btw, regarding I2C:
Sharing the Pins for SDA: PB7 and SCL: PB6 with some other I2C devices also works fine (as should). However I noticed one case (a MOD-1023 from embedded adventures) where u8g2 seemingly interfered sometimes with the data transmission (I have no clue why and how).
Interestingly, the problem went away when using the Adafruit GFX library which I now also use for the OLED (and which is used on the ePaper by GxEPD.) Having two instances run is no problem. Though I like u8g2 over Adafruitâs Lib for itâs fonts I was not able to resolve the problem with the scrambled data.
With all respect, what docs?
The code doesnât contain any (and the LoRaWAN library code is rather cryptic).
Adding some documentation to the code and adding a description of the library will increase itâs usability for others tenfold.
I tried the included basic examples TTN_OTAA.ino and TTN_ABP.ino using #define REGION_EU868 (on a B-L072Z-LRWAN1 board).
Unfortunately I directly ran into several issues:
TTN_ABP.ino
The examples use LoRaWAN keys/idâs in string format.
The only (byte array) string format that TTN Console uses is msb-first format. (The TTN Console supports both msb-first and lsb-first formats only for the array initializer notation with brackets.)
Unfortunately the TTN_ABP example expects the devAddr string variable to be in lsb-first format which is not consistent with the format that TTN Console provides and neither is it documented that devAddr requires lsb-format instead of msb-format. nwkSKey and appSKey use msb-first format, like presented on the TTN Console, but again information about the required format is missing.
TTN_OTAA.ino
Different from TTN_ABP.ino, TTN_OTAA.ino expects all LoRaWAN keys/idâs in msb-first format.
That is at least the only way where I can get a JOIN ACCEPT on a JOIN request.
I tried devEui, appEui and appKey in different msb-first / lsb-first combinations but only when all were msb-first did I actually see JOIN ACCEPTâs on the gateway and application consoles.
But then it stops. The JOIN ACCEPT is not followed up by an upstream message from the node. I see no data arriving.
What can this be? Why doesnât the node send any data messages?
Both the OTAA and ABP examples start communication at SF12 instead of the usual SF7.
The examples do not specify a spreading factor so SF12 appears to be the default.
I think the default should be SF7.
Where can the spreading factor be specified?
Yes, as soon as this goes to beta. I have still a long list of things to address. While addressing those I really need to keep the freedom to mock around with the API.
The code is not that cryptic ⌠it just needed to be as small as possible. I still have this notion that a decent LORaWAN sensor application might fit into a STM32L052.
Thanx for pointing out. Iâll updated the comments in the examples. Yes, âdevAddrâ is LSB and the keys are MSB. I did follow there the common convention (Arduino MKR 1300, Murataâs own AT-Command set âŚ).
Keys are MSB first as common convention, the rest is LSB first. So perhaps this is the issue. Does the gateway send out a CFlist in the JOIN_ACCEPT to populate the rest of the channels ? I heard different things from users (I am in US915 based, so testing EU868 is somewhat tricky).
But even if the gateway would not send a CFlist, youâd still be able to use the core 3 channels. However then there is the dutycycle issue, which means the system will wait till it is allowed to send again. Perhaps for debugging youâd want to use:
LoRaWAN.setDutyCycle(false);
Yes, per default ADR is enabled, so the setup starts with SF12BW128 (i.e. DR_0). SF7 is unclear. There is SF7BW125 (DR_5) and SF7BW250 (DR_6).
The compiled form of less-cryptic code does not have to be larger.
It is more related to the brief names and lack of descriptions. (But when you have written the code yourself that will probably be less obvious.)
Does the gateway send out a CFlist in the JOIN_ACCEPT to populate the rest of the channels ? I heard different things from users (I am in US915 based, so testing EU868 is somewhat tricky).
I have no experience with checking the CFList in a join accept. I normally use LMIC-Arduino and didnât have to bother with that before. How to check that?
Sure, testing EU868 is difficult when you are situated in the US.
Great. That makes it consistent with the TTN Console (when using strings) so the keys/idâs can simply be copy/pasted. But please make it aware to the user.
SF7 is unclear. There is SF7BW125 (DR_5) and SF7BW250 (DR_6).
Iâm not sure which one it should be.
The code below is from the LMIC-Arduino ttn-abp.ini example but this does not directly provide an answer.
I do know that LMIC-Arduino starts trying to OTAA join on SB7 first and then gradually steps up to higher spreading factors if the join does not succeed (it takes a long time before it finally reaches SF12).
#if defined(CFG_eu868)
// Set up the channels used by the Things Network, which corresponds
// to the defaults of most gateways. Without this, only three base
// channels from the LoRaWAN specification are used, which certainly
// works, so it is good for debugging, but can overload those
// frequencies, so be sure to configure the full frequency range of
// your network here (unless your network autoconfigures them).
// Setting up channels should happen after LMIC_setSession, as that
// configures the minimal channel set.
// NA-US channels 0-71 are configured automatically
LMIC_setupChannel(0, 868100000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(1, 868300000, DR_RANGE_MAP(DR_SF12, DR_SF7B), BAND_CENTI); // g-band
LMIC_setupChannel(2, 868500000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(3, 867100000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(4, 867300000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(5, 867500000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(6, 867700000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(7, 867900000, DR_RANGE_MAP(DR_SF12, DR_SF7), BAND_CENTI); // g-band
LMIC_setupChannel(8, 868800000, DR_RANGE_MAP(DR_FSK, DR_FSK), BAND_MILLI); // g2-band
// TTN defines an additional channel at 869.525Mhz using SF9 for class B
// devices' ping slots. LMIC does not have an easy way to define set this
// frequency and support for class B is spotty and untested, so this
// frequency is not configured here.
#elif defined(CFG_us915)
// NA-US channels 0-71 are configured automatically
// but only one group of 8 should (a subband) should be active
// TTN recommends the second sub band, 1 in a zero based count.
// https://github.com/TheThingsNetwork/gateway-conf/blob/master/US-global_conf.json
LMIC_selectSubBand(1);
#endif //defined
I do have a bunch of gatesways around here (including EU868). However they are configured either as simple local gateways, or packet forwarders. There I can see directly in the log files what happens packet by packet ⌠But the TTN package is kind of invasive last time I tried.
I donât think this is spec conform. Most âdevices on our networkâ style papers from Oarnge/KPI/Senet/machineQ seem to imply a rather strong preference to use the lowest datarate for a OTAA join, and then subsequently (first user data packet) either ADR or a user configured value. In fact the first non-user packet has to include ADR, or there is no way for a US915 based setup to get the channel mask set âŚ
I can see the logic of seeding ADR during the JOIN_REQUEST. But looking at my gateways and their ADR logic, after 6 packets it has figured out the data-rate and from there takes only the minimal number of steps to get to the lowest TxPower (I think thatâs at packet 14 ⌠still not that great, as the gateway could have figured out that 30dbm are not legal if it has only 8+2 frequencies).
The following is copied from the LMIC-Arduino ttn-abp.ini example but does not directly give an answer.
Actually this code is only relevant for ABP. There the gateway does not know when you are joining, and hence cannot send NEW_CHANNEL_REQ commands or a ADR_REQ to add new channels and/or enable/disable them.
Again I suspect that you just got caught out by the âwhy does my first packet take 2 minutes before it is sendâ effect. LoRaWAN.setDutyCycle(false) will fix that, although itâs not ETSI conform. Perhaps I should change that for the TTN examples to avoid that trap âŚ
Ah, that is a very good point ⌠one that I have no good answer to.
Here is the problem. The LoRaWAN protocol works with datarates, 0 to 15 (not all are used). 0 is the lowest, 15 the highest.
What a specific datarate means is region specific. So for EU868 DR_0 means SF12BW125. For US915 DR_0 means SF10BW125 ⌠All of that is nicely documented by the LoRaWAN Regional Parameters document you can download upon request from the LoRaWAN alliance ⌠after you register âŚ
Not user friendly at all. TxPower has the same issue, except that I do the âdbmâ to âtx-power-indexâ translation (which is really power-dbm = EIRP/ERP - 2 * power-index
So there I am stuck. Whether you use â0â or âDR_0â does not matter much. Itâs bad. If I start using enums along SF[sf]BW[bw] it becomes more intuitive, except that the question comes up âwhy does SF12BW125â not work for me" ⌠are you on âUS915â ⌠âyesâ ⌠âthatâs not supportedâ. So you end up needed to reference the LoRaWAN Regional Parameters document again.
Yes but I assume that would have to be region dependent.
A dual scheme probably causes confusion and inconsistencies in applications.
A one-fits-all solution is probably not feasible due to region dependent implementation differences.
.
I did additional testing. These are the results:
Data Rate Join Request Join Accept After Join Accept
--------- ------------ ----------- -----------------
setDataRate(0) SF12BW125 SF12BW125 Nothing, then retry after 2:30
setDataRate(1) SF11BW125 SF12BW125 Nothing, then retry after 1:20
setDataRate(2) SF10BW125 SF10BW125 Recognized, starts uploading messages
setDataRate(3) SF9BW125 SF9BW125 Recognized, starts uploading messages
setDataRate(4) SF8BW125 SF8BW125 Recognized, starts uploading messages
setDataRate(5) SF7BW125 SF7BW125 Recognized, starts uploading messages
setDataRate(6) Exactly like setDataRate(0)
So when doing Join Requests on SF11BW125 or SF12BW123 the gateway sends Join Accepts but they are not recognized / not properly handled by the node.
Notice the different behavior when doing Join Requests at SF11BW125, the Join Accept uses SF12BW125 while the request was done on SF11BW125.
Can you explain the different (accept on SF12) behavior when doing Join Request on SF11BW125?
Why are the Join Accepts when doing requests on SF11BW125 and SF12BW125 not recognized / not properly handled?
Do you have some more info about the frequencies that gateway sees the incoming JOIN_REQUEST, and on which frequency it sends itâs JOIN_ACCEPT.
The RX2 window for TTN is non-standard:
869.525 - SF9BW125 (RX2 downlink only)
And yes, you found that for EU868 you can set the datarate before the join, and it will use that one ⌠Iâll see with my gateway what it answer with. The join accept should be with the same datarate, unless itâs answering on the RX2 slot. This is why I think your gateway answers for SF12BW125 and SW11BW125 on the RX2 slot, but on 869.525MHz with SF12BW125 (which is EU868 standard).
The LoRaWAN_TTN_OTAA.ino contains this line:
LoRaWAN.setRX2Channel(869525000, 3);
Mind removing it and retry the series ? This one would remove the TTN specific RX2 window.
Three sequences for DR 5: (reset before each sequence)
There are image issues with the forum software the last months, so you will have to right-click the image and then select âOpen image in new tabâ (for Chrome) or whatever option your browser has.
Note: setRX2Channel(âŚ) still in place (not yet removed).