I just became a member of TTN and successfully set up my first LoRaWAN gateway (using a RPi and a iMST iC880A as described here. Now I have tried to examine the communication range of this gateway. I logged all uplink packets using DR0 and noticed that there are no packets with an RSSI lower than -121 dBm received from my gateway. When I move more far away from the position of the gateway the RSSI decreases to -121 dBm and finally the message reception stops…The iC880A data sheet states that the minimum sensitivity is -137 dBm though.
As a next step I inserted an attenuator between my LoRaWAN device and the antenna to add step by step more attenuation to the uplink signal without changing my position - with similar results. Packets are only received, if the RSSI is -121 dBm or better.
Before I became a member of TTN, I used a LoRa gateway (simple LoRa Phy) implemented by my own, with similar Semtech LoRa chips. I was able to receive LoRa packets down to an RSSI of -138 dBm all the time. So is there a reason I can’t use the full link budget of LoRA within the TTN? What am I missing?
Sorry, I missed that. It’s a Pycom LoPy with a pyTrack Board.
17 dBm ? So you think there’s a problem with your iMST iC880A ? antenna’s ?
Maybe there is. I use a Delock 12500 7 dBi outdoor antenna. But this bad performance isn’t limited to my gateway. I was in my town the other day with the LoPy device transmitting my GPS position and came very close to two other TTN gateways. One of these gateways successfully received some of my data, but never with an lower RSSI than -121 dBm. That’s why I needed to get pretty close to this gateway. This is all some kind of strange since LoRaWAN should perform quite well (and already did with my previous LoRa Phy gateway). The RSSI value of your screencap is interesting though since I never had an RSSI lower than -121 dBm. Did you had an even lower RSSI at some point? This would mean that this is no issue of TTN but with the gateway’s hardware / software.
Why do you relate the limited link budget to TTN and not to your hardware/software?
Because I get this limitation with other gateways, too. The TTN limitation was only a suggestion searching for the reason of this bad performance. My devices are fine, when using no TTN gateways.
The data sheet may well quote that figure, but its likley it was measured under ideal conditions, in a faraday shield perhaps out by Pluto where noise does not have a significant effect.
I would ignore the data sheet specifications, if you carry out real world tests, you will find the actual sensitivity is maybe as much as 20dBm worse due to impact of noise.
I would also ingore the RSSI completly at low signals levels, its an extremly poor indication of whats actually going on. The reported SNR however, is a very accurate indication of whats happening, in my experience.
If you take a quick look at the sources used for your gateway you will notice the reported RSSI is the result of a calculation. That calculation uses a value set in the global_conf.json file as one of its parameters. So the value reported might correspond to values you observed earlier if the formula and input parameters differ.
Drawing conclusions based on numbers observed from other gateways where you do not know the antenna setup could be very misleading. Have you checked ttnmapper.org to see if those gateways are mapped and how they perform?
Have you checked the performance of your 7dBi outdoor antenna in combination with the antenna wire you are using?
Okay. So I took a look at the latest measurements:
RSSI,SNR
-118,-0.8
-117,-2.5
-118,-0.2
-117,0.2
-118,-2
-120,-0.5
-120,-17.25
-120,-5
-118,-9.2
-120,-2.5
-117,-0.8
-120,-3.2
-120,-2.8
So I understand that the minimum SNR is around -20dB. But in most cases the SNR is much better than that. This should allow me to increase my path loss and still receive some packets…
As a next step I will check on the hardware to be sure that the LoRa concentrator, cables and antenna are all fine. I will keep you guys up to date.
How meaningful are the absolute sensitivity figures? Looking at
doesn’t this mean that if the environment my antenna is in has a noise floor around -100dB then the best I can expect is to receive a packet at -120dB RSSI? Whether the receiver has -137dB sensitivity doesn’t really enter the equation, does it.
Now if I could eliminate the noise by turning off some gear, say, and the noise floor drops to -120dB, then I wouldn’t be able to get packets at -140dB RSSI because the internal receiver noise and thus sensitivity is the limiting factor.
Aside from the fact that I’m using a bit idealized numbers that don’t take fading and stuff like that into account, am I on the right track?
Well, the receiver will typically report a no signal RSSI of between -100 and -105dBm, so at -20dB SNR you would expect the real sensitivity to be -125dBm or so.
I have never seen the receiver report noise of much less than -105dBm, even on remote hiilsides where quite deliberatly I carried out link testing with no electronic devices within a couple of kms, phone off, watch left behind etc.
I wrote this up in a report in 2015, a 40km hilltop to hilltop test, you can caculate the signal strength you are actually getting (because you know the antennas and distance) and by noting the transmit power at which transmissions fail, the report is here;
“One anomaly that was clear from the results of Howie’s link budget calculation was that the signal strength of a 2mW transmitter at 40km calculated as -114dBm”
This was at SF8, so at -10db SNR the comms are failing at -114dBm and not as the datasheet might suggest at -131dBm.
dear all.
The RSSI report of the radio transceiver is the total radio power measured in the bandwidth of the channel (in this case 125kHz).
The total radio power in the channel is the sum of (1) thermal noise + (2) the lora signal + (3) all interferers. So the RSSI report is very often > than the actual LoRa signal power.
In theory the reported RSSI can NEVER be less than the thermal noise inside 125kHz = -174 + db10(125kHz) + 5dB noise figure of receiver = -118dBm.
BEcause the RSSI measurment process has its own noise it can happen that you get RSSI reported down to -120dBm in abscence of any interference.
Very often the com is interference limited, therefore the RSSI reported will be higher.
However the LoRa signal can be demodulated underneath the noise therefore the actual LoRa power may be lower than the reported RSSI. The actual LoRa power is close to RSSI + SNR when SNR is <0. When SNR is >0 the LoRa signal dominates the channel power, therefore in that case the LoRa signal power is roughly = RSSI.
Hope this helps.
I ran some tests and measurements:
I built up another gateway based on a RPi and iC880A and measured the performance of its LoRa receiver by placing the gateway and the node both in separated shielding boxes and used HF cables to connect both, but with digital attenuators to simulate different path losses. These are the results for SF12, Coding Rate 4/8 and 125 kHz bandwidth:
So this is quite interesting. On the one hand, both the iC880A Gateway and the DIY Pycom LoPy gateway receive data up to 135 dB attenuation (excluding cable and adapter losses), but with two major differences:
The iC880A displays a nearly constant RSSI value when the SNR gets below 0 dB. This is the correct behaviour as @nsornin and others already mentioned. Having a look at the results of the LoPy, the RSSI keeps decreasing as the SNR gets below 0 dB. This was my expected behaviour as I worked with the LoPy devices before.
BUT: The LoPy behaviour is wrong! In the LoPy documentation you can find the lora.stats() function which returns the following tuple: (rx_timestamp, rssi, snr, sftx, sfrx, tx_trials, tx_power, tx_time_on_air, tx_counter, tx_frequency). But the RSSI that is returned here isn’t the RSSI, but the Signal Strength or Packet Strength. In the Semtech SX1272 data sheet (that is used in the LoPy) Packet Strength is defined as following for SNR < 0: Packet Strength (dBm) = -139 + PacketRSSI + PacketSnr*0.25
This makes sense now: When the SNR is negative, the LoPy adds the SNR to the RSSI value and returns the Packet Strength, but not the RSSI. I guess this mystery seems to be solved.
The second result is: Although both iC880A and LoPy have a similar sensitivity, the Packet Delivery Rate is very different. While the LoPy successfully receives nearly all packets, the iC880A starts dropping packets while the SNR is far away from -20 dB. So this proves my experience of the poor performance (iC880A). This is no issue of cables or antennas, but of the LoRaWAN concentrator itself…