As it stands right now I think I have two possible reasons for why my setup doesn’t
work:
1/ Bad configuration of spread factor and band width for each channel.
As I’m using the RAK811 I thought I check in the RAKwireless forum for how to configure this properly (but of course if you know this
2/ I do not have line of sight to any of the Gateways covering the area where I live.
I will get mobile and move my little testbed to a spot where I get LOS to the respective GW’s antenna.
If you have any other suggestion for things I can try it would be great.
Edit: @jasonreiss I just use the SDR to “see” what is going on. But given the scale in the particular snapshoot I guess that 125KHz would look very narrow.
Line of sight isn’t always required. It all depends on things like distance, antennas use, what kind of obstacles are between transmitter and receiver, RF noise floor in the area…
If you aren’t near the gateway sending a return packet chances are small you’ll be able to see it using SDR. Lora modulation is made to be detectable below the noise floor, in a visual inspection of the spectrum it will be hard to detect.
Take into account you can’t always predict which gateway will be chosen to send downlinks in an area where multiple gateways receive your nodes transmissions. Usually the closest gateway will have the best signal conditions and will be chosen, but airtime and gateway use are factors as well.
Using a generic SDR dongle and Windows software I haven’t been able to scan the entire band used by EU868 with sufficient speed to be able to capture all transmissions of my test node (at least I wasn’t able to recognize them).
Thx, your answer.
Yes, antennas and radios and everything else thats affect the transmission are a whole sciense on its own.
Of course, you’re right, the signal is much to weak in the DL to be seen in the waterfall.
As it seems, see below, I don’t have any Gateway cloes enough for me to use.
Tweaking the SDR a project on its own. I am satisfied with that I got some hints that my node doesn’t have a broken radio:
The red line in the spectrum is the node transmitting.
Today I went on an excursion to some different Gateways and finally at the second Gateway:
So I have to get me a Gateway!
Thx, again all of you who have commented in this thread!
See also https://revspace.nl/DecodingLora And surely my SDR tests easily showed chirps like that, but maybe I was always using SF12 then, to have some time to see things.
Aside: don’t send text, even though that makes your payload extra long for testing.
No, I will stop that now when I have finalised the first test to prove that the node’s HW is functioning.
I’m deliberating wether to send the data in JSON format and then compact with CBOR or if I should just send bytes with data in binary format and then a checksum in the end.
I guess it depends if I am going to integrate with some cloud service to access the data and what fits best the cloud service I choose, and if I do that at all.
I’m thinking that I can stop the data from being transmitted from the GW to TTN while I developing the sensor nodes and data formats.
And also trim down TX power so the radio will only cover the lab.
So not to disturb the TTN.
The nearest GW seems not to be i operation, altough it is indicated as operational on the map. The second closest that is like 6 km from me, had a reach of some kilometer.
I feel CBOR is really unpredicatable, hence a bad choice for LoRaWAN:
A checksum (the MIC) is already included in each LoRaWAN message. If that fails, your application won’t even see the uplink. (Even more: TTN wouldn’t even be able to tell it’s your uplink.)
Good thing that there already is a CS. One less thing to consider.
I was thinking I need only signed 16 bit integer, that would give a range of +/-32767, perhaps CBOR can handle positive and negative whole number at least?
Perhaps I need to deal with 32 bit sensordata at some point but I think CBOR will be able to deal with that?
Thx for your input Arjan.
I have got answers to my questions regarding SDR and also in the process verified that my first node is working as long as I have a working GW to connect to.
Is it customary to close the ticket then?
How much can we wander off topic w/o beaking the rules?
To nitpick: as long as there is a gateway in range. In WiFi you connect to a gateway, in LoRaWAN you connect to the network using any gateway that is part of that network.
Not forwarding the data to TTN makes it hard to get to the data. TTN decrypts the LoRaWAN packet for you. Also you will still be using airtime so not forwarding the data doesn’t help keeping the air ‘free’. And if you don’t forward data other users will not be able to profit from additional TTN coverage.
I’ve played around with this using an RTL-SDR USB stick which is very cheap. This picture shows the best resolution I’ve been able to get, that’s with SF12, any faster and the waterfall plot is too fast to make out anything much. I’m not sure if it would be better with a better SDR device (but they start to get expensive) or a more powerful PC that can produce the live plot faster or if I just don’t know enough about GNU Radio since this is the first and only time I’ve tried anything with it.
I’d say it is good enough to be able to see when something is being sent from the node. An SDR cannot replace a spectrum analyzer to properly measure the signal.
But as such it helped me to see that my Lora node actually sent something on the expected channels and was not broken.
Anyway I bought a proper 8 channel gateway and have succeded to get som packets through.
Very good work, I would advise you to use software like the SDRSharp that is easy to configure, in addition to improving the gain of your SDR in LoRa you can use a lamda / 2 antenna according to the frequency that LoRa works in your country.
An RTL-SDR can capture enough data to demodulate a nearby LoRa transmission, the issues are:
it doesn’t begin to have the instantaneous dynamic range of a real LoRa receiver to be able to receive weak distant signals in the presence of strong nearby interference or noise overall
it doesn’t have the bandwidth to watch an entire band, especially for something like US915 where the 500 KHz wide downlink channels move in step with the uplink ones. Thus you may need to know the frequency pair that will be used in advance and pre-tune it there, fortunately people generally are debugging setups where they control the node firmware, so what it is going to do is something that can be predicted or even customized for a test. In such a bandplan you also can’t cover the uplink and downlink at the same setting.
And of course for a lot of these tests you don’t actually need to demodulate, but just see that something was there.