it is part of my experiment, i have to record latency along with rssi and snr
Indeed but I know how you like playing 20 questions
So in the abscence of a specifc detail lets consider say wind speed measurement…
The transit time for an aneometer to meaure on one side of a field c/w an aneometer on the other side of the field or indeed in the neighbouring field can easily be calculated of course but as this will normally be measured in seconds the message sending time is almost inconsequention on that scale.
If interested in the variability of say temp, this is normally done over many minutes/hours let alone seconds so again Tx time for payload not of consequence, and if a need for comparing temp (or humidity or soil moisture of anything else of consequence) at different points this can be done by precision scheduling of tx but in practice you REALLY want to dither the measures as you want to minimise risk of packet collisions and spectrum congestion so as Nick hints need real details to understand your concerns and what is of interest (and possibly why?)
Ok so not quite in a vacuum but devide distance between points by c and its a good approximation! And anything on the scale of LoRaWAN range the transit time of the payload as RF is meaninglessly small in the scheme of things!
No depending on clock sources the delay here is an offset of time and will be determined by how well in sync the clocks are and how long each end takes to process and recognise the fact of TX and Rx…
Where and relative to what and are the source and destination synchronised wrt their clock?
Again coming back to earlier post what is it that is changing/varying that you want to measure that 1sec (or fractions therefof) become significant (context), then we can perhaps better understand or advise…
It is shown in the metadata for each message received by a Gateway - click on the individual message in the TTN console and it will expand to the show the full set of metadata
Also if you know the details of your payload message you can use online estimators to predict what the time would likely be e.g.
https://avbentem.github.io/airtime-calculator/ttn/eu868/10
And actually for this one the values (given lack of resolution) suggest it may only be 27mS!
28.973 vs 29?!
i want to compare the transmission time of raw packets of data versus processed ones or if there’s a noticeable difference in the time it takes for transmission between the two.
my project involves edge computing so i want to determine whether data processing at the end devices impacts the overall transmission latency.
Then that is a different peice of string you are trying to measure and is not related to or impacted by LoRaWAN! What you want to know is how long that data processing takes as part of your edge computing process (perhaps detecting thresholds, tracking averages, sending an alarm vs data, running a machine learning algorythm, or what ever)… only you can determine that, no one here on the Forum…once you have a result and are sending the result then it goes over to being a LoRaWAN question and influenced by distance and speed of light, payload/overhead size etc…
okay, thanks for the clarification. but to clarify, my objective is to measure the time it takes for raw data to be received, not just transmitted. so i want to test if there’s a significant difference in the latency between raw and processed data when using LoRaWAN.
From LoRaWAN perspective it all ‘just data’ - and the system doesnt care. It doesnt matter whether the e.g. 10 byte payload you choose to send over the air is ‘raw’ data or the ‘result of some calculation’ you have done before sending…all 10 byte payloads are the same to LoRaWAN and the time taken to send (and therefore any transmission related ‘latency’ is the same and can be predicted, based on e.g SF used for the Tx per the link I provided earlier!
okay i got it nowww thanks!!
i see. thanks for your help!!
Consider a spherical Swallow, what is its average air speed unladen?
The whole investigation seems quite out of proportion to the system it’s measuring. Recording temperature at one point in a field and getting the data to your dashboard in a few milli-seconds whilst the plants near that sensor are taking hours for cell mitosis seems disproportionate.
And if you have many sensors in a field and are hoping to take a snapshot all at the same time, there are ways of developing the code to do that but then not clash on transmission, but the manufacturing variability & the repeatability of the sensor is likely to make this rather academic.
You may want Time-on-air?
Maybe you could include your device time in your payload, and then compare it with the time it was received on the NS?
If your device supports LoRaWAN 1.0.3 you could use DeviceTimeReq for synchronization.
how to include device time in the payload?
I’m not sure what you mean? Just add it to your payload, it will take up 4 bytes - so you could have it as the first 4 bytes of every single payload you are sending.
Does the average agricultural LoRaWAN sensor have an accurate time source ?
Even if there is an accurate time source, there is no reason to look at the latency given the problem description. If raw data and processed data has the same length LoRaWAN transmission time will be the same. If the length differs there are calculators to calculate the difference in time, still no need to add timestamps.
And the time it takes to process the data at the edge can be found by using the serial console of the device with appropriate firmware. I would evaluate the difference in power consumption as well to check if there is a significant difference in battery lifetime to be expected, given the use case you don’t want to go round changing batteries every couple of months…