I am measuring the power consumption of my sensor node. I use an EFM32 Microcontroller with Microchips RN2483 module for LoRaWAN communication. The datasheet states that the current consumption is around 39 mA when transmitting with an output current of 14dBm and an supply voltage of 3.3V. However, i’m measuring a current of 28 mA when transmitting a packet with the same parameters (14 dBm and 3.3V). Also, i disabled adaptive datarate as i found that this can reduce transmitting power, but this didn’t have an effect. It is a good thing that the current consumption is lower than expected, but i wondered how this happened. Are there other things that lower the current consumption when sending a packet?
You might run a quick confidence test on your current meter, for example add a resistor between 3v3 and ground which should result in a 39 mA current. Then maybe use a lower current resistor on an MCU I/O pin and pulse it.
Presumably you don’t want to do microsurgery under the module shield can, but tapping the SPI lines with a cheap USB based logic analyzer can let you see the actual chip register setting used.
Assuming the command set of the module gives you some control of the power level, one thing you can do is try experiments at various commanded levels and see where the supply current stops increasing or decreasing - that can let you find any actual limits in the settings.
RF test equipment is comparatively more expensive…