Controlling bitrate/output buffer

I need to push data (split into packets) more or less reliably.
After getting tens of messages to input buffer, radio reorders them and loses about 10-15%.

  1. Is it possible to manage buffer in a pull/push way and get notified “Now i can send more messages, please continue”
  2. On connect, MY_INFO response notifies about channel_utilisation and bitrate.
    I can’t find a way to ping for MY_INFO timely to manually keep load within reasonable average channel_utilisation. is it possible?

1- If you are dropping packets from the buffer you are filling it up

2 -that data comes in over the telemetry device metrics, myInfo is a big packet and only comes in on connect

Which firmware version are you running?
In principle the listen-before-talk feature from 1.3 should be able to handle high channel utilization quite well by waiting until the channel is not busy anymore. Even after that, it waits a random delay which is longer if the channel utilization is high.
Are you sending directly to another node or are there nodes in between that should relay it?
Did you determine the 10-15% loss by checking on the receiving node or by counting ACKs?

  1. Not sure. I’m sending 10 packets. They come in order 1,5,4,10,9,7,3,8,6 (or alike).
    Packet number 2 is lost.
    But num 10 (the last one) comes in the middle.
    So it means that all packets were succesfully swallowed by buffer, but one of them lost during sending…

Sending directly (two radios, no broadcasting, direct adressing).
Distance - 0.5m
Checking receiving node.

Perhaps just an example to explain your point, but Packet #2 was apparently not sent to the sending device. Could that be the problem?

I’m sending it from laptop through serial port with 5-seconds delay between packets.
Don’t think it can be lost in the USB-cable.
If I send them one-by-one, after previous packet came to receiving radio - it’s fine.
If I send them faster than 1 per ~1 minute and don’t wait for previous packets arrival - they are reordered and some of them are missed (

Yes the USB-Cable is unlikely to be the cause of this, and your test has shown, that it is caused by a buffer overflow, might not necessarily be in the serial input buffer, but clearly when waiting longer between each packet it works correctly. For a test you could increase the delay between each packet to say 5 minutes, that would most likely work correctly.

it’s ok for testing…
But in the end I need more or less reliable data “streaming” at optimal speed.
That’s the question - how to regulate it

Are you using long slow on 1.2? If so you are pretty bandwidth contstrained.

As @garth wrote bandwidth is very low, and personally I doubt that LoRaWAN is the right technology for “streaming data” at any reasonable speed.

Would it be possible to break up your data stream into (much) smaller packets? Depending on your use case this might work out.