07 Jul 2025 09:43 PM
If you check out the Log Management and Analytics Default limits page here, you'll see this table:
What confuses me is, how can the Content size limit be the same as the Request size limit? A single request can have up to 50,000 logs. If each of those records' content fields can be a maximum of 10MB, how can the entire request also be limited to 10MB?
If the entire request is really limited to only 10MB, then the max size of each log would average out to 209 Bytes per log record... That can't be right...
I assume what it means to say is that each record in a request has a max size of 10MB but, not only does that seem redundant with the 10MB Content limit, it also would mean that a single POST request could be almost 500 GB, which also doesn't sound right...
So, the question is, what is the max size of a single, 50,000 record request being sent to the OpenPipeline SaaS Log Ingest API endpoint?
Solved! Go to Solution.
09 Jul 2025 02:36 PM
I got a hold of Chat support, and they indeed confirmed that the full payload size is really only 10MB...
Not sure why the record limit is 50,000 then since we'd never be able to actually send 50,000 records in a single payload if the limit is only 10MB. Those would have to be the tiniest logs ever...
Hopefully they raise this limit soon. I feel like 50 or 100 MB would be a more reasonable limit these days.
Luckily, Cloudflare LogPush jobs have the ability to limit the number of records they send as well as the maximum payload size sent for each POST, so we can just set those to 50,000 and 10MB in there. It's a shame though because Cloudflare can send as much as 1 million records in a single POST and 1 GB in a single POST.