-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
support JSONLines #3954
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
We allow returning custom mimetypes: https://docs.postgrest.org/en/v12/references/api/media_type_handlers.html Since you mention "bulk inserts", I assume you'll want to POST this. For this, we'd need the inverse: Support custom mimetypes for |
I find JSONLines interesting as I've also been having issues with bulk json insertions. Custom media types plus a pg function sound like the ideal solution. However, perhaps #2261 it's also related to this.
So even if we offer custom media types for POST, the above failure would still occur since the payload will fail at the PostgREST level? @steverweber What's the exact size of your JSON payload? |
|
That is unexpected, we have these tests: postgrest/test/memory/memory-tests.sh Lines 105 to 119 in 4348cb2
So that means a 50MB JSON should consume at most 72 MB RAM and so on. I thought the memory consumed decreased as the payload size increased. Maybe we need to do similar tests with different MB sizes of arrays, instead of just one big json object (which is what we do on |
the json is complex. each array item is ~7KB of data with plenty of nesting. here is the first ~20% of what one data row looks like.. yep messy stuff.
|
Problem
Large JSON bulk inserts have too much cost. JSON arrays need an ending
]
and this might be causing parser issues.Solution
JSONLines
https://jsonlines.org - is a simple JSON format like a CSV where each line is a new record.The text was updated successfully, but these errors were encountered: