-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chunked Map Uploading for Maps larger than 100MB #318
Comments
I looked into this @Gocnak but it feels like Uppy is a bit wack with Angular, their latest package isn't even bundled correctly :) Might need to just improve on the file-upload component visually instead |
Yep that's fair! I'll rename the issue to better fit then. |
Do we really this? Most modern internet should be able to handle a single POST request of ~100mb perfectly fine, and 0.10.0 deploy shouldn't have an issue with files > 100mb, even though it still streams through the backend. As me and Nick discussed recently, if our backend instances are started to use too much bandwidth and we need to do fancier upload stuff we can, but until we actually run into that problem (with the 0.10.0 deploy), we're fighting with a (often extremely complex) problem before it's actually affected us. |
Note that if we're protecting our backend via Cloudflare, we will need to have multipart uploading to get any file > 100 MB uploaded, unless we use presigned URLs. |
Sorry, what Cloudflare system requires this? Is there really no way to configure this? |
We discussed this a few weeks ago, our free Cloudflare DDoS stuff limits to maximum of 100MB. Chunking with current system is really annoying given our only state atm is in Postgres. Since we don't do anything with BSPs (currently) on the backend the easiest approach is probably presigned URLs. But this is definitely low priority, maps should really try to stay under 100MB. Not sure it's completely necessary for 0.10.0. |
Okay we can chunking and use disk but needs a spec, don't have time now, ping me if interested in doing this. |
Our file uploading process is currently just a single POST request to the backend. It'd be great to get resume-able uploads with multi part chunking it, especially when we move to cloud storage.
The frontend should update visually to respect this!
The text was updated successfully, but these errors were encountered: