Replies: 1 comment
-
Your up for a world of pain. Since none of your markdown files / bodies will be larger than 1 GB - just store them in PostgreSQL and everything will be much easier :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a postgres schema for relating authors, posts, comments, likes, follows, and all kinds of stuff that works great for relational queries.
The articles table in particular is great, with the exception of the "body" column, where all the sudden I'm hosting a large markdown file.
I want to have an "article_bodies" table in MongoDB that holds many [article_id:article_body] entries, which in my understanding is what MongoDB was made for. I'm thinking this will open higher performance from my postgres, while allowing me to keep writing relational queries everywhere.
Is there a fast and easy way to connect my pg "articles" table to a mongo "articles" table? I'm using PostgREST to host the rest API, will this have the performance impact I'm expecting? What I'm imagining is a tool that always propogates a query from my existing postgrest api call to mongodb as needed, and as already handled for postgres, so it seems like within postgREST is the correct level of the stack to implement this.
Opening a discussion because this is not a bug report or feature request; I'm assuming the fact that I want to interface with MongoDB makes this at best a fork of postgREST, because even assuming it gets built I truly do want postgREST to focus on being the best CRUD-over-PG layer ever, because it currently is and saved me just so much time and frustration.
Beta Was this translation helpful? Give feedback.
All reactions