Skip to content

Streaming large queries using a callback function #148

@rweichler

Description

@rweichler

I just want to know if this is possible. I can implement it myself. I'll get straight to the question. This is all I really want to know. On line 677 in init.moon, is row_desc guaranteed to be non-nil at that point? In other words, does the PostgreSQL protocol guarantee that the MSG_TYPE_B.row_description gets sent before MSG_TYPE_B.data_row?

If that's the case, then I can proceed with implementing what I said in the title.

My dream is this:

local websocket = ...
pg:query('SELECT * FROM table LIMIT 100000', function(row)
    websocket:send_text(require'cjson'.encode(row)..'\n')
end)

What's nice about this is, you don't have to wait for postgres to return all 100k rows before sending stuff over the websocket. You'd be able to start sending stuff over the socket once you get the first row. Which is a lot faster. And plays really nice with the coroutine paradigm.

If my hypothesis in the first paragraph is correct, then I'd implement it like this:

On line 667 (linked above) you can simply call the callback function, after running this part of format_query_result on the raw string that PostgreSQL gave you over the wire.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions