Skip to content

Conversation

@Chris-Schnaufer
Copy link
Contributor

Large files on underpowered machines can cause the to appear to database to hang through locking conflicts. One cause is that loading large files in one go uses significant resources, slowing the loading process down. Breaking large source files apart and loading them in a loop resolves this issue

@Chris-Schnaufer Chris-Schnaufer marked this pull request as ready for review November 21, 2022 22:44
@dlebauer
Copy link
Member

dlebauer commented May 4, 2025

@Kooper please review

if [ -f "${DUMPDIR}/${T}.csv" ]; then
echo "\COPY ${T} FROM '${DUMPDIR}/${T}.csv' WITH (DELIMITER ' ', NULL '\\N', ESCAPE '\\', FORMAT CSV, ENCODING 'UTF-8')" >&3
T_SIZE=`cat ${DUMPDIR}/${T}.csv | wc -l`
if [ "${T_SIZE}" -gt "${SPLIT_FILE_MAX}" ]; then
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we have SPLIT_FILE_MAX==0 means use the old algorithm, and for that to be the default value?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants