Is DBFS required when connecting to Databricks? #213
Unanswered
AnoopShahi
asked this question in
Q&A
Replies: 2 comments 3 replies
-
Does the user who generated the API key for Databricks have select privileges on the catalog/schema used for storing the metadata? I also looked at getting Databricks working. It looks like DBFS approach is the only way which is unfortunate. I think I'm going to just use the REST API instead. |
Beta Was this translation helpful? Give feedback.
1 reply
-
@AnoopShahi @justin-gerolami can you try adding This will fall back to JDBC-style inserts instead of using the dbfs API + upsert strategy. It's much slower.. but it should be functional. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
So I was running cs_tools searchable metadata for databricks in a unity catalog enabled environment and ran into some permission errors. The code below (FROM 'dbfs:/FileStore/cs_tools/TMP_STAGE_ts_cluster_a36e3.csv') references dbfs, and after speaking to someone that mangaes databricks from our side wanted to know if this is required for cs_tools to run. If it is, where can we find at what step the connection is established in the whole process.
DatabaseError: (databricks.sql.exc.ServerOperationError) [INSUFFICIENT_PERMISSIONS] Insufficient privileges:
User does not have permission SELECT on any file. SQLSTATE: 42501
[SQL:
COPY INTO ts_cluster
FROM 'dbfs:/FileStore/cs_tools/TMP_STAGE_ts_cluster_a36e3.csv'
FILEFORMAT = CSV
FORMAT_OPTIONS (
'header' = 'true',
'inferSchema' = 'true',
'delimiter' = '|'
)
COPY_OPTIONS ('mergeSchema' = 'true')
]
Beta Was this translation helpful? Give feedback.
All reactions