Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introducing support for InterSystems IRIS #302

Merged
merged 24 commits into from
Jan 29, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
bb26660
initial support for InterSystems IRIS
bdeboe Feb 29, 2024
02b73bd
reran roxygen
bdeboe Feb 29, 2024
b2ffcf8
use uppercase IRIS in JDBC URL
bdeboe Jun 24, 2024
05f1f31
initial support for InterSystems IRIS
bdeboe Feb 29, 2024
0b7c045
reran roxygen
bdeboe Feb 29, 2024
05cc5db
use uppercase IRIS in JDBC URL
bdeboe Jun 24, 2024
2c4a46c
Merge branch 'main' of github.com:intersystems-community/OHDSI-Databa…
bdeboe Sep 16, 2024
6f307d9
Note InterSystems IRIS in reference
bdeboe Sep 16, 2024
4283406
Fix section formatting & rerun roxygen
bdeboe Sep 16, 2024
69f8dbb
require SqlRender 1.19.0 for new IRIS dialect support
bdeboe Oct 9, 2024
46c896b
simplify testing when test servers are undefined
bdeboe Oct 9, 2024
cc2f38d
simplify retrieving JDBC driver for InterSystems IRIS
bdeboe Oct 9, 2024
6009263
initial IRIS setup
bdeboe Oct 10, 2024
211152d
Merge branch 'develop' of github.com:OHDSI/DatabaseConnector
bdeboe Oct 10, 2024
8a6cdbd
use executeBatch() rather than executeLargeBatch() on InterSystems IRIS
bdeboe Oct 10, 2024
026d0b3
add InterSystems IRIS datatypes to insertTable unit test
bdeboe Oct 10, 2024
e2b5986
drop temp table if exists at start of test
bdeboe Oct 10, 2024
7880ca4
reran roxygen
bdeboe Jan 2, 2025
57a2469
reran pkgdown
bdeboe Jan 2, 2025
2ac302c
update IRIS driver, pull from Maven
bdeboe Jan 2, 2025
2ffbd15
merge from OHDSI/main
bdeboe Jan 27, 2025
cd08658
Merge branch 'OHDSI-main'
bdeboe Jan 27, 2025
d6cc165
remove stray test file
bdeboe Jan 27, 2025
7badb96
use SqlRender 1.19.1
bdeboe Jan 27, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,15 @@ Authors@R: c(
person("Amazon Inc.", role = c("cph"), comment = "RedShift JDBC driver")
)
Description: An R 'DataBase Interface' ('DBI') compatible interface to various database platforms ('PostgreSQL', 'Oracle', 'Microsoft SQL Server',
'Amazon Redshift', 'Microsoft Parallel Database Warehouse', 'IBM Netezza', 'Apache Impala', 'Google BigQuery', 'Snowflake', 'Spark', and 'SQLite'). Also includes support for
fetching data as 'Andromeda' objects. Uses either 'Java Database Connectivity' ('JDBC') or other 'DBI' drivers to connect to databases.
'Amazon Redshift', 'Microsoft Parallel Database Warehouse', 'IBM Netezza', 'Apache Impala', 'Google BigQuery', 'Snowflake', 'Spark', 'SQLite',
and 'InterSystems IRIS'). Also includes support for fetching data as 'Andromeda' objects. Uses either 'Java Database Connectivity' ('JDBC') or
other 'DBI' drivers to connect to databases.
SystemRequirements: Java (>= 8)
Depends:
R (>= 4.0.0)
Imports:
rJava,
SqlRender (>= 1.16.0),
SqlRender (>= 1.19.1),
methods,
stringr,
readr,
Expand Down
35 changes: 34 additions & 1 deletion R/Connect.R
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,8 @@ checkIfDbmsIsSupported <- function(dbms) {
"spark",
"snowflake",
"synapse",
"duckdb"
"duckdb",
"iris"
)
deprecated <- c(
"hive",
Expand Down Expand Up @@ -332,6 +333,8 @@ connectUsingJdbc <- function(connectionDetails) {
return(connectSpark(connectionDetails))
} else if (dbms == "snowflake") {
return(connectSnowflake(connectionDetails))
} else if (dbms == "iris") {
return(connectIris(connectionDetails))
} else {
abort("Something went wrong when trying to connect to ", dbms)
}
Expand Down Expand Up @@ -747,6 +750,36 @@ connectSqlite <- function(connectionDetails) {
return(connection)
}

connectIris <- function(connectionDetails) {
inform("Connecting using InterSystems IRIS driver")
jarPath <- findPathToJar("^intersystems-jdbc-.*\\.jar$", connectionDetails$pathToDriver)
driver <- getJbcDriverSingleton("com.intersystems.jdbc.IRISDriver", jarPath)
if (is.null(connectionDetails$connectionString()) || connectionDetails$connectionString() == "") {
if (is.null(connectionDetails$port())) {
port <- "1972"
} else {
port <- connectionDetails$port()
}
connectionString <- paste0("jdbc:IRIS://", connectionDetails$server(), ":", port, "/USER") # use a full connection string for nondefault database
if (!is.null(connectionDetails$extraSettings)) {
connectionString <- paste(connectionString, connectionDetails$extraSettings, sep = ";")
}
} else {
connectionString <- connectionDetails$connectionString()
}
if (is.null(connectionDetails$user())) {
connection <- connectUsingJdbcDriver(driver, connectionString, dbms = connectionDetails$dbms)
} else {
connection <- connectUsingJdbcDriver(driver,
connectionString,
user = connectionDetails$user(),
password = connectionDetails$password(),
dbms = connectionDetails$dbms
)
}
return(connection)
}

connectUsingJdbcDriver <- function(jdbcDriver,
url,
identifierQuote = "'",
Expand Down
4 changes: 2 additions & 2 deletions R/DatabaseConnector.R
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ NULL
#' functions to point to the driver. Alternatively, you can set the 'DATABASECONNECTOR_JAR_FOLDER' environmental
#' variable, for example in your .Renviron file (recommended).
#'
#' # SQL Server, Oracle, PostgreSQL, PDW, Snowflake, Spark, RedShift, Azure Synapse, BigQuery
#' # SQL Server, Oracle, PostgreSQL, PDW, Snowflake, Spark, RedShift, Azure Synapse, BigQuery, InterSystems IRIS
#'
#' Use the [downloadJdbcDrivers()] function to download these drivers from the OHDSI GitHub pages.
#'
Expand All @@ -75,7 +75,7 @@ NULL
#'
#' For SQLite we actually don't use a JDBC driver. Instead, we use the RSQLite package, which can be installed
#' using `install.packages("RSQLite")`.
#'
#'
NULL

globalVars <- new.env()
11 changes: 7 additions & 4 deletions R/Drivers.R
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ jdbcDrivers <- new.env()
#' - "spark" for Spark
#' - "snowflake" for Snowflake
#' - "bigquery" for Google BigQuery
#' - "iris" for InterSystems IRIS
#' - "all" for all aforementioned platforms
#'
#' @param method The method used for downloading files. See `?download.file` for details and options.
Expand All @@ -48,6 +49,7 @@ jdbcDrivers <- new.env()
#' - Spark (Databricks): V2.6.36
#' - Snowflake: V3.16.01
#' - BigQuery: v1.3.2.1003
#' - InterSystems IRIS: v3.10.2
#'
#' @return Invisibly returns the destination if the download was successful.
#' @export
Expand Down Expand Up @@ -81,9 +83,9 @@ downloadJdbcDrivers <- function(dbms, pathToDriver = Sys.getenv("DATABASECONNECT
warn(paste0("The folder location '", pathToDriver, "' does not exist. Attempting to create."))
dir.create(pathToDriver, recursive = TRUE)
}
stopifnot(is.character(dbms), length(dbms) == 1, dbms %in% c("all", "postgresql", "redshift", "sql server", "oracle", "pdw", "snowflake", "spark", "bigquery"))

stopifnot(is.character(dbms), length(dbms) == 1, dbms %in% c("all", "postgresql", "redshift", "sql server", "oracle", "pdw", "snowflake", "spark", "bigquery", "iris"))

if (dbms == "pdw" || dbms == "synapse") {
dbms <- "sql server"
}
Expand All @@ -96,7 +98,8 @@ downloadJdbcDrivers <- function(dbms, pathToDriver = Sys.getenv("DATABASECONNECT
4,oracle,oracleV19.8.zip,https://ohdsi.github.io/DatabaseConnectorJars/
5,spark,DatabricksJDBC42-2.6.36.1062.zip,https://databricks-bi-artifacts.s3.us-east-2.amazonaws.com/simbaspark-drivers/jdbc/2.6.36/
6,snowflake,snowflake-jdbc-3.16.1.jar,https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/3.16.1/
7,bigquery,SimbaJDBCDriverforGoogleBigQuery42_1.6.2.1003.zip,https://storage.googleapis.com/simba-bq-release/jdbc/"
7,bigquery,SimbaJDBCDriverforGoogleBigQuery42_1.6.2.1003.zip,https://storage.googleapis.com/simba-bq-release/jdbc/
8,iris,intersystems-jdbc-3.10.2.jar,https://repo1.maven.org/maven2/com/intersystems/intersystems-jdbc/3.10.2/"
)

if (dbms == "all") {
Expand Down
2 changes: 1 addition & 1 deletion R/RStudio.R
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ unregisterWithRStudio <- function(connection) {
}

hasCatalogs <- function(connection) {
return(dbms(connection) %in% c("pdw", "postgresql", "sql server", "synapse", "redshift", "snowflake", "spark", "bigquery", "duckdb"))
return(dbms(connection) %in% c("pdw", "postgresql", "sql server", "synapse", "redshift", "snowflake", "spark", "bigquery", "duckdb", "iris"))
}

listDatabaseConnectorColumns <- function(connection,
Expand Down
13 changes: 10 additions & 3 deletions R/Sql.R
Original file line number Diff line number Diff line change
Expand Up @@ -291,9 +291,10 @@ lowLevelExecuteSql.default <- function(connection, sql) {

statement <- rJava::.jcall(connection@jConnection, "Ljava/sql/Statement;", "createStatement")
on.exit(rJava::.jcall(statement, "V", "close"))
if (dbms(connection) == "spark") {
if ((dbms(connection) == "spark") || (dbms(connection) == "iris")) {
# For some queries the DataBricks JDBC driver will throw an error saying no ROWCOUNT is returned
# when using executeLargeUpdate, so using execute instead.
# when using executeLargeUpdate, so using execute instead.
# Also use this approach for IRIS JDBC driver, which does not support executeLargeUpdate() directly.
rJava::.jcall(statement, "Z", "execute", as.character(sql), check = FALSE)
rowsAffected <- rJava::.jcall(statement, "I", "getUpdateCount", check = FALSE)
if (rowsAffected == -1) {
Expand Down Expand Up @@ -435,7 +436,13 @@ executeSql <- function(connection,
tryCatch(
{
startQuery <- Sys.time()
rowsAffected <- c(rowsAffected, rJava::.jcall(statement, "[J", "executeLargeBatch"))
# InterSystems IRIS JDBC supports batch updates but does not have a separate
# executeLargeBatch() method
if (con@dbms == "iris") {
rowsAffected <- c(rowsAffected, rJava::.jcall(statement, "[I", "executeBatch"))
} else {
rowsAffected <- c(rowsAffected, rJava::.jcall(statement, "[J", "executeLargeBatch"))
}
delta <- Sys.time() - startQuery
if (profile) {
inform(paste("Statements", start, "through", end, "took", delta, attr(delta, "units")))
Expand Down
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ Features
- IBM Netezza
- SQLite
- Spark
- InterSystems IRIS
- Statements for executing queries with
- Error reporting to file
- Progress reporting
Expand Down
13 changes: 9 additions & 4 deletions docs/articles/Connecting.html

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions man-roxygen/Dbms.R
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,5 @@
#' - "sqlite extended" for SQLite with extended types (DATE and DATETIME)
#' - "spark" for Spark
#' - "snowflake" for Snowflake
#' - "iris" for InterSystems IRIS
#'
8 changes: 8 additions & 0 deletions man-roxygen/DefaultConnectionDetails.R
Original file line number Diff line number Diff line change
Expand Up @@ -131,6 +131,14 @@
#' - `user`. The user name used to access the server.
#' - `password`. The password for that user.
#'
#' InterSystems IRIS:
#' - `connectionString`. The connection string (e.g. starting with
#' 'jdbc:IRIS://host:port/namespace'). Alternatively, you can provide
#' values for `server` and `port`, in which case the default `USER` namespace
#' is used to connect.
#' - `user`. The user name used to access the server.
#' - `password`. The password for that user.
#' - `pathToDriver`. The path to the folder containing the InterSystems IRIS JDBC driver JAR file.
#'
#' ## Windows authentication for SQL Server:
#'
Expand Down
2 changes: 1 addition & 1 deletion man/DatabaseConnector-package.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 12 additions & 0 deletions man/connect.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 12 additions & 0 deletions man/createConnectionDetails.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions man/createDbiConnectionDetails.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions man/downloadJdbcDrivers.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion man/jdbcDrivers.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading