Skip to content

Commit

Permalink
update to 0.3.1 by geospark scala package 1.3.1
Browse files Browse the repository at this point in the history
  • Loading branch information
harryzhu committed Mar 2, 2020
1 parent bf6e596 commit 732c6ac
Show file tree
Hide file tree
Showing 5 changed files with 12 additions and 9 deletions.
3 changes: 2 additions & 1 deletion .Rbuildignore
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
^CRAN-RELEASE$
^.*\.Rproj$
^\.Rproj\.user$
^\.travis\.yml$
Reference.md$
^docs/
^logs$
^derby\.log$
^derby\.log$
10 changes: 5 additions & 5 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
Package: geospark
Type: Package
Title: Bring Local Sf to Spark
Version: 0.2.1
Version: 0.3.1
Authors@R: c(
person("Harry", "Zhu", email = "[email protected]", role = c("aut", "cre")),
person("Javier", "Luraschi", email = "[email protected]", role = c("ctb"))
)
Maintainer: Harry Zhu <[email protected]>
BugReports: https://github.com/harryprince/geospark/issues
Description: R binds 'GeoSpark' <http://geospark.datasyslab.org/> extending 'sparklyr'
<https://spark.rstudio.com/> R package to make distributed geocomputing easier. Sf is a
package that provides [simple features](https://en.wikipedia.org/wiki/Simple_Features) access
for R and which is a leading geospatial data processing tool. Geospark R package bring
<https://spark.rstudio.com/> R package to make distributed 'geocomputing' easier. Sf is a
package that provides [simple features] <https://en.wikipedia.org/wiki/Simple_Features> access
for R and which is a leading 'geospatial' data processing tool. 'Geospark' R package bring
the same simple features access like sf but running on Spark distributed system.
License: Apache License (>= 2.0)
Encoding: UTF-8
Expand All @@ -24,4 +24,4 @@ Imports:
dbplyr (>= 1.3.0)
RoxygenNote: 6.1.1
Suggests:
testthat, knitr
testthat, knitr, utils
4 changes: 2 additions & 2 deletions R/dependencies.R
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
spark_dependencies <- function(spark_version, scala_version, ...) {
sparklyr::spark_dependency(
packages = c(
paste0("org.datasyslab:geospark-sql_",sparklyr::spark_dependency_fallback(spark_version, c("2.1", "2.2", "2.3")),":1.2.0"),
paste0("org.datasyslab:geospark-sql_",sparklyr::spark_dependency_fallback(spark_version, c("2.1", "2.2", "2.3")),":1.3.1"),
"com.vividsolutions:jts-core:1.14.0",
"org.datasyslab:geospark:1.2.0"
"org.datasyslab:geospark:1.3.1"
),
initializer = function(sc, ...) {
register_gis(sc)
Expand Down
3 changes: 2 additions & 1 deletion R/st_example.R
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
#' @examples
#' library(geospark)
#' library(sparklyr)
#' library(utils)
#'
#' # use the proper master, like 'local', 'yarn', etc.
#' sc <- spark_connect(master = "spark://HOST:PORT")
Expand All @@ -19,7 +20,7 @@
#'
#' @export
st_example <- function(sc, geom = "polygons") {
geoms <- read.table(system.file(package="geospark",sprintf("examples/%s.txt",geom)), sep="|")
geoms <- utils::read.table(system.file(package="geospark",sprintf("examples/%s.txt",geom)), sep="|")
switch (geom,
"polygons" = {
colnames(geoms) <- c("area","geom")
Expand Down
1 change: 1 addition & 0 deletions man/st_example.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 732c6ac

Please sign in to comment.