spark-connections {sparklyr} | R Documentation |
These routines allow you to manage your connections to Spark.
spark_connect(master, spark_home = Sys.getenv("SPARK_HOME"), method = c("shell", "livy", "test"), app_name = "sparklyr", version = NULL, hadoop_version = NULL, config = spark_config(), extensions = sparklyr::registered_extensions()) spark_connection_is_open(sc) spark_disconnect(sc, ...) spark_disconnect_all()
master |
Spark cluster url to connect to. Use |
spark_home |
The path to a Spark installation. Defaults to the path
provided by the |
method |
The method used to connect to Spark. Currently, only
|
app_name |
The application name to be used while running in the Spark cluster. |
version |
The version of Spark to use. Only applicable to
|
hadoop_version |
The version of Hadoop to use. Only applicable to
|
config |
Custom configuration for the generated Spark connection. See
|
extensions |
Extension packages to enable for this connection. By
default, all packages enabled through the use of
|
sc |
A |
... |
Optional arguments; currently unused. |
sc <- spark_connect(master = "spark://HOST:PORT") connection_is_open(sc) spark_disconnect(sc)