Adds a file to the running remote context.
Adds a file to the running remote context.
Note that the URL should be reachable by the Spark driver process. If running the driver in cluster mode, it may reside on a different host, meaning "file:" URLs have to exist on that node (and not on the client machine).
If the provided URI has no scheme, it's considered to be relative to the default file system configured in the Livy server.
The location of the file.
A future that can be used to monitor the operation.
Adds a jar file to the running remote context.
Adds a jar file to the running remote context.
Note that the URL should be reachable by the Spark driver process. If running the driver in cluster mode, it may reside on a different host, meaning "file:" URLs have to exist on that node (and not on the client machine).
If the provided URI has no scheme, it's considered to be relative to the default file system configured in the Livy server.
The location of the jar file.
A future that can be used to monitor the operation.
Asks the remote context to run a job immediately.
Asks the remote context to run a job immediately.
Normally, the remote context will queue jobs and execute them based on how many worker threads have been configured. This method will run the submitted job in the same thread processing the RPC message, so that queueing does not apply.
It's recommended that this method only be used to run code that finishes quickly. This avoids interfering with the normal operation of the context.
The job to be executed. It is a function that takes in a ScalaJobContext and returns the result of the execution of the job with that context.
A handle that can be used to monitor the job.
Stops the remote context.
Stops the remote context.
Any pending jobs will be cancelled, and the remote context will be torn down.
Whether to shutdown the underlying Spark context. If false, the context will keep running and it's still possible to send commands to it, if the backend being used supports it.
Submits a job for asynchronous execution.
Submits a job for asynchronous execution.
The job to be executed. It is a function that takes in a ScalaJobContext and returns the result of the execution of the job with that context.
A handle that can be used to monitor the job.
Upload a file to be passed to the Spark application.
Upload a file to be passed to the Spark application.
The local file to be uploaded.
A future that can be used to monitor this operation.
Upload a jar to be added to the Spark application classpath.
Upload a jar to be added to the Spark application classpath.
The local file to be uploaded.
A future that can be used to monitor this operation.
A client for submitting Spark-based jobs to a Livy backend.