You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Add a LRO for spark operations that require special Livy poling
- The logic in these LRO is ported from test helpers written by the client team (and removed here as well)
- This is a rebase of #17677
- I had to shut up a few tests due to more nullability issues (fixing in Azure/azure-rest-api-specs#12258)
Co-authored-by: Mariana Rios Flores <[email protected]>
Co-authored-by: Christopher Scott <[email protected]>
Copy file name to clipboardExpand all lines: sdk/synapse/Azure.Analytics.Synapse.ManagedPrivateEndpoints/tests/samples/Sample1_HelloManangedPrivateEndpoint.cs
Copy file name to clipboardExpand all lines: sdk/synapse/Azure.Analytics.Synapse.Spark/samples/Sample1_SubmitSparkJob.md
+20-5Lines changed: 20 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,12 +7,21 @@ This sample demonstrates basic operations with two core classes in this library:
7
7
To interact with Spark jobs running on Azure Synapse, you need to instantiate a `SparkBatchClient`. It requires an endpoint URL and a `TokenCredential`.
8
8
9
9
```C# Snippet:CreateSparkBatchClient
10
+
// Replace the strings below with the spark, endpoint, and file system information
To submit a Spark job, first create a `SparkBatchJob`, passing in an instance of `SparkBatchJobOptions` describing the job's parameters. Calling `CreateSparkBatchJob` with that job will submit it to Synapse.
24
+
To submit a Spark job, first create a `SparkBatchJob`, passing in an instance of `SparkBatchJobOptions` describing the job's parameters. Calling `StartCreateSparkBatchJob` with that job will submit it to Synapse.
16
25
17
26
```C# Snippet:SubmitSparkBatchJob
18
27
stringname=$"batch-{Guid.NewGuid()}";
@@ -32,15 +41,21 @@ SparkBatchJobOptions request = new SparkBatchJobOptions(name, file)
This sample demonstrates basic asynchronous operations with two core classes in this library: `SparkBatchClient` and `SparkBatchJob`. `SparkBatchClient` is used to interact with Spark jobs running on Azure Synapse - each method call sends a request to the service's REST API. `SparkBatchJob` is an entity that represents a batched Spark job within Synapse. The sample walks through the basics of creating, running, and canceling job requests. To get started, you'll need a connection endpoint to Azure Synapse. See the [README](https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/synapse/Azure.Analytics.Synapse.Spark/README.md) for links and instructions.
4
+
5
+
## Create Spark batch client
6
+
7
+
To interact with Spark jobs running on Azure Synapse, you need to instantiate a `SparkBatchClient`. It requires an endpoint URL and a `TokenCredential`.
8
+
9
+
```C# Snippet:CreateSparkBatchClientAsync
10
+
// Replace the strings below with the spark, endpoint, and file system information
To submit a Spark job, first create a `SparkBatchJob`, passing in an instance of `SparkBatchJobOptions` describing the job's parameters. Calling `StartCreateSparkBatchJobAsync` with that job will submit it to Synapse.
0 commit comments