https://delta.io logo
#random
Title
d

Divyansh Jain

06/16/2023, 12:32 PM
Hi All, I have few
.SQL
in my repo. Is there any way to execute those SQLs on Databrivks via Azure DevOps CO/CD pipeline. Please help I'm very new to CI/CD. Thanks!
g

Gerhard Brueckl

06/19/2023, 8:04 AM
you can use the Databricks REST API to run them
you can use the DatabricksPS Powershell Module and simply authenticate in the ADO pipeline https://github.com/gbrueckl/Databricks.API.PowerShell#azure-devops-integration Then use the command and execution API to run your SQL code
Copy code
- task: AzureCLI@2
  displayName: "DatabricksPS - Execute SQL"  
  inputs:
    azureSubscription: "MyServiceConnection"    
    addSpnToEnvironment: true
    scriptType: ps
    scriptLocation: inlineScript
	arguments: '$(DATABRICKS_URL) $(AzURE_RESOURCE_ID)'
    inlineScript: |
	  Set-DatabricksEnvironment -ApiRootUrl $1 -AzureResourceID $2 -UsingAzureDevOpsServiceConnection 
	  $ctx = Get-DatabricksExecutionContext -ClusterID "1202-211320-brick1" -Language SQL
	  $ctx | Start-DatabricksCommand -Command "SELECT * FROM myTable"
    azurePowerShellVersion: latestVersion
d

Divyansh Jain

06/20/2023, 6:30 AM
@Gerhard Brueckl Thanks for your reply. Could you please help like how can I pass multiple SQL Scripts, instead of directly writing SQL query in YAML using DatabricksPS Powershell Module.
g

Gerhard Brueckl

06/20/2023, 6:43 AM
Simply use powershell to read the content into a variable and then use the variable instead of the static script If you have multiple files just create a loop
👍 1