Home > Software design >  Using Databricks CLI command output in other Unix command
Using Databricks CLI command output in other Unix command

Time:09-16

I am using Databricks CLI to configure clusters and setup libraries in Azure Release pipeline as part of an Azure CLI task with /bin/bash as the interpreter. At one place, I am trying to use databricks fs ls command to list a jar file in a dbfs path which matches a specific file name pattern and store the name of that file in a unix variable.

databricks fs ls dbfs:/fs/jars/*<pattern>*.jar --profile <profile_name>

This code fails as dbfs ls expects only a directory as argument, and not files patterns.

I tried others combinations like:

grep *<pattern>*.jar < <(databricks fs ls dbfs:/fs/jars/ --profile <profile_name>)

and

grep *<pattern>*.jar | cat < <(databricks fs ls dbfs:/fs/jars/ --profile <profile_name>)

but to no avail.

However, the command

cat < <(databricks fs ls dbfs:/fs/jars/ --profile <profile_name>

works fine stand alone.

Question is : how to use unix commands in conjunction with databricks CLI commands?

CodePudding user response:

Just use a pipe (|) operation to forward output of databricks-cli into other commands:

databricks fs ls dbfs:/fs/jars/ --profile <profile_name>|grep <your-pattern>
  • Related