Azure Databricks has good integration options with Azure Data Factory. From calling notebooks, to libraries. But when it comes to getting a value, or a table from Databricks, I found myself going back to spark dataset based lookups which has it’s own limitations on the queries we can write to retrieve the required information. Recently, however, I started looking beyond the lookups and found an easy way to directly pass data from Notebooks to Data Factory.
Databricks allows us to pass messages to the caller of notebooks using the command:
dbutils.notebook.exit('Notebook Return Value')
On calling the notebook from Azure Data…
Databricks provides a great environment for ad-hoc and interactive usage of data. However, setting up a local environment for one’s testing can be quite a task. The below post is intended to help setup a working local environment for development and testing of python libraries for databricks remotely using databricks-connect.
The first thing we require for our development is an IDE. In this post, I have chosen to use Visual Studio Code as my choice due to the wide set of option it provides.
Tech Enthusiast, Software Engineer 2, Microsoft