Company founded by the creators of Apache Spark. Databricks make use of Apache Spark to provide a Unified Analytics platform.
Why do we need Azure Databricks?
- In order to make use of apache spark we need to provision the machines and install the spark, and the necessary libraries and maintain the scaling and availability of the machines.
- With Databricks, the entire environment can be provisioned with just a few clicks.
Three main components
- Databricks tools, services, and optimization.
- Distributed computation
- DBFS (Files)
Azure Databricks workspace is a single cluster with multiple nodes. The cluster will have the spark engine and other components installed.
The cluster contains two types of nodes:
- Worker/ Slave Nodes
Node is responsible for actually performing the underlying tasks.
2. Driver / Master Nodes
(i) Entry point to the node or the Pyspark application.
(ii) Node is responsible for distributing the task to the worker nodes.
3. Cluster Manager
Responsible for managing the resources.
In Databricks, we have two types of clusters:
We can analyze data with the help of an interactive cluster. Multiple users can use a cluster and collaborate.
There are two types of interactive clusters:
1. Standard cluster
Cluster for a single user. A fault by one user can impact the whole cluster. Resources are allocated to a single workload.
2. High concurrency cluster
Cluster for multiple users. Resources are shared across the users. Fault isolation is maintained. Access control on the table is also provided.
Responsible for running the jobs. When a job has to run Azure Databricks will start the cluster. Once the job is completed cluster will be terminated.
Databricks support languages such as R, SQL, Python, Scala, and Java.
Benefits of Azure Databricks
- Quick setup for prototyping experiments.
- Include a notebook similar to the Jupiter notebook.
- Intuitive and Integrated console.
- Apache spark core committers work for Databricks which is one of the reasons behind Databricks is performant.
- Granular, integrated security.
Databricks is built on top of Apache Spark. So Databricks has all the features of Apache Spark.
The only difference between the two is Databricks is completely managed and distribution compute. Supports ML, a notebook that helps in easy writing of the code.
Cloud developer mainly uses Pyspark programming language to write code on the data bricks notebook.
So, this article is about the basic introduction to Databricks. The next article will be soon published on Pyspark and how to write code on a Databricks notebook using Pyspark.
“Thanks for the read🙏🙏. Do clap👏👏 and share what new you have learned from this article”.
“Keep learning and keep sharing knowledge”