• /
  • Log in
  • Free account
Instant Observability (I/O)Databricks Integration

Databricks Integration

Databricks Integration
Monitor Databricks Spark applications with New Relic Spark integration using notebook script
Install quickstartView repo

What's included

Spark - Overview v2

Dashboard
Dashboard

Databricks init script creator notebook

Databricks notebook to create init script to be used during initialization of Databricks cluster

Doc

Databricks is an orchestration platform for Apache Spark. Instantly monitor Databricks Spark applications with our New Relic Spark integration quickstart. Our integration provides a script run in a notebook to generate an installation script, which you can attach to a cluster and populate Spark metrics to New relic Insights events. Easily track the health of your Databricks clusters, fine-tune your Spark jobs for peak performance, and troubleshoot problems with this quickstart.

Databricks cluster’s driver node runs each job in scheduled stages. Individual stages are broken down into tasks and distributed across executor nodes. Our New Relic Spark integration collects detailed job and stage metrics so you can get granular insight into job performance at a glance. For example , break down the Job metric by status (successful, pending, or failed) to see in real-time if a high number of jobs are failing, which could indicate a code error or memory issue at the executor level. Metrics on the number of jobs in realtime can also help you make decisions for provisioning clusters in the future.

Create issue
Copyright © 2021 New Relic Inc.