Documents
  • Invariant Documents
  • Platform
    • Data Platform
      • Install Overview
      • System Requirement
      • Software Requirement
      • Prepare the Environment
      • Installing Ambari Server
      • Setup Ambari Server
      • Start Ambari Server
      • Single Node Install
      • Multi-Node Cluster Install
      • Cluster Install from Ambari
      • Run and monitor HDFS
    • Apache Hadoop
      • Compatible Hadoop Versions
      • HDFS
        • HDFS Architecture
        • Name Node
        • Data Node
        • File Organization
        • Storage Format
          • ORC
          • Parquet
        • Schema Design
      • Hive
        • Data Organization
        • Data Types
        • Data Definition
        • Data Manipulation
          • CRUD Statement
            • Views, Indexes, Temporary Tables
        • Cost-based SQL Optimization
        • Subqueries
        • Common Table Expression
        • Transactions
        • SerDe
          • XML
          • JSON
        • UDF
      • Oozie
      • Sqoop
        • Commands
        • Import
      • YARN
        • Overview
        • Accessing YARN Logs
    • Apache Kafka
      • Compatible Kafka Versions
      • Installation
    • Elasticsearch
      • Compatible Elasticsearch Versions
      • Installation
  • Discovery
    • Introduction
      • Release Notes
    • Methodology
    • Discovery Pipeline
      • Installation
      • DB Event Listener
      • Pipeline Configuration
      • Error Handling
      • Security
    • Inventory Manager
      • Installation
      • Metadata Management
      • Column Mapping
      • Service Configuration
      • Metadata Configuration
      • Metadata Changes and Versioning
        • Generating Artifacts
      • Reconciliation, Merging Current View
        • Running daily reconciliation and merge
      • Data Inventory Reports
    • Schema Registry
  • Process Insight
    • Process Insight
      • Overview
    • Process Pipeline
      • Data Ingestion
      • Data Storage
    • Process Dashboards
      • Panels
      • Templating
      • Alerts
        • Rules
        • Notifications
  • Content Insight
    • Content Insight
      • Release Notes
      • Configuration
      • Content Indexing Pipeline
    • Management API
    • Query DSL
    • Configuration
  • Document Flow
    • Overview
  • Polyglot Data Manager
    • Polyglot Data Manager
      • Release Notes
    • Data Store
      • Concepts
      • Sharding
    • Shippers
      • Filerelay Container
    • Processors
    • Search
    • User Interface
  • Operational Insight
    • Operational Insight
      • Release Notes
    • Data Store
      • Concepts
      • Sharding
    • Shippers
      • Filerelay Container
    • Processors
    • Search
    • User Interface
  • Data Science
    • Data Science Notebook
      • Setup JupyterLab
      • Configuration
        • Configuration Settings
        • Libraries
    • Spark DataHub
      • Concepts
      • Cluster Setup
      • Spark with YARN
      • PySpark Setup
        • DataFrame API
      • Reference
  • Product Roadmap
    • Roadmap
  • TIPS
    • Service Troubleshooting
    • Service Startup Errors
    • Debugging YARN Applications
      • YARN CLI
    • Hadoop Credentials
    • Sqoop Troubleshooting
    • Log4j Vulnerability Fix
Powered by GitBook
On this page
  • Dashboard
  • End User Dashboards
  • Navigation
  1. Operational Insight

User Interface

PreviousSearchNextData Science Notebook

Last updated 4 years ago

The Operational Insight dashboards are built using Kibana, which allows authenticated user to view and drill into the system metrics and application log data. Kibana provide a rich dashboard development framework, which allows users to build dashboards for visual exploration and real-time analysis of data stored in Elasticsearch. The metrics and log data stored within Elasticsearch can be queried and visualized in Kibana with no coding effort. Valuable data such as user details, processing time is extracted and stored in separate fields can be used a filters in the dashboards created.

Dashboard

Dashboards are very useful, when consumers want to get a quick overview of the data and make correlations among various visualizations and logs. The Dashboard page allows users to create, modify, and view their own custom dashboards. With a dashboard, users can combine multiple visualizations onto a single page, then filter them by providing a search query or by selecting filters by clicking elements in the visualization.

End User Dashboards

Users can filter results by entering a search query, changing the time filter, or clicking on the elements within the visualization.

For example, if you click on a color segment in the histogram, the dashboard will allow you to filter on the significant term that the segment represents.

Be sure to click the Apply Now button to filter the results and refresh the visualizations. Filters can be applied and removed as needed.

The search and time filters work just like they do in the Discover page, except they are only applied to the data subsets that are presented in the dashboard.

Navigation

When you open the dashboard by default you will see the last 15 minutes of the logs sorted by time. You can modify the time window by clicking on the clock icon on the top panel. This will give you a number of options and allow you to quickly look at data for a specific time span.

Select a start and end time and the click the GO button. The log data will be displayed with a bar chart at top and table to log events at the bottom. The histogram shows the event count for the time range with an interval which is auto computed. You can change this by clicking on the link and selecting a different time interval from the drop down.

You can use the mouse to zoom in further. This will load the events for that time range. Right click and select back button to go back. You can also select on a bar to drill into the event for that specific time.

Below the graph, the first 500 log events are shown. You can refine the search criteria or use a different time range to see additional results. Click on the caret graphic (>) to expand the message and view the event details. This is useful for multi-line events such as stack traces where the log message is split across multiple lines.