Ingest Data of All Shapes, Sizes, and Sources
Data is often scattered or siloed across many systems in many formats. Logstash supports a variety of inputs that pull in events from a multitude of common sources, all at the same time. Easily ingest from your logs, metrics, web applications, data stores, and various AWS services, all in continuous, streaming fashion.
Parse & Transform Your Data On the Fly
As data travels from source to store, Logstash filters parse each event, identify named fields to build structure, and transform them to converge on a common format for easier, accelerated analysis and business value.
Logstash dynamically transforms and prepare your data regardless of format or complexity:
Derive structure from unstructured data with grok
Decipher geo coordinates from IP addresses
Anonymize PII data, exclude sensitive fields completely
Ease overall processing independent of the data source, format, or schema.
Choose Your Stash, Transport Your Data
While Elasticsearch is our go-to output that opens up a world of search and analytics possibilities, it’s not the only one available.
Logstash has a variety of outputs that let you route data where you want, giving you the flexibility to unlock a slew of downstream use cases.
Create and Configure Your Pipeline, Your Way
Logstash has a pluggable framework featuring over 200 plugins. Mix, match, and orchestrate different inputs, filters, and outputs to work in pipeline harmony.
Ingesting from a custom application? Don’t see a plugin you need? Logstash plugins are easy to build. We’ve got a fantastic API for plugin development and a plugin generator to help you start and share your creations.
SECURITY & MONITORING
Secure It and Monitor It
Whether you’re running 10s or 1000s of Logstash instances, we’ve made it possible for you to secure and keep a pulse on the status of your ingest pipelines from end to end. Incoming data from Beats along with other inputs can be encrypted over the wire, and there's full integration with secured Elasticsearch clusters. Logstash also has a monitoring API which unlocks visibility of the overall pipeline health and performance.
The workshop covers real-world data sets and instructors work with the participants to ingest, search, and visualize them. This includes an Elasticsearch overview, Logstash configuration, creation of dashboards in Kibana, how to process logs, recommended architecture for designing a system to scale, choosing hardware, and managing the life cycle of your logs.
No prior knowledge of the Elastic Stack is required
Comfort using the terminal or command line is recommended
Elastic Stack Overview
Logs and Problems
Introduction to Logstash or Why Should I Bother?
Getting started with Logstash
Shipping events without the Logstash agent
Structured Application logging.
What and Why
Terminology: Documents, Index, Shards, Node, Cluster
Installation and Configuration
Working with Data
What and Why
Time Picker, Search, and Filters
Kibana Discover, Visualization, and Dashboard Interfaces
Build and configure your first data pipeline with ELK
Collect, Parse, and Transform Data with Logstash
Handling Back Pressure
Hardware Best Practices
Debugging and Monitoring