Filebeat Lightweight Log Collection Tool

The Beats platform is a collection of multiple single-purpose data collectors.These collectors can be installed as lightweight agents to send data to Logstash or Elasticsearch from hundreds or thousands of machines.

I. Architecture Diagram

This experiment is based on the previous articles, which need to be based on the previous articles.

2. Install Filebeat

  • Download and install Filebeat

    yum install ./filebeat-6.0.1-x86_64.rpm
  • Modify Filebeat Profile

    vim /etc/filebeat/filebeat.yml                          # Main Profile
    \- type: log                                            # Document type
    \- /var/log/httpd/access.log*                       # Where to read data from
    # Either elasticsearch or logstash is sufficient for the output
    output.elasticsearch:                               #Export data to Elasticsearch.Choose from the logstash below
      hosts: ["localhost:9200"]
    output.logstash:                                    # Transfer data to logstash, to configure logstash to receive using beats
      hosts: [""]
  • Start Filebeat

    systemctl start filebeat

    3. Configuring Filebeat

  • Configure Logstash to receive data collected from Filebeat

    vim /etc/logstash/conf.d/test.conf
    input {
        beats {
                port => 5044                                            # Listen 5044 for receiving Filebeat incoming data
    filter {
      grok {
    match => {
      "message" => "%{COMBINEDAPACHELOG}"                               # Logs matching HTTP
    remove_field => "message"                                           # Do not show original information, only after matching
    output {
     elasticsearch {
     hosts => [" ","Http://"9200"."Http:// 9200 "] #Cluster IP
     index => "logstash-%{+YYYY.MM.dd}"
     action => "index"
     document_type => "apache_logs"
  • Start Logstash

     /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/test.conf

4. Simulated log access

Simulate customer access through curl command, generate access log


V. Verification Information

Clear the old data from the previous experiment (type Delete in the dialog box when deleting), and you can see that the filebeat ed collected data is filtered by Logtash and sent to Elasticsearch.


With the gradual upgrade of ELK log system, it is now possible to collect each node's log based on Filebeat, log stash filters, prunes data, and finally to index, word segmentation, and build search engines in ELasticsearch.Head view based on Elasticsearch can now be viewed in a browser, but Head can only be viewed simply and can't be used effectively for data analysis and display.Kibana is needed for data analysis and presentation. Kibana will continue to be explained in the next article, with a schematic diagram here.

Tags: Linux ElasticSearch curl RPM vim

Posted on Tue, 09 Jun 2020 12:59:38 -0400 by sycoj0ker