Filebeat Lightweight Log Collection Tool

The Beats platform is a collection of multiple single-purpose data collectors.These collectors can be installed as lightweight agents to send data to Logstash or Elasticsearch from hundreds or thousands of machines.

I. Architecture Diagram

This experiment is based on the previous articles, which need to be based on the previous articles.

2. Install Filebeat

  • Download and install Filebeat

    wget  https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.0.1-x86_64.rpm
    yum install ./filebeat-6.0.1-x86_64.rpm
  • Modify Filebeat Profile

    vim /etc/filebeat/filebeat.yml                          # Main Profile
    \- type: log                                            # Document type
    paths:
    \- /var/log/httpd/access.log*                       # Where to read data from
    # Either elasticsearch or logstash is sufficient for the output
    output.elasticsearch:                               #Export data to Elasticsearch.Choose from the logstash below
      hosts: ["localhost:9200"]
    output.logstash:                                    # Transfer data to logstash, to configure logstash to receive using beats
      hosts: ["172.18.68.14:5044"]
  • Start Filebeat

    systemctl start filebeat

    3. Configuring Filebeat

  • Configure Logstash to receive data collected from Filebeat

    vim /etc/logstash/conf.d/test.conf
    input {
        beats {
                port => 5044                                            # Listen 5044 for receiving Filebeat incoming data
        }
    }
    filter {
      grok {
    match => {
      "message" => "%{COMBINEDAPACHELOG}"                               # Logs matching HTTP
    }
    remove_field => "message"                                           # Do not show original information, only after matching
      }
    }
    output {
     elasticsearch {
     hosts => ["http://172.18.68.11:9200 ","Http://172.18.68.12"9200"."Http://172.18.68.13: 9200 "] #Cluster IP
     index => "logstash-%{+YYYY.MM.dd}"
     action => "index"
     document_type => "apache_logs"
     }
    }
  • Start Logstash

     /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/test.conf

4. Simulated log access

Simulate customer access through curl command, generate access log

curl 127.0.0.1
curl 172.18.68.51
curl 172.18.68.52
curl 172.18.68.53

V. Verification Information

Clear the old data from the previous experiment (type Delete in the dialog box when deleting), and you can see that the filebeat ed collected data is filtered by Logtash and sent to Elasticsearch.

extend

With the gradual upgrade of ELK log system, it is now possible to collect each node's log based on Filebeat, log stash filters, prunes data, and finally to index, word segmentation, and build search engines in ELasticsearch.Head view based on Elasticsearch can now be viewed in a browser, but Head can only be viewed simply and can't be used effectively for data analysis and display.Kibana is needed for data analysis and presentation. Kibana will continue to be explained in the next article, with a schematic diagram here.

Tags: Linux ElasticSearch curl RPM vim

Posted on Tue, 09 Jun 2020 12:59:38 -0400 by sycoj0ker