filebeat+kafka+logstash收集日志到es使用kibana展示

警告
本文最后更新于 2020-05-30 17:09,文中内容可能已过时。

实现逻辑

filebeat    ==>>    kafka    <<==    logstash    ==>>    elastsearch   <==    kibana

 

  1. filebeat配置
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/messages
  fields:
    log_topic: test_kafka
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
output.kafka:
  enable: true
  hosts: ["log1:9092"]
  version: "2.0.0"
  topic: '%{[fields.log_topic]}'
  partition.round_robin:
    reachable_only: true
  worker: 1
  required_acks: 1
  compression: gzip
  compression_level: 4
  max_message_bytes: 10000000
processors:
  - drop_fields:
     fields:
     - beat
     - host
     - input
     - source
     - offset
     - prospector

启动服务

 

  1. logstash配置:
input {
  kafka{
    bootstrap_servers => "log1:9092"
    topics => ["test_kafka"]
    codec => "json"
  }
}

output {
  elasticsearch {
    hosts => "log1:9200"
    index => "test_kafka-%{+YYYY.MM.dd}"
  }
}

启动服务

nohup ./logstash -f es.conf &

 

3.使用kibana验证

可以手动模拟生成日志:

echo "手动echo测试" >> /var/log/messages

 

请我喝杯水
SoulChild 微信号 微信号
SoulChild 微信打赏 微信打赏
0%