代码之家  ›  专栏  ›  技术社区  ›  Johnny

如何设置一个简单的filebeat到es集群?

  •  0
  • Johnny  · 技术社区  · 6 年前

    我第一次尝试建立一个集群,在这个集群中,filebeat将日志发送到es,我可以在kibana中查看这些日志。我只想看看我写进文件的日志 /tmp/aaa.log 在Kibana。我对所有的配置都有点迷茫。根据下面的配置文件,有人能告诉我我做错了什么吗?

    这是我的 docker-compose.yml :

    ---
    version: '3.6'
    services:  
      elasticsearch:
        image: docker.elastic.co/elasticsearch/elasticsearch:${TAG}
        container_name: elasticsearch
        ports: ['9200:9200']
        networks: ['stack']
        environment:
          - xpack.security.enabled=false
        volumes:
          - 'es_data:/usr/share/elasticsearch/data'
    
      kibana:
        image: docker.elastic.co/kibana/kibana:${TAG}
        container_name: kibana
        ports: ['5601:5601']
        networks: ['stack']
        depends_on: ['elasticsearch']
        environment:
          - xpack.security.enabled=false
    
      logstash:
        image: docker.elastic.co/logstash/logstash:${TAG}
        container_name: logstash
        networks: ['stack']
        depends_on: ['elasticsearch']
        environment:
          - xpack.security.enabled=false
    
      filebeat:
        image: docker.elastic.co/beats/filebeat:${TAG}
        container_name: filebeat
        volumes:
          - /tmp/filebeat.yml:/usr/share/filebeat/filebeat.yml
        networks: ['stack']
        depends_on: ['elasticsearch', 'kibana']
    
    networks: {stack: {}}
    

    这里是 filebeat.yml :

    filebeat.prospectors:
    - input_type: log
      paths:
        - /tmp/aaa.log
    
    output.elasticsearch:
      hosts: ['elasticsearch:9200']
    

    我用 TAG=5.6.13 docker-compose up (我必须使用ES版本5)。

    以下是日志:

    2018/11/27 16:20:57.165350 beat.go:297: INFO Home path: [/usr/share/filebeat] Config path: [/usr/share/filebeat] Data path: [/usr/share/filebeat/data] Logs path: [/usr/share/filebeat/logs]
    2018/11/27 16:20:57.165389 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.13
    2018/11/27 16:20:57.165502 output.go:263: INFO Loading template enabled. Reading template file: /usr/share/filebeat/filebeat.template.json
    2018/11/27 16:20:57.166247 output.go:274: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: /usr/share/filebeat/filebeat.template-es2x.json
    2018/11/27 16:20:57.167063 output.go:286: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: /usr/share/filebeat/filebeat.template-es6x.json
    2018/11/27 16:20:57.167554 metrics.go:23: INFO Metrics logging every 30s
    2018/11/27 16:20:57.167888 client.go:128: INFO Elasticsearch url: http://elasticsearch:9200
    2018/11/27 16:20:57.167909 outputs.go:108: INFO Activated elasticsearch as output plugin.
    2018/11/27 16:20:57.168015 publish.go:300: INFO Publisher name: 34df7198d027
    2018/11/27 16:20:57.168185 async.go:63: INFO Flush Interval set to: 1s
    2018/11/27 16:20:57.168194 async.go:64: INFO Max Bulk Size set to: 50
    2018/11/27 16:20:57.168512 beat.go:233: INFO filebeat start running.
    2018/11/27 16:20:57.168546 registrar.go:68: INFO No registry file found under: /usr/share/filebeat/data/registry. Creating a new registry file.
    2018/11/27 16:20:57.174446 registrar.go:106: INFO Loading registrar data from /usr/share/filebeat/data/registry
    2018/11/27 16:20:57.174491 registrar.go:123: INFO States Loaded from registrar: 0
    2018/11/27 16:20:57.174515 crawler.go:38: INFO Loading Prospectors: 1
    2018/11/27 16:20:57.174633 prospector_log.go:65: INFO Prospector with previous states loaded: 0
    2018/11/27 16:20:57.174715 prospector.go:124: INFO Starting prospector of type: log; id: 16715230261889747 
    2018/11/27 16:20:57.174726 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
    2018/11/27 16:20:57.174735 registrar.go:236: INFO Starting Registrar
    2018/11/27 16:20:57.174754 sync.go:41: INFO Start sending events to output
    2018/11/27 16:20:57.174788 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
    2018/11/27 16:21:27.168018 metrics.go:39: INFO Non-zero metrics in the last 30s: registrar.writes=1
    2018/11/27 16:21:57.167828 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/27 16:22:27.167772 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/27 16:22:57.167974 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/27 16:23:27.167752 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/27 16:23:57.167944 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/27 16:24:27.167943 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/27 16:24:32.039122 filebeat.go:267: INFO Stopping filebeat
    2018/11/27 16:24:32.039158 crawler.go:90: INFO Stopping Crawler
    2018/11/27 16:24:32.039166 crawler.go:100: INFO Stopping 1 prospectors
    2018/11/27 16:24:32.039187 prospector.go:180: INFO Prospector ticker stopped
    2018/11/27 16:24:32.039187 prospector.go:137: INFO Prospector channel stopped because beat is stopping.
    2018/11/27 16:24:32.039198 prospector.go:232: INFO Stopping Prospector: 16715230261889747
    2018/11/27 16:24:32.039215 crawler.go:112: INFO Crawler stopped
    2018/11/27 16:24:32.039223 spooler.go:101: INFO Stopping spooler
    2018/11/27 16:24:32.039249 registrar.go:291: INFO Stopping Registrar
    2018/11/27 16:24:32.039264 registrar.go:248: INFO Ending Registrar
    2018/11/27 16:24:32.041518 metrics.go:51: INFO Total non-zero values:  registrar.writes=2
    2018/11/27 16:24:32.041533 metrics.go:52: INFO Uptime: 3m34.878904973s
    2018/11/27 16:24:32.041538 beat.go:237: INFO filebeat stopped.
    2018/11/28 08:43:17.481376 beat.go:297: INFO Home path: [/usr/share/filebeat] Config path: [/usr/share/filebeat] Data path: [/usr/share/filebeat/data] Logs path: [/usr/share/filebeat/logs]
    2018/11/28 08:43:17.481411 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.13
    2018/11/28 08:43:17.481500 output.go:263: INFO Loading template enabled. Reading template file: /usr/share/filebeat/filebeat.template.json
    2018/11/28 08:43:17.482638 output.go:274: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: /usr/share/filebeat/filebeat.template-es2x.json
    2018/11/28 08:43:17.483675 metrics.go:23: INFO Metrics logging every 30s
    2018/11/28 08:43:17.483780 output.go:286: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: /usr/share/filebeat/filebeat.template-es6x.json
    2018/11/28 08:43:17.484701 client.go:128: INFO Elasticsearch url: http://elasticsearch:9200
    2018/11/28 08:43:17.484745 outputs.go:108: INFO Activated elasticsearch as output plugin.
    2018/11/28 08:43:17.484844 publish.go:300: INFO Publisher name: 34df7198d027
    2018/11/28 08:43:17.484975 async.go:63: INFO Flush Interval set to: 1s
    2018/11/28 08:43:17.484982 async.go:64: INFO Max Bulk Size set to: 50
    2018/11/28 08:43:17.485563 beat.go:233: INFO filebeat start running.
    2018/11/28 08:43:17.485607 registrar.go:85: INFO Registry file set to: /usr/share/filebeat/data/registry
    2018/11/28 08:43:17.485630 registrar.go:106: INFO Loading registrar data from /usr/share/filebeat/data/registry
    2018/11/28 08:43:17.485656 registrar.go:123: INFO States Loaded from registrar: 0
    2018/11/28 08:43:17.485688 crawler.go:38: INFO Loading Prospectors: 1
    2018/11/28 08:43:17.485758 prospector_log.go:65: INFO Prospector with previous states loaded: 0
    2018/11/28 08:43:17.485840 prospector.go:124: INFO Starting prospector of type: log; id: 16715230261889747 
    2018/11/28 08:43:17.485848 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
    2018/11/28 08:43:17.485881 sync.go:41: INFO Start sending events to output
    2018/11/28 08:43:17.485898 registrar.go:236: INFO Starting Registrar
    2018/11/28 08:43:17.485945 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
    2018/11/28 08:43:47.483962 metrics.go:34: INFO No non-zero metrics in the last 30s
    2018/11/28 08:44:17.484051 metrics.go:34: INFO No non-zero metrics in the last 30s
    
    1 回复  |  直到 6 年前
        1
  •  0
  •   Johnny    6 年前

    我的错误。我愚蠢地忘记把日志文件映射到 docker-compose.yml .