pátek 2. září 2016

howto get router syslog huawei/cisco/juniper via fluentd to elasticsearch & kibana

objective:
- visualisation of syslog in kibana, host wise... Here is final example:


what i have:
-let say i have kibana 4.4.x, fluentd ,elasticsearch plugin installed





if you try to do visualisation with default setup, you can notice host & message field  get splited(exact algorythm depends on elasticsearch)

problem with kibaba&elasticsearch is, by default all fields inserted with elasticsearch plugin, are defined as strings ==automatically analysed. This is making any visualisation impossible...

solution is to have data inserted to fluentd mapped not to strings, but to multitype  ( string & raw)...
in order to do it , it is necesary to create special template in elasticsearch, before you start sending data by fluentd !  you cant change mapping later. so first make template.


1. install to your chrome browser extension sense, so you can work with elasticsearch
2. with sense, insert to your elastic database template:
(https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html)
put  /_template/template_syslogxx
    {
  "template": "syslogxx-*",
  "settings": {
    "number_of_shards": 1
  },
      "mappings": {
         "fluentd": {
            "properties": {
               "@timestamp": {
                  "type": "date",
                  "format": "strict_date_optional_time||epoch_millis"
               },
               "host": {
                  "type": "multi_field",
                                    "fields": {
                                              "host": { "type": "string" },
                                              "raw":   { "type": "string", "index": "not_analyzed" }
                                               }
                  }
               ,
               "ident": {
                  "type": "string"
               },
               "message": {
                  "type": "multi_field",
                                    "fields": {
                                              "message": { "type": "string" },
                                              "raw":   { "type": "string", "index": "not_analyzed" }
                                               }
               }
               ,
               "pid": {
                  "type": "string"
               },
               "tag": {
                  "type": "string"
               }
            }
}
}
}

3. configuration of fluentd...
  -change host / port
  -smal note - i found quite annoying , following 2 commands are order sensitive:
  logstash_prefix netsys,  logstash_format true
  -multiformat & format part is handling huawei specific message pattern. tested on ne40 / s9xx

  <source></source>
    type syslog
    port 514
    bind 0.0.0.0
    tag vystupnormal
    format multi_format
    <pattern>
      format /^(?<time>[^ ]*\s*[^ ]* [^ ]* [^ ]*) (?<host>[^ ]*) ?(?:[^\:]*\:)? *(?<message>.*)$/
      time_format %b %d %Y %H:%M:%S+03:00
    </message></host></time></pattern>
    <pattern>
      format syslog
    </pattern>
    <pattern>
      format /(?<time>\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2},\d{3}) (?<message>(.|\s)*)/
    </message></time></pattern>



  <match vystupnormal.="">
    type copy
    <store>
      type stdout
    </store>
    <store>
      type elasticsearch
      host 192.168.1.1
      port 9200
      include_tag_key true
      logstash_prefix syslogxx
      logstash_format true
      type_name fluentd
    </store>
  </match>



4. by now, you should be able to add  to kibana syslogxx-* :
    settings--&gt; indices --&gt;  put syslogxx-* to empty row, etc.



Žádné komentáře:

Okomentovat