Tester que ça marche
curl "https://elasticsearch.raphaelpiccolo.com/_cat/nodes?v&pretty"
Quand on installe pour la première fois il faut ouvrir kibana via le lien founi dans les logs.
Il faut ensuite créer un token depuis elastic search et le rentrer dans kibana :
docker exec -it swarm_elasticsearch.1.57c43ytln520ey6svypjgv7pi /usr/share/elasticsearch/bin/elasticsearch-create-enrollment-token -s kibana
Tout en bas de home principale il y a un bloc stack management / index management
on peut insérer des données avec un curl
curl -X POST https://elasticsearch.raphaelpiccolo.com/test/_doc \ -H "Content-Type: application/json" \ -d '{"name": "foo2", "title": "bar2" }'
Ensuite dans kibana on peut aller dans analytics > discover
et lancer cette requête :
FROM test* | LIMIT 10
VIEUX
java -version apt-get update apt-get install default-jre apt-get install default-jdk wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | apt-key add - echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | tee -a /etc/apt/sources.list.d/elastic-6.x.list apt-get update apt-get install elasticsearch emacs /etc/elasticsearch/elasticsearch.yml network.host: localhost systemctl start elasticsearch systemctl enable elasticsearch
check working
curl "localhost:9200"
get indexes
curl 'localhost:9200/_aliases?pretty=true' curl 'localhost:9200/_cat/indices'
get types
curl -s -XGET 'http://localhost:9200/_mapping' | jq 'to_entries | .[] | {(.key): .value.mappings | keys}'
empty indexes
curl -XDELETE 'http://localhost:9200/filebeat-*'
empty pipelines
curl -XDELETE 'http://localhost:9200/_ingest/pipeline/*'
find something by id
curl 'http://localhost:9200/logstash-node-2019.05.16/doc/YiBswmoBrpq531GuVhD-?pretty'
search something
curl 'http://localhost:9200/logstash-node-*/doc/_search?pretty' curl 'http://localhost:9200/logstash-node-*/_search?pretty' curl 'http://localhost:9200/logstash-node-2019.05.16/_search?pretty'
insert
curl -H 'Content-Type: application/json' -XPOST 'http://localhost:9200/logstash-node-2019.05.16/doc?pretty' -d '{ "message" : "value", "@timestamp" : "2019-05-16T20:42:10.000Z"}'
apt-get install kibana systemctl enable kibana systemctl start kibana
Vérifier que ça marche
curl http://localhost:5601/status
Rendre accessible de l'exterieur
emacs /etc/apache2/sites-enabled/000-default.confServerName kibana.raphaelpiccolo.com ProxyRequests Off ProxyPass / http://localhost:5601/ ProxyPassReverse / http://localhost:5601/ ProxyPreserveHost On AuthName "Safe Zone" AuthType Basic AuthUserFile "/etc/apache2/.htpasswd" Require valid-user
apt-get install logstash
pour que le user logstash ait accès à /var/log/*
usermod -a -G adm logstash
Existing logformat options : https://httpd.apache.org/docs/2.4/fr/mod/mod_log_config.html
Existing grok patterns : https://github.com/elastic/logstash/blob/v1.4.2/patterns/grok-patterns
emacs /etc/apache2/apache2.conf LogFormat "%v:%p %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\" %D" vhost_combined tail -f /var/log/apache2/other_vhosts_access.log fake.fr:80 176.159.8.38 - - [05/May/2019:16:24:48 +0200] "GET /test HTTP/1.1" 200 839 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.131 Safari/537.36" 314 emacs /etc/logstash/conf.d/apache-access.conf input { file { type => "apache-access" path => "/var/log/apache2/other_vhosts_access.log" start_position => beginning } } filter { if [type] == "apache-access" { grok { match => { "message" => "%{IPORHOST:vhost}:%{POSINT:port} %{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response} (?:%{NUMBER:bytes}|-) %{QS:referrer} %{QS:agent} %{NUMBER:duration}" } } date { match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] } mutate { remove_field => [ "timestamp" ] } if [agent] { useragent { source => "agent" target => "useragent" } } if [clientip] and [clientip] != '::1' and [clientip] != '127.0.0.1' { geoip { source => "clientip" } mutate { convert => [ "responsetime", "integer" ] convert => [ "bytes", "integer" ] } } } } output { if [type] == "apache-access" { elasticsearch { hosts => ["127.0.0.1"] index => "logstash-apache-%{+YYYY.MM.dd}" } } }
verifier la conf
sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t -f /etc/logstash/conf.d/apache-access.conf lancer en console pour tester sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -f /etc/logstash/conf.d/apache-access.conf
activer en daemon
systemctl start logstash systemctl enable logstash
verifier que l'index se remplit
curl 'localhost:9200/_cat/indices'
tail -f /var/log/apache2/error.log [Sun May 05 17:49:27.410788 2019] [proxy:error] [pid 486] [client 66.249.76.120:52362] AH00898: Error reading from remote server returned by /en/activity.json, referer: https://ideaz.world/en/activities/48 emacs /etc/logstash/conf.d/apache-error.conf input { file { type => "apache-error" path => "/var/log/apache2/error.log" start_position => beginning } } filter { if [type] == "apache-error" { grok { match => { "message" => "%{HTTPD_ERRORLOG}" } } date { match => [ "timestamp", "EEE MMM dd HH:mm:ss.SSSSSS yyyy" ] } mutate { remove_field => [ "timestamp" ] } if [clientip] and [clientip] != '::1' and [clientip] != '127.0.0.1' { geoip { source => "clientip" } } } } output { if [type] == "apache-error" { elasticsearch { hosts => ["127.0.0.1"] index => "logstash-apache-%{+YYYY.MM.dd}" } } }
emacs /etc/logstash/conf.d/apache-error.conf input { tcp { type => "node" port => 12345 codec => json } } filter { if [type] == "node" { date { match => ["date", "yyyy-MM-dd'T'HH:mm:ss'.'SSSZ" ] } if [clientip] and [clientip] != '::1' and [clientip] != '127.0.0.1' { geoip { source => "clientip" } } } } output { stdout { codec => rubydebug } } output { elasticsearch { hosts => ["127.0.0.1"] index => "logstash-node-%{+YYYY.MM.dd}" } }
emacs test.conf input { stdin { } } output { stdout { codec => rubydebug } } filter { grok { match => { "message" => "%{NUMBER:numero}" } } } output { stdout { codec => rubydebug } } echo "3456789" | sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -f /etc/logstash/conf.d/test.conf { "host" => "ideaz.world", "numero" => "3456789", "@version" => "1", "message" => "[3456789]", "@timestamp" => 2019-05-05T20:39:33.799Z }
apt-get install filebeat emacs /etc/filebeat/filebeat.yml setup.kibana: host: "localhost:5601" filebeat modules enable apache2 filebeat modules list filebeat setup filebeat setup -e systemctl start filebeat systemctl enable filebeat
pour activer autre chose :
filebeat modules enable system filebeat modules enable redis filebeat modules enable mysql filebeat modules enable mongodb filebeat setup systemctl restart filebeat
apt-get install metricbeat emacs /etc/metricbeat/metricbeat.yml setup.kibana: host: "localhost:5601" metricbeat modules enable system metricbeat modules enable apache metricbeat modules enable mongodb metricbeat modules enable mysql metricbeat modules enable redis metricbeat setup systemctl restart metricbeat