ELK分布式日志平台的概念
1)ElasticSearch概念:
ElasticSearch简称ES,主要是用于存储数据(日志内容)、分布式、可以检索日志内容、对日志内容进行过滤、可以从日志内容搜索相关的内容,支持自动发现节点、创建索引、支持副本集、丰富API接口(满足各种程序来使用ES)。
2)Logstash概念:
Logstash主要是部署在客户端节点上,主要是用于收集|采集,客户端服务器的相关的日志内容(内核日志、系统日志、安全日志、应用日志)的,除了可以采 集日志之外,还可以对日志进行简单过滤(过滤正则),最终会将收集的日志统 一存储至ES分布式服务器中。
3)Kibana概念:
Kibana是为ES和Logstash提供一套WEB工作界面的,用户通过WEB界面能够便 捷、高效的管理ES分布式集群、对日志快速匹配、操作,Kibana程序会连接ES 分布式引擎,可以操作分布式引擎中数据、日志。
4)FileBeat概念:
FileBeat跟Logstash是一样的功能,其特点是轻量级、高性能,部署在客户端节点上,主要是用于收集|采集,客户端服务器的相关的日志内容(内核日志、系统日志、安全日志、应用日志)的,但是其不能对日志内容进行过滤,如果需要 对日志进行过滤的话,需要将日志数据发送给Logstash,最终会将收集的日志 统一存储至ES分布式服务器中。
5)Logstash和Elasticsearch是用Java语言编写,Filebeat是基于Go语言开发,而Kibana使用node.js框架,在配置ELK环境要保证系统有JAVA JDK开发库。
2、部署操作:
传统部署步骤: es上配置:
准备:安装jdk1.8,es 安装jdk tar xf jdk1.8.0_131.tar.gz mv jdk1.8.0_131 /usr/java/jdk1.8.0_131 vim /etc/profile export JAVA_HOME=/usr/java/jdk1.8.0_131 export CLAsspATH=$CLAsspATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH:$Homr/bin source /etc/profile 安装es tar xf elasticsearch-5.3.0.tar.gz mv elasticsearch-5.3.0 /usr/local/elasticsearch vim /usr/local/elasticsearch/config/elasticsearch.yml nethwork.host: 0.0.0.0 bootstrap.memory_lock: false bootstrap.system_call_filter: false vim /etc/security/limits.conf * soft nofile 65536 * hard nofile 65536 vim /etc/sysctl.conf vm.max_map_count=262144 sysctl -p useradd elk chown -R elk:elk /usr/local/elasticsearch/ su - elk /usr/local/elasticsearch/bin/elasticsearch -d ps -ef |grep elas elk 1926 1 82 16:40 pts/0 00:00:16 /usr/java/jdk1.8.0_131/bin/java -xms2g -Xmx2g -XX:+UseConcmarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+disableExplicitGC -XX:+AlwaysPretouch -server -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -Djdk.io.permissionsUseCanonicalPath=true -dio.netty.noUnsafe=true -dio.netty.noKeySetoptimization=true -dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Dlog4j.skipJansi=true -XX:+HeapDumpOnOutOfMemoryError -Des.path.home=/usr/local/elasticsearch -cp /usr/local/elasticsearch/lib/elasticsearch-5.3.0.jar:/usr/local/elasticsearch/lib/* org.elasticsearch.bootstrap.Elasticsearch -d elk 1966 1874 0 16:40 pts/0 00:00:00 grep --color=auto elas netstat -tanpl |grep 9200 (Not all processes Could be identified, non-owned process info will not be shown, you would have to be root to see it all.) tcp6 0 0 :::9200 :::* LISTEN 1926/java 安装kibana: mv kibana-5.3.0-linux-x86_64 /usr/local/kibana vim /usr/local/kibana/config/kibana.yml server.port: 5601 server.host: "0.0.0.0" elasticsearch.url: "http://10.0.0.11:9200" /usr/local/kibana/bin/kibana log [08:55:46.247] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready log [08:55:46.340] [info][status][plugin:[email protected]] Status changed from uninitialized to yellow - Waiting for Elasticsearch log [08:55:46.387] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready log [08:55:46.624] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready log [08:55:46.628] [info][listening] Server running at http://0.0.0.0:5601 log [08:55:46.629] [info][status][ui settings] Status changed from uninitialized to yellow - Elasticsearch plugin is yellow log [08:55:51.420] [info][status][plugin:[email protected]] Status changed from yellow to yellow - No existing Kibana index found log [08:55:52.478] [info][status][plugin:[email protected]] Status changed from yellow to green - Kibana index ready log [08:55:52.478] [info][status][ui settings] Status changed from yellow to green - Ready 访问页面: 默认访问页面出现即可: logstash*引擎 timespace时间节点获取信息 部署logstash tar xzf logstash-5.3.0.tar.gz mv logstash-5.3.0 /usr/local/logstash/ mkdir -p /usr/local/logstash/config/etc/ cd /usr/local/logstash/config/etc/ vim jfedu.conf input { stdin { } } output { stdout { codec => rubydebug {}} elasticsearch { hosts => "10.0.0.11:9200" } } /usr/local/logstash/bin/logstash -f jfedu.conf /usr/local/logstash/bin/logstash -f jfedu.conf Sending Logstash's logs to /usr/local/logstash/logs which is Now configured via log4j2.properties [2022-04-05T17:04:18,848][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/local/logstash/data/queue"} [2022-04-05T17:04:18,959][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"6a99e592-bb18-4519-87bf-3b1c59a376d6", :path=>"/usr/local/logstash/data/uuid"} [2022-04-05T17:04:20,915][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.0.0.11:9200/]}} [2022-04-05T17:04:20,930][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.0.0.11:9200/, :path=>"/"} [2022-04-05T17:04:21,183][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x1d43720 URL:http://10.0.0.11:9200/>} [2022-04-05T17:04:21,232][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2022-04-05T17:04:21,411][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2022-04-05T17:04:21,455][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash [2022-04-05T17:04:21,823][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x3e644f1 URL://10.0.0.11:9200>]} [2022-04-05T17:04:21,868][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125} [2022-04-05T17:04:21,941][INFO ][logstash.pipeline ] Pipeline main started The stdin plugin is Now waiting for input: [2022-04-05T17:04:22,102][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} www.jf.com { "@timestamp" => 2022-04-05T09:07:26.034Z, "@version" => "1", "host" => "k8s-node-2", "message" => "www.jf.com" } test { "@timestamp" => 2022-04-05T09:09:13.727Z, "@version" => "1", "host" => "k8s-node-2", "message" => "test" } localhsot:9200timespace { "@timestamp" => 2022-04-05T09:09:57.272Z, "@version" => "1", "host" => "k8s-node-2", "message" => "localhsot:9200timespace" } 0
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。