docker配置
docker-compose
version: '3'services:elasticsearch:image: elasticsearch:6.6.2container_name: elasticsearchrestart: alwaysenvironment:- "cluster.name=elasticsearch"- "discovery.type=single-node"- "ES_JAVA_OPTS=-Xms512m -Xmx512m"ports:- 9200:9200- 9300:9300kibana:image: docker.elastic.co/kibana/kibana:6.6.2container_name: kibanaenvironment:- SERVER_NAME=kibana- ELASTICSEARCH_URL=http://elasticsearch:9200- XPACK_MONITORING_ENABLED=trueports:- 5601:5601depends_on:- elasticsearchlogstash:image: logstash:6.6.2container_name: logstashvolumes:- ~/docker/mydata/logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.confdepends_on:- elasticsearchlinks:- elasticsearch:esports:- 4560:4560
logstash.conf
上面docker-compose配置中,logstash有一个挂载文件,用来配置logstash. 定义输入输出规则,采用json数据输出. 启动docker容器之前,先创建~/docker/mydata/logstash/logstash.conf文件,否则则启动会有问题.
input {tcp {mode => "server"host => "0.0.0.0"port => 4560codec => json_linestype => "business"}}output {elasticsearch {hosts => ["es:9200"]action => "index"codec => jsonindex => "%{type}-%{+YYYY.MM.dd}"template_name => "business"}}
启动容器
启动命令:docker-compose up -dlogback配置
<!-- 集成logstash --><dependency><groupId>net.logstash.logback</groupId><artifactId>logstash-logback-encoder</artifactId><version>5.3</version></dependency>
logback-spring.xml
通过logback-spring.xml指定日志输出,在配置文件中添加日志输出到logstash,完整配置:
<?xml version="1.0" encoding="UTF-8" ?><configuration><!--应用名称--><springProperty scope="context" name="APP_NAME" source="spring.application.name" defaultValue="springBoot"/><!--LogStash访问host--><springProperty name="LOG_STASH_HOST" scope="context" source="logstash.host" defaultValue="localhost"/><!-- 控制台输出日志 --><appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"><encoder><pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%X{traceId}] [%level] [%c:%L] - %msg%n</pattern></encoder></appender><!--每天生成一个日志文件,保存30天的日志文件。--><appender name="DayFile" class="ch.qos.logback.core.rolling.RollingFileAppender"><File>logs/log.log</File><rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"><fileNamePattern>logs/log.%d{yyyy-MM-dd}.log</fileNamePattern><maxHistory>30</maxHistory></rollingPolicy><encoder><pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%X{traceId}] [%level] [%thread] [%c:%L] - %msg%n</pattern></encoder></appender><!--业务日志输出到LogStash--><appender name="LOG_STASH_BUSINESS" class="net.logstash.logback.appender.LogstashTcpSocketAppender"><destination>${LOG_STASH_HOST}:4560</destination><encoder charset="UTF-8" class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"><providers><timestamp><timeZone>Asia/Shanghai</timeZone></timestamp><!--自定义日志输出格式--><pattern><pattern>{"service": "${APP_NAME:-}","level": "%level","pid": "${PID:-}","thread": "%thread","class": "%logger","traceId": "%X{traceId:-}","message": "%message","stack_trace": "%exception"}</pattern></pattern></providers></encoder></appender><!--指定logger name为包名或类全名 指定级别 additivity设置是否传递到root logger --><logger name="slf4j" level="INFO" additivity="false"><appender-ref ref="STDOUT"/><appender-ref ref="DayFile"/><appender-ref ref="LOG_STASH_BUSINESS"/></logger><!--slf4j2包下的类在ERROR级别时候传递到root logger中--><logger name="slf4j2" level="ERROR"/><!--根logger控制--><root level="INFO"><appender-ref ref="STDOUT"/><appender-ref ref="DayFile"/><appender-ref ref="LOG_STASH_BUSINESS"/></root></configuration>
