无法在Docker中的日志文件中写入日志信息
我在Docker工作时生成日志信息有问题。在Localhost中的日志文件中编写日志没有问题。
在Docker期间实现CRUD过程时,我看不到任何新日志。
如何将日志文件( springboot-elk.log )连接到docker?
我该如何修复?
是显示屏幕截图的文件:
这 我的项目链接:我的项目
这是 docker-compose.yml 显示下面
version: '3.8'
services:
logstash:
image: docker.elastic.co/logstash/logstash:7.15.2
user: root
command: -f /etc/logstash/conf.d/
volumes:
- ./elk/logstash/:/etc/logstash/conf.d/
- ./Springboot-Elk.log:/tmp/logs/Springboot-Elk.log
ports:
- "5000:5000"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
depends_on:
- elasticsearch
filebeat:
build:
context: ./filebeat
dockerfile: Dockerfile
links:
- "logstash:logstash"
volumes:
- /var/run/docker.sock:/host_docker/docker.sock
- /var/lib/docker:/host_docker/var/lib/docker
depends_on:
- logstash
kibana:
image: docker.elastic.co/kibana/kibana:7.15.2
user: root
volumes:
- ./elk/kibana/:/usr/share/kibana/config/
ports:
- "5601:5601"
depends_on:
- elasticsearch
entrypoint: ["./bin/kibana", "--allow-root"]
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.15.2
user: root
volumes:
- ./elk/elasticsearch/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
app:
image: 'springbootelk:latest'
build:
context: .
dockerfile: Dockerfile
container_name: SpringBootElk
depends_on:
- db
- logstash
ports:
- '8077:8077'
environment:
- SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/springbootexample?useSSL=false&allowPublicKeyRetrieval=true&serverTimezone=Turkey
- SPRING_DATASOURCE_USERNAME=springexample
- SPRING_DATASOURCE_PASSWORD=111111
- SPRING_JPA_HIBERNATE_DDL_AUTO=update
db:
container_name: db
image: 'mysql:latest'
ports:
- "3366:3306"
restart: always
environment:
MYSQL_DATABASE: ${MYSQL_DATABASE}
MYSQL_USER: ${MYSQL_USER}
MYSQL_PASSWORD: ${MYSQL_PASSWORD}
MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD}
volumes:
- db-data:/var/lib/mysql
# Volumes
volumes:
db-data:
是 logstash.conf 如下所示
input {
beats {
port => 5000
}
file {
path => "/tmp/logs/Springboot-Elk.log"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}
output {
stdout{
codec => rubydebug
}
elasticsearch {
hosts => "elasticsearch:9200"
index => "dockerlogs"
}
}
filebeat.yml 文件如下所示。
filebeat.inputs:
- type: docker
enabled: true
containers:
ids:
- "*"
path: "/host_docker/var/lib/docker/containers"
processors:
- add_docker_metadata:
host: "unix:///host_docker/docker.sock"
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
output.logstash:
hosts: ["logstash:5000"]
log files:
logging.level: info
logging.to_files: false
logging.to_syslog: false
loggins.metrice.enabled: false
logging.files:
path: /var/log/filebeat
name: filebeat
keepfiles: 7
permissions: 0644
ssl.verification_mode: none
这是 dockerfile filebeat.yml ,
FROM docker.elastic.co/beats/filebeat:7.15.2
COPY filebeat.yml /usr/share/filebeat/filebeat.yml
USER root
RUN mkdir /usr/share/filebeat/dockerlogs
RUN chown -R root /usr/share/filebeat/
RUN chmod -R go-w /usr/share/filebeat/
因为我想在 logstash中看到logs ,我运行此命令docker容器logs -f。 我看不到在Person Controller和PersonService中定义的任何日志。 这是屏幕截图
I have a problem about generating log info when I work in Docker. There is no issue to write logs in log file in localhost.
I cannot see any new logs when I implement the CRUD process during docker.
How can I connect log file(Springboot-Elk.log) to Docker?
How can I fix it?
Here is the file showing screenshots : Link
Here is my project link : My Project
Here is the docker-compose.yml shown below
version: '3.8'
services:
logstash:
image: docker.elastic.co/logstash/logstash:7.15.2
user: root
command: -f /etc/logstash/conf.d/
volumes:
- ./elk/logstash/:/etc/logstash/conf.d/
- ./Springboot-Elk.log:/tmp/logs/Springboot-Elk.log
ports:
- "5000:5000"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
depends_on:
- elasticsearch
filebeat:
build:
context: ./filebeat
dockerfile: Dockerfile
links:
- "logstash:logstash"
volumes:
- /var/run/docker.sock:/host_docker/docker.sock
- /var/lib/docker:/host_docker/var/lib/docker
depends_on:
- logstash
kibana:
image: docker.elastic.co/kibana/kibana:7.15.2
user: root
volumes:
- ./elk/kibana/:/usr/share/kibana/config/
ports:
- "5601:5601"
depends_on:
- elasticsearch
entrypoint: ["./bin/kibana", "--allow-root"]
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.15.2
user: root
volumes:
- ./elk/elasticsearch/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
app:
image: 'springbootelk:latest'
build:
context: .
dockerfile: Dockerfile
container_name: SpringBootElk
depends_on:
- db
- logstash
ports:
- '8077:8077'
environment:
- SPRING_DATASOURCE_URL=jdbc:mysql://db:3306/springbootexample?useSSL=false&allowPublicKeyRetrieval=true&serverTimezone=Turkey
- SPRING_DATASOURCE_USERNAME=springexample
- SPRING_DATASOURCE_PASSWORD=111111
- SPRING_JPA_HIBERNATE_DDL_AUTO=update
db:
container_name: db
image: 'mysql:latest'
ports:
- "3366:3306"
restart: always
environment:
MYSQL_DATABASE: ${MYSQL_DATABASE}
MYSQL_USER: ${MYSQL_USER}
MYSQL_PASSWORD: ${MYSQL_PASSWORD}
MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD}
volumes:
- db-data:/var/lib/mysql
# Volumes
volumes:
db-data:
Here is logstash.conf shown below
input {
beats {
port => 5000
}
file {
path => "/tmp/logs/Springboot-Elk.log"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}
output {
stdout{
codec => rubydebug
}
elasticsearch {
hosts => "elasticsearch:9200"
index => "dockerlogs"
}
}
filebeat.yml file is shown below.
filebeat.inputs:
- type: docker
enabled: true
containers:
ids:
- "*"
path: "/host_docker/var/lib/docker/containers"
processors:
- add_docker_metadata:
host: "unix:///host_docker/docker.sock"
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
output.logstash:
hosts: ["logstash:5000"]
log files:
logging.level: info
logging.to_files: false
logging.to_syslog: false
loggins.metrice.enabled: false
logging.files:
path: /var/log/filebeat
name: filebeat
keepfiles: 7
permissions: 0644
ssl.verification_mode: none
Here is the Dockerfile of filebeat.yml
FROM docker.elastic.co/beats/filebeat:7.15.2
COPY filebeat.yml /usr/share/filebeat/filebeat.yml
USER root
RUN mkdir /usr/share/filebeat/dockerlogs
RUN chown -R root /usr/share/filebeat/
RUN chmod -R go-w /usr/share/filebeat/
As I want to see logs in logstash , I run this command docker container logs -f .
I cannot see any logs defined in PersonController and PersonService there.
Here is the screenshot
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
这是我的回答下面所示。
在我修订了下面显示的logstash.conf文件之后,我的问题被消失了。
Here is my answer shown below.
After I revised logstash.conf file shown below, my issue was dissappeared.
与Docker合作时,最好将所有日志写入控制台。这将使在Kubernetes或其他Ochestrators中运行时,可以公开日志。在春季框架上,您可以通过更改为consoleappender来实现这一目标。示例Bellow显示了如何在log4j.xml中实现这一目标。将文件放在您的资源文件夹中并添加log4j依赖项(ref: https://www.baeldungungungungungungungungungungungungungungungungungung .com/spring-boot-logging ):
您仍然可以通过在上面的配置中添加另一个appender将日志配置为磁盘,但是您需要在docker-compose文件中添加一个安装点,以指向日志目录在您的申请上。
很高兴注意到Docker是无状态的,因此当您重新启动容器时,日志会丢失。
When working with docker, its good to write all logs to the console. This will allow it to expose logs when run in Kubernetes or other ochestrators. On spring framework, you can achieve this by changing to ConsoleAppender. The example bellow shows how to achieve this in log4j.xml. Place the file to your resources folder and add the log4j dependencies(Ref: https://www.baeldung.com/spring-boot-logging ):
You can still configure log to disk by adding another appender in the configuration above, but you need to add a mount point to your docker-compose file to point to the logs directory on your application.
It is good to note that Docker is stateless and therefore logs are lost when you restart the container.
对我来说,它有效。但是我的麋鹿堆在Docker中不运行。
这是我的LogStash配置(TCP和UDP的相同配置):
您也必须在Kibana中设置记录索引。
这是我的logback-spring.xml:
我正在直接记录(UDP)到麋鹿堆栈。
你需要
For me it works. But my ELK Stack is running not in docker.
This is my logstash config (same config for TCP and UDP):
And you have to setup the logback index in Kibana too.
This is my logback-spring.xml:
I'm logging directly (UDP) to ELK Stack.
And you need
根据您的存储库,您具有
logBack
用于日志记录的配置,但示例logback
并添加了log4j
在您的Maven依赖项中支持。要使logBack
工作您只需不排除默认的日志库,然后添加logBack
dependenty的较新版本(因为由弹簧启动依赖项的LogBack的Spring Boot依赖版本指定了在您的配置logstash appender -net.logstash.logback.appender.logstashtcpsocketappender中指定。FE:
但是最好在Internet上找到一个好的指南如何将Spring Boot应用程序记录到正确的LogStash。而且我不建议您将存储库中的配置用于生产目的。
According to your repository you have
logback
configuration for logging, but excudedlogback
and addedlog4j
support in your maven dependencies. To makelogback
work you just need not to exclude default logging library, and add newer version of thelogback
dependency (because specified by spring boot dependency version of logback doesn't contain specified in your configuration logstash appender - net.logstash.logback.appender.LogstashTcpSocketAppender).F.e.:
But it is better to find a good guide on the internet how to setup logging of your spring boot application to logstash properly. And I don't suggest you to use configuration in your repository for production purposes.