You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I cannot understand how to my postgres database which is running by container to connect with logstash and then, they can display on the kibana. So, when I set up my logstash.conf and run them. I have received the error => [2024-04-05T06:31:40,600][ERROR][logstash.config.sourceloader] No configuration found in the configured sources. And my target is how to connect postgres database with logstash with this source.
Extra information
I have not change anything file or structure what I have cloned from this repository.
version: '3.7'services: # The 'setup' service runs a one-off script which initializes users inside # Elasticsearch — such as 'logstash_internal' and 'kibana_system' — with the # values of the passwords defined in the '.env' file. It also creates the # roles required by some of these users. # # This task only needs to be performed once, during the *initial* startup of # the stack. Any subsequent run will reset the passwords of existing users to # the values defined inside the '.env' file, and the built-in roles to their # default permissions. # # By default, it is excluded from the services started by 'docker compose up' # due to the non-default profile it belongs to. To run it, either provide the # '--profile=setup' CLI flag to Compose commands, or "up" the service by name # such as 'docker compose up setup'. setup: profiles: - setup build: context: setup/ args: ELASTIC_VERSION: ${ELASTIC_VERSION} init: true volumes: - ./setup/entrypoint.sh:/entrypoint.sh:ro,Z - ./setup/lib.sh:/lib.sh:ro,Z - ./setup/roles:/roles:ro,Z environment: ELASTIC_PASSWORD: ${ELASTIC_PASSWORD:-} LOGSTASH_INTERNAL_PASSWORD: ${LOGSTASH_INTERNAL_PASSWORD:-} KIBANA_SYSTEM_PASSWORD: ${KIBANA_SYSTEM_PASSWORD:-} METRICBEAT_INTERNAL_PASSWORD: ${METRICBEAT_INTERNAL_PASSWORD:-} FILEBEAT_INTERNAL_PASSWORD: ${FILEBEAT_INTERNAL_PASSWORD:-} HEARTBEAT_INTERNAL_PASSWORD: ${HEARTBEAT_INTERNAL_PASSWORD:-} MONITORING_INTERNAL_PASSWORD: ${MONITORING_INTERNAL_PASSWORD:-} BEATS_SYSTEM_PASSWORD: ${BEATS_SYSTEM_PASSWORD:-} networks: - elk depends_on: - elasticsearch elasticsearch: build: context: elasticsearch/ args: ELASTIC_VERSION: ${ELASTIC_VERSION} volumes: - ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro,Z - elasticsearch:/usr/share/elasticsearch/data:Z ports: - 9200:9200 - 9300:9300 environment: node.name: elasticsearch ES_JAVA_OPTS: -Xms512m -Xmx512m # Bootstrap password. # Used to initialize the keystore during the initial startup of # Elasticsearch. Ignored on subsequent runs. ELASTIC_PASSWORD: ${ELASTIC_PASSWORD:-} # Use single node discovery in order to disable production mode and avoid bootstrap checks. # see: https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html discovery.type: single-node networks: - elk restart: unless-stopped logstash: build: context: logstash/ args: ELASTIC_VERSION: ${ELASTIC_VERSION} volumes: - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro,Z - ./logstash/pipeline:/usr/share/logstash/pipeline:ro,Z ports: - 5044:5044 - 50000:50000/tcp - 50000:50000/udp - 9600:9600 environment: LS_JAVA_OPTS: -Xms256m -Xmx256m LOGSTASH_INTERNAL_PASSWORD: ${LOGSTASH_INTERNAL_PASSWORD:-} networks: - elk depends_on: - elasticsearch restart: unless-stopped kibana: build: context: kibana/ args: ELASTIC_VERSION: ${ELASTIC_VERSION} volumes: - ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro,Z ports: - 5601:5601 environment: KIBANA_SYSTEM_PASSWORD: ${KIBANA_SYSTEM_PASSWORD:-} networks: - elk depends_on: - elasticsearch restart: unless-stoppednetworks: elk: driver: bridgevolumes: elasticsearch:
Container logs
$ docker-compose logs
2024-04-05 15:31:39 [2024-04-05T06:31:39,954][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Xms256m, -Xmx256m, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
2024-04-05 15:31:39 [2024-04-05T06:31:39,956][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`2024-04-05 15:31:39 [2024-04-05T06:31:39,957][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`2024-04-05 15:31:40 [2024-04-05T06:31:40,122][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified2024-04-05 15:31:40 [2024-04-05T06:31:40,599][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"/usr/share/logstash/config/usersync.conf"}2024-04-05 15:31:40 [2024-04-05T06:31:40,600][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.2024-04-05 15:31:40 [2024-04-05T06:31:40,686][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}2024-04-05 15:31:40 [2024-04-05T06:31:40,696][INFO ][logstash.runner ] Logstash shut down.2024-04-05 15:31:40 [2024-04-05T06:31:40,702][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit2024-04-05 15:31:40 org.jruby.exceptions.SystemExit: (SystemExit) exit2024-04-05 15:31:40 at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:808) ~[jruby.jar:?]2024-04-05 15:31:40 at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:767) ~[jruby.jar:?]2024-04-05 15:31:40 at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:90) ~[?:?]2024-04-05 15:31:41 Using bundled JDK: /usr/share/logstash/jdk2024-04-05 15:31:55 Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties2024-04-05 15:31:55 [2024-04-05T06:31:55,764][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties2024-04-05 15:31:55 [2024-04-05T06:31:55,771][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.13.0", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"}
2024-04-05 15:31:55 [2024-04-05T06:31:55,774][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Xms256m, -Xmx256m, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
2024-04-05 15:31:55 [2024-04-05T06:31:55,777][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`2024-04-05 15:31:55 [2024-04-05T06:31:55,777][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`2024-04-05 15:31:55 [2024-04-05T06:31:55,991][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified2024-04-05 15:31:56 [2024-04-05T06:31:56,498][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"/usr/share/logstash/config/usersync.conf"}2024-04-05 15:31:56 [2024-04-05T06:31:56,499][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.2024-04-05 15:31:56 [2024-04-05T06:31:56,594][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}2024-04-05 15:31:56 [2024-04-05T06:31:56,608][INFO ][logstash.runner ] Logstash shut down.2024-04-05 15:31:56 [2024-04-05T06:31:56,616][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit2024-04-05 15:31:56 org.jruby.exceptions.SystemExit: (SystemExit) exit2024-04-05 15:31:56 at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:808) ~[jruby.jar:?]2024-04-05 15:31:56 at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:767) ~[jruby.jar:?]2024-04-05 15:31:56 at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:90) ~[?:?]2024-04-05 15:31:57 Using bundled JDK: /usr/share/logstash/jdk
The text was updated successfully, but these errors were encountered:
For them to be able to communicate with other containers, it is necessary to share a network. If your postgresql container is running in a different network (e.g. the default Docker bridge network), the components of the ELK stack can't access it.
Thank you, @antoineco for your response. By the way, I am encountering an issue when running my Docker compose with an ELK source using the new configuration I presented. (my structure based on your ELK source and make sure I don't change anything about position or something). And it's giving me an error indicating that it can't my configuration with logstash postgres. Any insight on this? (my error: [2024-04-05T06:31:40,600][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.). Additional, except for the jdbc_connection_string, Are all the parameters I have set up above correct, right?
Problem description
I cannot understand how to my postgres database which is running by container to connect with logstash and then, they can display on the kibana. So, when I set up my logstash.conf and run them. I have received the error =>
[2024-04-05T06:31:40,600][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.
And my target is how to connect postgres database with logstash with this source.Extra information
I have not change anything file or structure what I have cloned from this repository.
Stack configuration
Docker setup
Container logs
The text was updated successfully, but these errors were encountered: