Cannot create index in elasticsearch I use stack ELK
docker-compose:
version: '3'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.18
ports:
- "9200:9200"
environment:
- discovery.type=single-node
- xpack.security.enabled=false
logstash:
image: docker.elastic.co/logstash/logstash:7.17.18
ports:
- "9600:9600"
environment:
- discovery.type=single-node
- xpack.security.enabled=false
- "http.host=0.0.0.0"
- xpack.monitoring.enabled=true
volumes:
- ./logstash/logstash.conf:/config-dir/logstash.conf:ro
command: logstash -f /config-dir/logstash.conf
depends_on:
- elasticsearch
kibana:
image: docker.elastic.co/kibana/kibana:7.17.18
environment:
- xpack.security.enabled=false
- ELASTICSEARCH_HOSTS=http://elasticsearch:9200
ports:
- "5601:5601"
depends_on:
- elasticsearch
volumes:
elasticsearch_data:
driver: local
logstash.conf:
input {
file {
path => "/d/FastApiLearning/log.log"
start_position => "beginning"
}
}
filter {
}
output {
elasticsearch {
# hosts => ["host.docker.internal:9200"]
hosts => ["elasticsearch:9200"]
index => "fastapi"
}
}
All services are up, at least I didn’t find any errors. There are no critical errors in the logstash container either. What could be the problem?
Output url http://localhost:9600/_node/stats/events?pretty:
{
"host" : "1de0ee694b28",
"version" : "7.17.18",
"http_address" : "0.0.0.0:9600",
"id" : "341753b3-2279-4ee5-8e0b-d651d3aadc62",
"name" : "1de0ee694b28",
"ephemeral_id" : "4c2b480a-1809-4839-bb2c-43b778ed076a",
"status" : "green",
"snapshot" : false,
"pipeline" : {
"workers" : 8,
"batch_size" : 125,
"batch_delay" : 50
},
"monitoring" : {
"hosts" : [ "http://elasticsearch:9200" ],
"username" : "logstash_system"
},
"events" : {
"in" : 0,
"filtered" : 0,
"out" : 0,
"duration_in_millis" : 0,
"queue_push_duration_in_millis" : 0
}
I understand that my in and out are zero, but why?
sticsearch:9200/"}
2024-03-07 17:33:46 [2024-03-07T14:33:46,888][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (7.17.18) {:es_version=>7}
2024-03-07 17:33:46 [2024-03-07T14:33:46,888][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.18) {:es_version=>7}
2024-03-07 17:33:46 [2024-03-07T14:33:46,889][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
2024-03-07 17:33:46 [2024-03-07T14:33:46,889][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
2024-03-07 17:33:46 [2024-03-07T14:33:46,930][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
2024-03-07 17:33:46 [2024-03-07T14:33:46,933][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
2024-03-07 17:33:46 [2024-03-07T14:33:46,937][WARN ][logstash.javapipeline ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
2024-03-07 17:33:46 [2024-03-07T14:33:46,952][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
2024-03-07 17:33:46 [2024-03-07T14:33:46,978][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0xf0097f7 run>"}
2024-03-07 17:33:46 [2024-03-07T14:33:46,978][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/config-dir/logstash.conf"], :thread=>"#<Thread:0x1520aea7@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:130 run>"}
2024-03-07 17:33:47 [2024-03-07T14:33:47,520][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.54}
2024-03-07 17:33:47 [2024-03-07T14:33:47,521][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.54}
2024-03-07 17:33:47 [2024-03-07T14:33:47,547][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
2024-03-07 17:33:47 [2024-03-07T14:33:47,573][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
2024-03-07 17:33:47 [2024-03-07T14:33:47,634][INFO ][filewatch.observingtail ][main][d76c44e6f77faba1d001bfb24bb25aab3250b628376a6c26d9a60693d3d001d7] START, creating Discoverer, Watch with file and sincedb collections
2024-03-07 17:33:47 [2024-03-07T14:33:47,647][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>[]}```