In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article mainly introduces the ELK Logstash how to install and import data, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let Xiaobian take you to understand.
Dataset download
Download address: https://grouplens.org/datasets/movielens/
We can download and install Logstash by selecting the smallest dataset.
First, go to the official website to download the Logstash installation package: https://www.elastic.co/downloads/logstash
Logstash download
If the download speed is too slow, you can choose this proxy address to download: http://mirror.azk8s.cn/elastic/logstash/
Azk8s
After downloading the installation package and unzipping it, enter the config directory:
Logstash-config
At the same time, configure the following contents. You only need to change the original path in the configuration file according to the path of your dataset:
Logstash.confinput {
File {
Path = > "/ Users/tanjian/Desktop/logstash-7.6.1/movielens/ml-latest-small/movies.csv" # specify the dataset path here
Start_position = > "beginning"
Sincedb_path = > "/ dev/null"
}
}
Filter {
Csv {
Separator = > ","
Columns = > ["id", "content", "genre"]
}
Mutate {
Split = > {"genre" = > "|"}
Remove_field = > ["path", "host", "@ timestamp", "message"]
}
Mutate {
Split = > ["content", "("]
Add_field = > {"title" = > "% {[content] [0]}"}
Add_field = > {"year" = > "% {[content] [1]}"}
}
Mutate {
Convert = > {
"year" = > "integer"
}
Strip = > ["title"]
Remove_field = > ["path", "host", "@ timestamp", "message", "content"]
}
}
Output {
Elasticsearch {
Hosts = > "http://localhost:9200""
Index = > "movies"
Document_id = > "% {id}"
}
Stdout {}
}
Logstash operation
After configuring the logstash.conf file above, we can start Logstash and start importing data:
Sudo. / bin/logstash-f config/logstash.conf
As shown in the Logstash log in the following figure, the dataset is being imported:
Logstash log opens Kibana to view data
Before we can view the data, we need to open http://localhost:5601 and create an Index Pattern through Kibana:
Index Pattern
After that, we can view our data through Discover:
Discover
It's up to you to do the rest. You can go to Dev Tools and search the data through QSL grammar to get familiar with the grammar.
Thank you for reading this article carefully. I hope the article "how to install and import data in ELK Logstash" shared by the editor will be helpful to you. At the same time, I also hope you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.