In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article shows you how to achieve accurate retrieval and matching of nearly 2 million images on Colab. The content is concise and easy to understand, which will definitely brighten your eyes. I hope you can get something through the detailed introduction of this article.
OpenAI also released two neural networks that connect text and images: DALL E and CLIP. Among them, DALL E can generate the image directly based on the text, and CLIP can match the image with the text category. After the release of these two models, there have been some attempts to replicate the DALL E model in the machine learning community, such as the developer Phil Wang's GitHub project DALLE-pytorch, which harvested 1.9k star in just 20 days.
Recently, another developer has created a project to use the CLIP model to search for accurate text matching images. All the images in the project are from the Unsplash dataset, about 2 million, and are processed using the CLIP model. The project can be run either on a given free Google Colab notebook or on the user's own machine.
Project address: https://github.com/haltakov/natural-language-image-search#two-dogs-playing-in-the-snow
In the specific implementation, the project author processes all the images into precomputed feature vectors on Colab Notebook, and then finds the images that best match the natural language search query (that is, the input text). The author of the project provides some pictures showing the results, such as entering the search term "Two dogs playing in the snow", and the following pictures will appear:
Taken by: Richard Burlton, Karl Anderson and Xuecheng Chen.
Enter the search term "" The word love written on the wall "and the following text containing" love "appears:
Taken by: Genton Damian, Anna Rozwadowska and Jude Beck.
In addition, you can choose the number of search images, for example, change 3 in the line to the desired number of search results by modifying "search_unslash (search_query, photo_features, photo_ids, 3)".
How to achieve it?
Run on Google Colab notebook
The steps for the first search in a given Colab session are as follows:
1. First, you need to enter the Colab interface.
two。 Log in to your Google account and click the "S" button in the upper right corner to complete this operation. Note: signing up for a Google account will affect privacy, for example, your Google search history will be recorded in your Google account.
3. Click a location in the cell (except for triangles), and the row reads "search_query =" Two dogs playing in the snow ""
4. Click the menu "Runtime → Run before" and wait for the execution to finish.
5. Find the line to read (or initially read) "search_query =" Two dogs playing in the snow "and change"Two dogs playing in the snow" to the query you want. For example: "search_query =" A clock with gold-colored numbers on a black background "
6. (optional) find the row "search_unslash (search_query, photo_features, photo_ids, 3)" that was read (or initially read). Change 3 in the row to the desired number of search results
7. Click the triangle to the left of the original read "search_query =" Two dogs playing in the snow "line and wait for the search results.
Colab interface address: https://colab.research.google.com/github/haltakov/natural-language-image-search/blob/main/colab/unsplash-image-search.ipynb#scrollTo=xbym_cYJJH6v
To perform more searches in the Colab session, refer to steps 5-7 above. After a Google Colab session, you can choose to log out of your Google account because logging in to your Google account can affect your privacy.
Run on this machine
To implement this feature on the local machine, you first need to install the necessary dependencies, with the following installation code:
Pip install-r requirements.txt
If you want to run all the code, open Jupyter notebooks and follow the numbering order and instructions below:
01-setup-clip.ipynb: set up the environment, review and prepare the CLIP code
02-download-unsplash-dataset.ipynb: downloading pictures from Unsplash datasets
03-process-unsplash-dataset.ipynb: use CLIP to process all pictures in the dataset
04-search-image-dataset.ipynb: retrieving pictures in a dataset using natural language queries
05-search-image-api.ipynb: use Unsplash Search API to retrieve images and CLIP to filter search results.
Note: only the stripped-down version of the Unsplash dataset is publicly available. If you want to use the full version, you need to apply for (free) access; using Unsplash Search API to search for images does not require access to the Unsplash dataset, but may produce poor results. The above content is how to achieve accurate retrieval and matching of nearly 2 million images on Colab. Have you learned the knowledge or skills? If you want to learn more skills or enrich your knowledge reserve, you are welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.