In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article will explain in detail how to use TensorFlow Hub for neural style transfer, the content of the article is of high quality, so the editor will share it for you as a reference. I hope you will have a certain understanding of the relevant knowledge after reading this article.
What is neurostylistic transfer?
Neural style transfer (NST) is a technique that uses depth convolution neural networks and algorithms to extract content information from one image and style information from another reference image. After the style and content are extracted, a composite image is generated, in which the content and style of the generated image come from different images. NST is an image stylization method, which is a process of using an input reference image to provide an output image with a style change derived from the input image.
Leon A Gatys et al introduced NST technology in their paper "Neural algorithm of artistic style" (https://arxiv.org/pdf/1508.06576.pdf).
How does it work
Depth neural network (DNN), more specifically, the key feature of convolution neural network (CNN) is that it can learn the spatial representation of content and style in images. This feature enables the implementation of NST technology.
The spatial representation of the input image generated by CNN captures the style and content statistics of the image. NST combines the extracted style and content into the resulting output image.
The activation function in the middle layer within the CNN layer structure provides the function to capture the content and style statistics of the input image.
The CNN layer outputs a feature map after a convolution operation, which involves a filter sliding on the input image. The content of the image is actually in the generated feature map of each layer.
Extracting content from the feature map of the middle layer will provide the advanced structure and geometric information of the input image. The feature map acquires the style of the input image. In order to derive the style of the image, it is necessary to evaluate the mean and correlation of the feature graph in the middle layer. The information provided by this process provides texture pattern information for the input image.
Programming
Here comes the good stuff. We will create the image using the content and style of the image below.
Left: content screen, right: style picture
In order to successfully implement the process of neural style transfer using two reference images, we will use the module on TensorFlow Hub (https://www.tensorflow.org/hub)
TensorFlow Hub provides a set of reusable machine learning components, such as datasets, weights, models, etc.
For the implementation part of this article, we will use a set of tools and libraries to load images and perform data conversion.
TensorFlow: an open source platform for implementing, training, and deploying machine learning models.
Matplotlib: tools for creating visualization diagrams in Python, such as charts, graphs, etc.
Numpy: enables several mathematical calculations and operations of array data structures.
TensorFlow Hub: a library of reusable machine learning components, such as models, datasets, etc.
Their addresses are as follows:
TensorFlow: https://www.tensorflow.org/
Matplotlib: https://matplotlib.org/
Numpy: https://numpy.org/
TensorFlow Hub: https://www.tensorflow.org/hub
We will use Jupyter Notebook (https://jupyter.org/) for code implementation. A link to notebook's Github repository is also included at the end of this article.
First, we will import the required tools and libraries.
Import tensorflow as tfimport matplotlib.pyplot as pltimport numpy as npimport PIL.Imageimport tensorflow_hub as hub
Next, we declare two variables that hold the directory path of the image to represent the content and style of the output. In addition, we will also display the image.
Content_path = 'images/human.jpg'style_path =' images/painting.jpg'content_image = plt.imread (content_path) style_image = plt.imread (style_path) plt.subplot (1,2,1) plt.title ('Content Image') plt.axis (' off') imshow (content_image) plt.subplot (1,2,2) plt.title ('Style Image') plt.axis (' off') imshow (style_image)
Requires that the image be converted to a tensor representation. For the next step, we will use the image processing method of TensorFlow.
We will create a function that takes the image path as an argument, and then use "tf.io.read_file" to convert the image into a tensor. We further use 'tf.image.decode_image' to change the data type of the value in the tensor to a floating-point number between 0 and 1.
Def image_to_tensor (path_to_img): img = tf.io.read_file (path_to_img) img = tf.image.decode_image (img, channels=3, dtype=tf.float32) # Resize the image to specific dimensions img = tf.image.resize (img, [720,512]) img = img [tf.newaxis,:] return img
You need to do the opposite to visualize the results from the TensorFlow Hub module. We need to convert the returned tensor into an image that can be visualized.
We just need to multiply each element by 255 to inversely normalize the tensor containing values between 0 and 1 to the actual pixel value. The next step is to use Numpy to create an array containing the data types we need.
We return an image object from the tensor.
Def tensor_to_image (tensor): tensor = tensor*255 tensor = np.array (tensor, dtype=np.uint8) tensor = tensor [0] plt.figure (figsize= (20je 10)) plt.axis ('off') return plt.imshow (tensor)
So far, we have completed the following work:
View content and reference style images
Create a function to convert an image to a tensor and a tensor to an image
Now we convert the image to a tensor and pass it to the module through the .load () method in the TensorFlow Hub package.
We expect a combination of style and content from the reference image; therefore, we will create a variable to hold the results of the operation from the module.
To visualize the results, we only use the tensor_to_image function we created earlier.
Content_image_tensor = image_to_tensor (content_path) style_image_tensor = image_to_tensor (style_path) hub_module = hub.load ('https://tfhub.dev/google/magenta/arbitrary-image-stylization-v1-256/2')combined_result = hub_module (tf.constant (content_image_tensor), tf.constant (style_image_tensor)) [0] tensor_to_image (combined_result))
On how to use TensorFlow Hub for neural style transfer to share here, I hope the above content can be of some help to you, can learn more knowledge. If you think the article is good, you can share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.