Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to create a containerized Machine Learning Model in Podman Container

2025-03-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly shows you "how to create a containerized machine learning model in the Podman container", the content is easy to understand, well-organized, hope to help you solve your doubts, let the editor lead you to study and learn how to create a containerized machine learning model in the Podman container "this article.

Prepare for

First, install Podman using the following command:

Sudo dnf-y install podman

Next, create a new folder for the container and change to that directory.

REST API of mkdir deployment_container & & cd deployment_containerTensorFlow model

The next step is to create a REST API for the machine learning model. The github repository contains a pre-training model and settings that allow REST API to work.

Clone it in the deployment_container directory using the following command:

Git clone https://github.com/svenboesiger/titanic_tf_ml_model.gitprediction.py and ml_model/

Prediction.py can make Tensorflow prediction, and the weight of 20x20x20 neural network is located in the folder ml_model/.

Swagger.yaml

Swagger.yaml uses the Swagger specification to define the API of the Connexion library. This file contains all the information you need to get your server to provide input parameter validation, output response data validation, and URL endpoint definition.

Besides, Connexion will also provide you with a simple but useful single-page Web application that demonstrates how to use Javascript to call API and update DOM.

Swagger: "2.0" info: description: This is the swagger file that goes with our server code version: "1.0.0" title: Tensorflow Podman Articleconsumes:-"application/json" produces:-"application/json" basePath: "/" paths: / survival_probability: post: operationId: "prediction.post" tags:-"Prediction" summary: "The prediction data structure provided by the server application" Description: "Retrieve the chance of surviving the titanic disaster" parameters:-in: body name: passenger required: true schema: $ref:'# / definitions/PredictionPost' responses: '2018: description:' Survival probability of an individual Titanic passenger' definitions: PredictionPost: type: objectserver.py and requirements.txt

Server.py defines the entry point to start the Connexion server.

Import connexion app = connexion.App (_ _ name__, specification_dir='./') app.add_api ('swagger.yaml') if _ _ name__ = =' _ main__': app.run (debug=True)

Requirements.txt defines the python packages needed to run the program.

Connexiontensorflowpandas containerization!

In order for Podman to build the image, create a new file called Dockerfile in the deployment_container directory created in the preparation step above:

FROM fedora:28 # File Author / MaintainerMAINTAINER Sven Boesiger # Update the sourcesRUN dnf-y update-- refresh # Install additional dependenciesRUN dnf-y install libstdc++ RUN dnf-y autoremove # Copy the application folder inside the containerADD / titanic_tf_ml_model/ titanic_tf_ml_model # Get pip to download and install requirements:RUN pip3 install-r / titanic_tf_ml_model/requirements.txt # Expose portsEXPOSE 5000 # Set the default directory where CMD will executeWORKDIR / titanic_tf_ml_model # Set the default command to execute# when creating a new containerCMD python3 server.py

Next, use the following command to build the container image:

Podman build-t ml_deployment. Run the container

As the container image is built and ready, you can run it locally using the following command:

Podman run-p 5000Suzhou 5000 ml_deployment

Enter http://0.0.0.0:5000/ui in the Web browser to access Swagger/Connexion UI and test the model:

Of course, you can now access the model through REST API in your application.

These are all the contents of the article "how to create a containerized machine learning model in a Podman container". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 260

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report