Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to create docker Image in dockerfile

2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article is about how to create a docker image in dockerfile. The editor thinks it is very practical, so I hope you can get something after reading this article. Let's take a look at it with the editor.

Create a docker image

Two ways: docker commit and dockerfile

The commit command creates a local image

The commit command is packaged into an image using the container we have created locally, which is convenient and simple, but there is a disadvantage that the image generated in this way will contain a lot of intermediate files generated during the use of the container, which may be cache or junk files. All the containers pulled up by the created images will contain these junk files. Therefore, this method is generally not recommended.

Dockerfile

Write all the operations that need to be done on the image to a file with the name Dockerfile, and then use the docker build command to create an image from that file. This method makes the creation of the mirror transparent and independent, and the creation process can be repeated. The Dockerfile file takes the unit of behavior, with the Dockerfile command at the beginning of the line, and the command is in uppercase, followed by the parameters of the command.

First create an empty directory mkdir / usr1/xmla_python, and then create a Dockerfile file under the directory

# basic image

FROM python:3.6

# author Information

MAINTAINER xingxingzaixian "942274053@qq.com"

# set working directory

WORKDIR / app

# copy the local requirements.txt file to the mirrored / app directory

ADD requirements.txt / app

# copy the scrapyd configuration file to the directory specified by the image

ADD scrapyd.conf / .scrapyd.conf

# create scrapyd data directory, set time zone, ensure normal image time, and install python environment library

RUN mkdir / scrapyd & &

Pip install-I http://pypi.douban.com/simple/-- trusted-host=pypi.douban.com-r requirements.txt

# Open ports, mainly scrapyd ports

EXPOSE 6800 80

# execute the scrapyd command when the docker container starts to start scrapyd

CMD ["scrapyd", "redis-server"]

Create requirements.txt, mainly to initialize the python environment

Scrapy==1.5.0scrapyd==1.2.0redis==2.10.6scrapy-redis==0.6.8lxml==4.2.1requests==2.18.4

Create scrapyd.conf files, mainly to configure the scrapyd environment, and use ADD in our Dockerfile files

[scrapyd]

# eggs storage location of the project

Eggs_dir = / scrapyd/eggs

# the directory where scrapy logs are stored. If you want to disable storage of logs, set this option to empty, logs_dir=

Logs_dir = / scrapyd/logs

# the directory where scrapyitem will be stored. This is disabled by default. If the value is set, the FEED_URI configuration item of scrapy will be overwritten.

Items_dir =

# keep the number of work done by each spider. The default is 5

Jobs_to_keep = 5

# the directory stored in the project database

Dbs_dir = / scrapyd/dbs

# the maximum number of concurrent scrapy processes. Default is 0. If it is not set or set to 0, the number of cpus available in the system will be multiplied by the value of max_proc_per_cpu configuration.

Max_proc = 0

# number of processes started by each CPU. Default is 4.

Max_proc_per_cpu = 4

# the number of completion processes retained in the initiator. The default is 100

Finished_to_keep = 100

# the time interval used to poll the queue in seconds. Default is 5.0

Poll_interval = 5.0

# webservices snooping address

Bind_address = 0.0.0.0

# default http listening port

Http_port = 6800

# whether to debug mode

Debug = off

# the module that will be used to start the subprocess. You can use your own module to customize the Scrapy process started from Scrapyd.

Runner = scrapyd.runner

Application = scrapyd.app.application

Launcher = scrapyd.launcher.Launcher

Webroot = scrapyd.website.Root

[services]

Schedule.json = scrapyd.webservice.Schedule

Cancel.json = scrapyd.webservice.Cancel

Addversion.json = scrapyd.webservice.AddVersion

Listprojects.json = scrapyd.webservice.ListProjects

Listversions.json = scrapyd.webservice.ListVersions

Listspiders.json = scrapyd.webservice.ListSpiders

Delproject.json = scrapyd.webservice.DeleteProject

Delversion.json = scrapyd.webservice.DeleteVersion

Listjobs.json = scrapyd.webservice.ListJobs

Daemonstatus.json = scrapyd.webservice.DaemonStatus

Mainly eggsdir, logsdir, dbs_dir three directory configuration, the other can default, the home directory of these three directories / scrapyd, we add mkdir / scrapyd to the RUN command of the Dockerfile file to create

After the above is created, we can use the docker build command to build the image.

Execute docker build-t scrapy_python. Command to build an image, where the-t parameter specifies the image name,. Is the path to the Dockerfile file

The above is how to create a docker image in dockerfile. The editor believes that there are some knowledge points that we may see or use in our daily work. I hope you can learn more from this article. For more details, please follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 292

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report