Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to create continuous integration pipeline in Docker+Rancher

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article mainly shows you "how to create a continuous integration pipeline in Docker+Rancher", which is easy to understand and well-organized. I hope it can help you solve your doubts. Let me lead you to study and learn this article "how to create a continuous integration pipeline in Docker+Rancher".

In the containerization of the build environment, we began the first step in building a continuous integration pipeline-the creation of a build system (Build System). We analyzed the three common challenges in [Build]-dependency management, managing environmental dependencies, the long build time of complex projects, and how to solve these problems with traditional tools and methods. Next, we shared how to use Docker to create containerized build systems to more easily solve those traditional challenges, including how to containerize the build environment, how to package applications using Docker, and how to use Docker Compose to create a build environment that is repeatable, centrally managed, well isolated, and parallelized.

Now that we have created the [Build] system, in this article, we will create a continuous integration pipeline for the example application. In this way we can ensure that best practices are followed while ensuring that conflicting changes do not interact and cause problems. However, before we establish continuous integration for the code, let's take a moment to discuss how to divide the code into branches.

Branching mode

As we automate the continuous integration pipeline, a key point to consider is the development model that the team follows. This pattern is usually determined by how the team uses the version control system. Because our application is hosted in the git repository, we use the git-flow model to branch, versioning, and publish our application. It is one of the most commonly used models based on git repositories. In short, the idea of the model is to maintain two branches: a developer branch and a main branch. Whenever we want to develop a new feature, we create a new branch from the development branch and merge it back when the feature development is complete. All functional branches are managed separately by developers. Once the code is submitted to the development branch, the CI server is responsible for ensuring that the branch always compiles, passes automated testing, and can be QA tested and reviewed on the server. When we are ready for a release, we can create a publication from the development branch and merge it into the main branch. The specific commit hash that is released is also marked with a version number. Tagged release items can then be pushed to the Staging/Beta or production environment.

Next we will use the git-flow tool to help manage our git branch. To install git-flow, please refer to the instructions here: https://github.com/nvie/gitflow/wiki/Installation. With git-flow installed, you can configure your repository with the git flow init command shown below. Git flow will ask some questions, and we recommend that you use the default settings. After executing the git-flow command, it creates a development branch (if it was not previously developed) and checks it out as a working branch.

Now, we use git flow to enter the git flow feature start [feature-name] command to create a new feature. It is common practice to use ticket/issue id as the name of the function. For example, if you are using Jira to process ticket, then ticket id (for example, MSP-123) can be used as the function name. You will also find that when you use git-flow to create a new feature, it will automatically switch to the functional branch.

At this point, you can do everything you need for the feature, and then run the automated test suite to make sure everything works properly. Once you are ready to release, just tell git-flow to complete this function. According to your actual need for this feature, you can submit as many times as you want. In my example, for our purposes, we just need to update the README file and complete the update by typing "git flow feature finish MSP-123".

It is important to note that git flow incorporates the functionality of the development branch, removes the functional branch and returns to the development branch. At this point, you can push the development branch to the remote repository (git push origin develop:develop). When you submit the development branch, the CI server takes over the continuous integration pipeline. A more appropriate pattern for larger teams is to push functional branches remotely before completing the functionality, review them, and then use pull request to merge them into the development branch.

Create a CI pipeline using Jenkins

In this section, we assume that you have started and run a Jenkins cluster. If not, you can use the official Jenkins image here: https://hub.docker.com/_/jenkins/, where you can see more about building an extensible Jenkins cluster: https://rancher.com/deploying-a-scalable-jenkins-cluster-with-docker-and-rancher/. After you have a running Jenkins cluster, we need to install the following plug-ins and dependencies on the Jenkins server:

Jenkins Plugins 2.32.2 +

Git Parameter Plugin 0.8.0 +

Parameterized Trigger Plugin 2.33 +

Copy Artifact Plugin 1.38.1 +

Build Pipeline Plugin 1.5.6 +

Mask Passwords Plugin 2.9 +

Docker 1.13.1 +

Docker Compose 1.11.1 +

After installing the required plug-ins, we can create the first three tasks in the Build pipeline: compilation, packaging, and integration testing. These will serve as a starting point for our continuous integration and deployment of the system.

Build application

The first task in the sequence will check out the latest code from source control and ensure that it is compilable after each commit. It also runs unit tests. If you want to set the first task in our sample project, select New Item- > Freestyle Project. Go to the project configuration view and select the General tab and the "The project is parameterized" option. Add a git parameter called GO_AUTH_VERSION and set the parameter type to Branch (branch) or Tag (tag). Next, select the Advanced configuration parameter and use the Tag Filter setting to get all tags that match "v" (such as v2.0). Set Default Value to develop (Development Branch). This will help you get a list of version tags from Git and populate the options menu for the task. If the task is to be triggered automatically without a given value, GO_AUTH_VERSION is set to the develop branch by default.

Next, add the repository url in the Source Code Management tab section, specifying the branch as ${GO_AUTH_VERSION}, so that when you build manually, you will use the git parameter to select the branch or tag to build and set the polling interval. In this way, Jenkins keeps track of all future changes in our development branch, automatically triggering the first task in our CI (and CD) pipeline. It is important to note here that the default values of GO_AUTH_VERSION (such as the development branch) will be used for automatically detected changes.

Now select Add Build Step > Execute Shell in the Build tab section and paste the docker run command here from earlier in this chapter. This allows you to get the latest code from Github and build it into the go-auth executable. Here you need to install and run docker. If you are using a Linux server, you may also need to add sudo to the docker client command to be able to access the docker daemon.

After the build step, we need to add two next steps, in which Archive the Artifacts (artifact archiving) archives the go-auth binaries and help scripts into the project. We need to specify the following artifacts for archiving in the task.

Then we use Trigger parameterized builds (trigger Parametric build) to start the next task in the pipeline, as shown in the following figure. When adding a Trigger parameterized build, make sure that the Current build parameters is added from the Add Parameters. This allows all the parameters of the current task (such as GO_AUTH_VERSION) to be used for the next task. Note the parameter names used for downstream tasks in the Trigger parameterized build section, which we will use in the following steps.

The log output of the build task should look like that shown below. You can see that we used the docker container to perform the build. Go fmt will be used to fix any formatting inconsistencies in the code at build time and our unit tests will be performed. If the test fails or the compilation fails, Jenkins detects the failure. In addition, you should set up notifications through email or chat integrations (such as Hipchat or Slack) so that you can notify your team if the build fails and fix it quickly.

Packaged application

The code has been compiled, and then we can package it into a Docker container. To create a packaged task, select New Item > Freestyle Project and give your second task a name that corresponds to what you specified in the previous task. As before, this task is a parameterized build with GO_AUTH_VERSION parameters. Note that here and in all subsequent tasks, GO_AUTH_VERSION is just a string parameter with a default value of develop (development branch). We want its value to come from upstream.

As before, add the build step to execute the shell. Note that you do not need to specify SCM settings here, because we will extract the required binaries and scripts from the artifacts generated by the previous build. Note that here we create an untagged usman/go-auth and then retag it so that we can perform integration tests later to build an untagged container environment.

In order to build the Docker container, we also need the executable that we built in the previous step. To do this, we add a build step to copy the artifacts in the upstream build. This ensures that we have an executable file that can be used for the Docker build command, which can be packaged into a Docker container. Note that here we select the flatten directory to ensure that all artifacts are copied to the root directory of the current project.

We have been using the GO_AUTH_VERSION variable to mark the image we are building. By default, changes to the development branch always build the usman/go-auth:develop and overwrite the existing image. In the next chapter, we will release a new version of the application and re-examine the pipeline.

As before, post-build (post-build) under Trigger parameterized builds (including Current build parameters) is used to trigger the next task in the pipeline, which will perform integration tests using the Docker container we just built and the Docker Compose detailed in the previous section.

Perform integration tests

The next step is to perform the integration test, first creating a new task. Like the packaging task, the new task is a parameterized build that uses the GO_AUTH_VERSION string variable. Then copy the artifacts from the build task. This time we will use the above Docker Compose template to build a multi-container test environment and perform integration tests on our code. Integration testing (unlike unit testing) is usually completely isolated from the code being tested. Therefore, we will use a shell script that executes http queries against our test environment. In executing the shell command, change the directory to go-auth and execute integrationtest.sh.

The contents of the script can be found here: https://github.com/usmanismail/go-messenger/blob/master/go-auth/integration-test.sh. We use Docker Compose to build our environment, and then use curl to send http requests to the built container. The log for this task will be similar to the one shown in the following figure. Compose will start a database container and connect it to the goauth container. After connecting to the database, you should see some columns of "Pass:..." Information indicating that each test is running and validating. After the test is complete, the compose template will clean up the database and go-auth containers on its own.

After three tasks have been set up, you can select the [+ tab] in the Jenkins view and select build pipeline view to create a new build pipeline view. In the pop-up configuration screen, select the task you compiled / built as the initial task, and then select ok. Now you should be able to see that the CI pipeline has taken shape. This will give you a visual guide to how each submission works through your build and deployment pipeline.

When you change the development branch, you will notice that the pipeline is automatically triggered by Jenkins. If you want to trigger the pipeline manually, select your first (build) task and run it. You will be asked to select the value of the git parameter (such as GO_AUTH_VERSION). If not specified, the default value is executed, and the CI pipeline for the latest content in the development branch is run. Of course, you can click "Run" directly in the pipeline view.

Let's take a quick look at what we have done so far. We created the CI pipeline for our application by following these steps:

Use git-flow to add new features and merge them into the development branch.

Track changes in the development branch and build our application in a containerized environment

Package our application into a docker container

Using Docker Compose to build a short Lifecycle Environment

Perform integration tests and clean up the environment

Through the CI pipeline above, each time a new feature (or fix) is merged into the development branch, the CI pipeline performs all the above steps to create a "usman/go-auth:develop" Docker image. In addition, we will build a deeper integration deployment pipeline in the following chapters. In addition, because the view has a clear testing phase, you can also use this view to extend the application version to a variety of deployment environments.

These are all the contents of the article "how to create a continuous integration pipeline in Docker+Rancher". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report