In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-10 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)05/31 Report--
This article focuses on "what is the method of continuous integration of database and Redgate SQL Toolbelt". Interested friends may wish to take a look. The method introduced in this paper is simple, fast and practical. Now let the editor take you to learn "what is the method of continuous integration of database and Redgate SQL Toolbelt"?
Database CI in Theory and practice
The theory behind CI is that if we integrate code into a shared repository several times a day and then verify each commit by running automated builds and subsequent tests, we will identify and eradicate problems early and improve the quality of the software.
This article is about setting up pipes so that you can put CI theory into practice and use it in a database. When I published the original article on this topic in 2013, it used SVN and Jenkins and proved to be very popular. This is still a feasible route, but in our industry, half a year is eternal, and other technologies and tools are becoming more and more popular. PowerShell has become the main scripting language for the Microsoft stack, and Git is everyone's popular source code control tool. Jenkins, or even TeamCity, can still be easily used on CI servers, but last month Azure DevOps replaced Visual Studio Team Services (VSTS) and felt it was the best time to write a new version of the original post in 2013. Azure DevOps has a free package that can be used by anyone who wants to take this tutorial.
Therefore, this article will use:
Git-as the source control system.
Azure DevOps-hosts remote Git repositories and automates database construction
Redgate SQL source code control (SoC)-commit database changes locally. We clone the remote repository and link it to SoC, so we can commit database changes directly from SSMS
PowerShell-push the submission to a remote repository hosted on Azure DevOps
Redgate SQL Change Automation (SCA)-automates local database builds using PowerShell cmdlet (optional)
SCA plug-in for Azure DevOps-automates the database construction and deployment process
Of course, just because you now have some CI tools, you are a CI practitioner, just like you claimed to be an astronomer when you bought your first telescope. These tools are only necessary first steps. The team still needs to adopt good CI practices. Developers need to write unit tests to prove that every small change is valid. They must commit changes to the shared source control repository as many times a day as possible. Each submission should trigger an automatic database build to ensure that the team always has an available database build process. Every night, the team should run integration tests (not covered in this article) to prove that all individual units are working together to implement the required process. These CI practices naturally lead to automatic deployment
Of course, you can practice CI without these tools, but the better the tools, the easier and more valuable CI you can find, just like telescopes and astronomy.
What do you need to start?
This article assumes you have a basic understanding of SQL Server,Git and PowerShell. To reproduce this proof of concept, I used the following software version:
Windows Server 2016 VM with administrator access. You should also be able to use all similar results on Windows 10.
SQL Server 2017 developer Edition. You can download it here.
Redgate SQL Toolbelt (October 2018 edition). You can download it here and it offers a 14-day free trial. In particular, you need to install SQL Source Control and SQL Change Automation on your local workstation.
Git 2.19 . You can download it here.
Azure DevOps account, where you can create an account for free.
You can use an older or newer version of the software, but since Azure DevOps is a managed service and time is a complex issue, I will not make any commitment to compatibility. Similarly, if you read this post in the coming months or years, you may find that Azure DevOps UI has changed:
You also need a database to play with. I will use the StackOverflow database because it has a simple architecture and I know how it works. For your first build, try using a simple content localDB that can be built separately. This does not include AdventureWorks because localDB does not support full-text search. In addition, now try to avoid databases that depend on other databases.
Step 1: create a new Azure DevOps project and clone the repository
Navigate to the Projects tab in Azure DevOps (in my case, https://dev.azure.com/DLMConsultants/_projects)
Specify a name for the project, such as the name of the database. Under version control, select Git, and then select your preferred work item process. If in doubt, the default work item process (Agile) is good because it has no impact on this tutorial.
Clone the repo to the local workstation. To do this, copy the HTTPS link in Azure DevOps:
Now open the PowerShell window, navigate to the directory where you want to store the source code, and clone repo. The basic commands are as follows:
Git clone https://yourcompany@dev.azure.com/etc
This is my full command window:
You should now have a local repository on your computer that contains a hidden .git directory where you can add database source code.
Step 2: link the database to the source
We will use Redgate SQL Source Control, so make sure that it is installed and that you can access your database in SSMS. Open SSMS and right-click the development or test database that you want to link to source control. Select Link Database to Source Control.
Redgate SQL Source Control opens in the query window. Select Link to my source control system, select Git from the supported source control systems, and paste into the path to the local source control repository. Then click Browse and create New folder and add the status directory to the root of the source control repository and link the database to the state directory. The reason will become clear in step 2.1.
After successfully linking, notice that the RedGate.ssc file has been added to the "state" directory in the Git repository. Then go to the "Commit" tab in Redgate SQL Source Control, and you will see a list of all database objects, ready to submit to source control. Make sure all objects are selected, type the submit message, and click commit:
When SQL Source Control indicates that all changes have been committed, please check your git repository again. You should see that all database objects are written into various directories.
Because Git is a distributed source control system, you only need to commit these changes to the local repository. You have not pushed these files to a remote repository hosted in Azure DevOps. To do this, open the PowerShell terminal again, navigate to the root directory of git repo, and run the command: git push.
You may see the push button in the SQL source control submission tab, but it may not work properly. This is a known problem with Redgate SQL Source Control and Git repos hosted in Azure DevOps. Although a bit frustrating, opening the PowerShell terminal and typing "git push" is not too troublesome, and other features in SQL Toolbelt can make up for it. In other words, if the broken push button bothers you, you can add your vote here.
After pushing the source code to Azure DevOps, you should be able to see it under the Code tab of the Azure DevOps Web site:
Step 2.1: script the database build using SCA and PowerShell (optional)
We now have our database in source control, so the next step is to set up a build process to check whether our source code is "compiled". By this, I mean, it's deployable. For example, if there is some illegal T-SQL syntax in the source file, or if some views lack dependencies, it may be because I refactored some tables but forgot to update my views, then SQL Server will not be able to execute my source code and my database build will fail.
This step is optional, or at least running it locally using PowerShell is optional. Another way is to skip this step and go straight to step 3 and start configuring the database build on Azure DevOps. However, I like to run my build locally first. It helps me understand what's going on behind the scenes. It also helps me understand whether my first build failed due to my source code or my Azure DevOps configuration.
In the root directory of source control, create a new directory called "build" next to the "state" directory. This explains why I didn't link my database to the root of my Git repository in step 2. It's useful to put other related files in source control, but you don't want to put them in your Redgate folder.
Validate the build using localdb
Create a new file called "build.ps1" in the build directory and copy the following PowerShell code to it. You may want to change the default value of the parameter, $packageID, to reflect the name of the database. For now, you should keep the other parameters. I'll explain the $packageVersion,$packagePath and $testPath parameters soon, and I'll explain others with a further extension.
PARAM ([string] $packageVersion = '0.1', [string] $packageID = 'StackOverflow', [string] $packagePath =' C:\ packages', [string] $testPath ='C:\ testResults', [string] $targetServerInstance = 'TARGETSERVERINSTANCE' [string] $targetDatabase = 'TARGETDATABASE') $errorActionPreference = "stop" import module SqlChangeAutomation-ErrorAction silentlycontinue-ErrorVariable + ImportErrors "* Parameter * packageVersion is $packageVersionpackageID is $packageIDpackagePath is $packagePathtestPath is $testPath* *" | write output # in the parent Search for the status folder $myDir = Split-Path-Parent $MyInvocation in the directory. My orders. Path $scriptsFolder = Join-Path-Path $myDir-ChildPath'.. \ state'$ scriptsFolder if (- not (test path-PathType container $scriptsFolder)) {write error "$scriptsFolder"} # Using Redgate SCA verify the code in the status directory try {$validatedScriptsFolder = Invoke-DatabaseBuild $scriptsFolder #-SQLCompareOptions'NoTransactions'} to catch # {$_. An exception. The message "$($database. Name;) cannot be verified because $($_. Exception. Message)" | Foreach {write error $_}}
This PowerShell code uses SQL Change Automation to create a new database, localDB, and deploy all source code to that database. Once the database is deployed, SQL Change Automation deletes it immediately because it has achieved its purpose. If you run the build 10 times a week, you don't want to have 50 test databases.
You should be able to run scripts from PowerShell and override the default parameters if you prefer.
Source control\ build in the root directory of CD. \ build. Ps1-packageVersion'0.1'- packageID'MyDatabase'
This is the complete command window, from my example:
If your build script works, that's great! If not, you may find some broken code, which is also great! Typically, these are the result of code corruption, lack of dependencies, localDB not installed, or incorrect SQL Server credentials. The PowerShell output is your friend.
Validate the build using a tempdb
You may encounter situations where the database cannot be deployed, for example, localDB lacks dependencies on some other databases, or uses SQL Server features that localDB does not support. For example, AdventureWorks uses full-text search for localDB that will not be deployed. Instead, you can deploy to the database on the "temproary" SQL Serve instance, which sets all the features required by the database, dependencies, filegroups, and so on. (see examples 3 and 4 in the Invoke-DatabaseBuild cmdlet documentation. )
You can download a PowerShell build script (DBBuildTempServer.ps1) and write it in a slightly different style to demonstrate possible ways to see how it works.
You should be able to run scripts from PowerShell and override the default parameters if you prefer.
Source control\ build in the root directory of CD. \ build. Ps1-packageVersion'0.1'- packageID'MyDatabase'- TempServer'MyServerInstance'- User_id'- password''
If you do not have an instance of SQL Server available for this process, simply remove it from the SCA Invoke-DatabaseBuild command-TemporaryDatabaseServer $TempconnectionString, and it will use localDB, as mentioned earlier. Phil Factor's article also provides a lot of additional information about using PowerShell and SCA.
Create a NuGet package
At this stage, all we do is validate the build; in other words, it proves that the database will be built successfully. When done, we can start exploring other SCA cmdlet that will generate and export database build artifacts from the validated build, run our test suite on it, and then deploy the test changes to synchronize the target database with the validated and tested resources.
I'll cover testing and synchronizing cmdlet later in the "extensions" section, but for now you may want to generate a NuGet package. The NuGet package can be used as a build artifact, representing an important milestone in package-based deployment and continuous delivery. You only need to uncomment a few lines of code in the build script to generate the NuGet package.
Just uncomment two lines immediately under the # Export NuGet package in the bottom comment block (for example, git add build. PS1 > git commit-m "add build script" > git push
Step 3: use Azure DevOps to build automatically
You now have the database code in source control and push it to Azure DevOps. If you follow the instructions in step 2.1, you can also use the PowerShell script to build the database in source control, and you know that the source code is "compiled". Now, we want Azure DevOps to build the database every time a new change is pushed to source control to verify that the change can be deployed and catch any errors.
We will use the Redgate Azure DevOps extension to automate the database build, as this does not require you to follow step 2.1. If you do follow step 2.1instead of using the Redgate plug-in, you may prefer to run the build script using the original PowerShell task that executes the build.ps1 file and override the default parameters as needed. If you take this approach, you may find it useful to index Azure DevOps predefined variables, and you may want to use $(Build.BuildNumber) or $(Build.BuildID) to create packages with sequential version numbers.
Your automatic build will be performed by the build Agent. If you already have localDB to build the database separately, you should be able to use the managed build agent immediately. Otherwise, you need to configure a local agent on the managed server that can access the appropriate target SQL Server instance to run the build.
In Azure DevOps, hover over the pipes tab and select build from the drop-down list.
Click the New Pipeline button to ensure that the correct source control repository and master branch are selected. Then select empty Job for the template.
A new pipe will be created for you with the default name (in my case, StackOverflow-CI) and the default proxy queue (in my case, managed VS2017, which is the default name for Azure DevOps. The pipe contains a default job named AgentJob1, although if you choose it, you can rename it. Let's call it Build Stack Overflow Database.
We can add tasks to agent jobs that will run MSBuild or execute PowerShell scripts, etc., to compile and test our code. To do this, click the + button next to the agent job.
The Redgate SQL Change Automation Azure DevOps extension is not available by default, but if you search for "Redgate" in the Marketplace tab under add tasks, you will find it. Follow the instructions to add the SQL Change Automation:Build extension.
After the installation is complete, find a way to return to the build process. You may need to click points 2-5 above again, as your new definition may not have been saved. This time, if you search for "Redgate", you will find that the extension does not need to be used through Marketplace again, you can simply click add.
A build task is created under the Build StackOverflow database, and some settings are warned that you need to pay attention to. Click the build task, and then under actions, select the build SQL source control project. Then, under the Database folder, select "the Database scripts folder is a subfolder of the VCS root", use the wizard under Subfolder Path to select the status directory maintained by SQL Source Control (do not select the root of Git repo), and if you follow step 2.1, it will also include your build script! ).
Under Output NuGet Package ID, write a name for your build artifact. Typically, this will be the name of your database.
Click Save and queue to manually trigger the build. The default setting should be fine. You should see a yellow notification indicating that the build has been queued. If you click the number, you will go to the real-time build log.
You will be assigned a managed agent. It then downloads SQL Change Automation and localDB builds the source code itself in the cloud and reports the results.
Note that if your database is not built on the default installation of the local database on the hosted build agent in Azure, the build will fail. If your database cannot build localDB, you need to provide a separate instance of SQL Server to run the build. Then, in the Build StackOverflow Database build task, change the Temporary server type from SQL LocalDB (recommended) to SQL Server and provide connection details to the server instance. If you want to host this server instance locally, you may need to install a local build agent that can access the local server to run the build.
It may take several minutes to build. As far as I'm concerned, it took about two and a half minutes to build the agent using Stack Overflow and hosting. Using larger, more complex patterns, you may take longer.
If the build fails, look for errors in the log and troubleshoot. Common problems include lack of dependencies, incorrect credentials, insufficient permissions on the database, use of unsupported features localDB, and corrupted T-SQL code. Keep blocking until your build turns green. If you skip step 2.1 and encounter problems while building, consider going back and try step 2.1, as this is an easier way to solve the source code problem and can help you determine if the problem is configured with the source code or Azure DevOps.
Once the build is green, you need to set up a build trigger to ensure that the build runs each time new code is pushed to the Azure DevOps server. From any page in Azure DevOps, hover over the pipes tab at the top and select generate to navigate back to your build definition. Select Build Definition, and then select the Edit button.
Select the trigger and make sure continuous integration is enabled, and you are done.
If you have done this and pushed the new code to Azure DevOps, it will automatically start the build and validate your code, and you can continue to integrate continuously and quickly get feedback on errors.
Further expansion
Automating database construction is only the first step. All of this will test whether your source code is deployable. It does not test whether your stored procedure runs as designed. It does not automatically update the integrated database, so it is always synchronized with the source control. It will not deploy your code to test, staging, and production servers.
First, and most importantly, you need to start writing tests for the database that will run automatically at each build. View tSQLt and Redgate SQL Test.
If you use the tSQLt framework that is written in step 2.1 of the PowerShell script and has installed the database, and is committed to its source control, you can run the test # by uncommenting the line below as part of your build. If you have not yet submitted the tSQLt framework to source control, this will fail. You should also make sure that the default parameter of $testPath is set to an existing directory other than git repo. As with packages, $packageVersion needs to be overridden each time the build is run.
If you want the build to deploy the changes to the integrated database, uncomment the line under # Sync a test database and set the default values for $targetServerInstance and $targetDatabase appropriately.
If you use Redgate SQL Change Automation extensions in Azure DevOps, you need to:
Add a SQL Change Automation:Build task with the action set to build artifacts from SQL Source Control project tests using tSQLt tests. Again, this will not work unless you have already used the tSQLt framework and testing in source control.
Add a SQL Change Automation:Build task with the action set to synchronize build artifacts from the SQL source control project to the target database.
At this point, I believe you have a deeper understanding of "what is the method of continuous integration of database and Redgate SQL Toolbelt". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.