In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
How to tune Oracle SQL part 2: SQL performance Methodology.
2 SQL performance Methodology 2.1 Application Design Guide
The key to achieving good SQL performance is to consider performance when designing your application.
2.1.1 data Modeling Guide
Data modeling is very important for successful application design.
You have to model the data according to the actual business requirements. In this process, there may be different disputes about which model is the right data model. It is important to apply the greatest modeling effort to entities that are affected by the most frequent business transactions.
During the modeling phase, it is possible to spend too much time modeling non-core data elements, which can lead to an increase in development lead time. Using modeling tools, schema definitions can be generated quickly and are useful when rapid prototyping is needed.
2.1.2 write an effective application guide
During the design and architecture phases of system development, ensure that application developers understand the efficiency of SQL execution.
To achieve this goal, the development environment must support the following characteristics:
Good database connection management
Connecting to a database is an expensive operation and cannot be extended. Therefore, the best practice is to minimize the number of concurrent connections to the database. A simple system where users connect when the application is initialized is ideal. However, in Web-based or multi-tier applications, this approach can be difficult. Using these types of applications, you typically use database connection pooling instead of requesting a re-establishment of connections for each user.
Good use and management of cursors
Maintaining user connections is also important to minimize parsing activity on the system. Parsing is the process of interpreting a SQL statement and creating an execution plan for it. There are many stages in this process, including syntax checking, security checking, performing plan generation, and loading shared structures into shared pools. There are two types of parsing operations:
The SQL statement was submitted for the first time, and no match was found in the shared pool. Hard parsing is the most resource-intensive and non-extensible because they perform all the operations involved in parsing.
Commit the SQL statement for the first time and find a match in the shared pool. The match can be a result previously performed by other users. Share SQL statements, which is best for performance. However, soft parsing is not ideal because they still require syntax and security checks, which consume system resources.
Soft analysis
Hard analysis
Because parsing should be minimized as much as possible, application developers should design their applications to parse SQL statements once and execute them multiple times. This is done through cursors. Experienced SQL programmers should be familiar with the concept of opening and re-executing cursors.
Effective use of bound variables
Application developers must also ensure that SQL statements are shared in the shared pool. To achieve this goal, bind variables are used to transform the query. If you do not, the SQL statement may be parsed once and will never be reused by other users. To be sure to share SQL, do not use string literals with SQL statements. For example:
Statements with string text:
SELECT * FROM employees WHERE last_name LIKE 'KING'
Statements that bind variables:
SELECT * FROM employees WHERE last_name LIKE: 1
The following example shows some test results for a simple OLTP application:
The test supports the number of users without parsing all statements 270 soft parsing all statements 150 hard parsing all statements 60 reconnecting 30 for each transaction
These tests were carried out on four CPU computers. As the number of CPU on the system increases, so does the difference.
2.2 deployment Application Guide
For optimal performance, deploy your application as carefully as you design it.
2.2.1 deployment guidelines in a test environment
The testing process mainly includes functional testing and stability testing. At some point in this process, you must perform a performance test.
The following list describes the simple rules for performance testing of an application. If the records are correct, this list provides important information for the production application and the capacity planning process after the application is online.
Use automatic Database Diagnostic Monitor (ADDM) and SQL Tuning Advisor for design verification.
Test using the actual amount and distribution of data.
All tests must be done with fully populated tables. The test database should contain data representing the production system, including the amount of data and cardinality between tables. All production indexes should be built and pattern statistics should be populated correctly.
Use the correct optimizer mode.
Perform all tests using the optimizer pattern that you plan to use in production.
Test individual user performance.
Test individual users on idle or lightly used databases for acceptable performance. If a single user cannot achieve acceptable performance under ideal conditions, multiple users cannot achieve acceptable performance under actual conditions.
Gets and records the plan for all SQL statements.
Gets the execution plan for each SQL statement. Use this procedure to verify that the optimizer has the best execution plan, and that the relative cost of the SQL statement can be understood in terms of CPU time and physical Imax O. This process helps to identify a large number of transactions that need tuning and performance work most in the future.
Try a multi-user test.
This process is difficult to perform accurately because user workloads and profiles may not be fully quantified. However, you should test transactions that execute DML statements to ensure that there are no locking conflicts or serialization problems.
Test with the correct hardware configuration.
Test with a configuration as close to the production system as possible. Using a real system is particularly important for network latency, bandwidth of the Icano subsystem, and processor type and speed. If you do not use this method, it may result in incorrect analysis of potential performance problems.
The steady-state performance is measured.
In benchmark testing, it is very important to measure the performance under steady-state conditions. Each benchmark run should have an upward phase in which the user connects to the application and gradually begins to perform work on the application. This process allows frequently cached data to be initialized into the cache and a single execution operation (such as parsing) is completed before the steady-state condition. Similarly, it is useful to have a period of decline after the benchmark run, so that the system can release resources and users can stop working and disconnect.
2.2.2 Application deployment Guide
When a new application is launched, there are usually two strategies: the Big Bang approach (that is, all users migrate to the new system at the same time) and the Trickle approach (that is, users slowly migrate from the existing system to the new system).
Both methods have advantages and disadvantages. The Big Bang approach relies on reliable testing of the application at the required scale, but has the advantage of minimizing data conversion and synchronizing with the old system because it is simply turned off. The Trickle approach allows you to debug scalability issues as your workload increases, but it may mean that data must be migrated to and from legacy systems as the transition occurs.
It is difficult to recommend one method instead of the other, because each technology has associated risks that can cause system outages when the conversion occurs. Of course, the Trickle approach allows you to analyze actual users when they are introduced into a new application, and to reconfigure the system if only the migrated users are affected. This approach affects early adopters, but limits the load of supporting services. As a result, unplanned outages affect only a small number of users.
The decision on how to launch a new application is for each business. Any method used has its own unique pressure and stress. The more testing and knowledge you gain from the testing process, the more you will realize when is the best time to launch a new application.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.