Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Oracle controls oracle RAC to perform parallel operations.

2025-01-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)06/01 Report--

One of the advantages of RAC is that parallel computing can be carried out across nodes, so how to control parallel computing? This is what this article is going to discuss.

In 10g:

To set cross-node parallelism reasonably, you need to set some parameters first:

Instance_groups: this parameter mainly sets whether the node instance belongs to an instance group. Each node can set multiple different instance group names, separated by commas.

The value set by parallel_instance_group is the value set in instance_groups, indicating which instance groups can be spanned by parallel operations on this node.

Alter system set instance_groups='dw','dw1','dw123','dw124','dw134' scope=spfile sid='dwrac1'

Alter system set instance_groups='dw','dw2','dw123','dw124','dw234' scope=spfile sid='dwrac2'

Alter system set instance_groups='dw','dw3','dw123','dw134','dw234' scope=spfile sid='dwrac3'

Alter system set instance_groups='dw','dw4','dw124','dw134','dw234' scope=spfile sid='dwrac4'

The above parameters define five instance group for each of the four nodes, some group contains multiple nodes, and some group contains only one node.

1. Execute across all nodes

Test in two cases: instance_groups contains all nodes and instance_groups contains only one node.

1. Instance_groups contains all nodes

1)。 Node 2 parameter setting

Admin@dwrac2 > show parameter instance_group

NAME TYPE VALUE

-

Instance_groups string dw, dw2, dw123, dw124, dw234

Parallel_instance_group string

Only instance_groups is defined and parallel_instance_group is not set.

As we set up earlier, the dw instance_group contains all the nodes.

2)。 Issue a query on node 2

Admin@dwrac2 > select / * + parallel (aMagne 128) * / count (1) from testuser.big_table a

3)。 Open another window to query parallel processes

SQL > select INST_ID,COUNT (1) from gv$px_session p where (p.instinstidPower.sid) in (select s from gv$session s where s.usernamekeeper admin 'and module like' SQL*%') GROUP BY INST_ID

INST_ID COUNT (1)

--

1 32

2 33

4 32

3 32

As you can see, parallel sessions (parallel slaves) are evenly distributed among four nodes. Node 2 has one more process because it is the initiator of the parallel query, the additional session is the initiating session, the parallel query Coordinator (parallel query coordinator), and the rest are parallel slave processes.

All of them can be assigned to 4 nodes because all 4 nodes belong to the instance group of dw, and no parallel_instance_group is set.

2. Instance_groups contains only one node

1)。 Modify parameters

Admin@dwrac2 > alter system set instance_groups='dw2' scope=spfile sid='dwrac2'

Restart

Admin@dwrac2 > show parameter instance_group

NAME TYPE VALUE

-

Instance_groups string dw2

Parallel_instance_group string

Parallel_instance_group is not set, and instance_group=dw2 contains only node 2.

2)。 Issue parallel queries

Admin@dwrac2 > select / * + parallel (aMagne 128) * / count (1) from testuser.big_table a

3)。 Open another window to query parallel processes

SQL > select INST_ID,COUNT (1) from gv$px_session p where (p.instinstidPower.sid) in (select s from gv$session s where s.usernamekeeper admin 'and module like' SQL*%') GROUP BY INST_ID

INST_ID COUNT (1)

--

1 32

2 33

4 32

3 32

As you can see, even though instance_groups='dw2' appears only on node 2, parallelism can still span all nodes if parallel_instance_group is not set.

Second, execute across some nodes

There are two kinds of tests in this part: the node issuing the command belongs to a member of parallel_instance_group and the node issuing the command does not belong to parallel_instance_group.

1. The node that issued the command belongs to a member of parallel_instance_group

1)。 Modify parameters in node 2

Admin@dwrac2 > show parameter instance_group

NAME TYPE VALUE

-

Instance_groups string dw, dw2, dw123, dw124, dw234

Parallel_instance_group string

Admin@dwrac2 > alter session set parallel_instance_group='dw2'

Session altered.

This setting forces parallel operations to be performed only on nodes that belong to dw2, the instance group.

-- Note: parallel_instance_group is case-sensitive. If you use alter session set parallel_instance_group=dw2, no parallel processes will occur, because if you do not add single quotation marks, the loaded background dw2 will actually be converted to uppercase DW2, which does not belong to any instance group.

2)。 Issue the same query on node 2

Admin@dwrac2 > select / * + parallel (aMagne 128) * / count (1) from testuser.big_table a

3)。 Start another session query parallelism

SQL > select INST_ID,COUNT (1) from gv$px_session p where (p.instinstidPower.sid) in (select s from gv$session s where s.usernamekeeper admin 'and module like' SQL*%') GROUP BY INST_ID

INST_ID COUNT (1)

--

2 129

As you can see, because only node 2 belongs to the instance group of dw2, all parallel sessions are distributed on node 2.

two。 The node that issued the command does not belong to parallel_instance_group

1) modify the parameters in node 2

Admin@dwrac2 > show parameter instance_group

NAME TYPE VALUE

-

Instance_groups string dw, dw2, dw123, dw124, dw234

Parallel_instance_group string

Admin@dwrac2 > alter session set parallel_instance_group='dw1'

Session altered.

According to our previous settings, only node 1 belongs to the instance group of dw1.

2) issue a query on node 2

Admin@dwrac2 > select / * + parallel (aMagne 128) * / count (1) from testuser.big_table a

3) start a new session to query parallel process information

SQL > select INST_ID,COUNT (1) from gv$px_session p where (p.instinstidPower.sid) in (select s from gv$session s where s.usernamekeeper admin 'and module like' SQL*%') GROUP BY INST_ID

INST_ID COUNT (1)

--

1 128

2 1

As you can see, except for the parallel query coordinator (that is, the session that issued the query command), all parallel processes are running on node 1.

This also shows that even if the node that issues the parallel query instruction does not belong to parallel_instance_group, it can still invoke node resources that belong to parallel_instance_group.

Admin@dwrac2 > alter session set parallel_instance_group='dw134'

Session altered.

Elapsed: 00:00:00.00

Admin@dwrac2 > select / * + parallel (aMagne 128) * / count (1) from testuser.big_table a

SQL > select INST_ID,COUNT (1) from gv$px_session p where (p.instinstidPower.sid) in (select s from gv$session s where s.usernamekeeper admin 'and module like' SQL*%') GROUP BY INST_ID

INST_ID COUNT (1)

--

1 42

2 1

4 42

3 44

To sum up:

1. RAC can flexibly control how to perform parallel operations across nodes through instance_groups and parallel_instance_group.

2. Parallel_instance_group is case sensitive. If the set parallel_instance_groups value does not belong to any of the instance_groups settings for the entire cluster, Oracle only performs operations serially and does not enable parallelism.

3. If parallel_instance_group is not set, parallelism can span all nodes no matter how instance_group is set

4. If a node has a valid parallel_instance_group set, parallel operations issued on that node can run on all nodes contained in the parallel_instance_groups, regardless of whether the node of the parallel_instance_groups contains the node that issued the command.

That is to say, parallel sessions run on those nodes that are only related to instance_groups and parallel_instance_groups, regardless of which node the command is issued.

5. It is generally recommended to set the appropriate instance_groups, but do not set parallel_instance_groups at the system level, but set the parallel_instance_groups parameter at the session level according to the actual situation.

6. Here are some examples and illustrations

Dwrac1.instance_groups='dw','dw1','dw123','dw124','dw134'

Dwrac2.instance_groups='dw','dw2','dw123','dw124','dw234'

Dwrac3.instance_groups='dw','dw3','dw123','dw134','dw234'

Dwrac4.instance_groups='dw','dw4','dw124','dw134','dw234'

Dwrac1.parallel_instance_groups=''-the session of parallel computing requests initiated by Node 1 can be executed across all nodes

Dwrac1.parallel_instance_groups='dw'-the session of parallel computing requests initiated by Node 1 can be executed across all nodes

Dwrac1.instance_groups='dw1'-the session of a parallel computing request initiated by node 1 can only be executed on node 1

Dwrac1.instance_groups='dw2'-the session of a parallel computing request initiated by node 1 can only be executed on node 2

Dwrac2.instance_groups='dw134'-the session of parallel computing requests initiated by node 2 can only be executed on the node on 1-3-4

Dwrac1.instance_groups='other'-parallelism is not enabled

Transfer to: http://www.cnblogs.com/ylqmf/archive/2012/03/16/2400126.html

11g:

In 11g database, parallel sessions default to the message buffer used by shared pool for parallel execution.

When there is too much parallelism, it is easy to cause insufficient shared pool and make the database report ORA-4031 errors. Set this parameter to

True to change the parallel session to use large pool.

1.-- adjust Px operations to use shared pool

Alter system set "_ px_use_large_pool" = true sid ='* 'scope=spfile

2.-- adjust parallel_force_local for RAC parameter

The new parameter of 11g is used to restrict the parallel slave process to the session where the parallel SQL is initiated

Nodes, that is, to avoid a large number of data exchange between nodes and performance problems caused by cross-node parallelism. This parameter replaces 11g

The previous two parameters instance_groups and parallel_instance_group.

Alter system set parallel_force_local=true sid='*' scope=spfile

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report