Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to understand LogMiner

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article introduces you how to understand LogMiner, the content is very detailed, interested friends can refer to, hope to be helpful to you.

-, basic introduction

The LogMiner tool can analyze all the DML operation statements against the database, and it can also analyze and get some necessary rollback SQL. This tool is particularly useful for debugging, auditing, or backing back specific transactions. LogMiner can analyze not only online logs but also offline logs, redo Log of its own database and redo log of other databases. Using this data, DBA can:

* found the misoperation of the user

* determine how to perform the most subtle restore operations at the transaction level

* tuning and planning the database through trend analysis

* conduct a job audit

II. Basic composition

2 PL/SQL packages:

DBMS_LOGMNR_D:

Contains a procedure for extracting the data dictionary, DBSM_LOGMNR_D.build (). It also includes a process to rebuild the LogMiner, DBMS_LOGMNR_D.set_tablespace (), and the default logminer table is built under the system tablespace.

How to extract data dictionary:

1. Extract as flat file

Need to set the UTL_FILE_DIR parameter (restart the database)

Alter system set UTL_FILE_DIR='/logminer' scope=spfile

Exec DBMS_LOGMNR_build (dictionary_filename= > 'logmnr.ora',dictionary_location= >' / logminer', [options= > DBMS_LOGMNR_D.STORE_IN_FLAT_FILE])

2. Extract to redo log

Must be running in archive mode

Exec DBMS_LOGMNR_D.build (options= > DBMS_LOGMNR_D.STORE_IN_REDO_LOGS)

When the redo logs of these data dictionaries are archived, you can query the V$ARCHIVE_LOG view to query:

Select name from v$archive_log where dictionary_begin='YES'

Select name from v$archive_log where dictionary_end='YES'

If you extract dictionary information as redo files, you need to add these redo files that contain dictionary information when adding analyzed log files using DBMS_LOGMNR.add_logfile. At the same time, when you start the analysis using start_logmnr, you need to specify the DICT_FROM_REDO_LOGS parameters.

3. Use online catalog

Exec DBMS_LOGMNR_start_logmnr (options= > DBMS_LOGMNR.DICT_FROM_ONLINE_CATALOG)

DBMS_LOGMNR:

Add_logfile (options= >, logfilename= >) is used to add and delete logs for analysis. Options:dbms_logmnr.new/addfile/removefile

Start_logmnr (options= >, dictfilename= >, [starttime/endtime/startscn/endscn]): used to enable log analysis and determine data extraction conditions

End_logmnr (): terminates the analysis

Views related to logminer:

V$logmnr_dictionary:logminer data Dictionary Information

V$logmnr_parameters: parameter information

V$logmnr_logs: list of logs currently used for analysis

V$logmnr_contents: log analysis result

III. Examples

1. Create DBMS_ LOGMNR package and related data dictionary

@ $ORACLE_HOME\ rdbms\ admin\ dbmslm.sql

@ $ORACLE_HOME\ rdbms\ admin\ dbmslmd.sql

2. Initialization settings

A: it is recommended to open the minimum supplementary log

Alter database add supplemental log data

B: logminer tablespaces can be established

Exec DBMS_LOGMNR_D.set_tablespace ('logminer')

3. Extract the dictionary

As mentioned earlier, extract the dictionary file as flat file or redo log, or use online catalog directly

A:alter system set UTL_FILE_DIR='' scope=spfile

Exec DBMS_LOGMNR_D.build (dictionary_filename= >', dictionary_location= >', options= > DBMS_LOGMNR_D.STORE_IN_FLAT_FILE)

B:exec DBMS_LOGMNR_D.build (options= > DBMS_LOGMNR_D.STORE_IN_REDO_LOGS)

C:exec DBMS_LOGMNR_D.build (options= > DBMS_LOGMNR_D.DICT_FROM_ONLINE_CATALOG)

4. Make logs that need to be analyzed

Exec DBMS_LOGMNR.add_logfile (options= > DBMS_LOGMNR.NEW,logfilename= >'/ oracle/logs/redo01.log')

Exec DBMS_LOGMNR.add_logfile (options= > DBMS_LOGMNR.ADDFILE,logfilename= >'/ oralce/logs/redo02.log')

Exec DBMS_LOGMNR.add_logfile (options= > DBMS_LOGMNR.REMOVEFILE,logfilename= >'/ oracle/logs/redo3.log')

5. Start LogMiner

If the dictionary is extracted as flat file, you need to specify the Dictionary parameter. If there is no dictionary, you need to specify the DICT_FROM_REDO_LOGS or DICT_FROM_ONLINE_CATALOG parameter.

Exec DBMS_LOGMNR.start_logmnr (dictfilename= >'', options= >)

Filter the data returned by LogMiner:

A: show only committed transactions

Exec DBMS_LOGMNR.start_logmnr (options= > DBMS_LOGMNR.COMMITED_DATAONLY,dictfilename= >'')

B: ignore redo errors

Exec DBMS_LOGMNR.start_logmnr (options= > DBMS_LOGMNR.SKIP_CORRUPTION,dictfilename= >'')

C: filter by time

Exec DBMS_LOGMNR.start_logmnr (dictfilename= >'', starttime= > to_date (','), endtime= > to_date ('','')

D: filter by SCN

Exec DBMS_LOGMNR.start_logmnr (dictfilename= >'', startscn= > 1 Endscn = > 5)

E: trace DDL statements (off by default and cannot be used when dictionaries use ONLINE CATALOG)

Exec DBMS_LOGMNR.start_logmnr (options= > DBMS_LOGMNR.DDL_DICT_TRACKING,dictfilename= >'')

6. Query the results of analysis

Select * from v$logmnr_contents

Seg_name: object name

Scn:scn number

Sql_redo: the sql of the operation

Sql_undo: sql for recovery

The time when the timestamp:sql was sent

Type of operation:sql (insert, update, delete, DDL, etc.)

Session#: issues the session of the operation

7. End the analysis

Exec DBMS_LOGMNR.end_logmnr (): once the analysis is finished, the analysis results in v$logmnr_contents no longer exist, because the analysis information is stored in PGA.

Note:

1. Logminer log analysis tool can only analyze products after oracle8.

2. When querying v$logmnr_contents, the condition after where must be capitalized.

3. When using logminer to analyze redo log files generated by other database instances

A:logminer must use the dictionary files generated by the database instance being analyzed, rather than the dictionary files generated by the database where logminer is installed

B: you must ensure that the character set of the installed logminer database is the same as that of the database being analyzed

C: the platform of the database being analyzed must be the same as the platform on which the logminer database is installed

On how to understand LogMiner to share here, I hope that the above content can be of some help to you, can learn more knowledge. If you think the article is good, you can share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report