Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Logminer mining archive logs for recovery of misoperation of DDL

2025-04-11 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >

Share

Shulou(Shulou.com)06/01 Report--

Log mining, supplementary log feature is not enabled, and archive is used to recover DDL operations:

In the log file of Oracle, for user objects such as tables (Object), instead of saving the name, an ID number is saved. The purpose of establishing a dictionary file is to enable logminer to translate Object ID into familiar object names when analyzing.

Before creating a dictionary file, make sure that the initialization parameter UTL_FILE_DIR of the database has been set correctly.

You can check if there is a dictionary file through the command: show parameters utl_file_dir

SYS@jie > show pashow parameters utl_file_dir NAME TYPE VALUE-- utl_file_dir String / home/oracle/logminer

You can see the current setting for this parameter. If there is no value, you must modify the database's init.ora file to point the utl_file_dir to a path where you want to save the dictionary file. In this example, UTL_FILE_DIR=/home/oracle/logminer is set here

(general production database is configured for this option)

Create a dictionary file with the following statement:

Begin sys.dbms_logmnr_d.build (dictionary_filename= > 'dictionary.ora', dictionary_location = >' / home/oracle/logminer'); end;-dictionary_filename is the name of the dictionary to be generated. Feel free to take-dictionary_location is the location where the dictionary is to be generated

SYS@jie > begin sys.dbms_logmnr_d.build (dictionary_filename= > 'dictionary.ora', dictionary_location = >' / home/oracle/logminer'); end; 2 / PL/SQL procedure successfully completed.

The entire creation process may take from ten minutes to an hour, depending on the number of object in the database and how busy it is. When you're done, you'll see a file called dictionary.ora in the / home/oracle/logminer directory.

[oracle@localhost logminer] $pwd/home/oracle/logminer [oracle@localhost logminer] $ls-rtltotal 40768, RW, RW, May, r, dictionary.ora-1 oracle oinstall 41742401 May 12 08:43 dictionary.ora

Second, select the documents to be analyzed

The number of log files and archive log files is very large. As far as the company database is concerned, a 200m log file is generated every 10 minutes, and the amount of data is very large. So it is virtually impossible to analyze all the log files (you can do it, but make sure you have enough space and time, and are not afraid to affect database performance), usually select logs from the time period you are interested in.

The operation of selecting a log file is as follows:

Begin sys.dbms_logmnr.add_logfile (logfilename = > '11947-937825099. DBFF, options= > sys.dbms_logmnr.NEW)

End

Only one file can be selected at a time. To add files, use the following example:

Begin sys.dbms_logmnr.add_logfile (logfilename = > '114848937825099.dbfeld, options= > sys.dbms_logmnr.ADDFILE)

End

To remove a file that has been selected or added, use REMOVEFILE:

Begin sys.dbms_logmnr.add_logfile (logfilename = > '114848937825099.dbfeld, options= > sys.dbms_logmnr.REMOVEFILE)

End

By doing this repeatedly, you can select all the files to be analyzed.

When testing, it is also possible and convenient to use the following

Exec dbms_logmnr.add_logfile ('/ u01ActionActionPlacleArchphrase 1 '36th 937825099.dbf')

Third, carry on the analysis

After selecting all the files that need to be analyzed, execute the following command to start the analysis:

Begin sys.dbms_logmnr.start_logmnr (dictfilename = > 'dictionary.ora'); end

Notice that the dictionary.ora here is the dictionary file name you created earlier.

Depending on the amount of data in the selected file, the analysis process may take several hours. Sometimes, DBA may not need all the data in these log files

So can we analyze only part of the data? Oracle allows you to analyze only the data for a specified time period or a specified SCN segment. The syntax example is as follows:

Begin sys.dbms_logmnr.start_logmnr (dictfilename = > 'dictionary.ora',starttime = > to_date (' 01-Aug-2001 08 dictionary.ora',starttime 30 to_date, 'DD-MON-YYYY HH:MI:SS'), endtime = > to_date (' 01-Aug-2001 08 dictionary.ora',starttime 4500, 'DD-MON-YYYY HH:MI:SS'))

End

Or

Begin sys.dbms_logmnr.start_logmnr (dictfilename = > 'dictionary.ora',startscn = > 100pm Endscn = > 500)

End

After the analysis, the analyzed data can be queried from a view called V$LOGMNR_CONTENTS. We can apply the contents of this view to achieve our goal.

For example, if I mistakenly deleted a row of data from the AIR table, I could find it with the following statement:

Select username,session#,SESSION_INFO,operation,sql_redo,sql_undo from v$logmnr_contents where TABLE_NAME='AIR'

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Database

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report