Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to analyze sqlldr

2025-01-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

Today, I would like to talk to you about how to analyze sqlldr. Many people may not know much about it. In order to make you understand better, the editor has summarized the following contents for you. I hope you can get something from this article.

SQL*LOADER is a data loading tool for ORACLE, which is usually used to migrate operating system files (data) to ORACLE databases. SQL*LOADER is the loading method chosen by large data warehouses because it provides the fastest way (DIRECT,PARALLEL).

Execution of sqlldr (called with shell under UNIX)

$ORACLE_HOME/bin/sqlldr dwh/cognos@ORA8 control=../tmp/load.ctl

$ORACLE_HOME/bin/sqlldr dwh/cognos@ORA8 control=../tmp/load.ctl direct=true log=...

Develop control document load.ctl

1. Control file identification

2. The name of the data file to be imported is test.txt

3. Append records to table test

4. Specify the delimiter

Load data

Infile'/ query5/Ascential/data/month/mgmid.200304M'

Into table DC_RPT_T_MGMID_200304M_30 append (APPEND is appended, or REPLACE)

Fields terminated by','

(

Userid

Svcnum

Brand

SvcPlan

Busist

HvcFlag

MntFlag

UserYear

JoinMonth

Callfee

Callfeefav

Tollfee

Tollfeefav

Calltimes

Callduration

Billduration

Tollduration

TotalFee

GroupID

)

Import mode

* here are four ways to load tables

APPEND / / the data in the original table is added at the end.

INSERT / / load empty table if the original table has data sqlloader will stop the default value

REPLACE / / previous tables have data and all previous data will be deleted

TRUNCATE / / the same content as replace will delete existing data with the truncate statement

Import date field

LOAD DATA

INFILE 'zlx.TXT'

Append INTO TABLE zlx

FIELDS TERMINATED BY', 'OPTIONALLY ENCLOSED BY' "'

(

ID

L

F

M

DATE1 date 'dd-mm-yyyy'

)

SQLLoader exports Excel data to Oracle

1. All the files needed to create the SQL*Loader input data are saved to C, and the control file is edited with notepad: input.ctl, the contents are as follows:

Load data-1. Control file identification

Infile 'test.txt'-2. The name of the data file to be imported is test.txt

Append into table test-3. Append records to table test

Fields terminated by TAB 09'-4, the field terminates at the end of the tab (tab).

(id,username,password,sj)-defines the corresponding order of columns

two。 There's another way.

You can save the EXCEL file as CSV (comma delimited) (* .csv), and the control files are separated by commas instead.

LOAD DATA

INFILE'drig car.csv'

APPEND INTO TABLE t_car_temp

FIELDS TERMINATED BY','

(phoneno,vip_car)

Import data directly in the control file

1. Control the contents of the file test.ctl

LOAD DATA

INFILE *

BADFILE 'C:Documents and SettingsJackey Desktop WMCOUNTRY.BAD'

INSERT INTO TABLE EMCCOUNTRY

Fields terminated by'; 'Optionally enclosed by' "'

(

COUNTRYID

COUNTRYCODE

COUNTRYNAME

CONTINENTID

MAPID

CREATETIME DATE 'MM/DD/YYYY HH24:MI:SS'

LASTMODIFIEDTIME DATE 'MM/DD/YYYY HH24:MI:SS'

)

BEGINDATA

1; "JP"; "Japan"; 1 position 9; "09 plash 16 picks 16:31:32"

2; "CN"; "China"; 1 position 10; "09 swap 16 picks 16:31:32"

3; "IN"; "India"; 1ter11; "09Universe 16:31:32"

4; "AU"; "Australia"; 6 * 12; "09 * * 16 * 2004 16:31:32"

5; "CA"; "Canada"; 4scape 13; "09Universe 16" 16:31:32 2004 "

6; "US"; "United States"; 4 ters14; "09 swap 16 Universe 16:31:32"

7; "MX"; "Mexico"; 4 ters15; "09plash 16 and2004 16:31:32"

8; "GB"; "United Kingdom"; 3ter16; "09plash 16" 16:31:32 2004

9; "DE"; "Germany"; 3 ters17; "09 plash 16 Universe 16:31:32"

10; "FR"; "France"; 3ter18; "09Universe 16" 16:31:32 2004 "

11; "IT"; "Italy"; 3 ters19; "09plash 16 Universe 16:31:32"

12; "ES"; "Spain"; 3 ters20; "09 plans16 and2004 16:31:32"

13; "FI"; "Finland"; 3ter21; "09Universe 16" 16:31:32 2004 "

14; "SE"; "Sweden"; 3tern 22; "09Universe 16" 16:31:32 2004 "

15; "IE"; "Ireland"; 3ter23; "09Universe 16" 16:31:32 2004 "

16; "NL"; "Netherlands"; 3There 24; "09Universe 16 Universe 16:31:32"

17; "DK"; "Denmark"; 3tern 25; "09Universe 16" 16:31:32 2004 "

18; "BR"; "Brazil"; 5TX 85; "09swap 30 Universe 11:25:43"

19; "KR"; "Korea, Republic of"; 1 TX 88; "09 Universe 30 Grease 11:25:43"

20; "NZ"; "New Zealand"; 6 villages 89; "09 ticks 30 picks 11:25:43"

21; "BE"; "Belgium"; 3There 79; "09Thirty Universe 11:25:43"

22; "AT"; "Austria"; 3transfer78; "09swap 30 Universe 11:25:43"

23; "NO"; "Norway"; 3There are 82; "09max 30 and2004 11:25:43"

24; "LU"; "Luxembourg"; 3 ters81; "09gamp30 and2004 11:25:43"

25; "PT"; "Portugal"; 3 x 83; "09 max 30 amp 11:25:43"

26; "GR"; "Greece"; 3tern 80; "09swap 30 Universe 11:25:43"

27; "IL"; "Israel"; 1 x 86; "09 Unix 30 pica 2004 11:25:43"

28; "CH"; "Switzerland"; 3 x 84; "09max 30 Universe 11:25:43"

29; "A1"; "Anonymous Proxy"; 0 + 0; "09 + 30" 11:25:43 "

30; "A2"; "Satellite Provider"; 0 position 0; "09 tick 30 Universe 11:25:43"

31; "AD"; "Andorra"; 3x 0; "09max 30max 11:25:43"

32; "AE"; "United Arab Emirates"; 1 position 0; "09 tick 30 Universe 11:25:43"

33; "AF"; "Afghanistan"; 1 position 0; "09 tick 30 Universe 11:25:43"

34; "AG"; "Antigua and Barbuda"; 7; 0; "09tic30, 2004 11:25:43"

35; "AI"; "Anguilla"; 7 x 0; "09 tick 30 pm 11:25:43"

36; "AL"; "Albania"; 3X 0; "09On30 Universe 11:25:43"

37; "AM"; "Armenia"; 3X 0; "09On30 Universe 11:25:43"

38; "AN"; "Netherlands Antilles"; 3X 0; "09On30 Universe 11:25:43"

39; "AO"; "Angola"; 2 position 0; "09 tick 30 Universe 11:25:43"

40; "AP"; "Asia/Pacific Region"; 2 position 0; "09 tick 30 Universe 11:25:43"

41; "AQ"; "Antarctica"; 8; 0; "09, 30, 2004 11:25:43"

42; "AR"; "Argentina"; 5; 0; "09, 30, 2004 11:25:43"

43; "AS"; "American Samoa"; 6 × 0; "09 tick 30 Universe 11:25:43"

44; "AW"; "Aruba"; 5 x 0; "09 tick 30 Universe 11:25:43"

45; "AZ"; "Azerbaijan"; 1 position 0; "09 tick 30 Universe 11:25:43"

46; "BA"; "Bosnia and Herzegovina"; 3t0; "09On30 Universe 11:25:43"

47; "BB"; "Barbados"; 5 x 0; "09 tick 30 pm 11:25:43"

48; "BD"; "Bangladesh"; 1 position 0; "09 tick 30 Universe 11:25:43"

49; "BF"; "Burkina Faso"; 2 position 0; "09 tick 30 Universe 11:25:43"

50; "BG"; "Bulgaria"; 3x 0; "09tick 30" 11:25:43 2004 "

51; "BH"; "Bahrain"; 1 * 0; "09 *

52; "BI"; "Burundi"; 2 position 0; "09 Universe 30 Universe 11:25:43"

53; "BJ"; "Benin"; 2 position 0; "09 tick 30 Universe 11:25:43"

54; "BM"; "Bermuda"; 4 position 0; "09 tick 30 Universe 11:25:43"

55; "BN"; "Brunei Darussalam"; 1 position 0; "09 tick 30 Universe 11:25:43"

56; "BO"; "Bolivia"; 5 x 0; "09 tick 30 pm 11:25:43"

57; "BS"; "Bahamas"; 7; 0; "09tic30, 2004 11:25:43"

58; "BT"; "Bhutan"; 1 position 0; "09 tick 30 Universe 11:25:43"

59; "BV"; "Bouvet Island"; 5 x 0; "09 tick 30 pm 11:25:43"

60; "BW"; "Botswana"; 2 position 0; "09 tick 30 Universe 11:25:43"

61; "BY"; "Belarus"; 3 ters0; "09gamp30 and2004 11:25:43"

2. Execute the import command

C: > sqlldr userid=system/manager control=test.ctl

Valid keywords:

Userid-ORACLE username/password

Control-Control file name

Log-Log file name

Bad-Bad file name

Data-Data file name

Discard-Discard file name

Discardmax-- Number of discards to allow (default for all)

Skip-- Number of logical records to skip (default 0)

Load-- Number of logical records to load (default for all)

Errors-Number of errors to allow (default 50)

Rows-- Number of rows in conventional path bind array or between direct path data saves (default: regular path 64, all direct paths)

Bindsize-Size of conventional path bind array in bytes (default 256000)

Silent-Suppress messages during run (header,feedback,errors,discards,partitions)

Direct-use direct path (default FALSE)

Parfile-- parameter file: name of file that contains parameter specifications

Parallel-do parallel load (default FALSE)

File-File to allocate extents from

Skip_unusable_indexes-disallow/allow unusable indexes or index partitions (default FALSE)

Skip_index_maintenance-- do not maintain indexes, mark affected indexes as unusable (default FALSE)

Readsize-Size of Read buffer (default 1048576)

External_table-- use external table for load; NOT_USED, GENERATE_ONLY, EXECUTE (default NOT_USED)

Columnarrayrows-Number of rows for direct path columnarray (default 5000)

Streamsize-Size of direct path stream buffer in bytes (default 256000)

Multithreading-use multithreading in direct path

Resumable-enable or disable resumable for current session (default FALSE)

Resumable_name-text string to help identify resumable statement

Resumable_timeout-wait time (in seconds) for RESUMABLE (default 7200)

Date_cache-size (in entries) of date conversion cache (default 1000)

PLEASE NOTE: command line arguments can be specified by location or keyword. An example of the former is' sqlload scott/tiger foo';. An example of the latter is' sqlldr control=foouserid=scott/tiger'.. The position specifies the parameter must be earlier than but not later than the parameter specified by the keyword. For example, 'sqlldr scott/tiger control=foo logfile=log', is allowed but qlldr scott/tiger control=foo log', is not allowed even if the parameter' log' is in the correct position.

Optionally enclosed by'"

Remove the quotation marks "" contained in the data field in the data file, if not, the ""and the word will be imported into the database table together.

Specify which column is not loaded

LOAD DATA

INFILE *

INTO TABLE DEPT

REPLACE

FIELDS TERMINATED BY', 'OPTIONALLY ENCLOSED BY' "'

(

DEPTNO

FILLER_1 FILLER, / / the following "Something Not To Be Loaded" will not be loaded

DNAME

LOC

)

BEGINDATA

20 something Not To Be Loaded,Accounting, "Virginia,USA"

The column of position

LOAD DATA

INFILE *

INTO TABLE DEPT

REPLACE

(

DEPTNO position (1:2)

DNAME position (*: 16), / / the start of this field is at the end of the previous field

LOC position (*: 29)

ENTIRE_LINE position (1:29)

)

BEGINDATA

10Accounting Virginia,USA

Merge multiple rows of records into one row

LOAD DATA

INFILE *

Concatenate 3 / / treat several rows of records as one-line records through the keyword concatenate

INTO TABLE DEPT

Replace

FIELDS TERMINATED BY','

(

DEPTNO

DNAME "upper (: dname)"

LOC "upper (: loc)"

LAST_UPDATED date 'dd/mm/yyyy'

)

BEGINDATA

10 sales, / / in fact, these three lines are regarded as a line 10, sales, Virginia, 1, 5, and 2000.

Virginia

1/5/2000

/ / this column uses continueif list= "," or you can tell sqlldr to find a comma at the end of each line and append the next line to the previous line.

Load the line number of each line

Load data

Infile *

Into table t

Replace

(

Seqno RECNUM / / load the line number of each line

Text Position (1VR 1024)

)

BEGINDATA

Fsdfasj / / automatically assigns a line number to the seqno field loaded in table t this behavior 1

Fasdjfasdfl / / this behavior 2.

Skip data rows

You can use the "SKIP n" keyword to specify how many rows of data can be skipped on import. Such as:

LOAD DATA

INFILE *

INTO TABLE load_positional_data

SKIP 5

(

Data1 POSITION (1:5)

Data2 POSITION (6:15)

)

BEGINDATA

11111AAAAAAAAAA

22222BBBBBBBBBB

Improve the performance of SQL*Loader

1) A simple and easy-to-ignore problem is that no indexes and / or constraints (primary keys) are used on imported tables. Doing so, even when using the ROWS= parameter, can significantly degrade database import performance.

2) DIRECT=TRUE can be added to improve the performance of imported data. Of course, in many cases, this parameter cannot be used.

3) you can close the log of the database by specifying the UNRECOVERABLE option. This option can only be used with direct.

4) you can run multiple import tasks at the same time.

The difference between regular Import and direct Import

Regular imports can import data by using INSERT statements. Direct import can skip the relevant logic (DIRECT=TRUE) of the database and import the data directly into the data file.

After reading the above, do you have any further understanding of how to analyze sqlldr? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 301

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report