In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
Detailed explanation of Goldengate to kafka configuration
Environment introduction:
Source-side database version
Source-side OGG version
Target side OGG version
Kafka cluster
Target database GP
11.2.0.3
12.2.0.1.1
Ggs_Adapters_Linux_x64
Remember that the OGG version is for big data.
12.3.0.1.0
Source configuration:
1.1 install the OGG software.
OGG software does not require version 12.
Configure MGR
PORT 7810
DYNAMICPORTLIST 7811-7914
AUTORESTART REPLICAT dpe*, WAITMINUTES 1, RETRIES 5
AUTORESTART REPLICAT ext*, WAITMINUTES 1, RETRIES 5
PURGEOLDEXTRACTS / home/ogg/kafka_ogg/dirdat/kf*,USECHECKPOINTS, minkeephours 6
Configure ext extraction process parameters:
EXTRACT extkaf
-setenv (NLS_LANG= "AMERICAN_AMERICA.AL32UTF8")
Setenv (ORACLE_HOME=/u01/app/oracle/product/11.2.0/dbhome_1)
Userid goldengate@ogg, password Golden_1230
-- getupdatebefores
GETTRUNCATES
REPORTCOUNT EVERY 15 MINUTES, RATE
DISCARDFILE. / dirrpt/extkaf.dsc,APPEND,MEGABYTES 1024
-- THREADOPTIONS MAXCOMMITPROPAGATIONDELAY 90000 IOLATENCY 90000
DBOPTIONS ALLOWUNUSEDCOLUMN
-- WARNLONGTRANS 2h.CHECKINTERVAL 3m
EXTTRAIL. / dirdat/kf
-- TRANLOGOPTIONS CONVERTUCS2CLOBS
TRANLOGOPTIONS EXCLUDEUSER goldengate
TRANLOGOPTIONS DBLOGREADER
-- TRANLOGOPTIONS _ noReadAhead Any
-- DYNAMICRESOLUTION
Table schema1.tablename1
Table schema1.tablename2
Table schema1.tablename3
Table4schema1.tablename4
Configure the delivery process:
Extract dpekaf
Rmthost 172.31.31.10,mgrport 7810
Passthru
Numfiles 500
Rmttrail / home/ogg/kafka_ogg/dirdat/kf
Table schema1.tablename1
Table schema1.tablename2
Table schema1.tablename3
Table4schema1.tablename4
-= destination side configuration =
Goldengate for big data
MGR is configured on the destination side:
PORT 7810
DYNAMICPORTLIST 7811-7914
AUTORESTART REPLICAT rep*, WAITMINUTES 1, RETRIES 5
PURGEOLDEXTRACTS / home/ogg/kafka_ogg/dirdat/kf*,USECHECKPOINTS, minkeephours 6
Configure the replicat process:
Storage process 1:
Replicat repykaf1
-- setenv (JAVA_HOME=/home/ogg/jdk1.8.0_111)
-- setenv (JRE_HOME=/home/ogg/jdk1.8.0_111/jre)
-- setenv (PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH)
-- setenv (CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib)
-- setenv (LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server:/home/ogg/kafka_ogg/lib)
-- getenv (JAVA_HOME)
-- getenv (JRE_HOME)
-- getenv (CLASSPATH)
-- getenv (LD_LIBRARY_PATH)
-- getenv (PATH)
TARGETDB LIBFILE libggjava.so SET property=dirprm/kafka1.props
GETTRUNCATES
REPORTCOUNT EVERY 1 MINUTES, RATE
GROUPTRANSOPS 1000
Table schema1.tablename1
Storage process 2:
Replicat repykaf2
-- setenv (JAVA_HOME=/home/ogg/jdk1.8.0_111)
-- setenv (JRE_HOME=/home/ogg/jdk1.8.0_111/jre)
-- setenv (PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH)
-- setenv (CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib)
-- setenv (LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server:/home/ogg/kafka_ogg/lib)
-- getenv (JAVA_HOME)
-- getenv (JRE_HOME)
-- getenv (CLASSPATH)
-- getenv (LD_LIBRARY_PATH)
-- getenv (PATH)
TARGETDB LIBFILE libggjava.so SET property=dirprm/kafka2.props
GETTRUNCATES
REPORTCOUNT EVERY 1 MINUTES, RATE
GROUPTRANSOPS 1000
Table schema1.tablename2
Storage process 3:
Replicat repykaf3
-- setenv (JAVA_HOME=/home/ogg/jdk1.8.0_111)
-- setenv (JRE_HOME=/home/ogg/jdk1.8.0_111/jre)
-- setenv (PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH)
-- setenv (CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib)
-- setenv (LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server:/home/ogg/kafka_ogg/lib)
-- getenv (JAVA_HOME)
-- getenv (JRE_HOME)
-- getenv (CLASSPATH)
-- getenv (LD_LIBRARY_PATH)
-- getenv (PATH)
TARGETDB LIBFILE libggjava.so SET property=dirprm/kafka3.props
GETTRUNCATES
REPORTCOUNT EVERY 1 MINUTES, RATE
GROUPTRANSOPS 1000
Table schema1.tablename3
Storage process 4:
Replicat repykaf4
-- setenv (JAVA_HOME=/home/ogg/jdk1.8.0_111)
-- setenv (JRE_HOME=/home/ogg/jdk1.8.0_111/jre)
-- setenv (PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH)
-- setenv (CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib)
-- setenv (LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server:/home/ogg/kafka_ogg/lib)
-- getenv (JAVA_HOME)
-- getenv (JRE_HOME)
-- getenv (CLASSPATH)
-- getenv (LD_LIBRARY_PATH)
-- getenv (PATH)
TARGETDB LIBFILE libggjava.so SET property=dirprm/kafka4.props
GETTRUNCATES
REPORTCOUNT EVERY 1 MINUTES, RATE
GROUPTRANSOPS 1000
Table schema1.tablename4
Configure the parameter file to kafka
My OGG extraction directory is: kafka_ogg
Under the decompressed directory of OGG, there are: AdapterExamples folder
Cp / home/ogg/kafka_ogg/AdapterExamples/big-data/kafka/* / home/ogg/kafka_ogg/dirprm/
Edit:
Vi custom_kafka_producer.properties
# bootstrap.servers=ip: Port, ip: Port example
Bootstrap.servers=172.31.31.10:6667172.31.31.11:6667172.31.31.12:6667172.31.31.13:6667
Acks=1
Reconnect.backoff.ms=1000
Compression.type=gzip
Value.serializer=org.apache.kafka.common.serialization.ByteArraySerializer
Key.serializer=org.apache.kafka.common.serialization.ByteArraySerializer
# 100KB per partition
Batch.size=102400
Linger.ms=10000
Max.request.size=10240000
Send.buffer.bytes=10240000
Editing
Kafka1.props
Gg.handlerlist = kafkahandler
Gg.handler.kafkahandler.type=kafka
Gg.handler.kafkahandler.KafkaProducerConfigFile=custom_kafka_producer.properties
Gg.handler.kafkahandler.topicMappingTemplate=topic-name
(write the topic name you created here)
Gg.handler.kafkahandler.format=json
Gg.handler.kafkahandler.BlockingSend = false
Gg.handler.kafkahandler.includeTokens=false
Gg.handler.kafkahandler.mode=tx
Goldengate.userexit.timestamp=utc
Goldengate.userexit.writers=javawriter
Javawriter.stats.display=TRUE
Javawriter.stats.full=TRUE
Gg.log=log4j
Gg.log.level=INFO
Gg.report.time=30sec
Gg.classpath=dirprm/:/home/ogg/kafka_ogg/ggjava/resources/lib/*:/usr/hdp/2.4.0.0-169Accord Kafka *
Javawriter.bootoptions=-Xmx512m-Xms32m-Djava.class.path=ggjava/ggjava.jar
Goldengate.userexit.utf8mode=true
Gg.handler.kafkahandler.keyMappingTemplate=HH
Gg.handler.kafkahandler.format.includePrimaryKeys=true
Kafka2.props
Kafka3.props
Kafka4.props
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.