Qlik Attunity Replicator and Kafka synco

lichenrichard 13 views 12 slides Jun 30, 2024
Slide 1
Slide 1 of 12
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12

About This Presentation

This is introduction for Attunity data replication technology based on kafka


Slide Content

Kafka and Replicate

What is Kafka Distributed message queue High throughput – designed for large amount of small messages (default: 1MB) Scalable – cluster can be extended easily Persistent – messages are stored on disk, replicas are stored on different servers Runs on Java (Linux, Windows …) Apache open-source

Kafka Terminology Broker Kafka server Cluster Multiple brokers form a cluster Producer Kafka client, sends messages to brokers Consumer Kafka client, r eceives messages from brokers Topic Logical queue, divided into partitions Partition Contains Kafka messages, saved as a file on disk. Message Any buffer ( text/binary), f ormat should be coordinated between producers and consumers Message Key Optional, used for partitioning and compaction Offset Position of a message in a partition , u sed by consumer

Kafka Architecture

Anatomy of a Topic

Kafka Target Endpoint Replicate is a Kafka Producer Connects to brokers and sends messages Each message represents a record Message format is JSON, UTF-8 encoded, or Avro Message can have a Key Message can be compressed (snappy, gzip) Full Load and CDC are supported Supported Kafka versions: 0.8, 0.9, 0.10

Kafka – Metadata Messages Replicate produces messages to Kafka Each message contains a data record Metadata messages: Replicate Server name Task name Source schema, Source table Columns information Table version

(new) JSON Message Structure { "magic": " atMSG ", "type": "DT", "message": { "data": { "ID": "4", "NAME": " yyyyy " }, " beforeData ": { "ID": "4", "NAME": " xxxxx " }, "headers": { "operation": "UPDATE",  include FRESH (full load), UPDATE, DELETE, INSERT (CDC) " changeSequence ": "20170622101040000000000000000000005", "timestamp": "2017-06-22T10:10:40.000", " streamPosition ": "00000000.0095508c.00000001.0000.02.0000:237.36778.16", " transactionId ": "000000000000000000000000001B0007" } } }

(new) JSON Message Structure

Endpoint Configuration

Troubleshooting Connection Errors Test the connection to Kafka broker Check the broker’s address and port Verify that port is accessible by using telnet

Thank You 