@@ -12,8 +12,12 @@ Kafka Sink Connector Guide
12
12
:depth: 2
13
13
:class: singlecol
14
14
15
- Apache Kafka uses a sink connector to consume records from a topic and
16
- save the data to a datastore.
15
+
16
+ Overview
17
+ --------
18
+
19
+ The MongoDB Kafka Sink Connector consumes records from a Kafka topic and
20
+ saves the data to a MongoDB database.
17
21
18
22
This section of the guide covers the configuration settings necessary to
19
23
set up a Kafka Sink connector.
@@ -27,6 +31,33 @@ set up a Kafka Sink connector.
27
31
:manual:`use an index to support these queries
28
32
<tutorial/create-indexes-to-support-queries/>`.
29
33
34
+
35
+ Message Delivery Guarantee
36
+ --------------------------
37
+
38
+ The Sink Connector guarantees "at-least-once" message delivery by default.
39
+ If there is an error while processing data from a topic, the connector
40
+ retries the write.
41
+
42
+ An "exactly-once" message delivery guarantee can be achieved using an
43
+ idempotent operation such as insert or update. Configure the connector to
44
+ ensure messages include a value for the ``_id`` field.
45
+
46
+ .. note::
47
+
48
+ You can configure the :ref:`DocumentIdAdder post processor <config-document-id-adder>`
49
+ to define a custom behavior and a value for a ``_id`` field. By default,
50
+ the sink connector uses the ``BsonOidStrategy`` which generates a
51
+ new :manual:`BSON ObjectId </reference/bson-types/#objectid>` for the
52
+ ``_id`` field if one does not exist.
53
+
54
+ If you need an "exactly-once" message delivery guarantee, configure the
55
+ connector to ensure messages include a value for the ``_id`` field. For
56
+ example, you can specify the :ref:`DocumentIdAdder post processor <config-document-id-adder>`
57
+ to add a value for the ``_id`` field.
58
+
59
+ The sink connector does not support the "at-most-once" guarantee.
60
+
30
61
.. toctree::
31
62
:titlesonly:
32
63
:hidden:
0 commit comments