Skip to content

Commit b01cc8a

Browse files
committed
DOCSP-21884 updated config uri name (#123)
* updated config uri name (cherry picked from commit 425976871af14898b1e516781a86ca4b2e5edbe0)
1 parent f27a51a commit b01cc8a

File tree

8 files changed

+43
-43
lines changed

8 files changed

+43
-43
lines changed

source/configuration/read.txt

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -31,14 +31,14 @@ You can configure the following properties to read from MongoDB:
3131
* - Property name
3232
- Description
3333

34-
* - ``uri``
34+
* - ``connection.uri``
3535
- **Required.**
3636
The connection string in the form
3737
``mongodb://host:port/``. The ``host`` can be a hostname, IP
3838
address, or UNIX domain socket. If the connection string doesn't
3939
specify a ``port``, it uses the default MongoDB port, ``27017``.
4040

41-
You can append the other remaining read options to the ``uri``
41+
You can append the other remaining read options to the ``connection.uri``
4242
setting. See :ref:`configure-input-uri`.
4343

4444
* - ``database``
@@ -342,13 +342,13 @@ Change Streams
342342

343343
.. _configure-input-uri:
344344

345-
``uri`` Configuration Setting
346-
-----------------------------
345+
``connection.uri`` Configuration Setting
346+
----------------------------------------
347347

348-
You can set all :ref:`spark-input-conf` via the read ``uri`` setting.
348+
You can set all :ref:`spark-input-conf` via the read ``connection.uri`` setting.
349349

350350
For example, consider the following example which sets the read
351-
``uri`` setting via ``SparkConf``:
351+
``connection.uri`` setting via ``SparkConf``:
352352

353353
.. note::
354354

@@ -357,24 +357,24 @@ For example, consider the following example which sets the read
357357

358358
.. code:: cfg
359359

360-
spark.mongodb.read.uri=mongodb://127.0.0.1/databaseName.collectionName?readPreference=primaryPreferred
360+
spark.mongodb.read.connection.uri=mongodb://127.0.0.1/databaseName.collectionName?readPreference=primaryPreferred
361361

362362
The configuration corresponds to the following separate configuration
363363
settings:
364364

365365
.. code:: cfg
366366

367-
spark.mongodb.read.uri=mongodb://127.0.0.1/
367+
spark.mongodb.read.connection.uri=mongodb://127.0.0.1/
368368
spark.mongodb.read.database=databaseName
369369
spark.mongodb.read.collection=collectionName
370370
spark.mongodb.read.readPreference.name=primaryPreferred
371371

372-
If you specify a setting both in the ``uri`` and in a separate
373-
configuration, the ``uri`` setting overrides the separate
372+
If you specify a setting both in the ``connection.uri`` and in a separate
373+
configuration, the ``connection.uri`` setting overrides the separate
374374
setting. For example, given the following configuration, the
375375
database for the connection is ``foobar``:
376376

377377
.. code:: cfg
378378

379-
spark.mongodb.read.uri=mongodb://127.0.0.1/foobar
379+
spark.mongodb.read.connection.uri=mongodb://127.0.0.1/foobar
380380
spark.mongodb.read.database=bar

source/configuration/write.txt

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ The following options for writing to MongoDB are available:
3131
* - Property name
3232
- Description
3333

34-
* - ``uri``
34+
* - ``connection.uri``
3535
- **Required.**
3636
The connection string in the form
3737
``mongodb://host:port/``. The ``host`` can be a hostname, IP
@@ -40,7 +40,7 @@ The following options for writing to MongoDB are available:
4040

4141
.. note::
4242

43-
The other remaining options may be appended to the ``uri``
43+
The other remaining options may be appended to the ``connection.uri``
4444
setting. See :ref:`configure-output-uri`.
4545

4646
* - ``database``
@@ -102,13 +102,13 @@ The following options for writing to MongoDB are available:
102102

103103
.. _configure-output-uri:
104104

105-
``uri`` Configuration Setting
106-
-----------------------------
105+
``connection.uri`` Configuration Setting
106+
----------------------------------------
107107

108-
You can set all :ref:`spark-output-conf` via the write ``uri``.
108+
You can set all :ref:`spark-output-conf` via the write ``connection.uri``.
109109

110110
For example, consider the following example which sets the write
111-
``uri`` setting via ``SparkConf``:
111+
``connection.uri`` setting via ``SparkConf``:
112112

113113
.. note::
114114

@@ -117,23 +117,23 @@ For example, consider the following example which sets the write
117117

118118
.. code:: cfg
119119

120-
spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection
120+
spark.mongodb.write.connection.uri=mongodb://127.0.0.1/test.myCollection
121121

122122
The configuration corresponds to the following separate configuration
123123
settings:
124124

125125
.. code:: cfg
126126

127-
spark.mongodb.write.uri=mongodb://127.0.0.1/
127+
spark.mongodb.write.connection.uri=mongodb://127.0.0.1/
128128
spark.mongodb.write.database=test
129129
spark.mongodb.write.collection=myCollection
130130

131-
If you specify a setting both in the ``uri`` and in a separate
132-
configuration, the ``uri`` setting overrides the separate
131+
If you specify a setting both in the ``connection.uri`` and in a separate
132+
configuration, the ``connection.uri`` setting overrides the separate
133133
setting. For example, given the following configuration, the
134134
database for the connection is ``foobar``:
135135

136136
.. code:: cfg
137137

138-
spark.mongodb.write.uri=mongodb://127.0.0.1/foobar
138+
spark.mongodb.write.connection.uri=mongodb://127.0.0.1/foobar
139139
spark.mongodb.write.database=bar

source/includes/extracts-command-line.yaml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,11 @@ content: |
1717
---
1818
ref: list-configuration-explanation
1919
content: |
20-
- The :ref:`spark.mongodb.read.uri <spark-input-conf>` specifies the
20+
- The :ref:`spark.mongodb.read.connection.uri <spark-input-conf>` specifies the
2121
MongoDB server address (``127.0.0.1``), the database to connect
2222
(``test``), and the collection (``myCollection``) from which to read
2323
data, and the read preference.
24-
- The :ref:`spark.mongodb.write.uri <spark-output-conf>` specifies the
24+
- The :ref:`spark.mongodb.write.connection.uri <spark-output-conf>` specifies the
2525
MongoDB server address (``127.0.0.1``), the database to connect
2626
(``test``), and the collection (``myCollection``) to which to write
2727
data. Connects to port ``27017`` by default.
@@ -37,8 +37,8 @@ content: |
3737
3838
.. code-block:: sh
3939
40-
./bin/spark-shell --conf "spark.mongodb.read.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
41-
--conf "spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection" \
40+
./bin/spark-shell --conf "spark.mongodb.read.connection.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
41+
--conf "spark.mongodb.write.connection.uri=mongodb://127.0.0.1/test.myCollection" \
4242
--packages org.mongodb.spark:mongo-spark-connector:{+current-version+}
4343
4444
.. include:: /includes/extracts/list-configuration-explanation.rst
@@ -54,8 +54,8 @@ content: |
5454
5555
.. code-block:: sh
5656
57-
./bin/pyspark --conf "spark.mongodb.read.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
58-
--conf "spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection" \
57+
./bin/pyspark --conf "spark.mongodb.read.connection.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
58+
--conf "spark.mongodb.write.connection.uri=mongodb://127.0.0.1/test.myCollection" \
5959
--packages org.mongodb.spark:mongo-spark-connector:{+current-version+}
6060
6161
.. include:: /includes/extracts/list-configuration-explanation.rst
@@ -71,8 +71,8 @@ content: |
7171
7272
.. code-block:: sh
7373
74-
./bin/sparkR --conf "spark.mongodb.read.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
75-
--conf "spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection" \
74+
./bin/sparkR --conf "spark.mongodb.read.connection.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
75+
--conf "spark.mongodb.write.connection.uri=mongodb://127.0.0.1/test.myCollection" \
7676
--packages org.mongodb.spark:mongo-spark-connector:{+current-version+}
7777
7878
.. include:: /includes/extracts/list-configuration-explanation.rst

source/java/api.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -48,21 +48,21 @@ Configuration
4848
SparkSession spark = SparkSession.builder()
4949
.master("local")
5050
.appName("MongoSparkConnectorIntro")
51-
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
52-
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
51+
.config("spark.mongodb.read.connection.uri", "mongodb://127.0.0.1/test.myCollection")
52+
.config("spark.mongodb.write.connection.uri", "mongodb://127.0.0.1/test.myCollection")
5353
.getOrCreate();
5454

5555
// Application logic
5656

5757
}
5858
}
5959

60-
- The :ref:`spark.mongodb.read.uri <spark-input-conf>` specifies the
60+
- The :ref:`spark.mongodb.read.connection.uri <spark-input-conf>` specifies the
6161
MongoDB server address(``127.0.0.1``), the database to connect
6262
(``test``), and the collection (``myCollection``) from which to read
6363
data, and the read preference.
6464

65-
- The :ref:`spark.mongodb.write.uri <spark-output-conf>` specifies the
65+
- The :ref:`spark.mongodb.write.connection.uri <spark-output-conf>` specifies the
6666
MongoDB server address(``127.0.0.1``), the database to connect
6767
(``test``), and the collection (``myCollection``) to which to write
6868
data.

source/python/api.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,8 @@ Create a ``SparkSession`` Object
1919
``spark`` by default. In a standalone Python application, you need
2020
to create your ``SparkSession`` object explicitly, as show below.
2121

22-
If you specified the ``spark.mongodb.read.uri``
23-
and ``spark.mongodb.write.uri`` configuration options when you
22+
If you specified the ``spark.mongodb.read.connection.uri``
23+
and ``spark.mongodb.write.connection.uri`` configuration options when you
2424
started ``pyspark``, the default ``SparkSession`` object uses them.
2525
If you'd rather create your own ``SparkSession`` object from within
2626
``pyspark``, you can use ``SparkSession.builder`` and specify different
@@ -33,8 +33,8 @@ configuration options.
3333
my_spark = SparkSession \
3434
.builder \
3535
.appName("myApp") \
36-
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.coll") \
37-
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.coll") \
36+
.config("spark.mongodb.read.connection.uri", "mongodb://127.0.0.1/test.coll") \
37+
.config("spark.mongodb.write.connection.uri", "mongodb://127.0.0.1/test.coll") \
3838
.getOrCreate()
3939

4040
You can use a ``SparkSession`` object to write data to MongoDB, read

source/python/read-from-mongodb.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
You can create a Spark DataFrame to hold data from the MongoDB
22
collection specified in the
3-
:ref:`spark.mongodb.read.uri <pyspark-shell>` option which your
3+
:ref:`spark.mongodb.read.connection.uri <pyspark-shell>` option which your
44
``SparkSession`` option is using.
55

66
.. include:: /includes/example-load-dataframe.rst

source/python/write-to-mongodb.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,15 +9,15 @@ a list of tuples containing names and ages, and a list of column names:
99
("Dwalin", 169), ("Oin", 167), ("Gloin", 158), ("Fili", 82), ("Bombur", None)], ["name", "age"])
1010

1111
Write the ``people`` DataFrame to the MongoDB database and collection
12-
specified in the :ref:`spark.mongodb.write.uri<pyspark-shell>` option
12+
specified in the :ref:`spark.mongodb.write.connection.uri<pyspark-shell>` option
1313
by using the ``write`` method:
1414

1515
.. code-block:: python
1616

1717
people.write.format("mongodb").mode("append").save()
1818

1919
The above operation writes to the MongoDB database and collection
20-
specified in the :ref:`spark.mongodb.write.uri<pyspark-shell>` option
20+
specified in the :ref:`spark.mongodb.write.connection.uri<pyspark-shell>` option
2121
when you connect to the ``pyspark`` shell.
2222

2323
To read the contents of the DataFrame, use the ``show()`` method.

source/scala/api.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -69,8 +69,8 @@ Configuration
6969
val spark = SparkSession.builder()
7070
.master("local")
7171
.appName("MongoSparkConnectorIntro")
72-
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
73-
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
72+
.config("spark.mongodb.read.connection.uri", "mongodb://127.0.0.1/test.myCollection")
73+
.config("spark.mongodb.write.connection.uri", "mongodb://127.0.0.1/test.myCollection")
7474
.getOrCreate()
7575

7676
}

0 commit comments

Comments
 (0)