@@ -6,7 +6,7 @@ title: Accessing OpenStack Swift from Spark
66Spark's support for Hadoop InputFormat allows it to process data in OpenStack Swift using the
77same URI formats as in Hadoop. You can specify a path in Swift as input through a
88URI of the form <code >swift://container.PROVIDER/path</code >. You will also need to set your
9- Swift security credentials, through <code >core-sites .xml</code > or via
9+ Swift security credentials, through <code >core-site .xml</code > or via
1010<code >SparkContext.hadoopConfiguration</code >.
1111Current Swift driver requires Swift to use Keystone authentication method.
1212
@@ -37,7 +37,7 @@ For example, for Maven support, add the following to the <code>pom.xml</code> fi
3737
3838# Configuration Parameters
3939
40- Create <code >core-sites .xml</code > and place it inside <code >/spark/conf</code > directory.
40+ Create <code >core-site .xml</code > and place it inside <code >/spark/conf</code > directory.
4141There are two main categories of parameters that should to be configured: declaration of the
4242Swift driver and the parameters that are required by Keystone.
4343
@@ -100,7 +100,7 @@ contains a list of Keystone mandatory parameters. <code>PROVIDER</code> can be a
100100</table >
101101
102102For example, assume <code >PROVIDER=SparkTest</code > and Keystone contains user <code >tester</code > with password <code >testing</code >
103- defined for tenant <code >test</code >. Than <code >core-sites .xml</code > should include:
103+ defined for tenant <code >test</code >. Than <code >core-site .xml</code > should include:
104104
105105{% highlight xml %}
106106<configuration >
@@ -146,7 +146,7 @@ Notice that
146146<code >fs.swift.service.PROVIDER.tenant</code >,
147147<code >fs.swift.service.PROVIDER.username</code >,
148148<code >fs.swift.service.PROVIDER.password</code > contains sensitive information and keeping them in
149- <code >core-sites .xml</code > is not always a good approach.
150- We suggest to keep those parameters in <code >core-sites .xml</code > for testing purposes when running Spark
149+ <code >core-site .xml</code > is not always a good approach.
150+ We suggest to keep those parameters in <code >core-site .xml</code > for testing purposes when running Spark
151151via <code >spark-shell</code >.
152152For job submissions they should be provided via <code >sparkContext.hadoopConfiguration</code >.
0 commit comments