Skip to content

Commit ae68d5b

Browse files
author
gp510
committed
Updated naming throughout project per issue #117
1 parent e24583a commit ae68d5b

15 files changed

+71
-70
lines changed

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ hs_err_pid*
2323
target/*
2424
.idea/*
2525

26-
kafka-connect-splunk/
26+
splunk-kafka-connect/
2727
pom.xml.versionsBackup
2828
.classpath
2929
.project

README.md

Lines changed: 27 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -17,25 +17,25 @@ A Kafka Connect Sink for Splunk features:
1717

1818
## Build
1919

20-
1. Clone the repo from https://github.com/splunk/kafka-connect-splunk
20+
1. Clone the repo from https://github.com/splunk/splunk-kafka-connect
2121
2. Verify that Java8 JRE or JDK is installed.
2222
3. Run `bash build.sh`. The build script will download all dependencies and build the Splunk Kafka Connector.
2323

24-
Note: The resulting "kafka-connect-splunk-*.tar.gz" package is self-contained. Bundled within it are the Kafka Connect framework, all 3rd party libraries, and the Splunk Kafka Connector.
24+
Note: The resulting "splunk-kafka-connect-*.tar.gz" package is self-contained. Bundled within it are the Kafka Connect framework, all 3rd party libraries, and the Splunk Kafka Connector.
2525

2626
## Quick Start
2727

2828
1. [Start](https://kafka.apache.org/quickstart) your Kafka Cluster and confirm it is running.
2929
2. If this is a new install, create a test topic (eg: `perf`). Inject events into the topic. This can be done using [Kafka data-gen-app](https://github.com/dtregonning/kafka-data-gen) or the Kafka bundle [kafka-console-producer](https://kafka.apache.org/quickstart#quickstart_send).
30-
3. Untar the package created from the build script: `tar xzvf kafka-connect-splunk-*.tar.gz` (Default target location is /tmp/kafka-connect-splunk-build/kafka-connect-splunk).
31-
4. Navigate to kafka-connect-splunk directory `cd kafka-connect-splunk`.
30+
3. Untar the package created from the build script: `tar xzvf splunk-kafka-connect-*.tar.gz` (Default target location is /tmp/splunk-kafka-connect-build/splunk-kafka-connect).
31+
4. Navigate to splunk-kafka-connect directory `cd splunk-kafka-connect`.
3232
5. Adjust values for `bootstrap.servers` and `plugin.path` inside `config/connect-distributed-quickstart.properties` to fit your environment. Default values should work for experimentation.
3333
6. Run `./bin/connect-distributed.sh config/connect-distributed-quickstart.properties` to start Kafka Connect.
3434
7. Run the following command to create connector tasks. Adjust `topics` to set the topic, and `splunk.hec.token` to set your HEC token.
3535

3636
```
3737
curl localhost:8083/connectors -X POST -H "Content-Type: application/json" -d '{
38-
"name": "kafka-connect-splunk",
38+
"name": "splunk-kafka-connect",
3939
"config": {
4040
"connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
4141
"tasks.max": "3",
@@ -71,17 +71,17 @@ Note: The resulting "kafka-connect-splunk-*.tar.gz" package is self-contained. B
7171
# List active connectors
7272
curl http://localhost:8083/connectors
7373
74-
# Get kafka-connect-splunk connector info
75-
curl http://localhost:8083/connectors/kafka-connect-splunk
74+
# Get splunk-kafka-connect connector info
75+
curl http://localhost:8083/connectors/splunk-kafka-connect
7676
77-
# Get kafka-connect-splunk connector config info
78-
curl http://localhost:8083/connectors/kafka-connect-splunk/config
77+
# Get splunk-kafka-connect connector config info
78+
curl http://localhost:8083/connectors/splunk-kafka-connect/config
7979
80-
# Delete kafka-connect-splunk connector
81-
curl http://localhost:8083/connectors/kafka-connect-splunk -X DELETE
80+
# Delete splunk-kafka-connect connector
81+
curl http://localhost:8083/connectors/splunk-kafka-connect -X DELETE
8282
83-
# Get kafka-connect-splunk connector task info
84-
curl http://localhost:8083/connectors/kafka-connect-splunk/tasks
83+
# Get splunk-kafka-connect connector task info
84+
curl http://localhost:8083/connectors/splunk-kafka-connect/tasks
8585
```
8686
8787
See the [the Confluent doucumentation](https://docs.confluent.io/current/connect/managing.html#common-rest-examples) for additional REST examples.
@@ -98,11 +98,11 @@ Use the following connector deployment options:
9898
### Connector in a dedicated Kafka Connect Cluster
9999
Running the Splunk Kafka Connector in a dedicated Kafka Connect Cluster is recommended. Isolating the Splunk connector from other Kafka connectors results in significant performance benefits in high throughput environments.
100100
101-
1. Untar the **kafka-connect-splunk-*.tar.gz** package and navigate to the **kafka-connect-splunk** directory.
101+
1. Untar the **splunk-kafka-connect-*.tar.gz** package and navigate to the **splunk-kafka-connect** directory.
102102
103103
```
104-
tar xzvf kafka-connect-splunk-*.tar.gz
105-
cd kafka-connect-splunk
104+
tar xzvf splunk-kafka-connect-*.tar.gz
105+
cd splunk-kafka-connect
106106
```
107107
108108
2. Update config/connect-distributed.properties to match your environment.
@@ -118,26 +118,26 @@ Running the Splunk Kafka Connector in a dedicated Kafka Connect Cluster is recom
118118
> Note: The below topics should be created by Kafka Connect when deploying the Splunk Connector. If the Kafka Connect cluster **does not have permission** to create these topics, create these manually before starting Kafka Connect cluster.
119119
120120
```
121-
group.id=kafka-connect-splunk-hec-sink # consumer group id of Kafka Connect, which is used to form a Kafka Connect cluster
121+
group.id=splunk-kafka-connect-hec-sink # consumer group id of Kafka Connect, which is used to form a Kafka Connect cluster
122122
123-
config.storage.topic=__kafka-connect-splunk-task-configs # kafka topic used to persistent connector task configurations
123+
config.storage.topic=__splunk-kafka-connect-task-configs # kafka topic used to persistent connector task configurations
124124
config.storage.replication.factor=3
125125
126-
offset.storage.topic=__kafka-connect-splunk-offsets # kafka topic used to persistent task checkpoints
126+
offset.storage.topic=__splunk-kafka-connect-offsets # kafka topic used to persistent task checkpoints
127127
offset.storage.replication.factor=3
128128
offset.storage.partitions=25
129129
130-
status.storage.topic=__kafka-connect-splunk-statuses # kafka topic used to persistent task statuses
130+
status.storage.topic=__splunk-kafka-connect-statuses # kafka topic used to persistent task statuses
131131
status.storage.replication.factor=3
132132
status.storage.partitions=5
133133
```
134134
135-
4. Deploy/Copy the **kafka-connect-splunk** directory to all target hosts (virtual machines, physical machines or containers).
135+
4. Deploy/Copy the **splunk-kafka-connect** directory to all target hosts (virtual machines, physical machines or containers).
136136
5. Start Kafka Connect on all target hosts using the below commands:
137137
138138
```
139-
cd kafka-connect-splunk
140-
export KAFKA_HEAP_OPTS="-Xmx6G -Xms2G" && ./bin/connect-distributed.sh config/connect-distributed.properties >> kafka-connect-splunk.log 2>&1
139+
cd splunk-kafka-connect
140+
export KAFKA_HEAP_OPTS="-Xmx6G -Xms2G" && ./bin/connect-distributed.sh config/connect-distributed.properties >> splunk-kafka-connect.log 2>&1
141141
```
142142
143143
> Note: The **KAFKA\_HEAP\_OPTS** environment variable controls how much memory Kafka Connect can use. Set the **KAFKA\_HEAP\_OPTS** with the recommended value stated in the example above.
@@ -167,13 +167,13 @@ internal.value.converter.schemas.enable=false
167167
offset.flush.interval.ms=10000
168168

169169
#Recommended
170-
group.id=kafka-connect-splunk-hec-sink
171-
config.storage.topic=__kafka-connect-splunk-task-configs
170+
group.id=splunk-kafka-connect-hec-sink
171+
config.storage.topic=__splunk-kafka-connect-task-configs
172172
config.storage.replication.factor=3
173-
offset.storage.topic=__kafka-connect-splunk-offsets
173+
offset.storage.topic=__splunk-kafka-connect-offsets
174174
offset.storage.replication.factor=3
175175
offset.storage.partitions=25
176-
status.storage.topic=__kafka-connect-splunk-statuses
176+
status.storage.topic=__splunk-kafka-connect-statuses
177177
status.storage.replication.factor=3
178178
status.storage.partitions=5
179179

build.sh

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
# variables
44
kafkaversion=0.11.0.2
5-
builddir=/tmp/kafka-connect-splunk-build/kafka-connect-splunk
5+
builddir=/tmp/splunk-kafka-connect-build/splunk-kafka-connect
66

77
githash=`git rev-parse --short HEAD 2>/dev/null | sed "s/\(.*\)/@\1/"` # get current git hash
88
gitbranch=`git rev-parse --abbrev-ref HEAD` # get current git branch
@@ -15,7 +15,7 @@ if [[ -z "$gitversion" ]]; then
1515
jarversion=${gitversion}-SNAPSHOT
1616
fi
1717

18-
packagename=kafka-connect-splunk-${gitversion}.tar.gz
18+
packagename=splunk-kafka-connect-${gitversion}.tar.gz
1919

2020
# record git info in version.properties file under resources folder
2121
resourcedir='src/main/resources'
@@ -39,8 +39,8 @@ mvn versions:set -DnewVersion=${jarversion}
3939
mvn package > /dev/null
4040

4141
# Copy over the pacakge
42-
echo "Copy over kafka-connect-splunk jar ..."
43-
cp target/kafka-connect-splunk-${jarversion}.jar ${builddir}/connectors
42+
echo "Copy over splunk-kafka-connect jar ..."
43+
cp target/splunk-kafka-connect-${jarversion}.jar ${builddir}/connectors
4444
cp config/* ${builddir}/config
4545
cp README.md ${builddir}
4646
cp LICENSE ${builddir}
@@ -64,19 +64,19 @@ echo "Clean up ..."
6464

6565
# Package up
6666
echo "Package ${packagename} ..."
67-
cd .. && tar czf ${packagename} kafka-connect-splunk
67+
cd .. && tar czf ${packagename} splunk-kafka-connect
6868

6969
echo "Copy package ${packagename} to ${curdir} ..."
7070
cp ${packagename} ${curdir}
7171

72-
/bin/rm -rf kafka-connect-splunk ${packagename}
72+
/bin/rm -rf splunk-kafka-connect ${packagename}
7373
echo "Done with build & packaging"
7474

7575
echo
7676

7777
cat << EOP
78-
To run the kafka-connect-splunk, do the following steps:
79-
1. untar the package: tar xzf kafka-connect-splunk.tar.gz
78+
To run the splunk-kafka-connect, do the following steps:
79+
1. untar the package: tar xzf splunk-kafka-connect.tar.gz
8080
2. config config/connect-distributed.properties according to your env
8181
3. run: bash bin/connect-distributed.sh config/connect-distributed.properties
8282
4. Use Kafka Connect REST api to create data collection tasks

ci/Jenkinsfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ def dockerReq = new DockerRequest(steps,
88
env,
99
[imageName: "repo.splunk.com/splunk/products/splact:1.0.9",
1010
userId: "10777",
11-
repoName: "[email protected]:splunk/kafka-connect-splunk.git",
11+
repoName: "[email protected]:splunk/splunk-kafka-connect.git",
1212
runner: "yarn",
1313
remotePath: "/build"])
1414

ci/kafka_orca_gen.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66

77
DATA_GEN_IMAGE = 'repo.splunk.com/kafka-data-gen:0.4'
88
KAFKA_IMAGE = 'repo.splunk.com/kafka-cluster:0.12'
9-
KAFKA_CONNECT_IMAGE = 'repo.splunk.com/kafka-connect-splunk:1.8'
9+
KAFKA_CONNECT_IMAGE = 'repo.splunk.com/splunk-kafka-connect:1.8'
1010
KAFKA_BASTION_IMAGE = 'repo.splunk.com/kafka-bastion:1.8'
1111

1212

ci/run_bastion.sh

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,15 @@
11
#!/bin/bash
22

33
curdir=`pwd`
4-
git clone [email protected]:splunk/kafka-connect-splunk.git
4+
git clone [email protected]:splunk/splunk-kafka-connect.git
55
branch=${KAFKA_CONNECT_BRANCH:-develop}
6-
cd kafka-connect-splunk && git checkout ${branch}
6+
cd splunk-kafka-connect && git checkout ${branch}
77

88
duration=${SLEEP:-600}
99
sleep ${duration}
1010

11-
bash ${curdir}/kafka-connect-splunk/ci/fix_hosts.sh > /tmp/fixhosts 2>&1 &
11+
bash ${curdir}/splunk-kafka-connect/ci/fix_hosts.sh > /tmp/fixhosts 2>&1 &
1212

13-
python ${curdir}/kafka-connect-splunk/ci/perf.py
13+
python ${curdir}/splunk-kafka-connect/ci/perf.py
1414

1515
tail -f /dev/null

ci/run_kafka_connect.sh

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
11
#!/bin/bash
22

3-
# Checkout, build and run kafka-connect-splunk in the fight
3+
# Checkout, build and run splunk-kafka-connect in the fight
44

55
curdir=`pwd`
6-
git clone [email protected]:splunk/kafka-connect-splunk.git
6+
git clone [email protected]:splunk/splunk-kafka-connect.git
77

88
branch=${KAFKA_CONNECT_BRANCH:-develop}
99
# build the package
10-
cd kafka-connect-splunk && git checkout ${branch} && bash build.sh
10+
cd splunk-kafka-connect && git checkout ${branch} && bash build.sh
1111

1212
# untar the package
13-
tar xzf kafka-connect-splunk*.tar.gz
14-
cd kafka-connect-splunk
13+
tar xzf splunk-kafka-connect*.tar.gz
14+
cd splunk-kafka-connect
1515

1616
sed -i"" "[email protected]=.*@bootstrap.servers=$KAFKA_BOOTSTRAP_SERVERS@g" config/connect-distributed.properties
1717

@@ -24,7 +24,7 @@ duration=${SLEEP:-300}
2424
sleep ${duration}
2525

2626
echo "Run fix hosts"
27-
bash ${curdir}/kafka-connect-splunk/ci/fix_hosts.sh > /tmp/fixhosts 2>&1 &
27+
bash ${curdir}/splunk-kafka-connect/ci/fix_hosts.sh > /tmp/fixhosts 2>&1 &
2828

2929
echo "Run proc monitor"
3030
cd proc_monitor

config/connect-distributed-quickstart.properties

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -25,15 +25,15 @@ offset.flush.interval.ms=10000
2525

2626
plugin.path=connectors/
2727

28-
group.id=kafka-connect-splunk-hec-sink
29-
config.storage.topic=__kafka-connect-splunk-task-configs
28+
group.id=splunk-kafka-connect-hec-sink
29+
config.storage.topic=__splunk-kafka-connect-task-configs
3030
config.storage.replication.factor=1
3131

32-
offset.storage.topic=__kafka-connect-splunk-offsets
32+
offset.storage.topic=__splunk-kafka-connect-offsets
3333
offset.storage.replication.factor=1
3434
offset.storage.partitions=1
3535

36-
status.storage.topic=__kafka-connect-splunk-statuses
36+
status.storage.topic=__splunk-kafka-connect-statuses
3737
status.storage.replication.factor=1
3838
status.storage.partitions=1
3939

config/connect-distributed.properties

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -25,15 +25,15 @@ offset.flush.interval.ms=10000
2525

2626
plugin.path=connectors/
2727

28-
group.id=kafka-connect-splunk-hec-sink
29-
config.storage.topic=__kafka-connect-splunk-task-configs
28+
group.id=splunk-kafka-connect-hec-sink
29+
config.storage.topic=__splunk-kafka-connect-task-configs
3030
config.storage.replication.factor=3
3131

32-
offset.storage.topic=__kafka-connect-splunk-offsets
32+
offset.storage.topic=__splunk-kafka-connect-offsets
3333
offset.storage.replication.factor=3
3434
offset.storage.partitions=25
3535

36-
status.storage.topic=__kafka-connect-splunk-statuses
36+
status.storage.topic=__splunk-kafka-connect-statuses
3737
status.storage.replication.factor=3
3838
status.storage.partitions=5
3939

dependency-reduced-pom.xml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
33
<modelVersion>4.0.0</modelVersion>
44
<groupId>com.github.splunk.kafka.connect</groupId>
5-
<artifactId>kafka-connect-splunk</artifactId>
6-
<name>kafka-connect-splunk</name>
7-
<version>dev-SNAPSHOT</version>
5+
<artifactId>splunk-kafka-connect</artifactId>
6+
<name>splunk-kafka-connect</name>
7+
<version>v1.0.0-LAR</version>
88
<build>
99
<plugins>
1010
<plugin>

0 commit comments

Comments
 (0)