-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-27687][SS] Rename Kafka consumer cache capacity conf and document caching #24590
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @srowen |
|
Test build #105348 has finished for PR 24590 at commit
|
HeartSaVioR
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/package.scala
Outdated
Show resolved
Hide resolved
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/package.scala
Show resolved
Hide resolved
|
Test build #105375 has finished for PR 24590 at commit
|
|
retest this please |
|
Test build #105377 has finished for PR 24590 at commit
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Thank you, @gaborgsomogyi and @srowen , @HeartSaVioR .
Merged to master.
|
@gaborgsomogyi @SparkQA , can anyone help me whats wrong here https://stackoverflow.com/questions/58456939/how-to-set-spark-consumer-cache-to-fix-kafkaconsumer-cache-hitting-max-capaci |
|
@BdLearnerr I've answered but next time please use the mailing list. |
What changes were proposed in this pull request?
Kafka related Spark parameters has to start with
spark.kafka.and not withspark.sql.. Because of this I've renamedspark.sql.kafkaConsumerCache.capacity.Since Kafka consumer caching is not documented I've added this also.
How was this patch tested?
Existing + added unit test.
and manual webpage check.