-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-20354][CORE][REST-API]When I request access to the 'http: //ip:port/api/v1/applications' link, return 'sparkUser' is empty in REST API. #17656
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ucceeded|failed|unknown]
…remove redundant description.
|
@squito do you have an opinion? |
|
yeah, looks like the right change, I think it was just overlooked in https://issues.apache.org/jira/browse/SPARK-14245 I'd ask that you add an assertion to this unit test: https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/ui/UISeleniumSuite.scala#L655 (attempts(0) \ "sparkUser").extract[String] should not be ("")aside, I'm not sure why SPARK-14245 introduced a different way of getting the user from what the history server uses, but in any case I think this change is right, to do the same thing as the UI, even if those internals should be changed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for catching this, somehow I totally missed it last year in my original PR. I don't think I was working on the API side as much then.
|
ok to test |
|
Test build #75870 has finished for PR 17656 at commit
|
|
Merging to master / 2.2. |
…p:port/api/v1/applications' link, return 'sparkUser' is empty in REST API.
## What changes were proposed in this pull request?
When I request access to the 'http: //ip:port/api/v1/applications' link, get the json. I need the 'sparkUser' field specific value, because my Spark big data management platform needs to filter through this field which user submits the application to facilitate my administration and query, but the current return of the json string is empty, causing me this Function can not be achieved, that is, I do not know who the specific application is submitted by this REST Api.
**current return json:**
[ {
"id" : "app-20170417152053-0000",
"name" : "KafkaWordCount",
"attempts" : [ {
"startTime" : "2017-04-17T07:20:51.395GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2017-04-17T07:20:51.395GMT",
"duration" : 0,
**"sparkUser" : "",**
"completed" : false,
"endTimeEpoch" : -1,
"startTimeEpoch" : 1492413651395,
"lastUpdatedEpoch" : 1492413651395
} ]
} ]
**When I fix this question, return json:**
[ {
"id" : "app-20170417154201-0000",
"name" : "KafkaWordCount",
"attempts" : [ {
"startTime" : "2017-04-17T07:41:57.335GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2017-04-17T07:41:57.335GMT",
"duration" : 0,
**"sparkUser" : "mr",**
"completed" : false,
"startTimeEpoch" : 1492414917335,
"endTimeEpoch" : -1,
"lastUpdatedEpoch" : 1492414917335
} ]
} ]
## How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.
Author: 郭小龙 10207633 <[email protected]>
Author: guoxiaolong <[email protected]>
Author: guoxiaolongzte <[email protected]>
Closes #17656 from guoxiaolongzte/SPARK-20354.
(cherry picked from commit 1f81dda)
Signed-off-by: Marcelo Vanzin <[email protected]>
…p:port/api/v1/applications' link, return 'sparkUser' is empty in REST API.
## What changes were proposed in this pull request?
When I request access to the 'http: //ip:port/api/v1/applications' link, get the json. I need the 'sparkUser' field specific value, because my Spark big data management platform needs to filter through this field which user submits the application to facilitate my administration and query, but the current return of the json string is empty, causing me this Function can not be achieved, that is, I do not know who the specific application is submitted by this REST Api.
**current return json:**
[ {
"id" : "app-20170417152053-0000",
"name" : "KafkaWordCount",
"attempts" : [ {
"startTime" : "2017-04-17T07:20:51.395GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2017-04-17T07:20:51.395GMT",
"duration" : 0,
**"sparkUser" : "",**
"completed" : false,
"endTimeEpoch" : -1,
"startTimeEpoch" : 1492413651395,
"lastUpdatedEpoch" : 1492413651395
} ]
} ]
**When I fix this question, return json:**
[ {
"id" : "app-20170417154201-0000",
"name" : "KafkaWordCount",
"attempts" : [ {
"startTime" : "2017-04-17T07:41:57.335GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2017-04-17T07:41:57.335GMT",
"duration" : 0,
**"sparkUser" : "mr",**
"completed" : false,
"startTimeEpoch" : 1492414917335,
"endTimeEpoch" : -1,
"lastUpdatedEpoch" : 1492414917335
} ]
} ]
## How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.
Author: 郭小龙 10207633 <[email protected]>
Author: guoxiaolong <[email protected]>
Author: guoxiaolongzte <[email protected]>
Closes apache#17656 from guoxiaolongzte/SPARK-20354.
What changes were proposed in this pull request?
When I request access to the 'http: //ip:port/api/v1/applications' link, get the json. I need the 'sparkUser' field specific value, because my Spark big data management platform needs to filter through this field which user submits the application to facilitate my administration and query, but the current return of the json string is empty, causing me this Function can not be achieved, that is, I do not know who the specific application is submitted by this REST Api.
current return json:
[ {
"id" : "app-20170417152053-0000",
"name" : "KafkaWordCount",
"attempts" : [ {
"startTime" : "2017-04-17T07:20:51.395GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2017-04-17T07:20:51.395GMT",
"duration" : 0,
"sparkUser" : "",
"completed" : false,
"endTimeEpoch" : -1,
"startTimeEpoch" : 1492413651395,
"lastUpdatedEpoch" : 1492413651395
} ]
} ]
When I fix this question, return json:
[ {
"id" : "app-20170417154201-0000",
"name" : "KafkaWordCount",
"attempts" : [ {
"startTime" : "2017-04-17T07:41:57.335GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2017-04-17T07:41:57.335GMT",
"duration" : 0,
"sparkUser" : "mr",
"completed" : false,
"startTimeEpoch" : 1492414917335,
"endTimeEpoch" : -1,
"lastUpdatedEpoch" : 1492414917335
} ]
} ]
How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.