According to the docs, one can specify either S3 URI or a local path as code location while running processing jobs with SageMaker: <img width="692" alt="Screen Shot 2019-12-12 at 11 29 32 AM" src="https://user-images.githubusercontent.com/8008982/70704632-b1356d80-1cd2-11ea-844e-21f75db5b454.png"> This seems not to be the case. When you specify the s3 path, SageMaker Python SDK still tries to create a default S3 bucket and upload the code to it. I dug into the code base and can see that there is no check if the given path is an S3 path or not and default behavior is always to expect a local path. https://github.com/aws/sagemaker-python-sdk/blob/master/src/sagemaker/processing.py#L373 <img width="795" alt="Screen Shot 2019-12-12 at 11 35 28 AM" src="https://user-images.githubusercontent.com/8008982/70705029-85ff4e00-1cd3-11ea-98d7-21e0babb995b.png"> <img width="733" alt="Screen Shot 2019-12-12 at 11 38 09 AM" src="https://user-images.githubusercontent.com/8008982/70705213-e8584e80-1cd3-11ea-9584-acf21d30515a.png"> Expected behavior: Allow S3 paths as a code location as stated in the docs.