After running this ScriptProcessor:
processor.run(code='preprocess-spark.py',
arguments=['s3_input_data', balanced_train_data_input,
's3_output_data', balanced_train_data_tfidf_output,
],
logs=True,
wait=False
)
and running this code:
preprocessing_job_description = processor.jobs[-1].describe()
processing_job_name = preprocessing_job_description['ProcessingJobName']
running_processor = sagemaker.processing.ProcessingJob.from_processing_name(processing_job_name=processing_job_name,
I'm seeing this error
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-55-5b16753426ad> in <module>()
1 running_processor = sagemaker.processing.ProcessingJob.from_processing_name(processing_job_name=processing_job_name,
----> 2 sagemaker_session=sagemaker_session)
3 running_processor.describe()
~/anaconda3/envs/python3/lib/python3.6/site-packages/sagemaker/processing.py in from_processing_name(cls, sagemaker_session, processing_job_name)
668 outputs=[
669 ProcessingOutput(
--> 670 source=job_desc["ProcessingOutputConfig"]["Outputs"][0]["S3Output"][
671 "LocalPath"
672 ],
KeyError: 'ProcessingOutputConfig'
Note: I'm not using ProcessingOutput because I am using a Spark Processor and writing directly to S3. (See aws/amazon-sagemaker-examples#994 for more info on why I'm doing this.)