@@ -45,8 +45,8 @@ def feature_processor(
4545
4646 If the decorated function is executed without arguments then the decorated function's arguments
4747 are automatically loaded from the input data sources. Outputs are ingested to the output Feature
48- Group. If arguments are provided to this function, then arguments are not automatically
49- loaded (for testing).
48+ Group. If arguments are provided to this function, then arguments are not automatically loaded
49+ (for testing).
5050
5151 Decorated functions must conform to the expected signature. Parameters: one parameter of type
5252 pyspark.sql.DataFrame for each DataSource in 'inputs'; followed by the optional parameters with
@@ -82,9 +82,9 @@ def transform(input_feature_group, input_csv):
8282 inputs (Sequence[Union[FeatureGroupDataSource, CSVDataSource, ParquetDataSource,
8383 BaseDataSource]]): A list of data sources.
8484 output (str): A Feature Group ARN to write results of this function to.
85- target_stores (Optional[list[str]], optional): A list containing at least one
86- of 'OnlineStore' or 'OfflineStore'. If unspecified, data will be ingested to the
87- enabled stores of the output feature group. Defaults to None.
85+ target_stores (Optional[list[str]], optional): A list containing at least one of
86+ 'OnlineStore' or 'OfflineStore'. If unspecified, data will be ingested to the enabled
87+ stores of the output feature group. Defaults to None.
8888 parameters (Optional[Dict[str, Union[str, Dict]]], optional): Parameters to be provided to
8989 the decorated function, available as the 'params' argument. Useful for parameterized
9090 functions. The params argument also contains the set of system provided parameters
0 commit comments