tecton.SparkBatchConfig
Summary​
Configuration used to define a batch source using a Data Source Function.
The SparkBatchConfig class is used to configure a batch source using a user
defined Data Source Function.
This class is used as an input to a
BatchSource’s parameter
batch_config. Declaring this configuration class alone will not register a
Data Source. Instead, declare as a part of BatchSource that takes this
configuration class instance as a parameter.
Do not instantiate this class directly. Use
tecton.spark_batch_config()
instead.
Attributes​
data_delay: This attribute is the same as thedata_delayparameter of the__init__method. See below.
Methods​
__init__(...)​
Instantiates a new SparkBatchConfig.
Parameters​
data_source_function(Union[Callable[[SparkSession],DataFrame],Callable[[SparkSession,FilterContext],DataFrame]]) – User defined Data Source Function that takes in aSparkSessionand an optionaltecton.FilterContext, ifsupports_time_filtering=True. Returns aDataFrame.data_delay(timedelta) – By default, incremental materialization jobs run immediately at the end of the batch schedule period. This parameter configures how long they wait after the end of the period before starting, typically to ensure that all data has landed. For example, if a feature view has abatch_scheduleof 1 day and one of the data source inputs has adata_delayof 1 hour, then incremental materialization jobs will run at01:00UTC. (Default:datetime.timedelta(0))supports_time_filtering(bool) – When set toTrue, the Data Source Function must take thefilter_contextparameter and implement time filtering logic.supports_time_filteringmust be set to True if<data source>.get_dataframe()is called withstart_timeorend_time.supports_time_filteringmust also be set toTrueif usingtecton.declarative.FilteredSourcewith a Data Source when defining aFeatureView. TheFeatureViewwill call the Data Source Function with thetecton.FilterContext, which has thestart_timeandend_timeset. (Default:False)
Returns​
A SparkBatchConfig class instance.