great_expectations.datasource.types.batch_spec

Module Contents

Classes

BatchMarkers(*args, **kwargs)

A BatchMarkers is a special type of BatchSpec (so that it has a batch_fingerprint) but it generally does

PandasDatasourceBatchSpec()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SparkDFDatasourceBatchSpec()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

SqlAlchemyDatasourceBatchSpec()

This is an abstract class and should not be instantiated. It’s relevant for testing whether

PathBatchSpec(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

S3BatchSpec(*args, **kwargs)

This is an abstract class and should not be instantiated. It’s relevant for testing whether

RuntimeDataBatchSpec(*args, **kwargs)

dict() -> new empty dictionary

great_expectations.datasource.types.batch_spec.logger
class great_expectations.datasource.types.batch_spec.BatchMarkers(*args, **kwargs)

Bases: great_expectations.core.id_dict.BatchSpec

A BatchMarkers is a special type of BatchSpec (so that it has a batch_fingerprint) but it generally does NOT require specific keys and instead captures information about the OUTPUT of a datasource’s fetch process, such as the timestamp at which a query was executed.

property ge_load_time(self)
class great_expectations.datasource.types.batch_spec.PandasDatasourceBatchSpec

Bases: great_expectations.core.id_dict.BatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.batch_spec.SparkDFDatasourceBatchSpec

Bases: great_expectations.core.id_dict.BatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

class great_expectations.datasource.types.batch_spec.SqlAlchemyDatasourceBatchSpec

Bases: great_expectations.core.id_dict.BatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property limit(self)
property schema(self)
class great_expectations.datasource.types.batch_spec.PathBatchSpec(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_spec.PandasDatasourceBatchSpec, great_expectations.datasource.types.batch_spec.SparkDFDatasourceBatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property path(self)
property reader_method(self)
class great_expectations.datasource.types.batch_spec.S3BatchSpec(*args, **kwargs)

Bases: great_expectations.datasource.types.batch_spec.PandasDatasourceBatchSpec, great_expectations.datasource.types.batch_spec.SparkDFDatasourceBatchSpec

This is an abstract class and should not be instantiated. It’s relevant for testing whether a subclass is allowed

property s3(self)
property reader_method(self)
class great_expectations.datasource.types.batch_spec.RuntimeDataBatchSpec(*args, **kwargs)

Bases: great_expectations.core.id_dict.BatchSpec

dict() -> new empty dictionary dict(mapping) -> new dictionary initialized from a mapping object’s

(key, value) pairs

dict(iterable) -> new dictionary initialized as if via:

d = {} for k, v in iterable:

d[k] = v

dict(**kwargs) -> new dictionary initialized with the name=value pairs

in the keyword argument list. For example: dict(one=1, two=2)

_id_ignore_keys
property batch_data(self)