Skip to main content
Version: 1.3.7

PandasAzureBlobStorageDatasource

Signature

class great_expectations.datasource.fluent.PandasAzureBlobStorageDatasource(*, type: Literal['pandas_abs'] = 'pandas_abs', name: str, id: Optional[uuid.UUID] = None, assets: List[great_expectations.datasource.fluent.data_asset.path.file_asset.FileDataAsset] = [], azure_options: Dict[str, Union[great_expectations.datasource.fluent.config_str.ConfigStr, Any]] = {})

PandasAzureBlobStorageDatasource is a PandasDatasource that uses Azure Blob Storage as a data store.

Methods

Signature

add_csv_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa0680b00> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa0680bc0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa0680d10> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa0680ec0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa0680f80> = None, sep: typing.Optional[str] = None, delimiter: typing.Optional[str] = None, header: Union[int, Sequence[int], None, Literal['infer']] = 'infer', names: Union[Sequence[str], None] = None, index_col: Union[IndexLabel, Literal[False], None] = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, dtype: typing.Optional[dict] = None, engine: Union[CSVEngine, None] = None, true_values: typing.Optional[typing.List] = None, false_values: typing.Optional[typing.List] = None, skipinitialspace: bool = False, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, skipfooter: int = 0, nrows: typing.Optional[int] = None, na_values: Union[Sequence[str], None] = None, keep_default_na: bool = True, na_filter: bool = True, verbose: bool = False, skip_blank_lines: bool = True, parse_dates: Union[bool, Sequence[str], None] = None, infer_datetime_format: bool = None, keep_date_col: bool = False, date_format: typing.Optional[str] = None, dayfirst: bool = False, cache_dates: bool = True, iterator: bool = False, chunksize: typing.Optional[int] = None, compression: CompressionOptions = 'infer', thousands: typing.Optional[str] = None, decimal: str = '.', lineterminator: typing.Optional[str] = None, quotechar: str = '"', quoting: int = 0, doublequote: bool = True, escapechar: typing.Optional[str] = None, comment: typing.Optional[str] = None, encoding: typing.Optional[str] = None, encoding_errors: typing.Optional[str] = 'strict', dialect: typing.Optional[str] = None, on_bad_lines: str = 'error', delim_whitespace: bool = False, low_memory: bool = True, memory_map: bool = False, float_precision: Union[Literal['high', 'legacy'], None] = None, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a csv asset to the datasource.

Signature

add_excel_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa0681f40> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa0681f10> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa0682510> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa0681e80> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa0682f00> = None, sheet_name: typing.Optional[typing.Union[str, int, typing.List[typing.Union[int, str]]]] = 0, header: Union[int, Sequence[int], None] = 0, names: typing.Optional[typing.List[str]] = None, index_col: Union[int, Sequence[int], None] = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, dtype: typing.Optional[dict] = None, engine: Union[Literal['xlrd', 'openpyxl', 'odf', 'pyxlsb'], None] = None, true_values: Union[Iterable[str], None] = None, false_values: Union[Iterable[str], None] = None, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, nrows: typing.Optional[int] = None, na_values: typing.Any = None, keep_default_na: bool = True, na_filter: bool = True, verbose: bool = False, parse_dates: typing.Union[typing.List, typing.Dict, bool] = False, date_format: typing.Optional[str] = None, thousands: typing.Optional[str] = None, decimal: str = '.', comment: typing.Optional[str] = None, skipfooter: int = 0, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, engine_kwargs: typing.Optional[typing.Dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add an excel asset to the datasource.

Signature

add_feather_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d81a0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d8500> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d8650> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d8800> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d88c0> = None, columns: Union[Sequence[str], None] = None, use_threads: bool = True, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a feather asset to the datasource.

Signature

add_fwf_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9040> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9100> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9250> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9400> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d94c0> = None, colspecs: Union[Sequence[Tuple[int, int]], str, None] = 'infer', widths: Union[Sequence[int], None] = None, infer_nrows: int = 100, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a fwf asset to the datasource.

Signature

add_hdf_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9d90> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9e50> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa06d9fa0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa06da150> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa06da210> = None, key: typing.Any = None, mode: str = 'r', errors: str = 'strict', where: typing.Optional[typing.Union[str, typing.List]] = None, start: typing.Optional[int] = None, stop: typing.Optional[int] = None, columns: typing.Optional[typing.List[str]] = None, iterator: bool = False, chunksize: typing.Optional[int] = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a hdf asset to the datasource.

Signature

add_html_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa06da9c0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa06daa80> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dabd0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dad80> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dae40> = None, match: Union[str, Pattern] = '.+', flavor: typing.Optional[str] = None, header: Union[int, Sequence[int], None] = None, index_col: Union[int, Sequence[int], None] = None, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, attrs: typing.Optional[typing.Dict[str, str]] = None, parse_dates: bool = False, thousands: typing.Optional[str] = ',', encoding: typing.Optional[str] = None, decimal: str = '.', converters: typing.Optional[typing.Dict] = None, na_values: Union[Iterable[object], None] = None, keep_default_na: bool = True, displayed_only: bool = True, extract_links: Literal[None, 'header', 'footer', 'body', 'all'] = None, dtype_backend: DtypeBackend = None, storage_options: StorageOptions = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a html asset to the datasource.

Signature

add_json_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dbb60> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dbc20> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dbd70> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa06dbf20> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa0708050> = None, orient: typing.Optional[str] = None, typ: Literal['frame', 'series'] = 'frame', dtype: typing.Optional[dict] = None, convert_axes: typing.Optional[bool] = None, convert_dates: typing.Union[bool, typing.List[str]] = True, keep_default_dates: bool = True, precise_float: bool = False, date_unit: typing.Optional[str] = None, encoding: typing.Optional[str] = None, encoding_errors: typing.Optional[str] = 'strict', lines: bool = False, chunksize: typing.Optional[int] = None, compression: CompressionOptions = 'infer', nrows: typing.Optional[int] = None, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a json asset to the datasource.

Signature

add_orc_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa0708b90> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa0708c50> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa0708da0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa0708f50> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa0709010> = None, columns: typing.Optional[typing.List[str]] = None, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add an orc asset to the datasource.

Signature

add_parquet_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa0709760> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa0709820> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa0709970> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa0709b20> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa0709be0> = None, engine: str = 'auto', columns: typing.Optional[typing.List[str]] = None, storage_options: Union[StorageOptions, None] = None, use_nullable_dtypes: bool = None, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a parquet asset to the datasource.

Signature

add_pickle_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa070a390> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa070a450> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa070a5a0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa070a750> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa070a810> = None, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a pickle asset to the datasource.

Signature

add_sas_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa070af00> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa070afc0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa070b110> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa070b2c0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa070b380> = None, format: typing.Optional[str] = None, index: typing.Optional[str] = None, encoding: typing.Optional[str] = None, chunksize: typing.Optional[int] = None, iterator: bool = False, compression: CompressionOptions = 'infer', **extra_data: typing.Any) → pydantic.BaseModel

Add a sas asset to the datasource.

Signature

add_spss_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa070bb30> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa070bbf0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa070bd40> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa070bef0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa070bfb0> = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, convert_categoricals: bool = True, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a spss asset to the datasource.

Signature

add_stata_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa05347d0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa0534890> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa05349e0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa0534b90> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa0534c50> = None, convert_dates: bool = True, convert_categoricals: bool = True, index_col: typing.Optional[str] = None, convert_missing: bool = False, preserve_dtypes: bool = True, columns: Union[Sequence[str], None] = None, order_categoricals: bool = True, chunksize: typing.Optional[int] = None, iterator: bool = False, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a stata asset to the datasource.

Signature

add_xml_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7f6aa0535550> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7f6aa0535610> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7f6aa0535760> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7f6aa0535910> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7f6aa05359d0> = None, xpath: str = './*', namespaces: typing.Optional[typing.Dict[str, str]] = None, elems_only: bool = False, attrs_only: bool = False, names: Union[Sequence[str], None] = None, dtype: typing.Optional[dict] = None, encoding: typing.Optional[str] = 'utf-8', stylesheet: Union[FilePath, None] = None, iterparse: typing.Optional[typing.Dict[str, typing.List[str]]] = None, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) → pydantic.BaseModel

Add a xml asset to the datasource.

Signature

delete_asset(name: str)None

Removes the DataAsset referred to by asset_name from internal list of available DataAsset objects.

Parameters

NameDescription

name

name of DataAsset to be deleted.

Signature

get_asset(name: str) → great_expectations.datasource.fluent.interfaces._DataAssetT

Returns the DataAsset referred to by asset_name

Parameters

NameDescription

name

name of DataAsset sought.

Returns

TypeDescription

great_expectations.datasource.fluent.interfaces._DataAssetT

if named "DataAsset" object exists; otherwise, exception is raised.