PandasGoogleCloudStorageDatasource
class great_expectations.datasource.fluent.PandasGoogleCloudStorageDatasource(*, type: Literal['pandas_gcs'] = 'pandas_gcs', name: str, id: Optional[uuid.UUID] = None, assets: List[great_expectations.datasource.fluent.data_asset.path.file_asset.FileDataAsset] = [], bucket_or_name: str, gcs_options: Dict[str, Union[great_expectations.datasource.fluent.config_str.ConfigStr, Any]] = )#
PandasGoogleCloudStorageDatasource is a PandasDatasource that uses Google Cloud Storage as a data store.
- add_csv_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af06c00> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af06cc0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af06e10> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af06ed0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af07260> = None, sep: typing.Optional[str] = None, delimiter: typing.Optional[str] = None, header: Union[int, Sequence[int], None, Literal['infer']] = 'infer', names: Union[Sequence[str], None] = None, index_col: Union[IndexLabel, Literal[False], None] = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, dtype: typing.Optional[dict] = None, engine: Union[CSVEngine, None] = None, true_values: typing.Optional[typing.List] = None, false_values: typing.Optional[typing.List] = None, skipinitialspace: bool = False, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, skipfooter: int = 0, nrows: typing.Optional[int] = None, na_values: Union[Sequence[str], None] = None, keep_default_na: bool = True, na_filter: bool = True, verbose: bool = False, skip_blank_lines: bool = True, parse_dates: Union[bool, Sequence[str], None] = None, infer_datetime_format: bool = None, keep_date_col: bool = False, date_format: typing.Optional[str] = None, dayfirst: bool = False, cache_dates: bool = True, iterator: bool = False, chunksize: typing.Optional[int] = None, compression: CompressionOptions = 'infer', thousands: typing.Optional[str] = None, decimal: str = '.', lineterminator: typing.Optional[str] = None, quotechar: str = '"', quoting: int = 0, doublequote: bool = True, escapechar: typing.Optional[str] = None, comment: typing.Optional[str] = None, encoding: typing.Optional[str] = None, encoding_errors: typing.Optional[str] = 'strict', dialect: typing.Optional[str] = None, on_bad_lines: str = 'error', delim_whitespace: bool = False, low_memory: bool = True, memory_map: bool = False, float_precision: Union[Literal['high', 'legacy'], None] = None, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any)pydantic.BaseModel #
Add a csv asset to the datasource.
- add_excel_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af2cf20> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af2cb90> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af2c920> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af2c770> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af2cf80> = None, sheet_name: typing.Optional[typing.Union[str, int, typing.List[typing.Union[int, str]]]] = 0, header: Union[int, Sequence[int], None] = 0, names: typing.Optional[typing.List[str]] = None, index_col: Union[int, Sequence[int], None] = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, dtype: typing.Optional[dict] = None, engine: Union[Literal['xlrd', 'openpyxl', 'odf', 'pyxlsb'], None] = None, true_values: Union[Iterable[str], None] = None, false_values: Union[Iterable[str], None] = None, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, nrows: typing.Optional[int] = None, na_values: typing.Any = None, keep_default_na: bool = True, na_filter: bool = True, verbose: bool = False, parse_dates: typing.Union[typing.List, typing.Dict, bool] = False, date_format: typing.Optional[str] = None, thousands: typing.Optional[str] = None, decimal: str = '.', comment: typing.Optional[str] = None, skipfooter: int = 0, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, engine_kwargs: typing.Optional[typing.Dict] = None, **extra_data: typing.Any)pydantic.BaseModel #
Add an excel asset to the datasource.
- add_feather_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af2dfd0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af2d970> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af2e1e0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af2e390> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af2e450> = None, columns: Union[Sequence[str], None] = None, use_threads: bool = True, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any)pydantic.BaseModel #
Add a feather asset to the datasource.
- add_fwf_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af2eb70> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af2ec30> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af2ed80> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af2ef30> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af2eff0> = None, colspecs: Union[Sequence[Tuple[int, int]], str, None] = 'infer', widths: Union[Sequence[int], None] = None, infer_nrows: int = 100, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any)pydantic.BaseModel #
Add a fwf asset to the datasource.
add_hdf_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af2f8c0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af2f980> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af2fad0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af2fc80> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af2fd40> = None, key: typing.Any = None, mode: str = 'r', errors: str = 'strict', where: typing.Optional[typing.Union[str, typing.List]] = None, start: typing.Optional[int] = None, stop: typing.Optional[int] = None, columns: typing.Optional[typing.List[str]] = None, iterator: bool = False, chunksize: typing.Optional[int] = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) pydantic.BaseModel #
Add a hdf asset to the datasource.
- add_html_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af5c530> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af5c5f0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af5c740> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af5c8f0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af5c9b0> = None, match: Union[str, Pattern] = '.+', flavor: typing.Optional[str] = None, header: Union[int, Sequence[int], None] = None, index_col: Union[int, Sequence[int], None] = None, skiprows: typing.Optional[typing.Union[typing.Sequence[int], int]] = None, attrs: typing.Optional[typing.Dict[str, str]] = None, parse_dates: bool = False, thousands: typing.Optional[str] = ',', encoding: typing.Optional[str] = None, decimal: str = '.', converters: typing.Optional[typing.Dict] = None, na_values: Union[Iterable[object], None] = None, keep_default_na: bool = True, displayed_only: bool = True, extract_links: Literal[None, 'header', 'footer', 'body', 'all'] = None, dtype_backend: DtypeBackend = None, storage_options: StorageOptions = None, **extra_data: typing.Any)pydantic.BaseModel #
Add a html asset to the datasource.
add_json_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af5d6a0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af5d760> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af5d8b0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af5da60> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af5db20> = None, orient: typing.Optional[str] = None, typ: Literal['frame', 'series'] = 'frame', dtype: typing.Optional[dict] = None, convert_axes: typing.Optional[bool] = None, convert_dates: typing.Union[bool, typing.List[str]] = True, keep_default_dates: bool = True, precise_float: bool = False, date_unit: typing.Optional[str] = None, encoding: typing.Optional[str] = None, encoding_errors: typing.Optional[str] = 'strict', lines: bool = False, chunksize: typing.Optional[int] = None, compression: CompressionOptions = 'infer', nrows: typing.Optional[int] = None, storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) pydantic.BaseModel #
Add a json asset to the datasource.
add_orc_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af5e690> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af5e750> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af5e8a0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af5ea50> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af5eb10> = None, columns: typing.Optional[typing.List[str]] = None, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) pydantic.BaseModel #
Add an orc asset to the datasource.
add_parquet_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af5f260> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af5f320> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48af5f470> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48af5f620> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48af5f6e0> = None, engine: str = 'auto', columns: typing.Optional[typing.List[str]] = None, storage_options: Union[StorageOptions, None] = None, use_nullable_dtypes: bool = None, dtype_backend: DtypeBackend = None, kwargs: typing.Optional[dict] = None, **extra_data: typing.Any) pydantic.BaseModel #
Add a parquet asset to the datasource.
add_pickle_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48af5fec0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48af5ff80> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8c110> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8c2c0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8c380> = None, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, **extra_data: typing.Any) pydantic.BaseModel #
Add a pickle asset to the datasource.
add_sas_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8ca70> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8cb30> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8cc80> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8ce30> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8cef0> = None, format: typing.Optional[str] = None, index: typing.Optional[str] = None, encoding: typing.Optional[str] = None, chunksize: typing.Optional[int] = None, iterator: bool = False, compression: CompressionOptions = 'infer', **extra_data: typing.Any) pydantic.BaseModel #
Add a sas asset to the datasource.
add_spss_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8d610> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8d6d0> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8d820> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8d9d0> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8da90> = None, usecols: typing.Optional[typing.Union[int, str, typing.Sequence[int]]] = None, convert_categoricals: bool = True, dtype_backend: DtypeBackend = None, **extra_data: typing.Any) pydantic.BaseModel #
Add a spss asset to the datasource.
- add_stata_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8e240> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8e300> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8e450> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8e600> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8e6c0> = None, convert_dates: bool = True, convert_categoricals: bool = True, index_col: typing.Optional[str] = None, convert_missing: bool = False, preserve_dtypes: bool = True, columns: Union[Sequence[str], None] = None, order_categoricals: bool = True, chunksize: typing.Optional[int] = None, iterator: bool = False, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, **extra_data: typing.Any)pydantic.BaseModel #
Add a stata asset to the datasource.
- add_xml_asset(name: str, *, id: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8efc0> = None, order_by: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8f080> = None, batch_metadata: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8f1d0> = None, batch_definitions: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8f380> = None, connect_options: <pydantic.v1.fields.DeferredType object at 0x7fc48ad8f440> = None, xpath: str = './*', namespaces: typing.Optional[typing.Dict[str, str]] = None, elems_only: bool = False, attrs_only: bool = False, names: Union[Sequence[str], None] = None, dtype: typing.Optional[dict] = None, encoding: typing.Optional[str] = 'utf-8', stylesheet: Union[FilePath, None] = None, iterparse: typing.Optional[typing.Dict[str, typing.List[str]]] = None, compression: CompressionOptions = 'infer', storage_options: Union[StorageOptions, None] = None, dtype_backend: DtypeBackend = None, **extra_data: typing.Any)pydantic.BaseModel #
Add a xml asset to the datasource.
- delete_asset(name: str)None #
Removes the DataAsset referred to by asset_name from internal list of available DataAsset objects.
- Parameters
name – name of DataAsset to be deleted.
- get_asset(name: str)great_expectations.datasource.fluent.interfaces._DataAssetT #
Returns the DataAsset referred to by asset_name
- Parameters
name – name of DataAsset sought.
- Returns
_DataAssetT – if named “DataAsset” object exists; otherwise, exception is raised.