great_expectations.cli.datasource

Module Contents

Classes

SupportedDatabaseBackends()

Generic enumeration.

BaseDatasourceNewYamlHelper(datasource_type: DatasourceTypes, usage_stats_payload: dict, datasource_name: Optional[str] = None)

This base class defines the interface for helpers used in the datasource new

FilesYamlHelper(datasource_type: DatasourceTypes, usage_stats_payload: dict, class_name: str, context_root_dir: str, datasource_name: Optional[str] = None)

The base class for pandas/spark helpers used in the datasource new flow.

PandasYamlHelper(context_root_dir: str, datasource_name: Optional[str] = None)

The base class for pandas/spark helpers used in the datasource new flow.

SparkYamlHelper(context_root_dir: str, datasource_name: Optional[str] = None)

The base class for pandas/spark helpers used in the datasource new flow.

SQLCredentialYamlHelper(usage_stats_payload: dict, datasource_name: Optional[str] = None, driver: str = ‘’, port: Union[int, str] = ‘YOUR_PORT’, host: str = ‘YOUR_HOST’, username: str = ‘YOUR_USERNAME’, password: str = ‘YOUR_PASSWORD’, database: str = ‘YOUR_DATABASE’)

The base class for SQL helpers used in the datasource new flow.

MySQLCredentialYamlHelper(datasource_name: Optional[str])

The base class for SQL helpers used in the datasource new flow.

PostgresCredentialYamlHelper(datasource_name: Optional[str])

The base class for SQL helpers used in the datasource new flow.

RedshiftCredentialYamlHelper(datasource_name: Optional[str])

The base class for SQL helpers used in the datasource new flow.

SnowflakeAuthMethod()

Enum where members are also (and must be) ints

SnowflakeCredentialYamlHelper(datasource_name: Optional[str])

The base class for SQL helpers used in the datasource new flow.

BigqueryCredentialYamlHelper(datasource_name: Optional[str])

The base class for SQL helpers used in the datasource new flow.

ConnectionStringCredentialYamlHelper(datasource_name: Optional[str])

The base class for SQL helpers used in the datasource new flow.

Functions

datasource(ctx)

Datasource operations

datasource_new(ctx, name, jupyter)

Add a new Datasource to the data context.

delete_datasource(ctx, datasource)

Delete the datasource specified as an argument

datasource_list(ctx)

List known Datasources.

_build_datasource_intro_string(datasources)

_datasource_new_flow(context: DataContext, usage_event_end: str, datasource_name: Optional[str] = None, jupyter: bool = True)

_get_sql_yaml_helper_class(selected_database: SupportedDatabaseBackends, datasource_name: Optional[str])

_prompt_for_execution_engine()

_get_files_helper(selection: str, context_root_dir: str, datasource_name: Optional[str] = None)

_prompt_user_for_database_backend()

_prompt_for_snowflake_auth_method()

_verify_sqlalchemy_dependent_modules()

sanitize_yaml_and_save_datasource(context: DataContext, datasource_yaml: str, overwrite_existing: bool = False)

A convenience function used in notebooks to help users save secrets.

check_if_datasource_name_exists(context: DataContext, datasource_name: str)

Check if a Datasource name already exists in the on-disk version of the given DataContext and if so raise an error

great_expectations.cli.datasource.logger
great_expectations.cli.datasource.sqlalchemy
great_expectations.cli.datasource.yaml
great_expectations.cli.datasource.default_flow_style = False
class great_expectations.cli.datasource.SupportedDatabaseBackends

Bases: enum.Enum

Generic enumeration.

Derive from this class to define new enumerations.

MYSQL = MySQL
POSTGRES = Postgres
REDSHIFT = Redshift
SNOWFLAKE = Snowflake
BIGQUERY = BigQuery
OTHER = other - Do you have a working SQLAlchemy connection string?
great_expectations.cli.datasource.datasource(ctx)

Datasource operations

great_expectations.cli.datasource.datasource_new(ctx, name, jupyter)

Add a new Datasource to the data context.

great_expectations.cli.datasource.delete_datasource(ctx, datasource)

Delete the datasource specified as an argument

great_expectations.cli.datasource.datasource_list(ctx)

List known Datasources.

great_expectations.cli.datasource._build_datasource_intro_string(datasources)
great_expectations.cli.datasource._datasource_new_flow(context: DataContext, usage_event_end: str, datasource_name: Optional[str] = None, jupyter: bool = True) → None
class great_expectations.cli.datasource.BaseDatasourceNewYamlHelper(datasource_type: DatasourceTypes, usage_stats_payload: dict, datasource_name: Optional[str] = None)

This base class defines the interface for helpers used in the datasource new flow.

abstract verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

create_notebook(self, context: DataContext)

Create a datasource_new notebook and save it to disk.

abstract get_notebook_renderer(self, context)

Get a renderer specifically constructed for the datasource type.

send_backend_choice_usage_message(self, context: DataContext)
prompt(self)

Optional prompt if more information is needed before making a notebook.

abstract yaml_snippet(self)

Override to create the yaml for the notebook.

class great_expectations.cli.datasource.FilesYamlHelper(datasource_type: DatasourceTypes, usage_stats_payload: dict, class_name: str, context_root_dir: str, datasource_name: Optional[str] = None)

Bases: great_expectations.cli.datasource.BaseDatasourceNewYamlHelper

The base class for pandas/spark helpers used in the datasource new flow.

get_notebook_renderer(self, context)

Get a renderer specifically constructed for the datasource type.

yaml_snippet(self)

Note the InferredAssetFilesystemDataConnector was selected to get users to data assets with minimal configuration. Other DataConnectors are available.

prompt(self)

Optional prompt if more information is needed before making a notebook.

class great_expectations.cli.datasource.PandasYamlHelper(context_root_dir: str, datasource_name: Optional[str] = None)

Bases: great_expectations.cli.datasource.FilesYamlHelper

The base class for pandas/spark helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

class great_expectations.cli.datasource.SparkYamlHelper(context_root_dir: str, datasource_name: Optional[str] = None)

Bases: great_expectations.cli.datasource.FilesYamlHelper

The base class for pandas/spark helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

class great_expectations.cli.datasource.SQLCredentialYamlHelper(usage_stats_payload: dict, datasource_name: Optional[str] = None, driver: str = '', port: Union[int, str] = 'YOUR_PORT', host: str = 'YOUR_HOST', username: str = 'YOUR_USERNAME', password: str = 'YOUR_PASSWORD', database: str = 'YOUR_DATABASE')

Bases: great_expectations.cli.datasource.BaseDatasourceNewYamlHelper

The base class for SQL helpers used in the datasource new flow.

credentials_snippet(self)
yaml_snippet(self)

Override to create the yaml for the notebook.

_yaml_innards(self)

Override if needed.

get_notebook_renderer(self, context)

Get a renderer specifically constructed for the datasource type.

class great_expectations.cli.datasource.MySQLCredentialYamlHelper(datasource_name: Optional[str])

Bases: great_expectations.cli.datasource.SQLCredentialYamlHelper

The base class for SQL helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

class great_expectations.cli.datasource.PostgresCredentialYamlHelper(datasource_name: Optional[str])

Bases: great_expectations.cli.datasource.SQLCredentialYamlHelper

The base class for SQL helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

class great_expectations.cli.datasource.RedshiftCredentialYamlHelper(datasource_name: Optional[str])

Bases: great_expectations.cli.datasource.SQLCredentialYamlHelper

The base class for SQL helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

_yaml_innards(self)

Override if needed.

class great_expectations.cli.datasource.SnowflakeAuthMethod

Bases: enum.IntEnum

Enum where members are also (and must be) ints

USER_AND_PASSWORD = 0
SSO = 1
KEY_PAIR = 2
class great_expectations.cli.datasource.SnowflakeCredentialYamlHelper(datasource_name: Optional[str])

Bases: great_expectations.cli.datasource.SQLCredentialYamlHelper

The base class for SQL helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

prompt(self)

Optional prompt if more information is needed before making a notebook.

credentials_snippet(self)
_yaml_innards(self)

Override if needed.

class great_expectations.cli.datasource.BigqueryCredentialYamlHelper(datasource_name: Optional[str])

Bases: great_expectations.cli.datasource.SQLCredentialYamlHelper

The base class for SQL helpers used in the datasource new flow.

credentials_snippet(self)
verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

_yaml_innards(self)

Override if needed.

class great_expectations.cli.datasource.ConnectionStringCredentialYamlHelper(datasource_name: Optional[str])

Bases: great_expectations.cli.datasource.SQLCredentialYamlHelper

The base class for SQL helpers used in the datasource new flow.

verify_libraries_installed(self)

Used in the interactive CLI to help users install dependencies.

credentials_snippet(self)
_yaml_innards(self)

Override if needed.

great_expectations.cli.datasource._get_sql_yaml_helper_class(selected_database: SupportedDatabaseBackends, datasource_name: Optional[str]) → Union[MySQLCredentialYamlHelper, PostgresCredentialYamlHelper, RedshiftCredentialYamlHelper, SnowflakeCredentialYamlHelper, BigqueryCredentialYamlHelper, ConnectionStringCredentialYamlHelper]
great_expectations.cli.datasource._prompt_for_execution_engine() → str
great_expectations.cli.datasource._get_files_helper(selection: str, context_root_dir: str, datasource_name: Optional[str] = None) → Union[PandasYamlHelper, SparkYamlHelper]
great_expectations.cli.datasource._prompt_user_for_database_backend() → SupportedDatabaseBackends
great_expectations.cli.datasource._prompt_for_snowflake_auth_method() → SnowflakeAuthMethod
great_expectations.cli.datasource._verify_sqlalchemy_dependent_modules() → bool
great_expectations.cli.datasource.sanitize_yaml_and_save_datasource(context: DataContext, datasource_yaml: str, overwrite_existing: bool = False) → None

A convenience function used in notebooks to help users save secrets.

great_expectations.cli.datasource.PROMPT_FILES_BASE_PATH =

Enter the path of the root directory where the data files are stored. If files are on local disk enter a path relative to your current working directory or an absolute path.

great_expectations.cli.datasource.CLI_ONLY_SQLALCHEMY_ORDERED_DEPENDENCY_MODULE_NAMES :list = ['great_expectations.datasource.batch_kwargs_generator.table_batch_kwargs_generator', 'great_expectations.dataset.sqlalchemy_dataset', 'great_expectations.validator.validator', 'great_expectations.datasource.sqlalchemy_datasource']
great_expectations.cli.datasource.check_if_datasource_name_exists(context: DataContext, datasource_name: str) → bool

Check if a Datasource name already exists in the on-disk version of the given DataContext and if so raise an error :param context: DataContext to check for existing Datasource :param datasource_name: name of the proposed Datasource

Returns

boolean True if datasource name exists in on-disk config, else False