You can extend the functionality of Great Expectations by creating your own Custom ExpectationsAn extension of the `Expectation` class, developed outside of the Great Expectations library.. You can also enrich Great Expectations as a shared standard for data quality by contributing new ExpectationsA verifiable assertion about data. to the open source project.
These processes compliment each other and their steps are streamlined so that one flows into the other. Once you have created a Custom Expectation, it is simple to contribute it to the open source project. This section will teach you how to do both.
Prerequisites: This how-to guide assumes you have:
Creating Custom Expectations
A fully-developed, Production-ready Expectation needs to do a lot of things:
- Execute consistently across many types of data infrastructure
- Render itself and its Validation ResultsGenerated when data is Validated against an Expectation or Expectation Suite. into several formats
- Support ProfilingThe act of generating Metrics and candidate Expectations from data. against new data
- Be maintainable, with good tests, documentation, linting, type hints, etc.
In order to make development of Expectations as easy as possible, we've broken up the steps to create Custom Expectations into a series of bite-sized steps. Each step can be completed in minutes. They can be completed (and contributed) incrementally, unlocking value at each step along the way.
Grouped together, they constitute a Definition of Done for Expectations at each Level of Maturity.An Experimental Expectation...
- Has a valid
- Has a docstring, including a one-line short description
- Has at least one positive and negative example case, and all test cases pass
- Has core logic and passes tests on at least one Execution EngineA system capable of processing data to compute Metrics.
- Passes all linting checks
- Has basic input validation and type checking
- Has both Statement Renderers: prescriptive and diagnostic
- Has core logic that passes tests for all applicable Execution Engines and SQL dialects
- Has a robust suite of tests, as determined by a code owner
- Has passed a manual review by a code owner for code standards and style guides
How these docs are organized
The docs in
Creating Custom Expectations focus on completing the five steps required for Experimental Expectations.
Completing them will leave you with a Custom Expectation that meets our linting standards, can be executed against one backend, with a couple tests to verify correctness, and a basic docstring and metadata to support diagnostics.
The code to achieve the first four steps looks somewhat different depending on the class of Expectation you're developing. Accordingly, there are separate how-to guides and templates for each class of Expectation.
Not all classes of Expectation currently have guides and templates.
If you'd like to develop a different kind of Expectation, please reach out on Slack.
Beyond the first four steps, additional features are generally similar across all Expectation classes. Accordingly, most of the remaining steps have their own how-to guide in the
Adding Features to Custom Expectations section of the table of contents.
|Passes all linting checks||Great Expectations Code Style Guide: Linting|
|Has basic input validation and type checking||How to add input validation and type checking for a Custom Expectation|
|Has core logic that passes tests for all applicable Execution Engines and SQL dialects||How to add SQLAlchemy support for Custom Expectations|
How to add Spark support for Custom Expectations
The final two checks required for acceptance into the Great Expectations codebase at a Production level require manual review and guidance by a code owner.
Using your Expectation
You can find instructions for using your Custom Expectation in our guide: how to use a Custom Expectation.