How can we restore existing json expectation suite into Postgres database after upgrade PostgreSQL to be store a expectation suite?

We had implemented more than 20 expectation suite and tested with TupleFilesystemStoreBackend which stores all expectations as JSON file locally.
Now we upgraded into DatabaseStoreBackend but how can we export those existing JSON suite into a new Postgres database without loosing or creating everything from new.

Here docs for store expectation suite into Postgres
https://docs.greatexpectations.io/en/latest/guides/how_to_guides/configuring_metadata_stores/how_to_configure_an_expectation_store_to_postgresql.html

1 Like

@jagamts1, Here’s a pseudocode version.

Something along these lines should work, but I haven’t actually tried executing this.

#### Instantiate your DataContext with the new config, and get the Expectations Store
import great_expectations as ge
context = ge.DataContext()

new_store = context.stores[context.expectations_store_name]


#### Instantiate an Expectations Store outside your DataContext, using your old config
old_store_config = convert_from_yaml_to_dict("""
class_name: ExpectationsStore
store_backend:
    class_name: TupleFilesystemStoreBackend
    base_directory: expectations/
""")

old_store = instantiate_class_from_config(
    config=old_store_config,
    runtime_environment={"root_directory": self.root_directory,},
    config_defaults={"module_name": "great_expectations.data_context.store"},
)

#### Loop over Expectation Suites from the old store and save them into the new DataContext Expectation Store

for key_ in old_store.list_keys():
    if new_store.has_key(key_):
        print("New store already has key :", key_, ". Skipping...")
        continue

    new_store.set(
        key_,
        old_store.get(key_)
    )

PS: Once you get a working version, would you mind reposting here? Thanks!

@abegong Thanks, It’s worked like a charm. Here working version code.

#### Instantiate your DataContext with the new config, and get the Expectations Store
import yaml
import great_expectations as ge
from great_expectations.data_context.util import instantiate_class_from_config

context = ge.DataContext()

new_store = context.stores[context.expectations_store_name]

#### Instantiate an Expectations Store outside your DataContext, using your old config
old_store_config = yaml.load("""
class_name: ExpectationsStore
store_backend:
    class_name: TupleFilesystemStoreBackend
    base_directory: expectations/
""")

old_store = instantiate_class_from_config(
    config=old_store_config,
    runtime_environment={"root_directory": context.root_directory },
    config_defaults={"module_name": "great_expectations.data_context.store"},
)

for key_ in old_store.list_keys():
    if new_store.has_key(key_):
        print("New store already has key :", key_, ". Skipping...")
        continue

    new_store.set(
        key_,
        old_store.get(key_)
    )

It would be great making this as CLI command in the future and giving options such as PostgreSQL to GCP or S3.

1 Like