Feast is an open source feature store for both offline training and online inference. More strictly speaking it is Python library (SDK) that allows you to define features in Python and update them into your existing data store. With Feast you can query the features as a batch for offline/training purposes or as single entities for online inference.
It is possible to integrate Feast with Valohai to for example use the features in your model training. In order to install it for your executions, you can either add the
pip install to the command section or add
feast to your
Note that Feast itself is not a database or a data warehouse so you will need a separate backend, such as in Snowflake, BigQuery or Redis. Work with your cloud administrator to make sure that the workers created by Valohai can access the database!
Depending on the backend you might need different dependencies. You can install them with Feast by defining for example
pip install 'feast[snowflake]' or
pip install 'feast[postgres]'.
The information you will need for the Feast configuration file,
feature_store.yaml, depend on the backend you're using. For example, for a PostgreSQL database you will need the hostname, username and password. You should save these as Environment variables in your Valohai project or at organization level in case you will need the access in several projects.
To get an overview on how the integration looks like in practice, check out the Feast example we have available on GitHub. The example uses PostgreSQL as the backend but it can be adjusted to other providers as well.
Please sign in to leave a comment.