Follow our Valohai Fundamentals learning path to learn how to interact with Valohai inputs, outputs, metadata, and parameters.
valohai-utils
Python helper library with jupyter notebooks. This will be useful when working with inputs, outputs, and parameters.valohai-utils
will be installed automatically with the latest versions of jupyhai
. You can of course either add it in your Docker image or add !pip install valohai-utils
in the first cell of your notebook.
You will need to add the valohai.prepare()
command at the beginning of your notebook to be able to handle Valohai inputs and parameters. You can check below how the complete sample script with inputs, outputs, parameters, and metadata logging from the Fundamentals learning path would look like in Jupyter notebook. Please make sure to go through the tutorial to better understand how to use the valohai-utils
helper tool.
Finally, even though you will need to define the step
name in the valohai.prepare()
command, the actual step name in the automatically generated valohai.yaml
will be jupyter_execution
.
import numpy as np
import tensorflow as tf
import valohai
valohai.prepare(
step='mystep',
default_inputs={
'dataset': 'https://valohaidemo.blob.core.windows.net/mnist/mnist.npz'
},
default_parameters={
'learning_rate': 0.001,
'epoch': 10,
}
)
def log_metadata(epoch, logs):
with valohai.logger() as logger:
logger.log('epoch', epoch)
logger.log('accuracy', logs['accuracy'])
logger.log('loss', logs['loss'])
input_path = valohai.inputs('dataset').path()
with np.load(input_path, allow_pickle=True) as f:
x_train, y_train = f['x_train'], f['y_train']
x_test, y_test = f['x_test'], f['y_test']
x_train, x_test = x_train / 255.0, x_test / 255.0
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10)
])
optimizer = tf.keras.optimizers.Adam(learning_rate=valohai.parameters('learning_rate').value)
loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
model.compile(optimizer=optimizer,
loss=loss_fn,
metrics=['accuracy'])
callback = tf.keras.callbacks.LambdaCallback(on_epoch_end=log_metadata)
model.fit(x_train, y_train, epochs=valohai.parameters('epoch').value, callbacks=[callback])
model.evaluate(x_test, y_test, verbose=2)
output_path = valohai.outputs().path('model.h5')
model.save(output_path)
Comments
0 comments
Please sign in to leave a comment.