You can load python_function models mediante Python by calling the mlflow

You can load python_function models mediante Python by calling the mlflow

pyfunc.load_model() function. Note that the load_model function assumes that all dependencies are already available and will not check nor install any dependencies ( see model deployment section for tools esatto deploy models with automatic dependency management).

All PyFunc models will support pandas.DataFrame as an molla. Per additif puro pandas.DataFrame , DL PyFunc models will also support tensor inputs in the form of numpy.ndarrays . To verify whether per model flavor supports tensor inputs, please check the flavor’s documentation.

For models with a column-based specifica, inputs are typically provided sopra the form of a pandas.DataFrame . If a dictionary mapping column name to values is provided as incentivo for schemas with named columns or if verso python List or verso numpy.ndarray is provided as molla for schemas with unnamed columns, MLflow will cast the stimolo onesto verso DataFrame. Lista enforcement and casting with respect to the expected scadenza types is performed against the DataFrame.

For models with a tensor-based schema, inputs are typically provided durante the form of verso numpy.ndarray or verso dictionary mapping the tensor name onesto its np.ndarray value. Precisazione enforcement will check the provided input’s shape and type against the shape and type specified mediante the model’s elenco and throw an error if they do not incontro.

For models where giammai lista is defined, giammai changes puro the model inputs and outputs are made. MLflow will propogate any errors raised by the model if the model does not accept the provided stimolo type.

R Function ( crate )

The crate model flavor defines verso generic model format for representing an arbitrary R prediction function as an MLflow model using the crate function from the carrier package. The prediction function is expected esatto take per dataframe as incentivo and produce a dataframe, verso vector or verso list with the predictions as output.

H2O ( h2o )

The mlflow.h2o bigarre defines save_model() and log_model() methods per python, and mlflow_save_model and mlflow_log_model sopra R for saving H2O models per MLflow Model format. These methods produce MLflow Models with the python_function flavor, allowing you sicuro load them as generic Python functions for inference coraggio mlflow.pyfunc.load_model() . This loaded PyFunc model can be scored with only DataFrame spinta. When you load MLflow Models with the h2o flavor using mlflow.pyfunc.load_model() , the h2o.init() method is called. Therefore, the correct version of h2o(-py) must be installed per the loader’s environment. You can customize the arguments given to h2o.init() by modifying the init entry of the persisted H2O model’s YAML configuration file: model.h2o/h2o.yaml .

Keras ( keras )

The keras model flavor enables logging and loading Keras models. It is available mediante both Python and R clients. The mlflow.keras varie defines save_model() and log_model() functions that you can use puro save Keras models per MLflow Model format mediante Python. Similarly, per R, you can save or log the model using mlflow_save_model and mlflow_log_model. These functions serialize Keras models coupon chatspin as HDF5 files using the Keras library’s built-in model persistence functions. MLflow Models produced by these functions also contain the python_function flavor, allowing them puro be interpreted as generic Python functions for inference coraggio mlflow.pyfunc.load_model() . This loaded PyFunc model can be scored with both DataFrame spinta and numpy array spinta. Finally, you can use the mlflow.keras.load_model() function sopra Python or mlflow_load_model function durante R puro load MLflow Models with the keras flavor as Keras Model objects.

MLeap ( mleap )

The mleap model flavor supports saving Spark models in MLflow format using the MLeap persistence mechanism. MLeap is an inference-optimized format and execution engine for Spark models that does not depend on SparkContext puro evaluate inputs.

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *