unionml.services.bentoml.BentoMLService#
- class unionml.services.bentoml.BentoMLService(model, framework, name=None)#
Initialize a BentoML Service for generating predictions from a
unionml.model.Model
.- Parameters:
Methods
Create the bentoml.Service API.
Load a
bentoml.Model
from a version.Save the model as a
bentoml.Model
.Attributes
Maps python types to BentoML IO descriptors
Get the
Model
bound to this service.Get the name of the service.
Get the underlying
bentoml.Service
instance.- IO_DESCRIPTOR_MAPPING = {<class 'numpy.ndarray'>: <class 'bentoml._internal.io_descriptors.numpy.NumpyNdarray'>, <class 'pandas.core.frame.DataFrame'>: <class 'bentoml._internal.io_descriptors.pandas.PandasDataFrame'>, <class 'list'>: <class 'bentoml._internal.io_descriptors.json.JSON'>, <class 'dict'>: <class 'bentoml._internal.io_descriptors.json.JSON'>}#
Maps python types to BentoML IO descriptors
- configure(feature_type=None, output_type=None, enable_async=False, supported_resources=None, supports_cpu_multi_threading=False, runnable_method_kwargs=None)#
Create the bentoml.Service API.
- Parameters:
features – BentoML IO descriptor for the feature data. The descriptor is inferred using the
IO_DESCRIPTOR_MAPPING
attribute. This must be provided for unsupported types.output – BentoML IO descriptor for the prediction output. The descriptor is inferred using the
IO_DESCRIPTOR_MAPPING
attribute. This must be provided for unsupported types.supported_resources (
Optional
[Tuple
]) – Indicates which resources thebentoml.Runnable
class supports, see here for more details.supports_cpu_multi_threading (
bool
) – Indicates whether thebentoml.Runnable
class supports multi-threading, see here for more details.runnable_method_kwargs (
Optional
[Dict
[str
,Any
]]) – Keyword arguments forwarded to bentoml.Runnable.method.
- Raises:
ModelArtifactNotFound
if the boundunionml.model.Model
instance does not have a definedartifact
property.- Return type:
- load_model(tag_or_version)#
Load a
bentoml.Model
from a version.- Parameters:
- save_model(model_object=None, **kwargs)#
Save the model as a
bentoml.Model
.- Parameters:
model_object (
Optional
[Any
]) – model object to save. If None, this method assumes thatunionml.model.Model.artifact
is defined.framework – machine learning framework supported by bentoml. This is used to access the appropriate module
bentoml.<framework>
, e.g.bentoml.sklearn
- property svc: Service#
Get the underlying
bentoml.Service
instance.