coremltools.models.MLModel¶
-
class
coremltools.models.
MLModel
(model)¶ This class defines the minimal interface to a CoreML object in Python.
At a high level, the protobuf specification consists of:
- Model description: Encodes names and type information of the inputs and outputs to the model.
- Model parameters: The set of parameters required to represent a specific instance of the model.
- Metadata: Information about the origin, license, and author of the model.
With this class, you can inspect a CoreML model, modifiy metadata, and make predictions for the purposes of testing (on select platforms).
See also
Examples
# Load the model >>> model = MLModel('HousePricer.mlmodel') # Set the model metadata >>> model.author = 'Author' >>> model.license = 'BSD' >>> model.short_description = 'Predicts the price of a house in the Seattle area.' # Get the interface to the model >>> model.input_descriptions >>> model.output_description # Set feature descriptions manually >>> model.input_description['bedroom'] = 'Number of bedrooms' >>> model.input_description['bathrooms'] = 'Number of bathrooms' >>> model.input_description['size'] = 'Size (in square feet)' # Set >>> model.output_description['price'] = 'Price of the house' # Make predictions >>> predictions = model.predict({'bedroom': 1.0, 'bath': 1.0, 'size': 1240}) # Get the spec of the model >>> model.spec # Save the model >>> model.save('HousePricer.mlmodel')
-
__init__
(model)¶ Construct an MLModel from a .mlmodel
Parameters: model: str | Model_pb2
If a string is given it should be the location of the .mlmodel to load.
Examples
>>> loaded_model = MLModel('my_model_file.mlmodel')
Methods
__init__
(model)Construct an MLModel from a .mlmodel get_spec
()Get a deep copy of the protobuf specification of the model. predict
(data, **kwargs)Return predictions for the model. save
(filename)Save the model to a .mlmodel format. Attributes
author
input_description
license
output_description
short_description
user_defined_metadata
-
__init__
(model) Construct an MLModel from a .mlmodel
Parameters: model: str | Model_pb2
If a string is given it should be the location of the .mlmodel to load.
Examples
>>> loaded_model = MLModel('my_model_file.mlmodel')
-
get_spec
()¶ Get a deep copy of the protobuf specification of the model.
Returns: model: Model_pb2
Protobuf specification of the model.
Examples
>>> spec = model.get_spec()
-
predict
(data, **kwargs)¶ Return predictions for the model. The kwargs gets passed into the model as a dictionary.
Parameters: data : dict[str, value]
Dictionary of data to make predictions from where the keys are the names of the input features.
Returns: out : dict[str, value]
Predictions as a dictionary where each key is the output feature name.
Examples
>>> data = {'bedroom': 1.0, 'bath': 1.0, 'size': 1240} >>> predictions = model.predict(data)
-
save
(filename)¶ Save the model to a .mlmodel format.
Parameters: location : str
Target filename for the model.
See also
coremltools.utils.load_model
Examples
>>> model.save('my_model_file.mlmodel') >>> loaded_model = MLModel('my_model_file.mlmodel')