interpret_community.adapter package¶
Defines adapters for converting feature importance values to an explanation.
- class interpret_community.adapter.ExplanationAdapter(features=None, classification=False, method='Adapter')¶
Bases:
object
An adapter for creating an interpret-community explanation from local importance values.
- Parameters
- create_global(local_importance_values, evaluation_examples=None, expected_values=None, include_local=True, batch_size=100)¶
Create a global explanation from the list of local feature importance values.
- Parameters
local_importance_values (numpy.array or scipy.sparse.csr_matrix or list[scipy.sparse.csr_matrix]) – The feature importance values.
evaluation_examples (numpy.array or pandas.DataFrame or scipy.sparse.csr_matrix) – A matrix of feature vector examples (# examples x # features) on which to explain the model’s output.
expected_values (numpy.array) – The expected values of the model.
include_local (bool) – Include the local explanations in the returned global explanation. If include_local is False, will stream the local explanations to aggregate to global.
batch_size (int) – If include_local is False, specifies the batch size for aggregating local explanations to global.
- create_local(local_importance_values, evaluation_examples=None, expected_values=None)¶
Create a local explanation from the list of local feature importance values.
- Parameters
local_importance_values (numpy.array or scipy.sparse.csr_matrix or list[scipy.sparse.csr_matrix]) – The feature importance values.
evaluation_examples (numpy.array or pandas.DataFrame or scipy.sparse.csr_matrix) – A matrix of feature vector examples (# examples x # features) on which to explain the model’s output.
expected_values (numpy.array) – The expected values of the model.