No model could be found to perform inference – How to solve this Elasticsearch error

Opster Team

March-22, Version: 1.7-8.0

Before you begin reading this guide, we recommend you try running the Elasticsearch Error Check-Up which analyzes 2 JSON files to detect many configuration errors.

To easily locate the root cause and resolve this issue try AutoOps for Elasticsearch & OpenSearch. It diagnoses problems by analyzing hundreds of metrics collected by a lightweight agent and offers guidance for resolving them.

Take a self-guided product tour to see for yourself (no registration required).

This guide will help you check for common problems that cause the log ” No model could be found to perform inference ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin.

Log Context

Log “No model could be found to perform inference”classname  is InferenceStep.java We extracted the following from Elasticsearch source code for those seeking an in-depth context :

searchRequest.source(searchSourceBuilder); 
 executeAsyncWithOrigin(client; ML_ORIGIN; SearchAction.INSTANCE; searchRequest; ActionListener.wrap(searchResponse -> {
 SearchHit[] hits = searchResponse.getHits().getHits();
 if (hits.length == 0) {
 listener.onFailure(new ResourceNotFoundException("No model could be found to perform inference"));
 } else {
 listener.onResponse(hits[0].getId());
 }
 }; listener::onFailure));
 }

 

Watch product tour

Try AutoOps to find & fix Elasticsearch problems

Analyze Your Cluster
Skip to content