› Defaulting to 'auto' which will select the first provider available for the model, sorted by the user's order in https://hf.co/settings/inference-providers. › Auto selected provider: featherless-ai ! InputError: Model HuggingFaceH4/zephyr-7b-beta is not

I’m trying to use the model HuggingFaceH4/zephyr-7b-beta for text generation.
When I run it, I get this message:

Defaulting to ‘auto’ which will select the first provider available for the model, sorted by the user’s order in Hugging Face – The AI community building the future. .
Auto selected provider: featherless-ai
InputError: Model HuggingFaceH4/zephyr-7b-beta is not supported for task text-generation and provider featherless-ai. Supported task: conversational.

Hey @Sancha_Man_Subba,

I think that message means you’re trying to use the Zephyr 7B model for a task it doesn’t officially support.

For example:

const response = await hf.conversational({
  model: "HuggingFaceH4/zephyr-7b-beta",
  inputs: {
    past_user_inputs: ["Hello!"],
    generated_responses: [],
    text: "Explain how transformers work"
  }
})
console.log(response);

rather then something like

await hf.textGeneration({
  model: "HuggingFaceH4/zephyr-7b-beta",
  inputs: "Explain how transformers work"
})

Have a check to see if your code is something like the above. I haven’t used this model but from the error it seems like it only works with “conversational” tasks and not explicitly generation.