zen-eco-instruct

Efficient instruction-following

Model Details

Quick Start

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("zenlm/zen-eco-instruct")
tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-eco-instruct")

inputs = tokenizer("Hello!", return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0]))

Links


Zen LM โ€ข Building AI that's local, private, and free

Downloads last month
68
Safetensors
Model size
4B params
Tensor type
F16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for zenlm/zen-eco-4b-instruct

Base model

Qwen/Qwen3-4B-Base
Finetuned
Qwen/Qwen3-4B
Quantized
(173)
this model
Quantizations
2 models

Spaces using zenlm/zen-eco-4b-instruct 2