Zenith-GPT2-124M
Openweight conversational model.
Trained on 400k messages and built on GPT-2. Small, quick to run, and simple to use.
• 124M Parameters
• GPT-2 Backbone
• Open-weight Distribution
• License: Apache-2.0
• Best Use: Lightweight roleplay and conversational tasks
• Limitation: Limited reasoning performance