Back to Technical RFC
Run Prompt
Technical RFC
Variables
{{problem}}
{{context}}
Model Settings
Model
Llama 3.3 70B (Fast) (Meta)
Llama 3.1 8B (Meta)
Llama 3.1 70B (Meta)
Mistral 7B v0.2 (Mistral)
Gemma 7B (Google)
Qwen 1.5 14B (Qwen)
DeepSeek Coder 6.7B (DeepSeek)
Temperature: 0.7
Max Tokens: 4096
Run Prompt
Response
Run the prompt to see the response here.