- Based on Next.js 14 (App Router, Server Actions), shadcn/ui, TailwindCSS, Vercel AI SDK.
- Streaming in the UI.
- Can install and use any package from npm, pip.
- Supported stacks (add your own):
- πΈ Python interpreter
- πΈ Next.js
- πΈ Vue.js
- πΈ Streamlit
- πΈ Gradio
- Supported LLM Providers (add your own):
- πΈ OpenAI
- πΈ Anthropic
- πΈ Google AI
- πΈ Mistral
- πΈ Groq
- πΈ Fireworks
- πΈ Together AI
- πΈ Ollama
Make sure to give us a star!
This is an open-source version of apps like Anthropic's Claude Artifacts, Vercel v0, or GPT Engineer.
- git
- Recent version of Node.js and npm package manager
- E2B API Key
- LLM Provider API Key
In your terminal:
git clone https://github.com/e2b-dev/fragments.git
Enter the repository:
cd fragments
Run the following to install the required dependencies:
npm i
Create a .env.local
file and set the following:
# Get your API key here - https://e2b.dev/
E2B_API_KEY="your-e2b-api-key"
# OpenAI API Key
OPENAI_API_KEY=
# Other providers
ANTHROPIC_API_KEY=
GROQ_API_KEY=
FIREWORKS_API_KEY=
TOGETHER_API_KEY=
GOOGLE_AI_API_KEY=
GOOGLE_VERTEX_CREDENTIALS=
MISTRAL_API_KEY=
XAI_API_KEY=
### Optional env vars
# Domain of the site
NEXT_PUBLIC_SITE_URL=
# Rate limit
RATE_LIMIT_MAX_REQUESTS=
RATE_LIMIT_WINDOW=
# Vercel/Upstash KV (short URLs, rate limiting)
KV_REST_API_URL=
KV_REST_API_TOKEN=
# Supabase (auth)
SUPABASE_URL=
SUPABASE_ANON_KEY=
# PostHog (analytics)
NEXT_PUBLIC_POSTHOG_KEY=
NEXT_PUBLIC_POSTHOG_HOST=
### Disabling functionality (when uncommented)
# Disable API key and base URL input in the chat
# NEXT_PUBLIC_NO_API_KEY_INPUT=
# NEXT_PUBLIC_NO_BASE_URL_INPUT=
# Hide local models from the list of available models
# NEXT_PUBLIC_HIDE_LOCAL_MODELS=
npm run dev
npm run build
-
Make sure E2B CLI is installed and you're logged in.
-
Add a new folder under sandbox-templates/
-
Initialize a new template using E2B CLI:
e2b template init
This will create a new file called
e2b.Dockerfile
. -
Adjust the
e2b.Dockerfile
Here's an example streamlit template:
# You can use most Debian-based base images FROM python:3.19-slim RUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly # Copy the code to the container WORKDIR /home/user COPY . /home/user
-
Specify a custom start command in
e2b.toml
:start_cmd = "cd /home/user && streamlit run app.py"
-
Deploy the template with the E2B CLI
e2b template build --name <template-name>
After the build has finished, you should get the following message:
β Building sandbox template <template-id> <template-name> finished.
-
Open lib/templates.json in your code editor.
Add your new template to the list. Here's an example for Streamlit:
"streamlit-developer": { "name": "Streamlit developer", "lib": [ "streamlit", "pandas", "numpy", "matplotlib", "request", "seaborn", "plotly" ], "file": "app.py", "instructions": "A streamlit app that reloads automatically.", "port": 8501 // can be null },
Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.
-
Optionally, add a new logo under public/thirdparty/templates
-
Open lib/models.json in your code editor.
-
Add a new entry to the models list:
{ "id": "mistral-large", "name": "Mistral Large", "provider": "Ollama", "providerId": "ollama" }
Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see adding providers below).
-
Open lib/models.ts in your code editor.
-
Add a new entry to the
providerConfigs
list:Example for fireworks:
fireworks: () => createOpenAI({ apiKey: apiKey || process.env.FIREWORKS_API_KEY, baseURL: baseURL || 'https://api.fireworks.ai/inference/v1' })(modelNameString),
-
Optionally, adjust the default structured output mode in the
getDefaultMode
function:if (providerId === 'fireworks') { return 'json' }
-
Optionally, add a new logo under public/thirdparty/logos
As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.
-
Make sure E2B CLI is installed and you're logged in.
-
Add a new folder under sandbox-templates/
-
Initialize a new template using E2B CLI:
e2b template init
This will create a new file called
e2b.Dockerfile
. -
Configure the Dockerfile
Example Streamlit template:
# Use Debian-based base image FROM python:3.11-slim # Install dependencies RUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly # Set working directory WORKDIR /home/user COPY . /home/user
-
Set the start command in
e2b.toml
:start_cmd = "cd /home/user && streamlit run app.py --server.port 8501 --server.address 0.0.0.0"
-
Deploy the template
e2b template build --name <template-name>
Success message:
β Building sandbox template <template-id> <template-name> finished.
-
Register in templates.json
Add your template to
lib/templates.json
:"custom-template": { "name": "Custom Template", "lib": ["dependency1", "dependency2"], "file": "main.py", "instructions": "Template-specific instructions for the AI.", "port": 8080 }
-
Add template logo (optional)
Place logo SVG in
public/thirdparty/templates/
-
Register the model in
lib/models.json
:{ "id": "custom-model-id", "name": "Custom Model Name", "provider": "Provider Name", "providerId": "provider-id", "multiModal": true }
Parameters:
id
: Unique model identifiername
: Display name in the UIprovider
: Human-readable provider nameproviderId
: Provider configuration keymultiModal
: Whether the model supports images/vision
-
Configure provider in
lib/models.ts
:Add to the
providerConfigs
object:'custom-provider': () => createOpenAI({ apiKey: apiKey || process.env.CUSTOM_PROVIDER_API_KEY, baseURL: baseURL || 'https://api.customprovider.com/v1' })(modelNameString)
-
Set output mode (optional) in
getDefaultMode
:if (providerId === 'custom-provider') { return 'json' // or 'tool' or 'object' }
-
Add environment variable:
CUSTOM_PROVIDER_API_KEY="your-api-key"
-
Add provider logo (optional):
Place SVG logo in
public/thirdparty/logos/
CodinIT.dev supports complex multi-step workflows:
// Example workflow definition
const workflow = {
name: 'Data Analysis Pipeline',
fragments: [
{ type: 'data-import', config: { source: 'csv' } },
{ type: 'data-cleaning', config: { method: 'pandas' } },
{ type: 'visualization', config: { charts: ['scatter', 'histogram'] } }
],
connections: [
{ from: 'data-import', to: 'data-cleaning' },
{ from: 'data-cleaning', to: 'visualization' }
]
}
Setup team workspaces:
- Configure team billing in Stripe dashboard
- Invite team members via settings
- Set role-based permissions
- Share projects and workflows
Access CodinIT.dev programmatically:
// Execute code via API
const response = await fetch('/api/code/execute', {
method: 'POST',
headers: { 'Authorization': `Bearer ${token}` },
body: JSON.stringify({
code: 'print("Hello from API")',
template: 'code-interpreter-v1'
})
})
CodinIT.dev provides a comprehensive REST API. Key endpoints:
POST /api/chat
- AI code generationPOST /api/sandbox
- Create execution environmentsPOST /api/code/execute
- Execute code in sandboxesGET /api/workflows
- List workflowsPOST /api/workflows/{id}/execute
- Execute workflowsGET /api/files/sandbox/list
- Browse sandbox files
For detailed API documentation, see openapi.yaml
or import the Postman collection from postman-collection.json
.
We welcome contributions to CodinIT.dev! Please see our contributing guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Make your changes and test thoroughly
- Run linting:
npm run lint
- Commit changes:
git commit -m 'Add amazing feature'
- Push to branch:
git push origin feature/amazing-feature
- Open a Pull Request
- π€ New AI provider integrations
- π Additional sandbox templates
- π§ Workflow automation improvements
- π Documentation and tutorials
- π Bug fixes and performance optimizations
- π¨ UI/UX enhancements
- Documentation: Comprehensive guides in
docs/
- GitHub Issues: Report bugs and request features
- Community: Join our Discord server
- Email: Contact [email protected]
This project is licensed under the MIT License - see the LICENSE file for details.
- E2B for secure code execution environments
- Supabase for database and authentication
- Vercel for deployment and hosting
- shadcn/ui for beautiful UI components
- All the amazing AI providers making this possible
Built with β€οΈ by the CodinIT.dev team