Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
70 changes: 70 additions & 0 deletions .cerebras/skill/sparc/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
---
name: sparc
description: Use for ANY complex feature, refactor, or multi-step development task. SPARC walks through Specification, Pseudocode, Architecture, Refinement, and Completion — a structured methodology that turns vague requests into solid implementations. Triggers on keywords like build, implement, create, design, refactor, add feature, architect, plan and build, or any task that benefits from thinking before coding.
---

Systematic development methodology: **Specification, Pseudocode, Architecture, Refinement, Completion**. Each phase produces a concrete artifact before moving to the next. Cerebras inference speed makes this feel instant — you get the rigor of structured engineering without the wait.

The user provides a feature request, bug description, or development task. It may be vague or detailed.

## Phase 1: Specification

Before touching code, define what you're building:

- **Problem**: What exactly needs to change? What's broken or missing?
- **Constraints**: What can't change? Performance requirements? Backwards compatibility?
- **Success criteria**: How do we know it's done? What should a reviewer see?
- **Scope boundary**: What is explicitly NOT part of this task?

Output a brief spec (5-10 lines). If requirements are ambiguous, ask the user before proceeding.

## Phase 2: Pseudocode

Think through the logic in plain language before writing real code:

- Outline the control flow, data transformations, and edge cases
- Identify the key functions/components and their signatures
- Note where existing code will be touched vs new code created
- Flag any risky areas (concurrency, error handling, migrations)

Keep it short — pseudocode is a thinking tool, not documentation. Use numbered steps or bullet points, not full syntax.

## Phase 3: Architecture

Map the pseudocode onto the actual codebase:

- **Read first**: Examine the files you'll modify. Understand existing patterns before proposing new ones.
- **File plan**: List each file to create or modify, with a one-line summary of the change
- **Interface contracts**: Define function signatures, types, or API shapes that connect components
- **Dependencies**: What does this touch? What could break?

Follow existing conventions. Don't introduce new patterns unless the existing ones are clearly wrong for this use case.

## Phase 4: Refinement (TDD)

Implement with a test-first approach:

1. **Write a failing test** that captures the core behavior from your spec
2. **Write the minimum code** to make it pass — no more
3. **Run the test** to confirm it passes
4. **Refactor** if the code is unclear, but don't add features
5. **Repeat** for the next behavior

If the project doesn't have a test framework set up, skip to direct implementation but still work incrementally — one function at a time, verifying each before moving on.

## Phase 5: Completion

Wrap up with confidence:

- Run the full test suite (not just your new tests)
- Check for regressions in related functionality
- Remove any temporary code, debug logs, or commented-out blocks
- Verify the original success criteria from Phase 1 are met

If anything fails, loop back to the relevant phase rather than patching blindly.

## How to Use This

Don't announce each phase — just follow the progression naturally. For small tasks, phases collapse (a 3-line fix doesn't need pseudocode). For large tasks, each phase should produce visible output the user can react to before you continue.

The value is thinking before coding. Cerebras makes the thinking fast enough that it doesn't feel like overhead.
File renamed without changes.
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"description": "AI-powered development tool",
"private": true,
"type": "module",
"packageManager": "bun@1.3.4",
"packageManager": "bun@1.3.5",
"scripts": {
"dev": "bun run --cwd packages/opencode --conditions=browser src/index.ts",
"typecheck": "bun turbo typecheck",
Expand Down
3 changes: 2 additions & 1 deletion packages/opencode/src/agent/agent.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
import { Config } from "../config/config"
import z from "zod"
import { Provider } from "../provider/provider"
import { generateObject, type ModelMessage } from "ai"
import { generateObject } from "ai"
import type { ModelMessage } from "@ai-sdk/provider-utils"
import PROMPT_GENERATE from "./generate.txt"
import { SystemPrompt } from "../session/system"
import { Instance } from "../project/instance"
Expand Down
2 changes: 1 addition & 1 deletion packages/opencode/src/cli/cmd/agent.ts
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ const AgentCreateCommand = cmd({

const content = matter.stringify(generated.systemPrompt, frontmatter)
const filePath = path.join(
scope === "global" ? Global.Path.config : path.join(Instance.worktree, ".opencode"),
scope === "global" ? Global.Path.config : path.join(Instance.worktree, ".cerebras"),
`agent`,
`${generated.identifier}.md`,
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ export function FullscreenNotification(props: { notification: Notification; onCl
// Any key dismisses (after ready)
if (ready()) {
evt.preventDefault?.()
// @ts-expect-error stopPropagation exists at runtime
evt.stopPropagation?.()
// Small delay to prevent key from reaching prompt
setReady(false)
Expand Down Expand Up @@ -60,10 +61,12 @@ export function FullscreenNotification(props: { notification: Notification; onCl
flexDirection="column"
gap={2}
>
{/* @ts-expect-error fontSize exists at runtime */}
<text attributes={TextAttributes.BOLD} fg={theme.backgroundPanel} style={{ fontSize: 2 }}>
{icon()} {props.notification.title}
</text>
<box maxWidth={Math.min(80, dimensions().width - 10)} paddingLeft={2} paddingRight={2}>
{/* @ts-expect-error textAlign exists at runtime */}
<text fg={theme.backgroundPanel} wrapMode="word" textAlign="center">
{props.notification.message}
</text>
Expand Down
2 changes: 1 addition & 1 deletion packages/opencode/src/cli/cmd/tui/context/theme.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -343,7 +343,7 @@ async function getCustomThemes() {
Global.Path.config,
...(await Array.fromAsync(
Filesystem.up({
targets: [".opencode"],
targets: [".cerebras", ".opencode"],
start: process.cwd(),
}),
)),
Expand Down
28 changes: 16 additions & 12 deletions packages/opencode/src/config/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ export namespace Config {
log.debug("loaded custom config", { path: Flag.OPENCODE_CONFIG })
}

for (const file of ["opencode.jsonc", "opencode.json"]) {
for (const file of ["cerebras.jsonc", "cerebras.json", "opencode.jsonc", "opencode.json"]) {
const found = await Filesystem.findUp(file, Instance.directory, Instance.worktree)
for (const resolved of found.toReversed()) {
result = mergeConfigWithPlugins(result, await loadFile(resolved))
Expand Down Expand Up @@ -73,7 +73,7 @@ export namespace Config {
Global.Path.config,
...(await Array.fromAsync(
Filesystem.up({
targets: [".opencode"],
targets: [".cerebras", ".opencode"],
start: Instance.directory,
stop: Instance.worktree,
}),
Expand All @@ -89,8 +89,8 @@ export namespace Config {
for (const dir of directories) {
await assertValid(dir)

if (dir.endsWith(".opencode") || dir === Flag.OPENCODE_CONFIG_DIR) {
for (const file of ["opencode.jsonc", "opencode.json"]) {
if (dir.endsWith(".cerebras") || dir.endsWith(".opencode") || dir === Flag.OPENCODE_CONFIG_DIR) {
for (const file of ["cerebras.jsonc", "cerebras.json", "opencode.jsonc", "opencode.json"]) {
log.debug(`loading config from ${path.join(dir, file)}`)
result = mergeConfigWithPlugins(result, await loadFile(path.join(dir, file)))
// to satisy the type checker
Expand Down Expand Up @@ -193,7 +193,7 @@ export namespace Config {
if (!md.data) continue

const name = (() => {
const patterns = ["/.opencode/command/", "/command/"]
const patterns = ["/.cerebras/command/", "/.opencode/command/", "/command/"]
const pattern = patterns.find((p) => item.includes(p))

if (pattern) {
Expand Down Expand Up @@ -233,11 +233,13 @@ export namespace Config {

// Extract relative path from agent folder for nested agents
let agentName = path.basename(item, ".md")
const agentFolderPath = item.includes("/.opencode/agent/")
? item.split("/.opencode/agent/")[1]
: item.includes("/agent/")
? item.split("/agent/")[1]
: agentName + ".md"
const agentFolderPath = item.includes("/.cerebras/agent/")
? item.split("/.cerebras/agent/")[1]
: item.includes("/.opencode/agent/")
? item.split("/.opencode/agent/")[1]
: item.includes("/agent/")
? item.split("/agent/")[1]
: agentName + ".md"

// If agent is in a subfolder, include folder path in name
if (agentFolderPath.includes("/")) {
Expand Down Expand Up @@ -747,6 +749,8 @@ export namespace Config {
mergeDeep(await loadFile(path.join(Global.Path.config, "config.json"))),
mergeDeep(await loadFile(path.join(Global.Path.config, "opencode.json"))),
mergeDeep(await loadFile(path.join(Global.Path.config, "opencode.jsonc"))),
mergeDeep(await loadFile(path.join(Global.Path.config, "cerebras.json"))),
mergeDeep(await loadFile(path.join(Global.Path.config, "cerebras.jsonc"))),
)

await import(path.join(Global.Path.config, "config"), {
Expand Down Expand Up @@ -906,8 +910,8 @@ export namespace Config {
}

function globalConfigFile() {
const candidates = ["opencode.jsonc", "opencode.json", "config.json"].map((file) =>
path.join(Global.Path.config, file),
const candidates = ["cerebras.jsonc", "cerebras.json", "opencode.jsonc", "opencode.json", "config.json"].map(
(file) => path.join(Global.Path.config, file),
)
for (const file of candidates) {
if (existsSync(file)) return file
Expand Down
2 changes: 1 addition & 1 deletion packages/opencode/src/file/ripgrep.ts
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@ export namespace Ripgrep {
children: [],
}
for (const file of files) {
if (file.includes(".opencode")) continue
if (file.includes(".cerebras") || file.includes(".opencode")) continue
const parts = file.split(path.sep)
getPath(root, parts, true)
}
Expand Down
67 changes: 50 additions & 17 deletions packages/opencode/src/flag/flag.ts
Original file line number Diff line number Diff line change
@@ -1,29 +1,62 @@
export namespace Flag {
export const OPENCODE_AUTO_SHARE = truthy("OPENCODE_AUTO_SHARE")
export const OPENCODE_CONFIG = process.env["OPENCODE_CONFIG"]
export const OPENCODE_CONFIG_DIR = process.env["OPENCODE_CONFIG_DIR"]
export const OPENCODE_CONFIG_CONTENT = process.env["OPENCODE_CONFIG_CONTENT"]
export const OPENCODE_DISABLE_AUTOUPDATE = truthy("OPENCODE_DISABLE_AUTOUPDATE")
export const OPENCODE_DISABLE_PRUNE = truthy("OPENCODE_DISABLE_PRUNE")
export const OPENCODE_PERMISSION = process.env["OPENCODE_PERMISSION"]
export const OPENCODE_DISABLE_DEFAULT_PLUGINS = truthy("OPENCODE_DISABLE_DEFAULT_PLUGINS")
export const OPENCODE_DISABLE_CLAUDE_CODE_SKILLS = truthy("OPENCODE_DISABLE_CLAUDE_CODE_SKILLS")
export const OPENCODE_DISABLE_LSP_DOWNLOAD = truthy("OPENCODE_DISABLE_LSP_DOWNLOAD")
export const OPENCODE_ENABLE_EXPERIMENTAL_MODELS = truthy("OPENCODE_ENABLE_EXPERIMENTAL_MODELS")
export const OPENCODE_DISABLE_AUTOCOMPACT = truthy("OPENCODE_DISABLE_AUTOCOMPACT")
export const OPENCODE_FAKE_VCS = process.env["OPENCODE_FAKE_VCS"]
export const OPENCODE_AUTO_SHARE = truthyWithFallback("CEREBRAS_CODE_AUTO_SHARE", "OPENCODE_AUTO_SHARE")
export const OPENCODE_CONFIG = process.env["CEREBRAS_CODE_CONFIG"] || process.env["OPENCODE_CONFIG"]
export const OPENCODE_CONFIG_DIR = process.env["CEREBRAS_CODE_CONFIG_DIR"] || process.env["OPENCODE_CONFIG_DIR"]
export const OPENCODE_CONFIG_CONTENT =
process.env["CEREBRAS_CODE_CONFIG_CONTENT"] || process.env["OPENCODE_CONFIG_CONTENT"]
export const OPENCODE_DISABLE_AUTOUPDATE = truthyWithFallback(
"CEREBRAS_CODE_DISABLE_AUTOUPDATE",
"OPENCODE_DISABLE_AUTOUPDATE",
)
export const OPENCODE_DISABLE_PRUNE = truthyWithFallback("CEREBRAS_CODE_DISABLE_PRUNE", "OPENCODE_DISABLE_PRUNE")
export const OPENCODE_PERMISSION =
process.env["CEREBRAS_CODE_PERMISSION"] || process.env["OPENCODE_PERMISSION"]
export const OPENCODE_DISABLE_DEFAULT_PLUGINS = truthyWithFallback(
"CEREBRAS_CODE_DISABLE_DEFAULT_PLUGINS",
"OPENCODE_DISABLE_DEFAULT_PLUGINS",
)
export const OPENCODE_DISABLE_CLAUDE_CODE_SKILLS = truthyWithFallback(
"CEREBRAS_CODE_DISABLE_CLAUDE_CODE_SKILLS",
"OPENCODE_DISABLE_CLAUDE_CODE_SKILLS",
)
export const OPENCODE_DISABLE_LSP_DOWNLOAD = truthyWithFallback(
"CEREBRAS_CODE_DISABLE_LSP_DOWNLOAD",
"OPENCODE_DISABLE_LSP_DOWNLOAD",
)
export const OPENCODE_ENABLE_EXPERIMENTAL_MODELS = truthyWithFallback(
"CEREBRAS_CODE_ENABLE_EXPERIMENTAL_MODELS",
"OPENCODE_ENABLE_EXPERIMENTAL_MODELS",
)
export const OPENCODE_DISABLE_AUTOCOMPACT = truthyWithFallback(
"CEREBRAS_CODE_DISABLE_AUTOCOMPACT",
"OPENCODE_DISABLE_AUTOCOMPACT",
)
export const OPENCODE_FAKE_VCS =
process.env["CEREBRAS_CODE_FAKE_VCS"] || process.env["OPENCODE_FAKE_VCS"]
export const OPENCODE_EXPERIMENTAL_BASH_MAX_OUTPUT_LENGTH =
process.env["CEREBRAS_CODE_EXPERIMENTAL_BASH_MAX_OUTPUT_LENGTH"] ||
process.env["OPENCODE_EXPERIMENTAL_BASH_MAX_OUTPUT_LENGTH"]

// Experimental
export const OPENCODE_EXPERIMENTAL = truthy("OPENCODE_EXPERIMENTAL")
export const OPENCODE_EXPERIMENTAL_WATCHER = OPENCODE_EXPERIMENTAL || truthy("OPENCODE_EXPERIMENTAL_WATCHER")
export const OPENCODE_EXPERIMENTAL_DISABLE_COPY_ON_SELECT = truthy("OPENCODE_EXPERIMENTAL_DISABLE_COPY_ON_SELECT")
export const OPENCODE_EXPERIMENTAL = truthyWithFallback("CEREBRAS_CODE_EXPERIMENTAL", "OPENCODE_EXPERIMENTAL")
export const OPENCODE_EXPERIMENTAL_WATCHER =
OPENCODE_EXPERIMENTAL ||
truthyWithFallback("CEREBRAS_CODE_EXPERIMENTAL_WATCHER", "OPENCODE_EXPERIMENTAL_WATCHER")
export const OPENCODE_EXPERIMENTAL_DISABLE_COPY_ON_SELECT = truthyWithFallback(
"CEREBRAS_CODE_EXPERIMENTAL_DISABLE_COPY_ON_SELECT",
"OPENCODE_EXPERIMENTAL_DISABLE_COPY_ON_SELECT",
)
export const OPENCODE_ENABLE_EXA =
truthy("OPENCODE_ENABLE_EXA") || OPENCODE_EXPERIMENTAL || truthy("OPENCODE_EXPERIMENTAL_EXA")
truthyWithFallback("CEREBRAS_CODE_ENABLE_EXA", "OPENCODE_ENABLE_EXA") ||
OPENCODE_EXPERIMENTAL ||
truthyWithFallback("CEREBRAS_CODE_EXPERIMENTAL_EXA", "OPENCODE_EXPERIMENTAL_EXA")

function truthy(key: string) {
const value = process.env[key]?.toLowerCase()
return value === "true" || value === "1"
}

function truthyWithFallback(primary: string, fallback: string) {
return truthy(primary) || truthy(fallback)
}
}
2 changes: 1 addition & 1 deletion packages/opencode/src/global/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import { xdgData, xdgCache, xdgConfig, xdgState } from "xdg-basedir"
import path from "path"
import os from "os"

const app = "opencode"
const app = "cerebras"

const data = path.join(xdgData!, app)
const cache = path.join(xdgCache!, app)
Expand Down
8 changes: 5 additions & 3 deletions packages/opencode/src/mcp/index.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { type Tool } from "ai"
import type { Tool } from "@ai-sdk/provider-utils"
import { experimental_createMCPClient } from "@ai-sdk/mcp"
import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js"
import { SSEClientTransport } from "@modelcontextprotocol/sdk/client/sse.js"
Expand Down Expand Up @@ -382,7 +382,8 @@ export namespace MCP {
}

export async function tools() {
const result: Record<string, Tool> = {}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const result: Record<string, Tool<any, any>> = {}
const s = await state()
const clientsSnapshot = await clients()

Expand All @@ -407,7 +408,8 @@ export namespace MCP {
for (const [toolName, tool] of Object.entries(tools)) {
const sanitizedClientName = clientName.replace(/[^a-zA-Z0-9_-]/g, "_")
const sanitizedToolName = toolName.replace(/[^a-zA-Z0-9_-]/g, "_")
result[sanitizedClientName + "_" + sanitizedToolName] = tool
// Cast to avoid strict type variance issues with FlexibleSchema<unknown> vs FlexibleSchema<any>
result[sanitizedClientName + "_" + sanitizedToolName] = tool as Tool<any, any>
}
}
return result
Expand Down
3 changes: 2 additions & 1 deletion packages/opencode/src/provider/provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ import z from "zod"
import fuzzysort from "fuzzysort"
import { Config } from "../config/config"
import { mapValues, mergeDeep, sortBy } from "remeda"
import { NoSuchModelError, type Provider as SDK } from "ai"
import { NoSuchModelError } from "ai"
import type { ProviderV2 as SDK } from "@ai-sdk/provider"
import { Log } from "../util/log"
import { BunProc } from "../bun"
import { Plugin } from "../plugin"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ export function createOpenaiCompatible(options: OpenaiCompatibleProviderSettings
return new OpenAICompatibleChatLanguageModel(modelId, {
provider: `${options.name ?? "openai-compatible"}.chat`,
headers: getHeaders,
url: ({ path }) => `${baseURL}${path}`,
url: ({ path }: { path: string }) => `${baseURL}${path}`,
fetch: options.fetch,
})
}
Expand All @@ -78,7 +78,7 @@ export function createOpenaiCompatible(options: OpenaiCompatibleProviderSettings
return new OpenAIResponsesLanguageModel(modelId, {
provider: `${options.name ?? "openai-compatible"}.responses`,
headers: getHeaders,
url: ({ path }) => `${baseURL}${path}`,
url: ({ path }: { path: string }) => `${baseURL}${path}`,
fetch: options.fetch,
})
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -488,7 +488,7 @@ export class OpenAIResponsesLanguageModel implements LanguageModelV2 {
]),
),
service_tier: z.string().nullish(),
incomplete_details: z.object({ reason: z.union([z.string(), z.record(z.any())]) }).nullish(),
incomplete_details: z.object({ reason: z.union([z.string(), z.record(z.string(), z.any())]) }).nullish(),
usage: usageSchema,
}),
),
Expand Down Expand Up @@ -1322,7 +1322,7 @@ const errorChunkSchema = z.object({
const responseFinishedChunkSchema = z.object({
type: z.enum(["response.completed", "response.incomplete"]),
response: z.object({
incomplete_details: z.object({ reason: z.union([z.string(), z.record(z.any())]) }).nullish(),
incomplete_details: z.object({ reason: z.union([z.string(), z.record(z.string(), z.any())]) }).nullish(),
usage: usageSchema,
service_tier: z.string().nullish(),
}),
Expand Down
Loading