Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 0 additions & 20 deletions .github/workflows/deploy-search.yml

This file was deleted.

18 changes: 0 additions & 18 deletions .github/workflows/update-indexes.yml

This file was deleted.

3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ node_modules
# testing
coverage

# vite
vite.config.ts.timestamp-*

# next.js
.next/
out/
Expand Down
41 changes: 25 additions & 16 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,30 +49,41 @@ Bahar is an Arabic language learning application with these key features:

## Development Environment

### Docker Development Setup (Recommended)
### Local Development

- Start all services: `docker compose up`
- Front-end available at: http://localhost:4000
- API available at: http://localhost:3000
- Drizzle Studio available at: http://localhost:4983
- LibSQL Server available at: http://localhost:8080 (local development database)
Start each service in a separate terminal:

### Local Development
```bash
# Terminal 1: Local database (port 8080)
make local-db

# Terminal 2: API server (port 3000)
pnpm run dev --filter api

# Terminal 3: Web app (port 4000)
pnpm run dev --filter web

# Terminal 4 (optional): Database UI (port 4983)
pnpm run --filter api drizzle:studio
```

First-time setup:
```bash
pnpm install
pnpm run --filter api drizzle:migrate
```

### Other Commands

- Install packages: `pnpm install`
- Build all: `pnpm run build` or `turbo build`
- Dev mode: `pnpm run dev` or `turbo dev`
- Lint all: `pnpm run lint` or `turbo lint`
- Type check: `pnpm run type-check` or `turbo type-check`
- Local database: `turso dev --db-file apps/api/local.db` or `make local-db`

### Production Setup

- Build and run production: `make prod`
- Build production web app: `make build`
- Run production containers: `docker compose -f docker-compose.prod.yaml up -d`
- Serve production web app: `make serve`
- Cleanup production environment: `make cleanup`
- Serve production web app locally: `make serve`
- Delete local data (databases): `make delete-local-data`

## Testing

Expand Down Expand Up @@ -101,7 +112,6 @@ The monorepo includes shared packages in `/packages`:
- `@bahar/fsrs`: FSRS spaced repetition algorithm utilities
- `@bahar/i18n`: Internationalization with Lingui
- `@bahar/result`: Result type for explicit error handling
- `@bahar/schemas`: Shared Zod validation schemas
- `@bahar/search`: Orama search configuration and utilities

## Web App Data Flow
Expand All @@ -126,7 +136,6 @@ The web app uses a hybrid approach with local-first data handling:
3. **Writing Data**:

- Create/update/delete operations write to local database immediately
- API also writes to remote user database (dual-write pattern during migration)
- Background sync pushes local changes to remote every 60 seconds

4. **Synchronization**:
Expand Down
15 changes: 3 additions & 12 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,11 @@ serve:
build:
NODE_ENV=production pnpm turbo build --filter=web

# Run production application
prod: build
docker compose -f docker-compose.prod.yaml up -d --remove-orphans
make serve || (make cleanup && exit 1)

cleanup:
docker compose -f docker-compose.prod.yaml down

# Also make sure to clean up the "testing" db group
# in turso as that will have any user dbs that were
# Also make sure to clean up the "testing" db group
# in turso as that will have any user dbs that were
# created even locally.
delete-local-data:
sudo rm -rf ./libsql
sudo rm -rf ./meili_data
sudo rm -rf ./apps/api/local.db*

.PHONY: serve build prod cleanup
.PHONY: local-db serve build delete-local-data
Comment thread
Shunseii marked this conversation as resolved.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Bahar is an Arabic language learning application built as a monorepo using pnpm
3. Access the web app at `http://localhost:4000`
4. Access the API at `http://localhost:3000`

To run a production setup locally, use `make prod`.
To build and serve the production web app locally, use `make build` followed by `make serve`.

## Projects

Expand Down
5 changes: 1 addition & 4 deletions apps/api/.example.env
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ APP_DOMAIN=""
GITHUB_CLIENT_SECRET=""
GITHUB_CLIENT_ID=""

SENDGRID_API_KEY=""
RESEND_API_KEY=""

UPSTASH_REDIS_REST_URL=""
UPSTASH_REDIS_REST_TOKEN="="
Expand All @@ -23,9 +23,6 @@ TURSO_PLATFORM_API_KEY=""
TURSO_ORG_SLUG=""
TURSO_DB_GROUP="testing"

MEILISEARCH_HOST="http://meilisearch:7700"
MEILISEARCH_API_KEY="MASTER_KEY"

BETTER_AUTH_SECRET=""

SENTRY_DSN=""
Expand Down
46 changes: 43 additions & 3 deletions apps/api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ The API uses a database-per-user architecture:
- **Per-User Databases**: Each user has their own Turso database for personal data (dictionary entries, flashcards, decks)

This design provides:

- Data isolation between users
- Independent scaling per user
- Simplified backup and restore
Expand Down Expand Up @@ -53,24 +54,63 @@ pnpm install
### Local Development

1. Start the local database:

```bash
turso dev --db-file local.db
# or from project root:
make local-db
```

2. Run migrations:

```bash
pnpm run drizzle:migrate
```

3. Start the dev server:
3. Start the dev servers (from project root):
```bash
pnpm dev
pnpm run dev
```

The API runs on `http://localhost:3000`.

### Setting Up a New User (Fresh Local DB)

When working with a fresh local database, you need to manually seed the user database migrations:

1. Start all dev servers from the project root:

```bash
pnpm run dev
```

2. In a separate terminal, start the local database:

```bash
make local-db
```

3. Run the (local) central database migrations:

```bash
pnpm run --filter api drizzle:migrate
```

4. Create a new user through the web app (sign-up flow)

5. Manually convert the user to an admin in the local database:

- Open Drizzle Studio: `pnpm run --filter api drizzle:studio`
- Find the user in the `users` table and set the role to `admin`

6. Seed the user database migrations:

- Copy the migration SQL from `packages/drizzle-user-db-schemas/drizzle/*.sql`
- Remove the `--> statement-breakpoint` markers and replace with newlines
- Manually insert records into the `migrations` table in the user's database

7. Log out and log back in for changes to take effect

## Database Commands

```bash
Expand All @@ -96,9 +136,9 @@ Utility scripts for database management are in `/scripts`:

- `create-user-dbs.ts` - Create individual Turso databases for users
- `apply-user-db-migrations.ts` - Apply schema migrations to user databases
- `migrate-settings-decks-to-user-db.ts` - Migrate data from central to user databases

Run scripts with:

```bash
npx tsx --env-file=.env scripts/<script-name>.ts
```
Expand Down
2 changes: 2 additions & 0 deletions apps/api/drizzle/0011_dazzling_jackal.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
DROP TABLE `decks`;--> statement-breakpoint
DROP TABLE `settings`;
20 changes: 20 additions & 0 deletions apps/api/drizzle/0012_conscious_rhodey.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
DROP INDEX "accounts_userId_idx";--> statement-breakpoint
DROP INDEX "sessions_token_unique";--> statement-breakpoint
DROP INDEX "sessions_userId_idx";--> statement-breakpoint
DROP INDEX "users_email_unique";--> statement-breakpoint
DROP INDEX "verifications_identifier_idx";--> statement-breakpoint
DROP INDEX "databases_user_id_unique";--> statement-breakpoint
ALTER TABLE `accounts` ALTER COLUMN "created_at" TO "created_at" integer NOT NULL DEFAULT (cast(unixepoch('subsecond') * 1000 as integer));--> statement-breakpoint
CREATE INDEX `accounts_userId_idx` ON `accounts` (`user_id`);--> statement-breakpoint
CREATE UNIQUE INDEX `sessions_token_unique` ON `sessions` (`token`);--> statement-breakpoint
CREATE INDEX `sessions_userId_idx` ON `sessions` (`user_id`);--> statement-breakpoint
CREATE UNIQUE INDEX `users_email_unique` ON `users` (`email`);--> statement-breakpoint
CREATE INDEX `verifications_identifier_idx` ON `verifications` (`identifier`);--> statement-breakpoint
CREATE UNIQUE INDEX `databases_user_id_unique` ON `databases` (`user_id`);--> statement-breakpoint
ALTER TABLE `sessions` ALTER COLUMN "created_at" TO "created_at" integer NOT NULL DEFAULT (cast(unixepoch('subsecond') * 1000 as integer));--> statement-breakpoint
ALTER TABLE `users` ALTER COLUMN "email_verified" TO "email_verified" integer NOT NULL DEFAULT false;--> statement-breakpoint
ALTER TABLE `users` ALTER COLUMN "created_at" TO "created_at" integer NOT NULL DEFAULT (cast(unixepoch('subsecond') * 1000 as integer));--> statement-breakpoint
ALTER TABLE `users` ALTER COLUMN "updated_at" TO "updated_at" integer NOT NULL DEFAULT (cast(unixepoch('subsecond') * 1000 as integer));--> statement-breakpoint
ALTER TABLE `users` ALTER COLUMN "banned" TO "banned" integer DEFAULT false;--> statement-breakpoint
ALTER TABLE `verifications` ALTER COLUMN "created_at" TO "created_at" integer NOT NULL DEFAULT (cast(unixepoch('subsecond') * 1000 as integer));--> statement-breakpoint
ALTER TABLE `verifications` ALTER COLUMN "updated_at" TO "updated_at" integer NOT NULL DEFAULT (cast(unixepoch('subsecond') * 1000 as integer));
Loading