Apply rabbit suggestions on the e2e feature branch#3389
Apply rabbit suggestions on the e2e feature branch#3389arkid15r merged 4 commits intoOWASP:feature/e2e-backendfrom
Conversation
|
PR validation failed: No linked issue and no valid closing issue reference in PR description |
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings. WalkthroughRefactors SQL in the dump_data management command to use psycopg2.sql compositions, updates tests accordingly, adjusts several GraphQL docstrings and resolver behaviors, changes ORM relation loading and a safe lookup, and reorders two frontend env variables. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested labels
Suggested reviewers
🚥 Pre-merge checks | ❌ 3❌ Failed checks (2 warnings, 1 inconclusive)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@backend/apps/common/management/commands/dump_data.py`:
- Around line 96-104: The DROP DATABASE cleanup in dump_data.py currently only
catches OperationalError inside the block that calls self._execute_sql (the DROP
DATABASE IF EXISTS {temp_db} SQL); broaden the exception handling to also catch
psycopg2.ProgrammingError (i.e., use except (OperationalError,
ProgrammingError):) and ensure ProgrammingError is imported where
OperationalError is imported so the best-effort cleanup won't raise an unhandled
ProgrammingError from _execute_sql.
There was a problem hiding this comment.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
backend/apps/common/management/commands/dump_data.py (1)
109-120: Schema‑qualify email updates to prevent targeting wrong table if search_path is modified.
_table_list_query()explicitly filtersWHERE table_schema = 'public', but_remove_emails()generatesUPDATE {table}without schema qualification. Ifsearch_pathis customized or contains a schema with a same-named table, the UPDATE could affect the wrong schema, leaving emails unmasked in the public tables being dumped.Select
table_schemaalongsidetable_nameand qualify updates with{schema}.{table}:Proposed fix
- table_list = self._execute_sql(temp_db, [self._table_list_query()]) - self._execute_sql(temp_db, self._remove_emails([row[0] for row in table_list])) + table_list = self._execute_sql(temp_db, [self._table_list_query()]) + self._execute_sql( + temp_db, + self._remove_emails([(row[0], row[1]) for row in table_list]), + ) def _table_list_query(self) -> sql.Composable: return sql.SQL(""" - SELECT table_name + SELECT table_schema, table_name FROM information_schema.columns WHERE table_schema = 'public' AND column_name = 'email'; """) - def _remove_emails(self, tables: list[str]) -> list[sql.Composable]: + def _remove_emails(self, tables: list[tuple[str, str]]) -> list[sql.Composable]: return [ - sql.SQL("UPDATE {table} SET email = '';").format(table=sql.Identifier(table)) - for table in tables + sql.SQL("UPDATE {schema}.{table} SET email = '';").format( + schema=sql.Identifier(schema), table=sql.Identifier(table) + ) + for schema, table in tables ]
|
* Update dump_data * Apply rabbit suggestions * Update tests and dump_data command * Update update-nest-test-images.yaml
* Update dump_data * Apply rabbit suggestions * Update tests and dump_data command * Update update-nest-test-images.yaml



Proposed change
Resolves #(put the issue number here)
Add the PR description here.
Checklist
make check-testlocally: all warnings addressed, tests passed