Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
5179a88
mock and fake functions
Dec 31, 2024
463c367
Add files via upload
v-maliutin Dec 31, 2024
4c29b12
there are several functions to fake a table and mock a function
v-maliutin Jan 1, 2025
65d0f50
Merge branch 'fake_and_mock_functions' of https://github.com/v-maliut…
Jan 2, 2025
9ab57ca
added example how to use mock and fake functions
Jan 2, 2025
69aa765
mock_func is able to mock plpgsql and sql functionality, + some enhan…
v-maliutin Jan 2, 2025
0f8c32d
fixed tap_funky and implemented call_count
v-maliutin Jan 4, 2025
c733367
fixed returns column in tap_funky and added a regression test
v-maliutin Jan 6, 2025
401bcc2
wrote self-tests on faking functionality
v-maliutin Jan 8, 2025
c958455
Prettying up examples
Mar 13, 2025
19c700a
Leave a single version
Mar 13, 2025
2231bec
Merge pull request #1 from v-maliutin/fake_and_mock_functions
v-maliutin Mar 13, 2025
3f8d7eb
Merge branch 'main' into main
v-maliutin Mar 17, 2025
8dce627
Created a new function to print a query instead of a table. Just more…
Aug 24, 2025
80fdb8a
No modifications in unpackaged
Aug 24, 2025
1604a83
Merge remote-tracking branch 'origin/main'
Aug 24, 2025
6226807
No modifications in unpackaged
Aug 24, 2025
e660e9d
Merge branch 'fake_and_mock_functions'
Aug 24, 2025
f57c7e0
Function instead of procedures + format function instead of ||
Sep 23, 2025
a2af1dd
Documentation on fake_table function
Oct 12, 2025
6de5e8a
Documentation on mock_func function
Oct 19, 2025
edf0f9a
Updated pgtap.sql.in
Oct 20, 2025
2273ba2
Add ability to mock a view + some fixes
Oct 21, 2025
4ae191b
Fixes, new tests, new documentation
Oct 24, 2025
5baf8c7
fixed a couple of typos
Oct 25, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
272 changes: 272 additions & 0 deletions doc/pgtap.mmd
Original file line number Diff line number Diff line change
Expand Up @@ -8551,6 +8551,278 @@ table in a bunch of schemas:
FROM (VALUES('schema1'), ('schema1')) AS stmp (sch)
CROSS JOIN (VALUES('col_pk'), ('col2'), ('col3')) AS ctmp (col);

Mocking, faking and making your test independent
================================================

Examples of implementing mocking and faking functionality can be found in my fork [`on GitHub`](https://github.com/v-maliutin/pgtap/tree/examples_forking_and_mocking/examples).

Faking
------

Let's say we have a main table and a subordinate table.

CREATE TABLE master(
some_id int primary key
);

CREATE TABLE child(
master_id int,
some_data numeric,
CONSTRAINT child_fk FOREIGN KEY(master_id) REFERENCES master(some_id)
);

--Your function simply sums up the values of the some_data column, filtering them by the master_id field.

SELECT SUM(some_data) as sm FROM child WHERE master_id = $1;

The main point here is that the master table is intended exclusively for user input,
and you cannot make any assumptions about the values stored in the some_id field.
Consequently, you will not be able to insert any data into the child table until
the corresponding data exists in the master table.

What you can do is drop the foreign key constraint, insert some data into the child table only,
and perform any necessary checks.

Please, do not worry that you've lost the relationship between tables and thus compromised relational integrity.
It's always well known which data are invalid for your system. Simply don't insert them. That's it.

### `fake_table()` ###

PERFORM fake_table( :_table_ident, :_make_table_empty, :_leave_primary_key, :_drop_not_null, :_drop_collation, :_drop_partitions );
PERFORM fake_table( :_table_ident, :_make_table_empty, :_leave_primary_key, :_drop_not_null, :_drop_collation );
PERFORM fake_table( :_table_ident, :_make_table_empty, :_leave_primary_key, :_drop_not_null );
PERFORM fake_table( :_table_ident, :_make_table_empty, :_leave_primary_key );
PERFORM fake_table( :_table_ident, :_make_table_empty );
PERFORM fake_table( :_table_ident );

**Parameters**

`:_table_ident`
: Text array of tables to be faked in the format 'my_schema.my_table'. Required.

`:_make_table_empty`
: Boolean. If TRUE the desired tables will be truncated. Optional. FALSE by default.

`:_leave_primary_key`
: Boolean. TRUE if you want primary key stay as is. Optional. FALSE by default.

`:_drop_not_null`
: Boolean. TRUE if you want not null constraints stay as is. Optional. FALSE by default.

`:_drop_collation`
: Boolean. Not yet implemented. Optional. FALSE by default.

`:_drop_partitions`
: Boolean. If TRUE any declarative partition of a table will be dropped. Works starting with PostgreSQL version 10.
Optional. FALSE by default.

Sometimes a table contains a lot of junk data, especially in a development environment.
However, to ensure that your test runs on completely valid data, you clear the table before testing,
insert valid data, and then run the check. This is why the _make_table_empty parameter is useful.
If you follow the steps outlined above, you can be assured that your test does not rely on
junk data and can be reproduced at any time. Example.

perform fake_table(
_table_ident => '{pgconf.account, pgconf.analytic, pgconf.osv, pgconf.transactions}'::text[],
_make_table_empty => true,
_leave_primary_key => false,
_drop_not_null => false,
_drop_collation => false
);

One note. The parameters _leave_primary_key and _drop_not_null are currently contradictory.
If you'd like to keep a primary key while making all columns nullable
(_leave_primary_key => true, _drop_not_null => true), you'll receive a runtime error.
This depends on whether the table truly has a primary key. I'm addressing this issue, but meanwhile,
the solution is to set _leave_primary_key => false, _drop_not_null => true, and then immediately
following the execution of 'fake_table', simply execute a command to create your primary key,
e.g., alter table my_table add constraint...

In general, it's important to remember that there's no way to create a primary key consisting
of nullable columns. Furthermore, when creating a primary key across multiple columns,
PostgreSQL automatically enforces a NOT NULL constraint on each one.

Mocking
-------

Imagine a situation where you have one function that is called by another function.
The first function (the inner one) has complex logic and depends on multiple tables.
The second function (the outer one) simply performs a few actions using the result
returned by the first function. Should you populate tables required for the inner function
with test case data for every test scenario when testing the outer function? Goog news you shouldn't.
Simply create a mock for each test scenario and invoke it like a regular inner function.
The mock will bear the same name as your inner function, but will act instead of your real function.

CREATE OR REPLACE FUNCTION pgconf.time_machine_now()
RETURNS time
LANGUAGE sql
AS $$
SELECT now()::time;
$$;

PERFORM mock_func('pgconf', 'time_machine_now', '()'
, _return_scalar_value => '13:00'::time);

On the script above you can see how to create a mock for the 'time_machine_now()' function.
In production 'time_machine_now()' will return the current time. In test, if function has been mocked,
it will return 13:00 always. This is a scalar mock.

Another mock you can do is a return set mock with sql string.

CREATE OR REPLACE FUNCTION pgconf.time_machine_now()
RETURNS TABLE ( t time )
LANGUAGE sql
AS $$
SELECT now()::time;
$$;

PERFORM tap.mock_func('pgconf', 'time_machine_now', '()'
, _return_set_value => 'select ''13:00''::time as t');

And my favorite case is

PREPARE mock_time_machine_now AS SELECT '14:00'::time AS t;
PERFORM tap.mock_func('pgconf', 'time_machine_now', '()'
, _return_set_value => 'mock_time_machine_now');

### `mock_func()` ###

PERFORM mock_func( :_func_schema, :_func_name, :_func_args, _return_scalar_value );
PERFORM mock_func( :_func_schema, :_func_name, :_func_args, _return_set_value );
PERFORM mock_func( :_func_schema, :_func_name, :_func_args );

**Parameters**

`:_func_schema`
: Text. This is the schema where your function is declared. Required.

`:_func_name`
: Text. Function name. Required.

`:_func_args`
: Text. PG supports something similar to polymorphism. That’s why you always have to provide
a specific signature. This way PG will be able to find the function you want to mock.
If there are no parameters, just give '()'.
But if your function has some parameters, you must specify them as follows:
(_int_param int, _text_param text = null, _ts_param = now()).
The simplest way to find correct signature of your function is to call get_routine_signature. It is required.

':_return_scalar_value'
: Text. Some scalar value that your mock have to return in test context. Optional. Default NULL.
Obviously, you have to provide either '_return_set_value' or '_return_scalar_value'.

':_return_set_value'
: Text. Some SQL code forming a dataset that your mock should return in text context.
You may provide a name for a prepared statement. There is a convention to name a prepared statement
using the following pattern: 'mock_my_dataset'. Optional. Default NULL.
Obviously, you have to provide either '_return_set_value' or '_return_scalar_value'.

### `get_routine_signature()` ###

PERFORM get_routine_signature( :_routine_schema, :_routine_name );
PERFORM get_routine_signature( :_routine_name );

**Parameters**

`:_routine_schema`
: Text. This is the schema where your routine is declared. Required.

`:_func_name`
: Text. Routine name. Required.

The simple way to find out how PostgreSQL stores the signature of your routine.
Use this function to select at least the arguments with defaults ('args_with_defs') to pass this value to 'mock_func'.

### `mock_view()` ###

PERFORM mock_func( :_view_schema, :_view_name, :_return_set_sql );

**Parameters**

`:_view_schema`
: Text. This is the schema where your view is declared. Required.

`:_view_name`
: Text. View name. Required.

`:_return_set_sql`
: Text. Some SQL code forming a dataset that your mock should return in text context. Required.

Creates a mock replacement for a real view. See more details about mocking above regarding the 'mock_func' function.

Assert to controls count of calls
---------------------------------

### `call_count()` ###

SELECT call_count( :_call_count, :_func_schema, :_func_name, :_func_args );

To be able to gather a count on function calls, please ensure that the track_functions setting is set to 'all'.

**Parameters**

`:_call_count`
: Int. How many calls you are expecting. Required.

`:_func_schema`
: Text. This is the schema where your function is declared. Required.

`:_func_name`
: Text. Function name. Required.

`:_func_args`
: Text. PG supports something similar to polymorphism. That’s why you always have to provide
a specific signature. This way PG will be able to find the function you want to mock.
If there are no parameters, just give '()'.
But if your function has some parameters, you must specify them as follows:
(_int_param int, _text_param text = null, _ts_param = now()). It is required.

Helping functions
-----------------

### `print_table_as_json()` ###

The greatest feature of pgTap is that every test run happens within its own transaction. That's awesome,
but it complicates tracking what's happening internally. Therefore, by calling the 'print_table_as_json'
function, you'll be able to see what data was present in a table (or query) at the time the test was running.

**Parameters**

`:_table_schema`
: Text. This is the schema where your table is declared. Required.

`:_table_name`
: Text. Table name. Required.

In the current version, all we need to do is create a table using some SQL statements in a certain schema
and pass the table and schema names to the function. The function will then issue a 'RAISE NOTICE' command
containing SQL code that you can copy-paste and execute. The result will appear as a regular table.
If you work in DBeaver look result of 'RAISE NOTICE' command in Output window.

### `print_query_as_json()` ###

It performs precisely the same action as 'print_table_as_json', except you don't have to create a table.
Simply prepare a statement and supply its name to the function.
It's a great idea that you can inspect any data you're working with during testing.
Simply construct a query returning your variables, settings, or any other relevant information,
and output them via the 'print_query_as_json' function.

**Parameters**

`:_statements`
: Text. This is the name of the prepared statement whose data you're looking to explore. Required.

### `drop_prepared_statement()` ###

Any prepared statement is a session-level object. Thus, if you reuse the same names across tests,
similar to my approach, you must remove previously created prepared statements before you can use it again.
It's advisable to leverage the 'setup' and 'teardown' features provided by pgTap.

**Parameters**

`:_statements`
: Array of text. Those are names of the prepared statements you want to drop. Required.

Compose Yourself
================

Expand Down
Loading