-
-
Notifications
You must be signed in to change notification settings - Fork 619
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
using connection.query(...).stream() with promise wrapper #677
Comments
maybe we need to wait for async generators to land: http://node.green/#ESNEXT-candidate--stage-3--Asynchronous-Iterators-for-await-of-loops for await (const row of connection.query(veryBiqSqlResult).stream()) {
console.log(row);
} ( not sure if I can mix buffering You can access non-promise api from wrapper as Line 55 in 97a8853
const [rows] = await connection.query(smallSqlResult);
await new Promise((accept, reject) => {
const s1 = connection.connection.query(veryBiqSqlResult);
s1.on('result', function(row) {...})
s1.on('end', accept);
s1.on('error', reject);
}) |
One note to the connection.connection.query(...).stream() part: |
Oh, thanks @derN3rd I'll update my example here |
yes, |
If you shut down the database while processing a large rowset after the 'result' handler is called at least once, neither end() nor error() are ever called. I'll try to create a test program. |
@terrisgit try setting Maybe we should change that to be true by default, but not sure if there might be some undesired side effects |
This promisified example writes a result set to a CSV file. It handles all possible errors (mysql connection, SQL, stream write) without crashing or hanging. Feedback wanted. const csvstringify = require('csv-stringify');
const fs = require('fs');
const outputStream = fs.createWriteStream('output.csv', {encoding: 'utf8'});
const finishedWriting = new Promise((resolve, reject)=>
outputStream.on('finished', resolve).on('error', reject));
const BOM = '\ufeff'; // Microsoft Excel needs this
outputStream.write(BOM);
const connection = __Create a mysql2 Connection object here__
const generator = connection.connection.query('SELECT...');
let recordsProcessed = 0;
try {
await new Promise((resolve, reject) => {
// When using a connection pool, the 'error' connection event is called only when
// enableKeepAlive is true. See:
// https://github.com/sidorares/node-mysql2/issues/677#issuecomment-588530194
// Without this handler, this code will hang if the database connection is broken
// while reading the result set.
connection.on('error', reject);
generator
.on('result', row => ++recordsProcessed) // Counting rows just as an example
.stream({highWaterMark: 10})
.on('error', reject)
.on('end', resolve)
.pipe(csvstringify({header: true}))
.pipe(outputStream)
.on('error', error => {
// Handle stream write error
// See also https://github.com/sidorares/node-mysql2/issues/664
// Data is being sent from server to client; without calling destroy, the
// connection will go back to the pool and will be unusable. The
// callback provided to destroy() is never called.
connection.destroy(); // This appears to cause file descriptor leaks
reject(error);
});
});
}
finally {
connection.on('error', error => console.log(error)); // Remove the handler. Is there a better way?
}
await finishedWriting; |
I've been doing a lot of error testing. The above code can be forced to fail in many ways but I've been focusing on write errors which you can force via outputStream.close() after you create it. It seems to leak memory if you run the above code in a loop when closing the outputStream. It's not a typical runtime error anyway so it's not a concern. |
Update: Do not use the following workaround -- it is unsafe. See nodejs/node#39722 I ended up with this workaround, if someone's interested: const unpromisedConnection = connection.connection;
const queryUnpromised = unpromisedConnection.query.bind(unpromisedConnection);
connection.streamQuery = async (sql, values, streamOptions) =>
new Promise((resolve, reject) => {
const query = queryUnpromised(sql, values);
const stream = query.stream(streamOptions);
stream.once('error', reject);
query.once('fields', () => {
stream.removeListener('error', reject);
resolve(stream);
});
query.once('error', reject);
}); This adds the method The resolved stream may still emit connection.streamQuery('SELECT blabla malformed sql'); // promise reject
const stream = await connection.streamQuery('SELECT name FROM employees');
stream.on('error', console.log);
stream.pipe(...); |
Following problems with error handling that I've encountered while trying to use queue composition and promises/async functions -- described in nodejs/node#39722 -- I feel like it would be unsafe to add any kind of support for promisified queue API, other than perhaps injecting the |
Just a note for anyone running into this issue. I had to actually include .stream() to make it work, as opposed to not calling it as suggested in this comment:
My code ended up looking something like this:
note: |
Interesting. I wonder whether this works: connection.connection.query('SELECT * FROM (SELECT column_a FROM my_table)')
.stream()
.pipe(someStream) |
@eyalroth I have no idea... I think the docs are a little unclear on this. In case you're interested, I have asked a somewhat similar question here: mysqljs/mysql#2523 |
@eyalroth I just wanted to let you (and others) know that I misinterpreted the docs. I have edited my comment accordingly. It is indeed possible to stream individual columns. However, it is not possible to stream individual fields. So a large field/entry can not be streamed and will be buffered in its entirety. I hope this clarifies it. More info |
why can't I use
connection.query(...).stream()
with promise wrapper?(you get error: TypeError: connection.query(...).stream is not a function)
I'd quite like to be able to do:
let q = await connection.query(someSQL);
as well as
in the same script, but atm I need promise api for the first and standard api for the second...
The text was updated successfully, but these errors were encountered: