You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know if it would be relevant or not. Some benchmark might be required to determine whether or not the original json_encode would fail on big arrays with big objects. If so using iterators could become useful. Comparing to the original one there would be the need for special const such as FORCE_OBJECT(which already exists) and a FORCE_LIST
The text was updated successfully, but these errors were encountered:
I think this would be out of scope. Effectively this calls for a streaming JSON encoder, as you'd probably want not just the input to be an iterator, but the output to be incremental as well. A quick google turns up https://github.com/violet-php/streaming-json-encoder, though I have no further familiarity with that library.
Thank you for this library. I was not talking about streaming per se (Already did something using the StreamResponse + IteratorAgreggate + __invoke) but rather a generic iterable->toJsonString taht we could then store for example in a database or else. I don't know if there are usecase where json_encode(iterator_to_array($myIterator)) would create a massive overhead in memory that we could avoid with this new function.
I don't know if it would be relevant or not. Some benchmark might be required to determine whether or not the original json_encode would fail on big arrays with big objects. If so using iterators could become useful. Comparing to the original one there would be the need for special const such as
FORCE_OBJECT
(which already exists) and aFORCE_LIST
The text was updated successfully, but these errors were encountered: