You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using SQLKit in a semi-low level ORM-ish thing I rolled myself with a limited scope, that's been working very well for a few years. I have a convenience decoding function defined like this:
I've noticed while doing some optimization on a few slower query/endpoints, that decoding models for certain large requests is a non-trivial part of the overall request time. I was wondering if the iteration and individual decoding could be improved by decoding in bulk. I wanted to write something like this:
publicextensionCollectionwhere Element ==SQLRow{func decodeAll<M:DuetSQL.Model>(_:M.Type)throws->[M]{??? // <-- 🤔
}}
But (as indicated by the ???) I can't figure out how to implement this. Is it possible? I poked around with SQLRowDecoder and SQLQueryFetcher but couldn't figure out how to hold it right.
Would greatly appreciate any insight. Thanks so much for the library!
The text was updated successfully, but these errors were encountered:
You could try using SQLQueryFetcher's run(decoding:prefix:keyDecodingStrategy:userInfo:_:) method to get decoding that doesn't rely on the gather-then-decode semantics that the all() methods use, although you then have to deal with unwrapping the Results passed to the handler.
Have you been able to narrow down any further where the overhead is coming from? I'd be very interested if so.
@gwynne thanks so much for the response. i'll take a look at that function and get back to you.
i don't have specific info about the overhead yet, but next time i'm profiling i'll see if i can gather more. all i know is that i was troubleshooting a slow API endpoint, and i was thinking that postgres was the main part of the slow down, and so i captured some times and was surprised to see that it seemed like i was spending maybe half the request time (roughly 600ms iirc) decoding the rows once they came back from postgres. i could easily have made a mistake though. i just got thinking, hey i have [SqlRow] collection, if it were a vanilla codable thing, i could decode the whole array at once, in stead of mapping over each one, and it seemed like a promising thing to try.
caveat: this might be a dumb question.
I'm using SQLKit in a semi-low level ORM-ish thing I rolled myself with a limited scope, that's been working very well for a few years. I have a convenience decoding function defined like this:
To decode my own models. Works great. In my actual postgres client, I decode rows like so:
I've noticed while doing some optimization on a few slower query/endpoints, that decoding models for certain large requests is a non-trivial part of the overall request time. I was wondering if the iteration and individual decoding could be improved by decoding in bulk. I wanted to write something like this:
But (as indicated by the
???
) I can't figure out how to implement this. Is it possible? I poked around withSQLRowDecoder
andSQLQueryFetcher
but couldn't figure out how to hold it right.Would greatly appreciate any insight. Thanks so much for the library!
The text was updated successfully, but these errors were encountered: