Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about decoding collections of SQLRow #186

Open
jaredh159 opened this issue Dec 18, 2024 · 2 comments
Open

question about decoding collections of SQLRow #186

jaredh159 opened this issue Dec 18, 2024 · 2 comments
Assignees

Comments

@jaredh159
Copy link

caveat: this might be a dumb question.

I'm using SQLKit in a semi-low level ORM-ish thing I rolled myself with a limited scope, that's been working very well for a few years. I have a convenience decoding function defined like this:

public extension SQLRow {
  func decode<M: DuetSQL.Model>(_: M.Type) throws -> M {
    try self.decode(model: M.self, prefix: nil, keyDecodingStrategy: .convertFromSnakeCase)
  }
}

To decode my own models. Works great. In my actual postgres client, I decode rows like so:

try rows.compactMap { row in try row.decode(M.self) }

I've noticed while doing some optimization on a few slower query/endpoints, that decoding models for certain large requests is a non-trivial part of the overall request time. I was wondering if the iteration and individual decoding could be improved by decoding in bulk. I wanted to write something like this:

public extension Collection where Element == SQLRow {
  func decodeAll<M: DuetSQL.Model>(_: M.Type) throws -> [M] {
    ??? // <-- 🤔
  }
}

But (as indicated by the ???) I can't figure out how to implement this. Is it possible? I poked around with SQLRowDecoder and SQLQueryFetcher but couldn't figure out how to hold it right.

Would greatly appreciate any insight. Thanks so much for the library!

@gwynne
Copy link
Member

gwynne commented Dec 20, 2024

You could try using SQLQueryFetcher's run(decoding:prefix:keyDecodingStrategy:userInfo:_:) method to get decoding that doesn't rely on the gather-then-decode semantics that the all() methods use, although you then have to deal with unwrapping the Results passed to the handler.

Have you been able to narrow down any further where the overhead is coming from? I'd be very interested if so.

@jaredh159
Copy link
Author

@gwynne thanks so much for the response. i'll take a look at that function and get back to you.

i don't have specific info about the overhead yet, but next time i'm profiling i'll see if i can gather more. all i know is that i was troubleshooting a slow API endpoint, and i was thinking that postgres was the main part of the slow down, and so i captured some times and was surprised to see that it seemed like i was spending maybe half the request time (roughly 600ms iirc) decoding the rows once they came back from postgres. i could easily have made a mistake though. i just got thinking, hey i have [SqlRow] collection, if it were a vanilla codable thing, i could decode the whole array at once, in stead of mapping over each one, and it seemed like a promising thing to try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants