Creating giant schemas to query for large JSON objects

Hi there,

I’ve been a fan of GraphQL since it came out, but never had a chance to use it in a real / commercial project.

Now I may have a use case at work, for which GraphQL could be well suited and after presenting a quick prototype of a server and client (with Apollo), one of the critical questions that has come up is:

Do we really need to define every single field in our schema that we might want to be able to query for?

The background of that question is that our data source may be very large JSON objects. I’m talking dozens of root-level fields, several of them containing nested arrays of again very complex JSON objects with lots of fields etc.

The easiest way I came up with, is to expose a “raw” field that just resolves to a stringified version of the whole massive object (requiring additional parsing on the client, of course).

But I was wondering if there is a way to dynamically identify fields from my query that haven’t been explicitly defined in the schema?

(Note, I do like the json-to-graphql package on npm, although that only circumvents the issue by auto-generating standard graphql-js schemas, not apollo-style pure graphql strings)

Any suggestions?

You can use any type of server with Apollo - I think using JSON-to-graphql is the right approach for this case!

json-to-graphql is not actually saving me that much time, it turns out.
It creates types and resolvers, but all of them return null by default, some of the type names are misformed (include spaces) and no parameters are taken into account. Overall, the time to clean up the output is probably not worth it.

I think our best bet may be to just spend a few days designing gql strings.

I think I’ll have to consolidate some of those data structures, parameterize quite a lot of them and then see if I can extract fragments wherever possible.

Anyway, thanks for the response, @sashko!