Meteor remains a great way to build applications but has two major limitations out of the box: rendering issues on the client and hard scaling limitations on the server.
On the client, the main problem lies in the data management tools and how they inform Blaze/etc about the changes. On the server, the main problem lies to how closely Meteor tools are coupled with MongoDB and LiveQuery.
While Meteor Development Group is solving the latter problem with GraphQL/Apollo, I believe the initial solution could be far simpler, and most of the parts for it are already with-in Meteor.
Fetch and Carry are a concept of how that solution might look. The two reinforce what Meteor is best known for, which is a fast, simple and practical way to build applications.
Fetch: a Database Agnostic Query Tool
The idea behind fetch is to create a simple way for you to query any database and return the results to the client.
On the server, you would define your queries, along with a permission enforcer and processor that would run each time you fetch a query:
Fetcher.register({
admin: {
permission: function () {
if (Meteor.user().isAdmin) {
return true;
}
},
processor: function (query) {
data = mutateYourQueryAsYoudLike(query); return data;
},
queries: {
adminAccounts: function () {
Database.query({...});
},
normalAccounts: function () {
Database.query({...});
},
billingHistory: function (userId) {
Database.query({...});
},
incomeHistory: function (from, until) {
Database.query({...});
},
}
}
})
On the client, you would run the respective query and get back the data, which you would then store it in your data layer. Hereās one idea of how it could work:
myQuery = Fetch({
from: "admin",
query: "incomeHistory",
parameters: ["June", "August"],
callback: function () {
// ...
}
})
myQuery.state() // returns ["notLoaded","loading","loaded","reloading"];
myQuery.reload() // reloads the query
myQuery.lastUpdate() // returns the timestamp
Compared to Pub/Sub, you would be run be able to run multiple, similar queries but they would all be separated. Compared to GraphQL, you would have to walk a lot less miles to get information from your database into your client.
Perhaps Fetch can be designed in a way that makes it easy to transition to Apollo/GraphQL when the time is right?
Carry: a Versatile Data Layer
Currently, Meteorās data layer is a state of flux (pun intended). The situation is:
-
minimongo
has nice filtering and sorting abilities, but it too closely coupled to MongoDB and LiveQuery -
reactive-var
andreactive-dict
hav performance issues with objects -
reactive-var
andreactive-dict
do not support sorting or filtering arrays of objects
Iāll go on a leap and say what we all love about Meteor was the reactive programming style. Tracker that makes it possible and not the libraries built on top of it. What Meteor needs next is a new Tracker-based data layer that leapfrogs the limitations of the current libraries.
If Fetch and Carry were integrated, we could perhaps pick up our Fetch query in the client like this:
Template.incomeHistory.helpers({
data: function () {
myQuery.get().sort({...}).filter({...});
}
});
In that case, Fetch would automatically create a new instance Carry. Weād also be able to do it ourselves, perhaps with an API similar to this:
MyData = new Carry("myNewData");
MyData.set({
toys: [{...},{...},{...}],
lastUpdated: "04:45 PM..",
otherItems: {
idk: ["Chicken"]
}
});
MyData.get("toys").sort({age: -1});
The Overall Concept
The overall concept comes two problems:
First, we now have three ways to obtain data with-in Meteor:
1. Using Pub/Sub with the classic Mongo stack
2. Using Apollo
3. Using Method calls
Second, we have a data layer that is becoming outdated:
- it is not well adapted to how we retrieve data in Meteor
- it causes poor rendering in some cases
The big takeaway is that we need a new data layer built on Tracker that can perform and adapt to all these new situations. If we do that right, it would also solve all the rendering issues that are associated with Blaze.
What do you think of this idea? Are you focused on using Apollo next, or are you looking for alternatives? It would be great to hear what you think.