I have a survey table that looks like so:
{
id: Id,
date: Date,
clients: [{
client_id: Id,
contacts: [{
contact_id: Id,
score: Number,
Creating a query to update a JSON array of objects in-place, is a rather complicated process in ReThinkDB (and most query languages). The best (and only) solution in ReQL that I know about, is to use a combination of update,offsetsOf,do,changeAt, and merge functions. This solution will retain the order of objects in the array, and only modify values on objects which match in the offsetsOf methods.
The following code (or something similar) can be used to update an array of objects (i.e. clients) which contain an array of objects (i.e. contracts).
Where
'%_databaseName_%','%_tableName_%','%_documentUUID_%',%_clientValue_%, and%_contractValue_%must be provided.
r.db('%_databaseName_%').table('%_tableName_%').get('%_documentUUID_%').update(row =>
row('clients')
.offsetsOf(clients => client('client_id').eq('%_clientValue_%'))(0)
.do(clientIndex => ({
clients: row('clients')(clientIndex)
.offsetsOf(contacts => contact('contact_id').eq('%_contactValue_%')))(0)
.do(contactIndex => ({
contacts: row(clientIndex)
.changeAt(contractIndex, row(clientIndex)(contractIndex).merge({
'score': 0,
'feedback': 'xyz'
}))
})
}))
)
survey
.pluck({ clients: 'contacts' }).run()
.then(results => {
results.clients.forEach((item, outerIndex) => {
item.contacts.forEach((item, index, array) => {
if(Number(item.contact_id) === Number(obj.contact_id)) {
array[index].score = obj.score;
console.log(outerIndex, index);
}
});
});
return survey.update(results).run()
})
While the code provided by Jacob (the user who asked the question here on Stack Overflow - shown above) might look simpler to write, the performance is probably not as good as the ReQL solution.
1) The ReQL solution runs on the query-server (i.e. database side) and therefore the code is optimized during the database write (higher performance). Whereas the code above, does not make full use of the query-server, and makes a read and write request pluck().run() and update().run(), and data is processed on the client-request side (i.e. NodeJs side) after the pluck() query is run (lower performance).
2) The above code requires the query-server to send back all the data to the client-request side (i.e. NodeJs side) and therefore the response payload (internet bandwidth usage / download size) can be several megabytes. Whereas the ReQL solution is processed on the query-server, and therefore the response payload typically just confirms that the write was completed, in other words only a few bytes are sent back to the client-request side. Which is done in a single request.
However, ReQL (and especially SQL) seem overly complicated when working with JSON, and it seems to me that JSON should be used when working with JSON.
I've also proposed that the ReThinkDB community adopt an alternative to ReQL that uses JSON instead (https://github.com/rethinkdb/rethinkdb/issues/6736).
The solution to updating nested JSON arrays should be as simple as...
r('database.table').update({
clients: [{
client_id: 0,
contacts: [{
contact_id: 0,
score: 0,
feedback: 'xyz',
}]
}]
});
You might need to get the array, filter out the desired value in the array and then append it again to the array. Then you can pass the updated array to the update method.
Example
Let's say you have a document with two clients that both have a name and a score and you want to update the score in one of them:
{
"clients": [
{
"name": "jacob" ,
"score": 200
} ,
{
"name": "jorge" ,
"score": 57
}
] ,
"id": "70589f08-284c-495a-b089-005812ec589f"
}
You can get that specific document, run the update command with an annonymous function and then pass in the new, updated array into the clients property.
r.table('jacob').get("70589f08-284c-495a-b089-005812ec589f")
.update(function (row) {
return {
// Get all the clients, expect the one we want to update
clients: row('clients').filter(function (client) {
return client('name').ne('jorge')
})
// Append a new client, with the update information
.append({ name: 'jorge', score: 57 })
};
});
I do think this is a bit cumbersome and there's probably a nicer, more elegant way of doing this, but this should solve your problem.
Database Schema
Maybe it's worth it to create a contacts table for all your contacts and then do a some sort of join on you data. Then your contacts property in your clients array would look something like:
{
id: Id,
date: Date,
clients: [{
client_id: Id,
contact_scores: {
Id: score(Number)
},
contact_feedbacks: {
Id: feedback(String)
}
}]
}
I had your same problem and i could solve it with two ways:
client_idr.db('nameDB').table('nameTable').get('idRegister')
.update({'clients': r.row('clients')
.map(elem=>{
return r.branch(
elem('client_id').eq('your_specific_client_id'),
elem.merge({
contacts: elem('contacts').map(elem2=>
r.branch(
elem2('contact_id').eq('idContact'),
elem2.merge({
score: 99999,
feedback: 'yourString'
}),
elem2
)
)
}),
elem
)
})
})
client_idr.db('nameDB').table('nameTable').get('idRegister')
.update({'clients': r.row('clients')
.map(elem=>
elem.merge({
contacts: elem('contacts').map(elem2=>
r.branch(
elem2('contact_id').eq('idContact'),
elem2.merge({
score: 99999,
feedback: 'yourString'
}),
elem2
)
)
})
)
})
I hope that it works for you, even when happened much time ago
it works for me
r.table(...).get(...).update({
contacts: r.row('Contacts').changeAt(0,
r.row('Contacts').nth(0).merge({feedback: "NICE"}))
})
tfmontague is on the right path but I think his answer can be improved a lot. Because he uses ...(0) there's a possibility for his answer to throw errors.
zabusa also provides a ReQL solution using map and branch but doesn't show the complete nested update. I will expand on this technique.
ReQL expressions are composable so we can isolate complexity and avoid repetition. This keeps the code flat and clean.
First write a simple function mapIf
const mapIf = (rexpr, test, f) =>
rexpr.map(x => r.branch(test(x), f(x), x));
Now we can write the simplified updateClientContact function
const updateClientContact = (doc, clientId, contactId, patch) =>
doc.merge
( { clients:
mapIf
( doc('clients')
, c => c('client_id').eq(clientId)
, c =>
mapIf
( c('contacts')
, c => c('contact_id').eq(contactId)
, c =>
c.merge(patch)
)
)
}
);
Use it like this
// fetch the document to update
const someDoc =
r.db(...).table(...).get(...);
// create patch for client id [1] and contact id [12]
const patch =
updateClientContact(someDoc, 1, 12, { name: 'x', feedback: 'z' });
// apply the patch
someDoc.update(patch);
Here's a concrete example you can run in reql> ...
const testDoc =
{ clients:
[ { client_id: 1
, contacts:
[ { contact_id: 11, name: 'a' }
, { contact_id: 12, name: 'b' }
, { contact_id: 13, name: 'c' }
]
}
, { client_id: 2
, contacts:
[ { contact_id: 21, name: 'd' }
, { contact_id: 22, name: 'e' }
, { contact_id: 23, name: 'f' }
]
}
, { client_id: 3
, contacts:
[ { contact_id: 31, name: 'g' }
, { contact_id: 32, name: 'h' }
, { contact_id: 33, name: 'i' }
]
}
]
};
updateClientContact(r.expr(testDoc), 2, 23, { name: 'x', feedback: 'z' });
The result will be
{ clients:
[ { client_id: 1
, contacts:
[ { contact_id: 11, name: 'a' }
, { contact_id: 12, name: 'b' }
, { contact_id: 13, name: 'c' }
]
}
, { client_id: 2
, contacts:
[ { contact_id: 21, name: 'd' }
, { contact_id: 22, name: 'e' }
, { contact_id: 23, name: 'x', feedback: 'z' } // <--
]
}
, { client_id: 3
, contacts:
[ { contact_id: 31, name: 'g' }
, { contact_id: 32, name: 'h' }
, { contact_id: 33, name: 'i' }
]
}
]
}
database schema
{
"clients": [
{
"name": "jacob" ,
"score": 200
} ,
{
"name": "jorge" ,
"score": 57
}
] ,
"id": "70589f08-284c-495a-b089-005812ec589f"
}
then you can do like this using map and branch query .
r.db('users').table('participants').get('70589f08-284c-495a-b089-005812ec589f')
.update({"clients": r.row('clients').map(function(elem){
return r.branch(
elem('name').eq("jacob"),
elem.merge({ "score": 100 }),
elem)})
})