-
Notifications
You must be signed in to change notification settings - Fork 10
Bulk API
Design notes for the Gitana Bulk API
var gitana = require("gitana");
gitana.createTransaction()
.for("branch://<platformId>/<repositoryId>/<branchId>") // using a reference string
//.for(branch) // using an object
.insert({
"title": "My first article",
"_type": "custom:article"
})
.insert({
"title": "My second page",
"_type": "custom:page"
})
.remove("GUID1")
.remove({
"_doc": "GUID2"
})
.read("GUID4") // could we provide a way to bulk read objects?
.retryCount(3)
.fail(function(result) {
// called if the commit fails
})
.success(function(result) {
// called if the commit succeeds
})
.commit();
As a first pass, the created transaction should store all JSON objects in-memory until commit() is called. When commit is called, the transaction can run through all objects and send single requests. For N number of objects, there could be N requests to add items to the transaction.
As a second pass, the created transaction could still store all JSON objects in-memory until commit() is called. However, when commit() is called, the objects could be grouped and sent in packages of 10, 100 or more depending on the total size of the JSON payload.
As a third pass, the created transaction could start chunking sends as things are being added. This would remove the requirement to hold things in memory. Local Storage could also be used as a way to queue things. This is not required for the moment, but an intended optimization down the road.
Create a Transaction POST /bulk/transactions?scope=branch:////
Add work to a Transaction POST /bulk/transactions//assign (with payload)
Cancel a Transaction DELETE /bulk/transactions/
Commit a Transaction POST /bulk/transactions//commit