You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session.
jsforce / jsforce Public
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I would appreciate if you could add an example of how to structure a bulk API insert execution that includes one job with multiple batches. My use case is to break up a large csv file and process the chunks with the bulk API. While I can create the single job with multiple batches, I can't figure out how to monitor all of the batches for completion so I can close the job. I assume it is some type of promise based solution, but I have not implemented anything like that before. Thanks.
The text was updated successfully, but these errors were encountered:
stomita added the question label Jan 5, 2015 jimrae commented Jan 6, 2015Yes, I need to create multiple batches on one Job. I was hoping to get an example of how to monitor one job with multiple batches. If if you don't have an example, I will try to get it working, a will provide an example.
Contributor stomita commented Jan 7, 2015How about this ? (I've not tried actually, though)
var fs = require('fs'); var Promise = require('promise'); var files = [ "acc01.csv", "acc02.csv", "acc03.csv" ]; var job = conn.bulk.createJob("Account", "insert"); Promise.all( files.map(function(file) var batch = job.createBatch(); batch.execute(fs.createReadStream(file)); batch.poll(5*1000, 30*1000); // poll interval = 5 sec, timeout = 30 sec return new Promise(function(resolve, reject) batch.on('response', function(res) resolve(res); >); batch.on('error', function(err) reject(error); >); >); >) ).then(function(rets) console.log('batch finished'); console.log(rets); >, function(err) console.error(err); >);jimrae commented Jan 7, 2015
I will check that out, thank you so much!!
jimrae commented Jan 12, 2015This is what I ended up with as my final version. The full code set also included a filereader that reads in a large CVS file using the HTML5 FileReader, and chunks it into smaller sets of records, which get added to the filearray variable.
var job = conn.bulk.createJob("Account", "insert"); Promise.all( filearray.map(function(file) var batch = job.createBatch(); batch.execute(file); return new Promise(function(resolve, reject) batch.on("queue", function(batchInfo) batchId = batchInfo.id; var batch = job.batch(batchId); batch.on('response', function(res) resolve(res); >); batch.on('error', function(err) reject(err); >); batch.poll(5*1000, 60*1000); >); >); >) ) .then(function(rets) var successrec=0; var errorrec=0; var procrec=0; var errormsg=''; for(var i=0;irets.length;i++) for(var ii=0;iirets[i].length;ii++) procrec++; if(rets[i][ii].success==true) successrec++; >else errorrec++; if(errormsg.length==0) errormsg=rets[i][ii].errors[0]; >else errormsg='\n'+rets[i][ii].errors[0]; > > > document.getElementById("loadresult").innerHTML = 'Processed : '+procrec+' with '+errorrec+' errors '+errormsg; > cursor_clear(); job.close(); >, function(err) alert(err); >);