[Stable]

This function is a convenience wrapper for submitting bulk API jobs

sf_run_bulk_operation(
  input_data,
  object_name,
  operation = c("insert", "delete", "upsert", "update", "hardDelete"),
  external_id_fieldname = NULL,
  guess_types = TRUE,
  api_type = c("Bulk 1.0", "Bulk 2.0"),
  batch_size = NULL,
  interval_seconds = 3,
  max_attempts = 200,
  wait_for_results = TRUE,
  record_types = c("successfulResults", "failedResults", "unprocessedRecords"),
  combine_record_types = TRUE,
  control = list(...),
  ...,
  verbose = FALSE
)

sf_bulk_operation(
  input_data,
  object_name,
  operation = c("insert", "delete", "upsert", "update", "hardDelete"),
  external_id_fieldname = NULL,
  guess_types = TRUE,
  api_type = c("Bulk 1.0", "Bulk 2.0"),
  batch_size = NULL,
  interval_seconds = 3,
  max_attempts = 200,
  wait_for_results = TRUE,
  record_types = c("successfulResults", "failedResults", "unprocessedRecords"),
  combine_record_types = TRUE,
  control = list(...),
  ...,
  verbose = FALSE
)

Arguments

input_data

named vector, matrix, data.frame, or tbl_df; data can be coerced into CSV file for submitting as batch request

object_name

character; the name of the Salesforce object that the function is operating against (e.g. "Account", "Contact", "CustomObject__c").

operation

character; string defining the type of operation being performed

external_id_fieldname

character; string identifying a custom field on the object that has been set as an "External ID" field. This field is used to reference objects during upserts to determine if the record already exists in Salesforce or not.

guess_types

logical; indicating whether or not to use col_guess() to try and cast the data returned in the recordset. If TRUE then col_guess() is used, if FALSE then all fields will be returned as character. This is helpful when col_guess() will mangle field values in Salesforce that you'd like to preserve during translation into a tbl_df, like numeric looking values that must be preserved as strings ("48.0").

api_type

character; one of "REST", "SOAP", "Bulk 1.0", or "Bulk 2.0" indicating which API to use when making the request.

batch_size

integer; the number of individual records to be included in a single batch uploaded to the Bulk APIs (1.0 or 2.0).

interval_seconds

integer; defines the seconds between attempts to check for job completion.

max_attempts

integer; defines then max number attempts to check for job completion before stopping.

wait_for_results

logical; indicating whether to wait for the operation to complete so that the batch results of individual records can be obtained

record_types

character; one or more types of records to retrieve from the results of running the specified job

combine_record_types

logical; indicating for Bulk 2.0 jobs whether the successfulResults, failedResults, and unprocessedRecords should be stacked together by binding the rows

control

list; a list of parameters for controlling the behavior of the API call being used. For more information of what parameters are available look at the documentation for sf_control.

...

other arguments passed on to sf_control or sf_create_job_bulk to specify the content_type, concurrency_mode, and/or column_delimiter.

verbose

logical; an indicator of whether to print additional detail for each API call, which is useful for debugging. More specifically, when set to TRUE the URL, header, and body will be printed for each request, along with additional diagnostic information where available.

Value

A tbl_df of the results of the bulk job

Note

With Bulk 2.0 the order of records in the response is not guaranteed to match the ordering of records in the original job data.

Examples

if (FALSE) {
n <- 20
new_contacts <- tibble(FirstName = rep("Test", n), 
                       LastName = paste0("Contact", 1:n))
# insert new records into the Contact object
inserts <- sf_bulk_operation(input_data = new_contacts, 
                             object_name = "Contact", 
                             operation = "insert")
}