This is a forum to share our thoughts and get your feedback.
Bulk Uploads to CRMOD
Often client applications need to insert/update millions of records such as leads, orders etc into CRMOD. CRMOD regular web services allow max 20 records per request, so if you want to push 1M activities, you have to make 50,000 calls, which can be painful!
As an example, consider StoryPulse, our flagship product that supports all popular CRM platforms. The platform collects millions of assessments from thousands of CMROD users from their mobile devices. Once the assessments are in the middle tier, the platform pushes them to respective backend; CRMOD in this case.
However, there are few problems with regular web services. If we were to use them:
- It takes longer time to push tasks to CRMOD.
- Thousands of CRMOD calls need to be made.
- There could be be issues with session
- max sessions reached
- max 20 web service calls in a given timeframe etc
So, we thought of using CRMOD batch processing to push millions of records in a faster and better way.
CRMOD has several utilities to import bulk data; online import assistant, data loader, migration tool etc. Data loader is a client utility that accepts huge data sets in a CSV file and inserts them into the CRMOD environment.
Client applications can also take advantage of this API and invoke web service calls to push bulk data. While customers are moving away into Oracle Sales Cloud, Salesforce etc, I thought I will take a moment to explain the setup of the this API and push data to CRMOD.
Bulk API Setup and access
Any admin user can download data loader from their CMROD UI. As part the utility a WSDL file “OracleDataLoaderOnDemandImportServices.wsdl” gets download. All you need to do is use this WSDL and setup your application to insert bulk data. It has following methods.
- It creates a basic bulk import request object.
- In this object we have to mention the following:
- Name of the object such as “Account” or “Task”
- What operation you want to perform (like insert or update).
- CSV file name which as all the data.
- This method needs to be called prior you make one or more send data calls.
- This method will push the data to the server.
- It will contain actual data sets, which needs to be pushed.
- Using this method, we can get a request status, how many records are failed to process, error messages etc
Please refer following Data Loading API URLs for more information on each of these methods.
Using this bulk import we can push 30,000 records (Eg – Assessment Activities ) per request. This number may be vary for different record and industry. If we can run this process using multiple threads and a proper session pool, we can insert millions of records in very few requests and less time.
- Imported file should be CSV only.
- If any of the records in the request cause an error, we can’t revert the records inserted before.
- Only one type of record can be inserted at a time.
- After batch job completes client applications does not have records Row-id details, need to query them using other call with External Unique Id reference.
In this blog, I could explore and explain bulk insertion with respect to our problem in the StoryPulse application. Your use case could be different but you can refer to the links below.