Limitation with Eloqua Bulk API

Hi,
We are planning to develop a connector for data extraction from Eloqua to load into Birst and would like to understand any limitation with Bulk API.
Thanks,
Anil

Hi Anil,
The below listed limitations are present in the Bulk API Guide:
4.12 Bulk API Limits
4.12.1 General Limits
Number of syncs permitted per 24hr period: 2000 This limit is not currently enforced, but is logged and monitored.
4.12.2 Import and Export Limits
Maximum import size: 32MB. The import will fail if the data source is larger than 32MB. To import a data source larger than 32MB, you can perform multiple POST requests. For extremely large data sources, the import should be synced periodically, i.e. once 250 MB of data has been POSTed.
Maximum export size: 50,000 records per page. The export will fail if you attempt to export more than 50,000 records in a single page at one time. If you need to export more than 50,000 records at a time, you can filter your request by date range and make separate requests for different date ranges.
Maximum fields that can be included in one export: 100 fields. The export will fail if you include more than 100 fields in the export definition.
Maximum sync size: 5,000,000 records. The sync will fail if you attempt to sync more than 5 million records
If planning for specific large exports, you can create filter statements in your export definition.
-Syed.

Similar Messages

  • ELOQUA Bulk API: error at first attempt

    Hello.
    I am trying to export ELOQUA data into enterprise data wharehouse.
    For this purpose I just try to go though tutorial in the
    "Oracle Eloqua Bulk API v2.0 Developer Guide"
    After I established ID, I try to post first sample from the tutorial:
    POST /accounts/exports
    "name": "Account Export Example",
    "fields": {
                  "accountId": "{{Account.Id}}",
                  "FirstName": "{{Account.Field(C_FirstName)}}"
    And I got an error message:
    "ML Statements must have an Activity root (see http://topliners.eloqua.com/docs/DOC-4298 for details)
    What is the reason? Where is this DOC-4298 (url is not working)? And why tutorial does not correspond to reality?

    HI,
    Even I got similar error for EmailOpen Activity Export, for me it seems for field related issue, you can probably try following:
    Make sure given fields are available in Eloqua (As eloqua also supports custom fields)
    Field Name exactly matching as specified in your request definition.
    It worked after I detected and removed one of the suspected field.
    Hope it Helps.
    ~Manan

  • Has anyone used Scribe Online to interact with Eloqua's API?

    I've been experimenting with Scribe Online via an ON24 integration and was interested if anyone has queried the activity and contact tables with Scribe Online. So far it's an easy drag and drop between entities, although the complex part is formulas within the logic.
    A current test I'm working on is to pull email activity (Email Name that was Opened and the last activity date), link it to a contact with a SFDC Lead ID and then populate that via a field in SFDC. I'd eventually like to experiment with advanced visual reporting to get a true idea on the activity of our database.

    does anyone have a answer for this? I'm having the same prob!!

  • Getting Data Types of Eloqua - any REST BULK API?

    Is there an api which lists all the data types used for the fields of eloqua entities?
    From the examples that list the entity metadata I could see that there are data types like "YESNO", "Date"  and "string" for entiy's fields.
    I will need a  list of all data types that eloqua can have for any of  its fields.
    I am going to be using the BULK API to import and export eloqua elements through java.

    There is currently no endpoint for this.
    There are four data types available for contacts and accounts - Date, Large Text, Numeric (float), Text.  Custom objects can these four data types plus a fifth - Number (integer).

  • Bulk API issue with contact imports

    Is the bulk API having validation issues? I can update any existing or create any new imports.
    Simply posting the content below from the tutorial now results in a validation error:
    "name": "Docs Import Example",
    "fields": {
    "firstName": "{{Contact.Field(C_FirstName)}}",
    "lastName": "{{Contact.Field(C_LastName)}}",
    "emailAddress": "{{Contact.Field(C_EmailAddress)}}"
    "identifierFieldName": "emailAddress",
    "isSyncTriggeredOnImport" : "false"
    Here is the error:
    "failures":[{"field":"identifierFieldName","constraint":"Must be a string value, at least 1 character and at most 100 characters long."},{"field":"name","constraint":"Must be a string value, at least 1 character and at most 100 characters long."},{"field":"fields","constraint":"Is required."}]}

    Seems like an issue with UD_ADUSER_LOCKED field value. Change it to non null value and retry.

  • Bulk API exports with 5m records

    Hi,
    I've just started using the bulk API. For our activity exports, we have >5 million records so when I run it (unfiltered apart from activity type) the sync stops at 5 million.
    I am looking to get all of the records out, so I was wondering if anyone knew how best to do this.
    'Offset' seems to work with the GETs but not with POSTing the export definition.
    I have tried saying activity id  > (the highest activity id from the first 5 million records returned) but this seems to leave gaps.
    Likewise date > (the highest date in the first 5 mil).
    It seems as though activity id & date only roughly tie up
    Any advice appreciated!
    Cheers
    Stephen

    Hi Wouter,
    Have you considered making a standard contact filter that does this, and simply referencing that in your export definition? Admittedly, it's either going to be a manual process from the UI, or extra code in your app to automate the creation of a shared filter via REST API.
    For reference, Bulk API 2.0 can do this via:
    "filter": "'{{Contact.Field(C_FirstName)}}' = ''"
    Regards,
    Bojan

  • How to get data of more than 100 fields by Bulk API V2.0?

    Hi all,
    I'm trying to get data from Eloqua by Bulk APIs because of big data.
    But my Contact has 186 fields (more than the Bulk export limitation 100). I think I need to get all data by 2 exports.
    How could I match 2 parts of a row and join them together?
    I'm afraid that any edit of data between the 2 syncs for the 2 exports would make the order different.
    E.G.:
        1. any record is deleted or edited(make it not matching the filter) after getting data of the first part and before getting the second part, then every one behind it would move up in the second part result.
        2. data of some fields (included in both parts) are changed between the 2 syncs, then values of the second part are newest but values of the first part are old.
    Any suggestions would be expected.
    Thanks,
    Biao

    bhuang -
    I don't know that you're ever going to get around the fact that things will change in your database while you're syncing the data. You're going to have to have some way to create exceptions on the sync side.
    If I was pushing data from Eloqua to another database and had to deal with the problem of matches changing while I was syncing, I'd create a few extra columns in my database to keep track of sync status for that record. Or create another small table to track the data mapping. Here's how I'd do it.
    I'd have two extra columns:  'mapped fields 1' and 'mapped fields 2'. They'd both be datetime fields.
    I would only do one set of syncs at a time. First, sync every record for the email + 99 fields. Do the entire list. For each batch, set the datetime of the batch in 'mapped fields 1' column.
    I'd then sync all records for email + other 86 fields. Do the entire list again. For this batch, set the datetime of each batch in their 'mapped fields 2' column to now().
    For any records that had only 'mapped fields 1' filled, but 'mapped fields 2' was empty, I'd re-run the second Eloqua API query using the email as the lookup value. If no results were returned, I'd delete the row. Otherwise, update and set 'mapped fields 2' to now
    For any records that had only 'mapped fields 2', I'd re-run the emails against the first Eloqua API query, fill in the missing data, and set 'mapped fields 1' to the current datetime. If the record wasn't returned, delete the row as it's probably not in the search any longer.
    Finally, set 'mapped fields 1' and 'mapped fields 2' to empty for all records, since you know the data is in sync. This will let you use the same logic above on your next sync.
    Does that make sense? It's not super clean, but it'll get the job done, unless your syncs are taking a ridiculous amount of time and your data changes super often. 

  • Steps to intergrate SalesFoce through Bulk API

    Hi,
    Could some one please let me know the steps to make a call from BPEL (SOA 11.1.1.6) toSalesforce using bulk API, Currently we are doing this with SOAP API(partner wsdl) which having limitation on the number of records in input, so we would like to migrate it with bulk API.
    Thanks & Regards
    Venkat

    In OBIEE 11g which includes BIP the application roles are applied to LDAP users and groups using the Enterprise Manager Fusion Control.
    During the upgrade process from OBIEE 10g to OBIEE 11g the groups do get assigned to these roles transparently so there must be some API to leverage this functionality.
    I would start there, http://download.oracle.com/docs/cd/E14571_01/bi.1111/e10541/admin_api.htm
    There are no specific instructions on accomplishing what you seek but if you have some WLST or Java Skills you should be able to get something prototyped.
    Let me know if that helps.

  • How to get activity data by Bulk APIs v2.0?

    Hi all,
    I have completed to get Account data by Bulk APIs v2.0. (create an export for Account and then create a Sync for this export)
    But when trying to use the integration to work for Activity data, I always get Error status of Sync.
    So, could you please show me any suggestions for this issue?
    Is there any option different between exports of Account and Activity?
    Thanks,
    Biao
    PS: I find an example to request activity data:
    POST https://<host>.eloqua.com/api/bulk/2.0/activities/exports
    "name":"Example Activity Export",
    "fields":{
    "ActivityId":"{{Activity.Id}}",
    "AssetName":"{{Activity.Asset.Name}}",
    "ActivityType":"{{Activity.Type}}",
    "Id":"{{Activity.Id}}",
    "ActivityDate":"{{Activity.CreatedAt}}",
    "EmailAddress":"{{Activity.Field(EmailAddress)}}",
    "ContactId":"{{Activity.Contact.Id}}",
    "VisitorId":"{{Activity.Visitor.Id}}",
    "AssetType":"{{Activity.Asset.Type}}",
    "AssetId":"{{Activity.Asset.Id}}",
    "RawData":"{{Activity.Field(RawData)}}"
    "filter":"’{{Activity.Type}}’=’FormSubmit’"
    I get error status of Sync for this export. But when I remove the 'Id' and 'RawData' fields, I get success.
    I guess that there are some roles of the field settings, e.g. one statement should not appear more than once, and so on.
    So, where could I find the roles? Or is there an API could return the valid field list?
    Thank you very much.

    Hi Biao,
    I am able to pull the data for Activity Export for Type Form Submit.
    I am adding details below :
    1) Create Activity Export Structure
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/activities/exports
    Input JSON : {"name":"Bulk Activity Export","dataRetentionDuration":"PT1H","fields":{"activityId":"{{Activity.Id}}","assetId":"{{Activity.Asset.Id}}","activityType":"{{Activity.Type}}","activityDate":"{{Activity.CreatedAt}}","contactId":"{{Activity.Contact.Id}}","visitorId":"{{Activity.Visitor.Id}}","visitorExternalId":"{{Activity.Visitor.ExternalId}}","assetType":"{{Activity.Asset.Type}}","assetName":"{{Activity.Asset.Name}}","rawData":"{{Activity.Field(RawData)}}"},"filter":"'{{Activity.Type}}'='FormSubmit' and '{{Activity.CreatedAt}}'>='2013-06-01' and '{{Activity.CreatedAt}}'<='2013-06-30'"}
    Status Code : 201
    Output JSON : {"name":"Bulk Activity Export","fields":{"activityId":"{{Activity.Id}}","assetId":"{{Activity.Asset.Id}}","activityType":"{{Activity.Type}}","activityDate":"{{Activity.CreatedAt}}","contactId":"{{Activity.Contact.Id}}","visitorId":"{{Activity.Visitor.Id}}","visitorExternalId":"{{Activity.Visitor.ExternalId}}","assetType":"{{Activity.Asset.Type}}","assetName":"{{Activity.Asset.Name}}","rawData":"{{Activity.Field(RawData)}}"},"filter":"'{{Activity.Type}}'='FormSubmit' and '{{Activity.CreatedAt}}'>='2013-06-01' and '{{Activity.CreatedAt}}'<='2013-06-30'","dataRetentionDuration":"PT1H","uri":"/activities/exports/74","createdBy":"CXDELQ.API","createdAt":"2014-11-03T09:40:00.9397930Z","updatedBy":"CXDELQ.API","updatedAt":"2014-11-03T09:40:00.9397930Z"}
    2) Create Sync with status pending
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/syncs
    Input JSON : {"syncedInstanceUri":"/activities/exports/74","status":"pending"}
    Status Code : 201
    Output JSON : {"syncedInstanceUri":"/activities/exports/74","status":"pending","createdAt":"2014-11-03T09:41:18.2503138Z","createdBy":"CXDELQ.API","uri":"/syncs/90640"}
    3) Get sync Result
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/syncs/90640
    Status Code : 200
    Output JSON : {"syncedInstanceUri":"/activities/exports/74","syncStartedAt":"2014-11-03T09:41:20.6670000Z","syncEndedAt":"2014-11-03T09:41:23.1970000Z","status":"success","createdAt":"2014-11-03T09:41:18.1670000Z","createdBy":"CXDELQ.API","uri":"/syncs/90640"}
    4) Get the data from API
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/activities/exports/74/data?limit=1000&offset=0&count=1&depth=complete
    Output JSON : "Output is high voulme, am not attaching here".
    Please let me know is this will help you to reolve the issue.
    Thanks,
    Deepak Chincholi

  • How to test Bulk API against testing environment?

    Hi,
    I need to be able to do testing of my code to import data and would like to do it without messing with live data. Is there any way to be able to have a testing environment for Eloqua that I would use to test my Bulk API imports?
    Thanks,
    Rami

    Rami, you would need to get either a sandbox (if you are an existing client) or join the technology partner program and get a development instance.  See Getting Started With the Eloqua Platform for more details.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Issues with Check In API

    I'm using Adobe Drive 3.0.1.104 on Windows 7.  When I drag and drop multiple assets into the Explorer Window, right-click those assets and choose "Check In..", none of the handlers associated with my custom checkin dialog (i.e., CanEnableCustomCheckInUIHandler, GetCustomCheckInUIContextInfoHandler and GetCustomCheckInUIHandler) are called.  Only the default check in comments dialog appears.  However, if I select a single asset for check in, then my handlers are correctly called.  Is this the expected behavior?  I want the user to be able to determine the type of object and some metadata in my CMS on import, but don't have this capability when multiple assets are selected.
    The second issue I have is that the CanEnableCustomCheckInUIHandler, GetCustomCheckInUIContextInfoHandler do not have any visibility into non-remote assets through the API (i.e., assets with a null Asset ID).  The getAssetIdentities() call only returns information about assets that already have an assetId.  In these two handlers, I would like to make decisions and set context based on the the entire list of files, not just those that already live in the remote CMS.
    The third issue I have is that I would like capture ContextInfo for each asset individually.  For example, if a user checks in an InDesign file and a linked Image at the same time, I would like capture separate context info for each asset that I can store in my CMS.  I might also want to capture global context that applies to all assets (i.e., the comments are a good example of this, but there could be others).
    Based on these three issues, I am limited to allowing only one asset to be checked in at a time.  However, with the current API, I can't even prevent a check in until my CheckInHandler is called, which is the first opportunity for me to throw an Exception to stop the checkin.  In my opinion, this is not a good user experience because the user will be prompted to enter the comments and then get an error, which means they will loose the information they typed in the comments.
    I'm mostly trying to provide feedback to improve the API, but if there is something I'm overlooking here, please let me know.
    Rich

    Hi,
    For the first issue and the second issue, yes, it's a design limitation, we will try to resolve it in the coming dot release, thanks for pointing out them
    For the third issue, I think you can manage the context info for each item by utilizing IGetCustomCheckInUIContextInfoHandler, you can establish a data structure for the requested items, this structure will be passed to the your custom UI module, the UI module understands it and returns the context info to ICheckInHandler in the custom parameters, in a word, you should establish a comunication protocol between IGetCustomCheckInUIContextInfoHandler, the custom Flex UI module and ICheckInHandler about how to pass and manager the context info.
    Regarding how to prevent a check-in, please provide your deep thoughts on it, we can discuss that in detail

  • Limitations with the free Informatica Cloud Data Loader

    Hello,Can you please help me understand that limitations of the free data loader? In this link  - http://www.informaticacloud.com/editions-integration.html# - I see the below features listed.No-code, wizard driven cloud integrationMulti-tenant SaaS solutionDatabase and file ConnectivityFlexible schedulingBulk API support (for Salesforce.com)Unlimited rows/day24 jobs/day1 Secure AgentLimited to 1 userCommunity supportCloud Data MaskingQuestions:When I view licenses in my free data loader, under Feature Licences, it shows the License type for Salesforce Connectivity/Bulk API as “Trial”. Can’t I create a scheduled Data Synch task to upsert records in Salesforce using Bulk API mode?Is the email notification option (for success, warning and failure of data synch task) available on the free version (and not as a trial)?I understand there is a limit of 24 jobs/day. But is there a limit on the number of scheduled data synch tasks that can be created?Data Masking is listed as a feature above for the free edition. However, when I view the licenses in my free data loader, Data Masking is shown as “Trial”. Can you please clarify this?Is there a limit on the number of Connections that can be created?ThanksSanjay

    Hi, The present project has the requirement to Delete the data from Sales Force objects.Have following set up: 1. Parent Objects2. Child Objects3. Cloud Data Synchronization tasks to delete these objects Parent and Child have LOOKUP relationships between them.Deleing data from Child objects did not give any error. Tried 2 scenarios to delete data from Parent object: Scenario 1: Tried to delete to data from PARENT first before deleting CHILD.                  Result: Failed Scenario 2: Tried to delete to data from PARENT after deleting CHILD.                  Result: Failed Error mesge received in both cases: "Error loading into target [SF_Object] : Error received from salesforce.com. Fields []. Status code [DUPLICATE_COMM_NICKNAME]. Message [Too many records to cascade delete or set null]." Kindly help to resolve this error and suggest a method to delete data from PARENT salesforce objects. Please feel free to ask for more inputs, if required.

  • How do I create a Campaign Entry via the Bulk API

    I am importing contacts into Eloqua via an Audience App and the Bulk API.  When if I look in the Campaign Entry Report, those contacts don't appear.  If I examine a contact that does appear in the report, I see that it has Campaign Entry and Campaign Membership under Recent Activity.  What do I need to do to make that happen for the contacts I'm importing via the Bulk API?  I've looked through the API documentation but haven't found what I need.

    In the Style editor you only see the CSS files used by the website in the currently selected tab.
    You won't see any references to the userContent.css file via the built-in Inspector.
    Only the DOM Inspector show such CSS rules if you inspect an element that have a matching selector.
    You do not need to import the userContent.css file as Firefox does this automatically each time you start Firefox and the rules that apply will automatically be added.
    *DOM Inspector: https://addons.mozilla.org/firefox/addon/dom-inspector-6622/
    *https://developer.mozilla.org/DOM_Inspector
    *https://developer.mozilla.org/Introduction_to_DOM_Inspector

  • Does the Bulk API accept gzipped request content?

    I've been trying to send gzipped requests to the Bulk API with the headers Content-Encoding: gzip and Content-Type: application/json and it returns an HTTP 400 response with the message "There was a serialization error."
    I've also tried changing the content type to application/x-gzip and it returns an HTTP 400 response with the message:
    { "failures":[{"field":"name","constraint":"Must be a string value, at least 1 character and at most 100 characters long."},{"field":"fields","constraint":"Is required."}]}
    These requests work perfectly when uncompressed, it is only when I compress the data and send it to the API that I start running into issues. Will the Bulk API accept gzipped request content?

    I don't believe this is currently possible

Maybe you are looking for

  • Problem with faces : no more recognition

    Hi I am using apertures with faces since few years. Since the beginning of this years I have no faces propostions on new pictures. No faces propostions at all in 2012. It still gives propostions with older pictures! Any suggestions?

  • How to Turn off Color Management in Photoshop 7

    Wondering if anyone can guide me through the steps of turning off Color Management in Photoshop 7 so that my Epson 2200 manages the color? I've already tried turning off color management in photoshop color settings and also in the printing box.  My p

  • OFR - Supplier matching "Vendor ID + Vendor Site ID"

    Hi I am configuring OFR to read invoices that will be passed into an EBS environment. The standard set up of OFR is that Vendor ID identifies a unique vendor, but for EBS you need to be able to identify a supplier by VENDOR ID and VENDOR SITE ID. App

  • Applications not supported on this architecture

    Hi, I know this topic has been posted before but it is now archived and I have a related question. I have the same problem in that I've installed a new harddrive (following death of the last one... about 14 months after the Macbook was bought new, th

  • Incorrect Nesting, before the statement ENDFUNCTION

    Hello: I have this simple ABAP program, but I don't seem to make it work: FUNCTION Z_SAP_GET_CREDIT. ""Interfase local *"  IMPORTING *"     VALUE(CLIENT_ID) TYPE  Z_CLIENT_ID OPTIONAL *"  EXPORTING *"     VALUE(CREDIT_LIMIT) TYPE  STRING *"     VALUE