Bulk API exports with 5m records

Hi,
I've just started using the bulk API. For our activity exports, we have >5 million records so when I run it (unfiltered apart from activity type) the sync stops at 5 million.
I am looking to get all of the records out, so I was wondering if anyone knew how best to do this.
'Offset' seems to work with the GETs but not with POSTing the export definition.
I have tried saying activity id  > (the highest activity id from the first 5 million records returned) but this seems to leave gaps.
Likewise date > (the highest date in the first 5 mil).
It seems as though activity id & date only roughly tie up
Any advice appreciated!
Cheers
Stephen

Hi Wouter,
Have you considered making a standard contact filter that does this, and simply referencing that in your export definition? Admittedly, it's either going to be a manual process from the UI, or extra code in your app to automate the creation of a shared filter via REST API.
For reference, Bulk API 2.0 can do this via:
"filter": "'{{Contact.Field(C_FirstName)}}' = ''"
Regards,
Bojan

Similar Messages

  • Bulk API issue with contact imports

    Is the bulk API having validation issues? I can update any existing or create any new imports.
    Simply posting the content below from the tutorial now results in a validation error:
    "name": "Docs Import Example",
    "fields": {
    "firstName": "{{Contact.Field(C_FirstName)}}",
    "lastName": "{{Contact.Field(C_LastName)}}",
    "emailAddress": "{{Contact.Field(C_EmailAddress)}}"
    "identifierFieldName": "emailAddress",
    "isSyncTriggeredOnImport" : "false"
    Here is the error:
    "failures":[{"field":"identifierFieldName","constraint":"Must be a string value, at least 1 character and at most 100 characters long."},{"field":"name","constraint":"Must be a string value, at least 1 character and at most 100 characters long."},{"field":"fields","constraint":"Is required."}]}

    Seems like an issue with UD_ADUSER_LOCKED field value. Change it to non null value and retry.

  • Retrieve Email Group Status via Bulk API Exports

    What methods are needed in order to retrieve email group status and subscription date for contacts (alongside contact information) via Bulk API 2.0?
    Thank you,
    SR.

    After further testing you can subscribe two groups but the log message will be misleading.  I created a bulk import with 10 contacts to be subscribed to two email groups.  The log message indicated that 20 contacts had been subscribed to the first email group.  Checking the contacts via the web interface showed that they were properly subscribed to both groups.  The count was correct, just the text of the message was overly specific.

  • Limitation with Eloqua Bulk API

    Hi,
    We are planning to develop a connector for data extraction from Eloqua to load into Birst and would like to understand any limitation with Bulk API.
    Thanks,
    Anil

    Hi Anil,
    The below listed limitations are present in the Bulk API Guide:
    4.12 Bulk API Limits
    4.12.1 General Limits
    Number of syncs permitted per 24hr period: 2000 This limit is not currently enforced, but is logged and monitored.
    4.12.2 Import and Export Limits
    Maximum import size: 32MB. The import will fail if the data source is larger than 32MB. To import a data source larger than 32MB, you can perform multiple POST requests. For extremely large data sources, the import should be synced periodically, i.e. once 250 MB of data has been POSTed.
    Maximum export size: 50,000 records per page. The export will fail if you attempt to export more than 50,000 records in a single page at one time. If you need to export more than 50,000 records at a time, you can filter your request by date range and make separate requests for different date ranges.
    Maximum fields that can be included in one export: 100 fields. The export will fail if you include more than 100 fields in the export definition.
    Maximum sync size: 5,000,000 records. The sync will fail if you attempt to sync more than 5 million records
    If planning for specific large exports, you can create filter statements in your export definition.
    -Syed.

  • How to filter CDO export on Date Modified using Bulk API

    I'm using the Bulk API 1.0 and E10, and I'm trying to export custom data object (aka CDO, Data Card) records that have been modified recently. I can get it to work using another date field, but I can't figure out what the field name or ID is for the modified date. I've tried C_DateModified, but that returned an generic error message. I tried to query the custom object to get all fields, but that just returns the custom fields. Any ideas?
    By the way, the filter I'm using is something like this: filter":{"filterRule":"valueGreaterThanOrEqualToComparisonValue","value":"{{CustomObject[53].Field[???]}}","comparisonValue":"2014-02-01 00:00:00"}
    Thanks!

    Hello,
    I was trying all the names in this post and kept getting the
    constraint: "Must be a reference to an existing object."
    error message. What exactly is the syntax to filter by modified date for customObjects in BulkAPI 2.0? I'm using E10
    I tried both
    "filter":"'{{CustomObject[5].Field(Date_Modified1)}}' >= '2014-09-20’"}
    and
    "filter":{"filterRule":"valueGreaterThanOrEqualToComparisonValue","value":"{{CustomObject[5].Field(Date_Modified1)}}","comparisonValue":"2014-09-01 00:00:00"}

  • Flat file export with a “Constant” header record???

    I have a mapping that export the records of a table in a Flat File.
    But I need to agreggate a constant in the first line of the export file (Header Record)
    For example ‘AAABBBCCCDDDEEEFFFGGG’
    How Can I put a “Constant” Header record?

    Marcelo,
    When you use flat file as a target you can set the header to be your field names. Now... it looks like this is not exactly what you are looking for.
    Alternative is to use a pre mapping procedure to create your file with the one line in it, and have the mapping produce additional records that are appended to the first line. Whether you create a new file or append to an existing file is a property on the target file, if I am not mistaken (it would be a configuration property otherwise).
    Thanks,
    Mark.

  • How to get data of more than 100 fields by Bulk API V2.0?

    Hi all,
    I'm trying to get data from Eloqua by Bulk APIs because of big data.
    But my Contact has 186 fields (more than the Bulk export limitation 100). I think I need to get all data by 2 exports.
    How could I match 2 parts of a row and join them together?
    I'm afraid that any edit of data between the 2 syncs for the 2 exports would make the order different.
    E.G.:
        1. any record is deleted or edited(make it not matching the filter) after getting data of the first part and before getting the second part, then every one behind it would move up in the second part result.
        2. data of some fields (included in both parts) are changed between the 2 syncs, then values of the second part are newest but values of the first part are old.
    Any suggestions would be expected.
    Thanks,
    Biao

    bhuang -
    I don't know that you're ever going to get around the fact that things will change in your database while you're syncing the data. You're going to have to have some way to create exceptions on the sync side.
    If I was pushing data from Eloqua to another database and had to deal with the problem of matches changing while I was syncing, I'd create a few extra columns in my database to keep track of sync status for that record. Or create another small table to track the data mapping. Here's how I'd do it.
    I'd have two extra columns:  'mapped fields 1' and 'mapped fields 2'. They'd both be datetime fields.
    I would only do one set of syncs at a time. First, sync every record for the email + 99 fields. Do the entire list. For each batch, set the datetime of the batch in 'mapped fields 1' column.
    I'd then sync all records for email + other 86 fields. Do the entire list again. For this batch, set the datetime of each batch in their 'mapped fields 2' column to now().
    For any records that had only 'mapped fields 1' filled, but 'mapped fields 2' was empty, I'd re-run the second Eloqua API query using the email as the lookup value. If no results were returned, I'd delete the row. Otherwise, update and set 'mapped fields 2' to now
    For any records that had only 'mapped fields 2', I'd re-run the emails against the first Eloqua API query, fill in the missing data, and set 'mapped fields 1' to the current datetime. If the record wasn't returned, delete the row as it's probably not in the search any longer.
    Finally, set 'mapped fields 1' and 'mapped fields 2' to empty for all records, since you know the data is in sync. This will let you use the same logic above on your next sync.
    Does that make sense? It's not super clean, but it'll get the job done, unless your syncs are taking a ridiculous amount of time and your data changes super often. 

  • Steps to intergrate SalesFoce through Bulk API

    Hi,
    Could some one please let me know the steps to make a call from BPEL (SOA 11.1.1.6) toSalesforce using bulk API, Currently we are doing this with SOAP API(partner wsdl) which having limitation on the number of records in input, so we would like to migrate it with bulk API.
    Thanks & Regards
    Venkat

    In OBIEE 11g which includes BIP the application roles are applied to LDAP users and groups using the Enterprise Manager Fusion Control.
    During the upgrade process from OBIEE 10g to OBIEE 11g the groups do get assigned to these roles transparently so there must be some API to leverage this functionality.
    I would start there, http://download.oracle.com/docs/cd/E14571_01/bi.1111/e10541/admin_api.htm
    There are no specific instructions on accomplishing what you seek but if you have some WLST or Java Skills you should be able to get something prototyped.
    Let me know if that helps.

  • How to get activity data by Bulk APIs v2.0?

    Hi all,
    I have completed to get Account data by Bulk APIs v2.0. (create an export for Account and then create a Sync for this export)
    But when trying to use the integration to work for Activity data, I always get Error status of Sync.
    So, could you please show me any suggestions for this issue?
    Is there any option different between exports of Account and Activity?
    Thanks,
    Biao
    PS: I find an example to request activity data:
    POST https://<host>.eloqua.com/api/bulk/2.0/activities/exports
    "name":"Example Activity Export",
    "fields":{
    "ActivityId":"{{Activity.Id}}",
    "AssetName":"{{Activity.Asset.Name}}",
    "ActivityType":"{{Activity.Type}}",
    "Id":"{{Activity.Id}}",
    "ActivityDate":"{{Activity.CreatedAt}}",
    "EmailAddress":"{{Activity.Field(EmailAddress)}}",
    "ContactId":"{{Activity.Contact.Id}}",
    "VisitorId":"{{Activity.Visitor.Id}}",
    "AssetType":"{{Activity.Asset.Type}}",
    "AssetId":"{{Activity.Asset.Id}}",
    "RawData":"{{Activity.Field(RawData)}}"
    "filter":"’{{Activity.Type}}’=’FormSubmit’"
    I get error status of Sync for this export. But when I remove the 'Id' and 'RawData' fields, I get success.
    I guess that there are some roles of the field settings, e.g. one statement should not appear more than once, and so on.
    So, where could I find the roles? Or is there an API could return the valid field list?
    Thank you very much.

    Hi Biao,
    I am able to pull the data for Activity Export for Type Form Submit.
    I am adding details below :
    1) Create Activity Export Structure
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/activities/exports
    Input JSON : {"name":"Bulk Activity Export","dataRetentionDuration":"PT1H","fields":{"activityId":"{{Activity.Id}}","assetId":"{{Activity.Asset.Id}}","activityType":"{{Activity.Type}}","activityDate":"{{Activity.CreatedAt}}","contactId":"{{Activity.Contact.Id}}","visitorId":"{{Activity.Visitor.Id}}","visitorExternalId":"{{Activity.Visitor.ExternalId}}","assetType":"{{Activity.Asset.Type}}","assetName":"{{Activity.Asset.Name}}","rawData":"{{Activity.Field(RawData)}}"},"filter":"'{{Activity.Type}}'='FormSubmit' and '{{Activity.CreatedAt}}'>='2013-06-01' and '{{Activity.CreatedAt}}'<='2013-06-30'"}
    Status Code : 201
    Output JSON : {"name":"Bulk Activity Export","fields":{"activityId":"{{Activity.Id}}","assetId":"{{Activity.Asset.Id}}","activityType":"{{Activity.Type}}","activityDate":"{{Activity.CreatedAt}}","contactId":"{{Activity.Contact.Id}}","visitorId":"{{Activity.Visitor.Id}}","visitorExternalId":"{{Activity.Visitor.ExternalId}}","assetType":"{{Activity.Asset.Type}}","assetName":"{{Activity.Asset.Name}}","rawData":"{{Activity.Field(RawData)}}"},"filter":"'{{Activity.Type}}'='FormSubmit' and '{{Activity.CreatedAt}}'>='2013-06-01' and '{{Activity.CreatedAt}}'<='2013-06-30'","dataRetentionDuration":"PT1H","uri":"/activities/exports/74","createdBy":"CXDELQ.API","createdAt":"2014-11-03T09:40:00.9397930Z","updatedBy":"CXDELQ.API","updatedAt":"2014-11-03T09:40:00.9397930Z"}
    2) Create Sync with status pending
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/syncs
    Input JSON : {"syncedInstanceUri":"/activities/exports/74","status":"pending"}
    Status Code : 201
    Output JSON : {"syncedInstanceUri":"/activities/exports/74","status":"pending","createdAt":"2014-11-03T09:41:18.2503138Z","createdBy":"CXDELQ.API","uri":"/syncs/90640"}
    3) Get sync Result
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/syncs/90640
    Status Code : 200
    Output JSON : {"syncedInstanceUri":"/activities/exports/74","syncStartedAt":"2014-11-03T09:41:20.6670000Z","syncEndedAt":"2014-11-03T09:41:23.1970000Z","status":"success","createdAt":"2014-11-03T09:41:18.1670000Z","createdBy":"CXDELQ.API","uri":"/syncs/90640"}
    4) Get the data from API
    URL : https://www02.secure.eloqua.com/API/Bulk/2.0/activities/exports/74/data?limit=1000&offset=0&count=1&depth=complete
    Output JSON : "Output is high voulme, am not attaching here".
    Please let me know is this will help you to reolve the issue.
    Thanks,
    Deepak Chincholi

  • Export VCS call records

    Hi,
    I wanted to export VCS call records to a billing system, what do I need to do to get these records in a raw format?
    I see the records in the Call\History log but they are not in raw format.
    Thanks in advance for your answer,
    Philippe

    You can see the history stuff in it's xml format..
    Open the history xml document... https://IPAddress/history.xml
    You can also use xpath expressions to interface with the xml via web or shell
    If the VCS documents don't cover it well.. look at the api docs for the integration codecs.. the way you interface with the data is basically the same for call history.. just more fields on the vcs

  • How to create a custom function module with the records in SAP R/3?

    Hi All,
    How to create a custom function module with the records in SAP R/3? Using RFC Adapter I have to fetch the custom function module records.
    Regards
    Sara

    Hi
    goto se37...here u need to create a function group... then u need to create a function module. inside assign import/export parameters. assign tables/exceptions. activate the same. now write ur code within the function module
    http://help.sap.com/saphelp_nw04/helpdata/en/9f/db98fc35c111d1829f0000e829fbfe/content.htm
    Look at the below SAP HELP links, These links will show you the way to create a Function Module
    http://help.sap.com/saphelp_nw04/helpdata/en/26/64f623fa8911d386e70000e82011b8/content.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/9f/db98fc35c111d1829f0000e829fbfe/content.htm

  • Getting Data Types of Eloqua - any REST BULK API?

    Is there an api which lists all the data types used for the fields of eloqua entities?
    From the examples that list the entity metadata I could see that there are data types like "YESNO", "Date"  and "string" for entiy's fields.
    I will need a  list of all data types that eloqua can have for any of  its fields.
    I am going to be using the BULK API to import and export eloqua elements through java.

    There is currently no endpoint for this.
    There are four data types available for contacts and accounts - Date, Large Text, Numeric (float), Text.  Custom objects can these four data types plus a fifth - Number (integer).

  • How to write the new records not with existing records.

    hi,
    I have a script.If i execute the script it writes the records.but its writing with the exisiting records.It writes not only the new records but also the old records.
    for eg: the exisiting records are:
    1111115-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111116-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111117-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111118-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111119-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    but now what happend is the new records such as
    1111116-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111117-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111118-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111119-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    it gets appended with the old existing records,
    1111117-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111118-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111119-2,USD,DINESH,1,1,,,,9,,,,123456184001,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111113-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111114-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111115-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111116-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111117-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111118-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    1111119-2,USD,DINESHBABU,1,1,,,,9,,,,123456184003,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,9999,,,,
    there will be some problem happens when we are going to upload a bulk data of say 25000 records.if its gonna update 20000 records,rest 5000 records i have to do it in next step,if this is going to happen then the first 20000 records will be displayed forever.because it have the capacity to write 20000 records.thenext 500 records will not be written.so there is a chance of redundancy.how to avoid this redundancy while running this script
    my script is
    create or replace procedure input_tables(table1 in varchar2)
    is
    str varchar2(32767);
    cnt number(2);
    cursor c1(tname in varchar2)
    is
    select column_name
    from all_tab_columns
    where table_name = tname
    order by column_id;
    rec c1%rowtype;
    begin
    cnt:= 1;
    str:= 'declare '||
    'cursor c2 '||
    'is '||
    'select ';
    open c1(table1);
    loop
    fetch c1 into rec;
    exit when c1%notfound;
    if cnt = 1 then -- Added New
    str:= str||rec.column_name; -- Added New
    cnt:= cnt + 1; -- Added New
    else -- Added New
    str:= str||'||'',''||'||rec.column_name; -- Added New
    end if; -- Added New
    end loop;
    close c1;
    str:= str||' SRC from '||table1||';'||
    ' r2 c2%rowtype;'||
    ' ft UTL_FILE.file_type;'||
    ' str2 varchar2(200);'|| --Added New
    ' begin '||
    ' ft := UTL_FILE.fopen(''SAMPLE'',''OUTPUT.csv'',''w'');'||
    ' for r2 in c2'||
    ' loop '||
    ' UTL_FILE.put_line(ft,r2.SRC);'||
    ' end loop;'||
    ' UTL_FILE.fclose(ft);'||
    ' end;';
    execute immediate(str);
    end;
    thanks,
    Ratheesh.

    Hi!
    U can check the following script --
    create or replace procedure input_tables(table1 in varchar2,start_col in number,last_col in number)
    is
         str varchar2(32767);
         cnt number(2);
         cursor c1(tname in varchar2)
         is
           select column_name
           from all_tab_columns
           where table_name = tname
           order by column_id;
         rec c1%rowtype;
    begin
         cnt:= 1;
         str:= 'declare '||
         'cursor c2 '||
         'is '||
         'select ';
         open c1(table1);
         loop
              fetch c1 into rec;
              exit when c1%notfound;
                   if cnt = 1 then -- Added New
                        str:= str||rec.column_name; -- Added New
                        cnt:= cnt + 1; -- Added New
                   else -- Added New
                        str:= str||'||'',''||'||rec.column_name; -- Added New
                   end if; -- Added New
         end loop;
         close c1;
         str:= str||' SRC from '||table1||
         ' where rownum between '||start_col||' and '||last_col||';'|| -- Added New
         ' r2 c2%rowtype;'||
         ' ft UTL_FILE.file_type;'||
         ' str2 varchar2(200);'|| --Added New
         ' begin '||
         ' ft := UTL_FILE.fopen(''SAMPLE'',''OUTPUT.csv'',''w'');'||
         ' for r2 in c2'||
         ' loop '||
         ' UTL_FILE.put_line(ft,r2.SRC);'||
         ' end loop;'||
         ' UTL_FILE.fclose(ft);'||
         ' end;';
         execute immediate(str);
    end;
    / To print first ten rows --
    exec input_tables('EMP',1,10);  --first 10 rowsTo print next remain rows --
    exec input_tables('EMP',11,15);N.B: No tested....
    Regrads.
    Satyaki De.

  • Videos export with purple/green artifacts/splotches/flashes with Media Encoder CC but export fine with Premiere Pro CC?

    I use Premiere Pro CC to edit videos, and videos render fine when I export through Premiere Pro, but when I try to use Media Encoder CC, the video exports with weird splotches of green/purple artifacts all throughout. It almost looks like the video is glitching and having a seizure of green and purple. No idea why this happens. Could anybody help me? I use Media Encoder to export bulks of videos at a time through its queue feature, which is why I prefer to use Media Encoder for video exporting.
    Please and thanks.
    Specs:
    Motherboard:  MSI® Z87-G45 Gaming [Red on Black color] 2x SLI/CrossFire
    Processor:  Intel® Core™ i7 4770K 3.5GHz/3.9GHz Turbo 8MB L3 Cache HD 4600
    Memory:  16GB Corsair® Vengeance™ DDR3-1600 1.5V (2x8GB)
    Graphics and GPGPU Accelerator: AMD R9 295x2
    Power Supply:  1300W EVGA
    Operating System:  Microsoft Windows 7 Home Premium 64-bit

    Mitch,
    I was hoping that you and others might have some insight as to whether the purple and green artifacts / splotches seen in the attached screenshots (taken from 4 consecutive frames here but occurring through the entirety of the clip) could possibly have resulted from exporting video with ~ Premiere Pro CS5 or 6 ? As seen below, these splotches alternate back and forth between purple and green in every consecutive frame. I was also hoping that someone might know if this is a relatively common type of artifact and how it is caused.

  • Export with crystal reports as web service

    In order to create 3 tier architecture for security enhancments, I am using crystal feature to publish report as web service. As far displaying these reports on viewer is concerned, I have got that part working by setting the datasource to asmx and then using the viewer to set the parameters and logon information.
    However currently I also have export button on the UI which basically used the ReportDocument and wrote the excel to Http stream. And this is the part where I am now struggling with crystal as web service as it does not expose ReportDocument. I have scanned all forums and have been googling for a while but reaching nowhere.
    Please help if someone has already dealt with similar problem.
    Thank you.

    I don't think you will be able to do this from the button. Reason is that the web service only works with the viewer and there is are no exposed export API associated with the viewer. The only export APIs are based off of the report document object.
    Ludek
    Follow us on Twitter
    http://twitter.com/SAPCRNetSup

Maybe you are looking for

  • Re-install plug-ins after upgrade to CC

    The chief thing holding me back from upgrading to CC is the prospect of installing all of my plug-ins, actions, brushes, etc. again. When I upgraded from CS5 to CS6, this took days, and proved a major pain.  I'd like to upgrade to CC, but not if I ha

  • LSMW  MM02 -  CHANGE PURCHASE ORDER TEXT

    I need to change the purchase order text  in mm02  using the LSMW.  Can you help me?

  • Nokia asha 305, a worst phon

    It is a one and only a smartphone in the world which do not play normal 3gp videos and etc at a unbelievable price Thanks to nokia ha ha haaaaaaaaaaaaaaaa

  • Download report to PDF at Output

    Hi SDN's, I am having a report which shows a output. Now my requirement is i need to download the Output of the Report as a PDF. I would need to place a button on the appl tool bar, clicking on which should throw a popup of File Select Dialog. Upon g

  • Convert pdf to word

    I have downloaded trial program.  I want to convert 2 pdf files to word... I can't get into the adobe program to do it