Bulk Load API Status

Hi,
I'm using Oracle Endeca 2.3.
I encountered a problem in data integrator, Some batch of records were missing in the Front end and when I checked the status of Graph , It Showed "Graph Executed sucessfully".
So, I've connected the Bulk loader to "Universal data writer" to see the data domain status of the bulk load.
I've listed the results below, However I'm not able to interpret the information from the status and I've looked up the documentation but I found nothing useful.
0|10000|0|In progress
0|11556|0|In progress
0|20000|0|In progress
0|30000|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
40009|-9|0|In progress
40009|9991|0|In progress
40009|19991|0|In progress
40009|20846|0|In progress
Could anyone enlighten me more about this status.
Also,Since these messages are a part of "Post load", I'm wondering why is it still showing "In-Progress".
Cheers,
Khurshid

I assume there was nothing of note in the dgraph.log?
The other option is to see what happens when you either:
A) filter your data down to the records that are missing prior to the load and see what happens
Or
B) use the regular data ingest API rather than the bulk.
Option b will definitely perform much worse on 2.3 so it may not be feasible.
The other thing to check is that your record spec is truly unique.  The only time I can remember seeing an issue like this was loading a record, then loading a different record with the same spec value.  The first record would get in and then be overwritten by the second record making it seem like the first record was dropped.  Figured it would be worth checking.
Patrick Rafferty
Branchbird

Similar Messages

  • Using API  to run Catalog Bulk Load - Items & Price Lists concurrent prog

    Hi everyone. I want to be able to run the concurrent program "Catalog Bulk Load - Items & Price Lists" for iProcurement. I have been able to run concurrent programs in the past using the fnd_request.submit_request API. But I seem to be having problems with the item loading concurrent program. for one thing, the program is stuck on phase code P (pending) status.
    When I run the same concurrent program using the iProcurement Administration page it runs ok.
    Has anyone been able to run this program through the backend? If so, any help is appreciated.
    Thanks

    Hello S.P,
    Basically this is what I am trying to achieve.
    1. Create a staging table. The columns available for it are category_name, item_number, item_description, supplier, supplier_site, price, uom and currency.
    So basically the user can load item details into the database from an excel sheet.
    2. use the utl_file api, create an xml file called item_load.xml using the data in the staging table. this will create the xml file used to load items in iprocurement and save it in the database directory /var/tmp/iprocurement This part works great.
    3. use the api fnd_request.submit_request to submit the concurrent program 'Catalog Bulk Load - Items & Price Lists'. This is where I am stuck. The process simply says pending or comes up with an error saying:
    oracle.apps.fnd.cp.request.FileAccessException: File /var/tmp/iprocurement is not accessable from node/machine moon1.oando-plc.com.
    I'm wondering if anyone has used my approach to load items before and if so, have they been successful?
    Thank you

  • API for bulk loading of pages into UCM

    For Oracle Universal Content Management –
    Is there an API for use in bulk loading pages?
    Where is the documentation for this?
    If there is not API, what is the best way to bulk load 10’s of thousands of pages into UCM?
    Thanks in advance,
    Ram

    To easily bulk load files in UCM, you can use the BatchLoader utility described in chapter 7 (in release 10g or chapter 3 in release 11g) of the 'Managing System Settings and Processes' admin guide.
    We used it on our project to load some 60000 documents in UCM with associated metadata in just over 1 hour. Worked fine.
    Brgrds,
    Bob Marien
    (Ps: if you want to use an API instead, you can ofcourse invoke the UCM webservices)

  • HTTP Response Status Codes: GET (retrieve), POST (create), PUT (modify), and DELETE (REST, Bulk, Any API)

    Many of us, who are starters, starting to wonder when reading any API documentations or starting out with your first program built to make any of the API calls, what do all error codes means, when I get many different types of response from running the program/script. The only reason I thought about sharing this is because I know how motivation plays a key role when dealing with Eloqua platform and building components on top. This extends the functionality beyond what is already out of the box.
    I put together a table that explains these in details. I hope you can benefit in resolving issues as you venture in your journeys. This is a very common chart that can be seen across many platform REST APIs. Idea was to have it here because the audience are not always the same.
    Response Code
    HTTP Operation
    Response Body Contents
    Description
    200
    GET, PUT, DELETE
    Resource
    No error, operation successful.
    201 Created
    POST
    Resource that was created
    Successful creation of a resource.
    202 Accepted
    POST, PUT, DELETE
    N/A
    The request was received.
    204 No Content
    GET, PUT, DELETE
    N/A
    The request was processed successfully, but no response body is needed.
    301 Moved Permanently
    GET
    XHTML with link
    Resource has moved.
    303 See Other
    GET
    XHTML with link
    Redirection.
    304 Not Modified
    conditional GET
    N/A
    Resource has not been modified.
    400 Bad Request
    GET, POST, PUT, DELETE
    Error Message
    Malformed syntax or a bad query.
    401 Unauthorized
    GET, POST, PUT, DELETE
    Error Message
    Action requires user authentication.
    403 Forbidden
    GET, POST, PUT, DELETE
    Error Message
    Authentication failure or invalid Application ID.
    404 Not Found
    GET, POST, PUT, DELETE
    Error Message
    Resource not found.
    405 Not Allowed
    GET, POST, PUT, DELETE
    Error Message
    Method not allowed on resource.
    406 Not Acceptable
    GET
    Error Message
    Requested representation not available for the resource.
    408 Request Timeout
    GET, POST
    Error Message
    Request has timed out.
    409 Resource Conflict
    PUT, PUT, DELETE
    Error Message
    State of the resource doesn't permit request.
    410 Gone
    GET, PUT
    Error Message
    The URI used to refer to a resource.
    411 Length Required
    POST, PUT
    Error Message
    The server needs to know the size of the entity body and it should be specified in the Content Length header.
    412 Precondition failed
    GET
    Error Message
    Operation not completed because preconditions were not met.
    413 Request Entity Too Large
    POST, PUT
    Error Message
    The representation was too large for the server to handle.
    414 Request URI too long
    POST, PUT
    Error Message
    The URI has more than 2k characters.
    415 Unsupported Type
    POST, PUT
    Error Message
    Representation not supported for the resource.
    416 Requested Range Not Satisfiable
    GET
    Error Message
    Requested range not satisfiable.
    500 Server Error
    GET, POST, PUT
    Error Message
    Internal server error.
    501 Not Implemented
    POST, PUT, DELETE
    Error Message
    Requested HTTP operation not supported.
    502 Bad Gateway
    GET, POST, PUT, DELETE
    Error Message
    Backend service failure (data store failure).
    505
    GET
    Error Message
    HTTP version not supported.
    Hope this helps. Original post: REST API Status Codes and Complete REST API Tutorial with Status Codes.
    Thank
    Amit

    Hi, I am trying to PUT to update contact info and I get following error:
    2015-01-16 11:00:17,970 INFO [main] oracle.eloqua.connector.eloqua.EloquaConnector.putWithBasicAuth(97) | accessHttpsPut.url=https://secure.eloqua.com/API/REST/2.0//data/contact/7606838, text={"id":"7606838","accountName":"openIdStr001","emailAddress":"[email protected]","type":"Contact"}
    2015-01-16 11:00:18,931 ERROR [main] oracle.eloqua.connector.eloqua.EloquaConnector.putWithBasicAuth(140) | ClientProtocolException
    org.apache.http.client.HttpResponseException: Request is malformed.
    Is there any idea?
    Thanks so much.
    Sincerely.

  • How to UPDATE a big table in Oracle via Bulk Load

    Hi all,
    in a datastore target as Oracle 11g, I have a big table having 300milions of record; the structure is One integer key + 10 columns attributes .
    In IQ Source i have the same table with the same size ; the structure is One integer key + 1 column attributes .
    What i need to do is to UPDATE that single field in Oracle from the values stored in IQ .
    Any idea on how to organize efficiently the dataflow and the target writing mode ? bulk load ? api ?
    thank you
    Maurizio

    Hi,
    You cannot do bulk load when you need to UPDATE a field. Because all a bulk load does is add records to your table.
    Since you have to UPDATE a field, i would suggest to go for SCD with
    source > TC > MO > KG >target
    Arun

  • Bulk loading of Customer data into Application

    Hi Guys,
    I am going on with the development of Teleservice module on a new instance.
    Now i need to migrate the data on the old instance to the new instance.
    Please let me know if i have to use only APIs to create the customer into Apps or whether i can bulk load into the seeded tables directly.
    This has to include even Service Requests data also.
    Please let me know if there is any integration violation if we go with bulk loading of data directly.

    You donot need to develop a code for loading customer data anymore. Oracle has provided the BUlk IMport functionality in 11.5.8 for importing the customer infromation (using Oracle Customers Online/Oracle Data Libraian modules). If you would like to create accounts in addition to customer parties, you will have to use TCA V2 apis or customer interface program. For migrating the service requests, i guess the only option is to use APIs. HTH, Venit

  • Bulk load in OIM 11g enabled with LDAP sync

    Have anyone performed bulk load of more than 100,000 users using bulk load utility in OIM 11g ?
    The challenge here is we have OIM 11.1.1.5.0 environment enabled with LDAP sync.
    We are trying to figure out some performance factors and best way to achieve our requirement
    1.Have you performed any timings around use of Bulk Load tool. Any idea how long will it take to LDAP sync more than 100,000 users into OID. What are the problems that we could encounter during this flow ?
    2.Is it possible we could migrate users into another environment and then swap this database for the OIM database? Also is there any effective way to load into OID directly ?
    3.We also have some custom Scheduled Task to modify couple of user attributes (using update API) from the flat file. Have you guys tried such scenario after the bulk load ? And did you face any problem while doing so ?
    Thanks
    DK

    to Update a UDF you must assign a copy value adpter in Lookup.USR_PROCESS_TRIGGERS(design console / lookup definition)
    eg.
    CODE --------------------------DECODE
    USR_UDF_MYATTR1----- Change MYATTR1
    USR_UDF_MYATTR2----- Change MYATTR2
    Edited by: Lighting Cui on 2011-8-3 上午12:25

  • Bulk Loading in SAP CRM

    Please let me know about SAP API’s that would enable bulk loading mechanism instead of serial method of one by one records.

    Hi,
    U can migrate all data by the help two methods as i mentioned ,
    1. By Funtion module
    CRM_ORDER_MAINTAIN
    or Sap Crm Bol Programming
    2. Middleware (BDoc).
    This FM is like bapi for data maintain in R/3.
    I think FM is good for those who don't bol programming .
    Check the Mapping Fields available in FM.
    Hope it helps u.
    Thanks and Regards
    Alok

  • Best practices for cleaning up after a Bulk REST API v2 export

    I want to make sure that I am cleaning up after my exports (not leaving anything staging, etc). So far I am
    DELETEing $ENTITY/exports/$ID and $ENTITY/exports/$ID/data as described in the Bulk REST API documentation
    Using a dataRetentionDuration when I create an export (as a safety net in case my code crashes before deleting).
    Is there anything else I should do? Should I/can I DELETE the syncs I create (syncs are not listed in the "Delete an Entity" section of the documentation)? Or are those automatically deleted when I DELETE an export?
    Thanks!
    1086203

    Hi Chris,
    I met the same problem as pod
    It happens when I tried to load all historical activities, and one sample is same activityId was given to 2 different types (one is EmailOpen, the other is FormSubmit) that generated in year 2013
    Before full loading, I ever did testing for my job, extracting the activity records from Nov 2014, and there is not unique ID issue
    Seems Eloqua fixed this problem before Nov 2014, right?
    So if I start to load Activity generated since 2015, there will not be PK problem, or else, I have to use ActivityId + ActivityType as compound PK for historical data
    Please confirm and advise
    Waiting for your feedback
    Thanks~

  • Please HELP! issue with BULK LOAD in FDM 11.1.2.1

    Please assist with a solution to the following error!
    See log below
    ** Begin FDM Runtime Error Log Entry [2011-10-07 13:43:39] **
    ERROR:
    Code............................................. -2147217900
    Description...................................... You do not have permission to use the bulk load statement.
    BULK INSERT POLFDM..tWkalnic158050364335 FROM N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic158050364335.tmp' WITH (FORMATFILE = N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic158050364335.fmt',DATAFILETYPE = N'widechar',ROWS_PER_BATCH=221593,TABLOCK)
    Procedure........................................ clsDataManipulation.fExecuteDML
    Component........................................ upsWDataWindowDM
    Version.......................................... 1112
    Thread........................................... 5036
    IDENTIFICATION:
    User............................................. kalnickim
    Computer Name.................................... POCHFM04
    App Name......................................... POLFDM
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... SQLOLEDB
    Data Server...................................... pochfmsql01\hfm
    Database Name.................................... POLFDM
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... BW
    Location ID...................................... 751
    Location Seg..................................... 5
    Category......................................... ActSeg
    Category ID...................................... 38
    Period........................................... Sep - 2011
    Period ID........................................ 9/30/2011
    POV Local........................................ True
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-07 13:43:40] **
    ERROR:
    Code............................................. -2147217900
    Description...................................... Data access error.
    Procedure........................................ clsImpDataPump.fImportTextFile
    Component........................................ upsWObjectsDM
    Version.......................................... 1112
    Thread........................................... 5036
    IDENTIFICATION:
    User............................................. kalnickim
    Computer Name.................................... POCHFM04
    App Name......................................... POLFDM
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... SQLOLEDB
    Data Server...................................... pochfmsql01\hfm
    Database Name.................................... POLFDM
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... BW
    Location ID...................................... 751
    Location Seg..................................... 5
    Category......................................... ActSeg
    Category ID...................................... 38
    Period........................................... Sep - 2011
    Period ID........................................ 9/30/2011
    POV Local........................................ True
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-07 13:43:40] **
    ERROR:
    Code............................................. -2147217900
    Description...................................... Data access error.
    Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
    Component........................................ upsWObjectsDM
    Version.......................................... 1112
    Thread........................................... 5036
    IDENTIFICATION:
    User............................................. kalnickim
    Computer Name.................................... POCHFM04
    App Name......................................... POLFDM
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... SQLOLEDB
    Data Server...................................... pochfmsql01\hfm
    Database Name.................................... POLFDM
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... BW
    Location ID...................................... 751
    Location Seg..................................... 5
    Category......................................... ActSeg
    Category ID...................................... 38
    Period........................................... Sep - 2011
    Period ID........................................ 9/30/2011
    POV Local........................................ True
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-07 13:55:38] **
    ERROR:
    Code............................................. -2147217900
    Description...................................... You do not have permission to use the bulk load statement.
    BULK INSERT POLFDM..tWkalnic46564644597 FROM N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic46564644597.tmp' WITH (FORMATFILE = N'\\pochfm04\apps\POLFDM\Inbox\tWkalnic46564644597.fmt',DATAFILETYPE = N'widechar',ROWS_PER_BATCH=221593,TABLOCK)
    Procedure........................................ clsDataManipulation.fExecuteDML
    Component........................................ upsWDataWindowDM
    Version.......................................... 1112
    Thread........................................... 4644
    IDENTIFICATION:
    User............................................. kalnickim
    Computer Name.................................... POCHFM04
    App Name......................................... POLFDM
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... SQLOLEDB
    Data Server...................................... pochfmsql01\hfm
    Database Name.................................... POLFDM
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... BW
    Location ID...................................... 751
    Location Seg..................................... 5
    Category......................................... ActSeg
    Category ID...................................... 38
    Period........................................... Sep - 2011
    Period ID........................................ 9/30/2011
    POV Local........................................ True
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-07 13:55:38] **
    ERROR:
    Code............................................. -2147217900
    Description...................................... Data access error.
    Procedure........................................ clsImpDataPump.fImportTextFile
    Component........................................ upsWObjectsDM
    Version.......................................... 1112
    Thread........................................... 4644
    IDENTIFICATION:
    User............................................. kalnickim
    Computer Name.................................... POCHFM04
    App Name......................................... POLFDM
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... SQLOLEDB
    Data Server...................................... pochfmsql01\hfm
    Database Name.................................... POLFDM
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... BW
    Location ID...................................... 751
    Location Seg..................................... 5
    Category......................................... ActSeg
    Category ID...................................... 38
    Period........................................... Sep - 2011
    Period ID........................................ 9/30/2011
    POV Local........................................ True
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False
    ** Begin FDM Runtime Error Log Entry [2011-10-07 13:55:39] **
    ERROR:
    Code............................................. -2147217900
    Description...................................... Data access error.
    Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
    Component........................................ upsWObjectsDM
    Version.......................................... 1112
    Thread........................................... 4644
    IDENTIFICATION:
    User............................................. kalnickim
    Computer Name.................................... POCHFM04
    App Name......................................... POLFDM
    Client App....................................... WebClient
    CONNECTION:
    Provider......................................... SQLOLEDB
    Data Server...................................... pochfmsql01\hfm
    Database Name.................................... POLFDM
    Trusted Connect.................................. False
    Connect Status.. Connection Open
    GLOBALS:
    Location......................................... BW
    Location ID...................................... 751
    Location Seg..................................... 5
    Category......................................... ActSeg
    Category ID...................................... 38
    Period........................................... Sep - 2011
    Period ID........................................ 9/30/2011
    POV Local........................................ True
    Language......................................... 1033
    User Level....................................... 1
    All Partitions................................... True
    Is Auditor....................................... False

    Have you read the installation documentation? It appears that you did not take the time to do a basic level of troubleshooting. A simple Google search of the error message provides the root problem as well as the solution.
    The forums are intended to be used when you have exhausted other options. Please be mindful of this and contributors time when posting further questions.
    I have attached a google search result for the error "You do not have permission to use the bulk load statement."
    http://www.google.com/#sclient=psy-ab&hl=en&safe=off&site=&source=hp&q=+You+do+not+have+permission+to+use+the+bulk+load+statement.&pbx=1&oq=+You+do+not+have+permission+to+use+the+bulk+load+statement.&aq=f&aqi=g4&aql=&gs_sm=e&gs_upl=1556l1556l0l2633l1l1l0l0l0l0l184l184l0.1l1l0&bav=on.2,or.r_gc.r_pw.r_cp.,cf.osb&fp=ebaa3ff8b466872e&biw=1920&bih=955

  • Dump error while bulk load to Sybase IQ

    Hi ,
    I am loading data from Sybase IQ source to Sybase IQ target database.  I can load the data in normal mode.  But If I am trying bulk loading it is throwing dump error. Earlier I loaded so many times using bulk load but I didn't face any issue. We upgraded to Sybase IQ 16.0 SP3 to SP8. Now We are facing this type of difficulties. Please find the attached log file for error.
    Please help me how to resolve this issue.
    Thanks & Regards,
    Ramana.

    Hi Vijay,
    follow the steps below (May Help you)
    Goto Monitor screen>Status Tab> using the wizard or the menu path >Environment -> Short dump> In the warehouse
    Select the Error and double click
    Analyse the error from the message.
    1.-->Go to Monitor
    -->Transactional RFC
    -->In the Warehouse
    -->Execute
    -->EDIT
    -->Execute LUW
    Refresh the Transactional RFC screen and go to the Data Target(s) which have failed and see the status of the Request-->Now it should be green.
    2.In some cases the above process will not work (where the Bad requests still exists after the above procedure).
    In that case we need to go to the Data Targets and need to delete the bad requests and do the update again
    Regards,
    BH

  • Portlet Preferences - Any way to define templates or bulk load ?

    Hi,
    I have a question regarding portlet preferences. We have need to store metadata
    about a portlet and this metadata is the same for all portlets inside the portlet.
    As we will have in excess of 300 portlets which is expected to rise to around
    2000 - 3000 in the future, it will be incredibly tedious for the portal admin
    to create the preferences for each one (for such items as portlet owner, portlet
    administrator etc)
    Is there a template facility (a bit like user profiles) where we can set up the
    definition with fields and values etc, and apply to a portlet. Or failing that
    a bulk load capability where we can just populate the DB using the portlet_definition_id
    or portlet_instance_id in the preference tables ?
    We are on SP2 by the way.
    TIA
    Martin

    Thanks Subbu,
    That worked.....can you keep us updated on any movement with the SPI as this is
    something that of us very definite importance to us.....
    Thanks for the quick response
    Martin
    Subbu Allamaraju <[email protected]> wrote:
    Martin,
    For some reason in Workshop i dont seem to be able to add preferencesto the portlet
    i am creating. For example i create a JSP and then right click 'GeneratePortlet'
    and it creates me the .portlet file. But when i try to access the preferences
    menu it just gives me 'Remove' and the list is empty. It does not offerthe 'Create'
    Option. I must be missing somethingTry dragging "Preference" from the palette.
    Also even if we could do it at create time in workshop these will bethe same
    for all portlets in the Portal and thus this would need to be doneevery time
    manually and that would leave room for mistakes and people missingfields etc.
    We are looking at a very large number of portlets...
    What would be really nice is to define a 'preferences' property setas user profile
    etc and then
    just apply it...thus not leaving room for errors etc.....
    At this time, the API does not support this scenario. We're considering
    an SPI for a future release, and we'll make sure this scenario gets
    addressed.
    Subbu
    Any thoughts.....?
    Subbu Allamaraju <[email protected]> wrote:
    Martin,
    Have you considered specifying preferences for each portlet while
    creating the portlet in WLW? The recommended approach is to create
    preferences while creating the portlet as the portlet developer ismore
    likely to know what preferences are required and the implications.If
    the default values entered by the portlet developer are meaningful,
    portal admins won't have to change these values.
    Subbu
    Martin Porter said the following on 12/17/2003 07:12 AM:
    Hi,
    I have a question regarding portlet preferences. We have need to storemetadata
    about a portlet and this metadata is the same for all portlets insidethe portlet.
    As we will have in excess of 300 portlets which is expected to riseto around
    2000 - 3000 in the future, it will be incredibly tedious for the portaladmin
    to create the preferences for each one (for such items as portlet
    owner,
    portlet
    administrator etc)
    Is there a template facility (a bit like user profiles) where we canset up the
    definition with fields and values etc, and apply to a portlet. Or
    failing
    that
    a bulk load capability where we can just populate the DB using theportlet_definition_id
    or portlet_instance_id in the preference tables ?
    We are on SP2 by the way.
    TIA
    Martin

  • Bulk Loading vs Ingest Record Adding(Updating)

    Hi
    We are integrating Oracle Endeca Server with our application. We already have set of interfaces which extracts record definitions and records from our database.
    I created necessary classes to create record definitions in Oracle Endeca DataSource, but I have difficulties with selection of right approach with loading of records into DataSource. I wonder that is better to use Ingest or Bulk Loading.
    We have constantly changing data in our application and time to time we had to update existing records in DataRecords. From point of view of our API there is no difference between adding new records and updating existing records. As I learned from documentation Oracle Endeca Server performs updating multivalued properties by adding new data to them and by ignoring changes in single-valued properties. In our case we need a replacement existing values. I figured out that I will need to use addAssignments and deleteRecords in ingestRecords operation. In this case existing record will be deleted and new will be created. However what happens if record is not exist? I guess deleteRecords will fail, but will engine still execute addAssignments?
    Considering usage of Bulk Loading I also have some questions.
    What Bulk Loading will do if record it's trying to insert is already exist?
    If I have attribute definition of type boolean could I use
    // entryCur.getKey() is String
    // entryCur.getValue() is String
    assignmentBuilder.setName(entryCur.getKey()).setDataType(Data.Assignment.DataType.STRING).setStringValue(entryCur.getValue()) instead of setting to BOOLEAN?
    I guess Builder.set<Type>Value allows to set up multivalued attributes with multi calls?
    Thanks
    Eugene.

    Hi Eugene,
    I am an Endeca developer working on ingest. The situation you describe is perfectly suited by bulk ingest; it is exactly this use pattern that bulk ingest was designed for. If the record to put duplicates the spec of an already-existing record, the record currently in the data store is replaced by the incoming record.
    If you still wish to use the data ingest web service (DIWS), this is still an option. The pattern you describe is correct. If you attempt to delete the record that does not exist, Endeca Server will do nothing; again, this is the use case it is designed for.
    Hope this helps,
    Dave

  • Can someone reply this - pre-populating (bulk load) the OID - URGENT

    gurus,
    i'm using following -
    Database --> Oracle 9i
    Portal --> Oracle Portal 9iAS Release 2
    there are about 10,000 portal users. i would like to pre-populate the OID from the existing employee repository (employee repository is a custom Oracle database).
    question - is there a white paper that gives u all the api's required to do so. i've to accomplish the following tasks -
    1. create users
    2. give them privileges
    3. assign them to groups
    4. assign a default groups to users
    i need to achieve above as part of pre-populating the OID.
    ideas anyone ....?
    thanx a bunch.

    Hero,
    I just went through an exercise were I did a bulk load of users and did exectly the four steps you're asking for. I also applied the users to desinated groups.
    I'm on a HP-UX but solution can apply to any O/S.
    How do I get this to you?

  • Unable to load null status:404 when replying in Verizon webmail

    Unable to load null status:404 when replying in Verizon webmail. Error appears in its own window. It seems to be related to MIME-encoded incoming messages. I can open a new message window and copy/paste the reply-message into the new message and then send it without the error. Trying to save-as-draft also gives the same error message. This is recent after update to V.31 of Firefox on Win7(Sp1).

    Hey guys,
    It looks like that the issue has been resolved on the site, can you please try to send and email and confirm this change. There was a bug filed and fixed this week in version 32?
    You make have to clear the cache for the site to see the changes. Thank you!

Maybe you are looking for