Bulk REST API v2 Activity Data Export  "Primary Key"?

After reading the documentation, I expected that ActivityId would be the primary key when exporting activity data -- e.g. that there would be no duplicate ActivityIds in the items data returned by the export. However, I am seeing exported data where I have multiple activity datums that are identical except they have different CampgainIds
Is ActivityId,CampaignId the primary key for activity data? If not, what is?
When will activity data with the same ActivityId have different fields (excluding CampaignId)?
Thanks!
1086203

Hi Chris,
I met the same problem as pod
It happens when I tried to load all historical activities, and one sample is same activityId was given to 2 different types (one is EmailOpen, the other is FormSubmit) that generated in year 2013
Before full loading, I ever did testing for my job, extracting the activity records from Nov 2014, and there is not unique ID issue
Seems Eloqua fixed this problem before Nov 2014, right?
So if I start to load Activity generated since 2015, there will not be PK problem, or else, I have to use ActivityId + ActivityType as compound PK for historical data
Please confirm and advise
Waiting for your feedback
Thanks~

Similar Messages

  • Best practices for cleaning up after a Bulk REST API v2 export

    I want to make sure that I am cleaning up after my exports (not leaving anything staging, etc). So far I am
    DELETEing $ENTITY/exports/$ID and $ENTITY/exports/$ID/data as described in the Bulk REST API documentation
    Using a dataRetentionDuration when I create an export (as a safety net in case my code crashes before deleting).
    Is there anything else I should do? Should I/can I DELETE the syncs I create (syncs are not listed in the "Delete an Entity" section of the documentation)? Or are those automatically deleted when I DELETE an export?
    Thanks!
    1086203

    Hi Chris,
    I met the same problem as pod
    It happens when I tried to load all historical activities, and one sample is same activityId was given to 2 different types (one is EmailOpen, the other is FormSubmit) that generated in year 2013
    Before full loading, I ever did testing for my job, extracting the activity records from Nov 2014, and there is not unique ID issue
    Seems Eloqua fixed this problem before Nov 2014, right?
    So if I start to load Activity generated since 2015, there will not be PK problem, or else, I have to use ActivityId + ActivityType as compound PK for historical data
    Please confirm and advise
    Waiting for your feedback
    Thanks~

  • REST API searching customObject data

    Using the tool at: https://secure.p03.eloqua.com/api/docs/Dynamic/Rest/1.0/Reference.aspx
    I'm attempting to search through customObject records using the GET command like:
    /data/customObject/{id}?count={count}&page={page}&search={search}&orderBy={orderBy}&lastUpdatedAt={lastUpdatedAt}
    This works like a charm if I leave the 'search' parameter empty, but any other value causes a 500 error like:
      There was an internal server error.
      The error has been logged with log identifier 102142565.
      Please provide this log identifier to technical support.

    Hey Chris (or anyone else),
    Do you know how to search all records of a Custom Object that contain a specific value in a specific field from the REST API (like you can in the front end of Eloqua)?
    I've tried a ton of different variants of: GET /data/customObject/{id}?count=1&page=1&search='fieldValues=id=1234567890' AND 'fieldValues=value=abcdefghi'
    But I can't get anything to work in the search parameter other than ...&search=id=789456132 (not even '...&search=contactId=456789123' would work)
    The body of response is usually something like:
    <html><head><title>Internal Server Error</title></head><body><h1>
            Internal Server Error
        </h1><div>
            There was an internal server error.
        </div><div>
            The error has been logged with log identifier <b>666997947</b>.
        </div><div>
            Please provide this log identifier to technical support.
        </div></body></html>
    Here are the results when I make the following call: GET /data/customObject/{id}?count=1&page=1
    { elements:
       [ { type: 'CustomObjectData',
           id: '789456132',
           contactId: '456789123',
           fieldValues: [
                   { type: 'FieldValue', id: '1234567890', value: 'abcdefghi' },
                   ...bunch of fields
    page: 1,
    pageSize: 1,
    total: 10000 }
    Questions:
    1) How can I query fields of a record? In the above response, each field is represented as an the object in the fieldValues array.
    2) How can I incorporate a logical 'AND' operator into the query? I think I need this to query for both a field's ID AND the value of that field on the record: i.e. something like '...search='fieldValues=id=1234567890' AND 'fieldValues=value=abcdefghi'.
    Thanks to all in advance!

  • Need to create a new row in table with same data as Primary key, but new PK

    Hello Gurus ,
    I have a table with one column as primary key, I need to create a new row in the table, but with same data as in one of the rows, but with different primary key, in short a duplicate row with diferent primary key ..
    Any ideas of how it can be done without much complication?
    Thanks in advance for your reply.
    Reards,
    Swapneel Kale

    user9970447 wrote:
    Hello Gurus ,
    I have a table with one column as primary key, I need to create a new row in the table, but with same data as in one of the rows, but with different primary key, in short a duplicate row with diferent primary key ..
    Any ideas of how it can be done without much complication?
    Thanks in advance for your reply.
    Reards,
    Swapneel Kalesomething like
    insert into mytable values ('literal for new pk',
                                           select non-pk-1,
                                                    non-pk-2,
                                                    non-pk-n
                                           from mytable
                                           where pk-col = 'literal for existing pk')

  • ResultSet updateRow ,insertRow ,deleteRow APIs on table with no PRIMARY KEY

    Hi,
    When I use ResultSet.insertRow , updateRow ,deleteRow API's on a table which contais no primary key column I find the following Exception ,
    java.sql.SQLException: Operation invalid. No primary key for the table
    at com.tandem.sqlmx.Messages.createSQLException(Messages.java:69)
    at com.tandem.sqlmx.SQLMXResultSet.getKeyColumns(SQLMXResultSet.java:3501)
    at com.tandem.sqlmx.SQLMXResultSet.prepareInsertStmt(SQLMXResultSet.java:3652)
    at com.tandem.sqlmx.SQLMXResultSet.insertRow(SQLMXResultSet.java:2073)
    at testdateupdate.main(testdateupdate.java:33)
    It looks like the table needs to have a primary key column to update or insert or delete rows using the ResultSret APIs.
    I use a proprietary JDBC driver. I find the Explanation for this behavior like this ,
    JDBC allows update stmts on tables w/out a primary key defined if the stmt is issued by the application. For updateable ResultSets, a primary key restriction is required to avoid updating more rows than desired.I dont understand this explanation and also I dont find this behavior is some other JDBC drivers that I tried with.
    Some one Please Clarify the same.
    Thanks in Advance.
    Thanks and Regards,
    Jay

    Hi,
    in simple words, when a table does not have primary key you can send update and delete on it only by using a Statement object. When using ResultSet.updateRow or ResultSet.deleteRow the jdbc looks for the primary key on the metadata in order to send the correct where clause to the rdbms. I think that this could maybe work with Oracle DBMS, which has a unique id (ROWID) for each record.
    Kiros

  • Dropping data from primary Key.

    Hi ,
    I am deleting data from foreign key and it is dropping 1900 records in 1 sec.
    Query 1
    delete from  abbr_pay_dtl_stag where pif_file_id=118bur after that when I drop records from primary key, it takes more then 2 minutes.
    Query 2
    delete from line_stage e where bip_pif_file_id   =118When I have looked into the explain plan, Query 1 was using Full table scan and Query 2 is using Range Index Scan so I have put hint in Query 2 to use Full table scan, but it is still the same.
    Please advise.

    I haven't understood which is the master table and which is the detail table for you haven't given any indication of how the foreign key is created.
    Still, when you have foreign keys, whenever you delete from the master table, the database has to check the existence of corresponding detail records for any master record deleted, which definitely inposes scanning the index used for the foreign key.
    For best performance, you may disable the foreign key constraint, and then enable it, eventually enable novalidate. Still, disabling a constraint would be a bad thing if you are performing those deletes on a system that processes several transactions at the same time, cause one of the other transactions may possibly insert junk in the detail table.

  • Data Dictionary-Primary keys

    Hi All,
    I have a question regarding creating primary keys in Ztables.
    When I designate a field as Primary Key, It is accepting NULLs as inputs.Which is contradicting Primary key constraint. How should I enforce primary key constraint.

    Even though the Key And Intitial coumn has been set, while entering values in table the field is still accepting Null values.
    for eg- Num (char10/Or any DE attached),
    Name (Char 20). If enter records like
    (<blank>, krishna),
    (10, chaiyanya).
    the PK field is also accepting blank values r if not entered anything also its accepting.
    But this is not the case in any RDBMS. PK (Not Null and unique) is maintained.
    How to go about it.

  • ELQ-00107 errors when exporting activity data with Bulk REST API (2.0)

    I am following the flow described in Bulk API v2.0 Documentation
    I POST to https://secure.eloqua.com/api/bulk/2.0/activities/exports and get back (note: I'm working in python so this is all deserialized json)
    {u'createdAt': u'2014-08-14T07:05:17.6413979Z',
    u'createdBy': u'P D',
    u'fields': {u'ActivityDate': u'{{Activity.CreatedAt}}',
      u'ActivityId': u'{{Activity.Id}}'},
    u'filter': u"('{{Activity.CreatedAt}}' > '2014-07-31T23:43:02.080971Z' AND '{{Activity.Type}}' = 'EmailOpen')",
    u'name': u'blarg3',
    u'updatedAt': u'2014-08-14T07:05:17.6413979Z',
    u'updatedBy': u'P D',
    u'uri': u'/activities/exports/275'}
    Next I POST to /syncs and get back
    {u'createdAt': u'2014-08-14T07:05:31.6571126Z',
    u'createdBy': u'P D',
    u'status': u'pending',
    u'syncedInstanceUri': u'/activities/exports/275',
    u'uri': u'/syncs/17790'}
    Now (unfortunately) I GET /syncs/17790 and /syncs/17790/logs
    {u'createdAt': u'2014-08-14T07:05:31.9330000Z',
    u'createdBy': u'P D',
    u'status': u'error',
    u'syncStartedAt': u'2014-08-14T07:05:32.6570000Z',
    u'syncedInstanceUri': u'/activities/exports/275',
    u'uri': u'/syncs/17790'}
    {u'count': 2,
    u'hasMore': False,
    u'items': [{u'count': 0,
      u'createdAt': u'2014-08-14T07:05:33.3770000Z',
      u'message': u'There was an error processing the export.',
      u'severity': u'error',
      u'statusCode': u'ELQ-00107',
      u'syncUri': u'/syncs/17790'},
      {u'count': 0,
      u'createdAt': u'2014-08-14T07:05:33.3930000Z',
      u'message': u'Sync processed for sync 17790, resulting in Error status.',
      u'severity': u'information',
      u'statusCode': u'ELQ-00101',
      u'syncUri': u'/syncs/17790'}],
    u'limit': 1000,
    u'offset': 0,
    u'totalResults': 2}
    All I can find for ELQ-00107 is "ELQ-00107: There was an error processing the {type}."
    Any thoughts on what I may have done wrong? Pointers on how I can debug further?
    Thanks!
    Joel Rothman-Oracle allison.moore 1086203 Ryan Wheler-Oracle

    Hi 1086203
    I am facing the same issue. I am trying my request from SOAPUI
    As suggested by you i am not using any sub zero second precision
    Response from export definition:
    <Response xmlns="https://secure.eloqua.com/api/bulk/2.0/activities/exports">
       <createdAt>2014-08-28T22:00:40.5537126Z</createdAt>
       <createdBy>Eloqua</createdBy>
       <fields>
          <ActivityDate>{{Activity.CreatedAt}}</ActivityDate>
          <ActivityId>{{Activity.Id}}</ActivityId>
          <ActivityType>{{Activity.Type}}</ActivityType>
          <AssetId>{{Activity.Asset.Id}}</AssetId>
          <AssetName>{{Activity.Asset.Name}}</AssetName>
          <AssetType>{{Activity.Asset.Type}}</AssetType>
          <ContactId>{{Activity.Contact.Id}}</ContactId>
          <Id>{{Activity.Id}}</Id>
          <RawData>{{Activity.Field(RawData)}}</RawData>
          <VisitorId>{{Activity.Visitor.Id}}</VisitorId>
       </fields>
       <filter>'{{Activity.Type}}'='FormSubmit' And '{{Activity.CreatedAt}}'>'2014-08-26T00:00:00'</filter>
       <name>Example Activity Export</name>
       <updatedAt>2014-08-28T22:00:40.5537126Z</updatedAt>
       <updatedBy>Eloqua.APIUser1</updatedBy>
       <uri>/activities/exports/36</uri>
    </Response>
    Response Sync the data for export
    <Response xmlns="https://secure.eloqua.com/api/bulk/2.0/activities/exports">
       <createdAt>2014-08-28T22:03:05.0953641Z</createdAt>
       <createdBy>Eloqua</createdBy>
       <status>pending</status>
       <syncedInstanceUri>/activities/exports/36</syncedInstanceUri>
       <uri>/syncs/16</uri>
    </Response>
    GET https://secure.eloqua.com/api/bulk/2.0/syncs/16 to check the status of sync
    <Response xmlns="https://secure.eloqua.com/api/bulk/2.0/activities/exports">
       <createdAt>2014-08-28T22:03:04.8630000Z</createdAt>
       <createdBy>Eloqua</createdBy>
       <status>error</status>
       <syncStartedAt>2014-08-28T22:03:14.4970000Z</syncStartedAt>
       <syncedInstanceUri>/activities/exports/36</syncedInstanceUri>
       <uri>/syncs/16</uri>
    </Response>
    Logs.
    <Response xmlns="https://secure.eloqua.com/api/bulk/2.0/activities/exports">
       <count>2</count>
       <hasMore>false</hasMore>
       <items>
          <e>
             <count>0</count>
             <createdAt>2014-08-28T22:03:16.4330000Z</createdAt>
             <message>There was an error processing the export.</message>
             <severity>error</severity>
             <statusCode>ELQ-00107</statusCode>
             <syncUri>/syncs/16</syncUri>
          </e>
          <e>
             <count>0</count>
             <createdAt>2014-08-28T22:03:16.4670000Z</createdAt>
             <message>Sync processed for sync 16, resulting in Error status.</message>
             <severity>information</severity>
             <statusCode>ELQ-00101</statusCode>
             <syncUri>/syncs/16</syncUri>
          </e>
       </items>
       <limit>1000</limit>
       <offset>0</offset>
       <totalResults>2</totalResults>
    </Response>
    Kindly help
    Thanks
    Vinay

  • How to update data when primary key is set through for update cursor

    Dear friends,
    I have tried to update data in the table through forms using cursor for update and i have given the plsql i have used please help me where i do mistake.
    DECLARE CURSOR EMP IS
    SELECT EMPNO,EMPNAME,FATHERNAME,COMMUNITY,SEX,BILLUNIT,BIRTHDATE,RLYJOINDATE,RETIREMENTDATE
    FROM PRMAEMP WHERE BILLUNIT=:CTRL.BILLUNIT AND SERVICESTATUS='SR'ORDER BY DESIGCODE,SCALECODE
    FOR UPDATE;
    BEGIN
    GO_BLOCK('EMP_DETAILS');
    SYNCHRONIZE;
    FOR I IN EMP
    LOOP
    I.BILLUNIT:=:EMP_DETAILS.BILLUNIT;     
    I.EMPNO:=:EMPNO;
    I.EMPNAME:=:EMPNAME;
    I.FATHERNAME:=:FATHERNAME;
    I.COMMUNITY:=:COMMUNITY;
    I.SEX:=:SEX;
    I.BIRTHDATE:=:BIRTHDATE;
    I.RLYJOINDATE:=:RLYJOINDATE;
    I.RETIREMENTDATE:=:RETIREMENTDATE;
    DOWN;
    END LOOP;
    COMMIT;
    END;
    your help is needed immediately

    DECLARE CURSOR ABC IS
       SELECT EMPNO,
              EMPNAME,
              FATHERNAME,
              COMMUNITY,
              SEX,
              BILLUNIT,
              BIRTHDATE,
              RLYJOINDATE,
              RETIREMENTDATE
    FROM PRMAEMP
    WHERE BILLUNIT=:CTRL.BILLUNIT
    AND SERVICESTATUS='SR'
    ORDER BY DESIGCODE,SCALECODE
    FOR UPDATE OF COMMUNITY;
    V_EMPNO           PRMAEMP.EMPNO%TYPE;
    V_EMPNAME         PRMAEMP.EMPNAME%TYPE;
    V_FATHERNAME      PRMAEMP.FATHERNAME%TYPE;
    V_COMMUNITY       PRMAEMP.COMMUNITY%TYPE;
    V_SEX             PRMAEMP.SEX%TYPE;
    V_BILLUNIT        PRMAEMP.BILLUNIT%TYPE;
    V_BIRTHDATE       PRMAEMP.BIRTHDATE%TYPE;
    V_RLYJOINDATE     PRMAEMP.RLYJOINDATE%TYPE;
    V_RETIREMENTDATE  PRMAEMP.RETIREMENTDATE%TYPE;
    BEGIN
       GO_BLOCK('EMP');
       SYNCHRONIZE;
       OPEN ABC;
       LOOP
          FETCH ABC INTO .... /*yOU NEED TO MENTION YOUR VARIABLES HERE*/;
          UPDATE PRMAEMP
          SET BILLUNIT= :EMP.BILLUNIT,
              EMPNO= :EMPNO,
              EMPNAME= :EMPNAME,
              FATHERNAME= :FATHERNAME,
              COMMUNITY= :COMMUNITY,
              SEX= :SEX,
              BIRTHDATE= :BIRTHDATE,
              RLYJOINDATE= :RLYJOINDATE,
              RETIREMENTDATE= :RETIREMENTDATE
          WHERE CURRENT OF ABC;
          EXIT WHEN ABC%NOTFOUND;
       END LOOP;
       CLOSE ABC;
    END;
    COMMIT;
    END;Cheers
    Sarma.

  • Merging data without primary key

    hiii,
    I have 2 internal tables "it_final1" and "It_final_new" and i want to join both tables.
    both have same fields like BUKRS,GJAHR,SAKNR.
    for IT_FINAL1 i am using "BAPI_GL_GETGLACCPERIODBALANCES" FM to get data.
    now i need to merge this.
    Please explain ...
    i am using below logic for that.
    LOOP AT IT_FINAL_NEW .
      IF sy-SUBRC = 0.
        MOVE-CORRESPONDING IT_FINAL_NEW to IT_FINAL_DIS.
      ENDIF.
    read table it_final1 with key bukrs = it_final_new-bukrs
                                                 gjahr = it_final_new-gjahr.
    move-corresponding it_final1 to it_final_dis.
    APPEND IT_FINAL_DIS.
    ENDLOOP.

    Hi,
    Please try the following,
    *Looping at final_new which has all fields except gjahr
    LOOP AT IT_FINAL_NEW .
    *move wa to destination wa
    MOVE-CORRESPONDING IT_FINAL_NEW to IT_FINAL_DIS.
    *read internal table having balance value using common field
    READ TABLE IT_FINAL1 WITH KEY BUKRS =  IT_FINAL_NEW-BUKRS INTO WA.
    if sy-subrc = 0.
    *assign balance value to final destination balance field
    IT_FINAL_DIS-balance = wa-balance.
    *append the it
    APPEND IT_FINAL_DIS.
    *now u got the values from 2 internal tables
    endif.
    ENDLOOP.
    Hope this solves your problem.
    Regards,
    Meera

  • Performance issue during SharePoint list data bind to html table using Ajax call(Rest API)

    Hello,
    I am having multiple lists in my SharePoint Site. I am using SharePoint REST APIs to get data from these lists and bind a HTML Table. Suppose, I have 5 lists with 1000 records each, I am looping 5000 times to bind each row(record) to this html table. This
    is causing performance issue which is taking a very long time to bind. 
    Is there any way So that I can reduce this looping OR is there any better approach to improve the performance. Please kindly Suggest.  Thank you for your help :)
    Warm Regards,
    Ratan Kumar Racha

    Hi Racha,
    For handling large data binding in a page,
    AngularJS would be a great option if you might would worry about the performance.
    You can get more information about using AngularJS from the two links below:
    https://www.airpair.com/angularjs/posts/angularjs-performance-large-applications
    http://www.sitepoint.com/10-reasons-use-angularjs/
    Best regards
    Patrick Liang
    TechNet Community Support

  • Batch Rest API Upsert for CDO Field Limit?

    Currently with SOAP we can update 150 CDO fields at once, is there a 100 filed limit with Batch Rest API?

    Further to Corey's question, when using REST API, when we upsert more than 100 fields into a CDO we get an error stating that we can only update 100 fields. Is it possible to change this limitation with Bulk REST API so that we can update more than 100 fields?

  • REST API sync issue

    Hi expert,
      I created a Excel file into SharePoint 2013 Excel library and Post the URL into MS word quick parts of IncludePicture. in MS Word, I can get the Excel Chart. but when I update the chart and publish into SharePoint again, the word file does not change.
    does anybody knows how to sync both ?
    the URL like this,
    http://www.sharepointsite.com/_vti_bin/excelrest.aspx/Excel%20Library/TeamTasks_data.xlsx/model/
    charts('Task%20Status')?$format=image
    Thanks
    James Liang

    Hi,
    There’s a setting in your Trusted File Locations (in the configuration of the Excel Service Application) that you have to check, in order to have the REST API update the connections.
    http://www.sharepointblogs.be/blogs/vandest/archive/2014/02/20/excel-rest-api-not-refreshing-data.aspx
    If the issue still exists, please check whether you have select "Data not stored with document" in the "Field options".
    http://blogs.office.com/2009/11/09/excel-services-in-sharepoint-2010-rest-api-examples/
    Best Regards
    Dennis Guo
    TechNet Community Support

  • Does composite primary key in BMP really work?

    Here's a issue. I have a bean called CalendarEvent which uses name, startTime and endTime in composite fashion for primary key. I have a primary key class defined as well. Now I have a ejbFindByUser method which takes in a name and creates a collection of primary key class objects. THis happens properly. But when the client invokes findByUser it gets a collection of SIMILAR items. Its like only the last primary key object that was added into the collection by ejbFindByUser was translated by the server and the corresponding entity bean returned to the client.
    Please help, this thing is killing me!!!

    Thanks a ton for your response. However I am afraid I didn't fully understand what you were trying to say.
    1) Don't use persisted data for your primary key (and
    certainly not composite keys). This is an old skool
    thing to do, and will end up hurting you later.I don't understand this comment. My database table needs a composite key for uniqueness. So I created my own PK class with those fields as members. The hash code returned by the PK class uses all these member variables to determine the hash code and I believe it will is unique. Also the equals method compares all these member variables. I didn't understand your line about using persisted data for primary key.
    >
    2) Ok, either your ejbFindByXXX() method is not
    returning distinct primary key objects, or your
    ejbLoad() method isn't working. If you can properly
    retrieve an entity via the findByPrimaryKey method,
    then I'd say you should re-examine what your
    ejbFindByXX method is returning to the container.This is my understanding. When the client invokes findByPrimaryKey, the container invokes ejbFindByPrimaryKey with the provided key on ALL the ejb objects of that class. Only one of these EJB objects must be returning the same PK it took as an argument for ejbFindByPrimaryKey. The container must then return that EJB object back to the client. This is what happens in my code as per the logging statements i have in my ejbFindByPrimaryKey method. The argument is a PK object say pk1. I return an object pk2 and i ascertain that pk1 and pk2 have the same hash code and pk1.equals(pk2) returns true. well i guess it wld be easier to just return the argument but i didn't see a problem with this. Now the logging statements in ejbFindByPrimaryKey don't show any problem, they return the same primary key they get.
    But this is what bothers me. I have two records inthe database. The container seems to call my ejbFindByPrimaryKey only once. Shouldn't it be checking twice on each of the ejbs corresponding to each of the records? And ultimately, it returns the wrong ejb back to the client.
    I have no idea about this and would DEEPLY appreciate your help.
    Thanks in anticipation,
    Balan
    >

  • Primary key 1700 does not exist...

    Hello,
    I have a strange behaviour with my Entity bean. I am trying to make an update of an entity bean through a session bean, and it works only for primary keys inferior to a precise number (maybe 1700).
    Otherwise the update throws the following error :
    javax.ejb.NoSuchEntityException: [EJB:010142]Instance of EJB 'VEtHr' with primary key '1999' does not exist.
    Do you have any idea about the problem ?
    Thanks... :-)
    Do I need to change the <max-beans-in-cache> variable? (I have already tried with a very big number).
    Regards,

    The problem is strange because before trying the update method of EJB, I am trying to get the full line by the id (getByPrimaryKey). And it works. I get all data for primary key =5002 for example.
    But when I try to update it, I get this message :
    ; nested exception is:
         javax.ejb.NoSuchEntityException: [EJB:010142]Instance of EJB 'VEtHr' with primary key '5002' does not exist.
    My update method is just like this :
    public void updateVEtHr(VEtHrDto vEtHrDto) throws RemoteException {
    if (vEtHrDto != null) {
    Integer hrId = vEtHrDto.getHrId();
    try {
    VEtHr vEtHr = vEtHrHome.findByPrimaryKey(hrId);
    setVEtHrFromVEtHrDto(vEtHr, vEtHrDto);
    catch (FinderException e) {
    throw new RemoteException(e.getMessage());
    }

Maybe you are looking for

  • How to setup lan internet on my PowerPC G3

    I recently got the PowerBook4,3 and I can't seem to connect to the internet using my lan cord, I am new to using mac.  Bill

  • Missing letters

    All of a sudden when I turn on my computer in the am, my file name and drop down menus are missing letters. Like if the file name was "energyFile.doc", if might show up as " ne gyF e.d " Same with my drop down menus. I have run TechTool Pro on it - B

  • Sales Values in Credit Management

    Hi, can anyone advise this following? I made some changes in the way how credit price is updated to sales value from SD to credit managment.  However, there are already some values existing in the credit management due to previous transactional data

  • Dreamweaver CS6 Crashes when Inserting... anything

    I've been encountering this problem frequently. When I use the Insert menu, whether from the menu bar or from the quick tab panel, (usually) Dreamweaver crashes. It happens with flash videos, form elements, you name it. There's no error message, just

  • [RESOLVED] What is the tracker proc for?

    /usr/lib/tracker/tracker-store /usr/lib/tracker/tracker-extract I've noticed that these two procs are sucking up an increasing amount of RAM.  What are they used for and am I the only one seeing this problem? [edit] The following two links were helpf