Question/issue regarding querying for uncommited objects in Toplink...

Hi, was hoping to get some insight into this problem we are encoutering…
We have this scenario were we are creating a folder hierarchy (using Toplink)
1. a parent folder is created
2. child elements are created (in the same transaction as step 1),
3. we need to lookup the parent folder and assign it as the parent
of these child elements
4. end the transaction and commit all data
In our system we control access to objects by appending a filter to the selection criteria, so we end up with SQL like this example
(The t2 stuff is the authorization lookup part of the query.) ;
SELECT t0.ID, t0.CLASS_NAME, t0.DESCRIPTION, t0.EDITABLE,
t0.DATE_MODIFIED, t0.DATE_CREATED,
t0.MODIFIED_BY, t0.ACL_ID, t0.NAME, t0.CREATED_BY,
t0.TYPE_ID, t0.WKSP_ID, t1.ID, t1.LINK_SRC_PATH,
t1.ABSOLUTE_PATH, t1.MIME_TYPE, t1.FSIZE,
t1.CONTENT_PATH, t1.PARENT_ID
FROM XDOOBJECT t0, ALL_OBJECT_PRIVILEGES t2,
ARCHIVEOBJECT t1
WHERE ((((t1.ABSOLUTE_PATH = '/favorites/twatson2')
AND ((t1.ID = t2.xdoobject_id)
AND ((t2.user_id = 'twatson2')
AND (bitand(t2.privilege, 2) = 2))))
AND (t1.ID = t0.ID))
AND (t0.CLASS_NAME = 'oracle.xdo.server.repository.model.Archivable'))
When creating new objects we also create the authorization lookup record (which is inserted into a different table.) I can see all the objects are registered in the UOW identity map.
Basically, the issue is that this scenario all occurs in a single transaction and when querying for the newly created parent folder, if the authorization filter is appended to the query, the parent is not found. If I remove the authorization filter then the parent is found correctly. Or if I break this up into separate transactions and commit after each insert, then the parent is found correctly.
I use the conformResultsInUnitOfWork attribute on the queries.
This is related to an earlier thread I have in this discussion forum;
Nested UnitOfWork and reading newly created objects...
Thanks for any help you can provide,
-Tim

Hi Doug, we add the authorization filter directly in the application code as the query is getting set up.
Here are some code examples; 1) the first is the code to create new object in the system, followed by 2) the code to create a new authorization lookup record (which also uses the first code to do the actual Toplink insert), then 3) an example of a read query where the authorization filter is appended to the Expression and after that 4) several helper methods.
I hope this is of some use as it's difficult to show the complete flow in a simple example.
1)
// create new object example
public Object DataAccess.createObject(....
Object result = null;
boolean inTx = true;
UnitOfWork uow = null;
try
SessionContext sc = mScm.getCurrentSessionContext();
uow = TLTransactionManager.getActiveTransaction(sc.getUserId());
if (uow == null)
Session session = TLSessionFactory.getSession();
uow = session.acquireUnitOfWork();
inTx = false;
Object oclone = (Object) uow.registerObject(object);
uow.assignSequenceNumbers();
if (oclone instanceof BaseObject)
BaseObject boclone = (BaseObject)oclone;
Date now = new Date();
boclone.setCreated(now);
boclone.setModified(now);
boclone.setModifiedBy(sc.getUserId());
boclone.setCreatedBy(sc.getUserId());
uow.printRegisteredObjects();
uow.validateObjectSpace();
if (inTx == false) uow.commit();
//just temp, see above
if (true == authorizer.requiresCheck(oclone))
authorizer.grantPrivilege(oclone);
result = oclone;
2)
// Authorizer.grantPrivilege method
public void grantPrivilege(Object object) throws DataAccessException
if (requiresCheck(object) == false)
throw new DataAccessException(
"Object does not implement Securable interface.");
Securable so = (Securable)object;
ModulePrivilege[] privs = so.getDefinedPrivileges();
BigInteger pmask = new BigInteger("0");
for (int i = 0; i < privs.length; i++)
BigInteger pv = PrivilegeManagerFactory.getPrivilegeValue(privs);
pmask = pmask.add(pv);
SessionContext sc = mScm.getCurrentSessionContext();
// the authorization lookup record
ObjectUserPrivilege oup = new ObjectUserPrivilege();
oup.setAclId(so.getAclId());
oup.setPrivileges(pmask);
oup.setUserId(sc.getUserId());
oup.setXdoObjectId(so.getId());
try
// this recurses back to the code snippet from above
mDataAccess.createObject(oup, this);
catch (DataAccessException dae) {
Object[] args = {dae.getClass().toString(), dae.getMessage()};
logger.severe(MessageFormat.format(EXCEPTION_MESSAGE, args));
throw new DataAccessException("Failed to grant object privilege.", dae);
3)
// example Query code
Object object = null;
ExpressionBuilder eb = new ExpressionBuilder();
Expression exp = eb.get(queryKeys[0]).equal(keyValues[0]);
for (int i = 1; i < queryKeys.length; i++)
exp = exp.and(eb.get(queryKeys[i]).equal(keyValues[i]));
// check if need to add authorization filter
if (authorizer.requiresCheck(domainClass) == true)
// this is where the authorization filter is appended to query
exp = exp.and(appendReadFilter());
ReadObjectQuery query = new ReadObjectQuery(domainClass, exp);
SessionContext sc = mScm.getCurrentSessionContext();
if (TLTransactionManager.isInTransaction(sc.getUserId()))
// part of a larger transaction scenario
query.conformResultsInUnitOfWork();
else
// not part of a transaction
query.refreshIdentityMapResult();
query.cascadePrivateParts();
Session session = getSession();
object = session.executeQuery(query);
4)
// builds the authorzation filter
private Expression appendReadFilter()
ExpressionBuilder eb = new ExpressionBuilder();
Expression exp1 = eb.getTable("ALL_OBJECT_PRIVILEGES").getField("xdoobject_id");
Expression exp2 = eb.getTable("ALL_OBJECT_PRIVILEGES").getField("user_id");
Expression exp3 = eb.getTable("ALL_OBJECT_PRIVILEGES").getField("privilege");
Vector args = new Vector();
args.add(READ_PRIVILEGE_VALUE);
Expression exp4 =
exp3.getFunctionWithArguments("bitand",args).equal(READ_PRIVILEGE_VALUE);
SessionContext sc = mScm.getCurrentSessionContext();
return eb.get("ID").equal(exp1).and(exp2.equal(sc.getUserId()).and(exp4));
// helper to get Toplink Session
private Session getSession() throws DataAccessException
SessionContext sc = mScm.getCurrentSessionContext();
Session session = TLTransactionManager.getActiveTransaction(sc.getUserId());
if (session == null)
session = TLSessionFactory.getSession();
return session;
// method of TLTransactionManager, provides easy access to TLSession
// which handles Toplink Sessions and is a singleton
public static UnitOfWork getActiveTransaction(String userId)
throws DataAccessException
TLSession tls = TLSession.getInstance();
return tls.getTransaction(userId);
// the TLSession method, returns the active transaction (UOW)
// or null if none
public UnitOfWork getTransaction(String uid) {
UnitOfWork uow = null;
UowWrapper uw = (UowWrapper)mTransactions.get(uid);
if (uw != null) {
uow = uw.getUow();
return uow;
Thanks!
-Tim

Similar Messages

  • Issue regarding [Work Flow] Business Object Event Raise in ABAP Program

    Hi All,
    I have one issue regarding [Work Flow] Business Object Event Raise in ABAP Program.
    Actual TDS is as below:
    If E message type written, raise Business object BUS2005 (Production order) Event PickShortage for production order passing warehouse, transfer request
    (BUS2065 Object key) in event container. Also include table of text version of error
    messages for this set of Transfer
    Request.
    Can anybody tell me how can i write it technically in ABAP Code.
    Can anybody solve this issue!
    Thanks in advance.
    Thanks,
    Deep.

    Hi,
    Can anybody solve above posted issue!
    Thanks,
    Deep.

  • How to build sql query for view object at run time

    Hi,
    I have a LOV on my form that is created from a view object.
    View object is read-only and is created from a SQL query.
    SQL query consists of few input parameters and table joins.
    My scenario is such that if input parameters are passed, i have to join extra tables, otherwise, only one table can fetch the results I need.
    Can anyone please suggest, how I can solve this? I want to build the query for view object at run time based on the values passed to input parameters.
    Thanks
    Srikanth Addanki

    As I understand you want to change the query at run time.
    If this is what you want, you can use setQuery Method then use executeQuery.
    http://download.oracle.com/docs/cd/B14099_19/web.1012/b14022/oracle/jbo/server/ViewObjectImpl.html#setQuery_java_lang_String_

  • Persistenance for Java Objects Using Toplink

    Hi All Happy New Year
    I am trying the a tutorial in Jdeveloper 10.1.3.0.4 called
    Provide Persistenance for Java Objects Using Toplink.
    I have followed the instructions and get the following error.
    com.evermind.reflect.UndeclaredExceptionTypeException: oracle.oc4j.rmi.OracleRemoteException
         at __Proxy1.persistEntity(Unknown Source)
         at acme.ejb.session.EmpSessionClient.main(EmpSessionClient.java:29)
    oracle.oc4j.rmi.OracleRemoteException: Invocation error: java.lang.NoSuchMethodException: acme.ejb.session.EmpSession.persistEntity(java.lang.Object)
    The release notes mention TopLink POJO's Must Implement java.io.Serializable When Returned From a Session Bean's Remote Interface (4902787) When creating a session bean facade for TopLink POJO objects, you must implement java.io.Serializable for each of the TopLink POJO objects returned from the SessionBean facade through a remote interface. This is typically required when using ADF Swing, a EJB Sample Client, or when your EJB Session Bean resides on a separate application server from the client. You can also tell you you need to implement java.io.Serializable when you get the following exception:
    com.evermind.reflect.UndeclaredExceptionTypeException:
    /oracle.oc4j.rmi.OracleRemoteException/
    at __Proxy1.[Your Class Name Here] (Unknown Source)
    The workaround is to manually edit each POJO object to implement java.io.Serializable.
    I have only one POJO which is declared as follows:
    public class EmpInfo implements Serializable {
    Can anybody help me understand what I need to do to get it to work?
    Many Thanks in Advance

    Hi,
    can you send me your test scenario/project at anuj dot k dot jain at oracle dot com. I tried reproducing this but was unable to do so.
    Thanks,
    anuj dot k dot jain at oracle dot com

  • Picklist Query for Child Object

    Hello every one
    for querying the picklist values from ondemand by "picklist" we need to pass the Object name, Picklist Integration Tag and Language code.
    for standard object like Account , Contact, Service Request i have no problem. but I don't know what to pass as object name for child objects like Preduct Revenue, Contact Role etc
    I have tried all possible combination but it is coming with exception. There is no documentation also.
    Any one have any idea about this? or any document?
    Thanks in advance
    Dinesh

    Hi,
    The problem I mentioned with Method #1 included the VO reverting back to the old SQL if you do stuff like sorting a column, etc. I don't know of a way to force it not to revert back except to re-setQuery on each action. I try not to use this approach (Method #1) if possible. I don't know how "dynamic" your SQLs are but if possible, you should comes up with an SQL general enough to cover them and then filter them with ViewCriterias (Method #2).
    As for executing your custom query on page load, you can try extending your backing bean with PagePhaseListener, and place your codes in the onPageLoad() method. You can check if it is the first page load by executing the command *!AdfFacesContext.getCurrentInstance().isPostback()*. You will also need to "register" your backing bean to the PageDefinition file as the ControllerClass such as below:
    <pageDefinition xmlns="http://xmlns.oracle.com/adfm/uimodel"
                    version="11.1.1.51.88" ...
                    ControllerClass="#{backingBean}">
    Run executeQuery on onPageLoad(), but screen don't get refreshed, pls help
    Regards,
    Chan Kelwin

  • Issue with query for AR transactions posted to GL

    Hi all,
    I'm using Oracle R12.1.3.
    I have a report similar to Account Analysis Report which displays Transactions posted to GL.
    I have the following issue:
    In the result of the report if an AR transaction has 2 lines or more I get multiplication of them. So my question is how can I identify which AR transaction line is linked to GL line number?
    Here is my query:
    SELECT DISTINCT GJH.JE_HEADER_ID,
      GJL.JE_LINE_NUM,
      PARTY.PARTY_NAME CUSTOMER_VENDOR,
      RCT.TRX_NUMBER TRANS_NUMBER,
      SUBSTR(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(SUBSTR(CTL.DESCRIPTION, 1, 2000), CHR(13), ''), CHR(10), ''), CHR(9), ''), CHR(11), ''), CHR(12), ''), 1, 100) DESCRIPTION,
      NVL(xal.entered_dr, 0) - NVL(xal.entered_cr, 0) amount,
      CTL.*
    FROM GL.GL_JE_HEADERS GJH,
      GL.GL_JE_LINES GJL,
      GL.GL_CODE_COMBINATIONS GCC,
      GL.GL_PERIODS GLP,
      GL.GL_IMPORT_REFERENCES IMP,
      XLA.XLA_AE_LINES XAL,
      XLA.XLA_AE_HEADERS XAH,
      XLA.XLA_EVENTS XE,
      XLA.XLA_TRANSACTION_ENTITIES XTE,
      RA_CUSTOMER_TRX_ALL RCT,
      HZ_PARTIES PARTY,
      AR.HZ_CUST_ACCOUNTS CA,
      GL_CODE_COMBINATIONS_KFV CC,
      AR.RA_CUSTOMER_TRX_LINES_ALL CTL,
      AR.RA_CUST_TRX_LINE_GL_DIST_ALL CTLD
    WHERE 1              = 1
    AND GJH.JE_HEADER_ID = GJL.JE_HEADER_ID
      --      AND GJL.STATUS || '' = 'P'
    AND GCC.CODE_COMBINATION_ID   = CTLD.CODE_COMBINATION_ID
    AND GJH.PERIOD_NAME           = GLP.PERIOD_NAME
    AND RCT.CUSTOMER_TRX_ID       = CTLD.CUSTOMER_TRX_ID
    AND CTLD.CUSTOMER_TRX_LINE_ID = CTL.CUSTOMER_TRX_LINE_ID
    AND ctld.customer_trx_id      = RCT.CUSTOMER_TRX_ID
      --       AND GLP.ADJUSTMENT_PERIOD_FLAG <> 'Y'
    AND GJH.JE_SOURCE           = 'Receivables'
    AND GJL.JE_HEADER_ID        = IMP.JE_HEADER_ID
    AND GJL.JE_LINE_NUM         = IMP.JE_LINE_NUM
    AND IMP.GL_SL_LINK_ID       = XAL.GL_SL_LINK_ID
    AND IMP.GL_SL_LINK_TABLE    = XAL.GL_SL_LINK_TABLE
    AND XAL.APPLICATION_ID      = XAH.APPLICATION_ID
    AND XAL.AE_HEADER_ID        = XAH.AE_HEADER_ID
    AND XAH.APPLICATION_ID      = XE.APPLICATION_ID
    AND XAH.EVENT_ID            = XE.EVENT_ID
    AND XE.APPLICATION_ID       = XTE.APPLICATION_ID
    AND XTE.APPLICATION_ID      = 222
    AND XE.ENTITY_ID            = XTE.ENTITY_ID
    AND XTE.ENTITY_CODE         = 'TRANSACTIONS'
    AND XTE.SOURCE_ID_INT_1     = RCT.CUSTOMER_TRX_ID
    AND RCT.BILL_TO_CUSTOMER_ID = CA.CUST_ACCOUNT_ID
    AND CA.PARTY_ID             = PARTY.PARTY_ID
    AND rcT.CUSTOMER_TRX_ID     = ctl.CUSTOMER_TRX_Id
    AND CTL.LINE_TYPE           = 'LINE'
    AND XAL.CODE_COMBINATION_ID = CC.CODE_COMBINATION_ID
    AND RCT.CUSTOMER_TRX_ID                              = 8857929
    AND GJL.JE_LINE_NUM                                  = 8866
    Any ideas?
    Thanks in advance,
    Stoyanov.

    Hi Stoyanov,
    Please try using the table xla_distribution_links to join with ra_cust_trx_line_gl_dist_all by using
    xla_distribution_links.source_distribution_id_num_1 = ra_cust_trx_line_gl_dist_all.cust_trx_line_gl_dist_id
    The below link gives the join conditions for various sub ledger types :
    Techincal: R12 SLA Tables connection to AP, AR, INV,Payments, Receiving
    Hope this helps.
    Regards,
    Manjusha.

  • Issue regarding bdc for capturing error records

    Hi All,
            My requirement is to capture the error record and download the error record to a flat file .
    I have done recording for MM01 transaction .
    I am getting a problem like no error records are downloaded into the flat file .It is downloading only the empty records.
    Pls see the below code which i developed & modify it for any changes .Its an urgent .Pls provide me the solution ASAP.
    My Flat file
    M     FERT     X     MATL105     KG     
    X     FERT     X     MATL106     KG
    In the above flat file 'X' is an Industry sector which doesnot exists which is an error record that has to be captured and download it into the flat file .
    Source code :
    report Z_MM01_MSG_F MESSAGE-ID MSG1
           no standard page heading line-size 255.
    include bdcrecx1.
    parameters: dataset(132) lower case.
    ***    DO NOT CHANGE - the generated data section - DO NOT CHANGE    ***
    *   If it is nessesary to change the data section use the rules:
    *   1.) Each definition of a field exists of two lines
    *   2.) The first line shows exactly the comment
    *       '* data element: ' followed with the data element
    *       which describes the field.
    *       If you don't have a data element use the
    *       comment without a data element name
    *   3.) The second line shows the fieldname of the
    *       structure, the fieldname must consist of
    *       a fieldname and optional the character '_' and
    *       three numbers and the field length in brackets
    *   4.) Each field must be type C.
    *** Generated data section with specific formatting - DO NOT CHANGE  ***
    data: begin of record occurs 0,
    * data element: MBRSH
            MBRSH_001(001),
    * data element: MTART
            MTART_002(004),
    * data element: XFELD
            KZSEL_01_003(001),
    * data element: MAKTX
            MAKTX_004(040),
    * data element: MEINS
            MEINS_005(003),
    * data element: MTPOS_MARA
            MTPOS_MARA_006(004),
          end of record.
    *DECLARATION OF BDCDATA STRUCTURE
    DATA: IT_BDCDATA LIKE BDCDATA OCCURS 0 WITH HEADER LINE .
    *declaration to store the message
    DATA: IT_MESSTAB LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE .
    *DECLARATION TO STORE THE MESSAGE
    DATA: BEGIN OF IT_STORE_MSG OCCURS 0,
          STORE(1000),
          END OF IT_STORE_MSG.
    *declaration SUCCESS MESG
    DATA: BEGIN OF IT_SUCCESS OCCURS 0,
          SUCCESS_REC(10),
          MBRSH(10),
          TABIX LIKE SY-TABIX,
          END OF IT_SUCCESS.
    *declaration ERROR MESSAGE
    DATA: BEGIN OF IT_ERROR  OCCURS  0,
          ERROR_REC(10),
          MBRSH(10),
             TABIX LIKE SY-TABIX,
          END OF IT_ERROR.
    DATA:TABIX LIKE SY-TABIX.
    *validating Material type(mtart) field data with table T134
    data : v_type like T134-mtart.
    DATA: V_INDSECT LIKE MARA-MBRSH.
    *** End generated data section ***
    start-of-selection.
    CALL FUNCTION 'UPLOAD'
    * EXPORTING
    *   CODEPAGE                      = ' '
    *   FILENAME                      = ' '
    *   FILETYPE                      = ' '
    *   ITEM                          = ' '
    *   FILEMASK_MASK                 = ' '
    *   FILEMASK_TEXT                 = ' '
    *   FILETYPE_NO_CHANGE            = ' '
    *   FILEMASK_ALL                  = ' '
    *   FILETYPE_NO_SHOW              = ' '
    *   LINE_EXIT                     = ' '
    *   USER_FORM                     = ' '
    *   USER_PROG                     = ' '
    *   SILENT                        = 'S'
    * IMPORTING
    *   FILESIZE                      =
    *   CANCEL                        =
    *   ACT_FILENAME                  =
    *   ACT_FILETYPE                  =
      TABLES
        data_tab                      = record
    * EXCEPTIONS
    *   CONVERSION_ERROR              = 1
    *   INVALID_TABLE_WIDTH           = 2
    *   INVALID_TYPE                  = 3
    *   NO_BATCH                      = 4
    *   UNKNOWN_ERROR                 = 5
    *   GUI_REFUSE_FILETRANSFER       = 6
    *   OTHERS                        = 7
    IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    *perform open_dataset using dataset.
    perform open_group.
    LOOP AT RECORD.
    perform bdc_dynpro      using 'SAPLMGMM' '0060'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RMMG1-MATNR'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    *- Validating industry sector(MBRSH) from the master table(MARA)
    select single MBRSH from T137  into V_INDSECT where MBRSH eq
    record-MBRSH_001.
    IF SY-SUBRC EQ 0.
    perform bdc_field       using 'RMMG1-MBRSH'
                                  record-MBRSH_001.
    *endif.
    perform bdc_field       using 'RMMG1-MTART'
                                  record-MTART_002.
    perform bdc_dynpro      using 'SAPLMGMM' '0070'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'MSICHTAUSW-DYTXT(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=ENTR'.
    perform bdc_field       using 'MSICHTAUSW-KZSEL(01)'
                                  record-KZSEL_01_003.
    perform bdc_dynpro      using 'SAPLMGMM' '4004'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_field       using 'MAKT-MAKTX'
                                  record-MAKTX_004.
    perform bdc_field       using 'BDC_CURSOR'
                                  'MARA-MEINS'.
    perform bdc_field       using 'MARA-MEINS'
                                  record-MEINS_005.
    perform bdc_field       using 'MARA-MTPOS_MARA'
                                  record-MTPOS_MARA_006.
    perform bdc_dynpro      using 'SAPLSPO1' '0300'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=YES'.
    perform bdc_transaction using 'MM01'.
    *ELSE.
    *message  E000 WITH 'Industry sector does not Exist' .
    *endif.
    LOOP AT MESSTAB.
    CALL FUNCTION 'FORMAT_MESSAGE'
    EXPORTING
       ID              = MESSTAB-MSGID
       LANG            = MESSTAB-MSGSPRA
       NO              = MESSTAB-MSGNR
       V1              = MESSTAB-MSGV1
       V2              = MESSTAB-MSGV2
    *   V3              = SY-MSGV3
    *   V4              = SY-MSGV4
    IMPORTING
       MSG             = IT_STORE_MSG-STORE
       EXCEPTIONS
    *   NOT_FOUND       = 1
       OTHERS          = 0.
    IF MESSTAB-MSGTYP = 'S'.
       IT_SUCCESS-SUCCESS_REC = IT_STORE_MSG-STORE.
       IT_SUCCESS-MBRSH = record-MBRSH_001.
       IT_SUCCESS-TABIX = TABIX.
       APPEND IT_SUCCESS.
       ELSEIF  MESSTAB-MSGTYP = 'E'.
       IT_ERROR-ERROR_REC = IT_STORE_MSG-STORE.
       IT_ERROR-MBRSH = record-MBRSH_001.
       IT_ERROR-TABIX = TABIX.
      APPEND IT_ERROR.
    ENDIF.
    endloop.
    endif.
    ENDLOOP.
    CALL FUNCTION 'DOWNLOAD'
      TABLES
        DATA_TAB                      = IT_error
    *   FIELDNAMES                    =
    * EXCEPTIONS
    *   INVALID_FILESIZE              = 1
    *   INVALID_TABLE_WIDTH           = 2
    *   INVALID_TYPE                  = 3
    *   NO_BATCH                      = 4
    *   UNKNOWN_ERROR                 = 5
    *   GUI_REFUSE_FILETRANSFER       = 6
    *   OTHERS                        = 7
    IF SY-SUBRC <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    perform close_group.
    *perform close_dataset using dataset.
    Code Formatted by: Alvaro Tejada Galindo on Apr 9, 2008 5:05 PM

    Hi,
    DATA: IT_MESSTAB LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE .
    CALL FUNCTION 'DOWNLOAD'
    TABLES
    DATA_TAB = IT_error
    FIELDNAMES =
    EXCEPTIONS
    INVALID_FILESIZE = 1
    INVALID_TABLE_WIDTH = 2
    INVALID_TYPE = 3
    NO_BATCH = 4
    UNKNOWN_ERROR = 5
    GUI_REFUSE_FILETRANSFER = 6
    OTHERS = 7.
    IF SY-SUBRC 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    instead of using IT_error ion tables use IT_MESSTAB.
    <REMOVED BY MODERATOR>
    Code Formatted by: Alvaro Tejada Galindo on Apr 9, 2008 5:07 PM

  • Hi All,Issue regarding bdc for capturing error records,its urgent

    Hi All,
            My requirement is to capture the error record and download the error record to a flat file .
    I have done recording for MM01 transaction .
    I am getting a problem like no error records are downloaded into the flat file .It is downloading only the empty records.
    Pls see the below code which i developed & modify it for any changes .Its an urgent .Pls provide me the solution ASAP.
    My Flat file
    M     FERT     X     MATL105     KG     
    X     FERT     X     MATL106     KG
    In the above flat file 'X' is an Industry sector which doesnot exists which is an error record that has to be captured and download it into the flat file .
    Source code :
    report Z_MM01_MSG_F MESSAGE-ID MSG1
           no standard page heading line-size 255.
    include bdcrecx1.
    parameters: dataset(132) lower case.
       DO NOT CHANGE - the generated data section - DO NOT CHANGE    ***
      If it is nessesary to change the data section use the rules:
      1.) Each definition of a field exists of two lines
      2.) The first line shows exactly the comment
          '* data element: ' followed with the data element
          which describes the field.
          If you don't have a data element use the
          comment without a data element name
      3.) The second line shows the fieldname of the
          structure, the fieldname must consist of
          a fieldname and optional the character '_' and
          three numbers and the field length in brackets
      4.) Each field must be type C.
    Generated data section with specific formatting - DO NOT CHANGE  ***
    data: begin of record occurs 0,
    data element: MBRSH
            MBRSH_001(001),
    data element: MTART
            MTART_002(004),
    data element: XFELD
            KZSEL_01_003(001),
    data element: MAKTX
            MAKTX_004(040),
    data element: MEINS
            MEINS_005(003),
    data element: MTPOS_MARA
            MTPOS_MARA_006(004),
          end of record.
    *DECLARATION OF BDCDATA STRUCTURE
    DATA: IT_BDCDATA LIKE BDCDATA OCCURS 0 WITH HEADER LINE .
    *declaration to store the message
    DATA: IT_MESSTAB LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE .
    *DECLARATION TO STORE THE MESSAGE
    DATA: BEGIN OF IT_STORE_MSG OCCURS 0,
          STORE(1000),
          END OF IT_STORE_MSG.
    *declaration SUCCESS MESG
    DATA: BEGIN OF IT_SUCCESS OCCURS 0,
          SUCCESS_REC(10),
          MBRSH(10),
          TABIX LIKE SY-TABIX,
          END OF IT_SUCCESS.
    *declaration ERROR MESSAGE
    DATA: BEGIN OF IT_ERROR  OCCURS  0,
          ERROR_REC(10),
          MBRSH(10),
             TABIX LIKE SY-TABIX,
          END OF IT_ERROR.
    DATA:TABIX LIKE SY-TABIX.
    *validating Material type(mtart) field data with table T134
    data : v_type like T134-mtart.
    DATA: V_INDSECT LIKE MARA-MBRSH.
    End generated data section ***
    start-of-selection.
    CALL FUNCTION 'UPLOAD'
    EXPORTING
      CODEPAGE                      = ' '
      FILENAME                      = ' '
      FILETYPE                      = ' '
      ITEM                          = ' '
      FILEMASK_MASK                 = ' '
      FILEMASK_TEXT                 = ' '
      FILETYPE_NO_CHANGE            = ' '
      FILEMASK_ALL                  = ' '
      FILETYPE_NO_SHOW              = ' '
      LINE_EXIT                     = ' '
      USER_FORM                     = ' '
      USER_PROG                     = ' '
      SILENT                        = 'S'
    IMPORTING
      FILESIZE                      =
      CANCEL                        =
      ACT_FILENAME                  =
      ACT_FILETYPE                  =
      TABLES
        data_tab                      = record
    EXCEPTIONS
      CONVERSION_ERROR              = 1
      INVALID_TABLE_WIDTH           = 2
      INVALID_TYPE                  = 3
      NO_BATCH                      = 4
      UNKNOWN_ERROR                 = 5
      GUI_REFUSE_FILETRANSFER       = 6
      OTHERS                        = 7
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    *perform open_dataset using dataset.
    perform open_group.
    LOOP AT RECORD.
    perform bdc_dynpro      using 'SAPLMGMM' '0060'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RMMG1-MATNR'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    *- Validating industry sector(MBRSH) from the master table(MARA)
    select single MBRSH from T137  into V_INDSECT where MBRSH eq
    record-MBRSH_001.
    IF SY-SUBRC EQ 0.
    perform bdc_field       using 'RMMG1-MBRSH'
                                  record-MBRSH_001.
    *endif.
    perform bdc_field       using 'RMMG1-MTART'
                                  record-MTART_002.
    perform bdc_dynpro      using 'SAPLMGMM' '0070'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'MSICHTAUSW-DYTXT(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=ENTR'.
    perform bdc_field       using 'MSICHTAUSW-KZSEL(01)'
                                  record-KZSEL_01_003.
    perform bdc_dynpro      using 'SAPLMGMM' '4004'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_field       using 'MAKT-MAKTX'
                                  record-MAKTX_004.
    perform bdc_field       using 'BDC_CURSOR'
                                  'MARA-MEINS'.
    perform bdc_field       using 'MARA-MEINS'
                                  record-MEINS_005.
    perform bdc_field       using 'MARA-MTPOS_MARA'
                                  record-MTPOS_MARA_006.
    perform bdc_dynpro      using 'SAPLSPO1' '0300'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=YES'.
    perform bdc_transaction using 'MM01'.
    *ELSE.
    *message  E000 WITH 'Industry sector does not Exist' .
    *endif.
    LOOP AT MESSTAB.
    CALL FUNCTION 'FORMAT_MESSAGE'
    EXPORTING
       ID              = MESSTAB-MSGID
       LANG            = MESSTAB-MSGSPRA
       NO              = MESSTAB-MSGNR
       V1              = MESSTAB-MSGV1
       V2              = MESSTAB-MSGV2
      V3              = SY-MSGV3
      V4              = SY-MSGV4
    IMPORTING
       MSG             = IT_STORE_MSG-STORE
       EXCEPTIONS
      NOT_FOUND       = 1
       OTHERS          = 0.
    IF MESSTAB-MSGTYP = 'S'.
       IT_SUCCESS-SUCCESS_REC = IT_STORE_MSG-STORE.
       IT_SUCCESS-MBRSH = record-MBRSH_001.
       IT_SUCCESS-TABIX = TABIX.
       APPEND IT_SUCCESS.
       ELSEIF  MESSTAB-MSGTYP = 'E'.
       IT_ERROR-ERROR_REC = IT_STORE_MSG-STORE.
       IT_ERROR-MBRSH = record-MBRSH_001.
       IT_ERROR-TABIX = TABIX.
      APPEND IT_ERROR.
    ENDIF.
    endloop.
    endif.
    ENDLOOP.
    CALL FUNCTION 'DOWNLOAD'
      TABLES
        DATA_TAB                      = IT_error
      FIELDNAMES                    =
    EXCEPTIONS
      INVALID_FILESIZE              = 1
      INVALID_TABLE_WIDTH           = 2
      INVALID_TYPE                  = 3
      NO_BATCH                      = 4
      UNKNOWN_ERROR                 = 5
      GUI_REFUSE_FILETRANSFER       = 6
      OTHERS                        = 7
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    perform close_group.
    *perform close_dataset using dataset.

    Hi,
    DATA: IT_MESSTAB LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE .
    CALL FUNCTION 'DOWNLOAD'
    TABLES
    DATA_TAB = IT_error
    FIELDNAMES =
    EXCEPTIONS
    INVALID_FILESIZE = 1
    INVALID_TABLE_WIDTH = 2
    INVALID_TYPE = 3
    NO_BATCH = 4
    UNKNOWN_ERROR = 5
    GUI_REFUSE_FILETRANSFER = 6
    OTHERS = 7.
    IF SY-SUBRC 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    instead of using IT_error ion tables use IT_MESSTAB.
    <REMOVED BY MODERATOR>
    Code Formatted by: Alvaro Tejada Galindo on Apr 9, 2008 5:07 PM

  • String index out of bounds... issue regarding checking for non-integers

    okay. i have been racking my brain about this for the last couple of days. i looks like everything is alright, but i keep getting string index out of bounds exception: string index out of range 1.
    heres the code.
                   for(int x=0; x<size; ++x)
                        count=x+1;
                        System.out.println("Please enter value #"+count+(":"));
                        numnum=console.nextLine();
                        if(Character.isDigit(numnum.charAt(x)))
                             goodInput=true;
                        if(!goodInput)
                             System.out.print("Please Enter Only Integers!");
                        else
                             values[x]=Integer.parseInt(numnum);
    its probably something really stupid, but if someone can figure this out, id be most grateful.
    -thanks

    Be sure values contains data.
    Then make sure that size = values.length
    Either you don't have any data to iterate through, or the size variable is too big. So it's trying to search in parts of the array that don't exist.

  • JPA query by entity/object ?

    I am trying to write an abstract API which dynamically assigns any Entity Class that needs to be persisted and retrieved using the Entity Manager.
    Saving into the database is not a problem, I just do entityManager.save(Class) and it works for any class that needs to be persisted.
    However, when querying for the object based upon the attributes, I want to avoid naming particular attributes and want to use the Entity class's attributes against itself for querying.
    For example, the client program will say something like this to query by name and age of a Person:
    -------calling (client) program: ---
    Person p = << get from UI, not saved yet, no Id but has all other attributes like name and age etc. >>
    List<Person> persons = dao.getAllThatMatch(p);
    --- end client Program --
    --- DAO class ---
    List<T> getAllThatMatch(T t) {  //note that expectation is that returned is a list of Object which is the same as the querying object
    List<T> entityList = em.someFinderMethod(t);
    //the someFinderMethod method should automatically query for all Person objects that match the attributes provided by the object of Person supplied as criteria
    //NOTE: there is no attribute mentioned extensively like name, age etc.
    return entityList ;
    -- end DAO class --
    Edited by: user7626479 on Feb 6, 2013 3:55 PM
    Edited by: user7626479 on Feb 6, 2013 3:55 PM

    Query by example is not included in the JPA standard, but it is possible to do with EclipseLink.
    See http://wiki.eclipse.org/EclipseLink/Examples/JPA/ORMQueries#Query_By_Example
    for how to use query by example with native EclipseLink queries. To execute a native query through JPA, you will need to call createQuery(DatabaseQuery query) on the org.eclipse.persistence.jpa;JpaEntityManager obtained from the javax.persistence.EntityManager instance by calling getDelegate() or unwrap.
    Best Regards,
    Chris

  • Are there REST APIs to retrieve entity metadata for  eloqua objects?

    There is a list of all the objects which can be accessed by REST for CRUD in this link: REST API - Documentation for Core Objects under the Core Objects section.
    For each of the objects listed under the  Core Objects section are there is a field metadata under the Properties section.
    For example for Email object, REST API - Accessing Emails , under the Properties section, there corresponding entries for fields of Emails object under the
    Name ,Type, Description and Validations headings.
    Is there a REST API for retrieving the same information i.e. the field metadata for an eloqua object programmatically ?
    If not , it is a serious hindrance to building systems that are metadata driven and also since SOAP support is being deprecated...

    Metadata is 'top level' information on the object, and available whether you query the individual object (a single form, or email asset) or query for multiple objects of that type (list all forms, list all emails). Consider using a depth of minimal or partial for faster performance if the specific configuration of those objects is not important..
    Example:
    GET /assets/forms?depth=minimal&count=2
    Returns:
      "elements":
        "type":"Form",
        "currentStatus":"Draft",
        "id":"19",
        "createdAt":"1409623550",
        "createdBy":"8",
        "depth":"minimal",
        "folderId":"7",
        "name":"zzztestCS_3-9381543541_AutocompleteTest",
        "permissions":"fullControl",
        "updatedAt":"1409623623",
        "updatedBy":"8"
        "type":"Form",
        "currentStatus":"Draft",
        "id":"22",
        "createdAt":"1409781207",
        "createdBy":"11",
        "depth":"minimal",
        "folderId":"466",
        "name":"daisychain1",
        "permissions":"fullControl",
        "updatedAt":"1412779449",
        "updatedBy":"20"
      "page":1,
      "pageSize":2,
      "total":130
    Without limiting the count to 2, this would return up to 1000 results if you had multiple forms in your system and give you a basic top level view of each. Similarly, you can use GET /assets/form/{id}?depth=minimal to get the same sort of information.
    Other endpoints can be found on the REST livedocs page here (requires authentication):
    https://secure.eloqua.com/api/docs/Dynamic/Rest/1.0/Reference.aspx
    Regards,
    Bojan

  • Collect metrics failed because 'Query Datacenter Managed Object Reference' is not executed

    Hello,
    This is my first post on this forum; I am working for a Cisco partner.
    I am working on Tidal Enterprise Orchestrator 2.3.0.441 (hotfix1 and 2, content update 1), part of CIAC starter edition.
    Scheduled Collect metrics failed in 'vSphere Datacenter Sync' and in 'vSphere Cluster Data Sync' parts
    Problem is at the level of 'Query Datacenter Managed Object Reference'
    Input is Datacenter name (well defined, value is not '*'), but this box is not executed (stay white). Next box : 'Set Datacenter MOR' define a variable with value 'Datacenter-' instead of 'Datacenter-[output of Query Datacenter Managed Object Reference]. And finally 'Create Cluster table" failed because a root element is missing (the datacenter name)
    So my question is why 'Query Datacenter Managed Object Reference' is not executed ?
    I change nothing in the workflow, and Datacenter is normally well defined.
    thank you for your help,
    Cheers,
    Nicolas

    This particular utility workflow is set to not-archive completed instances.
    This means that, after it finishes, it is not saved to the database and you can't see the runtime information. It improves performance and saves database space, but does make troubleshooting a little more roundabout.
    You'll want to turn on archiving temporarily to see what the error message is.  Open the process, go to the Options tab, and check the "Archive completed instances" box.

  • Questions on Named Query

    We have a few questions about named query.
    According to the Toplink documentation, we can register the named query in 2 places.
    1. Register the name query in the “query manager” in a Session level - The documentation said that these are for “GLOBAL” queries.
    2. Register the name query in the “query manager” in a Descriptor level. These “Descriptor level” query can be defined via the Toplink workbench or defind via Java code and load in the Descriptor via a post-project load amendment method. The documentation also mentioned that once the project is login, we cannot add named query to the descriptor.
    We have a few questions.
    All querys are related to a class, so what do we mean by “GLOBAL” queries ? What is this “GLOBAL” means.
    Is there any restriction in adding query to a session. Can we add query to a session after the session is login ?
    What we like to achieve is something like this.
    Ask the session for a query “findElectionByEmployee”, if nothing returns, construct the query, add this named query to the session.
    Can we ask a clientSession for a query using getQuery(string)?
    Or we can only ask a serverSession for a query ?
    How about adding query to a session, can a clientSession add a query back to the parent ServerSession, or we need to add the query directly to a serverSession ?
    We have hundreds of queries, not all of them used all the time, so adding the query to the session “on demand, or Just in Time addition”, will be most memory saving rather than starting the system with hundreds of query.
    Another issue we have about using descriptor based query, specially those that is define via the toplink workbench, is that it create sort of an administrative single point of entry. For a system that have a lot of query, to single file the creation of the query to a single person, or a few people, can be administratively difficult, even though we appreciate the “central and single point of control aspect of it.
    So what is your observation in the use of named queries with your other clients ?
    Is that a popular feature that your customer use ? Do they like to define the query in Workbench, can this solution “administratively scale” ?
    Do you see “just in time” add of query to a session, if that is all possible.
    Currently we have a simple query mechanism, each DAO method eg findElectionByEmployee, will contruct the query, with the expression builder, expressions etc and after one execution, throw that away and redo the same thing again in the next method invocation.
    We want to avoid that but we do not want to centralized the definition to a central “Toplink Mapping” person. For a fairlt big system like ours, that have 20-30 DAO with hundreds of “finder” methods, it is too difficult to centralized.
    What is your thoughts on this and if there is any white paper of best practice that you can point us to, we be most appreciated.

    In answer to your technical questions, the only difference between descriptor level named queries and session level named queries is where the query is stored. Queries stored at the session level must have unique signatures, queries stored in the descriptor require unique signatures from other queries on the same descriptor but not queries on other descriptors or sessions.
    Queries at the session level can be on any class, queries at the descriptor level must on that descriptor's class.
    A ServerSession's query can be found from a ClientSession but queries added to a ClientSession will only be available to that ClientSession and will dispose when the ClientSession is released.
    As to the questions about manageability it really is up to you to evaluate. If it is important to your organization that queries are moderated centrally then controlling them at the project level would provide for that. If that is not an important requirement then moving to a dynamic model would be less overhead, and headaches of having numerous people working on the source to and redeploying the same application repeatedly. But in general a combination of the two approaches, based on how often a particular query is used, would probably be best.
    --Gordon                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Regarding [Work Flow] Business Object Event  Raise in ABAP Program

    Hi All,
    I have one issue regarding [Work Flow] Business Object Event Raise in ABAP Program.
    Actual TDS is as below:
    If E message type written, raise Business object BUS2005 (Production order) Event PickShortage for production order passing warehouse, transfer request
    (BUS2065 Object key) in event container.  Also include table of text version of error
    messages for this set of Transfer
    Request.
    Can anybody tell me how can i write it technically in ABAP Code.
    Can anybody solve this issue!
    Thanks in advance.
    Thanks,
    Deep.

    Hi,
    Can anybody solve above posted issue!
    Thanks,
    Deep.

  • API Clearcase for Business Objects Reports including Webi

    Is there anyone out there who has developed a linked version of the following application?
    Winners of the Business Objects Community Content Contest
    I'm excited to announce the winners of the content contest that we ran earlier in the summer. The contest ran for seven weeks and had five categories, each of which offered a potential grand prize of $1,000 USD, a second prize of $300 USD, and a third prize of $200 USD. 
    Here is the list of winners, who have already been officially notified, along with a description of their submissions:
    SDK APPLICATION
    Grand prize - Sunil Abraham - "Bridge between Rational ClearCase version control and Business Object Enterprise"
    This tool enables updating the location of report files on the BOE server from the ClearCase repository without using the web interface. The tool is a stand alone application developed in VB.NET using BusinessObjects Enterprise SDK and the ClearCase Automation Library (COM Interface). The user selects the ClearCase repository and the change set that needs to be updated on the BOE server. The tool queries BOE and returns the locations of the files to be updated. The user can then update the server location with the click of a button.

    You cannot use query builder to update the object. You need to create a jsp page or java class and add the code to perform the action. The code should:
    1. logon to enterprise
    2. query for  the objects to get InfoObjects.
    3. loop through each object and update title using setTitle()
    4. commit
    5. logoff
    Check various [sample|http://www.sdn.sap.com/irj/boc/samples?rid=/webcontent/uuid/b0daacb7-ad82-2b10-b2b6-f3e9fa3e716a] and [Developer Guide|http://help.sap.com/businessobject/product_guides/boexir31/en/boesdk_java_dg_12_en.zip] on how to set up the environment etc,

Maybe you are looking for

  • Ipod not recognized in Windows and locks up iTunes

    When I connect my ipod to my pc with itunes open it does not recognize it as an ipod most times, other times it says that it needs to be reformatted to windows. Then it usually locks up itunes. If I connect the ipod first, windows thinks its a remova

  • HT4759 I have MAC OS X 10.5.8    How do I upgrade to unable icloud?

      I have a MacBook MAC OS X 10.5.8 version. I need to up date my software to unable my MAC to use icloud. What do I need to do to accomplish this? What software do I use? Do I need to purchase an upgrade? Thanks

  • Intial Bank Balanace in Sub-Ledger Bank Accounting

    Hi, I am going to upload the first ELECTRONIC bank statement into the system. I am using FF.5 to post the initial bank statement and the program is RFEBGB00. However, when I post the initial bank statement in the bank sub-ledger, the initial bank sta

  • How do I make Safari open PDF's using installed Adobe Acrobat 7.1.0

    I would like to force Safari 3.1.2 to open all PDF links using Adobe Acrobat 7.1.0. How do I accomplish this? I have no problem making Safari use the Adobe Reader, however, I would like to use the full Adobe Program so I can save and download interac

  • Export Media Question re H.264 PAL DV Widescreen High Quality

    Using Premiere Pro CS4. Why is the movie that I export being squashed into the 4:3 (tall and skinny) syndrome when I export using the following settings and view in Quicktime Player 7.6? I captured the video straight from my video camera as DV AVI PA