Error Occured During Data Selection

Hi Community,
                       I got "Error Occured During Data Selection" when loading data from ODS to Info Cube. This info cube already contained data before but first time I didnt get complete records (both in ODS and Info Cube) so I deleted the request and load the data in both ODS and Info cube.
                      I got the records in ODS but when I am loading data from ODS to Info Cube then error displayed in Monitor display called "Error Occured During Data Selection"
                      I am working on BI 7.0. Please help me
Regds
Robbie.

Hi,
Have you deleted the <i>Delta inti info</i> available in the info package which is used to upload the data from ODS to Cube . Here you have to make it red and then delete it.
Other wise delete the row available in RSA7( in BIW) which was created at the time od the delta init from ODS to CUBE upload.
and aslo check the following link is relevent to your problem?
/message/2922612#2922612 [original link is broken]
With rgds,
Anil Kumar Sharma .P

Similar Messages

  • Oracel B2B UI - Reports - "An unknown error occured during data streaming.

    I have a brand new oralce soa suite 11g R1 PS2 installation, following my own instructions even, in which everything seems to work except the B2B Web Console "Reports" functionality. When no messages are available for display (as in none were sent or all were purged) the display looks fine. WHen there are supposed to be messages to display I get the web browser dialogue box saying ""An unknown error occured during data streaming. Ask your system administrator to inspect teh server-side logs." and no data. I performed a clean install on brand new VMs 4 times already, 2-5 hours each time, and every time I get this. I used Windows XP 64-bit, WIndows XP 32-bit, JDK 1.6.0_20 64-bit (on the 64-bit platform), JDK 1.6.0_20 32-bit, Oracel provided JDK 1.6.0_18, all to no avail.
    Can some kind soul please tell me what I should look at / fix to have this resolved?
    The stack trace is below:
    [2010-11-01T20:57:13.256+11:00] [AdminServer] [WARNING] [] [oracle.adfinternal.view.faces.renderkit.rich.SimpleSelectOneRenderer] [tid: [ACTIVE].ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: weblogic] [ecid: 0000Ik6rbdg2zGWjLxfP8A1CnclD000079,0] [APP: b2bui] [dcid: f21c2821705695cf:-4862872f:12c06d54454:-7ff0-00000000000000d1] Could not find selected item matching value "" in RichSelectOneChoice[UIXEditableFacesBeanImpl, id=value40]
    [2010-11-01T20:57:13.490+11:00] [AdminServer] [ERROR] [] [oracle.adfinternal.view.faces.config.rich.RegistrationConfigurator] [tid: [ACTIVE].ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: weblogic] [ecid: 0000Ik6rbtn2zGWjLxfP8A1CnclD00007A,0] [APP: b2bui] [dcid: f21c2821705695cf:-4862872f:12c06d54454:-7ff0-00000000000000d5] Server Exception during PPR, #1[[
    javax.servlet.ServletException: java.lang.AssertionError
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:341)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.help.web.rich.OHWFilter.doFilter(Unknown Source)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:97)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
         at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:247)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:157)
         at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.tip.b2b.ui.util.SessionTimeoutFilter.doFilter(SessionTimeoutFilter.java:232)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:94)
         at java.security.AccessController.doPrivileged(Native Method)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:414)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:138)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:330)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.doIt(WebAppServletContext.java:3684)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3650)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2268)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2174)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1446)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused by: java.lang.AssertionError
         at oracle.adfinternal.view.faces.renderkit.rich.table.TableRenderingContext.isCurrentCellInRowBanded(TableRenderingContext.java:258)
         at oracle.adfinternal.view.faces.renderkit.rich.table.TableRenderingContext.isCurrentCellBanded(TableRenderingContext.java:253)
         at oracle.adfinternal.view.faces.renderkit.rich.table.BaseColumnRenderer.renderDataCellClasses(BaseColumnRenderer.java:1036)
         at oracle.adfinternal.view.faces.renderkit.rich.table.BaseColumnRenderer.renderDataCell(BaseColumnRenderer.java:1149)
         at oracle.adfinternal.view.faces.renderkit.rich.table.BaseColumnRenderer.encodeAll(BaseColumnRenderer.java:103)
         at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1369)
         at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:335)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:765)
         at org.apache.myfaces.trinidad.render.CoreRenderer.encodeChild(CoreRenderer.java:415)
         at oracle.adf.view.rich.render.RichRenderer.encodeChild(RichRenderer.java:2567)
         at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.renderDataBlockRows(TableRenderer.java:1932)
         at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer._renderSingleDataBlock(TableRenderer.java:1601)
         at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer._handleDataFetch(TableRenderer.java:1003)
         at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.encodeAll(TableRenderer.java:504)
         at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1369)
         at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:335)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:765)
         at org.apache.myfaces.trinidad.component.UIXCollection.encodeEnd(UIXCollection.java:529)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.__encodeRecursive(UIXComponentBase.java:1515)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeAll(UIXComponentBase.java:785)
         at oracle.adfinternal.view.faces.util.rich.InvokeOnComponentUtils$EncodeChildVisitCallback.visit(InvokeOnComponentUtils.java:113)
         at org.apache.myfaces.trinidadinternal.context.PartialVisitContext.invokeVisitCallback(PartialVisitContext.java:222)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:378)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:448)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:448)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:448)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
         at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
         at oracle.adfinternal.view.faces.util.rich.InvokeOnComponentUtils.renderChild(InvokeOnComponentUtils.java:43)
         at oracle.adfinternal.view.faces.streaming.StreamingDataManager._pprComponent(StreamingDataManager.java:611)
         at oracle.adfinternal.view.faces.streaming.StreamingDataManager.execute(StreamingDataManager.java:460)
         at oracle.adfinternal.view.faces.renderkit.rich.DocumentRenderer._encodeStreamingResponse(DocumentRenderer.java:3200)
         at oracle.adfinternal.view.faces.renderkit.rich.DocumentRenderer.encodeAll(DocumentRenderer.java:1245)
         at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1369)
         at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:335)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:765)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.__encodeRecursive(UIXComponentBase.java:1515)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeAll(UIXComponentBase.java:785)
         at javax.faces.component.UIComponent.encodeAll(UIComponent.java:942)
         at com.sun.faces.application.ViewHandlerImpl.doRenderView(ViewHandlerImpl.java:271)
         at com.sun.faces.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:202)
         at javax.faces.application.ViewHandlerWrapper.renderView(ViewHandlerWrapper.java:189)
         at org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:193)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._renderResponse(LifecycleImpl.java:710)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:273)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:205)
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:266)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         ... 34 more
    ]]

    Hello, Anuj.
    Thanks for getting back to me. I did try with another browser.
    Furtehr investigatio tells me that this issue is somehow caused by the domain creation process when I create a single server domain with SOA Suite and OSB. When I create a separate domain for OSB all works as expected (as far as I can tell).
    Regards
    Michael

  • Error occurred in Data selection

    Hi All,
    We have a delta load step for 0FIGL_O02 in one of our Process chains.This step failed with an error message-
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    On getting into monitor , the Status tab says" Error occurred in Data Selection".
    Message text is :
    Delta request REQU_D5XVIKL0GSIIZAWDIOB9HIOOQ is incorrect in the monitor
    A repeat needs to be requested
    Data target 0FIGL_O02 still includes delta request REQU_D5XVIKL0GSIIZAWDIOB9HIOOQ
    If the repeat is now loaded, then after the load
    there could be duplicate data in the data targets
    Request the repeat?
    Please advise.
    Thanks,
    Shweta

    Hi
    Repeat delta is only available in case of transaction Data source.
    For transaction datasource system stores two fields
    1. Last delta
    2. Repeat Delta.
    In case we set request status not ok and save the status, BW System udpates the repeat delta field and thus for next data extraction system marks the request diff. as repeat delta and starts from Repeat delta field value instead of Last delta.
    In case of normal delta request from BW system takes Last delta fields into consideration.
    Hope the below link will be clear in getting
    /people/aaron.wang3/blog/2007/04/11/repair-bad-data-and-subsequent-data-targets-with-delta-update
    Many thanks
    Kiran

  • Error occured in data selection!

    Hi,
    data request was failed in my case when i schedule the data .
    in details tab of monitor screen , under extraction it is saying " error occured in data selection".
    while scheduling, i had given only one selection range of the many available fields.
    is it mandatory that we should select all the available selection values?
    can any body here tell me what could be the problem?
    Ravi

    This would occur when corresponding job of BW is running at R/3 side.
    How to identify that ?
    1] find sm37 of BW job.
    2] double click the job, click on 'job details' from tool bar.
    3] from the pop up screen, note down the server and job id.
    4] log in to R/3, tcode SM51. Double click the server which is mentioned in above step.
    5] a list of running job would appear, find the job from job id as in step 3. double click that and clikc on cancel program to kill the job.
    NOTE - if the job is already over then it would not appear in list..
    now you can repeat the load.
    this works, i have tried it.

  • ERROR:Occured in data Selection in Extractor

    Hi ,
    When i am executing a Master data Attributes Infopackage from SRM system to BI, I am getting the following Error from RSMO Screen
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    I have replicated the datasources from the source system and activated the Transfer rules, but still the error continues.
    But when you check the error messages, it shows
    1)You are not authorized to extract from DataSource 0BP_RELATIONS_ATTR in application component: 0CA_BP-IO .
    2)Errors in source system.
    Can you please throw some light on this issue.When i consult the Basis Consultant regarding authorization, he says he has given SAP ALL authorization which means i have all authorizations.But the  error is still occurring.
    Thanks,
    Amar

    Hello!
    1) Go to TCODE SM59 and Check that the RFC Connection between Source System and BW System is working correctly. Check the User ID BIWREMOTE_ is Setup correctly here for RFC Communication.
    2) Go to TCODE SU01D and verify that User ID BIW_REMOTE is unlocked and has ALL set in ROLES Tab.
    3) Go to TCODE SM58 and check if there are any pending entries for User BIW_REMOTE.
    4) Verify the IDOC sweeper Jobs (Inbound & Outbound) are running correctly to flush out the Queues.
    My Gut feel is that Background Users in R/3 system do not have sufficient access set up correctly...Please post when the Issue is resolved.
    Thanks
    Vasu

  • Capturing the details of  data in which error occured during uploading

    Hi, anyone tell me how to Capturing the details of  data in which error occured during uploading  the file using BDC in backgroung mode. Please do the needful
    Thanks & Regards.
    Aniruddha

    hi,
    This declaration is used to capture the error msg. V is the std table that captures that.i have given a sample code with this..pls chk it out..
    data: err type standard table of bdcmsgcoll with header line.
    SAmple code:
    report z_aru_bdc_new4
           no standard page heading line-size 255.
    include bdcrecx1.
    Generated data section with specific formatting - DO NOT CHANGE  ***
    parameters: p_file like rlgrap-filename obligatory.
    data: err type standard table of bdcmsgcoll with header line.
    data: mess(200) type c.
    data: begin of it_err occurs 0,
    msg(200) type c,
    end of it_err.
    data: begin of record occurs 0,
    data element:
            viewname_001(030),
    data element: VIM_LTD_NO
            ltd_dta_no_002(001),
    data element: ZEMPID3
            zempid3_003(004),
    data element: ZENAME3
            zename3_008(040),
    data element: ZEDEPID
            zedepid_009(004),
    data element:
            zsalkey_010(005),
    data element:
            salary_011(013),
    data element: ZENAME3
           ZENAME3_008(040),
    data element: ZEDEPID
           ZEDEPID_009(004),
    data element:
           ZSALKEY_010(005),
    data element:
           SALARY_011(013),
          end of record.
    End generated data section ***
    start-of-selection.
    at selection-screen on value-request for p_file.
    call function 'WS_FILENAME_GET'
    exporting
      DEF_FILENAME           = ' '
      DEF_PATH               = ' '
       mask                   = '.,..'
       mode                   = 'O'  " O -- open, S -- Save.
       title                  = 'OPEN'
    importing
       filename               = p_file
      RC                     =
    exceptions
       inv_winsys             = 1
       no_batch               = 2
       selection_cancel       = 3
       selection_error        = 4
       others                 = 5.
    start-of-selection.
    call function 'UPLOAD'
    exporting
      CODEPAGE                      = ' '
       filename                      = p_file
       filetype                      = 'DAT'
      ITEM                          = ' '
      FILEMASK_MASK                 = ' '
      FILEMASK_TEXT                 = ' '
      FILETYPE_NO_CHANGE            = ' '
      FILEMASK_ALL                  = ' '
      FILETYPE_NO_SHOW              = ' '
      LINE_EXIT                     = ' '
      USER_FORM                     = ' '
      USER_PROG                     = ' '
      SILENT                        = 'S'
    IMPORTING
      FILESIZE                      =
      CANCEL                        =
      ACT_FILENAME                  =
      ACT_FILETYPE                  =
      tables
        data_tab                      = record
    exceptions
       conversion_error              = 1
       invalid_table_width           = 2
       invalid_type                  = 3
       no_batch                      = 4
       unknown_error                 = 5
       gui_refuse_filetransfer       = 6
       others                        = 7
    if sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    *perform open_dataset using dataset.
    *perform open_group.
    delete record index 1.
    loop at record.
    *read dataset dataset into record.
    *if sy-subrc <> 0. exit. endif.
    perform bdc_dynpro      using 'SAPMSVMA' '0100'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'VIEWNAME'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=UPD'.
    perform bdc_field       using 'VIEWNAME'
                                  record-viewname_001.
    perform bdc_field       using 'VIMDYNFLDS-LTD_DTA_NO'
                                  record-ltd_dta_no_002.
    perform bdc_dynpro      using 'SAPLZSHAP' '0001'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZEMPID3(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=NEWL'.
    perform bdc_dynpro      using 'SAPLZSHAP' '0002'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-SALARY'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=SAVE'.
    perform bdc_field       using 'ZEMPTAB1-ZEMPID3'
                                  record-zempid3_003.
    perform bdc_field       using 'ZEMPTAB1-ZENAME3'
                                  record-zename3_008.
    perform bdc_field       using 'ZEMPTAB1-ZEDEPID'
                                  record-zedepid_009.
    perform bdc_field       using 'ZEMPTAB1-ZSALKEY'
                                  record-zsalkey_010.
    perform bdc_field       using 'ZEMPTAB1-SALARY'
                                  record-salary_011.
    perform bdc_dynpro      using 'SAPLZSHAP' '0002'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZENAME3'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=UEBE'.
    perform bdc_field       using 'ZEMPTAB1-ZENAME3'
                                  record-zename3_008.
    perform bdc_field       using 'ZEMPTAB1-ZEDEPID'
                                  record-zedepid_009.
    perform bdc_field       using 'ZEMPTAB1-ZSALKEY'
                                  record-zsalkey_010.
    perform bdc_field       using 'ZEMPTAB1-SALARY'
                                  record-salary_011.
    perform bdc_dynpro      using 'SAPLZSHAP' '0001'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZEMPID3(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=SAVE'.
    perform bdc_dynpro      using 'SAPLZSHAP' '0001'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'ZEMPTAB1-ZEMPID3(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=BACK'.
    perform bdc_dynpro      using 'SAPMSVMA' '0100'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/EBACK'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'VIEWNAME'.
    perform bdc_transaction using 'SM30'.
    *enddo.
    *perform close_group.
    *perform close_dataset using dataset.
    endloop.
    loop at it_err.
    write : / it_err-msg.
    endloop.
    form error.
    call function 'FORMAT_MESSAGE'
    exporting
       id              = sy-msgid
       lang            = '-D'
       no              = sy-msgno
       v1              = sy-msgv1
       v2              = sy-msgv2
       v3              = sy-msgv3
       v4              = sy-msgv4
    importing
       msg             = mess
    exceptions
       not_found       = 1
       others          = 2
    if sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    it_err-msg = mess.
    append it_err.
    clear it_err.
    endform.
    If it is session method u can find the session in SM35 from where u can get the error log.
    Hope this helps u,
    Regards,
    Arunsri

  • Error occured during selection for the DS 0CRM_SALES_ACT_1 in RSA3.

    Hi Experts,
    I have transported the DS - 0CRM_SALES_ACT_1 i from Dev. to QA system and in RSA3 it throwing an error message saying that 'error occured during selection'.
    I have checked the respective tables of this datasources all the records are there and also check the table SMOXHEAD_S and the table is not having this datasource.
    Any idea why the error is coming and is this because of the table SMOXHEAD_S empty for this datasource.
    OR
    Shall i activate this DS using RSA5 tcode in the development system and then create a new request and tranport the request to QA system?...Shall we do this in the development system?.
    Thanks in advances for yousuggestion.
    Regs,
    VACHAN

    Hi Vachan
    The entry in SMOXHEAD_S for your datasource must match the entry in SMOXHEAD exactly.   Check in your dev box to be sure - the entry is there but it is missing in QA.
    I wrote this program which copies the entry from SMOXHEAD to SMOXHEAD_S.
    Run the program at your own risk your CRM QA box.  If the entry is there already then it will do nothing.
    Please post to say if it works for you.
    Good luck,
    Guy
    report  zbi_fix_extractor.
    data: ld_datasource type smoxhead-datasource value '0CRM_SALES_ACT_1',
            ls_smoxhead   type smoxhead.
    *.check that the smoxhead_s table is empty
      select single * from smoxhead_s into ls_smoxhead
        where datasource = ld_datasource.
      if sy-subrc ne 0.
    *..only do if there was no entry in the shadow table
        select single * from smoxhead into ls_smoxhead
          where datasource = ld_datasource.
        if sy-subrc eq 0.
           insert into smoxhead_s values ls_smoxhead.
        else.
           write 'no data available in SMOXHEAD for datasource'.
        endif.
      else.
          write 'no change made: entry already in SMOXHEAD_S'.
      endif.

  • Excel SSAS Tabular error: An error occurred during an attempt to establish a connection to the external data source

    Hello there,
    I have an Excel report I created which works perfectly fine on my dev environment, but fails on my test environment when I try to do a data refresh.
    The key difference between both dev and test environments is that in dev, everything is installed in one server:
    SharePoint 2013
    SQL 2012: Database Instance, SSAS Instance, SSRS for SharePoint, SSAS POWERPIVOT instance (Powerpivot for SharePoint).
    In my test and production environments, the architecture is different:
    SQL DB Servers in High Availability (irrelevant for this report since it is connecting to the tabular model, just FYI)
    SQL SSAS Tabular server (contains a tabular model that processes data from the SQL DBs).
    2x SharePoint Application Servers (we installed both SSRS and PowerPivot for SharePoint on these servers)
    2x SharePoint FrontEnd Servers (contain the SSRS and PowerPivot add-ins).
    Now in dev, test and production, I can run PowerPivot reports that have been created in SharePoint without any issues. Those reports can access the SSAS Tabular model without any issues, and perform data refresh and OLAP functions (slicing, dicing, etc).
    The problem is with Excel reports (i.e. .xlsx files) uploaded to SharePoint. While I can open them, I am having a hard time performing a data refresh. The error I get is:
    "An error occurred during an attempt to establish a connection to the external data source [...]"
    I ran SQL Profiler on my SSAS Server where the Tabular instance is and I noticed that every time I try to perform a data refresh, I get the following entries:
    Every time I try to perform a data refresh, two entries under the user name ANONYMOUS LOGON.
    Since things work without any issues on my single-server dev environment, I tried running SQL Server Profiler there as well to see what I get.
    As you can see from the above, in the dev environment the query runs without any issues and the user name logged is in fact my username from the dev environment domain. I also have a separated user for the test domain, and another for the production domain.
    Now upon some preliminary investigation I believe this has something to do with the data connection settings in Excel and the usage (or no usage) of secure store. This is what I can vouch for so far:
    Library containing reports is configured as trusted in SharePoint Central Admin.
    Library containing data connections is configured as trusted in SharePoint Central Admin.
    The Data Provider referenced in the Excel report (MSOLAP.5) is configured as trusted in SharePoint Central Admin.
    In the Excel report, the Excel Services authentication settings is set as "use authenticated user's account". This wortks fine in the DEV environment.
    Concerning SecureStore, PowerPivot Configurator has configured it the PowerPivotUnnattendedAccount application ID in all the environments. There is
    NO configuration of an Application ID for Excel Services in any of the environments (Dev, test or production). Altough I reckon this is where the solution lies, I am not 100% sure as to why it fails in test and prod. But as I read what I am
    writing, I reckon this is because of the authentication "hops" through servers. Am I right in my assumption?
    Could someone please advise what am I doing wrong in this case? If it is the fact that I am missing an Secure Store entry for Excel Services, I am wondering if someone could advise me on how to set ip up? My confusion is around the "Target Application
    Type" setting.
    Thank you for your time.
    Regards,
    P.

    Hi Rameshwar,
    PowerPivot workbooks contain embedded data connections. To support workbook interaction through slicers and filters, Excel Services must be configured to allow external data access through embedded connection information. External data access is required
    for retrieving PowerPivot data that is loaded on PowerPivot servers in the farm. Please refer to the steps below to solve this issue:
    In Central Administration, in Application Management, click Manage service applications.
    Click Excel Services Application.
    Click Trusted File Location.
    Click http:// or the location you want to configure.
    In External Data, in Allow External Data, click Trusted data connection libraries and embedded.
    Click OK.
    For more information, please see:
    Create a trusted location for PowerPivot sites in Central Administration:
    http://msdn.microsoft.com/en-us/library/ee637428.aspx
    Another reason is Excel Services returns this error when you query PowerPivot data in an Excel workbook that is published to SharePoint, and the SharePoint environment does not have a PowerPivot for SharePoint server, or the SQL Server Analysis
    Services (PowerPivot) service is stopped. Please check this document:
    http://technet.microsoft.com/en-us/library/ff487858(v=sql.110).aspx
    Finally, here is a good article regarding how to troubleshoot PowerPivot data refresh for your reference. Please see:
    Troubleshooting PowerPivot Data Refresh:
    http://social.technet.microsoft.com/wiki/contents/articles/3870.troubleshooting-powerpivot-data-refresh.aspx
    Hope this helps.
    Elvis Long
    TechNet Community Support

  • Processing data for a remote command failed with the following error message: Error occurred during the Kerberos reponse.

    Hi!
    We have 5 Exchange 2013 servers and when I’m trying to run a script that includes the cmd-let Get-MessageTrackinglog with StartDate = the first of the month and with EndDate = the end of the month I get the following error message after
    a couple of hours. The script is run on the server SERVER01 and goes through the Message Tracking logs of all Exchange servers. I have tried the script on other servers and get the same result.
    Processing data for a remote command failed with the following error message: Error occurred during the Kerberos reponse.
    [Server=SERVER01, TimeStamp = 918/2014 19:32:34]
    For more information, see the about_Remote_Troubleshooting Help topic.
        + CategoryInfo         
    : OperationStopped: (server01.domain.com:String) [], PSRemotingTransportException
        + FullyQualifiedErrorId : JobFailure
        + PSComputerName       
    : server01.domain.com
    I have gone through “about_Remote_Troubleshooting Help topic”, but can’t find anything related to my issue. There is nothing in the Application or the Windows PowerShell log either.
    /Henrik

    Hi Henado 
    Check the time on your Exchange server(s) relative to the DCs and ensure they are in correct sync
    Use another account which is assigned the Organization Management permission and log to to the server run the command in shell and see the results
    Run 
    Add-PSSnapin -Name Microsoft.Exchange.Management.PowerShell.E2010
    Set-Adserversettings -ViewEntireForest $True and then run message tracking command and see the results
    If none of the above helps you may need to remove and re-install the WinRM and see the results
    Good Luck :)
    Remember to mark as helpful if you find my contribution useful or as an answer if it does answer your question.That will encourage me - and others - to take time out to help you Check out my latest blog posts on http://exchangequery.com

  • Errors occurred during the extraction for CFM data sources in RSA3

    Hi Friends,
                   When i try to execute the datasource's in Extractor Checker (RSA3) using the datasources given below:
    0CFM_DELTA_POSITIONS
    0CFM_INIT_POSITIONS
             I'm getting the error message saying that "Errors occurred during the extraction"
              This is my frist error and the the second error what i'm having is:
    When i try to execute the the datasources given below:
    0CFM_RDB_AN_1
    0CFM_RDB_AN_2
    0CFM_RDB_AN_3
    0CFM_RDB_AN_4
    0CFM_RDB_AN_5
    0CFM_RDB_AN_6
             It is asking "Portfolio Hierarchy" in RSA3 selections, even though i've activated portfolio hierarchy, i can't able to get that hierarchy in Extractor Checker Selections (RSA3), unless i get that hierarchy, i can't be able to excute any datasource in extractor checker.
    Awaiting for your inputs with curiousity.
    Thanx in advance,
    BalajiReddy

    Have a look at the following OSS Notes :
    1029605
    808387
    995660
    594272

  • Errors occurred during the extraction -  CRM Data Source

    Hello Experts
    I am working on CRM Datasource 0CRM_SRV_PROCESS_H, when I tried to run the infopackage for Init I am getting short durmp, for   "MESSAGE_TYPE_X" "      "SAPLRSATREE" or "LRSATREEF28"   "RSATREE_START_ACTIVATION" and says Source Sytem Error in the Log.  To cross verify I tried RSA3 Extract checker, I found error "Errors occurred during the extraction. This is happening in Production System, whereas same is working in Dev perfectly.
    Basically, I am using a BI intance within Solution Manager System.  Need your help urgently.
    Regards
    Aftab

    HI,
    Can you check the whether the source system connection is active by going RSA1>Source system Tab>Right click on the CRM source system-->Check if the connective is acyive?
    check whether already init has been done or what.If you have transported the infopackage also.please go into the infopackage cheduling screen in the scheduler tab check intialisation to the soirce system.whether there is any request.if it is there try to do delete the request  or if u cant under stand this in the info package scheduling screen  in the updae checkare you able to see delta option
    Let me know if it works.
    Cheers,
    Vikram

  • An error occurred during olap API metadata retrieval. This is probably caus

    this is what i have done so far. i really nead some help asap!!!
    1. Install Oracle 9i Release 2 (9.2.0.1) Enterprise Edition with the General purpose database configuration (Data warehouse works as well). At the end of the installation I chose the password management button to change passwords for the few necessary accounts: SYS, SYSTEM, OLAPSYS, SH.
    2. Download the p3948480_9206_WINNT.zip from metalink, the 9.2.0.6 patchset
    3. Shut down any existing Oracle9i database instances with normal or immediate priority. Stop all listener and other services running in the Oracle home directory where you want to install the patch set.
    4. unzip the content of thepatch to a temp directory
    5. start setup.exe under the temp directory (it will start the Oracle Universal installer 10.1.0.3)
    6. install the patchset to your Oracle home, selecting the source in the temp_dir\stage\products.jar file
    7. review carefully the post-installation tasks for the patchset:
    Review the following sections before upgrading a database (quote from the patchset html readme):
    8.2.1.1If JServer is part of the installation ensure that there is at least 10 MB of free space allocated to the SYSTEM tablespace.
    8.2.1.2 Check XDB Tablespace Size
    For RAC installations, ensure that there is at least 50 MB of free space allocated to the XDB tablespace.
    8.2.1.3 Set the SHARED_POOL_SIZE and JAVA_POOL_SIZE Initialization Parameters
    Set the value of the SHARED_POOL_SIZE and the JAVA_POOL_SIZE initialization parameters as follows:
    Start the database:
    SQL> STARTUP
    If necessary, enter the following command to determine whether the system uses an initialization parameter file (initsid.ora) or a server parameter file (spfiledbname.ora):
    SQL> SHOW PARAMETER PFILE;
    This command displays the name and location of the server parameter file or the initialization parameter file.
    Determine the current values of these parameters:
    SQL> SHOW PARAMETER SHARED_POOL_SIZE
    SQL> SHOW PARAMETER JAVA_POOL_SIZE
    If the system is using a server parameter file:
    If necessary, set the value of the SHARED_POOL_SIZE initialization parameter to at least 150 MB:
    SQL> ALTER SYSTEM SET SHARED_POOL_SIZE='150M' SCOPE=spfile;
    If necessary, set the value of the JAVA_POOL_SIZE initialization parameter to at least 150 MB:
    SQL> ALTER SYSTEM SET JAVA_POOL_SIZE='150M' SCOPE=spfile;
    If the system uses an initialization parameter file, if necessary, change the values of the SHARED_POOL_SIZE and the JAVA_POOL_SIZE initialization parameters to at least 150 MB in the initialization parameter file (initsid.ora).
    Shut down the database:
    SQL> SHUTDOWN
    8.2.2 Upgrade the Database
    After you install the patch set, you must complete the following steps on every database associated with the upgraded Oracle home:
    Log on as a member of the Administrators group to the computer where the Oracle components are installed.
    Use SQL*Plus to login to the database as the SYS user with SYSDBA privileges:
    sqlplus /NOLOG
    CONNECT SYS/password AS SYSDBA
    Enter the following SQL*Plus commands:
    SQL> STARTUP MIGRATE
    SQL> SPOOL patch.log
    SQL> @ORACLE_BASE\ORACLE_HOME\rdbms\admin\catpatch.sql
    SQL> SPOOL OFF
    Restart the database:
    SQL> SHUTDOWN
    SQL> STARTUP
    Run the utlrp.sql script to recompile all invalid PL/SQL packages now instead of when the packages are accessed for the first time. This step is optional but recommended.
    SQL> @ORACLE_BASE\ORACLE_HOME\rdbms\admin\utlrp.sql
    12. Install JDeveloper 9.0.4 (download it from OTN and just unzip it in a directory ... it doesn't require an oracle home)
    13. Install BI Beans 9.0.4 (download it from OTN as well), run the setup.exe that comes with it and in the destination oracle home select the directory where you installed JDeveloper and give an oracle home name to it)
    14. Install the BIBDEMO schema:
    Create a directory on the computer that is running the Oracle9i database. This install_home directory is the location to which you will upload the data files that are required to build the BIBDEMO schema.
    On the computer where BI Beans is installed, locate the bibeans_home\bibdemo_schema folder (where bibeans_home is the root folder of your BI Beans installation). Copy all of the files found in this folder to the install_home folder on your server machine.
    Open a DOS prompt and navigate to the install_home folder.
    Run bibdemo.bat to install the schema, using the following syntax:
    bibdemo.bat <path to Oracle database files >
    For example, for an instance named my9iService, enter the following:
    bibdemo.bat D:\OraHome1\oradata\my9iService
    You are prompted for the password for the sys as sysdba user.
    The script takes approximately 15 minutes to run, depending on the machine specifications. It is normal to see some error messages while the script is running. In addition, when materialized views are being created in the database, the script will appear to stop; this is also normal. A clear message will tell you when the script has completed.
    The log files (*.log) that are generated by the installation script are stored in the folder from which you ran the script.
    Here's what
    bi_checkconfig.bat -h ana -po 1521 -sid proiect -u bibdemo -p bibdemo -q
    returned:
    BI Beans Diagnostics(v1.0.2.0) 2/28/05
    ===============================================================================
    JDEV_ORACLE_HOME .......................... = E:\OraDS
    JAVA_HOME ................................. = E:\OraDS\jdk
    JDeveloper version ........................ = 9.0.4.1.1.1436
    BI Beans release description .............. = BI Beans 9.0.4 Production Release
    BI Beans component number ................. = 9.0.4.23.0
    BI Beans internal version ................. = 2.7.5.32
    Connect to database ....................... = Successful
    JDBC driver version ....................... = 9.2.0.4.0
    JDBC JAR file location .................... = E:\OraDS\jdev\lib\patches
    Database version .......................... = 9.2.0.6.0
    OLAP Catalog version ...................... = 9.2.0.1.0
    OLAP AW Engine version .................... = 9.2.0.1.0
    OLAP API Server version ................... = 9.2.0.1.0
    BI Beans Catalog version .................. = N/A; not installed in bibdemo
    OLAP API JAR file version ................. = 9.2
    OLAP API JAR file location ................ = E:\OraDS\jdev\lib\ext
    Load OLAP API metadata .................... = Successful
    Number of metadata folders ................ = 2
    Number of metadata measures ............... = 12
    Number of metadata dimensions ............. = 8
    Testing sample query for measures and dimensions
    (S=Schema, C=Cube, M=Measure, D=Dimension)
    1/21) Measure Budget ................... = Successful
    S=BIBDEMO, C=BIBDEMO_BUDGET_CUBE, M=BUDGET
    2/21) Measure Actual ................... = Successful
    S=BIBDEMO, C=BIBDEMO_ACTUAL_CUBE, M=ACTUAL
    3/21) Measure Close Price .............. = Successful
    S=BIBDEMO, C=BIBDEMO_STKPRICE_CUBE, M=STKPRICE_CLOSE
    4/21) Measure Open Price ............... = Successful
    S=BIBDEMO, C=BIBDEMO_STKPRICE_CUBE, M=STKPRICE_OPEN
    5/21) Measure Low Price ................ = Successful
    S=BIBDEMO, C=BIBDEMO_STKPRICE_CUBE, M=STKPRICE_LOW
    6/21) Measure High Price ............... = Successful
    S=BIBDEMO, C=BIBDEMO_STKPRICE_CUBE, M=STKPRICE_HIGH
    7/21) Measure Stock Volume ............. = Successful
    S=BIBDEMO, C=BIBDEMO_STKPRICE_CUBE, M=STKPRICE_VOLUME
    8/21) Dimension Division ............... = Successful
    S=BIBDEMO, D=DIVISION
    9/21) Dimension Line Items ............. = Successful
    S=BIBDEMO, D=LINE
    10/21) Dimension Time ................... = Successful
    S=BIBDEMO, D=TIME
    11/21) Dimension Day .................... = Successful
    S=BIBDEMO, D=DAY
    12/21) Dimension Stock .................. = Successful
    S=BIBDEMO, D=STOCK
    13/21) Measure Costs .................... = Successful
    S=BIBDEMO, C=ANALYTIC_CUBE, M=F.COSTS
    14/21) Measure Promotion ................ = Successful
    S=BIBDEMO, C=ANALYTIC_CUBE, M=F.PROMO
    15/21) Measure Quota .................... = Successful
    S=BIBDEMO, C=ANALYTIC_CUBE, M=F.QUOTA
    16/21) Measure Units .................... = Successful
    S=BIBDEMO, C=ANALYTIC_CUBE, M=F.UNITS
    17/21) Measure Sales .................... = Successful
    S=BIBDEMO, C=ANALYTIC_CUBE, M=F.SALES
    18/21) Dimension Channel ................ = Successful
    S=BIBDEMO, D=CHANNEL
    19/21) Dimension Geography .............. = Successful
    S=BIBDEMO, D=GEOGRAPHY
    20/21) Dimension Product ................ = Successful
    S=BIBDEMO, D=PRODUCT
    21/21) Dimension Time ................... = Successful
    S=BIBDEMO, D=TIME
    Metadata output location .................. = E:\OraDS\bibeans\bi_checkconfig\bi
    _metadata.txt
    To interpret this output, see the "Displaying Information about your Oracle9i Bu
    siness Intelligence Beans Client Configuration" technical note, whose file name
    is bi_checkconfig_tn.html
    These diagnostics are captured in: E:\OraDS\bibeans\bi_checkconfig\bi_checkconfi
    g.xml
    now: i have created some new stuff:
    1). user ana with roles:
    -dba
    -olap_dba
    -connect
    -resource
    (same roles as bibdemo)
    2).schema ana; tablespace ana (permanent), tablespace anatemp (temporary)
    3).i have created some relational tables and i have inserted some data in them:
    agent, aparat (cofee machines), beneficiar (clients), locatii (city), raport (REPORT), timp (time), tipaparat (types of cofee machines), tipbautura (products : types of cofee made by all the cofee machines), zone (state)
    4). one fact table with:
    - sold cantity (measure)
    - id_bautura (id_product) primary key
    - id_timp(id_time) primary key
    - id_beneficiar (id_client) primary key
    - id_agent primary key
    - id_locatie (id_city) primary key
    - id_aparat (id_cofee_machine)primary key
    i have inserted some data also
    5).dimensions:
    AGENT_DIM :levels: codag(id_agent), numeag (agent name), telefag (agent phone number)from relational table agent
    BENEFICIAR_DIM :levels: codben (id_client), denumire (client name), adresa (adress) ,codl (id_city) etc from relatinal table beneficiar (clients)
    TIMP_DIM :levels: id, year, month from relational table timp (time)
    TIPBAUTURA_DIM :levels: codbautura (id_product), numebautura (product name)from relational table tipbautura (products)
    ZONA_DIM :levels: codzona (id state), numezona (state name), codoras (id city), numeoras (city name) with ierarhy id_state---id_city FROM 2 RELATINAL TABLES CITY AND STATE!!!!!!!! AM I ALOUD TO DO THAT?????
    DO I NEED TO CREATE A DIMENSION FROM ONLY ONE TABLE???????
    APARAT_DIM :LEVELS: codben (id client), codtip (id machine type), denumireap (machine type name), matricolap ((machine id) (FROM 2 RELATIONAL TABLES ALSO!!!!!!! FROM TYPES OF COFEE MACHINES AND COFEE MACHINES!!!!!
    6). I HAVE NOW CREATED THE CUBE FROM THE FACT TABLE AND WITH ALL THE DIMENSIONS
    7). summary advisor wizard NOT WORKING! IT NEVER STOPS!
    8). I HAVE CREATED ALSO ONE materialized view FOR THE CUBE
    IF I COMPILE IT... NO ERRORS
    9). CUBE VIEWER NOT WORKING!!!!!!! IT ONLY APEARS A BELL!!!
    NOW IF I RUN BI_CHECK CONFIG ON ANA AND ALSO ON BIBDEMO!!!!!!!!!!!!
    IT SAYS:
    1) An error occurred during olap API metadata retrieval. This is probably caused by inconsistent metadata.
    ============================================================================
    oracle.express.ExpressServerExceptionError class: Unknown Error
    Server error descriptions:
    INI: System failure, Generic at TxsOqConnection::getDefaultDatabase
    at oracle.express.olapi.data.full.ExpressDataProvider.getMetadataProviderInterface(ExpressDataProvider.java:1003)
    at oracle.olapi.metadata.MetadataFetcher.initialize(MetadataFetcher.java:73)
    at oracle.olapi.metadata.MetadataFetcher.<init>(MetadataFetcher.java:45)
    at oracle.olapi.metadata.BaseMetadataProvider.<init>(BaseMetadataProvider.java:47)
    at oracle.olapi.metadata.mdm.MdmMetadataProvider.<init>(MdmMetadataProvider.java:130)
    at oracle.express.olapi.data.full.ExpressDataProvider.getDefaultMetadataProvider(ExpressDataProvider.java:964)
    at oracle.dss.metadataManager.server.drivers.mdm._92.MDMMetadataDriverImpl_92.getMdmMetadataProvider(MDMMetadataDriverImpl_92.java:1133)
    at oracle.dss.metadataManager.server.drivers.mdm._92.MDMMetadataDriverImpl_92.attach(MDMMetadataDriverImpl_92.java:810)
    at oracle.dss.metadataManager.server.drivers.mdm.MDMMetadataDriverImpl.attach(MDMMetadataDriverImpl.java:125)
    at oracle.dss.metadataManager.server.MetadataManagerImpl.buildObjectModel(MetadataManagerImpl.java:1092)
    at oracle.dss.metadataManager.server.MetadataManagerImpl.attach(MetadataManagerImpl.java:969)
    at oracle.dss.metadataManager.client.MetadataManager.attach(MetadataManager.java:876)
    at oracle.dss.metadataManager.client.MetadataManager.attach(MetadataManager.java:799)
    at BICheckConfig.checkConnection(BICheckConfig.java:277)
    at BICheckConfig.main(BICheckConfig.java:1348)
    I TRYED ALSO WITH USER ANA WITH ROLES:
    - DBA
    - CONNECT
    -RESOURCE
    - OLAP_USER
    NOT WORKING! AND ALSO BIBDEMO NOT WORKING!
    WHAT AM I MISSING? SHOULD I USE AW MANAGER? OR DO I NEED TO CREATE AN AMNALITIC WORKSPACE???
    WHAT ARE THE STEPS TO CREATE A GOOD METADATA????

    Hi,
    The issue here is if the whole catalog is corrupt or just one schema. So to try and determine the status of the catalog I would try:
    1) Using OEM remove all the objects you created
    2) I presume you created your database using the Database Configuration Assistant? You should have used the warehouse template
    3) Make sure the following accounts are unlocked and also not expired : SH, OLAPSYS
    4) Make sure the password for the SH schema is SH
    5) Make sure the password for the OLAPSYS account is manager
    6) Install the BIBDEMO schema that is shipped with BI Beans. This in the jdev_home/bibeans/bibdemo_schema
    The installation process will remove SH schema from the OLAP catalog.
    7)Once this is installed use JDeveloper to see if you can create a crosstab or graph.
    8) If the BIBDEMO schema works try creating your new schemas one at a time.
    9) Make sure the if you define the a dimension as type time it has END_DATE (column type DATE) and TIME_SPAN (column type number) defined. Otherwise don't define the dimension as type time.
    Hope this helps
    Keith Laker
    Product Manager
    Oracle Business Intelligence Beans

  • Error occured during extraction from APO to BI 7.0

    Hi Experts,
    I am in the process of extracting data from SCM(APO-Demand Planning) to BI .I have selected the relevant planning area 9ADP01 and given Generate Export DS in APO system and replicated the same in BI.But before scheduling the InfoPackage,when i checked for the DS 9A0APO_DP_ORDERANALYTICS_1,it displays "Error occured during extraction".I do have a doubt whether I should activate the DS in RSA5 before replication (but am not able to find the DS under DP in RSA5)
       I have checked the forums but solutions mentioned did not resolve my issue.
    Thanks,
    Meera

    Hi,
    The issue is resolved...
    Regards,
    Meera

  • The import was stopped, an error occurred during the phase DDIC_ACTIVATION

    Dear all,
    When I import the SAP_BASIS support package SAPKB70019 , it appears the error as below :
    The import was stopped, since an error occurred during the phase
    DDIC_ACTIVATION, which the Support Package Manager is unable to resolve
    without your input.
    After you have corrected the cause of the error, continue with the
    import by choosing Support Package -> Import queue from the initial
    screen of the Support Package Manager.
    The following details help you to analyze the problem:
         -   Error in phase: DDIC_ACTIVATION
         -   Reason for error: TP_STEP_FAILURE
         -   Return code: 0008
         -   Error message: OCS Package SAPKB70019, tp step A, return code
             0008
    Notes on phase DDIC_ACTIVATION
    In this phase the imported Dictionary objects are activated. This phase
    can terminate due to the following reasons:
    o   TP_INTERFACE_FAILURE: The tp interface could not be called.
    o   TP_FAILURE: The program tp could not be executed. For more
         information, see the SLOG or ALOG log file.
    o   TP_STEP_FAILURE: The tp step DDIC activation could not be performed
         successfully. To see the cause of the error in the activation log,
         choose  Goto -> Log -> Queue.
         If you import two or more Support Packages in one queue, and you do
         not activate them in the correct sequence, activation errors may
         occur. In this case, the activation errors disappear when you repeat
         the activation run. To do this, choose Support Package -> Import
         Support Package Queue.
        A prerequisite of the Support Package Manager is that the Change and
        Transport System (CTS) is configured correctly. For more detailed
        information, read the online documentation available from Help -> SAP
        Library -> mySAP Technology Components -> SAP Web Application Server ->
        Change and Transport System .
        A list of the most important SAP Notes for Online Correction Support
        (OCS) is available in SAP Note 97620, which is updated regularly.
    I have used se11 to avtive the domain SEFS_CRAWL_ALGORITHM , but the error still exist.
    Hope anyone can help. Thanks!
    B.R
    Michael

    P762713 wrote:>
    > Hi ,
    >
    >     This is general issue now a days occuring in Support Packs .The only solution for this issue is to delete structure ::
    >
    > these were the structure deleted from system afterwards patches were installed successfully.
    >
    > /1PYXXFO/SAP_PAYSLIP_____L0001
    > /1PYXXFO/SAP_PAYSLIP_____T0001
    > /1PYXXFO/SAP_PAYSLIP
    >
    > /1PYXXFO/SAP_PAYSLIP_____L0020
    > /1PYXXFO/SAP_PAYSLIP_____T0020
    >
    > You can do this via Tcode se11 .give these info in data type & in turn you have to delete 1PYXXFO/SAP_PAYSLIP also.This has solved my issue & after this support pack went fine.
    >
    > Thanks..
    > Mohit
    >
    > Edited by: mohit gupta on Sep 23, 2009 9:15 AM
    Hi, mohit
          I have tried to use the t-code se11 to delete the structures, but the system telled me that they are all not exist .
    A part of the queue log as below:
       Field name KEY is reserved (Do not use structure as include in DB table)
       Field name PRIOR is reserved (Do not use structure as include in DB table)
       Field KEY: Component type or domain used not active or does not exist
       Exactly one field SPRAS of type LANG: Selection as text language possible
      Nametab for table /1PYXXFO/SAP_PAYSLIP_____L0001 cannot be generated
       Enhancement category for table missing
       Enhancement category for include or subtype missing
       Field name KEY is reserved (Do not use structure as include in DB table)
       Field name PRIOR is reserved (Do not use structure as include in DB table)
       Field KEY: Component type or domain used not active or does not exist
       Nametab for table /1PYXXFO/SAP_PAYSLIP_____L0020 cannot be generated
       Enhancement category for table missing
       Enhancement category for include or subtype missing
       Field name KEY is reserved (Do not use structure as include in DB table)
       Field name PRIOR is reserved (Do not use structure as include in DB table)
       Exactly one field SPRAS of type LANG: Selection as text language possible
       Enhancement category for table missing
    who can help me ?
    thank you very much
    BR.
    Michael
    Edited by: Michael Hong on Sep 23, 2009 12:36 PM

  • MAIL: The upgrade failed - error occurred during upgrade. launching Mail Import assistant  to repair damaged index did not work

    After installing Yosemite, the Mail required an upgrade. the upgrade was completed but resulted in a message: The upgrade failed. error occurred during the upgrade, check continue to launch Mail Import Assistant.
    This resulted in an instruction: Your mail index has been damaged. To repair it quit mail. Mail will repair index next time you open mail.
    Following this instruction resulted in the same failure message being repeated continuously.
    How do I proceed to upgrade the Mail app.?

    If doing as the alert directs (relaunching Mail) has no effect, see below.
    The new version of Mail stores its database in a different format than the old one, so the database has to be converted before Mail can use it. Sometimes the conversion fails because the old database is corrupt. The easy way to recover is to discard the old database and start afresh. You can do that without losing any messages, provided that:
    ☞ All your incoming mail accounts are on an IMAP or Exchange server
    ☞ You store sent messages on the server
    ☞ You don't have any "On My Mac" mailboxes
    Most well-known independent mail services such as iCloud and Gmail are based on IMAP. On the other hand, ISP-hosted mail services almost always use POP. If your ISP is one of your mail service providers (or the only one), you probably can't use this procedure. Ask for further instructions in that case.
    If you know that the conditions above are satisfied, continue as follows.
    Quit Mail if it's running. Back up all data.
    Open the Library folder in your home folder by holding down the option key and selecting
              Go ▹ Library
    from the Finder menu bar. Inside it is a subfolder named "Mail." Move that folder to the Desktop. You're not moving the Mail application; you're moving a folder named "Mail."
    Open the Internet Accounts pane in System Preferences and recreate your mail accounts other than iCloud with the same settings as before. To recreate an iCloud mail account, all you have to do is open the iCloud preference pane and check the box marked Mail.
    Launch Mail. If all goes well, your mailboxes should be restored automatically. The messages will be downloaded from the servers, so it may take a long time if you have very large mailboxes. Some people have mailboxes in the gigabyte range, and that may be a problem if your bandwidth usage is metered.
    If the mailboxes are created successfully, you can delete the folder you moved to the Desktop. You may have to recreate your Mail rules, signatures, and custom stationery. If it's very important to you not to have to do all that, ask for instructions before deleting the folder.

Maybe you are looking for

  • How to export my Mac's iPhoto library from my iPad?

    Hey ! Recently, my MacBook died, taking my iPhoto library with it. Thankfully, i had transferred all of it to my iPad Mini. I bought a new MacBook and would like to export the library i had previously imported to my iPad Mini back to my Mac. When i p

  • HTDM: HOW? PLZ HELP

    hi, I got the installers for HTDM and EPIC, but when i go to open protools.....its giving me a dongle error, what do i do? mike

  • Need Help! iTunes will not load podcast!

    Hi Everyone, I'm close to killing myself here! I would really appreciate some help! I created an Audio-Podcast (Fileformat: M4a) and wrote the xml file. Both files are on my public server! When I open the xml file with Safari, the feed works! I can r

  • How do you set up apple pay?

    hi the new update for apple pay has just been made avalable for download and i realy want to use it asap can someone help me set it up ?

  • RandomAccessFiles

    I have a question about how to read and write from a random access database. I have more experience with this in c++, and with this, if I wanted to read or write something to a database, it was possible to give the read and write functions a pointer(