No data in R/3 data source after extraction!

Hello All,
i am extracting the data from R/3 source system ( 4.7 ides system).
after succesffully transferring the data source and replicating it to the BW, i had created a infocube and scheduled the info package so that R/3 data source data gets uploaded into the Info cube created.
after scheduling in the header tab of monitor screen i found the error messege as "no data in R/3 data source, check your data source in R/3".
is it that data does not exist in R/3 Business content data source or is that data has not uploaded into BW?
can nay body here thorw some light onto it?
Ravi

Hi Ravi,
Please check the tables associated with your datasource and look if the data is available inthe respective tables.
You can also check the extracted records in T.code RSA3 in R/3 for your datasource.
Hope it helps!!!!
Regards,
Amit

Similar Messages

  • DS-BW:Data load called from BW failed after Source System program_id change

    Hi,
    In my BW 7.0 instance I have Data Services 3.2 as source system.
    Some time ago I have changed instance of Data Services I am connecting to - I just changed PROGRAM_ID in RFC connection parameters for that source system.
    Since that time when I am trying to execute infopackage to call a job I am getting following error:
         OCI call <OCISessionBegin> for connection <xxxxxxx> failed: <ORA-01017: inv
    where <xxxxxx> is Oracle database connection name for Data Services instance I connected originally.
    What can be reason for that?
    Regards,
    ak

    In Data Services Server Manager on DS box there were configurations for job server I connected originally and for the other job server. After removing original config it works fine.
    ak

  • How to change the Data sources after deploying the application ??

    Hi All,
    i want to know how to change the Data sources after deploying the application to the application server ???
    I'm using Oracle Application Server 10g Release 3 (10.1.3.1.0)

    Can you access the Enrprise Manager website of the target Application Server from your location? If so, you can change the datasource in it. If not, yo can bundle the datasource definition in your archive and use that one instead of the one configured in the target OC4J container. Or this will just be the responsability of your customer: whenever you send a new WAR file, they have to modify the datasource if needed and deploy the application?

  • Issue with data source after deploying

    We are experiencing an issue with our data source after deployment of a cube. On the datasource properties in Visual Studio 2012, we have the max connections set to 0 before the deployment. Once the cube is deployed, I can navigate to the <name>.0.ds.xml
    file and open it and see that the <MaxActiveConnections>0</MaxActiveConnections> is indeed set to 0. At some point over the next couple days, a process of the cube or some other action causes that value to get updated to some number too large to
    be converted to an int, and makes the datasource invalid. At that point we cannot view the datasource properties in SSMS, we cannot open the cube project from Visual Studio, and we’ve even had failures when trying to process the cube.  Is there a config
    somewhere that would cause this value to get overwritten, or some other behind the scenes process that we can look at?
    Our server information is:
    Microsoft SQL Server 2012 (SP1) - 11.0.3153.0 (X64)
                    Jul 22 2014 15:26:36
                    Copyright (c) Microsoft Corporation
                    Enterprise Edition: Core-based Licensing (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
    Chad Dotzenrod SWC | TECHNOLOGY PARTNERS 1420 Kensington Road, Suite 110 Oak Brook, Illinois 60523-2144 http://www.swc.com

    Typically you would import the metadata from the source location and either use that location as the data source (and so not need to redeploy), or deploy it to a separate target location.
    The replace action is destructive as you've found, and effectively performs a drop table followed by create table. Hence any data in the table is lost.
    If you just want the Control Center Manager to correctly display that the table is deployed, try setting the action to "Upgrade". This will try to upgrade the deployed object to match the definition in OWB, but as the two are identical this will result in no changes. However, it will update the deployment records to indicate that the object is deployed.
    Nigel.

  • Data source and Extract Structure

    Hi all,
    I have a doubt on Data source and Extract Str,
    when i used one info cube with some char and kf's
    after i did extraction shall we change the extract strcuture prequently other wise better to use all the predefined extract strcuture hide in data source.
    if i hide in datasource after extraction i want to use some of hide fields, so we can use it? if we can use how can we extract data for that specified hide fields to bw side.

    If you wont to create the genaric data source at that time
    you can create cube with some char and kf.
    after extraction you wont to modify the structure ? if you mofify
    the structure then replicate the data source and delete the data
    and upload the data from r/3 to bw.its not correct way every time modifying the
    structure.if you put hide means that fields are not coming to
    bw side.if put modify the hide field ,first goto rsa6 edit data source
    remove the hide check box and save after come to bw replicate the data source
    delete the data and load the data (if you wont full data for hide field)
    if you dont load full load means only available upto data only what ever field you modify the
    hide to unhide,

  • Data source doesn't show porper data while flat file data loading

    Hi Experts,
    I am trying to load data from flat file for info object 0GL_ACCOUNT. I have created an application component and data source.
    My question is after loading text data in the corresponding data source, it shows only few characters of the actual data.
    For Ex. in my original data i have G/L account name "Comp Hardware Purchase" but after loading into the data source it shows only "Comp Hardwa".
    I want to show the whole name "Comp Hardware Purchase ".  I know there is some text length problem but i couldn't find it. What steps i need to follow?
    Please help.
    Full points will be given to the useful answers ****
    Thanks and Regards,
    Niranjan Chechani

    HI Niranjan,
    Go to your data source and check whether the field and infoobject are having the same lenght. If the infoobject lenght is less than your field lenght wihc is coming from Flat file then such issue occur.
    Search for your flatfile datasource and and doubel clik on the datasource and check the fileds tabs, there you will have the mapping or you can also check your Transfer or Update rules of  the target.
    Regards,
    Nanda.S

  • How do we use Data rules/error table for source validation?

    How do we use Data rules/error table for source validation?
    We are using OWB repository 10.2.0.3.0 and OWB client 10.2.0.3.33. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    I reviewed the posting
    Re: Using Data Rules
    Thanks for this forum.
    I want to apply data rules to source table/view and rule violated rows should go to defined error table. Here is an example.
    Table ProjectA
    Pro_ID Number(10)
    Project_name Varchar(50)
    Pro_date Date
    As per above posting, I created the table in object editor, created the data rule
    NAME_NOT_NULL (ie project name not null). I specified the shadow table name as ProjectA_ERR
    In mapping editor, I have projectA as source. I did not find error table name and defined data rules in table properties. It is not showing up the ERR group in source table
    How do we bring the defined data rules and error table into mapping?
    Are there any additional steps/process?
    Any idea ?
    Thanks in advance.
    RI

    Hi,
    Thanks for your reply/pointer. I reviewed the blog. It is interesting.
    What is the version of OWB used in this blog?
    After defining data rule/shadow table, I deployed the table via CC. It created a error table and created the all the source coulmns in alphabatical order. If I have the primary key as 1st coulmn (which does not start with 'A') in my source, it will apprear middle of of columns in error table.
    How do we prevent/workaround this?
    If I have source(view) in sch A, how do we create Error table in Sch B for source(view)?
    Is it feasible?
    I brought the error table details in mapping. Configured the data rules/error tables.
    If I picked up 'MOVE TO ERROR' option, I am getting "VLD-2802 Missing delete matching criteria in table. the condition is needed because the operator contain at least one data rule with a MOVE TO ERROR action"
    On condition Loading - I have 'All constraints' for matching criteria.
    I changed to "no constraints' still I get the above error.
    If I change to 'REPORT' option instead of 'MOVE TO ERROR' option, error goes off.
    Any idea?
    Thanks in advance.
    RI

  • How to retain data in Delta after extraction

    Hi Experts,
                     We are suppose to extract delta from same datasource for multiple cubes ... from the source system only ... now after we extract delta for cube A, there is another cube B which is suppose to extract delta from the same datasource at different times,  is there any setting that we need to do to retain delta after extraction of first delta... so I can use it for another cube ...
    thanks.
    Sunil.

    HI,
    DTP will satisfies your requirement.
    DTP will update delta data with respect to target.
    Get delta upto PSA and update it to targets using Delta enabled DTPs.
    U can create number of delta DTPs with respect target.
    Regards,
    rvc

  • Deleting duplicate records from different data packets in BI data source.

    Hi,
    I am getting same (duplicate) records from different data packets in BI data source, after completion of extraction.
    I tried to store key fields of the first data packet in an internal table. But this internal table is not carrying the previous data at the time of extraction of second data packet.
    Is there any other way to remove duplicate records after completion of extraction.
    Thanks in advance.

    I did not extensively worked in BI routenes. But I recon there will be routene which will het executed before data mapping part there will be a start routene in which you can validate the existense of data before beeing passed from data source to cube.
    Hope this helps,
    Regards,
    Murthy.

  • IDCS6 MACOSX JS: Data Merging many txt files one after another

    Hello everybody.
    I have a situation where I have to data-merge many text files to the one actual indesign "base" file. The formats of the databases are all the same, the point of difference is a code in the second field of the database that refers to a mailing zone (not a zip/postcode, a "distribution centre" code that has 56 or so possibilities). I would prefer to data merge one file and then somehow split the resulting PDF via Acrobat, but the length of the resulting mailing zones is inconsistent and something that can't be done though the "split document" feature in acrobat.
    The database starts out as one massive file but using a one-line  code, can be split into its mailing zones into separate text files.
    My ultimate question: is there any way to data merge more than one file at once (e.g. one after another) and give them their names based on the names of the input text file?
    Ole Kvern had a script to Data Merge one file without the UI. I have altered the last line slightly so that it outputs to a PDF based on [High Quality Setting]:
    if(app.documents.length != 0){
            var myDocument = app.activeDocument;
            app.dataMergeOptions.removeBlankLines = true;
            //Select a source file.
            var myDataFile = File.openDialog("Select a data file")
            var exported = "file location(redacted for the sake of this post)"
            if(myDataFile != ""){
                    myDocument.dataMergeProperties.selectDataSource(myDataFile,);
                    myDocument.dataMergeProperties.exportFile(exported,"[High Quality Print]",);
    I figure the answer has to do with the variable "myDataFile" and rather than being simply open a dialog, it would be somehow selecting an array of text files, but this is where I am out of my depth.
    Any insights or other thoughts that may not involve trying to do many merges at once but may somehow revolve around one large merged PDF?
    Many thanks
    Colin

    I've answered my own question, but the answer lies outside of Adobe InDesign – it uses bookmarks within Adobe Acrobat.
    1) Data Merge the art/data to one large PDF file.
    2) In the resulting PDF file, manually search out the mailing zones and apply bookmarks to each first instance only of each mailing zone. In this case, it is a manual process of find/replace to find one of 70 possible mailing zones that may be used in any database. Each bookmark should be named with its respective code. It is a manual process, but it would take 5 minutes so this is acceptable.
    3) Using "Split Document" from the pages panel, make sure the "top level bookmarks" radio button is checked, and in the Output Options make sure that the "Use bookmark names for files" radio button is checked.
    I'm aware that this was a rather obscure question that only a fraction of users on this forum may wish to know an answer for, but hopefully this workaround helps someone else in a similar situation.
    Colin

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • Excel Table with Data Connection Manual Text Entry Misaligned After Refresh

    Greetings!
    I have an Excel 2010 workbook that includes a table linked to my SharePoint 2013 site by a data connection. The SharePoint list feeds the table standard information that's managed on the SharePoint site, but I need the user of the Excel workbook to be able
    to enter text manually in the workbook to associate local information with the line-items coming from the SharePoint list. To do this, I've added extra columns to the end of the table.
    The user can enter information in the appropriate cells in the "extra" columns at the end of the table, but when I refresh the data connection, the addition of a new list item on the SharePoint side results in the user's manually entered text getting
    out of alignment with the row it's supposed to be associated with.
    Example
    Column 1(SP)
    Column 2(Extra)
    Row 1
    Item 1
    Row 2
    Item 2
    Text entered for Item 2
    Row 3
    Item 3
    Then, if I add a new item to the list in SharePoint, for example, something that would appear between the original items 1 & 2, after refreshing the table, I get the following:
    Column 1(SP)
    Column 2(Extra)
    Row 1
    Item 1
    Row 2
    New Item 1.5
    Text entered for Item 2
    Row 3
    Item 2
    Row 4
    Item 3
    The table's data connection is set to insert rows for new items, and I could swear I had this working properly once upon a time...but I can't seem to make it work now.
    Any thoughts on what would cause this?
    Thanks in advance!

    Yes, it is. I realized after posting the first time, that I'd assigned the question to the Visio forum. I wasn't sure how to reassign to the correct (Excel) forum, so I re-posted over there:
    http://social.msdn.microsoft.com/Forums/en-US/b3bbe00c-94c0-48d4-bed9-fbd08d707b1d/excel-table-with-sharepoint-data-connection-manual-text-entry-misaligned-after-refresh?forum=exceldev

  • Data from a new customertable lost after conntrans ... Activity_Object

    Hello!
    We're working with CRM/MobileSales 4.0...
    We've inserted a new segment (for a customertable) into the BDoc Activity_Object.
    We've done this like we've seen it at the segment from smocondocu.
    We're planning new activities on the notebook entering data in our customertable and in the smocondocu-tile. After saving the actual smocondocu data and the customer data is on the local tables.
    Now we're doing conntrans ones stopping the inbound queue on the server. All data arrived. We're starting the inbound queue again.
    Now we're doing conntrans a second time ... something comes back to the local machine ...
    The actual smocondo data is now in the local table - the actual data from the customer table is away!
    The data from our customer table doesn't arrive in the server table, too.
    ...And we can't imagine why. No error messages on the server... nothing.
    The inserted segment in the Activity_Object looks like the other segments there.
    We've run again smoggen... the results remain the same.
    Are there any hints for us?
    Is there anywhere a small flag we've to set
    Please help. Thanks a lot in advance.
    Best regards,
    Ingo

    Hi Anusha!
    My collegues had sendbit-problems months ago - so we know this problem.
    We've run smoggen again on the server and all tables/metadata is new generated on the client.
    The problem should be anywhere on the crm-server perhaps directly behind the inbound-queue. Perhaps it have to do with the mBDoc... It seems that anywhere on the server one strucure doesn't match to another structure... or a flag is missing?
    The big problem is that we don't get any error messages ... an empty structure (of the new segment) returns in Mobile Sales - That's all.
    Best regards,
    Ingo

  • To open & Edit the XLS file in edit mode after Extracting SAP data into it

    Hello Experts,
      I have a requirement to open and edit the xls file imidiately after downlaoding the SAP data into this XLS file. The XLS file is getting saved on presentation server (e.g. Destop/C: drive).
      I have used function module "GUI_DOWNLOAD" OR "DOWNLAOD" to download the data from SAP table to XLS file. But now I need this XLS file to be get opned automatically after finishing the Download, so that user can make changes into XLS file and can save the changes into it. After saving I have to upload this modified Data into SAP table again.
    For this I am really not aware how to get it done..but I believe you experts will definately help me out..waiting for your reply.
    Thanks,

    HI,
    did you ur problem was solved if not check this code once.
    now only i tried it my system , it opens xl file and save all data in it.
    DATA: ZKNA1 LIKE STANDARD TABLE OF KNA1 WITH HEADER LINE.
    SELECT * FROM KNA1 INTO TABLE ZKNA1.
    CALL FUNCTION 'MS_EXCEL_OLE_STANDARD_DAT'
      EXPORTING
        FILE_NAME                       = 'C:\Documents and Settings\rajesh.NACL\Desktop\XLSSDSDS.XLS'
      CREATE_PIVOT                    = 0
      DATA_SHEET_NAME                 = ' '
      PIVOT_SHEET_NAME                = ' '
      PASSWORD                        = ' '
      PASSWORD_OPTION                 = 0
    TABLES
      PIVOT_FIELD_TAB                 =
       DATA_TAB                        = ZKNA1[]
      FIELDNAMES                      =
    EXCEPTIONS
      FILE_NOT_EXIST                  = 1
      FILENAME_EXPECTED               = 2
      COMMUNICATION_ERROR             = 3
      OLE_OBJECT_METHOD_ERROR         = 4
      OLE_OBJECT_PROPERTY_ERROR       = 5
      INVALID_PIVOT_FIELDS            = 6
      DOWNLOAD_PROBLEM                = 7
      OTHERS                          = 8
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    thanq,
    rajesh.k

  • How to check data records in R/3 (source system)

    Hello,
    I need to check the data records in R/3 (source system). Is Transaction RSA3 the only option or is there another way. When I use RSA3, all I see is the sandbox and cursor. Nothing seems to happen.
    Pls help.
    SD

    Hi Sebastian,
    To some extent this works out i.e Comparision of Tables in R/3 Vs ODS at BIW.
    Tables in R/3 like VBAK(Order),VBRK(Billing),MKPF(inventory Management) and LIKP(Delivery)
    compare them with the respective Datasource/PSA/ODS
    2LIS_11_VAITM
    2LIS_12_VCITM
    2LIS_13_VDHDR      
    2LIS_03_BF
    in the BIW.
    The Total numbers of Document Numbers ,Quantity or Value should match !!!!!!!!
    Hope it helps !!
    Rgds
    SVU123
    Edited by: svu123 on Mar 3, 2009 1:16 PM

Maybe you are looking for