Some records missing while full repair from datasource 2LIS_11_VAITM

hi friends,
  we are using the datasource 2LIS_11_VAITM.  If we run a full repair with a selection ., say for a particular dealer,  some document numbers are missing.  even if we run a full repair without any selection, same thing happens. 
we couldnt able to spot out where the records are getting filtered. is there any way to check the datasource for any filterations.
can anyone tell from which tables the data is flowing to this datasource.

Hi.
fill the set up table first for the application 11 for the selections which you want to get pulled in BW.
A full repair load for a LO data source is pulled from a set up tables and a full repair will pull whatever is in set up tables.
So fill the set up table for the values which you want to be pulled into BW first.
I think right now your set up tables do not contain those records which you want in BW.
Then verify in the t-code RSA3 for the selection and see if you are able to see those records there.
if its showing up in RSA3 then schedule a full repair and it will show up in BW then.
The major underlying table are VBAP,VBAK.you can verify the values from there.
Do not forget do delete the set up tables before filling it.
Thanks
Ajeet

Similar Messages

  • Duplicate Records in Infocube : Full Repair without DSO is possible ?

    Hello Gurus,
    I have a critical situation in BI production. we have more then 3 years of data in inventory infocubes (PSA->ODS->CUBE). everything was working fine till december 2010. after that in January 2010 and March -2010 we were getting double records in infocubes. in this scenarion we don't have ODS in between.
    The solution which i find is delete all data from Jan -2010 till today in infocube and also from PSA. on R/3 side delete set up tables and fill setup tables for Jan-2010 to Mar-2010. and create a infopackage for Full Repair request .  but i have below questions for these solution
    (1) For Full Repair Info Package or Full Repair Request we don't need to delete INIT request .. correct me if i am wrong.
    (2) For full repair request do we need to run Initialize Delta With Out Data Transfer first and then we have to do Full Repair ?
    (3) If we don't have DSO in these scenario then also can we solve this with full repair request ?
    Regards,
    Komik Shah

    Hi Venu,
    We have data in PSA since 13/04/2010 because the process chain was failing since last 15 days. so  didn't get new records in PSA also. we are using BI 7.0.
    Whole scenario is like this.
    Data is there in Inventory Cube since last 3 year. but no body monitor the process chain since last 15 days and no body analyze any reports after dec-09. now they Analyze the report in April-10 and for some months they are getting Double records. process chain was  failing since last 15 days and the reason behind that was  INDEXING as well as wrong records in PSA.
    So, My plan was to delete data from Jan-2010 to April-2010 and fill setup tables between Jan-2010 to april.2010. i will get data into PSA. but when i will load data to cube. i will get double records whether it's a full repair or not .
    Regards,
    Komik Shah
    Edited by: komik shah on Apr 30, 2010 12:38 PM

  • Error while loading data from DataSource to InfoCube/DSO through DTP

    Hi Friends,
    I am trying to load data from DataSource to InfoCube/DSO through DTP and getting the following error:
    Exception in substep: Extraction completed
    Processing terminated
    Data package 1 / 30.04.2009 22:38:30 / Status 'Processed with Errors'
    When i see the detail message it says:
    Syntax error in routine "convert_to_applstru", line 25 of generated program"
    I checked all my transformation and looks ok. I was able to load data earlier and i getting this error all of sudden. Does some one know what is this routine "convert_to_applstru" ?
    Thanks,
    Amit

    Hi Arun,
    Where do i see this generated program in RSRV. I dont see this in RSRV. Please guide me.
    Thanks,
    Amit

  • Record missing while insertion

    Hi,
    Through my jsp coding i am checking nearly 15 conditions in loop. Each and every loop execution will insert a record on any of the 15 condition, this is going smooth but sometimes it is missing some records to insert. what and why its happening.
    I am using Ms-Sql server 2000. i am using Stored procedure to insert the record.
    Please advice to make this to work efficiently.

    Yes, I have all the 15 conditions belongs to a single transaction and all these conditions will be in a jsp file.

  • How to avoide duplicacy when doing Full Update from Datasource to Infocube

    Hi,
    I am loading the data from 0HR_PA_OS_1 datasource to infocube 0PAOS_C01 infocube.Implementing the standard Business content.
    The Plan is to load the data Monthly Once.
    If by mistake, if the data is loaded more than once in a month the duplicates will happen in the infocube.I need to avoid this situation.
    I have 2 ways to avoid.But am looking for some effective ways in which it handles the upload effectively.
    1)Giving the Option of Overlapping request by means of process chain.
    2) Give the selections for the calmonth at the infopackage selections tab.
    The drawback of the 1st method is that its just like deleting the number of records existing and loading a new request.
    Say for eg in the month of June i have loaded the data which is around 1000 records.
    If am going to load the data for the month of July which is around 10,000 records, then with the 1st methos what i have mentioned i need to delate the previous request containing 1000 records and in effect doing a full update which includes 10,000 records of july + 1000 records of june.
    I don't want to do this since as the months go by the number of records gets increased in a single request.
    regarding the 2nd method am not quite sure how much it will be effective.
    Please let me know any possible ways other than this.
    Thanks,
    Rajesh Janardanan

    Hi ravi,
    The problem with the 1st method is that as the day goes by the number of records in a single request increases which in turn will take quite a long time to get load and sometimes fails.
    Is there any other way to handle the situation.
    Thanks for the reponse,
    Rajesh Janardanan

  • Records missing while extracting 0account hierarchy

    Bi experts,
    Two records have been added  by the business to a node in the 0account hierarchy , which can be seen in transaction /nKAH3.
    While extracting the MAE  0account hierarchy using a full InfoPackage no errors occur, but when checking the hierarchy in BW both records are missing. The activate checkbox has been selected in the InfoPackage.
    Has anyone  experienced something simular in the past?
    Regards,

    Hi,
    If once you loaded the Hierarchy, then do the Change run for it.
    If the loading was done manually then go to
    RSA1 --> Tools --> Apply Hierarchy/Attribute Change Run -- > Click on Hierarchy List Puch Button
    Then select the your hierarchy loading and click on save and then click on Execute Hierarchy/Attribute Change Run.
    Hope this will solve your problem
    With Regards
    ARUN
    Edited by: Arun Aravapalli on Mar 16, 2010 11:59 PM

  • Error while extracting data from datasource 0GLACCEXT_T011_HIER in RSA3

    Hi Experts,
    While trying to extract the data for the datasource 0GLACCEXT_T011_HIER in RSA3, am getting the below Dump.
    Runtime Errors         SAPSQL_ARRAY_INSERT_DUPREC
    Except.                CX_SY_OPEN_SQL_DB
    Short text
        The ABAP/4 Open SQL array insert results in duplicate database records.
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "SAPLFAGL_FSV" had to be terminated because it has
        come across a statement that unfortunately cannot be executed.
    Error analysis
        An exception occurred that is explained in detail below.
        The exception, which is assigned to class 'CX_SY_OPEN_SQL_DB', was not caught
         in
        procedure "FSV_TEXTS_CONVERT" "(FORM)", nor was it propagated by a RAISING
         clause.
        Since the caller of the procedure could not have anticipated that the
        exception would occur, the current program is terminated.
        The reason for the exception is:
        If you use an ABAP/4 Open SQL array insert to insert a record in
        the database and that record already exists with the same key,
        this results in a termination.
        (With an ABAP/4 Open SQL single record insert in the same error
        situation, processing does not terminate, but SY-SUBRC is set to 4.)
    Kindly help me out in resolving this error.

    Hi,
    This dump occurs if the storage of original files is not happening in a Content Server, which is recommended by SAP. You can check this easily if you go to transaction DC10 and check if the flag 'Use KPRO' is set for the used document type.
    SAP do not recommend storing such larger files in SAP DB. If you try to store any original in the SAP DB, the file will be stored in the table DRAO. That means you are consuming table space memory. For one original we can have maximum of 9999 rows in the table. To store larger originals in SAP DB more than 9999 rows in the table are needed and this is not possible. That's the reason behind this dump. Storing such huge files inthe SAP DB will also affect the performance of the system while accessing the original
    Refer the thread
    "http://wiki.sdn.sap.com/wiki/display/PLM/SAPSQL_ARRAY_INSERT_DUPRECwhilesaving+original"
    Thanks,

  • Some tables missing while taking dump

    hi
    Version 11.2.1
    When i'm taking dump using exp. some tables are missing.But it's actually there in the schema
    What could be wrong?

    If a table has 0 rows and was created with the instance parameter 'deferred_segment_creation'=TRUE, it would have been a segment-less table. Export expects a segment to be present.
    See Oracle Support Doc#960216.1
    If you attempt a schema export, the table is silently ignored.
    If you attempt a table export, export raises the error EXP-0011 "<tablename> does not exist".
    You need to either
    1. Insert at least 1 row in the table
    OR
    2. Set DEFERRED_SEGMENT_CREATION to FALSE and create the table
    OR
    3. Use DataPump (expdp)
    OR
    see the workaround using ALTER TABLE <tablename> MOVE that I mention in the discussion at
    Import the table with 0 rows
    (Note : You need to ALTER INDEX <indexname> REBUILD if you MOVE a Table)
    Hemant K Chitale
    Edited by: Hemant K Chitale on Feb 1, 2011 1:14 PM

  • Full Repair/Init/Delta & LO Cockpit Information required

    Hello
    I'm pretty new to BW and I'm starting to dig a bit more now into the deeper stuff now.  I've created Cubes with Extractor from R/3 and now I'm interested to understand more the LO Cockpit various specificities.
    For a better understanding, let's establish a scenario as --> Activation of R/3 SD Billing --> BW --> DSO --> Info Cube
    Ok, first, let me start by explaining what my understanding is.
    First, I need to go into R/3, t-code LBWE, and under the "13: SD Billing BW", activate the extractors I'm interested in (Header & Item as an example in my scenario).  I'm also assuming that the default fields within the extractors are suiting my needs so no adjustments needed for now.  I should also schedule a job here that will create records in the Waiting Queue on a daily basis.
    Next, I need to fill the setup tables from the t-code SBIW; I enter a date in the future, it runs for a while, then I get my data prepared.  In the setup table, it included all the existing R/3 invoices as of now right? (1)  Let's say there are new invoices created minutes after I've created the setup table, they will be sent to the Waiting Queue as soon as the document are created or will the schedule job created earlier will post them to that Queue? (2)  If the job send them to the Waiting Queue, where are sitting the pending documents before being sent then? (3)
    Now, on my BW system, once the Data Source have been replicated, what do I need to create and execute to be able to proceed with a Delta process? (4) I know I have to create a u201CInitu201D package 1st but have no clue why except that itu2019s needed for me to be able to create a u201CDeltau201D  package afterwards.  Technically, what does the u201CInitu201D package really does? (5)  What is the reason to have a u201CInit without datau201D and u201CInit with datau201D, I mean, I know literally what it means but again, why would I choose one over the other? (6)
    Iu2019d also like to understand the concept behind a "Full Repair" request, which I have no clue what the purpose is.  I guess there some of these properties (Full Repair, Init) determines where to get the data on the R/3 side; Setup Tables or Waiting Queue? (7)
    Please, provide clear respond (Donu2019t assume I know what youu2019re think of ) on my questions (identified by a bold number) and don't hesitate to provide a long one if needed to; it's better to provide more information than not enough
    Thank you in advance for all of your help!
    P.s. I'm working with BI 7.0.

    Hi.....
    So if I understand properly, whenever I run the init, it toggle a flag on my source system (R/3) which means that all changes performed in the invoices are going to be sent in the Delta Queue? Until I run an Init, changes are not written anywhere (Other than internal SAP tables/structures required by R/3 Invoices processes).
    If so, does this means that I could loose some transactions from the time I run the creation of the setup tables and the time I execute the Init on my BW system? Let's say I create the setup table today and execute the init only tomorrow; all invoices that have been created by this time won't be tranferred to BW?
    yes,suppose you create a set up table today....and ron the init tomorrow........in between many new recordsmay come.......but those new records will not be in the set up table.......since those transaction happenned after the filling of the set up table........and new records only goes to Delta queue after you run the init.......
    I have to admit that it's hard for me to "Suppose" i did not want to do an init; what would be a good reason not to? I understand that both Init types will toggle the "Init" flag on but still can't figure appropriate business scenarios for both types
    Suppose your Delta mechanism is corrupted.......for which you need to do a full repair......I mean first ,you have to delete the existing init flag.........second, you will fill the set up table .......third,you will load the records by Full repair......then 4th step will be running init without data transfer..only to set the init flag..........it will not pick any records.............then delta.......
    Init with data transfer picks all the previous records......we cannot give any selection..............for selection we have to use Full repair.........then init without data transfer..............generally for Transaction data we will go for Full repair..........bcoz otherwise number of recordswill be very huge........and there may be duplicate records..................
    Another thing we want to say........we do full repair we mainly use for ODS.........it is just like Full upload.........in case of infocube we can use full upload instead of full repair.........but in case of ODS full repair is must............since ODS does'nt support Full upload and Delta upload parallely........otehrwise ODS activation will fail........
    Then again you hav to use the Program : RSSM_SET_REPAIR_FULL_FLAG  to convert full upload request to full repair request.....
    Hope this helps....
    Regards,
    Debjani.......
    Edited by: Debjani  Mukherjee on Nov 7, 2008 10:45 PM

  • Error while creating DTP from DS to Infoobject

    Dear All,
    I am getting one error while creating DTP from Datasource to Info object can any one give some suggetion regarding this issue.
    Issue:Enter a valid value
    Msg No:00002
    Procedure: Display the allowed values withF4 and correct your entry,
    Regards,
    Satya

    Hi Satya,
    Trying logging off the system and log in again and try to activate Data source Infoobject and transformation again and try creating DTP by selecting Selecting the source system by pressing F4. I hope it should work.
    I don't find any specific problem or any patch problem with the error you are getting.
    Hope it helps !
    Regards,
    Rose.

  • Processing overdue error during delta extraction from datasource 0CO_OM_CCA

    Hi,
    I'm getting "Processing overdue error" in BW while extracting delta from datasource 0CO_OM_CCA_9. all other extraction jobs from R3 -> BW are successful. Even Delta init on this datasource is successful & problem is only with delta package.
    I appreciate if someone could provide information based on the following error details on this issue.
    here are the extraction steps we followed.
    Full load of fiscal years 2006 & 2007 Into transactional cube.
    load budget data into transactional cube.
    compression of the cube with "0 elimination"
    Delta Initialization with fiscal period selections 1/2008 to 12/9999
    all the above steps were successful.
    and when delta package is scheduled, we are getting following errors.
    BW system log
    BW Monitoring job is turning red with following message.
    Technical : Processing is overdue
    Processing Step : Call to BW
    Sending packages from OLTP to BW lead to errors
    Diagnosis
    No IDocs could be sent to the SAP BW using RFC.
    System response
    There are IDocs in the source system ALE outbox that did not arrive in
    the ALE inbox of the SAP BW.
    Further analysis:
    Check the TRFC log.
    You can get to this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
    Removing errors:
    If the TRFC is incorrect, check whether the source system is completely
    connected to the SAP BW. Check especially the authorizations of the
    background user in the source system.
    R3 Job Log
    Even after BW job turns to red, R3 job continues to run for 2 hours and
    eventually gets cancelled with an ABAP dump & here is the log.
    Job
    started
    Step 001 started (program SBIE0001, variant &0000000110473, user ID
    BWREMOTE) DATASOURCE = 0CO_OM_CCA_9
    Current Values for Selected Profile Parameters
    abap/heap_area_nondia.........2000000000 *
    abap/heap_area_total..........2000000000 *
    abap/heaplimit................20000000 *
    zcsa/installed_languages......EDJIM13 *
    zcsa/system_language..........E *
    ztta/max_memreq_MB...........2047 *
    ztta/roll_area................6500000 *
    ztta/roll_extension...........2000000000 *
    ABAP/4 processor: SYSTEM_CANCELED
    Job cancelled
    Thanks,
    Hari Immadi
    http://immadi.com
    SEM BW Analyst

    Hi Hari,
    We were recently having similar problems with the delta for CCA_9.  And activating index 4 on COEP resolved our issues.
    Yes, by default there is a 2 hour safety interval for the CCA_9 DataSource.  You could run this extractor hourly but at the time of extraction you will only be pulling postings through 2 hours prior to extraction time.  This can be changed for testing purposes but SAP strongly discourages changing this interval in a production environment.  SAP does provide an alternative described in OSS note 553561.  You may check out this note to see if it would work for your scenario.
    Regards,
    Todd

  • Record missed in a SQL

    hi experts
    I run a SQL like :
        SELECT UC_POD_EXT
               UC_DATEFRO
               UC_DATETO
               /BIC/ZKWH_MR
               /BIC/ZKWH_BILL
               /BIC/ZBI_MRCON
          FROM /BIC/AZFC_D9900 INTO TABLE LT_BILL
          FOR ALL ENTRIES IN SOURCE_PACKAGE
         WHERE UC_POD_EXT = SOURCE_PACKAGE-UC_POD_EXT.
    SOURCE_PACKAGE only has 1 record, and this reocrd's UC_POD_EXT is 10032789420750421. and it return 31 records.
    and I test another SQL like
        SELECT UC_POD_EXT
               UC_DATEFRO
               UC_DATETO
               /BIC/ZKWH_MR
               /BIC/ZKWH_BILL
               /BIC/ZBI_MRCON
          FROM /BIC/AZFC_D9900 INTO
          corresponding fields of TABLE LT_BILL
         WHERE UC_POD_EXT = '10032789420750421'.
    it return 34 reocrds
    so some records missed in the first SQL, i don't know why.

    Please retrive all the key fields of table /BIC/AZFC_D9900, as this is because of FOR ALL ENTRIES CLAUSE (as mentioned by Neil), which will delete duplicate records from the fields you are selecting.  Since now you'll retrieve all the key fields of the table /BIC/AZFC_D9900, you are assured of getting unique records.
    Regards
    Ranganath

  • Missing records while fetching data from Data mart

    Hi,
    I have some missing records in the ODS.The data is fetched from the other BW system.
    It is a delta load & all the loads are succesfull as of now.But still some records are missing.
    If i see in reconstruction tab, some requests are showing the Transfer structure status as clock(transfer stucture has changed sine the last request). Is it because of this time stamp status got changed ?
    I have done reinitialization & the missing Datas are fetched .But i would like to know the cause of missing records.
    Regrads,
    ANita

    Hi kedar,
    If there was a time stamp difference ,the data load should have got failed.All the delta loads were succesfull.
    But now they realised there are some missing records.Just for analysis purpose , i was looking into reconstruction tab & the transfer structure status was displayed as clock.
    But actually there was no change in the transfer stucture.
    Sometimes we used to get timestamp error &
    we used to replicate the data source & activate the transfer structure.
    To avoid these things in future what needs to be done.As this load gets triggered through process chain & each time the data load status was succesful.
    Every time we cant do replication of datasource while loading through Process chain unless the transfer structure gets changed.
    But my concern was is this the cause for missing records or something else.
    Regards,
    ANita

  • Few records are missing while downloading to a Spreadsheet  from a Report

    Dear Gurus,
    few records are missing while downloading to a Spread sheet from a Z report.  There are around 300 records, out of which 11 records are not appearing in Spreadsheet file after saving.  But, the funny thing is when i try to save in other format like, HTML or to a clip board all records are coming. 
    When asked, the ABAPer said -
        your report is coming correctly.  if the report is wrong then we can try checking some code in the Z report.  Saving it into Spread sheet is the standard program provided by SAP.
    He is also facing this problem for the first time.
    Can anybody help.
    Thanks in advance and u will get points if useful.
    Regards

    Hi,
    Few days back we got this kind of error, when i tried to down load the asset balances in excel format.
    It was observed that, for one of the asset description ends with cama",".  Due to this reason all other details has been stored in single cell.  Once we changed the master data of that asset, then we able to get the report properly.
    2) Some other time, when we tried to download material master details, for one of the material ... description not maintained.  this is another issue. After maintain the description problem got resolved.
    Hope this information will be helpful to u.

  • Some fields are missing while recording through shdb

    Dear Abapers,
                   I am doing bdc for F-27 t.code.Some fields,cobl-gsber(business area),cobl-kostl(Cost center) are missing while recording through SHDB.But when do it manualy the fields are displaying.Also i have ddded those fields manualy in the program,stll the fields are not getting captured.May anyone pls help me,why this problem is happening?What is the solution for this.
    Thanks in Advance.

    hi
    i can tell u that few transactions wont suffice us with BDC,so for such transactions we do bdc with similar kind of transactions.
    ie., The transactions FB60 and f-63 both are meant for Parking.
    But we cant do bdc for FB60 Parking. So we do bdc for f-63.
    similarly when we do bdc for vendor few fields wont appear...in that case, we use bapi for it....
    So try to find an appropriate bapi or similar tcode which will suffice ur need
    hope its clear.
    Regards
    Sajid

Maybe you are looking for

  • Getting selected values from a data table

    My data table gets values directly from a result set. I went through http://balusc.blogspot.com/2006/06/using-datatables.html#top , however, the data table shown in this example takes values from a simple list. I have trouble in getting selected valu

  • How to restore Vista on old Satellite P300?

    Sorry this is a long one. I bought a new P70 to replace my older P300 which came with Vista and now want to restore it to a clean install for the children to use. The problem is that I had partitioned the drive and had been testing the betta and RC1

  • Sales by Partner Functions

    Dear Sirs, Is there any way we can track sales by partner functions? I have a particular requirement wherein my client wants to track the Sales by - 1. Corporate Group 2. Sale employee Partner Function One Corporate group (e.g. Bhushan Steel) will be

  • Notes snapping in the wrong position?

    I am experiencing an annoying problem in Garage band 11 (6.0.4). I'm writing a music placing notes along the track and using software instruments. The problem is that after a while, the notes start to fall out of place and play on the wrong tempos. e

  • Generating report

    Hi Experts, Please help me. I have to generate Agewise report. For that the situation is... Think if we r running query on 5/9/2007 and Net due date of vendor document and amount is like 1) 30/09/2007 and amount is 100/- 2) 10/10/2007 and amount is 1