Problem in Data extraction for NEW GL DSO 0FIGL_O10

Hi ,
I am facing Problem in extraction of records from SAP to BW.
I have installed Business Content of NEW GL DSO  0FIGL_O10.
When I extract the Data from SAP R/3, to this DSO  ( 0FIGL_O10 )  the reocrds are getting over written
For Example  When I go the the Mange Option ( InfoProvider Administration)  the transferred Records and the Added Records are not same.  The Added records are less then the Transfered reocords.
This is happening becuase of Key Filed Definations.
I have 16 Characterisics in the KEY FIELD, which the maximum that I can have. But the Data comming from is Unique in some casses.
As result the data get added up in the DSO, hence my balances are not matching with SAP R/3 for GL Accounts.
There are total 31 Characteristics in the Datasource (0FI_GL_10) . Of which 16 Charactheristics i can include in the Key field area.
Please suggest some solution.
Regards,
Nilesh Labde

Hi,
For safety, the delta process uses a lower interval setting of one hour (this is the default setting). In this way, the system always transfers all postings made between one hour before the last delta upload and the current time. The overlap of the first hour of a delta upload causes any records that are extracted twice to be overwritten by the after image process in the ODS object with the MOVE update. This ensures 100% data consistency in BW.
But u can achive ur objective in different manner::
Make a custom info object ZDISTINCT and populate it in transformation using ABAP code. In ABAP try and compound the values from different charactersitcs so that 1 compounded characterstic can be made. Use ZDISTINCT in ur DSO as key
Just a thought may be it can solve ur problem.
Ravish.

Similar Messages

  • Legacy data transfer for new depreciation area for already existing assets

    Hi all,
    I am doing legacy data transfer for New deprecaition area for already existing assets through AS92.
    I like to give Ordinay depreciation posted amount  for the current FY for these assets in AS92, but i find this field is not editable inAS92.
    Please guide me to upload the Posted deprecaiton amount (for current FY) for these assets. I have assets both which are acquired in Last FY year and also this FY year.
    Regards,
    Srinivas

    HI,
    I have to migrate Posted dpereciation for new depreciation for nearly 3000 assets, In transaction 147, where i need give Acquituion value aslo, please note these are not new assets, these are prior year assets wiht opeing balances. If we use 147 system will take it is current acquition.
    Please note my transter date is 28.02.2010, I have already run AFAB for this company cdoe Feb 2010 for other depreciation areas.
    Now i am trying to update Opening APC Cost and Opening Accumulated deprecation for these assets and also posted depreciation for this FY.
    Where in AS92, i am able to update Opeing APC Cost and Opeing  Acc Dep succsfully for new dep area for already existingassets, bu Posted deprecation field is not editable.
    Please guide me how to update posted deprecation for new deprecaition area  in AS92 or some other way for already existing assets.
    REgards,
    Srinivas

  • Error in data extraction for 0TCT_DS22

    Hi All .
    I am facing issue in data extraction for 0TCT_DS22 , its giving message 'error in source system ' and also giving a dump
    Runtime Errors TSV_TNEW_PAGE_ALLOC_FAILED
    Date and Time 24.05.2010 11:22:53
    Short text
    No more storage space available for extending an internal table.
    What happened?
    You attempted to extend an internal table, but the required space was
    not available.
    As per following thread i updated the table but still data load is failing ( Our system is at EHP1)
    DataSource 0TCT_DS22 not extracting Data
    Any pointers what could be the solution for this data load failure .
    Thanks.

    Hi Neetika,
           This type of error occures due to lack of memory in server or no process is available for request process.
    Please refer below forum.
    Re: Runtime Errors  TSV_TNEW_PAGE_ALLOC_FAILED
    Have you done any custom coding? if yes then check the loop.. endloop statement. (not closing loop)
    Thanks & Regards,
    Ashish

  • E recruiting Data Transfer for New Employee

    Hi all,
              We are implementing E-recruitment 3.0, Ecc 6.0 and E recruitment system are in the same system, we processing internal recruitment currently, here i can receive the internal applications, we can able to view and process the applications in recruiter role. Here i can execute the sequence of activities for application process group for received applications, we not integrated any XI systems for e recruitment process.
             After processing the activity to be hired, the next activity is Data transfer for new employee, this is also executed successfully for data transfer of employee positions, job, org unit details etc.
           1.  whether transferred data of applicant is processed via transaction PA48(Hiring of transferred employees)for internal recruitment in R/3 HR system, after the data transfer activity in E recruitment portal screen ?
           2. is any other roles and application url available for Hiring internal candidate which could be done in portal E recruitment screen itself with out coming to R/3 HR transactions side ? with out going into PA48 for hiring.
          3. any specific roles we have assign for recruiter ?
          4. any process carried out in Managerial self service?(according to me MSS is different instance to process recruitment process for internal and external recruitment for Vacancy filling) please correct me my vision on MSS.
            your answers and suggestions will be highly appreciated!
    Thanks from,
    Venkatesh Mani.

    1. whether transferred data of applicant is processed via transaction PA48(Hiring of transferred employees)for internal recruitment in R/3 HR system, after the data transfer activity in E recruitment portal screen ?
    If you are selecting the internal candidate , you need not hire the employee again using PA48. You can trigger transfer / change in position workflow using PCR form.
    2. is any other roles and application url available for Hiring internal candidate which could be done in portal E recruitment screen itself with out coming to R/3 HR transactions side ? with out going into PA48 for hiring.
    I dont think we have any specific action from eRecruitment side to run the hiring action.This needs to be done from R/3 only.
    3. any specific roles we have assign for recruiter ? Not sure about this.
    4. any process carried out in Managerial self service?(according to me MSS is different instance to process recruitment process for internal and external recruitment for Vacancy filling) please correct me my vision on MSS.
    If its intergrated portal with the Requestor role attached in the manager profile , they will be able to carry out both internal and external candidates selection. I dont think you would require different instance.
    We can also wait for someother inputs from forum gurus.
    Thanks.
    Gautham.

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • Data Structure for New General Ledger for mySAP 2004

    Guys,
    We are preparing to upgrade to mySAP 2004.
    mySAP 2004 doesn't have data migration tool for upgrade "classic" ledger to new ledger with advanced functionality. To utilize all new functionality, data should be migrated to the new ledger. Data migration tool in this case should be developed (ABAP). What is the difference in data structure of new ledger?
    if anybody who already went thru upgrade and could share some examples of programs for mySAP 2004 upgrade, my e-mail : [email protected]
    Thanks in advance,
    Mike

    Now I have the same problem. Anyone has a solution for this matter?

  • HFM data extract for Multiload

    Very new to this and need some help please. When extracting HFM data for FDM multiload what is the best way to get the data for periods into columns and not rows without formatting the data manually?

    The data extract interface is limited and does not allow you to have multicolumns for the periods. What you can try is to create a smartview (if the number of rows is not too large) and the save the file as a CSV, then in a text editor replace all commas by semicolons. Otherwise, if you can write Excel VBA code then this could also be an alternative again number of lines could be a potential issue.
    If you don't have to run it through translation tables, you can probably accomplish must of the changes using a strong text editor such as K-edit.

  • Problem in data extraction using 2LIS_11_VAITM

    Hi,
    For sales order item detail I am using a standard data source 2LIS_11_VAITM.
    And for this I have not written any code in exit.
    But for a particular date some of Sales Document: Item Data has pulled to BI using delta upload and some of not pulled.
    As some of sales order items are missing in BI. So i dentified those sales order detail in R/3
    So the Problem is coming for only those records which have created on 04.10.2008,27.10.2008, 01.11.2008.
    And the process chain for 05.10.2008, 28.10.2008, 02.11.2008 has not executed which suppose to pull the data for 04.10.2008,27.10.2008, 01.11.2008. The error is in delta uploading.
    On 05.10.2008, 28.10.2008, 02.11.2008  the server is down at the scheduled time of process chain.
    So the data suppose to come on next delta upload.
    So 0n 06.10.2008, 29.10.2008, 03.11.2008 the delta executed successfully.
    But all the records which are created on  04.10.2008,27.10.2008, 01.11.2008 hv not pulled to BI which I checked in PSA.
    Only some of records which are created on  04.10.2008,27.10.2008, 01.11.2008. are in PSA.
    Please suggest your idea.
    Point will be awarded for this.

    Hi,
    Are you loading to the DSO??
    Then
    1)fill the set up table for the following dates only for which you have misssed the records.
    2) Do a full repair to the targets for these date after the set up table is filled.
    3) schedule a normal delta to the further targets from this DSO.
    If loading to the cube directly??
    1)do a selective deletion from the cube for these dates
    2)fill the set up table for the following dates only
    3) Do a full repair to the targets for these date.
    No need to delete or do anything with the existing init in any case.
    Thanks
    Ajeet

  • Data extraction for BW/BI

    Hi ,
    I am new in BW.Can anyone send me material on DATA EXTRACTION IN BW? I mainly want the material for LO-extraction.If anyone could provide the material on extractors like LO-cockpit,Generic data source etc.I will be really thankful.Please send the material at  "baljinder4u_gmail.com" .
    Thanks in advance

    Hi Rakesh
    Step-by-step control flow for a successful data extraction with SAP BW:
       1.  An InfoPackage is scheduled for execution at a specific point of time or for a certain system- or user-defined event.
       2.  Once the defined point of time is reached, the SAP BW system starts a batch job that sends a request IDoc to the SAP source system.
       3.  The request IDoc arrives in the source system and is processed by the IDoc dispatcher, which calls the BI Service API to process the request.
       4.  The BI Service API checks the request for technical consistency. Possible error conditions include specification of DataSources unavailable in the source system and changes in the DataSource setup or the extraction process that have not yet been replicated to the SAP BW system.
       5.  The BI Service API calls the extractor in initialization mode to allow for extractor-specific initializations before actually starting the extraction process. The generic extractor, for example, opens an SQL cursor based on the specified DataSource and selection criteria.
       6.  The BI Service API calls the extractor in extraction mode. One data package per call is returned to the BI Service API, and customer exits are called for possible enhancements. The extractor takes care of splitting the complete result set into data packages according to the IDoc control parameters. The BI Service API continues to call the extractor until no more data can be fetched.
       7.  The BI Service API finally sends a final status IDoc notifying the target system that request processing has finished (successfully or with errors specified in the status IDoc).
    Note
    Control parameters specify the frequency of intermediate status IDocs, the maximum size (either in kilobytes or number of lines) of each individual data package, the maximum number of parallel processes for data transfer, and the name of the application server to run the extraction process on.
    *Here is LO Cockpit Step By Step*
    LO EXTRACTION
    - Go to Transaction LBWE (LO Customizing Cockpit)
    1). Select Logistics Application
           SD Sales BW
                Extract Structures
    2). Select the desired Extract Structure and deactivate it first.
    3). Give the Transport Request number and continue
    4). Click on `Maintenance' to maintain such Extract Structure
           Select the fields of your choice and continue
                 Maintain DataSource if needed
    5). Activate the extract structure
    6). Give the Transport Request number and continue
    - Next step is to Delete the setup tables
    7). Go to T-Code SBIW
    8). Select Business Information Warehouse
    i. Setting for Application-Specific Datasources
    ii. Logistics
    iii. Managing Extract Structures
    iv. Initialization
    v. Delete the content of Setup tables (T-Code LBWG)
    vi. Select the application (01 u2013 Sales & Distribution) and Execute
    - Now, Fill the Setup tables
    9). Select Business Information Warehouse
    i. Setting for Application-Specific Datasources
    ii. Logistics
    iii. Managing Extract Structures
    iv. Initialization
    v. Filling the Setup tables
    vi. Application-Specific Setup of statistical data
    vii. SD Sales Orders u2013 Perform Setup (T-Code OLI7BW)
            Specify a Run Name and time and Date (put future date)
                 Execute
    - Check the data in Setup tables at RSA3
    - Replicate the DataSource
    Use of setup tables:
    You should fill the setup table in the R/3 system and extract the data to BW - the setup tables is in SBIW - after that you can do delta extractions by initialize the extractor.
    Full loads are always taken from the setup tables
    please follow the link
    https://www.sdn.sap.com/irj/sdn/advancedsearch?query=dataEXTRACTIONIN+BW&cat=sdn_all
    Regards
    Tapashi
    Edited by: Tapashi Saha on Aug 18, 2008 11:03 AM

  • Meet a problem of data exchange for sale order from CRM to R3.

    Dear Friends:
          I do the data exchange for sale oder from to R3 today , the problem's detail is as follows:
          When i save a sale order in CRM (Version is 5.0) . it can automatically generate a bdoc which bdoc type is BUS_TRANS_MSG. but the bdoc status alway is "Sent to receivers (not all have confirmed)". and the original order in CRM can not be change .it quote that "Document is being distributed - changes are not possible",  so i check the order status analysis in detail .it presents two error messages ," Event 'BEFORE_CHANGE', attribute '     FINI': Error code for function module 'CRM_STATUS_BEFORE_COMPLETED_EC' , "Item is not yet completed in OLTP system".  so i check  the order in R/3 ,it has already been create and without any error messages.
       Would like to tell me how to solve it . thanks your any idear..

    Hi Benjamin,
    When performing uploads to R/3 from CRM there is a response from the OTLP system that is sent back to the CRM Middleware to confirm that the data records were received and processed correctly. 
    Here is a checklist you can run through to verfiy that the connections, systems and objects that are needed are all in place:
    <b>On R/3 system:</b>
    - Check R/3 outbound queue (transaction SMQ1) for any entries that are not reaching CRM.
    - Check that all RFC destinations on R/3 are defined correctly and are pointing to CRM
    - Check the CRMCONSUM table in R/3 to ensure CRM is registered as a consumer
    - Check the CRMRFCPAR table in R/3 to ensure that order objects are valid for exchange between R/3 and CRM
    - Check for any short dumps in R/3 (ST22/ST21)
    <b>On CRM:</b>
    - Are there entries stuck in the inbound queue (SMQ2) with R3AU* names?
    - What does the CRM Middleware Trace show (SMWT)?  Sometimes this has more detail than the specific BDoc overview (SMW01)
    - Check for short dums in CRM (ST22)
    Let us know what else you uncover and we can work from there.
    Brad

  • Data archiving for Write Optimized DSO

    Hi Gurus,
    I am trying to archive data in Write Optimized DSO.
    Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
    But i want to select to archive from this request to this request.(Selection of request by my own).
    Please guide me.
    I got the below details from SDN.Kindly check.
    Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
    Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
    The actual process of archiving remains the same i.e
    Create a Data Archiving Process
    Create and schedule archiving requests
    Restore archiving requests (optional)
    Regards,
    kiruthika

    Hi,
    Please check the below OSS Note :
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
    -Vikram

  • Problem in data sources for transaction data through flat file

    Hello Friends,
    While creating the data sources for transaction data through flat file, I am getting the following error "Error 'The argument '1519,05' cannot be interpreted as anumber' while assigning character to application structure" Message no. RSDS016
    If any one come across this issue, please provide me the solution.
    Thanks in Advance.
    Regards
    Ravi

    Hallo,
    just for information.
    I had the same problem.
    Have changed the field type from CURR to DEC and have set external instead of internal.
    Then, the import with flatfile worked fine.
    Thank you.

  • Data source for new FIGL

    Hi:
    I generated the data source for the totals records of FIGL( new ).
    But how to get the data source for line items?
    Should I still use the 0FI_GL_4 for that?
    Thanks.
    Eric

    N0 you need to create ur own generic datasource for line items - it is not delivered.
    0FI_GL_4 reads BSEG and BKPF
    where as the new Gl line items is based on table FAGLFLEXA

  • IDOC: Problem with data filter for IDOC extension field

    Hallo!
    I've created an idoc extension for the basic type DEBMAS06 that works fine. Now I want to use a data filter for one field ( company code ) of my segment. Every segment with a company code different from 100 should be filtered and not send to the other client. But what happend is that for all customers that have at least one company code different from 100, all segments including the one with cc 100 were deleted and a error "Segment ... does not exist for message type DEBMAS" appeared on the screen.
    Does anyone have any ideas about this problem?

    Not sure about changes to be made at the filtering options.
    An alternative would be sending the data to XI as it is and perform the mapping to remove the unnecessary segments.
    Disadvantage: Unnecessary processing of segment would be done at XI.
    Advantage: The integration logic would be completely handled by XI.
    Regards,
    Prateek

  • Problems with Delta Extraction for 0CRM_OPPT_H (no data found)

    Hi,
    I've some problems with the Delta Extraction of the Infosource 0crm_oppt_h (CRM Opportunities Header). After initialization I get no delta data from the CRM system.
    What I already did:
    Activated 0crm_oppt_h Data Source (checked functionality with rsa3)
    Started Info Package (Init) on BW side (worked fine)
    Checked the status of the Data Source on the CRM system using BWA7 ("initial upload" is unmarked; "delta active" is marked and what makes me worry is that the column "Queue exists" in <i>unmarked</i>...)
    If I change anything (like Phase, Expected Sales Vol.) in the opportunity, the Delta Extraction get no changes.
    Could You help me out, please?
    Best regards,
    Markus Svec

    hi Markus,
    try to check oss note 788172
    Release Status Released for Customer
    Released on 23.03.2005
    Priority Correction with high priority
    Category Program error
    Symptom
    No data exists in delta extraction from the CRM server to the BW system for business transactions, if parallel processing is applied as per note 639072. But Data is extracted if parallel processing is switched off.ie. when BWA_NUMBER_OFF_PROCESSES is set to 1,there is data during delta. This applies to the following DataSources:
    0BBP_TD_CONTR_1
    0CRM_COMPLAINTS_I
    0CRM_LEAD_ATTR
    0CRM_LEAD_H
    0CRM_LEAD_I
    0CRM_OPPT_ATTR
    0CRM_OPPT_H
    0CRM_OPP T_I
    0CRM_QUOTATION_I
    0CRM_QUOTA_ORDER_I
    0CRM_SALES_ACT_1
    0CRM_SALES_CONTR_I
    0CRM_SALES_ORDER_I
    0CRM_SRV_CODES
    0 CRM_SRV_CONFIRM_H
    0CRM_SRV_CONFIRM_I
    0CRM_SRV_CONTRACT_H
    0CRM_SRV_PROCESS_H
    0CRM_SRV_PROCESS_I
    Other terms
    DataSources, BWA, initial extraction, delta init, parallel processing, no data in delta.
    Reason and Prerequisites
    There is an update on the generated delta table which causes data corruption in running delta initializations as the changed delta sets will be deleted with every further update on documents. An open cursor statement is there without fetch data in SMOX3_GET_DATA.
    Solution
    The problem is solved with the attached corrections.After applying the corrections a new initialization of the affected datasources is necessary.

Maybe you are looking for

  • Solution to only TWO USB ports, please

    I should have paid more attention when I bought my Mac. I didn't realize there were only TWO USB ports on board. My job requires me to be mobile, and on any given day, I could be using up to 10 USB ports. I know they make hubs, one of which I've trie

  • Data in Call_Type_Real_Time resets to 0 if Progger(PG) restarts

    Hi all, I have the following situation. In UCCE 7.5.7, we wrote in house a web page(wallboard) with some statistics from dbo.Call_Type_Real_Time view/table. The problem is: after we restart one of Proggers for test purposes the data from this view/ta

  • Is my SATA Controller crashed?

    Hey Guys, the HDD from MacBook Pro (mid 2010) does not work anymore. I have tested it in another MacBook and it works. I have tested three HDD Cables with the front LED and nothin helps. If i go to system information under SATA section, the first Int

  • With iTunes 11, how do I view all the comments I saved in previous versions

    I think I must have been foolish to update to iTunes 11, reading everyone else's hassles. For now, my trouble is perhaps simple: I used to rely on the old list view, and had entered several comments for the music I have imported into iTunes. I cannot

  • Convolute image several times and merge result?

    Hi, I'd like to use a special version of sobel filter. So I need to convolute an image, acquired by my camera several times with different masks and merge the results, so I can see the whole edge outline. My problem is, that every time I'm trying to