Delta Upload Problem For 0PM_CO1 Cube

Hi Friends,
I got a problem for Delta Upload.In RSA3 data is O.K and In RSA7 also data is O.k.but the problem is with Delta Upload new records are not comming,i did First initialization,thst is o.k,but why Delta Upload is not comming.
please give me help about this .
Thanks in Advance.

Hi Praksh,
this is chava,can you send your mobile no. to my mail id.
Mail:[email protected]
Message was edited by: CHAVA CHAVA

Similar Messages

  • Delta Upload Problem

    Hi All,
    I have a ods which feeds the cube, i designed this to facilitate delta capability to cube.i loaded 17000 records(delta upload) to ODS and it says transfered-17000 and added-17000 but only 12000 records have been transfered to the cube and there are no routines in the update rules of cube from Ods, i am not sure why its only transfered 12000 records to cube when the added records in ODS are 17000.
    Thanks,
    Shetty.

    Hi Shetty.
       If you are having Keys as key1 key2 key3 key4 and Qty and value. If you want to make it overwrite you have load in overwrite mode.
    It works like this:
    key1 key2 key3 key4 Qty value
    A     B   C    D    10  100  -->> already there
    if you are loading new data from Flat file in overwrite mode contains below data then,
    It works like this:
    key1 key2 key3 key4 Qty value
    A     B   C    D    15  150 
    A     B   C    E    20  200
    After this load your ODC data will be
    key1 key2 key3 key4 Qty value
    A     B   C    D    15  150 
    A     B   C    E    20  200
    Hope it Helps
    Srini

  • Delta update for Info cube, How it will be works, what are mode to select.

    Hi all,
                  How delta update works for info cube, Generic data source delta update options are
                  1) time stamp
                  2) Calday
                  3) Nemeric pointer
    Please explain briefly delta update for infocube and delta init and delta update.
    Thanks,
    Gowda

    Hi,
    How delta update works for info cube
    When you are updateing the data to cube from ods then data would extracted from change log their delta pointer get setup in the source request (in Source you can check the option beside the reqest says updated to target object.... it means the records are updated to target hence we would not update the same request )
    Incase of PSA to cube Also in the PSA  request get stamped as last updated .
    Regarding Generic Datasource
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    Re: Extraction, Flat File, Generic Data Source, Delta Initialization
    Regards,
    Satya

  • After deletion ODS change log, Delta upload OK?

    Hi all,
    Now I am reducing data from BW server. For this purpose, I will delete ODS change log. Both of data "from ODS" and "to ODS" are uploaded by Delta Upload.
    If I delete change log after last delta initialization, is there any problem to next delta upload?
    For example, the last delta initializing was done April 1. And I delete the change log up to April 30. After this operation, can be expected any problem to Delta upload?
    Bamboo Shampoo

    I dont think it will cause any issue with next delta loads.
    Data goes from ODS to its targets from change log table.
    But I think better have strategy like..'delete change log data only for requests older than say...1 month or 2 months..'
    so that if u have problems with recent data loads..u can reload easily..
    u can also try cleaning up old requests from the PSA tables in the system..
    cheers,
    Vishvesh

  • Uploading problems with Safari 4.1.3 -

    Hello, I just thought this was a funny question to ask -
    On my PowerPC G5 tower, running OS 10.4.11, and my question is in regards to Safari version 4.1.3 uploading problems for facebook and youtube... I always have to use FireFox version 3.6.13... and not without trouble either.
    I get it.  I need a new computer, but I just thought it fascinating why one works 90% of the time and the other never works.

    Firefox is way more upto date than Safari in 10.4.11, but TenFourFox is the most up to date browser for our PPCs, they even have G4 & G5 optimized versions...
    http://www.floodgap.com/software/tenfourfox/
    Also I think they Require Flash &/or Java, which may be a problem in 10.4

  • Problem in Upload for HR Cube

    Hello,
    I am facing one problem when i am uploading data in HR Cube Qualification (0PAPD_C01).
    When i am uploading from R3 to BW it is giving 0 records. Also it is giving green signal.
    But in my R3 extractor, 7,350 records is available.
    Same connection, i did data upload for FI and Sales.
    Everything is working fine.
    Even in HR, the other functionalities like
    Headcount and Personnel Actions and
    Applications and Applicant Actions
    are working fine. Only Qualification is giving problem.
    I have replicated and activated properly for Qualification info source.
    Monitor Details:
    No data available
    Diagnosis
    The data request was a full update.
    In this case, the corresponding table in the source system does not
    contain any data.
    System response
    Info IDoc received with status 8.
    Procedure
    Check the data basis in the source system.
    Can any one give your help? Any special settings i have to do for Qualificaiton?
    Regards,
    SGK

    Hi,
    Check the following :
    1. that you have no filters in the InfoPackage, Transformations, update rules
    2. Check you have authorisation to extract the data
    3. Did you upload the master data first, HR is centric around Master data.

  • Delta Upload for Flat File

    Hello Everyone
    i am srikanth. i would like to know wheather do we have a facility for Delta Upload for flat file. If yes can u please give me the steps.
    thanks in advance
    srikanth

    Hi Sabrina...thank you for ur help...i did load the data from cube to ods
    steps.
    1. i generated export data source on the cube
    2. i found the name of the cube with prefix 8<infocube name> in the infosource under DM application component.
    3. there are already communication structure and transfer rules activated but when i am creating update rules for the ods..i am getting the message 0recordmode missing in the infosource.
    4. so i went to infosource and added 0recordmode in communication structure and activated but the transfer rules in yellow colour..there was no object assigned to 0recordmode but still i activated.
    5.again i went to ods and created update rules and activated (tis time i didnt get any message about 0recordmode).
    6.i created infopackage and loaded.
    a)Now my question is without green signal in the transfer rule how data populated into the ods and in your answer you mentioned to create communication structure and transfer rules where in i didnt do anything.
    b) will i b facing any problem if i keep loading the data into the ods from cube (in yellow signal) ..is it a correct procedure..plz correct me..thanks in advance

  • RSA7 shows zero records for delta upload of 2LIS_03_BF Datasource

    Dear Professionals,
    I am having constant issues with inventory management DataSource (2LIS_03_BF). I scheduled delta upload in r3 production using LBWE, keeping in mind that in LBWE there is no options for posting dates, ETC... In RSA3, there are records, but when i check RSA7, it is showing 0 records. What could be the problem. Do i need to do anything/activation anywhere in LBWE or RSA7 to see the records. Please advise.

    Hi,
    Normally there should be a collective run which transfers data from the application-specific queues to the BW delta queue (RSA7), just like this workflow:
    MCEX03 queue==> RMBWV03 (Collective run job) ==> RSA7-Delta queue => BW
    It is necessary to schedule the job control regularly - see point 3 of SAP Note 505700.
    Check also these SAP Notes:
    869628     Constant WAITUPDA in SMQ1 and performance optimization
    728687     Delta queued: No data in RSA7
    527481     tRFC or qRFC calls are not processed
    739863    Repairing data in BW
    Rgds,
    Colum

  • Delta upload  for generic data sources.

    Hai All,
    I tried as per SAP online document , but still i am not getting the result i.e delta upload from the BW side. Here i explained  everything what i did.
    In R/3
    1. I have created a table with 3 fields SNO, SNAME, DOB
    2. Then i created some entries in that table.
    3. With Tr Code RSO2 i created one data source. Here i took Master Attribute Datasource
    4. In generic delta i took the Generic field as DOB. and Time stamp and upper limit 10 sec.
    In BW side
    1. I have replicated the data sources under application compone in which i have creatred in R/3..
    2. Then i activated that data source and created infopackage for that.
    3. in the selection i have specified 01.01.1900 to 01.01.9999
    4. First i made  Full Update then i get all the records from R/3.
    5. In R/3 i have created 2 more entries.
    6. In Infopackage UPDATE Tab i have selected the Initilize Delta Process ( Initialization with Data Transfer).
    For this i am getting the error message as follows.
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Here i closed the all things in R/3 as per the note mentioned above. Still it is giving the error message.
    If i select the Intilization Delta Process( Intilization with out Data Transfer) then i am getting the option Delta Update . I selected this Delta update and scheduled but no data is coming.
    Please help.
    Regards
    Prashanth K

    Hai  Sachin,
                       I am getting the problem at PSA only. Actually that is connected to ODS Object only. Until unless we don't get the delta data into PSA we can't proceed futher. So please help. I am working on NW 2004S.
    Regards
    Prashanth K

  • Delta upload for DB connect.

    hi all,
    i have a small querry- i have to load data in a cube every month. my data source is a SQL table.
    i will load data using DBCONNECT.
    do we have delta update option for DBCONNECT, because in my info package i can see only "Full upload" option.
    if delta upload is available, then how do i get this option in info package.
    thanks a lot.
    Regards,
    Shruti

    Hi Shruti,
    In infopackage, in the tab for data selection, we can see some fields for selection. On those fields - we can write an abap code (routine) which will have some logic to find the last update date and then populate the selection field based upon this last update date + 1. So the info package will dynamically fetch the required records.
    Regards,
    Amit Mittal.

  • Delta upload for total fisl extractor

    Hello,
    Its said that the BALANCE field in the transfer structure is not supplied with data during Delta upload . 
    Would you know the reason for this?
    I assume this is for the totals extractor only since balance is not available otherwise.
    Regards
    Pascal

    Because the change record updates the movements not the balance
    And you can back post - ie post to a period already gone - so if yuu wanted balance to be provided you would then have to change every periods balance b/f in the extractor from the backposted period onwards
    Easiest way to do it - do it in the cube (as per the OSS note) and let compression deal with getting rid of the extra records created

  • Delta Missing for Billing Cube

    Data goes missing only for 25th dec 08 in Billing cube , rest is available in the cube
    Data modeling:
    DataSource: 2LIS_13_VDITM & 2LIS_13_VDHDR
    Load type : Delta
    Target : Infocube
    System: BW3.1
    Possible solution is to delete the deltas since 25th dec 08 till today and perform repair full request for the concerned period (since target is cube so we have to delete the previous loads to avoid duplicate records)
    then,
    delete the init
    initilize w/o data transfer
    schedule the load back to delta
    Before going for this method,do we have to delete/fillup setup table in R/3 ??? I dont think so as we are performing full repair load in BW for the given range of date/period...
    Guys, lets me know whether i m right or not,,,will it be okay to delete delta loads since 25th till today??
    Do u have any better method to get the missing billing data for 25th??

    Alok,
    1. Delete data in BW for the period from 25th Dec. one doubt is there whether we have to delete all the requests since 25th till date or just to delete 25th data?
    2. Dont touch the exisiting init as ait adds unnecessary work
    3. Delete the setup table in R3 fisrt (in case thr is some old data)
    4. Fill the setup table fro Appl 13 for the date range you have deleted. it will be filled up for missing documents
    5. Do a repair full request for this selection. (P.S : Stop the billing delta until you complete this correction) i think we have to borrow help from functional consultant to stop the posting
    6. Restart the billing delta
    --> Alok, if delta queue getting updated and same time setup also performing while postings happening till current date, chances of same record available in setup table and delta queue.
    As you are loading into a cube(addition), after setup tables data loding through repair full, same record may come again via delta(due to new postings). It creates a redundancy of data.
    So, as you have problem with one day data on 25th Dec, its better to delete data for that day or for that week and reload by selectively. As you are loding historic data, no postings happens to those records, no need to worry for delta queue. Or
    Find out missing records, setup and load. Bit tidious job.... but 100% safe...
    P.S: Deleting entire data from 25th onwards and reloading is not a wise decision, as you are loding into a cube.
    Hope its Clear
    Srini

  • Need a fix for Muse upload problems to GoDaddy????  I found one!

    GoDaddy now has a new cpanel named "Plesk".  In that panel you will find File Manager.  (I use the Windows version).  In the left panel of File Manager, you will see the root directory and all of its subfolders.  When you click on any of these folders, you will see their contents in the right panel.  You will also see a folder labelled "httpdocs".  This is the default folder that is set up in GoDaddy to be your "home" folder.  So, to fix your upload problems, do these steps:
    1.  Click on the httpdocs folder to see it open in the right panel.  In the toolbar above the right panel, you will see a green plus sign with the word "New".  Click on this drop-down menu.
    2.  Choose "directory"
    3.  Type in the name of one of the folders that your site will need, e.g. images, css, assets, scripts, etc.  This basically creates a subfolder within the httpdocs folder, right?  (There is a way to see how your Muse site sets up these folders prior to publish or upload.  You do this by choosing "Export as HTML" in the File menu of Muse, creating a folder on your desktop (or wherever) for the HTML export, then inspecting its contents after your save it.)
    4.  Continue to create all the subfolders within httpdocs that you will need.  Unfortunately, you have to do this one folder at a time.
    5.  After you have them set up, begin the process of uploading your site, like this:
    6.  In the File menu in Muse, you'll of course see "Upload to FTP Host ...".  Click on it.
    7.  A dialog box will open.  In the top text field, type in your website name ... you must use www. with your entry
    8.  In the "FTP Host" field, type in your domain name again like this ... "ftp.yourdomain.com" ... NO FORWARD SLASHES!!! (even if you find instructions to the contrary, like in Filezilla)  (Also, your domain might be a .org or .net, instead of a .com)
    9.  In the "Host directory" field, YOU MUST ENTER "httpdocs"!!!!  Despite what you learn, DO NOT LEAVE THIS FIELD BLANK!!!
    10.  Then enter your GoDaddy username and password.  (If you set up a different username for your GoDaddy account and your Plesk account, like I did, then use your Plesk username and password.)
    11.  Click on "Upload:  All Files"
    The next few steps are tedious, but necessary to keep your site organized on GoDaddy and to keep from confusing things.  You can probably get away with it, but, if your site doesn't load properly in a browser after doing the above steps, you'll need to click on each folder in the root directory that you copied as a subfolder in the "httpdocs" folder and delete the files.  For example, in the left panel of File Manager, you'll see the root directory.  Click on the "images" folder.  If there are any images that you need for your site in that folder, delete them.  Remember ... you've already copied them over to the images subfolder in httpdocs during your last upload described in the instructions above.  If you don't trust that you can delete them, click on the "httpdocs" folder, then click on the "images" folder and check the files.  Same files, right?  So, go ahead and delete the site files that you find in the appropriate folders in the root directory, making sure that you don't delete the ones that are NOT duplicated within the "httpdocs" folder.  Also remember that your .html pages will not require a subfolder to be set up in the "httpsdocs" folder.  They can just sit there, looking pretty.  But you will have to delete them from the root directory to keep things tidy.  Just click on the folder icon next to "root directory", and you'll see those html pages in the right panel.  Again, delete them. 
    REMEMBER:  To keep your Muse uploads to your GoDaddy site error free, you must ALWAYS make sure that "httpdocs" appears in the "Host directory" field in the "Upload to FTP Host ..." dialog box in Muse.  And all files and folders that your site needs MUST go in the "httpdocs" folder in the root directory of your site in GoDaddy's File Manager (now found in your Plesk Panel).
    I hope this wasn't too confusing.  If so, call GoDaddy.  I called them with this fix and they are using it in their phone support.

    Thank you! Thank you! Thank you! I was going out of my mind. And GoDaddy was no help (although they're usually really good.) I put all the folders in "httpdocs" and the website worked perfectly!

  • Migration Problem in 2lis_11_vaitm for info cube 0SD_C03(Sales Overview)

    Dear Experts
    I am facing an issue for the data source 2lis_11_vaitm and 2lis_11_vcitm for info cube 0SD_C03. When i create transformation on update rule, it gets created with warnings but no issues. When i try making a transformation on update rule it gives errors about abap routines and start routines. Then i would go for migrating 2lis_11_vaitm but that cant be done because of above mentioned error. I have migrated other data sources same way. The reason i am doing so is because of making exact transformations as trsansfer and update rule.
    My objectives is migrating 2lis_11_vaitm and 2lis_11_vcitm  to 7.X etl flow. Am I right on the sequence of steps? cause my main issue is when my info package starts writing data in PSA in errors out for both 2lis_11_vaitm & 2lis_11_vcitm. The IDOC has the same problem. So migrating seems to be the only solution. Plz help and FYI my patch levels on bi system.
    compo           Rel.              SP-lev.           Supp. pack                        Desc.
    SAP_ABA     701     0010     SAPKA70110     Cross-Application Component
    SAP_BASIS     701     0010     SAPKB70110     SAP Basis Component
    PI_BASIS     701     0010     SAPK-70110INPIBASIS     Basis Plug-In
    SAP_BW     701     0010     SAPKW70110     SAP Business Warehouse
    BI_CONT     706     0000     -     Business Intelligence Content

    The errors I face during the transformation  activation while making them on the update rule are following.
    Start Routine: Syntax error in routine
    Rule (target: 0SUBTOT_1S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0SUBTOT_2S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0SUBTOT_3S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0SUBTOT_4S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0SUBTOT_5S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0SUBTOT_6S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0QUANT_B, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0DOC_ITEMS, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0NET_VAL_S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0COST_VAL_S, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0GR_WT_KG, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0NT_WT_KG, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0VOLUME_CDM, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0HDCNT_LAST, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0BP_GRP, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0CP_CATEG, group: 01 Standard Group): Syntax error in routine
    Rule 39 (target field: 0CRM_PROD Group: 01 Standard Group): incomplete master data rule
    Rule (target: 0DEB_CRED, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0DOC_CLASS, group: 01 Standard Group): Syntax error in routine
    Rule (target: 0PROD_CATEG, group: 01 Standard Group): Syntax error in routine
    Key rule 44 (target field: 0VERSION): Initial update set
    it has as many routine as many syntax errors in routine.

  • Problems of using of Aggregates for Transactional Cube

    Hi,
    <b>Are there problems or disadvantages of using of Aggregates for Transactional Cube?</b>
    Tx.
    AndyML

    Hi,
    have a look at SAP's docu: http://help.sap.com/saphelp_nw04/helpdata/en/c0/99663b3e916a78e10000000a11402f/frameset.htm
    /manfred

Maybe you are looking for

  • Oracle 8.1.7 on SuSE 6.4

    Can anyone recommend a procedure to install Oracle 8.1.7 on SuSE 6.4. It is for practice only so security is not an issue. My unix/linux is basic and I am new to Oracle. I am doing an Oracle dba subject at uni this semester and need access to 8i at h

  • Extracting HTML data in BSP from Browser

    Hello,         I am displaying an Adobe Interactive form as HTML using an IFRAME in BSP, I know there is another way of using Adobe Interactive form in BSP, but it would occupy the entire BSP page and would overwrite any other BSP elements and they w

  • Pairing Apple TV ios6 - cannot get to Apple Support

    The bug in ios6 that makes pairing an iPad with TV impossible is well documented, so I gave up trying and thought to call Apple support for an official line. Of course, they won't give me a number in the UK to call and, when I try to raise a case wit

  • Dbms_space.object_growth_trend: ORA-22905

    RDBMS: 10.1.0.5.0 SQL> SELECT * FROM TABLE(dbms_space.object_growth_trend('JO', 'TAB1', 'TABLE')); SELECT * FROM TABLE(dbms_space.object_growth_trend('JO', 'TAB1', 'TABLE')) ERROR at line 1: ORA-22905: cannot access rows from a non-nested table itemW

  • AD accounts and workgroup manager settings

    We have several Windows AD accounts that can logon, but are not applying the correct settings that have been created in workgroup manager. Open Directory and Active Directory bindings appear to be fine. Can anyone help!!