SAP BO Rapidmart for sales: Data loading issue

I could successfully load data to the sales rapidmart [XI 3.2] from SAP R/3 till QA [Our DEV and QA are same]. I have imported the same ATL to PRD and could succesfuly load 2 months of data. For the subsequent loads, its throwing me the following error.
11252     8464     DBS-070401     5/31/2010 7:38:40 AM     |Data flow DF_DeliveryFact_SAP|Loader Rename_DELIVERY_FACT
11252     8464     DBS-070401     5/31/2010 7:38:40 AM     ODBC data source <LIONSAPBI02> error message for operation    <SQLExecute>: <[Microsoft][ODBC SQL Server Driver]Datetime field
11252     8464     DBS-070401     5/31/2010 7:38:40 AM     overflow>.|<Regular Load Operations>|<DELIVERY_FACT>
Can  somebody help me in this?

Hi,
Refer the lnik and the note mentioned it.
Re: Problem when storing master data 0PLANT_ATTR-Full (i.e. Full Update)
With rgds,
Anil Kumar Sharma .P

Similar Messages

  • SAP BODS and SAP ECC integration for Sales data

    Hi  Folks,
    I have a requirement to load sales data from BODS into SAP ECC. Could you please suggest me a best approach for this?
    I really appreciate your prompt response.
    Thanks,
    Daniel.

    Hi Daniel
      What sales data you want to migrate?. Like
    1) Customer master
    2) Pricing Condition
    3) Sales Order
    4) Customer Sales Area
    5) Scheduling Agreement
    6) Schedule Lines
    7) Customer Material Info Records
      What you want to migrate. Evyerything is possible only by IDOC's if the volume of data is less( < 100000). If its more they the best option is only by BAPI method.
      please provide the exact requirement then we can through some light on it.
    Regards
    Vijay Mukunthan

  • TileList data load issue

    I am having an issue where the data that drives a tilelist
    works correctly when the tile list is not loaded on the first page
    of the application. When it is put on a second page in a viewstack
    then the tilelist displays correctly when you navigate to it. When
    the tilelist is placed in the first page of the application I get
    the correct number of items to display in the tilelist but the
    information the item renderer is supposed to display, ie a picture,
    caption and title, does not. The strange thing is that a Tree
    populates correctly given the same situation. Here is the sequence
    of events:
    // get tree is that data for the tree and get groups is the
    data for the tilelist
    creationComplete="get_tree.send();get_groups.send();"
    <mx:HTTPService showBusyCursor="true" id="get_groups"
    url="[some xml doc]" resultFormat="e4x"/>
    <mx:XMLListCollection id="myXMlist"
    source="{get_groups.lastResult.groups}"/>
    <mx:HTTPService showBusyCursor="true" id="get_tree"
    url="[some xml doc]" resultFormat="e4x" />
    <mx:XMLListCollection id="myTreeXMlist"
    source="{get_tree.lastResult.groups}"/>
    And then the data provider of the tilelist and tree are set
    accordingly. I tried putting moving the data calls from the
    creation complete to the initialize event thinking that it would
    hit earlier in the process and be done by the time the final
    completion came about but that didn't help either. I guess I'm just
    at a loss as to why the tree works fine no matter where I put it
    but the TileList does not. It's almost like the tree and the
    tilelist will sit and wait for the data but the item renderer in
    the tilelist will not wait. Which would explain why clicking on the
    tile list still produces the correct sequence of events but the
    visual component of the tilelist is just not working right. Anyone
    have any ideas?

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • Sap bw 7.3 master data load error

    sap bw 7.3 master data load error
    erroe: Exception in Substep Rules

    Hi Jayram,
    I am assuming that you are getting this error when you are loading data from PSA to IO. If so, the error might be because of
    1. Duplicate records
    PSA might have duplicate records. But you will be able to load only one record to Master data IO. In DTP Update tab, there is an option for "Handle Duplicate Record Keys" . Just enable this & try to load again.
    2. Erroneous records.
    Some special reocrds are allowed in PSA where as thye might not be allowed in Master Data/Data Targets like Lower case letters/ some speacial characters etc. If error is because of this, then you might need to correct the data at PSA level & reload the data to Master data. Else get it corrected in source system itself & fetch the data to BW. Or write some code to take care of these special characters.
    Hope it helps!
    Regards,
    Pavan

  • Last Data packet for the data load.

    Hi All,
    How to find the last data packet loaded to target for the data load in SAP BI? In the Table?
    Thank you,
    Adhvi

    Hi Adhvirao,
    When u r loading the data from ECC go to monitor ---> details ---> check the data pacaket which are failed double click on that data packet then in bottom u will able to see the IDOC no for ecc side which are pending in ecc containing some data. you can excute them manually to BI side thorgh BD87.
    Thanks,
    Deepak

  • SAP Best Practices for Sales Support(CAS)

    Hi,
    Pl provide me the SAP Best Practices for Sales Support(CAS).
    Regards
    Ravi

    Dear Ravi
    Please check these links
    [Sales Support: Computer-Aided Selling |http://help.sap.com/saphelp_40b/helpdata/en/10/748534c4603c34e10000009b38f83b/frameset.htm]
    [Sales Support : Computer Aided Selling|http://help.sap.com/printdocu/core/Print46c/EN/data/pdf/SDCAS/SDCAS.pdf]
    thanks
    G. Lakshmipathi

  • Data Load Issue "Request is in obsolete version of DataSource"

    Hello,
    I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
    Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
    The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    I have taken the follwoing action
    1. Replicated the data source
    2. Deleted all request from PSA
    2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
    3. Re transported the datasource , transformation, DTP
    Still getting the same issue
    If you have any idea please reply asap.
    Samit

    Hi
    Generate your datasource in R/3 then replicate and activate the transfer rules.
    Regards,
    Chandu.

  • Demantra Data Load Issue

    I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
    Thanks

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Table for Sales datas assinged to Equip/Floc

    Hi
          I need a table for Sales datas like sales org,distribution channel,divition,sales office,sales group assigned to equipment master,
    Points assured
    Thanks
    Seenu

    hi seenu
    yes you are wright the sales organisation, DC,Divison is strored under the ILOA table only , since in equipment its being picked from the view table,you can interrelate with FL and equipment
    regards
    thyagarajan

  • SAP BPC RDS FOR SALES AND OPERATION

    Hi
    i just need a link to download sap RDS BPC for sales and operation planning

    Hi Elhoussiny,
    Refer below link may be helpful.
    SAP Service Marketplace - Analytics - EPM
    Shrikant

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Error in Setting lock for Master Data Load

    Hi Team,
    I encounter the following error while uploading the Masterdata
    "Lock NOT set for: Loading master data attributes
    Attributes of characteristic 0BPARTNER are locked by terminated change run 465451241"
    I Performed the following steps
    1) Checked if any active jobs (change run jobs) running in the system. But no jobs running.
    2) RSATTR->infoobject list-> no infoobject found
    3)Tried after sometime ,but load continue to fail
    Could anyone help to solve the problem
    thanks
    Bala

    Hi Bala,
    Check in SM12 if there are any locks exist with the change run number?
    First check if there are any Master data loads running related to it?
    Check in the Attribute change run screen if there are any change run steps are running?
    Once after checking the above if no locks exist and no master data is running then check the below steps
    1. Run RSA1 transaction
    2. Tools -> Apply Hierarchy/Attibutes Change(or direct transaction RSATTR)... menu option
    3. Click at "Monitor and Start terminated Change Runs" button (at the bottom of the screen)
    4. Click "Reset Status" button at Run ID 465451241
    NOTE: this is not recommended, to unlock the locks better to wait till the locks get relased.
    If still the error message persists then we can use a FM in se37 in serious cases which was not recommended by SAP...
    Use RSDDS_AGGR_MOD_CLOSE in se37 and execute--->there give ur failed change run no(465451241) beside I_CNSID and execute...
    It releases all the locks by that change run....and continue with your further work/jobs...
    Regards
    KP

  • DATA LOAD ISSUE /NO ROLL UP MEMORY

    Hello Team,
    I have this master data load failure for FIS_BELNR. The problem which I think that everytime it tries to load it runs out of internal table space in the backend. I really don't know what it means. This data is been failing everyday because of the same problem.
    I have attached all the screen shots and ABAP short dump analysis screen shots as well. One can read in details why is it failing it tells exactly what the problem is , but how to fix it.
    Any more details needed please let me know.
    ABAP runtime errors    TSV_TNEW_BLOCKS_NO_ROLL_MEMORY                 
           Occurred on     25.10.2007 at 02:53:55                                                                               
    >> Short dump has not been completely stored. It is too big.                                                                               
    No roll storage space of length 2097424 available for internal storage.
                                                                                    What happened?                                                                               
    Each transaction requires some main memory space to process           
    application data. If the operating system cannot provide any more     
    space, the transaction is terminated.                                                                               
    What can you do?                                                                               
    Try to find out (e.g. by targetted data selection) whether the        
    transaction will run with less main memory.                                                                               
    If there is a temporary bottleneck, execute the transaction again.    
    If the error persists, ask your system administrator to check the     
    following profile parameters:                                                                               
    o  ztta/roll_area            (1.000.000 - 15.000.000)                 
           Classic roll area per user and internal mode                   
           usual amount of roll area per user and internal mode           
    o  ztta/roll_extension       (10.000.000 - 500.000.000)               
           Amount of memory per user in extended memory (EM)              
    o  abap/heap_area_total      (100.000.000 - 1.500.000.000)            
           Amount of memory (malloc) for all users of an application      
           server. If several background processes are running on         
           one server, temporary bottlenecks may occur.                   
           Of course, the amount of memory (in bytes) must also be        
           available on the machine (main memory or file system swap).    
           Caution:                                                       
           The operating system must be set up so that there is also      
           enough memory for each process. Usually, the maximum address   
           space is too small.                                            
           Ask your hardware manufacturer or your competence center       
           about this.                                                    
           In this case, consult your hardware vendor                     
    abap/heap_area_dia:        (10.000.000 - 1.000.000.000)               
           Restriction of memory allocated to the heap with malloc        
           for each dialog process.                                       
    Parameters for background processes:                                                                               
    hat can you do?                                                                               
    ry to find out (e.g. by targetted data selection) whether the         
    ransaction will run with less main memory.                                                                               
    f there is a temporary bottleneck, execute the transaction again.                                                                               
    f the error persists, ask your system administrator to check the      
    ollowing profile parameters:                                                                               
    ztta/roll_area            (1.000.000 - 15.000.000)                  
          Classic roll area per user and internal mode                    
          usual amount of roll area per user and internal mode            
      ztta/roll_extension       (10.000.000 - 500.000.000)                
          Amount of memory per user in extended memory (EM)               
      abap/heap_area_total      (100.000.000 - 1.500.000.000)             
          Amount of memory (malloc) for all users of an application       
          server. If several background processes are running on          
          one server, temporary bottlenecks may occur.                    
          Of course, the amount of memory (in bytes) must also be         
          available on the machine (main memory or file system swap).     
          Caution:                                                        
          The operating system must be set up so that there is also       
          enough memory for each process. Usually, the maximum address    
          space is too small.                                             
          Ask your hardware manufacturer or your competence center        
          about this.                                                     
          In this case, consult your hardware vendor                      
    bap/heap_area_dia:        (10.000.000 - 1.000.000.000)                
          Restriction of memory allocated to the heap with malloc         
          for each dialog process.                                        
    arameters for background processes:                                                                               
    Error analysis                                                                               
    The internal table "IT_62" could not be enlarged further.                
    You attempted to create a block table of length 2097424 for the internal 
    table "IT_62". This happens whenever the OCCURS area of the internal table
    is exceeded. The requested storage space was not available in the roll   
    area.                                                                    
    The amount of memory requested is no longer available.                                                                               
    How to correct the error                                                                               
    Please try to decide by analysis whether this request is                 
    reasonable or whether there is a program error. You should pay           
    particular attention to the internal table entries listed below.                                                                               
    The amount of storage space (in bytes) filled at termination time was:                                                                               
    Roll area...................... 2595024                                  
    Extended memory (EM)........... 2001898416                               
    Assigned memory (HEAP)......... 1886409776                               
    Short area..................... 16639                                    
    Paging area.................... 24576                                    
    Maximum address space.......... "-1"                                                                               
    If the error occurred in a non-modified SAP program, you may be          
    able to find a solution in the SAP note system.                          
    If you have access to the note system yourself, use the following        
    search criteria:                                                                               
    "TSV_TNEW_BLOCKS_NO_ROLL_MEMORY"                                        
    "SAPLZ_BW_EXTRACTORS " or "LZ_BW_EXTRACTORSU24 "                        
    "Z_BW_AP_GL_BELNR"                                                      
    If you cannot solve the problem yourself, please send the                
    following documents to SAP:                                                                               
    1. A hard copy print describing the problem.                             
       To obtain this, select the "Print" function on the current screen.    
    Thanks

    Hello,
    The memory of your internal table wants beyond the system configured threshold.
    Decrease your package size or extend the mentioned parameters (basis task):
    ztta/roll_area (1.000.000 - 15.000.000)
    Classic roll area per user and internal mode
    usual amount of roll area per user and internal mode
    ztta/roll_extension (10.000.000 - 500.000.000)
    Amount of memory per user in extended memory (EM)
    abap/heap_area_total (100.000.000 - 1.500.000.000)
    Regards, Patrick Rieken

Maybe you are looking for

  • How to Configure my url in Apache server?

    i have an Apache webserver.i want to give url as desired say.http://intranet.com. Now iam giving my machine ip address as "127.0.101:9000 "to get my html page displayed from a remote terminal in a intranet network. How to configure the apache so that

  • Flash Media Player- missing controls

    I am using the flash media player.  When I publish the file, the controls seem to be missing from the player.  I have the skin.swf file uploaded with the player....but it appears missing? Could something else be causing this?

  • Tab control bug?

    I have a problem with tab control which has been bugging me for a long time and i can't seem to solve it. I have a tab control with 4 pages, A, B, C and D. All these pages are invisible at the beginning. Once the user selects the page from the custom

  • When I did verify disk, it says "the disk needs to be repaired.." how do I repair it?

    When I did verify disk, it says "the disk needs to be repaired.." how do I repair it?

  • BPS distribution over the big hierarchy

    Hi, I need to make a distribution based on the CostCenter hierarchy. In BPS application user post values on the level of CostCenter and on upper nodes level (postable nodes in hierarchy). - Values posted on CostCenter should be summarized up in the n