Data load - Customer and quotation

Hi
I need to load customer data from the legacy into SAP using LSMW.. I found the standard program RFBIDE00 for this. Will this program help me load the data for all account groups ?
My 2nd question is - Is there any standard prog/BAPI or IDoc to load Quotations(VA21) into SAP ? How will this be accomplished ?
Thanks

1) You enter the account group in BKN00-KTOKT, so I don't see any restriction on the account group.
2) See if:
BAPI_QUOTATION_CREATEFROMDATA  Customer quotation: Create customer quotation
BAPI_QUOTATION_CREATEFROMDATA2 Customer Quotation: Create Customer Quotation
is what you want.
Rob

Similar Messages

  • Data load management and CTS

    hi sdn,
    can any one explain data load management and CTS
    regards
    andrea

    Hi Dipika,
      When ever the delta fails, we have to do the repeat delta to bring earlier delta records along with current delta.
    If your data load fails in ods it won't have any impact on your cube, manually turn the status of the request to red and delete the failed request from the target trigger repeat delta.
    suppose if the datamart load fails in between this ods to cube, then analyze the reason for the failure, if the dso is updating a single cube then you can follow the same step, if this is updating multiple targets Reset the datamart status in the ods for that request after deleting the bad request from the target and trigger again, if it fails only in a single target then you can load the same delta by creating seperate info packge and check the target to be loaded for that instance.
    Thanks,
    Sathish.

  • GL Data Load - INIT and FULL

    Hi,
    I am getting different record count for Full and Init GL - Data loads.
    Count for Init is about 300 records less than the count for Full load.
    What could be the reason ?
    Thanks,

    while posting question be clear what cube/datasource ,which GL OLD or NEW you are working whats the background...else its just speculating and beating around bush guessing..
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/45/757140723d990ae10000000a155106/content.htm
    New Gl data flow-
    Re: New GL cubes 0FIGL_V10, 0FIGL_V11
    Hope it Helps
    Chetan
    @CP..

  • New ledger data about customer and voder extraction

    Dear colleagues:
                              I met a requirment, The FI system implement new ledger,so the detail information store in table FAGLFLEXA,I could use 0FI_GL_14 to get line item, but there is no vendor and customer information.and the 0FI_AR/AP_04  have customer and vendor information,but it does not have detail information (eg.profit centre).
                              Is there any smart solution to get the overall data together?
    Thanks al ot

    Hi,
    U can enhance the datasource to include CUSTOMER and VENDOR fields if they r appearing in the ECC table.

  • Data Load file and Rule file Creation

    Hi,
    I have used to create Rule file and Data file for loading data into Essbase 7.1.6 version.
    Past two years I didnt work in Essbase and forget the option, field properities, data load file creation.
    Could you please advice me or any demo for creating rule file as well data files?.

    Two things I could suggest.
    1. look at the Sample.basic application it has dimension and data load rules for all sorts of scenarios
    2. Come to my session at Kaleidoscope Rules files beginning to advanced where I go over some of the more interesting things with rules files

  • Data Loading(Before and After Image)

    I heard that a datasource which has both "Before and After Image " ,then the data can be sent directly to the infocube or from the DSO to the Infocube but where as
    If a datasource supports after image then first it has to be loaded to the DSO and then to the Infocube ,
    My question is how to know the image types of the datasource ?

    Hi Ravi,
    Check in ROOSOURCE tables in ECC. You can find the behvariaour options in DELTA field, so based on this table we can say will it support Cube/ODS.
        Delta Only Via Full Upload (ODS or InfoPackage Selection)
    A    ALE Update Pointer (Master Data)
    ABR    Complete Delta with Deletion Flag Via Delta Queue(Cube-Comp)
    ABR1    Like Method 'ABR' But Serialization Only by Requests
    ADD    Additive Extraction Via Extracto (e.g. LIS Info Structures)
    ADDD    Like 'ADD' But Via Delta Queue (Cube-Compatible)
    AIE    After-Images Via Extractor (FI-GL/AP/AR)
    AIED    After-Images with Deletion Flag Via Extractor (FI-GL/AP/AR)
    AIM    After-Images Via Delta Queue (e.g. FI-AP/AR)
    AIMD    After-Images with Deletion Flag Via Delta Queue (e.g. BtB)
    CUBE    InfoCube Extraction
    D    Unspecific Delta Via Delta Queue (Not ODS-Compatible)
    E    Unspecific Delta Via Extractor (Not ODS-Compatible)
    FIL0    Delta Via File Import with After-Images
    FIL1    Delta Via File Import with Delta Images
    NEWD    Only New Records (Inserts) Via Delta Queue (Cube-Compatible)
    NEWE    Only New Records (Inserts) Via Extractor (Cube-Compatible)
    O  
    ODS    ODS Extraction
    X    Delta Unspecified (Do Not Use!)

  • CS Repair at Plant with Collection from Customer and Quotation

    I'd like to incorporate the creation of a quotation with a pick up and repair from customer process. But am encoutering a few obstacles. Can anyone suggest a process to incorporate what I'm trying to do? :
    1.     IW51 Create notification
    2.     IW52 Create repair (sales) order from notification (Sales Order type YDAS) allowing receipt into plant and delivery back to customer of material.
    3.     VL01N Receive material against sales order in a Returns delivery (Delivery type YDLR)
    4.     Generate Service Order from Repair Sales Order (Order type SM03) , strip and inspect the material.
    5.       Record Time and Materials to be used IW32 and generate planned costs.     
    At this point I'd like to generate a quotation in DP80 for approval but get an error:
    "Quotation creation only relevant for revenue bearing service orders"
    OK. So if you change order type SM03 to allow this then you can't do step 4  i.e. Generate Service Order from Repair Sales Order (Order type SM03).
    I can get round this by creating a service order with order type SM02 and then create a quotation from DP80 before returning to step 4 and completing the process below but this is not ideal!
    6.     Confirm Operations IW42
    7.     Technically complete order IW32
    8.     Send equipment back to customer VL01N with order type YDLF
    9.     Collate Time and material used on repair order DP90
    10.     Create billing document VF01

    Use best practice scenarion H77

  • Two issues: activation of transfer rules and data load performance

    hi,
    I have two problems I face very often and would like to get some more info on that topics:
    1. Transfer rules activation. I just finished transport my cubes, ETL etc. on productive system and start filling cubes with data. Very often during data load it occurs that transfer rules need to be activated even if I transport them active and (I think) did not do anything after transportation. Then I again create transfer rules transports on dev, transport changes on prod and have to execute data load again.
    It is very annoying. What do you suggest to do with this problem? Activate all transfer rules again before executing process chain?
    2. Differences between dev and prod systems in data load time.
    On dev system (copy of production made about 8 months ago) I have checked how long it takes me to extract data from source system and it was about 0,5h for 50000 records but when I executed load on production it was 2h for 200000 records, so it was twice slower than dev!
    I thought it will be at least so fast as dev system. What can influence on data load performance and how I can predict it?
    Regards,
    Andrzej

    Aksik
    1 How freequently this activation problem occurs. If it is one time replicate the datasource and activate thetransfer structure( But in general as you know activation of transfer structure should be done automatically after transport of the object)
    2 One thing for difference of time is environmental as you know in production system so many jobs will run at the same time so obiously system performance will be slow compare to Dev System. In your case both the systems are performing equally. You said in dev system for 50000 records half an hour and in production 200000 records 2hrs so records are more in Production system and it took longer time. If it is really causing problem then you have to do some performance activities.
    Hope this helps
    Thnaks
    Sat

  • Essbase studio data loading

    Hi Friends,
    I need some help in cube deployment using Essbase studio and oracle as my source. I am trying to deploy a cube, my dimensions are loaded properly but data loading fails and gives following error.
    *"Cannot get async process state. Essbase Error(1021001): Failed to Establish Connection With SQL Database Server. See log for more information"*
    Kindly advice.
    Thanks
    Andy

    used custom data load settings using group by function.

  • Demantra Data Load Issue

    I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
    Thanks

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Data load error - error 4 in the udpate

    Hi guys,
    I am trying to load master data for customer and it has total 32 data packages. Data Package 8 has incorrect character which fails. The problem is after that all data packages gives error though there is no incorrect data in them. From data package 9, I get message with red status - Processing Ends: Errors occured.
          (Green light) Processing 2 finished
          (Red square) Error 4 in the update.
    This is the only message I get. I wonder whats the problem.

    Hi,
    Check in RSA1--> Monitoring --> PSA --> respective Infosource and click on PSA --> Now go to that Package and check record by record and find the erronous record and double click on the record and check still any other spl character is there.
    If you find any then if you want to EDIT that delete the request first in respective Data Targets and it will allow you to change..
    remove it and reload from PSA
    hope this helps,
    Sudhakar.

  • Data Load Check Status

    Hello Guru's,
    I am wroking on BI 3.5. i want check data load status in infocube and ODS, where i should check data load failure and
    success in details,
    Data is uploaded by data requests and i want check which requested is failed.
    Please help me how to create master data sources in r/3
    Regards
    Shivaraj
    Edited by: Shvai Patil on Feb 1, 2008 4:45 AM

    Hi,
    You can monitor all the requests in RSMO and also check in sm37.
    For creating custom master data in R/3 you can use RSO2 tcode.
    Hope it helps you.
    Let us know if still have  any issues.
    Reg
    Pra

  • For a Sales order, what is Actual Delivery Date to Customer?

    Hi Experts,
    I am an ABAP consultant. In a particular report, for a Sales order I am asked to display:
    ‘Delivery Create Date (LIKP-ERDAT)’,
    ‘PGI Date’,
    ‘Requested Delivery Date (VBAK-VDATU)’ and
    ‘Actual Delivery Date to Customer’.
    And in addition, I also have to show:
    ‘No: of days between Order Create Date to Delivery Create Date’,
    ‘No: of days between Delivery Create Date to PGI Date’,
    ‘No: of days between Requested Delivery Date to Actual Ship Date’ and
    ‘No: of days between Order Create Date to Actual Ship Date’.
    I’ve searched SCN for similar questions but I couldn’t get clarity. I’ll be very grateful if somebody can explain me how to find the ‘Actual delivery date to customer’ and what is the difference between this date and ‘Billing date’. Also, please explain, the difference between ‘Delivery Create Date’ & ‘PGI’?

    Hi Rashmith,
    It seems the report is related to delivery. Below is the explanation for the different terms you mentioned in your question.
    Delivery Create Date (LIKP-ERDAT)---- when a delivery is created with or without reference of an order system writes the date of creation time of creation and created by in the header data. Creation of delivery does not means that goods are dispatched. There are many steps further after a delivery is created before goods are dispatched.  For example I have created an order, created delivery on 01.01.2014; delivery date will always be 01.01.2014.
    PGI Date (LIKP-WADAT_IST)------ Post goods issue date is the date on which goods move out of the company to carrier. This is the last step of delivery. When delivery is created system derives different dates of planning; at this stage it will determine the planned PGI date (LIKP-WADAT) and when actual PGI  happens system writes the date in LIKP-WADAT_IST which is actual goods issue date.
    Requested delivery date (VBAK-VDATU)------When order is created for a  customer he asks for a material, a quantity and a date on which he wants the goods. This date on which customer wants the goods is called requested delivery date. Based on the requested delivery date system will check feasibility of delivering the goods on the requested delivery date based on the delivery scheduling.
    You can get the RDD based on the below logic.
    Input the delivery number in VBFA-VBELN and VBFA-VBTYP_V and get VBFA-VBELV.
    Input VBFA-VBELV in VBAK table and get the value of VDATU.
    Actual Delivery Date to Customer-----Actual delivery date is Actual post goods issue date (LIKP-WADAT_IST). This is the date on which goods are issued to the customer and customer is liable for billing for the goods dispatched.
    Difference between the Actual delivery date (Actual goods issue date) and billing date.
    Normally as per standard SAP, once goods are moved out of the company customer is liable to be for billing for the goods dispatched. So by default in standard SAP system copies the actual goods issue date (LIKP-WADAT_IST) as billing date (VBRK-FKDAT) irrespective of the date of creation of the invoice (If delivery is goods issued on 01.01.2014 and billing document/invoice is created today i.e. 14.04.2014, system by default will take 01.01.2014 as billing date). And this is the correct practice.
    However  if you want the current date as the invoice creation date instead of the actual goods issue date we can control it by copy controls feature given in SAP.
    So based on the copy controls setting, it may be different from the actual goods issue date/ actual delivery date.

  • Error 8 when starting the extracting the program-data load error:status 51

    Dear all,
    <b>I am facing a data exracton problem while extracting data from SAP source system (Development Client 220). </b>The scenario and related setting are as the flowings:
    A. Setting:
    We have created 2 source system one for the development and another for the quality in BW development client
    1. BW server: SAP NetWeaver 2004s BI 7
    PI_BASIS: 2005_1_700 Level: 12
    SAP_BW: 700 Level:13
    Source system (Development Client 220)
    2. SAP ERP: SAP ERP Central Component 6.0
    PI_BASIS: 2005_1_700 Level: 12
    OS: SunOS
    Source system (Quality Client 300)
    2. SAP ERP: SAP ERP Central Component 6.0
    PI_BASIS: 2005_1_700 Level: 12
    OS: HP-UX
    B. The scenario:
    I was abele to load the Info provider from the Source system (Development Client 220), late we create another Source system (Quality Client 300) and abele to load the Info provider from that,
    After creating the another Source system (Quality Client 300), initially I abele to load the info provider from both the Source system – , but now I am unable to load the Info provider from the (Development Client 220),
    Source system Creation:
    For both 220 and 300, back ground user in source system is same with system type with same authorization (sap_all, sap_new, S_BI-WX_RFC) And user for source system to bw connection is dialog type (S_BI-WX_RFC, S_BI-WHM_RFC, SAP_ALL, SAP_NEW)
    1: Now while at the Info Package : Start data load immediately and then get the
    e-mail sent by user RFCUSER, and the content:
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Service API .
    2:Error in the detail tab of the call monitor as under,
    <b>bi data upload error:status 51 - error: Error 8 when starting the extracting the program </b>
    Extraction (messages): Errors occurred
      Error occurred in the data selection
    Transfer (IDocs and TRFC): Errors occurred
    Request IDoc : Application document not posted (red)
      bw side: status 03: "IDoc: 0000000000007088 Status: Data passed to port OK,
                                    IDoc sent to SAP system or external program"
      r<b>/3 side: status 51:  IDoc: 0000000000012140 Status: Application document not posted
                                   Error 8 when starting the extraction program</b>
    Info IDoc 1 : Application document posted (green)
       r/3 side: "IDoc: 0000000000012141 Status: Data passed to port OK
                     IDoc sent to SAP system or external program"
       bw side: "IDoc: 0000000000007089 Status: Application document posted,
                     IDoc was successfully transferred to the monitor updating"
    Have attached screen shots showing error at BW side.
    •     check connection is ok, tried to restore the setting for bw-r3 connection – though same problem
    •     Have checked partner profile.
    <b>what's wrong with the process? </b>
    Best regards,
    dushyant.

    Hi,
       Refer note 140147.
    Regards,
    Meiy

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

Maybe you are looking for

  • Incopy: Text Macros not working from machine to machine, same template

    Hello there - Merry Christmas! This is my first post as I have exhausted my troubleshooting on an issue within Incopy CS4, version 6.06, Text Macro's. I have a template that is shared by all users on similar windows XP machines. I could go into more

  • One of my iPad Mail accounts doesn't work

    I recently changed the password on one of my Apple accounts, and update the settings for my iMac, MacBook Pro, and iPod Touch - with no incident. However, my iPad Mini's Mail account is acting up for some reason. I can access email from two other iPa

  • ObjectInputStream problems

    Hi I'm getting slightly desperate after 3 days stuck with the same problem. I'm writing a small client server application that contains a chat room. when I use a printwriter combined with a BufferedReader on the sockets everything works perfectly. Ho

  • Blackberry Appworld download does not work

    I have tried the following links http://ca.blackberry.com/apps/blackberry-world/download.html & http://us.blackberry.com/apps/blackberry-world/download.html entered my Email and both sites gives "Error" I belive the Appworld I have donwloaded on my 9

  • How do I troubleshoot "open print cartridge door, clear jam and press ok" ?

    I have opened the cartridge printer door, by lifting the top glass compartment and removing the 6" by 3" black cover....I do not see any jam? which is the error message I am getting....I have removed the rear piece, the front trays and even with all