Error in loading  transaction dat from flat file

hi everybody
while i try to load  data from flat file at time of  scheduling it shows the error "check load from infosourceand its showing  errorwhilw monitoring the data as follow"the  correcsponding datapacket were not updated using psa" what can  i do to rectify the  problem
thanks in advance

Hi,
Please check the data whether it is in a correct format using the "Preview" option in the infopackage.
Check whether all the related AWB objects are active or not.
Regards,
K.Manikandan.

Similar Messages

  • Loading transaction data from flat file to SNP order series objects

    Hi,
    I am an BW developer and i need to provide data to my SNP team.
    Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
    I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
    Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
    thanks in advance
    Rahul

    Hi,
    Please go through the following links:
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
    https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
    Hope this helps...
    Regards,
    Habeeb
    Assign points if helpful..:)

  • Error in load master data from flat file to infocube "No SID found for val"

    Hi gurus
    while i m loding data from ffile to infocube. getting error message "No SID found for value 'HOT ' of characteristic 0UNIT" for material name field .
    in flat file one fileld having lower carrecter so i have defined formula for upper case in transfer rule.
    after scheduleing message is "data was requested". but in monitor-> detail-> request, extraction status is green and transfer , processing ststus are red.
    overall status is red.

    It appears your transfer structure and source file are not mapped correctly. Make sure your source file is in the correct format. TRy the "preview" function to make sure.
    Also, is the file and your log on language the same? For instance is the file from a GErman user but you are logged in in English? You will see errors like that when you use ST (Stuck = Pieces) instead of PC (Peices in English)
    To test a bad mapping, hard code the Unit to something to make sure...(to PC for instance).
    Also, the error message you are getting is because of transfer structure, please re-activate it. IF it persists, try to skip the PSA and go straight to the data target.
    Message was edited by: Steve Wilson
    Message was edited by: Steve Wilson

  • Error while loading Transactional data from NW BW Infoprovider

    Hi,
      I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
    Below is the transformation file content
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=NO
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT= 9999999999999
    ROUNDAMOUNT=
    *MAPPING
    ACCOUNT = 0ACCOUNT
    BUSINESSTYPE = *NEWCOL(NOBTYPE)
    BWSRC = *NEWCOL(R3)
    COMPANYCODE = 0COMP_CODE
    DATASRC = *NEWCOL(R3)
    FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
    GAME = *NEWCOL(NOGAME)
    INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
    PROBABILITY = *NEWCOL(NOPROB)
    PRODUCT = *NEWCOL(NOPROD)
    PROFITCENTER = 0PROFIT_CTR
    PROJECT = *NEWCOL(NOPROJECT)
    TIME = 0FISCPER
    VERSION = *NEWCOL(REV0)
    WEEKS = *NEWCOL(NOWEEK)
    SIGNEDDATA= 0AMOUNT
    *CONVERSION
    Below is the Error Log
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    InforProvide=ZPCA_C01
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&amp;L.xls
    CLEARDATA= Yes
    RUNLOGIC= No
    CHECKLCK= Yes
    [Messages]
    Task name CONVERT:
    No 1 Round:
    Error occurs when loading transaction data from other cube
    Application: ACTUALREPORTING Package status: ERROR
    Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
    ANy Inpout will be appreciated.
    Sanjay

    Hello Guru,
    Currently i am getting below error while loading costcenter master data from BW to BPC.
    Task name MASTER DATA SOURCE:
    Record count: 189
    Task name TEXT SOURCE:
    Record count: 189
    Task name CONVERT:
    No 1 Round:
    Info provider  is not available
    Application: ZRB_SALES_CMB Package status: ERROR
    Anybody can tell me, if i have missed anything ???
    Regards,
    BI NEW
    Edited by: BI  NEW on Feb 23, 2011 12:25 PM

  • Can i load transaction data thro flat files without loading masters?

    Hi all,
    is it possible to load history data using flat files in sales cube without having loaded any masters?
    thanks

    Hi
    You can load the transaction data without loading the master data. You have to check the option in update rules (update also if no master data exists).
    In reporting you will not be able to see the attribute and text data. Its always adivceable to load master data first and run apply hierarchy/attribute change and load the transaction data.
    AHP: Nice to see you after a long time
    REGards
    Rak

  • Deleting master data after loading transactional data using flat file

    Dear All,
    I have loaded transaction data  into an infocube using a flat file . While loading DTP i have checked the  option "load  transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
    While loading the flat file, I made a mistake for DIVISION Characteristic  where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
    But when I see the master data for DIVISION , i can see a new entry  with value '4000'.
    My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
    I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
    Please suggest me on this.
    Regards,
    Veera

    Hi,
    Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
    If this master data is not used any where else just delete the master data completely with SID option.
    If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
    Hope this helps
    Akhan.

  • How to load the data from flat file ( ex excel ) to Planning area directly

    Hi all ,
    How can i load thedata fro m flat file directly to Planning area .
    PLease help me in this.
    Regards,
    Chandu .

    download one key figure data from planning book ( interactive damand plan) and made some changes and need to upload the data back to same planning book
    But, may I know why you are thinking of downloading, changing and uploading for just changing the figures for a particular key figure. You can do it in the planning book itself.
    However, not all the key-figures can be changed. But, what type of key-figure  you are speaking here? Is it like 'Forecast' for which the value is based on other key-figures, or is like a key-figure where some manual adjustments are to be done--so that it can be manually edited? However,  in both the cases, the data can be changed in the planning book only. In first case, you can change the values of dependant key-figures and in the second case, you can change the key-figures directly.
    And please note that you can change the values of the key-figures only at the detailed level. So, after loading the data in the book, use drill-down option, maintain the data at the detailed level, change the figures, and automatically, this gets reflected at the higher level.
    In case you are unable to change the values, go to the 'Design' mode of the book, right-click your key-figure, under "Selected Rows", uncheck "Output Only" option. In case you are unable to see that option, then you are not authorised to change that. See if you can change the authorisations by going to the "Data View" tab in planning book configuration (/n/sapapo/sdp8b), and change the value of Status to 3.
    Hope your query is answered with different solutions offered by many of the sdn colleagues here.
    Regards,
    Guru Charan.

  • Error while loading transaction data from an Infoprovider Mapping C_ACCT

    Hi Gurus,
    We are implementing BPC Consolidation for NetWeaver 7.5, and we're facing the following situation:
    We have loaded master data from 0RC_ACCOUNT (group account) to our dimension C_ACCT without a problem, but when we run the loading package from an infoprovider 0FIGL_C10 (Transactional Data), the following error message appear:
    Dimension: Member C_ACCT: 000 not valid
    We already check data in source InfoProvider and there's no record with 000 account value.
    Any suggestions?
    Thanks in advance
    Best Regards
    Abraham Méndez

    Rad,
    See if SAP note# 492647 & 849501 is of some help in your scenario.

  • Error while loading the data from text file

    Hi,
    I got an error " Data Value Encountered before all Dimensions selected" while loading the data from the text file.
    Can any one please suggest me the solution.

    Possible Solutions
    Make sure that the data source is valid.
    Is a member from each dimension specified correctly in the data source or rules file?
    Is the numeric data field at the end of the record? If not, move the numeric data field in the data source or move the numeric data field in the rules file.
    Are all members that might contain numbers (such as "100") enclosed in quotation marks in the data source?
    If you are using a header, is the header set up correctly? Remember that you can add missing dimension names to the header.
    Does the data source contain extra spaces or tabs?
    Has the updated outline been saved?

  • How to Load Arabic Data from flat file using SQL Loader ?

    Hi All,
    We need to load Arabic data from an xls file to Oracle database, Request you to provide a very good note/step to achieve the same.
    Below are the database parameters used
    NLS_CHARACTERSET AR8ISO8859P6
    nls_language american
    DB version:-10g release 2
    OS: rhel 5
    Thanks in advance,
    Satish

    Try to save your XLS file into CSV format and set either NLS_LANG to the right value or use SQL*Loader control file parameter CHARACTERSET.
    See http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_control_file.htm#i1005287

  • Error loading transaction data from csv file

    Hello all
    I am getting the following error when loading from a .csv file:
    Error 'The argument '-65.025,60' cannot be interpreted as a number' on assignment field BALANCE record 1 value -65.025,60
    The BALANCE field is defined as : 0D_BALANCE (from the SAP Demo Cubes).
    Any help on this is appreciated.
    TIA

    Thank you all for your help
    I made the changes. I was still getting the error until I had changed one item.
    Under the FIELDS tab, I set the "FORMAT" to "External".
    After doing this, all worked well.
    Thanks again!

  • How to load the data from flat file

    Hi ,
    Im new to Oracle 10g . I have to upload a flat flle(notepad with , as a seperator) to the table in the database,
    can you please tell me how to do it ?
    Thanks
    vivi

    Hi vivi,
    Sure, read about the wonderful SQL*LOADER. This tool's purpose is exactly that.
    Here's a link: Tahiti start page. Just type in sql*loader and off you go.
    Since you did not mention your database version, I could not put a better one.
    Regards,
    Guido

  • How to pass parameters to the Control file while loading the data from flat file to staging table

    Thanks in advance

    Hi ,
    LOADDATA statement is required at the beginning of the control file.
    INFILE: INFILE keyword is used to specify location of the datafile or datafiles.
    INFILE* specifies that the data is found in the control file and not in an external file. INFILE ‘$FILE’, can be used to send the filepath and filename as a parameter when registered as a concurrent program.
    INFILE   ‘/home/vision/kap/import2.csv’ specifies the filepath and the filename.
    Hope this will help you......

  • Given data is (10,20,.,30)how to load the data  from flat file to base table by eliminate the dot from data by using sql loader

    pls send ans for this

    1b5595eb-fcfc-48cc-90d2-43ba913ea79f wrote:
    pls send ans for this
    use any text editor to eliminate the dot before loading

  • Error in loading data from flat file....

    Hi,
    While loading the data from flat file , only one column is getting populated in the ODS and after that the sixth column is getting populated with the null values. Also when I save the changes in CSV file .. its not getting saved.
    Could you please tell me what could be the probable reason for this.
    Also what are the points we should keep in mind while loading from flat files.
    Regards,
    Jeetu

    Hi,
    You need to take care of -
    1. you flat file structure ( left to right columns) should match with the datasource ( top to down) column to column. if you don't wish to laod some column, leave it as blank column in flat file.
    2. your file should not be open when loading data
    3. make sure your transfer rules & update rules are defined properly.
    4. do the simulation in infopakage first & that will give you fair idea.
    5. to start with load data first in PSA Only, check & then take it forward.
    hope it helps
    regards
    Vikash

Maybe you are looking for

  • Error while deploying project

    All, I have ben able to fix most the issue with deployment using ant. But I am still stick on one exception. Has anyone encountered this before. -Thanks Exception in thread "Global JMS Executor:'TESTUpdateListner'" java.lang.SecurityE xception: [Secu

  • No appropriate line item is contained in this document : Error in F-03

    HI All, when I am trying to clear a G/L account in F-03, for a document I am getting the error 'No appropriate line item is contained in this document' I can see the open item in FBL3N and this is not a special G/L item what can be the possible reaso

  • Neither JMX nor JNDI seems working with OEP 12.1.3 (regression ?)

    Hi, I'm currently writing an OEP application, on the 12.1.3 version. My first problem is related to JMX: when OEP is started, I cannot connect in JMX: I lanch "OEP_HOME/oep/bin/wlevsjconsole.cmd" I use the following URL "service:jmx:msarmi://localhos

  • WBS planned data - which Business Content Source ?

    Hi Experts. Hope you can help me. I am trying to find the planned data for wbs element in datasource 0CO_OM_WBS_2, but the source gives 0 rows. I know there are some wbs elements lines that are planned for in transaction CJR2 ( Planning of cost on WB

  • 24p from a DVX 100 - What are the best ways to Import / Export

    Hi Everyone, A friend of mine did some shooting on his DVX100 on 24p. He then captured them to avi files. I have the files on my hard drive. What is the best settings when importing these files and what should my settings be for exporting. I want to