Error while uploading data into cube

I am trying to upload data into my content cube but I got an error it says"
"Time conversion from 0CALDAY to 0FISCPER (fiscal year S1 ) failed with value 20040303"     
I checked the data in the PSA it's there but the first record is not green light it has red light. Could you please give me some idea how to solve this problems.
Thank you in advance
sajita

If you don't know if you want to take over all settings (especially exchange rates may be critical) the problem is probably found in the fiscal year variant. So if you just take over the fiscal year variants.
If the problem remains you could check the following things:
In SPRO -> Global Settings -> Fiscal Year Variants (or similar) check:
Does a fiscal year variant S1 exist?
Is it time dependent? If yes, is it valid for Mar 3rd 2004?
If it is a self defined variant check if there is a period defined for March 3rd 2004.
Best regards
   Dirk

Similar Messages

  • Dead lock error while updating data into cube

    We have a scenario of daily truncate and upload of data into cube and volumes arrive @ 2 million per day.We have Parallel process setting (psa and data targets in parallel) in infopackage setting to speed up the data load process.This entire process runs thru process chain.
    We are facing dead lock issue everyday.How to avoid this ?
    In general dead lock occurs because of degenerated indexes if the volumes are very high. so my question is does deletion of Indexes of the cube everyday along with 'deletion of data target content' process help to avoiding dead lock ?
    Also observed is updation of values into one infoobject is taking longer time approx 3 mins for each data packet.That infoobject is placed in dimension and defined it as line item as the volumes are very high for that specific object.
    so this is over all scenario !!
    two things :
    1) will deletion of indexes and recreation help to avoid dead lock ?
    2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
    Regards.

    hello,
    1) will deletion of indexes and recreation help to avoid dead lock ?
    Ans:
    To avoid this problem, we need to drop the indexes of the cube before uploading the data.and rebuild the indexes...
    Also,
    just find out in SM12 which is the process which is causing lock.... Delete that.
    find out the process in SM66 which is running for a very long time.Stop  this process.
    Check the transaction SM50 for the number of processes available in the system. If they are not adequate, you have to increase them with the help of basis team
    2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
    Ans:
    Lie item dimension is one of the ways to improve data load as well as query performance by eliminationg the need for dimensin table. So while loading/reading, one less table to deal with..
    Check in the transformation mapping of that chs, it any rouitne/formula  is written.If so, this can lead to more time for processing that IO.
    Storing mass data in InfoCubes at document level is generally not recommended because when data is loaded, a huge SID table is created for the document number line-item dimension.
    check if your IO is similar to doc no...
    Regards,
    Dhanya

  • Error while loading data into cube 0calday to 0fiscper (2lis_13_vdcon)

    Hi all,
    I m getting following error while loading the data into cube.
    "Time conversion from 0CALDAY to 0FISCPER (fiscal year V3 ) failed with value 10081031"
    amit shetye

    Hi Amit,
    This conversion problem. Calender not maintained for Fiscal variant "V3", for Year: 1008.
    Maintain calender for year: 1008 and transfer global setting from soruce(R/3).
    RSA1--> Source systems --> from context menu --> transfer global settings > choose fiscal year variants and calender> execute
    Hope it Helps
    Srini

  • Error while loading data into cube

    hi BW gurus,
    when ever i am trying to load data into the cube from flat file after scheduling iam getting short dump in BW system. I checked it in st22 it is giving me a error as exception add_partition_failed. please help me to sort out this problem. If you know the error recovery please give me the answer in detail.
    I will assign points for good answers.

    This is what the note says:
    Symptom
    The process of loading transaction data fails because a new partition cannot be added to the F fact table.The loading process terminates with a short dump.
    Other terms
    RSDU_TABLE_ADD_PARTITION_ORA, RSDU_TABLE_ADD_PARTITION_FAILED, TABART_INCONSITENCY, TSORA, TAORA , CATALOG
    Reason and Prerequisites
    The possible causes are:
    SQL errors when creating the partition
    Inconsistencies in the Data Dictionary control tables TAORA and TSORA
    Solution
    BW 3.0A & BW 3.0B
    In the case of SQL errors:Analyze the SQL code in the system log or short dump and if possible, eliminate the cause. The cause is often a disk space problem or lock situations on the database catalog or, less frequently: the partitioning option in the ORACLE database is not installed.
    The most common cause of the problem is inconsistencies in the TAORA and TSORA tables. As of Support Package 14 for BW 3.0B/Support Package 8 for BW 3.1C, the TABART_INCONSITENCY exception is issued in this case. The reason is almost always missing entries in TSORA for the tablespaces of the DDIM, DFACT and DODS data classes.
    The TAORA table contains the assignment of data classes to data tablespaces and their attributes, for example:
    Tablespace Data class
    DDIM PSAPDIMD ........
    DFACT PSAPFACTD ........
    DODS PSAPODSD .......
    Foreach data tablespace, the TSORA table must contain an entry for the corresponding index tablespace, for example:
    TABSPACE INDSPACE
    PSAPDIMD PSAPDIMD
    PSAPFACTD PSAPFACTD
    PSAPODSD PSAPODSD
    In most cases, these entries are missing and have to be added. See also notes 502989 and 46272.

  • Error -- While Uploading Data into Planning Book

    Hi all...
    working on SCM 4.1...
    In Planning book...while loading the data of one SKU-Location into the PB , i am getting the below error...
    "Overflow during propagation calculation: Number too large"
    Message no. /SAPAPO/OM_TS078
    Diagnosis
    An overflow occurred while propagating a change to a key figure within the LCA routine.
    In this case an internal error in the LCA routine is concerned.
    *ProcedureCreate an OSS message as detailed in SAP Note 167280.*
    can anybody help me to resolve the same as i couldnot ablt to view the data in planning book for one of the Product-location.
    Regards,
    Rajesh Patil

    Hi Senthil,
    For Some of the SKUs it is giving the Error , some abrupt values are stored in Live Cache for these SKUs(CVC), thats why it giving the Error.
    The only solution is ..i have to delete the CVCs and create new one...
    But i will loose the data as we cant take the Backup of Old CVC data  --> Giving COM Error
    The reason of Abrupt  Values in Live Cache  Could Be -->.>
    Macro/Alerts
    How can we fix these types of Error , as regularly i am getting these Error??
    Is there any Note/Patch to overcome this issue?
    Will be GR8 if i will get the solution..
    Regards,
    Rajesh Patil

  • While uploading data into the r/3 using call transaction or session method

    hi experts
    while uploading data into the r/3 using call transaction or session method error occured in the middle of the processing then how these methods behaves it transfers next records or not?

    hai
    Session method: The records are not added to the database until the session is processed. sy-subrc is not returned. Error logs are created for error records. Updation in database table is always Synchronous.
    Call Transaction method: The records are immediately added to the database table. sy-subrc is returned to 0 if successful. Error logs are not created and hence the errors need to be handled explicitly. Updation in database table is either Synchronous or Asynchronous.
    While to transfer the data from the through if any errors occurs until the errors are the complete the data is not transfer to the SAP system.
    the system compulsory shows the errors. that errors are stored into the error logs (Transaction is SM35).
    so the session method should not return any value.
    In call transaction method data is directly pass to the SAP system.
    So its compulsory return the value.
    Because of the call transaction is the function.
    A function should return the value mandatory
    In session method errors stroed in SYSTEM GENRATED ERROR LOG.
    IN CALL TRANSACTION TO CAPTURE THE ERRORS WE SHOULD PERFORM THE FOLLOWING.
    FIRST ME MUST DECLARE AN INTERNAL TABLE WITH THE STRUCTURE OF BDCMSGCOLL TABLE.
    THEN WHILE WRITING THE CALL TRANSACTION STATEMENT WE SHOULD PUT THE 'E' MODE FOR CAPTURING ALL THE ERRORS.
    THEN FINALLY THE CAPTURED ERRORS MUST TO SENT TO THE INTERNAL TABLE WHICH WE DECLARED IN THE BEGINNING WITH BDCMSGCOLL BY USING THE FUNCTION MODULE "FORMAT_MESSAGE"
    AND THUS THE ERROR MESSAGES WILL BE SENT TO THE INTERNAL TABLE WHICH WE DECLARED AT THE BEGINNING.

  • DB Connect Load - "Unknow error while uploading data from the DB Table"

    Hi Experts,
    We have our BI7 system connected to Oracle DB based third party tool. The loads are performing quite well in DEV environment.
    I would like to know, how we transport DB Connect datasources to Quality systems? Any different process to be followed for DB Connect datasources?
    At present the connections between BI Quality and the third party quality systems are established. We transported the DataSource from BI DEV system to BI quality system, but on trigerring an infopackage we are not able to perform loads. It prompts - "Unknow error while uploading data from the DB Table".
    Also on comparing the DataSources in DEV system and Quality system there are no fields in "Proposal" tab of datasource in Quality system. Also I cannot change or activate Datasource in Quality system as we dont have change access in quality.
    Please advice.
    Thanks,
    Abhijit

    Hi,
    Sorry for bumping an old thread ....
    Did this issue get ever get resolved?
    I am facing the same one. The loads work successfully in Dev. The transport for DBConnect DS also moved in successfully.
    One strange this is that DB User for dev did not automatically change to db user from quality when I transported the DBConnect datasource. DBCon DS still shows me the DB User from Dev in Quality system
    I get "Unknown Error" whenever I trigger the data package.
    Advait

  • Error while uploading file into KM

    Hi Experts,
    I m getting error while uploading file into KM. Its throwing an error message like " Syngenta-POC.doc" does not exist, or file is empty; you cannot upload empty files"
    please assists me.

    Shantanu,
    Please check the files which you are upload is empty or unknown format. If everything seems valid then try to check for the corresponding SAP Note in service.sap.com
    Ram

  • Ways to Upload data into Cube,

    1. What are the methods to upload data into cube, apart from flat file mode.(.csv)
    2. I  have some data in the Planning book in a  KF eg. Sales Order. , i need to upload this sales order data to another cube, say forecast cube. how can i do it???
    how to upload the data from Planning book to another cube.
    Thanks!.

    Please follow the steps:
    1.      If necessary, replicate the DataSource. To do so, you can use the following options in the Data Warehousing Workbench:
    ○     Select the source system in the source system overview and choose Replicate DataSources in the context menu. This replicates all data sources in the source system.
    ○     Select the data source in the DataSource overview and choose Replicate Metadata in the context menu. This replicates just the one DataSource.
      2.      Create an InfoSource and assign the data source to it. To do so, choose your application component on the InfoSource page of the Data Warehousing Workbench. In the context menu, choose Create InfoSource. On the next dialog box, select Transactional Data. Another dialog box appears. Enter a name and description for the new InfoSource and choose Enter. In the tree, select the new InfoSource. In the context menu, choose Assign DataSource. On the dialog box that appears, enter the source system. A list of DataSources appears. Select the required DataSource. Choose Enter.
    Alternatively you can remain in the DataSource overview. A  icon indicates that no InfoSource has been assigned yet. Either click the icon or choose Assign InfoSource in the context menu. On the dialog box that appears, enter a name for the InfoSource. Choose . On the next dialog box, enter a description and choose . Confirm the following dialog box. You can now maintain the InfoSource.
    You can assign a DataSource to one InfoSource only.
    3.      Create a DataStore object in the InfoProvider overview of the Data Warehousing Workbench.
    a.      Select the InfoArea and then choose Create DataStore Object in the context menu. The Edit DataStore Object dialog box appears.
      b.      Enter a name and a short description. If required, you can also specify a DataStore object to use as a template. Choose . The Edit DataStore Object dialog box appears.
       c.      On the left-hand side of the screen, you can select InfoObjects, for example, InfoCubes or InfoObjectCatalogs. You can copy characteristics or key figures from these InfoObjects to the DataStore object. We suggest that you select either the InfoCube to which you want to copy the data, or the InfoSource.
         d.      Copy the characteristics to the key fields in the right-hand tree in the DataStore object and copy the key figures to the data fields. In both cases, use drag and drop. You might have to transfer the 0RECORDMODE InfoObject from the Business Content.
         e.      In the Settings branch of the DataStore tree, set the following indicators:
    ■      Set quality status to 'OK' automatically
    ■      Activate DataStore object data automatically
    ■      Update data targets from DataStore object automatically
      f.      Activate the DataStore object.
    For more information, see DataStore Object.
           4.      Create update rules for the DataStore object.
    .a.      Select the DataStore object in the data targets page (Data Warehousing Workbench).
    b.      Choose Create update rules from the context menu. The Create Update Rules screen appears.
    c.      Enter the InfoSource that you created in step 2. Choose . Edit the update rules as necessary.
    d.      Activate the update rules by choosing .
           5.      Create update rules for the InfoCube as above, but with the DataStore object as the data source.
           6.      Create an InfoPackage for the InfoSource. In contrast to the normal procedure, on the Processing tab page, set the Only PSA and Update subsequently in data targetsindicators. Start or schedule the data load

  • Getting error while loading  Data into ASO cube by flat file.

    Hi All,
    i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
    does anyone have solution.
    Regards,
    VM

    Are you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error while loading  data into External table from the flat files

    HI ,
    We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
    While loading the data, we are encountering the following error.
    Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: un) while loading data into table_ext
    Please let us know what needs to be done in this case to solve this problem.
    Thanks,
    Kartheek

    Kartheek,
    I used Google (mine still works).... please check those links:
    http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
    http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
    HTH,
    Thierry

  • Error while Inserting data into flow table

    Hi All,
    I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
    ========================
    I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
    My plan is to achieve this in 3 interfaces:
    1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
    2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
    3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
    Question-1 : Is this approach correct?
    ========================
    I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
    Flow Control not possible if no Key is declared in your Target Datastore
    With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
    Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
    ========================
    Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
    For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
    Question-3 : When is this CKM knowledge module required?
    ========================
    After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
    1 - Loading - SS_0 - Drop work table
    2 - Loading - SS_0 - Create work table
    3 - Loading - SS_0 - Load data
    5 - Integration - FTE Actual data to Staging table - Drop flow table
    6 - Integration - FTE Actual data to Staging table - Create flow table I$
    7 - Integration - FTE Actual data to Staging table - Delete target table
    8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
    The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
    Question-4 : What/why is this error? Did I made any mistake while creating a sequence?

    Everyone is new and starts somewhere. And the community is there to help you.
    1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
    Otherwise, its simple to move data from SourceFile -> Target Table
    2.) Does your Target table have a Key ?
    3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
    4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
    <%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
    where MY_DB_Sequence_Row is the oracle sequence in the target schema.
    HTH

  • Error  while uploading data in table t_499s through BDC Prog

    Hi
    am facing problem while uploading data in table t_499s through BDC Program  , if there is more than 15 records in file its not allowing to upload kindly suggest what to do
    Thanx
    Mukesh s

    Hi,
    See if you want to update only single table, which has User maintenance allowed
    Use Modify statement.
    EX:
    LOOP AT ITAB INTO WA_TAB.
        MOVE-CORRESPONDING WA_TAB TO T499S.
        MODIFY T499S.
        CLEAR T499S.
      ENDLOOP.
    It will update the table, to check go to sm30 , and check in V_T499S.
    Rgds
    Aeda

  • Error while uploading data to ztable from excel file

    Hi,
    I have a requirement where i have to upload data from excel file to ztable.I have used the fm 'ALSM_EXCEL_TO_INTERNAL_TABLE' for reading the excel file.After reading the excel file i have used INSERT zrb_hdr from table t_zrb_hdr for updating the ztable with data .
    here it is giving error as the data base table zrb_hdr and the internal table t_zrb_hdr should be declared of same type .
    I got this error b'coz i have changed the date and time fields in t_zrb_hdr table to char type.so the structure of zrb_hdr and t_zrb_hdr are not same.If i don't change the date and time fields,in the o/p i am not getting proper date and time formats.
    now how can i upload data into ztable?

    Hi,
    Try this.
    Data: itab type standard table of ztable,
             wa_itab type ztable.
    loop at t_zrb_hdr into wa_t_zrb_hdr.
       wa_itab-date = wa_t_zrb_hdr-date.
       wa_itab-time = wa_t_zrb_hdr-time.
       like  move all the fiedl to wa_itab...........
       append itab with wa_itab.
    Endloop.
    now insert the records from itab to the database table ztable.
    Thanks,
    Muthu.

  • Error while loading data into clob data type.

    Hi,
    I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
    java.lang.NumberFormatException: For input string: "4294967295"
         at java.lang.NumberFormatException.forInputString(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
    Let me know if anyone come across and resolved this kind of issue.
    Thanks much,
    Nishit Gajjar

    Mr. Gajjar,
    You didnt mention what KMs you are using ?
    have a read of
    Re: Facing issues while using BLOB
    and
    Load BLOB column in Oracle to Image column in MS SQL Server
    Try again.
    And can you please mark the Correct/Helpful points to the answers too.
    Edited by: actdi on Jan 10, 2012 10:45 AM

Maybe you are looking for

  • JSP/javascript question. Guru's please help.

    Need help. I know we can assign value of a JSP variable to a javascript variable. e.g. strJScriptvar = <%=strJSPvar%>; Is there a way we can go the other way, i.e. assigning a javascript variable value to a JSP variable? e.g. will it be valid <%strJS

  • FB70 Misc Sales Invoice + EC Sales List (ESL)

    Dear All Does anybody know a way of having FB70 Miscellaneous Sales Invoices generate the appropriate data to appear in EC sales listing? We use this sometimes to bill for services rendered, but it does not appear to create ESL movements when we run

  • Portal Session gettable?

    Hi all! Is it possible to get the IPortalComponentRequest details in a webdynpro application. I don't want to pass data through the Application Parameters in the Portal. In short, Can I read directly from the Portal Session. Best regards, Andreas

  • Dynamic action - default values

    Hi guys, Anybody knows if it is possible to set a dynamic action in order to default P0001-WERKS field in the Infotype 0001? Thank you very much!

  • Monitoring filesystem changes, made by programm/command/script

    Hello, guys. Archlinux is my first serious linux experience as desktop OS. I'm in a process of learning... Often I want to view changes which some programm/command/script makes in file system: which files or directories were created/deleted/modified(