Cost Data Load

I am in the process of loading cost data into a v5.2.2 database. We are utilizing a SQL procedure to load the data into the Cost Item table. The extract of the Cost Item table is attached to this.
We are unable to see the Cost Type in the DWB Spec section. Please help!
I have a screenshot, but I do not see anyway to load it into this Thread....

I don't see your extract, but this may help.
Cost Items need to be tied to a cross reference, so a cross reference system selection is required before you will see the available cost types.
1. Add cross reference to a specification
2. Make sure there is an entry in the database that associates a cost, currency, facility, type, effective date to that cross reference
3. Make sure to select a cross reference system and currency, then the type drop down should populate.
You can find all the data that is needed in chapter 3 of the 5.2 DWB guide.
Data entered into the cost library must have the following information:
ERP/Cross Reference System—The code associated with the system that
sources the cost data.
Equivalent—The equivalent number for the specification that the cost is being
applied to.
Cost Type—A classification assigned to the cost.
Cost Set (Facility)—The facility that the cost is tied to. The same material can
have different costs across facilities.
Effective Date—The date that the cost information becomes effective in the
library.
UOM—The unit of measure that the cost is singularly specified for.
Cost—The cost value in the currency specified.
Currency—The currency of the entered cost.

Similar Messages

  • Master data load for 0COSTCENTER (Cost Center) failing

    Hi Experts
    I have a master data load for 0COSTCENTER (Cost Center). The Load has started failing from a couple of days at DTP.
    (R) Filter Out New Records with the Same Key
          (G) Starting Processing...
          (R) Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    I am unable to understand the reason for this failure. I tried loading the data 1 costcenter at a time it is still failing so i doubt if its a internal table storage issue as suggested by the dump.
    Could you help me on this one please.
    Regards
    Akshay Chonkar

    Thanks All
    I have got the issue resolved
    The Error Stack for the DTP had accumulated so many entries that it was unable to process further.
    So i deleted the entries in Error Stack usin SE14. Then again executed my DTP.
    Everything is fine now thanks for your help.
    Regards
    Akshay Chonkar

  • GL Account data load - Struggle , Pls help

    Hi all ,
    I am mapping the standard infoObject 0GL_ACCOUNT
    to an R/3 field of type HKONT . When i try to
    activate the data loaded in ODS , I get an error
    saying no SID exists for '   0003223311'. when i check
    the contents in the ODS , I can see the GL Account
    '0003223311' . In R/3 table ( data source ) , I see
    the GL account as '0003223311' when i select Standard
    list in user parameters . However when i tick option ,
    'Check conversion exits' , I see the GL Acct as '3223311   '.
    Please help me resolve this issue .
    Thanks all

    when i look at the cost center SID table , I see that the key for the table is a combination of Cost Center and
    Controlling area . Does that mean that the corressponding SIDs are for the combination value ? for ex : if in
    the cost center table i have a sid of 'xxxx' for cost center '9999' and co area 'abc' combination .
    Now i have an ODS which has only cost center(9999) filled , therefore when i try to activate the data in the ODS
    it tells me 'SID not found for cost center 9999' , Is that because the key to the SID table is a combo of Cost cen and Co area. Do i need to have both populated in my ODS ?

  • Design pattern / data loading solution

    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...
    thanks for any help!
    psi have also asked this question here [http://stackoverflow.com/questions/4329087/design-pattern-data-loading-solution]

    rackham wrote:
    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...That depends on the specifics.
    The fact that it seems like processes are similar doesn't mean that they are in fact the same. My code editor and Word both seem to be basically the same but I am rather sure that generalizing between the two would be a big mistake.
    And I speak from experience (parsing customer data and attempting to generalize the process.)
    The problem with attempting to generalize is if you generalize functionality that is not in fact the same. And then you end up with conditional logic all over the place to deal with differences dependent on the users. Rather than saving time that actually costs time because the code becomes more fragile.
    Doesn't mean it isn't possible but just rather that you should insure that it is in fact common behavior before implementing anything.

  • How to best reduce data load on MAC due to duplicate Adobe files?

    I just got hired at a small business. I don't have a lot of experience with MACs, so I need to know some best practices here.
    I am working with CS3, Ai, Ps, Id, and later, Dw.
    It's a magazine publishing company. I have it organizing so each magazine has its folder, and I want to have an "old editions" and a "working edition" folders. Within each, I want to break it down into "Ads this issue", "Links", and "stories".
    The Ads and Links are where I'm concerned. I want to have a copy of each ad's file within that folder, and a copy of all the other files its linked to, so that if the original ads/images get moved, the links won't be disturbed.
    I'm wondering if there is a way to do this without bogging down the machine's HD with duplicates of really large files. The machine moves slow enough as it is.
    I've theorized that I could:
    A) keep the Main "Ads" folder along with the subfolders compressed, and the "old editions" compressed, and have a regular copy in the working folder only. This also works because the ads get edited for different editions sometimes.
    or
    B) Is there a way to do this with Aliases? Being unfamiliar with alias, or even shortcuts, because I haven't worked in an actual production environment yet, I don't know they functionality of linking alias into an ID file. I read a couple of previous posts and the outlook isn't very good for it.
    or
    C) Just place a PDF (or whatever you guys think is the best quality preserving filetype) in with the magazine itself? Then each company could have its own ad folder with all the rest of the files...
    What do you all think? If you can even link me to a post that goes into further detail on which option you think is best, or if  you have a different solution, that would be wonderful. I am open to answers.
    I want to be sure to leave a cleaner computer/work environment then the last few punks who were here... That's my "best practice". Documentation and file organization got drilled into me at Uni.

    Sorry, I am overcaffienated today, this response is kind of long.
    "Data load?" Do you mean that:
    a) handling lots of large files is too much for your computer to handle, or
    b) simply having lots of large files on your hard drive (even if they are not currently in use) slows your computer down?
    Because b) is pretty much impossible, unless you are almost out of space on your system drive. Which can be ameliorated by... buying another drive.
    I once set up an install of InDesign on a Mac for a friend of mine who is chipping away at a big-data math PhD. and who is sick to death of LaTeX. (Can't blame her, really.) Because we are both BSD nerds from way back, she wanted to do what you are suggesting - but instead of thinking about aliases, which you are correct to regard with dubiousness, she wanted to do it with hardlinks. Which worked, more or less. She liked it. Seemed like overkill to me.
    I suspect that this is because she is a highfalutin' academic whereas I am a production wonk in a business. I have to compare the cost of my time resolving a broken-link issue due to a complicated archiving scheme versus Just Buying Another Drive. Having clocked myself on solving problems induced by complicated archival schemes (or failure of overworked project managers to correctly follow the rules for same) I know that it doesn't take many hours of my work invested in combing through archives or rebuilding lost image files to equal Another Drive that I can go out and Just Buy.
    If you set up a reasonable method of file organization, and document it clearly, then you have already saved your organization (and your successors!) significant amounts of time and cash. Hard drive space is cheap. Don't spend your time on figuring out a way to save a few terabytes here and there. In fact, what I'd suggest for you is to try to figure out how many terabytes you've already spent on this question, by figuring out todays ratio of easily purchaseable reliable external hard drives to your unit of preferred currency, then figuring out how many hours you've already spent on the question.
    The only reason I can make this argument is that price-per-unit-of-magnetic-data-storage has, with remarkablly few exceptions, been constantly plummeting for decades, while the space requirements for documentation have been going up comparatively slowly. If you need a faster computer to do your job more efficiently, then price out a SSD drive for your OS and applications and jobs-on-deck, and then show your higher-ups the math that proves that the SSD pays for itself in your saved time within n weeks. My gut feeling these days is that, unless you are seriously underpaid, n is between two and six.
    Finally: I didn't really address your suggested possibilities. Procedure C (placing PDFs) usually works, but you do need to figure out how to make PDFs in such a way as to ensure they play nicely with your print method. Procedure A (compress stuff you don't need anymore) probably works okay, but I hope that you have some sort of command-line scripting ability to be able to quickly route stuff into and out of archives.

  • Cost Data Sets

    What is the best way to upload the Cost Data Sets into P4P?
    If we choose to use a SQL Load directly, which tables in the database contain the cost information?

    PLM for Process provides a web service that does this, in the Cost Services web service. If you are doing this through the database directly, though, the tables it populates are:
    - costmessages - used to group cost items into one load
    - costitems - the individual spec cost, by cost type, currency, SCRM facility, etc.
    Please see the Web Services guide for details about the data expected in these tables. However you will also have to provide a Sequence Number, which should just be the next highest number

  • 0CO_PC_PCP_30 - Data Load

    Hi,
    We are using 0CO_PC_PCP_30 datasource to get itemization data . I was able to load for one plant,material combination to BW.
    What is the general selecton condition that we use to pull the data from R3 to BI. We need data for all plants to BI system.
    When we run for a plant,costing variant and costing keydate its running for long time.
    Please share your experiences on using this datasource.
    Thanks,
    Arun

    Hi,
    We have that note already installed in the system. Problem is when i run for range of plants with costing date and Costing variant the job is getting terminated in source system.
    Costing item 00001 is not assigned to a cost component
    Job cancelled after system exception ERROR_MESSAGE
    Thanks,
    Arun

  • Costing data in Accounting Tab

    Hi,
    When I create a task and have a role assignment, it does calculate the costing based on rates defined but as soon as I assign a resource to the role the cost data goes away.
    Can anyone help me understand this costing process? It works fine at the task level and role level as far as resources are not assigned to them.
    We need to assign resources to use CATS for time booking. It seems I'm missing something here.
    Pl. help.
    Thx/DS

    Hello DS,
    As I have mentioned earlier there are two ways of doing Accounting in cProjects, Task based and role based.
    For Task based accounting you need to define Cost / Revenue rate for Task type in customizing. When particular task type is used while defineing a task, in Additional data tab of that task you will get this default cost / revenue rate. Also at project header lever on Additional data tab you will require to define Org. Unit.
    There are further two ways to do Role based accounting.
    a) Without staffing: Here in customizing you need to define Cost / Revenue rate for particular Role type in define project role type. So when you create a Project role using that project role type, you will get this default cost revenue rate in Costing tab page.
    b) With Staffing: Here in Define project role type you need not define Cost / Revenue rate, instead you define the same for a Resource in tcode : BP. So whenever a Project role is staffed with the Resource, accounting is done.
    For both scenarios you need to define Org. Unit as mentioned earlier.
    Hope this clarifies your doubt.
    Regards,
    Niraj

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

  • BI 7.0 data load issue: InfoPackage can only load data to PSA?

    BI 7.0 backend extraction gurus,
    We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS. 
    After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view.  In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table). 
    Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed!  In the Data Target tab, find the ODS as a target can't be selected!  Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process!  Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS!  Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
    Many new features with BI 7.0!  Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!

    You dont have to select anything..
    Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
    Go through the links for Lucid explainations
    Infopackage -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
    DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Creating DTP
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    <b>Pre-requisite-</b>
    You have used transformations to define the data flow between the source and target object.
    Creating transformations-
    http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Data load times

    Hi,
    I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
    Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
    So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
    We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
    When I go the monitor screen of these data loads the total time the tortal time  for data loads show as 40 minutes.
    I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
    *Can anybody throw me some light?
    Thank you all
    Edited by: amrutha pal on Sep 30, 2008 4:23 PM

    Hi All,
    I am not asking like which steps needed to be included in which chain.
    My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
    The process chain also includes ods activation buliding indexex,extracting data from R/3.
    So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
    Let's take a example:
    In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
    When I right click on the display messages for successful load it shows all the messages like
    Job started.....
    Job ended.....
    The total time here it shows 15 minutes.
    when I go to RSMO for same step it shows 30 mintues....
    I am confused....
    Please help me???

  • Master Data loading got failed: error "Update mode R is not supported by th

    Hello Experts,
    I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
    For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
    "Update mode R is not supported by the extraction API"
    Can anyone tell me what is that error for? how to resolve this issue?
    Regards,
    Nirav

    Hi
    Update mode R error will come in the below case
    You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
    This time the load will fail with update mode R.
    As repeat delta is not supported.
    So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
    example your fiorst delta failed with replication issue.
    only replicating and repeaing will not solve the update mode R.
    you will have to do both replication of the data source and re-int for the update mode R.
    One more thing I would like to add is.
    If the the delat which failed with error the first time(not update mode R), then
    you have to do init with data transfer
    if it failed without picking any records,
    then do init without data transfer.
    Hope this helps
    Regards
    Shilpa
    Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM

  • CALL_FUNCTION_CONFLICT_TYPE Standard Data loading

    Hi,
    I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
    The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
    The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
    I apologise for the long message below... this is a part of the system log.
    For information the routine_0004 is a standard one.
    Thanks a lot in advanced!
    Cheers.
       CALL_FUNCTION_CONFLICT_TYPE                                                 
    Except.                CX_SY_DYN_CALL_ILLEGAL_TYPE                             
    Symptoms.                                                Type conflict when calling a function module
    Causes                                                        Error in ABAP application program.   
         The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.                                 
         This is probably due to an error in the ABAP program.                                 
         A function module was called incorrectly.
    Errors analysis
         An exception occurred. This exception is dealt with in more detail below                      
         . The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"               
         "(FORM)" .                                    
        Since the caller of the procedure could not have expected this exception                      
         to occur, the running program was terminated.                                                  The reason for the exception is:     
        The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
    The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
        The field "RESULT" specified here is a different field type.
    How to correct the error.
      You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:                                
        "CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC                                    
        "GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"                                
        "ROUTINE_0004"                                                                               
        If you cannot solve the problem yourself and you wish to send                                 
        an error message to SAP, include the following documents:                                  
        1. A printout of the problem description (short dump)                                         
           To obtain this, select in the current display "System->List->                              
           Save->Local File (unconverted)".                                              2. A suitable printout of the system log To obtain this, call the system log through transaction SM21.  Limit the time interval to 10 minutes before and 5 minutes  after the short dump. In the display, then select the function                                    
           "System->List->Save->Local File (unconverted)".                                       
        3. If the programs are your own programs or modified SAP programs, supply the source code.               
           To do this, select the Editor function "Further Utilities->  Upload/Download->Download".                                        
        4. Details regarding the conditions under which the error occurred                            
           or which actions and input led to the error.                                               
        The exception must either be prevented, caught within the procedure                           
         "ROUTINE_0004"                                    
        "(FORM)", or declared in the procedure's RAISING clause.                                      
        To prevent the exception, note the following:                                    
    Environment system SAP Release.............. "640"
    Operating system......... "SunOS"   Release.................. "5.9"
    Hardware type............ "sun4u"
    Character length......... 8 Bits     
    Pointer length........... 64 Bits             
    Work process number...... 2        
    Short dump setting....... "full"   
    Database type............ "ORACLE" 
    Database name............ "BWI"  
    Database owner........... "SAPTB1"  
    Character set............ "fr" 
    SAP kernel............... "640"   
    Created on............... "Jan 15 2006 21:42:36"   Created in............... "SunOS 5.8 Generic_108528-16 sun4u" 
    Database version......... "OCI_920 " 
    Patch level.............. "109"    
    Patch text............... " "        
    Supported environment....     
    Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."             
    SAP database version..... "640"
    Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
    SAP Release.............. "640"  
    The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in      
         "ROUTINE_0004". 
       The main program was "RSMO1_RSM2 ".  
    The termination occurred in line 702 of the source code of the (Include)           
         program "GP420EQ35FHFOCVEBCR6RWPVQBR"                                             
        of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
       Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in  the RAISING clause of the procedure. 
        The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
    672      'ROUTINE_0003' g_s_is-recno 
    673      rs_c_false rs_c_false g_s_is-recno  
    674      changing c_abort.   
    675     catch cx_foev_error_in_function. 
    676     perform error_message using 'RSAU' 'E' '510'  
    677             'ROUTINE_0003' g_s_is-recno
    678             rs_c_false rs_c_false g_s_is-recno
    679             changing c_abort.
    680   endtry.              
    681 endform.
    682 ************************************************************************ 
    683 * routine no.: 0004
    684 ************************************************************************ 
    685 form routine_0004 
    686   changing 
    687   result  type g_s_hashed_cube-FISCPER3
    688   returncode     like sy-subrc 
    689     c_t_idocstate  type rsarr_t_idocstate
    690     c_subrc        like sy-subrc 
    691     c_abort        like sy-subrc. "#EC *  
    692   data:
    693     l_t_rsmondata like rsmonview occurs 0 with header line. "#EC * 
    694                    
    695  try.             
    696 * init
    variables 
    697   move-corresponding g_s_is to comm_structure.
    698                     
    699 * fill the internal table "MONITOR", to make monitor entries  
    700                     
    701 * result value of the routine
    >>>>    CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'  
    703          EXPORTING      
    704               I_TIMNM_FROM       = '0CALDAY'  
    705               I_TIMNM_TO         = '0FISCPER'  
    706               I_TIMVL            = COMM_STRUCTURE-CALDAY
    707               I_FISCVARNT        = gd_fiscvarnt
    708          IMPORTING 
    709               E_FISCPER          = RESULT.                               
    710 * if the returncode is not equal zero, the result will not be updated 
    711   RETURNCODE = 0. 
    712 * if abort is not equal zero, the update process will be canceled
    713   ABORT = 0.
    714              
    715   catch cx_sy_conversion_error   
    716         cx_sy_arithmetic_error.
    717     perform error_message using 'RSAU' 'E' '507'
    718             'ROUTINE_0004' g_s_is-recno
    719             rs_c_false rs_c_false g_s_is-recno
    720             changing c_abort.
    721   catch cx_foev_error_in_function.
    System zones content
    Name                Val.                                                                               
    SY-SUBRC           0                                         
    SY-INDEX           2                                         
    SY-TABIX           0                                         
    SY-DBCNT           0                                         
    SY-FDPOS           65                                        
    SY-LSIND           0                                         
    SY-PAGNO           0                                         
    SY-LINNO           1                                         
    SY-COLNO           1                                         
    SY-PFKEY           0400                                      
    SY-UCOMM           OK                                        
    SY-TITLE           Moniteur - Atelier d'administration       
    SY-MSGTY           E                                         
    SY-MSGID           RSAU                                      
    SY-MSGNO           583                                       
    SY-MSGV1           BATVC  0000000000                         
    SY-MSGV2           0PROJECT                                  
    SY-MSGV3                                           
    SY-MSGV4                                           
    Selected variables
    Nº       23 Tpe          FORM
    Name    ROUTINE_0004           
    GD_FISCVARNT                                 
        22
        00 RS_C_INFO                                                      I
          4
          9                                
    COMM_STRUCTURE-CALDAY
    20060303
    33333333
    20060303  
    SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR   4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
    RESULT
    000
    333
    00

    You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
    DATA: var type <the same that e_fiscper in FM definition>
    CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
    EXPORTING
    I_TIMNM_FROM = '0CALDAY'
    I_TIMNM_TO = '0FISCPER'
    I_TIMVL = COMM_STRUCTURE-CALDAY
    I_FISCVARNT = gd_fiscvarnt
    IMPORTING
    E_FISCPER = var.
    result = var.
    --- ASSIGN POINTS IS USEFUL.

  • Data load stuck from DSO to Master data Infoobject

    Hello Experts,
    We have this issue where data load is stuck between a DSO and master data infoobject
    Data uploads from DSO( std) to master data infoobject.
    This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
    Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
    Now when we are doing full load via DTP the load is stuck and is not processing.
    Earlier it took only 5 mns of time to complete the full load.
    Please advise what could be the reason and cause behind this.
    Regards,
    santhosh.

    Hello guys,
    Thanks for the quick response.
    But its nothing proceeding further.
    The request is still running.
    earlier this same data is loaded in 5 mns.
    Please find the screen shot.
    master data for the infoobjects are loaded as well.
    I can see in SM50 the process at P table of the infoobject the process is.
    Please advise.
    Please find the detials
    Updating attributes for InfoObject YCVGUID
    Start of Master Data Update
    Check Duplicate Key Values
    Check Data Values
    Process time dependent attributes- green.
    No Message: Process Time-Dependent Attributes- yellow
    No Message: Generates Navigation Data- yellow
    No Message: Update Master Data Attributes - yellow
    No Message: End of Master Data Update - yellow
    and nothing is going further in Sm37
    Thanks,
    Santhosh.

Maybe you are looking for

  • Top-N Filter and Rank

    Hi, I want to display the Top-3 records based on the dashboard prompts and also display the overall rank of the record independent of the prompt. My data is in the format: Region Market Revenue East New York1 $100 East New York2 $80 East San Jose1 $5

  • Monitor will not awaken from sleep

    PowerMac G4 Dual 1GHz Quicksilver. OS 10.5.2, Apple Cinema Display. After sleeping, monitor will not awaken. Display is visible when screen sharing from another computer. Requires restart. Recent changes: Started using Time Machine on an existing int

  • Activity Diagram in Jdeveloper.

    Hi , I am trying to create a activity diagram for representing my requirement. I have two separate diagrams. In the MainFLow , i use 'Call Behaviour Action' so that i could call the subFlow. The problem is In the MainFlow, i would like to pass the pa

  • Inbox Search - status for service request not updated

    Hi, In inbox search, the status change for a service request does not reflect until the user log offs or opens the service request in edit mode. PFB the steps followed. (Two users are monitoring the service request but is being processed only by one)

  • Output device defaulting

    Dear Gurus. I have created the output type. and defined the condition record using tcode VV31. Now when I issue the billing document it is picking the output type properly. But the output device it picks up LP01. I want to make LOCL device by default