Flexible upload log showns no values

Hi All,
We are uploading transaction data through Flexible upload method.
After update data log shows no values under Local Currency and Quantity.
Please let me know, why system behaving like this.
How to
Regards
SVNR

Hi Dan,
Thanks for your immediate reply.
Key figures for LC and quantity are already mapped and included in the flexible upload method and Quatity was used in the breakdown categories.
But still it is not showing any values in the log for respective two consolidation units.  Rest of all cons unit disclose the values.

Similar Messages

  • Flexible upload of profitcenter nodes from CSV is restructured after succes

    Hi;
    I'm having a problem with using flexible upload loading data center nodes into an existing structure. The result is that Inside that profit center hierarchy a node 1000 is created in which the nodes specified in the CSV file are created. It is found that the hierarchy structure is not maintained as the newly created nodes are created under earchother.
    Based on the idea that a hierarchy has been created as a profit center group with exactly the same structure it is not understoud why it is not possible to do this for the profit center. Additionally it demonstrates that the flexible upload function is understoud and is configured correctly. As well the file is properly setup.
    This website was used http://help.sap.com/saphelp_sem60/helpdata/en/62/f7e73ac6e7ec28e10000000a114084/frameset.htm
    I'm looking to understand why it is that the nodes are created differently for a profit center group then a profit center, It is suggested that the problem lies in the fact that the profit center was created and is under a specific configuration that does not allow me to add nodes to it.
    THis is the structure:
    H
    --N1
       --N2
          -- The node that is added here and one level below is placed under "N3 1000"
    --N3 1000
       --The node is added here
       --The child is added as a sibling
    It was noticed that N3 is labeled as 1000 which is the controlled area. When refering to the log it has been found that the CO area was automatically populated with 1000 dispite this not having been done in the CSV file. Therefore it is assumed that there is a relationship between the nodes being added to the 1000 node and the controlled area.
    Any suggestions are welcome. Would it be required to provide more detail with regards to the problem then please state this.

    Hi,
    IMHO, the culprit is a CO area set as an external attrbute int the hierarchy of PC and PCG.
    AFAIR, CO area might/must be a linked attribute to PC (see the last tabstrip in PC infoobject screen) and that's why it should be fixed in the ConsArea settings (and populated aotomatically durung the data load).
    If I were you I would try the following:
    - ask the basis guys make a backup copy of the system
    - delete CO Area from the external char of PC & PCG hierarchies
    - set CO Area as a linked attribute to PC & PCG (if it is really needed - very often it's not needed if CO area has just the only value (which is fixed in the ConsArea settings)). => NB => These changes are very significant for BW and it's not always possible to done without data deletion.
    - regenerate the BCS data basis (and probably the ConsArea).
    The 1st step is needed because SEM-BCS very often does not regenerate properly the data basis (it simply doesn't see the changes). In the forum topics I several times explained how to force the system to see these changes (just drag and drop any role in the definiton of the BCS data basis and the save it).
    This is (significant change in underlying properties of BW infoobjects) the main painpoint in changes BW for SEM-BCS. In my practice, unfortunately,  it required not less than several attempts (even with dumps). And unfortunately it may require the full data deletion in the cubes which play role in the data basis. It might be the main charge for the bad desifn of the BW structures for SEM-BCS.
    Hopefully, you'll avoid it.
    Good luck!
    Linked attribute = compound attribute.
    Edited by: Eugene Khusainov on Jan 31, 2011 4:25 AM

  • Short Dump - GETWA_NOT_ASSIGNED - Flexible upload of RFD

    Hi,
    SEM 6.0 / BI 7.0
    When i try to do a Flexible upload for a sample RFD, i get the following dump:
    Runtime Errors         GETWA_NOT_ASSIGNED
    Date and Time          11/25/2008 16:22:01
    Short dump has not been completely stored (too big)
    Short text
        Field symbol has not yet been assigned.
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "CL_UC_TASK_EXECUTION==========CP" had to be
         terminated because it has
        come across a statement that unfortunately cannot be executed.
    Error analysis
        You attempted to access an unassigned field symbol
        (data segment 32776).
        This error may occur if
        - You address a typed field symbol before it has been set with
          ASSIGN
        - You address a field symbol that pointed to the line of an
          internal table that was deleted
        - You address a field symbol that was previously reset using
          UNASSIGN or that pointed to a local field that no
          longer exists
        - You address a global function interface, although the
          respective function module is not active - that is, is
          not in the list of active calls. The list of active calls
          can be taken from this short dump.
    Trigger Location of Runtime Error
        Program                               CL_UC_TASK_EXECUTION==========CP
        Include                                 CL_UC_TASK_EXECUTION==========CM02V
        Row                                     155
        Module type                          (METHOD)
        Module Name                        COMPARE_OLD_AND_NEW_DOCS
    Flexible upload - Delete all / Cumulative
    Has anyone encountered this type of dump before?
    My Observation
    (Data Rows contain the following columns - Item / Company / Trading partner / PV LC)
    When i give a trading partner which is not defined in the system, the system is interpreting the header row & data rows. Also, it gives an error message for each row.
    But,
    When i give a trading partner which is defined in the system, or when i remove the trading partner column in total, it throws the above short dump.
    The breakdown category defined in the system are Trading partner & Movement type.
    Both, have the breakdown type 1. (optional breakdown, initialized value allowed).
    Pls note that after several permutations & combinations - to trace the error, i have removed all the columns & kept the minimum required (Item / Company / Trading partner / PV LC).
    Appreciate your comments / inputs.
    Thanks!
    Kumar

    Hi Kumar,
    The note refers to FINBASIS 300 release.
    In the present system we have, it is FINBASIS 600.
    I raised an OSS message.
    Anyway, implementation of the note helped me with all releases.
    But, first of all, I mentioned another remedy that you may see here:
    SPRO: SEM/Business Analytics -> Fin. Basis -> Master Data Framework -> System Settings -> Profile Parameter Setting.
    It's about setting parameter for ABAP shared memory on the server. If you do not set this parameter to 200-300 MB, you constantly will have the errors like I mentioned while trying to save master data. Did you read my previous message carefully and look at this hint?
    In case of breakdown type 1 - the upload should happen - if we dont specify any values for the breakdown category. Pls confirm.
    - Confirmation. The system will accept any value, including null.
    Is it true that we cannot load RFD data (with both movement type & trading partner information), in one go? If yes, what is the reason behind it.
    - No confirmation. Where is it from? I always do such a load.

  • Flexible Upload of Data for several periods/years

    Hey Colleagues,
    my client wants me to develop a new functionality to upload data for several periods/years. The function they are using right now and correctly is the flexible upload.
    As we have already developed several custom tasks and enhancements/modifications we try to not modificate further SAP standard or to post directly (back-door solution)
    But:
    It is no problem to start the flexible upload for different selecitons and to provide the values in an excel sheet. But I have problems to avoid a popup for selecting the excel file. It would be no problem if for example Batch-Input would work here, or if we would post directly using standard functionality (posting method with data change) but in this case we try to use the flex upload.
    Has anyone experienced a similiar problem? Can we use the flex upload with suppressing the popup to select the file WITHOUT an enhancement?
    Thanks in advance for your help,
    Yavuz

    Hi Yavuz,
    Yes this can work for every period in one year - use the multiperiod processing option, available since EHP2 (BCS 6.02).
    The multiperiod processing requires you to define the consolidation cycle and start period, so you could define it to start with p1 and include all 12+ periods of the year.
    NB the multiperiod processing is initiated by running with selection screen - right click and selec "run for remaining periods of the year"
    You can run either a task or a task group for the rest of the consoliation cycle, but data collection with flexible upload MUST have the presentation server settings (already mentioned above) or the data collection task will not work.
    So you make the configuration settings
    You then save your multi-period file on the presentatin server
    You then go to data collection (flexible upload) task in p1 and "run for remaining periods"

  • Flexible Upload of master data including attributes - error at new attrib.

    Dear all,
    trying to use the flexible upload for loading master data with attributes the system aborts if there new values have to be created for attributes, e.g. New Business Unit has region FR, Region FR actually does not exist as key. Therefore at first the value for Region has to be created within BCS and after that the load of the new Business Unit will be successful.
    Does anybody know if it possible or which seeting within the upload customizing must be chosen that the new attribut value automatically could be created during upload of the main characteristic?
    Kind regards
    Dieter

    Hello Dieter,
    your first way was the right way - befor you can upload Business Units with attribut FR you have to create the attribute in BCS.
    It is not possible to create master data for two different characteristics in one flexible upload method.
    In the definition of the upload method you have to choose which characteristic you try to upload...
    best regards
    Thomas

  • Simple question on uploading reported Fin. data with Flexible upload

    Hi.
    I am trying to upload Rep. Fin. Data with Flexible upload. I found that the all CR item should have the number with - sign. Firstly I think that Dr/Cr sign field in FS item will work for deciding the sign of all number but it doesn't seem to be. Is my finding right?
    or do I missed something?.
    I just followed the help as below
    Data Entry
    Debit/Credit Sign
    You define a fixed debit/credit sign so that item values can be entered without the need for specifying the debit/credit sign (for example, in manual data entry or in flexible upload).
    ●     You enter a debit (plus/) sign for asset items and expense items.+
    ●     You enter a credit (minus/-) sign for equity and liability items and revenue items.
    When you enter item values, you only enter absolute amounts or quantities. When the values are written to the database, the system multiplies each entered value by the debit/credit sign of the item (and when applicable, by the debit/credit sign of the item's subassignment).
    If I missed anything, please advise me.
    If my finding is right, please let me know what is the use of DR/CR sign in FS item.

    Hi,
    Your findings are correct.
    Your may have (and may use) the sign of FS item and sign of MovementType. They may work during data load, in order to place data into database with the proper sign. That's all.

  • Flexible Upload

    Hi,
    I have done the configuration for Flexible upload and able to bring the data into the BCS cube. This upload brings the content data from the file, but we need to have the details like file name also to be brought into the cube.
    Request to provide inputs on the same.
    Thanks in advance,
    Abdul

    Hi Rajesh,
    Thanks for your reply,
    By saying brought to the infocube means, the data is coming in the totals records cube.
    I need to bring the file name into the same totals record cube, for eg:
    the file uploaded through flexible upload is like :C:// Test folder/Test1
    Now this Test1 name should get displayed into the infocube. As per the last log we can see the details of the file location etc; but the need is to bring the same file name into the cube.
    Appreciate your advice.
    Thanks in advance,
    Abdul

  • How to upload a document with values related to document properties in to document set at same time using Javascript object model

    Hi,
          Problem Description: Need to upload a document with values related to document properties using custom form in to document set using JavaScript Object Model.
        Kindly let me know any solutions.
    Thanks
    Razvi444

    The following code shows how to use REST/Ajax to upload a document to a document set.
    function uploadToDocumentSet(filename, content) {
    appweburl = decodeURIComponent(getQueryStringParameter('SPAppWebUrl'));
    hostweburl = decodeURIComponent(getQueryStringParameter('SPHostUrl'));
    var restSource = appweburl +
    "/_api/SP.AppContextSite(@target)/web/GetFolderByServerRelativeUrl('/restdocuments/testds')/files/add(url='" + filename + "',overwrite=true)?@target='" + hostweburl + "'";
    var dfd = $.Deferred();
    $.ajax(
    'url': restSource,
    'method': 'POST',
    'data': content,
    processData: false,
    timeout:1000000,
    'headers': {
    'accept': 'application/json;odata=verbose',
    'X-RequestDigest': $('#__REQUESTDIGEST').val(),
    "content-length": content.byteLength
    'success': function (data) {
    var d = data;
    dfd.resolve(d);
    'error': function (err,textStatus,errorThrown) {
    dfd.reject(err);
    return dfd;
    Then when this code returns you can use the following to update the metadata of the new document.
    function updateMetadataNoVersion(fileUrl) {
    appweburl = decodeURIComponent(getQueryStringParameter('SPAppWebUrl'));
    hostweburl = decodeURIComponent(getQueryStringParameter('SPHostUrl'));
    var restSource = appweburl +
    "/_api/SP.AppContextSite(@target)/web/GetFolderByServerRelativeUrl('/restdocuments/testds')/files/getbyurl(url='" + fileUrl + "')/listitemallfields/validateupdatelistitem?@target='" + hostweburl + "'";
    var dfd = $.Deferred();
    $.ajax(
    'url': restSource,
    'method': 'POST',
    'data': JSON.stringify({
    'formValues': [
    '__metadata': { 'type': 'SP.ListItemFormUpdateValue' },
    'FieldName': 'Title',
    'FieldValue': 'My Title2'
    'bNewDocumentUpdate': true,
    'checkInComment': ''
    'headers': {
    'accept': 'application/json;odata=verbose',
    'content-type': 'application/json;odata=verbose',
    'X-RequestDigest': $('#__REQUESTDIGEST').val()
    'success': function (data) {
    var d = data;
    dfd.resolve(d);
    'error': function (err) {
    dfd.reject(err);
    return dfd;
    Blog | SharePoint Field Notes Dev Tools |
    SPFastDeploy | SPRemoteAPIExplorer

  • How to upload two different asset values for Book & Tax Depre in FA

    Hello all
    kindly let me know how to resolve a situation where we have different methods of calculating Depreciation. For Example SLM for Book Depreciation and Diminishing method for Tax Depreciation.
    To calculate depreciation we use straight line method for book depreciation and diminishing value method for Tax depreciation. In this case, we have 2 different asset values to calculate book and tax depreciation.
    Since SAP allows to upload only one asset value (Gross book value), I get correct book depreciation value but incorrect value for tax depreciation as the tax depreciation is calculated based on gross book value instead of net book value (gross book value less accumulated depreciation value).
    For Eg
    FOR THE FIRST YEAR 
    BOOK DEPRECIATION:
    Asset Value = 1000
    Depreciation Method : SLM
    Depreciation rate is 10%
    Depreciation Amount = 100
    TAX DEPRECIATION:
    Asset Value = 1000
    Depreciation Method : DIMINISHING METHOD
    Depreciation rate is 15%
    Depreciation Amount = 150
    FOR THE SECOND YEAR
    BOOK DEPRECIATION:
    Asset Value = 1000
    Depreciation Method : SLM
    Depreciation rate is 10%
    Depreciation Amount = 100
    TAX DEPRECIATION:
    Asset Value = 850
    Depreciation Method : DIMINISHING METHOD
    Depreciation rate is 15%
    Depreciation Amount = 127.5
    KINDLY LET ME KNOW HOW TO UPLOAD THE PAST DATA WITH TWO DIFFERENT ASSET VALUES (IE 1000 & 850)  FOR THE ABOVE.
    Regards
    Sushil Yadav

    Hi K,
    I'm having the same problem...did you find a solution yet?
    Greets,
    Martin.

  • How to get the logged in userId value in adf task flow OIM11g R2

    Hi,
    I have created an adf task flow. Now I want to run some query in that based on the logged in userId.
    Could you please help me in knowing how to get the logged in userID value in adf TaskFlow so that I can run a parameterized query.
    Thanks

    3 different ways to retrieve the username (not sure what you mean by user ID) :
    http://mahmoudoracle.blogspot.be/2012/06/adf-get-current-logged-user-name.html#.USI_c-h8zIo
    Also provide your JDev version.
    Basicly, you should use the groovy expression in a view criteria (it's the fastest and easiest way) and call that view criteria whenever you need it.
    That's if you are using ADF BC of course.

  • Bulk uploading the Pick List Values

    Hi,
    I have to create a cascading for states and then cities, which i have to use acorss the application in Leads, account and contact. Can any one please tell me that, is there any way of uploading the pick list values and how can i use the same thing at different places.
    Regards
    Nisman

    Hi Alex,
    I tryed this option. It is uploading the values in the City and state picklist field (at account view), but when i tryed drill down the picklist field, i didn't find any other value in that. I checked the same in the field level setup in admin section, no value was uploaded in the pick list value of that field.
    Any other idea of doing the same?
    Regards
    Nisman

  • Upload FS Items master using flexible upload-SEM-BCS

    I want to upload FS Items master using flexible upload in SEM BCS.  I was wondering if i could have a sample upload format of the file.  Kindly help.
    Ramanathan
    [email protected]

    Hi Ramanathan and welcome to SDN!
    Everything depends on your settings in a flexible upload method.
    You may use a comment character (like an asterisk) to show fields names and choose a fields delimiter (say, ';').
    In this case you may have a file for upload something like the following.
    These are fields in a header
    Field1 Field2 Field3 Field4 etc.
    Field1Val;Field2Val;Field3Val;Field4Val
    These are fields in rows
    Field5 Field6 Field7 Field8 etc.
    Field8Val;Field8Val;Field8Val;Field8Val
    Field8Val;Field8Val;Field8Val;Field8Val
    Field8Val;Field8Val;Field8Val;Field8Val
    Best regards,
    Eugene

  • Need to send the logging tag's value as On in diagnostics

    Hi Experts,
                I read some where as "we recommend that you deactivate the logging function because you can activate it, if necessary, via the diagnostics header".   I want to send the logging tag's value as "ON" in diagnostics. How is it possible, please guide me.           and also clarify me what is LOGGING, TRACE-LEVEL.
    Thanks & Regards,
    Srihari.

    Chk this :
    Re: PI Integration Engine RUNTIME LOGGING_SYNC for individual Interfaces
    Synchronous Scenario : Logging in XI
    Logging and tracing:
    http://help.sap.com/saphelp_nw04/helpdata/en/d6/49543b1e49bc1fe10000000a114084/frameset.htm
    Personlaized Logging and tracing:
    The specified item was not found.

  • External field catalog & Info object catalogs - Role - in Flexible upload

    1) What is the role of Info Object catalogs maintained in the Data Basis & the Source Data Basis?
    Please be kind enough to mention a scenario u2013 underlining their utility in the consolidation process.
    Like u2026.
    Are they used for loading master data from source system to the BCS system?
    Can they be used in the flexible upload u2013 data collection function?
    Are they used for as a source of AFD data?
    I came across the below documentation on flexible upload - in an SAP material.
    *When uploading from a field catalog, you also have the option of using mapping. In this case, the file structure no longer has to correspond with the structure of the data basis.*
    Understood the above 2 points.
    *Rather, you can assign a BW InfoObjectCatalog that acts as the data structure description for the file here. You specify this InfoObjectCatalog in Customizing for the data basis.*
    2)Does the above statement implies that we need not do any setting in the field catalog tab of the flexible upload? If yes, what do we do?
    In the flexible upload u2013 field catalog tab u2013 we define the data structure for the file we upload (correct me if Iu2019m wrong).
    If you use an external field catalog, you have to specify how you want to map the data structure for the file to the structure for the data basis.
    3)Pls give an example of an external field catalog.
    4)Can we use an external field catalog in the upload of RFD?
    Many Thanks
    Kind Regards,
    Kumar

    There are two possibilities of using Infoobject catalog in SEM-BCS (it is true for both catalogs, characteristics and key figures):
    u2022     In a data basis. The system adds these chars and KFs that are sitting in the defined in the Data Basis catalogs to u201CAdditional Fieldsu201D of each data stream (the tab strip u201CData Stream Fieldsu201D in data basis. If you check some of these fields and generate data basis, these additional infoobjects will be placed into appropriate ODS/DSO objects and you will be able to use them for uploading of some extra information.
    Without indicating the Infoobject Catalog for Chars youu2019ll not be able to configure a new functionality of "assets/liabilities" at all.
    u2022     In a source data basis.
    After including source data basis to your data basis youu2019ll be able to use external infoobjects catalog in a method of the category Flexible Upload. Tick the flag of using the external catalog and choose the SDB. In the mapping tab youu2019ll have a possibility to choose from those chars and KFs that are located in the catalogs.
    This might be used for upload of ANY data, RFD, AFD or master data.
    The scenarios, I guess, are rather obvious.
    Hope this helps.

  • I/U profit and loss in Inventory : Flexible upload of Inventory Data

    Hello people! gurus!!
    In Data Collection, I have defined flexible upload to Inventory Data as Data Type;
    then I tried to connect this in Method, however in the options belong to Data Stream, u201Ctotal recordsu201D and u201Cdocumentsu201D are my only options.
    Should it appear options to relate as Inventory Data and Supplier ?
    Thanks in advance
    Ismael Lozano

    Ooph, Ismael,
    DO the search BEFORE you post the question!
    Many of such questions were answered a long time ago.
    For example:
    Re: Missing Elimination of Interunit Profit/Loss Functions
    or
    Re: SEM BCS - Interunit Profit/Loss in Transferred Asset / Inventory

Maybe you are looking for

  • Slow webpage loading on all browsers

    Hi, I have the problem that webpages tend to load extremely slow on all browsers under the latest Mountain Lion on a Macbook Pro Retina Mid 2012. I say tend, because sometimes pages load instantly, but very often it takes minutes to open or a page lo

  • How to avoid the split problem when uploading the data from csv file

    Dear Friends,               I have to upload data from the .csv file to my custom table , i have found a problem when uploading the data . i am using the code as below , please suggest me what i have to do in this regard       SPLIT wa_raw_csv  AT ',

  • Cannot open iTunes quick time error

    When i try to open my iTunes i get this message: Quicktime failed to initialize (error-2096). quicktime is required to tun itunes. Please uninstall iTunes, then install itunes again I have done this countless times with no result, I then restarted my

  • Fonts in terminal emulators (terminal from xfce)

    I always have problems with fonts in terminal emulators, because I can never choose good one. Now I found perfect terminal emulator, it's called Terminal. I have Arch on three computers at home - two workstations and one server. On one workstation I

  • How to turn off secondary voicemail notification?

    Every time someone leaves me a voicemail it sounds the notification chime twice. This is a new irritating "feature" that I can NOT figure out how to disable. ONCE is plenty! It even does it when I am ON THE PHONE. Please tell me I have overlooked the