Consolidation Group in Target Cube (Data Collection Task)

Dear Experts,
In consolidation Monitor, While doing Data Collection task via Load from Data Stream, after doing Updation, WHen I see the content of Target InfoCube in RSA1, In GL account Line item I do not get COnsolidation Group Value.
Like
GL account   Company  CCode  Cons Group  Currency  PV LC    PV GC
100000           3000         3000     (Blank)          USD         1000      44000
1) Is it necessary to Have Value in Consolidation Group?
2) If Yes, What is the utility and how to get value in this column.?
Regards
Ritesh M.

No, ConsGoups are determined later, during the ConsGroup-dependent tasks.

Similar Messages

  • Data Collection task error (Item  is not defined in Cons Chart of Accts 01)

    Dear Experts,
    While executing Data collection task, I am getting error "Item  is not defined in Cons Chart of Accts 01", What could be the reason?
    (Many Items are getting collected in target, but at the topline this error is been shown)
    Regards
    Ritesh M

    Thans Dan,
    I have gone in the program UGMDSYNC, and getting few Gls which have not assigned Cons Chart  of account.
    Kindly tell me how to assign chart of account to these GL account?

  • Data collection task execution in UCMON

    Dear Friends
    I am using flexible upload method and load from data streams for data upload. Both methods are assigned to my data collection method which in turn is assigned to the task. When i execute the data collection task in consolidation monitor the system executes only the load from data streams and does not give me the option of selecting the method to execute.
    To differenciate it i created a new task and assigned only flexible upload method and when i execute this task the system still executes the load from data streams method.
    Can anyone guide me as to where i am going wrong?
    Cheers
    Shivkumar

    Hi Mani
    I could not follow what you meant to say
    "I have done same what u describe, but do not have the issue. Create seperate method for data steam and FU. Then merged both in one method and assigned such method to one task and no issue.. I face that u described,
    Can you assign the method to the task, starting from the same starting period and check again."
    Do you mean to say that you did whatever i am doing and you too faced the same problem. But the strange behaviour of the system is when i execute it in "Execute with selection screen" mode the system shows all the methods, however if i run it in update mode the system executes only the "Load from data streams" even if the "Flexible upload" method is assigned.
    BTW where do you set the validity for a task?
    Cheers
    Shivkumar

  • ECCS : Data Collection tasks

    Hello Gurus,
    I want to customize my system so that manual entries tasks, flexible upload and rollup tasks are the ONLY available tasks into the "Data Collection task".
    However, the system default an option called "Additional financial data" where "supplier data" can be imported whereas I did not customize anything about this option. I don't find the solution to delete this option into my "data Collection task" layout.
    Do you have any solution for me?
    Thank you in advance for your help
    Best regards
    Pascal

    Hello Dan and thank you for this first reply,
    - For "data collection" task group I don't have the possibility to define the tasks to be used or not (the system apply his logic depending on different custo parameters).
    - Regarding the "profit-in-inventory" option into the dimension, the system only gives the options to set the product group usage as as an independent characteristic or existing custom characteristic but there is no option that indicates the profit-in-inventory as not being implemented...
    Is there anything else I should check? If yes, could you please give me more details?
    Thank you in advance for your help
    best regards
    Pascal

  • Data collection task in Update mode is not posible with Original list

    Hi Friends,
    When I try to Execute the Task for data collection in sa38.I am getting the Error with in 9 sec.
    UPDATE MODE IS NO POSSIBLE WITH ORIGINAL LIST
    Message No. UCS0111
    Diagnosis :
    If you select the indicator"original list"the current task status is ignored.In this case,the system simulates a task run in with all of the organisational units are not yet blocked.Thus,setting this indicator is meaningful and valid only if the task is executed in test mode at tthe same time.
    procedure : select the original list & Test run together.
    My problem is I can not select the Test run in production  server.
    Can any one please help me to solve the Issue.

    I Thank you all for Responding to me.
    Hi Dan,
    SA38 provides all the needed parameters- Con.Group,Company,Version,Fis.Year,Period,Group currency.
    Iam trying to run the task with SA38 for Data Collection.
    Hi Lucio,
    When I Check the Fields Both the  Log & Original list . The programe gives an error message "UPDATE MODE IS NOT POSSIBLE WITH ORIGINAL LIST"
    Even when I tried with Log,Test Run & Original List Iam getting the Same error Message "UPDATE MODE IS NOT POSSIBLE WITH ORIGINAL LIST"
    I can run the task Successfully with selecting the Log option only. But my Client need  to Use both Log & Original list ( as they used to run in BIW 3.5 ) Now it we are using BI 7.
    Hi Christopher,
    We are running the programe UCBATCH01,with the T code SA38.
    Hi Eugene,
    I will look in to the link and i will Discuss with the Basis team.
    Once again Thank you all for responding
    Suraj.

  • How to extract data from BCS consolidation cube to Group BW extraction cube

    Gurus,
    I have to figure out a way to extract data from a local BCS consolidation totals cube to a group BW extraction cube via a virtual cube in between the too...How can i do the extraction and what are the different ways in which it can be done..
    Detailed steps would be appreciated....also how shud i proceed towards this...
    Thanks
    Cheers:
    Sam

    Hi Sam,
    Instead of extracting you can also consider reporting on the data with a multiprovider.
    There is a doc here that might be useful
    http://www.affine.co.uk/files/How%20to%20Create%20a%20MultiProvider%20over%20BW%20and%20BCS.pdf
    Kevin

  • Missing Data Target in Infopackage for Update ODS Data in Data Target Cube

    Hello & Best Wishes for the New Year to all of you,
    I have 3 ODS (1 on Full Update and 2 with Delta Updates). All these 3 ODS update data to a single CUBE. In my development system this works correctly. Data load from PSA to ODS to Cube.
    Now I transported this to my QA and Production System. In QA and Production System, I am able to load data upto all the 3 ODS and ACTIVATE Data in all these 3 ODS.
    When I am trying for "Update ODS Data in Data Target" to load data from ODS to Cube in QA /PD, the system created Infopackage (ODS to CUBE), doesnot get the Data Target details. (Initial Upload / Full Upload). The Data Target Tab is Blank (expected Cube details).
    I have tried to transport again after deleting the update rules.
    Can you suggest what could be the problem ?
    regards - Rajesh Sarin

    Thanks Dinesh,
    I have the ODS to CUBE Update Rules ACTIVE in the QA and PD system. Still the problem exists only in QA and PD. In DV the ODS to CUBE Data Target is available in the Infopackage and loading the data correctly to Cube.
    Listing all the trials I have done :
    1) Originally transported with all the Collected Objects. Inspite of having Active Update Rules, Data Target Cube was missing in QA and PD.
    2) After this problem, I again transported only the Update Rules through the Transport Connection, still the problem didnot get solved in QA and PD.
    3) Again, I sent a transport to delete the Update Rules which deleted the ODS to CUBE update rules in (DV), QA, PD. After that I sent another request to CREATE the ODS To CUBE UPDATE Rules in (DV), QA and PD. Still the Data Target is missing, inspite of having Active Update Rules in QA and PD.
    In DV the ODS to CUBE Data Target is available and loading the data correctly to Cube, even now.
    4) I have also tried "Generate Export Data Source" for the 3 ODS in QA and PD. Still it doesnot help.
    Can you please suggest ?
    regards - Rajesh Sarin

  • COI not reading equity data after consolidation group change

    Dear experts,
    I am running the consolidation of investments task for a new group that has been created on 012.2009. This new group CG2 is a child of an existing one CG1, and all the consolidation units that I have in 012.2009 under CG2 were in 012.2008 under the CG1.
    In my new group GC2 all the units have as first consolidation period 012.2009 and on the parent group they have as last consolidation period 012.2009.
    The problem is that when I run the COI task the system does not read the equity data for these consolidation units. I have both, equity data and Investment data defined to be read from additional financial data.  The system reads correctly the investment data for the parent consolidation unit, and tries to eliminate it, but as it does not read the equity data of the child so it takes the value to consolidation a difference, that is incorrect.
    Does anyone know what I am missing?
    Thanks in advance.
    Kind regards,
    Sónia

    Hi Sonia
    i dont want to confuse you as Dan is guiding so far. let me put my thoughts some here.
    CG1  12/2008
       - CU - A    Parent    12/2008 to 999/9999
       - CU- B                   12/2008  to 999/9999
       - CU - C                  12/2008 to 999/9999
    do you have any another sub node for CU-B and CU-C? as B has investment in C
    New Group: 12/2009
      CG1
       - CU -A    12/08 to 999/9999
       - CU - B    12/08 to 12/9999
       - CU - C     12/08 to 12/2009
       CG2 group supose to be like this
      CG1
        -CU-A
        -CU-B
           I---  CG2
                 CU - B   Parent  12/09  to 999/9999
                 CU - C                12/09 to 999/9999
         - CU-C 
    Now with above scenario you can invest in C from B and system will treate as FIRST CONSOLIDATION  based on AFD Investments/Equity
    How did you maintain B investments in C at CG1 12/08 level?
    and or    as per Dan
    You have to run PREP for CONSOLIDATION GROUP CHANGE  task  before CT task and after Elim Taks but before COI task
    thx

  • Data collection was switched from an AI Config task writing to an hsdl file to synchronized DAQmx tasks logging to TDMS files. Why are different readings produced for the same test?

    A software application was developed to collect and process readings from capacitance sensors and a tachometer in a running spin rig. The sensors were connected to an Aerogate Model HP-04 H1 Band Preamp connected to an NI PXI-6115. The sensors were read using AI Config and AI Start VIs. The data was saved to a file using hsdlConfig and hsdlFileWriter VIs. In order to add the capability of collecting synchronized data from two Eddy Current Position sensors in addition to the existing sensors, which will be connected to a BNC-2144 connected to an NI PXI-4495, the AI and HSDL VIs were replaced with DAQmx VIs logging to TDMS. When running identical tests, the new file format (TDMS) produces reads that are higher and inconsistent with the readings from the older file format (HSDL).
    The main VIs are SpinLab 2.4 and SpinLab 3.8 in folders "SpinLab old format" and "Spinlab 3.8" respectfully. SpinLab 3.8 requires the Sound and Vibration suite to run correctly, but it is used after the part that is causing the problem. The problem is occuring during data collection in the Logger segment of code or during processing in the Reader/Converter segment of code. I could send the readings from the identical tests if they would be helpful, but the data takes up approximately 500 MB.
    Attachments:
    SpinLab 3.8.zip ‏1509 KB
    SpinLab 2.4.zip ‏3753 KB
    SpinLab Screenshots.doc ‏795 KB

    First of all, how different is the data?  You say that the reads are higher and inconsistent.  How much higher?  Is every point inconsistent, or is it just parts of your file?  If it's just in parts of the file, does there seem to be a consistent pattern as to when the data is different?
    Secondly, here are a couple things to try:
    Currently, you are not calling DAQmx Stop Task outside of the loop; you're just calling DAQmx Clear Task.  This means that if there were any errors that occured in the logging thread, you might not be getting them (as DAQmx Clear Task clears outstanding errors within the task).  Add a DAQmx Stop Task before DAQmx Clear Task to make sure that you're not missing an error.
    Try "Log and Read" mode.  "Log and Read" is probably going to be fast enough for your application (as it's pretty fast), so you might just try it and see if you get any different result.  All that you would need to do is change the enum to "Log and Read", then add a DAQmx Read in the loop (you can just use Raw format since you don't care about the output).  I'd recommend that you read in even multiples of the sector size (normally 512) for optimal performance.  For example, your rate is 1MHz, perhaps read in sizes of 122880 samples per channel (something like 1/8 of the buffer size rounded down to the nearest multiple of 4096).  Note: This is a troubleshooting step to try and narrow down the problem.
    Finally, how confident are you in the results from the previous HSDL test?  Which readings make more sense?  I look forward to hearing more detail about how the data is inconsistent (all data, how different, any patterns).  As well, I'll be looking forward to hearing the result of test #2 above.
    Thanks,
    Andy McRorie
    NI R&D

  • Target Data Collection in ASCP

    Hi,
    I was NOT able to see "Target Data Collection" parameter in LOV of Data Collection Concurrent Program.I was able to "see" ONLY Complete Refresh Parameter in LOV.
    Is there any Profile Option?
    Regards
    NTGR

    set "purge previously collected data" to 'NO' in the parameter list. you will able to see "targeted collection" and "Net change Collection" in Collection Method parameter LOV.
    Regards
    Abhishek

  • Planning & Consolidation Questionnaires for Data Collection

    Hi Friends,
    I need some help on the following documents.
    *1.Does any body have Planning & consolidation Questionnaires for Data collection (information gathering)*
    *2.Document on BPC integration with Other systems
    Regards,
    Mcs.Chowdary

    VenK7337,
    Could you show your python code. so we know what your are "writing" to the ethernet port?
    That way we can see what you are receiving.
    Parsing the incoming data (from the TCP-read) depends heavily on the device that sends it, and can not generically be described. LabVIEW has many byte (and even bit) manipulation functions to convert many different data formats to its own build in formats.
    So after the TCP listener is connected, you are constantly reading from the established connection (until it gets broken of course). More advanced example would be the internet toolkit if oyu have it.
    From the read characters (and I hope you designed a protocol with a clear starting character, ending character and maybe even a build in checksum) you parse the data and perform you action, and of course generate a reply. Again the internet toolkit is a good example. It parsed the input as it comes it, based on the HTTP format. Then generates the reply based on the request received.
    These days I would suggest not to use binairy encoded numerics. Try and use XML formatted data. Yes, it causes a lot of overhead. But typically this is not an issue and makes the code a lot more portable and maintainable. Also makes it easier to interface with other languages/platforms.
    Umless of course you are looking at Khz data rates, then XML is not th preferred choice.
    Hope this helps...

  • Group By Date and Task

    Hi Guys,
    I have got a report and It was long query to get the data but now I am stuck as I tried to sort it by date but no luck.
    I want to sort it so it only shows one date and sum all the tasks on that date.

    Hi Mo Yusuf,
    Per my understanding that you have sort by the ansactionDate field and want to groub by this field to sum all the task of the same date but failed, right?
    Please find details information below about how to add the row group for the ansaction date and sum of the task in the report design:
    Right click the field task and select the add group--Row Group to add Parent group like below:
    Click to select the task under the task field you will get the "=Sum(Fields!task.Value)" which will sum the task based on the scope of ansactiondate, you can also use the expression "=Sum(Fields!task.Value,"ansactiondate")"
    to include the scope of the sum function:
    If I have some misunderstanding, please try to provide some sample data in the table and the query you are currently using in the dataset and more details information about the expect result you want to get.
    Regards,
    Vicky Liu
    If you have any feedback on our support, please click
    here.
    Vicky Liu
    TechNet Community Support

  • SP2013 alternative for Collect Data workflow task?

    SP2010 offered a workflow action named Collect Data, allowing for easy data collection from users without having to create custom ASPX forms. Apparently, SP2013 no longer offers this feature.
    I'm doing a SP2013 project now that absolutely inhibits me from doing any custom development, so I'm trying to find an easy way to collect data from users in a SP2013 without having to do custom coding. Are there alternatives?

    Hi,
    I understand that there is a big list of workflow actions that are not available in SharePoint 2013. However, SharePoint 2013 still allows users to create workflows on the SharePoint 2010 platform. If you need these legacy actions, make sure you select the
    "Platform Type" as SharePoint 2010 when you create the workflow. You cannot change the platform type once the workflow is created.
    Source link:
    What's new in workflow in SharePoint Server 2013
    http://technet.microsoft.com/en-us/library/jj219638.aspx
    Tracy Cai
    TechNet Community Support

  • BCS - Data collection issue

    Hi,
    I'm using BCS 4.0. I'm working now in final testing and I have some question regarding to the data collection process using load from data stream method. I ran the task in consolidation monitor for 007.2007 period and for all companies without any error or warnings, but we have differences in financial information for that period.
    I reviewed the content in my BCS cube (RSA1) and we don't have any data for that accounts, the only thing that I found was that all docs were created on same date
    I deleted the ID request in RSA1in my BCS cube and I executed task again in consolidation monitor, but the result was the same.
    Looking at log / source data, in the rows listed, these data is not taking from the infoprovider
    Any idea which could be the problem ?
    Thanks
    Nayeli

    Hi Nayeli,
    I had to do this kind of job (reconciliation of data in the source basis cube and the totals cube) during the final testing a lot of times, with an accountant.
    The only way to have the first clue what is happening is to compare every particular amount s in both cubes. Comparing and trying to figure out any dependencies in data.
    The difference might arise because of ANY reason. Only you may analyze the data and decide what to do.
    AFAIU, you compared only reported data and did no currency translation and eliminations?
    Then, I'm afraid that you've made a very big mistake deleting the request from the totals cube. You have broken the consistency between totals and document data. For a transactional cube the request is yellow until the number of records in it reach 50000 records. Then it is closed and become green. As you may understand, in those 50000 records maybe contained a lot of data for different posting periods, for example, reported data for the previous closed period. Though, documents still contain eliminations for that period. - Inconsistency.

  • Can I filter the table being profiled in the Data Profiling task, or know of a work around? SSIS2008r2

    Hi,
    I import several files into a staging table.  Inside a foreach loop, after the data flow task I have a Data Profiling task that profiles the table so we can give feedback to those who sent us the file.
    I just relalized that after the first loop, there is more than one file in the staging table, hence more than one file is being profiled.
    At first glance it doesnt appear I can add a "Where" clause to the table.  I do have a column that states which file the data came from, but with only being able to profile a view or table, I can't filter.
    Anyone have any ideas?
    Mike

    Catching outliers seeing how much of what kind of data exist, deviations, distribution, you get some sort of a historgram in essence with the Data Profiling.
    It is normally a preamble to doing data ingress (or egress) into a new target.
    Now patterns are in Fuzzy Lookups/matching or grouping. Having RegEx used is hard in SSIS! I admit, I proposed to have it included into SSIS VNext. I remember there was a component or two on CodePlex:
    http://ssisctc.codeplex.com/ see if any makes sense, I recall I tried one just out for fun.
    Arthur My Blog

Maybe you are looking for

  • "Library" Projects vs "Projects & Albums" Projects

    Hello all, I have been using Aperture for about three years. I have Aperture 3 on three different Macs. Prevously I have been using Windows XP and have been storing photos in directories on the hard drive the old fashioned way. I have some questions

  • Duplication checking

    Hello gurus! Is there anybody who used ADDRESS_SEARCH and ADDRESS_UPDATE badies for duplication checking? Could you provide me with roadmap (code samples, descriptions) how to use them? My case: Business Partner duplication check by name and telephon

  • What is executable (or Runnable) file in Agentry Application?

    I have built an application using Agentry, the application is completed now I want to publish this application, and want to deploy it on mobile device, but I don't have any idea how to do it, because it generated .apg file, but Android supports .apk

  • Can expose be reversed to the old look?

    I just installed snow leopard, and I seem to like how the old expose displays all the windows, compared to this new one, I hate the neon glow around the hovered window, and the new grid layout. I don't really think that there is a way to make it look

  • ESA 8.5.6-092 Update and CVE-2011-1521 vulnerability ?

    I can't find much info on this update my appliances have reported is available to upgrade to and I was wondering if anyone else has installed the 8.5.6-092 update onto a C370 ESA Appliance yet or are we all covered OK as the documentation leads to be