SEM BCS - Step for Load from Datastream

Hi,
What are steps to configuration "Load from DataStream".
What kind of infocube that i should use?
Thanks in advance.

Hi,
There are rather lot threads reg the issue.
For example:
Re: BCS configuration
Re: How is data loaded into the BCS cubes?
Re: Which info object should be used for Financial statement line (BPS and BCS)
Re: SEM-BCS: Load from data stream
Feel free to ask if something is not clear.

Similar Messages

  • EHP 2 Multiperiod consolidation for  Load from datastream

    Any experience of using multiperiod consolidation with load from datastream functionality, as opposed to flexible upload?

    not with LFDS but I have several other tasks and
    i don't see any problems with multiperiod monitor as long as there is no manual intervention required
    (eg if you the task requires choosing a file or entering manual entries, then multiperiod monitor will ignore it).
    have you got a specific error to tell us about?

  • What are the steps for loading master data

    Hello
    what are the steps for loading master data? i want to learn about loading all master data and the steps to choose the best way to load the data.
    if anyone has documents please send me the documents i will be really greatful
    [email protected] thanks everyone
    Evion

    Hi Heng,
    Download the data into a CSV file.
    Write a program using GUI_UPLOAD to upload the CSV file and insert records.Chk the below link for example
    http://www.sap-img.com/abap/vendor-master-upload-program.htm
    Reward Points for the useful solutions.
    Regards,
    Harini.S

  • Steps for Migration from SharePoint 2007 to SharePoint 2010?

    Hi All,
    Can any one tell the steps for migration from SharePoint 2007 to SharePoint 2010.
    Thanks in advance!

    This is the official communication:
    http://technet.microsoft.com/en-us/library/cc263212.aspx
    and this is a step by step guide:
    Step by Step inplace upgrade to SP2010
    Hope it helps!
    Thanks, Ransher Singh, MCP, MCTS | Click Vote As Helpful if you think that post is helpful in responding your question click Mark As Answer, if you think that this is your answer for your question.

  • SEM-BCS Step COI

    Hi,
    Any one can explain or forward a document for handling step consolidation of investments.
    Thanks and regards
    Naveen.KV

    CoI is the most complicated part of consolidation.
    There is still  no good documentation on SEM-BCS, and CoI in particular.
    Do you expect that someone will write a doc on this topic for you and send it to you?
    In the SAP mentor forum were some 2008 suggestions. One of them was the more tight connections with SAP employees. Don't you have these connections? You are from SAP, aren't you?
    Can't you get this information from your colleagues?

  • Need info on Upgrade / Conversion Steps for PO from 11.0.3 to  11.5.10

    Any info on the steps required to do the Upgrade / Conversion for PO from 11.0.3 to 11.5.10 will be appreciated.
    Thanks

    Do you just want to convert PO's or all open items such as AP invoices etc.? If you are just talking PO information, there is an interface table available in 11.5.10 that was not available in 11.0.3 to convert PO information. Please look at the Open Interface Manual for Manufacturing to get details.

  • Steps for loading data into the infocube in BI7, with dso in between

    Dear All,
    I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
    Top to bottom:
    InfoCube (Customized)
    Transformation
    DSO (Customized)
    Transformation
    DataSource (Customized).
    The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
    But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
    Kindly advise me, where did i miss.
    OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
    Regards,

    Hi,
    my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
    My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
    Hope this helps.
    Kind regards,
    Jürgen

  • Index for loads from ODS To Cube

    The load from ODS to cube is taking a long time - In the start routine another ODS is being looked up - The keys for look up is say X and Y
    There is already an index existing on keys X , Y & Z -
    Will this index be used while doing the select on that ODS or I need to create a new index with only X and Y keys ?
    Thnx

    When you are running the start routine - run an SQL trace - ST05 - that will tell you if the index is being used.
    Arun

  • Steps for loading approvers, solicitors & workflows to the CUP

    Hello,
    I want to know if there is a way or steps to load the approvers, solicitors & workflows in the CUP of SAP GRC AC 5.3.
    What I mean is that can I load first the workflows and then the approvers & solicitors and then join the approvers to the workflow, or do I have to put first the approvers & solicitors first and then the workflows & join them??
    Please I need help.
    Best regards...
    Pablo Mortera.

    Sorry, Pablo. Just one question::
    What do you mean by solicitors?
    Having approvers set up and creating workflows are two separate tasks and have no dependency so you don't have to worry about the order. You won't be able to create request without having both of them configured in CUP.
    Regards,
    Alpesh

  • Steps for migrating from Oracle10g to Oracle 11g

    Hi,
    Please send me steps to migrate from Oracle10g to Oracle 11g on Redhat Linux x86_64 on RAC with ASM.
    Regards,
    Tushar

    http://download.oracle.com/docs/cd/B28359_01/server.111/b28300/toc.htm

  • BAdI UC_DATATRANSFER for BCS Mapping in "Load from Data Stream" method

    Hello Everyone,
    I need some help on finishing up the code for the UC_DATATRANSFER BAdI.
    I have looked up in the SDN and other places, but could not get comprehensive breakdown of documentation except for the "F1" documentation available on the BAdI.
    So, any help would be appreciated.
    The Steps so far completed,
    1. Have activated the BAdI and have created the filter value for the BAdI.
    2. After the BAdI has been activated, I was able to go into the MAP method and have written the logic for profit center derivation from consolidation hierarchy.
    The issue is there are four components for the Map method,
    IT_DATA_SOURCE
    IS_DATA_TARGET
    ES_DATA_TARGET
    ET_DATA_TARGET
    The data is available from Source system in the table IT_DATA_SOURCE.
    But this is not changeable as it is "Importing" type. Whereas the actual ET_DATA_TARGET which is passed over into FINALIZE method of the BAdI is not filled initially.
    When I try to do a MOVE-CORRESPONDING from the IT_DATA_TARGET into ET_DATA_TARGET I continuously am getting the short dumps as both the tables length is not the same.
    Did anyone else face the same issue as above when trying to do the BAdI implementation for Mapping.
    I will really appreciate if any one can provide me a sample code if possible.
    Let me know if you need additional information.
    Thanks
    Dharma.

    Hello,
    Thanks for looking into the question.
    I already had tried doing that, I get the Short dump stating the object tables are not convertible.
    When I looked into the table structures, I found out that the table structures "IS_DATA_TARGET", "ES_DATA_TARGET" & "ET_DATA_TARGET" belong to the same category in terms of these structures being flat structures or tables of length 484 as per the debugger.
    Whereas the structure "IT_DATA_SOURCE" has the length 404.
    Due to this reason when I say,
    ET_DATA_TARGET = IT_DATA_SOURCE, I keep getting the short dumps.
    Also, is your consolidation process legal or managerial.
    Our Consolidation process is legal and we have the Company and Profit Center fields assigned to the Consolidation Unit role in the Data Basis definition.
    Can you please let me know what is the structures length in your system.
    Thanks
    Dharma.

  • Best practice for loading from mysql into oracle?

    Hi!
    We're planning migrating our software from mysql to oracle. Therefore we need a migration path for moving the customer's data from mysql to oracle. The installation and the data migration/transfer have to run onto different customer's enviroments. So migration ways like installing the oracle gateway and connect for example via ODBC to mysql are no option because the installation process gets more complicated... Also the installation with preconfigured oracle database has to fit on a 4,6 GB dvd...
    I would prefer the following:
    - spool mysql table data into flat files
    - create oracle external tables on the flat files
    - load data with insert into from external tables
    Are there other "easy" ways of doing migrations or what do you think about the prefered way above?
    Thanks
    Markus

    Hi!
    Didn't anyone have this requirement for migrations? I have tested with the mysql select into file clause. Seems to work for simple data types - we're now testing with blobs...
    Markus

  • Settings for loading from FLat File in BI7

    Hi guys
    i am trying to load data from a csv file. i used MS excel to create this cvs file. now in BI-7 what settings do i need to use to load the data. i need info like "Seperator"  and "Replacement for Blank".
    i have already tried  ; and , for seperator, but the preview is all messed up.
    thanks

    Hi Adnan,
    First you create a source system for flat file loading.
    Do this in RSA1 - Source Systems. Via context menu, create it in node "File".
    Then, goto the RSA1 - Datasources view. Choose the corrrect source system.
    Create a datasource and assign fields.
    Use your datasource to load data into infoobject, datastore using a transformation between the datasource and datastore.
    Create and schedule an infopackage to load to PSA.
    Create and schedule a DTP to load into target.
    Setting for flat file loading
    in SPRO
    Thousand separator
    dec.pointer separator
    field separator
    Field delimiter
    In info package
    External Data
    Here we can give the file path & where the file exist(wether in appllication server or client work station
    file type
    data separator
    esacpesign
    thousand separator
    how many to be igonered
    hope this will help you...............
    for more info
    Pls refer to the link below
    [http://help.sap.com/saphelp_nw04s/helpdata/en/43/03450525ee517be10000000a1553f6/frameset.htm ]
    [http://help.sap.com/saphelp_nw04s/helpdata/en/fc/1251421705be30e10000000a155106/content.htm]
    [b.i 7.0 ffile extraction;
    [http://help.sap.com/saphelp_nw04/helpdata/en/8e/dbe92341c84242be2c7d3917f1c197/content.htm]
    [https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/g-i/how%20to%20load%20a%20flat%20file%20into%20bw-bps%20using%20sapgui.pdf]
    Regards,
    NR
    Assign points if useful...

  • Restrict authorizations for loads from HR to BW for certain data

    Hi,
    our customer wants protect some data in the HR productive system. This data are defined/restricted by certain personal areas.
    It is not enough to use reporting authorizations in BW to restrict presentation in queries or use filters in infopackets during load to avoid this data.
    The requirement is to make load of such data from HR to BW absolutely impossible, even BW administrator cannot see them and must not be able to load them.
    We will probably have to somehow limit ALEREMOTE users authorizations in BW. I do not know how and I even doubt, that extractors in HR source system perform authorizations checks for fields.
    Is there any way to do this?
    Thank you very much,
    Petr

    Hi Petr,
    Create a general enhancement program (restricted authorization) with generic name, which should be called dynamically for every datasource.
    Refer-
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2d99121a-0e01-0010-e78c-b1ae566a2413?overridelayout=true
    Not personally tested but check following.
    In that program, you may try applying following logic:
    1) You may need to use TYPE ANY field symbols
    2) In While Loop until all fields of C_T_DATA checked, may be a counter based on total number of fields.
        DELETE C_T_DATA where <TYPE_ANY1> EQ (OR use IN) specific value(s) of Personnel Area
        DELETE C_T_DATA where <TYPE_ANY1> CS (Contains, check pattern) specific value(s) of Personnel Area
    ENDWHILE.
    Optionally: For Standard Daatsources in the same program you can add logic based on standard field only "WERKS".
    Note: You may need to research on dynamic pointing using field symbols for every field.
    Thanks
    Arun Purohit

  • SEM:BCS - Copying Consolidated data from one version to another

    Hi,
    We have a situation where we do Consolidation of Forecast data. However this involves mixing Actuals data. For eg.
    1) Q3 forecast would contain Q1 and Q2 as Actuals and Q3 and Q4 as Forecast
    2) System has already done the consolidation for Q1 and Q2 lets say in version 0
    3) Forecast is a different version eg. F1.
    4) I want to be able to copy the Consolidated data of Q1 and Q2 from version 0 to version F1. Then load the data for Q3 and Q4 for forecast and perform consolidations for Q3 and Q4.
    What I have seen in the system is -
    a) You can copy only the non-consolidated data from one version to another and not for eg. Elimination entries etc.
    b) This means that I have to perform consolidations in version F1 for quarters Q1 and Q2 again even though its already been done in the Actuals version (0).
    c) Is there any way out of this. There is always the possibility that the consolidated data in Q1 and Q2 does not match between Actuals (version 0) and Forecast (F1)

    Hi,
    We had the same issue. In your solution you'll have to delete old (Q1 and Q2) forecast data and replace them by actuals. It's too problematic.
    And you are absolutely right about possible inconsistency between actual and forecast data.
    So, we've chosen another way.
    We make separate consolidation in actual, budget, plan and forecast versions.
    In plan and forecast we everything that is possible do periodically (including currency translation) -- to avoid the possible influence of historical data.
    In a BEx query (and Excel workbook) level we retrive: N months of fact, forecast data for N1 month and a plan data for N2 month. After summing up all figures we have predicted (name it as forecast) data for N+2 months.
    Of course, the actual, forcast and plan components are hidden in a report.
    This is possible because, in plan and forecast we use turnovers (not balances) only. Even a balance sheet is produced without problems, not talking about P&L and Cash Flow.
    Hope it helps.
    Eugene

Maybe you are looking for