Data source to Virtual Cube

Hello friends,
I am having a Virtual Cube with services based on Functional Module.
How can i find out from where data is comming into this virtual cube.
I would like to see the source data which is comming into this Virtual Cube?
Thanks
Tony

Hi Tony,
For Virtual Infoprovider with services, there will be a FM in BW system itself. All the connection and selection of data is done in the FM. So, You do not have a separate DS for them. Just look into the FM for the logic of connection and extraction.
Thanks and Regards
Subray Hegde

Similar Messages

  • How to load data from a virtual cube with services

    Hello all,
    we have set up a virtual cube with service and create a BEx report to get the data from an external database. That works fine. The question is now:
    Is it some how possible to "load" the data from this virtual cube with service (I know that there are not really data...) into an other InfoCube?
    If that is possible, can you please give my some guidance how to set up this scenario.
    Thanks in advance
    Jürgen

    Hi:
    I don't have system before me, so try this.
    I know it works for Remote Cube.
    Right Click on the Cube and Select Generate Export Data Source.
    If you can do this successfully, then go to Source Systems tab and select the BW. Here, Right CLick on select Replicate DataSources.
    Next, go to InfoSOurces, click on Refresh. Copy the name of Virtual Cube and add 8 as a prefix and search for the infosource.
    If you can see it, that means, you can load data from this cube to anywhere you want, just like you do to ODS.
    ELSE.
    Try and see if you can create an InfoSpoke in Virtual Cube. Tran - RSBO.
    Here, you can load to a database table and then, from this table, you can create datasource, etc.
    ELSE.
    Create query and save it as CSV file and load it anywhere you want. This is more difficult.
    Good luck
    Ram Chamarthy

  • What are the data sources for 0FIGL_V10 cube Balance Sheet and Profit& Loss

    What are the data sources for 0FIGL_V10 (G/L (New): Balance Sheet and Profit and Loss) cube. and whether we can install business content regarding this.
    Please help me out.
    thanks,
    sapsdnhelp

    Hi,
    Check this:-------
    Urgent: Relevant Master Data Data Sources for FI-GL  & FI-AP
    http://help.sap.com/saphelp_nw04/helpdata/en/04/47a46e4e81ab4281bfb3bbd14825ca/frameset.htm
    Regards,
    Suman
    Edited by: Suman Chakravarthy on Sep 28, 2011 7:33 AM

  • Service Management: Contract data sources, Exrractors and cubes

    Hi Fellows: I have never worked in Service Management before and I have been asked to locate the extract strucutres, data sources and cubes for some reports. The reports are for contracts, contract renewals, Contract repairs, Contracts Units End of Service, Contract customers, Contract Customer Report Card etc. Are there any standard extractors, data sources for contracts or service Management?
    Contracts     PLACE ID
    Contracts     
    Contracts     MODEL ID
    Contracts     SERIAL ID
    Contracts     Contract Start Date
    Contracts     Contract End Date
    Contracts     LINE VALUE
    Contracts     Contract ID
    Contracts     Contract Version
    Contracts     Invoice Date
    Contracts     Invoice ID
    Contracts     NAME
    Contracts     LOCATION
    Contracts     Address
    Contracts     City
    Contracts     Global Name
    Contracts     Name
    Contracts     Place ID
    Contracts     State
    Contracts     Zip
    Contracts     Sales Group B
    Contracts     Sales Group C
    Contracts     Contract ID
    Contracts     Version
    Contracts     Term of Contract
    Contracts     PO Number
    Contracts     Product Location
    Contracts     Phone
    Contracts     Model ID
    Contracts     Serial ID
    Contracts     Part ID
    Contracts     Quantity
    Contracts     Service Plan
    ***Contract Renewals:
    Contract Version
    Status of New Contract
    New PG Description
    Event Code
    Event Created
    New Price
    New Travel Price
    Old Price
    Old Travel Price
    ****Contract repairs:
    Problem Resolution
    Received Date
    Repair Center Info, name, address, ciy, state, postalcode,  country
    Repair Center Info, name, address, ciy, state, postalcode,  country
    Repair Center Info, name, address, ciy, state, postalcode,  country
    Repair Center Info, name, address, ciy, state, postalcode,  country
    Repair Center Info, name, address, ciy, state, postalcode,  country
    Repair Center Info, name, address, ciy, state, postalcode,  country
    Scheduled Ship Date
    Serial # Received
    Serial # Registered
    Serial # Shipped
    Service Repair (Work) Order #
    Technician’s name
    Tracking #
    Account Indicator

    This may help you..open the folders in left
    http://help.sap.com/saphelp_nw70/helpdata/en/ed/9dfb3b699bde74e10000000a114084/frameset.htm

  • Max data pull from Virtual Cube - is this a setting?

    We have a user doing a query against a Remote cube in our BW system, and they're hitting a "maximum data" limit of data from this remote cube.  Is this a setting for this cube or globals, and can you modify it?
    Thanks,
    Ken Little
    RJ Reynolds Tobacco

    Hi,
    MAXSIZE = Maximum size of an individual data packet in KB.
    The individual records are sent in packages of varying sizes in the data transfer to the Business In-formation Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package. SAP recommends a data package size between 10 and 50 MB.
    https://www.sdn.sap.com/irj/sdn/directforumsearch?threadid=&q=cube+size&objid=c4&daterange=all&numresults=15
    MAXLINES = Upper-limit for the number of records per data packet
    The default setting is 'Max. lines' = 100000
    The maximum main memory space requirement per data packet is around
    memory requirement = 2 * 'Max. lines' * 1000 Byte,
    meaning 200 MByte with the default setting
    3 THE FORMULA FOR CALCULATING NUMBER OF RECORDS
    The formula for calculating the number of records in a Data Packet is:
    packet size = MAXSIZE * 1000 / transfer structure size (ABAP Length)
    but not more than MAXLINES.
    eg. if MAXLINES < than the result of the formula, then MAXLINES size is transferred into BW.
    The size of the Data Packet is the lowest of MAXSIZE * 1000 / transfer structure size (ABAP Length) or MAXLINES
    Goto RSCUSTV6 tcode and set it.
    Go to your Infopackage, from tool bar, scheduler, data packet settings, here you can specify your data packet size
    Go to R/3, Transaction SBIW --> General settings --> Maintain Control Parameters for Data Transfer.
    Here you can set the maximum number. But the same can be reduced in BW...
    Info package>Scheduler>data’s default data transfer-->here you can give the size, but can reduce the size given in R/3 side, you can’t increase here...
    In RSCUSTV6 you can set the package size...press F1 on it to have more info and take a look to OSS Notes 409641 'Examples of packet size dependency on ROIDOCPRMS' and 417307 'Extractor Package Size Collective Note'...
    Also Check SAP Note 919694.
    This applies irrelevant of source system meaning applicable for all the DS:
    Go To SBIW-> General Settings -> Maintain Control Parameters for Data Transfer -> Enter the entries in table
    If you want to change at DS level then:
    IS->IP -> Scheduler Menu -> Data’s. Default Data Transfer and change the values.
    Before changing the values keep in mind the SAP recommended params.
    Hope this helps u..
    Best Regards,
    VVenkat..

  • Query Data cached? (Virtual Cube)

    Hi folks,
    I have some problems with a Query-Objekt which gets data out of an Virtual Cube. The Virtual Cube is based on a 3.X Infosource an gets data out of a table in the ERP.
    When i call the Query from VC everything works fine and current data is shown. But if i manipulate data of the table which the Virtual Cube points to and send a refresh event to the Query-Objekt, the manipulated data is not shown. It always returns the data which it fetched at the first call. If i refresh the whole application in the browser (via F5), the manipulated data is shown. I disabled cachmode in rsrt for this query but it doesn't work.
    Any chance to get the current data by just sending a refresh action and call the Query-Objekt again without reloading the whole application? Any Idea?
    Points will be awarded for usefull information.

    Hello,
    The reason why the data manipulated is not showing up in the query even after refresh is sent is because the cache for the virtual provider does not get reset as it would for a normal InfoCube.
    So, it does not know when to reset the cache for itself even when data is manipulated.
    The way we have worked around this is have a temporary process chain which runs on a frequent basis and executes the function module RSDMD_SET_DTA_TIMESTAMP for the virtual cube in consideration.
    Thanks
    Dharma.

  • PP  Data Source - 2LIS_04_PEARBPL  Which Cube / DSO

    Hi Everyone,
    In PP , Data Source  2LIS_04_PEARBPL ( Reporting Points for Repetitive Manufacturing ) This Data Source  data leads to which Infocube / DSO ?
    Could you please guide me.
    Thanks,
    Bhima

    Hi,
    There is no standrad  DSO/Cube available in BI for the mentioned DS.So you need to create custom data targets.
    Regards
    Ram.

  • Data source on top of virtual cube s_r_infoprov  expection

    hi experts,
    i have generated a export datasource on virtual cube. The virtual cube is based on a function module.
    When I display data in virtual cube it shows all the data .When I extract the data from it through DTP , it works well.
    But when i am using data source(exported data source) to pull the data from the virtual cube it is not pulling the data.
    It is throwing an exception at the following place.
    CALL FUNCTION 'RSDRI_INFOPROV_READ'
           EXPORTING
                                                            "begin qyh030513
             i_infoprov             = l_infoprov  "'/CPMB/ASFIN'
    *        i_infoprov             = do_util->do_appl_info_m-MULTIPROV  "'/CPMB/ASFIN'
                                                              "end qyh030513
             i_th_sfc               = lth_sfc
             i_th_sfk               = lth_sfk
             i_t_range              = lt_range
             i_rollup_only          = space
             "i_use_db_aggregation   = abap_true "abap_true "RS_C_TRUE
             i_use_db_aggregation   = IF_DB_AGGREGATE "abap_true "RS_C_TRUE
             i_use_aggregates       = abap_true "abap_true "RS_C_TRUE
             i_packagesize          = i_packagesize
             i_authority_check      = space
             i_currency_conversion  = space
           IMPORTING
             e_t_data               = et_data
             e_split_occurred       = e_split_occurred
             e_end_of_data          = e_end_of_data
             e_stepuid              = l_stepuid
           CHANGING
             c_first_call           = c_first_call
           EXCEPTIONS
             illegal_input          = 1
             illegal_input_sfc      = 2
             illegal_input_sfk      = 3
             illegal_input_range    = 4
             illegal_input_tablesel = 5
             no_authorization       = 6
             illegal_download       = 8
             illegal_tablename      = 9
             OTHERS                 = 11.
    When we go inside this function module in debugging the below code will be there
    STATICS: s_r_infoprov   TYPE REF TO cl_rsdri_infoprov.
       CLEAR: e_t_msg.
       IF i_clear = rs_c_true.
         CLEAR s_r_infoprov.
         RETURN.
       ENDIF.
    IF s_r_infoprov IS NOT INITIAL AND c_first_call = rs_c_true.
    *   nested call of RSDRI_INFOPROV_READ
         MESSAGE e882(dbman) RAISING illegal_input.
       ENDIF.
    Problem is with the s_r_infoprov value.  When I execut with data source in RSA3 then  the value is {O:222*\CLASS=CL_RSDRI_INFOPROV}
    and when I executed directly the virtual cube the value for s_r_infoprov is initial.
    So because of this value {O:222*\CLASS=CL_RSDRI_INFOPROV} the following code got executed and throwing an exception.
    Experts please guide how to solve this. will data source on top of virtual cube wont work?
    thank you

    Hi Sander,
    Thank you for the reply.
    I am trying to understand why an virtual cube is not working if it is generated as an exported  data source and it is working perfectly if directly connected as a source to DSO or Cube?
    Problem is with the s_r_infoprov value. It is getting populated differently , that is why there is an exception which using data source(virtual cube generated as exported data source).
    If this functionality is not going to work then why SAP has given an option to generate an exported DS of virtual cube.
    thank you for all giving some valuable inputs on this. Some how some thing is missing it seems....

  • Objects are not referenced in cube data source view. Large model.

    Looking at large model. 100+ measure groups, 200+ dimensions, one cube. Deployed to multiple customers in production as a part of custom developed software.
    in data tools 2010 I'm looking at the cube on sql 2008, sql 2008 R2 (EE, DE).
    In the cube data source diagram we observe 50+ objects that do not have any links to them or from them.
    1)
    Some of these I was unable to find in DMV's of the cube at all, it turns out they are present in the Data source that is in Data Source Views section.
    How did these end/could end up in the cube data source model? they are not available in dimension usage window at all.
    2) Others do not have any links from them or to them in cube’s data source diagram, however the dimension usage clearly shows these objects are part of relationships 
    both regular and many to many depending on the object in question.
    We are trying to identify what objects we can ‘trim’ from the model. I’m not sure what am I looking at in this case: data tools have a trouble interpreting diagram for 200+ dimensions and
    100+ measure groups (it is a large existing model) and this artifact should not be used to decide what in dimensions is obsolete and can be decommissioned.
    We are looking at cube that has over 40 NxM just to one fact not counting regular relationships.
    Am I looking at a feature or maybe an issue with the model itself that somehow has developed in the particular cube.
    It is not something I came across before.
    thanks

    Hi,
    According to your description, you are designing a large SQL Server Analysis Services cube, the problem is that you are unable view the relationship in the data source
    view diagram, right?
    In your description, you said that you can see the relationships in Dimension Usage window. In your scenario, can you see the relationship under the Tables pane when expanding
    a table's Relationships folder?
    General, the relationships will be defined automatic if the relationships were defined in your data source of the cube. And the dimension relationships between cube dimensions
    and measure groups in the cube are defined manually in Dimension Usage. In your scenario, you can define relationship manually in the data source view, please refer to the link below to see the details.
    http://technet.microsoft.com/en-us/library/ms178448.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • Fi, co-pa, ps, hr cubes,data sources?

    hi experts,
    could you please tell me what are the standard data sources, dsos and cubes for fi,co-pa, ps and HR modules?
    please give me some documentation regarding the above modules.
    thanks in advance
    regards

    Hi Venu,
    Check below links which may help you...
    Differences between LIS,LO,CO/PA and FI-SL extractors
    Re: how to find standard reports for Modules like MM,SD,FI
    CO-PA
    http://help.sap.com/saphelp_nw04/helpdata/en/53/c1143c26b8bc00e10000000a114084/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1910ab90-0201-0010-eea3-c4ac84080806
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/ff61152b-0301-0010-849f-839fec3771f3
    http://help.sap.com/saphelp_nw04/helpdata/en/c9/fe943b2bcbd11ee10000000a114084/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/g-i/how%20to%20co-pa%20extraction%203.0x
    CO-PA - Retraction
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/g-i/how%20to%20co-pa%20-%20retraction
    FI-SL
    http://help.sap.com/saphelp_nw04/helpdata/en/c9/fe943b2bcbd11ee10000000a114084/frameset.htm
    For HR tables and Transactions
    http://www.planetsap.com
    http://www.atomhr.com/
    http://www.sapcookbook.com/preview_hr_questions.htm
    http://sap.ittoolbox.com/topics/t.asp?t=302&p=302&h1=302
    http://help.sap.com/saphelp_470/helpdata/EN/8a/6a46347969e94be10000009b38f83b/frameset.htm
    Re: What is ABAP-HR?
    HR_Reports
    Regards,
    KK.

  • Virtual cube with services read from Multicube?

    Hello All.
    We have logically partitioned the Balance Sheet cube 0FIGL_C01 into 3 new cubes. Now I found that I also had to partition the Virtual cube 0FIGL_VC1 into 3 new virtual cubes( I have copied the standard function module to change the data origin).
    Then I have included 4 virtual cubes in one single multicube.
    When I run a query on the multicube, I get a short Dump.
    My questions are:
    1.- has anyone done the same logical partioning for balance sheet before?
    2.- Is it possible to use a multicube as source of data for a Virtual cube with services, using the balance sheet virtual cube function module?
    Thank you all for your help.
    Regards,
    Alfonso.

    a basic cube can only be a source of data for virtual cube or for  a multicube.
    all cubes should have atleast one char common when u add them in a multicube.
    check this

  • Error reading virtual cube using listcube t code

    Hi ,
    I an trying to execute listcube t-code and get the data of the virtual cube , but it is giving me an error message :
         1)Exception 'CX_RSDAI_ACCESS_ERROR' caught
                    2)Error reading the data of InfoProvider XXXXXinfoprovider
    Request your suggestions to resolve this issue.
    Thanks in advance

    Hi,
    Just to share some thoughts on virtual provider (VP), sometimes there are selection criteria that are not set as a required field but the VP needs it to successfully pull data.
    For example if in case this is a VP is a planning area, you would always need to include the planning version value even if its not set as a required field in the selection screen.
    You need also to put a check mark in all of the output fields before getting the result of your listcube.
    Now, looking at your error this seems more of a auth issue of your ID on your source system.
    Can you try pulling the data again then after you encounter this error run SU53 to check for any auth issue?
    Hope this helps.
    - jeff

  • Realtime,RDA,virtual cube

    Hi Experts,
    Can anybody give me the explanation on Realtime cube ,Realtime data aquasition and Virtual cube. And differance between them.
    Points will be assign.
    In advance thanks
    Regards
    Tg

    Hi Tg
    Real-time Data Acquisition
    Real-Time Data Acquisition u2013BI 2004s
    Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
    You might be having complex reports in your BI System, which helps in making decisions on the basis of data of your transactional system. Sometimes (quarter closure, month end, year ending...) single change in the transactional data can change your decision, and its very important to consider each record of transactional data of the company at the same time in BI system as it gets updated in the transactional system.
    Using new functionality of Real-time Data Acquisition (RDA) with the Net Weaver BI 2004s system we can now load transactional data into SAP BI system every single minute. If your business is demanding real-time data in SAP BI, you should start exploring RDA.
    The source system for RDA could be SAP System or it could be any non-SAP system. SAP is providing most of the Standard Data Sources as real-time enabled.
    The other alternative for RDA is Web Services, even though Web Services are referred for non-SAP systems, but for testing purpose here I am implementing Web Service (RFC) in SAP source system.
    Eg will be a production line where business wants information regarding defective products in the real time so that production can be stopped before more defective goods are produced.
    In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.
    Refer:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f80a3f6a983ee4e10000000a1553f7/content.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3db14666-0901-0010-99bd-c14a93493e9c
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3cf6a212-0b01-0010-8e8b-fc3dc8e0f5f7
    http://help.sap.com/saphelp_nw04s/helpdata/en/52/777e403566c65de10000000a155106/content.htm
    https://www.sdn.sap.com/irj/sdn/webinar?rid=/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    Real time cube
    Definition
    Real-time InfoCubes differ from standard InfoCubes in their ability to support parallel write accesses. Standard InfoCubes are technically optimized for read accesses to the detriment of write accesses.
    Use
    Real-time InfoCubes are used in connection with the entry of planning data. For more information, see:
    ●      BI Integrated Planning: Structure linkInfoProvider
    ●      Structure linkOverview of Planning with BW-BPS
    The data is simultaneously written to the InfoCube by multiple users. Standard InfoCubes are not suitable for this. You should use standard InfoCubes for read-only access (for example, when reading reference data).
    Structure
    Real-time InfoCubes can be filled with data using two different methods: using the transaction for entering planning data, and using BI staging, whereby planning data cannot be loaded simultaneously. You have the option to convert a real-time InfoCube. To do this, in the context menu of your real-time InfoCube in the InfoProvider tree, choose Convert Real-Time InfoCube. By default, Real-Time Cube Can Be Planned, Data Loading Not Permitted is selected. Switch this setting to Real-Time Cube Can Be Loaded With Data; Planning Not Permitted if you want to fill the cube with data using BI staging.
    When you enter planning data, the data is written to a data request of the real-time InfoCube. As soon as the number of records in a data request exceeds a threshold value, the request is closed and a rollup is carried out for this request in defined aggregates (asynchronously). You can still rollup and define aggregates, collapse, and so on, as before.
    Depending on the database on which they are based, real-time InfoCubes differ from standard InfoCubes in the way they are indexed and partitioned. For an Oracle DBMS, this means, for example, no bitmap indexes for the fact table and no partitioning (initiated by BI) of the fact table according to the package dimension.
    Reduced read-only performance is accepted as a drawback of real-time InfoCubes, in favor of the option of parallel (transactional) writing and improved write performance.
    Creating a Real-Time InfoCube
    When creating a new InfoCube in the Data Warehousing Workbench, select the Real-Time indicator.
    Converting a Standard InfoCube into a Real-Time InfoCube
    Conversion with Loss of Transaction Data
    If the standard InfoCube already contains transaction data that you no longer need (for example, test data from the implementation phase of the system), proceed as follows:
           1.      In the InfoCube maintenance in the Data Warehousing Workbench, from the main menu, choose InfoCube ® Delete Data Content. The transaction data is deleted and the InfoCube is set to inactive.
           2.      Continue with the same procedure as with creating a real-time InfoCube.
    Conversion with Retention of Transaction Data
    If the standard InfoCube already contains transaction data from the production operation that you still need, proceed as follows:
    Execute ABAP report SAP_CONVERT_NORMAL_TRANS under the name of the corresponding InfoCube. Schedule this report as a background job for InfoCubes with more than 10,000 data records because the runtime could potentially be long.
    Integration
    The following typical scenarios arise for the use of real-time InfoCubes in planning:
    1st Scenario:
    Actual data (read-only access) and planning data (read-only and write access) have to be held in different InfoCubes. Therefore, use a standard InfoCube for actual data and a real-time InfoCube for planning data. Data integration is achieved using a multi-planning area that contains the areas that are assigned to the InfoCubes. Access to the two different InfoCubes is controlled by the Planning area characteristic that is automatically added.
    2nd Scenario:
    In this scenario, the plan and actual data have to be together in one InfoCube. This is the case, for example, with special rolling forecast variants. You have to use a real-time InfoCube, since both read-only and write accesses take place. You can no longer directly load data that has already arrived in the InfoCube by means of an upload or import source. To be able to load data nevertheless, you have to make a copy of the real-time InfoCube and flag it as a standard InfoCube and not as real-time. Data is loaded as usual and is subsequently updated to the real-time InfoCube.
    Assign points if helpful
    Thnx
    Anil.V

  • Data load issue with export data source - BW 3.5

    Hi,
    We are facing issues in loading data with the help of export data source.
    We have created export data source of 0PCA_C01 cube. With the help of this export datasource,  we are loading data to other custom cube. Scenario is working fine in development server.
    But when we transported objects to quality server data is not getting loaded to custom target cube.
    It is extracting zero records.  All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs.  Every object is active but data is not getting extracted.
    RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development.   We are in BW 3.5 with patch level 19.
    Please guide us to resolve the issue.
    Thanks,
    Aditya

    Hi
    Make sure that you have relevant Role & Authorization at Quality/PRS.
    You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
    Hope it helps and clear

  • Error while loading data from DS into cube

    Hello All
    I am getting the below error while loading data from Data source to the cube.
    Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20070331
    Wht am i suppose to do?
    Need ur input in this regard.
    Regards
    Rohit

    Hi
    Simply map 0calday to 0fiscper(you already done it)..You might have forgotten to map fiscal year variant in update rules....if it comes from souce,just map it..if it won't comes from source...make it as constant in update rules and give value..
    If your business is April to March make it as  'V3'
    If your business is January to Decemeber make it as 'K4'..
    activate your update rules delete old data and upload again....
    Hope it helps
    Thanks
    Teja

Maybe you are looking for

  • Get all files from Directory

    Hi All, My requirment is to get releted files from Local Directory. For Example Path is "D/Data/" in this path there is files like abc123_de.txt abc123_en.txt xyz123_de.txt abc123_pt.txt xyz123_en.txt pqr123_en.txt bcg234_en.txt sjd467_en.txt Now i w

  • Amemdments in purchase order

    Dear all,      Is there any standard reports in sap to check the amendments in purchase order. If so how?. I just need to compare the old values and new values in that particular purchase order. with regards, Edamanayil..

  • Time machine deleted old backups

    I was wondering if anyone has tried to use time machine? i've just started a time machine backup and it is not detected the old backups made on lion.

  • Starting weblogic server from back ground

    hi guruz, I have installed weblogic 10.3.3 on solaris Sparc..i have set d following details for admin server username: weblogic password: weblogic1 Now i want to start this script in the background ie sh startWeblogic.sh Since when ever i start this

  • Default storage for locally managed tablespaces

    the documentation says you cannot have a default storage parameter for locally managed tablespaces. Does this mean that we cannot specify INITIAL NEXT PCTINCREASE MINEXTENTS MAXEXTENTS for such tablespaces, or is there another way we can, without usi