Checking the Data Loaded in the InfoCube

Hi all,
With 3.x version and also with RSA1OLD, it's possible to see infosources that can be loaded for this data target and their status with infosources overview.
Now, with 2004s version, there's no Infosources. It was really helpful to see if all master data are loaded.
Is there any new functionality? RSA1OLD is still necessary?
Thanks
Best regards

Hi Andreas and Michael,
Thanks for replies.
Andreas, not exactly, it's more about DTP than PSA.
I know how to view all information one by one. But for an Infocube you can have x Master Data (texts, attributes and hierarchies).
How can I see if all data are loaded? (date and status)
As Michael said, we can use the process chain to check if all init, delta or full are successfully loaded. But we need to check it in more than 5 process chain.
With "overview infosources" it was possible to check with just one click if all data are loaded.
You're right Michael it's clearly more about ergonomy.
Best regards

Similar Messages

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • How to delete the data loaded into MySQL target table using Scripts

    Hi Experts
    I created a Job with a validation transformation. If the Validation was failed the data passed the validation will be loaded into Pass table and the data failed will be loaded into failed table.
    My requirement was if the data was loaded into Failed database table then i have to delete the data loaded into the Passed table using Script.
    But in the script i have written the code as
    sql('database','delete from <tablename>');
    but as it is an SQL Query execution it is rising exception for the query.
    How can i delete the data loaded into MySQL Target table using scripts.
    Please guide me for this error
    Thanks in Advance
    PrasannaKumar

    Hi Dirk Venken
    I got the Solution, the mistake i did was the query is not correct regarding MySQL.
    sql('MySQL', 'truncate world.customer_salesfact_details')
    error query
    sql('MySQL', 'delete table world.customer_salesfact_details')
    Thanks for your concern
    PrasannaKumar

  • System Abort in the middle of the Data load

    Hi,
    when system aborts in the middle of the Data load in Essbase 9.X and some of the data loaded into the cube. and i want the data which is loaded to get back to original stage with previous data. Please help.

    Hi MK2,
    Request you to refer to 47th Chapter 'ensuring data integrity', it provides information of isolation levels ( commited and uncommited).
    This will answer your question .
    If you want to retain data, you need to find out these isolation levels of your cube/applications
    Sandeep Reddy Enti
    HCC
    http://analtyiks.blogspot.com

  • Unable to load the data from PSA to INFOCUBE

    Hi BI Experts, good afternoon.
        I am loading 3 years data( Full load ) from R/3 to Infocube.
       So loaded the data by monthwise. So i created 36 info packages.
      Everything is fine. But i got a error in Jan 2005 and Mar 2005. It is the same error in both months. That is Caller 01and caller 02 errors( Means invalid characteristics are there PSA data )
    So i deleted both PSA and Data target Requests and again i loaded the data only to PSA.
      Here i got data in PSA without fail.
      Then i tried to load the data from PSA to Infocube MANUALLY.
    But its not happening.
      One message came this
           SID 60,758 is smaller than the compress SID of cube ZIC_C03; no        request booking.
       Please give me the solution how to solve this problem.
      Thanks & Regards
         Anjali

    Hi Teja,
       Thanks for the good response.
      How can i check whether it is already compressed or not?
      Pls give me the reply.
      Thanks
              Anjali

  • Error while loading the data from ODS to InfoCube

    hai
    Im trying to load the data from ODS to InfoCube for particular year .
    But it says that there is a source system problem .
    why it is like that .
    pls tell me
    i ll assing the points
    rizwan

    Hi Rizwan,
    you didn't mention the error message in details. there could be a few places to be checked:
    - check if BW itself source system is active and in tact and reactivate if necessary
    - check if update rule is active and reactivate if necessary
    - check if ODS is active and reactivate if necessary
    Regards,
    Lilly

  • Modify the Data Load wizard to remove the check for unique columns

    Hi,
    I'm trying to make a Data Loading Wizard into my application, which basically works nice except for one thing:
    I want to remove the check for unique columns, and insert ALL data which the users wants to upload.
    (Like the Data load utility under development).
    The reason is, that the data I want to upload is an export from my bank statements (in XLS / CSV format)
    And these exports don't have any primary key.
    Using the Apex wizard created Data Upload pages, I will loose data in case I had two identical payments in a day.
    Is there a way to do this, while keeping the pages created by the wizard to make Data Loading pages?
    apex 4.2.1

    I would suggest to load the data into a view and process the records into the real table(s) with instead-of-triggers. When you add a "1=2" where clause to the view, there never is any data and everything will be loaded.
    Otherwise you have to have a "real" PK value, like a timestamp or a statement_nr+line_nr.

  • Is there any setting in ODS to accelerate the data loads to ODS?

    Someone mentions that there is somewhere in ODS setting that make the data load to this ODS much faster.  Don't know if there is kind of settings there.
    Any idea?
    Thanks

    hi Kevin,
    think you are looking for transaction RSCUSTA2,
    Note 565725 - Optimizing the performance of ODS objects in BW 3.0B
    also check Note 670208 - Package size with delta extraction of ODS data, Note 567747 - Composite note BW 3.x performance: Extraction & loading
    hope this helps.
    565725 - Optimizing the performance of ODS objects in BW 3.0B
    Solution
    To obtain a good load performance for ODS objects, we recommend that you note the following:
    1. Activating data in the ODS object
                   In the Implementation Guide in the BW Customizing, you can implement different settings under Business Information Warehouse -> General BW settings -> Settings for the ODS object that will improve performance when you activate data in the ODS object.
    2. Creating SIDs
                   The creation of SIDs is time-consuming and may be avoided in the following cases:
    a) You should not set the indicator for BEx Reporting if you are only using the ODS object as a data store.Otherwise, SIDs are created for all new characteristic values by setting this indicator.
    b) If you are using line items (for example, document number, time stamp and so on) as characteristics in the ODS object, you should mark these as 'Attribute only' in the characteristics maintenance.
                   SIDs are created at the same time if parallel activation is activated (see above).They are then created using the same number of parallel processes as those set for the activation. However:if you specify a server group or a special server in the Customizing, these specifications only apply to activation and not the creation of SIDs.The creation of SIDs runs on the application server on which the batch job is also running.
    3. DB partitioning on the table for active data (technical name:
                   The process of deleting data from the ODS object may be accelerated by partitioning on the database level.Select the characteristic after which you want deletion to occur as a partitioning criterion.For more details on partitioning database tables, see the database documentation (DBMS CD).Partitioning is supported with the following databases:Oracle, DB2/390, Informix.
    4. Indexing
                   Selection criteria should be used for queries on ODS objects.The existing primary index is used if the key fields are specified.As a result, the characteristic that is accessed more frequently should be left justified.If the key fields are only partially specified in the selection criteria (recognizable in the SQL trace), the query runtime may be optimized by creating additional indexes.You can create these secondary indexes in the ODS object maintenance.
    5. Loading unique data records
                   If you only load unique data records (that is, data records with a one-time key combination) into the ODS object, the load performance will improve if you set the 'Unique data record' indicator in the ODS object maintenance.

  • How to identify the status of the data load

    Hi All,
    Here is my requiremenet,
    I have a process called etl_pf2 which loads the data into staging tables. Now I have another process that does a partition exchange and moves the data from staging tables to online tables.
    In one procedure, I need to verify whether any data load into the staging tables is happings. If yes then partiontion should not occur, if no then movement should happen.
    Any idea on any parameter which can be used in the procedure to check the status of the data load in staging tables.
    Please help me.

    Thanx for reply
    But i thhink that the problem is with NQSServer which crashes but not disappears from the processes list.
    I tried to search obiee+USER_MEM_ARGS and nqsserver+USER_MEM_ARGS but found only this thread.
    How can JVM settings help to nqsserver.exe to work properly?

  • Data loaded in the target are not reportable.

    Dear all,
    In one of our cubes, data is loaded daily. A couple of days back a load failed with ABAP Dump " Raise exception" - subsequently the other days have data loaded but are found to be not reportable.
    How to handle this.
    Can anyone provide reasons for the error and subsequently making the requests reportable in steps.
    Experts help will be appreciated.
    Note:- Further details of the error gives details as "ADD_PARTITION_FAILED"
    Regards,
    M.M

    Hi,
      can you have look in ST22 for Short dumps on the day this load got failed, there you can get some more details of the error. it will help us to understood your error.
    have a look to the below SAP Note:539757
    Summary
    Symptom
    The process of loading transaction data fails because a new partition cannot be added to the F fact table. The loading process terminates with a short dump.
    Other terms
    RSDU_TABLE_ADD_PARTITION_ORA , RSDU_TABLE_ADD_PARTITION_FAILED , TABART_INCONSITENCY, TSORA , TAORA , CATALOG
    Reason and Prerequisites
    The possible causes are:
    SQL errors when creating the partition.
    Inconsistencies in the Data Dictionary control tables TAORA and TSORA. There must be valid entries for the data type of the table in these tables.
    Solution
    BW 3.0A & BW 3.0B
               If there are SQL errors: Analyze the SQL code in the system log or short dump and if possible, eliminate the cause. The cause is often a disk space problem or lock situations on the database catalog or, less frequently: The partitioning option in the ORACLE database is not installed.
               In most cases, there are inconsistencies in the Data Dictionary control tables TAORA and TSORA. As of Support Package 14 for BW 3.0B/Support Package 8 for BW 3.1C, the TABART_INCONSITENCY exception is issued in this case.
               1. ) Check whether there is an entry for the corresponding data type of the table in the TAORA table.
               The TAORA table contains the assignment of data classes to data tablespaces and their attributes, for example:
               Tablespace Data class
               DDIM      PSAPDIMD   . .......
               DFACT     PSAPFACTD  ........
               DODS      PSAPODSD   . ......
               2. ) Check whether the TSORA contains an entry for the tablespace from the TAORA table.
               The TSORA table contains the assignment of the data tablespace to the index tablespace.
               TABSPACE                      INDSPACE
               PSAPDIMD                      PSAPDIMD
               PSAPFACTD                      PSAPFACTD
               PSAPODSD                      PSAPODSD
               For more information, see Notes 502989 and 46272.

  • Last Data packet for the data load.

    Hi All,
    How to find the last data packet loaded to target for the data load in SAP BI? In the Table?
    Thank you,
    Adhvi

    Hi Adhvirao,
    When u r loading the data from ECC go to monitor ---> details ---> check the data pacaket which are failed double click on that data packet then in bottom u will able to see the IDOC no for ecc side which are pending in ecc containing some data. you can excute them manually to BI side thorgh BD87.
    Thanks,
    Deepak

  • Optimize the data load process into BPC Cubes on BW

    Hello Gurus,
    We like to know how to optimize the data load process and our scenario for this is that we have ECC Classic Ledger,  and we are looking for the best way to load data into the BW Infocubes from an ECC source.
    To complement the question above, from what tables the data must be extracted and then parsed to BW so the consolidation it´s done ?  also, is there any other module that has to be considered from other modules like FI or EC-CS for this?
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    Have you looked at the BW Business Content extractors available for the classic GL? If not, I suggest you take a look. BW business content provides all the business logic you will normally need to get data out of ECC and into BW for pretty much every ECC application component in existence: [http://help.sap.com/saphelp_nw70/helpdata/en/17/cdfb637ca5436fa07f1fdc0123aaf8/frameset.htm]
    Ethan

  • HT1386 I have an older iPhone (3gs) and need to upgrade to a newer phone (4S).  I need to get my NOTES, CALENDAR, CONTACTS, PICTURES, etc backed up on iTunes so I can get that data loaded onto the new phone.  But not sure how to do that.

    I have an older iPhone (3gs) and need to upgrade to a newer phone (4S).  I need to get my NOTES, CALENDAR, CONTACTS, PICTURES, etc backed up on iTunes so I can get that data loaded onto the new phone.  But not sure how to do that.  When I open iTunes it has a button that say "Back Up iPhone", but I'm not sure what that does.  When I go into the sync options it say I have another user account and asks me if I want to merge of replace. I'm assuming it's trying to tell me I have an older iTunes Library, but don't know that.  Geez, maybe people over 60 shouldn't have iPhones, iTunes just bafles me.

    http://manuals.info.apple.com/en_US/iphone_user_guide.pdf

  • How to find the data loaded from r/3 to bw

    hi
    how to find the data loaded from r/3 to bw is correct . i am not able to find which feild in the query is connected to which feild in the r/3 . where i am geting the data from r/3 . is there any process to find which feild  and table the data is comming from . plz help
    thanks in advance to u all

    Hi Veda ... the mapping between R/3 fields and BW InfoObjects should take place in Transfer Rules. Other transformation could take place in Update Rule.
    So you could proceed this way: look at InfoProvider Data Model and see if the Query does perform any calculation (even with Virtual keyfigures / chars). Than go back to Update Rules and search for other calculation / transformation. At least there are Tranfer Rule and eventually DataSource / Extraction Enhancements.
    As you can easily get there are many points where you have to look for ... it's a quite complex work but very usefull.
    Once you will have identified all mappings / transfromation see if BW data matchs R/3 (considering calculations ...)
    Good job
    GFV

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

Maybe you are looking for

  • Open Source project

    Open Source Project Announcement... I have started an open source project to build a module for PHP4 that will allow direct calls to Tuxedo services from the PHP script, the project name is php-tuxedo. Think about it for a second... The ability from

  • ITunes will not sync with ipod when music is not on C drive in Windows 7

    For years I have used various versions of windows and itunes, always having my data on a different drive that the system with no issues. I am currently using Windows 7 RC1 and generally finding it far better than Vista. My problem is that although iT

  • Certain songs won't sync from itunes to iphone

    WHAT IS GOING ON? on my iTunes' it says that the playlist I created has this one song on it. Under the iphone4 tab it also says that the song is in the playlist. If it's showing up in both lists, why is it not showing up on my iphone when i sync? PLE

  • Problem with setVisible

    Hi I made a class like this. class MonthComboBox extends JComboBox public MonthComboBox() addItem(" "); addItem("Jan"); addItem("Feb"); addItem("Mar"); addItem("Apr"); addItem("May"); addItem("Jun"); addItem("Jul"); addItem("Aug"); addItem("Sep"); ad

  • Problem with Macro Mode on 808 for close up pictur...

    -The 808 have much better video and picture quality both in normal mode and in macro mode than the N8, but not in picture macro mode! I took several pictures of very small things using the 808 with full resolution including extremely small text, but