How to manage huge amount of data into OBIEE 11g?

Hi all Experts,
I have some business requirements for a BANK where I need to get data for around 2 crores accounts with different product lines with 50 columns.
from a staging table generated from data ware house. .
** I don't need to manage any modeling and business model based criteria (dimension and fact) its going from direct database request.
how to handle and make the report output faster.
***If I  create the same report from OBIEE rpd based subject area , presentation tables (with filters to get less no of rows) than it never comes up with result any result and fails than return errors.
any suggestion will help a lot.
Thanks in advance
Raj M

"if the product does not peform"...
Let's put the problem into a perspective that guys (which I assume we all are for the sake of the argument): cars.
Using your direct database request as a starting point it's a bit like trying to use a Ferrari to pull a camper. Yes, it will work, but it's the wrong approach. Likewise (or conversely) it's a pretty bad idea to take a Land Rover Defender around the Nurburg Ring.
In both cases "the product" (i.e. the respective cars) "will not perform" (i.e. not fulfill the intended duties the way you may want them to).
I never get why everyone always bows to the most bizarre requests like "I MUST be able to export 2 million rows through an analysis exposed on a dashboard" or "This list report must allow scrolling through 500k records per day across 300 columns".

Similar Messages

  • How to Extract Huge SAP HR Data into Excel?

    Dear Experts & Gurus,
    Need your Valuable suggestions regarding this scenario.
    I need to extract entire SAP HR data, modules are OM,PA,TM,PY and PD into Excel sheets because our clients are migrating to other ERP. The data volume is very huge, as they are using SAP since 14years. Can we use se16 to extract the data , will it able to handle in extracting the data from HR Tables or any other standard ways to extract this huge data without any dumps or errors .
    Appreciate if you provide some useful suggestions or tips  for this scenario.
    Thank you.
    Regards
    Vicky

    Dear Liran,
    Thanks a lot for your valuable suggestions.its really given an brief idea now.
    First:
    What's the purpose of this data download?
    Do you wish to transfer the data from SAP into the new ERP system? Or do you need the download as a backup?
    Yes , Transfer the data from SAP to New ERP system, it is not for Backup.
    IDOCS/ALE from SAP to New ERP?
    Second
    I think Payroll results stored in cluster tables, so here we need Abaper to write program. For OM , PA and TM can we extract via Se16/ Se16n .
    Thank you very much in advance.
    Regards
    Vicky

  • How can we transfer huge amount of data from database server to xml format

    hi guru
    how can we transfer huge amount of data from database server to xml format.
    regards
    subhasis.

    Create ABAP coding
    At first we create the internal table TYPES and DATA definition, we want to fill with the XML data. I have declared the table "it_airplus" like the structure from XML file definition for a better overview, because it is a long XML Definition (see the XSD file in the sample ZIP container by airplus.com)
    *the declaration
    TYPES: BEGIN OF t_sum_vat_sum,
              a_rate(5),
              net_value(15),
              vat_value(15),
             END OF t_sum_vat_sum.
    TYPES: BEGIN OF t_sum_total_sale,
            a_currency(3),
            net_total(15),
            vat_total(15),
            vat_sum TYPE REF TO t_sum_vat_sum,
           END OF t_sum_total_sale.
    TYPES: BEGIN OF t_sum_total_bill,
            net_total(15),
            vat_total(15),
            vat_sum TYPE t_sum_vat_sum,
            add_ins_val(15),
            total_bill_amount(15),
           END OF t_sum_total_bill.TYPES: BEGIN OF t_ap_summary,
            a_num_inv_det(5),
            total_sale_values TYPE t_sum_total_sale,
            total_bill_values TYPE t_sum_total_bill,
           END OF t_ap_summary.TYPES: BEGIN OF t_ap,
            head    TYPE t_ap_head,
            details TYPE t_ap_details,
            summary TYPE t_ap_summary,
           END OF t_ap.DATA: it_airplus TYPE STANDARD TABLE OF t_ap
    *call the transformation
    CALL TRANSFORMATION ZFI_AIRPLUS
         SOURCE xml l_xml_x1
         RESULT xml_output = it_airplus
         .see the complete report: Read data from XML file via XSLT program
    Create XSLT program
    There are two options to create a XSLT program:
    Tcode: SE80 -> create/choose packet -> right click on it | Create -> Others -> XSL Transformation
    Tcode: XSLT_TOOL
    For a quick overview you can watch at the SXSLTDEMO* programs.
    In this example we already use the three XSLT options explained later.
    As you can see we define a XSL and ASX (ABAP) tags to handle the ABAP and XML variables/tags. After "

  • Data Transfer Prozess (several data packages due two huge amount of data)

    Hi,
    a)
    I`ve been uploading data from ERP via PSA, ODS and InfoCube.
    Due to a huge amount of data in ERP - BI splits those data in two data packages.
    When prozessing those data to ODS the system delete a few dataset.
    This is not done in step "Filter" but in "Transformation".
    General Question: How can this be?
    b)
    As described in a) data is split by BI into two data packages due to amount of data.
    To avoid this behaviour I enterd a few more selection criteria within InfoPackage.
    As a result I upload data a several time, each time with different selction criteria in InfoPackage.
    Finally I have the same data in ODS as in a), but this time without having data deleted in step "Transformation".
    Question: How is the general behaviour of BI when splitting data in several data packages?
    BR,
    Thorsten

    Hi All,
    Thanks a million for your help.
    My conclusion from your answers are the following.
    a) Since the ODS is Standard - within transformation no datasets are deleted but aggregated.
    b) Uploading a huge amount of datasets is possible in two ways:
       b1) with selction criteria in InfoPackage and several uploads
       b2) without selction criteria in InfoPackage and therefore an automatic split of datasets in data packages
    c) both ways should have the same result within the ODS
    Ok. Thanks for that.
    So far I have only checked the data within PSA. In PSA number of datasets are not equal for variant b1 and b2.
    Guess this is normal technical behaviour of BI.
    I am fine when results in ODS are the same for b1 and b2.
    Have a nice day.
    BR,
    Thorsten

  • Handling Huge Amount of data in Browser

    Some information regarding large data handling in Web Browser. Browser data will be downloaded to the
    cache of local machine. So when the browser needs data to be downloaded in terms of MBs and GBs, how can we
    handle that ?
    requirement is as mentioned below.
    A performance monitoring application is collecting performance data of a system every 30 seconds.
    The size of the data collected can be around 10 KB for each interval and it is logged to a database. If this application
    runs for one day, the statistical data size will be around 30 MB (28.8 MB) . If it runs for one week, the data size will be
    210 MB. There is no limitation on the number of days from the software perspective.
    User needs to see this statistical data in the browser. We are not sure if this huge amount of data transfer to the
    browser in one instance is feasible. The user should be able to get the overall picture of the logged data for a
    particular period and if needed, should be able to drill down step by step to lesser ranges.
    For e.g, if the user queries for data between the dates 10'th Nov to 20'th Nov, the user expects to get an overall idea of
    the 11 days data. Note that it is not possible to show each 30 second data when showing 11 days data. So some logic
    has to be applied to present the 11 days data in a reasonably acceptable form. Then the user can go and select a
    particular date in the graph and the data for that day alone should be shown with a better granularity than the overall
    graph.
    Note: The applet may not be a signed applet.

    How do you download gigabytes of data to a browser? The answer is simple. You don't. A data analysis package like the one you describe should run on the server and send the requested summary views to the browser.

  • Report in Excel format fails for huge amount of data with headers!!

    Hi All,
    I have developed an oracle report which fetches upto 5000 records.
    The requirements is to fetch upto 100000 records.
    This report fetches data if the headers are removed. If headers are given its not able to fetch the data.
    Have anyone faced this issue??
    Any idea to fetch huge amount of data by oracle report in excel format.
    Thanks & Regards,
    KP.

    Hi Manikant,
    According to your description, the performance is slow when display huge amount of data with more than 3 measures into powerpivot, so you need the hardware requirements for build a PowerPivot to display huge amount of data with more than 3 measures, right?
    PowerPivot benefits from multi-core processors, large memory and storage capacities, and a 64-bit operating system on the client computer.
    Based on my experience, large memory, multiprocessor and even
    solid state drives are benefit PowerPivot performance. Here is a blog about Memory Considerations about PowerPivot for Excel for you reference.
    http://sqlblog.com/blogs/marco_russo/archive/2010/01/26/memory-considerations-about-powerpivot-for-excel.aspx
    Besides, you can identify which query was taking the time by using the tracing, please refer to the link below.
    http://blogs.msdn.com/b/jtarquino/archive/2013/12/27/troubleshooting-slow-queries-in-excel-powerpivot.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • Changes to write optimized DSO containing huge amount of data

    Hi Experts,
    We have appended two new fields in DSO containg huge amount of data. (new IO are amount and currency)
    We are able to make the changes in Development (with DSO containing data).  But when we tried to
    tranport the changes to our QA system, the transport hangs.  The transport triggers a job which
    filled-up the logs so we need to kill the job which aborts the transport.
    Does anyone of you had the same experience.  Do we need to empty the DSO so we can transport
    successfully?  We really don't want to empty the DSO's as it will take time to load? 
    Any help?
    Thank you very muhc for your help.
    Best regards,
    Rose

    emptying the dso should not be necessary, not for a normal dso and not for a write optimized DSO.
    What are the things in the logs; sort of conversions for all the records?
    Marco

  • I have huge amount of data on a windows external drive and want to transfer to a Mac drive.  Does anyone know an easy way to do this?  I have almost 2TB of data to transfer.  Thanks.

    I have huge amount of data 2TB on a windows Fantom external drive and want to transfer to a Mac drive.  Does anyone know an easy way to do this?  Thanks.  I have an IMac 3.5 GHz Intel Core i7.  I haven't bought a Mac external yet. 

    Move your data to a new Mac - Apple Support

  • How to manage huge (3 gb+) files in photoshop

    I have started creating 3gb+ files in CS2 photoshop and my computer is taking 3 minutes to open and 10 minutes to save, etc - driving me mad with the delays. My system (3.166 mhz core duo, ASUS P5K SE/EPU motherboard, 4GB Kingston DDR2 800 RAM, Quadro FX540 video card) copes well with 300MB files but not with these ones.
    Recently I moved my OS to Windows 7 Professional 64-bit in the hope that things would improve but any change was marginal.
    The files are multi-layered designs, 150 dpi, about 16 feet by 10 feet and 1.8GB flattened when they are printed.
    Whilst I know that the designs are pushing the boundaries/restrictions of photoshop I would appreciate any views of members who have figured out how to manage huge files. Any suggestions welcomed, whether hardware/software upgrades, photoshop hints (but the dpi and size cannot change), etc.

    Thanks,
    you've all been helpful. My files were being saved as Photoshop not tiffs but I found a 30 day free trial of CS4 Photoshop and that seems to be making a difference.  It looks like I'll have to purchase that along with some more ram.  My computer is a money pit!
    Thanks again

  • How to get the plsql table data into output cursor

    Hi,
    Could anybody please help me.
    Below is an example of the scenario..
    CREATE OR REPLACE PACKAGE chck IS
    PROCEDURE getdata(dept_no IN VARCHAR2,oc_result_cursor OUT sys_REFCURSOR);
    TYPE get_rec is record (ename varchar2(20),
    eno number(12));
    TYPE t_recs IS TABLE OF get_rec INDEX BY BINARY_INTEGER;
    emp_tab t_recs;
    END chck;
    CREATE OR REPLACE PACKAGE BODY chck AS
    PROCEDURE getdata(dept_no IN VARCHAR2,oc_result_cursor OUT sys_REFCURSOR)
    is
    BEGIN
    select ename, eno
    bulk collect into emp_tab
    from emp;
    open oc_result_cursor for select * from table(emp_tab); -- I believe something is wrong here ....
    END;
    END chck;
    the above package is giving me an error:
    LINE/COL ERROR
    10/29 PL/SQL: SQL Statement ignored
    10/43 PL/SQL: ORA-22905: cannot access rows from a non-nested table
    item
    let me know what needs to be changed
    Thanks
    Manju

    manjukn wrote:
    once i get the data into a plsql table, how to get this plsql table data into the cursor?There is no such thing as a PL/SQL table - it is an array.
    It is nothing at all like a table. It cannot be indexed, partitioned, cluster, etc. It does not exist in the SQL engine as an object that can be referenced. It resides in expensive PGA memory and needs to be copied (lock, stock and barrel) to the SQL engine as a bind variable.
    It is an extremely primitive structure - and should never be confused as being just like a table.
    Its use in SQL statements is also an exception to the rule. Sound and valid technical reasons need to justify why one want to push a PL/SQL array to the SQL engine to run SELECT 's against it.

  • How to store the flat file data into custom table?

    Hi,
    Iam working on inbound interface.Can any one tell me how to store the flat file data into custom table?what is the procedure?
    Regards,
    Sujan

    Hie
    u can use function
    F4_FILENAME
    to pick the file from front-end or location.
    then use function
    WS_UPLOAD
    to upload into
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      CALL FUNCTION 'F4_FILENAME'   "Function to pick file
        EXPORTING
          field_name = 'p_file'     "file
        IMPORTING
          file_name  = p_file.     "file
      CALL FUNCTION 'WS_UPLOAD'
       EXPORTING
         filename                       = p_file1
        TABLES
          data_tab                      = it_line
    *then loop at it_line splitting it into the fields of your custom table.
    loop at it_line.
              split itline at ',' into
              itab-name
              itab-surname.
    endloop.
    then u can insert the values into yo table from the itab work area.
    regards
    Isaac Prince

  • How to avoid the Amount and Date values for VOID Cheques

    Hi All,
    I had created a two window i.e For Amount and Date. If I process the cheque the Amount value and Date should not trigger for VOID CHEQUES.
    Can any one tel me how to avoid the Amount and Date values for VOID Cheques
    Your help will be greatly appreciated.
    Regards
    Yathish

    Hi,
    I dont know which tablel you are referring to, is it PAYR table and the field VOIDR?
    If a cheque is voided, it would have a reason and it is stored in VOIDR field of this PAYR table.
    Check if the field VOIDR is filled, if it is filled, do not print the amount and date.
    Regards
    Subramanian

  • How can I read the trace data into LabVIEW for E5071B

    HI 
    I am setting up the measurement using vector network analyzer (VNA) E5071B controlled by NI 488.2. How can I read the trace data into LabVIEW and display on the graph? If anyone having an idea or know well about this process please give me the suggestion, I will much appreciate it.
    Many Thanks

    You want to start with the driver
     In case you do not know it, you can do the driver search in LabVIEW from Tools>Instrumentation>Find Instrument Drivers. You might also want to bookmark the Instrument Driver Network for information on what a driver is and how to use it.

  • TS3992 My Icloud back up shows huge amount of data stored from back up yet listed as incomplete. Unable to access data. Is data lost

    My ICloud backup shows a huge amount of data stored, yet it is listed as incomplete backup. I am unable to access the backed up data. Is the data lost?
    Thank you in advance for your assistance.

    Unfortunately, if an iCloud backup is incomplete you can't access any of the data in it.  The only way to access anything in the backup is to restore the entire backup, which can't be done if it is incomplete.

  • How to implement real-time refresh datas in obiee?

    How to implement real-time refresh datas in obiee?

    Can you elaborate more...
    If you want to see refreshed data in OBIEE Reports, you need to implement Caching mechanism based on how you often refresh warehouse..
    [http://download.oracle.com/docs/cd/E05553_01/books/admintool/admintool_QueryCaching6.html]

Maybe you are looking for

  • Data federator

    Hello Experts, I am just wondering how is Data federator to create Web Intelligence reports on top of SAP BW Infoproviders using Relational universe instead of OLAP universe. My requirement is that we want to create universes with all the custom obje

  • Header and trailer pages in report templates

    Hi: I wonder how can i make header and trailer pages in the tamplate reports , so that the objects in the template header & footer report would be inherited to the generated reports. I use Oracle reports 6.0.8 and there is no header section, niether

  • Absolute Font vs. Relative Fonts

    Hi, I am a fairly new user to Dreamweaver. Is there a way to set the font size in Dreamweaver and it will always appear at this size in the browser - Safari, IE and Netscape??

  • I have sluggish video and audio when mirroring ipad 3 or iphone.  Any solution?

    I have the new ipad and also the latest iphone.  I really want to use mirroring to stream a particular video each day that i subscribe to using the mirroring feature.  I have found that the audio keeps dropping and playing catch up.   The audio drops

  • Automated Intercompany Process

    Hi, Can someone clarify if I can use this BAPI MB_CREATE_GOODS_MOVEMENT for make automatic receptions in intercompany process? The process should work via EDI. EDI generates IDOC that calls BAPI and make recepction Could be posible? Thanks in advance