Daily Incremental Load

Hi All,
I got a situation where I want to load 9GB/50Million+ of data daily into my warehouse system from a 10g source database. This is just daily incremental volume. With PL/SQL what would be fastest and best way of doing this without giving heat to the source.
Thanks in Advance.

bi_man wrote:
With PL/SQL what would be fastest and best way of doing this without giving heat to the source. With PL/SQL? No fast and best way. As I can already see a clunky and somewhat flawed PL/SQL cursor loop selecting from the remote server over a db-link, inserting locally, doing incremental commits.
So forget PL/SQL. Data problems need to be addressed first and foremost using SQL. PL/SQL is used to manage the overall processing flow of that SQL, and deal with exceptions and so on.
The basic method for "moving rows" between source and destination tables is via an INSERT..SELECT statement.
Due to the nature and size of the move, it can be done as smaller discrete steps - allowing you to better manage the move of rows and even run it in parallel. You can do this via the DBMS_PARALLEL_EXECUTE package. You can also do that via custom criteria.
For example, every day you move yesterday's orders from the OLTP system into the warehouse. Orders are received 24x7. So you can create a procedure that takes an hour (date/time) as input, and copy all order the orders submitted during that hour. As part of this process, the procedure can log the hourly copy process in a table - listing the hour and number of rows copied. Likewise it can first check that log table to ensure that the hour has not already been successfully copied.
You now can manage the copy process at hourly level, have restart capabilities should a specific copy process fail, and the ability to start several copy processes at the same time for parallel processing.
One factor that will have an impact on the performance of this, is the size and speed of the network pipe between remote database and local database. If this is a major factor, then you could decide that a INSERT..SELECT across that network pipe is simply too expensive and slow. And instead look at options like unloading source data into a CSV file, compressing it, and then transferring it across the network. On the destination server, the file can then be uncompressed and loaded using SQL*Loader or an external table.
If the source and target tables are identical and partitioned, you can use DataPump to export the specific day's partition from the source table, and then import it into the empty partition for that day in the target table.

Similar Messages

  • How to setup daily incremental loads

    Hi:
    OBIEE 11.1.1.6
    OBIA 7.9.6.3
    I've been building and configuring OBIA in a test environment, and I'm planning the go-live process. When I move everything to production, I'm sure I would need to do a full load. My question is, what do I need to do to change the DAC process from a full load to a nightly incremental load?
    Thanks for any suggestions.

    Go to DAC->Setup->Physical Data Sources-> select the connection type any of Source or Target
    Look for the 'Refresh Dates' for list of tables make sure you have data entry of Yesterday or any date.
    Do the same for Source and Target connections
    Pls mark if helps
    Question for you:
    1) Do you have Production environment up and running daily loads? If yes what you are trying to do?

  • DAC Incremental load with new instance

    we have our daily incremental load running from an instance, but now we would like to run load from another instance( not full load) how can this be acheived

    One possible way to do this is to create in awm a cube script with a Load Command with a where clause filter that would filter what records were loaded into the cube.  This cube script could then be run to load only partial data from the instance.

  • Incremental Loads like daily or weekly runs in IOP

    Generally when we do incremental loads in production we opt for load replace or load update.
    I think in case of RS we opt for load update while for dimensions we opt for load replace is it correct understanding?
    Also should we run stage clear rowsource and stage clear dimension commands before loading RS & Dim. in order to be on safer side to clean up previous run left overs in stage area.

    Integrated Operational Planning uses update when the input data stream is incremental; for
    example, inventory at the end of the current week. Replace is used when the data stream is a
    complete snapshot of the data in the external system.
    Doing a Stage clear rowsource usually depends on whether you would need to the earlier data present in the stagging area to be kept or not. If the data in the rowsource is not used it is is usually preferred to be run Stage clear rowsource before updating the stagging area with new data.
    This can also be achieved in 1 line using stage replace which is equivalent to doing stage clear + stage update.

  • Incremental Loads and Refresh Date

    Hi all,
    Thank you for taking the time to review this post.
    Environment
    Oracle BI Applications 7.9.6 (Financial & Project Analytics)
    Oracle E-Business Suite 11.5.10
    Question
    I have a Test BI Apps 7.9.6 in a Test environment that is connected to a static EBS 11.5.10 data source. As part of my testing phase I'd like to do multiple Incremental Loads to get an accurate performance impact and timing study for the final pre-approval before migrating to Production. I can get a refresh of EBS which has a week's worth of transactions after my Initial Full Load. What I'd like to do is change Refresh Dates to "trick" the Incremental Load into only loading one days worth of data at a time, rather than the full week's worth of data in the Incremental load. Is this possible, and if so, how?
    Example timeline:
    Today - Initial Full load using Test EBS as of today
    1 week later - Refresh static Test EBS from Production with a week of transactions
    Post Refresh - Run daily Incremental jobs using static Test EBS
    First Incremental Load - Today's position + 1 day,
    Second " " - Today's position + 2 days,
    Third " " - Today's position + 3 days, etc
    As always all comments and solutions greatly appreciated.
    Kind Regards,
    Gary.

    Say on the 01st of the month, you did a Load.
    Then on the 08th of the month, the source EBS system was itself refreshed.
    What you want to do is to run a single day refresh on the 08th for all data from the 01st to the 02nd of the month), and then another single day referesh -- whether on the 08th or on the 09th , you don't care -- for all data from the 03rd to the 04th.
    Unfortunately, the refresh is from last refresh date to current date. You can't define "refresh upto date". Therefore, your first 'incremental' refresh on the 08th would refresh all data from the 02nd to the 08th in one shot. What you could try to do is
    a. After the first load on the 01st, shutdown the BI DWH.
    b. When the EBS test source is refresh on the 08th, reset your SYSTEM CLOCK ON THE BI DWH DATABASE SERVER to the 2nd (or 3rd) of the month.
    c. Now, when you run a refresh, BI will extract all data from the 01st to the 02nd or 03rd (even though EBS is as of the 08th).
    d. Once this is done, shutdown BI DWH.
    e. Reet the SYSTEM CLOCK ON THE BI DWH DATABASE SERVER to the 3rd or 4th of the month.
    f. Run another Incremental Refresh.
    ... and so on ...
    Hemant K Chitale
    http://hemantoracledba.blogspot.com

  • Incrementally Loading Data Using a Data Load Buffer

    Hi
    I am using Essbase 9.3.1. I am trying to use the feature "Replacing Database or Incremental Data Slice Contents" for my data loads to the ASO Cube.
    I have 2 data sets, one of them is 2 years history. And another is last 3 months which would be changing on a daily basis. I looked at that DBAG and they have exact same scenario as an example for this feature. But I am not able to overwrite my valatile data set with my new file.
    Here is what I do
    alter database ${1}.${2} initialize load_buffer with buffer_id ${6} resource_usage 0.3 property ignore_missing_values, ignore_zero_values ;
    import database ${1}.${2} data from data_file '${3}' using server rules_file '${4}' to load_buffer with buffer_id ${6} add values create slice on error write to '${5}' ;
    alter database {1}.{2} merge incremental data;
    alter database ${1}.${2} clear aggregates ;
    execute aggregate process on database ${1}.${2} ;
    In fact my data from my new (incremental file) does not even make it to the database. I checked that it does get rejected.
    Am I doing something wrong over here. How do I use the concept of "data slice" and its incremental load feature.
    Can anyone please explain ?
    Thanks
    Mandar Joshi

    Hi,
    Just wondering if anyone had any inputs or feedback on my query. Or is my question a really stupid one and does not deserve any attention :)
    Can someone explain how the "data slice" concept works ??
    Thanks
    Mandar Joshi.

  • Compare data in R/3 with data in a BW Cube after the daily delta loads

    Hi Friends,
    How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records?

    Hi Sunil,
    If you want to check the records daily instead of checking the data in R/3 manually ......
    You can try this...
    If you have staging DSO(level 1) that means whatever data is in source system load it to Staging DSO without any routines or any modifications.
    Now load this DSO data to Cube or DSO(level 2) as per your requirement with routines etc.
    Now Staging DSO contains Source system data.
    Now the level 2 Cube or DSO contains BW data with some modifications.
    Now create a Multiprovider based on level 1 and level 2 data targets.
    Now create a report on which keyfigures you want to test the data.
    In Multiprovider there is a field called 0infoprovider in data packet dimension.
    you can drag this infoprovider to the columns and restict your keyfigures with level 1 and level 2 data targets.
    In the first column you can see the level 1 DSO data ( source system data),in the 2nd column you can see the BW data.
    Now create a formula which gives the diffrence b/n level 1 and level2.
    that is R/3 data - BW data.
    If the diffrence is zero both R/3 and BW data are same.
    if the diffrence is not eqaul to zero check whether any routine is there or not.

  • Duplicate rows in Hierarchy Table created when running incremental load

    I copied an out of the box dimension and hierarchy mapping to my custom folders (task hierarchy) this should create the same wids from dimension to hierarchy table and on full load does this using the sequence generator. The problem I am getting is whenever I run a incremental load instead of updating, a new record is created. What would be the best place to start looking at this and testing. A full load runs with no issues. I have also checked the DAC and run the SDE trunc always and SIL trunc for full load only.
    Help appreciated

    Provide the query used for populating the child records. Issue might be due to caching.
    Thanks
    Shree

  • OBIA Financial Analytics - ETL Incremental Load issue

    Hi guys
    I have an issue while doing ETL Incremental load in DEV. Source, Target ORACLE.
    issue with these two tasks: SDE_ORA_GL_JOURNALS and SDE_ORA_ImportReferenceExtract
    incremental load is holding at SDE_ORA_GL_JOURNALS.....on Database sessions the query is completed and also the session is done but no update in the session log for that session in informatica. It says hust ' SQL Query issued to database' . no progress from there. and the task in both informatica session monitor and DAC says running and keeps running for ever. No errors seen in any of the log files.
    Any idea on whats happening? I checked session logs, DAC servr logs, Database alert logs, Exception logs on source and found nothing.
    I tried to run these Informatica generated queries in SQL developer and they ran well and I did see the results. More over, weird thing is from past three days and about 10 runs....the statis tics are
    both thses tasks run in parallel and most of the times Import references extract is completed first and then GL_Journals is running for ever
    In one run, GL_Journals is done but then Import reference extract is running for ever. I dont exactly understand whats happening. I see both the queries running in parallel on source database.
    Please give me some idea on this. And this same stuff is running good on QA. I dont know how this is working.....any idea on this is appreciated. Thank you. let me know of ay questions. Thank you in advance.

    Please refer this:
    http://gerardnico.com/wiki/obia/installation_7961

  • Need help on: Automation of Daily Data Load

    Hi all,
    We need to start our Daily Data load from DAC by Manually. So right now my client has asked us to do Automation of Daily Data Load.
    Starting the Daily Data Load Manually(DAC) Process: First we have to check whether the ASCP Plans updated or not
    Right now we are checking whether the plans got updated or not, so for this we are using following query
    SELECT LTrim(RTrim (compile_designator)),data_completion_date,TO_CHAR(data_completion_date ,'DD-MON-YYYY HH24:MI:SS') FROM apps.msc_plans
    WHERE LTrim(RTrim (compile_designator))
    in( 'Plan01,'Plan02','Plan03','Paln04') ORDER BY 2 desc
    from this query we will able to see whether all the plans got updated or not. From all the Four Plans, two plans will get updated as of Sysdate(mm/dd/yyy) ,Timestamp(hh:mm:ss)(for example i.e. Plan01 08/25/2011 11:20:08 PM, Plan02 08/25/2011 11:45:06 PM) and rest two plans get updated on Sysdate+1(mm/dd/yyy), Timestamp(hh:mm:ss)(for example i.e. Plan03 08/26/2011 12:20:05 AM, Plan04 08/26/2011 12:45:08 AM)
    So after checking the plans , we start the Daily Load in DAC manually.
    May I know how should I convert my above sql query which I am using for checking the plans updated or not in informatica, so as to automate the Daily Load in informatica level..
    Need help.

    You cannot replicate what is done with DAC at Informatica level. DAC is a separate Oracle product that orchestrates and manages the ETL load (including Index management, etc). The reason Oracle developed DAC is because it allows you to manage a large scale DW load for a large ERP system. As suggested, you can invoke the DAC execution plan via a command but you cannot replicate everything the DAC does at Informatica level. If this helps, please mark as helpful.

  • Incremental Loading of a Tree Component

    I'm working on an explorer-type interface in Flex 2 for
    browsing a remote file repository. It's a standard split-pane
    affair with a directory tree on the left and a listing on the
    right.
    Because the entire directory tree can be rather large, I need
    to load it incrementally as the user expands nodes rather than all
    at once. I failed to find a relevant example so I wrote my own. It
    works, but I'm new to Flex and am not sure if there's an easier way
    or if there are any pitfalls to the way I did it.
    I posted my code here:
    http://xocoatl.blogspot.com/2007/01/incremental-loading-of-tree-in-flex-2.html
    Any comments here or on the blog are appreciated; I'm
    guessing that having a good example of the "right" way to do this
    will be useful to many others.
    Thanks.

    i am also using another workaround using CSS trick(inlineStyle). if #{node.children} is null, i am placing an empty 10x10 white png image file over expand icon on tree node.
    following code is nodeStamp of tree component.
    <f:facet name="nodeStamp">
      <af:group id="g1">
        <af:image source="/images/spacer.png" id="i1" inlineStyle="border: 2px solid white; position: absolute; margin-left:-14px;" rendered="#{node.children == null}"/>
        <af:commandLink text="#{node.name}" id="cl1" partialSubmit="true"/>
      </af:group>
    </f:facet>

  • 11g (11.2.0.1) - dimension operator very slow on incremental load

    Dimension operator very slow in porcessing incremental loads on 11.2.0.1 Have applied cumulative patch - still same issue.
    Statistics also gathered.
    Initial load into empty dimension performs fine (thousands of records in < 1 min) - incremental load been running over 10 mins and still not loaded 165 records from staging table.
    Any ideas?
    Seen in 10.2.0.4 and applied patch which cured this issue.

    Hi,
    Thanks for the excellent suggestion.
    Have run other mapings which maintain SCD type 2 using dimesnion operator behave similary to 10g. - have raised issue with this particular mapping with Oracle - awaiting response.
    One question - when look at the mappings which maintain SCD Type 2s looks to join on dimension key and the surrogate ids.
    What is best practice regarding indexing of such a dimension, e.g is it recommended to index dimension key and the surrogate ids along with the natural/nbusiness keys?
    Thanks

  • Incremental load not capturing data in SSIS 2012

    Hi ,
    Iam having an issue with Oracle CDC for SSIS which is new in 2012, Developed SSIS packages with Full load and Incremental load logic to load data into ODS - STAGE - DWH. Here problem is when ever i do a full load following with an incremental load , incremental
    load is not capturing updated data , if i do second incremental load then it captures data.
    Is there any solution for this to get data in first incremental load.

    Are you sure it picks up LSN correctly? I doubt its CDC service not picking the correct LSN value which it uses to identify the changes.
    It should be in cdc.lsn_time_mapping table I guess
    http://msdn.microsoft.com/en-IN/library/bb510494.aspx
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Incremental load into the Dimension table

    Hi,
    I have the problem in doing the incremental load of the dimension table.Before loading into the dimension table,i would like to check the data in the dimnesion table.
    In my dimension table i have one not null surrogate key and the other null dimension tables.The not null surrogate key, i am populating with the Sequence Generator.
    To do the incremental load i have done the following.
    I made lookup into the dimension table and looked for a key.The key from the lookup table i have passed to the expression operator.In the expression operator i have created one field and hard coded one flag based on the key from the lookup table.I passed this flag to the filter operator and rest of the fields from the source.
    By doing this i am not able to pass the new records to the dimension table.
    Can you please help me.
    I have another question also.
    How do i update one not null key in the fact table.
    Thanks
    Vinay

    Hi Mark,
    Thanks for your help to solve my problem.I thought i share more information by giving the sql.
    I am giving below the 2 sqls, i would like to achieve through OWB.
    Both the following tasks need to be accomplished after loading the fact table.
    task1:
    UPDATE fact_table c
    SET c.dimension_table_key =
    (SELECT nvl(dimension_table.dimension_table_key,0)
    FROM src_dimension_table t,
    dimension_table dimension_table
    WHERE c.ssn = t.ssn(+)
    AND c.date_src_key = to_number(t.date_src(+), '99999999')
    AND c.time_src_key = to_number(substr(t.time_src(+), 1, 4), '99999999')
    AND c.wk_src = to_number(concat(t.wk_src_year(+), concat(t.wk_src_month(+), t.wk_src_day(+))), '99999999')
    AND nvl(t.field1, 'Y') = nvl(dimension_table.field1, 'Y')
    AND nvl(t.field2, 'Y') = nvl(dimension_table.field2, 'Y')
    AND nvl(t.field3, 'Y') = nvl(dimension_table.field3, 'Y')
    AND nvl(t.field4, 'Y') = nvl(dimension_table.field4, 'Y')
    AND nvl(t.field5, 'Y') = nvl(dimension_table.field5, 'Y')
    AND nvl(t.field6, 'Y') = nvl(dimension_table.field6, 'Y')
    AND nvl(t.field7, 'Y') = nvl(dimension_table.field7, 'Y')
    AND nvl(t.field8, 'Y') = nvl(dimension_table.field8, 'Y')
    AND nvl(t.field9, 'Y') = nvl(dimension_table.field9, 'Y')
    WHERE c.dimension_table_key = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    source table for the dimension table is src_dimension_table
    dimension_table_key is a not null key in the fact table
    task2:
    update fact_table cf
    set cf.key_1 =
    (select nvl(max(p.key_1),0) from dimension_table p
         where p.field1 = cf.field1
    and p.source='YY')
    where cf.key_1 = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    key_1 is a not null key in the fact table
    Is it possible to achieve the above tasks through Oracle Warehouse builder(OWB).I created the mappings for loading the dimension table and fact table and they are working fine.But the above two queries i am not able to achieve through OWB.I would be thankful if you can help me out.
    Thanks
    Vinay

  • Business Objects Data Services Incremental Load Help

    Hi this my first time creating a incremental load for a batch job. My batch job consists of a try - initialization script - data flow - catch. When I validate my initialization script I am getting an error could you review and identify the error with the script. My data flow consists of the data store table I imported with a query then table comparison then key generation then the the table I am updating.
    # Set Todays Date
    $SYSDATE = cast ( sysdate (), 'date' );
    print ('Today\' date:' || cast($SYSDATE, 'varchar(10)'));
    # SET CDC DATE
    $CDC_DATE = nvl (cast(sql('Target', 'SELECT MAX(BATCH_END_DATE) FROM BATCH_CONTROL WHERE BATCH_NAME = {$BATCH_NAME}
    AND BATCH_STATUS = \'SUCESS\' '), 'date'), cast(to_date('1900-01-01', 'YYYY-MM-DD'), 'date'));
    #Mark an entry in Batch_Control
    # Batch_Name    BATCH_STATUS   BATCH_START_DATE   BATCH_END_DATE Load_DATE
    sql('Target', 'INSERT INTO BATCH_CONTROL VALUES ( {BATCH_NAME}, \'STARTED', {to_char ($CDC_DATE, \'YYYY-MM-DD\')}, NULL, {to_char ($SYSDATE, \'YYYY-MM-DD\')};

    So I resolved the first error now I am receiving this long error any ideas?
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    |Session Table_Incramental_Load
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Process dump is written to <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Call stack:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D305BA, TrCallStatement::process_dbdiff_xform_new()+6666 byte(s), x:\src\parser\process_predef_xform.cpp, line 7281
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D3128E, TrCallStatement::process_diff_xform()+1422 byte(s), x:\src\parser\process_predef_xform.cpp, line 0432
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D356EE, TrCallStatement::process_predef_xform_options()+0286 byte(s), x:\src\parser\process_predef_xform.cpp, line
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0067+0017 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C313A5, TrCallStatement::processStatement()+0789 byte(s), x:\src\parser\dataflowstm.cpp, line 3307
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C310FC, TrCallStatement::processStatement()+0108 byte(s), x:\src\parser\dataflowstm.cpp, line 3201+0012 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C0FB55, DataFlowDef::processStatements()+0101 byte(s), x:\src\parser\dataflow.cpp, line 2331+0014 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C110D5, DataFlowDef::buildGraph()+1621 byte(s), x:\src\parser\dataflow.cpp, line 1723
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C12D99, DataFlowDef::processObjectDef()+2793 byte(s), x:\src\parser\dataflow.cpp, line 1290
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB9DC5, CallStep::processStep()+2037 byte(s), x:\src\parser\planstep.cpp, line 1050
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:FFFFFFFF, NsiAllocateAndGetPersistentDataWithMaskTable()+-1997676757 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB406F, TryStep::processStep()+0335 byte(s), x:\src\parser\planstep.cpp, line 3634
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB33A6, Step::processStepBlock()+0134 byte(s), x:\src\parser\planstep.cpp, line 0377+0018 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C8A78E, PlanDef::processObjectDef()+2718 byte(s), x:\src\parser\plandef.cpp, line 0689
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABB806, AE_Main_Process_Options()+32534 byte(s), x:\src\xterniface\actamainexp.cpp, line 3622
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABFAB1, AE_Main()+1505 byte(s), x:\src\xterniface\actamainexp.cpp, line 0830+0030 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00402AE9
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Registers:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EAX=056E85F0  EBX=00000000  ECX=00000010  EDX=02250048  ESI=056E85F0
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EDI=056E85A8  EBP=04A7C590  ESP=002700F0  EIP=00D305BA  FLG=00010206
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    CS=0023   DS=002B  SS=002B  ES=002B   FS=0053  GS=002B
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Exception code: C0000005 ACCESS_VIOLATION
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Fault address:  00D305BA 01:0029A5BA C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\bin\acta.dll

Maybe you are looking for

  • Unsupported Image Format After External Editor

    I have Aperture set up with Photoshop Elements 6 for MAC as the external editor. I have PSD(16bit) 300 dpi. I have used this external editor several times in the last two days. The most recent time I had a problem. It started PSE 6 as requested and d

  • Jdeveloper 11g http analyzer help

    Hi , I am trying to run a web service. I would want to test the wed application in the http analyzer, but when i run test web service.I have my http analyzer running, when i click "create new request" the http analyzer window opens up , but i dont se

  • Help about text frame

    Hi,everybody, I create a text frame in indesign and there are some texts, tables and pictures in it. I want to write some codes to treat each item in the text frame, can anybody tell me how to do that? especially to get the picture item in the text f

  • Logic crashing, can't fix...SOS

    When I start Logic it hangs and I get 'not responding'. If I leave it eventually it will right itself but will give the message "midi services not available". I have tried deleting plists and repaired permissions. This was happening before the upgrad

  • Can i access my data on time capsule from some other network

    can i access my data on time capsule from some other network that is my data at home from office over internet