ERPi Data load mapping Issue

Hi,
We are facing issue with ERPi data load mappings issue. Mapping file (txt file) has 36k records, whenever we are trying to load mappings, it's taking very long time, nearly 1 hour 30mins. but we want to reduce that time. is there any way to reduce data load mapping time??
Hyperion verion: 11.1.2.2.300
Please help, thanks in advance!!
Thanks.

Any one face the same kind of issue??

Similar Messages

  • ERPI: Data Loading problem Hyperion Planning & Oracle EBS

    Hi
    I am trying to load data from Oracle EBS to Hyperion Planning.
    When i push data Zero rows are inserted in Target.
    When i look at Table " SELECT * FROM TDATASEG "
    It is showing me data but it is not comminting data in Target application.
    The reason is Data difference in Source (EBS) and Target.
    In my source Year is 2013 but in Target 'FY14' so for Entity Source is '21' but Target is '2143213'
    Can you please let me know how to solve this issue?
    can i place a Lookup table for this in EPRI.
    i am using EPRI and ODI to push data.
    Regards
    Sher

    Have you setup the data load mapping correctly to map the source value to the proper target value? Based on what you are describing it seems that the system generated * to * map is being used, if you are mapping a source to a different target, this needs to be added to the data load mapping.

  • Regarding ERPI Data Loading

    Dear All,
    I have few doubts on ERP Integrator.
    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    3) what is process for loading the data to Planning using ERP Integrator?
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    Anyone please guide me in this situation.
    Thanks,
    PC

    1) What are things required from Oracle GL to Planning for Data Loading using ERP Integrator? (Trail Balance is enough or we required some other file from Oracle GL)
    Assuming you have the right version of Oracle EBS, ERP Integrator queries the tables within the Oracle EBS database to get the appropriate information. In my case, the trail balance file was enough. Within the trail balance file you will have the appropriate dimension intersection (account, entity, period, etc.), the type of account (asset vs. liability, etc.) and finally the dollar amount.
    2) Is there any scheduling options available for Data loading using ERP Integrator?
    Yes. You can use FDQM to map and validate the data, then use the FDQM batch scheduler to load the data via command line or you can use the FDQM batch scheduler as well.
    3) what is process for loading the data to Planning using ERP Integrator?
    I'll try to do my best to summarize. (Assuming you are using FDQM) Create rules in ERPi -> Configure the adapters in the Workbench Client for the ERPi Rules -> Configure the FDQM Web Client to call the Adapters set in the Workbench Client -> Import the data into FDQM. Then from here you can call your command line automation for batching if you wish.
    4) How we load the data to Planning? (i.e. monthly load, hourly load)
    This depends on your business. Assuming you are going to load the data for budget and planning purposes then maybe your business is happy with a monthly load (and most of the time this is the case). An hourly load might be helpful if you deal with users that need up to date actuals. Loading hourly acutals data might be an overkill for a budget or planning application, but I have ran into situations where this is needed, but then find myself worried about speeding up the calculations after the data is loaded. Long store short you can load monthly or hourly.

  • QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES

    WHAT ARE  QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
    WHAT ARE DATALOADING PERFORMANCE ISSUES  WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
    WILL REWARD FULL POINT S
    REGARDS
    GURU

    BW Back end
    Some Tips -
    1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 — Background Processing Job Management — to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
    2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 — ABAP/4 Run-time Analysis — and then run the analysis for the transaction code RSA3 — Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
    3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 — Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
    4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 — Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
    5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW — BW IMG Menu — on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
    6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
    7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
    8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
    You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
    9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
    10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables — for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
    11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
    12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
    13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
    14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
    Hope it Helps
    Chetan
    @CP..

  • Problem converting static data load mapping to MOLAP

    Hi
    as a prototyping exercise I am converting some of our ROLAP dimensions and corresponding data load mappings (1 static data i.e. "-1" id to handle unknowns in fact data, and 1 for real data coming from a table) to MOLAP.
    The dimension itself converts and deploys correctly and the real data mapping also redeploys and executes correctly.
    HOWEVER
    my static data mapping will not execute successfully.
    The mapping uses constants (ID = -1, NAME 'UNKNOWN' etc), not all attributes are linked (this has been tried). My column WH_ID which was the ROLAP surrogate key gets converted to VARCHAR2 as expected. Mapping does deploy cleanly.
    The error i get is below. I have been banging my head on this for a couple of days and tried searching the Net, Metalink to no avail. I'm hoping someone out there can help
    LOAD_STATIC_D_TRADER_IU
    Warning
    ORA-20101: 15:48:04 ***Error Occured in BUILD_DRIVER: In __XML_SEQUENTIAL_LOADER: In __XML_LOAD_ATTRS: Error loading attributes for hierarchy, D_TRADER.AW$NONE.HIERARCHY, level D_TRADER.TRADER.LEVEL, mapping group D_TRADER.TRADER.MAPGROUP1.DIMENSIONMAPGROUP. In __XML_LOAD_ATTRS_ITEM: In ___XML_LOAD_TEMPPRG: The SQL IMPORT command cannot convert from the TEXT type to the DECIMAL type.
    TRUNCATE_LOAD=false
    AW Execution status: Success
    15:48:00 Started Build(Refresh) of MARTS Analytic Workspace.
    15:48:00 Attached AW MARTS in RW Mode.
    15:48:01 Started Loading Dimensions.
    15:48:01 Started Loading Dimension Members.
    15:48:01 Started Loading Dimension Members for D_TRADER.DIMENSION (1 out of 1 Dimensions).
    15:48:03 Finished Loading Members for D_TRADER.DIMENSION. Added: 1. No Longer Present: 885.
    15:48:03 Finished Loading Dimension Members.
    15:48:03 Started Loading Hierarchies.
    15:48:03 Started Loading Hierarchies for D_TRADER.DIMENSION (1 out of 1 Dimensions).
    15:48:03 Finished Loading Hierarchies for D_TRADER.DIMENSION. 1 hierarchy(s) STANDARD Processed.
    15:48:03 Finished Loading Hierarchies.
    15:48:03 Started Loading Attributes.
    15:48:03 Started Loading Attributes for D_TRADER.DIMENSION (1 out of 1 Dimensions).
    15:48:04 Failed to Build(Refresh) MARTS Analytic Workspace.
    15:48:04 ***Error Occured in BUILD_DRIVER: In __XML_SEQUENTIAL_LOADER: In __XML_LOAD_ATTRS: Error loading attributes for hierarchy, D_TRADER.AW$NONE.HIERARCHY, level D_TRADER.TRADER.LEVEL, mapping group D_TRADER.TRADER.MAPGROUP1.DIMENSIONMAPGROUP. In __XML_LOAD_ATTRS_ITEM: In ___XML_LOAD_TEMPPRG: The SQL IMPORT command cannot convert from the TEXT type to the DECIMAL type.

    Hi this looks like a bug in set based mode with using numeric dimension attributes and loading them from a constant. Row based mode is OK, which stages the data before loading the AW, but you probably don't want this.
    A workaround is to add an expression operator in the map. You will have to add a link from a source table/constant into the expression operator to satisfy the map analyser. But then you can add expressions such as your numeric attributes in the expression operator's output group, define the values for each expression and map these expression outputs (not the numeric constants) into your dimension. Hopefully this makes sense.
    Cheers
    David

  • ERPI data load error

    Hello,
    We are loading one month data from peoplesoft to planning application using ERPI 11.1.2.1. It takes about 1 hrs to load one month data.Is it normal? When looking at ODI PS_GL_LOAD_BALANCES_DATA is always successful but COMM_LOAD_BALANCES fails at the end with this error.Did anyone have this error before. Message in HyS9aifWeb-sysout file is:
    FROM AIF_HS_BALANCES
    WHERE LOADID = 132
    [EssbaseRuleFile] Locking rules file AIFData
    [EssbaseRuleFile] Successfully locked rules file AIFData
    [EssbaseRuleFile] Copying rules file OWBPLAND for data load as AIFData
    [EssbaseRuleFile] Unlocking rules file AIFData
    [EssbaseRuleFile] Successfully unlocked rules file AIFData
    [EssbaseRuleFile] The data rules file has been created successfully.
    [EssbaseRuleFile] Locking rules file AIFData
    [EssbaseRuleFile] Successfully locked rules file AIFData
    [EssbaseRuleFile] Load data into the cube by launching rules file...
    <Mar 1, 2012 1:09:30 PM PST> <Error> <WebLogicServer> <BEA-000337> <[STUCK] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)' has been busy for "625" seconds working on the request "weblogic.servlet.internal.ServletRequestImpl@3bc2189[
    POST /aif/services/HPLService HTTP/1.1
    Content-type: text/xml;charset="utf-8"
    Accept: text/xml, multipart/related, text/html, image/gif, image/jpeg, *; q=.2, */*; q=.2
    Soapaction: "http://DEVAPPHPN001.indymacdev.biz:19000/aif/services/HPLServiceexecuteDataLoad"
    User-Agent: JAX-WS RI 2.1.7-b01-
    Content-Length: 638
    Connection: Keep-Alive
    Proxy-Client-IP: 10.205.101.108
    X-Forwarded-For: 10.205.101.108
    X-WebLogic-KeepAliveSecs: 30
    X-WebLogic-Force-JVMID: 2124105652
    ]", which is more than the configured time (StuckThreadMaxTime) of "600" seconds. Stack trace:
    Thread-176 "[STUCK] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'" <alive, suspended, priority=1, DAEMON> {
    jrockit.net.SocketNativeIO.readBytesPinned(SocketNativeIO.java:???)
    jrockit.net.SocketNativeIO.socketRead(SocketNativeIO.java:24)
    java.net.SocketInputStream.socketRead0(SocketInputStream.java:???)
    java.net.SocketInputStream.read(SocketInputStream.java:107)
    com.essbase.services.olap.main.main_direct.EssNetClient.readFrom(Unknown Source)
    com.essbase.services.olap.main.main_direct.EssNetClient.readPackage(Unknown Source)
    com.essbase.services.olap.main.main_direct.EssNetClient.receiveResponse2(Unknown Source)
    com.essbase.services.olap.main.main_direct.EssNetClient.adNetReceiveResponse(Unknown Source)
    com.essbase.services.olap.main.main_direct.EssAPIData._adImport(Unknown Source)
    com.essbase.services.olap.main.main_direct.EssAPIData.ImportToEssbaseASO(Unknown Source)
    com.essbase.services.olap.main.main_direct.EssMAPIDir.BeginDataload(Unknown Source)
    com.essbase.server.framework.EssOlapMainService.BeginDataload(Unknown Source)
    com.essbase.server.framework.EssOrbPluginDirect.essMainBeginDataload(Unknown Source)
    com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
    com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
    com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
    com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
    com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
    com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
    com.essbase.api.session.EssOrbPlugin.essMainBeginDataload(Unknown Source)
    com.essbase.api.datasource.EssCube.beginDataload(Unknown Source)
    com.hyperion.aif.essbase.EssbaseRuleFile.executeDataRuleFile(EssbaseRuleFile.java:314)
    com.hyperion.aif.webservices.HPLService.executeDataLoad(HPLService.java:167)
    sun.reflect.GeneratedMethodAccessor7310.invoke(Unknown Source)
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    java.lang.reflect.Method.invoke(Method.java:575)
    org.apache.axis.providers.java.RPCProvider.invokeMethod(RPCProvider.java:397)
    org.apache.axis.providers.java.RPCProvider.processMessage(RPCProvider.java:71)
    org.apache.axis.providers.java.JavaProvider.invoke(JavaProvider.java:265)
    org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)
    org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:113)
    org.apache.axis.SimpleChain.invoke(SimpleChain.java:78)
    org.apache.axis.handlers.soap.SOAPService.invoke(SOAPService.java:435)
    org.apache.axis.server.AxisServer.invoke(AxisServer.java:132)
    org.apache.axis.transport.http.AxisServlet.doPost(AxisServlet.java:586)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:700)
    org.apache.axis.transport.http.AxisServletBase.service(AxisServletBase.java:325)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:815)
    weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:224)
    weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:108)
    weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:206)
    weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
    oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
    oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:299)
    oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:405)
    oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
    oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:98)
    oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:70)
    weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
    oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:86)
    weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
    weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:25)
    weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
    weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3687)
    weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
    weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:308)
    weblogic.security.service.SecurityManager.runAs(SecurityManager.java:116)
    weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2213)
    weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2135)
    weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1420)
    weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    weblogic.work.ExecuteThread.run(ExecuteThread.java:168)
    Any feedback is appreciated.Thank you for taking ime to read through.
    Regards.

    Hello,
    Part of the message you are getting is because the WebLogic process is a bit intuitive. It knows when something is running and waits. It also assumes anything running for a certain period of time is considered "stalled/stuck" and reports it. The time it waits is dependant on the Weblogic configuration.
    As for the time it takes to load data... that is very vague. Without knowing more about your environment that is very hard to state. We know nothing about the hardware, platform, dataset size, etc. You would need to review ODI to find out what is taking the long amount of time and try and replicate it outside of ERPi/ODI to see if you get the same results.
    Thank you,

  • Poor Data Load Performance Issue - BP Default Addr (0BP_DEF_ADDR_ATTR)

    Hello Experts:
    We are having a significant performance issue with the Business Partner Default Address extractor (0BP_DEF_ADDRESS_ATTR).  Our extract is exceeding 20 hours for about 2 million BP records.  This was loading the data from R/3 to BI -- Full Load to PSA only. 
    We are currently on BI 3.5 with a PI_BASIS level of SAPKIPYJ7E on the R/3 system. 
    We have applied the following notes from later support packs in hopes of resolving the problem, as well as doubling our data packet MAXSIZE.  Both changes did have a positive affect on the data load, but not enough to get the extract in an acceptable time. 
    These are the notes we have applied:
    From Support Pack SAPKIPYJ7F
    Note 1107061     0BP_DEF_ADDRESS_ATTR delivers incorrect Address validities
    Note 1121137     0BP_DEF_ADDRESS_ATTR Returns less records - Extraction RSA3
    From Support Pack SAPKIPYJ7H
    Note 1129755     0BP_DEF_ADDRESS_ATTR Performance Problems
    Note 1156467     BUPTDTRANSMIT not Updating Delta queue for Address Changes
    And the correction noted in:
    SAP Note 1146037 - 0BP_DEF_ADDRESS_ATTR Performance Problems
    We have also executed re-orgs on the ADRC and BUT0* tables and verified the appropriate indexes are in place.  However, the data load is still taking many hours.  My expectations were that the 2M BP address records would load in an hour or less; seems reasonable to me.
    If anyone has additional ideas, I would much appreciate it. 
    Thanks.
    Brian

    Hello Experts:
    We are having a significant performance issue with the Business Partner Default Address extractor (0BP_DEF_ADDRESS_ATTR).  Our extract is exceeding 20 hours for about 2 million BP records.  This was loading the data from R/3 to BI -- Full Load to PSA only. 
    We are currently on BI 3.5 with a PI_BASIS level of SAPKIPYJ7E on the R/3 system. 
    We have applied the following notes from later support packs in hopes of resolving the problem, as well as doubling our data packet MAXSIZE.  Both changes did have a positive affect on the data load, but not enough to get the extract in an acceptable time. 
    These are the notes we have applied:
    From Support Pack SAPKIPYJ7F
    Note 1107061     0BP_DEF_ADDRESS_ATTR delivers incorrect Address validities
    Note 1121137     0BP_DEF_ADDRESS_ATTR Returns less records - Extraction RSA3
    From Support Pack SAPKIPYJ7H
    Note 1129755     0BP_DEF_ADDRESS_ATTR Performance Problems
    Note 1156467     BUPTDTRANSMIT not Updating Delta queue for Address Changes
    And the correction noted in:
    SAP Note 1146037 - 0BP_DEF_ADDRESS_ATTR Performance Problems
    We have also executed re-orgs on the ADRC and BUT0* tables and verified the appropriate indexes are in place.  However, the data load is still taking many hours.  My expectations were that the 2M BP address records would load in an hour or less; seems reasonable to me.
    If anyone has additional ideas, I would much appreciate it. 
    Thanks.
    Brian

  • Data Load Performance Issue

    Hi Gurus,
    We are loading data from one source into two BW systems (A and B). The loads of system A only takes 2 hours, while the same loads in system B takes more than 6 hours usually. System B is the upgraded version, which is supposed to be faster than system A.
    One finding is in system B, BW picks only 5000 records each time from all source tables. In system A, it picks 50,000 records each time from all source tables. I checked the system setting from RSCUSTV6. It says data package size is 50,000. Isnt it strange?
    Any idea why this is happening? Please advise! Points will be generously awarded.
    Thanks!
    Linda
    One mistake.
    Message was edited by:
            Linda Tseng

    That is 50,000 records for 1 info-idoc...
    For more info...
    <b>Maximum size of a data packet in kilo bytes:</b>
    The individual records are sent in packages of varying sizes in the data transfer to the Business Information Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package.
    SAP recommends a data package size between 10 and 50 MB.
    <b>Frequency with which status Idocs are sent</b>:
    With this frequency you establish how many data IDocs should be sent in an Info IDoc.
    Hope this helps.

  • " Data Load error issues in  DTP "

    Hai Friend's,
    When I am loading Data from R/3 to BI.
    My Data is loaded into PSA.
    After  that  when  I  am  running   the  DTP, The Data is not coming to the Multi-Provider.
    Let me know the Solution. so that I can run the DTP.

    hI,
    DTP will help to load the data in DATA TARGETS ..
    so,after checking the PSA, check the data targets (infocube or DSO),which is targeted.
    if the data is there then check the connection between data targets and Multi-providers.
    or
    Give me more and clear picture.
    Hope you got the idea.
    Edited by: chandoo77 on Oct 7, 2011 2:33 PM

  • FDM ERPI data load rule error

    Hi,
    When I am extracting the data from EBS to Planning(run the data rule),I got this below error
    Error: Unable to launch process. Creation of Data Rule File failed.
    Can you please why I am getting this error
    Thanks in advance

    Please do not post your question to existing questions.
    Please take a moment to review this posting:
    Forum Posting Tips & Etiquette

  • Issues with ondemand Data loader

    Hello,
    We are facing 2 issues with on demand data loader.
    Issue 1
    While inserting 'Contacts' and 'Assets' if the 'Account' information is wrong, the records are created with out accounts even though "Account" is a required field.
    Issue 2
    While inserting records data loader is not checking for duplicates. So duplicate records are getting created.
    Kindly advise if anyone has come across similar issues. Thanks
    Dipu
    Edited by: user11097775 on Jun 20, 2011 11:46 PM

    Hello,
    We are facing 2 issues with on demand data loader.
    Issue 1
    While inserting 'Contacts' and 'Assets' if the 'Account' information is wrong, the records are created with out accounts even though "Account" is a required field.
    Issue 2
    While inserting records data loader is not checking for duplicates. So duplicate records are getting created.
    Kindly advise if anyone has come across similar issues. Thanks
    Dipu
    Edited by: user11097775 on Jun 20, 2011 11:46 PM

  • Load monitoring issue

    Hi,
          in load monitoring process chain having  issue with 1 master data load, this issue containg  "Error message when processing in the Business Warehouse" and in monitor 2 duplicate records are there, how  can i solve the issue and continue the process chain, could you plz anyone sugess me in resolving the issue.
    Thanks!
    regards,
    Buvana.

    hii bhuvana,
    there is one option which u can try but am not sure if it is correct or not. go to DTP from there go to update tab and there u can see a option called "Handle duplicare record key" check that option. It should work. In the past when we had the similar error of duplicate records. we did it in the same way.
    regards,
    raghu.

  • Unable to Define EBS R12 Data Load Mappings in FDMEE

    Hi,
    I am trying to integrate EBS R12 and HFM using FDMEE 11.1.2.3.500. Here are the steps that I followed:
    a. Configured the EBS_DATA_SERVERS and PHYSICAL SCHEMAS in ODI to point to EBS Instance using APPS User
    b. Defined EBS R12 Source System and Clicked INITIALIZE. (Validated that the Initialization Process is completed without any Errors)
    c. Selected the Approriate SOURCE ACCOUTING ENTITIES for R12 Source System
    Now, when I define a new IMPORT LOCATION, I am unable to see any EBS related SOURCE SEGMENTS under DATA LOAD MAPPING Grid under Import Format
    Is there something done incorrectly? Also strangely, I dont even see any RESPONSIBILITIES in the SOURCE ACCOUNTING ENTITES when I want to define the Drill Through feature
    Thanks,
    Mohit

    One reason that you might not see entries in the segments drop down is due to the entries being filtered according to browser language or FDMEE user language settings for localization purposes. You might want to try adjusting the Setup > User Settings > User Language in FDMEE to see if that can make the segments appear.
    We've got a US English EBS 12 source system to test with and if I change Setup > User Settings > User Language in FDMEE to something other than English United States then save and go back to the import format screen, segment drop down boxes are now completely empty. If I change back to en-us then they're back again. I believe that either the browser accept language or the User Language setting in FDMEE needs to match the 'base language' on the source system registration screen.
    With a bit of luck that might be it

  • 'Select a measure:' stuck on 'Loading...' in Dashboard Designer KPI Dimensional Data Source Mapping

    [using SharePoint 2013 Enterprise SP1]
    I am trying to create a KPI in Dashboard Designer, but am getting a timeout. I have been doing this for a while on my site; this is not the first. I haven't had this problem before.
    I created a new KPI and clicked on the Data Mappings column value, which is a hyperlink, to bring up the Dimensional Data Source Mapping dialog. I switched to a Data Connection in the site I just created (DC works perfectly and can retrieve sample data).
    When I click the "Select a measure:" drop-down menu, I get the message "Loading..." and after a while (a minute? two?) a dialog pops up with:
    The request took too long to complete. SharePoint is currently unavailable or experiencing heavy traffic. Try again later.
    This is a test SP server and I'm the only one on it, so there is no load. Also as mentioned, I am able to verify the Data Connection without problem. I am not having any issue with any of my other few dozen KPIs/Data Connections. Any suggestions as to how
    to troubleshoot?

    Hi cgtyoder,
    According to your description, my understanding is that you got an error when you created a KPI in Dashboard Designer.
    Please try to recycle the PerformancePoint Services Application Pool account, compare the result.
    Please go to C:\inetpub\wwwroot\wss\VirtualDirectories\the port of web application, adjust the HttpRuntime executionTimeout for the Web Application by modifying the web.config, now PerformancePoint report stability is much better:
    <httpRuntime executionTimeout="600" maxRequestLength="51200" />
    Note: before you change the web.config file, please make a bakcup for the file.
    If this issue still exists, please go to the log file to find more information about this issue.
    I hope this helps.
    Thanks,
    Wendy
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Wendy Li
    TechNet Community Support

  • DSO data Load issue

    Hi all,
    i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
    Please any one suggest me.
    Thanks,
    Gayatri.

    Hi Gayatri,
    if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
    (or)
    If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
    (or)
    Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
    Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
    Hope it is clear & helpful!
    Regards,
    Pavan

Maybe you are looking for