Bulk load Supporting details

Hi All,
I was looking at an option of how we can bulk upload and extract supporting details into/from Hyperion Planning. Version is 4.0.2.2
Smartview??
Thanks

You will need to use SQL to bulk load and/or extract supporting detail.
The relevant tables are in your specific planning app DB are below:
Table name:      HSP_COLUMN_DETAIL
Description:     One row per line item detail will exist in this table. This table stores unique numbers which relate to the dimension names and also the detail_id which is a value to find the specific cell detail.
Table name:      HSP_OBJECT
Description:     This table is used to translate DIM1 – DIM22 ID’s into member names. Object_name is the member name. If the member has an alias the HSP_ALIAS table must be queried.
Table name:      HSP_COLUMN_DETAIL_ITEM
Description:     One row per line item detail will exist in this table. This table stores unique numbers which relate to the dimension names and also the detail_id which is a value to find the specific cell detail.
Regards,
-John

Similar Messages

  • Does ODI Load Hyperion Planning Supporting Detail?

    Does the Hyperion Planning KM allow for loading supporting detail to planning?

    Hi,
    None of the Planning KMs currently allow loading of supporting detail.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Report on Supporting Details

    Hi,
    Our client has purchased a tool to load supporting details directly into Planning. This is to have the granularity of the cost details. Is there any way that we can generate a report of the supporting details. Will that be restricted to that particular combination or we can have a report of complete P&L where the accounts which have supporting details have also the list displayed below.
    Can this view be possible? Please let me know.
    Thanks,

    hi,
    For a data form ,which might be any report ( from business sense) can have supporting details ,which can be easily spotted in green colour.
    For example , if its expense of some kind, and it has got supporting details , you can dig into its depth by looking into it to find it constituents ex: travel expense , hotel expense...etc.
    Its like a in built calculator, and it does display list. Revert if you need any further info
    Sandeep Reddy Enti
    HCC
    http://hyperionconsultancy.com/

  • Forgot to check supporting detail box on Copy Process ... need workaround

    We have rolled over our budget year via the Planning Copy Process (ver. 11.1.2.1). In the rush I did the copy without checking the copy supporting details box.
    Users in the new budget have already made their adjustments to budget therefore I cannot rerun the copy process (or can I via exporting the existing data, do the copy then reimporting the new data?)
    Looking for ideas/thoughts on how to roll-over the supporting detail.
    JTS

    John,
    That worked but I accidentally duped a scenario twice. In supporting detail for that column's cells the detail lines show duplicate and in red in error.
    I have removed the error and hit save and got the infamous 'An error has occurred check the log'
    I tried to delete all supporting detail and still got the error.
    Before going through Oracle I am thinking something on the SQL table is duplicate and thus needs to be trashed..
    Just wanted to see if anyone else has accidentally loaded supporting detail into a scenario twice and what steps to resolve it
    JTS

  • Load Data to supporting details Hyperion Planning

    Hi All,
    I want to load data to supporting details in Hyperion Planning, does anyone know how to do it?
    Because as i know, supporting details only available in Planning's repository.
    I really appreciate if anyone can help me.
    I am using Hyperion 11 and ODI.
    Thx

    Hi,
    You could look at using LCM, you will need to export supporting details to understand the format of the XML.
    You can also use ODI with LCM, I wrote a blog about using LCM with ODI which should give you some ideas, have a read here
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Bulk load of user security details

    Hi,
    Any tips on how to go for a bulk load of user details in BPC, we have a list of users and their required authorisations.May be using scripts or some thing else. I know using DTS is one way, but trying to find if any workaround is there.?

    My advice to customers and partners is to always build a security matrix in excel to determine all the assignments. The matrix helps to determine if you have captured all the correct teams, tasks assignments, and access.  Try to ONLY setup users without any access 1st, then setup Memebraccess profiles and task profiles.  Bring this all together via the TEAM assignment, since a team may have only 1 TASK profile, but multiple Member access profiles.  While the building of security may take time, there are methods to minimize the current and future maintenance after the initial setup.  Plus, once you set-up the process via the admin console, you then may see in the table structures just how complex the assignments are for each of the components.  Once the tables are set, I still believe and I may be wrong, that an admin needs to process at a minimum the TEAMS from the admin console, to establish the connections required for by users for access.
    Hope this helps.

  • Error when doing a   ATGOrder Bulk load

    Hi
    Getting the below error when trying to do a bulk load ATGOrder in CSC.
    Machine Details :Linux 64bit machine
    ATG Version:10.1
    17:44:07,487 INFO [OrderOutputConfig] Starting bulk load
    17:44:11,482 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
    17:44:11,488 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
    17:44:11,495 WARN [loggerI18N] [com.arjuna.ats.internal.jta.recovery.xarecovery1] Local XARecoveryModule.xaRecovery got XA exception javax.transaction.xa.XAException, XAException.XAER_RMERR
    17:44:17,651 WARN [LiveIndexingService] Current hosts for environment ATGOrderBulk cannot support requested engine count
    17:44:17,652 WARN [LiveIndexingService] Allocate more hosts or increase the maximum number of search engines for one of its hosts
    17:44:17,656 ERROR [LiveIndexingService] Unable to release lock: __routingLiveIndexingLock:ATGOrder
    atg.service.lockmanager.LockManagerException: Attempt to release a write lock when not the owner: key=__routingLiveIndexingLock:ATGOrder Owner=Thread[http-0.0.0.0-8580-1:ipaddr=172.21.21.49;path=/dyn/admin/nucleus/atg/commerce/search/OrderOutputConfig/;sessionid=B0DC1551B81ACFD6B7C987E59116D825,5,jboss]
    at atg.service.lockmanager.ClientLockEntry.releaseWriteLock(ClientLockEntry.java:713)
    at atg.service.lockmanager.ClientLockManager.releaseWriteLock(ClientLockManager.java:1386)
    at atg.service.lockmanager.ClientLockManager.releaseWriteLock(ClientLockManager.java:1415)
    at atg.search.routing.LiveIndexingService.releaseLock(LiveIndexingService.java:1843)
    at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1455)
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
    at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at atg.nucleus.Nucleus.service(Nucleus.java:2967)
    at atg.nucleus.Nucleus.service(Nucleus.java:2867)
    at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
    at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
    at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
    at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
    at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
    at java.lang.Thread.run(Thread.java:662)
    17:44:17,658 ERROR [BulkLoader]
    atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:209)
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
    at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at atg.nucleus.Nucleus.service(Nucleus.java:2967)
    at atg.nucleus.Nucleus.service(Nucleus.java:2867)
    at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
    at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
    at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
    at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
    at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1629)
    at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1444)
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
    ... 49 more
    Caused by: atg.search.routing.LiveIndexException: Current supported by hosts engine count is less than required count of engines
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1161)
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1063)
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1625)
    ... 51 more
    17:44:17,675 ERROR [OrderOutputConfig]
    atg.repository.search.indexing.IndexingException: atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:1040)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1610)
    at atg.repository.search.indexing.IndexingOutputConfig.bulkLoad(IndexingOutputConfig.java:1563)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at atg.nucleus.ServiceAdminServlet.printMethodInvocation(ServiceAdminServlet.java:1463)
    at atg.nucleus.ServiceAdminServlet.service(ServiceAdminServlet.java:251)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at atg.nucleus.Nucleus.service(Nucleus.java:2967)
    at atg.nucleus.Nucleus.service(Nucleus.java:2867)
    at atg.servlet.pipeline.DispatcherPipelineServletImpl.service(DispatcherPipelineServletImpl.java:253)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.ServletPathPipelineServlet.service(ServletPathPipelineServlet.java:208)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.security.ExpiredPasswordAdminServlet.service(ExpiredPasswordAdminServlet.java:312)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.BasicAuthenticationPipelineServlet.service(BasicAuthenticationPipelineServlet.java:513)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.DynamoPipelineServlet.service(DynamoPipelineServlet.java:491)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.dtm.TransactionPipelineServlet.service(TransactionPipelineServlet.java:249)
    at atg.servlet.pipeline.PipelineableServletImpl.passRequest(PipelineableServletImpl.java:157)
    at atg.servlet.pipeline.HeadPipelineServlet.passRequest(HeadPipelineServlet.java:1271)
    at atg.servlet.pipeline.HeadPipelineServlet.service(HeadPipelineServlet.java:952)
    at atg.servlet.pipeline.PipelineableServletImpl.service(PipelineableServletImpl.java:272)
    at atg.nucleus.servlet.NucleusProxyServlet.service(NucleusProxyServlet.java:237)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:183)
    at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:95)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126)
    at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:451)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: atg.repository.search.indexing.IndexingException: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:209)
    at atg.repository.search.indexing.BulkLoaderImpl.bulkLoad(BulkLoaderImpl.java:921)
    ... 48 more
    Caused by: atg.search.routing.LiveIndexException: Unable to prepare engines for live indexing.
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1629)
    at atg.search.routing.LiveIndexingService.prepareIndexing(LiveIndexingService.java:1444)
    at atg.repository.search.indexing.submitter.LiveDocumentSubmitter.beginSession(LiveDocumentSubmitter.java:193)
    ... 49 more
    Caused by: atg.search.routing.LiveIndexException: Current supported by hosts engine count is less than required count of engines
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1161)
    at atg.search.routing.LiveIndexingService.prepareEnginesForLiveIndexingOperation(LiveIndexingService.java:1063)
    at atg.search.routing.LiveIndexingService.prepareBulkIndexing(LiveIndexingService.java:1625)
    ... 51 more

    In my /atg/search/routing/LiveIndexingService/ component i have the following values.
    ATGProfile      running      yes      yes      8000001      null      1      1      1      start stop cycle delete
    backup restore disable
    ATGProfileBulk      stopped      NO      yes      null      null      1      0      0      start stop cycle delete
    backup restore disable
    ATGOrder      running      yes      yes      8000002      null      1      4      4      start stop cycle delete
    backup restore disable
    ATGOrderBulk      stopped      NO      yes      null      null      1      0      0      start stop cycle delete
    backup restore disable
    Why is there 4 engins running for ATG Order???? i think this is wat is causing the problem, but i am unable to find from where its creating this 4 engins.

  • PL/SQL Bulk Loading

    Hello,
    I have one question regarding bulk loading. I did lot of bulk loading.
    But my requirement is to call function which will do some DML operation and give ref key so that i can insert to fact table.
    Because i can't use DML function in select statement. (which will give error). otherway is using autonomous transaction. which i tried working but performance is very slow.
    How to call this function inside bulk loading process.
    Help !!
    xx_f is function which is using autonmous transction,
    See my sample code
    declare
    cursor c1 is select a,b,c from xx;
    type l_a is table of xx.a%type;
    type l_b is table of xx.b%type;
    type l_c is table of xx.c%type;
    v_a l_a;
    v_b l_b;
    v_c l_c;
    begin
    open c1;
    loop
    fetch c1 bulk collect into v_a,v_b,v_c limit 1000;
    exit when c1%notfound;
    begin
    forall i in 1..v_a.count
    insert into xxyy
    (a,b,c) values (xx_f(v_a(i),xx_f(v_b(i),xx_f(v_c(i));
    commit;
    end bulkload;
    end loop;
    close c1;
    end;
    I just want to call xx_f function without autonoumous transaction.
    but with bulk loading. Please let me if you need more details
    Thanks
    yreddyr

    Can you show the code for xx_f? Does it do DML, or just transformations on the columns?
    Depending on what it does, an alternative could be something like:
    DECLARE
       CURSOR c1 IS
          SELECT xx_f(a), xx_f(b), xx_f(c) FROM xx;
       TYPE l_a IS TABLE OF whatever xx_f returns;
       TYPE l_b IS TABLE OF whatever xx_f returns;
       TYPE l_c IS TABLE OF whatever xx_f returns;
       v_a l_a;
       v_b l_b;
       v_c l_c;
    BEGIN
       OPEN c1;
       LOOP
          FETCH c1 BULK COLLECT INTO v_a, v_b, v_c LIMIT 1000;
          BEGIN
             FORALL i IN 1..v_a.COUNT
                INSERT INTO xxyy (a, b, c)
                VALUES (v_a(i), v_b(i), v_c(i));
          END;
          EXIT WHEN c1%NOTFOUND;
       END LOOP;
       CLOSE c1;
    END;John

  • Using API  to run Catalog Bulk Load - Items & Price Lists concurrent prog

    Hi everyone. I want to be able to run the concurrent program "Catalog Bulk Load - Items & Price Lists" for iProcurement. I have been able to run concurrent programs in the past using the fnd_request.submit_request API. But I seem to be having problems with the item loading concurrent program. for one thing, the program is stuck on phase code P (pending) status.
    When I run the same concurrent program using the iProcurement Administration page it runs ok.
    Has anyone been able to run this program through the backend? If so, any help is appreciated.
    Thanks

    Hello S.P,
    Basically this is what I am trying to achieve.
    1. Create a staging table. The columns available for it are category_name, item_number, item_description, supplier, supplier_site, price, uom and currency.
    So basically the user can load item details into the database from an excel sheet.
    2. use the utl_file api, create an xml file called item_load.xml using the data in the staging table. this will create the xml file used to load items in iprocurement and save it in the database directory /var/tmp/iprocurement This part works great.
    3. use the api fnd_request.submit_request to submit the concurrent program 'Catalog Bulk Load - Items & Price Lists'. This is where I am stuck. The process simply says pending or comes up with an error saying:
    oracle.apps.fnd.cp.request.FileAccessException: File /var/tmp/iprocurement is not accessable from node/machine moon1.oando-plc.com.
    I'm wondering if anyone has used my approach to load items before and if so, have they been successful?
    Thank you

  • JavaScript:Modify Data form to expose multiple periods of Supporting Detail

    Good Evening,
    I am currently on a project that has a requirement that users can enter Supporting Detail for multiple periods, regardless of what intersection of data you enter.
    I know that one approach is to not store the Supporting Detail in Hyperion Planning and just have Excel "store" the supporting detail and load the totals via SV VBA.
    Example:
    Detail 1
    Detail 2
    Detail 3
    Total Amount  (gets loaded into Essbase)
    However, I went down the path of exploring different approaches aside from the one mentioned above and came up with creating a process of copying the SD via stored procedure in the Planning Tables as a second approach (although I know this isn't exactly best practice) or creating custom Java Script. I have seen the examples of "Forcing SD" on Data Form using Java Script, but is is possible to expose multiple months of Supporting Detail in a customized form?
    Any thoughts/inputs/best approaches would be greatly appreciated.
    Thanks!

    A quick simple example
    function customCellEnterPre(row, col, cell) {
         if(row == 3 && col ==1)
              doSupportingDetail();
         return true;
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Smartview / Supporting Details Functionality / VBA Excel

    Question Summary*: Within Excel, using VBA, is it possible to control the "*_Supporting Details_*" functionality (read / add / delete children, paste values command)?
    Detail*:
    --- Within Excel, using VBA, I am able to control the POV parameters using something like this:
    X = HypSetPOV("Sheet1", "Entity#" & theE, "Project#" & theP)
    where theE and theP are variables I've preassigned --- works flawlessly
    --- Within Excel, using VBA, I am able to load / show the "*_Supporting Details_*" form using something like this:
    Application.CommandBars(1).Controls("Hyperion").Controls("Supporting Details").Execute
    I then manually (takes too long) enter data to the form in order to itemize --- within a specific GL expense account --- the budget line item.
    --- Request: I have a pre-completed Excel worksheet with all the details. I want to be able to read / add / delete children within a specific line item (SUPPORTING DETAILS) using VBA.
    Can someone provide a specific example?
    I've tried using HypGetChildren, but is this the wrong method? From documentation, this method appears to be associated with member children, not* SUPPORTING DETAILS.
    I'm using this for Supporting Details nodes (children and siblings) --- it is required that I enter "Supporting Details" children (can have anywhere from 1 to 20 children) for this GL expense item (row).
    Here's my set-up...
    Oracle Hyperion Smartview Version = _9.3.1.5.0.025_
    MS Excel Version = _2003 (Service Pack 3)_
    Declarations Made = _\SmartView\bin\smartview.bas_ (successfully able to use - call - these methods and functions)
    Form Structure Row = One GL Expense Account (called "Meals Expense")
    Form Structure Column = Twelve --- with each representing a month (Jan, Feb, Mar...)
    Form Structure POV = Many --- and these are already set, I do not need help setting the POV programmatically
    Thank you for your consideration and guidance.
    Edited by: 888548 on Sep 29, 2011 8:36 AM
    Edited by: 888548 on Sep 29, 2011 8:39 AM

    Dear Mike,
    Thanks so much for your quick response -- this helps a lot.
    Follow-Up Question: Could a modified request be accomplished (as follows)?
    Modified Request*: Rather than programatically reading / deleting child nodes within the Supporting Details form, is there a way to programatically add a child node and paste values within the Supporting Details form in the same fashion that I'm able to load / show the Supporting Details form.
    Load / Show Supporting Details Form:
    ---- This works --> Application.CommandBars(1).Controls("Hyperion").Controls("Supporting Details").Execute*
    Click on the Add Child button Programatically:
    ---- This does not work --> Application.CommandBars(1).Controls("Hyperion").Controls("Supporting Details/AddChild").Execute*
    Click on the Paste Values button Programatically:
    ---- This does not work --> Application.CommandBars(1).Controls("Hyperion").Controls("Supporting Details/PasteValues").Execute*
    Click on the OK button Programatically:
    ---- This does not work --> Application.CommandBars(1).Controls("Hyperion").Controls("Supporting Details/OK").Execute*
    Thanks for your help and consideration.

  • Anyone know setting primary key deferred help in the bulk loading

    Hi,
    Anyone know by setting primary key deferred help in the bulk loading in term of performance..cos i do not want to disable the index, cos when user query the existing records in the table, it will affect the search query.
    Thank You...

    In the Oracle 8.0 documentation when deferred constraints were introduced Oracle stated that defering testing the PK constraint until commit time was more efficient than testing each constraint at the time of insert.
    I have never tested this assertion.
    In order to create a deferred PK constraint the index used to support the PK must be created as non-unique.
    HTH -- Mark D Powell --

  • Bulk load issue

    Hi there
    Just wanted to know is it a bug or feature - in case if Column store table has non capital letters in its name, then bulk load does not work and performance is ruined?
    Mike

    it looks like we're having performance issues here and bulk load is failing because of connection method which is being used with Sybase RS. If we use standard ODBC then everything is as it should be, but as soon as we swith to .NET world then nothing happens, silngle inserts/updates are ok
    So, we have Application written in mixed J2ee/.NET and we use HANA applience as host for tables, Procedures and views.
    This issues hs been sent to support, will update as soon as i get smth from them

  • Bulk loader for SCCM 2012 collections

    I am looking for a tool which will add users / computer names as direct membership to collections, Bulk loader was there in SCCM 2007 but it is not working in SCCM 2012. Please suggest..

    Hi,
    Here is a PowerShell script with GUI for your reference.
    Configuration Manager 2012 Direct Membership Collection Manager
    http://blogs.msdn.com/b/rslaten/archive/2014/03/10/configuration-manager-direct-membership-collection-manager.aspx
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Is there a way to upload Supporting Details into a cell in Planning?

    Is there a way to upload Supporting Details in a cell in Planning from a couple of rows of data in Excel ?
    We have a lot of supporting details in excel rows that we want to load into one cell in Planning Web form 11.1.1.3.
    I know how to extract Supporting Details from Planning's RDBMS repository, but has anyone successfully reverse that process and update Planning's Supporting Detail Tables with data?

    You can reverse the process of how you exported the information using SQL to import the new information.
    I think you will need to restart the planning service for it to take effect, best to test it on a dev environment until you are comfortable with the SQL
    Or you can use LCM, use LCM to export supporting details, then look at the xml document for the format.
    You can then create your own XML document and use LCM to import it in.
    I would probably try the LCM route first.
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for

  • Trying to Output Sound to Speakers and TV Simultaneously on H8-1360t

    Hi everyone - I have an HP H8-1360t with an AMD HD 7570 video card, running Windows 7 64-bit.  I have speakers hooked up to the on-board sound via a standard 3.5mm cable.  I run DVI to my monitor and HDMI to my TV.  I am trying to get sound to go to

  • Lookup Issue in ODI

    Hi there, I need to do lookup in source tables and insert the result to target table in an ODI interface. I have source table s_table1 with: s_table1 columns: LV1_ID_COLUMN, LV2_ID_COLUMN, LV3_ID_COLUMN, ... LV15_ID_COLUMN I need to use the LVx_ID co

  • Getting error "Sender address invalid"

    I've had the phone since release and everything has been working smoothly (and I am happy our area finally got the 3G network!), until I got a new email address. I can receive my email fine on the phone, but when I go to reply and hit send, it tells

  • My vector shape has a sawtooth line...WHY?

    I have created a logo that is to be printed onto clothing.  I have provided the printer with the eps file and the artwork was created in Illustrator CS4. The printer has noted that the shapes in the logo do not look like vectors and they have a sawto

  • Problem with Java Mail Program

    Hi Everyone... Please help me to sort out this problem... I am getting this Exception while executing the code pasted below... javax.mail.AuthenticationFailedException at javax.mail.Service.connect(Service.java:306) at javax.mail.Service.connect(Serv