Urgent-Issue in the Dimension tables

Hi Experts,
Question1:
I have a flat file load to a cube.this flat file has 1.5 mil records.One of the dimension created has a 2 dates(original date & current date) assigned.
When i look at dimension table for # entries is 30million.And when i look at the table i see the sids as 0,0 ( for 2 dates) and dim id's being creted.
When i did a search on the dimension table with current date & original date as not equal to 0.I see only 76,000 records.
Question 2:
we have an ODS which loads to the cube.In the process chain we have program that deletes the data in ods which does not match some conditions and then loads it tot he cube.
My question is,since we are not deleting contents from the cube and reloading it from the ODS(Full update).Will i not be seeing same records coming in with Full update which get agrregated in the cube.
Ex: i have a record in ODS.
A   X  Z  100  1000
After full update to the cube,the cube would have
A   X  Z  100  1000
When i run the process chain and data is dleeted from ODS on some condition and i still have hte same record in ODS and when this loads into cube wont this be aggregated with the previous record.
A  X   Z  200  2000
Would appreciate,if anyone could explain if i am missing anything.

Hello,
If you can't see the SID means you have not loaded the master data, that why there is no reference to the SID table and the values are 0.
InfoCube by default will have aggregated values, when there are duplicate records on the Keyfigures will be aggregated.
For example I have a Material Dimension and Customer Dimension
In the fact table,  it will be like this
DIM1     DIM2    KF1   KF2
Mat001  Cust1  100    10
Mat001  Cust2  200    5
for this there will be 1 entry in Material DIM table for Mat001 and 2 entries for Customer DIM table for Customer Cust1 and Cust2.
Material  Dimension
DIM ID    SID
1             Mat001  (Here it will be SID from Material Master)
Customer Dimension
1             Cust1  (Here it will be SID from Customer Master)
2             Cust2  (Here it will be SID from Customer Master)
Note : DIM ID is the combination of one or more SID in the dimension table.
So the exact fact table will look like
MATDIM    CUSDIM       AMT      QTY
1               1                  100        10
1               2                  200        5
If you load the data again with same characteristics values then the key figure will be aggregated
Example if you load
Mat001 Cust2 25 5
then the fact table will not have a new entry instead the it will aggregates and looks like (bolded one)
MATDIM    CUSDIM       AMT      QTY
1               1                  100        10
1               2                  220        10
Hope its clear
thanks
Chandran

Similar Messages

  • None of the dimension tables are compatible with the query request

    Hi,
    i am experiencing the below error while querying columns alone from employee dimension (w_employee_d) in workforce profile SA. There is only one column in my report which is employee number coming from employee dimension. when i query other information like job, region, location etc i am not getting any error. the below error appears only when querying columns from employee dimension. the content tab level for the LTS of employee dimension is set to employee detail.
         View Display Error
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14077] None of the dimension tables are compatible with the query request Dim - Employee.Employee Number. (HY000)
    SQL Issued: SELECT 0 s_0, "Human Resources - Workforce Profile"."Employee Attributes"."Employee Number" s_1 FROM "Human Resources - Workforce Profile" FETCH FIRST 65001 ROWS ONLY.
    couldn't able to know the exact reason. Any suggestions would be highly appreciated.
    Regards.

    hi user582149,
    It is difficult to answer you question with such a little amount of details. Could you specify:
    - how many facts/dimensions are you using in the query?
    - what is the structure of your Business Model?
    - which version of OBI are you using?
    - what does your log say?
    I hope to tell you more having the information above
    Cheers

  • Detailed analysis about the dimension tables of a cube

    Hi Experts,
    how can I get a detailed analysis about the dimension tables of a cube? (E.g. how many records include the fact tables, dimension tables and how much percent are these records compared with the whole records of the cube?!)
    Thx in advance for your answers!

    Hi,
    You will get most of the information in LISTSCHEMA transaction code. If you want to see further you can see the records in SE11.
    Regards
    Githen

  • Incremental load into the Dimension table

    Hi,
    I have the problem in doing the incremental load of the dimension table.Before loading into the dimension table,i would like to check the data in the dimnesion table.
    In my dimension table i have one not null surrogate key and the other null dimension tables.The not null surrogate key, i am populating with the Sequence Generator.
    To do the incremental load i have done the following.
    I made lookup into the dimension table and looked for a key.The key from the lookup table i have passed to the expression operator.In the expression operator i have created one field and hard coded one flag based on the key from the lookup table.I passed this flag to the filter operator and rest of the fields from the source.
    By doing this i am not able to pass the new records to the dimension table.
    Can you please help me.
    I have another question also.
    How do i update one not null key in the fact table.
    Thanks
    Vinay

    Hi Mark,
    Thanks for your help to solve my problem.I thought i share more information by giving the sql.
    I am giving below the 2 sqls, i would like to achieve through OWB.
    Both the following tasks need to be accomplished after loading the fact table.
    task1:
    UPDATE fact_table c
    SET c.dimension_table_key =
    (SELECT nvl(dimension_table.dimension_table_key,0)
    FROM src_dimension_table t,
    dimension_table dimension_table
    WHERE c.ssn = t.ssn(+)
    AND c.date_src_key = to_number(t.date_src(+), '99999999')
    AND c.time_src_key = to_number(substr(t.time_src(+), 1, 4), '99999999')
    AND c.wk_src = to_number(concat(t.wk_src_year(+), concat(t.wk_src_month(+), t.wk_src_day(+))), '99999999')
    AND nvl(t.field1, 'Y') = nvl(dimension_table.field1, 'Y')
    AND nvl(t.field2, 'Y') = nvl(dimension_table.field2, 'Y')
    AND nvl(t.field3, 'Y') = nvl(dimension_table.field3, 'Y')
    AND nvl(t.field4, 'Y') = nvl(dimension_table.field4, 'Y')
    AND nvl(t.field5, 'Y') = nvl(dimension_table.field5, 'Y')
    AND nvl(t.field6, 'Y') = nvl(dimension_table.field6, 'Y')
    AND nvl(t.field7, 'Y') = nvl(dimension_table.field7, 'Y')
    AND nvl(t.field8, 'Y') = nvl(dimension_table.field8, 'Y')
    AND nvl(t.field9, 'Y') = nvl(dimension_table.field9, 'Y')
    WHERE c.dimension_table_key = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    source table for the dimension table is src_dimension_table
    dimension_table_key is a not null key in the fact table
    task2:
    update fact_table cf
    set cf.key_1 =
    (select nvl(max(p.key_1),0) from dimension_table p
         where p.field1 = cf.field1
    and p.source='YY')
    where cf.key_1 = 0;
    fact table in the above sql is fact_table
    dimesion table in the above sql is dimension_table
    key_1 is a not null key in the fact table
    Is it possible to achieve the above tasks through Oracle Warehouse builder(OWB).I created the mappings for loading the dimension table and fact table and they are working fine.But the above two queries i am not able to achieve through OWB.I would be thankful if you can help me out.
    Thanks
    Vinay

  • (Urgent) issue of Using Aggregate table in the BMM layer (Thanks)

    Hi,
    Let's say, I have 2 fact tables, "F_Region_Goal" and "F_Store_Goal"; and 2 dimension tables, "Dim_Date" and "Dim_Store_info".
    "Dim_Date" has hierarchy as "day-month-year" and "Dim_Store_info" has hierarchy as "Division-Region-Store".
    "F_Store_Goal" has the most detailed goal info data in it;
    "F_Region_Goal" aggregate all the goal on "Region" level. (For Data dimension point of view, these 2 Fact tables are the same, "day-month-year")
    We create the BM, I have set the "Logic level" info in LTS, to let BIEE server to point to "Region_Goal" when we choose the Level: Region.
    However, i have tried several times, in the log, it always goes to "F_Store_Goal".
    Can anyone here tell me what I missed in the configuration?
    You help will be greatly appreciated.
    BTW: What if I want the BIEE server go to "F_Region_Goal" to do SUM() function to get the Goal for "Division" level, when I choose the criteria "Division" and "Month", How can i realize this ?
    Many many thanks for any reply.

    It seems to me that you are missing the aggregate dimension "Dim_Region_info" ... currently you are trying to get obiee to use a lower level dimension with a higher level fact and it's designed to take what it believes is the most effiicient way to do things.

  • Truncate always checked for the some of the dimension tables

    Hi,
    I found common dimension tables across Finance,Supply Chain and Sales by querying in DAC.
    In the task level most of the target tables checked as Truncate For Full Load.
    but some of the tables defined as truncate always, why they defined as truncate always is there any specific reason for that.
    Truncate Always Dimension Tables List:
    W_FSCL_MONTH_D
    W_FSCL_QTR_D
    W_FSCL_WEEK_D
    W_FSCL_YEAR_D
    W_HOUR_OF_DAY_D
    W_MONTH_D
    W_QTR_D
    W_TIME_OF_DAY_D
    W_WEEK_D
    W_YEAR_D
    Thanks and Regards,
    Partha.

    Hi,
    If they are defined as truncate always it is because the mapping which loads the table is always filling it with a full set of data not an incremental. If the table wasn't truncated every time then the data would not be correct after an ETL.
    You can see from that list that they are all time defined tables which are being truncated. These can be truncated and refilled without worry of breaking data integrity because the ROW_WIDs for these tables are not truly surrogate keys, they hold some meaning. So for instance the row_wid for the row for Jan 1980 in W_MONTH_D is 198001 and it will always have this same ROW_WID after each ETL, meaning the dimensions can be truncated and fully rebuilt without invalidating any joins to it from fact tables.
    Regards,
    Matt

  • An urgent issue of the XMLEncoder.

    Hi, all.
    I have a very urgent problem of the XMLEncode.
    Background:
    My project is a web project, and the flows are as follow:
    1. Get data entity from db by using toplink.
    2. Set the data into a domain entity for a page.
    3. Use XMLEncoder to convert the domain entity to an XML stream
    4. Use Transformer to transform the XML stream to http response and displayed in the client browser with the XSL files.
    Issue:
    When two or more users visit the same page by the browser at the same time, the weblogic server will throw this exception:
    java.lang.NullPointerException
    Continuing ...
    java.lang.Exception: discarding statement XMLEncoder0.writeObject(SearchWkOrdTyS
    tDomainEntity0);
    Continuing ...
    I've found and read these two articles
    http://forum.java.sun.com/thread.jsp?thread=262946&forum=62&message=1002333
    and
    http://forum.java.sun.com/thread.jsp?forum=63&thread=532252&tstart=30&trange=15
    and know that it maybe cause by the Serialization of the classes, but I cannot find the method to solve this problem.
    The code which I use to convernt the domain entity object to an XML document are as follow:
      public static Document object2Document(Object o)
      throws ReeferMnRException
        if (o==null) return null;
        Logger logger = LoggerFactory.getInstance().getLogger(XMLUtil.class.getName());
        logger.debug("Get into XMLUtil.object2Document===========================");
        try{
          ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
          XMLEncoder e = new XMLEncoder(
              new BufferedOutputStream(
              byteArrayOutputStream));
          e.writeObject(o);
          e.close();
          DocumentBuilderFactory documentBuilderFactory = DocumentBuilderFactory.
              newInstance();
          documentBuilderFactory.setNamespaceAware(true);
          StringBufferInputStream in = new StringBufferInputStream(new String(
              byteArrayOutputStream.toByteArray()));
          Document document = documentBuilderFactory.newDocumentBuilder().parse(in);
          return document;
        }catch(SAXException e){
          logger.error("XMLUtil-->object2Document-->SAXException:" + e.getMessage());
          throw new ReeferMnRException(
              ReeferMnRException.APPLICATION_SERVER_ERROR, ReeferMnRException.EXCEPTION);
        }catch(ParserConfigurationException e){
          logger.error("XMLUtil-->object2Document-->ParserConfigurationException:" + e.getMessage());
          throw new ReeferMnRException(
              ReeferMnRException.APPLICATION_SERVER_ERROR, ReeferMnRException.EXCEPTION);
        }catch(IOException e){
          logger.error("XMLUtil-->object2Document-->IOException:" + e.getMessage());
          throw new ReeferMnRException(
              ReeferMnRException.APPLICATION_SERVER_ERROR, ReeferMnRException.EXCEPTION);
        }catch(Exception e){
          logger.debug("XMLUtil-->object2Document-->Exception:" + e.getMessage());
          logger.error("XMLUtil-->object2Document-->Exception:" + e.getMessage());
          throw new ReeferMnRException(
              ReeferMnRException.APPLICATION_SERVER_ERROR,
              ReeferMnRException.EXCEPTION);
      }and an example of my domain entity to be convert into an XML document are as follow:
    package reefer.mnr.domain.entity;
    public class AbnInfoDomainEntity {
      private java.lang.String abnC;
      private java.lang.String abnId;
      private java.lang.String abnSubC;
      private java.lang.String abnSubDesc;
      private java.lang.String crtDt;
      public java.lang.String getAbnC() {
        return abnC;
      public java.lang.String getAbnId() {
        return abnId;
      public java.lang.String getAbnSubC() {
        return abnSubC;
      public java.lang.String getAbnSubDesc() {
        return abnSubDesc;
      public java.lang.String getCrtDt() {
        return crtDt;
      public void setAbnC(java.lang.String abnC) {
        this.abnC = abnC;
      public void setAbnId(java.lang.String abnId) {
        this.abnId = abnId;
      public void setAbnSubC(java.lang.String abnSubC) {
        this.abnSubC = abnSubC;
      public void setAbnSubDesc(java.lang.String abnSubDesc) {
        this.abnSubDesc = abnSubDesc;
      public void setCrtDt(java.lang.String crtDt) {
        this.crtDt = crtDt;
    }Anyone know how to can solve this problem?
    Thanks any more.

    public static Document object2Document(Object o)Is XMLUtil implemented as a singleton? Is this method in a servlet class? Like the doctor says, remove the "static" keyword and try it.
    public Document object2Document(Object o) { ... }
    After that, I would try it with the "synchronized" keyword.
    public synchronized Document object2Document (Object o) { ... }
    And then try it with synchronized code block:
    // This code writes the XML file of the object parameter
    synchronized (this) {
    ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
    XMLEncoder e = new XMLEncoder(
    new BufferedOutputStream(byteArrayOutputStream));
    e.writeObject(o);
    e.close();
    // This code returns XML document to calling object
    DocumentBuilderFactory documentBuilderFactory = DocumentBuilderFactory.newInstance(); documentBuilderFactory.setNamespaceAware(true);
    StringBufferInputStream in = new StringBufferInputStream(new String( byteArrayOutputStream.toByteArray()));
    Document document = documentBuilderFactory.newDocumentBuilder().parse(in);
    return document;

  • Issue with the shawdow table

    I am in the process of understanding the shawdow table and the error log table.
    i have a table created with a shadow table in place.
    1. ex : table emp( empno, ename), emp_err( ...., empno,ename)
    it contains the values (1,'A').
    Now i place the empno with a data rule, unique not null ... and configure the operator
    as MOVE TO ERROR.
    when i try to insert a row with 1,A, its not only moving the new row to be inserted but also the existing row in the table emp ie two rows are getting populated in the error table emp_err
    2. I have a scenario where i want to update a row in the fact, from an incoming row.
    If there is no match of the incoming row to that of the fact, how do i put that into the
    error_table ?
    Any ideas or tricks appreciated.
    Thanks
    Narayana.

    Hi,
      Remove the the internal tables memory by using FREE statement after processing the internal table.
       Also u can ask your Basis person to increase the page area.
    Reward if helpful.
    Regards,
    Umasankar.

  • Very urgent Issue - Convert the hours into Minutes

    Hello Guys,
    I have an infoobject zkabc, which is having hour information.
    eg: 24.588, which means 24 hrs and 30 min (..588 hrs).
    I want to display the infoobject value from 24.588 to 24.30 (24 hrs 30 minutes) is it possible in BEx.
    Issue critical.
    Thanks,
    Pratap

    Hello Kapadia,
    I have a small issues in the solution you gave me.
    I am getting the correct values when i implement the below formula you gave me.
    But i am getting incorrect values when i want to have average values.
    Lets say,
    I have 3 records in 2007 05 ( may month)
    I am getting the correct data for the 3 records, but when i have result for the 2007 05 month its giving different values.
    eg:
    K1(plant)     2007(yr)     May     5/29/2007     9.06     15.29     24.35
                   5/30/2007     7.50     21.43     29.33
                   5/31/2007     7.36     15.25     23.01
                                           Result     3.02     7.14     9.78
    It should be  sum of 3 records / no of records)
    Could you please explain how to solve the issue. Issue very critical.
    Thanks,
    Pratap.

  • Issue in the network table

    hi all 
    After deleting the network from the wbse  i can't able to open that wbse i am getting the following abap runtime  error 
    long text of error message:                                          
    Diagnosis                                                           
        A necessary database entry for one or more than one networks is 
        missing in the project. 
    Message class....... "CNPB"
    Number.............. 012  
    Variable 1.......... " "  
    Variable 2.......... " "  
    Variable 3.......... " "  
    Variable 4.......... " "  
    Program                                 SAPLCNPB_H       
    Include                                 LCNPB_HF0Z       
    Row                                     147              
    Module type                             (FORM)           
    Module Name                             NETWORK_DO_SELECT
    in this when i am going for create new network in the same name i can't able to ,and when i am going in the change mode also i can't able to find that network
    but same  network is exits in the table aufk.
    Please help to solve the above issue
    Regards
    Sen

    please refer sap note 457818.
    hope it helps.

  • Urgent issues involving the invokeDDX operation

    Hi all,
    I have been developing a process which takes xml from a client input, converts that xml into a ddx document, and then uses that DDX document to render a form. This has been working for months now, but all of a sudden I started getting this error:
    com.thoughtworks.xstream.converters.ConversionException: takeOwnership : Class name takeOwnership from package  not found. : takeOwnership : Class name takeOwnership from package  not found.
    ---- Debugging information ----
    message             : takeOwnership : Class name takeOwnership from package  not found.
    cause-exception     : com.thoughtworks.xstream.mapper.CannotResolveClassException
    cause-message       : takeOwnership : Class name takeOwnership from package  not found.
    class               : com.adobe.livecycle.assembler.client.AssemblerOptionSpec
    required-type       : com.adobe.livecycle.assembler.client.AssemblerOptionSpec
    path                : /com.adobe.livecycle.assembler.client.AssemblerOptionSpec/takeOwnership
        at com.thoughtworks.xstream.core.TreeUnmarshaller.convert(TreeUnmarshaller.java:89)
        at com.thoughtworks.xstream.core.AbstractReferenceUnmarshaller.convert(AbstractReferenceUnma rshaller.java:63)
        at com.thoughtworks.xstream.core.TreeUnmarshaller.convertAnother(TreeUnmarshaller.java:76)
        at com.thoughtworks.xstream.core.TreeUnmarshaller.convertAnother(TreeUnmarshaller.java:60)
        at com.thoughtworks.xstream.core.TreeUnmarshaller.start(TreeUnmarshaller.java:137)
        at com.thoughtworks.xstream.core.AbstractTreeMarshallingStrategy.unmarshal(AbstractTreeMarsh allingStrategy.java:33)
        at com.thoughtworks.xstream.XStream.unmarshal(XStream.java:923)
        at com.thoughtworks.xstream.XStream.unmarshal(XStream.java:909)
        at com.thoughtworks.xstream.XStream.fromXML(XStream.java:853)
        at com.thoughtworks.xstream.XStream.fromXML(XStream.java:845)
        at com.adobe.idp.dsc.datatype.impl.DefaultTextSerializer.deserializeValue(DefaultTextSeriali zer.java:59)
        at com.adobe.workflow.engine.PEUtil.processMapping(PEUtil.java:1051)
        at com.adobe.workflow.engine.PEUtil.invokeAction(PEUtil.java:798)
        at com.adobe.idp.workflow.dsc.invoker.WorkflowDSCInvoker.transientInvoke(WorkflowDSCInvoker. java:347)
        at com.adobe.idp.workflow.dsc.invoker.WorkflowDSCInvoker.invoke(WorkflowDSCInvoker.java:158)
        at com.adobe.idp.dsc.interceptor.impl.InvocationInterceptor.intercept(InvocationInterceptor. java:140)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.interceptor.impl.DocumentPassivationInterceptor.intercept(DocumentPassi vationInterceptor.java:53)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.transaction.interceptor.TransactionInterceptor$1.doInTransaction(Transa ctionInterceptor.java:74)
        at com.adobe.idp.dsc.transaction.impl.ejb.adapter.EjbTransactionCMTAdapterBean.execute(EjbTr ansactionCMTAdapterBean.java:357)
        at com.adobe.idp.dsc.transaction.impl.ejb.adapter.EjbTransactionCMTAdapterBean.doRequiresNew (EjbTransactionCMTAdapterBean.java:299)
        at sun.reflect.GeneratedMethodAccessor600.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.jboss.invocation.Invocation.performCall(Invocation.java:359)
        at org.jboss.ejb.StatelessSessionContainer$ContainerInterceptor.invoke(StatelessSessionConta iner.java:237)
        at org.jboss.resource.connectionmanager.CachedConnectionInterceptor.invoke(CachedConnectionI nterceptor.java:158)
        at org.jboss.ejb.plugins.StatelessSessionInstanceInterceptor.invoke(StatelessSessionInstance Interceptor.java:169)
        at org.jboss.ejb.plugins.CallValidationInterceptor.invoke(CallValidationInterceptor.java:63)
        at org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxInterceptor.java:121)
        at org.jboss.ejb.plugins.TxInterceptorCMT.runWithTransactions(TxInterceptorCMT.java:404)
        at org.jboss.ejb.plugins.TxInterceptorCMT.invoke(TxInterceptorCMT.java:181)
        at org.jboss.ejb.plugins.SecurityInterceptor.invoke(SecurityInterceptor.java:168)
        at org.jboss.ejb.plugins.LogInterceptor.invoke(LogInterceptor.java:205)
        at org.jboss.ejb.plugins.ProxyFactoryFinderInterceptor.invoke(ProxyFactoryFinderInterceptor. java:138)
        at org.jboss.ejb.SessionContainer.internalInvoke(SessionContainer.java:648)
        at org.jboss.ejb.Container.invoke(Container.java:960)
        at org.jboss.ejb.plugins.local.BaseLocalProxyFactory.invoke(BaseLocalProxyFactory.java:430)
        at org.jboss.ejb.plugins.local.StatelessSessionProxy.invoke(StatelessSessionProxy.java:103)
        at $Proxy378.doRequiresNew(Unknown Source)
        at com.adobe.idp.dsc.transaction.impl.ejb.EjbTransactionProvider.execute(EjbTransactionProvi der.java:143)
        at com.adobe.idp.dsc.transaction.interceptor.TransactionInterceptor.intercept(TransactionInt erceptor.java:72)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.interceptor.impl.InvocationStrategyInterceptor.intercept(InvocationStra tegyInterceptor.java:55)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.interceptor.impl.InvalidStateInterceptor.intercept(InvalidStateIntercep tor.java:37)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.interceptor.impl.AuthorizationInterceptor.intercept(AuthorizationInterc eptor.java:188)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.interceptor.impl.JMXInterceptor.intercept(JMXInterceptor.java:48)
        at com.adobe.idp.dsc.interceptor.impl.RequestInterceptorChainImpl.proceed(RequestInterceptor ChainImpl.java:60)
        at com.adobe.idp.dsc.engine.impl.ServiceEngineImpl.invoke(ServiceEngineImpl.java:121)
        at com.adobe.idp.dsc.routing.Router.routeRequest(Router.java:129)
        at com.adobe.idp.dsc.provider.impl.base.AbstractMessageReceiver.invoke(AbstractMessageReceiv er.java:329)
        at com.adobe.idp.dsc.provider.impl.soap.axis.sdk.SoapSdkEndpoint.invokeCall(SoapSdkEndpoint. java:139)
        at com.adobe.idp.dsc.provider.impl.soap.axis.sdk.SoapSdkEndpoint.invoke(SoapSdkEndpoint.java :81)
        at sun.reflect.GeneratedMethodAccessor760.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.axis.providers.java.RPCProvider.invokeMethod(RPCProvider.java:397)
        at org.apache.axis.providers.java.RPCProvider.processMessage(RPCProvider.java:186)
        at org.apache.axis.providers.java.JavaProvider.invoke(JavaProvider.java:323)
        at org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)
        at org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:118)
        at org.apache.axis.SimpleChain.invoke(SimpleChain.java:83)
        at org.apache.axis.handlers.soap.SOAPService.invoke(SOAPService.java:454)
        at org.apache.axis.server.AxisServer.invoke(AxisServer.java:281)
        at org.apache.axis.transport.http.AxisServlet.doPost(AxisServlet.java:699)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:710)
        at org.apache.axis.transport.http.AxisServletBase.service(AxisServletBase.java:327)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:290)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at com.adobe.idp.dsc.provider.impl.soap.axis.InvocationFilter.doFilter(InvocationFilter.java :43)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at com.adobe.idp.um.auth.filter.CSRFFilter.doFilter(CSRFFilter.java:41)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:235)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:230)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175)
        at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.ja va:179)
        at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:84)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
        at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java: 157)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:241)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:844)
        at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.ja va:580)
        at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447)
        at java.lang.Thread.run(Unknown Source)
    Caused by: com.thoughtworks.xstream.mapper.CannotResolveClassException: takeOwnership : Class name takeOwnership from package  not found.
        at com.thoughtworks.xstream.mapper.DefaultMapper.realClass(DefaultMapper.java:68)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.DynamicProxyMapper.realClass(DynamicProxyMapper.java:71)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.PackageAliasingMapper.realClass(PackageAliasingMapper.jav a:88)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.ClassAliasingMapper.realClass(ClassAliasingMapper.java:86 )
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.ArrayMapper.realClass(ArrayMapper.java:96)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.MapperWrapper.realClass(MapperWrapper.java:38)
        at com.thoughtworks.xstream.mapper.CachingMapper.realClass(CachingMapper.java:52)
        at com.thoughtworks.xstream.converters.reflection.AbstractReflectionConverter.determineType( AbstractReflectionConverter.java:347)
        at com.thoughtworks.xstream.converters.reflection.AbstractReflectionConverter.doUnmarshal(Ab stractReflectionConverter.java:208)
        at com.thoughtworks.xstream.converters.reflection.AbstractReflectionConverter.unmarshal(Abst ractReflectionConverter.java:162)
        at com.thoughtworks.xstream.core.TreeUnmarshaller.convert(TreeUnmarshaller.java:82)
        ... 95 more
    The above error happens when the process hits the "invokeDDX" operation.
    I have never seen this before, and need an urgent solution for this issue. I'm running Livecycle Server 9.0 on MS Server 2008(VM).
    Any help would be GREATLY appreciated...
    Regards
    Ross Malan

    I haven't seen this particular error, or the "takeownership" class.    The only take ownership thing I am familiar with is content space module. 
    Is either the DDX or one of the files to be assembled stored in Content Space?   If so, does the invokeDDX service work when files are not in Content Space?

  • Urgent: Issue with the size of an xls file generated from a rtf template

    Hi all,
    I have a template.rtf
    In BI, when I get the file in excel format and save it, the size of the report is 32 MB. When I just "save as" the same report, without changing anything, the size of the report is 8 MB.
    Do you know how to get directly the report with the size 8 MB or would you have some guidelines to avoid this issue.
    Thank you in advance,
    Sonia.

    Thank you Kurz for your answer. I'm going to try to be more precise.
    Actually I'm not in EBS. I generate the output (xls format) direclty from BIP.
    The DB of my application can't be requested directly, then I use HTTP request.
    From BIP, I launch the report choosing excel as output - the template is an rtf template.
    I have been requested to open or save the xls file that I save (32 MB).
    I open the excel file, save it again and the size of the file is now 8 MB.
    Regards,
    Sonia.

  • Issues with the exporting table from BAPI_SALESORDER_SIMULATE

    Hello Experts,
    When using the bapi BAPI_SALESORDER_SIMULATE I received three records in the table order_schedule_ex. But when we manually input in VA01 the same data that we used for the BAPI, on the schedule line items we can see two records for each material (which is good).
    For some reasons the standard bapi is missing one record. The following shows the actual result for the BAPI
    900001
    0001
    CT
    X
    20080124
    900011
    0001
    CP
    X
    20080124
    900011
    0002
    CP
    X
    20080125
    And these ones are the results from the VA01
    900001
    0001
    CT
    X
    20080124
    900001
    0002
    CT
    X
    20080125
    900011
    0001
    CP
    X
    20080124
    900011
    0002
    CP
    X
    20080125
    Does anyone knows why this is happening? what could I be missing?
    Thanks in advance,
    CL

    Hello Experts,
    When using the bapi BAPI_SALESORDER_SIMULATE I received three records in the table order_schedule_ex. But when we manually input in VA01 the same data that we used for the BAPI, on the schedule line items we can see two records for each material (which is good).
    For some reasons the standard bapi is missing one record. The following shows the actual result for the BAPI
    900001
    0001
    CT
    X
    20080124
    900011
    0001
    CP
    X
    20080124
    900011
    0002
    CP
    X
    20080125
    And these ones are the results from the VA01
    900001
    0001
    CT
    X
    20080124
    900001
    0002
    CT
    X
    20080125
    900011
    0001
    CP
    X
    20080124
    900011
    0002
    CP
    X
    20080125
    Does anyone knows why this is happening? what could I be missing?
    Thanks in advance,
    CL

  • Multiple columns from the same dimension table as row labels performing slowly

    (Working with SSAS tabular)
    I'm trying to figure out what the approach should be for the following scenario:
    Lets say we have a Customer table. The table has columns such as account number, department number, name, salesperson, account manager, number of customers, delivery route, etc
    A user of the model could want to see any permutation of that information as the row labels. How should that be handled?
    What we've been doing so far is that the user adds each column they want into the "ROWS" section in Excel. This works fine with smaller tables (for example, "Department" table with a "Department Code" and "Department Name",
    but on large tables this quickly chokes. I understand why this is happening, I just haven't found a better way to accomplish the same thing.
    I can add a calculated column to the model through VS, but obviously this is unsupportable and unscalable when each person needs their own permutations of the data. Can something similar be done in Excel? 
    This question seems to be what I need:
    http://social.msdn.microsoft.com/Forums/en-US/97d1157a-1402-4227-b96a-79524401ddcd/mdx-query-performance-when-selecting-multiple-attributes-from-same-dimension?forum=sqlanalysisservices
    However I can't find any information on how to add those properties (is it a multidimensional-only thing?)

    Thanks for the help. Sorry but i'm a self-taught developer, and i may be missing some basics :)
    Anyway i've done what you suggested but i get this error:
    [nQSError: 15011]The dimension table source Dimension Services.DM_D_SERVIZI_SRV has an aggregate content specification that specifies the level Product. But the source mapping contains column COD_PRODUCT with a functional dependency association on a more detailed level .
    where:
    - DM_D_SERVIZI_SRV is the physical alias for the Service Dimension (and the name of the LTS too)
    - COD_PRODUCT is the leaf of the hierarchy, the physical primary key, but it hasnt to be included in the hierarchy
    Do i have to add another level with the primary key and hide it to the users?
    I tried to solve this going to the logical tables source properties, on the tab contents, setting "logical level" to null for the hierarchy, but i don't know if this is correct.
    Thanks

  • How to find out the infoProvider for a given dimension table?

    Experts:
    In RSA1, I want to find out the infoProvider for a given dimension table.
    But I am not sure how to display the tables linked to a given infoProvider.
    Could you provide a way to display all tables linked to a given infoProvider?
    Thanks a lot!

    See, The dimension table starts with Dcubename1 and incremental Dcubename2 .... so on
    Ex.ZSD_C01 is your cube name
    Dim tables starts for this is /BIC/DZSD_C011 /BIC/DZSD_C012 ...
    Goto - LISTSCHEMA  - derive cube name from given dim table and enter cube name  - execute - will show you all the tables

Maybe you are looking for