Understanding the Data Source 0CO_PC_01

I received a query from a user who was running a query on the cube 0PC_C01. The actual quantity value for a particular cost centre is doubled. The cube is loaded using the data source 0CO_PC_01
I tracked the problem to a particular production order and then ran the extractor checker on the source system for that production order and selected only actuals (value type 10) and one currency type (020). When I look at the cost centre I see that there seems to be a line which has the total expected value for actual quantity, but also 2 lines which have the break down of that value. All 3 lines have value type 10. So when the user selects the cost centre, he sees the total of all 3 lines?
There also 2 other line which nett of the values correctly but these lines do not have the cost centre. They have a different cost assignment and refer to the production order line rather than the cost centre. So the total is correct, but when the user selecst the cost cente the total is inflated.
Is there a filter I should apply to avoid the duplication?

In this example there are 5 lines. One line has the correct (expected) actual qty of 16 HRS. This line has the cost centre. Then there are 2 lines which are negative -12 and -4 and these lines have a value of 0OIT in the field 0PIOBJ instead of cost centre these lines have the production order. The next 2 lines are postive +12 and +4. These lines have a valueof 0ACTY in 0PIOBJ and they have the cost centre.
When the user selects by cost centre, he loses the negative lines and gets the 3 positive lines 16, 12 and 4. So he sees an actual quantity of 32 instead of 16. All these lines have the same partner object value of 0ACTY. All have the same cost centre value, all have the currency code 020 and value type of 10.
To get the query to give the result of 32 for the cost centre, I need to find a way to put the cost centre onto the 0OIT lines which are negative or I must find a way to exlcude the first line which seems to be a total line?

Similar Messages

  • Trying to programmatically set the data-source for a Crystal reports report.

    I've got the following existing procedure that I need to add to in order to programmatically set the data-source (server, database, username, and password) for a Crystal reports report.
     I added the connectionInfo parts, but can’t figure out how to attach this to the existing
    this._report object.
    This is currently getting the connection data from the report file, but I now need to populate this connection data from a 'config.xml' text file.
    Am I trying to do this all wrong?
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using CrystalDecisions.CrystalReports.Engine;
    using WGS.Reports.Reports;
    using CrystalDecisions.Shared;
    using WGS.Reports.Forms;
    namespace WGS.Reports
    public class ReportService
    ReportClass _report;
    ParameterFields paramFields;
    ConnectionInfo connectionInfo; // <- I added this
    public ReportService()
    public void DisplayReport(string reportName, int allocationNo)
    if (reportName.ToLower() == "allocationexceptions")
    this._report = new AllocationExceptions();
    PrepareConnection(); // <- I added this
    PrepareAllocationExceptionReport(allocationNo);
    this.DisplayReport();
    private void PrepareConnection() // <- I added this
    //test - these will come from the config.xml file
    this.connectionInfo = new ConnectionInfo();
    this.connectionInfo.ServerName = "testserv\\test";
    this.connectionInfo.DatabaseName = "testdb";
    this.connectionInfo.UserID = "testuser";
    this.connectionInfo.Password = "test";
    this.connectionInfo.Type = ConnectionInfoType.SQL;
    private void PrepareAllocationExceptionReport(int allocationNo)
    this.paramFields = new ParameterFields();
    this.paramFields.Clear();
    ParameterField paramField = new ParameterField { ParameterFieldName = "@AllocationNo" };
    ParameterDiscreteValue discreteVal = new ParameterDiscreteValue { Value = allocationNo };
    paramField.CurrentValues.Add(discreteVal);
    paramFields.Add(paramField);
    private void DisplayReport()
    frmReportViewer showReport = new frmReportViewer();
    showReport.ReportViewer.ReportSource = this._report;
    showReport.ReportViewer.ParameterFieldInfo = paramFields;
    showReport.ShowDialog();
    showReport.Dispose();
    Any help would be much appreciated.

    Hi Garry,
    Please post SAP Crystal Reports questions in their own forums here:
    SAP Crystal Reports, version for Visual Studio
    We don't provide support for this control now. Thanks for your understanding.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • What is an ageing report? What are the data sources used to develop an agin

    Hello BW gurus,
    I was going thru some of the BW resumes. I could not understand some of the points mentioned below. Kindly go thru them and please explain each of it.
    Thank you.
    TR.
    •     Developed AR ageing report, created invoice layout and processed invoices.
    What is an ageing report? What are the data sources used to develop an aging report
    •     Worked on month-end and year end processes such as Balance Sheet Statements and Profit and Loss Accounts. 
    What data sources does one use to get Balance sheet and P&L accounts tables and fields.
    •     Involved in the end to end implementation of BW at Reliance Group as a team member.
    What are the tasks a BW consultant normally performs when he is involved in an end to end implementation project or
    a full life cycle project?
    •     Extensively worked on BW Statistics to optimize the performance of Info Cubes and to create Aggregates.
    What do you mean when you say worked on BW statistics to optimize the performance of Info Cubes.
    What are aggregates why do you need them?
    •     Prepared design documents from the Business Requirement documents and identified the
    relevant data targets for satisfying the customer requirements.
    What are the design documents does one prepare, please give an example. 
    Is cube a data target?

    What is an ageing report? What are the data sources used to develop an aging report
    Aging refers to values in different time period ranges. Example, the customer (credit) aging report can look like this.
    customer (credit)  for current period, 0 to 30 days, 30 to 90 days, 90 to 120 days. This is the way aging is classified.
    What data sources does one use to get Balance sheet and P&L accounts tables and fields.
    For P&L information, you may use 0FI_GL_6 datasource (or 0FI_GL_10 if you use ERP 5.0 version). This datasource reads the same information used in R/3 transaction f.01 (table glt0).
    What are the tasks a BW consultant normally performs when he is involved in an end to end implementation project or a full life cycle project?
    Requirement gathering, blueprint creation, development etc
    Refer to posts on Sap Methodology  and Sap lifecyle
    What do you mean when you say worked on BW statistics to optimize the performance of Info Cubes. What are aggregates why do you need them?
    Please check these links
    http://help.sap.com/saphelp_nw04/helpdata/en/8c/131e3b9f10b904e10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/f0/3c8c3bc6b84239e10000000a114084/plain.htm
    What are the design documents does one prepare, please give an example.
    Design document is basically a document to say  how the design is to be developed for a particular solution ex: FI it says what is the data fow and what are the data targets  to be used and how data shld be stored  for providing the client a solution they need.
    Is cube a data target?
    Yes cube is a data target.
    Hope this Helps
    Anand Raj

  • Import option is not showing the Data Source created using ODBC

    Hi all,
    I'm new to OBIEE, just installed OBIEE 10g in my system with the help of some reply from this Forum (thanks to Kalyan for this).
    Now when I tried to create new rpd, the Import option is not showing the Data source I have created. I have created new Datasource in the SystemDSN tab of the ODBC Administrator window,I have tried this by creating the Datasource in the UserDNS tab as well.
    Mine is Windows 7 Ultimate. The new data source which I have created through the ODBC Administrator window is to connect to SQL 2008 (I'm trying to get some table from SQL 2008 database), I don't have Oracle data base.
    While creating the Data source ODBC Administrator is showing only two drives (SQL Server and SQL Server Native Client 10.0 ) , so I have created the Data source using the SQL Server Native Client 10.0 drive.
    Can some one help me to understand the issue and to solve it.
    Thanks,
    Mithun

    By calling your cellular data provider.

  • Trying to change the data source for a Crystal Report.

    <p>The method below represents my best attempt to programatically change the data source of a Crystal Report. The goal is to have a routine that will update the data source for reports after they have been distributed to production servers. So far I have not been successful in saving the report back to the CMS. No exceptions are thrown, but when I view the Database Configuration of the report in the CMC nothing has changed.
    </p>
    <p>
    Am I missing a step, or is there another way to accomplish this?
    </p>
    <p>
    Thank you.
    </p>
    <hr />
    <pre>
    private void test(String reportName)
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects;
       IInfoObject reportObj;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dc;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB; //0;
       Fields connFields = null;
       String queryStr = "Select * From CI_INFOOBJECTS " +
          "Where SI_NAME='wfr.rpt' AND SI_KIND='CrystalReport' AND SI_INSTANCE=0";
       newInfoObjects = getCms().executeQuery(queryStr);
       if(newInfoObjects.size() > 0)
          reportObj = (IInfoObject)newInfoObjects.get(0);
          try
             clientDoc = getCms().getReportAppFactory().openDocument(
                reportObj
                , OpenReportOptions._refreshRepositoryObjects
                , java.util.Locale.US);
             dc = clientDoc.getDatabaseController();
             conInfos = dc.getConnectionInfos(null);
             for(int i = 0; i < conInfos.size(); ++i)
                oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(i);
                newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                pBag = newConInfo.getAttributes();
                pBag.putStringValue("QE_ServerDescription", "alio");
                logonProps = new PropertyBag();
                logonProps.putStringValue("Trusted_Connection", "false");
                logonProps.putStringValue("Server", "alio");
                pBag.put("QE_LogonProperties", logonProps);
                newConInfo.setUserName("admin");
                newConInfo.setPassword("password");
                dc.replaceConnection(
                   oldConInfo
                   , newConInfo
                   , connFields
                   , connOptions);
          catch(ReportSDKServerException Ex)
             String msg = "A server error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          catch(Exception Ex)
             String msg = "An error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          finally
             clientDoc.save();
             getCms().commitToInfoStore(newInfoObjects);
             clientDoc.close();
    </pre>
    Edited by: Mark Young on Sep 10, 2009 2:16 PM

    <style type="text/css">
    /<![CDATA[/
        body
            font-size: 1.125em;
              font-family: helvetica,arial,"sans-serif";
          .code{font-family: "courier new",courier,mono,monospace}
          .bi{font-style: italic; font-weight: bold;}
    /]]>/
    </style>
    <p>Justin,</p>
    <p>
    Thank you for the reply. Time constraints have not allowed me to post back to this tread
    till now. I will try your suggestion. My assumption is that <i>Save the report back to the
    info store</i> refers to <span class="code">IInfoStore.commit(IInfoObjects)</span>.
    </p>
    <p>
    I'm afraid that I do not understand why I don't want to change the report client document,
    or why <i>successfully exporting the report with the new login/password</i> is not what I
    want to do. Any explanation on that statement would be appreciated.
    </p>
    <p>
    I did find a way to accomplish my goal. It involved adding the SSOKEY property to the
    logon property bag. Below you'll see my revised code which modifies the report logon and
    server. I have no idea what
    this does, and SAP support has not been able to tell me why it works. However, what I
    discovered is that if I changed the report option, <b>Database Configuration -> When
    viewing report:</b>, in the CMS to <span class="bi">Use same database logon as when report
    is run</span> from <span class="bi">Prompt the user for database logon</span>, then the
    SSOKEY property had been added to the logon property bag having an empty string as its
    value. This allowed me to successfullyupdate and save the modified logon back to the CMS.
    </p>
    <p>
    So I took a chance and added code to always add the SSOKEY property with an empty string
    as its value, and I could then successfully modify and save the report's logon info
    and server. Again, I don't know what this means, but it has worked so far. If anyone has
    some insight or comments, either are welcome. Thank you in advance.
    </p>
    <br />
    <hr />
    <pre>
    private void changeDataSourceOfAWFCrystalReports()
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects = null;
       IInfoObject reportObj = null;
       IReport curReport = null;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dbController;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB;
       Fields connFields = null;
       String outputStr;
       int numOfReports;
       int numOfQueryPages;
       double progressIncrementPerPage = 30;
       int progressIncrementPerReport = 0;
       // Path query to reports is in a .properties file.
       String queryStr = getAppSettingsFile().getWscAwfCrystalReportPathQuery();
       try
          // Executes IInfoStore.getPageingQuery() and generates a list of queries.
          getCms().setPathQueryQueries(queryStr, 100);
          numOfQueryPages = 0;
          // Gets a List&lt;String&gt; of the IPageResult returned from IInfoStore.getPageingQuery().
          if(getCms().getPathQueryQueries() != null)
             numOfQueryPages = getCms().getPathQueryQueries().size();
          if(numOfQueryPages &gt; 0)
             // Use 30% of progress bar for the following loop.
             progressIncrementPerPage = Math.floor(30.0/(double)numOfQueryPages);
          for(int queryPageIndex = 0; queryPageIndex &lt; numOfQueryPages; ++queryPageIndex)
             // Gets the IInfoObjects returned from the current page query
             newInfoObjects = getCms().getPathQueryResultSetPage(queryPageIndex);
             numOfReports = newInfoObjects.size();
             if(newInfoObjects != null && numOfReports &gt; 0)
                progressIncrementPerReport =
                   Math.round((float)Math.floor(progressIncrementPerPage/(double)numOfReports));
                for(int reportIndex = 0; reportIndex &lt; numOfReports; ++reportIndex)
                   reportObj = (IInfoObject)newInfoObjects.get(reportIndex);
                   curReport = (IReport)reportObj;
                   clientDoc = getCms().getReportAppFactory().openDocument(
                      reportObj
                      , OpenReportOptions._refreshRepositoryObjects
                      , java.util.Locale.US);
                   dbController = clientDoc.getDatabaseController();
                   conInfos = dbController.getConnectionInfos(null);
                   for(int conInfosIndex = 0; conInfosIndex &lt; conInfos.size(); ++conInfosIndex)
                      oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                      newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                      pBag = newConInfo.getAttributes();
                      pBag.putStringValue(
                         "QE_ServerDescription"
                         ,getConfigFile().getDBDataSourceConnections());
                      logonProps = new PropertyBag();
                      logonProps.putStringValue("Trusted_Connection", "false");
                      <b>logonProps.putStringValue("SSOKEY", "");</b>
                      logonProps.putStringValue(
                         "Server"
                         ,getConfigFile().getDBDataSourceConnections());
                      pBag.put("QE_LogonProperties", logonProps);
                      newConInfo.setUserName(getConfigFile().getUNVConnectionUserName());
                      newConInfo.setPassword(getConfigFile().getUNVConnectionPasswordDecrypted());
                      dbController.replaceConnection(
                         oldConInfo
                         , newConInfo
                         , connFields
                         , connOptions);
                      newConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                   } // end for on conInfosIndex
                   clientDoc.save();
                } // end for on reportIndex
             } // end if on newInfoObjects
          } // end for on queryPageIndex
       } // end try
       catch(ReportSDKServerException Ex)
          // handle...
       catch(Exception Ex)
          // handle...
       finally
          getCms().commitToInfoStore(newInfoObjects);
          if(clientDoc != null)
             clientDoc.close();
    </pre>

  • Unable to Connect to the Data Source with Report Builder when using a closed Excel file

    I am using Report Builder 3.0 and to get started I am using an Excel spreadsheet as my data source.  I set this up just fine.  I can connect successfully if the file is open Excel.  But if I close the file, I get the error "Unable to
    Connect to the data source".
    This is my connection string:
    Dsn=Licenses;dbq=C:\USERS\AMEADE\DOCUMENTS\licenses.xlsx;defaultdir=C:\USERS\AMEADE\DOCUMENTS;driverid=790;fil=excel 8.0;maxbuffersize=2048;pagetimeout=5
    Works fine if the file is open but stops working when the file is closed.  Any idea what I can do?

    Hi Alice,
    Based on my understanding, when you keep the Excel file open, you can connect to the data source correctly. But when you close the Excel file, the error “Unable to Connect to the data source” is thrown out.
    In your scenario, I would like to know if only this Excel comes across this issue. Have you experienced the same issue when you use other Excel files as a data source? As we tested in our environment, we can connect to the data source whether the Excel file
    is open. You can refer to this
    article to create a data source again then check if you can connect to the data source when the Excel file is closed.
    If issue persists, I would suggest you use the
    Process Monitor to capture the processes during the connection to the data source. Then check the result of each process to find the exact reason.
    If you have any question, please feel free to ask.
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

  • How to enable the Delta for the data source 0CO_OM_OPA_7

    Hi Experts,
    I have a business content data source 0CO_OM_OPA_7 and I need to activate and enable Delta for this data source.
    This data source is activated (but not delta enabled) and used for full load in my system for the past three year. As this is a full load this data source is carrying more than 4 million records every day with adding 2 millions records every year. The load takes every day 4 to 5 hours for completion.
    Now I need to enable the delta for this data source. To enable the delta for this data source I used the transaction RSA5 and clicked on the 'Activate DataSources' but the data source is not enabled the 'Delta'
    I have also checked in the RSA6 transaction, the check box for 'Delta' is not checked for this data source.
    Can anybody help in enabling the 'Delta' for this data source.
    Thanks in Advance
    Narendra

    Hi,
    DS "0CO_OM_OPA_7" won't support deltas, so load to ODS then to Cube. Else delete the overlaping request then load this you can set in InfoPackage level.
    You check any Period/Month fields in 0CO_OM_OPA_7 then write simple routine in Infopackage and the set the overlaping deletion.
    Use the followng code and change it for your requirement in Infopackage level.
    I'm using 0CO_PC_01 DataSource, so it won't have delta so I'm using the following code in InfoPackageag and set delete overlaping req. So eveyday it will delete the old req for that period and then it will load data againg for that period (Full loads).
    DATA: l_idx LIKE sy-tabix,
            zzdate LIKE sy-datum,
            zzbuper LIKE t009b-poper,
            zzbdatj LIKE t009b-bdatj,
            zzperiod(7) TYPE c.
      READ TABLE l_t_range WITH KEY
           fieldname = 'FISCPER'.
      l_idx = sy-tabix.
      zzdate = sy-datum - 1.
      CALL FUNCTION 'DATE_TO_PERIOD_CONVERT'
        EXPORTING
          i_date               = zzdate
    *             I_MONMIT             = 00
          i_periv              = 'V3'
               IMPORTING
         e_buper              = zzbuper
         e_gjahr              = zzbdatj.
    *           EXCEPTIONS
    *             INPUT_FALSE          = 1
    *             T009_NOTFOUND        = 2
    *             T009B_NOTFOUND       = 3
    *             OTHERS               = 4
      IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      CONCATENATE zzbdatj zzbuper INTO zzperiod.
      l_t_range-low = zzperiod.
      l_t_range-option = 'EQ'.
      l_t_range-sign = 'I'.
      MODIFY l_t_range INDEX l_idx.
    Thanks
    Reddy

  • Unable to capture the Data Source into a Transport Request

    Hi All,
    We have a product hierarchy and we are using the data source :4R_PRODH_D_LGEN_HIER for the hierarchy.
    Now we need to transport this structure to the quality environment but we were not able to capture the datasource:4R_PRODH_D_LGEN_HIER into a transport request.
    When ever we activate the data source:4R_PRODH_D_LGEN_HIER it is asking for the Package and the Transport Request Number.If we give these details and save it, data source is not getting captured in the request, only the "bject Directory Entry" is getting captured.
    Can someone please guide me on how to capture the datasource under "Data Sources in BW" in a transport request.
    Regards,
    Sachin Dehey.

    Hi Sachin,
    Hierarachy datasource is not captured as Attributes and Text Datasource. So what ever you have done is correct.
    What ever is captured in Object Directory Entry is correct. So go ahead with your transports, once transport is done check the Hierarchy Infopackage with Available OLTP hierarchies and load the data.
    Most important thing first see that the all Master & Transactional Datasources are transported in R/3 Dev to QA to PRD
    In BW, datasources are not transported, only their replica is transported.
    Transportation of Datasource is done in R/3. Only their replica is transported in BW.
    So wht ever you have done till now is correct. So go ahead.
    While attaching Hierarchy Datasource it is captured only in "Object Directory Entry"
    Regards,
    Vishnu.

  • Refreshing the Data Source View in Analysis Services

    I have added columns to the SQL Database table that is used as a dimension in an Analysis Services Cube.  The new columns will be used as additional Property Fields for the dimension.  When I attempted to refresh the Data Source view so that the additional columns are present, I am given the following error:
    System.Data
    Property not accessible because 'Parent Columns and Child Columns don't have type-matching columns'
    I have done nothing to the columns used for the parent of child and the error message provides nothing to gon on. Does anyone have any ideas on this?
    Gary

    Olga,
    Thanks for your response.  I will try and answer your questions
    1) I have not tried removing the columns yet.  I will try that this afternoon but have limited hope.  The two columns I added are simple text columns that will be used as attributes in the dimension.  I have made no change to the parent or child columns.
    2) The table I modified is the source table for a parent-child dimension.
    3) The reference to the "check list" does not take me to any kind of check list.
    4) The parent-child dimensions I am trying to modify have been in use for months and the parent and child columns do have the dame data types.
    5) I have also check the data types between the dimension table and the fact table.  they use the same data types (small int).
    6) I have not made a collection for the parent key, it is a single column. The remainder of your last paragraph is not clear to me. Can you give me an example.
    I am fairly inexperienced with Analysis Services, please talk slow and use small words  :-)
    Thanks again for your help!
    Gary

  • Error while activating the data source.

    Dear all,
    We are using the BI version 7.00 - i have a problem i.e., while creating the infopackage for the data source
    0PM_OM_OPA_2 the following error is displayed.
    "No active transfer rule for DS 0PM_OM_OPA_2 and ....source system CLNTDEV400; create first..."
    Moreover the data source is found to be in version "M" - while activating the same when in click on the display/change icon , the following error is popped up.
    "You can only edit 3.x DataSources using transfer rule
    maintenance Do you want to create the transfer rules now?"
    What could be the reason for this error. What are the things that are to be done to replicate the data and pull data?
    Valid experts comment is expected in this regard.
    Regards,
    M.M

    Hi Magesh,
    Replicate your data source in BW and then right click onit and choose Assign Info source, it may take you to BC wherein you may need to install the Transfer Rules and related info objects etc.... Once you maintain this data source in the source system then you can create an info pack withput any extra effort.
    Hope it helps..

  • Error while deleting the Data Source

    Hi gurus,
    I am getting an error while deleting a Data Source - "Source system XXXXX  not found RSAR205".
    but that source system no more available.
    Is there any way to delete the Data Source.
    Thanks in advance.
    CK.

    Hi,
    That source system deleted.
    Using RSDS also same error.
    Thanks,
    CK
    Edited by: CK on Oct 3, 2008 11:37 AM

  • Unable to get the data in RSA 3 for the data source 2LIS_03_BX.

    Hi Xperts,
    we are loading the data sources(2LIS_03_BX) by using MCNB transaction but not getting the data in it.
    can any one please tell me how to load the data into it with step by step process.
    regards,
    satish

    Hi,
    To see the data in RSA3 you need to do the setup of data for the particular data source through SBIW transaction.
    The Rsa3 transaction displays the data present in the setup tables of the particular data source.

  • BOFC Unable to start the data source

    Hi there,
    i've done a standalone install using SQL 2008 R2 express. But i'm unable to start the data source and get this error:
    Failed to start data source
    Failed to start server instances on machine DELL1.
    Failed to initialize server configured on machine DELL1.
    Class not registered
    Thanks,
    Anees

    Ok, problem solved.
    I had to create a seperate DCOM user and assign it to the following roles in windows:
    log on as a batch job
    log on as a service
    then assign this DCOM user to the Ctbroker and CTserver. As i'm not running this on a domain, there was no need to use 'packet level authentication'. So i set the authentication level to none.
    There was no need to use the web based admin interface. Ctadmin.msc worked perfectly.
    Thanks,
    Anees

  • Fields Missing in the data source

    Hello Friends ,
                          I have to Append structures in my data source , and there are few fields under each append , the fields existing in Dev are missing in QA under one of Append strcuture's in QA. I see the fields when I double click the extract str but dont get to see them in the data source in QA, has anyone had this earlier ?
    Need y

    Hi,
    Go to transaction RSA6 -> datasource. You will see number of fields ( standard and appended). Check missing  field there and checkbox in HIDE column against it. If box is checked it wont be shown in the datasource. It will hide the field in the datasource.
    Hope this helps.
    Regards,
    Viren

  • Fields missing while creating the data source

    Hi All
    I am using a third party system to pull data in to BI...Actually there was one table where from the other end they had added 2 fields recently...when i checked in development i was able to create the data source for the table along with the new fields..but when i check it in production those newly added 2 fields are missing .. though i am able to see  the old fields in the data source the newly added fields are missing..so is there any problem with the refreshment of the table or should we reastart the connection?
    Can any one  please  let me know a solution for this...
    Regards
    Shilpa

    HI,
    Are you having connectivity like this
    BI Dev
    BI QA         ---> All Three connected to same 3 rd party system
    BI Prod
    If yes, If it is R3 replicate datasource will help.
    Incase of third party system you need to regenerate the datasource.
    if it is DB Connect try to go to the source system and connect to database table from BI side and at the top you will have
    generate datasource. use that option you will be able to solve.
    Else
    BI Dev - 3rd party dev
    BI prod - 3rd party prod
    Try to add 2 new fields in the 3rd party system and generate the datasource as told above.
    Thanks
    Arun
    Edited by: Arunkumar Ramasamy on Sep 14, 2009 9:12 AM

Maybe you are looking for