I want to fill the data source 0CO_OM_WBS_3 from COSP table.

Hi,
I am working on Plan and actual for cost controlling, I need to fill the data source with data from COSP table.
Please give me the steps for this. I checked in RSA3 for data source it don't have any data, but the corresponding table COSP having data.a
Thanks
Durg

Hi,
See the following DataSources for requirements,
0CO_OM_WBS_1                      WBS elements: Costs
0CO_OM_WBS_2                      WBS elements: Budget
0CO_OM_WBS_3                      WBS elements: Overall plan
0CO_OM_WBS_4                      WBS elements: Statistical key figures
0CO_OM_WBS_5                      WBS Elements: Selections
0CO_OM_WBS_6                      WBS Elements: Actual Costs Using Delta Extraction
0CO_OM_WBS_7                      WBS Elements: Commitment Line Items
0CO_OM_WBS_8                      WBS Elements: Budget Line Items
Use 0CO_OM_WBS_6 for Actual , bcoz it is having Delta Option for Plan you can go 0CO_OM_WBS_2 or 0CO_OM_WBS_3. See the SAP Help on that.
http://help.sap.com/saphelp_nw04/helpdata/EN/08/47c6f8bc1db04a84c9185954208ee7/frameset.htm
In this help you can find complete details of the Tables, you can see help for all above DataSources.
Thanks
Reddy

Similar Messages

  • Want to check which data source is used by the composites

    HI All,
    I have a requirement here in which i want to see that which data source is used by the composites deployed on SOA.
    I dont have the code.
    Is there any quick way to do it(from console or backend without downloading the code)
    Thanks

    You can export the jar file from the EM console.
    1) Login to EM console
    2) Right click on the composite and select Export and With default options click on Export button
    Find the data source name from the exported code.
    Mark the posting appropriately as "helpful" or "correct answer", if your issue is solved.
    Regards
    Albin I
    [http://albinsblog.com/]

  • How do I change the Data Source? Please help

    Hello Everyone,
    Can anyone provide me the steps to change the extractor for a data source? Here is the situation; we have a “Z” data source and under data source display transaction RSA2, I see that the extraction methods is “V” (Transparent table or DB View) and the extractor is a “Z” table which we maintain in R/3 for mapping the SAP g/l accounts to Hyperion accounts. Since, we have introduced/created a new “Z” table in R/3 for SAP to Hyperion accounts mappings because of changes in Hyperion reporting system the BW system has no latest data/data from that is maintained in new “Z” table hence, it is appearing to me that once I update the data source with new “Z” table in the extractor will update BW with mapping information that is in the new “Z” table. Please let me know if I am wrong and suggest me the right way to deal with the above issue.
    If updating the data source solves my problem then how do I go about doing that? Can I make the changes in R/3 production system directly or have to come through transport? Please provide me the steps to make those changes. Points will be awarded for the help.
    Thanks in advance,
    Kumar

    Hi Ravi,
    Here is what I meant for change in mapping and new mapping
    1.     For example R/3 account 1234 is mapped to Hyperion account ABC in old Z table and that was copied to new Z table since, we want to use that new Z table going forward but, in the new table the mapping is changed i.e., R/3 account 1234 mapped to Hyperion account XYZ
    2.     All new accounts created in R/3 after we started using new Z table are mapped to the Hyperion accounts only in the new Z table, meaning all the new mappings are only updated in new Z table.
    Please reply.
    Thanks,
    Kumar

  • Replicating the data source

    Hi,
    I have an attribute data source and when I have replicated the data source again from the source system, the type has changed from attribute to hierarchy.
    Can you please tell me how can this happen and what needs to be done to revert it back to attribute data source.
    Regards,
    Harikiran Gunnala

    Hi Hari,
    it cannot change automatically like hierarchy........See    for....another...data source,which data source ended with _ATTR......................
    Hope it helps
    Regards
    chandra sekhar

  • How to get group when the data source from system instead of UME database

    Hig guys,
    How to get group when the data source comes from backend system instead of UME database?
    I tried to use
    IUMPrincipal RefGroup = WPUMFactory.getGroupFactory().getGroup(groupName);
    But I was not able to get the group. But in "UserAdministrator", I can find this groupName.
    Which kind of API can I use?
    Thanks in advance!
    Regards,
    Liying
    Message was edited by:
            Liying Wang

    Ok,
    try this:
    com.sapportals.portal.security.usermanagement.IGroupFactory ep5GroupFactory = userManagementService.getGroupFactory();
    IGroupFactory groupFactory = UMFactory.getGroupFactory();
    com.sap.security.api.IGroup group = groupFactory.getGroupByUniqueName(groupName);
    IUMPrincipal ep5Principal = ep5GroupFactory.getEP5Group(group);
    This should do the trick,
    Romano
    PS: and thanks for the stars!

  • How to findout the data source information for the perticular tables

    Hi,
    Can you please tell me the process to findout the data source information for the perticular tables .
    For ex.. T2503 ,T2507 ,T25A1,T25A2 etc ..
    I am doing a reverse engineering to find out the data sources build on the above SAP Tables.
    Thanks.

    Hi,
    Still we haven't get the field level information ,before they send us we should first give them the corresponding data sources for the COPA tables which they have given .
    I have searched in help.sap.com but I didn't find any information on this .
    Please let me know is there any way to know the data source details for SAP Tables.
    Thanks

  • Filling test data  in Sflight, scarr , spfli tables

    Hi,
    Please someone tell me is their anyway to fill the SFLIGHT, SCARR, SPFLI tables
    because in my system there is no data in that. i want to fill the data , to test the Objects.
    Thanks,
    GAL

    The program you are looking for is called SAPBC_DATA_GENERATOR.
    Cheers
    Graham Robbo

  • Data deleted in ODS, data coming from the data source : 2LIS_11_VASCL

    Friends,
    I have situation :
    Data deleted in ODS, data coming from the data source : 2LIS_11_VASCL.
    when for the above ods and the data source when trying to delete the request whole data got delted.
    All the data got deleted in the fore ground. no background job has been generated for this.
    I ma really worried abt this issues. can u please tell me what should be the possibilities for this issue.
    Many Thanks
    VSM

    Hi,
    I suppose you want to know the possibilitiy of getting the data.
    If the entire data is being deleted, you can reload the data from source system.
    Load the setup table for your application. Then carry out init request.
    Please note that you would have to take a transaction-free period for carrying out this activity so that no data is missed.
    Once this is done, delta queues will again start filling up.

  • Error while activating the data source.

    Dear all,
    We are using the BI version 7.00 - i have a problem i.e., while creating the infopackage for the data source
    0PM_OM_OPA_2 the following error is displayed.
    "No active transfer rule for DS 0PM_OM_OPA_2 and ....source system CLNTDEV400; create first..."
    Moreover the data source is found to be in version "M" - while activating the same when in click on the display/change icon , the following error is popped up.
    "You can only edit 3.x DataSources using transfer rule
    maintenance Do you want to create the transfer rules now?"
    What could be the reason for this error. What are the things that are to be done to replicate the data and pull data?
    Valid experts comment is expected in this regard.
    Regards,
    M.M

    Hi Magesh,
    Replicate your data source in BW and then right click onit and choose Assign Info source, it may take you to BC wherein you may need to install the Transfer Rules and related info objects etc.... Once you maintain this data source in the source system then you can create an info pack withput any extra effort.
    Hope it helps..

  • I have a VI and an attched .txt data file. Now I want to read the data from the .txt file and display it as an array in the front panel. But the result is not right. Any help?

    I have a VI and an attched .txt data file. Now I want to read the data from the .txt file and display it as an array in the front panel. But the result is not right. Any help?
    Attachments:
    try2.txt ‏2 KB
    read_array.vi ‏21 KB

    The problem is in the delimiters in your text file. By default, Read From Spreadsheet File.vi expects a tab delimited file. You can specify a delimiter (like a space), but Read From Spreadsheet File.vi has a problem with repeated delimiters: if you specify a single space as a delimiter and Read From Spreadsheet File.vi finds two spaces back-to-back, it stops reading that line. Your file (as I got it from your earlier post) is delimited by 4 spaces.
    Here are some of your choices to fix your problem.
    1. Change the source file to a tab delimited file. Your VI will then run as is.
    2. Change the source file to be delimited by a single space (rather than 4), then wire a string constant containing one space to the delimiter input of Read From Spreadsheet File.vi.
    3. Wire a string constant containing 4 spaces to the delimiter input of Read From Spreadsheet File.vi. Then your text file will run as is.
    Depending on where your text file comes from (see more comments below), I'd vote for choice 1: a tab delimited text file. It's the most common text output of spreadsheet programs.
    Comments for choices 1 and 2: Where does the text file come from? Is it automatically generated or manually generated? Will it be generated multiple times or just once? If it's manually generated or generated just once, you can use any text editor to change 4 spaces to a tab or to a single space. Note: if you want to change it to a tab delimited file, you can't enter a tab directly into a box in the search & replace dialog of many programs like notepad, but you can do a cut and paste. Before you start your search and replace (just in the text window of the editor), press tab. A tab character will be entered. Press Shift-LeftArrow (not Backspace) to highlight the tab character. Press Ctrl-X to cut the tab character. Start your search and replace (Ctrl-H in notepad in Windows 2000). Click into the Find What box. Enter four spaces. Click into the Replace With box. Press Ctrl-V to paste the tab character. And another thing: older versions of notepad don't have search and replace. Use any editor or word processor that does.

  • Enhancing the standard  Data source without deleting setup tables

    Hi all,
    I am in the the Support Project. My requirement is I want to Enhance u201C2LIS_13_VDITMu201D LO- data source with two fields without disturbing the delta.
    Please suggest me how I have to do this.
    As Per my Knowledge ,
    1. we have to delete setup tables
    2. Enhance the data source & re populate the setup tables.
    3. Delete the data in the cube & add the two new fields in the cube & repopulate the cube with new Initial .
    4. after that delta will be enabled through job control.
    But this process is not suitable for our requirement because delta was enabled long back it is going very smooth till date, I donu2019t want to disturb that process.
    So please suggest me is there any other procedure to do this.
    Thanks,
    Kiran Manyam

    Hi,
    If historical data (loaded earlier in to BW) are not required for the two enhance field, then it is not required to deleted the setup table and reload them to BI.
    In this case simply you can follow the following procedure.
    1. Enhance the fields, and update the transfor structure(to unhide these fields). In BI update the required data target with respective IO. and in exit populate the enhance fields. No need to disturb the delta
    2. Replicated the DS in BI and do the mappings in tranformation.
    Here the existing delta is working, and you will be populating the two fields in the exit only.
    Thanks,
    Jugal.

  • Couldn't fetch the data from the data source...[nQSError: 16023]

    Hi all.
    I think that the problem I want to discuss is well-known, but still I got no answer whatever I tried ...
    I installed the BIEE on Linux (32 bit, OEL 5 - to be more precise), the complete installation was not a big deal. After that I installed the Administration tool on my laptop and created the repository. So... my tnsnames.ora on the laptop looks like this:
    TESTDB =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.5)(PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = testdb)
    And the tnsnames.ora on server, in its turn, looks like this:
    TESTDB =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = localhost.localdomain)(PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = testdb.localdomain)
    The database worked normally and I created and transferred the repository to the server and started it up.
    It started without any errors, but when I tried to fetch the data via the representation services I got the error:
    Odbc driver returned an error (SQLExecDirectW).
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.
    [nQSError: 16023] The ODBC function has returned an error. The database may not be available, or the network may be down. (HY000)
    I discovered, that the ODBC on my laptop was named not correctly (it should have been identical to tnsnames entry) - so I corrected it, saved and replaced the repository on the server and restarted it... - and still got the same error.
    Apparently, something is wrong with the data source. So let me put here some more information...
    My user.sh looks like this:
    ORACLE_HOME=/u01/app/ora/product/11.2.0/dbhome_1
    export ORACLE_HOME
    TNS_ADMIN=$ORACLE_HOME/network/admin
    export TNS_ADMIN
    PATH=$ORACLE_HOME/bin:/opt/bin:$PATH
    export PATH
    LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH
    export LD_LIBRARY_PATH
    and my odbc.ini looks like this:
    [ODBC]
    Trace=0
    TraceFile=odbctrace.out
    TraceDll=/u01/OracleBI/odbc/lib/odbctrac.so
    InstallDir=/u01/OracleBI/odbc
    UseCursorLib=0
    IANAAppCodePage=4
    [ODBC Data Sources]
    AnalyticsWeb=Oracle BI Server
    Cluster=Oracle BI Server
    SSL_Sample=Oracle BI Server
    TESTDB=Oracle BI Server
    [TESTDB]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=local
    Repository=SH
    Catalog=
    UID=
    PWD=
    Port=9703
    [AnalyticsWeb]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=local
    Repository=
    Catalog=
    UID=
    PWD=
    Port=9703
    [Cluster]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=local
    Repository=
    FinalTimeOutForContactingCCS=60
    InitialTimeOutForContactingPrimaryCCS=5
    IsClusteredDSN=Yes
    Catalog=SnowFlakeSales
    UID=Administrator
    PWD=
    Port=9703
    PrimaryCCS=
    PrimaryCCSPort=9706
    SecondaryCCS=
    SecondaryCCSPort=9706
    Regional=No
    [SSL_Sample]
    Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
    Description=Oracle BI Server
    ServerMachine=localhost
    Repository=
    Catalog=SnowflakeSales
    UID=
    PWD=
    Port=9703
    SSL=Yes
    SSLCertificateFile=/path/to/ssl/certificate.pem
    SSLPrivateKeyFile=/path/to/ssl/privatekey.pem
    SSLPassphraseFile=/path/to/ssl/passphrase.txt
    SSLCipherList=
    SSLVerifyPeer=No
    SSLCACertificateDir=/path/to/ca/certificate/dir
    SSLCACertificateFile=/path/to/ca/certificate/file.pem
    SSLTrustedPeerDNs=
    SSLCertVerificationDepth=9
    Can anybody point a finger where the error line is? According to the documentation it should work fine.Maybe the driver name is wrong? What driver I need then?
    Cause I can't find it.
    I'm really sorry to bother, guys :) Let me know if you get some ideas about it (metalink didn't help).

    OK, several things wrong here. First the odbc.ini is not meant to be used for Oracle databases, that's not supported on Linux. On Linux you should OCI (Oracle native drivers) and nothing should be added on odbc.ini. Your user.sh seems to be pointing to your DB installation path. This is not correct. It should point to your Oracle client installation so you need to install the Oracle FULL client somewhere. Typically this is normally done with the same OS account as the one used for OBIEE whereas the DB normally runs with the oracle account. Once you got the client installed test it under the OBIEE account doing tnsping and sqlplus to your DB. Also the LD_LIBRARY_PATH should point to $ORACLE_HOME/lib32 not lib as the lib directory is the 64bits and OBIEE uses the 32bits libraries even in 64bits OSes. Finally change your RPD connection to use OCI. Make all those changes and you should be good.

  • Trying to change the data source for a Crystal Report.

    <p>The method below represents my best attempt to programatically change the data source of a Crystal Report. The goal is to have a routine that will update the data source for reports after they have been distributed to production servers. So far I have not been successful in saving the report back to the CMS. No exceptions are thrown, but when I view the Database Configuration of the report in the CMC nothing has changed.
    </p>
    <p>
    Am I missing a step, or is there another way to accomplish this?
    </p>
    <p>
    Thank you.
    </p>
    <hr />
    <pre>
    private void test(String reportName)
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects;
       IInfoObject reportObj;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dc;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB; //0;
       Fields connFields = null;
       String queryStr = "Select * From CI_INFOOBJECTS " +
          "Where SI_NAME='wfr.rpt' AND SI_KIND='CrystalReport' AND SI_INSTANCE=0";
       newInfoObjects = getCms().executeQuery(queryStr);
       if(newInfoObjects.size() > 0)
          reportObj = (IInfoObject)newInfoObjects.get(0);
          try
             clientDoc = getCms().getReportAppFactory().openDocument(
                reportObj
                , OpenReportOptions._refreshRepositoryObjects
                , java.util.Locale.US);
             dc = clientDoc.getDatabaseController();
             conInfos = dc.getConnectionInfos(null);
             for(int i = 0; i < conInfos.size(); ++i)
                oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(i);
                newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                pBag = newConInfo.getAttributes();
                pBag.putStringValue("QE_ServerDescription", "alio");
                logonProps = new PropertyBag();
                logonProps.putStringValue("Trusted_Connection", "false");
                logonProps.putStringValue("Server", "alio");
                pBag.put("QE_LogonProperties", logonProps);
                newConInfo.setUserName("admin");
                newConInfo.setPassword("password");
                dc.replaceConnection(
                   oldConInfo
                   , newConInfo
                   , connFields
                   , connOptions);
          catch(ReportSDKServerException Ex)
             String msg = "A server error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          catch(Exception Ex)
             String msg = "An error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          finally
             clientDoc.save();
             getCms().commitToInfoStore(newInfoObjects);
             clientDoc.close();
    </pre>
    Edited by: Mark Young on Sep 10, 2009 2:16 PM

    <style type="text/css">
    /<![CDATA[/
        body
            font-size: 1.125em;
              font-family: helvetica,arial,"sans-serif";
          .code{font-family: "courier new",courier,mono,monospace}
          .bi{font-style: italic; font-weight: bold;}
    /]]>/
    </style>
    <p>Justin,</p>
    <p>
    Thank you for the reply. Time constraints have not allowed me to post back to this tread
    till now. I will try your suggestion. My assumption is that <i>Save the report back to the
    info store</i> refers to <span class="code">IInfoStore.commit(IInfoObjects)</span>.
    </p>
    <p>
    I'm afraid that I do not understand why I don't want to change the report client document,
    or why <i>successfully exporting the report with the new login/password</i> is not what I
    want to do. Any explanation on that statement would be appreciated.
    </p>
    <p>
    I did find a way to accomplish my goal. It involved adding the SSOKEY property to the
    logon property bag. Below you'll see my revised code which modifies the report logon and
    server. I have no idea what
    this does, and SAP support has not been able to tell me why it works. However, what I
    discovered is that if I changed the report option, <b>Database Configuration -> When
    viewing report:</b>, in the CMS to <span class="bi">Use same database logon as when report
    is run</span> from <span class="bi">Prompt the user for database logon</span>, then the
    SSOKEY property had been added to the logon property bag having an empty string as its
    value. This allowed me to successfullyupdate and save the modified logon back to the CMS.
    </p>
    <p>
    So I took a chance and added code to always add the SSOKEY property with an empty string
    as its value, and I could then successfully modify and save the report's logon info
    and server. Again, I don't know what this means, but it has worked so far. If anyone has
    some insight or comments, either are welcome. Thank you in advance.
    </p>
    <br />
    <hr />
    <pre>
    private void changeDataSourceOfAWFCrystalReports()
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects = null;
       IInfoObject reportObj = null;
       IReport curReport = null;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dbController;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB;
       Fields connFields = null;
       String outputStr;
       int numOfReports;
       int numOfQueryPages;
       double progressIncrementPerPage = 30;
       int progressIncrementPerReport = 0;
       // Path query to reports is in a .properties file.
       String queryStr = getAppSettingsFile().getWscAwfCrystalReportPathQuery();
       try
          // Executes IInfoStore.getPageingQuery() and generates a list of queries.
          getCms().setPathQueryQueries(queryStr, 100);
          numOfQueryPages = 0;
          // Gets a List&lt;String&gt; of the IPageResult returned from IInfoStore.getPageingQuery().
          if(getCms().getPathQueryQueries() != null)
             numOfQueryPages = getCms().getPathQueryQueries().size();
          if(numOfQueryPages &gt; 0)
             // Use 30% of progress bar for the following loop.
             progressIncrementPerPage = Math.floor(30.0/(double)numOfQueryPages);
          for(int queryPageIndex = 0; queryPageIndex &lt; numOfQueryPages; ++queryPageIndex)
             // Gets the IInfoObjects returned from the current page query
             newInfoObjects = getCms().getPathQueryResultSetPage(queryPageIndex);
             numOfReports = newInfoObjects.size();
             if(newInfoObjects != null && numOfReports &gt; 0)
                progressIncrementPerReport =
                   Math.round((float)Math.floor(progressIncrementPerPage/(double)numOfReports));
                for(int reportIndex = 0; reportIndex &lt; numOfReports; ++reportIndex)
                   reportObj = (IInfoObject)newInfoObjects.get(reportIndex);
                   curReport = (IReport)reportObj;
                   clientDoc = getCms().getReportAppFactory().openDocument(
                      reportObj
                      , OpenReportOptions._refreshRepositoryObjects
                      , java.util.Locale.US);
                   dbController = clientDoc.getDatabaseController();
                   conInfos = dbController.getConnectionInfos(null);
                   for(int conInfosIndex = 0; conInfosIndex &lt; conInfos.size(); ++conInfosIndex)
                      oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                      newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                      pBag = newConInfo.getAttributes();
                      pBag.putStringValue(
                         "QE_ServerDescription"
                         ,getConfigFile().getDBDataSourceConnections());
                      logonProps = new PropertyBag();
                      logonProps.putStringValue("Trusted_Connection", "false");
                      <b>logonProps.putStringValue("SSOKEY", "");</b>
                      logonProps.putStringValue(
                         "Server"
                         ,getConfigFile().getDBDataSourceConnections());
                      pBag.put("QE_LogonProperties", logonProps);
                      newConInfo.setUserName(getConfigFile().getUNVConnectionUserName());
                      newConInfo.setPassword(getConfigFile().getUNVConnectionPasswordDecrypted());
                      dbController.replaceConnection(
                         oldConInfo
                         , newConInfo
                         , connFields
                         , connOptions);
                      newConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                   } // end for on conInfosIndex
                   clientDoc.save();
                } // end for on reportIndex
             } // end if on newInfoObjects
          } // end for on queryPageIndex
       } // end try
       catch(ReportSDKServerException Ex)
          // handle...
       catch(Exception Ex)
          // handle...
       finally
          getCms().commitToInfoStore(newInfoObjects);
          if(clientDoc != null)
             clientDoc.close();
    </pre>

  • Modify the data source of a data view web part

    I have a dataView web part deployed in a template in multiple site collection. This dataView hasn't any query set up, so it loads a lot of items and this is slowing down my system.
    Now I want to programmatically put a query overriding the existent (empty) one and I'm doing like this:
    System.Web.UI.WebControls.WebParts.WebPart y = (System.Web.UI.WebControls.WebParts.WebPart)item;
    Microsoft.SharePoint.WebPartPages.DataFormWebPart z = (Microsoft.SharePoint.WebPartPages.DataFormWebPart)item;
    StringBuilder dataSourceString = new StringBuilder("<%@ Register TagPrefix=\"sharepoint\" Namespace=\"Microsoft.SharePoint.WebControls\" Assembly=\"Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c\" %>");
    dataSourceString.Append("<%@ Register TagPrefix=\"WebPartPages\" Namespace=\"Microsoft.SharePoint.WebPartPages\" Assembly=\"Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c\" %>");
    dataSourceString.Append("<sharepoint:SoapDataSource runat=\"server\" SelectUrl=\"http://intranet.contoso.com/sites/spc/_vti_bin/lists.asmx\" InsertUrl=\"\" UpdateUrl=\"\" DeleteUrl=\"\" SelectAction=\"http://schemas.microsoft.com/sharepoint/soap/GetListItems\" InsertAction=\"\" UpdateAction=\"\" DeleteAction=\"\" SelectPort=\"ListsSoap\" InsertPort=\"\" UpdatePort=\"\" DeletePort=\"\" SelectServiceName=\"Lists\" InsertServiceName=\"\" UpdateServiceName=\"\" DeleteServiceName=\"\" AuthType=\"Basic\" AuthUserName=\"contoso\\administrator\" AuthPassword=\"pass@word1\" WsdlPath=\"http://intranet.contoso.com/sites/spc/_vti_bin/lists.asmx?WSDL\" XPath=\"\" ID=\"SoapDataSource3\">");
    dataSourceString.Append("<SelectCommand><soap:Envelope xmlns:soap=\"http://schemas.xmlsoap.org/soap/envelope/\"><soap:Body><GetListItems xmlns=\"http://schemas.microsoft.com/sharepoint/soap/\"><listName>Jobs</listName>");
    dataSourceString.Append("<Query><Where><Eq><FieldRef Name=\"Title\" /><Value Type=\"Text\">2012_080_A_0</Value></Eq></Where></Query>");
    dataSourceString.Append("<rowLimit>9999</rowLimit></GetListItems></soap:Body></soap:Envelope></SelectCommand><InsertCommand></InsertCommand><UpdateCommand></UpdateCommand><DeleteCommand></DeleteCommand></sharepoint:SoapDataSource>");
    z.DataSourcesString = dataSourceString.ToString();
    manager.SaveChanges(z);
    In my code, I can see the DataSourceString changing, but if I refresh the page it is still loading all the data: why?!

    Hi,
    According to your description, my understanding is that you want to change the data view web part datasource programmatically.
    We need to override the databind method like below:
    public class ExtendedDataFormWebPart : DataFormWebPart
    public override void DataBind()
    this.DataSource = your own data source;
    base.DataBind();
    Here is a detailed code demo for your reference:
    http://jamestsai.net/Blog/post/How-to-query-cross-site-lists-in-DataFormWebPart-Part-1-Build-your-own-data-source-for-DataFormWebPart.aspx
    Best Regards
    Jerry Guo
    TechNet Community Support

  • Errors in the high-level relational engine. The data source view does not contain a definition for the table or view. The Source property may not have been set.

    Hi All,
    I have a cube in which i'm using the TIME DIM that i created in the warehouse. But now i wanted a new measure in the cube which is Average over time and when i wanted to created the new measure i got a message that no time dim was defined, so i created a
    new time dimension in the SSAS using wizard. But when i tried to process the new time dimension i'm getting the follwoing error message
    "Errors in the high-level relational engine. The data source view does not contain a definition for "SSASTIMEDIM" the table or view. The Source property may not have been set."
    Can anyone please tell me why i cannot create a new measure average over the time using my time dimension? Also what am i doing wrong with the SSASTIMEDIM, that i'm getting the error.
    Thanks

    Hi PMunshi,
    According to your description, you get the above error when processing the time dimension. Right?
    In this scenario, since you have updated the DSV, it should have no problem on the table existence. One possibility is that table has been specified for tracking in the notifications for proactive caching, but isn't available any more for some
    reason. Please change the setting in Proactive Caching into "MOLAP".
    Reference:
    How To Implement Proactive Caching in SQL Server Analysis Services SSAS
    If you have any question, please feel free to ask.
    Best Regards,
    Simon Hou
    TechNet Community Support

Maybe you are looking for

  • How do I get a receipt for final cut x

    How do I get a receipt for final cut x?  I paid for it to use at work and I cant claim the cost back unless I have a receipt.  Apple cant claim it is an app for personal use when they are pitching this software for professional editing

  • Error "The HostAgent is not reporting saposcol data"

    Hi, Our Landscape: Managing System: Solution Manager 7.0 EHP1 ST 400     0020     SAPKITL430 LM-SERVICE  7.01 SP4 (1000.7.01.4.6.20100324120540) Managed System: NetWeaver 7.0 Portal SP19 Windows 2003 IA64/Sql 2005 - SAP Host Agent 711 Installed - DAA

  • Preverification Error while using kXML with Netbeans IDE

    Hello All, I am using Netbeans IDE for wrting a program that parses a xml using kXML and display some results on screen. The Netbeans IDE gives the various obfuscation levels. The code is properly compiled at any level less than the highest obfuscati

  • MD5 Encryption on UTF8 database on NVARCHAR datatype column

    DECLARE l_string NVARCHAR2(600) := '123456'; checksum NVARCHAR2(600); BEGIN DBMS_OUTPUT.DISABLE; DBMS_OUTPUT.ENABLE(1000000); DBMS_OBFUSCATION_TOOLKIT.md5 (input_string => l_string, checksum_string => checksum); DBMS_OUTPUT.PUT_LINE(RAWTONHEX(utl_raw

  • Weblogic6.1 sp3 and oracle9.2.0.6

    Hi, We are migrating oracle8i to ora9.2.0.6 and app is running in weblogic6.1 sp3 on sun solaris5.8. we are using oracle oci driver.i have kept required dll's and classes12.jar in path.when try to read CLOB i am getting the below error. java.sql.SQLE