DAC Physical Data Source password change - invalid username/password

We are using DAC/Informatica for loading Peoplesoft data into the warehouse. The password for the peoplesoft source was changed recently. I changed the password in DAC under setup, Type source for the Peoplesoft DB. When I select 'Test Connection' I receive the message connection established successfully. When I restart the execution plan I still get the same error - invalid username/password for obi_infa_ps, which is the account we use for accessing the Peoplesoft DB. All of the DAC and Informatica services on the server have been shutdown and restarted - still getting the invalid username/password error.
Do I need to rebuild the execution plan? If I do, can I manually put in the needed refresh dates?

You will also need to update the username/password in the Informatica Workflow Manager.
Under Connections > Relational
EDIT: First post, welcome to BI Apps!
Edited by: birchy on Nov 17, 2011 9:00 AM

Similar Messages

  • DAC Physical data source not visible in task tab

    Hello
    I successfully created a new physical source and it connects well also. Then I went to the tasks tab and created a new task but I can't see the new physical data source I just created. How do I broadcast/publish or push the connection details to the tables, tasks, etc? You know without this even execution plan would fail
    Any help would be helpful
    Thanks

    Hi
    If the task is of type Informatica,I think it is because that you didnot synchronize the task,If it is not informatica rather a stored procedure or so........you need to add target tables......Hope this helps
    regards

  • Changing Physical Data Source in DAC

    I need to extract data from DEV2 instance of Siebel instead of DEV1.
    So i have change the database name, user name and password under Physical Data Sources tab under Setup for the SEBL_VERT_80 datasource.
    I have generated parameters and rebuilt the execution plan. However when I run the execution plan some tasks fail and I notice in the session log that the ETL tasks are still trying to extract data from the DEV1 instead of DEV2.
    Can someone please let me know what I may be missing? How can I make DAC run ETLs against the DEV2 database?
    Thanks

    My first thought is that the connections need to be reconfigured in another place.
    Have you changed the realtional connections in the Informatica PowerCenter Workflow Manager?
    Some worfklows might be configured to retrieve database connection information from there.
    - In Worfklow Manager connect to the appropriate repository
    - Select "Connections" -> Relational
    - Review the connections specified.
    - Austin

  • Data warehouse Admin Console - MySQL Physical Data Source

    We have some tables on a MySQL database that we'd like to include in our ETL process. The data is to be loaded into our Oracle 11g R1 data warehouse after the transformations are completed.
    Everything works with the exception of adding a physical data source in the DAC (Data warehouse administration console) for the MySQL database. There is no connection type for this database. I don't see how it's setup in the documentation.
    Can someone explain what needs to be done or direct me to the docs? Or, can this be done?
    Does it have to be done? The connection details are stored in the odbc.ini file on the server. So that's managed by Informatica. Will the workflow call from the DAC be sufficient?
    Thanks,
    LWatkins
    DAC 10.1.3.4.0.20080729.2025
    Informatica PowerCenter 8.1.1
    AIX 5.3/64Bit
    OBIA 7.9.5
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit
    MySQL 5.5
    Edited by: LWatkins on Mar 25, 2013 12:32 PM

    I would suggest this work only if you are able to connect MySQL thru Informatica;
    1) Create Connection in Worflow Manager and try to use the same connecton in sessions and try to avoid parameters from dac
    2) Create same connection in DAC pointing to Oracle or MSSQL or DB2
    3) Add tasks in DAC and do not add source and target tables
    Another workaround to handling full loads using dac system variables or any other
    If helps mark
    Edited by: Srini VEERAVALLI on Mar 25, 2013 4:49 PM

  • Setting up MS SQL Server as a physical data source in DAC

    Hi everyone,
    I'm working with the latest version of BI Apps (7.9.6) and OBIEE 10.1.3.4.1. I've installed Informatica 8.6.1 and DAC on Oracle Red Hat Enterprise Linux 5.4 64bit.
    So far, everything has gone along relatively well.
    Now I'm hitting an issue as I am trying to setup a DAC physical connection to our source system. We are pulling from a MS SQL Server physical source. When ever I try to connect to the MS SQL server using the DataDirect ODBC drivers from the Linux, it works fine. But when I try to test the connection using the DAC, it fails saying "Failure connecting to "<connection name>"! MSSQL driver not available!"
    Has anyone successfully connected to MSSQL server as a source from the DAC? I checked the DAC documentation but it was definitely not helpful in this situation.
    Thanks for the advice.
    -Joe

    I've got an outstanding SR with Oracle at the moment to resolve this same problem. If I get a resolution I'll update this blog.
    The problem is microsoft no longer supports MSSQL JDBC Driver 1.1, 1.2. You cannot get the 3 files msbase.jar, mssqlserver.jar and msutil.jar files. It only gives you the sqljdbc.jar file. I'm trying to get these files to see if it makes a difference.
    You need to have the four files both on your Windows ~DAC\lib directory and the Unix $DAC_HOME/lib directory. Then make changes to the connection_template.xml to uncomment the SQL SERVER 2000 bit and include the SQL SERVER 2005 bit and finally make changes to the config.sh file to include the following
    export SQLSERVERLIB=./lib/msbase.jar:./lib/mssqlserver.jar:./lib/msutil.jar:./lib/sqljdbc.jar
    Edwin
    Edited by: bebedi123 on 05-May-2010 07:36

  • DSP Web Service Data Source runtime error "Invalid xsi:type qname"

    I have created a DSP data service that uses a web service as a data source.
    The web service wraps an Oracle PL/SQL package and works when tested independantly of the DSP data service.
    When I test the DSP data service using the weblogic workshop test view I get the error below:
    weblogic.xml.query.exceptions.XQueryDynamicException: {err}XQ0027: Validation failed: error: cvc-elt.4.2: Invalid xsi:type qname: 'ns0:EmpRecUser' in element getEmpResponseElement@http://hr/EmpWebService.wsdl/types/
    (The full stack trace and the WSDL imported into DSP are at the bottom of this posting)
    I am new to DSP but I have created data services from the DSP tutorials succesfully before.
    This is the first data service I have tried to create which is not from a DSP tutorial.
    I dont really understand the error message.
    It appears to me that type EmpRecUser is being prefixed by the wrong name space. In the WSDL the namespace is 'tns:' in the error message the namespace is: 'ns0'.
    <element name="getEmpResponseElement">
    <complexType>
    <sequence>
    <element name="result" type="tns:EmpRecUser" nillable="true"/>
    </sequence>
    </complexType>
    </element>
    I created the web service in Oracle J Developer from the tutorial URL below.
    http://www.oracle.com/technology/obe/obe1013jdev/10131/wsfromplsqlpackage/devwsfrom%20plsql.htm#p
    weblogic.xml.query.exceptions.XQueryDynamicException: {err}XQ0027: Validation failed: error: cvc-elt.4.2: Invalid xsi:type qname: 'ns0:EmpRecUser' in element getEmpResponseElement@http://hr/EmpWebService.wsdl/types/
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans.giveToken(XMLValidatorForXMLBeans.java:169)
         at weblogic.xml.query.schema.ValidatingIterator.fetchNext(ValidatingIterator.java:148)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at com.bea.ld.wrappers.ws.JAXRPCWebserviceIterator.fetchNext(JAXRPCWebserviceIterator.java:104)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
         at weblogic.xml.query.iterators.GenericIterator.hasNext(GenericIterator.java:134)
         at weblogic.xml.query.runtime.sequences.Subsequence.fetchNext(Subsequence.java:101)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.runtime.querycide.QueryAssassin.fetchNext(QueryAssassin.java:54)
         at weblogic.xml.query.iterators.GenericIterator.peekNext(GenericIterator.java:151)
         at weblogic.xml.query.runtime.qname.InsertNamespaces.fetchNext(InsertNamespaces.java:238)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.runtime.core.QueryIterator.fetchNext(QueryIterator.java:127)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
         at weblogic.xml.query.iterators.GenericIterator.peekNext(GenericIterator.java:151)
         at weblogic.xml.query.xdbc.util.Serializer.processNamespaces(Serializer.java:340)
         at weblogic.xml.query.xdbc.util.Serializer.processElement(Serializer.java:262)
         at weblogic.xml.query.xdbc.util.Serializer.process(Serializer.java:206)
         at weblogic.xml.query.xdbc.util.Serializer.serializeItems(Serializer.java:152)
    Caused by: weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans$ValidationException: null
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans$ExceptionCollection.add(XMLValidatorForXMLBeans.java:340)
         at org.apache.xmlbeans.impl.validator.Validator.emitError(Validator.java:175)
         at org.apache.xmlbeans.impl.validator.Validator.emitFieldError(Validator.java:207)
         at org.apache.xmlbeans.impl.validator.Validator.emitFieldError(Validator.java:193)
         at org.apache.xmlbeans.impl.validator.Validator.beginEvent(Validator.java:458)
         at org.apache.xmlbeans.impl.validator.Validator.nextEvent(Validator.java:246)
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans.processBeginElementToken(XMLValidatorForXMLBeans.java:1205)
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans.processToken(XMLValidatorForXMLBeans.java:1322)
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans.processTokensOnHold(XMLValidatorForXMLBeans.java:1349)
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans.gotBeginElementToken(XMLValidatorForXMLBeans.java:772)
         at weblogic.xml.query.schema.xmlbeans.XMLValidatorForXMLBeans.giveToken(XMLValidatorForXMLBeans.java:100)
         at weblogic.xml.query.schema.ValidatingIterator.fetchNext(ValidatingIterator.java:148)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at com.bea.ld.wrappers.ws.JAXRPCWebserviceIterator.fetchNext(JAXRPCWebserviceIterator.java:104)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
         at weblogic.xml.query.iterators.GenericIterator.hasNext(GenericIterator.java:134)
         at weblogic.xml.query.runtime.sequences.Subsequence.fetchNext(Subsequence.java:101)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.runtime.querycide.QueryAssassin.fetchNext(QueryAssassin.java:54)
         at weblogic.xml.query.iterators.GenericIterator.peekNext(GenericIterator.java:151)
         at weblogic.xml.query.runtime.qname.InsertNamespaces.fetchNext(InsertNamespaces.java:238)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.runtime.core.QueryIterator.fetchNext(QueryIterator.java:127)
         at weblogic.xml.query.iterators.GenericIterator.next(GenericIterator.java:113)
         at weblogic.xml.query.xdbc.iterators.ItemIterator.fetchNext(ItemIterator.java:86)
         at weblogic.xml.query.iterators.GenericIterator.peekNext(GenericIterator.java:151)
         at weblogic.xml.query.xdbc.util.Serializer.processNamespaces(Serializer.java:340)
         at weblogic.xml.query.xdbc.util.Serializer.processElement(Serializer.java:262)
         at weblogic.xml.query.xdbc.util.Serializer.process(Serializer.java:206)
         at weblogic.xml.query.xdbc.util.Serializer.serializeItems(Serializer.java:152)
         at com.bea.ld.server.QueryInvocation.getResult(QueryInvocation.java:461)
    <definitions
    name="EmpWebService"
    targetNamespace="http://hr/EmpWebService.wsdl"
    xmlns="http://schemas.xmlsoap.org/wsdl/"
    xmlns:tns="http://hr/EmpWebService.wsdl"
    xmlns:soap12="http://schemas.xmlsoap.org/wsdl/soap12/"
    xmlns:mime="http://schemas.xmlsoap.org/wsdl/mime/"
    xmlns:tns0="http://hr/EmpWebService.wsdl/types/"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/"
    >
    <types>
    <schema xmlns="http://www.w3.org/2001/XMLSchema" targetNamespace="http://hr/EmpWebService.wsdl/types/"
    elementFormDefault="qualified" xmlns:tns="http://hr/EmpWebService.wsdl/types/"
    xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:soap11-enc="http://schemas.xmlsoap.org/soap/encoding/">
    <element name="getEmpElement">
    <complexType>
    <sequence>
    <element name="empNo" type="decimal" nillable="true"/>
    </sequence>
    </complexType>
    </element>
    <element name="getEmpResponseElement">
    <complexType>
    <sequence>
    <element name="result" type="tns:EmpRecUser" nillable="true"/>
    </sequence>
    </complexType>
    </element>
    <complexType name="EmpRecUser">
    <complexContent>
    <extension base="tns:EmpRecBase">
    <sequence>
    <element name="departmentId" type="decimal" nillable="true"/>
    <element name="managerId" type="decimal" nillable="true"/>
    <element name="hireDate" type="dateTime" nillable="true"/>
    <element name="jobId" type="string" nillable="true"/>
    <element name="employeeId" type="decimal" nillable="true"/>
    <element name="commissionPct" type="decimal" nillable="true"/>
    <element name="salary" type="decimal" nillable="true"/>
    <element name="lastName" type="string" nillable="true"/>
    </sequence>
    </extension>
    </complexContent>
    </complexType>
    <complexType name="EmpRecBase">
    <sequence/>
    </complexType>
    </schema>
    </types>
    <message name="EmpWebService_getEmp">
    <part name="parameters" element="tns0:getEmpElement"/>
    </message>
    <message name="EmpWebService_getEmpResponse">
    <part name="parameters" element="tns0:getEmpResponseElement"/>
    </message>
    <portType name="EmpWebService">
    <operation name="getEmp">
    <input message="tns:EmpWebService_getEmp"/>
    <output message="tns:EmpWebService_getEmpResponse"/>
    </operation>
    </portType>
    <binding name="EmpWebServiceSoapHttp" type="tns:EmpWebService">
    <soap:binding style="document" transport="http://schemas.xmlsoap.org/soap/http"/>
    <operation name="getEmp">
    <soap:operation soapAction="http://hr/EmpWebService.wsdl/getEmp"/>
    <input>
    <soap:body use="literal"/>
    </input>
    <output>
    <soap:body use="literal"/>
    </output>
    </operation>
    </binding>
    <service name="EmpWebService">
    <port name="EmpWebServiceSoapHttpPort" binding="tns:EmpWebServiceSoapHttp">
    <soap:address location="http://localhost:8888/PL_SQL_WS-GetEmployee-context-root/EmpWebServiceSoapHttpPort"/>
    </port>
    </service>
    </definitions>

    DSP validates the results from webservices. Whatever you tested the webservice with outside of DSP, probably does not.
    Can you capture the output of the webservice and attach it as well? You can capture the webservice response by running the wls server with the command-line property -Dweblogic.webservice.verbose=true
    Can you also open a case with customer support for this? I will not likely have much time to look at it.
    - Mike
    btw - the namespace prefixes are just place-holders that are defined with xmlns:prefix="some-uri". It is the namespace uri's that must be equivalent. If, in one case you have the definition xmlns:tns="myUri" and in another case xmlns:ns0="myUri" - then tns:EmpRecUser is equivalent to ns0:EmpRecUser.
    Perhaps in the webservice result there is no definition, or an incorrect definition for the namespace prefix ns0 - that would explain the complaint about the QName ns0:EmpRecUser. According to the wsdl, there should be a definition for ns0 as :
    xmlns:ns0="http://hr/EmpWebService.wsdl"
    in the webservice response.

  • Data Source Password

    Hi, is there any way to hide datasource password?.
    If I see the file data-source.xml I can see the user, password and database host/sid.
    For client pilicies I can have clear password in files.
    Thanks.
    Carlos

    Thanks a lot Srini for your reply.
    I am using nQUDMLExec for the UDML execution.
    I got the below UDML after copying the connection pool for that data source:
    DECLARE CONNECTION POOL "TESTRepository"."CONPOOL" AS "CONPOOL" UPGRADE ID 163723
         DATA SOURCE  {orcl}
         TIME OUT 300
         MAX CONNECTIONS 10
         TYPE 'OCI10G'
         USER 'devschema'
         PASSWORD 'D7EDED84BC624A917F5B462A4DCA05CDCE256EEEEEDC97D5B72C10C09258A4CA5074180DF5FE12D642A16786F2ABF854'
         SHARED LOGIN
         CONNECTIONS TO SAME URI 10
         OUTPUT TYPE XML
         HEADER PATH {D:\\OracleBI\\server\\config\\NQSQueryHeader.xml}
         TRAILER PATH {D:\\OracleBI\\server\\config\\NQSQueryTrailer.xml}
         BULK INSERT BUFFER SIZE 32768 TRANSACTION BOUNDARY 10
         TEMP TABLE PREFIX {TT}  OWNER {}
         PRIVILEGES ( READ);Here the password is encrypted. So, can I use my new decrypted password in the UDML?
    Well, I typed my decrypted password. But while executing the script like below:
    D:\OracleBI\server\Bin>nQUDMLExec -I D:\abc.txt -O DevRepository_Ver121.rpd
    ---------------D:\abc.txt---------------
    ---------------ERROR(s)---------------
    [nQSError: 28006] Near Line 1, <TESTRepository>: Not defined.N.B: In the error line, there is double quotes covering the TESTRepository, but I removed that for showing the complete error message.
    Please suggest. Thanks in advance.

  • OC4J data-source password indirection LDAP

    Hi all,
    I'm trying to set up password indirection for my OC4J data-sources. If I choose my user manager as JAZN XML UserManager then it works fine using encrypted passwords from the jazn-data.xml file. However using LDAP I cannot get it to work. I get an exception (pasted below) when the container starts up. Does anyone have experience with this?
    Regards.
    Anton.
    Exception:
    06/01/05 14:26:38 java.lang.UnsupportedOperationException
    06/01/05 14:26:38 at oracle.security.jazn.oc4j.RealmUserAdaptor.getPassword(Unknown Source)
    06/01/05 14:26:38 at oracle.security.jazn.oc4j.FilterUser.getPassword(Unknown Source)
    06/01/05 14:26:38 at com.evermind.security.SecuritySensitive.lookup(SecuritySensitive.java:217)
    06/01/05 14:26:38 at com.evermind.security.SecuritySensitive.decode(SecuritySensitive.java:114)
    06/01/05 14:26:38 at com.evermind.security.SecuritySensitive.decode(SecuritySensitive.java:131)
    06/01/05 14:26:38 at com.evermind.server.DataSourceConfig.getPassword(DataSourceConfig.java:530)
    06/01/05 14:26:38 at com.evermind.server.Application.initDataSource(Application.java:1674)
    06/01/05 14:26:38 at com.evermind.server.Application.initDataSources(Application.java:2077)
    06/01/05 14:26:38 at com.evermind.server.Application.preInit(Application.java:517)
    06/01/05 14:26:38 at com.evermind.server.Application.setConfig(Application.java:166)
    06/01/05 14:26:38 at com.evermind.server.Application.setConfig(Application.java:145)
    06/01/05 14:26:38 at com.evermind.server.ApplicationServer.initializeApplications(ApplicationServer.java:1756)

    Hi Toby,
    Passsword indirection with jazn-ldap is not supported please review this link
    Cheers,
    Deepak

  • How to edit copa data source or change delta method.Need t-code/procedure

    Hi gurus,
    I have a copa extractor in production.The copa extraction is costing based.
    I need to get an extra field from CE1**** table. when i go to KEB0 and display , i see my required field under the chracteristics from segment table. It is unchecked.
    I want to know the transaction where i can go and  edit the datasource to check that field and bring that data in.
    I tried going to T-code KEDV. But i couldnt figure out what to do there?
    I also tried deleting and recreating the datasource, then i am able to check that field. The problem is earlier existing data source used to say,
    Delta Method     Time Stamp Management in Profitability Analysis
    now it says
    Delta Method         Generic Delta.
    how do i change the delta method from generic delta to Time Stamp Management in Profitbility Analysis. During creation in KEB0 it doesnt give any option to change.
    and what exactly is the differnce between both delta methods?
    plz post your inputs.
    Thanks in advance,
    > Points will be assigned for inputs
    Message was edited by:
            ravi a
    Message was edited by:
            ravi a

    Hi Ravi,
    I would expect you are communicating to your users that you will need some downtime for you to change this.  To add characteristics to COPA you actually have to first the delete the datasource in KEB0 then re-create the datasource.  I would work with your CO-PA config team to get the T-Codes necessary to assign those objects to PA.
    Please reference OSS note 392635 for further reference on your CO-PA timestamp question.  This should answer your questions here.
    Pls. assign pts if this helps.
    Thanks,
    -Alex

  • DAC  - Reset Data Sources does not work properly

    Hi,
    I am trying to run a full load by selecting Reset Data Sources in DAC. However, when i select this option, only the data after Jan 11, 2014 gets picked up in the ETL. I have data starting from year 2010 in my database.
    How do i ensure that the full set of data gets picked up for full load?

    Hi Srini,
    The value of INITIAL EXTRACT DATE is as follows:
    $$INITIAL_EXTRACT_DATE
    Timestamp
    Value=19700101 at 00:00:00:00_dac_sep_Formatter=MM/DD/YYYY_dac_sep_Function=TO_CUSTOM_dac_sep_Runtime=Static
    How does this work? Actually this was working fine for DAC 10g and the client was able to perform full loads by resetting the data sources.
    What could be the issue with DAC 11g?

  • Data Source path in Pivot Table changes to absolute on its own

    Hello.
    I have a .XLSX file, that was created long time ago (I don't even know in which Office version, but definitely not 2013), and maybe even was a .XLS file at first.
    So it's a 4 MB file with 16 Sheets and 8 Pivot Tables.
    All of the Pivot Tables use other sheets from the same file as Data Source.
    Data Source for some of them look like this: 'Sheet3'!$A:$E
    Everything is fine when I save the file, and open it from saved file. 
    But as soon as I try to move the file elsewhere, or rename it, or email it - all Data Source paths change to something like this: '\Users\Sergii_Litnevskyi\Desktop\New folder\[FileName.xlsx]Sheet3'!$A:$E
    And it happens with all Pivot Tables. The problem is that it links to an old file path, where the file does not exist anymore. And it links to an external file, which is not what I want.
    If I Save As and select different path and filename - then it works fine. So it's a workaround for renaming and moving files, but not for sending them to other persons.
    I've read some threads, and people recommend disabling "Save external link values", but it does not help. It is already turned off in my office, but it keeps acting weird. 
    So what I need is: Save the file, close it, rename it, move it to other place, send it over email as attachment. And then I want to have the same Data Source path in my PivotTables as I had before I saved the file. How can I do it?
    My Office version: Microsoft Excel 2013 (15.0.4454.1503) MSO (15.0.4517.1005) 32-bit

    Hi,
    According to your description, I suppose the issue may be caused by some reason.
    Do you link the outside data source?
    I think if the file moves the file elsewhere, or renames, or email,
    data source paths can’t be change.
    But, your data source paths add the absolute path.
    Do you link the outside data source?
    I recommend you zip the file and send it as Email attachment.
    If the issue exists, you may save as it in a new name and test it in another computer.
    Regards,
    George Zhao
    TechNet Community Support
    I am pretty sure that I don't have any external links in the document.
    However, even if I did - why would it change Data Source path for all of the Pivot Tables, when I did not request it?
    I tried zipping it and sending to other person over email, but he got the file with changed data source paths.
    I can even record a short video to show what happens.
    Actually, I just did it. You can see the video here: http://screencast.com/t/qMBild3ck9b
    It is rather big - 23.8 MB.
    Let me explain what I showed there:
    I opened my original file. I showed that there are Pivot Tables, whose Data Sources are in the same file, on various other sheets.
    I showed this for all of the Pivot Tables in the document.
    I saved the file using Save As in a different folder and under a different name (TEST.xlsx).
    I then opened that saved file to show you that it is fine, and the Data Source path for one of the Pivot Tables is the same as it was in original file. It is the same for all of the other Pivot Tables.
    Then I closed, and simply renamed the file to TEST123.xlsx.
    Opened it, and first thing wrong - Security warning.
    Then I got ‘Cannot open PivotTable source file ….’ messages. And, as I showed, now all Data Source paths have been changed to full paths of the file, that was created by Save As (TEST.xlsx) from original file.

  • Recurring Crystal schedules fail if original data source changes

    We're using BO 3.1 (SP3 plus FP 3.3 and 3.5) with Oracle 11g repository and both 10g and 11g clients installed on the BO servers (Windows Server 2008 R2).  We've been trying to resolve an issue with recurring schedules of Crystal reports.  We always configure them to use the Custom Data Source settings but if the Original Data Source name changes, it's causing report schedules to fail with the below error message.  So if the parent report's original data source doesn't match the schedules original data source, the schedule begins to fail even though it's configured to use the custom data source settings.  I've had a ticket open with support for over a month now but am not getting anywhere.  Any feedback would be greatly appreciated. 
    Error in File ~tmp6406e9398df050.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 1005 ]

    Hi John,
    Thanks for the response but rescheduling reports does NOT resolve the issue and should not be necessary when they're set to use the CUSTOM data source.  We've found that the only thing you can really do is delete the existing recurring schedules and create brand new ones.  Needless to say, this isn't very convenient for users and it's difficult to prevent the original data source name from ever changing.  It sure seems like a bug but SAP will not acknowledge it as such.  If you use Query builder to look up all the detail on a recurring schedule, there's nothing to indicate where the problem is.  The SI_USE_ORIGINALDS will be set to "false" but if the original data source in the recurring schedule does not match that of the parent report, it will fail even though the custom data source settings are correct.  Why would the recurring schedule do anything at all with the original data source when it is set to use the custom one??

  • SMSY Data Source changes after performing Managed System Config

    Hello everyone,
    I am working with Solman EHP1 SPS26.  I have setup up my central SLD to push system data to my Solman local SLD which is then retrieved from Landscape Fetch to SMSY.  Initially, everything seemed to be working and all data sources reported SLD.
    However, once I performed the Managed System Confirguration wizard for a particular technical system, that systems data source would change to TMS/RFC under SMSY.  It's actually not even that consistent.  For example, the server says source is RFC, the product system says TMS/RFC and some of the product instances still say SLD.
    Is this normal for this to happen after connecting a managed system?  Will these systems still be updated via the SLD considering SMSY_SETUP is configured to use the SLD?
    Anyone's help would be greatly appreciated.
    Alex

    Hello,
    If you notice in SMSY this is not an editable field.
    It is reporting the last datasource used.
    Thefore when you made a change via managed system setup that became the last data source used.
    So yes it is normal to see this change.
    If you have configured the datasource to be the SLD, it should continue to update these systems.
    Where you need to watch out is making manual changes in SMSY, as the SLD can view this as a different system to the SID it knows and will generate systems with known SIDs appended with a suffix variable to distinguish it, as the SLD will not overwrite manual changes, and will create new systems instead.
    But the field in SMSY reflects the last data source used.
    Hope this helps some.
    Regards,
    Paul

  • Change all reports data source connection

    I have numerous SQL reports that require their data source connection changed. Does anyone have code to change all reports that use a data source name of 'Data_Warehouse_Main' to a different shared connection.
    Thanks

    Hi BiscuitB,
    According to your description, there are numerous SQL reports and their data source need to be changed. You used RSScripter but it does not work.
    In Reporting Services, if we want to change data sources of multiple SSRS Report, we can use Powershell to set the data sources for an item in a report server database or SharePoint library. To achieve your goal, we need to use both the Reporting Service
    GetItemDataSources Method and the SetItemDataSources Method. You can refer to the thread provided by PrajapatiNeha, and the PowerShell script sample will be helpful for you.
    Reference:
    GetItemDataSources Method
    SetItemDataSources Method
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu

  • Dynamic Creation of Physical Data Server / Agent cache Refresh

    Scenario:
    I have a requirement to load data from xml source to oracle DB, and the xml source will change at run time,but the xsd of the xml would remain same ( so I don't have to change the Logical data Server, models, mappings, interfaces and scenarios - only the Physical Data Server will change at runtime).I have created all the ODI artifacts using ODI studio in my Work Repo and then I'm using odi sdk to create the physical dataserver for the changed xml data source and then invoking the agent programmatically.
    Problem:
    The data is being loaded from the xml source to oracle DB for the first time, but it is not working fine from the second time onwards. If I restart the agent, it is again working fine for one more time. on the first run, I think the agent maintains some sort of cache for the physical data server details and so when ever I change the data server, something is going wrong and that is leading to the following exception. So I want to know, if there is any mechanism to handle dynamic data servers or if there is any way of clearing the agent cache, if any.
    Caused By: org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 41, in <module>
    AttributeError: 'NoneType' object has no attribute 'createStatement'
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1596)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:582)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1070)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$1.run(DefaultAgentTaskExecutor.java:50)
         at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor.executeAgentTask(DefaultAgentTaskExecutor.java:41)
         at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doExecuteAgentTask(TaskExecutorAgentRequestProcessor.java:93)
         at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.process(TaskExecutorAgentRequestProcessor.java:83)
         at oracle.odi.runtime.agent.support.DefaultRuntimeAgent.execute(DefaultRuntimeAgent.java:68)
         at oracle.odi.runtime.agent.servlet.AgentServlet.processRequest(AgentServlet.java:445)
         at oracle.odi.runtime.agent.servlet.AgentServlet.doPost(AgentServlet.java:394)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:821)
         at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:503)
         at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:389)
         at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
         at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
         at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)
         at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:417)
         at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
         at org.mortbay.jetty.Server.handle(Server.java:326)
         at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:534)
         at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:879)
         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:747)
         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
         at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
         at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:520)

    Hi ,
    If you want to load multiple files ( same structure) through one connection then in topology create M.XSD for M.XML file
    Create three directories
    RAW -- It will contain file with original name
    PRO- Processing area where file will be moved one by one & renamed it as M.XML.
    OUT- Once file data will be loaded into tables move the file M.XML from PRO to OUT.
    Go to odiexperts to create loop,
    Use odifilemove ( to move & rename/masking) to move A.XML from RAW to PRO & rename to M.XML
    use ODIfilemove to move M.XML to OUT folder & then rename back to A.XML
    Use variables to store file names & refresh
    NoneType' object has no attribute 'createStatement' : It seems that structure of your file is different & your trying to load different files in same schema. If stucture is same then use Procedure "SYNCHRONIZE ALL" after every load...
    Edited by: neeraj_singh on Feb 16, 2012 4:47 AM

Maybe you are looking for

  • Each time I open Firefox or click a link, it will freeze up

    When I go to open Firefox or click a new link, it will freeze up and say (Not Responding) Which usually ends after about 30 seconds. The bigger problem is it does the same thing with my other programs only after I start Firefox.

  • G/L account does not exist in company code

    Hi, When posting a GR (movement type 101) via MIGO one of my users ran into above error message: G/L account 300000 does not exist in company code PFNA. I first checked the material valuation and found out that it was incorrect. After the material va

  • Tints vs. transparency; printing indesign file - so confused now

    Hello! I'm still learning indesign so i'm definitely not a pro. I've been working on an assignment which consists of creating compositions in indesign. The indesign file basically has text, lines and colored areas. I've just realized that I am only a

  • Kuler not showing up in cs5 flash

    when i go window>extensions on my mac based flash there is no kuler link available. only a w3 and w6 i really want/need kuler. what do i need to get it please.... the internet is working fine. d

  • Application does not load correctly when deploying to standalone server

    All, I have created simple application implementing fusion theme. Applicaiton works fine when I run on integrated server but when deployed on standalone server it does not load any images or the theme itself. afrloop keep incrementing in status bar a