Data Sources Discrepency in R/3 DEV and QA

I compared RSA6 in R/3 DEV with R/3 QA and there were many data sources missing from the QA system. WHy is that? Is that normal? What do I need to do match the two systems? Thanks.
Message was edited by:
        BobbySi

You may not want all of the datasources in Dev to be in your QA system.
If someone goes through the system and randomly turns on datasources to "see what's there", then you may not want all of those datasources to be moved up the landscape.
Example, I'm looking for Sales data, I may turn on 4 or 5 datasources looking for the right data.  If I only find it in 1, then that's the only datasource I will transport.
If have a matching DEV and Test system, they could be knocked out of synch if Test is refreshed from production and production doesn't match DEV or Test.
IMHO, it's okay if they are not matching and completely expected.  The second you turn on a datasource in DEV, the systems don't match.

Similar Messages

  • Source System Mapping between BW Dev and Prod system

    Hi,
    I have a DB Connect Source System in BW Dev system called DBSRCDEV which points to one of our Oracle Dev DB.
    In Production, the DB Connect Source System is called DBSRCPROD and it points to the corresponding Oracle Prod DB.
    In Development, I created a data source on this Source System (DBSRCDEV) and I tried to transport it to Production. But it failed with error saying that DBSRCDEV does not exist.
    Where and how can I maintain or configure the mapping between DBSRCDEV and DBSRCPROD Source System so that I can transport my data source to Production?
    Please help.
    Thanks,
    CH

    Hi,
    usually you need to maintain the setting for the change of the logical system name (source system) in each target system of a transport. So in your prod system goto rsa1->tools->change logical system names...
    After maintaining that table, try to import your request again.
    regards
    Siggi

  • What are the data sources for 0FIGL_V10 cube Balance Sheet and Profit& Loss

    What are the data sources for 0FIGL_V10 (G/L (New): Balance Sheet and Profit and Loss) cube. and whether we can install business content regarding this.
    Please help me out.
    thanks,
    sapsdnhelp

    Hi,
    Check this:-------
    Urgent: Relevant Master Data Data Sources for FI-GL  & FI-AP
    http://help.sap.com/saphelp_nw04/helpdata/en/04/47a46e4e81ab4281bfb3bbd14825ca/frameset.htm
    Regards,
    Suman
    Edited by: Suman Chakravarthy on Sep 28, 2011 7:33 AM

  • Data source for application using both pooled and non pooled connections

    Hi guys.
    I am integrating Oracle's connection pooling into an existing application that had formerly used dbConnectionBroker. It looks like this task should be quite straightforward. However, for consistency, I would also like to replace other Connection logic within the application to use Oracle classes. This will involve using OracleDataSource to obtain a Connection object. ( without pooling ).
    So in this case, the application will use both pooled and non pooled Oracle connections. They will be connecting to the same database. The question I have is in regard to the use of data-sources.xml.
    Are there any special considerations for the required attribute values within data-sources.xml under this scenario ?
    Help will be greatly appreciated.
    Regards.
    Steve.

    Hi Steve -
    It should be feasible for you to define a single datasource using multiple location entries to indicate what sort of pooling behaviour you wish to use.
    If you lookup and use the "location" attribute, you will receive a javax.sql.DataSource object which will not provide connection pooling.
    If you lookup and use the "ejb-location" attribute you will receive a DataSource object that will support connection pooling operations.
    Note that this is using the emulated datasource approach, and transaction support is limited to a single resource (one database) for these datasources - you won't get 2PC support for transactions.
    If you need a transaction to span two separate resources (ie two databases in same tx) then you will need to use the non-emulated datasource approach.
    There is a chapter in the J2EE Services Guide which describes the datasource model we have with OC4J. This might provide you with some more useful information. See Chapter 11 - http://otn.oracle.com/docs/products/ias/doc_library/903doc_otn/generic.903/a97690/ds3.htm#1004903
    cheers
    -steve-

  • Active version of BW Data Source Dev Appears inactive in Q

    Hi All,
            This is urgent..needs to move to prod today but I am stuck in D still! 
    Does anyone know why a particular BW data source which is active inn Dev might appear to be inactive  in Q after the transport?
      We tried manual replication, transporting replica and reapplying the transports, recollecting the data source in both r3 and BW
      I also verified that I am collecting the active version in my transport
    Thanks,
    DB
    Points will be assigned if I can get rid of this issue!
    Edited by: Darshana on Jul 14, 2008 7:34 PM
    Edited by: Darshana on Jul 14, 2008 7:34 PM

    Hello Darshana,
    Make sure the appropriate "Conversion of Source System names after transport" are maintained in target system QA and Prod.
    RSA1 > Tools >...
    Darshana,
    Follow the above path...Go to Datawarehousing Workbench in QA tcode RSA1 and on the top you will see Tools > and follow from there...
    Hope this helps....
    Edited by: Chetanya Thanneer on Jul 14, 2008 10:15 PM

  • Object Data Source and oracle connections.-Help please!!

    I have a detailsview with objectdatasource as a Data source.
    Every time, i Edit and save a row in the detailsview, upto 10 connections are created to Oracle. And because of this maximum no of connections is reached quickly.
    I am closing/disposing the connection object in the update method.
    What could be causing this behaviour?
    Need help immediatley. Appreciate your time.
    -Thanks.

    That helpes quite a bit. I still can't get the app to retrieve data, but I am getting a more useful message in the log:
    [Error in allocating a connection. Cause: Connection could not be allocated because: ORA-01017: invalid username/password; logon denied]
    As you suggested, I removed the <default-resource-principal> stuff from sun-web.xml and modified it to match your example. Additionally, I changed the <res-ref-name> in web.xml from "jdbc/jdbc-simple" to "jdbc/oracle-dev".
    The Connection Pool "Ping" from the Admin Console is successful with the user and password I have set in the parameters. (it fails if I change them, so I am pretty sure that is set up correctly) Is there another place I should check for user/pass information? Do I need to do anything to the samples/database.properties file?
    By the way, this is the 4th quarter 2004 release of app server. Would it be beneficial to move to the Q1 2005 beta?
    Many thanks for your help so far...

  • SQL Server 2008 R2 - Report Builder 3.0 - timeout using shared data source and stored procedure

    I select the shared datasource from the data source propeties dialog, test the connection and everything is good.
    I add a dataset by selecting "use a dataset embedded in my report" option within the Dataset properties dialog.
    I select the newly added data source, click the "Stored procedure" query type and drop down the list box and select my intended stored procedure.
    the timeout for the dataset is "0" seconds.
    I click the "OK" button and I'm presented with the parameters to the stored procedure.
    I enter valid data for the parameters and click the "OK" button.
    I then get the following error message after 30 seconds:
    The problem is, all of the timeouts, that I'm aware of, have values of zero (no timeout) or high enough values that 30 seconds isn't even close to the timeout.
    I think the smallest timeout we have is 120 seconds.
    I have searched this site and many others and the solutions all involve altering the stored procedure to get the fields into report builder and then revert the stored procedure back to its original form.
    To me, this is NOT a solution.  
    I have too many stored procedures that need to be brought into Report Builder.
    I need a real solution.
    Thank you for you time, Tim Caldwell.
    Timothy E Caldwell

    I don't mean to be rude, but really, check to see if the stored procedure can return data rows???
    Maybe I'm not being clear enough.
    The stored procedure runs perfectly fine.
    it runs perfectly fine in the production environment and the test environment.
    I can access the stored procedure in several ways and have it return correct data.
    I can even trick report builder into creating a dataset with parameters and run the stored procedure that way.
    What I cannot do, is to get report builder to not timeout after 30 seconds on the initial creation of a dataset with a Query type of stored procedure.
    I have seen this issues posted again and again and again on may different sites and the "solution" is to simplifiy the stored procedure by creating a stored procedure that has a create table and a select in the stored procedure and that's it.  After
    report builder creates the dataset the developer then has to replace the simplified stored procedure with the actual stored procedure and everything works fine after that.
    HOWEVER, having to go through this process for 70 or more stored procedures is ridiculous.
    It would appear that there is something within report builder itself that is causing this issue.
    The SQL Script included is an example of a stored procedure that will not create fields create a dataset with fields and parameters in Report Builder 3.0:
    USE [CRUM_IT]
    GO
    /****** Object: StoredProcedure [dbo].[COGNOS_Level5ScriptSP] Script Date: 11/17/2014 08:02:26 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    ALTER procedure [dbo].[COGNOS_Level5ScriptSP]
    @CompanyCode varchar(8) = null,
    @GetSiblings varchar(1) = 'N'
    as
    Begin
    -- get emergency contact info
    select *
    into #tmp_Contacts
    from
    (select
    ConEEID,
    con.connamelast as [Emer Contact Last Name],
    con.connamefirst as [Emer Contact First Name],
    con.connamemiddle as [Emer Contact Middle Initial/Name]--,
    ,ROW_NUMBER() over (Partition by ConEEID order by ConNameLast)as rn
    ,ISNULL(
    case when con.conphonepreferred = 'H'
    then '(' + substring(con.conphonehomenumber, 1, 3) + ')' + substring(con.conphonehomenumber, 4, 3) + '-' + substring(con.conphonehomenumber, 7, 4)
    else '(' + substring(con.conphoneothernumber , 1, 3) + ')' + substring(con.conphoneothernumber , 4, 3) + '-' + substring(con.conphoneothernumber , 7, 4)
    end,
    ) as [Emergency Phone]
    from [ultiprosqlprod1].[ultipro_crum].dbo.Contacts con
    where con.ConIsEmergencyContact='y'
    and con.ConIsActive='y'
    ) A
    where A.rn = 1
    CREATE TABLE #tmp_CompanyCodes (CompanyCode varchar(8))
    If @GetSiblings = 'Y'
    Begin
    INSERT INTO #tmp_CompanyCodes (CompanyCode)
    EXEC [z_GetClientNumbers_For_ParentOrg_By_ClientNumber] @CompanyCode
    End
    INSERT INTO #tmp_CompanyCodes
    values (@CompanyCode)
    select *
    into #tmp_Company
    from [ultiprosqlprod1].[ultipro_crum].dbo.Company
    where cmpcompanycode in (select CompanyCode from #tmp_CompanyCodes)
    select distinct
    cmpcompanycode as [Client ID],
    CmpCompanyDBAName as [Client Name],
    eec.eecEmplStatus AS [Employment Status],
    eec.eecEmpNo AS [Employee Num],
    rtrim(eep.eepNameLast) AS [Last Name],
    rtrim(eep.eepNameFirst) AS [First Name],
    isnull(rtrim(ltrim(eep.eepNameMiddle)), '') AS [Middle Initial/Name],
    rtrim(eep.eepAddressLine1) AS [Address Line 1],
    isnull(rtrim(eep.eepAddressLine2), '') AS [Address Line 2],
    eep.eepAddressCity AS [City],
    eep.eepAddressState AS [State],
    CASE
    WHEN len(eep.eepAddressZipCode) > 5 and charindex(eep.eepAddressZipCode, '-', 1) = 0
    THEN substring(eep.eepAddressZipCode, 1, 5)
    ELSE rtrim(eep.eepAddressZipCode)
    END AS [Zip code],
    CASE
    WHEN len(eep.eepAddressZipCode) > 5 and charindex(eep.eepAddressZipCode, '-', 1) = 0
    THEN substring(eep.eepAddressZipCode, 6, 4)
    WHEN len(eep.eepAddressZipCode) > 5 and charindex(eep.eepAddressZipCode, '-', 1) > 0
    THEN substring(eep.eepAddressZipCode, charindex(eep.eepAddressZipCode, '-', 1) + 1, 4)
    WHEN len(eep.eepAddressZipCode) <= 5
    THEN ''
    END AS [ZIP + 4],
    substring(eep.eepSSN, 1, 3) + '-' + substring(eep.eepSSN, 4, 2) + '-' + substring(eep.eepSSN, 6, 4) AS [SSN],
    isnull(convert(VARCHAR(10), eep.eepDateOfBirth, 101), '') AS [Date Of Birth],
    eetFED.TAXCODE AS [FED Tax Code],
    eetFED.FILINGSTATUS AS [Fed Filing Status],
    eetFED.EXEMPTIONS AS [Fed Exemption Allowance],
    eetFED.ADDITIONAL AS [Additional Fed Withholding],
    eetSIT.TAXCODE AS [SIT Tax Code],
    eetSIT.FILINGSTATUS AS [State Filing Status],
    eetSIT.EXEMPTIONS AS [State Exemption Allowance],
    eetSIT.ADDITIONAL AS [Additional State Withholding],
    isnull('(' + substring(eep.eepPhoneHomeNumber, 1, 3) + ')' + substring(eep.eepPhoneHomeNumber, 4, 3) + '-' + substring(eep.eepPhoneHomeNumber, 7, 4), '') AS [Home Phone],
    isnull((SELECT cod.codDesc
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.Codes cod WITH (NOLOCK)
    WHERE cod.codCode = eep.eepEthnicID
    AND cod.codDosTable = 'ETHNICCODE'), '') AS [Race-Origin], --eep.eepEthnicID AS [Race-Origin],
    eep.eepGender AS [Gender],
    isnull(convert(VARCHAR(10), eec.eecDateOfOriginalHire, 101), '') AS [Original Hire Date],
    isnull(convert(VARCHAR(10), eec.eecDateOfSeniority, 101), '') AS [Seniority Date],
    isnull(convert(VARCHAR(10), eec.eecDateOfTermination, 101), '') AS [Termination Date],
    isnull(eecTermType,'') as [Termination Type],
    isnull(TchDesc, '') as [Termination Reason],
    rtrim(eec.eecJobCode) AS [WC Code],
    isnull(eec.eecJobTitle, '') AS [Job Title],
    pgr.pgrPayFrequency AS [Pay Frequency],
    eec.eecFullTimeOrPartTime AS [Full/Part Time],
    eec.eecSalaryOrHourly AS [Pay Type],
    isnull(convert(MONEY, eec.eecHourlyPayRate), 0.00) AS [Hourly Rate],
    isnull(eec.eecAnnSalary, 0.00) AS [Annual Salary],
    [YTD Hours],
    isnull(eep.eepNameFormer, '') AS [Maiden Name],
    eec.eecLocation AS [Location ID],
    rtrim(eec.eecOrgLvl1) AS [Department ID],
    eec.eecorglvl2 AS [Cost Item],
    eec.eecorglvl3 as [Client Project],
    eec.eecPayGroup as [Pay Group],
    isnull(eepAddressEMail,' ') as [Email Address],
    isNull(BankName1,' ') as PrimaryBank,
    isNull(BankRoute1,' ') as PrimaryRouteNum,
    isNull(Account1,' ') as PrimaryAccount,
    isNull(AcctType1,' ') as PrimaryAcctType,
    isNull(DepositRule1,' ') as PrimaryDepositRule,
    isNull(BankName2,' ') as SecondaryBank,
    isNull(BankRoute2,' ') as SecondaryRouteNum,
    isNull(Account2,' ') as SecondaryAccount,
    isNull(AcctType2,' ') as SecondaryAcctType,
    isNull(DepositRule2,' ') as SecondaryDepositRule,
    isNull(
    CASE
    WHEN DepositRule2 = 'D'
    THEN '$' + convert(varchar, cast(EddAmtOrPct2 AS decimal(10,2)))
    WHEN DepositRule2 = 'P'
    THEN convert(varchar, cast((EddAmtOrPct2*100) AS decimal(10,0))) + '%'
    ELSE null
    END,' ') as SecondaryDepositAmount,
    isNull(BankName3,' ') as ThirdBank,
    isNull(BankRoute3,' ') as ThirdRouteNum,
    isNull(Account3,' ') as ThirdAccount,
    isNull(AcctType3,' ') as ThirdAcctType,
    isNull(DepositRule3,' ') as ThirdDepositRule,
    isNull(
    CASE
    WHEN DepositRule3 = 'D'
    THEN '$' + convert(varchar, cast(EddAmtOrPct3 AS decimal(10,2)))
    WHEN DepositRule3 = 'P'
    THEN convert(varchar, cast((EddAmtOrPct3*100) AS decimal(10,0))) + '%'
    ELSE null
    END,' ') as ThirdDepositAmount,
    Supervisor,
    eec.eecEEID AS [Employee EEID],
    eec.EecJobCode As [Job Code],
    isnull(eec.EecTimeclockID,' ') As [Time Clock ID],
    con.[Emer Contact Last Name],
    con.[Emer Contact First Name],
    con.[Emer Contact Middle Initial/Name],
    con.[Emergency Phone]
    from [ultiprosqlprod1].[ultipro_crum].dbo.empPers eep WITH (NOLOCK)
    inner join [ultiprosqlprod1].[ultipro_crum].dbo.empComp eec WITH (NOLOCK)
    ON eep.eepEEID = eec.eecEEID
    inner join #tmp_Company cmp WITH (NOLOCK)
    ON eec.eecCOID = cmp.cmpCOID
    inner join [ultiprosqlprod1].[ultipro_crum].dbo.PayGroup pgr WITH (NOLOCK)
    ON eec.eecPayGroup = pgr.pgrPayGroup
    left outer join [ultiprosqlprod1].[ultipro_crum].dbo.TrmReasn
    on tchCode = eecTermReason
    left join (select CAST(sum(isnull(eee.eeeYTDHrs,0.00))AS DECIMAL(18,2)) as [YTD Hours],
    eeeEEID,
    eeeCOID
    from [ultiprosqlprod1].[ultipro_crum].dbo.EmpEarn eee with (NOLOCK)
    group by eeeCOID,eeeEEID)eee
    on eec.eecEEID = eee.eeeEEID
    and eec.eecCOID = eee.eeeCOID
    left join (SELECT eetCOID AS COID,
    eetEEID AS EEID,
    eetTaxCode AS TAXCODE,
    eetFilingStatus AS FILINGSTATUS,
    eetExemptions AS EXEMPTIONS,
    eetExtraTaxDollars AS ADDITIONAL
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.empTax WITH (NOLOCK)
    WHERE eetTaxCode = 'USFIT'
    )eetFED
    ON eec.eecCOID = eetFED.COID
    and eec.eecEEID = eetFED.EEID
    left join (SELECT eetCOID AS COID,
    eetEEID AS EEID,
    eetTaxCode AS TAXCODE,
    eetFilingStatus AS FILINGSTATUS,
    eetExemptions AS EXEMPTIONS,
    eetExtraTaxDollars AS ADDITIONAL
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.empTax WITH (NOLOCK)
    WHERE eetTaxCode like '%SIT'
    AND eetIsWorkInTaxCode = 'Y'
    )eetSIT
    ON eec.eecCOID = eetSIT.COID
    and eec.eecEEID = eetSIT.EEID
    left outer join (SELECT eddCOID,
    eddEEID,
    eddEEBankName BankName1,
    eddEEBankRoute BankRoute1,
    eddAcct Account1,
    EddAcctType AcctType1,
    EddDepositRule DepositRule1,
    EddAmtOrPct EddAmtOrPct1
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpDirDp WITH (NOLOCK)
    WHERE eddSequence = '99')edd
    ON eec.eecCOID = edd.eddCOID
    and eec.eecEEID = edd.eddEEID
    left outer join (SELECT eddCOID,
    eddEEID,
    eddEEBankName BankName2,
    eddEEBankRoute BankRoute2,
    eddAcct Account2,
    EddAcctType AcctType2,
    EddDepositRule DepositRule2,
    EddAmtOrPct EddAmtOrPct2
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpDirDp WITH (NOLOCK)
    WHERE eddSequence = '01')edd2
    ON eec.eecCOID = edd2.eddCOID
    and eec.eecEEID = edd2.eddEEID
    left outer join (SELECT eddCOID,
    eddEEID,
    eddEEBankName BankName3,
    eddEEBankRoute BankRoute3,
    eddAcct Account3,
    EddAcctType AcctType3,
    EddDepositRule DepositRule3,
    EddAmtOrPct EddAmtOrPct3
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpDirDp WITH (NOLOCK)
    WHERE eddSequence = '02')edd3
    ON eec.eecCOID = edd3.eddCOID
    and eec.eecEEID = edd3.eddEEID
    left outer join (SELECT eecCOID,
    eecEEID,
    rtrim(eepNameLast) + ', ' +
    rtrim(eepNameFirst) + ' ' +
    isnull(rtrim(ltrim(eepNameMiddle)), '') AS [Supervisor]
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpComp WITH (NOLOCK)
    join [ultiprosqlprod1].[ultipro_crum].dbo.EmpPers with (NoLock)
    on eeceeid = eepeeid)eec2
    ON eec.eecSupervisorID = eec2.eecEEID
    left outer join #tmp_Contacts con
    on eep.eepEEID = con.ConEEID
    order by [Client ID],
    [Last Name],
    [First Name]
    drop table #tmp_Contacts
    END
    Timothy E Caldwell

  • Vb 2010 data source configuration wizard and sql server 2012

    I am having a problem using the Data Source Configuration Wizard in vb 2010 express to connect to a remote SQL server database. The weird thing is that I can connect to it just fine if I manually dimension the sqlcommand, sqlconnection and set up the sql
    connectionstring. So I know the connectivity is possible, but I would like to use the functionality of the Data Source Configuration Wizard.
    Here's what happens when I attempt to add a connection via the Database Explorer:
    I only get two SQL Server options. Neither will let me connect to a remote database.
    I know that Microsoft SQL Server Compact 3.5 isn't what I want, but I have tried it anyway. It will only let me connect to a database on my computer. An 'ActiveSync' option is greyed out.
    Microsoft SQL Server Database File would appear to be what I want, but when I select this option, it informs me (as you can see) that this will only work for a local database:
    The connection string that I build manually (and which works) looks like this:
    Dim
    myConn
    As
    NewSqlConnection(sConnect)
    sConnect =
    ""Data Source=xxxxxx.xxxxxx.com;Server=""Xxxx XXX Database"";Initial Catalog=StockAnalysisProd;User
    ID=cadietz;Password = "& sPW &
    I have attempted all variation of this string in the database file name input box to no avail. I think it really should go in as a Data Source but those are hard wired and cannot be changed. I'm sure I'm doing something stupid, but I need someone
    to point out what it is. Thanks in advance for your help...
    Alex3764

    Alberto... thanks for your reply.
    Unfortunately this does not appear to work. I downloaded Visual Studio Express for the Web (2012) and attempted the 'work around' proposed by Sergey and I could not make it work on the remote server. I have given up on the Data Connection Wizard and
    am simply connecting to my remote SQL server using the manual classes (SqlConnection, SqlCommand, SqlDataAdapter, etc.). The moderator has marked this as an answer but I'm going to unmark it.
    I am now even more frustrated because I now find that I cannot connect to my LOCAL SQL database using the Data Connection Wizard and the circumstances are even weirder. Here's what happens:
    I open the Add Connection window, select 'Microsoft SQL Server Database File (SqlClient)' as the data source.
    I then browse to my local database file in the Database file name input box. When I click the 'Test Connection' button I get a message box that informs me: 'Test connection
    succeeded'. Voila!
    But, alas, when I close the message box and click OK on the Add Connection window I get this message: 'The ability to open this connection is not supported in this edition of Visual Studio'.
    I mean this borders on the absurd. Why is the Data Connection Wizard even offered with VB 2010 Express if you can't use it for anything. Once again I can get around the problem manually but I'm getting very frustrated with MS. Any suggestions??
    Alex3764

  • Data source for deposit advice xml and check writer xml

    Could someone please tell me where I can find the data source for the Deposit Advice XML and Check Writer XML programs. Since these executables are now a spawned process (PYUGEN), I don't know how to get to the data source. I was able before to look at the queries in PAYUSPST.rdf and PAYUSCHK.rdf. Where are these queries located now? Any help is appreaciated.

    Hi All,
    I applyied the patch 6399100 and i got the seeded Check Writer (XML) and Deposit Advice (XML) programs.Im able to run the Check Writer (XML) with Parameter (Check Style :Seeded Archive Check Writer Template ) but where as the im unable to run the Deposit Advice (XML) due to Error msg: "No enties found for list of vales"for the 'Check Style' Parameter.
    And for the above (only for Deposit Advice) i am not able to ecexute the SQL script given in the Doc ID : 459306.1 due to the error msg : cannot insert NULL into ("HR"."PAY_REPORT_CATEGORIES"."REPORT_GROUP_ID") if i see query the select * from apps.pay_report_categories where short_name like 'ARCHIVE%' there is no record exixts for ARCHIVE_DEPOSIT....
    for this we required any other patch/setups.
    And is the "Third Pary Check" will support XML Publisher ??
    Any respose will be greatly appriciated.
    Thanks,
    Madhu
    Message was edited by:
    user589951

  • Data Source and Beans

    I am working on a small project, and I am pretty unfamiliar with Beans.
    I am using NetBeans, Im a newb really.!
    I have a bean that is supposed to connect to a database using this code:
    DataSource source =
                        ( DataSource ) ic.lookup(
                        "java:comp/env/jdbc/Movies" );
                connection = source.getConnection();my question is this..
    what exactly is the " java:comp/env/jdbc" part mean? broken down i think..
    i have tried to use the following to connect.
    Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
                            Connection connection = DriverManager.getConnection(
                                    "jdbc:odbc:Store");but i cant get it to work,
    if i have a DB called "Movies" and the DSN called Store,
    what kind of syntax would i use for this Bean to connect ? the Data Source is what it reads now, and I dont understand it.
    Sorry if this is confusing, but i am really stuck!
    When I run the JSP to call the Bean it shows the page, but doesnt populate anything from the database..
    if anyone could help i would appreciate it!

    java:comp/env/jdbc/Movies is a lookup in JNDI.
    Your server - Tomcat? has a data source configured, under that name.
    java:comp/env = standard Tomcat JNDI context that it loads resources into
    /jdbc/Movies = the name of the resource you are after - jdbc connection to database "Movies"
    Every server configures JNDI slightly differently, some have a nice interface, others don't.
    Take a look at this page to see how it is done in Tomcat http://jakarta.apache.org/tomcat/tomcat-5.0-doc/jndi-datasource-examples-howto.htmlSomewhere in the config for your server it will map the name /jdbc/Movies to the database driver/url that you are using. That is where you will see the strings "sun.jdbc.odbc.JdbcOdbcDriver" and 'jdbc:odbc:Store"
    Regarding your connection specifically:
    You are using the JDBC-ODBC driver - presumably to connect to Access?
    Check that the DSN is a SYSTEM DSN, and not just a user one.
    Basically the Datasource is a wrapper around the DriverManager, that provides useful services like connection pooling. You shouldn't really care where you get the DB connection from as long as you can get it :-)
    So it comes to the crux of the matter:
    Are there any error messages printed to your console/log?
    Is the program connecting to the database successfully?
    Can you run a simple query and print out the results?
    How are you populating the bean?
    How are you displaying it on the page?
    Hope this helps some,
    evnafets

  • Data source and Extract Structure

    Hi all,
    I have a doubt on Data source and Extract Str,
    when i used one info cube with some char and kf's
    after i did extraction shall we change the extract strcuture prequently other wise better to use all the predefined extract strcuture hide in data source.
    if i hide in datasource after extraction i want to use some of hide fields, so we can use it? if we can use how can we extract data for that specified hide fields to bw side.

    If you wont to create the genaric data source at that time
    you can create cube with some char and kf.
    after extraction you wont to modify the structure ? if you mofify
    the structure then replicate the data source and delete the data
    and upload the data from r/3 to bw.its not correct way every time modifying the
    structure.if you put hide means that fields are not coming to
    bw side.if put modify the hide field ,first goto rsa6 edit data source
    remove the hide check box and save after come to bw replicate the data source
    delete the data and load the data (if you wont full data for hide field)
    if you dont load full load means only available upto data only what ever field you modify the
    hide to unhide,

  • What is Attr data source name for SOLD,BILL and SHIP TO PARTY

    Hi all,
      I want master data source names for  these SOLD,BILL and SHIP TO PARTY.
       Bill to party,Ship to party and Sold to party Master data source names.
      Plz give me Some help on this.
    i want solution very urgently.
    Thanks,
    Guna.

    Dear Gunasekhar,
    There are no separate datasource for 0sold to, 0ship to and 0bill to parties.
    They are referrenced to 0CUSTOMER in BW. and the datasources for 0customer are 0CUSTOMER_ATTR and 0CUSTOMER_TEXT.
    Assign points if useful
    Regards
    Venkata Devaraj!!!

  • Updated Direct 7mode And CDOT Script Data Sources

    Here are updated versions of my direct data sources I put together (communities doesn't seem to let me edit the existing article) - the 20150703 update adds support for WFA 3.0 and provides much more complete cDOT (cm_storage), including cDOT 8.3 features. The 20150728 update fixes storage pool support. The 20150802 update adds support for WFA 3.1RC1 and fixes failover_group support in WFA 3.0. Not everyone has pre-existing OCUM and installing and configuring OCUM for both 7mode and cDOT is not necessarily a small task. These datasource types populate the standard storage and cm_storage schemas which can then be used as normal with many workflows. The WFA 3.0 cDOT direct data source is almost complete. It now populates the following: disk, disk_aggregate, efficiency_policy. It includes cDOT 8.3 information added in WFA 3.0: broadcast_domain, failover_group, ipspace, storage_pool*. There may be some minor gaps in some of the data - please let me know if you identify anything. (Note: The WFA 2.2* cDOT data source is missing all of these items). The WFA 3.1RC1 cDOT dats source adds support for cluster_peer, vserver_peer and SVM DR, and all new fields added in 3.1RC1 are populated. It also populates cifs domain information. The 7-mode source is less complete. It contains: array, vfiler, aggregate, interface, volume, vsm, qtree, lun. It is missing: array_license, cifs_share, cifs_share_acl, dataset, disk, igroup, igroup_initiator, lunmap, nfs_export, object_comment, quota, snapshot, snapvault, user_quota. Configuration is simple:Provide credentials for the arrays/clusters in WFA credentials.Add datasource - select version appropriate to your WFA version (the 7-mode source works with 2.2-3.0) which the cDOT one is different for 2.2 and 3.0.Set the Hostname of the datasource to a comma separated list of array or cluster admin names or addresses, or set up one datasource per array/cluster.Configure the interval to something reasonable (say 15 or 30 minutes)Increase the timeout for larger environments (in my testing, collection takes 15-90s per array/cluster - it may take longer when there are large numbers of objects or if there is significant latency between the WFA system and the array/cluster) There are some flags that can be passed in (comma separated) through the data source's "User name" field:strict - fail collection if any error occurs (default if there is only one array/cluster)nostrict - don't fail connection if an error occurs (for a single array/cluster datasource)debug - log more informationlog=<logpath> - where to log (defaults to ..\..\log\direct_<schema>.log - typically C:\Program Files\NetApp\WFA\jboss\standalone\log\)timeout=<timeout_in_milliseconds> - API timeout (defaults to 180000)The Password, Database and Port fields are unused. Reservations, etc, should work as normal. This has been developed against WFA 2.2RC1, 2.2, 2.2.1, 3.0RC1 and 3.0 and will likely require updates to work with future versions. Upgrades from previous versions should be seamless. Changes from previous version:Added WFA 3.0 schema supportAccomodate changed mysql behavior in WFA 3.0Single array/cluster is now automatically strictAdded nostrict parameter7-mode direct data source handles pre 7.3.3 systems7-mode direct data source doesn't fail if vfiler stopped/inconsistentFix for storage_pools in 3.0 cDOT data source (20150728)Fix for broadcast_domain (3.0/3.1RC1)Added support for WFA 3.1RC1, populates all new schema tables and fields.No longer imports into WFA 2.2RC1

    Hi Richard, The dar file is directly attached, not wrapped in a zip as the previous communities site used to do. You should be able to directly import it into WFA. Note that some browsers may save the file with a zip extension (I've seen IE do this - dar files are just zip files anyway), so you may need to rename the file to .dar first. Regards,Tim

  • Error While Activating BI Web service data source based on MDM XI.

    Dear experts,
    Need help with an error that i am getting for a BI data source based on XI MDM. An extra field was added in the data source as part of business requirement and it works fine in dev environment but when we transported the changes to Q environment it is always giving following error.
    The web-service based communication structure in the data source in QA is not getting updated with the new field in the target system.
    Any inputs or pointers would be appreciated.
    Thanks,
    Abhishek

    Many thanks for your help.  This solved the issue for our .NET code, however the leak is still present in the report designer.  I was also wondering if you could help further: because of the limits on the java memory process is there a way to ensure that a separate java process is started for each report that is loaded in my report viewers collection?  Essentially the desktop application that i have created uses a tab control to display each type report, so each tab goes through the following code when displaying a report and closing a tab:
    Is there a way to ensure that a different Java process is kicked off each time that I display a different report?  My current code in c# always uses the same Java process so the memory ramps up.  The code to load the report and then dispose of the report through closing the tab (and now the Java process) looks like this:
        private void LoadCrystalReport(string FullReportName)
          ReportDocument reportDoc = new ReportDocument();
          reportDoc.Load(FullReportName, OpenReportMethod.OpenReportByTempCopy);
          this.crystalReportViewer1.ReportSource = reportDoc;
        private void DisposeCrystalReportObject()
          if (crystalReportViewer1.ReportSource != null)
            ReportDocument report = (ReportDocument)crystalReportViewer1.ReportSource;
            foreach (Table table in report.Database.Tables)
              table.Dispose();
            report.Database.Dispose();
            report.Close();
            report.Dispose();
            GC.Collect();
    Thanks

  • Capturing Data Source Activation in a transport

    Dear Gurus,
    I 'm using the 0TR_LP_1 data source for my data loading and facing the following issue.
    In DEV system , this data source have been activated by one of the developer but not captured in any of the transport request.
    How can i capture the activation in another transport so that this data source can be activated in QAS and PRD system via transport.
    Thank you
    BR

    Hi ,
    You can fix this issue by following the below steps.
    1--> Go to T code RSA6
    2-> Find your data source --> Click on Object Directory Entry
    4--> Change the package --> It will ask you for a transport
    5--> Save it >Double click on your data source> Generate data socrce
    6--> Then transport your data source
    On the BW side 
    1--> Replicate data source
    2--> If required reactivate transfer and communication structure using standrad programs.
    This will solve your issue.
    -Vikram Srivastava

Maybe you are looking for