Converting data source

Hi all,
I need to send a video to a remote computer through the internet. I am using RTPSockets for this transmit ion. But I need to convert the data source to an array of bytes. Is there a way to do it? But even if I convert it I am not sure this will work. So if any one has some code that could send a live stream to a remote computer thought the internet place forward it to me
Thanks lot

There is a "Configuration" option for an application module defined under: Applications -> Workspace -> Model -> Applications Sources -> [a dac package].
Opening it up - Configuration Manager, I have only 1 service defined that I can switch between different db connections (for dev/test/prod) - using a Connection Type of "JDBC URL". I've switched the the type to "JDBC DataSource" and entered the JNDI Location name (defined in the portal), but once the app is deployed the app fails.
The first time I switched it worked - but this was due (I'm guessing here) to the schema and passwd being left around in the properties in the Config Mgr (even though I had changed the Connection Type. Clearing these fields and redeploying resulted in the app failure. ...but that's the whole point behind using a data source - i.e. not maintaining db passwd info in the app itself.
Another interesting thing happened - once I changed to a "JDBC DataSource" connection, wiped the passwd and schema from the properties - saved and reopened the Config Mgr - the entire popup is now blank. To get it back, I have to revert back to a previously saved version of my app.
ps: Migrating to 10.1.3 or 11 (my preference) is part of my plans for 2009/2010.

Similar Messages

  • Help to read a table with data source and convert time stamp

    Hi Gurus,
      I have a req and need to write a ABAP prog. As soon as i excute ABAP program it should ask me enter a data source name, then my ABAP prog has excute teh code, in ABAP code i have to read a table with this data source as key, sort time stamp from table and should display the data source and time stamp as output.
    As follows:
    Enter Data Source Name: 
    Then user enters : 2lis_11_vahdr
    Then out put should be "Data source  :"  10-15-2008.
    The time stamp format in table is 20,050,126,031,520 (YYYYMMDDhhmmss). I have to display as 05-26-2005. Any help would be apprciated.
    Thanks,
    Ram

    Hi Jayanthi Babu Peruri,
    I tried to extract YEAR, MONTH, DAY separately and using
    EDIT MASK written it.
    Definitely there will be some STANDARD CONVERSION ROUTINE will be there. But no idea about it.
    DATA : V_TS      TYPE TIMESTAMP,
           V_TS_T    TYPE CHAR16,
           V_YYYY    TYPE CHAR04,
           V_MM      TYPE CHAR02,
           V_DD      TYPE CHAR02.
    START-OF-SELECTION.
      GET TIME STAMP FIELD V_TS.
      V_TS_T = V_TS.
      CONDENSE V_TS_T.
      V_YYYY = V_TS_T.
      V_MM   = V_TS_T+4(2).
      V_DD   = V_TS_T+6(2).
      V_TS_T(2) = V_MM.
      V_TS_T+2(2) = V_DD.
      V_TS_T+4(4) = V_YYYY.
      SKIP 10.
      WRITE : /10 V_TS," USING EDIT MASK '____-__-________'.
              /10 V_YYYY,
              /10 V_MM,
              /10 V_DD,
              /10 V_TS_T USING EDIT MASK '__-__-__________'.
    If you want DATE alone, just declare the length of V_TS_T as 10.
    Regards,
    R.Nagarajan.
    We can -

  • Using sql bulk copy throwing exception -The given value of type String from the data source cannot be converted to type int of the specified target column

    Hi All,
    I am reading notepads files and inserting data in sql tables from the notepad-
    while performing sql bulk copy on this line it throws exception - "bulkcopy.WriteToServer(dt); -"data type related(mentioned in subject )".
    Please go through my  logic and tell me what to change to avoid this error -
    public void Main()
    Dts.TaskResult = (int)ScriptResults.Success;
    string[] filePaths = Directory.GetFiles(@"C:\Users\jainruc\Desktop\Sudhanshu\master_db\Archive\test\content_insert\");
    for (int k = 0; k < filePaths.Length; k++)
    string[] lines = System.IO.File.ReadAllLines(filePaths[k]);
    //table name needs to extract after = sign
    string[] pathArr = filePaths[0].Split('\\');
    string tablename = pathArr[9].Split('.')[0];
    DataTable dt = new DataTable(tablename);
    |
    string[] arrColumns = lines[1].Split(new char[] { '|' });
    foreach (string col in arrColumns)
    dt.Columns.Add(col);
    for (int i = 2; i < lines.Length; i++)
    string[] columnsvals = lines[i].Split(new char[] { '|' });
    DataRow dr = dt.NewRow();
    for (int j = 0; j < columnsvals.Length; j++)
    //Console.Write(columnsvals[j]);
    if (string.IsNullOrEmpty(columnsvals[j]))
    dr[j] = DBNull.Value;
    else
    dr[j] = columnsvals[j];
    dt.Rows.Add(dr);
    SqlConnection conn = new SqlConnection();
    conn.ConnectionString = "Data Source=UI3DATS009X;" + "Initial Catalog=BHI_CSP_DB;" + "User Id=sa;" + "Password=3pp$erv1ce$4";
    conn.Open();
    SqlBulkCopy bulkcopy = new SqlBulkCopy(conn);
    bulkcopy.DestinationTableName = dt.TableName;
    bulkcopy.WriteToServer(dt);
    conn.Close();
    Issue 1:-
    I am reading notepad: getting all column and values in my data table now while inserting for date and time or integer field i need to do explicit conversion how to write for specific column before bulkcopy.WriteToServer(dt);
    Issue 2:- Notepad does not contains all columns nor in specific sequence in that case i can add few column ehich i am doing now but the issue is now data table will add my columns + notepad columns and while inserting how to assign in perticular colums?
    sudhanshu sharma Do good and cast it into river :)

    Hi,
    I think you'll have to do an explicit column mapping if they are not in exact sequence in both source and destination.
    Have a look at this link:
    https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopycolumnmapping(v=vs.110).aspx
    Good Luck!
    Kaur.
    Please mark as answer if this resolves your issue.

  • Using a SQL data source and XML data source in the same template

    I am trying to develop a template for the Request for Quote report generated in Apps 11.5.10. I have loaded the data from the XML output into the template, but I am missing one field - I need the org_id from the po_headers table. Is it possible to use a sql data source (i.e., "select org_id from po_headers_all where po_header_id = [insert header_id from xml data]...") in addition to the xml data source to populate the template at runtime? When you use the Insert > SQL functionality is it static at the time the template is created, or does it call to the database at runtime? I've looked through all the docs I could find, but this isn't clear.
    Thanks for any help or suggestions you may have.
    Rhonda

    Hi Pablo
    Thats a tough one ... if you go custom with a data template you will at least get support on the data template functionality ie you have a problem when you try and build one. You will not get support on the query inside the data template as you might have gotten with the Oracle Report, well you could at least log a bug against development for a bad query.
    Eventually that Oracle Report will be converted by development anyway, theres an R12 project going on right now to switch the shipped OReports to data templates. AT this point you'll be fully supported again but:
    1. You have to have R12 and
    2. You'll need to wait for the patch
    On reflection, if you are confident enough in the query then Oracle will support you on its implementation within a data template. Going forward you may be able to swap out your DT and out in the Oracle one without too much effort.
    Regards, Tim

  • Unable to refresh SQL Server data source through Data Management Gateway

    I just installed the version 1.1.5226.8 of Data Management Gateway and tried to refresh a simple query on a table connected to SQL Server, with no transformations in Power Query.
    This is the error I obtain:
    Errors in the high-level relational engine. The following exception occurred while the managed IDataReader interface was being used: transfer service job status is invalid.
    I am wondering whether my Power BI is still not updated to handle such a connection type, or there could be something else not working?
    I correctly created the data source in admin panel following instructions in Release Notes, and
    test Power Query connection is ok.
    Marco Russo http://www.sqlbi.com http://www.powerpivotworkshop.com http://sqlblog.com/blogs/marco_russo

    I made other tests and I found important information (maybe there is a bug, but read the following).
    The functions DateTime.LocalNow and DateTime.FixedLocalNow
    work correctly, generating these statements to SQL Server:
        convert(datetime2, '2014-05-03 06:37:52.1135108') as [LocalNow],
        convert(datetime2, '2014-05-03 06:37:52.0525061') as [FixedLocalNow],
    The functions DateTimeZone.FixedLocalNow, DateTimeZone.FixedUtcNow,
    DateTimeZone.LocalNow, and DateTimeZone.UtcNow
    stop the scheduled refresh with the error I mentioned
    in my previous messages, generating these statements to SQL Server:
        '2014-05-03 06:37:52.0525061+02:00' as [TZFixedLocalNow],
        '2014-05-03 04:37:52.0525061+00:00' as [TZFixedUtcNow],
        '2014-05-03 06:37:52.1135108+02:00' as [TZLocalNow],
        '2014-05-03 04:37:52.1135108+00:00' as [TZUtcNow]
    I solved the issue by placing the DateTimeZone calls after a Table.Buffer call, so query folding does not translate in SQL these functions. However, it seems like something to fix.
    Marco Russo http://www.sqlbi.com http://www.powerpivotworkshop.com http://sqlblog.com/blogs/marco_russo

  • SSRS CatalogItem method not working for deploying a shared data source

    I have been working with the SSRS CreateCatalogItem method to deploy reports to a SSRS 2012 in SharePoint integrated mode with SharePoint Server Enterprise 2013. I am using Powershell. The CreateCatalogItem method works fine when I deploy RDL files,
    but fails when I deploy an RDS. I get an rsInvalidXML1400 error, whatever that is. Here is a cut-down version of my code to establish the bare essentials:
        [String] $reportserver = "server20";
        [String] $url = "http://$($reportserver)/sites/AdventureWorks/_vti_bin/reportserver/reportservice2010.asmx?WSDL";
        [String] $SPFolderPath = "http://server20/sites/AdventureWorks/BICenter/Data%20Connections/";
        [String] $fileFolder = "C:\SiteBackups\BIReports\BIReports\";
        [String] $itemName = "AdventureWorksCube.rds";
        $ssrs = New-WebServiceProxy -uri $url -UseDefaultCredential;       
        $warnings = $null; 
        $itemPath= $($fileFolder + $itemName);
        $definition = get-content $itemPath -encoding byte;      
        try
            $ssrs.CreateCatalogItem("DataSource", $itemName, $SPFolderPath,$False,$definition,$null, [ref] $warnings);
        catch [System.Web.Services.Protocols.SoapException]
            $msg = $_.Exception.Detail.InnerText;
            Write-Error $msg;
    I have a workaround whereby I read the XML of the data source file directly and extract the ConnectString and Extension elements then use the text within them to create the data source using the DataSourceDefinition class. My point is not to get a workaround.
    I want to establish that the CreateCatalogItem method indeed does not work when used with the ItemType "DataSource". In the code above, if I change the itemType i.e. first parameter of CreateCatalogItem to "Report" and change the $itemName
    to the name of an RDL file, it deploys correctly. Has anyone else encountered this behavior or am I doing something wrong here?
    Charles Kangai, MCT
    author of the following Microsoft Business Intelligence courses:
    http://www.learningtree.co.uk/courses/139/sql-server-analysis-services-for-business-intelligence/
    http://www.learningtree.co.uk/courses/134/sql-server-integration-services-for-business-intelligence/
    http://www.learningtree.co.uk/courses/140/sql-server-reporting-services/
    http://www.learningtree.co.uk/courses/146/sharepoint-business-intelligence/
    Charles Kangai, MCT

    Hello,
    We can invoke the SSRS proxy endpoint (ReportService2006.asmx)from PowerShell to publish report definitions (.rdl) and report models (.smdl) to a SharePoint library, but this does not apply to data source (.rds) files.
    In order to deploy .rds to SharePoint library without using SSDT, you should convert the .rds file to its .rsds counterpart which is pretty contains same content but in different schema.
    If you want to fully automate your deployment, you should write your own converter and perform the deployment by utilizing SharePoint feature framework and SSRS proxy endpoint (ReportService2006.asmx).
    Please refer to the following blog about this issue:
    PowerShell:Deploying SSRS Reports in Integrated Mode
    Deploying Reports in Integrated Mode
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click
    here.
    Fanny Liu
    TechNet Community Support

  • Update Routine to populate 0VENDOR from either of the 2 data source fields

    Hi,
    I have a requirement to write an update routine for 0VENDOR based on the below logic :
    Create routines to populated BW Info Object u201CVendoru201D (0VENDOR) based on the following logic:
    IF field u201CVendoru201D (ITM_VENDOR_ID) is populated from data source 0BBP_SC_TD_1, THEN populate 0VENDOR with that value
    ELSE IF u201CPreferred Vendoru201D (ITM_PROPVEN_ID) is populated from data source 0BBP_SC_TD_1, THEN populate 0VENDOR with that value
    ELSE IF neither u201CVendoru201D (ITM_VENDOR_ID) or u201CPreferred Vendoru201D (ITM_PROPVEN_ID) are populated from data source 0BBP_SC_TD_1, then 0VENDOR = NULL
    Can anyone help me in converting this logic into ABAP routine.
    Thanks,
    Suchitra

    Hi Suchitra,
    In the Transfer Rules ... You will be mapping each field then the mapping field click on the button with Triangle then you can see the which type you want.
    Then select the routine and select the datasource fields (don't forget to select the both fields VENDOR and PROPITM)...
    Then give a name to routine ...
    and in the code just change the COMM_STRUCTURE to TRANSFER_STRUCTURE.
    Then you can get this .... done..
    Regards,
    Ravi Kanth

  • Reading Values from Listbox and data source into MS Office Toolkit

    Hi,
    Been trying to get this to work but making no progress and my lack of experience on labview is becoming a hinderence.
    Does anyone know how I can read the values from the listbox example attached into MS Office Toolkit for Excel?
    The values from the listbox need to be compared to mulitple values from a strain data source.
    Cheers,
    Mike.
    Attachments:
    Capture.PNG ‏62 KB

    Hi,
    Ok in the attached vi I want value from the listbox "0kg through to 10kg" to be put into the excel table in the report generation toolkit along side data from the convert strain gauge reading.
    Cheers,
    Mick.
    Attachments:
    Strain Gauge Edit2.vi ‏112 KB

  • Data Source & Infosource

    Hi,
    I am working on the PM cube 0PM_C01 .I have to enhance this data source to get some additional data .
    The requirement is to create a  custom stagin DSO for this cube 0PM_C01 & then perform the reporting from  a cube .
    Should I copy the Std InfoSource ZPM_0M_0PA_2 to  custom Infosurce  say XPM_OM_0PA_2 & then add a new info object to hold the enhanced data , after which I will map this infosurce to the staging DSO .
    using this approach I will make sure all the update rules & transfer rules are captured .
    Or
    Can I  just map(tranformation ) the enhanced data source 0PM_0M_OPA_2 to the staging DSO  using BI 7
    If this approach is used how can i make sure the std update rules & transfer rules are applied .
    I need to make sure  all the current data in the data source is pulled into the DSO.
    Please let me know

    Hi there,
    I would use BI 7 transformation rather than transfer and update rules...
    When moving transfer and update rules to transformation:
    1) rightclick on transfer / update rules -> Additional functions -> create transformation
    2) make sure that you have all the routines in old rules converted to classes in transformation. This has to be done manually going through the code.
    3) Enhance the datasource and apply necessary rules in the transformation for the enhanced objects
    4) Load to DSO
    Edited by: Phani Kothwali on Mar 4, 2010 6:26 AM

  • SQL Server 2008 R2 - Report Builder 3.0 - timeout using shared data source and stored procedure

    I select the shared datasource from the data source propeties dialog, test the connection and everything is good.
    I add a dataset by selecting "use a dataset embedded in my report" option within the Dataset properties dialog.
    I select the newly added data source, click the "Stored procedure" query type and drop down the list box and select my intended stored procedure.
    the timeout for the dataset is "0" seconds.
    I click the "OK" button and I'm presented with the parameters to the stored procedure.
    I enter valid data for the parameters and click the "OK" button.
    I then get the following error message after 30 seconds:
    The problem is, all of the timeouts, that I'm aware of, have values of zero (no timeout) or high enough values that 30 seconds isn't even close to the timeout.
    I think the smallest timeout we have is 120 seconds.
    I have searched this site and many others and the solutions all involve altering the stored procedure to get the fields into report builder and then revert the stored procedure back to its original form.
    To me, this is NOT a solution.  
    I have too many stored procedures that need to be brought into Report Builder.
    I need a real solution.
    Thank you for you time, Tim Caldwell.
    Timothy E Caldwell

    I don't mean to be rude, but really, check to see if the stored procedure can return data rows???
    Maybe I'm not being clear enough.
    The stored procedure runs perfectly fine.
    it runs perfectly fine in the production environment and the test environment.
    I can access the stored procedure in several ways and have it return correct data.
    I can even trick report builder into creating a dataset with parameters and run the stored procedure that way.
    What I cannot do, is to get report builder to not timeout after 30 seconds on the initial creation of a dataset with a Query type of stored procedure.
    I have seen this issues posted again and again and again on may different sites and the "solution" is to simplifiy the stored procedure by creating a stored procedure that has a create table and a select in the stored procedure and that's it.  After
    report builder creates the dataset the developer then has to replace the simplified stored procedure with the actual stored procedure and everything works fine after that.
    HOWEVER, having to go through this process for 70 or more stored procedures is ridiculous.
    It would appear that there is something within report builder itself that is causing this issue.
    The SQL Script included is an example of a stored procedure that will not create fields create a dataset with fields and parameters in Report Builder 3.0:
    USE [CRUM_IT]
    GO
    /****** Object: StoredProcedure [dbo].[COGNOS_Level5ScriptSP] Script Date: 11/17/2014 08:02:26 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    ALTER procedure [dbo].[COGNOS_Level5ScriptSP]
    @CompanyCode varchar(8) = null,
    @GetSiblings varchar(1) = 'N'
    as
    Begin
    -- get emergency contact info
    select *
    into #tmp_Contacts
    from
    (select
    ConEEID,
    con.connamelast as [Emer Contact Last Name],
    con.connamefirst as [Emer Contact First Name],
    con.connamemiddle as [Emer Contact Middle Initial/Name]--,
    ,ROW_NUMBER() over (Partition by ConEEID order by ConNameLast)as rn
    ,ISNULL(
    case when con.conphonepreferred = 'H'
    then '(' + substring(con.conphonehomenumber, 1, 3) + ')' + substring(con.conphonehomenumber, 4, 3) + '-' + substring(con.conphonehomenumber, 7, 4)
    else '(' + substring(con.conphoneothernumber , 1, 3) + ')' + substring(con.conphoneothernumber , 4, 3) + '-' + substring(con.conphoneothernumber , 7, 4)
    end,
    ) as [Emergency Phone]
    from [ultiprosqlprod1].[ultipro_crum].dbo.Contacts con
    where con.ConIsEmergencyContact='y'
    and con.ConIsActive='y'
    ) A
    where A.rn = 1
    CREATE TABLE #tmp_CompanyCodes (CompanyCode varchar(8))
    If @GetSiblings = 'Y'
    Begin
    INSERT INTO #tmp_CompanyCodes (CompanyCode)
    EXEC [z_GetClientNumbers_For_ParentOrg_By_ClientNumber] @CompanyCode
    End
    INSERT INTO #tmp_CompanyCodes
    values (@CompanyCode)
    select *
    into #tmp_Company
    from [ultiprosqlprod1].[ultipro_crum].dbo.Company
    where cmpcompanycode in (select CompanyCode from #tmp_CompanyCodes)
    select distinct
    cmpcompanycode as [Client ID],
    CmpCompanyDBAName as [Client Name],
    eec.eecEmplStatus AS [Employment Status],
    eec.eecEmpNo AS [Employee Num],
    rtrim(eep.eepNameLast) AS [Last Name],
    rtrim(eep.eepNameFirst) AS [First Name],
    isnull(rtrim(ltrim(eep.eepNameMiddle)), '') AS [Middle Initial/Name],
    rtrim(eep.eepAddressLine1) AS [Address Line 1],
    isnull(rtrim(eep.eepAddressLine2), '') AS [Address Line 2],
    eep.eepAddressCity AS [City],
    eep.eepAddressState AS [State],
    CASE
    WHEN len(eep.eepAddressZipCode) > 5 and charindex(eep.eepAddressZipCode, '-', 1) = 0
    THEN substring(eep.eepAddressZipCode, 1, 5)
    ELSE rtrim(eep.eepAddressZipCode)
    END AS [Zip code],
    CASE
    WHEN len(eep.eepAddressZipCode) > 5 and charindex(eep.eepAddressZipCode, '-', 1) = 0
    THEN substring(eep.eepAddressZipCode, 6, 4)
    WHEN len(eep.eepAddressZipCode) > 5 and charindex(eep.eepAddressZipCode, '-', 1) > 0
    THEN substring(eep.eepAddressZipCode, charindex(eep.eepAddressZipCode, '-', 1) + 1, 4)
    WHEN len(eep.eepAddressZipCode) <= 5
    THEN ''
    END AS [ZIP + 4],
    substring(eep.eepSSN, 1, 3) + '-' + substring(eep.eepSSN, 4, 2) + '-' + substring(eep.eepSSN, 6, 4) AS [SSN],
    isnull(convert(VARCHAR(10), eep.eepDateOfBirth, 101), '') AS [Date Of Birth],
    eetFED.TAXCODE AS [FED Tax Code],
    eetFED.FILINGSTATUS AS [Fed Filing Status],
    eetFED.EXEMPTIONS AS [Fed Exemption Allowance],
    eetFED.ADDITIONAL AS [Additional Fed Withholding],
    eetSIT.TAXCODE AS [SIT Tax Code],
    eetSIT.FILINGSTATUS AS [State Filing Status],
    eetSIT.EXEMPTIONS AS [State Exemption Allowance],
    eetSIT.ADDITIONAL AS [Additional State Withholding],
    isnull('(' + substring(eep.eepPhoneHomeNumber, 1, 3) + ')' + substring(eep.eepPhoneHomeNumber, 4, 3) + '-' + substring(eep.eepPhoneHomeNumber, 7, 4), '') AS [Home Phone],
    isnull((SELECT cod.codDesc
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.Codes cod WITH (NOLOCK)
    WHERE cod.codCode = eep.eepEthnicID
    AND cod.codDosTable = 'ETHNICCODE'), '') AS [Race-Origin], --eep.eepEthnicID AS [Race-Origin],
    eep.eepGender AS [Gender],
    isnull(convert(VARCHAR(10), eec.eecDateOfOriginalHire, 101), '') AS [Original Hire Date],
    isnull(convert(VARCHAR(10), eec.eecDateOfSeniority, 101), '') AS [Seniority Date],
    isnull(convert(VARCHAR(10), eec.eecDateOfTermination, 101), '') AS [Termination Date],
    isnull(eecTermType,'') as [Termination Type],
    isnull(TchDesc, '') as [Termination Reason],
    rtrim(eec.eecJobCode) AS [WC Code],
    isnull(eec.eecJobTitle, '') AS [Job Title],
    pgr.pgrPayFrequency AS [Pay Frequency],
    eec.eecFullTimeOrPartTime AS [Full/Part Time],
    eec.eecSalaryOrHourly AS [Pay Type],
    isnull(convert(MONEY, eec.eecHourlyPayRate), 0.00) AS [Hourly Rate],
    isnull(eec.eecAnnSalary, 0.00) AS [Annual Salary],
    [YTD Hours],
    isnull(eep.eepNameFormer, '') AS [Maiden Name],
    eec.eecLocation AS [Location ID],
    rtrim(eec.eecOrgLvl1) AS [Department ID],
    eec.eecorglvl2 AS [Cost Item],
    eec.eecorglvl3 as [Client Project],
    eec.eecPayGroup as [Pay Group],
    isnull(eepAddressEMail,' ') as [Email Address],
    isNull(BankName1,' ') as PrimaryBank,
    isNull(BankRoute1,' ') as PrimaryRouteNum,
    isNull(Account1,' ') as PrimaryAccount,
    isNull(AcctType1,' ') as PrimaryAcctType,
    isNull(DepositRule1,' ') as PrimaryDepositRule,
    isNull(BankName2,' ') as SecondaryBank,
    isNull(BankRoute2,' ') as SecondaryRouteNum,
    isNull(Account2,' ') as SecondaryAccount,
    isNull(AcctType2,' ') as SecondaryAcctType,
    isNull(DepositRule2,' ') as SecondaryDepositRule,
    isNull(
    CASE
    WHEN DepositRule2 = 'D'
    THEN '$' + convert(varchar, cast(EddAmtOrPct2 AS decimal(10,2)))
    WHEN DepositRule2 = 'P'
    THEN convert(varchar, cast((EddAmtOrPct2*100) AS decimal(10,0))) + '%'
    ELSE null
    END,' ') as SecondaryDepositAmount,
    isNull(BankName3,' ') as ThirdBank,
    isNull(BankRoute3,' ') as ThirdRouteNum,
    isNull(Account3,' ') as ThirdAccount,
    isNull(AcctType3,' ') as ThirdAcctType,
    isNull(DepositRule3,' ') as ThirdDepositRule,
    isNull(
    CASE
    WHEN DepositRule3 = 'D'
    THEN '$' + convert(varchar, cast(EddAmtOrPct3 AS decimal(10,2)))
    WHEN DepositRule3 = 'P'
    THEN convert(varchar, cast((EddAmtOrPct3*100) AS decimal(10,0))) + '%'
    ELSE null
    END,' ') as ThirdDepositAmount,
    Supervisor,
    eec.eecEEID AS [Employee EEID],
    eec.EecJobCode As [Job Code],
    isnull(eec.EecTimeclockID,' ') As [Time Clock ID],
    con.[Emer Contact Last Name],
    con.[Emer Contact First Name],
    con.[Emer Contact Middle Initial/Name],
    con.[Emergency Phone]
    from [ultiprosqlprod1].[ultipro_crum].dbo.empPers eep WITH (NOLOCK)
    inner join [ultiprosqlprod1].[ultipro_crum].dbo.empComp eec WITH (NOLOCK)
    ON eep.eepEEID = eec.eecEEID
    inner join #tmp_Company cmp WITH (NOLOCK)
    ON eec.eecCOID = cmp.cmpCOID
    inner join [ultiprosqlprod1].[ultipro_crum].dbo.PayGroup pgr WITH (NOLOCK)
    ON eec.eecPayGroup = pgr.pgrPayGroup
    left outer join [ultiprosqlprod1].[ultipro_crum].dbo.TrmReasn
    on tchCode = eecTermReason
    left join (select CAST(sum(isnull(eee.eeeYTDHrs,0.00))AS DECIMAL(18,2)) as [YTD Hours],
    eeeEEID,
    eeeCOID
    from [ultiprosqlprod1].[ultipro_crum].dbo.EmpEarn eee with (NOLOCK)
    group by eeeCOID,eeeEEID)eee
    on eec.eecEEID = eee.eeeEEID
    and eec.eecCOID = eee.eeeCOID
    left join (SELECT eetCOID AS COID,
    eetEEID AS EEID,
    eetTaxCode AS TAXCODE,
    eetFilingStatus AS FILINGSTATUS,
    eetExemptions AS EXEMPTIONS,
    eetExtraTaxDollars AS ADDITIONAL
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.empTax WITH (NOLOCK)
    WHERE eetTaxCode = 'USFIT'
    )eetFED
    ON eec.eecCOID = eetFED.COID
    and eec.eecEEID = eetFED.EEID
    left join (SELECT eetCOID AS COID,
    eetEEID AS EEID,
    eetTaxCode AS TAXCODE,
    eetFilingStatus AS FILINGSTATUS,
    eetExemptions AS EXEMPTIONS,
    eetExtraTaxDollars AS ADDITIONAL
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.empTax WITH (NOLOCK)
    WHERE eetTaxCode like '%SIT'
    AND eetIsWorkInTaxCode = 'Y'
    )eetSIT
    ON eec.eecCOID = eetSIT.COID
    and eec.eecEEID = eetSIT.EEID
    left outer join (SELECT eddCOID,
    eddEEID,
    eddEEBankName BankName1,
    eddEEBankRoute BankRoute1,
    eddAcct Account1,
    EddAcctType AcctType1,
    EddDepositRule DepositRule1,
    EddAmtOrPct EddAmtOrPct1
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpDirDp WITH (NOLOCK)
    WHERE eddSequence = '99')edd
    ON eec.eecCOID = edd.eddCOID
    and eec.eecEEID = edd.eddEEID
    left outer join (SELECT eddCOID,
    eddEEID,
    eddEEBankName BankName2,
    eddEEBankRoute BankRoute2,
    eddAcct Account2,
    EddAcctType AcctType2,
    EddDepositRule DepositRule2,
    EddAmtOrPct EddAmtOrPct2
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpDirDp WITH (NOLOCK)
    WHERE eddSequence = '01')edd2
    ON eec.eecCOID = edd2.eddCOID
    and eec.eecEEID = edd2.eddEEID
    left outer join (SELECT eddCOID,
    eddEEID,
    eddEEBankName BankName3,
    eddEEBankRoute BankRoute3,
    eddAcct Account3,
    EddAcctType AcctType3,
    EddDepositRule DepositRule3,
    EddAmtOrPct EddAmtOrPct3
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpDirDp WITH (NOLOCK)
    WHERE eddSequence = '02')edd3
    ON eec.eecCOID = edd3.eddCOID
    and eec.eecEEID = edd3.eddEEID
    left outer join (SELECT eecCOID,
    eecEEID,
    rtrim(eepNameLast) + ', ' +
    rtrim(eepNameFirst) + ' ' +
    isnull(rtrim(ltrim(eepNameMiddle)), '') AS [Supervisor]
    FROM [ultiprosqlprod1].[ultipro_crum].dbo.EmpComp WITH (NOLOCK)
    join [ultiprosqlprod1].[ultipro_crum].dbo.EmpPers with (NoLock)
    on eeceeid = eepeeid)eec2
    ON eec.eecSupervisorID = eec2.eecEEID
    left outer join #tmp_Contacts con
    on eep.eepEEID = con.ConEEID
    order by [Client ID],
    [Last Name],
    [First Name]
    drop table #tmp_Contacts
    END
    Timothy E Caldwell

  • Writing and including our own customer data source to Crystal Report Design

    Hi,
    I need to create a Crystal Report based on a locally stored XML and schema. I can do this but the issue Iu2019m facing is that some tags used in the schema and the format is not supported by Crystal. So my plan is to write a database driver which will take our format and then convert it to the format Crystal supports and vise versa. To do this before writing the driver I need to know if itu2019s possible to include my custom database driver source in to the list that is shown in the Crystal Report Designer when selecting a data source to create a report.  Is there such an API available so that I can plugin my data source to Crystal Report Designer? Is this at least possible to do or is there another way?
    Hope you can help me.
    Thanks you in advance.
    Regards,
    Chanaka

    Hi Chanaka,
    Not directly. If you base your driver on OLD DB or ODBC then it will show up in those lists.
    If you are planning on selling a lot of packages then you may want to contact our OEM Sales team and sign up. You'll have more resources to do customized features like this. Not sure if we can but the option is there.
    Thank you
    Don

  • An error occurred accessing a data source.

    Hi,
    I have seen many posts talking about this error but still i am facing the same issue. The process i have followed is:
    1) created infopath form with receive data connection to SQL DB with plain username/pwd
    2) placed the repaeating table of receive dataconnection, form has full trust
    3) converted the dataconnection to udcx and stored it in data connection library of site collection
    4) published the form for admin approval, and as i am the admin i have uploaded it and activated it
    5) approved the dataconnection
    6) i have modified the infoath settings to use SQL authen, data connection files , cross domain
    But still i am getting this error:
    An error occurred accessing a data source.
    An entry has been added to the Windows event log of the server.
    Log ID:6932
    Regards,
    Amarnath.
    Regards, Amar.

    hi,
    I have found below log:
    DataAdapterException, Exception Message: The database returns an error. Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.)
    but i am receiving data only here is my query:
    SELECT [Service Request],[Task Order],COUNT(*) as TotalTestCases,SUM(CASE WHEN [Status] = 'Completed' THEN 1 ELSE 0 END) AS Completed,
          SUM(CASE WHEN [Status] = 'Blocked' THEN 1 ELSE 0 END) AS Blocked,SUM(CASE WHEN [Status] not in('Completed','Blocked') THEN 1 ELSE 0 END) AS InProgress FROM
          DashBoardView where SRStatus = 'Active' group by [Service Request],[Task Order] 
    is my calculated columns is giving error? but when i run my wuery its giving correct output
    Regards, Amar.

  • Error occurred querying a data source in office 365.

    I have created a simple infopath form and in one of my dropdown box  i am receiving the REST services. 
    when i run the infopath form 2013 in preview it was running good but when i upload it on to office 365 cloud it's showing me error. i had converted the services into UDL and save it on Data connection Library and it's approved.
    Warning
    Click OK to resume filling out the form. You may want to check your form data for errors.
    The entry has been added to the Windows event log of the server.
    log ID:5566
    Correlation ID:feb08a9c-9d47-30f5-5d89-60340242d6d1.

    Hi,
    According to your post, my understanding is that you got error when querying a data source in office 365.
    Per my knowleadge, we cannot connect to SharePoint Web Services from InfoPath Forms within SharePoint Online.
    Loopback protection is enabled on the SharePoint Online environment and will block calls to SOAP and REST from InfoPath Form Services.
    For more information, you can refer to:
    Error message when you connect an InfoPath form to a SharePoint Online web service: "An error occurred while connecting to a Web Service"
    As a workaround, to look up data for some dropdowns from several external REST Web Services, we need to have a Data Connection library in your Site Collection in which to store the UDCX files.
    The steps seem to be:
    Create a Data Connection Library in your Site Collection
    In the InfoPath ribbon, go to Data, then Data Connections. Your existing data connections should be listed.
    Choose the data connection which is hosted on a different domain.
    Clink the
    Convert to  Connection File… button
    In the location selector, specify the path to the Data Connection Library you created above.
    Leave the
    Relative to  site collection (recommended) radio button selected.
    Click OK
    Here is a great blog for your reference:
    InfoPath 2013 Forms on Office365 with External REST DataSources
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Create data source in BI 7.0?

    hi,
    while creating data source in BI 7.0 , ( from flat file data)
    under extraction tab:
    character set settings : default and direct entry is there
    if i select direct entry i am getting character set and replacement character.
    what scenerio we use direct entry and what scenerio we use default. if i select direct entry what is effect for my output.
    please clarify.
    regards
    ss

    <b>System Default or Fixed Entry</b>
    If you choose <b>Default Setting</b> for the character set, in Unicode systems a file in UTF-8 format is expected, in non-Unicode systems a file in the format of the system code page is expected.
    If you choose <b>Direct Input</b> for the character set, you can select an SAP character set and determine a replacement character, in case errors occur while converting to the system character set.
    <b>SAP Character Set ID</b>
    The 4-character name of an SAP character set as defined in SAP character set maintenance.
    The following explains the naming convention in more detail:
    First digit: Code
    0 EBCDIC character sets
    1 ASCII character sets
    2 mixed single byte / double byte character sets
    4 double-byte character sets
    6 mixed character sets
    8 double byte and multibyte character sets
    9 reserved for code pages you define
    Second digit: Country
    1-3 countries that use the Latin alphabet (Western Europe, North and South America, Australia, Africa)
    4-6 countries that use non-Latin alphabets and writing systems (Eastern Europe, Asia, Arabic countries in Africa)
    7-9 reserved for special languages
    Third and fourth digits: Sequential number
    Example
    0100 IBM 00697/00273 (Latin 1 - Germany/Austria)
    0401 SNI BS2000 8859-5 EHCLC (cyrillic - multiple languages)

  • Cannot Change Data Source - Error The memory could not be "read"

    We have some reports that were created under Crystal Reports version 10 using a Sybase 12.5 database as the datasource.  We have since upgraded to Sybase 15 ASE and we are also attempting to upgrade these reports to Crystal Reports version 11.5.  The reports will convert to version 11.5 but will not work because of the following error:
    "Failed to load database information. Details: The database connector 'crdb_p2ssyb10.dll' could not be loaded"
    From previous experience we know that Sybase 15 is only supported via ODBC with Crystal Report 11.5, so we tried to change the datasource to ODBC, but when we do that we get the following error:
    The instruction at "0x0e864b11" referenced memory at "0x00000004". The memory could not be "read".
    Do we have any options at this point besides competely re-creating the reports?

    We are using CR 11.5 and trying to change the data source to ODBC.  I can create a new report and use the same stored procedures as a datasource via ODBC with no issues.  I only have an issue when trying to change the data source on these existing reports.

Maybe you are looking for

  • Using Time Machine to backup a shared directory on Windows XP.

    I have a shared directory on a Windows XP machine on my LAN that I would like to backup using Time Machine.  Apparently Time Machine will not automatically backup shared directories so I created a soft link to the mount point, /Volumes/WindowsXP, and

  • FF4 download won't open--says off-line but no tool bar to fix.

    FF4 download was a disaster--I was ok with FF3 anyway. When download complete, would not run because said I was in off line mode, but no way to change that. Eventually delete FF4 and tried new download. Same result. Tried computer restart to no effec

  • Clean install on Netra T1 105 using terminal service

    Im using a windows hyper terminal session onto a Netra T1 105 and am able to boot and login as normal, however my problem lies with being able to do a fresh install from this terminal, on a normal sun box I shutdown and get the ok prompt and from the

  • Dynamic gallery with 3 columns and 5 rows with paging

    guyz, i am new to spry and i need your help. i want to create a dynamic gallery with data source as xml. i want to create a gallery with 3 columns and 5 rows.Basically a wallpaper page where there would be 15 wallpapers in a page with a link of downl

  • Just need some simple stats :)

    I've been using Microsoft's "Excel Lite" program for a few years. I don't even know what it's called, but its performance is more limited than Excel, which I've never used. I've never used a mathematical formula in a spreadsheet. All I do is insert d