Append a string while fetching data Through Sql Statement

Hi i have a table which having a one field contains information of some delted files.
when i write a sql query to fetch the information its giving out put like ..
SELECT DISTINCT filedname  FROM TB_JOBCOMP_DBS40
Assinid
complte_date
pr_date
but i need a out put like
Assinid -------------------->deleted
complte_date -------------------->deleted
pr_date -------------------->deleted
is there any way to write sql to fetch this information

Are you looking for concatenation ?
SQL> select concat(ename,'----- Names Of employee') from emp
  2  WHERE rownum<3
  3  ;
CONCAT(ENAME,'-----NAMESOFEMPLOYEE')
SMITH----- Names Of employee
ALLEN----- Names Of employee
SQL> select ename||'----- Names Of employee' from emp
  2  WHERE rownum<3;
ENAME||'-----NAMESOFEMPLOYEE'
SMITH----- Names Of employee
ALLEN----- Names Of employee

Similar Messages

  • Slow Speed While fetching data from SQL 2008 using DoQuery.

    Hello,
    I am working for an AddOn and tried to use DoQuery for fetching data from SQL 2008 in C#.
    There are around 148 records which full fill this query condition but it takes much time to fetch the data.
    I wanna know that is there any problem in this code by which my application is getting slower.
    I used break Points and checked it, I founds that while connecting to the server it is taking time.
    Code:
    // Get an initialized SBObob object
    oSBObob = (SAPbobsCOM.SBObob)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoBridge);
    //// Get an initialized Recordset object
    oRecordset = (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.BoRecordset);
    string sqlstring = "select DocEntry,ItemCode From OWOR  where OWOR.Status='R' and DocEntry not in ( Select distinct(BaseRef) from IGE1 where IGE1.BaseRef = OWOR.DocEntry)";
    oRecordset.DoQuery(sqlstring);
    var ProductList = new BindingList<KeyValuePair<string, string>>();
    ProductList.Add(new KeyValuePair<string, string>("", "---Please Select---"));
    while (!(oRecordset.EoF))
    ProductList.Add(new KeyValuePair<string, string>(oRecordset.Fields.Item(0).Value.ToString(), oRecordset.Fields.Item(0).Value.ToString() + " ( " + oRecordset.Fields.Item(1).Value.ToString() + " ) "));
    oRecordset.MoveNext();
    cmbProductionOrder.ValueMember = "Key";
    cmbProductionOrder.DisplayMember = "Value";
    Thanks and Regards,
    Ravi Sharma

    Hi Ravi,
    your code and query look correct. But can you ellaborate a little bit.
    It seems to be a DI API program ( no UI API ) ?
    When you say "I founds that while connecting to the server it is taking time." do you mean the recordset query or the DI API connection to SBO ? The later would be "normal" since the connection can take up to 30 seconds.
    To get data it is usually better to use direct SQL connections.
    regards,
    Maik

  • Only 1st character while fetching data through get_char_property

    Hi
    I am reading excel file through OLE2 object. while I am fatching data thr this utility It is fetching only first character of the string.
    procedure read_chr_cell( p_row number, p_col number, p_value in out varchar2 ) is
    args ole2.list_type;
    cell ole2.obj_type;
    begin
    args := ole2.create_arglist;
    ole2.add_arg(args, p_row );
    ole2.add_arg(args, p_col );
    cell := ole2.get_obj_property(worksheet,'Cells',args);
    ole2.destroy_arglist(args);
    p_value := ole2.get_char_property(cell,'Value'); [[[ PROBLEM AREA ]]]
    ole2.release_obj(cell);
    end;
    any advice is appreaciate
    Thanks
    Vishal

    Hi Vishal
    I think u need to LOOP Through the Records :)
    Hope this helps...
    Regards,
    Amatu Allah

  • SSRS reports running slow while fetching data from SQL server DB

    We are currently facing issues with lots of our SSRS reports running very slow. Approx they are taking around 15-20 minutes compared to earlier which was 2-3 minutes.
    Also, the SSRS reports which fetch their data from a Database is acting as Subscriber for a reporting server. That is we have an OLTP DB server whose Database say 'A' is getting replicated onto the server where say Database 'B'. On this B we have our SSRS
    reports fetching the required data and are reported to be running very slow.
    As a start I have checked for Index and stats to be properly updated and yes they are.
    I am not sure now, from where to start analysing the issue.
    I believe replication, not to be the issue as earlier also we had replication set up and never saw the slowness.
    IS memory or IO causing a problem, please help as i am not sure on how to find a fix for this?
    Thanks in advance!

    Hi,
    You need to first find out what all are slow? Is it that the whole instance is slow? How about other queries /replication etc against those databases? Are they slow?
    Check the reports which are slow, try to find the slowest reports and then start looking at execution plans. Check if there is any kind of blocking happening during the time the reports run. Also try running the reports manually and see if there is a difference
    in execution time. Take a look at the waitstats during the time the reports are running.
    Read this to get more better understanding -
    https://technet.microsoft.com/en-us/library/ms177500%28v=sql.105%29.aspx?f=255&MSPPError=-2147217396
    Regards, Ashwin Menon My Blog - http:\\sqllearnings.com

  • Problem getting the selecteditem in tree while fetching data through httpserivces

    I have a mxml file in which a want to display data in tree structure. The supplied to the tree is fetched from database through httpservice
    the mxml file content is :
    <?xml version="1.0" encoding="utf-8"?><mx:Application 
    xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute" creationComplete ="getMenu.send();">
    <mx:HTTPService id="
    getMenu"method="
    GET"url="
    http://172.17.26.55:7001/frmwk/MenuData"resultFormat="
    e4x"result="resultHandler(event)"
    useProxy="
    false" 
    />  
    <mx:Script>
    <![CDATA[
    import mx.controls.Alert; 
    import mx.rpc.events.FaultEvent;  
    import mx.rpc.events.ResultEvent;  
    import mx.collections.XMLListCollection; 
    import mx.events.CloseEvent; 
    private var companyData:XML = new XML; 
    private var Menu:XMLListCollection; 
    private function resultHandler(event:ResultEvent):void{ 
    companyData = event.result
    as XML;Menu =
    new XMLListCollection(companyData.MENU);tree.dataProvider=Menu;
    private function getSelected():void{ 
    Alert.show(
    "selected--->"+tree.selectedItem);}
    ]]>
    </mx:Script>
    <mx:Label text="Tree with XML data"/>
    <mx:Tree id="tree" top="100" left="400" labelField="@title" height="224" width="179"dragEnabled="
    false" dropEnabled="false" allowMultipleSelection="true" 
    />
    <mx:Button label="get Selected item" click="getSelected()" x="400" y="450"/>
    When i try to get the selected item is is giving me null
    Please help !!
    thanks in advance.

    How are you selecting the item? How many items are you selecting?

  • Hyperion IR : Getting out of memory error while fetching data for whole year through web client (wrokspace)

    Hi,
    While fetching data though IR wen client from workspace for a year(all 12 months) I am getting error as ("Out of Memory .Advice : Close other applications or windows and try again").
    If I am trying same through IR studio it does not give any output and show me same repoting front page.
    If i am selecting periods till 8 months it is giving the required data in both IR web client and IR studio.
    Could you please suggest how can we resolve this issue.
    Thanks,
    D.N.Rana

    Issue Cause :
    Sometimes this is due to excessive data which brings the size of the BQY file up around one gigabyte uncompressed in size (for processing may take twice as actual RAM, plus the memory space space for the plugin, and the typical memory limit on a 32-bit system is 2 gigabytes).
    Solution :
    To avoid excessive BQY size exceeding memory availability:
    Ensure that your computer has at least 2Gb of free RAM before he runs IR Studio.
    Put a limit to the number of rows that can be pulled down: Right click on Request label of Query section and put a value in Return First xxx Rows (and check the check box).
    Do not pull down more than 750 MB of data (remember it may be duplicated while processing).
    Place limits or aggregations in Query section (as opposed to Result section) to limit data entering the BQY.

  • How to use for all entires clause while fetching data from archived tables

    How to use for all entires clause while fetching data from archived tables using the FM
    /PBS/SELECT_INTO_TABLE' .
    I need to fetch data from an Archived table for all the entries in an internal table.
    Kindly provide some inputs for the same.
    thanks n Regards
    Ramesh

    Hi Ramesh,
    I have a query regarding accessing archived data through PBS.
    I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
    Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
    Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
    The call to the above FM is as follows :
    CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
      EXPORTING
        archiv           = 'CFI'
        OPTION           = ''
        tabname          = 'BKPF'
        SCHL1_NAME       = 'BELNR'
        SCHL1_VON        =  belnr-low
        SCHL1_BIS        =  belnr-low
        SCHL2_NAME       = 'GJAHR'
        SCHL2_VON        =  GJAHR-LOW
        SCHL2_BIS        =  GJAHR-LOW
        SCHL3_NAME       =  'BUKRS'
        SCHL3_VON        =  bukrs-low
        SCHL3_BIS        =  bukrs-low
      SCHL4_NAME       =
      SCHL4_VON        =
      SCHL4_BIS        =
        CLR_ITAB         = 'X'
      MAX_ZAHL         =
      tables
        i_tabelle        =  t_bkpf
      SCHL1_IN         =
      SCHL2_IN         =
      SCHL3_IN         =
      SCHL4_IN         =
    EXCEPTIONS
       EOF              = 1
       OTHERS           = 2
       OTHERS           = 3
    It gives me the following error :
    Index for table not supported ! BKPF BELNR.
    Please help ASAP.
    Thnaks and Regards
    Gurpreet Singh

  • Eliminate duplicate while fetching data from source

    Hi All,
    CUSTOMER TRANSACTION
    CUST_LOC     CUT_ID          TRANSACTION_DATE     TRANSACTION_TYPE
    100          12345          01-jan-2009          CREDIT
    100          23456          15-jan-2000          CREDIT
    100          12345          01-jan-2010          DEBIT
    100          12345          01-jan-2000          DEBITNow as per my requirement, i need to fetch data from CISTOMER_TRANSACTION table for those customer which has transaction in last 10 years. In my above data, customer 12345 has transaction in last 10 years, whereas for customer 23456, does not have transaction in last 10 years so will eliminate it.
    Now, CUSTOMER_TRANSACTION table has approximately 100 million records. So, we are fectching data in batches. Batching is divided into months. Total 120 months. Below is my query.
    select *
    FROM CUSTOMER_TRANSACTION CT left outer join
    (select distinct CUST_LOC, CUT_ID FROM CUSTOMER_TRANSACTION WHERE TRANSACTION_DATE >= ADD_MONTHS(SYSDATE, -120) and TRANSACTION_DATE < ADD_MONTHS(SYSDATE, -119) CUST
    on CT.CUST_LOC = CUST.CUST_LOC and CT.CUT_ID = CUST.CUT_IDThru shell script, months number will change. -120:-119, -119:-118 ....., -1:-0.
    Now the problem is duplication of records.
    while fetching data for jan-2009, it will get cust_id 12345 and will fetch all 3 records and load it into target.
    while fetching data for jan-2010, it will get cust_id 12345 and will fetch all 3 records and load in into target.
    So instead of having only 3 records, for customer 12345 it will be having 6 records. Can someone help me on how can i eliminate duplicate records from getting in.
    As of now i have 2 ways in mind.
    1. Fetch all records at once. Which is impossible as it will give space issue.
    2. After each batch, run a procedure which will delete duplicate records based on cust_loc, cut_id and transaction_date. But again it will have performance problem.
    I want to eliminate it while fetching data from source.
    Edited by: ace_friends22 on Apr 6, 2011 10:16 AM

    You can do it this way....
    SELECT DISTINCT cust_doc,
                    cut_id
      FROM customer_transaction
    WHERE transaction_date >= ADD_MONTHS(SYSDATE, -120)
       AND transaction_date < ADD_MONTHS(SYSDATE, -119)However please note that - if want to get the transaction in a month like what you said earlier jan-2009 and jan-2010 and so on... you might need to use TRUNC...
    Your date comparison could be like this... In this example I am checking if the transaction date is in the month of jan-2009
    AND transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)) Your modified SQL...
    SELECT *
      FROM customer_transaction 
    WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27))Testing..
    --Sample Data
    CREATE TABLE customer_transaction (
    cust_loc number,
    cut_id number,
    transaction_date date,
    transaction_type varchar2(20)
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2009','dd-MON-yyyy'),'CREDIT');
    INSERT INTO customer_transaction VALUES (100,23456,TO_DATE('15-JAN-2000','dd-MON-yyyy'),'CREDIT');
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2010','dd-MON-yyyy'),'DEBIT');
    INSERT INTO customer_transaction VALUES (100,12345,TO_DATE('01-JAN-2000','dd-MON-yyyy'),'DEBIT');
    --To have three records in the month of jan-2009
    UPDATE customer_transaction
       SET transaction_date = TO_DATE('02-JAN-2009','dd-MON-yyyy')
    WHERE cut_id = 12345
       AND transaction_date = TO_DATE('01-JAN-2010','dd-MON-yyyy');
    UPDATE customer_transaction
       SET transaction_date = TO_DATE('03-JAN-2009','dd-MON-yyyy')
    WHERE cut_id = 12345
       AND transaction_date = TO_DATE('01-JAN-2000','dd-MON-yyyy');
    commit;
    --End of sample data
    SELECT *
      FROM customer_transaction 
    WHERE transaction_date BETWEEN ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27)  AND LAST_DAY(ADD_MONTHS(TRUNC(SYSDATE,'MONTH'), -27));Results....
    CUST_LOC     CUT_ID TRANSACTI TRANSACTION_TYPE
          100      12345 01-JAN-09 CREDIT
          100      12345 02-JAN-09 DEBIT
          100      12345 03-JAN-09 DEBITAs you can see, there are only 3 records for 12345
    Regards,
    Rakesh
    Edited by: Rakesh on Apr 6, 2011 11:48 AM

  • How can we improve the performance while fetching data from RESB table.

    Hi All,
    Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
    SELECT aufnr posnr roms1 roanz
        INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
        FROM resb
        WHERE kdauf  = p_vbeln
        AND   ablad  = itab-sposnr+2.
    Here I am using 'KDAUF'  & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
    Regards,
    Himanshu

    Hi ,
    Declare intenal table with only those four fields.
    and try the beloe code....
    SELECT aufnr posnr roms1 roanz
    INTO  table itab
    FROM resb
    WHERE kdauf = p_vbeln
    AND ablad = itab-sposnr+2.
    yes, you can also use secondary index for improving the performance in this case.
    Regards,
    Anand .
    Reward if it is useful....

  • Fatal error while fetching data from bi

    hi,
    i am getting following error while fetching data from bi using select statement
    i have written code in this way
    SELECT  [Measures].[D2GFTNHIOMI7KWV99SD7GPLTU] ON COLUMNS, NON EMPTY { [DEM_STATE].MEMBERS} ON ROWS FROM DEM_CUBE/TEST_F_8
    error description when i click on test
    Fatal Error
    com.lighthammer.webservice.SoapException: The XML for Analysis provider encountered an error

    thanks for answering .but when i tried writing the statement in transaction 'MDXTEST' and clicked on check i am getting following error
    Error occurred when starting the parser: timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
    Message no. BRAINOLAPAPI011
    Diagnosis
    Failed to start the MDX parser.
    System Response
    timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
    Procedure
    Check the Sys Log in Transaction SM21 and test the TCP-IP connection MDX_PARSER in Transaction SM59.
    SO I WENT IN SM 59 TO CHECK THE CONNECTION.
    CAN U TELL ME WHAT CONFIGERATION I NEED TO DO FOR MAKING SELECT STATEMENTS WORK?

  • Facing prolem  in Dashboard 4.1, while fetching data from Bex Query

    Hi Exports,
    I am facing an error message  " Failed to (de-)serialise data. (Xsl 000004)"  while fetching data in Dashboard from Bex Query.
    The query is getting connect. while drag n drop some dimensions and measures then going for Refesh or Run Query, geting this error.
    The same query is working fine with other comp like webi n crystal.
    Anybody having solution for this please let me know. I am stuck somewhere.
    Thank You

    Hi,
    Check the data in the infoProvider.Reduce the  Bex query Characteristics & Key figure fields.Try to identify due to which characteristsic adding in Bex Query ,are you facing the issue.Check that characteristic data in the infoProvider.
    Regards,
    Venkat

  • While import data through ff_5 i am getting error

    hello
    while import data through ff_5 i am getting error  as below
    message error fv150 and fv151
    'Termination in statement no. 00009 of acct
    1101200570116; closing record 62F missing'
    so please give solution
    thank inadvance
    SIRI

    Dear Siri,
    I guess you are importing an MT940 format. This format should have the closing balance at the end.
    This closing balance starts with :62F:
    A sample is
    :61:0801180118DR3835,97NTRF000000//000000//
    :86:1022  LTD CHENNAI18012008
    :61:0801180118DR69885,09NCHK850819//850819//
    :86:6101  LTD COCHIN18012008
    :62F:C080118INR7210048,86
    I guess that is missing in the import file.
    Maintain that and the import will happen.
    Assign points if useful
    regards
    Venkatesh

  • Compare date in SQL statement

    yup.. how can i compare date in SQL statement??
    pls give me a completed example.

    I'd think this is a formatting problem. Why not try:
    PreparedStatement ps = myConnection.prepareStatement(
    "SELECT * FROM Receipt WHERE to_date(Date) > ? ");
    ps.setDate(1,TodayDate);
    ResultSet rs = ps.executeQuery();
    HTH,
    Ken

  • How to parsing xml data in sql statement??

    Hi friends, I have a table which contain column as clob ,stores in xml format, for example my column contain xml data like this
    <Employees xmlns="http://TargetNamespace.com/read_emp">
       <C1>106</C1>
       <C2>Harish</C2>
       <C3>1998-05-12</C3>
       <C4>HR</C4>
       <C5>1600</C5>
       <C6>10</C6>
    </Employees>
      Then how to extract the data in above xml column data using SQL statement...

    Duplicate post
    How to parsing xml data in sql statement??

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

Maybe you are looking for

  • Alerts viewer - component

    Hi I am looking for an alert viewer component. Does anyone have a recomendation of a commercial one? I need something that will look like "hp open view" Also.... I would like to know of a swing components package - one with it's own "sorted" tables,t

  • Transitions is messed up the first time it runs

    In the following code when the need to register button is pressed the transitions is messed up the first time it runs <?xml version="1.0" ?> <!-- transitions\LoginFormTransition.mxml --> <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009"        

  • OWA search breaks periodically

    Hi, We have an intermittent and recurrent search issue for OWA clients. Search returns the error: "Action couldn't be completed, please try again" I follow the steps here and it resolves the issue for a few weeks and it occurs again: hxxp://www.tomle

  • No ringtone signal in carkit handsfree after update to ios 4.1

    I updated my iphone 3gs from 4.0.1 to ios 4.1 to solve some issues with my built-in carkit handfree. Now i can see mi contacts in the car again, but there is no ringtone when a incoming call happen. What can I do?. I restored setings and configured m

  • Visitor IP banning...SSI supported by .Mac...htaccess support?

    Exploring options on excluding specific IPs from accessing my website. I've come across some javascript that requires SSI support and saving to a .shtml file. Before any web experts jump down my throat, I understand that this kind of ban can be circu