Substract date from date?

hello,
i try to substract a date from sysdate and format it like 'HH24:MI:SS'.
SELECT ID
, to_char (DLC,'DD.MM.YY HH24:MI:SS') as DateLastChange
, to_char (sysdate,'DD.MM.YY HH24:MI:SS') as ToDay
, to_char ((to_date(SYSDATE,'DD.MM.YY HH24:MI:SS') - to_date (DLC ,'DD.MM.YY HH24:MI:SS'))) as Time
FROM TEST_TABLE
result:
ID | DateLastChange | ToDay |Time
1045 | 14.02.03 17:24:04 | 27.02.03 12:29:30 | 10
but only full day's are displayd.... i need day's and minutes
thank's for help

Hi Rainer,
if you are using Oracle9i, you can use Day to Second Interval.
SELECT sysdate, hiredate ,NUMTODSINTERVAL( TO_CHAR( sysdate - hiredate ), 'day' ) from emp;
It gives you even the fractional millisecond difference upto a precision of 9.
SQL> insert into emp(empno,hiredate) values( 100, sysdate-1);
SQL> select NUMTODSINTERVAL(to_char(sysdate-hiredate),'day') from emp where empno=100;
NUMTODSINTERVAL(TO_CHAR(SYSDATE-HIREDATE),'DAY')
+000000001 00:03:13.000000000
Regards
Elango.

Similar Messages

  • Unable to access the data from Data Management Gateway: Query timeout expired

    Hi,
    Since 2-3 days the data refresh is failing on our PowerBI site. I checked below:
    1. The gateway is in running status.
    2. Data source is also in ready status and test connection worked fine too.
    3. Below is the error in System Health -
    Failed to refresh the data source. An internal service error has occurred. Retry the operation at a later time. If the problem persists, contact Microsoft support for further assistance.        
    Error code: 4025
    4. Below is the error in Event Viewer.
    Unable to access the data from Data Management Gateway: Query timeout expired. Please check 1) whether the data source is available 2) whether the gateway on-premises service is running using Windows Event Logs.
    5. This is the correlational id for latest refresh failure
    is
    f9030dd8-af4c-4225-8674-50ce85a770d0
    6.
    Refresh History error is –
    Errors in the high-level relational engine. The following exception occurred while the managed IDataReader interface was being used: The operation has timed out. Errors in the high-level relational engine. The following exception occurred while the
    managed IDataReader interface was being used: Query timeout expired. 
    Any idea what could have went wrong suddenly, everything was working fine from last 1 month.
    Thanks,
    Richa

    Never mind, figured out there was a lock on SQL table which caused all the problems. Once I released the lock it PowerPivot refresh started working fine.
    Thanks.

  • Error while extracting data from data source 0RT_PA_TRAN_CONTROL, in RSA7

    Hi Gurs,
    I'm getting the below error while extracting data from data source 0RT_PA_TRAN_CONTROL, in RSA7. (Actullly this is IS Retail datasource used to push POSDM data into BI cubes)
    The error is:
    Update mode "Full Upload" is not supported by the extraction API
    Message no. R3011
    Diagnosis
    The application program for the extraction of the data was called using update mode "Full Upload". However, this is not supported by the InfoSource.
    System Response
    The data extraction is terminated.
    Procedure
    Check for relevant OSS Notes, or send a problem message of your own.
    Your help in this regd. would be highly appreciated.
    Thanks,
    David.

    Hi David,
    I have no experience with IS Retail data sources. But as message clearly say this DS is not suppose to be ran in Full mode.
    Try to switch you DTPs/Infopackages to Delta mode.
    While to checking extraction in source system, within TA RSA3 = Extractor checker, kindly switch Update mode field to Delta.
    BR
    m./

  • Runtime error when Transfering data from data object to a file

    Hi everybody,
    I'm having a problem when I transfer data from data object to file. The codes like following :
    data : full_path(128).
    OPEN DATASET full_path FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
    and transfer data from flat structure to this file full_path
      move:    tab                 to c_output-tab_5,
                  tab                 to c_output-tab_4,
                  tab                 to c_output-tab_3.
      transfer c_output to full_path.      // Error Line
    The detail error like the following:
    For the statement
       "TRANSFER f TO ..."
    only character-type data objects are supported at the argument position
    "f".
    In this case. the operand "f" has the non-character-type "u". The
    current program is a Unicode program. In the Unicode context, the type
    'X' or structures containing not only character-type components are
    regarded as non-character-type.
    transfer c_output to full_path. " Line error
    Please help me to fix this issue !
    Thank you in advance !
    Edited by: Hai Nguyen on Mar 4, 2009 10:55 AM

    Hi Mickey,
    Thanks for your answer,
    I found out that the structure c_output have the field with data type X. I know that the cause of the issue.
          begin of c_output,
             vbeln(10),
             tab_5 like tab,
             posnr(6),
             tab_4 like tab,
             topmat(18),
             tab_3 like tab,
         end  of c_output.
    data : tab type X value 9.
    Could you tell me how to fix it ? What I have to do in this situation ?
    Thank you very much !

  • When to refresh Servlet data from Data Base

    Hello all,
    I have a servlet that retrive few hundreds thousands records from data base table.
    The data in data base table being updated once or twice in every week.
    Since same servlet instance serve all users, that access the servlet many times a day.
    I would like to avoid retriving the data from data base on each servlet access.
    and make the users use same data already retrieved and kept in servlet members.
    First, what is the best way to avoid data retrive from data base on each servlet access?
    and how could I have some kind of trigger that will refresh servlet data from data base every few days?
    Thanks in advance for every idea.
    Ami

    Java_A wrote:
    Thanks Saish for your reply.
    I'm not using DAO in my application but retrive the data from BI data base using a web service. response time querying the BI data base is not quick enuogh.
    Since, I wouldn't want to query the BI server on each servlet access.
    Because the data I retrived at the begining using the web service contains all required data for all servlet requests, I thought to store the data (~200K rows) once in the servlet which will be using for all requests.
    Why not store the results locally in your own database after you fetch them?
    This still leave me with the questions: in which event should I query the BI data, and also when or in which event should I update the data again from BI server?
    Query at startup, an user demand, when data becomes stale. It depends on your requirements.
    >
    Thanks
    Ami- Saish

  • Delete Transaction Data from date to date

    Hi All,
    we want to delete transactional data from date to to date
    is there any way to delete data from date to todate?
    We are aware of following tcodes
    OBR1- Reset transaction data
    CXDL - Delete transaction data from ledger
    But there is no period/from date to date option available
    Example:
    we are in 2010 now we want to delete data from 2005- 2007 and we don't want to archive
    Thanks in advance
    Regards,
    MS

    Hi Eli,
    Thanks for the reply,
    Yes, you are right its not right to delete data based on the period... but we have such kind of typical scenario
    Let me get some other opinion
    Regards,
    MS

  • Error while loading Reported Financial Data from Data Stream

    Hi Guys,
    I'm facing the following error while loading Reported Financial Data from Data Stream:
    Message no. UCD1003: Item "Blank" is not defined in Cons Chart of Accts 01
    The message appears in Target Data.  Item is not filled in almost 50% of the target data records and the error message appears.
    Upon deeper analysis I found that Some Items are defined with Dr./Cr. sign of + and with no breakdown.  When these items appear as negative (Cr.) in the Source Data, they are not properly loaded to the target data.  Item is not filled up, hence causing the error.
    For Example: Item "114190 - Prepayments" is defined with + Debit/Credit Sign.  When it is posted as negative / Credit in the source data, it is not properly written to the target.
    Should I need to define any breakdown category for these items?  I think there's something wrong with the Item definitions OR I'm missing something....
    I would highly appreciate your quick assistance in this.
    Kind regards,
    Amir

    Found the answer with OSS Note: 642591..... 
    Thanks

  • Web Template is not able to fetch data from Data Provider

    hi friends,
               i have created a reporting agent for a particular query and given all the necessary parameters, defined the variants and activated this and created a scheduling package to this ,assigned my query to the scheduling package
               later in the web template,in the web item tab i maintained  the read mode as precalculated web template and the final report on the web is saying web tempalte is not able to get the data from the data provider  ,
    am i missing any where...
            i look forward to your help ,
    regards,
    sasidhar gunturu

    Hi,
    use the correct link to call the report.
    you can check the links : Re: Concept Of Precalculated Template
    https://www.sdn.sap.com/irj/sdn/developerareas/bi?rid=/webcontent/uuid/a8cd1f71-0a01-0010-4783-f119b6132d25 [original link is broken]
    Regards
    Happy Tony

  • Error when extracting data from Data Source

    Hi All,
    I have a Generic datasource in BW which is based on a Function Module.When I am extracting the data from this source in RSA3 it is ending up in Run time error.
    the error message is
    The current ABAP program "SAPLYBWPU" had to be terminated because one of the
    statements could not be executed.
    and the error analysis is
    You wanted to add an entry to table "\FUNCTION-POOL=YBWPU\DATA=GT_HIERTAB2", which you declared with a UNIQUE KEY. However, there was already an entry with the same key.
    This may have been in an INSERT or MOVE statement, or within a
    SELECT ... INTO statement.
    In particular, you cannot insert more than one initial line into a
    table with a unique key using the INSERT INITIAL LINE... statement.
    Can anybody explain me how can I resolve this
    Thanks
    Sreeja

    Hello
    You can refer following link.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/a0f46157-e1c4-2910-27aa-e3f4a9c8df33&overridelayout=true

  • Performance problem in select data from data base

    hello all,
    could you please suggest me which select statement is good for fetch data form data base if data base contain more than 10 lac records.
    i am using SELECT PACKAGE SIZE n statement,  but it's taking lot of time .
    with best regards
    srinivas rathod

    Hi Srinivas,
    if you have huge data and selecting ,you could decrease little bit time if you use better techniques.
    I do not think SELECT PACKAGE SIZE  will give good performance
    see the below examples :
    ABAP Code Samples for Simple Performance Tuning Techniques
    1. Query including select and sorting functionality
    tables: mara, mast.
        data: begin of itab_new occurs 0,
                 matnr like mara-matnr,
                 ernam like mara-ernam,
                 mtart like mara-mtart,
                 matkl like mara-matkl,
                 werks like mast-werks,
               aenam like mast-aenam,
    stlal like mast-stlal,
         end of itab_new.
    select fmatnr fernam fmtart fmatkl gwerks gaenam g~stlal
    into table itab_new from mara as f inner join mast as g on
    fmatnr = gmatnr where gstlal = '01' order by fernam.
    Code B
    tables: mara, mast.
    data: begin of itab_new occurs 0,
          matnr like mara-matnr,
          ernam like mara-ernam,
          mtart like mara-mtart,
          matkl like mara-matkl,
          werks like mast-werks,
          aenam like mast-aenam,
          stlal like mast-stlal,
    end of itab_new.
    select fmatnr fernam fmtart fmatkl gwerks gaenam g~stlal
    into table itab_new from mara as f inner join mast as g on f~matnr =
    gmatnr where gstlal = '01'.
    sort itab_new by ernam.
    Both the above codes essentially do the same function, but the execution time for code B is considerably lesser than that of Code A. Reason: The Order by clause associated with a select statement increases the execution time of the statement, so it is profitable to sort the internal table once after selecting the data.
    2. Performance Improvement Due to Identical Statements – Execution Plan
    Consider the below queries and their levels of efficiencies is saving the execution
    tables: mara, mast.
    data: begin of itab_new occurs 0,
          matnr like mara-matnr,
          ernam like mara-ernam,
          mtart like mara-mtart,
          matkl like mara-matkl,
          werks like mast-werks,
          aenam like mast-aenam,
          stlal like mast-stlal,
    end of itab_new.
    select fmatnr fernam fmtart fmatkl gwerks gaenam g~stlal
    into table itab_new from mara as f inner join mast as g on f~matnr =
    gmatnr where gstlal = '01' .
    sort itab_new.
    select fmatnr fernam
    fmtart fmatkl gwerks gaenam g~stlal
    into table itab_new from mara as
    f inner join mast as g on f~matnr =
    gmatnr where gstlal
    = '01' .
    Code D (Identical Select Statements)
    tables: mara, mast.
    data: begin of itab_new occurs 0,
          matnr like mara-matnr,
          ernam like mara-ernam,
          mtart like mara-mtart,
          matkl like mara-matkl,
          werks like mast-werks,
          aenam like mast-aenam,
          stlal like mast-stlal,
    end of itab_new.
    select fmatnr fernam fmtart fmatkl gwerks gaenam g~stlal
    into table itab_new from mara as f inner join mast as g on f~matnr =
    gmatnr where gstlal = '01' .
    sort itab_new.
    select fmatnr fernam fmtart fmatkl gwerks gaenam g~stlal
    into table itab_new from mara as f inner join mast as g on f~matnr =
    gmatnr where gstlal = '01' .
    Both the above codes essentially do the same function, but the execution time for code B is considerably lesser than that of Code A. Reason: Each SQL statement during the process of execution is converted into a series of database operation phases. In the second phase of conversion (Prepare phase) an “execution  plan” is determined for the current SQL statement and it is stored, if in the program any identical select statement is used, then the same execution plan is reused to save time. So retain the structure of the select statement as the same when it is used more than once in the program.
    3. Reducing Parse Time Using Aliasing
    A statement which does not have a cached execution plan should be parsed before execution; this parsing phase is a highly time and resource consuming, so parsing time for any sql query must include an alias name in it for the following reason.
    1.     Providing the alias name will enable the query engine to resolve the tables to which the specified fields belong to.
    2.     Providing a short alias name, (a single character alias name) is more efficient that providing a big alias name.
    Code E
    select jmatnr jernam jmtart jmatkl
    gwerks gaenam g~stlal into table itab_new from mara as
    j inner join mast as g on jmatnr = gmatnr where
                g~stlal = '01' .
    In the above code the alias name used is ‘ j ‘.
    4. Performance Tuning Using Order by Clause
    If in a SQL query you are going to  read a particular database record based on some key values mentioned in the select statement, then the read query can be very well optimized by ordering the fields in the same order in which we are going to read them in the read query.
    Code F
    tables: mara, mast.
    data: begin of itab_new occurs 0,
          matnr like mara-matnr,
          ernam like mara-ernam,
          mtart like mara-mtart,
          matkl like mara-matkl,
          end of itab_new.
    select MATNR ERNAM MTART MATKL from mara into table itab_new where
    MTART = 'HAWA' ORDER BY  MATNR ERNAM  MTART MATKL.
    read table itab_new with key MATNR = 'PAINT1'   ERNAM = 'RAMANUM'
    MTART = 'HAWA'   MATKL = 'OFFICE'.
    Code G
    tables: mara, mast.
    data: begin of itab_new occurs 0,
          matnr like mara-matnr,
          ernam like mara-ernam,
          mtart like mara-mtart,
          matkl like mara-matkl,
          end of itab_new.
    select MATNR ERNAM MTART MATKL from mara into table itab_new where
    MTART = 'HAWA' ORDER BY  ERNAM MATKL MATNR MTART.
    read table itab_new with key MATNR = 'PAINT1'   ERNAM = 'RAMANUM'
    MTART = 'HAWA'   MATKL = 'OFFICE'.
    In the above code F, the read statement following the select statement is having the order of the keys as MATNR, ERNAM, MTART, MATKL. So it is less time intensive if the internal table is ordered in the same order as that of the keys in the read statement.
    5. Performance Tuning Using Binary Search
    A very simple but useful method of fine tuning performance of a read statement is using ‘Binary search‘ addition to it. If the internal table consists of more than 20 entries then the traditional linear search method proves to be more time intensive.
    Code H
    select * from mara into corresponding fields of table intab.
    sort intab.     
    read table intab with key matnr = '11530' binary search.
    Code I
    select * from mara into corresponding fields of table intab.
    sort intab.     
    read table intab with key matnr = '11530'.
    Thanks
    Seshu

  • Problem Reading Data from .dat File

    I am trying to read in data from lines 384-475. The data looks something like this
    198 95 70 71 93-99-99-99-99-99-99-99-99-99-99  5 17 17 17 17-99-99-99-99-99-99-99-99-99-99 1
    78 95 70 69-99-99-99-99-99-99-99-99-99-99-99  6  2  1  1-99-99-99-99-99-99-99-99-99-99-99 2
    70 73-99-99-99-99-99-99-99-99-99-99-99-99-99  0  1-99-99-99-99-99-99-99-99-99-99-99-99-99 1This is only the data from lines 384-475. I keep getting an error though, and I can't figure it out.
    import java.util.*;
    import java.io.*;
    public class ProgrammingExercise1
         public static double findAccuracy(String systemCommand) throws IOException
             Process proc = Runtime.getRuntime().exec(systemCommand);
             Scanner scan = new Scanner(proc.getInputStream());
             PrintStream logOut = new PrintStream(new FileOutputStream("log.dat"));
             while(scan.hasNextLine())
                  logOut.println(scan.nextLine());
             logOut.close();
             scan.close();
             proc.destroy();
             Scanner input = new Scanner(new File("log.dat"));
             String line = "";
             for(int c=1; c<=4; c++)
                  line = input.nextLine();
             double finalAccuracy = Double.parseDouble(line.substring(22,27));
             input.close();
             return finalAccuracy;
         public static void main (String[] args) throws IOException
              try {
                        String s;
                        int count=0;
                        String temp;
                        int aInt;
                        int med;
                        int[] daM = new int[251];
                        String[] pos = new String[251];
                        int[][] da = new int[251][507];
                        int[] daSort = new int[da.length];
                        for (int i=0; i<251; i++)
                             daM[i] = 0;
                             for (int j=0; j<507; j++)
                                  da[i][j] = 0;
              //creating an input stream
                 FileInputStream fstream = new FileInputStream("dataset.dat");
                 DataInputStream in = new DataInputStream(fstream);
                 BufferedReader br = new BufferedReader(new InputStreamReader(in));
                 while((s = br.readLine()) != null)
                    //safety check as the last line in not null, but empty
                    //break the procedure info from the string/line
                      String sub = s.substring(384, 475);
                      for (int k=0; k<15; k++)
                           temp = sub.substring(k*3, (k*3)+3);
                             //removes spaces from the string to alleviate integer conversion errors
                           temp = temp.replace(" ", "");
                           aInt = Integer.parseInt(temp);
                           if (aInt > -1)
                                da[count][aInt] = 1;
                      sub = s.substring(260, 264);
                      temp = sub.replace(" ", "");
                      aInt = Integer.parseInt(temp);
                      if (aInt > -1)
                           daSort[count] = aInt;
                           daM[count] = aInt;
                      count++;
                 in.close();
                 bubbleSort(daSort);
                 med = daSort[daSort.length/2];
                 for (int t=0; t<daM.length; t++)
                      if (daM[t]<med)
                           pos[t] = "+1";
                      else
                           pos[t] = "-1";
                      for (int r=0; r<da[t].length; r++)
                           if (da[t][r]==1)
                                pos[t] +=" " + r + ":1.0";
                 Collections.shuffle(Arrays.asList(pos));
                 File file = new File("testSet.dat");
                 PrintWriter output = new PrintWriter(new FileWriter(file));
                 String tempOut = "";
                 for (int ii=0; ii<pos.length/4; ii++)
                      tempOut += pos[ii];
                      output.write(tempOut);
                      tempOut = "";
                      output.write("\n");
                 output.close();
                 file = new File("crossValidationSet.dat");
                 output = new PrintWriter(new FileWriter(file));
                 for (int jj=(pos.length/4); jj<pos.length/2; jj++)
                      tempOut += pos[jj];
                      output.write(tempOut);
                      tempOut = "";
                      output.write("\n");
                 output.close();
                 file = new File("trainingSet.dat");
                 output = new PrintWriter(new FileWriter(file));
                 for (int kk=(pos.length/2); kk<pos.length; kk++)
                      tempOut += pos[kk];
                      output.write(tempOut);
                      tempOut = "";
                      output.write("\n");
                 output.close();
                 System.out.println("Your file has been written");
              catch (Exception e)
                   System.err.println("Error: " + e.getMessage());
    } // end ProgrammingExercise1The main error I can't figure out is as follows... "Error: For input string: "93-"
    *Note: I had to remove some of the code due to exceeding the maximum characters allowed. I THINK the error is found at the //creating an input stream section.

    The input data that I included is the exact data I am trying to read...just three lines instead of the 251. There is plenty of other data surrounding the included data, but I am not concerned with it.
    Here is the first 3 lines of data...
    4200410577816 76-99 9 177        1 1 0  5.573125262 5 429429422MED0262      25080403915990 427322851 5789 250402859 2111                                 50  99 159 106  60 153  50  59  47-999-999-999-999-999-999                    -999-999-999-999 0 1 4025AZ   14    141010 5868 2817 9  0 04223 3 105        -9           2451644434443                                                 70 93 93-99-99-99-99-99-99-99-99-99-99-99-99  3-99-99-99-99-99-99-99-99-99-99-99-99-99-99 3     40667       40667.002004 1
    4200410581753 79-9911 177        1 1 0  5.5731252 3 5 429429422MED0262      250822851 4271 486  5789 785595990 41400V4581                                50  60 106 122 153 249 159 101 101-999-999-999-999-999-999                    -999-999-999-999 0 1 4025AZ    8     81010 2043  883 9  0 04223 2 105        -9           245134824                                                     70 77-99-99-99-99-99-99-99-99-99-99-99-99-99  4-99-99-99-99-99-99-99-99-99-99-99-99-99-99 3     25444       25444.002004 1
    4200410277559 67-99 3 177        1 0 0  5.5731252 3 5 229229221MED0236      25080707146827 4280 7318 73027041042506045981                                50 199 197 108 212 201   3  50 121-999-999-999-999-999-999                    -999-999-999-999 0 0 4047AZ   29    291010 2414 5409 9  0 04223 6 111        -9           1392745164542884288483950                                     57 70 95189190 61-99-99-99-99-99-99-99-99-99 18-99-99-99-99-99-99-99-99-99-99-99-99-99-99 1    141927      141927.002004 4
    import java.util.*;
    import java.io.*;
    public class ProgrammingExercise1 {
    public static void bubbleSort(int[] x)
              int n = x.length;
              for (int pass=1; pass < n; pass++)
                   for (int i=0; i < n-pass; i++)
                        if (x[i] > x[i+1])
                             int temp = x;
    x[i] = x[i+1];
    x[i+1] = temp;
    }// end bubbleSort
    public static void main (String[] args) throws IOException
    try {
              String s;
              int count=0;
              String temp;
              int aInt;
              int med;
              int[] daM = new int[251];
              String[] pos = new String[251];
              int[][] da = new int[251][507];
              int[] daSort = new int[da.length];
              for (int i=0; i<251; i++)
                   daM[i] = 0;
                   for (int j=0; j<507; j++)
                        da[i][j] = 0;
    FileInputStream fstream = new FileInputStream("dataset.dat");
    DataInputStream in = new DataInputStream(fstream);
    BufferedReader br = new BufferedReader(new InputStreamReader(in));
    while((s = br.readLine()) != null)
         String sub = s.substring(384, 475); //!!! Is this supposed to be (384, 475) or (384, 428)???
         for (int k=0; k<15; k++)
              temp = sub.substring(k*3, (k*3)+3);
              temp = temp.replace(" ", "");
              aInt = Integer.parseInt(temp);
              if (aInt > -1)
                   da[count][aInt] = 1;
         sub = s.substring(260, 264);
         temp = sub.replace(" ", "");
         aInt = Integer.parseInt(temp);
         if (aInt > -1)
              daSort[count] = aInt;
              daM[count] = aInt;
         count++;
    in.close();
    bubbleSort(daSort);
    med = daSort[daSort.length/2];
    for (int t=0; t<daM.length; t++)
         if (daM[t]<med)
              pos[t] = "+1";
         else
              pos[t] = "-1";
         for (int r=0; r<da[t].length; r++)
              if (da[t][r]==1)
                   pos[t] +=" " + r + ":1.0";
    Collections.shuffle(Arrays.asList(pos));
    File file = new File("testSet.dat");
    PrintWriter output = new PrintWriter(new FileWriter(file));
    String tempOut = "";
    for (int ii=0; ii<pos.length/4; ii++)
         tempOut += pos[ii];
         output.write(tempOut);
         tempOut = "";
         output.write("\n");
    output.close();
    file = new File("crossValidationSet.dat");
    output = new PrintWriter(new FileWriter(file));
    for (int jj=(pos.length/4); jj<pos.length/2; jj++)
         tempOut += pos[jj];
         output.write(tempOut);
         tempOut = "";
         output.write("\n");
    output.close();
    file = new File("trainingSet.dat");
    output = new PrintWriter(new FileWriter(file));
    for (int kk=(pos.length/2); kk<pos.length; kk++)
         tempOut += pos[kk];
         output.write(tempOut);
         tempOut = "";
         output.write("\n");
    output.close();
    System.out.println("Your file has been written");
    catch (Exception e)
         e.printStackTrace();
    Edited by: djcochran on Jan 27, 2010 5:22 PM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • UNIQUE Problem in pulling DATA from DATA base table to internal table

    Dear Experts,
    I am new to ABAP. I have a very basic question but looks a quite puzzling one to me. Hemnce I am posting it here.
    I am facing an unique problem in pulling data from database table and populating that data into internal table for further use.
    The data in the database table "Zlt_mita" with fields M1 (Employee Name, Type: Char20) and M2 (Employee Code, Type Char7) are:
    Plz refer the screenshot in the attached file:
    My Code:
    1) When I try to pull data from Dbase table by taking M2 as parameter.
         This code is succcessful and I am able to populate data in internal table it_dat.
    TYPES: Begin Of ty_DAT,
                     M1   TYPE  Zlt_mita-M1,
                     M2   TYPE  ZLT_mita-M2,
                 END  OF  ty_DAT.
    DATA: it_dat        TYPE STANDARD TABLE OF ty_dat with header line,
              wa_dat      TYPE   ty_dat.
    PARAMETERS: p_mitar    TYPE  Zlt_Mita-M2.
    SELECT           M1
                           M2
            FROM     ZLt_mita
            INTO       TABLE it_dat
            Where     M2 = p_mitar.
    Loop at it_dat into wa_dat.
       WRITE:/2 wa_dat-M1,
                  10 wa_dat-M2.
    ENDLOOP.
    2) When I try to pull data from Dbase table by taking M1 as parameter.
         This code is NOT succcessful and I am NOT able to populate data in internal table it_dat.
    TYPES: Begin Of ty_DAT,
                     M1   TYPE  Zlt_mita-M1,
                     M2   TYPE  ZLT_mita-M2,
                 END  OF  ty_DAT.
    DATA: it_dat        TYPE STANDARD TABLE OF ty_dat with header line,
               wa_dat      TYPE   ty_dat.
    PARAMETERS:    P_Mita    TYPE   ZLT_Mita-M1.
    SELECT           M1
                           M2
            FROM     ZLt_mita
            INTO       TABLE it_dat
            Where     M1 = P_Mita.
    Loop at it_dat into wa_dat.
       WRITE:/2 wa_dat-M1,
                 10 wa_dat-M2.
    ENDLOOP.
    Why is this happening when both M1 and M2 are Type Character fields.
    Looking forward for your replies.
    Regards
    Chandan Kumar

    Hi Chandan ,
    Database fetch is case sensitive ,So u need to give exact format in where condition.
    Make your parameter and database in same case so that you need not worry about case sensitivity .
    Check the lowecase check box in the domain .
    Then declare your parameter 
    PARAMETERS:
    P_Mita
    TYPE   ZLT_Mita-M1 LOWER CASE . 
    You can do the vice versa also by unchecking lowercase and giving Upper case instead of lower in parameter declartion .
    Regards ,
    Juneed Manha

  • 0EQUIPMENT_ATTR; DATE FROM & DATE TO objects are not available

    Hi,
    In the equipment masters iam getting the FROM DATE & TO DATE, even when iam scheduling the info package, master data is getting loaded based on these dates. once after loading iam able to see the FROM DATE & TO DATE data at object level.
    But when i want to use the same date fileds at report level to check the equipment usage period based on the functional location means these objects are not even available as attributes at report level.
    Can any body suggest me the approach how i can get these fields at report level and make use of them.
    Regards,
    Prabhakar.

    Hi,
    In the equipment masters iam getting the FROM DATE & TO DATE, even when iam scheduling the info package, master data is getting loaded based on these dates. once after loading iam able to see the FROM DATE & TO DATE data at object level.
    But when i want to use the same date fileds at report level to check the equipment usage period based on the functional location means these objects are not even available as attributes at report level.
    Can any body suggest me the approach how i can get these fields at report level and make use of them.
    Regards,
    Prabhakar.

  • Graph Data from Data Table

    I have a Table Data which 5 columns, 1 string column and 4 double columns with header name.  I try to graph the data in excel based on example from NI. When I use the data from NI example instead of my Data Table then it works.  I research on convert 1D to 2D Array but I could not be able to do it.

    LabViewRV wrote:
    I fixed the typos. How can I convert it into 2D Array, I tried the reshape accordingly to some old questions but it did not work for me. 
    A Reshape Array can be used. You can also simply feed a 1D array into a Build Array function. That will give you a 2D array, but I do not believe that is what you should do here:
    By the way, it there any possible way to graph data in LabView?
    Again, the Data Table contains 5 columns, 1st column data type is string (timestamp), 2nd, 3rd, 4th, and 5th are double
    There are numerous ways to graph data in LabVIEW. Have you looked at the shipping examples and in the LabVIEW Help, or in the controls palette? You have a waveform chart, a waveform graph, and an XY graph. You can also have picture graphs. Is the source data a text file or an Excel workbook? If it's a text file you can use Read From Spreadhsheet File to read the file into a 2D array of strings. You can then convert the 1st column into time information, and the remaining columns into numerics. Please take a look at the examples. If you are still having problems, then post back what you have tried, and please provide an example of the data file you are trying to read and graph.

  • Copy data from data base table

    Hello friends,
    I have some data in a Z-table in QAS. I want to copy all the data from QAS to the table in SDV. Is there way that I can copy without writing a program.
    Thanks in advance,
    Shejal.

    hi,
    use se09-display object list
    insert new line with:
    R3TR TABU "ZTAB01"
    -> insert key:
    Mandt = sy-mandt
    and key  = *
    A.
    Message was edited by:
            Andreas Mann

Maybe you are looking for

  • [SOLVED] Mpd and sonata doesn't seem to work

    Hi guys! I 'm facing with an issue . I can't seem to configure mpd correctly to run it with sonata . I've followed the alternative setup from archwiki and this is my config file . # An example configuration file for MPD # See the mpd.conf man page fo

  • Horizontal mapping not working in Kodo 4.1.2

    Hello, I am having troubles in trying to get the class mapping I want in Kodo 4.1.2. I want to go from Kodo 3.3 to Kodo 4.1, and still in the evaluation process. Basically, all I want is to have my package.jdo files to work in Kodo 4.1, with the mini

  • VESA adapter for 24" Aluminum iMac, doesn't fit

    I just got my VESA adapter for my new 24" iMac and the mount adapter doesn't fit onto the flange. I got the flange to attach to the iMac fine but the bolt holes on the flange don't line up with the mount adapter they are off my about 1/4". Did I miss

  • Appcrash on Itunes

    Every time i open my itunes, after a few minutes this message comes up: Problem Event Name: APPCRASH Application Name: iTunes.exe Application Version: 10.1.2.17 Application Timestamp: 4d3f51b8 Fault Module Name: StackHash_bf0f Fault Module Version: 6

  • ICHAT NEWBIE--ONE .MAC ACCOUNT; TWO PARENTS

    Hi all, I am separated from my family and we bought two iSight cameras and hoped to use iChat AV to keep in touch. We have a full .Mac account, and I thought that the 'multiple logins' feature would allow us to both logon and do video chat. But this