How to skip certain lines for a txt file and insert into array

so here is my question:
i had a file to read, and it requires to input into the array starting from a certain line
example:
4
john 25 M
mary 22 F
lee 20 M
faye 10 F
faye john
mary john
mary faye
i want to insert the friend list, starting 5th line into a 2d array, which is the int from first line +1.
can someone help me with it?
i believe there is a skip method and stuff..
but just dont know how to use it
may someone tell me how to do tat?

the thing is i think that takes too long and it is not efficient..
however...i just solved it with a better method
Scanner in = new Scanner (reader);
int size = Integer.parseInt(in.next());
BufferedReader insert = new BufferedReader(new FileReader(new File(input)));
String line = null;
int count = 0;
int startAtLineNo =size+1; // 0-based
while ((line = insert.readLine()) != null) {
if (count >= startAtLineNo) {
/* do stuff */
System.out.println(line);
// else ignore
count++;
thanks anyways

Similar Messages

  • How can i open a DOC or TXT file and insert the data into table?

    How can i open a DOC or TXT file and insert the data into table?
    I have a doc file . the doc include some columns and some rows.(for example 'ID,Name,Date,...').
    I'd like open DOC file and I'd like insert them into the table with same columns.
    Thanks.

    Use the SQL*Loader utility or the UTL_FILE package.

  • How to search a special string in txt file and return it's position in txt file?

    How to search a special string in txt file and return it's position in txt file?

    I just posted a solution for a similar question here:  http://forums.ni.com/ni/board/message?board.id=170​&view=by_date_ascending&message.id=362699#M362699
    The top portion can search for the location of a string, while the bottom portion is to locate the position of a character.  Both can search for a character.
    The position of the character within the file is displayed in the indicator(s).
    R

  • How to use certain entries from a txt file as metadata?

    So during my plugin development I stumbled on a problem.
    I have created a txt file for each picture telling me who's in the photos. So the first column in the file shows all the names that appear in the photo.
    I want to add those names into LR for each photo. I thought using them as metadata would be a good idea. Then I could let the user enter a name of some person and create a smart collection having all photos with that person in them.
    But putting it into practice isn't that easy.
    So I started with the classic metadata definition:
    metadataFieldsForPhotos = {
      id = 'Person',
      title = 'Person'
      dataType = 'string',
      searchable = true,
      browsable = true,
    Now I don't know how to actually read only the column entries of the txt file and displaying all the names appearing in it. Another problem is that each photo has its own txt file in "nameofphoto.jpg.txt" So LR must know which txt file belongs to a photo.
    I know my question isn't very specific but it would be great if someone could help...

    Thank you for the reply. Reading the file how I wanted worked fine and I chose to use keywords, as you suggested.
    So now I tried to add the keywords to the pictures and I get:
    "LrCatalog:withWriteAccessDo: could not execute action 'keywordProvider'. It was blocked by another write access call, and no timeout parameters were provided."
    There's my function so far:
    function Finder.findPerson(smartCollectionName)
         local catalog = LrApplication.activeCatalog()
         local photos = catalog:getMultipleSelectedOrAllPhotos()
           LrTasks.startAsyncTask( function()
                for i=1,#photos do
                     local photo = photos[i]
                     local photopath = photo:getRawMetadata("path")
                     local photoname = photo:getFormattedMetadata("fileName")
                     local folderpath = string.sub(photopath, 1, string.len(photopath) - string.len(photoname))
                     local metadataPath = folderpath .. "\.metaface\\" .. photoname .. ".txt"
                     local text = LrFileUtils.readFile(metadataPath)
                     local metadataValues = stringSplit(text)
                     for i = 0, #metadataValues do
                          catalog:withWriteAccessDo( "keywordProvider", function()
                               local keywords = catalog:createKeyword(metadataValues[i])
                               photos:addKeyword(keywords)
                          end)
                     end
               end
          end)
    end
    So what am I doing wrong?

  • How to read the data from XML file and insert into oracle DB

    Hi All,
    I have below require ment.
    I will receive data in the XML file. then i need to read that data and insert into oracle tables. please let me know how this can be handled.
    Many Thanks.

    Sounds a lot like this question, only with less details.
    how to read data from XML  variable and insert into table variable
    We can only help if you provide us details to help as we cannot see what you are doing and only know what you tell us.  Plenty of examples abound on the forums that cover the topics you seek as well.

  • Reading from a txt file and storing into a list

    i want to read a list of counters(String) which are there in a flat txt file and store them into a List.
    Any suggestion will be helpful
    sanjeev

    Try this
    try {
              FileReader fr = new FileReader("C:/abc.txt");
              BufferedReader br = new BufferedReader(fr);
              String record = null;
              while ((record = br.readLine()) != null)
                   System.out.println(record );     
              

  • How to read last line from a .txt file?

    Hello
    I have a string: e.g. "my name is John"
    and i wanna verify if this string is equal with the last line from a text file
    for example, if in the txt file i have:
    asdasd
    sdgsdfgasd
    asdfgadfgadf
    sdgasdgsdf
    my name is john
    then it's OK.
    but if i have
    asdgsdfg
    dsfhsdfhsd
    sdgasdfg
    sdgsdg
    my name is Jdfgsdg
    this is not correct
    How should i do this?
    Thanks:)

    Read from the beginning an discard all lines (if you have a small file), or use RandomAccessFile and scan from the end till you find the beginning of the last line.
    Kaj

  • Reading certain lines of a text file and boundary testing ...

    Hi all, 
    This is only my second post so please be gentle ;P
    Basically, my problem is that i want to read certain chunks of a text file into different text boxes. For example, this may be 50 lines in textbox1, then 50 in textbox2, then the rest in textbox3. i know this would involve using a splitter, but i have absolutelyno
    idea how i would go about implementation this. any help would be greatly appreciated on this.
    my second problem is that i also need to carry out testing on my system. one page in this is the login page. I understand that normal data would be all correct fields and erroneous data would be blank field(s). however, i dont really know what would be boundary
    testing for this. The only thing i could think of is correct username but incorrect password but i dont think this is correct. again, any help would be appreciated. 
    thanks, 
    LTID

    Hi all, 
    This is only my second post so please be gentle ;P
    Basically, my problem is that i want to read certain chunks of a text file into different text boxes. For example, this may be 50 lines in textbox1, then 50 in textbox2, then the rest in textbox3. i know this would involve using a splitter, but i have absolutelyno
    idea how i would go about implementation this. any help would be greatly appreciated on this.
    my second problem is that i also need to carry out testing on my system. one page in this is the login page. I understand that normal data would be all correct fields and erroneous data would be blank field(s). however, i dont really know what would be boundary
    testing for this. The only thing i could think of is correct username but incorrect password but i dont think this is correct. again, any help would be appreciated. 
    thanks, 
    LTID
    I suppose there would be a reason for needing to read some amount of lines into each textbox. But you make no mention of why that would be necessary.
    There is such a thing as a delimited text file. Each line would contain a delimiter of some character between fields on each line. That file could be used for displaying information in separate controls on a Form.
    But you only mention TextBox's and reading x amount of lines from a file into each TextBox as if you are perhaps providing certain information for each TextBox.
    If that is what you want then provided replies will work. If you want to explain what you need to do with regard to information in a file and displaying it otherwise then please explain so that a better method could be provided.
    Example using Text Field Parser.
    https://msdn.microsoft.com/en-us/library/microsoft.visualbasic.fileio.textfieldparser(v=vs.110).aspx
    Option Strict On
    Public Class Form1
    Private Sub Form1_Load(sender As Object, e As EventArgs) Handles MyBase.Load
    Me.Location = New Point(CInt((Screen.PrimaryScreen.WorkingArea.Width / 2) - (Me.Width / 2)), CInt((Screen.PrimaryScreen.WorkingArea.Height / 2) - (Me.Height / 2)))
    Using MyReader As New Microsoft.VisualBasic.FileIO.TextFieldParser("c:\Users\John\Desktop\DelimitedTextFile.Txt")
    MyReader.TextFieldType = Microsoft.VisualBasic.FileIO.FieldType.Delimited
    MyReader.Delimiters = New String() {"|"}
    Dim currentRow As String()
    While Not MyReader.EndOfData
    Try
    currentRow = MyReader.ReadFields()
    ListBox1.Items.Add(currentRow(0))
    ListBox2.Items.Add(currentRow(1))
    ListBox3.Items.Add(currentRow(2))
    Catch ex As Microsoft.VisualBasic.FileIO.MalformedLineException
    MsgBox("Line " & ex.Message & " is invalid. Skipping")
    End Try
    End While
    End Using
    End Sub
    End Class
    Text in text file DelimitedTextFile.Txt. Delimiter is vertical bar.
    Bill|Home Depot|Garden Department
    Cheryll|DirectTV|Installer
    Charlie|C&K Automotive|Owner
    Zorro|Television|Masked Avenger
    Samantha|Mayo Clinic|Nurse Practitioner
    Mona|Bahia Honda Key State Park|Park Ranger
    La vida loca

  • How to save the time to a .txt file and retrive it back as time?

    I have to save the time [hour:minute:second] and the date [date:month:year] to a .txt file. I am using the Calendar API to set the time and date as below
    FileOutputStream fileStream = new FileOutputStream ("what.txt");
    PrintWriter pw=new PrintWriter(fileStream, true);
    Calendar curDate = Calendar.getInstance();
    System.out.print(curDate.get(Calendar.DATE)+":");
    System.out.print(months[curDate.get(Calendar.MONTH)]+":");
    System.out.print(curDate.get(Calendar.YEAR));
    pw.println(curDate);
    I'll have to retrive these data back to use it to set timer
    timer = new Timer();
    timer.schedule(new RemindTask(), time);
    I try to retrive the data and use the Calendar API to that data by using the below program.
    I get the error as :
    C:\t\Retrivedate.java:41: int cannot be dereferenced
    System.out.print(t.get(Calendar.DATE)+":");
    How can I solve this so that I can use Calendar API on my retrived data?
    import java.io.*;
    import java.util.*;
    import java.lang.*;
    class Test {
    public static void main(String args[]) {
    try {
    File inputFile = new File("what.txt");
    FileReader in = new FileReader("what.txt");
    int t;
    char[] tmp = new char[100];
    int cnt = in.read(tmp);
    String tm="";
    for (int j=0;j<cnt;j++)
    { tm+=tmp[j];
    try {
    t=Integer.parseInt(tm,10);
    catch (NumberFormatException exception) {}
    System.out.println("jika "+tm);
    Calendar curDate = Calendar.getInstance();
    String months[]={
    "Jan","Feb","Mar","Apr",
    "May","June","July","Aug",
    "Sept","Oct","Nov","Dec"};
    Date time1 = curDate.getTime();
    System.out.print(" \n"+time1.getHours());
    System.out.print(":"+time1.getMinutes());
    System.out.println(":"+time1.getSeconds());
    System.out.print(t.get(Calendar.DATE)+":");
    System.out.print(months[t.get(Calendar.MONTH)]+":");
    System.out.print(t.get(Calendar.YEAR));
    in.close(); }
    catch (IOException ex) {
    System.out.println("IOException:"+ex.toString()); }

    These lines do not make sense:
    System.out.print(t.get(Calendar.DATE)+":");
    System.out.print(months[t.get(Calendar.MONTH)]+":");
    System.out.print(t.get(Calendar.YEAR));
    1. t is an integer, it does not have any methods.
    2. It's a bad idea to use char[] to retrive the date. You should use readLine() method to retrieve the whole date string, say "10/23/2001" (certainly you can have hours, minutes, seconds, etc.). Then you need to use SimpleDateFormat class to parse this string. SimpleDateFormat gives you everything to parse a date string in any format you want.
    PC

  • How to calculate storage space for archive log files and database backups?

    Hi all,
    I have a 1.8 terabyte Oracle 9i database and need to plan for how much additional disk space I will need to perform nightly backups and for archivelog files. Is there a script or formula available that can help me estimate how much required disk space I will need to hold a days worth of archived logs as well as a nightly export dump file and a full hot RMAN backup on disk?
    Thanks!

    I'm not sure how to estimate the size of your backups, especially if you use incrementals. However, the space required for archive logs will be equal to the amount of REDO your DB generates. I would count the number of log switches per day with a query like the following:
    select trunc(first_time), count(*)
    from v$log_history
    group by trunc(first_time)
    I would then take the average and multiply this count by the size of your redo log files (assuming they are all the same size).

  • How to get the data from a file and insert into a table

    Good morning,
    NEED TO READ THIS FILE sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv containing the following information
    NUID,NUMERO_DE_CUENTA_CONTRATO,CÓDIGO_DANE_DEPARTAMENTO,CÓDIGO_DANE_MUNICIPIO,ZONA_IGAC,SECTOR_IGAC,MANZANA_O_VEREDA_IGAC,NÚMERO_DEL_PREDIO_IGAC,CONDICION_DE_PROPIEDAD_DEL_PREDIO_IGAC,DIRECCIÓN_DEL_PREDIO,NÚMERO_DE_FACTURA,FECHA_DE_EXPEDICIÓN_DE_LA_FACTURA,FECHA_DE_INICIO_DEL_PERÍODO_DE_FACTURACIÓN,DIAS_FACTURADOS,CÓDIGO_CLASE_DE_USO,UNIDADES_MULTIUSUARIO_RESIDENCIAL,UNIDADES_MULTIUSUARIO_NO_RESIDENCIAL,HOGAR_COMUNITARIO_O_SUSTITUTO,USUARIO_FACTURADO_CON_AFORO,USUARIO_CUENTA_CON_CARACTERIZACIÓN,CARGO_FIJO,CARGO_POR_VERTIMIENTO_BASICO,CARGO_POR_VERTIMIENTO_COMPLEMENTARIO,CARGO_POR_VERTIMIENTOSUNTUARIO,CMT,VERTIMIENTO_DEL_PERIOD_EN_METROS_CUBICOS,VALOR_FACTURADO_POR_VERTIDO,VALOR_DEL_SUBSIDIO,VALOR_DE_LA_CONTRIBUCIÓN,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_CARGO_FIJO,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_VERTIMIENTO,CARGOS_POR_CONEXIÓN,PAGO_ANTICIPADO_DEL_SERVICIO,DÍAS_DE_MORA,VALOR_DE_MORA,INTERESES_POR_MORA,OTROS_COBROS,CAUSAL_DE_REFACTURACIÓN,NUMERO_DE_LA_FACTURA_OBJETO_DE_REFACTURACIÓN,VALOR_TOTAL_FACTURADO,PAGOS_DEL_CLIENTE_DURANTE_EL_PERÍODO_FACTURADO
    242602,242602,76,845,99,99,9999,9999,999,CLL 5 CRA 7 PEATONAL,24911920,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000005,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000000000.00
    242604,242604,76,845,99,99,9999,9999,999,CRA 4 # 6 - 13,24911846,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000013,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000004411.00
    242605,242605,76,845,99,99,9999,9999,999,CRA 2 CLLES 3 Y 4,24911509,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000004,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000002200.00
    this is the function that I have
    <<function_test>>
    DECLARE
    TOTAL_CAR NUMBER;
    POS_1 NUMBER:= 0;
    POS_2 NUMBER:= 0;
    REST NUMBER:= 0;
    ACUM NUMBER:= 0;
    CADEN VARCHAR2(200);
    nom_archivo varchar2(80);
    v1 utl_file.file_type;
    v2 varchar2(2048);
    BEGIN
         nom_archivo := 'sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv';
         v1:= utl_file.fopen('PUBLIC_ACCESS',nom_archivo,'R',32767);
         utl_file.get_line(v1,v2);
         SELECT LENGTH(v2) INTO TOTAL_CAR FROM DUAL;
         ACUM:=1;
         POS_1:=0;
         WHILE ACUM <= 60
         LOOP
              select instr(v2, ',', 1, ACUM) PRUEBA
              INTO      POS_2
              FROM      DUAL;
              dbms_output.put_line(' TOTAL POSICION 1--> '|| POS_1);
              dbms_output.put_line(' TOTAL POSICION 2--> '|| POS_2);
              dbms_output.put_line(' TOTAL ACUMULADO --> '|| ACUM);
              REST := (POS_2-POS_1)-1;
              SELECT SUBSTR(v2,(POS_1+1),REST) PRUEBA2
                   INTO CADEN
              FROM      DUAL;     
              dbms_output.put_line(' CADENA SELECCIONADA --> '|| CADEN);
              ACUM := ACUM + 1;
              POS_1:= POS_2;
         END LOOP;
         utl_file.fclose(v1);
         dbms_output.put_line(' -->');
         dbms_output.put_line(' TOTAL POSICION 1-->'|| POS_1);
         dbms_output.put_line(' TOTAL POSICION 2-->'|| POS_2);
         dbms_output.put_line(' TOTAL ACUMULADO -->'|| ACUM);
         dbms_output.put_line(' TOTAL DE CARACTERES -->'|| TOTAL_CAR);
         dbms_output.put_line(' ');     
         EXCEPTION
              WHEN NO_DATA_FOUND THEN
                   dbms_output.put_line('NO SE ENCONTRARON MAS CARACTERES');
              WHEN OTHERS THEN
                   dbms_output.put_line('OTRO TIPO DE ERROR ');
                   dbms_output.put_line('CODIGO ERROR '|| SQLCODE ||' '||SQLERRM);
    END;
    Which must be separated by a comma and enter a table I have the following procedure in which only brings me the first line, which need not
    The current role I have just read and extract data from the row number 1, I do not need information.
    I need information for rows 2,3,4. In each row there are 41 fields, which I enter in a table called Dato_archivos.
    how to perform this function? ...
    I appreciate the cooperation and explanation ...
    GOOD DAY ...
    REYNEL SALAZAR MARTINEZ
    COLOMBIA ...

    When you get an error with external tables (or sql*loader) look in the same folder as the data file and you should get a .log file and maybe a .bad file too.
    The log file should indicate the nature of the error it has trying to load the data.
    I've just copied your sample data from your first post to a file on my server and tried it to find that you are not specifying the required format for your dates. The below shows it now working...
    CREATE TABLE tabla_prueba
      (NUID NUMBER,
       NUM_CUENTA_CONTRATO NUMBER,
       COD_DANE_DD NUMBER,
       COD_DANE_MM NUMBER,
       ZONA_IGAC NUMBER,
       SECTOR_IGAC NUMBER,
       MANZANA_VEREDA_IGAC NUMBER,
       NUM_PREDIO_IGAC NUMBER,
       CONDICION_PREDIO_IGAC NUMBER,
       DIRECCION_PREDIO_IGAC VARCHAR2(80),
       NUM_FACTURA NUMBER,
       FECHA_EXPED_FACTURA DATE,
       FECHA_INI_PERIODO_FACTURACION DATE,
       DIAS_FACTURADOS NUMBER,
       COD_CLASE_USO NUMBER,
       UNI_MULTIUSUARIO_RESIDENCIAL NUMBER,
       UNI_MULTIUSUARIO_NORESIDENCIAL NUMBER,
       HOGAR_COMUNITARIO NUMBER,
       USUARIO_FACTURADO_AFORO NUMBER,
       USUARIO_CON_CARACTERIZACION NUMBER,
       CARGO_FIJO NUMBER,
       CARGO_VERTIMENTO_BAS NUMBER,
       CARGO_VERTIMENTO_COMP NUMBER,
       CARGO_VERTIMENTO_SUNT NUMBER,
       CMT NUMBER,
       VLR_FACTURADO_VERTIDO NUMBER,
       VLR_SUBSIDIO NUMBER,
       VLR_CONTRIBUCCION NUMBER,
       FACTOR_SUBS_CONTR_CARGO_FIJO NUMBER,
       FACTOR_SUBS_CONTR_VERTIMENTO NUMBER,
       CARGO_CONEXION NUMBER,
       PAGO_ANTICIPADO_SERVICIO NUMBER,
       DIAS_MORA NUMBER,
       VLR_MORA NUMBER,
       INTERES_MORA NUMBER,
       OTROS_COBROS NUMBER,
       CAUSAL_REFACTURACION NUMBER,
       NUM_FACTURA_OBJ_REFACTURACION NUMBER,
       VLR_TOTAL_FACTURADO NUMBER,
       PAGOS_CLIENTE_DURANTE_PERIODO NUMBER
    ORGANIZATION EXTERNAL
      (TYPE ORACLE_LOADER
       DEFAULT DIRECTORY TEST_DIR
       ACCESS PARAMETERS
        (RECORDS DELIMITED BY NEWLINE
         SKIP 1
         FIELDS TERMINATED BY ","
         OPTIONALLY ENCLOSED BY '"'
          (NUID,
           NUM_CUENTA_CONTRATO,
           COD_DANE_DD,
           COD_DANE_MM,
           ZONA_IGAC,
           SECTOR_IGAC,
           MANZANA_VEREDA_IGAC,
           NUM_PREDIO_IGAC,
           CONDICION_PREDIO_IGAC,
           DIRECCION_PREDIO_IGAC,
           NUM_FACTURA,
           FECHA_EXPED_FACTURA CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           FECHA_INI_PERIODO_FACTURACION CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           DIAS_FACTURADOS,
           COD_CLASE_USO,
           UNI_MULTIUSUARIO_RESIDENCIAL ,
           UNI_MULTIUSUARIO_NORESIDENCIAL,
           HOGAR_COMUNITARIO ,
           USUARIO_FACTURADO_AFORO ,
           USUARIO_CON_CARACTERIZACION ,
           CARGO_FIJO ,
           CARGO_VERTIMENTO_BAS ,
           CARGO_VERTIMENTO_COMP,
           CARGO_VERTIMENTO_SUNT,
           CMT,
           VLR_FACTURADO_VERTIDO,
           VLR_SUBSIDIO ,
           VLR_CONTRIBUCCION ,
           FACTOR_SUBS_CONTR_CARGO_FIJO ,
           FACTOR_SUBS_CONTR_VERTIMENTO ,
           CARGO_CONEXION ,
           PAGO_ANTICIPADO_SERVICIO ,
           DIAS_MORA ,
           VLR_MORA ,
           INTERES_MORA ,
           OTROS_COBROS ,
           CAUSAL_REFACTURACION ,
           NUM_FACTURA_OBJ_REFACTURACION,
           VLR_TOTAL_FACTURADO,
           PAGOS_CLIENTE_DURANTE_PERIODO
        LOCATION ('test.csv')
    SQL> select * from tabla_prueba;
          NUID NUM_CUENTA_CONTRATO COD_DANE_DD COD_DANE_MM  ZONA_IGAC SECTOR_IGAC MANZANA_VEREDA_IGAC NUM_PREDIO_IGAC CONDICION_PREDIO_IGAC DIRECCION_PREDIO_IGAC                                                    NUM_FACTURA FECHA_EXPE FECHA_INI_
    DIAS_FACTURADOS COD_CLASE_USO UNI_MULTIUSUARIO_RESIDENCIAL UNI_MULTIUSUARIO_NORESIDENCIAL HOGAR_COMUNITARIO USUARIO_FACTURADO_AFORO USUARIO_CON_CARACTERIZACION CARGO_FIJO CARGO_VERTIMENTO_BAS CARGO_VERTIMENTO_COMP CARGO_VERTIMENTO_SUNT        CMT
    VLR_FACTURADO_VERTIDO VLR_SUBSIDIO VLR_CONTRIBUCCION FACTOR_SUBS_CONTR_CARGO_FIJO FACTOR_SUBS_CONTR_VERTIMENTO CARGO_CONEXION PAGO_ANTICIPADO_SERVICIO  DIAS_MORA   VLR_MORA INTERES_MORA OTROS_COBROS CAUSAL_REFACTURACION NUM_FACTURA_OBJ_REFACTURACION
    VLR_TOTAL_FACTURADO PAGOS_CLIENTE_DURANTE_PERIODO
        242602              242602          76         845         99          99                9999         9999              999 CLL 5 CRA 7 PEATONAL                                                         24911920 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        5         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242604              242604          76         845         99          99                9999         9999              999 CRA 4 # 6 - 13                                                               24911846 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                       13         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242605              242605          76         845         99          99                9999         9999              999 CRA 2 CLLES 3 Y 4                                                            24911509 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        4         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
    SQL>
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to read LONG RAW data from one  table and insert into another table

    Hello EVERYBODY
    I have a table called sound with the following attributes. in the music attribute i have stored some messages in the different language like hindi, english etc. i want to concatinate all hindi messages and store in the another table with only one attribute of type LONG RAW.and this attribute is attached with the sound item.
    when i click the play button of sound item the all the messages recorded in hindi will play one by one automatically. for that i'm doing the following.
    i have written the following when button pressed trigger which will concatinate all the messages of any selected language from the sound table, and store in another table called temp.
    and then sound will be played from the temp table.
    declare
         tmp sound.music%type;
         temp1 sound.music%type;
         item_id ITEM;
    cursor c1
    is select music
    from sound
    where lang=:LIST10;
    begin
         open c1;
         loop
              fetch c1 into tmp; //THIS LINE GENERATES THE ERROR
              temp1:=temp1||tmp;
              exit when c1%notfound;
         end loop;
    CLOSE C1;
    insert into temp values(temp1);
    item_id:=Find_Item('Music');
    go_item('music');
    play_sound(item_id);
    end;
    but when i'm clicking the button it generates the following error.
    WHEN-BUTTON-PRESSED TRIGGER RAISED UNHANDLED EXCEPTION ORA-06502.
    ORA-06502: PL/SQL: numeric or value error
    SQL> desc sound;
    Name Null? Type
    SL_NO NUMBER(2)
    MUSIC LONG RAW
    LANG CHAR(10)
    IF MY PROCESS TO SOLVE THE ABOVE PROBLEM IS OK THEN PLESE TELL ME THE SOLUTION FOR THE ERROR. OTHER WISE PLEASE SUGGEST ME,IF ANY OTHER WAY IS THERE TO SOLVE THE ABOVE PROBLEM.
    THANKS IN ADVANCE.
    D. Prasad

    You can achieve this in many different ways, one is
    1. Create another VO based on the EO which is based on the dest table.
    2. At save, copy the contents of the source VO into the dest VO (see copy routine in dev guide).
    3. commiting the transaction will push the data into the dest table on which the dest VO is based.
    I understand that if we attach VO object instance to region/page, we only can pull and put data in to only one table.
    if by table you mean a DB table, then no, you can have a VO based on multiple EOs which will do DMLs accordingly.Thanks
    Tapash

  • HOW TO READ DATA FROM A FILE AND INSERT INTO A TABLE USING UTL_FILE

    Hi..
    I have a file.I want to read the data from file and load it into a table using utl_file.
    how can I do it?
    Any reply apreciated...

    Hi,
    This is not your requirment but u can try this :
    CREATE OR REPLACE DIRECTORY text_file AS 'D:\TEXT_FILE\';
    GRANT READ ON DIRECTORY text_file TO fah;
    GRANT WRITE ON DIRECTORY text_file TO fah;
    DROP TABLE load_a;
    CREATE TABLE load_a
    (a1 varchar2(20),
    a2 varchar2(200))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY text_file
    ACCESS PARAMETERS
    (FIELDS TERMINATED BY ','
    LOCATION ('data.txt')
    select * from load_a;
    CREATE TABLE A AS select * from load_a;
    SELECT * FROM A
    Regards
    Faheem Latif

  • Upload csv file and insert into a table for user

    Hello,
    i want to create a Page on which, the User of my Application can upload a semicolon delimited file and the data of the file should automatically be stored in a table.
    I can upload the file and the file is then stored in the htmldb_application_files table.
    But I do not no how to parse the file....
    Can anyone help me or is there anyone who have done that before?
    Thank you,
    Tim

    Tim...
    Here is what I did in a similar situation.
    Let the user download a csv file to use as an excel turnaround document.
    I check digit the primary key. They are not supposed to touch that column.
    They do their excel thing adding data in columns next to the ones they are updating. They now have the original and new data on the same row in the excel document. They save it on a share drive as a csv. A perl script wakes up and parses the csv. Verify's the check digit, checks that the old values still exist in the table... etc, and then does the update if all is well at the row level. The csv is replaced showing the success or failure of the update on each row.
    Probably lots of other ways to accomplish this but I have gotten years of use out of the script. The original csv can come out of almost any application. Mine come from apex, discoverer and some excel queries.
    Bob

  • How to read partcicular line from a txt file

    in a text file let there be 50 lines
    while reading the line if the line is having "name" as a string in that line then i have to store the next two consecutive lines in a string as array....
    Similar way and so on......
    How can we do this....

    You need a BufferedReader:
    http://java.sun.com/j2se/1.4.2/docs/api/java/io/BufferedReader.html
    for example:try {
        BufferedReader br = new BufferedReader(new FileReader("myfile.txt"));
        String line;
        while ( (line=br.readLine()) != null ) {
         if (line.indexOf("name") != -1) {
             // Store the next two lines.
             // You will need to call br.readLine() again.
    } catch (IOException x) {
        x.printStackTrace();
    }

Maybe you are looking for