Fetch data from two tables and insert into one table

I have similar to the following data in two tables (@Plant, @PlantDirector) and need to consolidate into one table (@PlantNew) as follows.
DECLARE @Plant TABLE (PlantID INT, PlantName VARCHAR(100))
INSERT INTO @Plant (PlantID, PlantName) VALUES (1, 'Name One'),(2, 'Name Two'),(3, 'Name Three'),(4, 'Name Four'),(5, 'Name Five'),(6, 'Name Six')
Director data for the Plants exist in the following table. Assistant value 1 means Assistant Director and 0 means Director. 
Data exists only for subset of plants and a Plant may have one or both roles.
DECLARE @PlantDirector TABLE (PlantID INT, PlantDirectorID INT, Assistant bit)
INSERT INTO @PlantDirector (PlantID, PlantDirectorID, Assistant) VALUES (2, 111, 1),(2, 222, 0),(4, 333, 0),(6,444,1)
The above data needs to be inserted into one table (@PlantNew) as follows: 
PlantID in @Plant table is IDENTITY value and needs to be inserted as-is into this table.
PlantDirExists will get a value of 1 if at least one record exists in @PlantDirector table for a PlantID. PlantAssistantDirID and PlantDirID should be set to the corresponding PlantDirID or NULL appropriately depending on the data.
DECLARE @PlantNew TABLE (PlantID INT, PlantName VARCHAR(100), PlantDirExists bit, PlantAssistantDirID INT, PlantDirID INT)
INSERT INTO @PlantNew (PlantID, PlantName, PlantDirExists, PlantAssistantDirID, PlantDirID)
VALUES (1, 'Name One', 0, NULL, NULL),(2, 'Name Two', 1, 111, 222),(3, 'Name Three', 0, NULL, NULL),(4, 'Name Four', 1, NULL, 333),(5, 'Name Five', 0, NULL, NULL),(6, 'Name Six',1, 444, NULL)
How do I achieve the above using SQL ? Thanks.

like this
INSERT @PlantNew  (PlantID, PlantName, PlantDirExists, PlantAssistantDirID, PlantDirID) 
SELECT p.PlantID,
p.PlantName,
CASE WHEN pd.PlantID IS NULL THEN 0 ELSE 1 END,
PlantAssistantDirID,
PlantDirID
FROM @Plant p
LEFT JOIN (SELECT PlantID,
MAX(CASE WHEN Assistant = 1 THEN PlantDirectorID END) AS PlantAssistantDirID,
MAX(CASE WHEN Assistant = 0 THEN PlantDirectorID END) AS PlantDirID
@PlantDirector
GROUP BY PlantID)pd
ON pd.PlantID = p.PlantID
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
You're missing a FROM
insert into @PlantNew
SELECT p.PlantID,
p.PlantName,
CASE WHEN pd.PlantID IS NULL THEN 0 ELSE 1 END,
PlantAssistantDirID,
PlantDirID
FROM @Plant p
LEFT JOIN (SELECT PlantID,
MAX(CASE WHEN Assistant = 1 THEN PlantDirectorID END) AS PlantAssistantDirID,
MAX(CASE WHEN Assistant = 0 THEN PlantDirectorID END) AS PlantDirID
from @PlantDirector
GROUP BY PlantID)pd
ON pd.PlantID = p.PlantID

Similar Messages

  • How to read the data from XML file and insert into oracle DB

    Hi All,
    I have below require ment.
    I will receive data in the XML file. then i need to read that data and insert into oracle tables. please let me know how this can be handled.
    Many Thanks.

    Sounds a lot like this question, only with less details.
    how to read data from XML  variable and insert into table variable
    We can only help if you provide us details to help as we cannot see what you are doing and only know what you tell us.  Plenty of examples abound on the forums that cover the topics you seek as well.

  • Hi, extract data from xml file and insert into another exiting xml file

    i am searching code to extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    1st xml file which has two lines(text1.xml)
    <?xml version="1.0" encoding="iso-8859-1"?>
    <xs:PrintDataRequest xmlns:xs="http://com.unisys.com/Anid"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    <xs:Person>
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://com.unisys.com/Anid file:ANIDWS.xsd">
    These two lines has to be inserted in the existing another xml(text 2.xml) file(at line 3 and 4)
    Regards,
    bubbly

    Jadz_Core wrote:
    RandomAccessFile? If you know where you want to insert it.Are you sure about this? If using this, the receiving file would have to have bytes inserted that exactly match the number of bytes replaced. I'm thinking that you'll likely have to stream through the second XML with a SAX parser and copy information (or insert new information) as you stream with an XML writer of some sort.

  • Extract data from xml file and insert into another exiting xml fil

    hello,
    i am searching extract data from xml file and insert into another exiting xml file by a java program. I understood it is easy to extract data from a xml file, and how ever without creating another xml file. We want to insert the extracted data into another exiting xml file. Suggestions?
    Regards,
    Zhuozhi

    If the files are small, you can load the target file in a DOM document, insert the data from the source file and persist the DOM.
    If the files are large, you probably want to use a StAX or SAX.

  • Loading consolodated data from two excel files / cube into one infocube

    Hi Friends,
    I am receiving data from two sources:
    Source 1:
    Customer Product  Location                          Keyfig(Budget)
    C1            P1          L1                                     100
    C2            P1         L1                                      200
    C1           P2          L1                                      300
    Source 2:
    Product     Location                        KeyFig (Actual)
    P1               L1                                           320
    P2               L1                                           350
    I want to combinedata from two sources (or cubes) into one cube as follows:
    Customer      PRoduct       location          Budget            Actual
    C1                 P1                 L1                  100                 320
    C2                 P1                L1                   200                 320
    C1                P2                L1                   300                   350
    I tried by creating multiple data sources / infosources / transformations / updates rules and also tried with both DSO and cube. But the records are always getting updated as follows:
    Customer      PRoduct       location          Budget            Actual
    C1                 P1                 L1                  100                 320
    C2                 P1                L1                   200                 320
    C1                P2                L1                   300                   350
                        P1               L1                                           320
                       P2               L1                                           350
    Can you help me figure out if this is possible? If yes, how can I do it.
    Thanks a lot in advance.

    Hi,
    Please use the below approach.
    Load the budget data in ODS1.
    Load the actual data in ODS2.
    Create a ODS3 with same structure as ODS1 with additional key figure for Actuals,which will get data from ODS1. Here add a look up based product and location to populate actuals.
    Start Routine
    SELECT * FROM ODS2 into ITAB
    FOR ALL ENTRIES in SOURCE_PACKAGE.
    Transformation Routine:
    Read table ITAB  into WATAB
    with key location = <source_fields>-Location
    product =  <source_fields>-Product.
    If sy-subrc = 0.
    RESULT = WATAB-ACTUAL.
    ENDIF.
    -Vikram

  • Getting data from some source and insert into corresponding source

    hi all,
    i am using db10g.
    i have like like
    MMA : add1:add2:add3'the above said is the format and actual data will be something like
    MMA : bank street : no:32 : tel: +9127546663
    add1 = bank street
    add2 = no:32
    add3 = tel: +9127546663 my requirement to store add1,add2 and add3 into the table.
    So to take the data i am using substring and instring function to get the data with respect to : my problem is for example in add2 place no:32 is there my task is to fetch the no:32 from that line and store into the table as no:32 only in that case i cannot fetch the value in terms of :.
    sincr : is the delimiter(seperator for data element)
    how can i solve this issue?
    Thanks..
    Edited by: user13329002 on Jan 1, 2011 11:37 PM

    Your first addr1 is terminated by (:).
    your second addr2 is started with (no:), and also terminated by (:)
    and third addr3 is started with (tel:).
    only the first and third (:) will be used as delimiter, so how about this?
    WITH T AS (SELECT 'bank street:no:32:tel:+9127546663' AS STR FROM DUAL)
    SELECT SUBSTR(STR, 1, INSTR(STR, ':', 1, 1)-1) AS ADD1,
    SUBSTR(STR, INSTR(STR, ':', 1, 1)+1, INSTR(STR, ':', 1, 3)-INSTR(STR, ':', 1, 1)-1) AS ADD2,
    substr(str, instr(str, ':', -1, 2)+1) as add3 from t;
    bank street       no:32     tel:+9127546663Or can`t you just ask the people who gave the requirement to change the delimiter used in the format to something else like pipe (|)?

  • Pool data from text file and insert into database

    Can anyone tell me how to pool data from a text file and insert into database?
    let's say my text file is in this format
    123456 Peter 22
    234567 Nicholas 24
    345678 Jane 20
    Then I need to insert the all the value for this three column into a table which has the three column name ID, Name, Age
    Anyone knows? I need to do this urgently...Thank in advanced

    1. Use BufferedReader and read the file line by line.
    2. Loop thru the file and do the following steps with in this loop.
    3. Use StringTokenizer to seperate each line into three values (columns).
    4. Now create a insert statement with these values and add the statement to the batch (using addBatch() method of PreparedStatement or Statement).
    5. Finally (after exiting the loop), execute these batch of statements (using ps.executeBatch()).
    Sudha

  • Can I transfer Time Machine data from two separate hard drives into one new one?

    I'm using a MacBook Pro as my primary computer.  My 500 gig Time Capsule filled up a year or so ago, so I stopped using it with Time Machine for awhile so I could keep the data from those old back-ups.  There were a number of things I deleted from my computer's very limited hard drive after they were backed up to the Time Capsule.  I got a 1T external USB drive last year to use as my "filing Cabinet" to store files I didn't necessarily need all the time or that were filling up my small laptop hard drive--including my iTunes library--all organized in a way that made it relatively easy for me to find what I needed, even if I didn't remember exactly when I'd filed it or what I'd called it.  I got another 1 terabyte external (portable) drive last July and dedicated it to TimeMachine backups and labeled it "TimeMachine".
    Over the last couple of weeks, my friend has been helping me upgrade to Yosemite and clean up my laptop hard drive.  Last week he cloned my laptop hard drive to a new 1T hard drive and I exchanged it for my old drive in my computer today. All good.
    Here's the issue.  We replaced my Time Capsule hard drive with a 1 terabyte drive with the idea of transferring the data from the old Time Capsule (500G) drive and the newer USB 1T "TimeMachine" compact drive to the new 1T Time Capsule drive and beginning using the latter for my Time Machine backups going forward.  Originally he thought we could copy everything from each of my external drives (the old 500gig drive from my Time Capsule, the USB "TimeMachine" drive I've been using since July, and the "file cabinet" files) to my computer in their own folders and then start regular TimeMachine backups to the new Time Capsule drive, thus preserving all my old data and making regular backups going forward.  The "file cabinet" data was no problem at all, but when I tried to copy my USB "TimeMachine" data to my computer, I was unable to.  My friend found instructions for transferring old TimeMachine data to a new TimeCapsule, but I don't know if I can transfer the data from two separate disks to the new TimeCapsule drive. I'm afraid that one set of data will supersede the other and either my newer backups or my old ones will be lost if I try to transfer both. 
    Are my fears justified or is there a way to insure that no such problem will occur?  Of course, my data will still be on those two older drives, but that won't do me a lot of good if I can't access it when I need to. Also, the 1T drive now belongs to my friend; he used a brand new drive he'd bought for himself for my new internal hard drive and plans to take my 1T "TimeCapsule" drive in exchange, once the data has been transferred, so he will, of course, erase that drive. 

    Should we be able to bring up the old (500G) Time Capsule Drive to rename it using a SATA to FireWire harness and then copy the whole thing to the new Time Capsule drive?
    You can copy a whole sparsebundle from one drive to another. That is not a problem. Whether you can access the sparsebundle is something you should test before you even start though.
    If it's on the Desktop and I don't tell Time Machine to exclude it from backups, will it just automatically back it up?
    All drives you plug into the Mac are excluded by default.. you must include them. So no problems there.. but I hope I am understanding the question.
    Will both volumes or directories (which is the right term?) show up when I open Time Machine?
    No, Time Machine will only open what it is told to open... or its backup default location.
    You can force Time Machine to open alternative/old/no longer used backups by (now I have a problem as things have changed somewhat in Yosemite and I consider it alpha release software at this point in time). The old method was to right click on the TM icon and select a different TM backup. easy. Yosemite seems to have made easy stuff harder.
    Here it is on my current computer.. clearly not Yosemite.. Right click on the TM icon in the dock.. select Browse Other Time Machine Disks.. And supply the info of where that is located. Easy. If you cannot figure it out one of the other posters here (with more patience than me for Yosemite) will help you.
    Or will we have to partition the new drive somehow--is that even possible?
    I am getting more lost as I go down the list.. but the TC disk cannot be partitioned.
    If you have included all those USB drives in the new backup on a Time a Time Capsule.. you have made life rather hard because now your files are stored another layer deeper than they were.
    So to open a file from a disk you need to open the sparsebundle.. then dig down to the drive in question and then dig down to the backup.. and all of this means Time Machine has to work perfectly which is Yosemite is a very big ask.
    I thought you wanted to just backup your old drives to central location.. which means copying the files to a separate folder on the Time Capsule.
    One correction I need to make to my post, which will make my strategy make a bit more sense: my new Time Capsule drive is 4T
    It makes it much harder.. and I have to pose a real question of long term .. if you have put a 4TB drive in a Gen1 TC.. did you also replace its power supply because I can assure you the drive might be ok but the TC itself will not last forever.. and what happens when it dies. The Gen1 power supply is already well beyond its normal life span and the vast majority are dead. When the backup device is unreliable and the backups on it are made that much harder to get access to.. is this a great plan??
    If you are going to consolidate all your old files on one disk.. a task I find understandable. I have done much the same albeit the usefulness of files made on emac running OS9 may be questioned. A disk lying in the bottom of a draw is a more appropriate place for them.
    You want those files as easily accessible as possible (at the point of recovery) and not buried inside a sparsebundle.. particularly not a sparsebundle from the old TC disk buried inside a new sparsebundle.. keep files as accessible as possible as you can run searches.. and that is best done on a USB 3 (or faster ie thunderbolt) drive plugged into a new(ish) computer.. not network. And since the files are not being accessed on a daily/weekly or even monthly or yearly basis.. keeping them in actively running TC network storage.. I would say is a waste of space. That is only my opinion of course.. you might consider it highly important that files you will never look at are available any time of day or night when the urge comes to track down that elusive pimpernel email you sent 10 years ago... but I find it hard to justify. What the case.. the problem with TM and things like Mail is you cannot search it.. you must restore the whole library/files/program even before you can access it.. that makes file recovery out of a sparsebundle double step process.
    So.. summary.
    If you want to store files on the 4TB drive in the TC.. that is not a great strategy but it can work.. simply create a folder named.. OldFilesEMac for instance.. and do a simple copy and paste of all the file to that location. Do not use TM.. Since you have already used TM.. and from what I am reading you have already done the backup with the external drives included.. then you are going to end up needing to erase the TC and start over.. which you may not be prepared to do.. which is fair enough. (I am coming across as overbearing school master.. apologies).
    TM is to backup your main OS and current files.. not files from 10-20years ago.
    Please do read the issues involved in Pondini..
    See his FAQ. I recommend you read through Q14-17 so you understand what is involved in recovery.
    http://pondini.org/TM/FAQ.html
    I also recommend you read the first couple of articles here. http://pondini.org/TM/Home.html
    Particularly so you understand the complexity of Time Machine.
    And the articles here. http://pondini.org/TM/Time_Capsule.html
    Particularly Q3 on mixing data and backups on a TC.
    I wish I could spend an hour or two face to face and work it out.. the whole strategy to do this.. !!

  • How to get the data from a file and insert into a table

    Good morning,
    NEED TO READ THIS FILE sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv containing the following information
    NUID,NUMERO_DE_CUENTA_CONTRATO,CÓDIGO_DANE_DEPARTAMENTO,CÓDIGO_DANE_MUNICIPIO,ZONA_IGAC,SECTOR_IGAC,MANZANA_O_VEREDA_IGAC,NÚMERO_DEL_PREDIO_IGAC,CONDICION_DE_PROPIEDAD_DEL_PREDIO_IGAC,DIRECCIÓN_DEL_PREDIO,NÚMERO_DE_FACTURA,FECHA_DE_EXPEDICIÓN_DE_LA_FACTURA,FECHA_DE_INICIO_DEL_PERÍODO_DE_FACTURACIÓN,DIAS_FACTURADOS,CÓDIGO_CLASE_DE_USO,UNIDADES_MULTIUSUARIO_RESIDENCIAL,UNIDADES_MULTIUSUARIO_NO_RESIDENCIAL,HOGAR_COMUNITARIO_O_SUSTITUTO,USUARIO_FACTURADO_CON_AFORO,USUARIO_CUENTA_CON_CARACTERIZACIÓN,CARGO_FIJO,CARGO_POR_VERTIMIENTO_BASICO,CARGO_POR_VERTIMIENTO_COMPLEMENTARIO,CARGO_POR_VERTIMIENTOSUNTUARIO,CMT,VERTIMIENTO_DEL_PERIOD_EN_METROS_CUBICOS,VALOR_FACTURADO_POR_VERTIDO,VALOR_DEL_SUBSIDIO,VALOR_DE_LA_CONTRIBUCIÓN,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_CARGO_FIJO,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_VERTIMIENTO,CARGOS_POR_CONEXIÓN,PAGO_ANTICIPADO_DEL_SERVICIO,DÍAS_DE_MORA,VALOR_DE_MORA,INTERESES_POR_MORA,OTROS_COBROS,CAUSAL_DE_REFACTURACIÓN,NUMERO_DE_LA_FACTURA_OBJETO_DE_REFACTURACIÓN,VALOR_TOTAL_FACTURADO,PAGOS_DEL_CLIENTE_DURANTE_EL_PERÍODO_FACTURADO
    242602,242602,76,845,99,99,9999,9999,999,CLL 5 CRA 7 PEATONAL,24911920,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000005,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000000000.00
    242604,242604,76,845,99,99,9999,9999,999,CRA 4 # 6 - 13,24911846,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000013,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000004411.00
    242605,242605,76,845,99,99,9999,9999,999,CRA 2 CLLES 3 Y 4,24911509,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000004,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000002200.00
    this is the function that I have
    <<function_test>>
    DECLARE
    TOTAL_CAR NUMBER;
    POS_1 NUMBER:= 0;
    POS_2 NUMBER:= 0;
    REST NUMBER:= 0;
    ACUM NUMBER:= 0;
    CADEN VARCHAR2(200);
    nom_archivo varchar2(80);
    v1 utl_file.file_type;
    v2 varchar2(2048);
    BEGIN
         nom_archivo := 'sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv';
         v1:= utl_file.fopen('PUBLIC_ACCESS',nom_archivo,'R',32767);
         utl_file.get_line(v1,v2);
         SELECT LENGTH(v2) INTO TOTAL_CAR FROM DUAL;
         ACUM:=1;
         POS_1:=0;
         WHILE ACUM <= 60
         LOOP
              select instr(v2, ',', 1, ACUM) PRUEBA
              INTO      POS_2
              FROM      DUAL;
              dbms_output.put_line(' TOTAL POSICION 1--> '|| POS_1);
              dbms_output.put_line(' TOTAL POSICION 2--> '|| POS_2);
              dbms_output.put_line(' TOTAL ACUMULADO --> '|| ACUM);
              REST := (POS_2-POS_1)-1;
              SELECT SUBSTR(v2,(POS_1+1),REST) PRUEBA2
                   INTO CADEN
              FROM      DUAL;     
              dbms_output.put_line(' CADENA SELECCIONADA --> '|| CADEN);
              ACUM := ACUM + 1;
              POS_1:= POS_2;
         END LOOP;
         utl_file.fclose(v1);
         dbms_output.put_line(' -->');
         dbms_output.put_line(' TOTAL POSICION 1-->'|| POS_1);
         dbms_output.put_line(' TOTAL POSICION 2-->'|| POS_2);
         dbms_output.put_line(' TOTAL ACUMULADO -->'|| ACUM);
         dbms_output.put_line(' TOTAL DE CARACTERES -->'|| TOTAL_CAR);
         dbms_output.put_line(' ');     
         EXCEPTION
              WHEN NO_DATA_FOUND THEN
                   dbms_output.put_line('NO SE ENCONTRARON MAS CARACTERES');
              WHEN OTHERS THEN
                   dbms_output.put_line('OTRO TIPO DE ERROR ');
                   dbms_output.put_line('CODIGO ERROR '|| SQLCODE ||' '||SQLERRM);
    END;
    Which must be separated by a comma and enter a table I have the following procedure in which only brings me the first line, which need not
    The current role I have just read and extract data from the row number 1, I do not need information.
    I need information for rows 2,3,4. In each row there are 41 fields, which I enter in a table called Dato_archivos.
    how to perform this function? ...
    I appreciate the cooperation and explanation ...
    GOOD DAY ...
    REYNEL SALAZAR MARTINEZ
    COLOMBIA ...

    When you get an error with external tables (or sql*loader) look in the same folder as the data file and you should get a .log file and maybe a .bad file too.
    The log file should indicate the nature of the error it has trying to load the data.
    I've just copied your sample data from your first post to a file on my server and tried it to find that you are not specifying the required format for your dates. The below shows it now working...
    CREATE TABLE tabla_prueba
      (NUID NUMBER,
       NUM_CUENTA_CONTRATO NUMBER,
       COD_DANE_DD NUMBER,
       COD_DANE_MM NUMBER,
       ZONA_IGAC NUMBER,
       SECTOR_IGAC NUMBER,
       MANZANA_VEREDA_IGAC NUMBER,
       NUM_PREDIO_IGAC NUMBER,
       CONDICION_PREDIO_IGAC NUMBER,
       DIRECCION_PREDIO_IGAC VARCHAR2(80),
       NUM_FACTURA NUMBER,
       FECHA_EXPED_FACTURA DATE,
       FECHA_INI_PERIODO_FACTURACION DATE,
       DIAS_FACTURADOS NUMBER,
       COD_CLASE_USO NUMBER,
       UNI_MULTIUSUARIO_RESIDENCIAL NUMBER,
       UNI_MULTIUSUARIO_NORESIDENCIAL NUMBER,
       HOGAR_COMUNITARIO NUMBER,
       USUARIO_FACTURADO_AFORO NUMBER,
       USUARIO_CON_CARACTERIZACION NUMBER,
       CARGO_FIJO NUMBER,
       CARGO_VERTIMENTO_BAS NUMBER,
       CARGO_VERTIMENTO_COMP NUMBER,
       CARGO_VERTIMENTO_SUNT NUMBER,
       CMT NUMBER,
       VLR_FACTURADO_VERTIDO NUMBER,
       VLR_SUBSIDIO NUMBER,
       VLR_CONTRIBUCCION NUMBER,
       FACTOR_SUBS_CONTR_CARGO_FIJO NUMBER,
       FACTOR_SUBS_CONTR_VERTIMENTO NUMBER,
       CARGO_CONEXION NUMBER,
       PAGO_ANTICIPADO_SERVICIO NUMBER,
       DIAS_MORA NUMBER,
       VLR_MORA NUMBER,
       INTERES_MORA NUMBER,
       OTROS_COBROS NUMBER,
       CAUSAL_REFACTURACION NUMBER,
       NUM_FACTURA_OBJ_REFACTURACION NUMBER,
       VLR_TOTAL_FACTURADO NUMBER,
       PAGOS_CLIENTE_DURANTE_PERIODO NUMBER
    ORGANIZATION EXTERNAL
      (TYPE ORACLE_LOADER
       DEFAULT DIRECTORY TEST_DIR
       ACCESS PARAMETERS
        (RECORDS DELIMITED BY NEWLINE
         SKIP 1
         FIELDS TERMINATED BY ","
         OPTIONALLY ENCLOSED BY '"'
          (NUID,
           NUM_CUENTA_CONTRATO,
           COD_DANE_DD,
           COD_DANE_MM,
           ZONA_IGAC,
           SECTOR_IGAC,
           MANZANA_VEREDA_IGAC,
           NUM_PREDIO_IGAC,
           CONDICION_PREDIO_IGAC,
           DIRECCION_PREDIO_IGAC,
           NUM_FACTURA,
           FECHA_EXPED_FACTURA CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           FECHA_INI_PERIODO_FACTURACION CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           DIAS_FACTURADOS,
           COD_CLASE_USO,
           UNI_MULTIUSUARIO_RESIDENCIAL ,
           UNI_MULTIUSUARIO_NORESIDENCIAL,
           HOGAR_COMUNITARIO ,
           USUARIO_FACTURADO_AFORO ,
           USUARIO_CON_CARACTERIZACION ,
           CARGO_FIJO ,
           CARGO_VERTIMENTO_BAS ,
           CARGO_VERTIMENTO_COMP,
           CARGO_VERTIMENTO_SUNT,
           CMT,
           VLR_FACTURADO_VERTIDO,
           VLR_SUBSIDIO ,
           VLR_CONTRIBUCCION ,
           FACTOR_SUBS_CONTR_CARGO_FIJO ,
           FACTOR_SUBS_CONTR_VERTIMENTO ,
           CARGO_CONEXION ,
           PAGO_ANTICIPADO_SERVICIO ,
           DIAS_MORA ,
           VLR_MORA ,
           INTERES_MORA ,
           OTROS_COBROS ,
           CAUSAL_REFACTURACION ,
           NUM_FACTURA_OBJ_REFACTURACION,
           VLR_TOTAL_FACTURADO,
           PAGOS_CLIENTE_DURANTE_PERIODO
        LOCATION ('test.csv')
    SQL> select * from tabla_prueba;
          NUID NUM_CUENTA_CONTRATO COD_DANE_DD COD_DANE_MM  ZONA_IGAC SECTOR_IGAC MANZANA_VEREDA_IGAC NUM_PREDIO_IGAC CONDICION_PREDIO_IGAC DIRECCION_PREDIO_IGAC                                                    NUM_FACTURA FECHA_EXPE FECHA_INI_
    DIAS_FACTURADOS COD_CLASE_USO UNI_MULTIUSUARIO_RESIDENCIAL UNI_MULTIUSUARIO_NORESIDENCIAL HOGAR_COMUNITARIO USUARIO_FACTURADO_AFORO USUARIO_CON_CARACTERIZACION CARGO_FIJO CARGO_VERTIMENTO_BAS CARGO_VERTIMENTO_COMP CARGO_VERTIMENTO_SUNT        CMT
    VLR_FACTURADO_VERTIDO VLR_SUBSIDIO VLR_CONTRIBUCCION FACTOR_SUBS_CONTR_CARGO_FIJO FACTOR_SUBS_CONTR_VERTIMENTO CARGO_CONEXION PAGO_ANTICIPADO_SERVICIO  DIAS_MORA   VLR_MORA INTERES_MORA OTROS_COBROS CAUSAL_REFACTURACION NUM_FACTURA_OBJ_REFACTURACION
    VLR_TOTAL_FACTURADO PAGOS_CLIENTE_DURANTE_PERIODO
        242602              242602          76         845         99          99                9999         9999              999 CLL 5 CRA 7 PEATONAL                                                         24911920 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        5         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242604              242604          76         845         99          99                9999         9999              999 CRA 4 # 6 - 13                                                               24911846 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                       13         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242605              242605          76         845         99          99                9999         9999              999 CRA 2 CLLES 3 Y 4                                                            24911509 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        4         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
    SQL>
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • HOW TO READ DATA FROM A FILE AND INSERT INTO A TABLE USING UTL_FILE

    Hi..
    I have a file.I want to read the data from file and load it into a table using utl_file.
    how can I do it?
    Any reply apreciated...

    Hi,
    This is not your requirment but u can try this :
    CREATE OR REPLACE DIRECTORY text_file AS 'D:\TEXT_FILE\';
    GRANT READ ON DIRECTORY text_file TO fah;
    GRANT WRITE ON DIRECTORY text_file TO fah;
    DROP TABLE load_a;
    CREATE TABLE load_a
    (a1 varchar2(20),
    a2 varchar2(200))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY text_file
    ACCESS PARAMETERS
    (FIELDS TERMINATED BY ','
    LOCATION ('data.txt')
    select * from load_a;
    CREATE TABLE A AS select * from load_a;
    SELECT * FROM A
    Regards
    Faheem Latif

  • Fetching data from two tables

    Hi friends,
    I've two similar tables A and B with one more column in table B that is "Active Flag".
    Table A
    Rowid----Fname----Lname
    101-----AF1----------AL1
    102----AF2-----------AL2
    103-----AF3---------AL3
    104-----AF4---------AL4
    Table B
    Rowid-----Fname---Lname---Flag
    101-----BF1----------BL1--------Y
    102-----BF2----------BL2---------N
    103-----BF3----------BL3--------N
    104-----BF4----------BL4-------Y
    I need to fetch data in Table C with column rowid, Fname and Lname.
    If Flag is Y in Table B, this record should come to table C otherwise the record from table A will come to C. Rowid is the primary key.
    Table C should be like this..
    Rowid-----Fname---Lname
    101-----BF1----------BL1
    102-----AF2----------AL2
    103-----AF3----------AL3
    104-----BF4----------BL4
    Thanks in advance

    Hi,
    try this
    QL> With T As
    2       (
    3          Select 101 Id, 'BF1' fname, 'BL1' lname, 'Y' flag
    4            From DUAL
    5          Union All
    6          Select 102, 'BF2', 'BL2', 'N'
    7            From DUAL
    8          Union All
    9          Select 103, 'BF3', 'BL3', 'N'
    10            From DUAL
    11          Union All
    12          Select 104, 'BF4', 'BL4', 'Y'
    13            From DUAL),
    14       T1 As
    15       (
    16          Select 101 Id, 'BF1' fname, 'BL1' lname
    17            From DUAL
    18          Union All
    19          Select 102, 'AF2', 'AL2'
    20            From DUAL
    21          Union All
    22          Select 103, 'AF3', 'AL3'
    23            From DUAL
    24          Union All
    25          Select 104, 'BF4', 'BL4'
    26            From DUAL)
    27  Select T1.Id, T1.fname, T1.lname
    28    From T, T1
    29   Where T.Id = T1.Id
    30     And T.flag='Y'
    31  /
           ID FNA LNA
          101 BF1 BL1
          104 BF4 BL4and rowid is oracle reserved word. you should be aware of oracle reserved word, kindly refer this v$reserved_words
    Edited by: user291283 on Sep 7, 2009 10:18 PM

  • PHP create array/variable for two dropdowns and insert into one column

    Hello,
    I have been trying for days to get this to work. Any help would be very appreciated.
    The dropdown code:
    <?php //Dropdowns for hours and minutes
    $hour = array (0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16);
    $min = array (.00, .25, .50 ,.75);
    echo '<span class="red">Hours :</span><select name="hour">';
    foreach ($hour as $value1) {
    echo "<option value=\"$value1\">$value1</option>\n";
    echo '</select>';
    echo '<span class="red">Quarter Hours :</span><select name="min"';
    foreach ($min as $value2) {
    echo "<option value=\"$value2\">$value2</option>\n";
    echo '</select>';
    //Dropdowns for hours and minutes
    //Create variable to send to the time field using a hidden field
    function input_time($value1, $value2) {
    $time = count($value1 + $value2);
    return $input_time;
    ?>
    <input name="time" type="hidden" value="<?php echo "$input_time";?>" />
    The Schema:
    CREATE TABLE `ND_time_sheet` (
      `time_id` int(10) unsigned NOT NULL AUTO_INCREMENT,
      `day` varchar(10) NOT NULL DEFAULT '',
      `date` varchar(8) NOT NULL DEFAULT '',
      `project_id` varchar(8) DEFAULT NULL,
      `non_bill` tinytext,
      `staff_id` varchar(5) NOT NULL DEFAULT '',
      `time` decimal(3,2) unsigned zerofill DEFAULT NULL,
      `type` varchar(30) NOT NULL DEFAULT '',
      PRIMARY KEY (`time_id`)
    ) ENGINE=MyISAM DEFAULT CHARSET=latin1
    Thanks!

    9677670421 wrote:
    hi... i am trying to do the same but i am having the 2D array of data when first time if i clicked in table it should show the the first row values and if i again clicked same value should show in the second row and so on... but insted in my code if clicked (using insert into array ) at a time it is showing two rows of the data pl help me ..........
    This seems to be a different problem, so you should have started a new thread instead.
    Can you show us your code, tell us what you are clicking, tell us what you get and what you expect to get instead. Thanks.
    LabVIEW Champion . Do more with less code and in less time .

  • Writing data from two different acquisition rates into one file

    Hi,
    I'm using a compactRIO, and I have an anemometer outputting data at 4Hz to the serial port on the controller, and I am using a 9237 module to read strain from two channels at 40Hz. I would like to record all this data in one file, but I'm unsure how to have the data points match up with respect to time since they are read at different rates. Basically I have 10 strain data points for every one anemometer data point, but that one data point needs to somehow be associated with the other 10 read in the same amount of time. It would be okay to "extend" the slower data to the 10 strain points, and likewise to the next ten when the anemometer refreshes its values.
    Thanks for any input!
    Andrew

    Check out the Queue multi-plexer vi from the examples.  C:\Program Files\National Instruments\LabVIEW 2009\examples\general\queue.llb\Queue Multiplexer.vi
    If each data acquisition loop enqueues data to one file writing loop the data is writen in order.  Queues are really a good way to multiplex data sources 
    Message Edited by Jeff Bohrer on 01-29-2010 09:44 AM
    Jeff

  • Fetch Data from two Database

    I want to fetch data from two tables which are in two separate database. Is it possible to fetch similar from two schemas of same database.
    Pls help with query.

    Hi,
    Say you have two schema S1 and S2 in the same database 'xyz' then you can create two database links to these two schemas from any third schema as ::
    CREATE database link s1link connect to S1 identified by <password of S1> using 'xyz' ;
    CREATE database link s2link connect to S2 identified by <password of S2> using 'xyz' ;
    Now say if there is a table called tab1 in schema S1, then you can get data from it as ::
    SELECT T1.*
    FROM tab1@s1link T1
    WHERE < you can put conditions here> ;
    Similarly for the columns of a table of schema S2.
    - Thanks
    Sandipan

  • Fetch the data from two tables

    hell all
    i want to fetch the data from two tables, one is from internal table and another one is data base table. what syntax i have to use either FOR ALL ENTRIES or INNER JOIN?

    hi
    Use FOR ALL ENTRIES.
    see the sample code
      select * into table tvbrk from vbrk
                                where fkart in ('F2', 'F3', 'RE',
                                           'ZVEC' , 'ZVEM' , 'ZVED',
                                           'S1')
                                and erdat in so_erdat
                                and kunag in s_kunag. 
                                      erdat in so_erdat
                               and   fkart in ('F2', 'F3', 'RE',
                                                 'ZVEC' , 'ZVEM').
    if not tvbrk is initial.
        select * into table t_zregion from zregion
                      for all entries in tvbrk
                       where country = tvbrk-land1
                       and   region = s_regio.
      endif.
    thanks
    sitaram

Maybe you are looking for

  • Pop up message: There was a problem connecting to the server. URLs with the type "file:" are not supported.

    Any ideas?  Rather annoying, really.  After clicking OK, it immediately reappears.  After second 'OK' click it goes away.  I have yet to determine what is triggering it. It was there again when I opened laptop this am.  Heck, I don't even know what s

  • Lost iphoto albums

    I suddenly lost or seem to have lost all the pictures and photographs I had on my iphoto albums. Wonder what happened. I did not do anything. How can I find the albums and retrieve my photographs that I had on my iphoto albums?

  • Another Panic Report Translation Needed Please

    Even after updating to 10.4.8 and installing a new hard drive, I'm still having kernel panics. Previously I got a report that indicated something plugged into the computer was causing the kernel panics. So I unplugged everything other than the keyboa

  • Need help with Workflow notification mailer setup

    Hi all, I have the following requirement from my client: "IT has a project request from the Accounting Department to automate payments for Canadian vendors. One of the things that will need to happen on this project is setting up the Workflow Notific

  • Error: unable to retrieve column information from the data source

    Hi, I am trying to set up a data flow task. The source is "SQL Command" which is a stored procedure. The proc has a few temp tables that it outputs the final resultset from. When I hit preview in the ole db source editor, I see the right output. When