Seeking table load solution

Hello,
I'm trying to load a table with 10 fields with data from a spreadsheet. there are fields that exceeds 4000 characters. What would be the best way to handle this situation....clob/blob. If so, how do I go about loading the table?
I'm familiar with sqlldr, but I'm limited to 4000 characters in loading.
Any help would be greatly appreciated.
Thanks

sqlldr can handle lobs. See the manual for details. You have several options for how you feed the lob data to Oracle via the utility such as providing the lob data as separate files.
HTH -- Mark D Powell --

Similar Messages

  • Using WHERE NOT EXISTS for a Fact Table Load

    I'm trying to set up a fact table load using T SQL, and I need to use WHERE NOT EXISTS. All of the fields from the fact table are listed in the WHERE NOT EXISTS clause. What I expect is that if the value of any one of the fields is different, that the whole
    record be treated as a new record, and inserted into the table. However, in my testing, when I 'force' a field value, new records are not inserted.
    The following is my query:
    declare 
    @Created_By nchar(50)
    ,@Created_Date datetime --do we need utc check?
    ,@Updated_By nchar(50)
    ,@Updated_Date datetime
    select @Created_By = system_user
    ,@Created_Date = getdate()
    ,@Updated_By = system_user
    ,@Updated_Date = getdate()
    insert fact.Appointment
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    select distinct
    Slot.Slot_ID 
    , Slot.Slot_Start_DateTime  as Slot_DateTime --???
    , Slot.Slot_Start_DateTime
    , Slot.Slot_End_DateTime
    , datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min 
    , Slot.Created_Date as Slot_CreateDateTime
    , SlotCreateDate.Date_key as Slot_CreateDate_DateKey
    , HSite.Healthcare_System_ID
    , HSite.Healthcare_Service_ID
    , HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
    , HSite.Healthcare_Site_ID 
    , Ref.Booked_Appt_ID 
    , ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
    , ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
    , datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
    , Appt.Appt_Notification_ID 
    , pat.Patient_ID 
    , 0 as Physician_ID
    , ref.Referral_ID
    , Hsrv.Specialty
    , appt.[Language] as LanguageRequested
    ,@Created_Date as Created_Date
    ,@Created_By as Created_By
    ,@Updated_Date as Updated_Date
    ,@Updated_By as Updated_By
    from dim.Healthcare_System HSys
    inner join dim.Healthcare_Service HSrv
    on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Healthcare_Site HSite
    on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
    and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Referral Ref 
    on Ref.ReferralSite_ID = HSite.Site_ID
    and Ref.ReferralService_ID = HSite.Service_ID
    and Ref.ReferralSystem_ID = HSite.System_ID 
    right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
    on ref.Source_Slot_ID = slot.Source_Slot_ID
    inner join dim.Appointment_Notification appt
    on appt.System_ID = HSys.System_ID
    inner join dim.Patient pat 
    on pat.Source_Patient_ID = appt.Source_Patient_ID
    inner join dim.SystemUser SysUser
    on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
    left join dim.Calendar SlotCreateDate
    on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
    left join dim.Calendar ApptSubmissionTime
    on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
    left join dim.Calendar ApptCompletionTime
    on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
    where not exists
    select
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    from fact.Appointment
    I don't have any issues with the initial insert, but records are not inserted on subsequent inserts when one of the WHERE NOT EXISTS field values changes.
    What am I doing wrong?
    Thank you for your help.
    cdun2

    so I set up a WHERE NOT EXIST condition as shown below. I ran the query, then updated Slot_Duration_Min to 5. Some of the Slot_Duration_Min values resolve to 15. What I expect is that when I run the query again, that the records where Slot_Duration_Min resolves
    to 15 should be inserted again, but they are not. I am using or with the conditions in the WHERE clause because if any one of the values is different, then a new record needs to be inserted:
    declare 
    @Created_By nchar(50)
    ,@Created_Date datetime
    ,@Updated_By nchar(50)
    ,@Updated_Date datetime
    select
    @Created_By = system_user
    ,@Created_Date = getdate()
    ,@Updated_By = system_user
    ,@Updated_Date = getdate()
    insert fact.Appointment
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    select distinct
    Slot.Slot_ID 
    , Slot.Slot_Start_DateTime  as Slot_DateTime --???
    , Slot.Slot_Start_DateTime
    , Slot.Slot_End_DateTime
    , datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min 
    , Slot.Created_Date as Slot_CreateDateTime
    , SlotCreateDate.Date_key as Slot_CreateDate_DateKey
    , HSite.Healthcare_System_ID
    , HSite.Healthcare_Service_ID
    , HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
    , HSite.Healthcare_Site_ID 
    , Ref.Booked_Appt_ID 
    , ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
    , ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
    , datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
    , Appt.Appt_Notification_ID 
    , pat.Patient_ID 
    , 0 as Physician_ID
    , ref.Referral_ID
    , Hsrv.Specialty
    , appt.[Language] as LanguageRequested
    ,@Created_Date as Created_Date
    ,@Created_By as Created_By
    ,@Updated_Date as Updated_Date
    ,@Updated_By as Updated_By
    from dim.Healthcare_System HSys
    inner join dim.Healthcare_Service HSrv
    on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Healthcare_Site HSite
    on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
    and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Referral Ref 
    on Ref.ReferralSite_ID = HSite.Site_ID
    and Ref.ReferralService_ID = HSite.Service_ID
    and Ref.ReferralSystem_ID = HSite.System_ID 
    right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
    on ref.Source_Slot_ID = slot.Source_Slot_ID
    inner join dim.Appointment_Notification appt
    on appt.System_ID = HSys.System_ID
    inner join dim.Patient pat 
    on pat.Source_Patient_ID = appt.Source_Patient_ID
    inner join dim.SystemUser SysUser
    on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
    left join dim.Calendar SlotCreateDate
    on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
    left join dim.Calendar ApptSubmissionTime
    on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
    left join dim.Calendar ApptCompletionTime
    on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
    where not exists
    select
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    from fact.Appointment fact
    where 
    Slot.Slot_ID  = fact.Slot_ID 
    or
    Slot.Slot_Start_DateTime   = fact.Slot_DateTime  
    or
    Slot.Slot_Start_DateTime = fact.Slot_StartDateTime
    or
    Slot.Slot_End_DateTime = fact.Slot_EndDateTime
    or
    datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) =
    fact.Slot_Duration_min
    or
    Slot.Created_Date  = fact.Slot_CreateDateTime
    or
    SlotCreateDate.Date_key = fact.Slot_CreateDate_DateKey
    or
    HSite.Healthcare_System_ID = fact.Healthcare_System_ID
    or
    HSite.Healthcare_Service_ID = fact.Healthcare_Service_ID
    or
    HSite.Healthcare_Service_ID  =
    fact.Healthcare_Service_ID 
    or
    HSite.Healthcare_Site_ID  = fact.Healthcare_Site_ID 
    or
    Ref.Booked_Appt_ID  = fact.Booked_Appt_ID 
    or
    ApptSubmissionTime.Date_key =
    fact.Appt_Notification_Submission_DateKey
    or
    ApptCompletionTime.Date_key =
    fact.Appt_Notification_Completion_DateKey
    or 
    datediff(mi,appt.SubmissionTime,appt.CompletionTime)  = fact.Appt_Notification_Duration
    or
    Appt.Appt_Notification_ID = fact.Appt_Notification_ID 
    or
    pat.Patient_ID  =
    fact.Patient_ID 
    or
    0 = 0
    or
    ref.Referral_ID = fact.Referral_ID
    or
    Hsrv.Specialty = fact.Specialty
    or
    appt.[Language] = fact.LanguageRequested

  • SQL*LOADER(8I) VARIABLE SIZE FIELD를 여러 TABLE에 LOAD하기 (FILLER)

    제품 : ORACLE SERVER
    작성날짜 : 2004-10-29
    ==================================================================
    SQL*LOADER(8I) VARIABLE SIZE FIELD를 여러 TABLE에 LOAD하기 (FILLER)
    ==================================================================
    PURPOSE
    SQL*LOADER 에서 variable length record와 variable size field를 가진 data
    file 을 여러 table에 load하는 방법을 소개하고자 한다.
    ( 8i new feature인 FILLER 절 사용)
    Explanation
    SQL*LOADER SYNTAX
    여러 table에 load하고자 할때에는 control file에 아래와 같이 하면 된다.
    INTO TABLE emp
    INTO TABLE emp1
    fixed length field을 가진 data file을 여러 table에 같은 data을 load하고자
    한다면 아래와 같다.
    INTO TABLE emp
    (empno POSITION(1:4) INTEGER EXTERNAL,
    INTO TABLE emp1
    (empno POSITION(1:4) INTEGER EXTERNAL,
    위와 같이 양쪽 table의 empno field에 각각의 load할 data로부터 1-4까지를
    load 할수 있다. 그러나 field의 길이가 가변적이라면 위와 같이 POSITION 절을
    각 field에 사용할 수 없다.
    Example
    예제 1>
    create table one (
    field_1 varchar2(20),
    field_2 varchar2(20),
    empno varchar(10) );
    create table two (
    field_3 varchar2(20),
    empno varchar(10) );
    load할 record가 comma로 나누어지며 길이가 가변적이라고 가정하자.
    << data.txt >> - load할 data file
    "this is field 1","this is field 2",12345678,"this is field 4"
    << test.ctl >> - control file
    load data infile 'data.txt'
    discardfile 'discard.txt'
    into table one
    replace
    fields terminated by ","
    optionally enclosed by '"' (
    field_1,
    field_2,
    empno )
    into table two
    replace
    fields terminated by ","
    optionally enclosed by '"' (
    field_3,
    dummy1 filler position(1),
    dummy2 filler,
    empno )
    dummy1 field는 filler로 선언되었다. filler로 선언하면 table에 load하지 않는다.
    two라는 table에는 dummy1이라는 field는 없으며 position(1)은 current record의
    처음부터 시작해서 첫번째 field을 dummy1 filler item에 load한다는 것을 말한다.
    그리고 두번째 field을 dummy2 filler item에 load한다. 세번째 field인, one이라는
    table에 load되었던 employee number는 two라는 table에도 load되는 것이다,
    << 실행 >>
    $sqlldr scott/tiger control=test.ctl data=data.txt log=test.log bindsize=300000
    $sqlplus scott/tiger
    SQL> select * from one;
    FIELD_1 FIELD_2 EMPNO
    this is field 1 this is field 2 12345678
    SQL> select * from two;
    FIELD_3 EMPNO
    this is field 4 12345678
    예제 2>
    create table testA (c1 number, c2 varchar2(10), c3 varchar2(10));
    << data1.txt >> - load할 data file
    7782,SALES,CLARK
    7839,MKTG,MILLER
    7934,DEV,JONES
    << test1.ctl >>
    LOAD DATA
    INFILE 'data1.txt'
    INTO TABLE testA
    REPLACE
    FIELDS TERMINATED BY ","
    c1 INTEGER EXTERNAL,
    c2 FILLER CHAR,
    c3 CHAR
    << 실행 >>
    $ sqlldr scott/tiger control=test1.ctl data=data1.txt log=test1.log
    $ sqlplus scott/tiger
    SQL> select * from testA;
    C1 C2 C3
    7782 CLARK
    7839 MILLER
    7934 JONES
    Reference Documents
    <Note:74719.1>

  • Design pattern / data loading solution

    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...
    thanks for any help!
    psi have also asked this question here [http://stackoverflow.com/questions/4329087/design-pattern-data-loading-solution]

    rackham wrote:
    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...That depends on the specifics.
    The fact that it seems like processes are similar doesn't mean that they are in fact the same. My code editor and Word both seem to be basically the same but I am rather sure that generalizing between the two would be a big mistake.
    And I speak from experience (parsing customer data and attempting to generalize the process.)
    The problem with attempting to generalize is if you generalize functionality that is not in fact the same. And then you end up with conditional logic all over the place to deal with differences dependent on the users. Rather than saving time that actually costs time because the code becomes more fragile.
    Doesn't mean it isn't possible but just rather that you should insure that it is in fact common behavior before implementing anything.

  • Fact Table loading in parallel

    Hi
    I have 3 fact tables which loads daily 300,000 records and runs in sequence . now i want to run these 3 fact tables load in parallel .
    if i run these 3 fact table load in parallel will there be any performance impact on running quries ?
    regards
    Srinivas

    Hello,
    I have had a case once where the parallel mechanism in SSIS was consuming so much memory that the pacakge failed every now and then. I have had to revert and make the tasks run sequentially. Luckily this is done simply by joining them in the BI designer.
    After that the package no longer failed. In short the impact is not so much on the SQL server engine running the queries, but on the System that runs the SSIS package and consumes too much memory.
    Jan D'Hondt - SQL server BI development

  • Seeking for Best Solutions to move code from Dev to Test: 11.1.1.3

    Seeking for Best Solutions to move code from Development to Test server: Studio Edition Version 11.1.1.3.0
    Development: Hostname: dev; Web logic server: WLSDev; Database: DBDev
    Testing: Hostname: test; Web logic server: WLSTest; Database: DBTest
    Now how do we take code from Development environment to Test environment? Once code is taken from development, testing team can't touch it.

    Lalitk,
    The obvious answer is just deploying the same EAR in test that you deployed to production?
    What other aspects do you need to deploy? If you have database code/data that needed to be migrated as well, I would copy the production DB to test and then apply all of your DB upgrade/patch scripts against that - in that way you can ensure your upgrade/patch scripts get tested as well.
    Are there other things you need to migrate (MDS repositories, security setups, etc)?
    John

  • ORA-26004: Tables loaded through the direct path may not be clustered

    hi ,
    I im planning to upload data to IOT table using sqlldr. but end with error.
    ORA-26004: Tables loaded through the direct path may not be clustered.
    how to resolve this. as this table going to insert high voluem data and to speed up the table quary we created IOT table.
    table create syntax:
    create CLUSTER C_HUA
    B_number number(10),
    A_number number(10),     
    ins_date date,
    account number(10),
    C_number number(10))
    SQL> CREATE TABLE HUA
    A_number number(10),
    B_number number(10),
    ins_date date,
    account number(10),
    C_number number(10)
    CLUSTER C_HUA
    B_number,
    A_number,
    ins_date,
    account,
    C_number);
    SQL> CREATE INDEX HUA_index1 on CLUSTER C_HUA;
    Pl help to resolve this.
    thanks
    ashwan

    You have to use conventional path. DIRECT=false.
    Restrictions on Using Direct Path Loads
    The following conditions must be satisfied for you to use the direct path load method:
    * Tables are not clustered.
    * Tables to be loaded do not have any active transactions pending.
    * Loading a parent table together with a child Table
    * Loading BFILE columns

  • How to call a javascript method after table load on JSFF Fragment load?

    Hello,
    The usecase is to invoke a javascript method after table is done loading (fetching data) when user lands to a JSFF fragment. With JSPX pages I can achieve that by using PagePhaseListener. I have tried with RegionController as follows, and the problem i face is that I cannot prevent multiple calls to the Javascript call when user presses a tab or button in a screen, or changes drop-down value with autosubmit on.
    import javax.faces.context.FacesContext;
    import oracle.adf.model.RegionBinding;
    import oracle.adf.model.RegionContext;
    import oracle.adf.model.RegionController;
    import org.apache.myfaces.trinidad.render.ExtendedRenderKitService;
    import org.apache.myfaces.trinidad.util.Service;
    public class MyListener implements RegionController{
    public MyListener() {
    super();
    @Override
    public boolean refreshRegion(RegionContext regionContext) {
    int refreshFlag = regionContext.getRefreshFlag();
    System.out.println("Render flag is: "+refreshFlag);
    if (refreshFlag == RegionBinding.PREPARE_MODEL)
    initializeMethod();
    regionContext.getRegionBinding().refresh(refreshFlag);
    return false;
    public boolean validateRegion(RegionContext regionContext) {
    regionContext.getRegionBinding().validate();
    return false;
    public boolean isRegionViewable(RegionContext regionContext) {
    return regionContext.getRegionBinding().isViewable();
    public void initializeMethod() {
    FacesContext f = FacesContext.getCurrentInstance();
    ExtendedRenderKitService service = Service.getRenderKitService(f, ExtendedRenderKitService.class);
    service.addScript(f, "myJSFunction();");
    @Override
    public String getName() {
    return null;
    I need the javascript to be called only once after the table is done loading when user lands to a fragment (jsff).
    Any ideas appreciated?
    JDeveloper version is 11.1.1.5.0
    Thank you.
    Valon
    Edited by: Valon on Apr 11, 2013 3:10 PM

    One of the requirements is to compare every row with the next row and highlight the changes. There are other requirements as well where JavaScript solution is used.
    The question remains the same. Is it doable or not without changing the solution from JavaScript solution to server-side solution ? Can we call a JavaScript only once when the user lands to a jsff fragment ?
    Hope that is clear.
    Thanks.
    Valon

  • Error in .dbf to table loading

    Hi all ,
    I am loading a .dbf file into a table.I took code from google for it.But I dnt know why its not working when I am running code.
    error coming as below..can anyone help me ..
    BEGIN
    EXPRESS.DBF2ORA.LOAD_TABLE ( ‘DATA_201′, ‘SAL_HEAD.DBF’, ‘dbf_tab_pc’, NULL, false );
    END;
    error is ....
    ORA-22285: non-existent directory or file for FILEOPEN operation
    ORA-06512: at “EXPRESS.DBF2ORA”, line 414
    ORA-06512: at line 2
    package code is ...
    package body dbase_pkg
    as
    -- Might have to change on your platform!!!
    -- Controls the byte order of binary integers read in
    -- from the dbase file
    BIG_ENDIAN constant boolean default TRUE;
    type dbf_header is RECORD
    version varchar2(25), -- dBASE version number
    year int, -- 1 byte int year, add to 1900
    month int, -- 1 byte month
    day int, -- 1 byte day
    no_records int, -- number of records in file,
    -- 4 byte int
    hdr_len int, -- length of header, 2 byte int
    rec_len int, -- number of bytes in record,
    -- 2 byte int
    no_fields int -- number of fields
    type field_descriptor is RECORD
    name varchar2(11),
    type char(1),
    length int, -- 1 byte length
    decimals int -- 1 byte scale
    type field_descriptor_array
    is table of
    field_descriptor index by binary_integer;
    type rowArray
    is table of
    varchar2(4000) index by binary_integer;
    g_cursor binary_integer default dbms_sql.open_cursor;
    -- Function to convert a binary unsigned integer
    -- into a PLSQL number
    function to_int( p_data in varchar2 ) return number
    is
    l_number number default 0;
    l_bytes number default length(p_data);
    begin
    if (big_endian)
    then
    for i in 1 .. l_bytes loop
    l_number := l_number +
    ascii(substr(p_data,i,1)) *
    power(2,8*(i-1));
    end loop;
    else
    for i in 1 .. l_bytes loop
    l_number := l_number +
    ascii(substr(p_data,l_bytes-i+1,1)) *
    power(2,8*(i-1));
    end loop;
    end if;
    return l_number;
    end;
    -- Routine to parse the DBASE header record, can get
    -- all of the details of the contents of a dbase file from
    -- this header
    procedure get_header
    (p_bfile in bfile,
    p_bfile_offset in out NUMBER,
    p_hdr in out dbf_header,
    p_flds in out field_descriptor_array )
    is
    l_data varchar2(100);
    l_hdr_size number default 32;
    l_field_desc_size number default 32;
    l_flds field_descriptor_array;
    begin
    p_flds := l_flds;
    l_data := utl_raw.cast_to_varchar2(
    dbms_lob.substr( p_bfile,
    l_hdr_size,
    p_bfile_offset ) );
    p_bfile_offset := p_bfile_offset + l_hdr_size;
    p_hdr.version := ascii( substr( l_data, 1, 1 ) );
    p_hdr.year := 1900 + ascii( substr( l_data, 2, 1 ) );
    p_hdr.month := ascii( substr( l_data, 3, 1 ) );
    p_hdr.day := ascii( substr( l_data, 4, 1 ) );
    p_hdr.no_records := to_int( substr( l_data, 5, 4 ) );
    p_hdr.hdr_len := to_int( substr( l_data, 9, 2 ) );
    p_hdr.rec_len := to_int( substr( l_data, 11, 2 ) );
    p_hdr.no_fields := trunc( (p_hdr.hdr_len - l_hdr_size)/
    l_field_desc_size );
    for i in 1 .. p_hdr.no_fields
    loop
    l_data := utl_raw.cast_to_varchar2(
    dbms_lob.substr( p_bfile,
    l_field_desc_size,
    p_bfile_offset ));
    p_bfile_offset := p_bfile_offset + l_field_desc_size;
    p_flds(i).name := rtrim(substr(l_data,1,11),chr(0));
    p_flds(i).type := substr( l_data, 12, 1 );
    p_flds(i).length := ascii( substr( l_data, 17, 1 ) );
    p_flds(i).decimals := ascii(substr(l_data,18,1) );
    end loop;
    p_bfile_offset := p_bfile_offset +
    mod( p_hdr.hdr_len - l_hdr_size,
    l_field_desc_size );
    end;
    function build_insert
    ( p_tname in varchar2,
    p_cnames in varchar2,
    p_flds in field_descriptor_array ) return varchar2
    is
    l_insert_statement long;
    begin
    l_insert_statement := 'insert into ' || p_tname || '(';
    if ( p_cnames is NOT NULL )
    then
    l_insert_statement := l_insert_statement ||
    p_cnames || ') values (';
    else
    for i in 1 .. p_flds.count
    loop
    if ( i <> 1 )
    then
    l_insert_statement := l_insert_statement||',';
    end if;
    l_insert_statement := l_insert_statement ||
    '"'|| p_flds(i).name || '"';
    end loop;
    l_insert_statement := l_insert_statement ||
    ') values (';
    end if;
    for i in 1 .. p_flds.count
    loop
    if ( i <> 1 )
    then
    l_insert_statement := l_insert_statement || ',';
    end if;
    if ( p_flds(i).type = 'D' )
    then
    l_insert_statement := l_insert_statement ||
    'to_date(:bv' || i || ',''yyyymmdd'' )';
    else
    l_insert_statement := l_insert_statement ||
    ':bv' || i;
    end if;
    end loop;
    l_insert_statement := l_insert_statement || ')';
    return l_insert_statement;
    end;
    function get_row
    ( p_bfile in bfile,
    p_bfile_offset in out number,
    p_hdr in dbf_header,
    p_flds in field_descriptor_array ) return rowArray
    is
    l_data varchar2(4000);
    l_row rowArray;
    l_n number default 2;
    begin
    l_data := utl_raw.cast_to_varchar2(
    dbms_lob.substr( p_bfile,
    p_hdr.rec_len,
    p_bfile_offset ) );
    p_bfile_offset := p_bfile_offset + p_hdr.rec_len;
    l_row(0) := substr( l_data, 1, 1 );
    for i in 1 .. p_hdr.no_fields loop
    l_row(i) := rtrim(ltrim(substr( l_data,
    l_n,
    p_flds(i).length ) ));
    if ( p_flds(i).type = 'F' and l_row(i) = '.' )
    then
    l_row(i) := NULL;
    end if;
    l_n := l_n + p_flds(i).length;
    end loop;
    return l_row;
    end get_row;
    procedure show( p_hdr in dbf_header,
    p_flds in field_descriptor_array,
    p_tname in varchar2,
    p_cnames in varchar2,
    p_bfile in bfile )
    is
    l_sep varchar2(1) default ',';
    procedure p(p_str in varchar2)
    is
    l_str long default p_str;
    begin
    while( l_str is not null )
    loop
    dbms_output.put_line( substr(l_str,1,250) );
    l_str := substr( l_str, 251 );
    end loop;
    end;
    begin
    p( 'Sizeof DBASE File: ' || dbms_lob.getlength(p_bfile) );
    p( 'DBASE Header Information: ' );
    p( chr(9)||'Version = ' || p_hdr.version );
    p( chr(9)||'Year = ' || p_hdr.year );
    p( chr(9)||'Month = ' || p_hdr.month );
    p( chr(9)||'Day = ' || p_hdr.day );
    p( chr(9)||'#Recs = ' || p_hdr.no_records);
    p( chr(9)||'Hdr Len = ' || p_hdr.hdr_len );
    p( chr(9)||'Rec Len = ' || p_hdr.rec_len );
    p( chr(9)||'#Fields = ' || p_hdr.no_fields );
    p( chr(10)||'Data Fields:' );
    for i in 1 .. p_hdr.no_fields
    loop
    p( 'Field(' || i || ') '
    || 'Name = "' || p_flds(i).name || '", '
    || 'Type = ' || p_flds(i).Type || ', '
    || 'Len = ' || p_flds(i).length || ', '
    || 'Scale= ' || p_flds(i).decimals );
    end loop;
    p( chr(10) || 'Insert We would use:' );
    p( build_insert( p_tname, p_cnames, p_flds ) );
    p( chr(10) || 'Table that could be created to hold data:');
    p( 'create table ' || p_tname );
    p( '(' );
    for i in 1 .. p_hdr.no_fields
    loop
    if ( i = p_hdr.no_fields ) then l_sep := ')'; end if;
    dbms_output.put
    ( chr(9) || '"' || p_flds(i).name || '" ');
    if ( p_flds(i).type = 'D' ) then
    p( 'date' || l_sep );
    elsif ( p_flds(i).type = 'F' ) then
    p( 'float' || l_sep );
    elsif ( p_flds(i).type = 'N' ) then
    if ( p_flds(i).decimals > 0 )
    then
    p( 'number('||p_flds(i).length||','||
    p_flds(i).decimals || ')' ||
    l_sep );
    else
    p( 'number('||p_flds(i).length||')'||l_sep );
    end if;
    else
    p( 'varchar2(' || p_flds(i).length || ')'||l_sep);
    end if;
    end loop;
    p( '/' );
    end;
    procedure load_Table( p_dir in varchar2,
    p_file in varchar2,
    p_tname in varchar2,
    p_cnames in varchar2 default NULL,
    p_show in boolean default FALSE )
    is
    l_bfile bfile;
    l_offset number default 1;
    l_hdr dbf_header;
    l_flds field_descriptor_array;
    l_row rowArray;
    begin
    l_bfile := bfilename( p_dir, p_file );
    dbms_lob.fileopen( l_bfile );
    get_header( l_bfile, l_offset, l_hdr, l_flds );
    if ( p_show )
    then
    show( l_hdr, l_flds, p_tname, p_cnames, l_bfile );
    else
    dbms_sql.parse( g_cursor,
    build_insert(p_tname,p_cnames,l_flds),
    dbms_sql.native );
    for i in 1 .. l_hdr.no_records loop
    l_row := get_row( l_bfile,
    l_offset,
    l_hdr,
    l_flds );
    if ( l_row(0) <> '*' ) -- deleted record
    then
    for i in 1..l_hdr.no_fields loop
    dbms_sql.bind_variable( g_cursor,
    ':bv'||i,
    l_row(i),
    4000 );
    end loop;
    if ( dbms_sql.execute( g_cursor ) <> 1 )
    then
    raise_application_error( -20001,
    'Insert failed ' || sqlerrm );
    end if;
    end if;
    end loop;
    end if;
    dbms_lob.fileclose( l_bfile );
    exception
    when others then
    if ( dbms_lob.isopen( l_bfile ) > 0 ) then
    dbms_lob.fileclose( l_bfile );
    end if;
    RAISE;
    end;
    end;
    i think i am doing mistake in creating and using directory ..direct me plz..

    Hi all ,
    I am loading a .dbf file into a table.I took code from google for it.But I dnt know why its not working when I am running code.
    error coming as below..can anyone help me ..
    BEGIN
    EXPRESS.DBF2ORA.LOAD_TABLE ( ‘DATA_201′, ‘SAL_HEAD.DBF’, ‘dbf_tab_pc’, NULL, false );
    END;
    error is ....
    ORA-22285: non-existent directory or file for FILEOPEN operation
    ORA-06512: at “EXPRESS.DBF2ORA”, line 414
    ORA-06512: at line 2
    package code is ...
    package body dbase_pkg
    as
    -- Might have to change on your platform!!!
    -- Controls the byte order of binary integers read in
    -- from the dbase file
    BIG_ENDIAN constant boolean default TRUE;
    type dbf_header is RECORD
    version varchar2(25), -- dBASE version number
    year int, -- 1 byte int year, add to 1900
    month int, -- 1 byte month
    day int, -- 1 byte day
    no_records int, -- number of records in file,
    -- 4 byte int
    hdr_len int, -- length of header, 2 byte int
    rec_len int, -- number of bytes in record,
    -- 2 byte int
    no_fields int -- number of fields
    type field_descriptor is RECORD
    name varchar2(11),
    type char(1),
    length int, -- 1 byte length
    decimals int -- 1 byte scale
    type field_descriptor_array
    is table of
    field_descriptor index by binary_integer;
    type rowArray
    is table of
    varchar2(4000) index by binary_integer;
    g_cursor binary_integer default dbms_sql.open_cursor;
    -- Function to convert a binary unsigned integer
    -- into a PLSQL number
    function to_int( p_data in varchar2 ) return number
    is
    l_number number default 0;
    l_bytes number default length(p_data);
    begin
    if (big_endian)
    then
    for i in 1 .. l_bytes loop
    l_number := l_number +
    ascii(substr(p_data,i,1)) *
    power(2,8*(i-1));
    end loop;
    else
    for i in 1 .. l_bytes loop
    l_number := l_number +
    ascii(substr(p_data,l_bytes-i+1,1)) *
    power(2,8*(i-1));
    end loop;
    end if;
    return l_number;
    end;
    -- Routine to parse the DBASE header record, can get
    -- all of the details of the contents of a dbase file from
    -- this header
    procedure get_header
    (p_bfile in bfile,
    p_bfile_offset in out NUMBER,
    p_hdr in out dbf_header,
    p_flds in out field_descriptor_array )
    is
    l_data varchar2(100);
    l_hdr_size number default 32;
    l_field_desc_size number default 32;
    l_flds field_descriptor_array;
    begin
    p_flds := l_flds;
    l_data := utl_raw.cast_to_varchar2(
    dbms_lob.substr( p_bfile,
    l_hdr_size,
    p_bfile_offset ) );
    p_bfile_offset := p_bfile_offset + l_hdr_size;
    p_hdr.version := ascii( substr( l_data, 1, 1 ) );
    p_hdr.year := 1900 + ascii( substr( l_data, 2, 1 ) );
    p_hdr.month := ascii( substr( l_data, 3, 1 ) );
    p_hdr.day := ascii( substr( l_data, 4, 1 ) );
    p_hdr.no_records := to_int( substr( l_data, 5, 4 ) );
    p_hdr.hdr_len := to_int( substr( l_data, 9, 2 ) );
    p_hdr.rec_len := to_int( substr( l_data, 11, 2 ) );
    p_hdr.no_fields := trunc( (p_hdr.hdr_len - l_hdr_size)/
    l_field_desc_size );
    for i in 1 .. p_hdr.no_fields
    loop
    l_data := utl_raw.cast_to_varchar2(
    dbms_lob.substr( p_bfile,
    l_field_desc_size,
    p_bfile_offset ));
    p_bfile_offset := p_bfile_offset + l_field_desc_size;
    p_flds(i).name := rtrim(substr(l_data,1,11),chr(0));
    p_flds(i).type := substr( l_data, 12, 1 );
    p_flds(i).length := ascii( substr( l_data, 17, 1 ) );
    p_flds(i).decimals := ascii(substr(l_data,18,1) );
    end loop;
    p_bfile_offset := p_bfile_offset +
    mod( p_hdr.hdr_len - l_hdr_size,
    l_field_desc_size );
    end;
    function build_insert
    ( p_tname in varchar2,
    p_cnames in varchar2,
    p_flds in field_descriptor_array ) return varchar2
    is
    l_insert_statement long;
    begin
    l_insert_statement := 'insert into ' || p_tname || '(';
    if ( p_cnames is NOT NULL )
    then
    l_insert_statement := l_insert_statement ||
    p_cnames || ') values (';
    else
    for i in 1 .. p_flds.count
    loop
    if ( i <> 1 )
    then
    l_insert_statement := l_insert_statement||',';
    end if;
    l_insert_statement := l_insert_statement ||
    '"'|| p_flds(i).name || '"';
    end loop;
    l_insert_statement := l_insert_statement ||
    ') values (';
    end if;
    for i in 1 .. p_flds.count
    loop
    if ( i <> 1 )
    then
    l_insert_statement := l_insert_statement || ',';
    end if;
    if ( p_flds(i).type = 'D' )
    then
    l_insert_statement := l_insert_statement ||
    'to_date(:bv' || i || ',''yyyymmdd'' )';
    else
    l_insert_statement := l_insert_statement ||
    ':bv' || i;
    end if;
    end loop;
    l_insert_statement := l_insert_statement || ')';
    return l_insert_statement;
    end;
    function get_row
    ( p_bfile in bfile,
    p_bfile_offset in out number,
    p_hdr in dbf_header,
    p_flds in field_descriptor_array ) return rowArray
    is
    l_data varchar2(4000);
    l_row rowArray;
    l_n number default 2;
    begin
    l_data := utl_raw.cast_to_varchar2(
    dbms_lob.substr( p_bfile,
    p_hdr.rec_len,
    p_bfile_offset ) );
    p_bfile_offset := p_bfile_offset + p_hdr.rec_len;
    l_row(0) := substr( l_data, 1, 1 );
    for i in 1 .. p_hdr.no_fields loop
    l_row(i) := rtrim(ltrim(substr( l_data,
    l_n,
    p_flds(i).length ) ));
    if ( p_flds(i).type = 'F' and l_row(i) = '.' )
    then
    l_row(i) := NULL;
    end if;
    l_n := l_n + p_flds(i).length;
    end loop;
    return l_row;
    end get_row;
    procedure show( p_hdr in dbf_header,
    p_flds in field_descriptor_array,
    p_tname in varchar2,
    p_cnames in varchar2,
    p_bfile in bfile )
    is
    l_sep varchar2(1) default ',';
    procedure p(p_str in varchar2)
    is
    l_str long default p_str;
    begin
    while( l_str is not null )
    loop
    dbms_output.put_line( substr(l_str,1,250) );
    l_str := substr( l_str, 251 );
    end loop;
    end;
    begin
    p( 'Sizeof DBASE File: ' || dbms_lob.getlength(p_bfile) );
    p( 'DBASE Header Information: ' );
    p( chr(9)||'Version = ' || p_hdr.version );
    p( chr(9)||'Year = ' || p_hdr.year );
    p( chr(9)||'Month = ' || p_hdr.month );
    p( chr(9)||'Day = ' || p_hdr.day );
    p( chr(9)||'#Recs = ' || p_hdr.no_records);
    p( chr(9)||'Hdr Len = ' || p_hdr.hdr_len );
    p( chr(9)||'Rec Len = ' || p_hdr.rec_len );
    p( chr(9)||'#Fields = ' || p_hdr.no_fields );
    p( chr(10)||'Data Fields:' );
    for i in 1 .. p_hdr.no_fields
    loop
    p( 'Field(' || i || ') '
    || 'Name = "' || p_flds(i).name || '", '
    || 'Type = ' || p_flds(i).Type || ', '
    || 'Len = ' || p_flds(i).length || ', '
    || 'Scale= ' || p_flds(i).decimals );
    end loop;
    p( chr(10) || 'Insert We would use:' );
    p( build_insert( p_tname, p_cnames, p_flds ) );
    p( chr(10) || 'Table that could be created to hold data:');
    p( 'create table ' || p_tname );
    p( '(' );
    for i in 1 .. p_hdr.no_fields
    loop
    if ( i = p_hdr.no_fields ) then l_sep := ')'; end if;
    dbms_output.put
    ( chr(9) || '"' || p_flds(i).name || '" ');
    if ( p_flds(i).type = 'D' ) then
    p( 'date' || l_sep );
    elsif ( p_flds(i).type = 'F' ) then
    p( 'float' || l_sep );
    elsif ( p_flds(i).type = 'N' ) then
    if ( p_flds(i).decimals > 0 )
    then
    p( 'number('||p_flds(i).length||','||
    p_flds(i).decimals || ')' ||
    l_sep );
    else
    p( 'number('||p_flds(i).length||')'||l_sep );
    end if;
    else
    p( 'varchar2(' || p_flds(i).length || ')'||l_sep);
    end if;
    end loop;
    p( '/' );
    end;
    procedure load_Table( p_dir in varchar2,
    p_file in varchar2,
    p_tname in varchar2,
    p_cnames in varchar2 default NULL,
    p_show in boolean default FALSE )
    is
    l_bfile bfile;
    l_offset number default 1;
    l_hdr dbf_header;
    l_flds field_descriptor_array;
    l_row rowArray;
    begin
    l_bfile := bfilename( p_dir, p_file );
    dbms_lob.fileopen( l_bfile );
    get_header( l_bfile, l_offset, l_hdr, l_flds );
    if ( p_show )
    then
    show( l_hdr, l_flds, p_tname, p_cnames, l_bfile );
    else
    dbms_sql.parse( g_cursor,
    build_insert(p_tname,p_cnames,l_flds),
    dbms_sql.native );
    for i in 1 .. l_hdr.no_records loop
    l_row := get_row( l_bfile,
    l_offset,
    l_hdr,
    l_flds );
    if ( l_row(0) <> '*' ) -- deleted record
    then
    for i in 1..l_hdr.no_fields loop
    dbms_sql.bind_variable( g_cursor,
    ':bv'||i,
    l_row(i),
    4000 );
    end loop;
    if ( dbms_sql.execute( g_cursor ) <> 1 )
    then
    raise_application_error( -20001,
    'Insert failed ' || sqlerrm );
    end if;
    end if;
    end loop;
    end if;
    dbms_lob.fileclose( l_bfile );
    exception
    when others then
    if ( dbms_lob.isopen( l_bfile ) > 0 ) then
    dbms_lob.fileclose( l_bfile );
    end if;
    RAISE;
    end;
    end;
    i think i am doing mistake in creating and using directory ..direct me plz..

  • How can a hyperlink in a table load an image in a new page?

    Hello,
    I�m using Java Studion Creator 2 Update 1 an I have following problem:
    In my jsp-page (main.jsp) I have a table component with an image hyperlink (showOriginal) in the first column and a hyperlink to another page in the second. On this page is also a hyperlink, which refreshes the table.
    The hyperlink "showOriginal"should open an new browser window and display an image depending on the selected table row. I set the property "target" for the hyperlink to "new Window" and specified an event_method (showOriginal_action() ), which gets the image for the selected table row and returns a new jsp-page (originalView.jsp).
    OriginalView.jsp has only one image component. The url for this image is set by the event_method of the image-hyperlink of main.jsp. Everything works fine but if I press the refresh link on the main.jsp to refresh the table (after opening the new window with the image), the image is also loaded in the main.jsp and every other link causes the same problem. Even when I close the window with the image, the same happens. It seems to my like I`m still on the main.jsp and not on the originalView.jsp (after I included a </redirect> tag in navigation.xml, I can see in the browser address line that the OriginalView.jsp is loaded) and the main.jsp "thinks" that the current content of itself is the image?
    I changed the return value of showOriginal_action() to null (because of the target property I always get the originalView-page) and then I included an onClick-method for the imagehyperlink to open a new window with the image via javascript but this didn`t work because the onclick is performed first and at this time I don`t have the image url yet (because first I must know, which row was selected in the table).
    Am I doing something wrong or is this not the correct way to do this? I just want to click the link in the table and a new window/page should be opened, which displays the image, independent from anything else. and all the links in the main-page must still work properly.
    I appreciate any help & suggestions.
    thanks
    Message was edited by:
    dan_k

    hi,
    here the code of main.jsp (with your suggestions):
    <?xml version="1.0" encoding="UTF-8"?>
    <jsp:root version="1.2" xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" xmlns:jsp="http://java.sun.com/JSP/Page" xmlns:ui="http://www.sun.com/web/ui">
    <jsp:directive.page contentType="text/html;charset=UTF-8" pageEncoding="UTF-8"/>
    <f:view>
    <ui:page binding="#{main.page1}" id="page1">
    <ui:html binding="#{main.html1}" id="html1">
    <ui:head binding="#{main.head1}" id="head1" title="SEWM">
    <ui:link binding="#{main.link1}" id="link1" url="/resources/stylesheet.css"/>
    <ui:script binding="#{main.script1}" id="script1" url="/resources/global.js"/>
    </ui:head>
    <ui:body binding="#{main.body1}" id="body1" style="background-color: rgb(255, 255, 255); -rave-layout: grid">
    <ui:form binding="#{main.formMain}" id="formMain" target="_self">
    <div id="main">
    <h:panelGrid binding="#{main.header}" id="header" styleClass="abc_header_bar">
    <ui:image binding="#{main.image1}" id="image1" styleClass="abc_header_logo" url="/resources/abc_logo.gif"/>
    <ui:image binding="#{main.imageLine1}" id="imageLine1" styleClass="abc_header_logoLine" url="/resources/amb_leiste.gif"/>
    <ui:staticText binding="#{main.staticText2}" id="staticText2" styleClass="abc_page_title" text="IWM Control"/>
    </h:panelGrid>
    <h:panelGrid binding="#{main.menu}" columns="1" id="menu" styleClass="abc_menu_wrapper">
    <ui:image align="middle" binding="#{main.imageLeon1}" height="81" id="imageLeon1" url="/resources/leon.gif"/>
    <h:panelGrid binding="#{main.gridPanel1}" id="gridPanel1" styleClass="abc_menu_box">
    <ui:hyperlink action="#{main.changePssw_action}" binding="#{main.hyperlinkChangePassword}" id="hyperlinkChangePassword"
    style="color:#990000" text="#{main.propertyResourceProvider1.value['changePassword']}"/>
    <ui:hyperlink action="#{main.refresh_action}" binding="#{main.hyperlinkRefresh}" id="hyperlinkRefresh"
    onClick="function test() {&#xa; this.formMain.target='_self';&#xa;}" style="color:#990000" text="#{main.propertyResourceProvider1.value['refresh']}"/>
    <ui:hyperlink binding="#{main.customerHelp1}" id="customerHelp1" immediate="true" style="color:#990000" styleClass=""
    target="_blank" text="Hilfe" url="/main.html"/>
    <ui:label binding="#{main.label2}" id="label2" styleClass="abc_lbplaceholder" text="_______________________"/>
    <ui:staticText binding="#{main.staticText108}" id="staticText108" styleClass="abc_tinfo" text="TEST Info" toolTip="#{SessionBean1.testSystemInfo}"/>
    </h:panelGrid>
    </h:panelGrid>
    <h:panelGrid binding="#{main.gridPanelUserInfo1}" columns="3" id="gridPanelUserInfo1" styleClass="abc_userinfo">
    <ui:staticText binding="#{main.staticText1}" id="staticText1" text="Sie sind angemeldet als "/>
    <ui:staticText binding="#{main.registeredUser1}" id="registeredUser1" text="#{SessionBean1.webFacade.user.uname}"/>
    <ui:hyperlink action="#{main.logout_action}" binding="#{main.logout1}" id="logout1" style="color: #ffffff" text="#{main.propertyResourceProvider1.value['logout']}"/>
    </h:panelGrid>
    <h:panelGrid binding="#{main.gridPanel86}" id="gridPanel86" styleClass="abc_page_content">
    <ui:label binding="#{main.label3}" id="label3" text="#{main.propertyResourceProvider1.value['sysmessages']}"/>
    <ui:messageGroup binding="#{main.outlineTableMessageGroup}" id="outlineTableMessageGroup" styleClass="abc_messagebox"/>
    <ui:tabSet binding="#{main.tabSet1}" id="tabSet1" lite="true" mini="true" selected="tab2">
    <ui:tab action="#{main.tab2_action}" binding="#{main.tab2}" id="tab2" text="#{main.propertyResourceProvider1.value['openProcesses']}">
    <ui:panelLayout binding="#{main.layoutPanel2}" id="layoutPanel2">
    <ui:table binding="#{main.table3}" id="table3" paginateButton="true" paginationControls="true"
    styleClass="abc_main_fulltable" title="Offene Vorg�nge" width="600">
    <script><![CDATA[
    /* ----- Functions for Table Preferences Panel ----- */
    * Toggle the table preferences panel open or closed
    function togglePreferencesPanel1() {
    var table = document.getElementById("form1:tabSet1:tab2:table3");
    table.toggleTblePreferencesPanel();
    /* ----- Functions for Filter Panel ----- */
    * Return true if the filter menu has actually changed,
    * so the corresponding event should be allowed to continue.
    function filterMenuChanged1() {
    var table = document.getElementById("form1:tabSet1:tab2:table3");
    return table.filterMenuChanged();
    * Toggle the custom filter panel (if any) open or closed.
    function toggleFilterPanel1() {
    var table = document.getElementById("form1:tabSet1:tab2:table3");
    return table.toggleTableFilterPanel();
    /* ----- Functions for Table Actions ----- */
    * Initialize all rows of the table when the state
    * of selected rows changes.
    function initAllRows1() {
    var table = document.getElementById("form1:tabSet1:tab2:table3");
    table.initAllRows();
    * Set the selected state for the given row groups
    * displayed in the table. This functionality requires
    * the 'selectId' of the tableColumn to be set.
    * @param rowGroupId HTML element id of the tableRowGroup component
    * @param selected Flag indicating whether components should be selected
    function selectGroupRows1(rowGroupId, selected) {
    var table = document.getElementById("form1:tabSet1:tab2:table3");
    table.selectGroupRows(rowGroupId, selected);
    * Disable all table actions if no rows have been selected.
    function disableActions1() {
    // Determine whether any rows are currently selected
    var table = document.getElementById("form1:tabSet1:tab2:table3");
    var disabled = (table.getAllSelectedRowsCount() > 0) ? false : true;
    // Set disabled state for top actions
    document.getElementById("form1:tabSet1:tab2:table3:tableActionsTop:deleteTop").setDisabled(disabled);
    // Set disabled state for bottom actions
    document.getElementById("form1:tabSet1:tab2:table3:tableActionsBottom:deleteBottom").setDisabled(disabled);
    }]]></script>
    <ui:tableRowGroup binding="#{main.tableRowGroup4}" emptyDataMsg="Keine Vorg�nge gefunden." id="tableRowGroup4"
    rows="20" sourceData="#{SessionBean1.openProcesses}" sourceVar="currentRowTable">
    <ui:tableColumn binding="#{main.tableColumn23}" embeddedActions="true" id="tableColumn23" noWrap="true">
    <ui:panelGroup binding="#{main.groupPanel13}" id="groupPanel13">
    <ui:imageHyperlink action="#{main.showOriginal_action}"
    alt="#{main.propertyResourceProvider1.value['ttShowOriginals']}"
    binding="#{main.imageHyperlink64}" id="imageHyperlink64"
    imageURL="/resources/original_small.gif" immediate="true" target="_blank" toolTip="#{main.propertyResourceProvider1.value['ttShowOriginals']}"/>
    </ui:panelGroup>
    </ui:tableColumn>
    <ui:tableColumn binding="#{main.tableColumn15}" headerText="Kreditor" id="tableColumn15" sort="vendorName">
    <ui:hyperlink action="#{main.edit_action}" binding="#{main.hyperlink1}" id="hyperlink1" text="#{currentRowTable.value['vendorName']}"/>
    </ui:tableColumn>
    <ui:tableColumn binding="#{main.tableColumn17}" headerText="Rechnungsnummer" id="tableColumn17" sort="refDocNo">
    <ui:staticText binding="#{main.staticText101}" id="staticText101" styleClass="abc_table_celltext" text="#{currentRowTable.value['refDocNo']}"/>
    </ui:tableColumn>
    <ui:tableColumn binding="#{main.tableColumn18}" headerText="Rechnungsdatum" id="tableColumn18" sort="docDate">
    <ui:staticText binding="#{main.staticText102}" converter="#{main.dateTimeConverter1}" id="staticText102"
    styleClass="abc_table_celltext" text="#{currentRowTable.value['docDate']}"/>
    </ui:tableColumn>
    <ui:tableColumn binding="#{main.tableColumn19}" headerText="F�lligkeit" id="tableColumn19" noWrap="true" sort="dueDate">
    <ui:staticText binding="#{main.staticText103}" converter="#{main.dateTimeConverter1}" id="staticText103"
    styleClass="abc_table_celltext" text="#{currentRowTable.value['dueDate']}"/>
    </ui:tableColumn>
    <ui:tableColumn binding="#{main.tableColumn21}" headerText="Zuordnung" id="tableColumn21" sort="stapleName">
    <ui:staticText binding="#{main.staticText105}" id="staticText105" styleClass="abc_table_celltext" text="#{currentRowTable.value['stapleName']}"/>
    </ui:tableColumn>
    </ui:tableRowGroup>
    <f:facet name="actionsBottom"/>
    </ui:table>
    </ui:panelLayout>
    </ui:tab>
    </ui:tabSet>
    </h:panelGrid>
    </div>
    <div id="wait" style="visibility: hidden;">
    <h:panelGrid binding="#{main.gridPanel8}" id="gridPanel8" styleClass="abc_wait_div">
    <ui:label binding="#{main.label1}" id="label1" styleClass="abc_labelwait_align" text="#{main.propertyResourceProvider1.value['loaddata']}"/>
    <ui:image binding="#{main.image15}" id="image15" url="/resources/loading.gif"/>
    </h:panelGrid>
    </div>
    </ui:form>
    </ui:body>
    </ui:html>
    </ui:page>
    </f:view>
    </jsp:root>
    originalView.jsp:
    <?xml version="1.0" encoding="UTF-8"?>
    <jsp:root version="1.2" xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" xmlns:jsp="http://java.sun.com/JSP/Page" xmlns:ui="http://www.sun.com/web/ui">
    <jsp:directive.page contentType="text/html;charset=UTF-8" pageEncoding="UTF-8"/>
    <f:view>
    <ui:page binding="#{originalView.page1}" id="page1">
    <ui:html binding="#{originalView.html1}" id="html1">
    <ui:head binding="#{originalView.head1}" id="head1">
    <ui:link binding="#{originalView.link1}" id="link1" url="/resources/stylesheet.css"/>
    </ui:head>
    <ui:body binding="#{originalView.body1}" id="body1" style="-rave-layout: grid">
    <ui:form binding="#{originalView.form1}" id="form1">
    <ui:image binding="#{originalView.image1}" height="#{SessionBean1.archiveObjectsFileDataProvider.value['height']}" id="image1"
    style="left: 120px; top: 48px; position: absolute" url="#{SessionBean1.archiveObjectsFileDataProvider.value['url']}" width="#{SessionBean1.archiveObjectsFileDataProvider.value['width']}"/>
    </ui:form>
    </ui:body>
    </ui:html>
    </ui:page>
    </f:view>
    </jsp:root>
    and java-methods:
    public String showOriginal_action() {
    getSelectedRowFromMainTable();
         setSelectedImage(); // the image url ist then available via session bean
    return "originalView"; // displays the image using the url in the session bean
    public String refresh_action() {
    try {
    getSessionBean1().getOpenProcessTable();
    //hier I tried
    //getHyperlinkRefresh().setTarget("_self");
    //or getFormMain().setTarget("_self");
    catch(Exception ex) {
    error(ex.getMessage());
    return null;
    }

  • Send the OHD database table load details as mail in Process chain

    Hi,
    We have a requirement that after the data is loaded into Open Hub destination we need to send an email to a list of users.
    The mail should contain the Open hub destination name, Calday, Request No, No of records and the message that the load has ended successfully.
    I know how to send an email in the process chain but how to send the details like Open hub destination name, Calday, Request No, No of records in the mail.
    Please suggest.

    Well we got a little idea:
    We have narrowed the req down to the following logic:
    select REQUID,No of records transferre, and the OHD name from RSBKREQUEST table
    where rsbkrequest-requid  = open hub table request
    Also the open hub table should be first sorted in descending order by request ID
    Now the main issue is:
    How to implement this logic and get the details from RSBKREQUEST table and send these details in the mail body.
    This we want to do in the message which is attached to the Open hub DTP load step of the PC. The message will go if the Open hub load is successful.
    In the message we are able to enter the receipient email address but how to add the above mentioned details.
    Actually we are trying to avoid the ABAP code step.
    Please suggest.
    Edited by: Debanshu Mukherjee on Dec 10, 2010 7:45 PM
    Edited by: Debanshu Mukherjee on Dec 10, 2010 7:46 PM

  • PL/SQL block to create temporary table + load via cursor for loop

    Assume I have a table that contains a subset of data that I want to load into a temporary table within a cursor for-loop. Is it possible to have a single statement to create the table and load based on the results of the fetch?
    I was thinking something like:
    Declare CURSOR xyz is
    CREATE TABLE temp_table as
    select name, rank, serial number from
    HR table where rank = 'CAPTAIN'
    BEGIN
    OPEN xyz
    for name in xyz
    LOOP
    END LOOP
    What I see wrong with this is that the table would be created multiple times which is why this syntax is not acceptable. I'd prefer not to have to define the temporary table then load in two sepearte SQL statements and am hoping a single statement can be used.
    Thanks!

    What is the goal here?
    If you're just going to iterate over the rows that are returned in a cursor, a temporary table is unnecessary and only adds complexity. If you truly need a temporary table, you would declare it exactly once, at install time when you create all your other tables. You'd INSERT data into the temp table and the data would only be visible to the session that inserted it.
    Justin

  • External Table Load KUP-04037 Error

    I was asked to repost this here. This was a follow on question to this thread how to load special characters (diacritics) in the Exort/Import/SQL Loader/External Table Forum.
    I've defined an external table and on my one instance running the WE8MSWIN1252 character set everything works fine. On my other instance running AL32UTF8 I get the KUP-04037 error, terminator not found on the field that has "à" (the letter a with a grave accent). Changing it to a standard "a" works avoids the error. Changing the column definition in the external table to nvarchar2 does NOT help.
    Any ideas anyone?
    Thanks,
    Bob Siegel

    Exactly. If you do not specify the CHARACTERSET parameter, the database character set is used to interpret the input file. As the input file is in WE8MSWIN1252, the ORACLE_LOADER driver gets confused trying to interpret single-byte WE8MSWIN1252 codes as multibyte AL32UTF8 codes.
    The character set of the input file depends on the way it was created. Even on US Windows, you can create text files in different encodings. Notepad allows you to save the file in ANSI code page (=WE8MSWIN1252 on US Windows), Unicode (=AL16UTF16LE), Unicode big endian (=AL16UTF16), and UTF-8 (=AL32UTF8). The Command Prompt edit.exe editor saves the files in the OEM code page (=US8PC437 on US Windows).
    -- Sergiusz

  • Table loading with empty selected row

    hi i have below table ,when the page load it has empty row created, i what it to show that message emptyText=No data to Display when the page load,the table was created based on the bean,am in jdeveloper 11.1.1.6.0,i don't what to use #{!adfFacesContext.initialRender} becuase it does not allow me to add value in the table
    <af:table
                              var="row" rows="#{bindings.addmemberBean.rangeSize}"
                              emptyText="#{bindings.addmemberBean.viewable ? 'No data to display.' : 'Access Denied.'}"
                              fetchSize="#{bindings.addmemberBean.rangeSize}"
                              rowBandingInterval="0" id="t1"
                              binding="#{pageFlowScope.MemberBean.tempTable}"
                              value="#{bindings.addmemberBean.collectionModel}"
                              rowSelection="multiple"
                              selectionListener="#{bindings.addmemberBean.collectionModel.makeCurrent}">
                      <af:column sortProperty="name" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.name.label}"
                                 id="c6" width="106">
                        <af:outputText id="ot2" value=" #{row.name}"/>
                      </af:column>
                      <af:column sortProperty="surname" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.surname.label}"
                                 id="c4" width="104">
                        <af:outputText value="#{row.surname}" id="ot3"/>
                      </af:column>
                      <af:column sortProperty="emailaddress" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.emailaddress.label}"
                                 id="c5" width="105">
                        <af:outputText value="#{row.emailaddress}" id="ot4"/>
                      </af:column>
                      <af:column sortProperty="firstname" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.firstname.label}"
                                 id="c3" width="105">
                        <af:outputText value="#{row.firstname}" id="ot1"/>
                      </af:column>
                      <af:column id="c7" headerText="AddUser">
                        <af:selectBooleanCheckbox
                                                  label="Label 1" id="sbc1"/>
                      </af:column>
                      <af:column headerText="#{bindings.addmemberBean.hints.name.label}"
                                 id="c8" visible="false">
                        <af:inputText value="#{row.bindings.name.inputValue}"
                                      label="#{bindings.addmemberBean.hints.name.label}"
                                      required="#{bindings.addmemberBean.hints.name.mandatory}"
                                      columns="#{bindings.addmemberBean.hints.name.displayWidth}"
                                      maximumLength="#{bindings.addmemberBean.hints.name.precision}"
                                      shortDesc="#{bindings.addmemberBean.hints.name.tooltip}"
                                      id="it2">
                          <f:validator binding="#{row.bindings.name.validator}"/>
                        </af:inputText>
                      </af:column>
                    </af:table>and i don't what to user
    public void emptytable(){
            ViewObjectImpl vo = this.getAddMemberVo1();
            vo.executeEmptyRowSet();
        }because am not using a view,am using a bean
    Edited by: adf0994 on 2012/11/06 10:38 AM
    Edited by: adf0994 on 2012/11/06 10:41 AM
    Edited by: adf0994 on 2012/11/06 10:44 AM

    public Beantest() {
    super();
    addmemberBean = new ArrayList<AddmemberBean>();
    This way, the list will be completely empty when your bean is created.
    And thus, your table shouldn't contain any data.

  • Does table load make the view INVALID

    Hi All,
    One of our jobs failed due to an INVALID view. When I check the view and last_ddl_time, it has a time stamp, which is recent than the the job failure. I have also checkd the base table. One of the table partition has same time stamp for last_ddl_time as the view. Looks like the partition has been exchanged or some othe DDL ran on that partition. But table structure was not altered (no columns added, no renaming of the columns). My question by anmy chance adding a partition/partition exchnage/loading a partition makes the views on the table INVALID?? What could be the reason to this?
    Thanks in advance....

    Hi,
    The packages which is using this view will become invalid but when the package is accessed next time it will be valid.
    Source:-
    ~~~~~
    Check http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:18018481500446
    Thanks

Maybe you are looking for

  • HT1766 can i get get my old contacts from a blackberry to an i phone

    how can i transfer contacts from a blackberry z10 to an iphone 5s

  • Simulation Vs Real in Generic Data Source

    Hi , I have got a generic data source based on Function Module. In the FM it is actually updating some mapping tables. I would like to update those tables only when extractor is actually executed using Infopackage or using DTP(Direct access). But don

  • Installing after broken installation

    Hello everybody! Please help me to solve this problem. I tried to install iTunes,the installation was not finished because of emergency reboot. Next try of installation: installation wizard refuses to install iTunes explaining that "can`t install it

  • CATS IDOCs

    I am trying to integrate third party time reporting tool with SAP CATS and think of using IDOCs.  Same time I am new to ALE/IDOCs and would like to get some input. 1. There are two standard IDOC types for CATS, ending with 1 and 2 respectively and bo

  • Automatic batch determination in MTO

    Hi, I want to configure the automatic batch determination for make to order deliveries to only take the batch in sales order stock not to take any batch in the stock. Thanks