Dbms_lob.getlength() returns different source and destination lengths

I am fairly new to PL/SQL so maybe this is an obvious problem but here goes. I am updating a clob field with a text file ~5KB in size. The field updates fine (as far as I can tell). Before I update the field, I open the source file as a bfile and then inquire the length using dbms_lob.getlength(). I then update the clob field using dbms_lob.loadclobfromfile(). This seems to work fine. However, when I use dbms_lob.getlength() on the destination object returned by dbms_lob.loadclobfromfile(), I get a length 3 characters less than then the source object (5072 vs 5075). Both the source and destination offsets are set to 1.
Probing on what documentation I could find, I found this at http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_lob.htm#i998484:
"The length returned for a BFILE includes the EOF, if it exists. Any 0-byte or space filler in the LOB caused by previous ERASE or WRITE operations is also included in the length count. The length of an empty internal LOB is 0."
I did not create the source file and I believe it is a Unix type file (because of the lack of CRs) and I am running on Windows 7. I am also using 11g Express. Could the use of a Unix-type file for a CLOB on a Windows system be causing this character count difference?
Once I found this issue I can work around it. I just want to understand what is going on.
Thanks to all who look at this.

The EOF and the LF versus CR/LF could influence the count difference, yes.
Another explain could possibly be character set conversions. The BFILE I believe counts bytes, a CLOB would count "characters" - so if the source happens to contain a few multibyte characters (UTF), then the byte count would be larger than the character count.
To help you find the cause for your exact file, then I can suggest a couple of things you might do to explore the issue:
<li>Load the file into a BLOB instead of a CLOB and see what getlength() returns for the BLOB. BLOBs would also do byte counts and not try to treat the source as text.
<li>Save the CLOB back into the filesystem and compare the original file with the exported CLOB and check the differences with some filecompare tool.

Similar Messages

  • Dbms_lob.getlength() returns different values

    Hi !
    I am not a developer.. So, possibly cannot answer developer specific question.
    We have two instances running on 10.2.0.4 but both giving different value for
    declare
         xml varchar2(32676) :=
    'SELECT XT_STRATEGY, ACCT_DESCRIPTION,sum(MON_PL) PL_MON ,sum(TUE_PL) PL_TUE,sum(WED_PL) PL_WED ,sum(THU_PL) PL_THU, sum(FRI_PL) PL_FRI, sum(WTD_PL) PL_WTDPL,sum(WTD_PREMIUM_PL) PL_WTDPRMPL, sum(MTD_PL) PL_MARKPL,sum(MTD_PREMIUM_PL) PL_MTDPRMPL,sum(YTD_PL) PL_YTDPL,sum(YTD_PREMIUM_PL) PL_YTDPRMPL,
                   DLC_REGION,DLC_ENTITY  FROM (SELECT xt_strategy, acct_description ,case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D'')))  then 0 else MON_PL end as MON_PL, case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D''))) then 0 else tue_pl end as tue_pl,
    case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D'')))  then 0 else WED_PL end as WED_PL, case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D'')))  then 0 else THU_PL end as  THU_PL,
    case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D'')))  then 0 else FRI_PL   end as  FRI_PL, case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D'')))  then 0 else WTD_PL   end as  WTD_PL,
    case when  blankout_date  < to_char( to_date(    fv[EOD]        )+ (1 -to_char(to_date(to_date(    fv[EOD]        )),''D'')))  then 0  else WTD_PREMIUM_PL   end as  WTD_PREMIUM_PL,
    case when  blankout_date  < (select actual_date from tbl_date where date_type = ''PLMonthStart'' and to_char(actual_date, ''MMYYYY'') = to_char(to_date(    fv[EOD]        ),''MMYYYY'')) then 0 else mtd_pl end as mtd_pl,
    case when  blankout_date  < (select actual_date from tbl_date where date_type = ''PLMonthStart'' and to_char(actual_date, ''MMYYYY'') = to_char(to_date(    fv[EOD]        ),''MMYYYY'')) then 0 else MTD_PREMIUM_PL end as MTD_PREMIUM_PL,
    case when  blankout_date  < (select to_date(''01-JAN-''|| to_char(to_date((fv[EOD]),''DD-MON-YYYY''),''YYYY''),''DD-MON-YYYY'') from tbl_business_date) then 0 else ytd_PL end as ytd_PL,
    case when  blankout_date  < (select to_date(''01-JAN-''|| to_char(to_date((fv[EOD]),''DD-MON-YYYY''),''YYYY''),''DD-MON-YYYY'') from tbl_business_date) then 0 else ytd_premium_PL end as ytd_premium_PL,
    IN_STRIKECURR, IN_QUOTEDCUR, case dlc_region when ''LON'' then ''LN'' when ''NYC'' then ''NY'' when ''TOK'' then ''TK'' else dlc_region
    end as DLC_REGION, DLC_ENTITY,DTL_REFERENCE,TRADE_STATE,blankout_date
    FROM (select a.bo_book_id as xt_strategy, a.bo_book_id  as acct_description , b.mon_pl, b.tue_pl, b.wed_pl,b.thu_pl,b.fri_pl,b.wtd_pl, b.wtd_premium_pl , b.mtd_pl , b.mtd_premium_pl, b.ytd_pl, b.ytd_premium_pl,
               a.BASE_CCY as IN_STRIKECURR, A.QUOTED_CCY AS IN_QUOTEDCUR,  a.region_code as dlc_region, a.entity_acronym  as dlc_entity, TP.EBI_TRADE_ID AS DTL_REFERENCE, a.trade_state,
         case when a.trade_state in  (7,8,13) then a.amendment_date when a.maturity_date > nvl(a.premium_settlement_date, to_date(''01-JAN-1900'')) then
              Case when a.trade_state in  (4,5 ) then  trunc(a.updated_date) else a.maturity_date end else a.premium_settlement_date end as blankout_date  FROM      TBL_TRADE_HIST A
         join vw_feed_downstream_trade_id tp on a.trade_id = tp.trade_id and tp.mapping_source_name = ''ramfx''
       join tbl_pl b on a.trade_id = b.trade_id and a.version = b.trade_version      and b.pl_date = (select max(pl_date) from tbl_pl d where a.trade_id  = d.trade_id and pl_date <=     fv[EOD]        )
       join tbl_bo_book bk on a.bo_book_id = bk.bo_book_id  and bk.business_Area = ''OPTIONS''
       where b.pl_date <     fv[EOD] 
       AND      NOT EXISTS (SELECT 1 FROM TBL_EOD_TRADES D WHERE A.TRADE_ID = D.TRADE_ID AND  D.BUSINESS_DATE =     fv[EOD]        )
    UNION
        SELECT A.XT_STRATEGY,     A.ACCT_DESCRIPTION, C.MON_PL,C.TUE_PL, C.WED_PL,C.THU_PL, C.FRI_PL,C.WTD_PL,C.WTD_PREMIUM_PL,C.MTD_PL,C.MTD_PREMIUM_PL,C.YTD_PL,C.YTD_PREMIUM_PL,
             A.IN_STRIKECURR, A.IN_QUOTEDCUR,  A.DLC_REGION,A.DLC_ENTITY, A.DTL_REFERENCE , a.trade_state , case when a.trade_state in  (7,8,13) then  a.DLC_BUSDATEUPD when a.IN_EXPIRATION > nvl(a.XT_SETTLEDATE, to_date(''01-JAN-1900'')) then a.IN_EXPIRATION else a.XT_SETTLEDATE end as blankout_date
        FROM VW_EBI_FEED A , TBL_EOD_TRADES B , TBL_PL C
        WHERE A.TRADE_ID                    = B.TRADE_ID   AND      A.VERSION  = B.TRADE_VERSION   AND      A.TRADE_ID    = C.TRADE_ID
        AND      A.ACCT_RISKFEED             = ''D''    AND      A.DLC_DEALSTATE   in  (''LIVE'',''PENDING'',''CANCELLED'',''REVERSED'')    AND      A.VERSION                 = C.TRADE_VERSION
        AND      B.BUSINESS_DATE         =     to_date(    fv[EOD]          ,''DD-MON-YYYY'')    AND C.PL_DATE   =    to_date(    fv[EOD]          ,''DD-MON-YYYY'')
    UNION
        SELECT A.BO_BOOK_ID  AS XT_STRATEGY , A.BO_BOOK_ID  AS ACCT_DESCRIPTION,   a.MON_PL , a.TUE_PL , a.WED_PL, a.THU_PL , a.FRI_PL, a.WTD_PL ,a.WTD_PREMIUM_PL , a.mtd_pl,  a.MTD_PREMIUM_PL,a.ytd_PL ,  a.ytd_premium_PL,A.BASE_CCY AS IN_STRIKECURR,A.QUOTED_CCY AS IN_QUOTEDCUR,
                       A.REGION_CODE as dlc_region ,A.ENTITY_ACRONYM AS DLC_ENTITY, A.MAPPING_SOURCE_ID AS DTL_REFERENCE, 12 as trade_state , a.PL_DATE as blankout_date
                                        FROM TBL_PL_ARCH A , tbl_bo_book b  where a.bo_book_id = b.bo_book_id  and      b.business_area = ''OPTIONS''
    union
    select d.bo_book_id as xt_strategy, d.bo_book_id as acct_description, case  when to_char(b.pl_date,''D'') = 6 then (b.inception_pl + b.inception_premium_pl ) else  (b.mon_pl * -1) end as mon_pl ,
             case when to_char(b.pl_date,''D'') = 2 then  (b.inception_pl + b.inception_premium_pl )  when to_char(b.pl_date,''D'')  > 2 then  (b.tue_PL  * -1) else 0 end as tue_pl,  case when to_char(b.pl_date,''D'') = 3 then  (b.inception_pl + b.inception_premium_pl )  when to_char(b.pl_date,''D'') > 3 then  (b.wed_pl * -1) else 0 end as wed_pl ,
               case when to_char(b.pl_date,''D'') = 4 then  (b.inception_pl + b.inception_premium_pl)  when to_char(b.pl_date,''D'') > 4 then  (b.thu_pl * -1) else 0 end as thu_pl ,case when to_char(b.pl_date,''D'') = 5 then  (b.inception_pl + b.inception_premium_pl)  when to_char(b.pl_date,''D'') > 5 then  (b.fri_pl * -1) else 0 end as fri_pl ,
             (b.mon_pl + b.tue_pl + b.wed_pl  + b.thu_pl + b.fri_pl) * -1 + (b.inception_pl + b.inception_premium_pl)  as wtd_pl, case when b.wtd_premium_pl <> 0 then b.wtd_premium_pl *-1 + (b.inception_premium_pl) else b.wtd_premium_pl end as wtd_premium_pl  ,       case when b.mtd_pl <> 0 then  b.mtd_pl * -1 + (b.inception_pl) else b.mtd_pl end as mtd_pl ,
             case when b.mtd_premium_pl <> 0 then b.mtd_premium_pl * -1 +  (b.inception_premium_pl ) else b.mtd_premium_pl end as mtd_premium_pl,   case when b.ytd_pl <> 0 then b.ytd_pl * -1 + (b.inception_pl) else b.ytd_pl  end as ytd_pl,
             case when b.ytd_premium_pl <> 0 then b.ytd_premium_pl * -1 + (b.inception_premium_pl) else b.ytd_premium_pl  end as ytd_premium_pl ,
              c.base_ccy as IN_STRIKECURR,c.quoted_ccy as IN_QUOTEDCUR,c.region_code as dlc_region,c.entity_acronym as dlc_entity, to_char(c.trade_id) as dtl_reference,
            8 as trade_state,  case when to_char(b.pl_date,''D'') = 6 then b.pl_date + 2 else b.pl_date end as blankout_date
    from tbl_trade_hist a
    join tbl_pl b on a.trade_id = b.trade_id and a.version = b.trade_version
    join tbl_trade_hist c on a.tradE_id = c.trade_id
    join tbl_pl d on c.trade_id = d.trade_id and c.version = d.trade_version and d.pl_date =   (select max(e.pl_date) from tbl_pl e where c.trade_id = e.trade_id and e.pl_date <= c.amendment_date)
    join tbl_bo_book f on a.bo_book_id = f.bo_book_id and f.business_area = ''OPTIONS''
    where (b.bo_book_id <> d.bo_book_id or a.region_code <> c.region_code or a.entity_acronym <> c.entity_acronym or a.base_ccy <> c.base_ccy or a.quoted_ccy <> c.quoted_ccy)
    and      d.pl_date  -  b.pl_date =  case when to_char(d.pl_date,''DAY'') = ''MON'' then  3 else  1 end
    and      d.pl_date <= to_date(    fv[EOD]          ,''DD-MON-YYYY'')
    union all
    select b.bo_book_id as xt_strategy,  b.bo_book_id as acct_description,case  when to_char(b.pl_date,''D'') = 6 then (b.inception_pl * -1 + b.inception_premium_pl * -1) else  (b.mon_pl ) end as mon_pl ,
             case when to_char(b.pl_date,''D'') = 2 then  (b.inception_pl * -1 + b.inception_premium_pl * -1)  when to_char(b.pl_date,''D'')  > 2 then  (b.tue_PL) else 0 end as tue_pl, case when to_char(b.pl_date,''D'') = 3 then  (b.inception_pl * -1  + b.inception_premium_pl * -1 )  when to_char(b.pl_date,''D'') > 3 then  (b.wed_pl) else 0 end as wed_pl ,
               case when to_char(b.pl_date,''D'') = 4 then  (b.inception_pl * -1 +  b.inception_premium_pl * -1 )   when to_char(b.pl_date,''D'') > 4 then  (b.thu_pl ) else 0 end as thu_pl , case when to_char(b.pl_date,''D'') = 5 then  (b.inception_pl *-1 + b.inception_premium_pl * -1)  when to_char(b.pl_date,''D'') > 5 then  (b.fri_pl) else 0 end as fri_pl ,
             (b.mon_pl + b.tue_pl + b.wed_pl  + b.thu_pl + b.fri_pl)  + (b.inception_pl* -1 + b.inception_premium_pl * -1)  as wtd_pl, case when b.wtd_premium_pl <> 0 then b.wtd_premium_pl + (b.inception_premium_pl * -1) else b.wtd_premium_pl end as wtd_premium_pl  ,   case when b.mtd_pl <> 0 then  b.mtd_pl + (b.inception_pl * -1) else b.mtd_pl end as mtd_pl ,
             case when b.mtd_premium_pl <> 0 then b.mtd_premium_pl +  (b.inception_premium_pl * -1) else b.mtd_premium_pl end as mtd_premium_pl,  case when b.ytd_pl <> 0 then b.ytd_pl + (b.inception_pl * -1) else b.ytd_pl  end as ytd_pl,
             case when b.ytd_premium_pl <> 0 then b.ytd_premium_pl + (b.inception_premium_pl * -1) else b.ytd_premium_pl  end as ytd_premium_pl ,
             a.base_ccy as IN_STRIKECURR, a.quoted_ccy as IN_QUOTEDCUR, a.region_code as dlc_region,a.entity_acronym as dlc_entity, to_char(a.trade_id) as dtl_reference, 8 as trade_state,
            case when to_char(b.pl_date,''D'') = 6 then b.pl_date + 2 else b.pl_date end as blankout_date
    from tbl_trade_hist a
    join tbl_pl b on a.trade_id = b.trade_id and a.version = b.trade_version
    join tbl_trade_hist c on a.tradE_id = c.trade_id join tbl_pl d on c.trade_id = d.trade_id and c.version = d.trade_version and d.pl_date =   (select max(e.pl_date) from tbl_pl e where c.trade_id = e.trade_id and e.pl_date <= c.amendment_date)
    join tbl_bo_book f on a.bo_book_id = f.bo_book_id and f.business_area = ''OPTIONS''
    where (b.bo_book_id <> d.bo_book_id or a.region_code <> c.region_code or a.entity_acronym <> c.entity_acronym or a.base_ccy <> c.base_ccy or a.quoted_ccy <> c.quoted_ccy)
    and      d.pl_date  -  b.pl_date =  case  when to_char(d.pl_date,''DAY'') = ''MON'' then  3 else  1 end and      d.pl_date   <= to_date(  fv[EOD]  ,''DD-MON-YYYY'') ))
    group by xt_strategy,ACCT_DESCRIPTION,DLC_REGION,DLC_ENTITY';
    c clob;
    s varchar2(25000);
    i numeric := 424;
    begin
         c := xml;
       dbms_output.put_line('length of clob: ' || dbms_lob.getlength(c));
       dbms_output.put_line('split clob at: ' || i);
       dbms_output.put_line('desired substring length: ' || to_char(dbms_lob.getlength(c) - i));
       s := dbms_lob.substr(c, dbms_lob.getlength(c) - i, i + 1);
       dbms_output.put_line('length of substring: ' || length(s));
    end;I checked for the database characterset and they are same - AL32UTF8
    The result we get is
    Output from MD1 database
    length of clob: 10616
    split clob at: 424
    desired substring length: 10192
    length of substring: 10192
    Output from MD2 database
    length of clob: 10616
    split clob at: 424
    desired substring length: 10192
    length of substring: 8191Any idea why there is this discrepancy ?
    Edited by: USER101 on Feb 19, 2010 4:24 PM

    The EOF and the LF versus CR/LF could influence the count difference, yes.
    Another explain could possibly be character set conversions. The BFILE I believe counts bytes, a CLOB would count "characters" - so if the source happens to contain a few multibyte characters (UTF), then the byte count would be larger than the character count.
    To help you find the cause for your exact file, then I can suggest a couple of things you might do to explore the issue:
    <li>Load the file into a BLOB instead of a CLOB and see what getlength() returns for the BLOB. BLOBs would also do byte counts and not try to treat the source as text.
    <li>Save the CLOB back into the filesystem and compare the original file with the exported CLOB and check the differences with some filecompare tool.

  • How to keep sort order in Shuttle item's source and destination boxes?

    I have defined a shuttle item on a page and the list of values in the source box is populated by a SQL with sort order specified. In the Settings of the shuttle item, Show Controls value is set to Moving Only instead of All. When a user moves selected values from the source box to the destination box or vice versa, the list of values does not maintain the original sort order. For example, if I have 26 values A through Z, a user can put value Z before A in the destination box. And when a user move A from the destination box to the source box, A is the last item in the source box. The reset button clears everything in the destination box and resets the source box to the original sorted list. This is not the solution that I am looking for. I would like to see the items in the source and destination boxes to maintain their original sort order all the time.
    Is it possible?
    APEX v4.0.1.00.03
    Oracle 11g (v11.2.0.1.0)

    Hi,
    See if this post help
    Re: Sort Shuttle Right
    Regards,
    Jari

  • HPCM custom driver use source and destination

    Hi All,
    I'm using HPCM 12.1.2, I need your help to create one custom driver that uses information from source and destination. I need to use 1 dimension of source and 2 of destination.
    I have the situation bellow:
    STAGE1 (SOURCE)
    Dim1 Machine
    Dim2 Account
    STAGE2 (DESTINATION)
    Dim1 Process
    Dim2 Operation
    Dim 3 Account
    Measure: Time used in the Machine
    The infromation is extracted:
    Machine1 - Process1 - Operation1 = 10 Hours
    Machine1 - Process1 - Operation2 = 9 Hours
    Machine2 - Process1 - Operation1 = 13 Hours
    How can I create one Driver that I could use these informations? It is possible?
    Thanks in Advance
    Diogo
    Edited by: user10432898 on 15/05/2012 14:21

    Hi Alex,
    You have to increase the Max Row size limit in Xcelsius.
    Goto >> File menu >> Preferences >> Increase the "Max Row Sixe"  by default it will be 500 you can increase it as per the requirement.
    Regards,
    AnjaniKumar C.A.

  • Could nt complete your request because source and destination files are the same

    Hi, thank you for reading.
    I'm having this problem and it's driving me nuts.
    I'm actually following a tutorial that you can check out here: http://nightshifted.tumblr.com/post/2559360661/tutorial-paused-animations
    basically I'm trying to do a animated gif with canvas (I'm sorry if my english is not so great). when I try to drag the layers into the canvas (step 2 of the tutorial), I get the error: "could not complete your request because source and destination are the same".
    can anybody help me? I have both CS3 and CS5 and they the error appears in both.
    thank you in advanced

    I think they mean select the layers and frames and using the move tool, drag
    inside the document (click inside the document window and drag) to move the
    selected layers to the top half  (transparent area), not to drag the layers from
    the layers palette into the document, which would give that error.
    MTSTUNER

  • Dynamic source and destination tables

    Hi all
    I've got to import 142 tables from csv into SQL 2008 on a regular basis.
    I was looking at building a 142-part SSIS package to do this, then thought there must be a dynamic way of doing it.
    Is there any way of dynamically changing the source and and destination tables?
    The csv filenames will remain identical, the SQL tables will be the same names but with "_Staging" at the end of them (e.g. SRSection.csv will always go into SRSection_Staging).
    I can then write MERGE statements to update the main tables from the staging data.
    Any help on this would be greatly appreciated.
    I get the the feeling I would need a FOREACH LOOP container but I'd really aprreciate a step-by-step guide if you can.

    Please check this :- http://sql-bi-dev.blogspot.com/2010/07/dynamic-database-connection-using-ssis.html  
    STEP1:
    To begin, Create two tables as shown below in on of the environment:
    -- Table to store list of Sources
    CREATE TABLE SourceList (
       ID [smallint],
       ServerName [varchar](128),
       DatabaseName [varchar](128),
       TableName [varchar](128),
       ConnString [nvarchar](255)
    GO
    -- Local Table to store Results
    CREATE TABLE Results(
       TableName  [varchar](128),
       ConnString [nvarchar](255),
       RecordCount[int],
       ActionTime [datetime]
    GO
    STEP 2:
    Insert all connection strings in SourceList table using below script:
    INSERT INTO SourceList
    SELECT 1 ID,
    '(local)' ServerName,
    --Define required Server
    'TestHN' DatabaseName,--Define DB Name
    'TestTable' TableName,
    'Data Source=(local);Initial Catalog=TestHN;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;' ConnString
    Insert as many connections as you want.
    STEP 3:
    Add new package in your project and rename it with ForEachLoopMultipleServers.dtsx. Add following variable:
    Variable
    Type
    Value
    Purpose
    ConnString
    String
    Data Source=(local);
    Initial Catalog=TestHN;
    Provider=SQLNCLI10.1;
    Integrated Security=SSPI;
    Auto Translate=False;
    To store default connection string
    Query
    String
    SELECT '' TableName,
    N'' ConnString,
    0 RecordCount,
    GETDATE() ActionTime
    Default SQL Query string.
    This can be modified at runtime based on other variables
    SourceList
    Object
    System.Object
    To store the list of connection strings
    SourceTable
    String
    Any Table Name.
    It can be blank.
    To store the table name of current connection string.
    This table will be queried at run time
    STEP 4:
    Create two connection managers as shown below:
    Local.TestHN: For local database which has table SourceList. Also this will be used to store the result in Results table.
    DynamicConnection: This connection will be used for setting up dynamic connection with multiple servers.
    Now click on DynamicConnection in connection manager and click on ellipse to set up dynamic connection string. Map connection String with variable
    User::ConnString.
    STEP 5:
    Drag and drop Execute SQL Task and rename with "Execute SQL Task - Get List of Connection Strings". Now click on properties and set following values as shown in snapshot:
    Result Set: Full Result Set
    Connection: Local.TestHN
    ConnectionType: Direct Input
    SQL Statement: SELECT ConnString,TableName FROM SourceList
    Now click on Result Set to store the result of SQL Task in variable User::SourceList.
    STEP 6:
    Drag and drop ForEach Loop container from toolbox and rename with "Foreach Loop Container - DB Tables". Double click on ForEach Loop container to open Foreach Loop Editor. Click on Collection  and select
    Foreach ADO Enumerator as Enumerator. In Enumerator configuration, select User::SourceList as ADO object source variable as shown below:
    STEP 7: Drag and drop Script Task inside ForEach Loop container and double click on it to open Script Task Editor. Select
    User::ConnString,User::SourceTable as
    ReadOnlyVariables and User::Query as
    ReadWriteVariables. Now click on Edit Script button and write following code in Main function:
    public void Main()
    try
    String Table = Dts.Variables["User::SourceTable"].Value.ToString();
    String ConnString = Dts.Variables["User::ConnString"].Value.ToString();
    MessageBox.Show("SourceTable = " + Table +
    "\nCurrentConnString = " + ConnString);
    //SELECT '' TableName,N'' ConnString,0 RecordCount,GETDATE() ActionTime
    string SQL = "SELECT '" + Table +
    "' AS TableName, N'" + ConnString +
    "' AS ConnString, COUNT (*) AS RecordCount, GETDATE() AS ActionTime FROM " + Dts.Variables["User::SourceTable"].Value.ToString() +
    " (NOLOCK)";
          Dts.Variables["User::Query"].Value = SQL;
          Dts.TaskResult = (int)ScriptResults.Success;
    catch (Exception e)
          Dts.Log(e.Message, 0,
    null);
    STEP 8:
    Drag and drop Data Flow Task and double click on it to open Data Flow tab. Add OLE DB Source and Destination. Double click on OLE DB Source to configure the properties. Select
    DynamicConnection as OLE DB connection manager and
    SQL command from variable as Data access mode. Select variable name as User::Query. Now click on
    columns to genertae meta data.
    Double click on OLE DB Destination to configure the properties. Select Local.TestHN as
    OLE DB connection manager and Table or view - fast load as
    Data access mode. Select [dbo].[Results] as Name of the table or the view. now click on
    Mappings to map the columns from source. Click OK and save changes.
    Finally DFT will look like below snapshot:
    STEP 9: We are done with package development and its time to test the package.
    Right click on the package in Solution Explorer and select execute. The message box will display you the current connection string.
     Once you click OK, it will execute Data Flow Task and load the record count in Results table. This will be iterative process untill all the connection are done. Finally package will execute successfully.
    You can check the data in results table:
    Here is the result:
    SELECT *
    FROM SourceList
    SELECT *
    FROM Results
    Regards, Pradyothana DP. Please Mark This As Answer if it solved your issue. Please Mark This As Helpful if it helps to solve your issue. ========================================================== http://www.dbainhouse.blogspot.in/

  • Setting up a project where source and destination are different formats

    Howdy...
    I'm a recent Vegas user who is just switching over to FCP6.
    My question is this: When your source footage and destination format are two unrelated formats (for instance, HDV for source footage, Apple-TV H264 for destination), is it better to set up the project for your source footage format, or for your destination format?
    In Vegas it didn't really matter, but it was good to use the destination format because you could preview what it'd look like at the size and frame-rate that you were going to end up in.
    In FCP6, however, it seems that importing HDV into a H264 project may not work as well, becuase it wants to render the footage into the correct format before you can preview it.
    FYI: The final desitnation is my gadget podcast, http://www.neo-fight.tv, which I have shot on HDV and edited on Vegas for the past year. FCP is very new to me, but I'm enjoying learning something new.
    Let me know your thoughts...
    Best,
    Benjamin
    http://www.neo-fight.tv [The TV Show for The 'Not-So-Geeky']
    MacBook, MacPro   Mac OS X (10.4.9)  

    Hi Benjamin - Prior to fcs 2 I worked in whatever my capture format was and will still continue to do that. I plan to use ProRes now - but if I was coming from an hdv source I'd prolly opt to capture using DVCPro HD and working in that.
    "In FCP6, however, it seems that importing HDV into a H264 project may not work as well, becuase it wants to render the footage into the correct format before you can preview it."
    As far as working in .h264 is concerned and combining other formats into that I really don't know.
    h264 for me is a delivery format - so I'd opt to still work in native capture format and then transcode/convert at the end. More options ....

  • Source and destination clarification needed

    Hi Guys,
    We are using SRM as an add-on component to ECC 6.0. We are on SRM Server 5.5. In the defination of backend system SAP asked us to create 4 entries.
    1. ONECLNTERP : RFC tick is on
    2. ONECLNTEBP:
    3. ONECLNTSUS:
    4. XXXCLNTYYY: Local tick is on.
    So this means 'ONECLNTERP' is my backend system i.e. ECC 6.0 and 'XXXCLNTYYY' is my local system.
    Now while defining backend system for Product category there is one 'Source System and there is 'Target system'. Source System: System from which master data is replicated. So this should be my SRM system (as Product category is a part of SRM) or this should be ECC (as My product categories are copy of material group from ECC).
    Target system is the system into which follow-on document of SC is transfered i.e. Backend ECC system.
    Am I correct to say the above. I'm confused about which system is source and which system is destination.
    While defining the Account Assignement category and defining the PR document type I'm facing the same problem as I've to define the source system.
    Can anybody clarify me??
    Thanks
    Debashish

    Hello Deb,
    I have not worked on ECC 6.0.
    But logically speaking the
    source system - has to be yr ECC as SRM would be taking the ref of matl masters from there only
    Target system - would be yr same ECC again if you are using only one backend (and I think you are)
    (if you are using multiple backend and you want followon document in diff backend then here the SID of that backend would come)
    BR
    Dinesh

  • Association must have both source and destination ends.

    I am getting this error when trying to developer Oracle Applications framework page using ICX and FND information. I am trying to find out if there is something missing. I don't see an assocation that is missing a source or destination.

    {forum:id=210}

  • SQL Server replication and size differences of source and destination databases

    I set up snapshot replication for a DB between two SQL instances.  On the source instance, the DB shows as 106612.56MB with 34663.75MB as available free space.  I expected that the replica would then end up being 71948.81MB (106612.56 - 34663.75
    because it wouldn't replicate the white space).  The resultant replica database is showing as 35522.94MB.  The required data appears to be present in the replicated DB as the SSRS reports that use it are able to find the data they look for.  But
    why the large discrepancy in size between the source and replicated DB?  The replicated DB is less than 1/2 the size of the source DB.  I've searched around and can't seem to find any explanation.  I realize this isn't mirroring so the DBs will
    not be identical in size but I did not expect to see such a large difference between the two.  I am replicating all almost all articles (tables, stored procs, etc.) with the exception of a handful of stored procedures and user-defined functions that either
    reference invalid column names in a table (vendor bug) or reference another DB that is not present on the replica's instance.  I would expect these 4-5 articles can not account for a 37000 MB size difference between the two DBs.
    Please note that this has nothing to do with transaction log size.  I am specifically talking about the database size and am not looking at the size that combines both DB and TxLog size.
    Any insight?

    Another factor could be that on the publisher the data is distributed through pages, paragraphs and extents. Depending on your fill factor and the amount of deletes and your datatype, there could be space in the pages, paragraphs and extents which have not
    been reclaimed.
    During the bcp process which is part of the snapshot application process on the subscriber all the data will be in the tables in a contiguous fashion. I would suspect this would be why you have the difference in space usage.
    looking for a book on SQL Server 2008 Administration?
    http://www.amazon.com/Microsoft-Server-2008-Management-Administration/dp/067233044X looking for a book on SQL Server 2008 Full-Text Search?
    http://www.amazon.com/Pro-Full-Text-Search-Server-2008/dp/1430215941

  • Migrating schemas - Different tablespace names on source and destination

    Hi,
    I am migrating database schemas with exp/imp from 8i on Solaris to 9i on Linux and also from one 9i on Linux to another 9i on another Linux machine.
    The tablespaces and schemas (empty) already exist in the destination, so the schemas now need to be filled using imp. However, the default tablespace of the schemas have a different name in the source than in the destination. Is this going to cause errors during the import?
    Thanks.

    Not at all, i think. Just verify that the source tablespace is not present in the destination database. If you have a tablespace with the same name like the source, put the user quota to zero, so imp will switch to the default tablespace.
    Dba Z.

  • Strcpy problem overlapping source and destination in 64 bit.

    A simple program like the following when compiled for 64 bit, it gives wrong result in some of the 64bit m/c.
    /* a.c */
    #include <stdio.h>
    #include <string.h>
    int main()
    char st[50];
    char *p = st + 2;
    strcpy(st,"XY1234");
    strcpy(st,p);
    printf("\n%s\n",st);
    return 0;
    cc -o check_copy -m64 a.c
    On running check_copy, the expected answer is "1234", but in some m/c this gives "1434" .
    The libc.so.1 in both the m/cs are different.
    The result is correct in older SunOS.
    uname -a in two m/cs ( old_mc and new_mc )
    SunOS old_mc 5.10 Generic_127112-05 i86pc i386 i86pc [ Gives correct result ]
    SunOS new_mc 5.10 Generic_147441-01 i86pc i386 i86pc [ Gives wrong result ]
    Both the m/c s have 118855-36 patch, which I believe is for libc.so.1.
    As our s/w is large one, it will be difficult to have a code fix, as there will be many instances of the above scenario and which triggered in very rare usecases.
    Any help, regarding the patch details which can fix this issue..

    Hi,
    Based on the error message, we should confirm whether Oracle client tool has been installed on the Report Server at the beginning. If the Report Server installed is 64-bit, we should also install a 64-bit Oracle client tool on the Report Server. Besides, we
    can also try to bypass tnsnames.ora file when connecting to Oracle by specifying all the information in the data source instead of using the TNS alias to connect
    to the Oracle database. Please see the sample below:
    Data Source=(DESCRIPTION=(CID=GTU_APP)(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST= server01.mydomain.com)(PORT=1521)))(CONNECT_DATA=(SID=OracleDB)(SERVER=DEDICATED)));
    Reference:http://blogs.msdn.com/b/dataaccesstechnologies/archive/2010/06/30/ora-12154-tns-could-not-resolve-the-connect-identifier-specified-error-while-creating-a-linked-server-to-oracle.aspx
    Hope this helps.
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • Update rules, different source and target units (0BASE_UOM-- 0QVVMRA)

    Hi,
    I have a problem with update rules as I'm trying to map two keyfigures with different unit-types together. They are both kilograms (KG) but the source figure is defined as 0BASE_UOM and the target keyfigure is 0QVVMRA (Amount of sales).
    How can I make this connection work? Do I have to use the 'Unit calculation in routine' -option or is there a more simple way? I a routine must ne written, please post an example or link to a thread which has one.
    Thx!
    -miikka

    Hi,
    The Problem is that you are trying to connect a Key Fig for Type Quantity with the Key Figure Amount.
    The Amount Key Fig will have either 0CURRENCY or Fixed Currency.
    Using Update Rules Coding you could transfer the Value but you will have problems with the Currency. You cant store a Unit like KG in 0CURRENCY.
    Regards.

  • Source and destination are deep structures.. can't seem to map the detail table

    So, both my source data structure and my destination structure are very similarly built:
    They both have a HEADER and BODY component. The HEADER has the standard AIF fields (MSGGUID, NS, IFNAME, IFVER).
    I have defined the BODY as a deep structure composed of a Header Structure + a Detail Table. Each row in the BODY has a structure with a few header fields, then a table of detail data attached to it.
    In my Structure Mappings, I've created 2 rows:
    The first to map BODY-HEADER to my header structure.
    The second to map BODY-DETAILS to my details table. Here, I've chosen "Indirect Mapping".
    Selecting that row and navigating to "Define Field Mappings", I choose my destination table (DTL) and enter the Sub-table from my source structure.
    At this point, I only have 1 action - to save the data in Ztables. I'm seeing that the header data is passed in fine. But there aren't any detail lines coming through.
    Any ideas?
    thanks alot.
    RP.

    Christoph,
    In your reply, I only have the ability to perform the First Structure Mapping (please see the below screenshots):
    Source Structure; BODY
    Destination Structure: BODY
    Field Mapping:
    1. Field in Dest.Struc=HDR, Sub-Table = SCH_HEADER
    2. Field in Dest.Struc=DTL,  Sub Table = SCH_DETAILS
    I presume I'm following your suggestion, but when I try to create the other two structure mappings, I only have HEADER or BODY as available options to choose from: I've attached screenshots of the F4 dropdown of the Source Structure and also the Destination Structure fields.
    I believe this is because my Interface Definition is defined as a HEADER and BODY:
    Source Data Structure
    Destination Data Structure

  • Drag & Relate Relation Editor ins't accessiable for source and destination

    Dear all,
    we've install a EP 7 SPS 15 and want to use the relationship editor, after we've import the business objects from a systeme where we have already a connection with ep6 the objects are available, but when we want to add them as source / destination to the relationshipeditor, there comes up an Website error :
    mtxSelect error no object
    we try this with IE 7 and 6 and Firefox, but for fierfox there wasn't any message.
    So we need some ideas why or what can we do to fix / find the reason for this error.
    Best regards
    Thorsten Stracke

    I am getting the same error. If anyone know what might be causing this, please help.
    Thanks,
    Alex

Maybe you are looking for