Issue with accessing members by dates

Hi,
i have one issue that Based on the start and end dates ,the qty and cost for a specific product line should be accessed.
Here is the description...
START DATE END DATE QTY COST
RM1 1/9/2009 30/9/09 0.1 2
RM2 1/10/09 31/10/09 0.2 1.5
i need to accesss the products(rm1,rm2) based on the start date and end date..
pls post it with detail....

As per your dimension members you want to see the numbers for period of fullmonth(01 SEP to 30 SEP).
For this case you dont need two dimensions, one dimension is enough at month level and you can use the alias name for the reporting purpose.
If the data is coming for various startdate and end date and you want to see the data for various combinations, you can maintain two calender dimensions.
Essbase will calculate the aggregated numbers for the various combinations.
You dont need another time dimension.

Similar Messages

  • Issue with Oracle LONG RAW data type

    Hi All,
    I am facing some issues with Oracle LONG RAW DATA Type.
    We are using Oracle 9IR2 Database.
    I got a table having LONG RAW column and I need to transfer the same into another table having LONG RAW column.
    When I tried using INSERT INTO SELECT * command (or) CREATE TABLE as select * , it is throwing ORA-00997: illegal use of LONG datatype.
    I have gone through some docs and found we should not use LONG RAW using these operations.
    So I did some basic PLSQL block given below and I was able to insert most of the records. But records where the LONG RAW file is like 7O kb, the inserting is faliling.
    I tried to convert LONG RAW to BLOB and again for the record where the LONG RAW is big in size I am getting (ORA-06502: PL/SQL: numeric or value error) error.
    Appreciate if anyone can help me out here.
    DECLARE
    Y LONG RAW;
    BEGIN
    FOR REC IN (SELECT * FROM TRU_INT.TERRITORY WHERE TERRITORYSEQ=488480 ORDER BY TERRITORYSEQ ) LOOP
    INSERT INTO TRU_CMP.TERRITORY
    BUSINESSUNITSEQ, COMPELEMENTLIFETIMEID, COMPONENTIMAGE, DESCRIPTION, ENDPERIOD, GENERATION, NAME, STARTPERIOD, TERRITORYSEQ
    VALUES
    REC.BUSINESSUNITSEQ, REC.COMPELEMENTLIFETIMEID, REC.COMPONENTIMAGE, REC.DESCRIPTION, REC.ENDPERIOD, REC.GENERATION, REC.NAME,
    REC.STARTPERIOD, REC.TERRITORYSEQ
    END LOOP;
    END;
    /

    Maddy wrote:
    Hi All,
    I am facing some issues with Oracle LONG RAW DATA Type.
    We are using Oracle 9IR2 Database.
    I got a table having LONG RAW column and I need to transfer the same into another table having LONG RAW column.
    When I tried using INSERT INTO SELECT * command (or) CREATE TABLE as select * , it is throwing ORA-00997: illegal use of LONG datatype.
    I have gone through some docs and found we should not use LONG RAW using these operations.
    So I did some basic PLSQL block given below and I was able to insert most of the records. But records where the LONG RAW file is like 7O kb, the inserting is faliling.
    I tried to convert LONG RAW to BLOB and again for the record where the LONG RAW is big in size I am getting (ORA-06502: PL/SQL: numeric or value error) error.
    Appreciate if anyone can help me out here.
    DECLARE
    Y LONG RAW;
    BEGIN
    FOR REC IN (SELECT * FROM TRU_INT.TERRITORY WHERE TERRITORYSEQ=488480 ORDER BY TERRITORYSEQ ) LOOP
    INSERT INTO TRU_CMP.TERRITORY
    BUSINESSUNITSEQ, COMPELEMENTLIFETIMEID, COMPONENTIMAGE, DESCRIPTION, ENDPERIOD, GENERATION, NAME, STARTPERIOD, TERRITORYSEQ
    VALUES
    REC.BUSINESSUNITSEQ, REC.COMPELEMENTLIFETIMEID, REC.COMPONENTIMAGE, REC.DESCRIPTION, REC.ENDPERIOD, REC.GENERATION, REC.NAME,
    REC.STARTPERIOD, REC.TERRITORYSEQ
    END LOOP;
    END;
    /below might work
    12:06:23 SQL> help copy
    COPY
    Copies data from a query to a table in the same or another
    database. COPY supports CHAR, DATE, LONG, NUMBER and VARCHAR2.
    COPY {FROM database | TO database | FROM database TO database}
                {APPEND|CREATE|INSERT|REPLACE} destination_table
                [(column, column, column, ...)] USING query
    where database has the following syntax:
         username[/password]@connect_identifier

  • Issues with accessing HD on Time Capsule

    I have a Windows 7 32 bit machine that I just built. I have a Windows XP system and a Mac Book Pro running Snow leopard. My XP and Mac are able to access the HD of my Time Capsule. My Windows 7 is having issues logging into the Time Capsule HD. When trying to log into the base station using the Airport Base Station Utility I get a message stating "Unknown user, incorrect password, or login is disabled. Please retypre the login information or contact the disk's administrator (86)." I have the AirPort Utility installed on my Windows 7 and am able to log into the settings to modify the settings of the Time Capsule. Does anyone have any insight as to how to assist me?

    Note: Mac 3 had a single user on it, my son. He was the admin. I created a new non-admin acct and moved all his stuff there (Not sure I did it the right way so it took a long time.). Now I am admin on that Mac. Not sure if that is relevant here (leading to the problem identified above) but thought I would mention it. 
    Thanks in advance for your help.
    The fact that you cannot access the backup now only on this Mac does suggest this is part of the issue. Did you do this before or after the issue with access?
    Permissions can be pretty tricky things.. certainly to do backups or to retrieve them from an existing backup created with different user permissions might be a mess.. Pondini is perhaps the only one who can tell you the ins and outs of that.
    On the more general note.. lots of people are having issues with USB drive on the TC from Lion.. for many it stops working forever the moment the disk spins down from a reboot.. making it useless.
    But it is dog slow for everyone.. The reality is USB is less than half its native speed on the TC cf on the computer.. and it wasn't that fast to begin with.
    Plug the hard drive directly into the computer and mount the sparsebundle and see if the TM then can access it.. but you might need to use your admin login not your son's.
    Personally if the TC disk is not big enough.. replace the TC with a larger disk version.. or the disk in the present one.. do not use disks hanging off TC. It has enough reliability issues without adding to them.

  • Issue with Enterprise Library's Data Access Application

    I do not know if I have the correct forum for this post. I hve put it in what I believe is the most likely best forum, but please move this if there is a more germane forum for it.
    I have inherited a solution (C#) which uses the reference Microsoft.Practices.EnterpriseLibrary.Data. The reference was included with the codebase I inherited, there are no missing references.
    When the library is called with:
    Microsoft.Practices.EnterpriseLibrary.Data.ExecuteDataSet("configConfigurationKeysSelect", parameterValues);
    It complains with the runtime error:
    [InvalidOperationException: The stored procedure 'configConfigurationKeysSelect' doesn't exist.] (I have put the full trace below)
    This is true, there IS no stored procedure  'configConfigurationKeysSelect'  but there is a table named  'configConfigurationKeys.'  
    I assumed this stored procedure is built on the fly to select from that table? Must I do something regarding  Enterprise Library's Data Access Application so that this works on the system I am running it on? (I am simply accessing the references
    included with the sourced for this, nothing else). I am looking for some guidance here, and thanks in advance.
    Stack trace on error as:
    [InvalidOperationException: The stored procedure 'configConfigurationKeysSelect' doesn't exist.]    System.Data.SqlClient.SqlCommand.DeriveParameters() +5344249
       System.Data.SqlClient.SqlCommandBuilder.DeriveParameters(SqlCommand command) +115
       Microsoft.Practices.EnterpriseLibrary.Data.Sql.SqlDatabase.DeriveParameters(DbCommand discoveryCommand) +72
       Microsoft.Practices.EnterpriseLibrary.Data.Database.DiscoverParameters(DbCommand command) +251
       Microsoft.Practices.EnterpriseLibrary.Data.ParameterCache.SetParameters(DbCommand command, Database database) +225
       Microsoft.Practices.EnterpriseLibrary.Data.Database.AssignParameters(DbCommand command, Object[] parameterValues) +53
       Microsoft.Practices.EnterpriseLibrary.Data.Database.GetStoredProcCommand(String storedProcedureName, Object[] parameterValues) +161
       Microsoft.Practices.EnterpriseLibrary.Data.Database.ExecuteDataSet(String storedProcedureName, Object[] parameterValues) +70
    Enterprise Library's Data Access Application 

    This is true, there IS no stored procedure  'configConfigurationKeysSelect'  but there is a table named  'configConfigurationKeys.'  
    I assumed this stored procedure is built on the fly to select from that table? Must I do something regarding  Enterprise Library's Data Access Application so that this works on the system I am running it on? (I am simply accessing the references
    included with the sourced for this, nothing else). I am looking for some guidance here, and thanks in advance.
    Why do you assume that a stored procedure is create on the fly?  Looking at the documentation for this component, I don't see a mention of such.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Issues with iMessage when switching data carrier (wifi - cellular)

    So I have had this issue with my 5S and now my 6+ in iMessage. Not *always* but often enough I've run into issues sending messages via iMessage when I associate with wifi in that they don't send. I'll get an error message saying "not delivered" and it will stack as many outgoing messages I try to send. However, if I force the first message to send as text, subsequent messages will go through afterwards without issue. If I go from wifi to cellular I have no issues whatsoever and when I have these issues on wifi, my data works fine (incoming emails, Facebook notifications, etc.). I used to run cellular data only because I had unlimited but now I have a cap so I intend on using wifi when possible...but to have to send my first message as a text just so I can use iMessage is annoying...

    Hi there BigDogg795,
    I would recommend taking a look at the troubleshooting steps for iMessage found in the article below. 
    iOS: Troubleshooting Messages - Apple Support
    To resolve issues with sending and receiving iMessages, follow these steps
    Check iMessage system status for current service issues.
    Go to Settings > Messages > Send & Receive and make sure that you registered iMessage with your phone number or Apple ID and that you selected iMessage for use. If the phone number or Apple ID isn't available for use, troubleshoot iMessage registration.
    Open Safari and navigate to www.apple.com to verify data connectivity. If a data connection isn't available, troubleshoot cellular data or a Wi-Fi connection.
    iMessage over cellular data might not be available while you're on a call. Only 3G and faster GSM networks support simultaneous data and voice calls. Learn which network your phone supports. If your network doesn't support simultaneous data and voice calls, go to Settings > Wi-Fi and turn Wi-Fi on to use iMessage while you're on a call.
    Restart your device.
    Tap Settings > General > Reset > Reset Network Settings on your iPhone.
    If you still can't send or receive an iMessage, follow these steps
    Make sure that the contact trying to message you isn't blocked in Settings > Messages > Blocked.
    Make sure that the contact you're trying to send a message to is registered with iMessage.
    If the issue occurs with a specific contact or contacts, back up or forward important messages and delete your current messaging threads with the contact. Create a new message to the contact and try again.
    If the issue occurs with a specific contact or contacts, delete and recreate the contact in the Contacts app. Create a new message to the newly created contact and try again.
    Back up and restore your device as new.
    -Griff W.

  • CSV creation through PLSQL having issue with some rows of data

    Hi,
    having issue with in some rows of data which is created by the PLSQL code.
    Issue description
    - 50000 rows of data in my CSV
    -around 20 fileds in a row
    - in some the row (in 50000 rows) some fields (20 rows of data) are returning empty as result but actually my query returns data for these fields
    Hope the issue is clear
    Code given below
    CREATE OR REPLACE FUNCTION TSC_OM_AUDIT(ERR_DESC IN OUT VARCHAR2,
    WEB_FILE_URL IN OUT VARCHAR2 ) RETURN BOOLEAN IS
    --Variable Declaration
    L_buff_line VARCHAR2(500);
    L_hdr_str VARCHAR2(32700);
    L_selling_UOM VARCHAR2(5);
    L_prim_bar_code VARCHAR2(30);
    L_status_pri_bar VARCHAR2(2);
    L_multpl_bar_exist VARCHAR2(2);
    L_multpl_prim_bar_exist VARCHAR2(2);
    L_prim_simp_pack VARCHAR2(30);
    L_staus_prim_simp_pack VARCHAR2(2);
    L_mupl_prim_pack_exist VARCHAR2(2);
    L_prim_supp_prim_simpl_pack VARCHAR2(15);
    L_prim_sim_supp_to_TPNB VARCHAR2(2);
    L_sel1 VARCHAR2(200);
    L_sel2 VARCHAR2(200);
    L_sel3 VARCHAR2(200);
    L_item_till_desc VARCHAR2(200);
    L_lengh1 VARCHAR2(200);
    L_width1 VARCHAR2(200);
    L_height1 VARCHAR2(200);
    L_lengh2 VARCHAR2(200);
    L_width2 VARCHAR2(200);
    L_height2 VARCHAR2(200);
    L_lengh3 VARCHAR2(200);
    L_width3 VARCHAR2(200);
    L_height3 VARCHAR2(200);
    --Item type
    CURSOR C_ITEM is
    select im.item L_item,
    im.status L_status,
    im.item_desc L_item_desc,
    'Regular' L_item_type,
    im.dept L_dept,
    im.class L_class,
    im.subclass L_subclass,
    'N/A' L_CW_item_order_type,
    'N/A' L_CW_item_sale_type,
    im.standard_uom L_standard_uom,
    NULL L_selling_uom,
    im.tsl_base_item L_tsl_base_item
    from item_master im
    where im.item_number_type='TPNB'
    -- and im.last_update_id = 'DATALOAD'
    -- and im.create_datetime>=to_timestamp('2010-11-05 10:57:47','YYYY-MM-DD HH24:MI:SS')
    and im.sellable_ind='Y'
    and im.orderable_ind='Y'
    and im.inventory_ind='Y'
    and im.item_xform_ind='N'
    and im.catch_weight_ind='N'
    and rownum<10;
    --order by im.item asc;
    Cursor C_Selling_UOM (tpnb varchar2) is
    select selling_uom
    from rpm_item_zone_price
    where item = tpnb;
    Cursor C_prim_bar (tpnb varchar2) is
    select item,
    status
    from item_master iem
    where item_parent = tpnb
    and iem.primary_ref_item_ind ='Y';
    Cursor C_multi_bar_exit (tpnb varchar2) is
    select count(*)
    from item_master iem
    where item_parent = tpnb;
    Cursor C_multpl_prim_bar_exist (tpnb varchar2) is
    select count(*)
    from item_master
    where item_parent = tpnb
    and primary_ref_item_ind ='Y';
    Cursor C_staus_prim_simp_pack (tpnb varchar2) is
    select piem.pack_no,
    iem.status
    from item_master iem,
    packitem piem
    where piem.item = tpnb
    and piem.pack_no = iem.item
    and iem.tsl_prim_pack_ind ='Y';
    Cursor C_multpl_prim_pack_exist (tpnb varchar2) is
    select count(*)
    from item_master iem,
    packitem piem
    where piem.item = tpnb
    and piem.pack_no = iem.item
    and iem.tsl_prim_pack_ind ='Y';
    Cursor C_prim_supp_prim_simpl_pack (tpnd varchar2) is
    select supplier
    from item_supplier
    where item = tpnd
    and primary_supp_ind= 'Y';
    Cursor C_prim_sim_supp_to_TPNB (tpnb varchar2,suppl number) is
    select 'Y'
    from item_supplier
    where item = tpnb
    and supplier = suppl;
    Cursor C_item_descretion_SEL (tpnb varchar2) is
    select sel_desc_1,
    sel_desc_2,
    sel_desc_3
    from tsl_itemdesc_sel
    where item =tpnb;
    Cursor C_item_till_descretion (tpnb varchar2) is
    select till_desc
    from tsl_itemdesc_till
    where item=tpnb;
    Cursor C_EA (tpnb varchar2,suppl number) is
    select length,
    width,
    height
    from item_supp_country_dim
    where item= tpnb
    and supplier = suppl
    and dim_object='EA';
    Cursor C_CS (tpnb varchar2,suppl number) is
    select length,
    width,
    height
    from item_supp_country_dim
    where item= tpnb
    and supplier = suppl
    and dim_object='TYUNIT';
    Cursor C_TRAY (tpnb varchar2,suppl number) is
    select length,
    width,
    height
    from item_supp_country_dim
    where item= tpnb
    and supplier = suppl
    and dim_object='TRAY';
    BEGIN
    --INITIAL
    WEB_FILE_URL := TSC_RPT_GRS_EXL_GEN.WEB_URL||TSC_RPT_GRS_EXL_GEN.OPEN_FILE;
    TSC_RPT_GRS_EXL_GEN.put_line('Poland Production Cutover');
    TSC_RPT_GRS_EXL_GEN.skip_line;
    l_hdr_str := 'L_item_TPNB,L_status,L_item_desc,L_item_type,L_section,L_class,L_subclass,L_cw_order_type,L_cw_sale_type,L_std_UOM,L_selling_UOM,'||
    'L_base_item,L_prim_bar_code,L_status_pri_bar,L_multpl_bar_exist,L_multpl_prim_bar_exist,L_prim_simple_pack,L_staus_prim_simp_pack,L_mupl_prim_pack_exist,'||
    'L_prim_supp_prim_simpl_pack,L_prim_sim_supp_to_TPNB,L_sel1,L_sel2,L_sel3,L_item_till_desc,L_lengh1,L_width1,L_height1,L_lengh2,L_width2,L_height2,L_lengh3,L_width3,L_height3';
    l_hdr_str := TSC_RPT_GRS_EXL_GEN.CONVERT_SEPARATOR(l_hdr_str);
    TSC_RPT_GRS_EXL_GEN.put_line(l_hdr_str);
    for rec_c_item in C_ITEM
    LOOP
    open C_Selling_UOM (rec_c_item.L_item);
    fetch C_Selling_UOM into L_selling_UOM;
    close C_Selling_UOM;
    open C_prim_bar (rec_c_item.L_item);
    fetch C_prim_bar into L_prim_bar_code,L_status_pri_bar;
    close C_prim_bar;
    open C_multi_bar_exit (rec_c_item.L_item);
    fetch C_multi_bar_exit into L_multpl_bar_exist;
    close C_multi_bar_exit;
    IF to_number(trim(L_multpl_bar_exist)) > 1 THEN
    L_multpl_bar_exist:='Y';
    ELSE
    L_multpl_bar_exist:='N';
    END IF;
    open C_multpl_prim_bar_exist (rec_c_item.L_item);
    fetch C_multpl_prim_bar_exist into L_multpl_prim_bar_exist;
    close C_multpl_prim_bar_exist;
    IF to_number(trim(L_multpl_prim_bar_exist)) > 1 THEN
    L_multpl_prim_bar_exist:='Y';
    ELSE
    L_multpl_prim_bar_exist:='N';
    END IF;
    open C_staus_prim_simp_pack (rec_c_item.L_item);
    fetch C_staus_prim_simp_pack into L_prim_simp_pack,L_staus_prim_simp_pack;
    close C_staus_prim_simp_pack;
    open C_multpl_prim_pack_exist (rec_c_item.L_item);
    fetch C_multpl_prim_pack_exist into L_mupl_prim_pack_exist;
    close C_multpl_prim_pack_exist ;
    IF to_number(trim(L_mupl_prim_pack_exist)) > 1 THEN
    L_mupl_prim_pack_exist:='Y';
    ELSE
    L_mupl_prim_pack_exist:='N';
    END IF;
    open C_prim_supp_prim_simpl_pack (trim(L_prim_simp_pack));
    fetch C_prim_supp_prim_simpl_pack into L_prim_supp_prim_simpl_pack;
    close C_prim_supp_prim_simpl_pack ;
    open C_prim_sim_supp_to_TPNB (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_prim_sim_supp_to_TPNB into L_prim_sim_supp_to_TPNB;
    close C_prim_sim_supp_to_TPNB ;
    open C_item_descretion_SEL (rec_c_item.L_item);
    fetch C_item_descretion_SEL into L_sel1,L_sel2,L_sel3;
    close C_item_descretion_SEL ;
    open C_item_till_descretion (rec_c_item.L_item);
    fetch C_item_till_descretion into L_item_till_desc;
    close C_item_till_descretion ;
    open C_EA (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_EA into L_lengh1,L_width1,L_height1;
    close C_EA ;
    open C_CS (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_CS into L_lengh2,L_width2,L_height2;
    close C_CS ;
    open C_TRAY (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_TRAY into L_lengh3,L_width3,L_height3;
    close C_TRAY ;
    L_buff_line := TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item), TRUE)||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_status))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item_desc))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item_type))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_dept))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_class))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_subclass))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_CW_item_order_type))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_CW_item_sale_type))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_standard_uom))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_selling_UOM)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_tsl_base_item))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_bar_code)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_status_pri_bar)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_multpl_bar_exist)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_multpl_prim_bar_exist)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_simp_pack)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_staus_prim_simp_pack)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_mupl_prim_pack_exist)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_supp_prim_simpl_pack)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_sim_supp_to_TPNB)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel3)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_item_till_desc)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh3)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width3)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height3)));
    TSC_RPT_GRS_EXL_GEN.PUT_LINE(L_buff_line);
    L_selling_UOM :=NULL;
    L_prim_bar_code :=NULL;
    L_status_pri_bar :=NULL;
    L_multpl_bar_exist :=NULL;
    L_multpl_prim_bar_exist :=NULL;
    L_prim_simp_pack :=NULL;
    L_staus_prim_simp_pack :=NULL;
    L_mupl_prim_pack_exist :=NULL;
    L_prim_supp_prim_simpl_pack :=NULL;
    L_prim_sim_supp_to_TPNB :=NULL;
    L_sel1 :=NULL;
    L_sel2 :=NULL;
    L_sel3 :=NULL;
    L_item_till_desc :=NULL;
    L_lengh1 :=NULL;
    L_width1 :=NULL;
    L_height1 :=NULL;
    L_lengh2 :=NULL;
    L_width2 :=NULL;
    L_height2 :=NULL;
    L_lengh3 :=NULL;
    L_width3 :=NULL;
    L_height3 :=NULL;
    END LOOP;
    TSC_RPT_GRS_EXL_GEN.close_file;
    return TRUE;
    EXCEPTION WHEN OTHERS THEN
    ERR_DESC := '['||SQLCODE||']-'||SUBSTR(SQLERRM,1,200);
    return FALSE;
    END TSC_OM_AUDIT;
    Please suggest something on this
    Regards,
    Shamed H

    Hi Shamid,
    This forum is only for questions regarding the SQL Developer tool. Please post in Forum Home > Database > SQL and PL/SQL:
    PL/SQL
    Regards,
    Gary
    SQL Developer Team

  • Blocking of Good Issue with respect to posting date

    Hi Expert,
                   Is there any way out in SAP B1 to block the Good Release with respect to posting date.i.e. If User receipt the Good in stock today then system should not allow the user to issue the same good on the back date.
    regards,
    PankajK

    Hi Pankaj,
    There are so many options e.g you can check in oinm select that particular item sort on docdate and check inqty and compare with your goods issue date if it is same then send same transaction to approval if you want.
    Thanks
    Sachin

  • QSIG issue with accessing a Voice System on a Meridian

    I have a Voice Gateway (2551) connecting a Meridian with the Cisco IP Telephony setup. The Callmanager is connected with QSIG to the Meridian.
    Basic call feature work fine. But i do have problems with accessing the Voicemail System on the meridian. Accessing the own Voicemailbox is also fine.(Accessing from 1111 on the Voice System (5959) the Voicebox of 1111) If i now forward the calls on 1111 to the Voice System and i make a call from 2222 to 1111 i will get to the voicemail system but not to the voicebox of 1111. Seams to me that the Voice System needs more information about 1111. On the QSIG Trunk i am sending the 1111 as the redirecting number along. Does somebody expirience such issues?
    Are there settings to change on the callmanager or on the meridian side? Also the MWI works fine.
    Thanks for any help on this issue. Regards
    Roger

    Hi,
    i do not know about the batches on the meridian side but i could find it out. i am only responsible for the cisco ip telefony side and i do not really know the meridian side. i'm working together with a technician who is doing the meridian side.
    How can i find out which batches are needed and what is the impact to install them?
    And what did you mean with little flaky...is it not supporting qsig the way it should?
    Regards
    Roger

  • Problem with access another domain data

    Hi
    I want to access another domain data like XML and html page, I tried crossdomain.xml.
    but not working plz Help me if u can.

    What's the Java console say?

  • Performance issue with FDM when importing data

    In the FDM Web console, a performance issue has been detected when importing data (.txt)
    In less than 10 seconds the ".txt" and the ".log" files are created the INBOX folder (the ".txt" file) and in the OUTBOX\Logs (the ".log" file).
    At that moment, system shows the message "Processing, please wait” during 10 minutes. Eventually the information is displayed, however if we want to see the second page, we have to wait more than 20 seconds.
    It seems a performance issue when system tries to show the imported data in the web page.
    It has been also noted that when a user tries to import a txt file directly clicking on the tab "Select File From Inbox", the user has to also wait other 10 minutes before the information is displayed on the web page.
    Thx in advance!
    Cheers
    Matteo

    Hi Matteo
    How much data is being imported / displayed when users are interacting with the system.
    There is a report that may help you to analyse this but unfortunately I cannot remember what it is called and don't have access to a system to check. I do remember that it breaks down the import process into stages showing how long it takes to process each mapping step and the overall time.
    I suspect that what you are seeing is normal behaviour but that isn't to say that performance improvements are not possible.
    The copying of files is the first part of the import process before FDM then starts the import so that will be quick. The processing is then the time taken to import the records, process the mapping and write to the tables. If users are clicking 'Select file from Inbox' then they are re-importing so it will take just as long as it would for you to import it, they are not just asking to retrieve previously imported data.
    Hope this helps
    Stuart

  • SQLServer to oracle migration. Issue with table having image data.

    Hi,
    I am using the SQL Developer version 1.5.0.53 Build MAIN-53.38. I am trying to migrate from sql server database to oracle. Sql server database version is 2005 and oracle database version is 10g (10.1.0.2 and 10.2.0.3). Both Oracle and Sql server databases are on windows-xp.
    Everything (including data) migrated well except for the one table having blob (oracle) and image(sqlserver). I am getting the following error in sql developer while migrating data for the blob from sql server to oracle.
    Data Move information:Rows : 497 Errors: 10278
    Commit failed: Closed Connection
    Must be logged on to server
    [POCRepository].[td].[REPOSITORY] Closed Connection
    Io exception: Software caused connection abort: socket write error
    [POCRepository].[td].[REPOSITORY] OALL8 is in an inconsistent state
    No more data to read from socket
    [POCRepository].[td].[REPOSITORY] No more data to read from socket
    Inserting ' ' into column td_POCRepository.REPOSITORY.RP_DATA (Row number 498)
    I have created sqldeveloper.cmd file as suggested in other threads and I am using the same java provided with SQL Developer 1.5.0.53.
    The database alertsid.log shows the following error messages for this activity.
    ORA-07445: exception encountered: core dump [ACCESS_VIOLATION] [0x34EF9E5] [] [] [] []
    ORA-00600: internal error code, arguments: [kghasp1], [0x5F3B718], [], [], [], [], [], []
    ORA-07445: exception encountered: core dump [ACCESS_VIOLATION] [0x34EF9E5] [] [] [] []
    The error ORA-00600[kghasp1] refers to the problem with heap memory. I also restarted the database and tried the data migration for only this table but with the same error.
    I tried it on both 10g release1 and 10g release 2.
    Can someone please help me in resolving this issue.
    Thanks
    Raghavendra

    Hi Raghavendra,
    Are you saying the Microsoft SQL Server bcp dump of image data failed? It has been tested, it results in a hex dump rather than a binary dump, hence the clob to blob, and hextoraw workaround.
    What are the version numbers and what is the reproducible test case?
    -Turloch
    Note that the clob to blob process is automated:
    Tools->preferences->Migration->Generation Options->General Options->Generate Stored Procedure for Migrate Blobs Offline
    From Help:
    Generate Stored Procedure for Migrate Blobs Offline: Causes a stored procedure named CLOBtoBLOB_sqldeveloper (with execute access granted to public) to be created if the schema contains a BLOB (binary large object); this procedure is automatically called if you perform an offline capture. If this option is not checked, you will need to use the manual workaround described in Populating the Destination Database Using the Data Files. (After the offline capture, you can delete the CLOBtoBLOB_sqldeveloper procedure or remove execute access from public.)

  • Question related to accessing members through dates

    we can access the members in the report through by dates in the sample.basic application through attributes concept. If there is requirement to access the members if it is specified with start date to end date (16/09/09 to 17/09/09)...How can we proceed..?

    I'm afraid you're out of luck. Attribute dims don't accept security.
    Could you handle this through the front end and custom code?
    I wonder if FR would let you limit the scope -- you'd have to try.
    Also, if 11x, you might try a Smart Slice in SmartView. Again, you'd have to try.
    Regards,
    Cameron Lackpour

  • Help with accessing same keychain data from two apps with bundled appid

    I have two ios apps that I'd like to share the same keychain data.  
    According to the apple docs, this should be possible as long as I've set up the appid with a wild card, which I did from the provisioning portal:
    appid:   bundleID.*
    Then, in the AIR for IOS settings of the flash ide, in the App ID field, I made the two apps be members fo the this appid:  bundleID.app1 and bundleID.app2
    But when I use the EncriptedLocalStore class to access the data from one of the apps that the other app stored, it isn't available.
    But each app, independently, is successfully writing to/reading data from the keychain.
    From what I've read, I'm doing everything right.  But clearly I'm not because it isn't working.

    hi dear
    the link which you have send is really too helpful
    but when i am using the way shown there and as  directed it shows an error that
                 'DataSource - Table not found'
    i think this is because the temprary table we r using is neither user defined nor SBO table.
    so plz tell me some way.

  • Xcelsius Engage: Issue with dynamic visibilty of data in dashboard

    Hi,
    We have a requirement for a dashboard where data for 5 Sites need to be displayed as per 17 KPIs and 12 different rolling months.
    Raw sample data looks like below-
    SITE  KPI        ActYTD  Act(Pre Month) PlanYTD  Plan(Prev Month)  VarianceYTD Variance(Pre Month)
    A       On-Time   76.7         82.92                  111.50       109.50             -1                    -1
    B       Delay       73.70       80.00                   79.75        77.75             -1                     1
    There are 5 different Sites and 17 different KPIs.
    Based on the raw data that we get from BI (7.0 query output), we manipulate it in MS excel, using some lookups and formulae to obtain certain values, store them in designated sheets and then Xcelsius (different components) use these sheets as source.
    We are having three levels of navigation-
    1. Main screen listing all KPIs site wise and months ( the site and months could be selected from Combo boxes at the top). The KPIs are bind to a list view, each row is a selectable KPI leading to level two navigation.
    2. Level two contains the details for each KPI ( values and trend chart). Selection in the chart for a Site leads to level 3 navigation.
    3. Level 3 screen contains data by KPI and by Site for 12 rolling months ( The sites could be changed from a drop-down).
    There are custom navigation buttons (home/back) to enable navigations between the screens.
    We are using 3 panel containers, 2 list views, 4-5 combo boxes and some hidden button ( with dynamic visibilty).
    The workbook has 8-9 different sheets, though the Xcelsius componets are bound to only 3 sheets.
    Things work fine till we select 14th KPI , but Xcelsius starts behaving awkwardly when we select 15th KPI and further.For the selected KPI, the level two screen would load for a flash of a second and then the control comes back to level 1 screen. We do not face this issue till 14th KPI.
    In-order to eliminate possibiltes we did the following-
    1. Changed the order of KPIs - issue persists
    2. Changed the Excel option " Max. no of Rows" to 4000- issue persists
    3. Decreased the amount of data to be loaded - issue persists ( though at max we are loading data for 2000 rows)
    4. Decreased the no. of KPIs to 14 in level 1 list view - all works fine.
    We have not noticed any gradual decrease in performance/load time till 14th KPI but everything goes for a toss when the 15th KPI is displayed.
    We are on the latest Xcelsius patch ( patch 3).
    We would appreciate any pointers/help to resolve this issue.
    Thanks in advance for your time.
    Best Regards,
    Bansi.

    How many total rows are you dealing with in your spreadsheet?
    -Dell

  • Issue with loading of Delta data

    Hi all,
    I have constructed a generic datasource using the tables ESLH,ESLL,ESKL tables with delta field as AEDAT(Changed on) from ESKL table.Created a Infocube and DSO basing on the datasource and loading is completed.Delta loads are running daily.
    Now my question is how to delete the data from the infocube in the BW system if the fields ESLL-DEL(deletion indicator) and ESLL-STOKZ(reversal document) is set to active in the ESLL table in the R/3 system when the delta loading of data is completed.
    can anyone have an idea of how to resolve this issue.
    Thanks in Advance,
    Vinay Kumar

    There are a couple of ways to do this, but I would question your logic as to why you need to delete them first:
    For records that are marked as "deleted" you should just filter these records in the Query designer. Doing it this way is non-destructive, and later if your business needs to know some KPI based around number of deleted records - then you will be able to answer this. If you permanently remove the data, then you cannot answer this KPI.
    For records marked "reversal" you should never remove these because they affect your key figures. A reversal can also be a partial reversal and so your KPIs will be wrong if you don't include them, and in any case in reporting you never see the reversal because almost all KFs are aggregated. The only time you will see a reversal record is when you include the characteristic that identifies it as a reversal, and in most reports that is unnecessary.
    Right - so now to your solution(s) - although I repeat I advise not doing any of these.
    As these are custom data sources, make sure that in your source system you create abap code to fill in thr RECORDMODE for each record.
    This way when you load your data to the DSO, the activation step will take care of processing deletions and reversals.
    Next compress your cube and use selective deletion on the cube to remove the deleted and reversal records.
    Alternatively:
    Compress your cube and use selective deletion on the cube to remove the deleted and reversal records.
    Then in your transformation to the cube, create a start routine which removes the records from the source package. ie the records are never allowed to go to the cube.
    I hope that helps.

Maybe you are looking for

  • File --- to--- Soap(web service)

    Hi all, i am doing  File ->to->Soap(web service) scenario. my requiremenmt would be 1.Can we able to re-trigger failed & successful messages 2.Can we we able send the consolidated XML for the day irrespective of individual xml's 3.we able to send the

  • Event Handling for Graphic Shapes

    Hi guys, I have a problem on the implementation of a piece of software that i'm making, to be more specific i implement a GUI. In this GUI i draw rectangles, lines and that kind of things. The problem is that i want when clicking on a rectangle, an e

  • CC desktop client doesn't display available updates

    After a rocky start I was finally able to download the desktop client. It was fine at first and allowed me to install CC trials, update trials, sync settings, etc. I'm still able to sync settings and use my software, but CC doesn't display all the ne

  • How do I get my Mac 10.6.8 to accept the newest Java update 7?

    How do I get my Mac 10.6.8 to accept the newest Java update 7?

  • Configuring SSL

    DB version: 11.1.0.6 Platform: AIX 5L I'm planning to configure SSL authentication to the client. Per Oracle documentation, I don't see 'Oracle Advanced Security Profile' in Oracle Net Manager. I'm able to see only 'Naming' and 'General' tabs in the