Issue with loading of Delta data

Hi all,
I have constructed a generic datasource using the tables ESLH,ESLL,ESKL tables with delta field as AEDAT(Changed on) from ESKL table.Created a Infocube and DSO basing on the datasource and loading is completed.Delta loads are running daily.
Now my question is how to delete the data from the infocube in the BW system if the fields ESLL-DEL(deletion indicator) and ESLL-STOKZ(reversal document) is set to active in the ESLL table in the R/3 system when the delta loading of data is completed.
can anyone have an idea of how to resolve this issue.
Thanks in Advance,
Vinay Kumar

There are a couple of ways to do this, but I would question your logic as to why you need to delete them first:
For records that are marked as "deleted" you should just filter these records in the Query designer. Doing it this way is non-destructive, and later if your business needs to know some KPI based around number of deleted records - then you will be able to answer this. If you permanently remove the data, then you cannot answer this KPI.
For records marked "reversal" you should never remove these because they affect your key figures. A reversal can also be a partial reversal and so your KPIs will be wrong if you don't include them, and in any case in reporting you never see the reversal because almost all KFs are aggregated. The only time you will see a reversal record is when you include the characteristic that identifies it as a reversal, and in most reports that is unnecessary.
Right - so now to your solution(s) - although I repeat I advise not doing any of these.
As these are custom data sources, make sure that in your source system you create abap code to fill in thr RECORDMODE for each record.
This way when you load your data to the DSO, the activation step will take care of processing deletions and reversals.
Next compress your cube and use selective deletion on the cube to remove the deleted and reversal records.
Alternatively:
Compress your cube and use selective deletion on the cube to remove the deleted and reversal records.
Then in your transformation to the cube, create a start routine which removes the records from the source package. ie the records are never allowed to go to the cube.
I hope that helps.

Similar Messages

  • Issues with loading master data and hierarchy from BI InfoObject

    Hi,
    I was successful with loading the master data from BI InfoObject. It copies the text node as well but it discards the space I have in the technical name when it copies it to BPC. For example if my Text Node technical ID is "NODE 1" it copies it as "NODE1" on BPC side because of which my hierarchy load fails as it is not able to find "NODE 1".
    What is the reason it deletes the space from technical ID? and how can I resolve this issue?
    Do we have a provision to write a ABAP exit where I can check?
    (PS: If i manuallay enter space on my BPC dimension the hierarchy loads successfully)
    Thanks in advance,
    Diksha.

    Hi,
    You can check this in start routine which can be called from Transformation file using BADI.
    http://help.sap.com/saphelp_bpc75_nw/helpdata/en/28/b66863b41f47589b9943f80b63def6/content.htm
    Hope this helps...
    regards,
    Raju

  • Issues with loads

    There has been issues with loads to a Sales cube.  The issue is that delta loads for the first few days in January have been lost (also deleted from PSA).  I wanted to check the following before I make a suggestion:
    1. Data up until the end of last year is available and okay
    2. Create a copy of the original Sales cube
    3. Generate a DataSource from the original cube
    4. Load up until the end of last year to the new cube via the Datamart
    5. Load from Sales LO extractors, carrying out the init from the beginning of this year to 9999 into the new copy
    6. Load Deltas into the new copy
    Thanks

    Hi,
    I hope You had taken no restriction on the date when you were doing delta init.thats why you are getting delta for <b>all</b> records.It would continue in futue also.
    We can easily find out the sales document number which was created <b>somewhere</b> in December ending( Iam not saying the document should be last document of the December). Use that documetn number to fil the setup table as starting document number.
    Now do selective deletion of  the data from the cube which has Calnder day >= 01.01.2007. Now  Do repair request with selection on calender day(or the field which is used to map to Calnder day) with the range 01.01.2007,12.12.9999 .
    Next time when you are doing delta load to the cube, it will bring the records as nothng was happend.
    Ofcourse as Roberto mentioned ,<i>it is possible only in a specific case: if you are sure that a every sales document created before 31/12/06 cannot change anymore after that date !</i>
    What ever the date you feel comfotable (i.e you feel that the records which were created before that date were not changed in after Jan 01) instead of 01.01.2007 in the above explanations.
    With rgds,
    Anil Kumar Sharma .P
    Message was edited by:
            Anil Kumar Sharma

  • Issues with load from Excel

    Dear all,
    I have issues with loading the data from excel.
    My excel file looks like this:
    Time
    Store
    Neto_prodaja_ACT
    Neto_prodaja_TAR
    Jan-12
    C1
    16
    16
    Feb-12
    C1
    2
    2
    Jan-12
    C2
    1
    1
    Feb-12
    C2
    3
    3
    My procedure for load is:
    CLEAR STATUS 
    Across Var Down Time, Store
    Sel Neto_prodaja_ACT, Neto_prodaja_TAR
    Sel Store Input
    access lslink
         connect test1
         select * from my_list
         peek only 10
         read
    end
    When I load the data I receive following error:
          Time                         Store Neto_prodaja_ACT  Neto_prodaja_TAR 
        1 01/01/2012                   C1 16.00             16.00 
        2 02/01/2012                   C1 2.00              2.00 
        3 01/01/2012                   C2                                                                                                                                                                                                                                                                       1.00              1.00 
        4 02/01/2012                   C2 3.00              3.00 
    4 Record(s) Read, 0 Record(s) Skipped.
    DAT096:
    Unexpected Dimensions in ACROSS/DOWN List For Variable NETO_PRODAJA_ACT
    DAT096:
    Unexpected Dimensions in ACROSS/DOWN List For Variable NETO_PRODAJA_TAR
    The loaded data looks like this:
    The data for store C2 is loaded also on store C1 so everything is messed up.
    I also have a question regarding dimensions.  I have a model in PAS which has data from different data sources (BW and excel). In excel I have dimensions which are not loaded from BW (they don't exist there). How to create such dimenison?
    Thank you a lot in advance.
    Best regards,
    Petra

    Petra,
    The Forum isn't really designed as a training system but rather where people can share questions or get a separate pair of eyes to look afresh at issues that aren't working for some reason. This is particularly the case for something which is as important as creating dimensions.
    The idea of the SSM Cube Builder/Model Designer was to enable people to build models with their relevant dimensions and metrics for demos and simple initial systems using manual data entry. If you are getting into building dimensions that will be outside BW then you are moving into the implementation arena rather than demo creation and need to work carefully so that things tie up.
    I doubt people would expect to be able to set up/implement BW without training and SSM is the same.
    If you would like training or would like to collaborate on a first project to enable skills transfer then my colleague Pedro and I would be happy to discuss this. We have done this with other people and it has worked well.
    Regards
    Colin

  • Performance issue with loading Proclarity Main Page..

    Hi All,
    I have Proclarity 6.3 installed on a Windows 2008 R2 OS. The Proclarity Reports was working well until last week. From last few days I am seeing a slow response time in loading the Proclarity Main page. 
    Loading Proclarity Main page on Internet Explorer 8 is taking 150 seconds and the same Proclarity Main page is loading on Google Chrome in 30 seconds.
    Have any of you faced similar issue ? 
    Already below things explored
    1. Clear Cache on PAS Tool
    2. Event Viewer, Noticed if there is any error or warning
    3. Tried browsing the Proclarity URL from server itself ( still the performance is slow)
    4. Memory consumption validated on server side. MSSQLServer was consuming more space. Hence restarted /. After restart also same issue ( with loading main page in IE ONLY)
    5. Checked drive space .. All drives has minimum 1.5 GB of free space
    6. Cleared up Proclarity Event Logs 
    The issue is NOT ONLY with loading Main page.. Navigating to any further web pages in Proclarity STANDARD and PROFESSIONAL version is responding VERY slowly.
    The only other option, that I am thinking now is RESTARTING THE WINDOWS SERVER. Which may not be a easy deal SINCE ITS A PRODUCTION SERVER.
    But the loading of web page on Chrome is 30 seconds and on IE its 150 seconds ( i.e, 5 times more..) .. So does proposing to restart the server makes sense ? 
    Any help, suggestion , thoughts on what I am facing.. ? Thanks 
    Regards,
    Aravind

    <b>onInputProcessing for two pages</b>  
      DATA: event TYPE REF TO if_htmlb_data.
      event = cl_htmlb_manager=>get_event_ex( request ).
      IF event IS NOT INITIAL AND event->event_name = 'button'.
        navigation->goto_page( event->event_server_name ).
      ENDIF.
    page1.htm
      <%@page language="abap" otrTrim="true"%>
      <%@extension name="htmlb" prefix="htmlb"%>
      <htmlb:content design="design2003">
        <htmlb:page>
          <htmlb:form>
            <htmlb:button       text          = "next"
                                design        = "NEXT"
                                onClick       = "page2.htm" />
          </htmlb:form>
        </htmlb:page>
      </htmlb:content>
    page 2
    <%@page language="abap" otrTrim="true"%>
      <%@extension name="htmlb" prefix="htmlb"%>
      <htmlb:content design="design2003">
        <htmlb:page>
          <htmlb:form>
            <htmlb:button       text          = "Page 1"
                                design        = "PREVIOUS"
                                onClick       = "page1.htm" />
          </htmlb:form>
        </htmlb:page>
      </htmlb:content>
    above will work fine.
    another way :
    you can define a global variable in your application class and subsquently change its value according to your requirement as the name of the page
    and whenever you want to move to some page. jaust assign on onclick event of the button:
    navigation->goto_page(global_variable);
    where global variable is the variable you have defined.
    hope this works for you.
    if not reply
    regards,
    Hemendra

  • Issue with Oracle LONG RAW data type

    Hi All,
    I am facing some issues with Oracle LONG RAW DATA Type.
    We are using Oracle 9IR2 Database.
    I got a table having LONG RAW column and I need to transfer the same into another table having LONG RAW column.
    When I tried using INSERT INTO SELECT * command (or) CREATE TABLE as select * , it is throwing ORA-00997: illegal use of LONG datatype.
    I have gone through some docs and found we should not use LONG RAW using these operations.
    So I did some basic PLSQL block given below and I was able to insert most of the records. But records where the LONG RAW file is like 7O kb, the inserting is faliling.
    I tried to convert LONG RAW to BLOB and again for the record where the LONG RAW is big in size I am getting (ORA-06502: PL/SQL: numeric or value error) error.
    Appreciate if anyone can help me out here.
    DECLARE
    Y LONG RAW;
    BEGIN
    FOR REC IN (SELECT * FROM TRU_INT.TERRITORY WHERE TERRITORYSEQ=488480 ORDER BY TERRITORYSEQ ) LOOP
    INSERT INTO TRU_CMP.TERRITORY
    BUSINESSUNITSEQ, COMPELEMENTLIFETIMEID, COMPONENTIMAGE, DESCRIPTION, ENDPERIOD, GENERATION, NAME, STARTPERIOD, TERRITORYSEQ
    VALUES
    REC.BUSINESSUNITSEQ, REC.COMPELEMENTLIFETIMEID, REC.COMPONENTIMAGE, REC.DESCRIPTION, REC.ENDPERIOD, REC.GENERATION, REC.NAME,
    REC.STARTPERIOD, REC.TERRITORYSEQ
    END LOOP;
    END;
    /

    Maddy wrote:
    Hi All,
    I am facing some issues with Oracle LONG RAW DATA Type.
    We are using Oracle 9IR2 Database.
    I got a table having LONG RAW column and I need to transfer the same into another table having LONG RAW column.
    When I tried using INSERT INTO SELECT * command (or) CREATE TABLE as select * , it is throwing ORA-00997: illegal use of LONG datatype.
    I have gone through some docs and found we should not use LONG RAW using these operations.
    So I did some basic PLSQL block given below and I was able to insert most of the records. But records where the LONG RAW file is like 7O kb, the inserting is faliling.
    I tried to convert LONG RAW to BLOB and again for the record where the LONG RAW is big in size I am getting (ORA-06502: PL/SQL: numeric or value error) error.
    Appreciate if anyone can help me out here.
    DECLARE
    Y LONG RAW;
    BEGIN
    FOR REC IN (SELECT * FROM TRU_INT.TERRITORY WHERE TERRITORYSEQ=488480 ORDER BY TERRITORYSEQ ) LOOP
    INSERT INTO TRU_CMP.TERRITORY
    BUSINESSUNITSEQ, COMPELEMENTLIFETIMEID, COMPONENTIMAGE, DESCRIPTION, ENDPERIOD, GENERATION, NAME, STARTPERIOD, TERRITORYSEQ
    VALUES
    REC.BUSINESSUNITSEQ, REC.COMPELEMENTLIFETIMEID, REC.COMPONENTIMAGE, REC.DESCRIPTION, REC.ENDPERIOD, REC.GENERATION, REC.NAME,
    REC.STARTPERIOD, REC.TERRITORYSEQ
    END LOOP;
    END;
    /below might work
    12:06:23 SQL> help copy
    COPY
    Copies data from a query to a table in the same or another
    database. COPY supports CHAR, DATE, LONG, NUMBER and VARCHAR2.
    COPY {FROM database | TO database | FROM database TO database}
                {APPEND|CREATE|INSERT|REPLACE} destination_table
                [(column, column, column, ...)] USING query
    where database has the following syntax:
         username[/password]@connect_identifier

  • Issue with load from livecache

    Dear all
    In our APO system we have encountered a strange issue with data loaded from livecache into the infocube.
    Problem is in the infocube, where there are two requests with a forecast version data with the same amount of data.
    When I look in my infocube in APO I see e.g. the two following request IDs for a given product:
    REQU_4CZ4SQ5PSM9V5PHJCHTIS24GK is the most recent load from Livecache (replacing any previous livecache loads).
    APO_R43U3KZLM0V3WVK112YEK3KKLE cannot be located. It is not available on the infocube manage section
    Ultimately this results in double data for this given product.
    Do any of you have an idea as to what this APO_* request could be triggered from?
    This is the input from the BW-team:
    This indicates that an APO program has modified the infocube data without updating the request tab of the manage infocube. But it has updated the request id of the data which is transformed. With subsequent loads into the infocube, these records are split based on different request idu2019s, and loaded collectively into BI.
    I hope you can help me out.
    Best regards,
    Anders Thinggaard

    Hi Visu
    It seems I made a typing error in my description of the problem. It is of course the DESTINATION combination, not the SOURCE combination, that is written into the infocube when realigning on the infocube.
    This, of course, creates the request ID starting with APO. Besides that I get the request starting with REQU when I load from livecache using an infopackage. So far so good.
    What confuses me though is that when I load back into livecache (load my planning area) is seems to pick up the correct amount of data (not taking the APO* request into account) whereas my load to my external BW out of my APO infocube seems to pick up both the REQU* and the APO* requests, resulting in dublicate data.
    Have you had this challenge as well?
    My first idea is to make the BW team make some ABAP coding leaving out any request ID starting with APO*. However it seems to me that this is a stardard functionality of APO, and I'd like to get to the bottom of this...
    Best regards,
    Anders
    P.S. As I understand, the copy logic only dertermines whether data from source combination is added to or overwriting the data for the destination combination.

  • Issues with iMessage when switching data carrier (wifi - cellular)

    So I have had this issue with my 5S and now my 6+ in iMessage. Not *always* but often enough I've run into issues sending messages via iMessage when I associate with wifi in that they don't send. I'll get an error message saying "not delivered" and it will stack as many outgoing messages I try to send. However, if I force the first message to send as text, subsequent messages will go through afterwards without issue. If I go from wifi to cellular I have no issues whatsoever and when I have these issues on wifi, my data works fine (incoming emails, Facebook notifications, etc.). I used to run cellular data only because I had unlimited but now I have a cap so I intend on using wifi when possible...but to have to send my first message as a text just so I can use iMessage is annoying...

    Hi there BigDogg795,
    I would recommend taking a look at the troubleshooting steps for iMessage found in the article below. 
    iOS: Troubleshooting Messages - Apple Support
    To resolve issues with sending and receiving iMessages, follow these steps
    Check iMessage system status for current service issues.
    Go to Settings > Messages > Send & Receive and make sure that you registered iMessage with your phone number or Apple ID and that you selected iMessage for use. If the phone number or Apple ID isn't available for use, troubleshoot iMessage registration.
    Open Safari and navigate to www.apple.com to verify data connectivity. If a data connection isn't available, troubleshoot cellular data or a Wi-Fi connection.
    iMessage over cellular data might not be available while you're on a call. Only 3G and faster GSM networks support simultaneous data and voice calls. Learn which network your phone supports. If your network doesn't support simultaneous data and voice calls, go to Settings > Wi-Fi and turn Wi-Fi on to use iMessage while you're on a call.
    Restart your device.
    Tap Settings > General > Reset > Reset Network Settings on your iPhone.
    If you still can't send or receive an iMessage, follow these steps
    Make sure that the contact trying to message you isn't blocked in Settings > Messages > Blocked.
    Make sure that the contact you're trying to send a message to is registered with iMessage.
    If the issue occurs with a specific contact or contacts, back up or forward important messages and delete your current messaging threads with the contact. Create a new message to the contact and try again.
    If the issue occurs with a specific contact or contacts, delete and recreate the contact in the Contacts app. Create a new message to the newly created contact and try again.
    Back up and restore your device as new.
    -Griff W.

  • CSV creation through PLSQL having issue with some rows of data

    Hi,
    having issue with in some rows of data which is created by the PLSQL code.
    Issue description
    - 50000 rows of data in my CSV
    -around 20 fileds in a row
    - in some the row (in 50000 rows) some fields (20 rows of data) are returning empty as result but actually my query returns data for these fields
    Hope the issue is clear
    Code given below
    CREATE OR REPLACE FUNCTION TSC_OM_AUDIT(ERR_DESC IN OUT VARCHAR2,
    WEB_FILE_URL IN OUT VARCHAR2 ) RETURN BOOLEAN IS
    --Variable Declaration
    L_buff_line VARCHAR2(500);
    L_hdr_str VARCHAR2(32700);
    L_selling_UOM VARCHAR2(5);
    L_prim_bar_code VARCHAR2(30);
    L_status_pri_bar VARCHAR2(2);
    L_multpl_bar_exist VARCHAR2(2);
    L_multpl_prim_bar_exist VARCHAR2(2);
    L_prim_simp_pack VARCHAR2(30);
    L_staus_prim_simp_pack VARCHAR2(2);
    L_mupl_prim_pack_exist VARCHAR2(2);
    L_prim_supp_prim_simpl_pack VARCHAR2(15);
    L_prim_sim_supp_to_TPNB VARCHAR2(2);
    L_sel1 VARCHAR2(200);
    L_sel2 VARCHAR2(200);
    L_sel3 VARCHAR2(200);
    L_item_till_desc VARCHAR2(200);
    L_lengh1 VARCHAR2(200);
    L_width1 VARCHAR2(200);
    L_height1 VARCHAR2(200);
    L_lengh2 VARCHAR2(200);
    L_width2 VARCHAR2(200);
    L_height2 VARCHAR2(200);
    L_lengh3 VARCHAR2(200);
    L_width3 VARCHAR2(200);
    L_height3 VARCHAR2(200);
    --Item type
    CURSOR C_ITEM is
    select im.item L_item,
    im.status L_status,
    im.item_desc L_item_desc,
    'Regular' L_item_type,
    im.dept L_dept,
    im.class L_class,
    im.subclass L_subclass,
    'N/A' L_CW_item_order_type,
    'N/A' L_CW_item_sale_type,
    im.standard_uom L_standard_uom,
    NULL L_selling_uom,
    im.tsl_base_item L_tsl_base_item
    from item_master im
    where im.item_number_type='TPNB'
    -- and im.last_update_id = 'DATALOAD'
    -- and im.create_datetime>=to_timestamp('2010-11-05 10:57:47','YYYY-MM-DD HH24:MI:SS')
    and im.sellable_ind='Y'
    and im.orderable_ind='Y'
    and im.inventory_ind='Y'
    and im.item_xform_ind='N'
    and im.catch_weight_ind='N'
    and rownum<10;
    --order by im.item asc;
    Cursor C_Selling_UOM (tpnb varchar2) is
    select selling_uom
    from rpm_item_zone_price
    where item = tpnb;
    Cursor C_prim_bar (tpnb varchar2) is
    select item,
    status
    from item_master iem
    where item_parent = tpnb
    and iem.primary_ref_item_ind ='Y';
    Cursor C_multi_bar_exit (tpnb varchar2) is
    select count(*)
    from item_master iem
    where item_parent = tpnb;
    Cursor C_multpl_prim_bar_exist (tpnb varchar2) is
    select count(*)
    from item_master
    where item_parent = tpnb
    and primary_ref_item_ind ='Y';
    Cursor C_staus_prim_simp_pack (tpnb varchar2) is
    select piem.pack_no,
    iem.status
    from item_master iem,
    packitem piem
    where piem.item = tpnb
    and piem.pack_no = iem.item
    and iem.tsl_prim_pack_ind ='Y';
    Cursor C_multpl_prim_pack_exist (tpnb varchar2) is
    select count(*)
    from item_master iem,
    packitem piem
    where piem.item = tpnb
    and piem.pack_no = iem.item
    and iem.tsl_prim_pack_ind ='Y';
    Cursor C_prim_supp_prim_simpl_pack (tpnd varchar2) is
    select supplier
    from item_supplier
    where item = tpnd
    and primary_supp_ind= 'Y';
    Cursor C_prim_sim_supp_to_TPNB (tpnb varchar2,suppl number) is
    select 'Y'
    from item_supplier
    where item = tpnb
    and supplier = suppl;
    Cursor C_item_descretion_SEL (tpnb varchar2) is
    select sel_desc_1,
    sel_desc_2,
    sel_desc_3
    from tsl_itemdesc_sel
    where item =tpnb;
    Cursor C_item_till_descretion (tpnb varchar2) is
    select till_desc
    from tsl_itemdesc_till
    where item=tpnb;
    Cursor C_EA (tpnb varchar2,suppl number) is
    select length,
    width,
    height
    from item_supp_country_dim
    where item= tpnb
    and supplier = suppl
    and dim_object='EA';
    Cursor C_CS (tpnb varchar2,suppl number) is
    select length,
    width,
    height
    from item_supp_country_dim
    where item= tpnb
    and supplier = suppl
    and dim_object='TYUNIT';
    Cursor C_TRAY (tpnb varchar2,suppl number) is
    select length,
    width,
    height
    from item_supp_country_dim
    where item= tpnb
    and supplier = suppl
    and dim_object='TRAY';
    BEGIN
    --INITIAL
    WEB_FILE_URL := TSC_RPT_GRS_EXL_GEN.WEB_URL||TSC_RPT_GRS_EXL_GEN.OPEN_FILE;
    TSC_RPT_GRS_EXL_GEN.put_line('Poland Production Cutover');
    TSC_RPT_GRS_EXL_GEN.skip_line;
    l_hdr_str := 'L_item_TPNB,L_status,L_item_desc,L_item_type,L_section,L_class,L_subclass,L_cw_order_type,L_cw_sale_type,L_std_UOM,L_selling_UOM,'||
    'L_base_item,L_prim_bar_code,L_status_pri_bar,L_multpl_bar_exist,L_multpl_prim_bar_exist,L_prim_simple_pack,L_staus_prim_simp_pack,L_mupl_prim_pack_exist,'||
    'L_prim_supp_prim_simpl_pack,L_prim_sim_supp_to_TPNB,L_sel1,L_sel2,L_sel3,L_item_till_desc,L_lengh1,L_width1,L_height1,L_lengh2,L_width2,L_height2,L_lengh3,L_width3,L_height3';
    l_hdr_str := TSC_RPT_GRS_EXL_GEN.CONVERT_SEPARATOR(l_hdr_str);
    TSC_RPT_GRS_EXL_GEN.put_line(l_hdr_str);
    for rec_c_item in C_ITEM
    LOOP
    open C_Selling_UOM (rec_c_item.L_item);
    fetch C_Selling_UOM into L_selling_UOM;
    close C_Selling_UOM;
    open C_prim_bar (rec_c_item.L_item);
    fetch C_prim_bar into L_prim_bar_code,L_status_pri_bar;
    close C_prim_bar;
    open C_multi_bar_exit (rec_c_item.L_item);
    fetch C_multi_bar_exit into L_multpl_bar_exist;
    close C_multi_bar_exit;
    IF to_number(trim(L_multpl_bar_exist)) > 1 THEN
    L_multpl_bar_exist:='Y';
    ELSE
    L_multpl_bar_exist:='N';
    END IF;
    open C_multpl_prim_bar_exist (rec_c_item.L_item);
    fetch C_multpl_prim_bar_exist into L_multpl_prim_bar_exist;
    close C_multpl_prim_bar_exist;
    IF to_number(trim(L_multpl_prim_bar_exist)) > 1 THEN
    L_multpl_prim_bar_exist:='Y';
    ELSE
    L_multpl_prim_bar_exist:='N';
    END IF;
    open C_staus_prim_simp_pack (rec_c_item.L_item);
    fetch C_staus_prim_simp_pack into L_prim_simp_pack,L_staus_prim_simp_pack;
    close C_staus_prim_simp_pack;
    open C_multpl_prim_pack_exist (rec_c_item.L_item);
    fetch C_multpl_prim_pack_exist into L_mupl_prim_pack_exist;
    close C_multpl_prim_pack_exist ;
    IF to_number(trim(L_mupl_prim_pack_exist)) > 1 THEN
    L_mupl_prim_pack_exist:='Y';
    ELSE
    L_mupl_prim_pack_exist:='N';
    END IF;
    open C_prim_supp_prim_simpl_pack (trim(L_prim_simp_pack));
    fetch C_prim_supp_prim_simpl_pack into L_prim_supp_prim_simpl_pack;
    close C_prim_supp_prim_simpl_pack ;
    open C_prim_sim_supp_to_TPNB (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_prim_sim_supp_to_TPNB into L_prim_sim_supp_to_TPNB;
    close C_prim_sim_supp_to_TPNB ;
    open C_item_descretion_SEL (rec_c_item.L_item);
    fetch C_item_descretion_SEL into L_sel1,L_sel2,L_sel3;
    close C_item_descretion_SEL ;
    open C_item_till_descretion (rec_c_item.L_item);
    fetch C_item_till_descretion into L_item_till_desc;
    close C_item_till_descretion ;
    open C_EA (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_EA into L_lengh1,L_width1,L_height1;
    close C_EA ;
    open C_CS (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_CS into L_lengh2,L_width2,L_height2;
    close C_CS ;
    open C_TRAY (rec_c_item.L_item,to_number(trim(L_prim_supp_prim_simpl_pack)));
    fetch C_TRAY into L_lengh3,L_width3,L_height3;
    close C_TRAY ;
    L_buff_line := TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item), TRUE)||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_status))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item_desc))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_item_type))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_dept))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_class))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_subclass))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_CW_item_order_type))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_CW_item_sale_type))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_standard_uom))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_selling_UOM)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(rec_c_item.L_tsl_base_item))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_bar_code)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_status_pri_bar)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_multpl_bar_exist)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_multpl_prim_bar_exist)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_simp_pack)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_staus_prim_simp_pack)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_mupl_prim_pack_exist)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_supp_prim_simpl_pack)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_prim_sim_supp_to_TPNB)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_sel3)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_item_till_desc)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height1)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height2)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_lengh3)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_width3)))||
    TSC_RPT_GRS_EXL_GEN.PAD_COMMA(to_char(trim(L_height3)));
    TSC_RPT_GRS_EXL_GEN.PUT_LINE(L_buff_line);
    L_selling_UOM :=NULL;
    L_prim_bar_code :=NULL;
    L_status_pri_bar :=NULL;
    L_multpl_bar_exist :=NULL;
    L_multpl_prim_bar_exist :=NULL;
    L_prim_simp_pack :=NULL;
    L_staus_prim_simp_pack :=NULL;
    L_mupl_prim_pack_exist :=NULL;
    L_prim_supp_prim_simpl_pack :=NULL;
    L_prim_sim_supp_to_TPNB :=NULL;
    L_sel1 :=NULL;
    L_sel2 :=NULL;
    L_sel3 :=NULL;
    L_item_till_desc :=NULL;
    L_lengh1 :=NULL;
    L_width1 :=NULL;
    L_height1 :=NULL;
    L_lengh2 :=NULL;
    L_width2 :=NULL;
    L_height2 :=NULL;
    L_lengh3 :=NULL;
    L_width3 :=NULL;
    L_height3 :=NULL;
    END LOOP;
    TSC_RPT_GRS_EXL_GEN.close_file;
    return TRUE;
    EXCEPTION WHEN OTHERS THEN
    ERR_DESC := '['||SQLCODE||']-'||SUBSTR(SQLERRM,1,200);
    return FALSE;
    END TSC_OM_AUDIT;
    Please suggest something on this
    Regards,
    Shamed H

    Hi Shamid,
    This forum is only for questions regarding the SQL Developer tool. Please post in Forum Home > Database > SQL and PL/SQL:
    PL/SQL
    Regards,
    Gary
    SQL Developer Team

  • Issues with loading slideshows

    I have built a site for a client using Muse and the client is experiencing problems with a couple of slideshows that won't load.  I'm not having this issue using either Safari or Firefox, and there doesn't appear to be any problems using separate desktops (mates online are not having any issues).
    The site in question has a number of slideshows on different pages, this issue is with the Gallery Page slideshow (apparently it doesn't load images at all, but the captions still rotate)
    And the same slideshow issue with the Covers & Cloths page.
    No-one else is experiencing this, the client's site renders fine on her tablet (running Safari) no problems there, it's just the desktop version . . . and there is no separate version for tablet or Desktop, I only created two versions of the site a Desktop and a Mobile. The site in question is www.finishingtouchuk.co.uk
    Off the top of my head I was wondering if perhaps the client has loaded an image into the offending slideshows through the Dashboard and it hasn't been optimized, or perhaps the file names are conflicting, but because no-one else is having this issue I'm not sure how to advise my client, could it be some specific settings within their version of Safari?
    Thanks
    Penny

    Hi Penny,
    This could very well be as simple as their browser cache or a plugin/extension/add-on causing this.
    You could advise them to try a different browser or try resetting their current browser to default settings. This seems pretty much a browser/machine specific issue.
    Cheers,
    Vikas

  • Blocking of Good Issue with respect to posting date

    Hi Expert,
                   Is there any way out in SAP B1 to block the Good Release with respect to posting date.i.e. If User receipt the Good in stock today then system should not allow the user to issue the same good on the back date.
    regards,
    PankajK

    Hi Pankaj,
    There are so many options e.g you can check in oinm select that particular item sort on docdate and check inqty and compare with your goods issue date if it is same then send same transaction to approval if you want.
    Thanks
    Sachin

  • SAP Best Practice: Problems with Loading of Transaction Data

    Hi!
    I am about to implement SAP Best Practices scenario "B34: Accounts Receivable Analysis".
    Therefore I load the data from SAP ERP IDES system into SAP NetWeaver 2004s system.
    My problems are:
    when I try to load the Transaction data for Infosources 0FI_AR_4 and 0FI_AR_6 I get the following errors/warnings:
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    On the right side I see some actions that are also "yellow", e.g. "DataStore Activation (Change Lo ): not yet activated".
    As a result I cannot see any data in tcode "RSRT" executing the queries "0FIAR_C03/...".
    The problems there
    1) Input help of the web template of query don't contain any values
    2) no data will be shown
    Can some one help me to solve this problem?
    Thank you very much!
    Jürgen

    Be in the monitor window where u got the below issue
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    and go to environment in the menu options TransactRFC--->in the sourcesystem...
    give the logon details and enter and from there give the correct target destination as ur BI server and execute...
    if u find some idoc's pending there push that manually using F6..
    and come back to ur load and refresh....
    if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
    it shows busy status and when it comes out of that once again refresh...
    rgds,

  • Illustrator CS5/CS6 on OSX 10.9.1 - Issue with loading fonts.

    I'm using Illustrator CS6 on OSX 10.9.1. Ever since I've upgraded to OSX Mavericks, Illustrator has stopped loading some of the document fonts even though they are enabled and available for other applications. I get the font missing dialog every time I open the artboard after a system restart. I usually resolve this by opening the Font Book app, disabling and re-enabling the fonts. This refreshes the artboard and the fonts then appear to work as normal. Initially, I suspected a compatability issue with Illustrator CS5 and OSX Mavericks, but after upgrading to CS6, I continue to have the same issue. The fonts in question are ttfs and these worked fine previously.
    How can I fix this?

    Do you use any third party font maangement tool?
    Which fonts are affected?

  • Issues with loading some webpages and dropbox. OS X Yosemite Wifi

    I am experiencing issues with my imac. As I was browsing Chrome this morning I went to load a page and suddenly it would not load and gave a page cannot be displayed error. Since then I'm having issues.
    My wifi says it's connected and everything is working fine. I can load some pages but not others (this is across all browsers I've tried: Chrome, Firefox, Safari). My dropbox desktop client will not load, nor can I load my dropbox account online. Also when I quit dropbox from the menu tray and try and reopen from applications it does nothing.
    My Google Drive desktop client and my google drive online is working perfectly though..
    I have shutdown and restarted my computer. I have also reset the modem. Nothing changed unfortunately though.
    I updated to OS X version 10.10.1 3 days ago. So maybe the update is partly responsible?
    Any ideas?

    Hello workingcars,
    I would start your troubleshooting with the first article below and use the Wireless Diagnostic to help isolate and resolve the issue. You also may want to create a new Network Location to see if there is that resolves the issue. 
    Wi-Fi: How to troubleshoot Wi-Fi connectivity
    http://support.apple.com/kb/ht4628
    Using network locations (Mac OS X v10.6 and later)
    http://support.apple.com/kb/ht5289
    Regards,
    -Norm G. 

  • Having issues with loading pages with large web articles..

    I'm having issues with pages loading all the way when I read articles off of wikipedia. Mainly from pages that have tons and tons of text, but it doesn't make sense. The browser indicates that the page is loaded all the way, but when I make it half way down the page, everything becomes completely white. Like LITERALLY everything becomes white. This prevents me from reading any further down the page, even though I would force the page to scroll down all the way, that doesn't help one bit.
    Does anyone have a solution or a fix to this? I would like to continue reading my web articles, without being interrupted with these white pages/bugs.
    Post relates to: HP TouchPad (WiFi)

    Try clearing the cache for the web browser. While the browser is open, go up to Web/Preferences and choose "Clear cache".
    Also try rebooting the TouchPad via Device Info/Reset Options/Restart.
    WyreNut
    I am a Volunteer here, not employed by HP.
    You too can become an HP Expert! Details HERE!
    If my post has helped you, click the Kudos Thumbs up!
    If it solved your issue, Click the "Accept as Solution" button so others can benefit from the question you asked!

Maybe you are looking for