Issue with deltas

hello, i am having problems with rsa7 queue. i have to delete the init everyday and recreate every day. all day long it works fine but then it stops working the next day. in my monitor all i see is a yellow request. cna some one tell me how to fix this issue? thanks.

Why are you deleting INIT everyday and recreating  ? What is the error msg you have seen in Monitor ?
Is the present load is Init ? or Delta ? what did you see in monitor screen ? Is it still running ?
can check the jobs in SM37 in BW and R/3 , also check in SM58 / WE02 if any Idocs are stucked

Similar Messages

  • Issue with Delta in Function Module

    Hi Team,
    I have an issue with delta in Genric extraction using function module.Full load is working fine and i have taken post_date as delta field.plz chk the code if any delta related statements are missing.
    FUNCTION ZRSAX_BIW_MANGEMENT_RAT .
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"  TABLES
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZQMBW_FUJ_MANAGEMENT OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    Example: DataSource for table MANAGEMENT RATING
      TABLES: ZQMBW_MANAGEMENT.
    Auxiliary Selection criteria structure
      DATA: L_S_SELECT TYPE SRSC_S_SELECT.
    Maximum number of lines for DB table
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
              S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
              S_CURSOR TYPE CURSOR.
      RANGES: POST_DATE FOR ZMMTVEND_RATING-POST_DATE,
              VENDOR FOR ZMMTVEND_RATING-VENDOR.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    Check DataSource validity
        CASE I_DSOURCE.
          WHEN 'ZQMMANAGEMENT_DS'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
    this is a typical log call. Please write every error message like this
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE   "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
        APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    Fill parameter buffer for data extraction calls
        S_S_IF-REQUNR    = I_REQUNR.
        S_S_IF-DSOURCE = I_DSOURCE.
        S_S_IF-MAXSIZE   = I_MAXSIZE.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
      ELSE.                 "Initialization mode or data extraction ?
        IF S_COUNTER_DATAPAKID = 0.
    Fill range tables BW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'VENDOR'.
            MOVE-CORRESPONDING L_S_SELECT TO VENDOR.
            CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
              EXPORTING
                INPUT  = VENDOR-LOW
              IMPORTING
                OUTPUT = VENDOR-LOW.
            CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
              EXPORTING
                INPUT  = VENDOR-HIGH
              IMPORTING
                OUTPUT = VENDOR-HIGH.
            APPEND VENDOR.
          ENDLOOP.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'POST_DATE'.
            MOVE-CORRESPONDING L_S_SELECT TO POST_DATE.
            CONCATENATE L_S_SELECT-LOW6(4)  L_S_SELECT-LOW3(2) L_S_SELECT-LOW+0(2) INTO POST_DATE-LOW.
            CONCATENATE L_S_SELECT-HIGH6(4)  L_S_SELECT-HIGH3(2) L_S_SELECT-HIGH+0(2) INTO POST_DATE-HIGH.
            APPEND POST_DATE.
          ENDLOOP.
    **Get Management rating details
          OPEN CURSOR WITH HOLD S_CURSOR FOR
          SELECT VENDOR POST_DATE OVERALL_MNGT_RAT OVERALL_DEV_RAT FROM ZMMTVEND_RATING WHERE VENDOR IN VENDOR AND POST_DATE IN POST_DATE .
        ENDIF.
    Fetch records into interface table.
        FETCH NEXT CURSOR S_CURSOR
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA
                   PACKAGE SIZE S_S_IF-MAXSIZE.
        S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
        IF SY-SUBRC <> 0.
          CLOSE CURSOR S_CURSOR.
          RAISE NO_MORE_DATA.
        ENDIF.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.

    Hi
    Check URLs:
    How to populate the ranges using FM for the SELECTs
    Re: Generic Delta Function Module

  • Issue with Delta Load in BI 7.0... Need resolution

    Hi
    I am having difficulty in Delta load which uses a Generic Extractor.  The generic extractor is based on a view of two Tables.  I use the system date to perform the delta load.  If the system date increaes by a day, the load is expected to pick up the extra records.  One of the tables used in the view for master data does not have the system date in it.
    the data does not even come up to PSA.  It keeps saying there are no records....  Is it because I loaded the data for yesterday and manually adding today's data...? 
    Not sure what is the cuase of delta failing....
    Appreciate any suggestions to take care of the issue.
    Thanks.... SMaa

    Hi
    The Generic DataSource supports following delta types:
    1. Calender day
    2. Numeric Pointer
    3. Time stamp
    Calday u2013 it is based on a  calday,  we can run delta only once per day that to at the end of the clock to minimize the missing of delta records.
    Numeric pointer u2013 This type of delta is suitable only when we are extracting data from a table which supports only creation of new records / change of existing records.
    It supports
    Additive Delta: With delta type additive delta, the record to be loaded only returns the respective changes to key figures for key figures that can be aggregated. The extracted data is added in BI (Targets: DSO and InfoCube)
    New status for changed records: With delta type new status for changed records, each of the records to be
    loaded returns the new status for all key figures and characteristics. The values in BI are overwritten (Targets: DSO and Master Data)
    Time stamp u2013 Using timestamp we can run delta multiple times per day but we need to use the safety lower limit and safety upper limit with minimum of 5 minutes.
    As you specified, the DS is based on VIEW (of two tables, with one containing date and other does not).
    Kindly check the above lines and verify, if the view (primary key) could be used to determine the Delta for your DS.
    Also let us the if any standard SAP tables used in creating VIEW and if so, what is the size of the DS load every day?
    Thanks.
    Nazeer

  • Issue with Delta Request

    Hi Friends,
    I have an issue with loading.
    1. From source 2lis_13_VDITM, data loads to 2 targets.
    ZIC_SEG, 0SD_C03.and from 0sd_C03 it again loads to ZSD_C03W and ZSD_C03M through DTP.
    I have done a repair full load on 08.08.2011 to PSA and loaded this request manually to cube ZIC_seg.
    I forgoted to delete this request in PSA as i dont want to load this to other targets.
    Before i notice ,already delta requests got loaded and has pulled my repair full request also to other targets.
    As i have not done any selective deltions on other cubes there may be double entries.
    I am planning to do the below steps inorder to rectify the issue.
    1. Do a selective deletion of the delta request in all 3 targets which got loaded on 8th along with repair full.
    2. Delete the repair full request from PSA.
    So now delta request which got loaded after repair full request, is left in PSA.
    My question is if my PC runs today will this delta request in PSA also pulls again to cube through DTP?
    Kindly share any other ideas please urgent...
    Regards,
    Banu

    Hi Banu,
    If the data in the CUBE's is not compressed, then follow the below steps
    1)Delete the latest request in CUBE ZIC_SEG
    2)Delete the latest request from cubes  ZSD_C03W and ZSD_C03M and then from 0SD_C03.
    3)load the delta request manually using DTP from your cubes(in DTP you can load by giving request number). to cubes ZIC_SEG and 0SD_C03.
    3)now run the delta DTP to load the delta request from 0SD_C03 to ZSD_C03W and ZSD_C03M.
    Next when your PC runs it will load only that particular day delta request.
    It will work
    Regards,
    Venkatesh

  • Issue with delta load from R/3 to BW

    Hi frnds,
    There is a standarded D.S  2LIS_05_ITEM in R/3. Evereday We extraceted data from R/3 to BW based on this standard Data Source with delta load.There are two fields ZX and ZY in BW ,some times the data haven't extracted for this 2 fields in BW with the delta update.But some times this 2 fields properly extracted in to BW with the same delta update.
    Why is it happen like this ? Please give some inputs on this.
    Regards,
    Satya.

    Hello,
    if its a standard field then its getting populated in correct way.
    if its custom field then you need to analyze the records for which its getting populated and for one which its not.
    Quite possible that some cutomization in CMOD results in this behaviour.
    Also,check underlying tables to see the correct values.
    Regards
    Ajeet

  • Issue with delta queue initialization

    Hello
    I am having a problem with an initialization of the delta queue for a certain extractor. The initialization was successfull, and I can see the extractor in the delta queue maintenance (RSA7) in ECC, but when users are posting transactions in ECC, they are not coming into RSA7.
    The extractor is 0CO_OM_WBS_6, so its not standard LO cockpit where we have V3 (background) jobs which have to be triggered.
    One more thing, the record which was posted was backdated, ie the date is sometime in the past. But I think the delta queue still picks the records. Can i know how the delta queue funtions based on the time stamp?
    Please advise.

    Here is the OOS note fro FI Datasources.
    OSS 485958:
    Symptom
    The FI line item delta DataSources can identify new and changed data only to the day because the source tables only contain the CPU date and not the time as time characteristic for the change.
    This results in a safety interval of at least one day.
    This can cause problems if the batch processing of the period-end closing is still running on the day following the period-end closing and all data of the closed period is to be extracted into BW on this day.
    Example:
    The fiscal year of the company ends on December 31, 2002.
    On January 01, 2003, all data of the closed fiscal year is to be extracted into BW.In this case, the FI line item delta DataSources only extract the data with a safety interval of at least one day.
    This is the data that was posted with a CPU date up to December 31, 2002.
    However, the batch programs for closing the 2002 fiscal year are still running on January 01, 2003.
    The postings created by these programs with a January 01 date for the 2002 fiscal year can therefore only be extracted with delta extraction on January 02, 2003. However, extraction should already be possible on January 01, 2003.
    Other terms
    FI line item delta DataSources
    0FI_AP_4; 0FI_AR_4; 0FI_GL_4
    Reason and Prerequisites
    This problem is caused by the program design.
    Solution
    The functions are enhanced as of PI 2002.1, standard version.
    Due to a customized setting, it is now possible for the FI line item extractors to read the data up to the current date during the delta runs.
    During the next delta run, the data is then read again from the day when the previous extraction was called, up to the current day of the call for the new extraction.This means that the delta extraction of the data always occurs with time intervals overlapping by one day.
    The delta calculation by the ODS objects ensures that the repeatedly extracted data does not cause duplicate values in the InfoCube.
    Activate the new functions as follows:
    1. PI 2001.1 or PI 2001.2:
    Implement the new functions with the correction instructions in this note.As of PI 2002.1, these changes are already implemented in the standard version.
    1. Start a test run for an FI line item DataSource via transaction RSA3.
    2. Two new parameters are now available in the BWOM_SETTINGS table.
    3. Parameter BWFIOVERLA (Default _)
    Set this parameter to "X" using transaction SE16, to activate the new function of overlapping time intervals.
    Caution
    The new function also uses a safety interval of one day to determine the To value of the selection, if the time limit set by BWFITIMBOR is not reached.This means that only data up to the day before the extraction was called is selected, if the extraction starts before 02:00:00 (default for BWFITIMBOR).
    Correction: Parameter BWFITIMBOR (Format HH:MM:SS, default 020000)
    The default value for this parameter is 02:00:00.Do not change this value.
    If this time limit is not reached, only data up to the previous day (with BWFIOVERLA = 'X') or the day before this day (with BWFIOVERLA = ' ') is selected.
    This logic prevents records with a CPU date of the previous day being contained in the posting when the extraction is executed.
    Check any OSS notes for your data source.

  • Issue with Delta upload in Extractor

    Hi all,
    We are using the Extractor 2LIS_11_VAITM to upload Sales Overview cube through Delta Process. It so happens that when ever a new record is entered (To be specific, whenever a new sales order is created) data is extracted and uploaded properly in to the cube.
    But when an existing record is changed(existing sales order), the record is not being extracted to R/3 extractor (2LIS_11_VAITM) which means any CHANGED RECORDS are not being extracted as per the schedule.
    Any hints to solve the issue would be helpful.
    Regards,
    Ram.

    Hello,
    Load is using Direct Delta or Queued Delta or Unserialized V3 Update ?
    Is Data from InfoSource 2LIS_11_VAITM directly feeding the InfoCube or is into a ODS ? If it is directly feeding into InfoCube, do you have document number InfoObject in the Cube ? How are you checking whether existing data is been updated or not or new record arrived into your Cube ? Take any document # which you think got changed in R3 and check in the PSA too.
    problem of not updating existing documents was always there or just happened today ?
    GSM.

  • Issue with Delta Queue

    Dear All,
    We have scheduled Delta Q on developement and it was working fine.
    But on Quality side all the jobs for Application 11 are failing while application 13 are running successfully.
    Please help on the same immediately.
    Regards,
    Sohil Shah.

    Hi Sohil,
    It seems that you have changed the structure of the datasource. When you change the structure of the datasource you have to make sure that there is no data in the delta queue or SMQ1 queue of the target system before you transport it.  Delete the data from the delta queue and SMQ1 queue and reintialise the datasource.
    Let me know if you have any doubt.
    Regards,
    Viren

  • Issues with Delta Loads.

    Hi All,
    I hope you don’t mind to provide a suggestion about the delta load of Purchase Orders.
    There was a request from Procurement team and they complaint that the data is not looking right.
    For example: for a given purchase order the exemption flag is displayed as ADE.
    Before change:
    Purchase Doc Number
    Puchase Exemption flag
    90002106
    ADE
    They want this to be displayed as:
    After Change :
    Purchase Doc Number
    Puchase Exemption flag
    90002106
    So ABAPER has included the logic in ECC Functional Module to populate the Extract Structure for Data Source to display above result. But the overnight run didn’t pick up the changes to above record in BW. The loads are delta loads.
    So my Question is:
    ·         This load for the Infocube Purchase Orders is delta load.  Does change to Function Module will pick up the changes to old records as well? I mean if we are doing delta load does the old records gets changed and updated in CUBE or should I delete the data from all the PSA and Infocube and run the load into CUBE to pick up the changes. Am bit worried about deleting the data from Infocube as it may have history data (because of delta load).  Total records in cube are 800000 records.
    ·         Another question does the process chain trigger the function module to load the data into Extract Structure of data source in ECC (am bit confused about this process in ECC)?
    Please suggest me how can I capture those changed records?
    Any help would be much appreciated. Please let me know if you need more info.
    Thanks in advance
    Sandeep

    Hi Sandeep,
    I am little confused from your post. You are using  2lis_02_SCL or the  generic FM datasource?.
    Also it looks like new  custom field added in your client. I never saw any purchase exemption flag even though i am working in purchasing flow for  last 2  years.
    W.r.t purchasing , deletion flag will be there  which  will come from EKPO and have  the values S or L (  Deleted or Blocked).
    You can block or delete a PO aslong as GR was not happened. Other wise you can do it. So use the EKPO-LOEKZ for  your use.
    Regards,
    Rajesh

  • Issue with loading of Delta data

    Hi all,
    I have constructed a generic datasource using the tables ESLH,ESLL,ESKL tables with delta field as AEDAT(Changed on) from ESKL table.Created a Infocube and DSO basing on the datasource and loading is completed.Delta loads are running daily.
    Now my question is how to delete the data from the infocube in the BW system if the fields ESLL-DEL(deletion indicator) and ESLL-STOKZ(reversal document) is set to active in the ESLL table in the R/3 system when the delta loading of data is completed.
    can anyone have an idea of how to resolve this issue.
    Thanks in Advance,
    Vinay Kumar

    There are a couple of ways to do this, but I would question your logic as to why you need to delete them first:
    For records that are marked as "deleted" you should just filter these records in the Query designer. Doing it this way is non-destructive, and later if your business needs to know some KPI based around number of deleted records - then you will be able to answer this. If you permanently remove the data, then you cannot answer this KPI.
    For records marked "reversal" you should never remove these because they affect your key figures. A reversal can also be a partial reversal and so your KPIs will be wrong if you don't include them, and in any case in reporting you never see the reversal because almost all KFs are aggregated. The only time you will see a reversal record is when you include the characteristic that identifies it as a reversal, and in most reports that is unnecessary.
    Right - so now to your solution(s) - although I repeat I advise not doing any of these.
    As these are custom data sources, make sure that in your source system you create abap code to fill in thr RECORDMODE for each record.
    This way when you load your data to the DSO, the activation step will take care of processing deletions and reversals.
    Next compress your cube and use selective deletion on the cube to remove the deleted and reversal records.
    Alternatively:
    Compress your cube and use selective deletion on the cube to remove the deleted and reversal records.
    Then in your transformation to the cube, create a start routine which removes the records from the source package. ie the records are never allowed to go to the cube.
    I hope that helps.

  • Issue with generic extraction (Generic Delta)

    I have one more issue with Generic Delta Extraction. Here I selected the field related to my requirement but when I am trying to save I am getting error Still OLTP have errors.
    I need some help when we select generic delta wht are the settings. Like now I am working on Inventory Management.
    Please Some one help me as soon as possible.
    Thanks In advance.....
    Regds
    SDR.

    Hi ,
    I think there are some of the Keyfigure fields which needs 0Unit or 0Currency field to be present in the Extract Structure but they are not present . Check All unit of measure and Currency unit fields are not hidden .
    Regards,
    Vijay.

  • Issue With R/3 Extractor 2LIS_13_VDITM and 2LIS_11_VAITM during Delta Updat

    Hi all,
    We are using 2LIS_13_VDITM and 2LIS_11_VAITM Datasources to upload sales overview cube directly as well as a custome ODS seperately using Direct Delta method.
    Now first time upload was perfect(Initialization) but if any changes are made to the existing data or if any new entries are entered in the source system, R/3 extractor (2LIS_13_VDITM and 2LIS_11_VAITM) is not pulling those records.
    Once we tried deleting SETUP table and reloaded it, extractor was able to pull the records but again the issue crops up for any new entries or changes made in source system.
    This is stopping us from completing the project. So any suggestions the would help us solving the issue is most welcome.
    Regards,
    Surendhar.

    Hi,
    If you have selected the 'Queue delta' Update method for applications 11 and 13,then make sure that the "update" job is scheduled to move the data from LBWQ to RSA7. Once the data available in RSA7, you can execute the infopackage with delta load.
    If you have selected the 'Direct delta' Update method for applications 11 and 13,then there is no need of "update" job to move the data from LBWQ to RSA7. Once the data available in RSA7, you can execute the infopackage with delta load.
    With rgds,
    Anil Kumar Sharma .P

  • Issue with 0SRM_TD_IV data source delta

    Hi experts,
    I have an issue with the invoices numbers not updated correctly when one of invoice item is deleted in SRM.
    For example:
    In SRM:
    Invoice number: 50001168
    Invoice item : 1, quantity: 30 - deleted ( deletion indicator - x)
    Invoice item : 2, quantity: 4
    Invoice item : 3, quantity: 7
    In standard extractor (0SRM_TD_IV) RSA3 : exactly same as SRM data.
    In standard DSO (0SRIV_D3) : only invoice number and invoice item one is present, which suppose to be deleted in BI.
    Invoice number: 50001168
    Invoice item : 1, quantity: 30
    When I did the full repair for this invoice number from SRM. I got all the 3 invoice items. It did not overwrite invoice item 1, which is deleted in SRM. the way it works in SRM is, it didn't delete the invoice item but flaged with the deletion indicator x.
    Know every thing look good up to extractor RSA3. what should I do know to fix this issue? I see that we don't have tranformation rules in BI 7.0 insted we are using trnasfer rules and update rule. Please let me know.
    Thanks,
    Chandra

    Hi Chandu,
    The way I understand your issue is that,
    when you do a full upload, you get the information correctly in RSA3 and DSO.
    when you run a delta, you are not getting all the line items.
    1) Please check whether there are any relevant SAP notes for this extractor to fix this delta issue.
    2) Please check whether there is any code in the update rule or transfer rule that deletes the missing line item
    Thanks,
    Krishnan

  • Performance issues with 0CO_OM_WBS_1

    We use BW3.5 & R/3 4.7 and encounter huge performance issues with 0CO_OM_WBS_1? Always having to do a full load involving approx 15M records even though there are on the average 100k new records since previous load. This takes a longtime.
    Is there a way to delta-enable this datasource?

    Hi,
    This DS is not delta enabled and you can only do a full load.  For a delta enabled one, you need to use 0CO_OM_WBS_6.  This works as other Financials extractors, as it has a safety delta (configurable, default 2 hours, in table BWOM_SETTINGS).
    What you should do is maybe, use the WBS_6 as a delta and only extract full loads for WBS_1 for shorter durations.
    As you must have an ODS for WBS_1 at the first stage, I would suggest do a full load only for posting periods that are open.  This will reduce the data load.
    You may also look at creating your own generic data source with delta; if you are clear on the tables and logic used.
    cheers...

  • Issue with 0HRPOSITION_ATTR extractor

    We are live with BW HR application and having issue with 0HRPOSITION master data. We are extracting data from SAP using 0HRPOSITION_ATTR datasource. I noticed that data is not maintained correctly in master data tables in BW and giving us incorrect results in reporting. Consider below mentioned scenario:
    Position A created as vacant on 04/01/2006 with start date (BEGDA/ Valid from) as 04/01/2006 and End date (ENDDA/ Valid to) as 12/31/9999 and is vacant. Below mentioned entry is shown under maintain master data for 0HRPOSITION in BW:
    Position Valid To   Valid From  Position Vacant
    A        03/31/2006 01/01/1000
    A        12/31/9999 04/01/2006        X
    Position A is now delimited on 09/15/2006 as it’s no more required. In SAP, position has record only from 04/01/2006 till 09/15/2006 as vacant. When record is extracted in BW, it creates below mentioned entry in master data table.
    Position  Valid To    Valid From  Position Vacant
    A         03/31/2006  01/01/1000
    A         09/15/2006  04/01/2006        X
    A         12/31/9999  09/16/2006        X
    Entry 09/16- 12/31 is incorrect as position doesn’t exist for this duration. If we report on 0HRPOSTION with key date as 09/30/2006, it shows position A as vacant though position no longer exists.
    Has anyone come across this situation. any help is greatly appreciated.
    Thanks,
    Milind

    Hi
    I have nt worked with this data source.
    1)Is your data source delta enabled?
    2)Try running RSA3 and check for record which is not coming fine. If in RSA3 record is fine then check calculations(if any ) in update/transfer rule.
    Let me know if it helps..
    Sorabh
    Assign point if it helps**
    Message was edited by: Sorabh Arora

Maybe you are looking for

  • Can't Duplex print from Acrobat Pro 9.3.2 to Epson Artisan 800

    I can not duplex Print from Acrobat Pro 9.3.2 to an Epson Artisan 800. I an running it from a MacBook Pro with OSX 10.6.3 I called Epson support and they could not help me. However they suggested that I try using Adobe Reader. Well that works OK if I

  • CR4E 2.1: Unexpected database connector error

    Hi,</br> I've an application that runs reports at an automated time, we've just updated from older JRC components from Crystal Reports XI to the CR4E 2.1 components, we did this because occasionally the reports would cause a maximum cursors exception

  • Iphone prints blank pages

    iphones  5 & 6 used to print 1 or all web pages flawlessly via airprint using Canon MG3250 printer .Now the iphones print blank sheets. Oddly the iphones wil print emails windows on laptop will print no problems. Cannon has no more suggestions . Boug

  • ORacle discoverer patch upgrade 5983622 error

    Hi, I have OS:RHEL 4 Oracle DB:10.2.0.4 ORacle Apps;R12 12.0.6 I have installed Oracle discoverer 10.1.2.0.2 and we are upgrading the discoverer with patch 5983622 I have encounterd the error as follows: ADMN-100001 Error initializing oracle.ias.sysm

  • Apps are now sorted by upload date, not release date??!!?!

    http://hothardware.com/News/iPhone-App-Developers-Gaming-The-System/ According to that article, the recent change to dates sorting is now changed to use the app upload date and not the actual release date. Some apps takes up to a month to review, so