APO-DP Datasource - Delta

We have a requirement to enable delta for a pdatasource which is based on a planning area, it seems SAP is inactivated this functionality for these kind of datasources...
Did anyone tried to enable delta for planning area based datasource ? if so can you please share your experience.
Thanks,
Raman

We are having full load..as SAP said delta will take more time than full...
MIght be in latest versions they have upgraded delta functionality..Check OSS notes/help..

Similar Messages

  • APO- BI Datasource from Planning Area

    Hi All,
    I need help with APO-BI datasource generated from Planning Area.
    In the Dev environment we had two clients:
    DCLNT020 (Holds APO part) DCLNT010 (Holds BI workbench).
    So a datasource was generated from the Planning area in DCLNT020 --> it was replicated in DCLNT010 --> data from Planning Area was extracted to BI cube using this.
    Now we transported this datasource to the Test environment which has only one client (TCLNT010). I maintained the Source to target mapping there such that DCLNT020 -- TCLNT010 and DCLNT010 -- TCLNT010.
    However the Transport fails and the error message is:
    Cannot replicate DataSource
    Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L
    If I go to the Test system and try to generate the transported Datasource directly from the Planning area again, it says this DataSource already exists. However I cannot see this datasource in the system even after replicating and refreshing multiple times.
    Please provide your inputs as to what might be wrong and hat I need to do to solve this.
    TIA
    Amrita

    Hi   Amrita Goswami
    Based on the above post it seems to be your maintain two clients in Dev one is for creation and another is for testing and when it comes to test environment your maintain only one client and maintain the DS in one it wont give any impact..
    Based on the error
    > +Cannot replicate DataSource+
    > +Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L+
    There could be two reasons
    1) Needs to replicate the data source once you have imported it to test environ ment and than ran the program "RSDS_DATASOURCE_ACTIVATE_ALL" by giving the name of the source and DS name if its BI 7.0
    If its 3.x then have to execute the program :"RS_TRANSTRU_ACTIVATE_ALL" By specifying the transfer structure name.
    2) RS_AFTER_IMPORT  in some cases its because of improper transport of the update rules.
    Solution would be recollect the transport and release the DS transport first and execute the ( 1)Activities and then transport the remaining._
    Hope its clear a little..!
    Thanks
    K M R
    ***Even if you have nothing, you can get anything.
                                                But your attitude & approach should be positive..!****
    >
    Amrita Goswami wrote:
    > Hi All,
    > I need help with APO-BI datasource generated from Planning Area.
    >
    > In the Dev environment we had two clients:
    >
    > DCLNT020 (Holds APO part) DCLNT010 (Holds BI workbench).
    >
    > So a datasource was generated from the Planning area in DCLNT020 --> it was replicated in DCLNT010 --> data from Planning Area was extracted to BI cube using this.
    >
    > Now we transported this datasource to the Test environment which has only one client (TCLNT010). I maintained the Source to target mapping there such that DCLNT020 -- TCLNT010 and DCLNT010 -- TCLNT010.
    >
    > However the Transport fails and the error message is:
    > Cannot replicate DataSource
    > Errors occurred during post-handling RS_AFTER_IMPORT for ISFS L
    >
    > If I go to the Test system and try to generate the transported Datasource directly from the Planning area again, it says this DataSource already exists. However I cannot see this datasource in the system even after replicating and refreshing multiple times.
    >
    > Please provide your inputs as to what might be wrong and hat I need to do to solve this.
    >
    > TIA
    > Amrita
    Edited by: K M R on Feb 6, 2009 12:03 PM
    Edited by: K M R on Feb 6, 2009 12:18 PM

  • Red Traffic Light for Datasources Delta Queues

    Hi all,
    Our R/3 source system is connected to our BW system.
    Between the two systems we operated standard logistics datasources delta queues, by filling the setup tables and performing an initial update.
    The datasources delta queues were created and used over a month (according to the RSA7 they all marked with green traffic light).
    Now, we copied the R/3 source system to a new one.
    After doing so, all the delta queues traffic light turned to be red.
    Does anyone can provide a technical explanation/reason to this problem?
    Also, is there something we could do to "save" this delta queue, without needed to delete it and create it all over again?
    Thanks ahead,
    Shani Bashkin

    Hi Eddo,
    Thanks for your help.
    Yes, I'm using the same system name and system id. The new copied system has the same name and id like the productive system.
    Also, it seems like the RFC connection to the BW is lost.
    The question is why?
    Thanks,
    Shani Bashkin

  • Master datasource delta problem.

    Hello,
    We are in process of implementing logistics module for our client. We are in BI7 and using all BI7 data flow for master and transaction data .
    While working with all master datasource which is supporting delta functionality, we have faced following issue. All this datasources are based on ALE pointer. somehow we are getting the same number of records in delta load...basically it is bringing the same records in delta load everytime.
    E.g - 0CUSTOMER_ATTR . when i extracted the data first time in delta load, it has brough 1200 records and from that day it is bringing 1000+ records in delta load. even if i start the delta in one min, it is bringing that much data. I am very confident that, there is not much changing in ECC ..but here it is bringing it from change pointer table and not updating the same table once the extraction is completed.
    When we dig into this issue, we realize that, somehow change pointer is not getting updated in change pointer table .  We haven't changed anything in standard extractor but still it is happening with all master datasource (delta based ) extractor .
    Can you please help me solving this issue ?
    Regards,

    Hi Maasum
    Not sure whether the below will help.. but give it a try:
    /people/simon.turnbull/blog/2010/04/08/bw-master-data-deltas-demystified
    Thanks
    Kalyan

  • 0FI_AR_4 Datasource, Delta

    Hi Experts,
    we are using 0FI_AR_4 datasource, this is delta enable, but the problem is we can run delta just once a day.
    Can any one please let me know how to change this so that i can run the delta more than once a day.
    Any document or a link would be of great help.
    Thanks in advance.
    Ananth

    hi Ananth,
    take a look Note 991429 - Minute Based extraction enhancement for 0FI_*_4 extractors
    https://websmp203.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=991429&_NLANG=E
    Symptom
    You would like to implement a 'minute based' extraction logic for the data sources 0FI_GL_4, 0FI_AR_4 and 0FI_AP_4.
    Currently the extraction logic allows only for an extraction once per day without overlap.
    Other terms
    general ledger  0FI_GL_4  0FI_AP_4  0FI_AR_4  extraction  performance
    Reason and Prerequisites
    1. There is huge volume of data to be extracted on a daily basis from FI to BW and this requires lot of time.
    2. You would like to extract the data at a more frequent intervals in a day like 3-4 times in a day - without extracting all the data that you have already extracted on that day.
    In situations where there is a huge volume of data to be extracted, a lot of time is taken up when extracting on a daily basis. Minute based extraction would enable the extraction to be split into convenient intervals and can be run multiple times during a day. By doing so, the amount of data in each extraction would be reduced and hence the extraction can be done more effectively. This should also reduce the risk of extractor failures caused because of huge data in the system.
    Solution
    Implement the relevant source code changes and follow the instructions in order to enable minute based extraction logic for the extraction of GL data. The applicable data sources are:
                            0FI_GL_4
                            0FI_AR_4
                            0FI_AP_4
    All changes below have to be implemented first in a standard test system. The new extractor logic must be tested very carefully before it can be used in a production environment. Test cases must include all relevant processes that would be used/carried in the normal course of extraction.
    Manual changes are to be carried out before the source code changes in the correction instructions of this note.
    1. Manual changes
    a) Add the following parameters to the table BWOM_SETTINGS
                             MANDT  OLTPSOURCE    PARAM_NAME          PARAM_VALUE
                             XXX                  BWFINEXT
                             XXX                  BWFINSAF            3600
                  Note: XXX refers to the specific client(like 300) under use/test.
                  This can be achieved using using transaction 'SE16' for table
                             'BWOM_SETTINGS'
                              Menue --> Table Entry --> Create
                              --> Add the above two parameters one after another
    b) To the views BKPF_BSAD, BKPF_BSAK, BKPF_BSID, BKPF_BSIK
                           under the view fields add the below field,
                           View Field  Table    Field      Data Element  DType  Length
                           CPUTM       BKPF    CPUTM          CPUTM      TIMS   6
                           This can be achieved using transaction 'SE11' for views
                           BKPF_BSAD, BKPF_BSAK , BKPF_BSID , BKPF_BSIK (one after another)
                               --> Change --> View Fields
                               --> Add the above mentioned field with exact details
    c) For the table BWFI_AEDAT index-1  for extractors
                           add the field AETIM (apart from the existing MANDT, BUKRS, and AEDAT)
                           and activate this Non Unique index on all database systems (or at least on the database under use).
                           This can achived using transaction 'SE11' for table 'BWFI_AEDAT'
                               --> Display --> Indexes --> Index-1 For extractors
                               --> Change
                               --> Add the field AETIM to the last position (after AEDAT field )
                               --> Activate the index on database
    2. Implement the source code changes as in the note correction instructions.
    3. After implementing the source code changes using SNOTE instruction ,add the following parameters to respective function modules and activate.
    a) Function Module: BWFIT_GET_TIMESTAMPS
                        1. Export Parameter
                        a. Parameter Name  : E_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
                        2. Export Parameter
                        a. Parameter Name  : E_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
    b) Function Module: BWFIT_UPDATE_TIMESTAMPS
                        1. Import Parameter (add after I_DATE_HIGH)
                        a. Parameter Name  : I_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
                        2. Import Parameter (add after I_TIME_LOW)
                        a. Parameter Name  : I_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
    4. Working of minute based extraction logic:
                  The minute based extraction works considering the time to select the data (apart from date of the document either changed or new as in the earlier logic).The modification to the code is made such that it will consider the new flags in the BWOM_SETTINGS table ( BWFINEXT and BWFINSAF ) and the code for the earlier extraction logic will remain as it was without these flags being set as per the instructions for new logic to be used(but are modified to include new logic).
    Safety interval will now depend on the flag BWFINSAF (in seconds ; default 3600) and has  a default value of 3600 (1 hour), which would try to ensure that the documents which are delayed in posting due to delay in update modules for any reason. Also there is a specific coding to post an entry to BWFI_AEDAT with the details of the document which have failed to post within the safety limit of 1 hour and hence those would be extracted as a changed documents at least if they were missed to be extracted as new documents. If the documents which fail to ensure to post within safety limit is a huge number then the volume of BWFI_AEDAT would increase correspondingly.
    The flag BWFINSAF could be set to particular value depending on specific requirements (in seconds , but at least 3600 = 1 hour)  like 24 hours / 1 day = 24 * 3600 => 86400.With the new logic switched ON with flag BWFINEXT = X the other flags  BWFIOVERLA , BWFISAFETY , BWFITIMBOR are ignored and BWFILOWLIM , DELTIMEST will work as before.
    As per the instructions above the index-1 for the extraction in table BWFI_AEDAT would include the field AETIM which would enable the new logic to extract faster as AETIM is also considered as per the new logic. This could be removed if the standard logic is restored back.
    With the new extractor logic implemented you can change back to the standard logic any day by switching off the flag BWFINEXT to ' ' from 'X' and extract as it was before. But ensure that there is no extraction running (for any of the extractors 0FI_*_4 extractors/datasources) while switching.
    As with the earlier logic to restore back to the previous timestamp in BWOM2_TIMEST table to get the data from previous extraction LAST_TS could be set to the previous extraction timestamp when there are no current extractions running for that particular extractor or datasouce.
    With the frequency of the extraction increased (say 3 times a day) the volume of the data being extracted with each extraction would decrease and hence extractor would take lesser time.
    You should optimize the interval of time for the extractor runs by testing the best suitable intervals for optimal performance. We would not be able to give a definite suggestion on this, as it would vary from system to system and would depend on the data volume in the system, number of postings done everyday and other variable factors.
    To turn on the New Logic BWFINEXT has to be set to 'X' and reset back to ' ' when reverting back. This change has to be done only when there no extractions are running considering all the points above.
                  With the new minute based extraction logic switched ON,
    a) Ensure BWFI_AEDAT index-1 is enhanced with addition of AETIM and is active on the database.
    b) Ensure BWFINSAF is atleast 3600 ( 1 hour) in BWOM_SETTINGS
    c) Optimum value of DELTIMEST is maintained as needed (recommended/default value is 60 )
    d) A proper testing (functional, performance) is performed in standard test system and the results are all positive before moving the changes to the production system with the test system being an identical with the production system with settings and data.
    http://help.sap.com/saphelp_bw33/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm

  • BW - APO Datamart datasource migration issue

    Hi,
    Question :  Can we migrate datasources exported as datamart from APO-DP system into BW, from 3.x to 7.0 or does it have to be a 3.x datasource with transfer rules, TS , update rules etc ...
    1) I know we cannot migrate data marts with 8* or //8  but this datamart is replicated from our APO-DP system and begins with 9*.
    Thanks .
    HS

    Hi Geetanjali,
    Thanks for the reply.
    My problem is that I am getting error message "No transfer structure available for infosource 9* in source system" when I run the infopkg to load in PSA ( which is default for 7.0 datasources).
    This is the dataflow l I wanted to have which replaces old model as shown  :
    NEW 7.0 dataflow : Datasource -->infopkg --> psa --> dtp -->transformation --> cubes ( same source feeds 3 cubes ) .
    OLD 3.x dataflow : DS >Infosource>Transfer rules--> infopkg --> update rules --> cubes.
    Thanks .
    HS

  • APO - MALORE datasource

    Hi
    Can someone explain the logic in relation to extracting data from APO datasource 9AMALORE.
    I really need an explanation on how this extractor works. The problem is related to "Labour resources" and "Machine resources"
    When check a material in APO planning book for resources I can see data on both resources:
    Example: Key figure (standard 9AFPROD)
    In APO for cal/month: 11.2010
    Labour resource      16.000
    Machine resource     16.000
    However In the BW report:
    Labour resource      16.000
    No data exists on the Machine resource
    Looks right when checking the extractor in APO, However only Labour resource contains data = 16.000. Machine resource = 0.
    Can anyone explain.
    Br
    Thomas

    Hi Thomas,
    I notice this post is now several years old but did you obtain an offline answer to your question? I have observed something similar and I am also looking for details of how his extractor works for a quantity related key figure.
    Many thanks,
    Daniel.

  • Generic Datasource Delta Settings Change

    Hi Experts,
    Currently the delta field in generic datasource is set to timestamp (LOCAL) and Upper limit is blank.
    I need to change the delta settings to timestamp(UTC) or set the upper limit to 1000 seconds.
    Can anyone tell me how to change this in production. Do i need to change this in dev and transport this to production.
    If i do the changes to generic datasource, do i have to initialise the delta again?
    Pls provide me the step by step solution.
    Regards,
    Anand

    Ananda, What about the delta initialization? After transporting the datasource from R/3 dev to R/3 production, I have to replicate the datasource right.
    In that case what will happen to my delta?.
    One more thing incase i need to change only the upper limit value.  Even for this, do i have to change it in dev and transport to production?
    Regards,
    Anand

  • FI datasources delta query

    Hi ALL
    I want to know the difference between delta in FI and delta in LO.
    1. I know that LO datasources send after and before images (ABR) and can feed ODS or cube.
    2. I know that delta enabled FI DS have after image and in such cases cant feed a cube and would need an ODS in between.
    3. I know that RSA7 queue holds records for delta records. This is true for both FI and LO datasources.
    But in case of LO cokcpit, we select queued delta/direct delta options, we run LBWQ jobs etc to push data into the RSA7 queue.
    What neesd to be done for FI datasources? I do see them in the RSA7 queue. Does this mean they pull data directly and we dont need to run any jobs for FI DS??
    Please clarify my doubts.
    SP

    Hi
    Why dont you look at this link?
    http://help.sap.com/saphelp_nw04/helpdata/en/af/16533bbb15b762e10000000a114084/content.htm.
    This should answer your question
    Prakash

  • Generic Datasource Delta upload Issue

    Hi all,
    I have created a Generic Datasource for Solution Manager ODS 0crm_process_srv_h which contains the userstatus, transaction number, transaction desc and GUID of CRM Order Object (ods key field),only these fields.
      I have taken transaction number as delta field and it is as timestamp. But delta upload is not happening. If I give GUID or Status also delta upload is not happening . Please suggest which is the best way to give to delta upload.
    Thanks and Regards,
    SGK.

    Hi,
    in your case you need to create a function module as a extractor. Create a extract structure as well and add a timestamp field (lets call it zdeltahelp) to be used as the delta relevant field of the datasource. After successful init the last extraction timestamp will be passed to the fm and you can start selecting all resb entries for orders created and/or changed from that time on as well as the ones deleted (but therefore you need to read the change documents as well). For more information about generic extraction using fm search the forum or check out my business card. There you will find a link to a weblog about that issue.
    regards
    Siggi

  • CRM Datasource-Delta not Working

    Folks,
    I am using CRM Standard Datasource 0CMS_RTCM_TRAN-Resale Tracking and Claims Management data..
    I have enhanced this Datasource with four fields and transported to Quality system..
    no problem with the new additions..Datasource is extracting properly..
    but Delta update is not working..Delta Queue is not picking any records..but new transactions are getting updated in the respective source tables..
    when i checked in BWA1- am getting these messages:
    error message:
    (121) MS_RTCM_TRAN: Table SMOXHEAD contains no BDOC name.
    Warning messages:
    (111) MS_RTCM_TRAN: No mapping found for extract field SLSR_ADDRESS_I..similarily for all the Datasource fields, its showing the same no mapping warning..
    its working fine with Full update..but for Delta update am facing these issues..
    Can you please explain what could be the reason for this and suggest a proper solution..
    Thanks and Regards,
    Yog

    Hi,
    Are there any transactions to change the BDOC name for Standard Datasources??..
    In BWA1, when i tried to change these ADAPTER settings, i am getting this message:
    "DataSource 0CMS_RTCM_TRAN cannot be edited with this transaction"..
    i hope if i specify the BDOC name, Extraction should be happening..
    so please suggest where i can perform these changes??..
    Note : currently no extraction is happening because of this..
    Regards,
    Yoga

  • Customized SD Datasource deltas

    Hi all,
    I have a requirement to append some fields to the Billing Item data source.  These fields are generally from other VB** tables.
    If I add them and then write and exit in EXIT_SAPLRSAP_001 is that enough? 
    Or will I have an issue with deltas for changes to those appended fields as is described in OSS note 576886?
    Will I have to do complex code to find the before and after image for all the fields?
    Anyone have more details steps to a solution?

    Hi,
    a generic datasource is the one created by yourself via transaction RSO2.
    Generic DS can be based on a table, a view or an extract structure.
    In the latter, you'll have as well to write a function module (the extractor); you can see an example in SE37, FM RSAX_BIW_GET_DATA_SIMPLE.
    You can delta enable your generic data source and the challenge here is to find a suitable field; the easiest is when your source table has a TIMESTAMP field tellong which record has been changed; otherwise a calendar day or sequential number is also supported... In your case, since you don't have any information regarding when the document has been changed, you'll have to use at least a view with VBPA and VBUK (the header) in order to get the "changed on" and base your delta on this field.
    This means that you'll potentially extract duplicate information (if you request a delta twice during the same day); therefore you'll have to model an ODS with overwrite update in order to not cumulate this data....
    hoping this will shed light
    Olivier.

  • Generic datasource delta enabling  against S973 & S966

    hellow gurus
    m workin on BW 3.5
    i have to  create Generic datasource for Warrqanty FOC against Sales
    for that i need to create a datasource using S966 and S973 View
    could u plzz tel me
    on which field should i make Delta enable in generic Datasource and
    what shuld i selc in DATA-Specific Field -
    and which radio button
    Time Stamp
    or Cal Day or Numeric Pointer
    and
    Saftety interval upper and lower limit for the same.
    points will be awarded as my gesture and thanks for ur efforts
    regards

    Hi,
    The following is the criteria that needs to be looked upon during Delta enabling in Generic Datasource.
    Delta Specific Field :
    Time Stamp - If you are extracting data for critical modules like FICO, go for Time stamp.
    Calendar Day - If data is extracted from modules that does not have huge postings in a day, like Sales or Delivery Billing, go for Calendar day.
    Numeric Pointer - For sequentially generated records like GLPCA, go for Numeric Pointer.
    Safety Interval (Upper & Lower Limits) :
    (1) If delta field is Date (Record Create Date or change date), then use Upper Limit of 1 day.This will load Delta in BW as of yesterday. Leave Lower limit blank.
    (2) If delta field is Time Stamp, then use Upper Limit of equal to 1800 Seconds (30 minutes). This will load Delta in BW as of 30 minutes old. Leave Lower limit blank.
    (3) If delta field is a Numeric Pointer i.e. generated record # like in GLPCA table, then use Lower Limit. Use count 10-100. Leave upper limit blank. If value 10 is used then last 10 records will be loaded again.
    If a record is created when load was running, those records may get lost. To prevent this situation, lower limit can be used to backup the starting sequence number.
    This may result in some records being processed more than once; therefore, be sure this DataSources is only feeding an ODS Object.
    Regards,
    Balaji V

  • Generic DataSource & Delta setup

    Hello Folks,
    I am trying to get my Generic DataSources and Delta concepts straight and got more confused after reading a white paper titled "how to ...create generic delta". <b>I am on SAP R/3 4.6c and BW 3.0b.</b>
    <b>This white paper talks about creating generic datasource and delta using RSO2 in BW</b>. It explains how to  specify  delta-specific field and then select a delta type (AIE or ADD) which sets the "delta update" flag after generating.
    My question is
    1)should we create a generic datasource and delta from BW as suggested by this white paper and not in SAP R/3 directly using RSO2 and RSA3? what are the pros and cons.
    I tried creating a generic datasource in SAP R/3 using RSO2 since these transactions are available in SAP R/3. I noticed the "delta update" checkbox was always(or for the cases I tried) blank and protected. Also noticed a menu item under "Datasource" called "setup delta" to be grayed out as well. So looks like, a delta can be set up in SAP R/3 but does not let me do it.
    My question is
    1) why can't I set up a delta?

    Hi Jay,
    First, you do not create a Datasource from BW. A Datasource is created in R/3. You replicate them to BW afterwards so that you can create/map them to Transfer Rules.
    Datasource creation is in R/3 only. You can of course create  BW as a source of data for other systems but thats something else.
    Creating a <b>GENERIC DELTA</b> is based only on <b>TIMESTAMP</b>, <b>NUMERIC POINTER</b> and <b>CALENDAR DAY</b>.  <i>Delta Check box</i> is really grayed out for <b>GENERIC DATASOURCES</b> because once you specify that your generic data source is <i>DELTA</i> capable by using any of the three options above as DELTA, then the <i>Delta Check Box</i> is automatically checked.
    What <i>Setup Delta</i> are you referring that you cannot do?
    --Jkyle

  • Datamart - Generate Datasource DELTA flag check

    We are on BW 3.5 (source system) and trying to send data across to target system (SCM / BI 70).
    When I perform "generate datasource" for a cube, the resulting datasource " 8* " is generated correctly but with the DELTA FLAG CHECKED (txn .RSA6 ) for that datasource
    With this flag setting, we are forced to run an INIT and then DELTA from the target system to extract the data
    Question :
    1) How do we "uncheck"  the Delta Flag setting ? Is it possible ?
    2) If we cannot uncheck the flag , how do we run Full loads from our target system ?
    Thanks in advance .
    Hitesh

    Hi,
    Even if the datasource is delta enabled it means you can run delta loads for it but it doesn't mean that you cannot run the full loads.
    For unchecking the flag in the Delta for a datasource you have make that particular delta field to be disabled in the RODELTAM table for the particular data source. This can be done by using a custom ABAP program.
    Hope it helps !
    Regards,
    Rose.

Maybe you are looking for

  • Why can't I sign into my gmail ?? I tried all the posted suggestions including clearing cookies and cache.

    I have tried all of the options suggested on this site and nothing works. I can get my mail on google chrome and internet explorer but not on firefox, WHY????

  • Why aren't my logged-in sessions sustained on certain forums?

    I belong to a large list of auto enthusiast sites, and recently several of them have started required me to log in each time I access the site, even though I always check boxes like "Remember me each time I visit" when I log in. It can't be a cookie

  • Multiple Logon Pages?

    I am trying to have different logon pages for the same portal depending on the URL used to access the portal.  I already have URL Aliases implemented and I filter the content based on the portal desktop being used.  Does anyone know how to have a dif

  • Any idea what this characher is?

    DB version:10gR2 Found this character in a table. I don't know how this got inserted. Any idea what this characher is?Even though it appears as a hypen here in OTN, this looks like a miniature square block in sqlplus. SELECT DUMP('–') FROM DUAL; DUMP

  • Mail ccs all outgoing email even though its turned off in preferences

    Recently mac mail started creating "ccs" for all my outgoing mail (several accounts). I have gone to preferences and "automatically cc/bcc myself" is unchecked. I have checked it and then unchecked it and still I get ccs and I don't want them.