Data Load in Production

Hi Gurus !
I need to perform the data load activities for year 2007, 2008, 2009 & 2010. I already performed deleting & filling of the SETUP tables. But its pulling only 0 records in the BI.
Please reply at the earliest. Its very critical. Points will be assigned when it gets resolved. Thanks.
With Regards
Rekha

Hi ,
I need to perform the data load activities for year 2007, 2008, 2009 & 2010. I already performed deleting & filling of the SETUP tables. But its pulling only 0 records in the BI.
As you are loading data for 2007,2008,2009 & 2010 it is selective loading with respect to year I hope and it is a init load.
Check R/3 - Bw connection.
Data need to be checked in RSA3.
Replicate the data source.
Regards,
Sultan.

Similar Messages

  • Data load in Production Issue

    Hi All !
    1) Loaded data from datasource 2LIS_11_VAITM to PSA. Successfully.
    2) While loading PSA to data target, the activation fails due to invalid characters.
    I have few options:
    1) It has 200 datapackets. Correcting manually is not possible due two huge volume.
    2) Added the invaliid character in RSKC. Though I dont see any other approach.
    3) Writing the routine is also not possible as its a high priority.
    Is the RSKC approach is OK to use? Please advise.
    With Regards
    Rekha

    Hi Rekha,
    In earlier same issue I was also faced and resolved it by RSKC.
    I think you go ahead with 2)......not  a problem at all.
    Coding or FM is somewhat time consuming process for this small issue.
    Thanks-RK

  • Master Data Load to APO from SAP MDM through SAP PI

    Hi SDners,
    This is a Parts Master Data Solution for one of the Largest Auto Manufacturer where SAP MDM will be the central hub of all the Global and Local attributes of Parts . The subscribing system is SAP SCM APO . Please advice if there is any possibility of direct integration between APO -MDM through PI. Does PI has the standard IDOC types to communicate with APO.
    Also does APO has some RFC/BAPI to do master data load for Product Master /n/sapapo/mat1
    Thanks,
    Prabuddha

    Hi,
    Check the LUWs in SM58 in source system and then execute it. Else check in BD87 and push the IDocs manually in ECC.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Feb 24, 2009 10:59 AM

  • Most common BW data load errors in production and how to solve them ..

    Hi All,
    Most common BW data load errors in production and how to solve them ..
    Any doc on it ,if so plz send it across to this id [email protected]
    Thanks in advance.
    Rgrds
    shoba

    hi
    1) RFC connection lost.
    2) Invalid characters while loading.
    3) ALEREMOTE user is locked.
    4) Lower case letters not allowed.
    5) While loading the data i am getting messeage that 'Record
    the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
    6) object locked.
    7) "Non-updated Idocs found in Source System".
    8) While loading master data, one of the datapackage has a red light error message:
    Master data/text of characteristic 'so and so' already deleted .
    9) extraction job aborted in r3
    10) request couldnt be activated because theres another request in the psa with a smaller sid
    11) repeat of last delta not possible
    12) datasource not replicated
    13) datasource/transfer structure not active ´
    14) Idoc Or Trfc Error
    15. ODS Activation Error

  • Loading master data in BI production

    Dear Experts,
    While loading master data in BI production we are facing error when updating IDOCS.
    __MONITOR ADMINISTRATOR WORKBENCH__
    STATUS:
    ERROR:Error when updating IDOCS in source system.
    DIAGNOSIS:Errors have been reported in source system during IDOCS Update
    SYSTEM RESPONSE:Some IDOCS have error status
    PROCEDURE:
    Check the IDOCS in source system.You do this using the extraction monitor.
    ERROR:How you ressolve the errors depends on the error message you get.
    SO can anyone tell why this type of errors occured? What will be the reason behind this?
    Thanks in advance.
    Warm Regards,
    Bunty.

    Hi Bunty,
    Your Idoc error is a general error, the first of all will be to know what is the concrete error with the Idoc, for this reason, please do the following checks and let us know the results:
    1.-) Execute the tx.code SM58. BD87 and WE05 and give us your feedback, tell us for each tx,.code if some errors are shown.
    2.-) Do you display some error if you go to tx.code SM59 - Select RFC Destination "Source system" -> Authorization test?
    3.-) Please ensure that your remote user has the correct authorization profiles in both BW and R/3. You need to set your     
         profiles for ALEREMOTE as such:
       1) in BW -> the remote user (usually 'ALEREMOTE') must have
           profile 'S_BI-WHM_RFC'
       2) in R/3 -> the remote user (usually 'ALEREMOTE') must have
           profile 'S_BI-WX_RFC'
    This is explained in more detail in note 150315.
    Also, you must ensure that you have sufficient DIA processes in both your BW and R/3 systems. DIA work processes are necessary for the processing of IDOC's. And if there are not sufficient DIA work processes, the load may hang in status 'yellow'. Note 561880 gives more information about this:
      1. Make sure there is always sufficient DIA, that is, at least 1 DIA more than all other work processes altogether, for
          example,  8 DIA for a total of 15 work processes (see also note 74141)."
    4.- ) Please identify if the Idocs get hanged in the the source system, and if these Idocs cannot be processed manually in        Tcode SM58 please implement the correction from note 1280898. // I hope it helps you. Best regards. Fran.

  • What are the frequent data load errors in production support?

    what are the frequent data load errors in production support?

    It is a long list. Here is some of them.
    1. Idoc not arriving from source system.
    2. previous processes failed.
    3. change run unsucessful or did not run properly and data in master data is messed up.
    4. invalid characteristics in the data.
    5. Duplicate records found.
    and on  and on.
    Ravi Thotahdri

  • BW Production Data Loads

    Today our BW Production data load was failed.
    When i right click the data target -manage-i can find the request with red trafic light...but when i tried to hit the TV SCREEN THE SYSTEM IS HANGING DOWN...it is not allowing me to enter into the request...
    When i try to access the request from rsmon the same thing is happening
    when i tried to check the logs in process chain for today the same thing happened.
    I think this is some thing related to back ground job
    Please help urgent..please give me detailed procedure as it production system
    Thanks

    ST22 ERRORS
    21.03.2006     09:17:57     sbcpap01     SKELLY2     400     C     MESSAGE_TYPE_X               CL_GUI_CFW====================CP     1
    21.03.2006     04:08:09     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     04:03:18     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     03:57:22     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     03:51:37     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     03:45:55     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1

  • Interface for loading data into customer products

    I am trying to find if there is any interface for loading Customer products in Install Base apart from entering them manually. I understand that this exists in Oracle Apps version 11.5.7 but I need this for version 11.5.4.

    Hi,
    In 11.5.4 , you've to write yourself the loader, using Oracle standard API.
    I've performed it, it is working fine.
    Hugues

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Data loading after field enhancement.

    Dear all,
    We are using BI7.00 and in one of our data source, a new field has to be enabled. Our people are under the impression that without downtime, the previous data which is available in the Target and the PSA can have values for the new field also.
    I could not perceive the possibility. Experts suggestion required in this regard. Can you kindly provide answers for the following questions.
    1) Can enhancement be done to the data source without deletion of setup table?
    2) Can the delta queue be as it is without stopping the delta pull process i.e., the process chain and the background jobs.
    3) If the field is enhanced, can the value of the field be loaded to all the data which is previously loaded to the PSA and the Target.
    Request Experts to provide apt solution so that field enhancement can take place without disturbing any of the data loads.
    I went through the forum posts and was able to find something about export data source and Loop back principles - these suggests that my requirement is possible.
    I do not know the process. Can experts provide step by step suggestion to my query.
    Regards,
    M.M

    Hello Magesh,
    1)Enhancement cannot be done if there are records in the set up tables.
    2)When an enhancement is done...delta queue also needs to be empty...so you will have to stop the collective running jobs...lock the system and empty the delta queue by scheduling the delta package twice....then only the transports to production will go succesful.
    3)Until you fill the set up tables again and do a historial loads...the old values for the new added field will not appear..
    If you just do an init without data transfer and schedule new delta loads...then the new added fields will contain values from that day and changes to them...previously loaded values to BW will remain as it is...to have the values for newly added fields you need to load the history through full repair loads by filling the set up tables first.
    Follow the following steps to load only the new values for the added fields
    1)Lock the system
    2)schedule the collective update job through job control so that all the records are in the delta queue and no records or LUW are left in LBWQ for that data source.
    3)Schedule the delta infopackage twice so that even the queue for repeat delta is also empty.
    4) do the transports and then delete the old init and do a new init without data transfer.
    5)schedule the normal delta.
    To have history for the added fields
    1)Lock the system and
    2)Delete the old init and clear the LBWQ from LUW's
    3)Do the transports
    3)Fill the set up tables and do init without data transfer for the data source.
    4)Unlock the system
    5)Do the full repair loads to the BW data targets
    6)Schedule the delta loads.
    Thanks
    Ajeet

  • Problem with Master Data Load

    Dear Experts,
    If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
    This it is the error:
    Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
    Message no. RSDMD218
    Diagnosis
    During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
    The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
    Procedure
    u2022     Check if the values of the master data record with the key specified in this message are updated correctly.
    u2022     Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
    u2022     Re-schedule the master data load process to avoid such situations in future.
    u2022     Read SAP note 668466 to get more information about master data update scheduling.
    Other hand, the SID table in the master data product is empty.
    Thanks for you well!
    Luis

    Dear Daya,
    Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
    We are on BI 7.0 (system ID DXX)
    While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
    Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
    We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
    The Master Data tables (S, P, X) are not updated properly.
    The following reparing actions have been taken so far:
    1.     Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
    2.     Follow instructions from OSS 632931 (tcode RSRV)
    3.     Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
    After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
    u201CCharacteristic XXXX00001: error fund during this test.u201D
    The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
    Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
    It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
    Could somebody please help us, and let us know how to solve the problem?
    Thank for all,
    Luis

  • Data Load

    We are trying to load the data of 2lis_03_bf from sap R/3 into SAP BW.
    The following steps were followed in the process.
    1.Delete data from Inventory Queue LBWQ MCEX03 Entries
    2.Delete setup tables LBWG
    3.Check data in Extractor RSA3 0 records should be there
    4. Filling setup tables for 2LIS_03_BX MCNB Termination date = next
    day, Transfer structure = 2LIS_03_BX,Only Valuated stock( with posting
    block ) on 14th august
    5.Filling setup tables for 2LIS_03_BF OLI1BW The data restriction
    given by posting date 01.01.1999-14.08.2007
    6.Generate Initial status 2LIS_03_BX. RSA1 ( BW) Done with in the
    posting block
    7.Collapse data with marker update
    8.Start/Schedule Control Job in R/3 side for BF to run every 2hrs LBWE,
    as suggested by External Consultant.
    9.Initialize delta process for 2LIS_03_BF RSA1 Started on 15th of
    august but failed due termination in R/3.
    10.So started full update in two parallel data load into BW, 3months at
    a time. Each load took 2 days to bring 2 million records.
    11.This load of data till 14th of August 2007 finished on 4th Sep 2007.
    12.INITIAL LOAD WITHOUT DATA TRANSFER IS DONE Successfully ( to
    activate delta for BW)
    13.Delta to BW was scheduled and it transferred 0 from 0 records.
    14. Check for data is R/3 delta Q : RSA7 Data records are shown from
    01.09.2007 – 04.09.2007. Unable to find data from 15.08.2007 to
    31.08.2007.
    15.Performed a full data load from 15.08.2007 till date ( in order to
    get the data for the missing days) RSA1 0 from 0 records are
    transferred.
    We are looking for any advice in getting this data records from 15th of
    August till today.
    This is a very critical issue, because we are unable to provide our
    business with any production reports and also stock reports.
    Please some one help us to resolve the issue.
    Please help as early as possible

    Hi,
    I have a suggestion, you can try..
    as you said that the, your delta init has failed from 15th aug..but later your init was
    successful ended at today's date..right..
    so, if your init and delta activation is successful, then system would have started
    captureing the data through one of its update mode set by you..
    therefore, first goto RSA7 and check whether you have any delta records there..?
    if you found NO..
    then...go to T-code - LBWQ - and check the entries against 'MCEX03' you should be able to see the no. of records..
    step2 : go there double click on that record and check the value in the status field
    and if it is other than 'Ready', then change the status to 'Ready'...
    and revert back to me...for further steps..

  • How to tune data loading time in BSO using 14 rules files ?

    Hello there,
    I'm using Hyperion-Essbase-Admin-Services v11.1.1.2 and the BSO Option.
    In a nightly process using MAXL i load new data into one Essbase-cube.
    In this nightly update process 14 account-members are updated by running 14 rules files one after another.
    These rules files connect 14 times by sql-connection to the same oracle database and the same table.
    I use this procedure because i cannot load 2 or more data fields using one rules file.
    It takes a long time to load up 14 accounts one after other.
    Now my Question: How can I minimise this data loading time ?
    This is what I found on Oracle Homepage:
    What's New
    Oracle Essbase V.11.1.1 Release Highlights
    Parallel SQL Data Loads- Supports up to 8 rules files via temporary load buffers.
    In an Older Thread John said:
    As it is version 11 why not use parallel sql loading, you can specify up to 8 load rules to load data in parallel.
    Example:
    import database AsoSamp.Sample data
    connect as TBC identified by 'password'
    using multiple rules_file 'rule1','rule2'
    to load_buffer_block starting with buffer_id 100
    on error write to "error.txt";
    But this is for ASO Option only.
    Can I use it in my MAXL also for BSO ?? Is there a sample ?
    What else is possible to tune up nightly update time ??
    Thanks in advance for every tip,
    Zeljko

    Thanks a lot for your support. I’m just a little confused.
    I will use an example to illustrate my problem a bit more clearly.
    This is the basic table, in my case a view, which is queried by all 14 rules files:
    column1 --- column2 --- column3 --- column4 --- ... ---column n
    dim 1 --- dim 2 --- dim 3 --- data1 --- data2 --- data3 --- ... --- data 14
    Region -- ID --- Product --- sales --- cogs ---- discounts --- ... --- amount
    West --- D1 --- Coffee --- 11001 --- 1,322 --- 10789 --- ... --- 548
    West --- D2 --- Tea10 --- 12011 --- 1,325 --- 10548 --- ... --- 589
    West --- S1 --- Tea10 --- 14115 --- 1,699 --- 10145 --- ... --- 852
    West --- C3 --- Tea10 --- 21053 --- 1,588 --- 10998 --- ... --- 981
    East ---- S2 --- Coffee --- 15563 --- 1,458 --- 10991 --- ... --- 876
    East ---- D1 --- Tea10 --- 15894 --- 1,664 --- 11615 --- ... --- 156
    East ---- D3 --- Coffee --- 19689 --- 1,989 --- 15615 --- ... --- 986
    East ---- C1 --- Coffee --- 18897 --- 1,988 --- 11898 --- ... --- 256
    East ---- C3 --- Tea10 --- 11699 --- 1,328 --- 12156 --- ... --- 9896
    Following 3 out of 14 (load-) rules files to load the data columns into the cube:
    Rules File1:
    dim 1 --- dim 2 --- dim 3 --- sales --- ignore --- ignore --- ... --- ignore
    Rules File2:
    dim 1 --- dim 2 --- dim 3 --- ignore --- cogs --- ignore --- ... --- ignore
    Rules File14:
    dim 1 --- dim 2 --- dim 3 --- ignore --- ignore --- ignore --- ... --- amount
    Is the upper table design what GlennS mentioned as a "Data" column concept which only allows a single numeric data value ?
    In this case I cant tag two or more columns as “Data fields”. I just can tag one column as “Data field”. Other data fields I have to tag as “ignore fields during data load”. Otherwise, when I validate the rules file, an Error occurs “only one field can contain the Data Field attribute”.
    Or may I skip this error massage and just try to tag all 14 fields as “Data fields” and “load data” ?
    Please advise.
    Am I right that the other way is to reconstruct the table/view (and the rules files) like follows to load all of the data in one pass:
    dim 0 --- dim 1 --- dim 2 --- dim 3 --- data
    Account --- Region -- ID --- Product --- data
    sales --- West --- D1 --- Coffee --- 11001
    sales --- West --- D2 --- Tea10 --- 12011
    sales --- West --- S1 --- Tea10 --- 14115
    sales --- West --- C3 --- Tea10 --- 21053
    sales --- East ---- S2 --- Coffee --- 15563
    sales --- East ---- D1 --- Tea10 --- 15894
    sales --- East ---- D3 --- Coffee --- 19689
    sales --- East ---- C1 --- Coffee --- 18897
    sales --- East ---- C3 --- Tea10 --- 11699
    cogs --- West --- D1 --- Coffee --- 1,322
    cogs --- West --- D2 --- Tea10 --- 1,325
    cogs --- West --- S1 --- Tea10 --- 1,699
    cogs --- West --- C3 --- Tea10 --- 1,588
    cogs --- East ---- S2 --- Coffee --- 1,458
    cogs --- East ---- D1 --- Tea10 --- 1,664
    cogs --- East ---- D3 --- Coffee --- 1,989
    cogs --- East ---- C1 --- Coffee --- 1,988
    cogs --- East ---- C3 --- Tea10 --- 1,328
    discounts --- West --- D1 --- Coffee --- 10789
    discounts --- West --- D2 --- Tea10 --- 10548
    discounts --- West --- S1 --- Tea10 --- 10145
    discounts --- West --- C3 --- Tea10 --- 10998
    discounts --- East ---- S2 --- Coffee --- 10991
    discounts --- East ---- D1 --- Tea10 --- 11615
    discounts --- East ---- D3 --- Coffee --- 15615
    discounts --- East ---- C1 --- Coffee --- 11898
    discounts --- East ---- C3 --- Tea10 --- 12156
    amount --- West --- D1 --- Coffee --- 548
    amount --- West --- D2 --- Tea10 --- 589
    amount --- West --- S1 --- Tea10 --- 852
    amount --- West --- C3 --- Tea10 --- 981
    amount --- East ---- S2 --- Coffee --- 876
    amount --- East ---- D1 --- Tea10 --- 156
    amount --- East ---- D3 --- Coffee --- 986
    amount --- East ---- C1 --- Coffee --- 256
    amount --- East ---- C3 --- Tea10 --- 9896
    And the third way is to adjust the essbase.cfg parameters DLTHREADSPREPARE and DLTHREADSWRITE (and DLSINGLETHREADPERSTAGE)
    I just want to be sure that I understand your suggestions.
    Many thanks for awesome help,
    Zeljko

  • Data Load Issue "Request is in obsolete version of DataSource"

    Hello,
    I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
    Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
    The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    I have taken the follwoing action
    1. Replicated the data source
    2. Deleted all request from PSA
    2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
    3. Re transported the datasource , transformation, DTP
    Still getting the same issue
    If you have any idea please reply asap.
    Samit

    Hi
    Generate your datasource in R/3 then replicate and activate the transfer rules.
    Regards,
    Chandu.

  • Need help on: Automation of Daily Data Load

    Hi all,
    We need to start our Daily Data load from DAC by Manually. So right now my client has asked us to do Automation of Daily Data Load.
    Starting the Daily Data Load Manually(DAC) Process: First we have to check whether the ASCP Plans updated or not
    Right now we are checking whether the plans got updated or not, so for this we are using following query
    SELECT LTrim(RTrim (compile_designator)),data_completion_date,TO_CHAR(data_completion_date ,'DD-MON-YYYY HH24:MI:SS') FROM apps.msc_plans
    WHERE LTrim(RTrim (compile_designator))
    in( 'Plan01,'Plan02','Plan03','Paln04') ORDER BY 2 desc
    from this query we will able to see whether all the plans got updated or not. From all the Four Plans, two plans will get updated as of Sysdate(mm/dd/yyy) ,Timestamp(hh:mm:ss)(for example i.e. Plan01 08/25/2011 11:20:08 PM, Plan02 08/25/2011 11:45:06 PM) and rest two plans get updated on Sysdate+1(mm/dd/yyy), Timestamp(hh:mm:ss)(for example i.e. Plan03 08/26/2011 12:20:05 AM, Plan04 08/26/2011 12:45:08 AM)
    So after checking the plans , we start the Daily Load in DAC manually.
    May I know how should I convert my above sql query which I am using for checking the plans updated or not in informatica, so as to automate the Daily Load in informatica level..
    Need help.

    You cannot replicate what is done with DAC at Informatica level. DAC is a separate Oracle product that orchestrates and manages the ETL load (including Index management, etc). The reason Oracle developed DAC is because it allows you to manage a large scale DW load for a large ERP system. As suggested, you can invoke the DAC execution plan via a command but you cannot replicate everything the DAC does at Informatica level. If this helps, please mark as helpful.

Maybe you are looking for

  • Getting hang/crash problem when I try to close browser window from applet

    I have a small Apllet that has a button that calls a Javascript function in the opening window, to close the window in which it is loaded. The Applet window is a PopUp window from one of my application's window. First time I start my application and

  • Cisco products and services comparisons

    Wasnt sure where to post this...was hoping somebody could point me in the right direction... anybody have any site, whitepaper, document of any kind that gives an outline of products and services comparison between cisco products and other network eq

  • Trays in Smart Forms

    Hello, I print billing documents from vf03, using a Smart Form. There is a need to choose different trays during printout, based on variable value. 1. I use COMMAND NODE in MAIN window, but is there any other option to select the tray manually during

  • E51 Wireless Lan - Net Connection Time Out

    Hi all, I cant get my E51 to open websites. It connects to my wireless LAN fine, but I get the message Net Connection Time Out when I try to browse to any website. Before the message comes up, it takes a minute or so of blank screen before the error

  • Elements 9 serial number limit

    Have had two computer crashes recently and replaced my computer. I have orinal E9 discs and number but after upgrading Elements tells I have reached my limit for installing the program and need to buy another one??? What's with this?? Can I get a ser