Moving data using data mover

Hii Guys,
I have installed PT8.49 on oracle10.2 on windows-2003(32-bit) and has done all the setup.Now i have made a production installation with PT8.51 on oracle 11gR2 on windows-2008(64-bit).Now i have to move all the data from test environment i.e. from PT8.49 to PT8.51.Is it possible to do it or should i upgrade PT8.49 to PT8.51?If possible pls provide me some links for blogs and resources else pls guide me what to do?
Thank you

You are again trying to move data across different Peopletools version, but that's rather odd without deep analyzes. Tables should match, columns should match, but also defined values.
It is not that simple, and the best advice here would be to upgrade your test environment up to the production level. Anyway, beeing on same level as production is the best way to test before going to production.
Nicolas.

Similar Messages

  • HT5594 why in spite of i turn off all of location servies when i turn on cellular data use data again ?

    why in spite of i turn off all of location servies when i turn on cellular data use data again
    in cellular >>system services >> mapping services
    how can i turn off mapping services completly ??
    & why now it use data until i turn it off ???

    Settings>Cellular>Use Cellular Data for and turn Maps off. If you are not using Maps for navigation, it's not using data, though.

  • How to get EKBE-BUDAT (GR Date) using data of BSEG

    hI ,
    My requirement is to get the GR date from EBKE which is in the field BUDAT.
    my report already has BSEG data , using that i want to get the EKBE-BUDAT.
    One of the Function person suggested this:
    Select LFBNR
               LFPOS
               LFGJA
       From EKBE
    into it_ekbe_temp
      where  EKBE -EBELN = EKBEBSEG-EBELN
          AND  EKBE-EBELP = EKBEBSEG-EBELP
          AND  EKBE-BELNR = EKBEBSEG-BELNR
          AND  EKBE-BUZEI = BSEG-BUZEI
    once we get these 3 fields, again put a query on EKBE and get the GR date BUDAT
    select a~ebeln
              a~ebelp
              a~budat
              a~lfbnr
              a~lfpos
              a~lfgja
    into table it_ekbe
    from ekbe as a
    inner join bseg as b on
    b~ebeln = a~ebeln
    and b~ebelp = a~ebelp
    for all entries in it_ekbe_temp
    where a~gjahr = it_ekbe_temp-lfgja
    and   a~belnr = it_ekbe_temp-lfbnr
    and   a~buzei = it_ekbe_temp-lfpos.
    endif.
    Can anyone suggest me how to get the GR date from EKBE using BSEG data.

    Hi Mayank,
    You can get through by hitting MSEG table first , get the required key info. and then hit EKBE and get BUDAT.
    Pass ebeln,ebelp to mseg and get the key info. ...
    Hope this helps.
    Thanks,
    Amresh

  • Issue with importing data using data pump

    Hi Guys,
    Need your expertise here. I have just exported a table using the following datapump commands. Please note that I used *%U* to split the export into chunk of 2G files.
    expdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110.log tables=(PT_CONTROL.pipeline_session) filesize=2G job_name=pt_ps_0831_1
    The above command produced the following files
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:04 PT_CONTROL_PS_083110_01.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:05 PT_CONTROL_PS_083110_02.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:06 PT_CONTROL_PS_083110_03.dmp
    -rw-r----- 1 oracle oinstall 394M Aug 31 15:06 PT_CONTROL_PS_083110_04.dmp
    -rw-r--r-- 1 oracle oinstall 2.1K Aug 31 15:06 PT_CONTROL_PS_083110.log
    So far things are good.
    Now when I import the data using the below command, it truncates the table but do no import any data. Last line says "*Job "SYS"."PT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57*".
    impdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_%U.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=PT_ps_imp_0831_1
    Import: Release 10.2.0.3.0 - Production on Tuesday, 31 August, 2010 15:14:53
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.2.0.3.0 - Production
    Master table "SYS"."AT_PS_IMP_0831_1" successfully loaded/unloaded
    Starting "SYS"."AT_PS_IMP_0831_1": '/******** AS SYSDBA' dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=AT_ps_imp_0831_1
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39153: Table "PT_CONTROL"."PIPELINE_SESSION" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/TRIGGER
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "SYS"."AT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57
    I suspect that it has something to do with %U in the impdp command. Anyone encounter this kind of situation before? What should be my import command be? Just want to confirm I am using the right import command.
    Thanks
    --MM
    Edited by: UserMM on Aug 31, 2010 3:11 PM

    I also looked into the alert log but didn't find anything about the error there. Any opinion?
    --MM                                                                                                                                                                                                           

  • Moving sum using date intervals - analytic functions help

    let's say you have the following set of data:
    DATE SALES
         09/02/2012     100
         09/02/2012     50
         09/02/2012     10
         09/02/2012     1000
         09/02/2012     20
         12/02/2012     1000
         12/02/2012     1100
         14/02/2012     1000
         14/02/2012     100
         15/02/2012     112500
         15/02/2012     13500
         15/02/2012     45000
         15/02/2012     1500
         19/02/2012     1500
         20/02/2012     400
         23/02/2012     2000
         27/02/2012     4320
         27/02/2012     300000
         01/03/2012     100
         04/03/2012     17280
         06/03/2012     100
         06/03/2012     100
         06/03/2012     4320
         08/03/2012     100
         13/03/2012     1000
    for each day i need to know the sum of the sales in the present and preceding 5 days (calendar) [not five rows].
    What qurey could i use???
    Please help!

    Hi.
    Here's one way.
    WITH data AS
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     50 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     10 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     1000 n FROM DUAL UNION ALL
         SELECT TO_DATE('09/02/2012','DD/MM/YYYY') d,     20 n FROM DUAL UNION ALL
         SELECT TO_DATE('12/02/2012','DD/MM/YYYY') d,     1000 n FROM DUAL UNION ALL
         SELECT TO_DATE('12/02/2012','DD/MM/YYYY') d,     1100 n FROM DUAL UNION ALL
         SELECT TO_DATE('14/02/2012','DD/MM/YYYY') d,     1000 n FROM DUAL UNION ALL
         SELECT TO_DATE('14/02/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     112500 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     13500 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     45000 n FROM DUAL UNION ALL
         SELECT TO_DATE('15/02/2012','DD/MM/YYYY') d,     1500 n FROM DUAL UNION ALL
         SELECT TO_DATE('19/02/2012','DD/MM/YYYY') d,     1500 n FROM DUAL UNION ALL
         SELECT TO_DATE('20/02/2012','DD/MM/YYYY') d,     400 n FROM DUAL UNION ALL
         SELECT TO_DATE('23/02/2012','DD/MM/YYYY') d,     2000 n FROM DUAL UNION ALL
         SELECT TO_DATE('27/02/2012','DD/MM/YYYY') d,     4320 n FROM DUAL UNION ALL
         SELECT TO_DATE('27/02/2012','DD/MM/YYYY') d,     300000 n FROM DUAL UNION ALL
         SELECT TO_DATE('01/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('04/03/2012','DD/MM/YYYY') d,     17280 n FROM DUAL UNION ALL
         SELECT TO_DATE('06/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('06/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('06/03/2012','DD/MM/YYYY') d,     4320 n FROM DUAL UNION ALL
         SELECT TO_DATE('08/03/2012','DD/MM/YYYY') d,     100 n FROM DUAL UNION ALL
         SELECT TO_DATE('13/03/2012','DD/MM/YYYY') d,     1000 n FROM DUAL
    days AS
         SELECT TO_DATE('2012-02-01','YYYY-MM-DD')+(LEVEL-1) d
         FROM DUAL
         CONNECT BY LEVEL <= 60
    totals_per_day AS
         SELECT dy.d,SUM(NVL(dt.n,0)) total_day
         FROM
              data dt,
              days dy
         WHERE
              dy.d = dt.d(+)
         GROUP BY dy.d
         ORDER BY 1
    SELECT
         d,
         SUM(total_day) OVER
              ORDER BY d
             RANGE BETWEEN  5 PRECEDING AND CURRENT ROW
         ) AS five_day_total
    FROM totals_per_day;
    2012-02-01 00:00:00     0
    2012-02-02 00:00:00     0
    2012-02-03 00:00:00     0
    2012-02-04 00:00:00     0
    2012-02-05 00:00:00     0
    2012-02-06 00:00:00     0
    2012-02-07 00:00:00     0
    2012-02-08 00:00:00     0
    2012-02-09 00:00:00     1180
    2012-02-10 00:00:00     1180
    2012-02-11 00:00:00     1180
    2012-02-12 00:00:00     3280
    2012-02-13 00:00:00     3280
    2012-02-14 00:00:00     4380
    2012-02-15 00:00:00     175700
    2012-02-16 00:00:00     175700
    2012-02-17 00:00:00     175700
    2012-02-18 00:00:00     173600
    2012-02-19 00:00:00     175100
    2012-02-20 00:00:00     174400
    2012-02-21 00:00:00     1900
    2012-02-22 00:00:00     1900
    2012-02-23 00:00:00     3900
    2012-02-24 00:00:00     3900
    2012-02-25 00:00:00     2400
    2012-02-26 00:00:00     2000
    2012-02-27 00:00:00     306320
    2012-02-28 00:00:00     306320
    2012-02-29 00:00:00     304320
    2012-03-01 00:00:00     304420
    2012-03-02 00:00:00     304420
    2012-03-03 00:00:00     304420
    2012-03-04 00:00:00     17380
    2012-03-05 00:00:00     17380
    2012-03-06 00:00:00     21900
    2012-03-07 00:00:00     21800
    2012-03-08 00:00:00     21900
    2012-03-09 00:00:00     21900
    2012-03-10 00:00:00     4620
    2012-03-11 00:00:00     4620
    2012-03-12 00:00:00     100
    2012-03-13 00:00:00     1100
    2012-03-14 00:00:00     1000
    2012-03-15 00:00:00     1000
    2012-03-16 00:00:00     1000
    2012-03-17 00:00:00     1000
    2012-03-18 00:00:00     1000
    2012-03-19 00:00:00     0
    2012-03-20 00:00:00     0
    2012-03-21 00:00:00     0
    2012-03-22 00:00:00     0
    2012-03-23 00:00:00     0
    2012-03-24 00:00:00     0
    2012-03-25 00:00:00     0
    2012-03-26 00:00:00     0
    2012-03-27 00:00:00     0
    2012-03-28 00:00:00     0
    2012-03-29 00:00:00     0
    2012-03-30 00:00:00     0
    2012-03-31 00:00:00     0Hope this helps.
    Regards.

  • How to Export data using DATA Cluster?

    Hi,
    I am trying to export around 10000 records using Memory ID concept  in the same session to another spot. 
    IF sy-subrc = 0.
    Exporting the number of total records
      EXPORT g_tot_line FROM g_tot_line TO MEMORY ID c_tline.
    ENDIF.
    My issue : - It throws dump saying
    ''When the SAP paging overflow occurred, the ABAP/4 memory contained
    entries for 8 of different IDs.''
    insufficent space and i need to use the clustres.
    I am not sure how to use Clusters in this scenario....
    Any suggestions will be appreciated..
    Regards,
    Charan

    What is so complicated about Data Clusters ?
    Did you do an F1 or read the online documentation: [http://help.sap.com/abapdocu_70/en/ABAPEXPORT_DATA_CLUSTER_MEDIUM.htm], [http://help.sap.com/abapdocu_70/en/ABAPIMPORT_MEDIUM.htm]
    If you've any specific question, shoot !!!
    BR,
    Suhas

  • How to cleanse the Arabic-General and Address data using Data Services 3.1

    I m working in UAE project(Sap Customer & Vendor master) data migration. Main address and customer tables are builded in english and arabic equally.
    I can able to read the arabic data, but there is no clue, how to cleanse or modify those datas?
    Is it possbile to handle the arabic data in business objects-data services XI 3.1?
    Is it possible to use the EMEA address directories to cleanse or standandize the arabic data?
    Please help me out.
    Thanks in advance.

    Dear All,
    Anyone with any inputs for above question. Please advice
    Vamshi - im also looking for some advice as per your questions with arabic versions
    Best Regards

  • Filtering data using dates and SQL

    Post Author: Ivanbennett
    CA Forum: Data Connectivity and SQL
    Hi all
    been struggling with this one all morning could do with a list help
    I am using CR XI rel 2
    I have 2 tables
    Table one - AUDIT_LOG
    PositionIdDateTimeStatCode
    Table two - POSITION
    PositionIdSiteID
    Table Three - SiteID
    SiteIDNameTownPostCode
    I would like to have a user type in a start date and an end date and then the report will return, records from the Position table, where the PositionID does not appear in the Audit Log. I can establish who has not got an entry for the entire table but I now want a snapshot for a period typed in by the user.
    This is the SQL used when I added a command from database expert
    SELECT distinct POSITION.ID, POSITION.SITE_ID, SITE.NAME,AUDIT_LOG.DATETIMEFROM SITE INNER JOIN (POSITION LEFT outer JOIN AUDIT_LOG ON POSITION.ID = AUDIT_LOG.POSITION_ID) ON SITE.ID = POSITION.SITE_IDWHERE (((AUDIT_LOG.POSITION_ID) Is Null))
    Current Output10/09/2007                                ID                        SITE_ID                   NAMEAndy Arms                               4                          AB120002                Andy Arms                              103                       AB120002                Andy Arms                              3                          AB120002                Andy Arms                              104                      AB120002                Andy ArmsCharter Court                               2                        120001                     Charter Court                              101                     120001                     Charter Court                              102                     120001                     Charter Court
    Charter Court Test Site                             60                        129999                    Charter Court Test Site
    Forte Jester                              7                        200005                      Forte Jester
    here                            48                        123456789                 here
    Any help appreciated

    Post Author: foghat
    CA Forum: Data Connectivity and SQL
    you need to create 2 command parameters:  start_date_from and start_date_to then add to your where clause:and datetime >=  {?start_date_from}and datetime <= {?start_date_to}

  • Merging variable data using data merge plug-ins for In-Design

    Has anyone had any experience with merging variable data (such as articles for a newsletter) with either of these plug-ins:
    DesignMerge, www.meadowsps.com/site/marketing/productinfo/designmerge.htm
    or Pageflex Studio ID, www.bitstream.com/publishing/products/studioid/index.html

    Only Pageflex Persona (cut down version of Studio - doesn't require InDesign Server).
    Variable stuff is extremely customisable, but design stuff is !@##%. Like Quark 3 simplicity... Customer support is poor in my experience. Program is clunky - they've only just added keyboard shortcuts to access tools... woohoo. Spot colours aren't fully supported - you have to specify them as CMYK and tell your printer to substitute later... you can't design across a spread at all - any two-page graphics have to be placed twice... I could go on!
    For a newsletter, I'd just find a way with InD CS5. Maybe with the Copyfit plugin if you want more functionality, but not with standalone software. For complex variable areas in InD, I just create pdfs or separate InD files of the variable area, and list these as you would images in the csv file. The free LayoutZone addon (Thanks Martinho!) makes region exporting very simple - beware of some occasional bugs with strokes changing.
    I did a job recently with 36 variables per entry, including 2 embedded InD files. The layout was a 3-fold A4 double-sided flyer, with each 3rd page being a few mm shorter to allow for the fold-in. CS5 multi-page size and 3-page spreads made this quite simple to set up.

  • Previous quarter from the current date using Date functions

    hi all,
    how can i get the value of the previous quarter and previous month using using NOW() function in the formula.
    regards,
    Rk
    Edited by: Rk on Feb 13, 2009 9:28 AM

    Hi Pk,
    This "DSTR(DADD(NOW(),-1,'Q'),'YYYYQ')" will give you just an year 2008 (if you run this today).
    But this "DSTR(DADD(NOW(),-1,'Q'),'YYYYMM)" will give you an year and month of prev quarter
    200811(if you run this today)
    Ola

  • Can we load data in chunks using data pump ?

    We are loading data using data pump. So I want to clear my understanding.
    Please correct me if I am wrong on my understandings -
    ODI will fetch all data from source (whether it is INIT or CDC ) in one go and unload into staging area.
    If it is true, will performance hamper in case very huge data (50 million records at source) at source as ODI tries to load entire data in one go. I believe it will give better performance if we load in chunks using data pump.
    Please confirm and correct.
    Also I would like to know how can we configure chunk load using data-pump.
    Thanks in Advance.
    Regards,
    Dinesh.

    You may consider usingLKM Oracle to Oracle (datapump)
    http://docs.oracle.com/cd/E28280_01/integrate.1111/e12644/oracle_db.htm#r15c1-t2
    In 11g ODI reads from source and write to target in parallel. This is the case where you specify select query in source command and insert/update query in the target command. At source side Odi reads records from source and add them to a data queue. At target side a parallel thread reads data from the data queue and writes to the target. So the overall performance would be the slower of the read or write process.
    Thanks,

  • Using Date in where clause

    Hello all,
    I am new to Oracle, currently using 10G + aspvbscript.
    I've been trying to query data using date in where clause but nothing seems to work.
    The column is in date format.
    It gets printed out like this: 5/1/2010 11:21:19 AM
    I tried using this query:
    SELECT * from table where TRUNC(user_date) > to_date('FEB-01-2010:00:00:00','mm-dd-yyyy:HH24:MI:SS') order by user_date asc.
    It does return an output but it returns everything in table and does not take WHERE clause into consideration however, it does sort the date in ascending order.
    I've tried getting rid of TRUNC tried to format date in a different way but no such luck.
    Please point me to the right direction.
    Thanks.

    Welcome to the forums!
    In cases like this it is helpful if you can provide the following information:
    1. Oracle version (SELECT * FROM V$VERSION)
    2. Sample data in the form of CREATE / INSERT statements.
    3. Expected output
    4. Explanation of expected output (A.K.A. "business logic")
    5. Use \ tags for #2 and #3. See FAQ (Link on top right side) for details.
    I'll try and take a stab at your request based on the data given. What your query says is that it will return all rows that have a date greater then 2/1/2010 (MM/DD/YYYY). If your query is returning all rows then maybe the possibility exists that all the dates in the table are greater then 2/1/2010. Have you checked all dates to see if this is the case?
    Also, one note about your TO_DATE() function.to_date('FEB-01-2010:00:00:00','mm-dd-yyyy:HH24:MI:SS')The date format does not match the string you are using with respect to month. Your string has 'FEB' but the format is 'MM' which is the numeric representation of the month. Although Oracle was able to convert it to the proper date on my system you should try and maintain consistency between the string and the date format used.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • How we will use date n currency fields in bdc

    how we will use date n currency fields in bdc

    Hi
    When you have to upload using BDC, always use char type fields.
    For date, use date(10) type c.
    write sy-datum to date. Date will be as per using setting, either mm/dd/yyyy or dd/mm/yyyy...
    For currency also, you need to declare as
    data1(15) type n and
    data2(15) type c.
    Now suppose you have a value 12545421.91 . As per user setting it can be 12545421,91.
    In this case, first push the value to NUMC type.
    data1 = 1254521.91 and then data2 = data1 / 100.
    This will change it in required format and can be populated in the bdc table.
    Regards
    Navneet

  • Just bought a 2tb external hard drive with the intention of moving my iPhoto data files on it. I have so much of my 500 gigs being used up by images and video. Questions: Will this foul up my iPhoto app? Do I need to point iPhoto to this new location?

    Just bought a 2tb external hard drive with the intention of moving my iPhoto data files to it. I have so much of my 500 gigs being used up by images and video. Questions: Will this foul up my iPhoto app?
    Do I need to point iPhoto to this new location?
    Thanks!

    Are you running a Managed or a Referenced Library?
    A Managed Library, is the default setting, and iPhoto copies files into the iPhoto Library when Importing. The files are then stored in the Library package
    A Referenced Library is when iPhoto is NOT copying the files into the iPhoto Library when importing because you made a change at iPhoto -> Preferences -> Advanced. (You unchecked the option to copy files into the Library on import) The files are then stored where ever you put them and not in the Library package. In this scenario you are responsible for the File Management.
    Assuming a Managed Library:
    Make sure the drive is formatted Mac OS Extended (Journaled)
    1. Quit iPhoto
    2. Copy the iPhoto Library from your Pictures Folder to the External Disk.
    3. Hold down the option (or alt) key while launching iPhoto. From the resulting menu select 'Choose Library' and navigate to the new location. From that point on this will be the default location of your library.
    4. Test the library and when you're sure all is well, trash the one on your internal HD to free up space.
    Regards
    TD

Maybe you are looking for

  • How can I change the default Country for my account!!!

    Hi, I just move in to U.S. and i'm trying to use the U.S. store but the systems tells me tha my account it's only valid for Venezuela... What can I do??? thanksin advance  for any help!!!!

  • EPM-Version is 11.1.2.3, Java API, getTimeModified strange outcome

    Dear all, I would like to ask for help in a "strange" area. We have built a small java programm to check locked objects (files) in essbase. We want so save the result (analog to the EAS-infos) in a oracle-table for analysis purposes. We also want to

  • Error Code-108

    HI, we are getting below error code in Exchange CAS Server  Error Code:108 Error Message: Outlook Web App couldn't connect Exchange Web Services due to a configuration error. Response code = "500". Any help would be appreciated ?  Thanks & Regards Su

  • BIPublisher 11g Pdf template

    Hi Experts, I am trying to build pdf template in 11.1.1.5 BIP.Please can you let me know which version of latest acrobat reader or any other tool used to write xml code on template. Thanks Shyam

  • Remotely synchroniz​ing my BlackBerry 8820

    TOPIC:  Looking for a solution to remotely synchronize my BlackBerry 8820 with Outlook 2007  DISCUSSION: I am trying to find a way to streamline the synchronization process of my BlackBerry 8820 to my Outlook 2007 software.  Do to the nature of my wo