Export data for offer/rejection letter Issue

The functionality for generating offer letters are generating fine from the Laptops of the users however end users who are using the desktop are not able to view the data and only .rtf template is appearing.
could any one assist me to resolve the issue pls...?

Hi Sven,
Thank you for your prompt response.
I'm doing the  transport from DEV to QA initially.
When i'm creating the transfer rules in the target system i'm getting the following message :
Assignment of InfoObjects to DataSource
80PROFIT_CTRH fields incomplete
In the transfer rules it creates the following field even though it does not exist in the original datasource. Deletion of this field is obsolete.
0SEM_CGPRCT
Kindly need your assistance on this.
Thank you

Similar Messages

  • Exporting Data for a single set of books from Oracle Financials

    I have a multiorg set-up with multiple set of books in an Oracle instance. I want to be able to export data for a single set of books into a new Oracle instance.
    Any insight into how this can be done (without going through each of the tables and referential integrity constraints) will be of great help
    thanks
    anoop

    It is a bit unclear what you are sending these POs to the other system for? Is it for matching to invoices there? Is it for receiving into inventory there?
    Are these two entities belong to the same parent but on two different oracle apps databases?
    In essence what do they do with the PO that you are sending to them?
    Why do you think there will be intercompany invoicing?
    Thanks
    Nagamohan

  • Best way to export data for a migration

    Hi Oracle Community,
    What's the best way to export data from an Oracle 8i database for it to be suitable for import into an Oracle 10g database?
    What's the best way to export data if it is to go into different rdbms database?
    Thanks, David

    Thanks everyone for all your help. You guys are great.
    There seems to be many good ways to export your data from Oracle into a flat file format, suitable for import into other RDBS': Oracle, mysql, postgresql, etc.
    A few tools where mentioned but using SQL*Plus, which comes with Oracle (And SQL*LDR on the backend, which also comes with Oracle) seem the most straight forward.
    I found this script on asktom.oracle.com to work great, slightly modified here,
    (to Include linesize max, and pipes rather than commas):
    set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
    set linesize 32767
    spool payment.txt
    select
    PAYMENT_ID||'|'||
    USER_ID||'|'||
    <more fields her>
    from
    payment
    spool off
    exit ;
    It works great. Rather than making one of these for each table I wrote an perl script called ora_export. http://crowfly.net/oracle/ora_export. It runs in Unix and only requires SQL*PLUS. It creates these four files:
    <tablename>.def # list of table columns and types (SQL*Plus DESC)
    <tablename>_dump.sql  # a script to export the data
    <tablename>.psv # THE DATA (eq. - name|address|etc)
    <tablename>_load.ctl  # SQL*LDR control for ORCL if you need it.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Export data for specific period through Data Pump

    Hi,
    I've a specific requirement to take the dumps of some tables for specific time period. like between last 10 days like 01-JAN-11 to 10-JAN-11. How can I acommplish this. For Documentation what I read is that we can export the data for specific period of thie by either setting FLASHBACK_SCN or FLASHBACK_TIME parameter in expdp command but this is point in time export not for the specific time export.
    Please guide me how can export between the specific time. like between 1-JAN to 10-JAN
    Regards,
    Abbasi

    export between the specific time. like between 1-JAN to 10-JANYou need to clarify your requirements. Data is always "at a point in time". I can see data as at noon of 01-Jan. I can see data as at noon of 10-Jan. What would I mean by data "between" 01-Jan and 10-Jan ?
    Say the table has 5 rows on 01-Jan :
    ID    VALUES
    1      ABC
    2      DEF
    3      TRG
    4      MXY
    5     DEW2 Rows "6-GGG" and "7-FRD" were inserted on 02-Jan.
    2 Rows "2" and "3" were updated from "DEF" and "TRG" to "RTU" and "GTR" on 03-Jan.
    1 Row "5-DEW" was deleted on 09-Jan.
    2 Rows "8-TFE" and "9-DZN" were insereted on 09-Jan.
    Can you tell me what is the "data between 01-Jan and 10-Jan" ?
    (the above example actually happens to have an incrementing key column "ID". Your table might not even have such an identifier column at all !)
    Hemant K Chitale
    Edited by: Hemant K Chitale on Jan 10, 2011 5:23 PM

  • Export data for domain data make wrong file

    Hi!
    If I try export data from table with column with type such as MDSYS.SDO_GEOMETRY in SQL developer (1.2.0 and 1.2.1.3213 both) result file will be with information like (for insert clause):
    Insert into table_name (NUMB,GEOLOC) values (500949,'MDSYS.SDO_GEOMETRY');.
    Also in previous version (1.2.0) when this column was shown in data window it was more informative:
    MDSYS.SDO_GEOMETRY(2006, 262148, NULL, MDSYS.SDO_ELEM_INFO_ARRAY(1,2,1,95,2,1,109,2,1,133,2,1,157,2,1), MDSYS.SDO_ORDINATE_ARRAY(22847.57591,7216.21100000001,22842.04691,7217.2571,22841.44841,7218.00440000001,22843.39211,7228.31675000001,22844.13881,7232.35205000001,22845.63335,7239.52580000001,22845.63335,7240.27310000001,22845.03599,7240.72145000001,22826.05499,7244.15885000001,22814.39735,7246.10180000001,22809.01769,7246.84910000001,22807.67249,7246.40075000001,22802.44103,7222.33850000001,22799.19203,7213.03505000001,22795.8656525,7208.73815000001,22794.81386,7208.73200000001,22789.47752,7208.70080000001,22784.3570675,7209.03725000001,22758.6899675,7184.04095000001,22757.3447675,7183.59260000001,22751.9645375,7183.59245000001,22744.006055,7183.03205000001,22743.258785,7181.83640000001,22737.1684775,7181.35070000001,22736.7201725,7182.69575,22729.546295,7183.59245000001,22726.7066975,7186.58165000001,22725.9594275,7186.73105000001,22725.2121575,7186.43210000001,22723.11983,7184.56400000001,22722.29789,7184.48915000001,22721.55062,7186.28270000001,22721.326325,7186.80575000001,22717.515305,7191.36410000001,22715.7218,7193.68070000001,22710.1920875,7200.48080000001,22709.4448175,7206.90740000001,22709.370005,7214.15585000001,22709.74364,7214.52950000001,22711.6866275,7215.35150000001,22711.83611,7216.84610000001,22711.98545,7220.05925000001,22711.611815,7236.12560000001,22711.3876625,7247.63360000001,22711.4249975,7249.76345000001,22710.7523975,7250.95910000001,22710.0051275,7252.45355000001,22849.96763,7244.45780000001,22848.8559875,7243.04300000001,22848.32375,7242.36545000001,22849.51961,7243.41155000001,22848.8559875,7243.04300000001,22846.82921,7241.91710000001,22826.05499,7244.15885000001,22263.062285,7163.22935000001,22263.809555,7173.01865000001,22265.67773,7194.61475000001,22265.902025,7196.78180000001,22265.902025,7197.23015000001,22265.8272125,7197.37970000001,22265.304095,7197.97745000001,22217.9272625,7201.19075,22217.1799925,7201.56440000001,22216.8063575,7202.31170000001,22216.35791,7204.47875000001,22216.731545,7206.12275000001,22800.2381225,7220.28350000001,22798.3699475,7214.23070000001,22796.651255,7211.31620000001,22795.3061975,7209.82175000001,22794.9325625,7209.22385000001,22794.81386,7208.73200000001,22785.5170175,7170.21620000001,22777.3717175,7133.0768,22776.9234125,7130.76035000001,22775.727695,7125.90305000001,22774.6816025,7120.82150000001,22773.7100375,7115.81480000001,22774.53212,7109.98610000001,22774.4573075,7110.73340000001,22773.2617325,7111.70480000001,22773.1870625,7112.45210000001,22773.7100375,7115.81480000001,22773.11225,7113.87185000001,22767.95603,7108.93985000001))
    when new one:
    MDSYS.SDO_GEOMETRY
    WBR,
    Sergey

    I'm newbie here and not sure what you want exactly but.
    First of all I've created table on Oracle 10G (10.2.0.3) Enterprise ed as follow:
    CREATE TABLE tblnm
         "MI_PRINX" NUMBER(11,0),
         "GEOLOC" MDSYS.SDO_GEOMETRY,
    CONSTRAINT RP_MAP_PK PRIMARY KEY (MI_PRINX)
    INSERT INTO USER_SDO_GEOM_METADATA (TABLE_NAME, COLUMN_NAME, DIMINFO, SRID)
    VALUES ('tblnm','GEOLOC',MDSYS.SDO_DIM_ARRAY(mdsys.sdo_dim_element('X', -100000.0, 185000.0, 1.425E-5), mdsys.sdo_dim_element('Y', -100000.0, 200000.0, 1.5E-5)),262148);
    CREATE INDEX tblnm_SX ON tblnm (GEOLOC)
    INDEXTYPE IS MDSYS.SPATIAL_INDEX;
    insert into tblnm (MI_PRINX,GEOLOC) VALUES
    (1,MDSYS.SDO_GEOMETRY(2001, 262148, NULL, MDSYS.SDO_ELEM_INFO_ARRAY(1,1,1), MDSYS.SDO_ORDINATE_ARRAY(6946.74932,9604.25675000001)));
    After that I've export data from this table by SQLDeveloper:
    as insert clause result was
    -- INSERTING into TBLNM
    Insert into TBLNM (MI_PRINX,GEOLOC) values (1,'MDSYS.SDO_GEOMETRY');
    when I've try to import data (after delete) by this command i've got:
    ERROR at line 1:
    ORA-00932: inconsistent datatypes: expected MDSYS.SDO_GEOMETRY got CHAR
    for loader clause file looks like
    LOAD DATA
    INFILE *
    Truncate
    INTO TABLE "TBLNM"
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    (MI_PRINX,
    GEOLOC)
    begindata
    "1","MDSYS.SDO_GEOMETRY"
    and so one. Result file doesn't consist data for sdo_geometry column - only name of class.

  • Export data for SQL Loader

    I have a table with the following 3 columns
    Help_number Number(8,0)
    Title       Varchar(100 Byte)
    Description Varchar (100 Byte)I would like to export all the data and import it into another table in another database. Im using SQL Developer to export the data. I choose the "LOADER" option but when the data is exported, the format is wrong. Here is an example of the data is exported.
    "1","Error","Error 5343 - Input not recognised" The problem i have is that the first column is being exported in double quotes even though its of a type of NUMBER. When i try to load this using sqlldr it gets rejected because its a string.
    The other problem that i have is that SQL Developer is not exporting all the rows if a table is big. I tried to export a table with 23000 rows and it only exported the first 55 rows.
    Any help will be appreciated.

    The quotes issue I am able to replicate and have logged a bug #6732587.
    I have also logged a bug for the number of rows, however, if you click ctrl-end and then export, you'll get all the rows. Also, if you do not want to query back all the rows, but want to export all, in the Export dialog, just click the "where" clause tab and then Apply. This will also bring back all the rows. This bug is not only for Loader, but for any export format.
    Sue

  • Financial Reporting - Exporting Batch for Command Line Scheduling Issue

    Financial Reporting 11.1.2.2
    When I export, a successfully run scheduled batch, for command line scheduling the xml file that is exported only has the following information in it. I think that it is missing alot of information.
    Any ideas what I am doing wrong, if anything?
    <?xml version="1.0" encoding="UTF-8" standalone="no"?>
    <JOB_OBJECT OBJECT_ID="0">
    <DATA_SOURCE_USER_CREDENTIALS DS_PASSWD="" DS_USER_NAME=""/>
    <HR_USER_CREDENTIALS HR_PASSWD="" HR_USER_NAME=""/>
    <OUTPUT_OPTIONS>
    <HTML VALUE="No"/>
    <MHTM VALUE="No"/>
    <PDF EXPORT_PDF_FOLDER_LABEL="" VALUE="Yes"/>
    <SAVE_AS_SNAPSHOT VALUE="No"/>
    <PRINT VALUE="No"/>
    </OUTPUT_OPTIONS>
    </JOB_OBJECT>

    Hi again,
    I am not sure if you can extract the formatting to Excel... I had also issues.
    Can you please try to retrieve the report into Excel via Smartview? Check page 136 of  http://docs.oracle.com/cd/E40248_01/epm.1112/smart_view_user.pdf
    Regards,
    Thanos

  • Export data for the table of SAP using  the JCO

    Good Morning,
    Sorry error of the agreement, and that I am using a translator.
    I am needing of help, to solve a problem that I have to integrate the system in Java using the SAP JCO.
    Well I need send (Export_Parameter) data into a table in SAP,but I do not know this process.
    If someone went through that and can help, thank you in advance
    Obs : Not found documents that help me
    Att
    Elton C.

    This is a forum for problems with the Java language and the default libraries that come with it. We can't help you with any third-party applications/libraries such as SAP and JCO.
    Sorry.

  • Export Data for Prime Infrastructure

    I started an Export for Prime Infrastructure on my LMS 4.2.3 box several days ago and it is still running.  There are approximately 3600 devices being exported from LMS I am guessing it should not take this long.  Would like to kill the process but I am not sure which one to stop?

    That would be lms he'd need to restart Afroz.
    Depending on the platform:
    Net stop crmdmgtd then net start crmdmgtd (Windows server)
    Application stop LMS / application start LMS ( soft or physical appliance)
    There is probably an individual process you could restart but I don't have it at my fingertips. I haven't exported one that big but usually it would only be a matter of minutes to complete. It's just a simple csv file.
    Sent from Cisco Technical Support iPad App

  • Export Data in 1.1 to XLS very slow

    I'm currently running with Oracle 10g. I have been having a problem running Lengthy SQL and trying to export to XLS. It appears to lock with I select Export Data --> XLS and after letting run for a long time eventually it will return with the Browse window and then when I select the file location it will again take a long time before it returns again. Is there still multiple selects re-occur in 1.1 as the prior versions? Or is there a setup parameter that I can set to improve performance?
    Any assistance would be greatly appreciated.

    This is an issue with the way the export from SQL Developer works - it doesn't matter which format you are exporting to. SQL Developer executes the query multiple times when exporting.
    On top of the initial query, it appears to execute the query again before displaying the dialog (for the columns and maybe sample data for the where tab?) and then again to actual export the query (in case you have added conditions on the export where tab).
    Try switching on tracing with a quick statement, run it and then export it. You need to look for the "select * from ( select columns from ( <your query> ) )" selects that are generated by the export).
    With long running queries, I find that it is far quicker to query all the rows and then do a select all, copy and then paste into the spreadsheet. Unfortunately, you don't get column headings with this, but if your query takes 10 minutes to run, you save yourself 20 minutes to manually add the headings.
    This has been a problem since 1.0 (see More export issues/bugs but at least 1.1 has reduced the number of additional queries - there was an extra query on export in 1.0.
    Note that export also still fails with an ORA-972 on exporting a query like "select 'testing query execution on export' from dual" as part of this multiple execution on export.

  • How to extract data for particular two members of same dimension.

    As per the requirement i need to export data for certain members of a dimension. Lets say we need data for two account members A and B which is in in Account dimension only but is not a direct children. I need the data for all the available years too. Please suggest me how my DATAEXPORT command should look like.
    When i am using an AND statement it is not working accordingly. Say i am fixing for years 2007 and 2009 but the output file is coming for 2009 and 2010.
    Something other is happening when i am fixing OPEX_31 and OPEX_32. The values are coming not only for OPEX_31 and OPEX_32 but for many more accounts too.
    Here is my dataexport statement for your reference
    SET DATAEXPORTOPTIONS
    DataExportLevel "ALL";
    DataExportColFormat ON;
    DataExportDimHeader ON;
    DataExportOverwriteFile ON;
    FIX("LC","Total_Year","ESB1","2009","SIERRA","COSTCENTER_NA","CELLULAR_NA","OPEX_31",
    "January","February","March","April","May","June","July","August","September","October","November","December");
    DATAEXPORT "File" "     " "D:\exports\feb.txt";
    ENDFIX;
    I need data for OPEX_31 and OPEX_32 for all the available years starting from 2001 to 2025.
    Please suggest what are the modification needed to get the desired result.
    Thanks in advance

    Hi,
    There a few different options you can use for fixing on the months, years..
    e.g. FIX(January:December)
    or FIX(@CHILDREN(YearTotal)) < depends what the parent of the months is
    sames goes for years
    FIX(2009:2025)
    or
    FIX(@CHILDREN(Year)
    If your period dimension is dense you can always use that as the column header e.g. DataExportColHeader "Period" and then fix on the accounts you require.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Is there any way to export data from a calendar 5 instance to 6.3?

    I've run into yet another issue with my migration from calendar 5 to 6.3.
    It turns out that one of our satellite clinics has a very large amount of data stored on our older server. Right now we're having to put the migration on hold because there are several hundred events on the old server that are repeat events and/or events scheduled relatively far in the future. This issue, along with the possibility of double booking events/appointments while both servers are active, is a serious risk that could have detrimental effects on this site.
    There are a few issues that I'm not certain how to handle in this case. First of all, the old server's database is stored in schema 1, and changes in the LDAP directory structure from cal 5 vs. 6.3 made it impossible to migrate the old database and accounts. The new server is schema 2. So migrating the whole directory, especially at this point, seems rather unlikely.
    Is there any way that I can export data for specific accounts and calendars for the people at this site in a version-independent format and import it to the new server? If nothing else we can schedule somebody to come in on one of the weekends and manually copy the data, but if possible I'd like to avoid devoting somebody to that for the several hours that it would take to copy that by hand. Also, this would not be the preferred method due to the fact that human error could result in our clients being misscheduled, which would obviously be bad for business.
    I'd appreciate any ideas anybody may have on this matter.
    Thanks in advance.
    -Damon

    damo.gets wrote:
    Actually I guess I was missing the obvious method of simply exporting to xml and importing on the new calendar. For some reason I thought that the exports were incompatible between versions as well.Hmm... whilst this may have worked on the face-of-it I do wonder whether everything has indeed been moved across and more importantly translated into the correct form e.g. access controls, uids => uids@domain format.
    Is there anybody that can answer definitively whether or not any data will be lost by this procedure?There are simply too many variables involved in your proposed migration to provide any kind of 100% iron-clad guarantees. I would suggest you perform a thorough test migration of the data and then compare the ics/xml export from the ics5/6 systems to see if any of the data has changed. Also test to see whether access controls are still working (can a user who could previously edit another persons calendar on ics5 now do so with ics6).
    Things to look out for are that meeting attendee information is kept, any 'fancy' characters (i.e. 8-bit characters) and formatting information is kept, the number of tasks/meetings is consistent between the two versions and so forth.
    Regards,
    Shane.

  • Duplicate records in exported data

    I'm trying to export the inventory data with a wildcard (%) filter on the
    Workstation Name.
    If I run the same filter in a query from ConsoleOne, I don't see any
    duplicate records.
    If I run the data export, the exported data for some workstations will have
    a duplicate DN. Just the DN is duplicated, all the other fields are either
    empty or have some default value.
    I have also ran the manual duplicate removal process and have tried
    deleting the records all together using the InventoryRemoval service.
    Any other ideas?

    Dlee,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
    - Check all of the other support tools and options available at
    http://support.novell.com.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • FM to count pricing data for document for todays date

    Hi
    I would like to find function module that will count pricing data for offer in CRM 5.0 and will store it in some internal table.
    it cannot change pricing data on document.
    I am writing module for comparing pricing data on document with these that pricing will give for todays date.
    Regards
    Radek

    not solved

  • Previous month end data for report

    Hi expert,
    I have to calculate previous month end data for my report.
    let say if user select 15 oct then he should be able to see 30 sept data.
    I have calander prompt.
    Thanks,

    Hi,
    Use presentation variable in date prompt.
    Apply sql filter(covert to sql) on report as date_column= TIMESTAMPADD(SQL_TSI_DAY,-DAYOFMONTH(date 'presntation_variable'),date 'presentation_variable')
    Refer : How to get LAST_DAY in obiee
    Regards,
    Srikanth

Maybe you are looking for

  • Reg. IDoc errors in Mat Mas conversion using ALE/IDocs & reduced msg type

    Hi Friends, Am facing a strange situation while converting the Material Master data from 4.6b to ECC using BD10 transaction with reduced custom message type. I have reduced the standard msg type MATMAS03 using BD53 for the required field mapping both

  • Threads with http socket

    Are there any good tutorials that teaches multithreading with http sockets?

  • No target for MPEG-2 and video elementary stream

    What the heck does "could not create the DVD because there was no target that is an MPEG-2 elementary stream" mean? I'm trying to create a DVD but once Compressor 3.5 was about to finish it said that. I tried to export from FCP to create a .aiff and

  • Project server 2007 - users amount in reporting services

    Hi! I wanted to know how/where can i found information about how many users defined as reporting services users (users that can access reports from SSRS) Thank you Ofir Ofir Marco , MCTS P.Z. Projects

  • Picture lag

    So im watching a you tube video and my internet is just fine and its high speed and im watching a u tube video and the picture lags ! also in the search bar the response from the keys to the cpu to the display is slow as a snail. so slow infact it ta