Export-data command in ACS CLI v 5.5

Hi,
I would like to ask which fields of the user database does the 'export-data user' command export?
I see in the tacacs web interface, if I look up system administration > Users > Authentication Settings, the Date Exceed field is filled
and so is the password lifetime such as if not changed within xx days, or reminder after xx days. Can these data be exported?
Also whenever the password is renewed, is it recorded in the one of the logs?
Thanks a lot.

report generation tool kit
Attachments:
trialexcel.vi ‏52 KB
trialexcel.vi ‏52 KB

Similar Messages

  • Exporting data

    Hi all
    I am trying to export a cube into a text file for archiving purposes. I am using application manager 6.5.3.
    I click on Database > Export and select 'All data', tick the Export in Column Format button and type in the pathway of a location on my c:\ drive. I then get the error
    "Ascii Backup:Failed to open [C:\xxxxxxx].
    From having a quick browse of the forums it appears that you can only ecport in this fashion to the actual server essbase is installed on, so I tried just putting the filename in the Server File Name and that seems to work.
    However, being a relative essbase virgin I'm not sure where the file actaully goes on the server! Can anyone help, ideally I would like a way I can export cubes directly to my C:\drive as txt files but if there is a simple workround that would be great as well.
    Thanks in advance
    Kryz

    Hi,
    The tech ref I believe has enough information to get you started :- http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/html_esb_techref/techref.htm
    I know it is for 9.3.1 but in the maxl section there is an overview of what is new in each version so you don't try and use a command that is not available in the version you are using.
    Have a look at the export data command.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to export data with column headers in sql server 2008 with bcp command?

    Hi all,
    I want know "how to export data with column headers in sql server 2008 with bcp command", I know how to import data with import and export wizard. when i
    am trying to import data with bcp command data has been copied but column names are not came.
    I am using the below query:-
    EXEC master..xp_cmdshell
    'BCP "SELECT  * FROM   [tempdb].[dbo].[VBAS_ErrorLog] " QUERYOUT "D:\Temp\SQLServer.log" -c -t , -T -S SERVER-A'
    Thanks,
    SAAD.

    Hi All,
    I have done as per your suggestion but here i have face the below problem, in print statment it give correct query, in EXEC ( EXEC master..xp_cmdshell @BCPCMD) it was displayed error message like below
    DECLARE @BCPCMD
    nvarchar(4000)
    DECLARE @BCPCMD1
    nvarchar(4000)
    DECLARE @BCPCMD2
    nvarchar(4000)
    DECLARE @SQLEXPRESS
    varchar(50)
    DECLARE @filepath
    nvarchar(150),@SQLServer
    varchar(50)
    SET @filepath
    = N'"D:\Temp\LDH_SQLErrorlog_'+CAST(YEAR(GETDATE())
    as varchar(4))
    +RIGHT('00'+CAST(MONTH(GETDATE())
    as varchar(2)),2)
    +RIGHT('00'+CAST(DAY(GETDATE())
    as varchar(2)),2)+'.log" '
    Set @SQLServer
    =(SELECT
    @@SERVERNAME)
    SELECT @BCPCMD1
    = '''BCP "SELECT 
    * FROM   [tempdb].[dbo].[wErrorLog] " QUERYOUT '
    SELECT @BCPCMD2
    = '-c -t , -T -S '
    + @SQLServer + 
    SET @BCPCMD
    = @BCPCMD1+ @filepath 
    + @BCPCMD2
    Print @BCPCMD
    -- Print out below
    'BCP "SELECT 
    * FROM   [tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername'
    EXEC
    master..xp_cmdshell
    @BCPCMD
      ''BCP' is not recognized as an internal or external command,
    operable program or batch file.
    NULL
    if i copy the print ourt put like below and excecute the CMD it was working fine, could you please suggest me what is the problem in above query.
    EXEC
    master..xp_cmdshell
    'BCP "SELECT  * FROM  
    [tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername '
    Thanks, SAAD.

  • Data Services 4.0 - Batch Job - Export Execution Command - Error

    Hi,
    I'm new to Data Services and tried to get started with this "how to":
    http://wiki.sdn.sap.com/wiki/display/BOBJ/HowToUseBusinessObjectsDataServicesinSAPBIstagingprocess
    Just that I used BW 7.0 instead of BI.
    But I got stuck at the Export Execution Command. Everytime I klick "Export", I get a "java.lang.NullPointerException".
    (I can execute the job manually from the Designer and it completes successfully.)
    Any idea what I can do to export it successfully?
    Or is there a workaround so that I'll be able to initiate the process from the BW system?
    Thanks!

    Hi,
    I'm new to Data Services and tried to get started with this "how to":
    http://wiki.sdn.sap.com/wiki/display/BOBJ/HowToUseBusinessObjectsDataServicesinSAPBIstagingprocess
    Just that I used BW 7.0 instead of BI.
    But I got stuck at the Export Execution Command. Everytime I klick "Export", I get a "java.lang.NullPointerException".
    (I can execute the job manually from the Designer and it completes successfully.)
    Any idea what I can do to export it successfully?
    Or is there a workaround so that I'll be able to initiate the process from the BW system?
    Thanks!

  • Export Data to Excel with commands API

    Hello,With commands API, we want to add the possibility to Export Data to Excel to our users, but we don't want to give them the acess to the mouse right button, because they can change others things that we don't want.How can we do that ?Thanks.

    Hello,With commands API, we want to add the possibility to Export Data to Excel to our users, but we don't want to give them the acess to the mouse right button, because they can change others things that we don't want.How can we do that ?Thanks.

  • Problem with Import and Export Data Wizard

    Downloaded and installed SQL Server Express 2008 R2 today because I want to explore how Access interacts with SQL Server (using my home computer). I'm using Access 2010 (under Windows 7), so the 2008 version of SQL Server Express seemed to be the version
    to use.
    After a couple of false starts, installation appeared to go okay. After the installation. My Start menu listed Microsoft SQL Server 2008 and Microsoft SQL Server 2008 R2. The latter listed Import and Export Data (64-bit). When I clicked that, the first Import
    and Export Data Wizard page was displayed. I wasn't ready at that time to explore the wizard, so I closed it. An hour or so later I again attempted to open the Import and Export Data wizard. This time, the wizard didn't open. Instead this error message was
    displayed: "The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered."
    I found DTS.dll on my computer at C:\Program Files\Microsoft SQL Server\100\DTS\Binn, so the file is available, but don't know whether it is registered.
    How can I correct this problem?

    First can you please post all log file errors
    >> I can't really give you a solution or specific recommendation since I did not saw this error yet myself, but on your own risk you can try:
    1. You may try to just register 'dts.dll' using regsvr32.exe, but this error may indicate a bigger problem with setup.
    If you are running SQL Server 64bit then try running this at the command prompt: %windir%\syswow64\regsvr32 "%ProgramFiles(x86)%\Microsoft SQL Server\90\dts\binn\dts.dll"
    2. You can try reinstall from start (In this case you have to make sure that you un-install all)
    [Personal Site] [Blog] [Facebook]

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • DCOM error "1260" on Windows 2008 R2 terminal server when SAP exports data into Office 2007 (Excel or Word)

    Hi all,
    We are experiencing an issue which happens at random whenever a user tries to export data from SAP into Excel or Word.  Excel or Word is started but remain empty.  We then see the below errors in the event log.  And the problem is very random
    as sometimes all they have to do is wait 10 minutes and the next try will be successful.  Not even log off/on.  Could someone tell me what the error "1260" is saying?     thanks for your help.
    Pete
    Log Name:      System
    Source:        Microsoft-Windows-DistributedCOM
    Date:          7/23/2013 4:07:36 PM
    Event ID:      10000
    Task Category: None
    Level:         Error
    Keywords:      Classic
    User:          N/A
    Computer:      Servername.domain.com
    Description:
    Unable to start a DCOM Server: {00020906-0000-0000-C000-000000000046}. The error:
    "1260"
    Happened while starting this command:
    "C:\Program Files (x86)\Microsoft Office\Office12\WINWORD.EXE" -Embedding
    Log Name:      System
    Source:        Microsoft-Windows-DistributedCOM
    Date:          7/23/2013 3:54:34 PM
    Event ID:      10000
    Task Category: None
    Level:         Error
    Keywords:      Classic
    User:          N/A
    Computer:      Servername.domain.com
    Description:
    Unable to start a DCOM Server: {00024500-0000-0000-C000-000000000046}. The error:
    "1260"
    Happened while starting this command:
    "C:\Program Files (x86)\Microsoft Office\Office12\EXCEL.EXE" /automation -Embedding

    Hi,
    Based on my research, please try the following:
    Click Start, click Run, type regedit in the Open box, and then click OK.
    Locate the following registry subkey:
    HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\Notify\termsrv
    Under this subkey, add the following registry entries:
    Name
    Type
    Value data
    Logoff
    REG_SZ
    TSEventLogoff
    Logon
    REG_SZ
    TSEventLogon
    Restart the Terminal Server computer.
    Note: Serious problems might occur if you modify the registry incorrectly. Therefore, make sure that you follow these steps carefully. For added protection, back up the registry
    before you modify it. Then, you can restore the registry if a problem occurs. For more information about how to back up and restore the registry, please refer to
    How to back up and restore the registry in Windows.
    Hope this helps.
    Best Regards
    Jeremy Wu

  • Export data in a particular order

    Hi all,
    I want to export data in a particular order of dimension by the DATAEXPORT command.
    I use the below script :
    SET DATAEXPORTOPTIONS
         DataExportLevel "LEVEL0";
    DataExportDimHeader ON ;
    DataExportColFormat ON;
    DataExportOverwriteFile ON;
    FIX ("FY11","Submitted","Budget",@RELATIVE("Entity",0),@CHILDREN ("Period"))
    DATAEXPORT "File" "," "file.txt";
    ENDFIX;
    It always gives the file data in the below order
    "Entity","Allocations","Projects","CostCentres","FinancialYear","Version","Scenario","Account","Period"
    I want in the order of Scenario,Version,Year,Entity,CostCenter,Allocation,Project,Account,Period.
    Is it possible?
    Many thanks.

    The only way to do it would be to change the order of the sparse dimensions in your outline. The export exports in outline order

  • How to select the "in columns" dimension on export all Command ?

    Hello,
    I'm trying to export data from an V6.5 Essbase database to a 9.2 one.
    The easier and faster way I found is to use the 'Export database all data in Columns' from the 6.5 cube and load the data with a rule file in the 9.2 (The Export database command is MaxL command).
    The Export All Command with the "in columns options", export all data and put one dimensions in columns. Previously, It was always the Time dimension (M01 .... M12). And all worked fine.
    But someone changed the order of the dimensions in the V6.5 Essbase database. And now, there is another dimension (which changes more often) in columns and which have a lot of members (meanwhile my time dimension only have 12 members).
    So do you know how to select the dimensions which is in columns when using the export database all data function ? I do not see any options in Maxl ?? Maybe there is a rule to follow, in the dimensions order ?
    Do you have any idea about it ?
    Thanks for your help
    P.S : I'd prefer to avoid workaround such as using Reports Scripts or Calc Script (there are too slow).

    Why not use Essbase's native export format, bring it over onto your new environment, and reimport? Change to your heart's content on the target.
    Yes, the dimensionality (names, order, members, etc.) will have to be the same, but this approach, not the columnar export, is the fastest and most efficient way.
    Regards,
    Cameron Lackpour

  • Exporting Data from one Server to Another server w/ Version Enabled Tables

    Hi,
    I'm currently having a problem with regards to Exporting data to another server. This is the Scenario:
    Source Server is Production Server with all of its Tables in the Schema are Version-Enabled.
    Destination Server is a Test Server.
    I exported data from Production Server using EXP command. Then in my Test Server I imported my data using IMP command (I already created tablespace and user for the Schema).
    Import is successful in my Test server but when I execute my queries, There are no rows returned.
    I checked my _LT tables and it contains my data. but when I query from the View created when version was enabled, no result is returned.
    Am I missing something when I exported and imported my Schema? Should I have included the WMSYS schema when I created the .dump file?
    Thanks in advance.

    Hi Stefan,
    we tried using Export and Import using Data Pump.
    expdp system/password@orcl full=y directory=dmpdir2 dumpfile=FULL_DB.dmp
    impdp system/password@orcl full=y table_exists_action=truncate directory=dmpdir2 dumpfile=FULL_DB.dmp
    Still the same result as using exp and imp. _LT tables have data but when you query using the View, no results are found.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Export Data- csv Extremely Slow

    I am using SQL Developer 1.5.1 Build MAIN-5440 on Mac OS X 10.5.5 and the Export Data=>csv command is taking 20 minutes to show the file dialogue window. The query takes the expected amount of time and once I choose to save the file the remaining records are retrieved and the file is saved quickly. Anyone know of a workaround for this bug?
    Thanks,
    Jason

    As discussed above, there is a bug in the SQL Developer export (since the initial 1.0 release from what I have seen) where the query is executed again (just fetching the first 50 records - not all of them) before the export dialog is displayed. This means that if your statement took 10 minutes to query the first 50 records when you first executed it, then it will take another 10 minutes between selecting the export option and the dialog being displayed. If your statement took a second to query the first 50 records, then it will take only another second between selecting the export option and the dialog being displayed.
    Ordinarily, I would suggest that you upgrade to the current version (1.5.3) from 1.2, but 1.5.3 has other problems with the export sometimes not working. The export problems in 1.5.3 are not in 1.5.1, but even 1.5.1 has the bug where the export reruns the query before displaying the export dialog.
    theFurryOne

  • Best way to export data with r.t. prompts and have dense dim mbrs on rows?

    Hi All-
    What is the best way to export data with Run time prompts out of Essbase?
    One thought was to use Business Rules with run time variables and DATAEXPORT command, but I came across at least one limitation where I cannot have months (part of dense Time Periods dimension) on rows.
    I have only two dense dimensions: Accounts and Time Periods and I need both of these on rows. This would come handy when user enter Start and End month and year for data to be exported e.g. If start period is Feb 2010 and end is Jan 2011, I get data for all months in 2010 and 2011.
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000",14202.24,14341.62,14560,13557.54,11711.92,10261.58,12540.31,15307.83,16232.88,17054.62,18121.76,18236
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000",19241,21372.84,21008.4,18952.75,23442.13,19938.18,22689.61,23729.29,22807.48,23365,23915.3,24253
    "CORP1","0173","FY11","Working","Budget","Local","HSP_InputValue","404000",21364,22970.37,23186,27302,25144.38,27847.91,27632.11,29007.39,24749.42,27183.39,26599,27112.79
    where ideally I would need to get the following:
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Feb",14341.62
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Mar",14560
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Apr",13557.54
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","May",11711.92
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Jun",10261.58
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Jul",12540.31
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Aug",15307.83
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Sep",16232.88
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Oct",17054.62
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Nov",18121.76
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Dec",18236
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Feb",21372.84
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Mar",21008.4,
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Apr",18952.75
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","May",23442.13
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Jun",19938.18
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Jul",22689.61
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Aug",23729.29
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Sep",22807.48
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Oct",23365
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Nov",23915.3
    "CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Dec",24253
    "CORP1","0173","FY11","Working","Budget","Local","HSP_InputValue","404000","Jan",21364
    Thank you in advance for any tips.

    Have a read of the following post :- export data to sql
    It may give you a further option.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Export Data Using Escape Character

    Hi All
    I have got a requirement where i need to export data from oracle with escape character.
    eg. I am using a delimiter 237(í) and if the same character is present in data it should be escaped by escape character eg. /.
    Once this file will get created i need to load this file in Netezza database which supports escape character.
    Data in oracle table
    FirstName     Lastname     Designation
    abc     xyz     mnz
    def     ghío     pqr
    Data should be exported like below
    FirstnameíLastnameíDesignation
    abcíxyzímnz
    defígh/íoípqr
    Thanks.

    943994 wrote:
    Thanks for the reply. I am new to Oracle and i am not able to find any command for exporting data in Oracle. I know we can do it manually using select statement but in that case we need to replace this delimiter with escape character and delimiter for all char fields.
    In netezza we can directly do that without this. Please see below example and let me know if any such thing is present in Oracle.
    SQL> CREATE EXTERNAL TABLE '/temp/test.csv' USING (REMOTESOURCE 'ODBC' DELIMITER 236 DATESTYLE 'YMD' DATEDELIM '-' TIMESTYLE '24HOUR' TIMEDELIM ':' MAXERRORS 0 ESCAPECHAR '\' NULLVALUE '' ) AS SELECT * FROM temp;
    .CSV file created by above command:
    abcíxyzímnz
    defígh/íoípqr
    Thankshttp://docs.oracle.com/cd/E11882_01/server.112/e22490/et_params.htm#sthref1293

  • Exporting Data using exp not working

    I am trying to export data from an applicaion and the vendor gave me a par file to help. Here is my problem.
    This is what I have tried.
    exp \"sys/****@Fieldmgr as SYSDBA\" parfile=fm_export.par
    I get this...
    About to export specified tables via Conventional Path ...
    EXP-00011: SYS.MATSRC does not exist
    EXP-00011: SYS.FIELDS does not exist
    Export terminated successfully with warnings.
    It is prefixing sys. Before everything and that seems to be the problem.
    This is how I have modified the par file.
    FILE="D:\exported\fm43.dmp"
    LOG="D:\exported\fm43_log.log"
    BUFFER=100000
    ROWS=yes
    GRANTS=yes
    INDEXES=yes
    TABLES =(MATSRC,
    FIELDS,
    SETTINGS,
    CHARLARG)
    What is causing the command to prefix sys before the table names and how can I prevent this?
    Thanks,
    Jim

    Too much is missing from your posts.
    1. Can you list both the export.par file and the import.par file. This will help clear things up.
    1) Create a new database (schema & instance) in 10g.
    Not sure what they mean by schema & instance. I've created a new database called Fieldmgr
    This means to create the database and the users. Specifically the user where you want the data to live.
    grant connect,resource to user1_name identified by user1_password;
    2) Modify the export par file as appropriate for your Oracle environment.
    I prefixed fmdb. to the tablenames.You say that you prefixed fmdb to the tablenames. I'm not sure why you did this. The dumpfile has names tab1, tab2, tab3.
    If you said tables=fmdb_tab1, fmdb_tab2, fmdb_tab3. These are not in the dumpfile. The imp utilitye will look for tables called fmdb_tab1, etc and won't find any. This is probably your biggest problem. Don't change the tablenames in the parfile. If you exported
    tables=tab1, tab2, tab3
    You need to import
    tables=tab1, tab2, tab3
    or a subset like:
    tables=tab2
    >
    3) Use the export par file on the disc to create a dump of the 9i database.
    This appeared to work and I got a 63 MB data file which seems about right and there were no warnings or errors.
    4) Modify the import par file for your environment.
    Not sure what to do here other than point to the files. I've already explained what they said here which seems incorrect from your post.
    According to the applicaion vendor I should replace these variables in the import file.
    REM * <fm_source_db> - database the dmp file was from *
    REM * <fm_dest_db> - database to load the dmp file intoThe above 2 parameters are not valid parameters for imp. I would remove them from the parfile.
    FROMUSER=<fm_source_db>
    TOUSER=<fm_dest_db>These 2 parameters are for remapping a schema name. If the source database has a user called source_user1 and you wanted these imported into the target database as targer_user5, then you would use:
    FROMUSER=SOURCE_USER1
    TOUSER=TARGET_USER5
    This would mean that in step 1 above you would use
    grant connect,resource to target_user5 identified by my_password;
    >
    5) Run the import par to move the 9i dump into the new 10g database.
    This never works. The commands I am trying are.
    imp \"sys/****@fieldmgr as sysdba\" parfile="filename.par"
    imp \"sys/**** as sysdba\" parfile="filename.par"This didn't work because you renamed the table names, you appended fmdb to the tables. These tables are not in the dumpfile so that is why you get the table not found error.
    Dean

Maybe you are looking for