Oracle Imp/exp
hi,
I am trying to import emp.dmp which i have done through exp utility (exp vtprod/vtprod file=emp.dmp tables=(EMP) log=exp-emp.log)
by using the following command
imp vtprod/vtprod fromuser=vtprod touser=vtprod file=emp.dmp tables=(EMP) ignore=y commit=y log=imp-emp.log
Question: Data is getting appended to the table emp, i wanted to truncate the table and insert..i dont want to append the data..what needs to be done during the import..
Thanks in Advance
Hi,
In conventional Import utility, there is no option to which will first truncate table and then reinsert into table. You need to do it manually before involing Import.
YOu can check this with Import help.
You can let Import prompt you for parameters by entering the IMP
command followed by your username/password:
Example: IMP SCOTT/TIGER
Or, you can control how Import runs by entering the IMP command followed
by various arguments. To specify parameters, you use keywords:
Format: IMP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: IMP SCOTT/TIGER IGNORE=Y TABLES=(EMP,DEPT) FULL=N
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default) Keyword Description (Default)
USERID username/password FULL import entire file (N)
BUFFER size of data buffer FROMUSER list of owner usernames
FILE input files (EXPDAT.DMP) TOUSER list of usernames
SHOW just list file contents (N) TABLES list of table names
IGNORE ignore create errors (N) RECORDLENGTH length of IO record
GRANTS import grants (Y) INCTYPE incremental import type
INDEXES import indexes (Y) COMMIT commit array insert (N)
ROWS import data rows (Y) PARFILE parameter filename
LOG log file of screen output CONSTRAINTS import constraints (Y)
DESTROY overwrite tablespace data file (N)
INDEXFILE write table/index info to specified file
SKIP_UNUSABLE_INDEXES skip maintenance of unusable indexes (N)
FEEDBACK display progress every x rows(0)
TOID_NOVALIDATE skip validation of specified type ids
FILESIZE maximum size of each dump file
STATISTICS import precomputed statistics (always)
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
COMPILE compile procedures, packages, and functions (Y)
STREAMS_CONFIGURATION import streams general metadata (Y)
STREAMS_INSTANTIATION import streams instantiation metadata (N)
The following keywords only apply to transportable tablespaces
TRANSPORT_TABLESPACE import transportable tablespace metadata (N)
TABLESPACES tablespaces to be transported into database
DATAFILES datafiles to be transported into database
TTS_OWNERS users that own data in the transportable tablespace set
Import terminated successfully without warnings.Note: Destroy=y is not for this purpose. It is basically for datafile reuse option.
This feature(truncate the table) has been introduced in the Datapump import in Oracle 10g.
You can go through this.
http://www.oracle-base.com/articles/10g/OracleDataPump10g.php
Regards,
Navneet
Similar Messages
-
Imp Exp utility in ubuntu 10.4 with oracle instant client 10
Hi,
I have a problem with the imp/exp utility on Ubuntu 10.4.
I have installed the oracle instant client 10, and I have followed this thread
[http://download.oracle.com/docs/cd/B25329_01/doc/admin.102/b25107/impexp.htm#CHDDBGDF]
So, I've set env variable in my .bashrc
ORACLE_HOME=/home/lucia/Programmi/instantclient10_1
export ORACLE_HOME
LD_LIBRARY_PATH=/home/lucia/Programmi/instantclient10_1:${LD_LIBRARY_PATH}
export LD_LIBRARY_PATH
PATH=/home/lucia/Programmi/instantclient10_1:${PATH}
export PATH
SQLPATH=/home/lucia/Programmi/instantclient10_1:${SQLPATH}
export SQLPATH
Sqlplus works fine, but when I try to use the utility exp/imp
exp DB_USER/DB_PW@DB_SCHEMA owner=DB_USER buffer=300000 LOG=DB_USER.log FILE=DB_USER.dmp compress=n
I have the following error:
No command 'exp' found, did you mean:
Command 'xep' from package 'pvm-examples' (universe)
Command 'ex' from package 'vim' (main)
Command 'ex' from package 'nvi' (universe)
Command 'ex' from package 'vim-nox' (universe)
Command 'ex' from package 'vim-gnome' (main)
Command 'ex' from package 'vim-tiny' (main)
Command 'ex' from package 'vim-gtk' (universe)
Command 'axp' from package 'axp' (universe)
Command 'expr' from package 'coreutils' (main)
Command 'expn' from package 'sendmail-base' (universe)
Command 'epp' from package 'e16' (universe)
exp: command not found
(I cannot use data pump exp)
Any suggestions?
thanks in advance.
Luciaroot@laptop:~# /tmp/CVU_11.2.0.1.0_lucia/runfixup.sh
Response file being used is :/tmp/CVU_11.2.0.1.0_lucia/fixup.response
Enable file being used is :/tmp/CVU_11.2.0.1.0_lucia/fixup.enable
Log file location: /tmp/CVU_11.2.0.1.0_lucia/orarun.log
[: 845: true: unexpected operator
[: 860: unexpected operator
Any suggestions?
thanks
LuciaHello,
I had the same problem. It was related to the shell script interpreter being used: it is really a bash script asking to be interpreted as sh. For the impatient, the solution was to change the first line of the script orarun.sh to read as:
#!/bin/bash
Explanation:
1. The script runfixup.sh calls orarun.sh. The latter is the one that fails starting at line 845.
2. The script is really a bash script as can be seen from the if syntax used. For example at line 191 we see the double equall sign ==:
if [ "`echo $SET_KERNEL_PARAMETERS | tr A-Z a-z`" == "true" ]
3. The bash man page says:
string1 == string2
string1 = string2
True if the strings are equal. = should be used with the test command for POSIX conformance.
string1 != string2
True if the strings are not equal.
(End of man page excerpt)
Notice in particular that for a bash script to be POSIX it should avoid the == and just use =
4. orarun.sh has on its first line #!/bin/sh. On my Ubuntu system, sh is a softlink to dash and the dash man page says:
s1 = s2 True if the strings s1 and s2 are identical.
s1 != s2 True if the strings s1 and s2 are not identical.
5. So this script script uses bash syntax but is interpreted by dash. It is possible that this worked fine for the Oracle developer if on their system sh was softlinked to bash (which is often the case). Apparently, one should then be very careful to not use idioms that are not POSIX compliant.
6. Another solution is to change the /bin/sh softlink from dash to bash. Personally, I was less comfortable doing this (the possible repercussions were wider than the proposed solution above).
After making the change, the script ran perfectly. I think the best solution is for Oracle to update the script to be POSIX compliant or to identify itself as bash.
Hope that helped,
Mark Gaber
Edited by: user1112514 on Jun 30, 2010 3:28 AM -
Imp/exp ORA-12899: value too large for column
imp/exp ORA-12899: value too large for column
source :
os: linux as 4 update4
.bash_profile NLS_LANG=AMERICAN_AMERICA.US7ASCII
for run exp bill/admin001 file=bill0518.dmp bill rows=y
oracle: 10.2.1
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CHARACTERSET US7ASCII
target :
os: linux as 4 update4
.bash_profile NLS_LANG=AMERICAN_AMERICA.AL32UTF8
for run
imp bill/admin001 file=bill0518.dmp
oracle: 10.2.1
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CHARACTERSET AL32UTF8
imp log
Import: Release 10.2.0.1.0 - Production on Wed May 16 14:57:59 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc tion
With the Partitioning, Real Application Clusters, OLAP and Data Mining options
Export file created by EXPORT:V10.02.01 via conventional path
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
export client uses AL32UTF8 character set (possible charset conversion)
. importing BILL's objects into BILL
. . importing table "MY_SESSION" 44 rows imported
. . importing table "T1"
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "BILL"."T1"."NAME" (actual: 62, maximum: 5 0)
Column 1 1
Column 2 ÖйúÈË. 0 rows impo rted
Import terminated successfully with warnings.Yes it's probably due to different char sets
A way around it it to change the DB setup on the new database to use CHAR as default for varchar2 rows, and then use datapump to do your import/export, because datapump uses the default varchar2 type when creating tables that includes varchar2 (which is normally byte). Exp/imp uses the varchar2 type that is in the original database
Best regards
/Klaus -
Use of imp/exp to restore a database will fail ??
I have a database that uses mdsys and other user-defined objects. This week I've had the "pleasure" to try to restore a database from scratch using imp, but had to abandon because imp refuses to create tables if they refer to objects, that upon export had a different id than the database being imported to.
Since mdsys for instance, is always excluded from exports (like sys and system) when you create a new database, the spatial objects are created with a different ID than the original database. Hence, every table you have with spatial data will not be imported.
So imp only works if the internal object ids match.
What would be the "imp/exp" procedure to do a full db restore that bypasses the above problem? Has datadump solved this issue?Correct - all schemas considered part of the dictinary are excluded (why has things become so complicated- in the old days we had ONE place where everything dictionary went).
That's the problem. I have tables in my export that depends on MDSYS objects. In other words, spatial data. So, to re-create my database I must first create a database and ADD a MDSYS schema. So yes, I created the schema prior to import. The problem is, that the MDSYS objects then do not get the same object-ids. And there lies the problem. IMP will not import my tables with spatial data in them because the reference to the MDSYS objects do not match the object ID that was used during export. Why this rule is applied is the mystery.
In addition, which I rote in the spatial forum, anything related to the mapviewer application is also placed in the MDSYS schema, excluding them from export; regardless that the data related to mapviewer is not related to database objects. The result is, that your Oracle Mapviewer application data is lost. The "official workaround" is to run pre-scripts that copy the dictionary views to fixed tables, then run your export. And on import you run a post-script to copy it back. Rather silly and inflexible. -
How to redirect imp/exp or impdp/expdp output
Hello,
I'm sorry for that, if this forum not the right place for this posting, but I hope that some one has this problem too.
I've an application written in C# (an windows.forms app) that calls imp/exp or impdp/expdp (depends on user). I need the output of this tools in my app.
Thank a lot for any help.user9099355 wrote:
Hello,
I'm sorry for that, if this forum not the right place for this posting, but I hope that some one has this problem too.
I've an application written in C# (an windows.forms app) that calls imp/exp or impdp/expdp (depends on user). I need the output of this tools in my app.
Thank a lot for any help.EXPDP/IMPDP:-
[oracle@adg disks]$ expdp system/manager schemas=scott directory=DATA_PUMP_DIR logfile=scot_exp.log
Export: Release 11.2.0.1.0 - Production on Tue Oct 4 14:36:54 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/******** schemas=scott directory=DATA_PUMP_DIR logfile=scot_exp.log
[oracle@adg disks]$ cd /u01/app/oracle/admin/adg/dpdump/
[oracle@adg dpdump]$ ls -lt scot_exp.log
-rw-r--r-- 1 oracle dba 1965 Oct 4 14:38 scot_exp.log
[oracle@adg dpdump]$exp/imp:-
[oracle@adg dpdump]$ exp system/manager file=exp_schema.log log=exp_schema_trad.log owner=scott
Export: Release 11.2.0.1.0 - Production on Tue Oct 4 14:38:53 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses AL32UTF8 character set (possible charset conversion)
[oracle@adg dpdump]$ ls -lt exp_schema_trad.log
-rw-r--r-- 1 oracle dba 1774 Oct 4 14:39 exp_schema_trad.log
[oracle@adg dpdump]$or
nohup exp system/manager file=exp_schema.dmp owner=scott > exp_schema_test.log
+[oracle@adg dpdump]$ ls -ltr exp_schema_test.log+
-rw-r--r-- 1 oracle dba 1922 Oct 4 14:40 exp_schema_test.log
+[oracle@adg dpdump]$+ -
Hi all,
I am interested in changing language in exp and/or imp log files? Oracle version is 10.2.0.4.0, OS version is Win 2k3R2.
I have Oracle DB installed on hungarian-language OS, and when issue an exp command, I get log file on hungarian language. I need stdout language for imp/exp utility to be on english. Can I do this?
As far as I know stdout language for sqlplus query for current session can be changed with alter session set nls_language=ENGLISH.
A little demonstration:
current logfile:
. . a következo tábla exportálása: A 3 sor exportálva
. . a következo tábla exportálása: B 2 sor exportálva
. . a következo tábla exportálása: C 0 sor exportálva
desired logfile:
. . exporting table A 3 rows exported
. . exporting table B 2 rows exported
. . exporting table C 0 rows exported
. . . . .Tnx in advance.Prior to running exp or imp (and why still use exp and imp?)
set nls_lang=<Region>_<territory>.<characterset>
Don't know why but it seems it is 'Doc question day' again.
Sybrand Bakker
Senior Oracle DBA -
Can I use 9i imp/exp utility after installing 10g
Hello, all:
I have two database co-exist in one machine, one is 9i, another one is 10.2, which was just installed recently . Now I need to imp/exp the 9i database, I can't invoke 9i imp/exp utility anymore, even under 9i oracle_home, I can only get 10g's imp/exp screen comes up, can someone help me? Is there anyway that I can use both version of utility in the same machine?
Thanks a lot in advance.
wendyI can't invoke 9i imp/exp utility anymoreThat might be because the 10G_ORA_HOME\BIN appears first in your PATH environment variable. Either put 9i_HOME\BIN first in your PATH or navigate to the BIN directory of 9i home and run the export/import from there.
C:\>exp
Export: Release 10.2.0.1.0 - Production on Wed Oct 25 09:31:50 2006
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: ^C
C:\>
C:\>cd Oracle\ora92\bin
C:\oracle\ora92\bin>exp
Export: Release 9.2.0.7.0 - Production on Wed Oct 25 09:32:03 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: ^C
C:\oracle\ora92\bin> -
Minimum group membership for imp/exp for ora 8i(client) windows xp users
Hi to all,
For oracle 8i clients, windows xp users, what is the minimum group membership required that can be used so that the users can import or export dmps. Using Administrators imp/exp works ok... Any alternative?
ThanksHi Thierry,
Please do not confuse the issue. Here we have the windows Operating System - privilege and then the Oracle database user privilege. In my case the Oracle database user privilege is DBA. If the user is given a Windows - Administrators privilege (which I do want to give) the exp/imp creates the DMP and log file. But any other standard windows privilege (with DBA privilege) the exp/imp does not create the dmp and log file. I hope I am clear and now you can suggest some alternative solution to OS - administrator.
Thanks again -
Can we migrate data from 7.3.4 to 10g? (imp / exp)
Can we migrate data from 7.3.4 to 10g? (imp / exp)
Please note: 7.3.4 is on Sun Solaris and 10g is on Linux x86, SuSE linux
enetrprise server 8.0.
But please remeber Direct upgrade is not supported from 7.3.4 to 10g. can we do
migartion without any issue?Justin is correct, you can do that with exp 7.
look on metalink for note 132904.1. Here is the matrix out of this note
+-----------+------------------------------------------------------+
| EXPORT | IMPORT into: |
| from +--------+--------+--------+--------+--------+---------+
| \/ | 8.1.5 | 8.1.6 | 8.1.7 | 9.0.1 | 9.2.0 | 10.1.0 |
+-----------+--------+--------+--------+--------+--------+---------+
| 5.x 1) 2)| EXP5x | EXP5x | EXP5x | EXP5x | EXP5x | EXP5x |
| 6.x 2)| EXP6x | EXP6x | EXP6x | EXP6x | EXP6x | EXP6x |
| 7.x 3)| EXP7x | EXP7x | EXP7x | EXP7x | EXP7x | EXP7x |
+-----------+--------+--------+--------+--------+--------+---------+
| 8.0.3 | EXP803 | EXP803 | EXP803 | EXP803 | EXP803 | EXP803 |
| 8.0.4 | EXP804 | EXP804 | EXP804 | EXP804 | EXP804 | EXP804 |
| 8.0.5 | EXP805 | EXP805 | EXP805 | EXP805 | EXP805 | EXP805 |
| 8.0.6 | EXP806 | EXP806 | EXP806 | EXP806 | EXP806 | EXP806 |
+-----------+--------+--------+--------+--------+--------+---------+
| 8.1.5 | EXP815 | EXP815 | EXP815 | EXP815 | EXP815 | EXP815 |
| 8.1.6 | EXP815 | EXP816 | EXP816 | EXP816 | EXP816 | EXP816 |
| 8.1.7 | EXP815 | EXP816 | EXP817 | EXP817 | EXP817 | EXP817 |
+-----------+--------+--------+--------+--------+--------+---------+
| 9.0.1 | EXP815 | EXP816 | EXP817 | EXP901 | EXP901 | EXP901 |
| 9.2.0 | EXP815 | EXP816 | EXP817 | EXP901 | EXP920 | EXP920 |
+-----------+--------+--------+--------+--------+--------+---------+
| 10.1.0 4)| EXP815 | EXP816 | EXP817 | EXP901 | EXP920 | 4) |
+-----------+--------+--------+--------+--------+--------+---------+
Remarks:
1) IMPORT can read export dump files created by EXPORT release 5.1.22 and
higher (up to same version).
2) An Oracle5 or Oracle6 export dump and an Oracle7 IMPORT:
see the Oracle Utilities Manual, Chapter 2 "Import" for special
considerations to keep in mind.
3) To export from Oracle7 for import into an Oracle6 database: user SYS
must run script CATEXP6.SQL on the Oracle7 database first (this script
needs to be run only once to have the version6 views been created).
4) To export from Oracle8, Oracle8i for import into an Oracle7 database:
user SYS must run script CATEXP7.SQL on the Oracle8/8i database first
(this script needs to be run only once in order to create the version7
views). -
what is the primary difference between datapumps and imp/exp utilities?
Thank youThe 10g Utilities manual lists the new capabilities that Data Pump provides over imp/exp:
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_overview.htm#i1010248
That is a pretty complete list. The original exp/imp chapter mentions two things that should be fairy obvious:
You need to use the original imp to import dump files generated by the original exp.
You need to use exp to generate files that can be read by imp on earlier (pre-10) versions of Oracle.
Last but not least, many of us have become accustomed to writing to and reading from named pipes with exp and imp. This is a popular way to compress or pipe across rsh/ssh on the fly. Unfortunately, this is impossible with Data Pump! I would say that is among the most strenuous complaints I hear about Data Pump.
Regards,
Jeremiah Wilton
ORA-600 Consulting
http://www.ora-600.net -
hi guys,
can someone please give me the complete steps on how to use the imp/exp utility for 8.1.7db runing on linux?
also if possible what are the requirements and steps to install db 8.1.7 on linux?
would really appreciate it,.....thanks in advance....http://download-west.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/toc.htm
http://www.oracle.com/pls/tahiti/tahiti.docindex
http://www.google.com/search?hl=en&q=installation+8i+linux&btnG=Search+v+Google&meta= -
Hi how to disable imp/exp functions once it has been enabled?
Thks for any infoEither you need to create a trigger which will prevent performing export/import or reoke the "select on EXU8FUL" as the forum user "michaels" suggested in one of the similar posts.
Prevent the use of EXP and IMP for specified Oracle Users
Regards,
Sabdar Syed. -
Problems compiling objects after imp/exp
I am moving an Apex database to a new machine using imp/exp. I have run into problems getting 2 packages ( WWV_FLOW_CUSTOM_AUTH_STD and WWV_FLOW_SECURITY ) to compile due to PLS-00753: malformed or corrupted wrapped unit. I have tried using both imp/exp and datapump with no success. I have insured that my sys grants exist on the new database. I have also tried generating the package code and manually creating the package with no luck?
Any ideas? Is there a script in the Apex distribution that can be run to recreate these pacakges?
Thank you
Message was edited by:
birkBirk,
1. Connect as FLOWS_xxxxxx (depends on your version) and alter package wwv_flow_security compile
alter package wwv_flow_security compile body
alter package wwv_flow_custom_auth_std compile
alter package wwv_flow_custom_auth_std compile body
/2. If that doesn't work, go to the core directory in the distribution and, again connected in SQL*Plus as FLOWS_xxxxxx, run: @auth.sql
@auth.pkb
@custom_auth_std.sql
@custom_auth_std.pkb3. If that doesn't work, you may have to contact Support about the imp/exp steps that seem to have given rise to this.
Scott -
Hello!
I am a newbie in dataguard.
Til now the only backup procedure for the data was accomplished using the exp command on specific schemas.
Now we want to create a data guard environment but we want to continue using exp on the primary and imp on another server(not the standby).
Do the new configurations for data guard effect the exp procedure, I mean can we continue using exp on the primary as before without any problem?
Thanks in advance!Uwe has made me think about more.
If you are asking will IMP/EXP still work then yes is the correct answer.
However since you state "only backup procedure" I went with the RMAN answer.
Also I assume you are speaking about the Primary database only.
Best Regards
mseberg -
Hi Folks,
We are importing table data from oracle 9i (9.2.0.8) into oracle 11g (11.2.0.3) using the old oracle imp utility, and we're hoping that you can help us speed up the process please.
We have exported one table data; roughly 35 million rows have been exported generating an export dump size of 7.5GB. Please note that this is a subset of the data (one month worth of data to be exact). We do realize that there will be a lot of duplicates within this data sub-set and that the import would reject the duplicated rows through PK constraints (giving the following error):
IMP-00019: row rejected due to ORACLE error 1
IMP-00003: ORACLE error 1 encountered
ORA-00001: unique constraint (RG_SCHEMA.SSTG010P) violated
Column 1 243113850256342640
The import has been running for over 3 hrs now and is taking quite a while even though we're running it with the following parameters:
statistics=NONE
buffer=12000000
resumable=Y
ignore=Y
constraints=N
indexes=N
We do realize rejecting more than half the rows would slow down the process as there is overhead associated with that; having said that is there anything else we can try to to speed up this dreaded process please.
Appreciate the help.IMPORT:
Create an indexfile so that you can create indexes AFTER you have imported data. Do this by setting INDEXFILE to a filename and then import. No data will be imported but a file containing index definitions will be created. You must edit this file afterwards and supply the passwords for the schemas on all CONNECT statements.
Place the file to be imported on a separate physical disk from the oracle data files
Set the LOG_BUFFER to a big value and restart oracle.
Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback segments offline (except the SYSTEM rollback segment of course). The rollback segment must be as big as your biggest table (I think?)
Use COMMIT=N in the import parameter file if you can afford it
Use STATISTICS=NONE in the import parameter file to avoid time consuming to import the statistics
Remember to run the indexfile previously created
Import Export FAQ - Oracle FAQ
Maybe you are looking for
-
Trouble connecting macbook to N600
We just connected a E2500-NP using a macbook. All of our other devices (iphone, itouches, nexus, PC) connect and run at regular speed. My macbook (the one used to install the router) connects but loads web pages incredibly slowly. We reinstalled t
-
Prompted to install Yahoo Messenger upon restart even though already installed
I have Yahoo Messenger (3.0.1 beta) installed on my MacBook Air (Lion OS X 10.7.2). It is working fine, however every time I restart my computer and access the Flash Drive (Go - Computer) or the Applications Folder (Go - Applications) I'm prompted to
-
Email address requires host name in address field
Help! - I'm trying to setup mail on Lion Sever (7.3). The only way I can get it to accept incoming mail is to specify the whole FQDN of the server in the address. For example the domain name is example.com and I have a user set up called john. Sendin
-
Osb proxy service on https,help?
The Client use https protocol to send message to OSB'S proxy service,so how to configue the proxy service? At present: service type:Messaging Service request message type:xml reponse message type :none transport:http https required:unchenk it works o
-
I installed the latest version of flash player and it said installed successfully and i can use it just fine, but when I checked system requirements it didnt seem to match. it needed 2.33 ghz and I have 2.13ghz. can this cause problems to my computer