Reg : Problem in Importing RedDot Live server .ept files
Dear All,
We are trying to import some .ept files into portal using the portal import functionality under Transport Tab.While importing the .ept files we are getting the error.
With all .ept files I created .epa file and imported into portal I am getting the following error.
These .ept files are related to RedDot Live server integration into portal.
Error Details
Unexpected exception during import.
[com.sapportals.portal.transport.EptFile]: transport file not found: /usr/sap/EPD/JC00/j2ee/temp/pcd/transport/IMPORT-0630_194417_454_c965d616aa4688c8/EPT/com.reddot.pct.liveserver.personalization.standalone.ept
Transport File: /usr/sap/EPD/JC00/j2ee/temp/pcd/transport/IMPORT-0630_194417_454_c965d616aa4688c8/EPT/com.reddot.pct.liveserver.personalization.standalone.ept
at com.sapportals.portal.transport.Transport.importObject(Transport.java:124)
Original exception:
java.io.FileNotFoundException: [com.sapportals.portal.transport.EptFile]: transport file not found: /usr/sap/EPD/JC00/j2ee/temp/pcd/transport/IMPORT-0630_194417_454_c965d616aa4688c8/EPT/com.reddot.pct.liveserver.personalization.standalone.ept
at com.sapportals.portal.transport.EptFile.(EptFile.java:57)
at com.sapportals.portal.transport.Transport.importObject(Transport.java:76)
at com.sapportals.portal.transport.app.ImportRunner.importObject(ImportRunner.java:333)
How to figure out this error...
Regards,
Sita Mahalakshmi
Hi Sitara,
please check, if the file
/usr/sap/EPD/JC00/j2ee/temp/pcd/transport/IMPORT-0630_194417_454_c965d616aa4688c8/EPT/com.reddot.pct.liveserver.personalization.standalone.ept
exists on your portal servers disk.
If not, please move it to the correct place.
If it's there, please double check, then come back.
Greetings,
Carsten
Similar Messages
-
Problems with import from textfile
Hi
We are experiencing problems with importing tables from a text-file. We are using Designer 6i (NT) release 2.
Some of the column sequences are lost others are changed.
Foreign keys are lost if the referenced table is after in alphbetical order! Not null columns are nullified and the column of the primary key definition is lost.
Has anyone had similar problems and/or know of a workaround?Hi
We are experiencing problems with importing tables from a text-file. We are using Designer 6i (NT) release 2.
Some of the column sequences are lost others are changed.
Foreign keys are lost if the referenced table is after in alphbetical order! Not null columns are nullified and the column of the primary key definition is lost.
Has anyone had similar problems and/or know of a workaround? -
Does anyone have a good server.xml file for tomcat?
I am trying to link apache and tomcat on a RedHat 9 computer. I previously posted a message stating that I could not get tomcat to "automagically" make the configuration files (mod_jk.conf-auto) that all the manuals promised it would. However, I found that my problem was the lack of an ApacheConfig tag in the server.xml file. After looking at my server.xml file, I found that it was much more simplistic than I would hoped = it did not have enough comments or commented out options. It had none of the directives that the manuals said it included by default.
In short, does anyone who has apache and tomcat linked have a server.xml file that I can look at so that I can figure out what I must add to mine to get tomcat working with apache? I would greatly appreciate it if you could copy and paste the whole thing here.
Thank you very much in advance.
Edward S. RiceHi!
I am having the Apache-Tomcat4.0 version installed on a Win2000 machine. I am pasting here my server.xml. Hope it will be useful to you. I even didnot do much modifications to my server.xml except for adding a context.
Please come back if problem persists.
<!-- Example Server Configuration File -->
<!-- Note that component elements are nested corresponding to their
parent-child relationships with each other -->
<!-- A "Server" is a singleton element that represents the entire JVM,
which may contain one or more "Service" instances. The Server
listens for a shutdown command on the indicated port.
Note: A "Server" is not itself a "Container", so you may not
define subcomponents such as "Valves" or "Loggers" at this level.
-->
<Server port="8005" shutdown="SHUTDOWN" debug="0">
<!-- A "Service" is a collection of one or more "Connectors" that share
a single "Container" (and therefore the web applications visible
within that Container). Normally, that Container is an "Engine",
but this is not required.
Note: A "Service" is not itself a "Container", so you may not
define subcomponents such as "Valves" or "Loggers" at this level.
-->
<!-- Define the Tomcat Stand-Alone Service -->
<Service name="Tomcat-Standalone">
<!-- A "Connector" represents an endpoint by which requests are received
and responses are returned. Each Connector passes requests on to the
associated "Container" (normally an Engine) for processing.
By default, a non-SSL HTTP/1.1 Connector is established on port 8080.
You can also enable an SSL HTTP/1.1 Connector on port 8443 by
following the instructions below and uncommenting the second Connector
entry. SSL support requires the following steps (see the SSL Config
HOWTO in the Tomcat 4.0 documentation bundle for more detailed
instructions):
* Download and install JSSE 1.0.2 or later, and put the JAR files
into "$JAVA_HOME/jre/lib/ext".
* Execute:
%JAVA_HOME%\bin\keytool -genkey -alias tomcat -keyalg RSA (Windows)
$JAVA_HOME/bin/keytool -genkey -alias tomcat -keyalg RSA (Unix)
with a password value of "changeit" for both the certificate and
the keystore itself.
By default, DNS lookups are enabled when a web application calls
request.getRemoteHost(). This can have an adverse impact on
performance, so you can disable it by setting the
"enableLookups" attribute to "false". When DNS lookups are disabled,
request.getRemoteHost() will return the String version of the
IP address of the remote client.
-->
<!-- Define a non-SSL HTTP/1.1 Connector on port 8080 -->
<Connector className="org.apache.catalina.connector.http.HttpConnector"
port="8080" minProcessors="5" maxProcessors="75"
enableLookups="true" redirectPort="8443"
acceptCount="10" debug="0" connectionTimeout="60000"/>
<!-- Note : To disable connection timeouts, set connectionTimeout value
to -1 -->
<!-- Define an SSL HTTP/1.1 Connector on port 8443 -->
<!--
<Connector className="org.apache.catalina.connector.http.HttpConnector"
port="8443" minProcessors="5" maxProcessors="75"
enableLookups="true"
acceptCount="10" debug="0" scheme="https" secure="true">
<Factory className="org.apache.catalina.net.SSLServerSocketFactory"
clientAuth="false" protocol="TLS"/>
</Connector>
-->
<!-- Define an AJP 1.3 Connector on port 8009 -->
<!--
<Connector className="org.apache.ajp.tomcat4.Ajp13Connector"
port="8009" minProcessors="5" maxProcessors="75"
acceptCount="10" debug="0"/>
-->
<!-- Define a Proxied HTTP/1.1 Connector on port 8081 -->
<!-- See proxy documentation for more information about using this. -->
<!--
<Connector className="org.apache.catalina.connector.http.HttpConnector"
port="8081" minProcessors="5" maxProcessors="75"
enableLookups="true"
acceptCount="10" debug="0" connectionTimeout="60000"
proxyPort="80"/>
-->
<!-- Define a non-SSL HTTP/1.0 Test Connector on port 8082 -->
<!--
<Connector className="org.apache.catalina.connector.http10.HttpConnector"
port="8082" minProcessors="5" maxProcessors="75"
enableLookups="true" redirectPort="8443"
acceptCount="10" debug="0"/>
-->
<!-- An Engine represents the entry point (within Catalina) that processes
every request. The Engine implementation for Tomcat stand alone
analyzes the HTTP headers included with the request, and passes them
on to the appropriate Host (virtual host). -->
<!-- Define the top level container in our container hierarchy -->
<Engine name="Standalone" defaultHost="localhost" debug="0">
<!-- The request dumper valve dumps useful debugging information about
the request headers and cookies that were received, and the response
headers and cookies that were sent, for all requests received by
this instance of Tomcat. If you care only about requests to a
particular virtual host, or a particular application, nest this
element inside the corresponding <Host> or <Context> entry instead.
For a similar mechanism that is portable to all Servlet 2.3
containers, check out the "RequestDumperFilter" Filter in the
example application (the source for this filter may be found in
"$CATALINA_HOME/webapps/examples/WEB-INF/classes/filters").
Request dumping is disabled by default. Uncomment the following
element to enable it. -->
<!--
<Valve className="org.apache.catalina.valves.RequestDumperValve"/>
-->
<!-- Global logger unless overridden at lower levels -->
<Logger className="org.apache.catalina.logger.FileLogger"
prefix="catalina_log." suffix=".txt"
timestamp="true"/>
<!-- Because this Realm is here, an instance will be shared globally -->
<Realm className="org.apache.catalina.realm.MemoryRealm" />
<!-- Replace the above Realm with one of the following to get a Realm
stored in a database and accessed via JDBC -->
<!--
<Realm className="org.apache.catalina.realm.JDBCRealm" debug="99"
driverName="org.gjt.mm.mysql.Driver"
connectionURL="jdbc:mysql://localhost/authority?user=test;password=test"
userTable="users" userNameCol="user_name" userCredCol="user_pass"
userRoleTable="user_roles" roleNameCol="role_name" />
-->
<!--
<Realm className="org.apache.catalina.realm.JDBCRealm" debug="99"
driverName="oracle.jdbc.driver.OracleDriver"
connectionURL="jdbc:oracle:thin:@ntserver:1521:ORCL?user=scott;password=tiger"
userTable="users" userNameCol="user_name" userCredCol="user_pass"
userRoleTable="user_roles" roleNameCol="role_name" />
-->
<!--
<Realm className="org.apache.catalina.realm.JDBCRealm" debug="99"
driverName="sun.jdbc.odbc.JdbcOdbcDriver"
connectionURL="jdbc:odbc:CATALINA"
userTable="users" userNameCol="user_name" userCredCol="user_pass"
userRoleTable="user_roles" roleNameCol="role_name" />
-->
<!-- Define the default virtual host -->
<Host name="localhost" debug="0" appBase="webapps" unpackWARs="true">
<!-- Normally, users must authenticate themselves to each web app
individually. Uncomment the following entry if you would like
a user to be authenticated the first time they encounter a
resource protected by a security constraint, and then have that
user identity maintained across all web applications contained
in this virtual host. -->
<!--
<Valve className="org.apache.catalina.authenticator.SingleSignOn"
debug="0"/>
-->
<!-- Access log processes all requests for this virtual host. By
default, log files are created in the "logs" directory relative to
$CATALINA_HOME. If you wish, you can specify a different
directory with the "directory" attribute. Specify either a relative
(to $CATALINA_HOME) or absolute path to the desired directory.
-->
<Valve className="org.apache.catalina.valves.AccessLogValve"
directory="logs" prefix="localhost_access_log." suffix=".txt"
pattern="common"/>
<!-- Logger shared by all Contexts related to this virtual host. By
default (when using FileLogger), log files are created in the "logs"
directory relative to $CATALINA_HOME. If you wish, you can specify
a different directory with the "directory" attribute. Specify either a
relative (to $CATALINA_HOME) or absolute path to the desired
directory.-->
<Logger className="org.apache.catalina.logger.FileLogger"
directory="logs" prefix="localhost_log." suffix=".txt"
timestamp="true"/>
<!-- Define properties for each web application. This is only needed
if you want to set non-default properties, or have web application
document roots in places other than the virtual host's appBase
directory. -->
<!-- Tomcat Root Context -->
<!--
<Context path="" docBase="ROOT" debug="0"/>
-->
<Context path="/vijay" docBase="D:\Java\servlets" debug="0"/>
<!-- Tomcat Manager Context -->
<Context path="/manager" docBase="manager"
debug="0" privileged="true"/>
<!-- Tomcat Examples Context -->
<Context path="/examples" docBase="examples" debug="0"
reloadable="true">
<Logger className="org.apache.catalina.logger.FileLogger"
prefix="localhost_examples_log." suffix=".txt"
timestamp="true"/>
<Ejb name="ejb/EmplRecord" type="Entity"
home="com.wombat.empl.EmployeeRecordHome"
remote="com.wombat.empl.EmployeeRecord"/>
<!-- PersistentManager: Uncomment the section below to test Persistent
Sessions.
saveOnRestart: If true, all active sessions will be saved
to the Store when Catalina is shutdown, regardless of
other settings. All Sessions found in the Store will be
loaded on startup. Sessions past their expiration are
ignored in both cases.
maxActiveSessions: If 0 or greater, having too many active
sessions will result in some being swapped out. minIdleSwap
limits this. -1 means unlimited sessions are allowed.
0 means sessions will almost always be swapped out after
use - this will be noticeably slow for your users.
minIdleSwap: Sessions must be idle for at least this long
(in seconds) before they will be swapped out due to
maxActiveSessions. This avoids thrashing when the site is
highly active. -1 or 0 means there is no minimum - sessions
can be swapped out at any time.
maxIdleSwap: Sessions will be swapped out if idle for this
long (in seconds). If minIdleSwap is higher, then it will
override this. This isn't exact: it is checked periodically.
-1 means sessions won't be swapped out for this reason,
although they may be swapped out for maxActiveSessions.
If set to >= 0, guarantees that all sessions found in the
Store will be loaded on startup.
maxIdleBackup: Sessions will be backed up (saved to the Store,
but left in active memory) if idle for this long (in seconds),
and all sessions found in the Store will be loaded on startup.
If set to -1 sessions will not be backed up, 0 means they
should be backed up shortly after being used.
To clear sessions from the Store, set maxActiveSessions, maxIdleSwap,
and minIdleBackup all to -1, saveOnRestart to false, then restart
Catalina.
-->
<!--
<Manager className="org.apache.catalina.session.PersistentManager"
debug="0"
saveOnRestart="true"
maxActiveSessions="-1"
minIdleSwap="-1"
maxIdleSwap="-1"
maxIdleBackup="-1">
<Store className="org.apache.catalina.session.FileStore"/>
</Manager>
-->
<Environment name="maxExemptions" type="java.lang.Integer"
value="15"/>
<Parameter name="context.param.name" value="context.param.value"
override="false"/>
<Resource name="jdbc/EmployeeAppDb" auth="SERVLET"
type="javax.sql.DataSource"/>
<ResourceParams name="jdbc/EmployeeAppDb">
<parameter><name>user</name><value>sa</value></parameter>
<parameter><name>password</name><value></value></parameter>
<parameter><name>driverClassName</name>
<value>org.hsql.jdbcDriver</value></parameter>
<parameter><name>driverName</name>
<value>jdbc:HypersonicSQL:database</value></parameter>
</ResourceParams>
<Resource name="mail/Session" auth="Container"
type="javax.mail.Session"/>
<ResourceParams name="mail/Session">
<parameter>
<name>mail.smtp.host</name>
<value>localhost</value>
</parameter>
</ResourceParams>
</Context>
</Host>
</Engine>
</Service>
<!-- The MOD_WEBAPP connector is used to connect Apache 1.3 with Tomcat 4.0
as its servlet container. Please read the README.txt file coming with
the WebApp Module distribution on how to build it.
(Or check out the "jakarta-tomcat-connectors/webapp" CVS repository)
To configure the Apache side, you must ensure that you have the
"ServerName" and "Port" directives defined in "httpd.conf". Then,
lines like these to the bottom of your "httpd.conf" file:
LoadModule webapp_module libexec/mod_webapp.so
WebAppConnection warpConnection warp localhost:8008
WebAppDeploy examples warpConnection /examples/
The next time you restart Apache (after restarting Tomcat, if needed)
the connection will be established, and all applications you make
visible via "WebAppDeploy" directives can be accessed through Apache.
-->
<!-- Define an Apache-Connector Service -->
<Service name="Tomcat-Apache">
<Connector className="org.apache.catalina.connector.warp.WarpConnector"
port="8008" minProcessors="5" maxProcessors="75"
enableLookups="true"
acceptCount="10" debug="0"/>
<!-- Replace "localhost" with what your Apache "ServerName" is set to -->
<Engine className="org.apache.catalina.connector.warp.WarpEngine"
name="Apache" debug="0" appBase="webapps">
<!-- Global logger unless overridden at lower levels -->
<Logger className="org.apache.catalina.logger.FileLogger"
prefix="apache_log." suffix=".txt"
timestamp="true"/>
<!-- Because this Realm is here, an instance will be shared globally -->
<Realm className="org.apache.catalina.realm.MemoryRealm" />
</Engine>
</Service>
</Server> -
When i am importing the delimiter text data file manually with import manager(by both port method & delimiter file), 1116 records are getting imported, but when i do import server automation with same file & Map only 130 records are getting imported in to repository.
Details of server.
SAP MDM 7.1.04.122 (SP4)
Observations
1. In sap MDM console when i see port details, the status of port is : HAS exception.
2. In Exception folder log is created in ImportX subfolder.
3.in log folder, in files i could see most of lines as
<Failure ts="2010/04/01 12:43:26.084 GMT" tid="4184" entry-no="39" operation="Import" import-action="Create" row="1">One or more field values are invalid</Failure>
Did anyone faced the similar issue, or did i do any mistake or missed any step...Hi Srinivas,
As you said that status of your import Exception is Empty once you delete all the records from Exception folders.
But when you again import records through MDIS all records are not getting imported. right?
I feel that it could be also possible that saved map which you are using into Port may have missed field/values.
So, I would suggest please go for this step by step and let me know if you face any issue.
1st step: Open this file using Import manager Type = Delimited text, select your Remote System and Delimiter as per your requirement. Map all your Fields and Values. Make sure that all your desired fields and their values are mapped correctly.
Now in the last come to import status and check that Action items: Ready to import . Don't import records here but save this map File-->Save As->Give Map name say MAP1 and then close your import manager.
2nd Step: Put this file into Ready folder and make sure that you MDIS services is stopped so that you can open this same file using MDM Import Manager Type = Port and select you required Remote system which you defined in MDM Console while defining Port and make sure that this port in console also contain the same map MAP1. Also make sure that you put same Delimiter Correctly in port as you put during mdm import manager.
check the status of Import into Import status Tab as Action items: Ready to import or not. It should have Action Items: Ready to import if not then map the missing values/fields and then Go to File-->Save.
Now Run your MDIS Services, and put this same file into your ready folder. This should import all of your records into MDM and i feel this will solve your Problem with MDIS.
Revert with result.
Regards,
Mandeep Saini -
Problems importing MS SQL Server 2008 DDL
I'm trying to import DDL into SQL Developer DM 4.0 from an MS SQL 2008 DDL dump. It fails to recognise ANY of the statements in the (perfectly legitimate) SQL file. I'm clearly doing something wrong, but I have no idea what! Any help/guidance gratefully received.
Hi David - thanks for responding. There are no messages relating to the attempted import in the external log. The version I'm using is 4.0.3.853. Here are the first few lines of the "output" file:
Oracle SQL Developer Data Modeler 4.0.3.853
Oracle SQL Developer Data Modeler Import Log
Date and Time: 2014-11-27 11:15:55 GMT
Design Name: Untitled_1
RDBMS : SQL Server 2008
All Statements: 2004
Imported Statements: 0
Failed Statements: 0
Not Recognized Statements: 2004
<<<<< Not Recognized >>>>>
ÿþU
The odd characters in the last line seem to be a feature of importing SQL Server DDL files. I've done all sorts of things to eliminate them, but they actually seem to be added in by the parser!
Here are the first few lines of the actual DDL file:
USE [master]
GO
/****** Object: Database [MyDatabase] Script Date: 10/29/2014 11:45:32 ******/
CREATE DATABASE [MyDatabase] ON PRIMARY
( NAME = N'MyDatabasedata', FILENAME = N'M:\Databases\MyDatabasesqldb01\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\MyDatabase.mdf' , SIZE = 20992512KB , MAXSIZE = UNLIMITED, FILEGROWTH = 512000KB )
LOG ON
( NAME = N'MyDatabaselog1', FILENAME = N'M:\Logs\MyDatabasesqllogs01\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\MyDatabase.ldf' , SIZE = 10769408KB , MAXSIZE = UNLIMITED, FILEGROWTH = 512000KB )
GO
ALTER DATABASE [MyDatabase] SET COMPATIBILITY_LEVEL = 80
GO
IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [MyDatabase].[dbo].[sp_fulltext_database] @action = 'disable'
end
GO
ALTER DATABASE [MyDatabase] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [MyDatabase] SET ANSI_NULLS OFF
GO
ALTER DATABASE [MyDatabase] SET ANSI_PADDING OFF
GO
ALTER DATABASE [MyDatabase] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [MyDatabase] SET ARITHABORT OFF
GO
ALTER DATABASE [MyDatabase] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [MyDatabase] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [MyDatabase] SET AUTO_SHRINK OFF
GO
I've change the database name. Apart from that, this is the DDL I have.
Any help, pointers, anything would be appreciated! -
Tempo problems with imported wav files
Hey everyone, sorry if there's a quick fix for this in the forums that I couldn't find, but I've been having some tempo problems with imported .wav files.
Long story short, my system couldn't handle playing all the tracks for a song while recording drums, so I bounced out an mp3 of the song and put it in a new Logic file so my drummer could just play along to that as I recorded him. Unfortunately, the original song is at 167 bpm, but I forgot to change the bpm in the new Logic file with the .mp3 file of the song to 167 bpm, so it was left at the default 120 bpm.
So, the drums were recorded at the correct 167 bpm, but Logic thinks that those new drum .wav files should be played at 120 bpm, so when I import my drum tracks back into the original file, they do not play correctly at all.
I could get record it all again, but I wanted to check if there was a way I could salvage what I already have, since my drummer lives about an hour away right now and can't just come over whenever he wants.
Thanks for the help! I really appreciate it.Hi,
First, do not use MP3 in Logic, the sound quality is less than AIFF, WAV or CAF, and Logic has to decode it for playback, making it a heavier burden on the CPU than an uncoded audiofile, such as AIFF, WAV or CAF.
Secondly, audio files are independent of Logic's tempo. If you bounce down an audio file in any format (other than Apple Loop), it will play back at the same speed, +regardless of Logics' tempo setting+, either at recording or playback. Logic doesn't 'think' anything. The BPM is only important to MIDI tracks, or to the spacing between audio files. The audio files themselves *are not affected* by the tempo setting. If you import an audio file of tempo 167 into a 120 BPM project, the file will still play at 167, only Logic will indicate the wrong bar positions.
regards, Erik. -
Problem with FM SXPG_COMMAND_EXECUTE in deleting a file on server.
Hi All,
My Task is to delete a file with version number 6 and rename all other files, so that my new file will be version 0.
Am using the FM SXPG_COMMAND_EXECUTE to delete/rename a file on server.
Though, this is working fine in debugging mode I can see the file getting deleted in regular run, which is followed by renaming some files, it is NOT working in regular run.
Somehow the deletion and renaming are not happening correctly in regular run, but does happen successfully while in debugging.
Am I missing anything or any refresh or delay needed?
Here is my code for DELETE :
CONCATENATE P_PATH '\' P_TABNAME '_6.dat' INTO LF_FILE.
CONDENSE LF_FILE NO-GAPS.
*// Check if this file exists
OPEN DATASET LF_FILE FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF SY-SUBRC EQ 0.
*// Delete this file
DELETE = LF_FILE.
CALL FUNCTION 'SXPG_COMMAND_EXECUTE'
EXPORTING
ADDITIONAL_PARAMETERS = DELETE
COMMANDNAME = 'ZDELETE'
OPERATINGSYSTEM = OPSYS
STDERR = 'X'
STDOUT = 'X'
TARGETSYSTEM = EHOST
TERMINATIONWAIT = TERMWAIT
TRACE = ' '
IMPORTING
STATUS = RETCODE
TABLES
EXEC_PROTOCOL = PROT
EXCEPTIONS
COMMAND_NOT_FOUND = 01
NO_PERMISSION = 02
PARAMETERS_TOO_LONG = 03
PARAMETER_EXPECTED = 04
PROGRAM_START_ERROR = 05
PROGRAM_TERMINATION_ERROR = 06
SECURITY_RISK = 07
TOO_MANY_PARAMETERS = 08
WRONG_CHECK_CALL_INTERFACE = 09
X_ERROR = 10
OTHERS = 11.
ENDIF.
Regards
Raj
Edited by: Rajasekhar Dinavahi on Apr 14, 2010 11:45 AMHi All,
Problem resolved.
We need ensure all files which are opened [by OPEN DATASET], are CLOSED before trying any operation like DELETE or RENAME on the files.
Regards
Raj -
Problem during import of 121 transport requests to productive system
Hello
We have problem during import of transport requests to productive system. Import of 121 transport requests stopped very soon in phase "N" (in TRBAT I have only one entry and in TRJOB as well).
In sm50 there is an BGD running under user DDIC in client 000 now for 14453 seconds (program SAPLSDB2). This should be import.
In SM37 I can see it as job "RDDGEN0L" with repport"RDDGENBB". Based on some literature it should perform "Converting all structure changes generated by the import and recorded in table TBATG, other than the structure changes to matchcode objects." Very interesting that TBATG has only four entries related to 2 indexes in table "DFKKOPK" , one in table "DFKKREP06" and one" ENQU" for EFKKEXC". (only this last one has not status error)
For fist two indexes I know they are not present as objects "LIMU""INDX" in any transport request beeing imported.
Also on productive system there is no"VOL" and "ZOL"indexes for table "DFKKOPK"(instead they are created on test system ie. not transported from development to test system)
Last command for that process is "CREATE INDEX "DFKKOPK~HKO" ON "DFKKOPK" ("MANDT", "HKONT", "OPBEL") PCTFREE 10 INITRANS 002 TABLESPACE PSA
PTR3 STORAGE (INITIAL 0000000016 K NEXT 0000000016 K MINEXTENTS 0000000001 MAXEXTENTS UNLIMITED PCTINCREAS"
There is enaught space on disk and in tablespaces (it is an oracle/HPux server).
Does anyone knows workaroun to solve productionare you importing these transport requests simultaneously into production?
I would suggest you try doing in smaller groups of 5 or 10 and then see whether you are able to import the requests
Rohit -
Problem in Importing Transport Request
Hi Everybody,
I have a Problem in Importing a Change Request . Yesterday ABAP person had Released two change request from DEV to PRD. I imported the 2. i got successfully imported but another one is still importing( status:Truck symbol)
after that i have imported many Request in PRD server, except this one. how to stop importing this? what may be the reason?
Regards,
SivaHi Siva Kumar,
You can reset the status of that transport.
Select the transport (which is showing the running status) -> Go to the import monitor -> right click the truck symbol and delete it.
Best Wishes. -
Problem in importing from dmp file
Hi ,
I am facing problem while importing from dmp file on unix server
Error is as follows :
Export file created by EXPORT:V08.01.07 via conventional path
IMP-00013: only a DBA can import a file exported by another DBA
IMP-00000: Import terminated unsuccessfully
Export taken was with user xxx@yyy
import command is : imp xxx/ppp file=abc.dmp TABLES=xxx.table1 ignore=y feedback=500
please help me out.
Atul ChouguleI tried with FROM USER / TO USER , but the result is same.
$ imp uuu/ppp file=aaa.dmp fromuser=uuu touser=uuu ignore=Y
Import: Release 8.1.7.3.0 - Production on Thu Jun 15 06:44:49 2006
(c) Copyright 2000 Oracle Corporation. All rights reserved.
Connected to: Oracle8i Enterprise Edition Release 8.1.7.3.0 - Production
With the Partitioning option
JServer Release 8.1.7.3.0 - Production
Export file created by EXPORT:V08.01.07 via conventional path
IMP-00013: only a DBA can import a file exported by another DBA
IMP-00000: Import terminated unsuccessfully
I am stucked up ,Dont know what to do . -
Problem in Importing Configuration object
Dear All,
I am facing the problem while importing ID object.
Business system INTEGRATION_SERVER_XIP is not assigned to a business system group with the ID (XISystemGroup).
So , I just try to create a Group in Business System in SLD. But I could not find the option to create the System Gourp.
I could n't see <b> edit groups </b>
any clue..
Regards
DanabHi,
First, you have to start the SLD.
To create a group, proceed as follows:
> Business Systems
> Click on the drop-down next to Group and select Edit Groups
> Create a group for each environment (eg DEV, TST, PRD). Specify for each group which Integration Server is used.
To assign business systems to a group, proceed as follows:
> Business Systems (Group = all)
> Select your Business System
> Click on the Integration tab.
> Select the related Integration Server (which is associated with a group - see step 1)
To map your DEV Business Systems to TST Business Systems etc., proceed as follows:
> Business Systems
> Select the group of your choice
> Select your Business System
> Click on Transport
> Click on Add/Change Target
> Select the Target Group and System
KR, Danny De Roovere -
Problem in importing SAP Exchange profile (BASIS settings) - Urgent
Hi,
I am currently facing problem in importing SAP Exchange profiles manually.
When i enter into http://<J2EE_host>:<J2EE_port>/exchangeProfile with username as PISUPER, the page is getting loaded but
<b>1.</b>it shows a error message stating, Name or password is incorrect
<b>2</b>. When i ignore that error and proceed with the connections tab to edit the server settings, i cannot set the password for PILDUSER. It again says, Name or Password is incorrect..
What could be the reason for these errors. Please help me... This is very urgent....
Thanks in advance,
RoseThanks for your response guys. I checked for PISUPER and PILDSUPER in SU01 and checked for lock but it shows an error message saying User doesnot exist.
Kindly let me know how to procced on in this case.
Thanks and Regards,
Rose. -
Problem in importing the type and jobs
Hello ,
Actually i am having problem in importing the jobs and types from a dump ,during import it gives an error for both jobs and types and after the import completes ,when i check the job and type in the schema it does not exist ,,
Kindly give me any suggestion regarding this problem or suggest me any way through which i can import these jobs and types on the same server but in different schema ..or is there any way though which i can make the dump by using export,in which i can exclude the jobs and types in dump file.
Plz reply me as soon as possible..
thanks in advance.
Regards,
Omer.What can be done in this senario when table name in source and target schema are different.
What other approaches i can follow.
Please let me know. -
Problem in Importing RFC from ECQ to XIQ
Hi.....
I have a problem in importing the RFC from ECC Quality Server to PI Quality Server.......
When I fill the fields like
Application Server......
System Number.....
UserName
Password.....
I am getting a screen Problem.....in which error information is given....
One of the error is UserName(XXXX) cannot have access to aii.sap......
Please help me.....
Thanks,
SudheerHi Sudheer,
Kindly check the user having all authorizations which you are try to importing the RFC's from ECQ to XIQ, check with your basis team and try to provide the required authorizations for that particular user.
Then you can able to import RFC's and do the design by activating the objects.
Regards
Venkat Rao .G -
Problem in importing ESS Component.
Hi,
We have problem in importing ESS component from hard drive into server.
Here is the details
Comaponent sap.com_SAP_ESS - 600 Level 6 Update ERP05VAL.09201316
We are importing it through CMS. its been 2 days since we initiate import. but still running.
Please find the log details below.
Software Component sap.com/SAP_ESS
Version MAIN_ERP05VAL_C.20060920131656
Label 600 Level 6 Update ERP05VAL.09201316
System XSSTrack-Development
Step Repository-import
Log /local/dsk/data1/sap/trans/EPS/in/CMSsapaudev06DD0/CMS/log/XSSTrack_D@DD0/[email protected]
The Log file does not have any info. empty.
Please give Ur suggestions
Cheers,
SenthilHi,
the problem seems to be connection to the database or may be database URL is incorrect.
The problem can be solved via the Visual Administrator or by a manual modification of the data-sources.xml file.
Visual Administrator --> Server --> Services --> JDBC Connector -->Select DataSource
Change the URL given in the Database URL field with the appropriate one & save it.
For step by step process refer
http://help.sap.com/saphelp_nw04/helpdata/en/5c/2f2c4142aef623e10000000a155106/frameset.htm
Thanks
Swarup
Maybe you are looking for
-
Error starting application in WAS 5.0.2.9
Hi, I been having this problem starting an application server for a while. I verify the SOAP port, everithing. For a little explanation, there are 2 websphere (in different servers, both are aix 4.3.3) all WAS 5.0.2.9 and i create a node, to use only
-
I keep getting this message when I open iTunes: The registry settings used by the iTunes drivers for importing and burning CDs and DVDs are missing. This can happen as a result of installing other CD burning software. Please reinstall iTunes. I have
-
Printing a field list in Crystal XI
Post Author: Gemmell Neilson CA Forum: Crystal Reports I am using Crystal XI Professional and trying to create reports using a database that is completely foreign to me. There are at least 50 tables in the source database. I'm trying to avoid searchi
-
Authorisation to access the Project
Dear all, I have SAP_ALL authorisation but still system is giving error of authorisation. For CJ20N, when I create new project, I am getting following error message: <i><b>You are not authorized to access this project</b> Message no. ZPS052 Diagnosi
-
Oracle 10g EM DB console Host Preferred Credentials for Windows
Hello: I install Oracle 10g R2 on on windows machine (XP Professional SP2). Everything is OK except I can't set Host Preferred Credentials. I logged in Windows using Windows Domain account, connected EM DB console as sysdba. I used domain user name t