EHP4 directory related doubts
Hi,
We are doing EHP4 ready system to EHP4 upgrade and have few basic questions on this. Can you experts please share your experiences ?
1) Download directory we have mentioned as /EHPI/EHPI (/EHPI filesystem was created by Unix team & after un-car of SAPehpi_42-10005805.SAR the EHPI directory got created) - is this ok ?
2) In which location should we include stack.xml ? Is there any specific location or we can place anywhere in the system & put that name in stack configuration tab under configuration phase ?
3) For the actual support pack files which we downloaded from SMP(through Solman), whether we need to uncar them and then put in /usr/sap/trans/EPS/in ? or EHPi will automatically uncar them during upgrade ?
Also is there any specific location to place the component (EA_APPL, EA_BASIS etc) specific .SAR files ?
Because of certain errors already our upgrade has to be reset couple of times in configuration stage. So like to cross check the above mentioned points based on your inputs.
Also is there any step wise guide for EHP4 available ? So we got "SAP ERP 6.0 Including Enhancement Package 4 Support Release 1" or "How to install.." etc pdf guides but there we won't screen wise step guidance unlike the normal SAP version upgrade guides that are available in the market.
We are doing EHP4 upgrade for the first time. So your help on these basic doubts will help us to proceed further.
Thanks
TS
1) Download directory we have mentioned as /EHPI/EHPI (/EHPI filesystem was created by Unix team & after un-car of SAPehpi_42-10005805.SAR the EHPI directory got created) - is this ok ?
Yes, make sure that the directory is created SIDadm. In EHP guide, it recommend to create EHPI folder under /usr/sap/
2) In which location should we include stack.xml ? Is there any specific location or we can place anywhere in the system & put that name in stack configuration tab under configuration phase ?
No. But it's a good practice you put under a EHPI/download directory. During the Configuration Phase, EHPI will ask for the stack.xml file where you can browse to the location where your xml file resided.
3) For the actual support pack files which we downloaded from SMP(through Solman), whether we need to uncar them and then put in /usr/sap/trans/EPS/in ? or EHPi will automatically uncar them during upgrade ?
No, no uncar required. EHPI will uncar them.
Also is there any specific location to place the component (EA_APPL, EA_BASIS etc) specific .SAR files ?
+No, you can place all files (upgrade files + SPs ) into the same directory.+
Similar Messages
-
Redo Log and Supplemental Logging related doubts
Hi Friends,
I am studying Supplemental logging in detail. Have read lots of articles and oracle documentation about it and redo logs. But couldnot found answers of some doubts..
Please help me clear it.
Scenario: we have one table with primary key. And we execute an update query on that table which is not using the primary key column in any clause..
Question: In this case, does the redo log entry generated for the changes done by update query contain the primary columns values..?
Question: If we have any table with primary key, do we need to enable the supplemental logging on primary columns of that table? If yes, in which circumstances, do we need to enable it?
Question: If we have to configure stream replication on that table(having primary key), why do we actually need to enable its supplemental logging ( I have read the documentation saying that stream requires some more information so.., but actually what information does it need. Again this question is highly related to the first question.)
Also please suggest any good article/site which provide inside details of redo log and supplemental logging, if you know.
Regards,
Dipali..1) Assuming you are not updating the primary key column and supplemental logging is not enabled, Oracle doesn't need to log the primary key column to the redo log, just the ROWID.
2) Is rather hard to answer without being tautological. You need to enable supplemental logging if and only if you have some downstream use for additional columns in the redo logs. Streams, and those technologies built on top of Streams, are the most common reason for enabling supplemental logging.
3) If you execute an update statement like
UPDATE some_table
SET some_column = new_value
WHERE primary_key = some_key_value
AND <<other conditions as well>>and look at an update statement that LogMiner builds from the redo logs in the absence of supplemental logging, it would basically be something like
UPDATE some_table
SET some_column = new_value
WHERE rowid = rowid_of_the_row_you_updatedOracle doesn't need to replay the exact SQL statement you issued, (and thus it doesn't have to write the SQL statement to the redo log, it doesn't have to worry if the UPDATE takes a long time to run (otherwise, it would take as long to apply an archived log as it did to generate the log, which would be disasterous in a recovery situation), etc). It just needs to reconstruct the SQL statement from the information in redo, which is just the ROWID and the column(s) that changed.
If you try to run this statement on a different database (via Streams, for example) the ROWIDs on the destination database are likely totally different (since a ROWID is just a physical address of a row on disk). So adding supplemental logging tells Oracle to log the primary key column to redo and allows LogMiner/ Streams/ etc. to reconstruct the statement using the primary key values for the changed rows, which would be the same on both the source and destination databases.
Justin -
Hi all,
What is message in Idoc? Please tell in layman terms. Is it part of the data records? Please point me to related files or threads.
Thanks,
Charles.Hi,
The Message type are mornally assign to SAP document type (IDOC). A message type represents a business function ( Purchase order data, Invoivce data etc). The technical structure of the message type is the IDoc type.
An <b>IDoc type</b> can be implemented for various "logical" <b>messages</b>; one message can also be assigned to different IDoc types (m:n relationship).
The message is defined by the values for <b>message type (required),</b> <b>message code</b> (optional) and <b>message function</b> (optional). These three fields belong to the key fields in the partner profiles, which are used, for example, to determine inbound processing. If the 'Message type' field is not maintained (e.g. in the case of a control record from Release 2.1, in which MESTYP did not exist), the IDoc Interface uses the value from STDMES (EDI message type).
<b>Example</b>
The message ORDERS (= message type) identifies purchase orders (outbound) and sales orders (inbound).
Check transaction <b>WE81,</b> for Message type in SAP.
Also, follow this link for more infomation related to IDOC processing
http://help.sap.com/saphelp_46c/helpdata/en/72/c18ee5546a11d182cc0000e829fbfe/frameset.htm
Check this lin for information about "Distribution Using Message Types".
http://help.sap.com/saphelp_46c/helpdata/en/78/2174a351ce11d189570000e829fbbd/frameset.htm
Let me know if you need any other information.
Regards,
RS -
Hello Experts,
I have few doubts (Questions) regarding the upgrade from BW 3.5 to BI 7.0. I am having certain issues after the upgrade. i want to know if these are because of the upgrade or are these general issues. Can you please give me the reasons in detailed as well.
Below are my questions
1. An Event is triggering a process chain in 3.5, but the same process chain after upgrade didn't work, so when i checked the Event is missing in the Process chain.
2.There were inconsistencies in the info objects in Dev and Production
3.Development was having low space in the memory
Points assured for all answers.
Thanks in advance for your inputs.Hi Lalitha,
Migration would not affect the process chain...
you should activate and schedule the process chain again....
check the event is available ...
and regarding memory, BI running on java instance it may need more memory...ask the DBA to increase the memory...
Cheer's
HVR -
Hi All,
I have a doubt. Does the code segment of a java class get repeated in all the objects created from that class ? Or the single code segment is used by all objects created from a single class ?
Regards,
SouravNo, code is associated with a class, and there is a single copy of a class within a process for each class loader that loads it (usually just one).
-
Directory related functions in LrFileUtils
I've been wanting an "isDirectory()" function for LrFileUtils. Consider this a request to add such in the future.
I've been trying to fabricate something similar from the existing APIs in the meantime. In the process I have discoverred the following two issues:
the recursiveDirectoryEntries/recursiveFiles functions (and maybe directoryEntries as well) do not return any error indication if the passed name is for a file, or for something non-existent. It seems like they should. Ideally two different indications for those two different scenarios, so that one could tell the difference.
isEmptyDirectory() returns true if the name is for a file. I don't think I tested passing it the name of something non-existent. Also, it would be nice to get more information from fileAttributes(), specifically around whether the named thing is a directory.Doh! :-) Indeed it does. I must confess I saw the method in the summary but never even read it's description, assuming that it would just return a true/false and wasn't useful for what I was doing.
I've just tried a few examples with it. It mostly works as I expect, with one exception, which you may very will find to be nitpicky. I've found that Lr is usually happy regardless of whether I use "/" or "\" as a separator (on Windows, at least), except that "//foo" and "//foo/bar" return false even though "\\foo" and "\\foo\bar" return true. "//foo/bar/baz" works fine. I really have no clue if this is an Lr quirk or a Windows quirk.
Regardless, I'm satisfied and moving along. Thanks Eric! -
BC4J,iAS9i: _pages directory related problem
Hi all,
I deployed a BC4J app generated by Jdev to iAS and have the following problem:
-If I shut down the iAS and restart it the BC4J the webbeans in JSP pages can not load data from Oracle 8i.
- If I shut it down and delete ../apache/apache/htdocs/myApp/_pages directory under the app directory and restart the iAS everything works just find.
Is this a known bug ?
best regards,Posting this again in hope of a answer...
-
Receiving Open Interface related doubt
Hi All,
We are currently on Oracle Release 12.0.4 and need some help/information regards Receiving Open Interface tables : rcv_headers_interface & rcv_transactions_interface tables.
We import the ASN which is being sent across from the Supplier via EDI directly into the Receiving Open Interface tables and then RTP program ( Receiving Transaction Processor ) program processes these into the base tables
to complete the receipt process..
Now for some reason or the other lets say because of EDI data issues or some other system issue with the RTP program , the Interface records end up in error status & the users have to manually complete the Receipt in Oracle Forms directly..
Now this Interface data over a period in time keeps accumulating while the corresponding PO Document would already have been Received manually in Oracle.
Because of this errored interface records the Interface tables keep growing in size in terms of data volume..
Now is there a way script or a program which validates these interface records and purges these automatically based on some validations on its own..
For Example : for a po if the records are stuck in the interface table but the po shipment has already been received in manually and in Closed Status, then it would automatically purge these records from interface tables.
Could someone please help us with this as soon as possible with any suggestions or workarounds..
ThanksCurrently there is no purge program available to purge the records in RTI. Further the oracle support wont be able to provide a delete script to delete the records in RTI (The reason being, the records are populated by the customer and customers are the best persons to validate the records and delete them on their own).
However, for ASN's
You may write a query based on the RTI.po_distribution_id and then check, whether the corresponding po shipment\distribution is already received \ closed for receiving and then you may delete. -
Reports-3.0 - Printing in compressmode & related doubts
Dear Friends,
I am working as an Oracle-Developer/2000 Programmer in Oman. As I am new to this Software, having some problems and very much expecting your valuable suggestions/solutions from your side. Please Help me!.
Problem 1: Printing Compress Mode
Software/Program specification:
- Oracle 8.0.5 / Developer 2000 (Forms 5.0/Reports 3.0) on
Windows N.T 4.0
- Report Layout with 250 characters (Designed in Character mode)
- Printer Epson LQ 2170
Now my requirement is
* How to take the Print out in Compress mode using 132 colum
width paper. Meantime User wants to preview the Out put in
Preview screen also.
Problem 2: How to programmatically maximize the Preview Screen in
Reports 6.0.
Problem 4: Why, in Form 6.0, by Default Page Up/Down Key is not
functioning.
Note : Reply by [email protected] / [email protected]
Thanking you.
Syed Ali.M
nullhi there,
this is jitu here.
i want to design reports in character mode format !!it seems that u are already on it and have some succcess in it !!
on my side the case is that whatever changes i do in the properties of layout like orientation or page size ...i see no effects while printing or previewing etc..
tell me if u can help me on it !
thanx in advance,
best regards,
jitu :-) -
Group By -- Having Clause related doubt.
Hello,
Can we Write/Use rather a 'Having Condition' Before a Group by Clause ..?
If Yes then How does it Work.. I mean a 'Having' is a WHERE clause (filter) on Aggregate results ...
SO how does Having works before grouping the Results..??Hi,
Aijaz Mallick wrote:
Hello,
Can we Write/Use rather a 'Having Condition' Before a Group by Clause ..?What happens when you try it?
If Yes then How does it Work.. I mean a 'Having' is a WHERE clause (filter) on Aggregate results ... Right; the HAVING clause is like another WHERE clause.
The WHERE clause is applied before the GROUP BY is done, and the aggregate functions are computed.
The HAVING clause is applied after the GROUP BY is done, and the aggregate functions are computed, so you can use aggregate functions in the HAVING clause.
SO how does Having works before grouping the Results..??The order in which clauses appear in your code isn't necessarily the order in which they are performed. For example,
SELECT job
, COUNT (*) AS cnt
FROM scott.emp
GROUP BY job;Does it confuse you that this query can reference COUNT (*) in the SLECT clause, which is before the GROUP BY clause?
The SELECT clause which always comes before the GROUP BY clause in code. That does not mean that the SELECT clause is completed before the GROUP BY clause is begun.
If the documentation says that clauses must be in a certain order, then use that order, even if your current version of Oracle allows them to be in a different order. There's no guarantee that the next version of Oracle will allow something that was always said to be wrong. -
Soa Suite 11g related doubt.
Hello All,
We are currently using Soa Suite 11g where in we have the SOA composite having a BPEL process which would de-queue the B2B payload from the seeded IP_IN_QUEUE existing within the b2b database.
Earlier on 10g we were able to get the B2B Message id from the Payload Header --> Msg ID field , along with other fields like : FROM_PARTY , TO_PARTY , DOC_TYPE etc when we used to receive the de queued message payload using the Receive Activity.
Now in 11g we see that this Msg ID is coming as NULL
Could someone please help us how in the BPEL process we could get the B2B Message ID corresponding to this successfully processed payload in B2B. ???
while doing the Outbound EDI transactions for hitting the B2B we used to Enqueue the payload message to the IP_OUT_QUEUE and populate the property named : jca.aq.Headerdocument on the Invoke Node in the bpel process with the values like this in string format : <payload><doc_type>850</doc_type><sender>host party</sender>...etc so that it identifies the valid agreement and deployment using From & To Party & document name.
Now while getting the Message out from b2b internal queues , how do we get the b2b message id ???
Could someone please help with this..we are currently stuck because of this issue..
regardsa few blogs :
http://sriworks.wordpress.com/2010/12/20/oracle-certification-oracle-soa-foundation-practitioner1z0-451/
http://oraclefusionmiddleware.blogspot.com/2011/01/oracle-soa-foundation-practitioner.html
http://www.oracle.com/partners/en/knowledge-zone/middleware/oracle-service-oriented-architecture-soa/soa-exam-page-170307.html
and what helped for me was reading the soa suite books (around 2 or 3 of them i guess, google on them) and user/developer guides -
Forward all alerts related to Active Directory to a specific email
Hi All,
When monitoring with SCOM how to forward all Active Directory related alerts to the administrator who is responsible for AD.
I saw when right click on a specific alert a notification subscription can be created but this only for that specific alert. How to create an email subscription to all errors related AD management pack.
Thanks,
KanishkaHi
Create a new subscription, on the criteria page select "created by specific rules or Monitors". On the rule and Monitor search select the Management pack(s) you want and click search. Then mark all rules / Monitors and click add. Repeat this step for every
Management pack you want to receive alerts.
Cheers,
Stefan
Blog: http://blog.scomfaq.ch -
TP causes message in SLOG0706.SMP in log directory - SPAM Hung
Hi
I started the import of a queue of support packs in SPAM to the SMP system (Basis 7.00 , SM 4.0)
After running for a few hours overnight there was a TCP/IP problem and at the 3rd last package the SPAM showed an error in the IMPORT_PROPER phase the reason being TP_INTERFACE_FAILURE.
I restarted the import (obviously not the right thing to do!) and shortly afterwards although SPAM is showing it is doing the MAIN IMPORT step, nothing is happening - as the SLOG0706.SMP has lots of the following entries since I restarted the SPAM transaction:
WARNING: /usr/sap/trans/tmp/SAPKKITL417.SMP is already in use (1070), I'm waiting 4 sec (20070209074827).
My name: pid 11357 on <hostname> (smp03)
Looking at the pid, I find it is the OS job TP running and complaining about the SAPKITL417 which was the 3rd last package in the queue (last one to import is SAPKITL419).
There are 2 files in the tmp transport directory related to it from last night-
SAPIITL417.SMP *STMF SMPOFR 3083687 Feb 08 23:22
SAPKKITL417.SMP *STMF SMPOFR 314 Feb 08 23:20
The log directory has the following file since the SPAM was restarted:
Browse : /sapmnt/trans/log/P070209.SMP
ETP199X######################################
ETP172 MOVE OF NAMETABS
ETP101 transport order : "ALL"
ETP102 system : "SMP"
ETP108 tp path : "tp"
ETP109 version and release : "370.00.09" "700"
ETP198
ETP399XMove Nametabs started in batch mode.
ETP399 processing modeflag 'A'
ETP301 -
ETP364 Begin: Act. of Shadow-Nametabs ("2007/02/09 06:54:07")
ETP301 -
ETP301 -
ETP338XActivation of Shadownametabs
ETP301 -
ETP301 -
ETP365 "0" Shadow-Nametabs activated
ETP362 "0" Shadow-Nametab activations failed
3 ETP362 "0" Shadow-Nametab activations failed
2 ETP364 Begin: Act. of Shadow-Nametabs ("2007/02/09 06:54:07")
2 ETP366 End: Act. of Shadow-Nametabs ("2007/02/09 06:54:07")
In SM12 there are 2 lock entries from when the import started yesterday
000 POWEM 08.02.2007 E PAT01 #################### 1 0
000 POWEM 08.02.2007 E CWBNTHEAD 9999999999#### 0 1
In PAT01 there are entries for each package in the queue and the reason field has TP_INTERFACE_FAILURE
Table: PAT01
PATCH SEQUENCE STEPNAME
SAPKA70009 003 IMPORT_PROPER
SAPKA70010 010 IMPORT_PROPER
SAPKB70009 001 IMPORT_PROPER
SAPKB70010 002 IMPORT_PROPER
SAPKIPYJ79 004 IMPORT_PROPER
SAPKIPYJ7A 011 IMPORT_PROPER
SAPKITL417 006 IMPORT_PROPER
SAPKITL418 009 IMPORT_PROPER
SAPKITL419 015 IMPORT_PROPER
SAPKNA7006 007 IMPORT_PROPER
SAPKNA7007 013 IMPORT_PROPER
SAPKU50006 008 IMPORT_PROPER
SAPKU50007 014 IMPORT_PROPER
SAPKW70009 005 IMPORT_PROPER
SAPKW70010 012 IMPORT_PROPER
e.g.
PATCH SAPKITL417
SEQUENCE 6
STEPNAME IMPORT_PROPER
STEPCOND ?
RET CODE
MESSAGE
REASON TP_INTERFACE_FAILURE
SCENARIO S
VALID CONJ 1
What should I do to fix this?
thanks
MichaelHi,
We are currently struck with the same error message and SPAM is hung. The message in SLOG bieng" WARNING:
rpmsap8\sapmnt\trans\tmp\SAPKKITL414.SM1 is already in use (1200), I'm waiting 2 sec (20070222184527). My name: pid 7556 on rpmsap8 (sm1adm)
But I am not able to rename the file currently SAPKKITL414.SM1. It says file is already being used by other program.
Please Advice at the earliest!
Antarpreet -
How to find out if a directory exists
I have a directory path as an input parameter to my stored procedure. I would like to validate the path. i have seen several examples using xp_fileexist. Are there any gotchas when using this? It sounds like it will not work for unc paths (\\asdfasdf\asfas).
But does a security setting need to be turned on? is there another way to validate a path?SSIS is better fit than T-SQL for directory related tasks.
If xp_cmdshell active, you can use DOS commands:
http://www.sqlusa.com/bestpractices2005/filesindirectory/
You can also use PowerShell:
http://technet.microsoft.com/en-us/library/dd315304.aspx
Security considerations:
http://stackoverflow.com/questions/11740000/check-for-file-exists-or-not-in-sql-server
Kalman Toth Database & OLAP Architect
SQL Server 2014 Design & Programming
New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012 -
No error if source directory and package doesn't match
Hello,
I am using Sun's jdk1.5.0_09 "javac" compiler and the package name in my Test class is written as "com.hrishijoshi.asdf". This file compiles even if I put it in a directory called "com" (or anything else, as a matter of fact).
javac doesn't throw error even if the source file's directory structure doesn't match the package name defined in that file! It creates a directory structure matching the package name (not the source tree) in the output "classes" directory.
Is this a feature or a bug? ;-)
- Hrishi Joshi.
PS: Here is a log of my compilation attempt on Fedora Core 5 Linux:
[hrishi@hrishi test]$
[hrishi@hrishi test]$ rm -rf classes/com/
[hrishi@hrishi test]$ cat src/com/Test.java
package com.hrishi.asdf;
public class Test {
[hrishi@hrishi test]$ javac -Xlint:all -sourcepath src -d classes src/com/Test.java
[hrishi@hrishi test]$
[hrishi@hrishi test]$ tree classes
classes
`-- com
`-- hrishi
`-- asdf
`-- Test.class
3 directories, 1 file
[hrishi@hrishi test]$There is no requirement that source code files are in the correct directory structure. If you want javac to be able to automatically compile dependent files, then the source code must be in an appropriate structure and find-able from the classpath.
Also, when you use the -d option, javac will create an appropriate directory relative to the directory specified in the -d option. That's how it is supposed to work.
So it is working as designed and specified.
Maybe you are looking for
-
hi could anybody explain me the sap wm physical inventory reports for showing the difference in qty instead of difference in percentage is there any reports pl
-
2. X-Achse in 2-D Diagramm einfügen
Hallo, ist es möglich eine 2. X-Achse in ein Diagramm einzufügen und den Bereich verschieden zu dem der 1. X-Achse zu wählen.
-
Whenever I plug my iPod in it says there is music on it, but when I click on my music the songs are all grayed out.
-
Credit Mgmt Report Needed -Urgent
Hi, I need this report urgently, can anyone help me out in the same :- Fields Of The Report a) Sr. No. b) Business Area c) Sales Representative d) Customer Code e) Customer Name f) Sale Order Number g) Qty h) Delivery No. i) Invoice No. j) Invoice Am
-
When I turn on my computer a message pops up asking if I want to allow other user access to my system. I allway say no. How can I get rid of this pop up?