Modification vs cloning
Hi,
I have 2 questions:
1. I need to make like 50-60% of code changes in standard sap report programs such as VKM1, like adding some extra selection criteria, adding extra buttons in the alv output and then when user selects the records and clicks on those buttons, some adidtional data stored in Sales order (as append structures)etc. Which is a good option to go for, modification of the standard programs like RVKRED01, RVKRED02 etc or cloning them ? What will be impact in both the scenario ?
2. The standard logical DB used for the above is KMV (program is SAPDBKMV ) which has performance issues. So i want to add / modify some lines in here. Can I do the modifications directly in this prorgam ? Is that allowed ?
thks
Moderator message - Please ask a specific question and search before asking - post locked
Edited by: Rob Burbank on Jun 1, 2009 2:28 PM
Hi,
Enhancement can be done for ABAP dictionary- Table enhancement /Customer exits - Function module/screen exit/menu exists / Business transaction event and business add ins...and modification is done for non-registered/Assisted modification and user exists in SD module...
Similar Messages
-
Stored procedure modification breaks data refresh
Hello All,
I have a PowerPivot model that is based off of a number of views and stored procedures on SQL Server. The procedures and views work fine, but I'm at a point where the business users have requested modifications
that require changes (additions and deletions) of fields in these views and procedures. The views are easy to modify (I wrote select statements and selected field names rather than SELECT * ), but tedious since I have to write out all field names and do it
for each table based on a view.
The procedures are giving me more trouble, since the columns are not defined in the SQL calling the procedure. (Just procedure_name parameter1, ...). If I remove a column from the result set of the stored procedure,
refresh fails for the table fed by that procedure. When adding columns to the result-set, the new columns are not available after a normal refresh. I have to open the table itself and view the table properties. Re-saving the SQL calling the procedure seems
to be the trick for getting PowerPivot to recognize the new columns. These procedures are used in multiple workbooks.
The result of this is that I have to manually update each table in the model when I make a change like this. These views and procedures are used in multiple workbooks, making maintenance of this sort a chore.
I am using a workaround found here (http://social.msdn.microsoft.com/Forums/sqlserver/en-US/356cb739-37bd-4b43-a202-5b40ae171919/powerpivot-excel-2013-crashes-after-column-rename-in-source-datatabase?forum=sqlkjpowerpivotforexcel) to delete columns from
the stored procedure-fed tables. I manually delete the old column in PowerPivot before deleting it from the procedure.
So I am able to complete my work, but in a very circuitous manner. Is there some better way to update these tables and add/remove columns from the model?I don't think you're going to find a better way. Versioning may help, by which I mean creating different views/procedures with versioned names (e.g. vwGetInfoV001) or some such thing then cloning the spreadsheets with the current version name available
in a documentation tab. At least that way you can gracefully upgrade without simply breaking the old while waiting to get all spreadsheets upgraded to the new. -
Hi friends
I am trying to clone the records present in the application using perl manipulator.
To the cloned records,I need to add a new property which is a combination of 2 properties,product id and group name.Product id is an existing property and group name is a new property to be created and should be added to the cloned records as well as existing set of records.
I am trying to use the following code in get_records() of perl manipulator but the records are being removed once processed by baseline update.
Can,anyone please help me on this?
Here is the code am using.Please correct if there is anything wrong.I am not much sure of the syntax.
EX:
Records before:
product_id:123
Sku_id:456
Cat_id:888
Records after passing through perl manipulator:
product_id:123
Sku_id:456
Cat_id:888
group_name:A
product_key:123_A
product_id:123
Sku_id:456
Cat_id:888
group_name:B
product_key:123_B
product_id:123
Sku_id:456
Cat_id:888
group_name:C
product_key:123_C
product_id:123
Sku_id:456
Cat_id:888
group_name:D
product_key:123_D
product_id:123
Sku_id:456
Cat_id:888
group_name:X
product_key:123_X
my @recs;
foreach my $src ( $this->record_sources ) {
my @productData = $src->get_records($key);
foreach my $rec ( @productData ) {
foreach my $groupnames(A,B,C,D,E,X){
# clone the record
# add product_key property to the cloned record
# push the cloned record into @recs
# Psudo code starts
my $clone = $rec->clone;
my $productId = grep { $clone ->name eq 'product_id' } @{ $clone->pvals };
my $groupName= grep { $clone ->name eq 'group_name' } @{ $clone->pvals };
my $productkey = $properties{"product_id"}."_".$properties{"group_name"};
my $pval = new EDF::PVal("product_key", $productId."_".$groupName);
$rec->add_pvals($pval);
# Psudo code ends
return @recs;
Edited by: Shreyas Ram R on Nov 10, 2012 10:33 PM
Edited by: Shreyas Ram R on Nov 10, 2012 10:37 PM
Edited by: Shreyas Ram R on Nov 10, 2012 10:38 PMI'd do this a different way. If you are cloning all records in a feed, then read the records in twice - once as normal, then another set of record adapter(s). The second set goes into an XML manipulator, and this modifies the properties (see the XML reference within Developer Studio, it looks quite complicated but is actually very easy once you get your head around an XML being used as a pseudo programming language). Finally, both sets of data go into a switch join (Record Assembler > Join Type = Switch).
Reason for this approach is it'll be a lot faster than using a perl manipulator to create these duplicates, and a lot cleaner too. Thinking about it, you may be able to read in the data once, then send it to two separate record caches, manipulate the second cache, then feed both into the switch join. Not 100% sure if that is supported, but definitely worth trying first.
Oh, and add a record spec - always a good idea, not least because it'll give you persistence of a unique id for record lookups (R=XYZ).
Michael -
Creating shell script for cloning
Hi,
I am cloning 11.5.8 instance. For copying of files from source to production I use rcp command. I want to create shell script that copies the files and lwhen it is complete send an email to my rediffmail account.
There are 2 problems:-
1. How to track if rcp commaind has completed.
2. How to send out an email. I tried to send an email using sendmail but it gave me an error "stdin: Value too large for defined data type".
Please help.
Thanks & Regards,
NeerajHi Neeraj,
I'd suggest using the exit code from the rcp command to drive your email step. For most commands in Linux, and exit code of 0 indicates success, and in most shells, the environment variable $? stores the exit code of the most recent command. So, you could do something like this:
rcp source_files target_location
if [[ $? -eq 0 ]]
then
echo "Huzzah\!" | mailx -s "RCP from source to target successful" your_username@your_host
else
echo "Bummer. Got exit code $? from rcp" | mailx -s "RCP from source to target failed" your_username@your_host
fiThe preceding advice is offered with three disclaimers:
1) I don't know which shell you're using (ksh, bash, csh?) -- my code snippet assumes ksh, but the principles should be the same in any shell
2) My morning coffee has not yet taken effect, so I may be completely wrong about something important :-)
3) The code above is untested, and will probably not work in any shell without modification.
Despite all that, hopefully this will put you on the right track.
Regards,
John P.
Message was edited by:
jpiwowar (minor clarification) -
Cloning AS & DB - Definitive Guide?
All,
I am not an Oracle DBA by far, but have installed Oracle 10g DB and AS successfully a couple times. We are running on Redhat ES4 with DB 10.2.0.3 and AS 10.1.3.1 which run on separate hosts (in VMware).
I have been asked to clone these instances to two more hosts to be the production instances. I have semi-successfully cloned the DB using the instructions here (with some modification):
[http://www.dba-oracle.com/oracle_tips_db_copy.htm]
The only problem with that clone at this point is that the new host still has the old hostname when we view it in EM. (Does anyone know how to fix that also? :)
Anyway, this is just background info.
I have seen all the documentation on cloning and the many ways to do it.
_But my real question is: Is there a definitive guide on how to clone DB and AS that are installed on two separate hosts to two new hosts and maintain everything in an exact state??_
Thanks and Peace,
TomThere are various ways of doing this.
For example if you do not want to use Rman, here is a guide for the DB piece on Metalink:
Subject: Steps to Manually Clone a Database
Doc ID: Note:458450.1
For Oracle AS, I'd refer to the OTN Docs:
Oracle® Application Server Administrator's Guide
10g Release 3 (10.1.3.2.0)
Part Number B32196-01
9 Cloning Application Server Middle-Tier Instances
http://download.oracle.com/docs/cd/B32110_01/core.1013/b32196/cloning.htm#CEGFDGCF -
Cloning a resource making my head spin
Newbie question: why is cloning a resource so complicated?! If I want to clone the resource "AD User" the modifications to my exported 1.4MB XML file are not very well explained anywhere. Despite spending a good amount of time on these forums, there is no clear explanation of the fields which need to be modified to support the cloning operation.
Rajiv's response on this post is the closest I have seen to an explanation:
Provisioning OID Account into 2 boxes
Can anyone else help?We are on same boat. I am trying to Clone AD User. Had partial luck. I am able to create the Objects for my new AD. However Still haveing few things to try out.
And I have the SR. But it will be slowest way.
Better develop some knowledge on the Object.
Try to run few iterations of trying with XML files.
The only details from the Connector is available on this.
From the Connector Documentation (Oracle Identity Manager Connector Guide for Microsoft Active Directory User Management - Release 9.1.1 - E11197-11):
Section: 4.15.1
To create a copy of the connector:
1. Create copies of the IT resource, resource object, process form, provisioning process, scheduled tasks, and lookup definitions that hold attribute mappings.
2. Create a copy of the Lookup.AD.Configuration lookup definition. In the copy that you create, change the values of the following entries to match the details of the process form copy that you create.
• ROUserID
• ROUserManager
• ROFormName
• ROUserGUID
See "Configuring the Lookup.AD.Configuration Lookup Definition" for information about these entries.
3. Map the new process tasks to the copy of the Lookup.AD.Configuration lookup definition. -
Hi all,
We have this situation.
We have made some modifications in our production init.ora file (source of cloning) initPROD.ora,
for example we have changed the sga_target parameter from:
sga_target = 1G
to
sga_target = 2G
After that we have finished the cloning procedure to new target instance and value of our modified parameter is still old value of sga_target (1G).
Do we need to change manually every parameter we have changed on production (source of cloning) or there is some other way to include this changes in new cloned instance?
Thanks,
Regards,
D.After that we have finished the cloning procedure to new target instance and value of our modified parameter is still old value of sga_target (1G).
Do we need to change manually every parameter we have changed on production (source of cloning) or there is some other way to include this changes in new cloned instance?Yes -- Please see this thread for details Re: Changes not taken context file On Db Tier in R12.1.3
Thanks,
Hussein -
Cloning = Defrag...? How?
Hi all,
A day of idle curiosity, apparently, and I was just wondering...
I have read here that cloning to an external, wiping the original and re-cloning back effectively defragments 'everything'. How does this work? To my way of thinking a clone is a clone is a clone, fragmentation and all.
How is it that DU, CCC or other back-up utilities 'know' to put that bit with this bit (or byte ) in the back-up so that the end result is more contiguous (if that is the right word) than the original?
Anyone care to elaborate, elucidate or otherwise educate?
I've no doubt that this may get a bit technical, I promise I'll try and keep up
AdrianIf someone creates and edits (i.e. modifies and re-saves) files *larger than 20 MB* (yes, MB), such as when working on videos, that need to be accessed with maximum speed then fragmentation can be an important issue. For the rest of us... well, instead of me rambling on, a 'Google' will give you masses to read about "Hot-File-Adaptive-Clustering" and related matters. As you say, there is a large disparity of views, but because many of the views reflect lack of knowledge, some reflect an ulterior motive, a prejudice or a 'gut feeling' or a 'feel-good factor', and some reflect experience of platforms other than OS X and HFS+, I strongly recommend that you start by reading what Apple themselves say, e.g. in
this short document.
I suspect that reading that might well make you decide to forget 'defragging' for ever ( ). If you want more info there is
a huge Apple Tech Note here
but you might only need to see a small section, "Hot Files", near the end to get a sense of what is constantly going on in the background as you use your Mac.
Many thanks for your kind thanks!
Andreas -
Error while transporting modification in module pool of a specific infotype
HI experts,
While transporting some modifications that have be done on the PAI of a module pool of a specific infotype, we have an error with code return = 8, the error is --> Original object R3TRPROGMP900730 must not be changed!!!
Any suggestion .
Tanhks a lot .Hi ,
Nice to know that your problem is solved. If any of the replies were usefull please acknowledge their work.
And it would be great if you can just put in your remarks about the best solution you had in solving that problem.
It might be useful to other users when they encoutner such problem.
BR,
Vijay. -
Can any one tell me how can I move to a different folder pictures, that I've cloned, without them staying aggregated? They all come together to the other folder and I don't want that… thanks
There's more to it than that.
Folders in Aperture do not hold Images. They hold Projects and Albums. You cannot put an Image in a Folder without putting it in a Project or an Album inside that Folder.
The relationship between Projects and Images is special: every Image must in a Project, and can be in only one Project.
Images can be in as many Albums you want. Putting an Image in an Album does not move it from the Project that holds it.
You can make as many Versions from a Master as you want.
What you want to do may appear simple to you, but it still much adhere to how Aperture works. I still can't tell exactly what you are trying to do (specifically: Images don't live in Folders; moving an Image from a Folder is non-sensical).
It can be very confusing (and frustrating) to get going with Aperture -- but it does work, and can be enormously helpful. If you haven't, take a look at the video tutorials on Apple's Aperture support site.
I feel as though I haven't helped you much -- but we need to be using the same names for interface items in order to get anything done -- and my sense is that you still haven't learned the names of the parts. -
Hello all,
I ordered a CTO Mac Pro for heavy rendering and animating work, and I am planning on using bootcamp to install a windows partition (for 3DsMax). As I am now using a HP Elitebook 8770W that has several valuable files and projects on it, I have bought a Seagate 4Tb external Desktop Drive that uses USB 3.0 to use as a backup drive. Now, as I saw that Bootcamp Assistant only supports installing a x64 version of Windows 8, my question is:
Will a late 2013 Mac Pro using bootcamp assistant boot from an external USB drive with a cloned partition running x64 win7? Or do I really have to buy Windows 8 and install that to my internal SSD, and then use the migration assistant to copy over my projects?
Thanks!Posted? or found in new builds?
BCA should really just pull whatever the latest drivers are when run.
If it is with how it partitions and sets up nMP and its partition for Windows that is another matter and makes sense.
Some people want UEFI native booting in Windows, and my experieince with that on PCs has been that it boots faster and runs well, but has different partitions that it wants. For one thing, there is now a backup "system reserved" partition, just as Apple GUID has some volume information blocks and backup and areas that were once optional (and if format erase was not able to, it would not create one) are now mandatory.
Windows 8.1 is req'd, reason a backup should be a big must - is it might overwrite and use another OS's partition table entries. Especially when doing UEFI install.
There was something about which linux OS was safe and how they would each add entries in the table, but one would not place nice. That one had to be done first or not at all. -
"Save image" button of Camera Raw does not save JPEG with modifications made.
I am using Bridge CC to process my RAW images and make modifications to some JPEG images as well. I have no need to further alter images in Photoshop at the moment, so I do all modifications in Camera Raw. After altering my images, I then press the "save image" button at the left bottom corner of Camera Raw and choose the settings to save in JPEG. However, when I upload these JPEG images to a website or when I email them, the image is uploaded without the modifications I had done! It seems to me that Camera Raw is saving JPEGs the same way it does for RAW images, in a non-destructive manner, only attaching instructions for the images to be displayed properly on Photoshop. But when I save an image in JPEG, I expect the image to be saved entirely with the modifications I made while still keeping the original RAW file. What goes on then? What is Camera Raw doing and how do I get to save a modified image in JPEG with Camera Raw? Would you please explain?
Many thanks for your help,
BezaHi,
What version of camera raw and operating system are you using? -
Can't view LR metadata modifications on Windows Explorer? What gives?
Hi all,
This is my first question to the community, I searched for the answer to this problem here and elsewhere but not able to find out why, hope you can help.
I'm coming to grips with managing data in LR but I've stumbled upon a snag. I'm using LR 5.7 and adding metadata to CR2 and JPG files and wanted to check whether the metadata modifications showed up on Windows Explorer. It appears as if they don't. My workflow is detailed below, could you point out if I'm missing anything?
What I did was fill some IPTC and EXIF fields on LR 5.7 and click on "save metadata to file". The fields I added data to was headline and description (IPTC), user comment (EXIF) and title and caption fields (Lightroom's own fields I presume).
However the metadata I added does not display on Windows Explorer (upon right click/properties/details). I'm using Windows 7 and MS website says Explorer is able to display IPTC and EXIF fields. This however does not seem to be the case. Did anyone else come across this problem or am I doing sth wrong.
To test it further I modified Title, Subject and Comments fields on Windows Explorer (through file properties/details tab),.I don't know whether these fields are an official part of IPTC or EXIF standards but they don't display on LR either.My Windows File Explorer (8.1) does show some of the metadata that LR 5.7 writes, but not all. I no longer have access to Windows 7, but I'm certain that it does too. LR 5.7 will also read metadata written by Windows Explorer.
Some general caveats:
- Be sure to Save Metadata To File in LR, or better, set the option Edit > Catalog Settings > Automatically Write Changes Into XMP.
- When you make changes to metadata outside of LR, be sure to Read Metadata From File to get it back into LR.
- When you make changes with the Explorer Properties window, make sure you click OK or Apply.
- Use the free Exiftool if you want to examine metadata authoritatively. Virtually every other program, including LR, Windows Explorer, and OS X Finder, has significant problems and limitiations with metadata.
I can't explain all that you're seeing. If you can't figure it out using Exiftool, then I suggest taking a sample pic you've written with LR, upload it to Dropbox or similar, and post the link here; we can take a look.
The LR metadata fields you mentioned map to these industry-standard fields:
Title > XMP:Title, IPTC:ObjectName
Caption, Description > EXIF:ImageDescription, XMP:Description, IPTC:Caption-Abstract
Headline > XMP:Headline, IPTC:Headline
User Comment > EXIF:UserComment
Windows File Explorer doesn't show Headline, and it has somewhat different mappings for Title. You can experiment for yourself with Exiftool. -
View modification in ODS after transport to production
Hi,
It's possible to see all modification in ODS after transport?
I would like to see the modification that was done in Navigation Attributes of ODS.
Thank's
Cesar G. BatistaHI
When you double click on ODS you see a info button(i) click on it it will pop up save and activate button these yo can see the last log.
Hope this helps!!
Thanks
Santosh RC -
Database Table where modifications to message long text are stored (log)
Hi,
As per manual correction mentioned in SAP note 1144291,
we have changed the long text for message XC092 by modification of the long text.
This note 1144291 is a pre-requisite for SAP Note 1310808.
After the notes were implemented, it is noticed that message CURTO1055 has incorrect information in its long text. This is because, the variable in the long text are not correctly defined in the long text.
Both the messages XC-092 and CURTO1-055 are SAP standard.
The error for CURTO1-055 can be rectified by modification of the long text and maintenance of the correct variables.
However, my question is:
Where do we check the log for document modifcation for a message long text.
I have found the logs relevant to my modification in table DOKHL and DOKIL.
But in which table do we get all of the foll. data:
- Message class
- Message number
- Modification name
- Modification created by
- Modification done on
- Last changed by
- Last changed on
Kindly help. A <removed by moderator> solution would be really helpful.
Best Regards,
Smruthi
Edited by: Thomas Zloch on May 5, 2011 12:07 PM - urgency reducedHi Smruthi,
The modification changes would be in the SMODILOG table. Please note this is a core basis table and should not be changed, however to best find your changes search under the TRKORR field with the relevant tp request.
Best regards,
Derrick Hurley
Development Workench
Maybe you are looking for
-
Image.getWidth(ImageObserver obs) returns -1
hello everyone i have used following code to load an image URL location=Sprite.class.getClassLoader().getResource(path); img=Toolkit.getDefaultToolkit().getImage(location);and this code to draw an image: g2d.drawImage(img, x, y, this);now i need
-
Ipod Touch is not recognized on my laptop
I just got a new Toshiba laptop with Windows 7 a couple of weeks ago. When I plugged my Ipod Touch 2g for the first time into one of the usb ports all that is recognized is a camera. Itunes doesn't notice and neither does any other program that I was
-
I would like to work with DICOM files so I may buy Photoshop CS6 Extended. Can a TIFF be exported from a DICOM file in Photoshop CS6 Extended?
-
Missing music after software update
What happened to my music after software update ?
-
Photoshop to FCP ... help help!!
Ok... So I've been making custom graphics for movies for some time. My normal protocol is to create (in PS CS2) save as a psd or sometimes a jpeg and import into FCP. Never had a problem. today, I was "delighted" to see that the psd's are having a ma