HCM Best Practice LOAD - Error in Copy Report variants Step

Dear Friends,
We are trying to use/load the HCM Best Practice data for testing purpose. We have applied the HCM Best Practice Add -ON. It went successfully. When we try to execute the Preparation step - (in Copy Varient Phase) using Tcode /HRBPUS1/C004_K01_01 we are getting the following error.
<b>E00555 Make an entry in all required fields</b>
Request you to provide some solution for the same.
Thanks and regards,
Abhilasha

Hi Sunny and others,
The main error here was sapinst couldn´t find and read the cntrlW01.dbf because this file was not on that location (/oracle/W01/112_64/dbs).
I already solved this issue... what I did was:
1º) As user ora I went in sqlplus as sysdba and I ran the following script (the control.sql script that was generated at the begining of the system copy process):
SQL> @/oracle/CONTROL.SQL
Connected.
ORA-32004: obsolete or deprecated parameter(s) specified for RDBMS instance
ORA-01081: cannot start already-running ORACLE - shut it down first
Control file created.
Database altered.
Tablespace altered.
2º) This is very important, is necessary to know where is the .CTL that was created with the script, so I checked what was the value of the parameter control_files in that moment:
SQL> show parameter control_files;
/oracle/W01/oraflash/W01/controlfile/o1_mf_6pqvl4jt_.ctl
3º) Next, logged with ora, for the sapinst read the value that it needed, I copied this file to the location/path that is needed for sapinst could read:
cp /oracle/W01/oraflash/W01/controlfile/o1_mf_6pqvl4jt_.ctl /oracle/W01/112_64/dbs/cntrlW01.dbf
4º) I ran the sapinst from the point he had stopped.
Thank you for your help.
Kind regards,
João Dimas - Portugal

Similar Messages

  • Is there a best practice for deleting a published report?

    Post Author: matthewh
    CA Forum: General
    Is there a best practice for deleting a published report from C:\Program Files\Business Objects\BusinessObjects Enterprise 11.5\FileStore on Crystal Reports Server or can I just delete the subfolders?  Does it reference them elsewhere?  I have a load of old reports I need to shed, but can't see how to do it.

    Hi,
    You can refer the SRND guide. As per document (page -292)
    you can configured -You can add a maximum of 50 agents per team.
    http://www.cisco.com/en/US/docs/voice_ip_comm/cust_contact/contact_center/ipcc_enterprise/ippcenterprise9_0_1/design/guide/UCCE_BK_S06086EC_00_srnd-9-0-1.pdf
    Also you can check the Bill of Material document of UCCE , under the section "Operating Conditions, Unified ICM, Unified CC" What are the number should configure in UCCE.

  • Best practice to share user defined reports?

    What is the best practice to share user defined reports among team members?
    Currently I can create a folder under the Reports/User Defined Reports system folder and any reports or subfolders under that can be exported into xml by right-clicking on the folder and selecting 'Save As'. Then sending the file over or put it on a shared point or check in to SVN, then the other team members can right click on the User Defined Reports system folder and chose "Open report..." and select the xml file.
    Is there any more elegant way?
    Is it a good idea to share the C:\Users\lgal\AppData\Roaming\SQL Developer\UserReports.xml on Windows, or ~/.sqldeveloper/UserReports.xml (on Linux) among users?
    Is there a standard way to bring this under standard SVN control in SD?

    I think the best thing would be if someone set up a config sharing site!
    It would be very easy to do on system like drupal or e107.  People could register and submit their configs to different download sections based on what sort of configs they are.  My site runs e107 so it's not hard to see how easy it would be
    It would be easy to set up - I could do it in about 30mins but I think I have done more than enough of that!  There seems no need to officialize it to any degree and burden the server even more.
    I guess user-cb could host it couldn't?  If you restrict people to posting only small config files with no icons and link to big stuff offsite it should be tiny bandwidth usage.
    That's what I would do anyway

  • Link for HCM best practices India/Asiapacific/Europe

    Hi Experts,
    Can any body provide me link to HCM best practices India/Asiapacific/Europe.
    Thanks in advance
    Rajeev Chhabra

    Hi Rajeev
    You can try the following links
    https://websmp202.sap-ag.de/bestpractices
    http://help.sap.com/bp/initial/index.htm
    You can order the SAP Best Practices Baseline Package (India) documentation CD from the SAP Marketplace (SAP Knowledge Shop) and the complete CD set (documentation CD and CD with configuration) from the SAP Software Catalog. SAP employees may order the documentation CD via the local B2B system.
    Hope it helps.
    Regards
    Lincoln

  • BEST PRACTICE FOR THE REPLACEMENT OF REPORTS CLUSTER

    Hi,
    i've read the noter reports_gueide_to_changed_functionality on OTN.
    On Page 5 ist stated that reports cluster is deprecated.
    Snippet:
    Oracle Application Server High Availability provides the industry’s most
    reliable, resilient, and fault-tolerant application server platform. Oracle
    Reports’ integration with OracleAS High Availability makes sure that your
    enterprise-reporting environment is extremely reliable and fault-tolerant.
    Since using OracleAS High Availability provides a centralized clustering
    mechanism and several cutting-edge features, Oracle Reports clustering is now
    deprecated.
    Please can anyone tell me, what is the best practice to replace reports cluster.
    It's really annoying that the clustering technology is changing in every version of reports!!!
    martin

    hello,
    in reality, reports server "clusters" was more a load balancing solution that a clustering (no shared queue or cache). since it is desirable to have one load-balancing/HA approach for the application server, reports server clustering is deprecated in 10gR2.
    we understand that this frequent change can cause some level of frustration, but it is our strong believe that unifying the HA "attack plan" for all of the app server components will utimatly benefit custoemrs in simpifying their topologies.
    the current best practice is to deploy LBRs (load-balancing routers) with sticky-routing capabilites to distribute requests across middletier nodes in an app-server cluster.
    several custoemrs in high-end environments have already used this kind of configuration to ensure optimal HA for their system.
    thanks,
    philipp

  • What is best practice in FR ? Original Report access to Users or Snapshot

    Hi,
    can any one pls. let me know on what is the best practice in FR ? I need to give access to original reports to my users or the snapshot access only ? Users are not happy with snapshot access mainly during the closing time. What are the complications if i give access to original reports ? I'm using Batch scheduler, but users are unable to see the data some times. What will be the reason for this ?
    Thanks,
    PVR

    Hi,
    There are certainly many variables to have a look at. Server sizes is one concern, report size is another. Giving access to original reports is fine as long as concurrency and heavy usage don't take the servers down. That said, the reports that will be given access to their originals shouldn't run for 10 minutes for each query. Having 10 user connected to such reports will probably take reporting services down which makes the entire system useless. These types of reports should better be scheduled and shown as snapshots. However, some of these reports might have time-dependent information which needs to be refreshed at query time. In this case you could either give the access to original reports or schedule the reports to run say every 2 hours.
    Cheers,
    Alp

  • Best practice for Error logging and alert emails

    I have SQL Server 2012 SSIS. I have Excel files that are imported with Exel Source and OLE DB Destination. Scheduled Jobs runs every night SSIS packages.
    I'm looking for advice that what is best practice for production environment.Requirements are followings:
    1) If error occurs with tasks, email is sent to admin
    2) If error occurs with tasks, we have log in flat file or DB
    Kenny_I

    Are you asking about difference b/w using standard logging and event handlers? I prefer latter as using standard logging will not always data in the way in which we desire. So we've developed a framework to add necessary functionalities inside event handlers
    and log required data in the required format to a set of tables that we maintain internally.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Best practice concerning embedding script in report vs.  controlling from Java

    Hi,
    I'm faced(probably not the only one) with adding some intelligence to my reports.  In a prior post I was curious about displaying/hiding sections based on conditions found in the bean/pojo. 
    Is there a best practice concerning embedding logic in the report in the form of formula(s), vs. using Java to get or create a field and then creating a formula on the fly?  I suspect the answer has something to do with truely dynamic fields, and perhaps a little bit of both Java, and script.
    Anyone on staff care to try answering??
    Peter

    Hi,
    log into your SAP ERP system using the SAP GUI and choose in the SAP Menu the following path:
    SAP Menu -> Accounting -> Controlling -> Cost Cetner Controlling ->Environment->Set Controlling Area.
    Set the desired controlling area for your user there (DO NOT FORGET TO CLICK ON THE DISKETTE ICON) and try again.
    Regards,
    Stratos

  • HCM Best Practice

    Dear Experts,
    Im trying to use the ata transfer tool delivered with best practices for HCM on mySAP ERP 2005, i have installed the role required, however while trying to run
    the transactions from the menu that has been attached to this role this message is displayed :
    Transaction /HRBP/T001_K00_01 does not exist.
    Nothing is there on OSS could anyone share information
    with this regard.

    Hello:
    Try   /n/HRBP/T001_K00_01
    It should work. This Transaction always need to do this.
    Thanks
    Venkata

  • Best Practice With Work Efficient in Report server

    In SSAS cube I have two different roles:  
     - Manager (about 10 people)  
     - Salesman (about 43 people)
    Manager has full access to 4 report and salesman has access to 1 report. Totally, it is five report.
    I have the roles and its users in SSAS cube and when I deploy the cube I need to apply all 53 users in the report server.
    I have created two map, named "aa" and "bb" in the index page. The manager has access to "aa" and salesman has acess to "bb". I have moved the document to these map.
    What is the best practice to make right user to have right access to map and document in a efficient approach in report server?

    Hi Sakura,
    In SQL Server Reporting Services, we should verify that sufficient permissions have been granted for the user. At a minimum, we should grant the user the role that support both the "View reports" task and the "View folders" tasks to support viewing and folder
    navigation in SQL Server Management Studio. For more information about system-level permission and item-level permission, please refer to the following document:
    http://technet.microsoft.com/en-us/library/aa337491(v=sql.105).aspx
    Hope this helps.
    Thanks, 
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Any Best Practices for developing custom ABAP reports for Portal?

    Hello,
    The developers on our project are debating the best way to develop custom reports and make them available on the portal.  Of these options that we can think of, can you give any pros & cons, or experiences, or other options?
    - Web-enabled Abap report programs
    - WebDynpro for Abap
    - WebDynpro for Abap using ALV
    - Adobe forms
    Does a "Best Practices" document or blog exist on this topic?
    Thanks,
    Colleen

    Re: Using p_trace=YES

  • Ehp5 and HCM best practices

    Hello Experts,
    Would any of you advise on upgrade to Ehp5. We use best practices for HCM and was applied on Ehp3 which we later upgraded to Ehp4. I'm wondering if there is aything to watch for when we upgrade to Ehp5.
    Thank you.

    Dear,
    By upgarding to EHP5 all the funcitionalities of EHP4 will remain same, But you have been provided with other new functionalities, So if you want to use the new functionaliies, then you can activate the corrosponding Business functions.
    Rest other things should be similar.
    Best Regards,
    Deepak.

  • SOA Best Practices Guide - Error 404

    http://download.oracle.com/technology/tech/soa/soa_best_practices_1013x_drop3.pdf
    The above link for the SOA Best practices document guide is broken...
    Does anyone have this document? Could you please send forward it to me at... [email protected]?
    Thank you

    All fixed!
    This is the correct link
    http://www.oracle.com/technology/tech/soa/soa-suite-best-practices/soa_best_practices_1013x_drop1.pdf
    From the best practices page (now corrected)
    http://www.oracle.com/technology/tech/soa/soa-suite-best-practices/index.html
    Mine was apparently cached!
    Heidi.

  • Best practice for error notification

    Rather than attach individual "send email" routines--or even a reusable scenario--to the "KO" step for every potential failure, is there some way I can generally notify an email address (like our production support group) that something blew up (or likewise, that everything succeeded)
    What about an ODI routine that polls the ODI logs for signs of errors, and then sends notification?
    How are other shops handling this?
    Thanks,
    -John

    Hi John,
    Just a tip, for getting error, a good idea is to use OdiExportLog tool to log the error and output of each steps.
    Pass the parameters needed by the tool, target file/directory, start/end date, agent name.
    Thanks,
    Randy
    Edited by: [email protected] on Apr 1, 2009 9:11 PM

  • What is the best practice for running a long report/query against an active database?

    We are using SQL Server 2012 EE but currently do not have the option to run queries on a R/O mirror though that is my long term goal. I am concerned I may still run into the below issue in that scenario as well since the mirror would also be updating data I
    am querying.
    I have a view that joins across several tables from two databases and is used by an invoicing program on existing data. Three of these tables are also actively updated by ongoing transactions. Running a report that used this view did not use to be a problem
    but now our database is getting larger and we have run into some timeout problems for the live transactions coming in.
    First the report query was timing out so I set command timeout to 0 and reran the query which pegged all 4 CPUs 100% for 90 minutes and so I finally killed it. Strangely there were no problems with active transactions during that time so I'm wondering if the
    query was really running doing anything useful or somehow spinning and waiting. I reviewed the view and found a field I was joining on that was not indexed so created an index on that field, reran the report, which then finished in three minutes and all the
    CPUs were busy but not at all pegged out. Same data queried both times. I figured problem solved. Of course later, my boss ran a similar invoice report, with the same amount of data, and our live transactions started timing out 100% while his query was running.
    I did not get a chance to see the CPU usage during that time.
    I looked at the execution plan of the underlying view and added the suggested index but that did not help. When I run the just the view at SQL Server it does not seem to cause any problems and finished in a couple seconds. Perhaps something else going on in
    the reporting tool using the view.
    My main question is - Given I have to use the live and active database, what is the proper way to run a long R/O query/report so that active transactions can still continue to update
    tables that I am querying? sp_who2 did show transactions being blocked so I guess a long query accessing the tables blocks live transactions accessing those same tables, but certainly I'm not the only one doing this. I
    am considering adding "with (nolock)" but am hoping there is a better standard practice as that clause can return dirty data and I understand why. Thx, Dave
    Thanks, Dave
    Dave

    Hello
    You can change the DB isolation level to Read uncommitted
    http://technet.microsoft.com/en-us/library/ms378149(v=sql.110).aspx
    or use WITH (NOLOCK)
    I do use NOLOCK option for the dirty reads to avoid locks on the tables
    Javier Villegas |
    @javier_vill | http://sql-javier-villegas.blogspot.com/
    Please click "Propose As Answer" if a post solves your problem or "Vote As Helpful" if a post has been useful to you

Maybe you are looking for

  • Creating a Link on the Column Heading

    Good afternoon, I know with reports, you can create a link for each cell in the report attributes section. My question is it there's any way to create a link for the column header without hard coding it into the query that generating the results. Tha

  • Automatic Clearing Program  not clearing in production run -

    Hi All, I am getting strange problem. I am trying to execute F.13 transaction code and When I run in test run mode its selecting all the relavent POs for GRIR clearing. But when I execute the same in Production run, its not at all  picking the  POs f

  • Kodak Gallery Plugin and Iphoto 09

    I did not find anything when searching... so my apologies if a similar thread exists elsewhere. I have not been able to upload photos since upgrading from Iphoto 08 to 09. I used to use the Ofoto Express program to upload, but it ceased working after

  • Exporting RAW images to Time Capsule

    When I tried to export my RAW files to the time capsule to have a back-up it converted them to jpg.  This didn't happen in iPhoto.  Is there something I should do differently?

  • Wont back up after i restored my hard drive?

    Hi there! I have no idea why, but since after i restored my computer from a time machine copy i had, Time Machine havent been able to back up my hard drive. When i press Back up disk now, it just tells me that it failed. Theres 65 GB space for the ba