Run VIRSA/ZVFATBAK to generate missed log

Hello experts, we had an upgrade a couple weeks ago and we had to put the VIRSA/ZVFATBAK job on hold. Because of this we have a couple of FF logons missing data. We have the "BACKGROUND JOB WAS NOT SCHEDULED/LOG & FILE NOT YET GENERATED" error. Usually I can run job immediately after the upgrade and gather this missed data, however this time we forgot and now trying to go back 3 weeks ago.
I thought since I can still see the data in STAD (ST03) I should be able to get VIRSA/ZVFATBAK to run and find the data. This is not the case this time. Can anyone confirm its too late to generate these missed logs?
Thanks
Dave Wood

Hi David,
If you see that logs are missing for certain date and time then you can run the report  /VIRSA/ZVFATBAK with date and time for which logs where missing, see sap note - 1142312 (read this to understand what start & read time should be given to run the report)                                                                               
Please be informed that the report will fetch data from STAT and CDHDR, CDPOS only if the data exists in these for the given date and time. If the data has been purged from STAT for that period then its not possible to get the logs.
Also note this point -   As per design, manual FF log update considers only the firefighter ids
assigned to the user who is triggering the update through the firefighter dashboard. Whereas the scheduled background job considers all the FFIDs for a particular time period.
Best Regards,
Sirish Gullapalli.

Similar Messages

  • Log storage time of /VIRSA/ZVFATBAK

    Hi Guys,
    Can any one let me know how many days firefighter stores the logs generated by the background job /VIRSA/ZVFATBAK.
    I have given the user Firefighter ID from june 12th to june 19th. can I generate  a firefighter log on june 28th.
    Kindly respond.

    Hello Rahul,
      The Data which is retrieved by the /VIRSA/ZVFATBAK program is stored in Firefighter tables. If the problem was able to retrieve the data for that particular date successfully from SAP System then it will be available in the logs for review as long as it is in Firefighter tables.
    Regards, Varun

  • ABAP report /VIRSA/ZVFATBAK run very long on backend

    Hello experts,
    For ABAP report /VIRSA/ZVFATBAK which runs in the backend system, normally how long would it take to finish? Because now the duration for the report is already 2k++ seconds and its still running in our test system, while the report scheduled in our development system only uses 1 or 2 secs to finish.
    Any idea why is it taking that long in a test system?
    Appreciate the replies, thank you in advance.

    Thank you for the note sabita! Its really helpful.
    One question regarding the sapnote, it mentioned STAT collector job, is this a standard job?
    If yes, below are the standard collector jobs which is scheduled in the system, which would it be?
    SAP_COLLECTOR_FOR_JOBSTATISTIC
    SAP_COLLECTOR_FOR_NONE_R3_STAT
    SAP_COLLECTOR_FOR_PERFMONITOR

  • Activity log for bug 278860: confusing "profile in use"/"already running" error when profile is missing (not found) did anyone find a fix for this? Thanks, Edie

    bug 278860: confusing "profile in use"/"already running" error when profile is missing (not found)
    Apparently the profile .default is missing. Therefore the bug 278860 is activated and I can't start Firefox. Is there a solution?

    I did some googling and found alot of topics from you on this issue, to me it simply seems like on of the connectors is broken or loose.

  • /virsa/ZVFATBAK background job

    Hello,
    SAP 4.6c.
    /virsa/ZVFATBAK is running currently in the system at specific periods.
    Should this job be running on each application server of the system?
    Or just a one time schedule in SM36 for the entire system is good.
    Kindly advise.
    Thanks.
    Prasanna

    Hello,
    Thanks for the reply.
    Recently, the frequency of the job was switched from hourly to once every 4 hours.
    The firefighter logs are not getting generated.
    I dont see anyother changes that have been made.
    1039144-Is this note the solution?
    thanks.
    Prasanna

  • How to generate rule log on client side

    I am looking for a solution where I could generate a rule log and make it available on client side. I know generating a log during consolidation and save it on server. However, I want to make it available on client side.
    I have created four scenarios in HFM application, namely -
    1. ACTUAL_REPORTING - To capture TB data and use it for reporting
    2. ACTUAL_ICP - To capture ICP data and run custom business rule as per requirement
    3. ACTUAL_ADDINPUT - To capture additional memorandum inputs
    During consolidation,data is copied from ACTUAL_ICP and ACTUAL_ADDINPUT to ACTUAL_REPORTING. However, problem is that user should consolidate ACTUAL_ICP AND ACTUAL_ADDINPUT scenarios first. If they miss executing the consolidation in these scenarios, then the data copied would be wrong. Just as to ensure this, there's a provision is HFM to check calculation status in rule file and looking at that, calcuation could be stopped and a message could be sent to the user. But delimma is, how to report the same on the user side.
    Possible solution that I could have thought are -
    1. To generate a rule log and post the same to the client, but how to do this? As path of the rule log could be of the server only (that is C:\Hyperion\Logs\Rule)
    2. To generate a rule log and pop a web page terminating consolidation, but how to do this?
    3. To raise an error from the rule file and direct it to the system messages, but how to to this?
    Please advice and any further clarification is required please contact me.
    Thanks & Regards,
    Mustafa

    You can define the output folder as a share, which has already been mentioned. I strongly advise against implementing any solution that in production will regularly generate an external file.
    1) The output file is generated by the DCOM user. If you want this user to generate a file in a specific location, it must be writeable by the DCOM user, and readable by the intended human user. Make sure both NTFS and share level permissions on the target file and its containing folder consider this. Latency for the file write can degrade performance.
    2) Most HFM implementations have two or more HFM application servers. You have no control over which server will execute the file, and also no control over what happens if two or more servers try to write to the same file. You can create a process which causes HFM to become single threaded if every time Sub Calculate for any entity on any server needs to open a single file. This is because the file object is single threaded. It could be multi-threaded if you write out to a database, but certainly far more complex.
    3) Wait state: while as in #2, this can single thread the process, and you have to consider whether your process will error out during moments when another process has access to the file, or if you will ignore that and simply proceed. This can have a significant impact on performance time if you decide to wait until the file becomes available.
    While I have done implementations where the overall solution required combining data across multiple scenarios, I consistently find this a cumbersome, error-prone, and poorly performing approach. For this reason, I always try to keep all of the data required in a single scenario. This is one key reason why the "DataSource" or "DataType" (or whatever you call it) approach is so popular and successful.
    Finally, you cannot write to the HFM system messages. You can, on the other hand, use Calc Manager to write out to a system log.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Refresh Materilaized Views without generating Archive Logs

    Hello Gurus,
    I'm facing an emebarassing situation,
    I've a job every night to execute a dbms_refresh on several materialiled Views, but there are 2 Mviews that take almost 30 minutes each but another issue thay generates almost 10G of data and my File system becomes full in a minute.
    I've done an alter Mview Nologging but it doesn't change anything.
    Does soemon ahve an idea how to do a refresh without generating Archive logs - or
    without logging?
    thxs in advance
    C.

    Since this is a TRUNCATE and INSERT, any user / application querying the MV while the Refresh is running will see ZERO rows. Furthermore if the INSERT fails (e.g. insufficent space to add an extent), the MV remains with ZERO rows.
    A DELETE and INSERT avoids such situations (even if the refresh fails, both INSERT and DELETE are rolled-back and the MV is reverted to the state as it was before the Refresh began -- thus showing data from the previous refresh)
    Hemant K Chitale
    http://hemantoracledba.blogspot.com
    Just to clarify : My paragraphs above are not about data "missing" but about precautions to be taken. A COMPLETE Refresh is done during defined outages -- users/applications are made aware that data is not available while the Refresh is running.
    Edited by: Hemant K Chitale on Sep 1, 2009 10:53 AM

  • /VIRSA/ZVFATBAK stopped after upgrade

    Hi,
    We are currently running the GRC compliance report to view log of Fire-fighter users and the transactions they have been using. This report is sent out to the FF owners every month.
    We upgraded our system from 4.7c to ECC6 in early May and since then the /VIRSA/ZVFATBAK job has not been running and collecting the log data.
    The /VIRSA/ZVFATBAK is now setup correctly again but we were hoping to run the job so it would collect the data from the previous month, up to the 9th of June.
    I have been trying various techniques and selection parameters but have been unsuccessful.
    Can anyone tell me any way of running the /VIRSA/ZVFATBAK job so that when I run report next week it will have all the data as far back as the 9th of June?
    Any help is much appreciated,
    Colm

    Hi Michael,
    Thanks for the reply. I will talk to the basis team about this but I probably should mention that on Thursday evening I was able to run the job for the 6th and 5th of July successfully. I had to go to a meeting then and when I came in on Friday morning the job was no longer working.
    I wonder is it something to do with my selection screen?
    If I want to gather all the data for the 2nd of July I am running it with a Start Date of 02.07.2010, a start time of 23:59:59 and a read time of 23:59:58.
    As I said this worked for the 5th and the 6th but no longer works for me and there were FF enteries on the 2nd.
    Many Thanks,
    Colm

  • LabVIEW 2011SP1 Error building installer: LabVIEW Run-Time Engine 2013 is missing 3 dependencies???

    I'm having a problem building an installer in LabVIEW 2011SP1.
    It's been a while since I've tried to build an installer but it used to work fine and building executables is still working fine.
    Now when I try to build the installer I get "The build was unsuccessful."
    Possible reasons: Error generating preview for My Application 3.1.7.
    Details:
    Visit the Request Support page at ni.com/ask to learn more about resolving this problem. Use the following information as a reference:
    CDK_Build_Invoke.vi.ProxyCaller >> CDK_Build_Invoke.vi >> CDK_Engine_Main.vi >> IB_MSI.lvclass:Build.vi >> IB_MSI.lvclass:Engine_InitializeDistribution.vi >> IB_MSI.lvclass:Report_Preview_Error.vi >> IB_Source_Container.lvclass:Report_Preview_Error.vi
    Loading product deployment information
    *** WARNING ***
    NI LabVIEW Run-Time Engine 2013 is missing 3 dependencies. This product, or other products that depend upon NI LabVIEW Run-Time Engine 2013, may not function properly while the dependencies are missing.  Visit ni.com/info and enter the Info Code "" for more information.
    *** Error: An internal error occurred for which the cause is unknown. (Error code -41)
    *** Error Details:
    Error in MDF API function: _MDFCommon_GetNextLogMessage
    Error in MDF::GetInstance - MDF static instance is not initialized!
    *** End Error Report
    Loading product deployment information
    *** WARNING ***
    NI LabVIEW Run-Time Engine 2013 is missing 3 dependencies. This product, or other products that depend upon NI LabVIEW Run-Time Engine 2013, may not function properly while the dependencies are missing.  Visit ni.com/info and enter the Info Code "" for more information.
    The really strange thing about this is that I'm using LabVIEW 2011SP1, I don't even have LabVIEW 2013 installed, not even the runtime.
    Where is the problem? Why is it even complaining about LabVIEW 2013? Has anyone seen this before?
    Troy
    CLDEach snowflake in an avalanche pleads not guilty. - Stanislaw J. Lec
    I haven't failed, I've found 10,000 ways that don't work - Thomas Edison
    Beware of the man who won't be bothered with details. - William Feather
    The greatest of faults is to be conscious of none. - Thomas Carlyle
    Solved!
    Go to Solution.

    I do have the LabVIEW 2013 discs but didn't install it.
    I checked in Control Panel > Programs and Features > National Instruments Software and the LabVIEW 2013 Run-Time is not listed there.
    I also checked in MAX > Software and I must have missed it the first time I looked. LabVIEW Run-Time 2013 is listed there after all.
    Now I remember installing VISA 5.4 to get rid of a nasty bug. It must have installed the 2013 Run-Time when I did that.
    So now I need to distribute 2 runtimes with my application?!?!
    Troy
    CLDEach snowflake in an avalanche pleads not guilty. - Stanislaw J. Lec
    I haven't failed, I've found 10,000 ways that don't work - Thomas Edison
    Beware of the man who won't be bothered with details. - William Feather
    The greatest of faults is to be conscious of none. - Thomas Carlyle

  • BCD_OVERFLOW CX_SY_CONVERSION_OVERFLOW /VIRSA/ZVFATBAK

    Hi guys,
    I have my FF user executing transaction SE38 and upon the execution, the background job /VIRSA/VFATBAK ended in status "Canceled". In the log, its shows the below description for the background job cancallation.
    Short text
    Overflow during an arithmetic operation (type P) in program "/VIRSA/ZVFATBAK".
    What happened?
    Error in the ABAP Application Program.
    The current ABAP program "/VIRSA/ZVFATBAK" had to be terminated because it has come across a statement that unfortunately cannot be executed. A value is too long for a calculation field.
    Error analysis
    An exception occurred that is explained in detail below.
    The exception, which is assigned to class 'CX_SY_CONVERSION_OVERFLOW', was not caught in
    procedure "GET_SE38_CHANGES" "(FORM)", nor was it propagated by a RAISING clause.
    Since the caller of the procedure could not have anticipated that the exception would occur, the current program is terminated. The reason for the exception is:
    In the current arithmetic operation with operands of type P an overflow has been detected. Possible causes are:
    1. The result field of type P is too small to store the result.
    2. The result or an intermediate result has more than 31 decimal places.
    How to correct the error
    Maybe the result field - if still possible - must be defined larger.
    Maybe the current process can be divided into separate units in a way that only smaller values occur.
    Anyone shares this experience before and if you have resolve it via the proposed way to coorect the error?
    If you have successfully executed the corrections, can you pls share on the steps to be taken?
    Thanks.
    Raymond

    An calculation causes an exception because the result variable is too short to store the result.
    You can 2 things to solve the problem :
    - Add an error handling mechanisme CATCH or TRY to get the exception and act on it (message or something else)
    - change the result variable so that it can store the resulting value
    regards,
    hans

  • Generating encrypted logs in a untrusted machine

    Hi guys,
    I would like to get some ideas from your guys on how to solve a problem.
    Basically, we have an application that is installed in a unstrusted computer (i.e. the person whom uses this computer cannot be trusted).
    Unfortunately, the guys whom install the application is untrusted as well.
    Basically this application allows the user to perform some operations in the field (generate some log of those operations) and when the user returns
    to the headquarter those logs are read, so we can check if "field user" did any malicious action. The problem is that this log MUST be encrypted, otherwise
    he can edit the log file and cheat the system.
    Do your guys have any idea on how to solve that?
    PS: I was thinking on give a public key that should be used during the product installation (by using keytool) and use a private key on the server to read the logs, however I don't know if that is the best solution.
    Thanks and Regards

    Those cmdlets use DPAPI (
    http://msdn.microsoft.com/en-us/library/ms995355.aspx .)  DPAPI does generate new Master Keys from time to time, but they're all stored in the user's profile, and you should not lose the ability to decrypt old data.  Few questions:
    Is this script running as a domain account, or as a local user?
    Does anyone mess with the account's user profile, deleting all or parts of it and allowing them to be recreated?
    Your DPAPI master keys are stored in <Profile Dir>\AppData\Roaming\Microsoft\Protect\<Account SID>\ .  They're marked as Hidden and System, so you'll need to make sure you're viewing those files to be able to see them.
    The master key files in that directory are encrypted using a hash of the user's password; if anyone resets the password, you lose access to those keys (unless you're either in Active Directory, which has a backup mechanism in place on the domain controllers,
    or you have made a Password Recovery Disk for a local account.)  You mentioned that the account's password hasn't changed, so that's probably not an issue here, but I figured I'd mention it so you know how DPAPI works.
    If the account's password has not changed at all, my best guess is that someone has managed to delete the old master key(s) from the user profile, one of which was used to encrypt the SecureString originally.

  • Fire Fighter job /VIRSA/ZVFATBAK is finishing in zero sec

    Hello Everyone,
    In my Solution Manager Production system FF job /VIRSA/ZVFATBAK is finishing in zero sec.
    Log:
    Date       Time     Message text                                                                        Message class Message no. Message type
    01.04.2014 02:30:33 Job started                                                                              00           516          S
    01.04.2014 02:30:33 Step 001 started (program /VIRSA/ZVFATBAK, variant &0000000000002, user ID DE15386)      00           550          S
    01.04.2014 02:30:33 Job finished                                                                             00           517          S
    Please suggest what could be thepossible reason.
    I am not able to get FF logs. So is this job the reason for no logs?
    Thanks,
    Ashish Gaurav

    Hi Tang,
    Did you check the configuration settings for both the FF IDs.
    Also, as a trail and error, to isolate the issue, can you check using only the 2nd FF ID for which the log was not sent. Ensure that the 1st FF ID is not used. This way you can identify whether the issue is with the FF ID or the configuration.
    Regards,
    Raghu

  • Generating rman log file

    hi
    How can I generate Rman log file? Oracle xe on windows
    My rman backup script is like:
    run{
    backup device type disk tag '%TAG' database;
    SQL 'ALTER SYSTEM ARCHIVE LOG CURRENT';
    backup device type disk tag '%TAG' archivelog all not backed up delete all input;
    delete noprompt obsolete device type disk;
    }

    In addition to what Paul mentioned, you can also consider the LOG option from the command line.

  • Error for Generating a log file

    Hi Cezar sanos,
    i am trying to generate a log file for ODI with details like who logged in and what is is doing kind of things.
    For this i am executing the command like
    lagentscheduler.bat "-PORT=20910" "-NAME=localagent" "-V=2" > C:\OraHome_1\logs\agent1.log.
    But its getting the error like
    A JDK is required to execute Web Services with OracleDI. You are currently using a JRE.
    OracleDI: Starting Scheduler Agent ...
    Starting Oracle Data Integrator Agent...
    Version : 10.1.3.5 - 10/11/2008
    DwgJv.main: Exit. Return code:-1

    Just in case,
    the following message :
    A JDK is required to execute Web Services with OracleDI. You are currently using a JRE.
    is only a warning and not an error message....

  • Generating a log file from oracle db

    I want to generate a log file in which i want to dump some useful messages, when anyone does a dml operation on a table, and also, i want to have a switch like YES or NO (may be an environment variable,,) if i switch it to YES the log file should get generated, if NO then, no log file will be generated..
    can anyone help, how can you do this task ?
    thanks a lot in advance..
    srini

    You can use a Trigger and UTL_FILE to write to a file (on the server).
    Example:
    create or replace trigger test_file
      after insert or delete or update on test_case 
      for each row
    declare
    v_logfile utl_file.file_type;
    begin
      v_logfile := utl_file.fopen('\myfiles','test_file.log','a');
      if inserting then
         utl_file.put_line(v_logfile,'Inserting to table');
      elsif deleting then
         utl_file.put_line(v_logfile,'Deleting to table');
      else
         utl_file.put_line(v_logfile,'Updating to table');
      end if;
      utl_file.fclose(v_logfile);
    end test_file;
    I want to generate a log file in which i want to dump some useful messages, when anyone does a dml operation on a table, and also, i want to have a switch like YES or NO (may be an environment variable,,) if i switch it to YES the log file should get generated, if NO then, no log file will be generated..
    can anyone help, how can you do this task ?
    thanks a lot in advance..
    srini

Maybe you are looking for

  • SAP data files

    Hi My SAP version is ECC 5.0 with oracle 9.2.0.7 on SOLARIS 10. My current DB size is 300GB where i have 23 data files on my oracle/<SID>/sapdata1...4 I have implemented MM,FICO,PP,QM modules.My monthly DB growth rate is 30-35 GB. My Questions 01.Is

  • Need to Create a Last 12 Months Variable for Custom Dimension

    Hi everybody Once again, I need your good help for an issue I'm facing. In one of our Cube, we have some Date Dimension sthat we've created ourselves (a ZCXXXXX dimension). For one Dimension, Ii need to create a variable that is giving me, automatica

  • Why JSP not move to jhtml direction?

    I have both knowledge of JSP and JHTML (from ATG). I like jhtml much more. The jhtml architecture is far superior than jsp. I just list following features: * better component naming, using directory and properties, instead of jsp. * excellent Form ha

  • How to white balance in FCPX

    Hi,i'd like to know if there is a way to make an accurate white balance in FCPX. Thanks in advance

  • Controlling slide duration - Captivate 4

    I am coming from a Camtasia world - no I am not "converting" - I still love Camtasia, But I would like to become proficient building scored simulations. I inserted a number of Click boxes and text entry boxes in a simulation of an accounting software