Exporting Job details

How can I export all the Jobs definition along with the start and end time into an excel?
When I am clicking on Jobs(Definition) -->Export , its exporting the job log not the jobs and details.

for the File.. Export function you are limited to what is available in the view  (use view... preferences when you are in the tab you want).  Job activity has more options than job Definitions and would be a better option for you for the details you are after.
To get ALL the details you need to run a back-end query SQL or ORACLE and then export from there
(ideally you can get one of your runtime users to have read to Tidal DB for this purpose and just run a job)
The query can be complex depending on what you want but typically you would join jobmst and jobdtl
if you want let me know what you are looking for and I can post the query here.
Tidal does have a Database Model help file that give you more detail on what the database, columns

Similar Messages

  • Scheduling an Export Job Using Grid Control

    Hi,
    I got a requirement to schedule export jobs in Oracle Grid Control 10.2.0.5 (Windows). Is this possible and have any one done this? If so please share the steps. The idea is to get alerts if the job fails.
    Thanks.
    GB

    Here is the easy steps (There might be slight differences as I am posting it based on 11g screen)
    1. On grid control console, click on the relevant database
    2. click on 'Data Movement' tab
    3. Click on 'Export to Export Files'
    4. chose the Export type (Database / Schema / Tables/ Tablespace)
    5. Provide Host credentials (OS username and Password) and click on 'Continue'. You will get a new screen called 'Export: Options'
    6. Click the checkbox called 'Generate Log File' and select the Directory object where you wants to create the dump files. (You need to create a directory object in database, if you don't have one)
    7. Chose the contents option as you required, and click '*Next*'. You will get a new page called 'Export: Files'
    8. Select the directory object from the drop-down box and provide a name format for file name, and click 'Next'
    9. Here you provide the Job name and description, and chose repeat options (daily, weekly etc), and click 'Next'
    10. You will get a summary screen called 'Export: Schedule'. review your job details and click on 'Submit'.
    This is the solution, and it works well.

  • How to view ALL batch job details (SM37) at one glance ?

    Dear all,
    I am documenting all released batch job details, the information required includes the job name, client number, job frequency .. etc.
    In order to see that information, i go to SM37 and click on each job to see the details. I have about 60 jobs released, to get their details i have to click on them at least 60 times.
    Is there a report or table that i can refer to that provides me the information of all the jobs in one screen ?
    Thanks.
    Advice and comment will be appreciated.
    Regards,
    Kent

    Dear Prashanth,
    Thanks for the link, I managed to get the required information from table TBTCO or TBTCP with below selected fields.
    JOBNAME = Background job name
    SDLSTRTDT = Planned Start Date for Background Job
    SDLSTRTTM = Planned start time for background Job
    SDLUNAME = Initiator of job/step scheduling
    PRDMINS = Duration period (in minutes) for a batch job
    PRDHOURS = Duration period (in hours) for a batch job
    PRDDAYS = Duration (in days) of DBA action
    PRDWEEKS = Duration period (in weeks) for a batch job
    PRDMONTHS = Duration period (in months) for a batch job
    PERIODIC = Periodic jobs indicator ('X')
    STATUS = State of Background Job, S = Released, F = Finished
    AUTHCKMAN = Background client for authorization check
    EVENTID = Background Processing Event
    EVENTPARM = Background Event Parameters (Such as, Jobname/Jobcount)
    Dear Juan,
    Thanks for your reply.
    Regards,
    Kent

  • Where are device export jobs managed?

    I created a scheduled device export job from the CS/Device Management/Device summary page to run daily and create a csv file. This ran fine for several months, but then seemed to stop. I think we had an issue with the jrm process, long since resolved. During that time I created another scheduled export job. I think they are now conflicting with each other (export to the same file name). I was hoping to delete one of them, but am unable to determine where they are stored. Just for a test I created a third job, noted the jobID, but can't find that one either. They don't seem to be listed in the RME job browser. Where are these stored and how do I delete the extraneous jobs?
    Perhaps a related issue. When go to the System View of CW, there is a panel named Job Information Status. It always contains only the string 'Loading....' in red (along with Log Space Usage and Critical Message Window). Thoughts?.

    My guess is you have a lot of jobs on this system, and jrm is not returning fast enough.  I find that Firefox is a bit more tolerant to delays than IE.  If you can, try FF, and see if the job browser loads.  If so, purge some old jobs.
    Posted from my mobile device.

  • Wrong Input Parameter in Job Details

    Hi,
    i start a OWB10.2.0.1 mapping by calling its main-Procedure including the default-parameters and a individual plant location-parameter 'B' as MappingInput (used later as filter), f.e.:
    declare
    p varchar2(4000);
    begin
    fact_map.main(p, 'B', NULL, NULL, NULL, NULL, NULL, NULL);
    end;
    THe mapping works fine, the 'B'-data is loaded but in the job details windows of the CCM on the Input Parameter tab it´s written an 'A' (the default plant location) instead of 'B'.
    OK, it´s not very important at the first moment, but later, if you want to check something in your loading-history - you are lost...
    Any idea, how to fix this?
    jwehner

    Please see commented lines below :
    BEGIN
      Log(
        '6B6C6D'                       -- this is not a RAW
      , '06-Aug-12'                    -- this is not a TIMESTAMP
      , 'COM.TESt'
      , 'OH'
      , 'AUT'
      , 'NOTRANSACT'
      , '<ACORD><SignonRq>'            -- this is not an XMLType (not even valid XML)
      , '000000E0LN1D000029FNSRRGTest'
      , '000009N1D000029FNJ9OITest'
    END;Use the correct datatypes and their constructors (if necessary).
    For example, you can build a RAW from a string with HEXTORAW() function. An XMLType can be built via the XMLType() constructor or the XMLParse() function, etc.

  • Interface Problems: DBA = Data Pump = Export Jobs (Job Name)

    Hello Folks,
    I need your help in troubleshooting an SQL Developer interface problem.
    DBA => Data Pump => Export Jobs (Job Name) => Data Pump Export => Job Scheduler (Step):
    -a- Job Name and Job Description fields are not visible. Well the fields are there but each of them just 1/2 character wide. I can't see/enter anything in the fields.
    Import Wizard:
    -b- Job Name field under the first "Type" wizard's step looks exactly the same as in Export case.
    -c- Can't see any row under "Chose Input Files" section (I see just ~1 mm of the first row and everything else is hidden).
    My env:
    -- Version 3.2.20.09, Build MAIN-09.87
    -- Windows 7 (64 bit)
    It could be related to the fact that I did change fonts in the Preferences. As I don't know what is the default font I can't change it back to the default and test (let me know what is the default and I will test it).
    PS
    -- Have tried to disable all extensions but DBA Navigator (11.2.0.09.87). It didn't help
    -- There are no any messages in the console if I run SQL Dev under cmd "sqldeveloper\bin\sqldeveloper.exe
    Any help is appreciated,
    Yury

    Hi Yury,
    a-I see those 1/2 character size text boxes (in my case on frequency) when the pop up dialog is too small - do they go away when you make it bigger?
    b- On IMPORT the name it starts with IMPORT - if it is the half character issue have you tried making the dialog bigger?
    c-I think it is size again but my dialog at minimum size is already big enough.
    Have you tried a smaller font - or making the dialogs bigger (resizing from the corners).
    I have a 3.2.1 version where I have not changed the fonts from Tools->Preferences->CodeEditor->Fonts appears to be:
    Font Name: DialogInput
    Font size: 12
    Turloch
    -SQLDeveloper Team

  • To know the job details if u know the jobname

    Hi,
    can we know what excatly job purpose is like for what purpse it is created or in other words what is the basic idea behind creating job, who has created the job , when he has cretaed the job etc if u know the job name?
    Thanks,
    Ravi

    Hi...
    1.Go to SM37--put job name as * and username as *.
    tick all jobs scheduled,finished ,cancelled...etc and then execute..
    2.Double click on any of the jobs...whose details you want to view....now..click on job details F7 you will view all the details..
    and job log ..to view how the job was executed...etc.etc..
    3.point the cursor on your job...and click on step button....now on next screen you will find ..top menu..GOTO --select variant to view variants of the job..

  • Export invoice details of tables

    hi experts ,
    plz tell the export invoice details of tables.
    thanks
    Rahul

    Hi,
    All the Invoice details are stored in VBRK and VBRP table only,.
    Only thing is that for Export Invoice the Doc Type is different.
    Ask your functional Guy what's the Billing doc type for that Export Invocie
    Check the fields FKART and VBTYP in VBRK table and choose that type and get the details from tha above tables.
    reward points if useful
    regards,
    ANJI

  • Problem Exporting Job to a Production Repository

    We're running Data Integrator 11.5.
    I'm exporting jobs from my development repo to a production repo and am having some weirdness when starting jobs from the Designer client.  No matter which repo I connect to, if I start a job, I see it running in the Monitor tab of both repositories.  Same name and everything.  It's really only running in the repo I start it in, using the correct System Configuration settings, etc., but I can see it in both.  I cannot view the log contents from within the non-starting repo, however.  Bu, if I cancel it in the non-starting repo, it will cancel the running job.  Even if I go back into my dev repo and rename the jobs, when I start the job in the prod repo, it will show the dev job, with it's different name, running in the dev repo monitor (even though, in fact, that job is not running).  In the prod repo monitor, I see the correct job name.
    I'm wondering if anyone else has had these kinds of issues and whether it's something I can fix, or whether I'm just going to have to live with it until we upgrade to DS 4.0 in a couple of months.

    Andy
    It has everything to do with the Format of the card.
    These cards are formatted FAT16 or FAT 32, and these older Windows formats have a limit to the number of objects they can have on the root directory of the volume. So, don’t copy the images from the folder on your desktop, just drag the folder to the card...
    Regards
    TD

  • Import/Export Job in management portal.

    I am trying to create an Import/Export job in portal, I read this document:
    http://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/
    and I am unable to do this, document says download import/export tool, but I am unable to do that any help will be greatly appreciated.

    Hi,
    I would request you to check if you meet all these prerequisites:
    • You must have an active Azure subscription.
    • Your subscription must include a storage account with enough available space to store the files you are going to import.
    • You need at least one of the account keys for the storage account.
    • You need a computer (the "copy machine") with Windows 7, Windows Server 2008 R2, or a newer Windows operating system installed.
    • The .NET Framework 4 must be installed on the copy machine.
    • BitLocker must be enabled on the copy machine.
    • You will need one or more empty 3.5-inch SATA hard drives connected to the copy machine.
    • The files you plan to import must be accessible from the copy machine, whether they are on a network share or a local hard drive.
    Regards,
    Azam Khan

  • Import/Export Jobs Fail - ORA-01017 Invalid UserName/PW - Solved!

    Every time I try to run either an Import or Export Job from OEM it always fails with ORA-01017 Invalid UserName/PW.
    I have read numerous posts in this forum on related topics.
    Suggestion have been;
    1. Make sure OS login is a member of the ORA_DBA group.
    2. Make sure the OS user has the ability to logon as batch job.
    3. Make sure the ORACLE_SID and ORACLE_HOME are set.
    4. Make sure to set up Preferred Credentials on the Preferences page.
    4. On and on and on.
    I am using Oracle Version 10.2.0.1.0 On Windows 2003 Server Enterprise SP1.
    When I installed the DB using Oracle Universal Installer it asks what Password you would like to assign to SYS, SYSTEM etc. I used the following password "AvT_$651#JK" which it accepted without problem. The installation completed and I am able to log into OEM using SYS/AvT_$651#JK (SYSDBA) or SYSTEM/AvT_$651#JK (Normal).
    I then proceed to "import from export files" a schema from a previously "export to export files" that was created on another host using the same Oracle Version 10.2.0.1.0 On Windows 2003 Server Enterprise SP1. This job always fails with "ORA-01017 Invalid UserName/PW" even though the username and pw are correct!
    Turns out the password itself is the problem, apparently some part of the process (RMAN??) does not like the special characters in the PW. Changing the PW to "testpw1" worked.
    Using Oracle Universal Installer should have prevented me from using a password that violated this policy.
    BG...

    Does you provide username under local security policy along with domainname like
    abc\uo.cok ? If not, please enter username along with domainname and let me know the result. If possible aste eventviewer log for this connection

  • Tracking RMAN Backup job details

    Version:11.2.0.3/Solaris 11
    To track RMAN backup job details , what dictionary view/dyn.performance view do you use ?
    In our shop , we use incrementally updated backup (incremental merge). To find the details of previous RMAN Jobs, I tried using
    v$rman_status
    and
    v$rman_backup_job_details
    But the details provided by these 2 views don't match. For example, on 26th of June 2013, there was an Incremental backup and an archivelog backup. As per v$rman_status , both of them took 61 minutes (46 + 15)
    According to v$rman_backup_job_details, backup jobs took 96 minutes .It doesn't seem to provide all info on archive log bkp though.
    The start_time and end_time provided by both of these views don't math either !
    col starttime format a25
    col endtime format a25
    select status, object_type, to_char(start_time,'dd/MON/yyyy:hh:mi:ss') as starttime,
    to_char(end_time,'dd/MON/yyyy:hh:mi:ss') as endtime ,
    to_number(end_time-start_time)*24*60 duration_minutes
    from sys.v$rman_status where start_time > trunc(sysdate) - 20 and operation = 'BACKUP'
    order by end_time desc;
    STATUS                  OBJECT_TYPE   STARTTIME                 ENDTIME                   DURATION_MINUTES
    FAILED                  ARCHIVELOG    07/JUN/2013:08:47:44
    COMPLETED               ARCHIVELOG    26/JUN/2013:06:49:16      26/JUN/2013:07:36:14            46.9666667
    COMPLETED               DB INCR       26/JUN/2013:06:33:18      26/JUN/2013:06:49:16            15.9666667
    COMPLETED               ARCHIVELOG    25/JUN/2013:06:50:55      25/JUN/2013:07:58:01                  67.1
    COMPLETED               DB INCR       25/JUN/2013:06:25:06      25/JUN/2013:06:50:55            25.8166667
    COMPLETED               ARCHIVELOG    24/JUN/2013:06:15:42      24/JUN/2013:07:07:54                  52.2
    COMPLETED               DB INCR       24/JUN/2013:06:01:09      24/JUN/2013:06:15:42                 14.55
    COMPLETED               ARCHIVELOG    23/JUN/2013:09:47:48      23/JUN/2013:10:01:19            13.5166667
    COMPLETED               DB INCR       23/JUN/2013:09:40:27      23/JUN/2013:09:47:48                  7.35
    COMPLETED               ARCHIVELOG    22/JUN/2013:07:23:18      22/JUN/2013:07:41:29            18.1833333
    COMPLETED               DB INCR       22/JUN/2013:07:15:35      22/JUN/2013:07:23:17                   7.7
    COMPLETED               ARCHIVELOG    21/JUN/2013:07:30:33      21/JUN/2013:09:05:50            95.2833333
    COMPLETED               DB INCR       21/JUN/2013:06:39:35      21/JUN/2013:07:30:33            50.9666667
    COMPLETED               ARCHIVELOG    20/JUN/2013:07:35:54      20/JUN/2013:09:25:03                109.15
    COMPLETED               DB INCR       20/JUN/2013:06:55:08      20/JUN/2013:07:35:54            40.7666667
    COMPLETED               ARCHIVELOG    19/JUN/2013:07:20:10      19/JUN/2013:08:27:28                  67.3
    COMPLETED               DB INCR       19/JUN/2013:07:00:02      19/JUN/2013:07:20:10            20.1333333
    COMPLETED               ARCHIVELOG    18/JUN/2013:07:27:30      18/JUN/2013:09:19:50            112.333333
    COMPLETED               DB INCR       18/JUN/2013:07:02:09      18/JUN/2013:07:27:30                 25.35
    COMPLETED               ARCHIVELOG    17/JUN/2013:07:42:20      17/JUN/2013:08:40:29                 58.15
    COMPLETED               DB INCR       17/JUN/2013:07:22:29      17/JUN/2013:07:42:20                 19.85
    COMPLETED               ARCHIVELOG    17/JUN/2013:06:28:16      17/JUN/2013:07:42:44            74.4666667
    COMPLETED               DB INCR       17/JUN/2013:01:57:49      17/JUN/2013:06:28:11            270.366667
    COMPLETED               ARCHIVELOG    16/JUN/2013:02:18:02      16/JUN/2013:04:22:26                 124.4
    COMPLETED               DB INCR       16/JUN/2013:01:48:18      16/JUN/2013:02:18:02            29.7333333
    COMPLETED               ARCHIVELOG    14/JUN/2013:07:27:44      14/JUN/2013:08:40:53                 73.15
    COMPLETED               DB INCR       14/JUN/2013:07:01:19      14/JUN/2013:07:27:43                  26.4
    COMPLETED               ARCHIVELOG    13/JUN/2013:06:56:13      13/JUN/2013:07:47:50            51.6166667
    COMPLETED               DB INCR       13/JUN/2013:06:42:11      13/JUN/2013:06:56:13            14.0333333
    COMPLETED               ARCHIVELOG    12/JUN/2013:07:12:43      12/JUN/2013:08:12:10                 59.45
    COMPLETED               DB INCR       12/JUN/2013:06:45:51      12/JUN/2013:07:12:43            26.8666667
    COMPLETED               ARCHIVELOG    11/JUN/2013:07:21:36      11/JUN/2013:08:46:11            84.5833333
    COMPLETED               DB INCR       11/JUN/2013:06:52:29      11/JUN/2013:07:21:36            29.1166667
    COMPLETED               ARCHIVELOG    10/JUN/2013:07:04:49      10/JUN/2013:07:55:15            50.4333333
    COMPLETED               DB INCR       10/JUN/2013:06:49:10      10/JUN/2013:07:04:49                 15.65
    COMPLETED               ARCHIVELOG    09/JUN/2013:08:10:13      09/JUN/2013:09:04:10                 53.95
    COMPLETED               DB INCR       09/JUN/2013:07:50:24      09/JUN/2013:08:10:13            19.8166667
    COMPLETED               ARCHIVELOG    08/JUN/2013:07:37:09      08/JUN/2013:08:33:58            56.8166667
    COMPLETED               DB INCR       08/JUN/2013:07:17:56      08/JUN/2013:07:37:09            19.2166667
    COMPLETED               ARCHIVELOG    07/JUN/2013:08:32:01      07/JUN/2013:09:34:11            62.1666667
    COMPLETED               DB INCR       07/JUN/2013:07:36:27      07/JUN/2013:08:32:01            55.5666667
    COMPLETED               ARCHIVELOG    07/JUN/2013:08:48:10      07/JUN/2013:11:28:14            160.066667
    42 rows selected.
    -- Output of v$rman_backup_job_details
    select status, input_type,
    to_char(start_time,'dd/mm/yyyy:hh:mi:ss') as starttime,
    to_char(end_time,'dd/mm/yyyy:hh:mi:ss') as endtime,
    to_number(end_time-start_time)*24*60 duration_minutes
    From v$rman_backup_job_details
    where start_time > trunc(sysdate) - 20
    order by end_time desc;
    STATUS                  INPUT_TYPE    STARTTIME           ENDTIME             DURATION_MINUTES
    FAILED                  ARCHIVELOG    07/06/2013:08:47:44
    COMPLETED               DB INCR       26/06/2013:06:00:09 26/06/2013:07:36:14       96.0833333
    COMPLETED               DB INCR       25/06/2013:06:00:08 25/06/2013:07:58:01       117.883333
    COMPLETED               DB INCR       24/06/2013:06:00:09 24/06/2013:07:07:54            67.75
    COMPLETED               DB INCR       23/06/2013:08:07:56 23/06/2013:10:01:19       113.383333
    COMPLETED               DB INCR       22/06/2013:06:00:10 22/06/2013:07:41:29       101.316667
    COMPLETED               DB INCR       21/06/2013:06:00:12 21/06/2013:09:05:50       185.633333
    COMPLETED               DB INCR       20/06/2013:06:00:12 20/06/2013:09:25:03           204.85
    COMPLETED               DB INCR       19/06/2013:06:00:11 19/06/2013:08:27:28       147.283333
    COMPLETED               DB INCR       18/06/2013:06:00:16 18/06/2013:09:19:50       199.566667
    COMPLETED               DB INCR       17/06/2013:06:00:13 17/06/2013:08:40:29       160.266667
    COMPLETED               DB INCR       16/06/2013:06:04:02 17/06/2013:07:42:44            818.7
    COMPLETED               DB INCR       15/06/2013:06:05:12 16/06/2013:04:22:26       617.233333
    COMPLETED               DB INCR       14/06/2013:06:00:09 14/06/2013:08:40:53       160.733333
    COMPLETED               DB INCR       13/06/2013:06:00:09 13/06/2013:07:47:50       107.683333
    COMPLETED               DB INCR       12/06/2013:06:00:10 12/06/2013:08:12:10              132
    COMPLETED               DB INCR       11/06/2013:06:00:17 11/06/2013:08:46:11            165.9
    COMPLETED               DB INCR       10/06/2013:06:00:14 10/06/2013:07:55:15       115.016667
    COMPLETED               DB INCR       09/06/2013:06:00:10 09/06/2013:09:04:10              184
    COMPLETED               DB INCR       08/06/2013:06:00:09 08/06/2013:08:33:58       153.816667
    COMPLETED               DB INCR       07/06/2013:06:00:19 07/06/2013:09:34:11       213.866667
    COMPLETED               ARCHIVELOG    07/06/2013:08:48:10 07/06/2013:11:28:14       160.066667
    22 rows selected.

    When I run an full/incremental backup with archivelog it only shows as one job in v$rman_backup_job_details.  Only if I explicitly run just a backup archivelog all in RMAN does it show up separately as an ARCHIVELOG backup in v$rman_backup_job_details.
    rman_status shows the individual parts of the jobs.  For example for a full backup it shows the db backup and the archivelog backup.
    The start and end time in rman_status should match up with $rman_backup_job_details.
    Lets take the 25th as an example:
    In man_backup_job_details
    STATUS                  INPUT_TYPE    STARTTIME           ENDTIME             DURATION_MINUTES
    COMPLETED               DB INCR       25/06/2013:06:00:08 25/06/2013:07:58:01       117.883333
    In v$man_status:
    STATUS                  OBJECT_TYPE   STARTTIME                 ENDTIME                   DURATION_MINUTES
    COMPLETED               ARCHIVELOG    25/JUN/2013:06:50:55      25/JUN/2013:07:58:01                  67.1
    COMPLETED               DB INCR       25/JUN/2013:06:25:06      25/JUN/2013:06:50:55            25.8166667
    You need to view your logfiles to see what rman was doing between 6:00:08 when the job started and 06:25:06 when the incremental backup started.
    You can see that the end time is the same in both views.
    Not easy to explain but I hope that helps.

  • Canceled Status in Job Detail, Process Chain triggering w/event

    Hi All,
    I have created event, PC and trigger event with SM64 or with ABAP, everything is ok
    if I activate and Schedule PC, I see BI_PROCESS_TRIGGER job with released status, and when I trigger the event nothing  is happened but when I checked the Job detail I see canceled status job.
    I donu2019t understand why this problem occurs ?
    I can trigger the event but before event trigger the PC, scheduled job detail directly turn to the chancel status.
    if any one give me a clue to solve this problem I will appreciate
    Note : No Job log !!!
    Thanks
    Ali

    Ali,
    Check authorizations.
    Try to run process chain with out event(immediate) and check.
    Srini

  • How to view ALL batch job details at one glance using function module

    Hi Experts,
    i need to see all batch jobs details, the information required includes the job name, client number, job frequency .. etc.
    but i need to do it with only FUNCTION MODULE.....
    SINCE THE INFORMATION IS CAPTURED BY THIRD PARTY SYSTEM.......i am looking for any function module
    could you please suggest me any FM's where i can get this information.................
    thanks and regards
    SAM

    Hi,
    You can explore these functional module for SM37 desired details:
    With function module BP_JOB_MAINTENANCE (transaction SM37), you can call the full job maintenance system of the background processing system, starting with the job selection screen.
    Since many users are not familiar with job maintenance and have no desire to search for their jobs, you can use the function modules BP_JOB_SELECT and BP_JOBLIST_PROCESSOR to select and display a list of jobs for the users of your program.
    Use BP_JOB_SELECT to generate an internal table of jobs. Then, with BP_JOBLIST_PROCESSOR, you can display the selected jobs in the list format used by the job maintenance system.
    You can also use BP_FIND_JOBS_WITH_PROGRAM to select jobs that run a particular program. Use this function module with BP_JOBLIST_PROCESSOR to display a job list to your users. Like BP_JOBLIST_SELECT; BP_FIND_JOBS_WITH_PROGRAM offers interactive and silent modes.
    Regards,
    Ashutosh

  • Acrobat Review Tracker: Can I export Tracker details?

    Hello,
    I frequently send out and join a variety of PDF reviews in my day to day activities at work. The Adobe Review Tracker allows me to see the various reviews I'm participating in with details regarding the review names, number of reviewers, and number of comments per review.
    Is there any way to export this information to another application (such as excel)?
    The review tracker (available in Adobe Acrobat 9.0 under comments > Track Reviews) seems to only offer a plain list of reviews. There appears to be no ability to sort this information (by file name) or any method to export the data in the tracker to another application.
    If this is a new feature in Acrobat 10 (X) or if there is some plug-in or other solution I'd love to hear about it.
    Thanks!

    As of now (till Acrobat X), there is no way to export details of reviews from Tracker. You can however create a PDF of review details of a selected review if you have Acrobat (not available in Reader). You may want to log a feature request for next Acrobat version for exporting Tracker details, though I was wondering how useful the exported data would be?
    Thanks!

Maybe you are looking for