DBCONSOLE job submission problem

Hi
I am running dbconsole against a 10.2.0.1.0 database running on SUSE sles9 Linux.
I wish to run a reorg jobs, but I get the following error when reach the final step to submit the job.
The user and password specified are valid and can be used to directly log on to the server thru an ssh connection. The user is the used to install Oracle and is a member of the Linux dba group and has write access to the entire oracle_home.
Any help would be appreciated., Does anyone know what directory dbconsole is trying to write to so I can check permissions ?
Thanks
Error
An error occurred verifying the host credentials. Make sure the credentials are valid and that they specify an account with enough privileges to write to the directory where the reorganization script will be created. Credential verification error: ERROR: Wrong password for user

Hi
root.sh has been run as can be seen from the file perms on nmo
-rwsr-s--- 1 root dba 23700 Jan 19 15:24 nmo
-rwsr-s--- 1 root dba 22498 Jan 19 15:24 nmb
Thanks

Similar Messages

  • Job submission fails

    Dear all,
    When scheduling backup in oracle 10g using enterprise manager, i get the below error
    The job submission has failed for the following reason
    ORA-04063: package body "SYSMAN.MGMT_CREDENTIAL" has errors ORA-06508: PL/SQL: could not find program unit being called: "SYSMAN.MGMT_CREDENTIAL" ORA-06512: at line 1
    I have been trying to compile the package but still fails. How can I replace or resolve this problem.
    Thanks in advance

    Hello Daniel,
    I am using Windows XP as the OS.
    I have not applied any patch.
    I have compiled using em and it has been compiled with errors below
    Line # = 46 Column # = 12 Error Text = PLS-00201: identifier 'DBMS_OBFUSCATION_TOOLKIT' must be declared
    Line # = 46 Column # = 5 Error Text = PL/SQL: Statement ignored
    Line # = 57 Column # = 12 Error Text = PLS-00201: identifier 'DBMS_OBFUSCATION_TOOLKIT' must be declared
    Line # = 57 Column # = 5 Error Text = PL/SQL: Statement ignored
    Line # = 68 Column # = 12 Error Text = PLS-00201: identifier 'DBMS_OBFUSCATION_TOOLKIT' must be declared
    Line # = 68 Column # = 5 Error Text = PL/SQL: Statement ignored
    Line # = 80 Column # = 12 Error Text = PLS-00201: identifier 'DBMS_OBFUSCATION_TOOLKIT' must be declared
    Line # = 80 Column # = 5 Error Text = PL/SQL: Statement ignored
    Line # = 94 Column # = 12 Error Text = PLS-00201: identifier 'DBMS_OBFUSCATION_TOOLKIT' must be declared
    Line # = 94 Column # = 5 Error Text = PL/SQL: Statement ignored
    Can you assist me plse?
    regards,
    steve

  • Background job scheduling problem

    Hai..Can anyone check the below program and correct it.
    I am unable to see the output in SP01 (SPOOL REQUEST)
    MY PROGRAM:-
    REPORT  zh_test4.
    TABLES : mara, TBTCO.
    DATA : BEGIN OF itab OCCURS 0,
          matnr LIKE mara-matnr,
          END OF itab.
    ****background data declarations
    data : job_name like TBTCO-JOBNAME.
    data : job_num like TBTCO-JOBCOUNT,
           rep like sy-repid.
    ***selection screen
    PARAMETERS : p_matnr LIKE mara-matnr default '1500-610'.
    SELECT matnr FROM mara INTO TABLE itab WHERE matnr EQ p_matnr.
    job_name = 'HARI'.
    CALL FUNCTION 'JOB_OPEN'
      EXPORTING
      DELANFREP              = ' '
      JOBGROUP               = ' '
        jobname                = job_name
      SDLSTRTDT              = NO_DATE
      SDLSTRTTM              = NO_TIME
    IMPORTING
       jobcount               = job_num
    EXCEPTIONS
       CANT_CREATE_JOB        = 1
       INVALID_JOB_DATA       = 2
       JOBNAME_MISSING        = 3
       OTHERS                 = 4
    IF sy-subrc <> 0.
    write :/ ' Job opening problem'.
    else.
    write :/ 'Job succesfully opened', sy-subrc.
    ENDIF.
    MOVE SY-UNAME TO TBTCO-AUTHCKNAM.
    rep = sy-repid.
    job_name = 'HARI'.
    CALL FUNCTION 'JOB_SUBMIT'
      EXPORTING
      ARCPARAMS                         =
        authcknam                         = SY-UNAME
        jobcount                          = job_num
        jobname                           = job_name
      LANGUAGE                          = SY-LANGU
      PRIPARAMS                         = ' '
       REPORT                            = 'ZH_TEST4'
      VARIANT                           = 'VAR'
    IMPORTING
      STEP_NUMBER                       =
    EXCEPTIONS
       BAD_PRIPARAMS                     = 1
       BAD_XPGFLAGS                      = 2
       INVALID_JOBDATA                   = 3
       JOBNAME_MISSING                   = 4
       JOB_NOTEX                         = 5
       JOB_SUBMIT_FAILED                 = 6
       LOCK_FAILED                       = 7
       PROGRAM_MISSING                   = 8
       PROG_ABAP_AND_EXTPG_SET           = 9
       OTHERS                            = 10
    IF sy-subrc <> 0.
    WRITE :/ 'JOB SUBMIT PROBLEM',
              job_name,
              job_num,
              rep,
              sy-subrc.
    else.
    write :/ 'Job succesfully submitted in background', sy-subrc.
    ENDIF.
    CALL FUNCTION 'JOB_CLOSE'
      EXPORTING
        jobcount                          = job_num
        jobname                           = job_name
      LASTSTRTDT                        = NO_DATE
      LASTSTRTTM                        = NO_TIME
      PRDDAYS                           = 0
      PRDHOURS                          = 0
      PRDMINS                           = 0
      PRDMONTHS                         = 0
      PRDWEEKS                          = 0
      PREDJOB_CHECKSTAT                 = ' '
      PRED_JOBCOUNT                     = ' '
      PRED_JOBNAME                      = ' '
      SDLSTRTDT                         = datum
      SDLSTRTTM                         = uzeit
      STARTDATE_RESTRICTION             = BTC_PROCESS_ALWAYS
       STRTIMMED                         = 'X'
      TARGETSYSTEM                      = ' '
      START_ON_WORKDAY_NOT_BEFORE       = SY-DATUM
      START_ON_WORKDAY_NR               = 0
      WORKDAY_COUNT_DIRECTION           = 0
      RECIPIENT_OBJ                     =
      TARGETSERVER                      = ' '
      DONT_RELEASE                      = ' '
      TARGETGROUP                       = ' '
    IMPORTING
      JOB_WAS_RELEASED                  = 'X'.
    EXCEPTIONS
       CANT_START_IMMEDIATE              = 1
       INVALID_STARTDATE                 = 2
       JOBNAME_MISSING                   = 3
       JOB_CLOSE_FAILED                  = 4
       JOB_NOSTEPS                       = 5
       JOB_NOTEX                         = 6
       LOCK_FAILED                       = 7
       INVALID_TARGET                    = 8
       OTHERS                            = 9
    IF sy-subrc <> 0.
    write :/ 'Unable to close the Job', rep, sy-subrc.
    else.
    write :/ 'Succesfully closed the job', sy-subrc.
    ENDIF.

    Here is an example, slightly different from your version.
    REPORT ztest.
    PARAMETERS: p_vbeln LIKE vbak-vbeln,
                p_bkrun NO-DISPLAY.
    DATA: ls_vbak LIKE vbak.
    DATA: v_answer,
          v_jobcount LIKE tbtcjob-jobcount.
      IF p_bkrun IS INITIAL.
    *-- not background processing
        CALL FUNCTION 'POPUP_TO_CONFIRM_STEP'
             EXPORTING
                  textline1      = 'This may time out.'
                  textline2      = 'Do you want to run in background?'
                  titel          = 'Warning!!!'
                  cancel_display = space
             IMPORTING
                  answer         = v_answer.
        IF v_answer = 'J'.
    *-- run in the background
          CALL FUNCTION 'JOB_OPEN'
               EXPORTING
                    jobname          = 'ZTEST'
               IMPORTING
                    jobcount         = v_jobcount
               EXCEPTIONS
                    cant_create_job  = 1
                    invalid_job_data = 2
                    jobname_missing  = 3
                    OTHERS           = 4.
          IF sy-subrc <> 0.
            MESSAGE ID sy-msgid TYPE 'E' NUMBER sy-msgno
                    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
            EXIT.
          ENDIF.
    *-- submit the program in the background
          SUBMIT ztest
            WITH p_bkrun = 'X'
            WITH p_vbeln = p_vbeln
            USER sy-uname
            VIA JOB 'ZTEST' NUMBER v_jobcount AND RETURN.
    *-- close the job
          CALL FUNCTION 'JOB_CLOSE'
               EXPORTING
                    jobcount             = v_jobcount
                    jobname              = 'ZTEST'
                    strtimmed            = 'X'
               EXCEPTIONS
                    cant_start_immediate = 1
                    invalid_startdate    = 2
                    jobname_missing      = 3
                    job_close_failed     = 4
                    job_nosteps          = 5
                    job_notex            = 6
                    lock_failed          = 7
                    OTHERS               = 8.
          IF sy-subrc <> 0.
            MESSAGE ID sy-msgid TYPE 'W' NUMBER sy-msgno
                    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
          ENDIF.
          EXIT.
        ELSE.
          CLEAR v_answer.
        ENDIF.
      ENDIF.
      CHECK v_answer IS INITIAL.
      SELECT SINGLE * FROM vbak
                      INTO ls_vbak
                     WHERE vbeln = p_vbeln.
      IF sy-subrc <> 0.
        WRITE:/ 'Invalid Order Id.'.
      ENDIF.
    END-OF-SELECTION.
      CHECK v_answer IS INITIAL.
      IF p_bkrun IS INITIAL.
        WRITE:/ 'Here is the result running the program in foreground.'.
      ELSE.
        WRITE:/ 'Here is the result running the program in background.'.
      ENDIF.
      WRITE:/ ls_vbak-vbeln,
              ls_vbak-vkorg.

  • Job submission failed : error occurred while scheduling the job. org.quartz.objectalreadyexistsexception: unable to store job with name

    Experts,
    Please help me out here,Am facing this issue while scheduling a job in BIPublisher.
    job submission failed : error occurred while scheduling the job. org.quartz.objectalreadyexistsexception: unable to store job with name
    Thanks,

    You are probably installing on a hardened machine.
    The "installation guide" says that if you are doing so, you need to create a file named libx11.so.4 and update your LD_LIBRARY_PATH (see http://docs.iplanet.com/docs/manuals/messaging/ims52/ig/unix/overview.htm)but:
    1- the library name is libX11.so.4 and
    2- for jre a common file is not enough, you need to install SUNWxwplt at least. This will install the library on /usr/openwin/lib.
    You can check the results running <server-root>/bin/base/jre/bin/jre
    Hope that helps and sorry about my poor english.

  • Double submission problem in J2EE application under Weblogic 8.1 SP2 server

    Hi,
    We are facing double submission problem in our J2EE application which is running under Weblogic 8.1 SP2 and for the same we already implemented or added the below preventive solutions.
    1. We disable the SUBMIT button once the user clicks it.
    2. We preventated pressing 'F5' button and clicking 'Refresh' button in the browser.
    3. Also we tried to prevent by declaring the idempotent is 'true' under weblogic-ejb-jar.xml as below.
    <stateless-bean-methods-are-idempotent>true</stateless-bean-methods-are-idempotent>
    So please somebody help us on this issue like how to prevent in some other way.
    Regards,
    Dinesh.

    I have no idea why you would think changing your EJB configuration would have anything to do with preventing double submission at your servlet layer.
    One technique I've seen for preventing double submission was first used in the Struts framework several years ago. When a page is "prepared" for display, a token value is created and stored in the session. The page is displayed with a hidden field containing that value. When the page is submitted, the value of the hidden field is compared with the value stored in the session. If they're not equal, the submission is ignored.

  • Unable to properly submit default credentials for use with Job Submission

    Greetings,
    I have several jobs running successfully by overriding default credentials. I am having difficulty getting them to run using the default credentials. Under preferences I choose referred credentials and set the username and password for the target of interest, choose to test the connection and it is successful. Under the database instance I set the username and password and the host username and host password and again the connection test is successful. I then go to the job in question (which BTW is running successfully if I choose override default credentials), edit the job and ask it to use the preferred credentials. WHen I click submit I see the following message displayed - FOllowing Preferred Credentials are not set: Database Host Credentials on Targets.... or Normal Database Credentials on targets....
    I am cleary missing something but I am not certain what. Any thoughts would be appreciated.
    Thanks.

    I figured it out. The job is associated with a particular user but default credentials were not set for that particular user. Once I set those credentials for that user the job submission was successful. Thanks.

  • Over ride job ticket at job submission (New user)

    Hi!I'm new to forum and new to FFC as well so please bear with me. Our FFC was just updated to version 4.0.2.0 [2015.05.18 9865 01.04]After that I don't have an option to choose over ride job ticket and give order quantity along with paper stock name. Any idea what is going on? Is there some way to turn it on again or was it removed in new version? 

    The option has moved but it is still there. You can see it at the bottom of the job submission UI If you are not seeing the same options I would check the server and client versions in the About FreeFlow Core screen If the version numbers do not match then flush the browser cache and restart your browser. Also, as you may have noticed above, you now have the option to select a printer destination and to define all job ticketing during upload.That new option will make job ticketing both more complete and easier (people have a hard time typing in the exact stock name in the override). 

  • Job Cancelled Problem

    Hi Masters,
    In SM37 I seen that 3 jobs were cancelled.
    Pls see below:
    ======
    Job started
    Step 001 started (program RSSTAT1, variant &0000000002410, user name ALEREMOTE)
    Log:Programm RSSTAT1; Request REQU_3ZAF2Q2IY0EE5RZCTYPE2FCAT; Status ; Action Start
    Deleting/reconstructing indexes for InfoCube ZSRVPUR01 is not permitted
    Deleting/reconstructing indexes for InfoCube ZSRVPUR01 is not permitted
    Log:Programm RSSTAT1; Request REQU_3ZAF2Q2IY0EE5RZCTYPE2FCAT; Status @08@; Action Callback
    Report RSSTAT1 completed with errors
    Job cancelled after system exception ERROR_MESSAGE
    How to investigate cancelled jobs under ALEREMOTE? How can i know this cancelled Job's request is assigned for what task.?
    How to rerun a cancelled job? Pls tell steps.
    Please suggest me.
    Thanks,
    BW26.

    Hi,
    it looks like you have an authority problem with aleremote. But anyway, did you check the syslog (sm21) or the dump overview (st22)? Are there any problems logged for the run time of the job?
    regards
    Siggi
    PS: Have look here it might be of some help for you! /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Message was edited by: Siegfried Szameitat

  • JOB Scheduling problem

    Hi All,
    My problem is the next:
    I would like to call two functions, in one function module, but I cannot schedule the second job after the first. I wrote the code, use the parameter PREDJOB_CHECKSTAT of JOB_CLOSE, but when the mother function called two jobs started immediately and paralell. What is worng in the following code?
    Thanks /and points :)/ for your help!
    Tamas
          L_JOBNAME = 'REQUEST_COPY'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME
            IMPORTING
              JOBCOUNT = L_JOBCOUNT.
          SUBMIT RSSEM_REQUEST_COPY
                       WITH RNR = I_RNR
                       WITH S_CUBE = LS_CUBES-SOURCE_CUBE
                       WITH T_CUBE = LS_CUBES-TARGET_CUBE
                       USER SY-UNAME VIA JOB L_JOBNAME NUMBER L_JOBCOUNT
                       AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT  = L_JOBCOUNT
              JOBNAME   = L_JOBNAME
              STRTIMMED = 'X'.
          L_JOBNAME2 = 'REQUEST_CLOSE'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME2
            IMPORTING
              JOBCOUNT = L_JOBCOUNT2.
         SUBMIT Z_REQUEST_CLOSE_ZSD_P25
                   USER SY-UNAME VIA JOB L_JOBNAME2 NUMBER L_JOBCOUNT2
                   AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT  = L_JOBCOUNT2
              JOBNAME   = L_JOBNAME2
              STRTIMMED = 'X'
              PREDJOB_CHECKSTAT = 'X'
              PRED_JOBCOUNT = L_JOBCOUNT
              PRED_JOBNAME = L_JOBNAME.

    Hi Thomas,
    Thanks for all helps, I found the solution!
    Inserted a select from the Job Status Table into th second job definition.
    Thanks for the ideas!
    Tamás
    The finally code is the the following:
          L_JOBNAME = 'REQUEST_COPY'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME
            IMPORTING
              JOBCOUNT = L_JOBCOUNT.
          SUBMIT RSSEM_REQUEST_COPY
                       WITH RNR = I_RNR
                       WITH S_CUBE = LS_CUBES-SOURCE_CUBE
                       WITH T_CUBE = LS_CUBES-TARGET_CUBE
                       USER SY-UNAME VIA JOB L_JOBNAME NUMBER L_JOBCOUNT
                       AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT  = L_JOBCOUNT
              JOBNAME   = L_JOBNAME
              STRTIMMED = 'X'.
          L_JOBNAME2 = 'REQUEST_CLOSE'.
          CALL FUNCTION 'JOB_OPEN'
            EXPORTING
              JOBNAME  = L_JOBNAME2
            IMPORTING
              JOBCOUNT = L_JOBCOUNT2.
          DO.
            SELECT SINGLE STATUS FROM TBTCO INTO L_STATUS
              WHERE JOBNAME = L_JOBNAME
              AND JOBCOUNT = L_JOBCOUNT.
            IF L_STATUS = 'F'.
              EXIT.
            ELSE.
              WAIT UP TO 1 SECONDS.
            ENDIF.
          ENDDO.
          SUBMIT Z_REQUEST_CLOSE_ZSD_P25
                    USER SY-UNAME VIA JOB L_JOBNAME2 NUMBER L_JOBCOUNT2
                    AND RETURN.
          CALL FUNCTION 'JOB_CLOSE'
            EXPORTING
              JOBCOUNT          = L_JOBCOUNT2
              JOBNAME           = L_JOBNAME2
              STRTIMMED         = 'X'
              PREDJOB_CHECKSTAT = 'X'
              PRED_JOBCOUNT     = L_JOBCOUNT
              PRED_JOBNAME      = L_JOBNAME.

  • Background job scheduling problem in APO

    Hi fellow SDNers,
    i am going through this peculiar problem of background job scheduling:
    scenario is like , i have an CSV (excel file) in aplication server which would load data into the infosource, which i have scheduled to run in backhround (in infopackage) ,after event is triggered (option in scheduling tab of infopackage...THE SCHEDULING OPTIONS)
    now everything seems to be fine.. but the data is not getting loaded...?? could u lease help me out..how to load data from excel file (in background), after an event gets triggered.
    thanks in advance,
    Rohan

    hi Alexander,
    i am triggering the event from BP_EVENT_RAISE FM in APO  by passing the  event id... this would automaitically raise the event...just like sm64
    Thanks
    Rohan

  • Urgent....please HELP....Different user Multiple submission problem

    Hi all:
    This is a STRUTS related problem
    I have posted earlier about the usage of saveToken() and check isTokenValid(request) method. Because I wanted to make sure that the user doesn't click the "submit" button twice. I managed to achieve that, However what I want is the following:
    suppose the user press the "submit" button more than once. The thread that executes the first request still runs, but has no means of providing its response to the browser. Hence, the user may be left with the impression that the transaction did not complete, while in reality, it may have successfully completed.
    So I would like to know if there can be a complete solution that prevents duplicate submission and still ensures the display of a response that represents the original request's outcome.
    some of you may suggest me to look at the article on JavaWorld
    http://www.javaworld.com/javaworld/javatips/jw-javatip136.html
    I have indeed, and downloaded their example and I ran it on my local app server, it DOESN"T claim to do what it is supposed to do, I clicke on the "OK" button more than once, and the second request just brings me to an empty page, because the ActionForward object returns NULL.
    So if anyone, has solved this problem before, please do let me know...
    I'd truly appreciate it...
    Again, I would like to thank everyone for their time in advance.
    Many thanks....

    The solution on that site looks pretty good in theory.
    What did you do to implement it?
    Did you return a forward page from your executeSynchro method which you overrode to implement it?

  • Plz: Replication 9i Job Network Problem Hang

    Hi ,This problem is posted on this forum by other users
    but all of them remained unresponsed!
    I hope somebody will help me now:
    I have a hang problem in 9i
    We are using bidirectional Replication (Materialized View Replication)
    and when mview refresh group is being refreshed if there happens a network problem( on the VPN) then our refresh job seems to hang, it will never end (once we waited a week but nothing happened) and the materialized views remain locked
    the session won't be killed at all, the only way for us is to restart the database,
    I hope there is a better way or some settings to prevent this.
    somebody help me please

    Personally, I've never seen this behavior. If other reports of the same problem in the forum haven't been answered, it's likely because none of the folks that answer questions here have seen this behavior elsewhere.
    Have you logged a TAR in Metalink? It certainly sounds like a bug. Have you tried upgrading to the latest patchset?
    Justin

  • Archive data of using DART : Job lock problem in table TSP01

    Hi ,
    I'm facing problem while archiving from Production system to UNIX using DART.
    Using TC: FTW1A to data extract, once data has extracted, we need to do verifaction progess through TC. FTWE1(run a BG jobRTXWCHK4) and FTWD(BG Job RTXWCHK2).
    When I am running TC: FTWD(BG Job RTXWCHK2)to verify, it was holding an extensive lock on the TSP01 table for long period of time which blocking others processing with this table, So We need to terminate this job. For the time being solution is to run this job during weekends.But I want solution for this.
    Is any boby can help me in this problem?
    Regards,
    Nupur S Jaipuriyar

    Locking a row that does not exist can be difficult.
    On most database you can lock an entire table through "LOCK TABLE <table>", however this may be extreme. Potentially you could also insert an empty row into the table with the id that you want to lock, then you would have a write lock on the row until you commit the transaction.

  • Reporting Agent Job Scheduling problem

    hi All,
    I am trying to schedule reporting agent job. I have few pre calculated webtemplates in one reporting agent scheduling package. when i try to schedule it and go to "start condition" - I want to put "After Event". I am selecting Event and giving the parameter name. But when i save and go out the job is already scheduled (checked in sm37). so I tried to chk the job condition again and the event i selected is not there !!!!
    I want to create after even condition and schedule that parameter via mainframe as our all job schedule through main frame only !!! can some one tell me why its not working with reporting agent ??
    I will definately assign the points.

    dinesh and SB,
    thank you for your reply. understood your point. but what i m asking is when i put "after event" criteria in the start selection... wouldnt it show that always when ever i go to Reporting Agent scheduling Package --> right click --> schedule --> start condition (i meant the event name and parameter should be saved there)
    but once i save it and get out from the scheduling package i can see the job has been scheduled but its not showing that its even controlled job !!!
    is it possible to use "after event" option in Reproting agent's job ?
    I have few queries under one Reporting Agent scheduling package which is added in one process chain - the PC is running once a month - i added RA at the end of the process chain and the variant will be schedule by mainframe once the process chain has completed successfully. now 2nd thing is: i need to run Reporting Agent job every single day. so need to schedule it twice. we schedule everything by main frame. so if i can save "after event" criteria then i can schedule that parameter by mainframe. the problem is the start condition is not saving my after event condition entries or parameter names.
    I hope i m clear. pl. guide its kind a urgent.

  • Database 11gR2  dbconsole job library

    I created a Refresh from Metal Link job with schedule. I saved the job and now wish to view the job library. I can't seem to find the job I created. Where do I go to find it in dbconsole....
    Thanks,
    Bob

    I created a Refresh from Metal Link job with schedule. I saved the job post SQL & results that show above is true

Maybe you are looking for

  • Urgent!  Adobe killed drag/dropping workflow in CC.  Help restore it!

    I posted this first FR at the end of another thread, here.  Due to the importance of this issue, IMO, I'm giving this FR a new thread, plus adding as second version of the same FR. My stance is simple.  Every change Adobe makes to Premiere Pro should

  • Trouble Connecting to FMS in the Amazon Cloud

    I get the following error message when I try to connect to and stream videos that I have uploaded to the Amazon Cloud (AWS Streaming Distribution) using the basic Strobe Media Playback: Connection attempt rejected by FMS server Connection failed. How

  • Variable for Debtors

    Dear Gurus, One of my user wants a query to be created whereby with all the usual time buckets for Debtors he would like to have a variable which will display the debtors as mentioned below: Variable Range to be: "All debtors greater than 180 days an

  • How to diable power on password for hp pro 3420 all in one

    how to disable power on password for hp pro 3420

  • Sync podcasts across Macs

    I'm putting this in here as I think it's where I'd find the people who will know a solution to my problem, if there even is one. To keep this short, I have an iMac, a MacBook Pro and an iPhone. I'll often start a podcast at home on my iMac, listen to