Inventory stops processing status files

We are having problems with files accumulating in our status folder on our ZCM server (SLES 10.2 64 bit). We are running ZCM 10.2.2 and for some reason, the status folder fills up with .zip files which means inventory has quit processing them. I usually stop all the services, delete the files, and then restart the server and as files are written into the folder they are now being processed. Unfortunately I am 2 days past the 2 weeks in which to reopen my previous service request. Does anyone have any suggestions or ideas as to what the problem might be? Or who to contact to reopen a call 2 days past the two weeks?

helgeson,
It appears that in the past few days you have not received a response to your
posting. That concerns us, and has triggered this automated reply.
Has your problem been resolved? If not, you might try one of the following options:
- Visit http://support.novell.com and search the knowledgebase and/or check all
the other self support options and support programs available.
- You could also try posting your message again. Make sure it is posted in the
correct newsgroup. (http://forums.novell.com)
Be sure to read the forum FAQ about what to expect in the way of responses:
http://forums.novell.com/faq.php
If this is a reply to a duplicate posting, please ignore and accept our apologies
and rest assured we will issue a stern reprimand to our posting bot.
Good luck!
Your Novell Product Support Forums Team
http://support.novell.com/forums/

Similar Messages

  • Sender File Adapter stop processing all files

    Hello all,
    the file adapter pick up all files in the directory by default.
    if  a large number of files are in the directory then this could slow down the pi processing.
    is there any way to process only one file per polling??
    regards

    >
    Ralf Zimmerningkat wrote:
    > Hello all,
    > the file adapter pick up all files in the directory by default.
    > if  a large number of files are in the directory then this could slow down the pi processing.
    > is there any way to process only one file per polling??
    >
    > regards
    I do not have any proble if you have got the answer. BUT this blog says how to exclude the other files from the same folder.
    Your Case: For example, Your Sender CC wants to pick up file ABC.txt from /xyz dir, now suppose there are 10 thousand files of same name in the dir and you want ABC.txt should be picked up one by one, So how this blog is going to help you. Can you Plz explain to me and others too?
    @Sachin may be you can throw some light on this... may be I am missing something.

  • XI file sender: filename validation to stop processing a file twice

    Hello folks,
    I have a file sender adapter for a text file with name convention that includes a date - e.g. orders_YYYYMMDD.txt. We have a business requirement to ensure that we don't post the same file more than once. The file can be huge, so it's not really an option to pick up the file contents in one message.
    Options I have tried:
    - I do not know of a setting in the file adapter to achieve this.
    - write a perl script to read the filename and translate [hex-encode] this into a Message GUID, then post the file via HTTP adapter
      - - if the file name is not too long (guid is 32 hex chars) this method works well for small files that are in XML format.
      - - Text files need too much perl coding to translate to XML. Large files that need to be split will fail on the 2nd chunk.
    Options I don't want to use:
    - Use BPM to call an RFC/Proxy that validates the filename - this will cause me to read the whole file, or I have to implement a 'pipe' in BPM to ensure EOIO processing. (We have this elsewhere, but it's not good for performance)
    - Actually, I don't want to manage this in ABAP/Ztables at all if possible.
    I am about to start work on a Module Processor to mangle the GIUD in the file adapter, similar to the HTTP method above (don't have any idea of whether this will be possible yet)...
    Can anyone recommend another method to achieve this?

    Hi Derek,
    Intially when the file is picked then archive it to another directory.
    with this the same file will not be processed twice.
    Rgds,
    Kumar

  • How to process Idoc status file within Sap?

    Hi All,
    We would like to process received status file to update the sent IDOC's. Our current setup is that we send/push  the Outbound Idoc to external EDI sub system and pull the status file for processed Idoc from external EDI system to our R/3 server.
    Sap Documentation explains the scenario, where this process is triggered from external system using startrfc program, whereas our scenario is to trigger the status update process, once the file is pulled to our system from unix using edi_status_incoming function module.
    Your anwers are much appreciated.
    Kind Regards,
    Sandhya

    Hi Sandhya,
    Once the Idocs are sent to the EDI Subsystem, the SAP will have the status that the Idoc has reached the partner sub system. I hope, the workflow should be able to trigger back the status file to the SAP System after the Idoc is processed at the EDI Subsystem.
    Please check with the workflow that is attached.
    Else, if this is using the FM or Message Control, then check the configuration under MN04.
    I'm also trying to find the solution for the same.
    Regards,
    -Syed.

  • Stop cf server from processing particular files

    Bit of a strange question this.
    I want to stop coldfusion server processing certain .cfm files in the root of my site, i want them to get processed by IIS as plain html. Is this possible? i guess its more of an iis question.

    there are thousands of links pointing to this file, internal and external and it would affect rankings on search engines.
    I do not want cf to process this file as its too slow to run now. i'd rather have a html version served up that was updated every 15 mins.
    the clients server is shared and the cf service seems unreliable.

  • Output Server stops processing files (seemingly) at random

    We use Adobe Central Output Server 5.5.0.308 to process Invoices/POs and the like from CommerceCenter (An Epicor/Prophit 21 product).  We have three isntances of it running on three seperate terminal servers.
    It has been working pretty well for years.  Sometimes it would fail to process a certain file or it would stop responding and need to be restarted, but nothing serious and never very often.
    Suddenly in the last couple of months it has stopped processing files at random during the day.  To get it to start again we have to open the Control Center and go to Control -> Start Server.  Then it will process everything in the queue just fine.  It has been happening on both of our older 2003 terminal servers fairly often (at least 4 times a week) and just happened for the first time on our new 2008 R2 terminal server.
    None of us here know much about Output Server and there is nothing in the Server log or the Event Viewer that indicates any sort of issue, so we are at a loss.
    I don't really even know where to begin as the documentation of the relationship between CommerceCenter and Output Server is non-existent.
    I'm really looking for any ideas... Maybe somebody has run into this issue before? 

    cnorris63,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

  • SQL Sync group stuck on processing status

    Hi,
    The sync group below has been stuck on processing since Jan 8, this sync group sync the hub DB with two Azure DB (all are on Azure), one of the Azure DB can sync normally but the other one stuck (keep in the processing status)
    Azure Product Subscription ID:
    43051d1f-7dc4-4e4f-81f4-3e027983733c
    Sync Group ID:
    4124f828-7e36-47dd-84f5-8b58a5e453a7_East Asia
    Status
    Processing
    Would you please help me to reset the status from backend? Thanks!
    Regards,
    Michael Yung

    Hi Michael,
    The Microsoft support engineer will help to solve the problem from backend. Sometime delay might be expected. Your patience is greatly appreciated. Thank you for understanding and support.
    Any of the following can result in a sync group being stuck in the processing state.
    a)The client agent is offline.
    Be sure that the client agent is online then try again.
    b)The client agent is uninstalled or missing.
    If the client agent is uninstalled or otherwise missing:
    1)Remove agent XML file from the SQL Data Sync (Preview) installation folder if the file exists.
    2)Install the agent on same/another on-premise computer, submit the agent key from the portal generated for the agent that’s showing offline.
    c)The SQL Data Sync (Preview) service is stopped.
    1)In the Start menu, search on Services.
    2)Click Services in the Programs section of the search results.
    3)Find the SQL Data Sync (Preview) service.
    4)If the service status is Stopped right-click the service name and select Start from the dropdown menu.
    Reference :
    http://msdn.microsoft.com/en-us/library/azure/hh667321.aspx#ProcessingError
    If you have any feedback on our support, please click
    here.
    Eric Zhang
    TechNet Community Support

  • Can't print - gstoraster stopped with status 13

    Title pretty much says it all. I have a Lexmark 2600 printer that won't work. This is what I got from the /var/log/cups/error_log:
    D [05/Sep/2012:23:11:53 -0500] [Job 21] time-at-processing=1346904713
    D [05/Sep/2012:23:11:53 -0500] [Job 21] job-sheets=none,none
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[0]="Lexmark_2600_Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[1]="21"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[2]="deusdies"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[3]="Untitled"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[4]="1"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[5]="media=Letter job-uuid=urn:uuid:98c3e101-39ae-318d-48e4-3cad7ab517a7 job-originating-host-name=localhost time-at-creation=1346904713 time-at-processing=1346904713 PageSize=Letter"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] argv[6]="/var/spool/cups/d00021-001"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[0]="CUPS_CACHEDIR=/var/cache/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[1]="CUPS_DATADIR=/usr/share/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[2]="CUPS_DOCROOT=/usr/share/cups/doc"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[3]="CUPS_FONTPATH=/usr/share/cups/fonts"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[4]="CUPS_REQUESTROOT=/var/spool/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[5]="CUPS_SERVERBIN=/usr/lib/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[6]="CUPS_SERVERROOT=/etc/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[7]="CUPS_STATEDIR=/var/run/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[8]="HOME=/var/spool/cups/tmp"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[9]="PATH=/usr/lib/cups/filter:/usr/bin:/usr/sbin:/bin:/usr/bin"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[10]="SERVER_ADMIN=root@galaxy"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[11]="SOFTWARE=CUPS/1.6.1"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[12]="TMPDIR=/var/spool/cups/tmp"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[13]="USER=root"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[14]="CUPS_MAX_MESSAGE=2047"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[15]="CUPS_SERVER=/var/run/cups/cups.sock"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[16]="CUPS_ENCRYPTION=IfRequested"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[17]="IPP_PORT=631"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[18]="CHARSET=utf-8"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[19]="LANG=en_US.UTF-8"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[20]="PPD=/etc/cups/ppd/Lexmark_2600_Series.ppd"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[21]="RIP_MAX_CACHE=128m"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[22]="CONTENT_TYPE=application/pdf"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[23]="DEVICE_URI=lxkusb://Lexmark/2600%20Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[24]="PRINTER_INFO=Lexmark 2600 Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[25]="PRINTER_LOCATION="
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[26]="PRINTER=Lexmark_2600_Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[27]="PRINTER_STATE_REASONS=none"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[28]="CUPS_FILETYPE=document"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[29]="FINAL_CONTENT_TYPE=printer/Lexmark_2600_Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[30]="AUTH_I****"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Started filter /usr/lib/cups/filter/pdftopdf (PID 4033)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Started filter /usr/lib/cups/filter/gstoraster (PID 4034)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Started filter /usr/local/lexmark/lxk08/bin/printdriver (PID 4035)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Started backend /usr/lib/cups/backend/lxkusb (PID 4036)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] STATE: +connecting-to-device
    D [05/Sep/2012:23:11:53 -0500] [Job 21] PID 4035 (/usr/local/lexmark/lxk08/bin/printdriver) stopped with status 113 (Permission denied)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Hint: Try setting the LogLevel to "debug" to find out more.
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Printer using device file "/dev/usb/lp1"...
    D [05/Sep/2012:23:11:53 -0500] [Job 21] STATE: -connecting-to-device
    D [05/Sep/2012:23:11:53 -0500] [Job 21] backendRunLoop(print_fd=0, device_fd=5, use_bc=1)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] PID 4036 (/usr/lib/cups/backend/lxkusb) exited with no errors.
    D [05/Sep/2012:23:11:53 -0500] [Job 21] PPD uses qualifier 'Color.Plain.'
    D [05/Sep/2012:23:11:53 -0500] [Job 21] PID 4033 (/usr/lib/cups/filter/pdftopdf) exited with no errors.
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Calling FindDeviceById(Lexmark_2600_Series)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Failed to send: org.freedesktop.ColorManager.Failed:device id 'Lexmark_2600_Series' does not exists
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Failed to get profile filename!
    D [05/Sep/2012:23:11:53 -0500] [Job 21] no profiles specified in PPD
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Ghostscript command line: /usr/bin/gs -dQUIET -dPARANOIDSAFER -dNOPAUSE -dBATCH -dNOINTERPOLATE -sDEVICE=cups -sstdout=%stderr -sOutputFile=%stdout -sOutputType=2 -r600x600 -dDEVICEWIDTHPOINTS=612 -dDEVICEHEIGHTPOINTS=792 -dcupsMediaType=1 -dcupsBitsPerColor=8 -dcupsColorOrder=0 -dcupsColorSpace=1 -dcupsCompression=1 -dcupsRowStep=1 -scupsPageSizeName=Letter -I/usr/share/cups/fonts -c -f -_
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[0]="CUPS_CACHEDIR=/var/cache/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[1]="CUPS_DATADIR=/usr/share/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[2]="CUPS_DOCROOT=/usr/share/cups/doc"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[3]="CUPS_FONTPATH=/usr/share/cups/fonts"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[4]="CUPS_REQUESTROOT=/var/spool/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[5]="CUPS_SERVERBIN=/usr/lib/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[6]="CUPS_SERVERROOT=/etc/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[7]="CUPS_STATEDIR=/var/run/cups"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[8]="HOME=/var/spool/cups/tmp"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[9]="PATH=/usr/lib/cups/filter:/usr/bin:/usr/sbin:/bin:/usr/bin"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[10]="SERVER_ADMIN=root@galaxy"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[11]="SOFTWARE=CUPS/1.6.1"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[12]="TMPDIR=/var/spool/cups/tmp"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[13]="USER=root"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[14]="CUPS_MAX_MESSAGE=2047"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[15]="CUPS_SERVER=/var/run/cups/cups.sock"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[16]="CUPS_ENCRYPTION=IfRequested"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[17]="IPP_PORT=631"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[18]="CHARSET=utf-8"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[19]="LANG=en_US.UTF-8"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[20]="PPD=/etc/cups/ppd/Lexmark_2600_Series.ppd"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[21]="RIP_MAX_CACHE=128m"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[22]="CONTENT_TYPE=application/pdf"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[23]="DEVICE_URI=lxkusb://Lexmark/2600%20Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[24]="PRINTER_INFO=Lexmark 2600 Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[25]="PRINTER_LOCATION="
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[26]="PRINTER=Lexmark_2600_Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[27]="PRINTER_STATE_REASONS=none"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[28]="CUPS_FILETYPE=document"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[29]="FINAL_CONTENT_TYPE=printer/Lexmark_2600_Series"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] envp[30]="AUTH_INFO_REQUIRED=none"
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Start rendering...
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Processing page 1...
    D [05/Sep/2012:23:11:53 -0500] [Job 21] PID 4034 (/usr/lib/cups/filter/gstoraster) stopped with status 13. ************************ is this important?
    D [05/Sep/2012:23:11:53 -0500] [Job 21] Hint: Try setting the LogLevel to "debug" to find out more.
    D [05/Sep/2012:23:11:53 -0500] [Job 21] End of messages
    D [05/Sep/2012:23:11:53 -0500] [Job 21] printer-state=3(idle)
    D [05/Sep/2012:23:11:53 -0500] [Job 21] printer-state-message="Processing page 1..."
    D [05/Sep/2012:23:11:53 -0500] [Job 21] printer-state-reasons=none
    I am using Cups 1.6.2, kernel 3.5.3, KDE 4.9 (if it makes any difference). I tried downloading the Lexmark drivers from the web, no joy. In fact, after being prompted to plug in the printer to the computer, the driver installation seems to freeze (it does not respond to me connecting the printer via USB).
    I also tried adding through Cups web interface, same thing. The "state" says "stopped: filter failed".
    Any help?

    Bump?

  • How to process pdf file in clower ETL

    Hi,
    I want process pdf document in clower ETL dataintegartor. I have created sample project and created ETL garph universal data reader, data i have imported pdf file, while openning the metta data information it's show encoding data format and invalid delimiter and while running error in the console
    Please assist me how to process pdf file with unstructured data format.
    I am getting below the error,
    ERROR [WatchDog] - Graph execution finished with error
    ERROR [WatchDog] - Node DATA_READER0 finished with status: ERROR caused by: Parsing error: Unexpected record delimiter, probably record has too few fields. in field # 1 of record # 2, value: '<Raw record data is not available, please turn on verbose mode.>'
    ERROR [WatchDog] - Node DATA_READER0 error details:
    org.jetel.exception.BadDataFormatException: Parsing error: Unexpected record delimiter, probably record has too few fields. in field # 1 of record # 2, value: '<Raw record data is not available, please turn on verbose mode.>'
         at org.jetel.data.parser.DataParser.parsingErrorFound(DataParser.java:527)
         at org.jetel.data.parser.DataParser.parseNext(DataParser.java:437)
         at org.jetel.data.parser.DataParser.getNext(DataParser.java:168)
         at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:415)
         at org.jetel.component.DataReader.execute(DataReader.java:261)
         at org.jetel.graph.Node.run(Node.java:425)
         at java.lang.Thread.run(Thread.java:619)
    please can any one help me.
    Thanks
    Rajini C
    Edited by: 954486 on Sep 19, 2012 11:19 PM

    There is a separate forum for the BI/Information Discovery application of Endeca software: Endeca Information Discovery You should post your message there.
    Thanks.
    Sean

  • File processing in FIle adapter

    Hi,
    i have configured Sender File Adapter (FTP) in my scenario which picks the file from FTP folder and sends to JDBC.
    now i have 2 flat files in my FTP (yesterdays and todays). file names are test_07062010 and test_08062010.
    now my requirement is while processing the files, my file adapter has to process first file as test_07062010 and then test_08062010. but now it is processing first test_08062010 and then test_07062010. due to this yesterdays data is overwriting on todays data in Database.
    i know we have some option in file sender adapter (NFS)...but i need to do with FTP file sender...please let me know how to proceed further.
    Thanks in Advance.

    Hi Praveen...thanks for reply
    my communication channel is active all the days. but sometimes due to PI server down or some maintance we will stop channels.
    that time file will not pick by file sender and it will be in FTP.then next day one more file will generate in FTP. 
    So my proble is while processing these files my sender file adapetr is processing todays file first and then yesterdays.due to this yesterdays data is overwriting on todays data.
    can you tell me how to handle that in file adapter.

  • How to stop process chain, if it is taking too much time than expected.

    Some times if a process chain takes to much time to finish than expected, how I can stop the process chain and execute it again.
    Thanks in Advance.
    Harman

    how I can stop the process chain ??
    If the job is running for a long time ,
    1)GOTO RSMO and SM37 and check the long running job over there.
    2)There you can see the status of the job.
    3)If the job is still running you can kill that job
    4)delete the failed request from data target.
    for more details go to this below link
    how to stop process chain if it yellow for long time
    how I can execute it again ?
    GOto Function module  RSPC_API_CHAIN_START
    and give u r process chain name there.and execute.

  • Process Multiple Files in PSE 7

    I'm running PSE7.
    I'd like to take a group of images (jpeg's) and do the following to them:
    Run QuickFix on them.
    Save them as a new file with the name based on the original name plus some extra information.
    Resize the jpeg's to a size suitable for the web. Or possibly a little larger.
    It would really be great if:
    I could use a 'tag' to select the images I want to process.
    The tags on the new images are the same as the originals.
    The new files are written into the same folder as the original. Even if multiple folders are needed due to
    the tags bringing in images from multiple folders.
    The new images would become part of a version set or stack along with the originals.
    Well, that's what's I'm interested in doing.
    Any thoughts would be greatly appreciatted!

    The Process Multiple Files command applies settings to a folder of files. If you have a digital camera or a scanner with a document feeder, you can also import and process multiple images. (Your scanner or digital camera may need an acquire plug‑in module that supports actions.)
    When processing files, you can leave all the files open, close and save the changes to the original files, or save modified versions of the files to a new location (leaving the originals unchanged). If you are saving the processed files to a new location, you may want to create a new folder for the processed files before starting the batch.
    Note: The Process Multiple Files command does not work on multiple page files.
    Choose File > Process Multiple Files.
    Choose the files to process from the Process Files From pop‑up menu:
    Folder
    Processes files in a folder you specify. Click Browse to locate and select the folder.
    Import
    Processes images from a digital camera or scanner.
    Opened Files
    Processes all open files.
    Select Include All Subfolders if you want to process files in subdirectories of the specified folder.
    For Destination, click Browse and select a folder location for the processed files.
    If you chose Folder as the destination, specify a file-naming convention and select file compatibility options for the processed files:
    For Rename Files, select elements from the pop‑up menus or enter text into the fields to be combined into the default names for all files. The fields let you change the order and formatting of the components of the filename. You must include at least one field that is unique for every file (for example, filename, serial number, or serial letter) to prevent files from overwriting each other. Starting Serial Number specifies the starting number for any serial number fields. If you select Serial Letter from the pop-up menu, serial letter fields always start with the letter “A” for the first file.
    For Compatibility, choose Windows, Mac OS, and UNIX® to make filenames compatible with the Windows, Mac OS, and UNIX operating systems.
    Under Image Size, select Resize Images if you want each processed file resized to a uniform size. Then type in a width and height for the photos, and choose an option from the Resolution menu. Select Constrain Proportions to keep the width and height proportional.
    To apply an automatic adjustment to the images, select an option from the Quick Fix panel.
    To attach a label to the images, choose an option from the Labels menu, then customize the text, text position, font, size, opacity, and color. (To change the text color, click the color swatch and choose a new color from the Color Picker.)
    Select Log Errors That Result From Processing Files to record each error in a file without stopping the process. If errors are logged to a file, a message appears after processing. To review the error file, open with a text editor after the Batch command has run.
    Click OK to process and save the files.

  • Generation of MMON process trace files in large file size (GB Size)

    Hi,
    I have created a database using the dbca in windows platform. Few days I found that, in the BDUMP directory the MMON process trace files are getting generated. The files starts to generate in MB size and will increase upto GB size. I know that the back ground process trace files cannot be disabled. So now iam force to manually delete these files from the bdump directory.plz help me to resolve this issue.
    I have checked and verified the SGA size, Shared Pool size and other memory areas.
    The statistics level in Typical also.
    But still the files are generated.
    PLease Helppp.....
    Shiyas

    hi
    As per your instruction i have checked the Alert log file. I have pasted a part of errors that found in the alert log file.
    Mon Jun 07 09:30:58 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_652.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:31:02 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_652.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:36:00 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_652.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:36:08 2010
    Restarting dead background process MMON
    MMON started with pid=11, OS id=656
    Mon Jun 07 09:36:11 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_656.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:36:15 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_656.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:41:12 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_656.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:41:16 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_656.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:46:13 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_656.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:46:17 2010
    Errors in file d:\oracle\product\10.2.0\admin\mir\bdump\mir_mmon_656.trc:
    ORA-00600: internal error code, arguments: [kjhn_post_ha_alert0-862], [], [], [], [], [], [], []
    Mon Jun 07 09:50:18 2010
    Shutting down instance: further logons disabled
    Mon Jun 07 09:50:19 2010
    Stopping background process QMNC
    Mon Jun 07 09:50:19 2010
    Stopping background process CJQ0
    Mon Jun 07 09:50:20 2010
    Stopping background process MMNL
    Mon Jun 07 09:50:21 2010
    Stopping background process MMON
    Mon Jun 07 09:50:22 2010
    Shutting down instance (immediate)
    License high water mark = 4
    Mon Jun 07 09:50:22 2010
    Stopping Job queue slave processes, flags = 7
    Mon Jun 07 09:50:22 2010
    Job queue slave processes stopped
    Waiting for dispatcher 'D000' to shutdown
    All dispatchers and shared servers shutdown
    Mon Jun 07 09:50:23 2010
    alter database close normal
    Mon Jun 07 09:50:23 2010
    SMON: disabling tx recovery
    SMON: disabling cache recovery
    Mon Jun 07 09:50:23 2010
    Shutting down archive processes
    Archiving is disabled
    Archive process shutdown avoided: 0 active
    Thread 1 closed at log sequence 71
    Successful close of redo thread 1
    Mon Jun 07 09:50:23 2010
    Completed: alter database close normal
    Mon Jun 07 09:50:23 2010
    alter database dismount
    Completed: alter database dismount
    ARCH: Archival disabled due to shutdown: 1089
    Shutting down archive processes
    Archiving is disabled
    Archive process shutdown avoided: 0 active
    ARCH: Archival disabled due to shutdown: 1089
    Shutting down archive processes
    Archiving is disabled
    Archive process shutdown avoided: 0 active
    But I am not able to understand anything above of this.
    And I am sorry we dont have the metalink support or srs support.
    Is there any other way to resolve this issue.
    Shiyas

  • How to process a file with out any CSV, Delimitaor, fixed length or Copybok

    Hi Team,
    i need to process below file in OSB and need to send mails to the concerns ids...
    this file will have either 1 mail or multiple mails.
    sample.txt file with 1 mail content
    ======================================
    START
    [email protected]
    [email protected], [email protected],
    END
    Subject : CAL IND Renege #00424523 Hse580 CTH580
    BODY:
    User_ID: LARRY014
    XXX Hse/Customer # : 580/1196310
    X12 Order Number: 580094624
    Customer E-Mail: [email protected]
    Customer E-Mail 2: [email protected]
    Customer Phone : 909312345
    Dear Salesperson,
    mysupply.com Order # : 00424523
    mysupply.com User ID : LARRY014
    Customer CALIFORNIA STEEL IND has entered order 00424523
    through mysupply.com.
    THIS ORDER HAS RENEGED for the following reason(S):
    I. ORDER LEVEL
    NOTE SEGMENTS FOUND IN INPUT - SENTRY
    CDF REQUIRED CUSTOMER - ORDER RENEGED
    II. ITEM/LINE LEVEL
    LINE # ECOM LINE NAED QTY STATUS ALLOW SUBS
    Please resolve the renege and release the order in Sentry
    01 as soon as possible. Thank you.
    EMAIL-END
    ====================================
    Please help me ,how to process this file and send mail to the concern people, as its do not have neither CSV, nor Fixed lengthn or its not Cobol copybook nor its not DTD to convert it.
    Thanks
    Reddy
    Edited by: KiranReddy on Feb 3, 2012 9:52 PM

    you shouldn't need a csv if you want a fixed file you need some thing like
    read:
    <xsd:element name="C1" type="xsd:string" nxsd:style="fixedLength" nxsd:length="1" nxsd:paddedBy=" " nxsd:padStyle="tail" />
    <xsd:element name="C2" type="xsd:string" nxsd:style="fixedLength" nxsd:length="xx" nxsd:paddedBy=" " nxsd:padStyle="tail" />
    write:
    <xsd:element name="C1" type="xsd:string" nxsd:style="fixedLength" nxsd:length="1" nxsd:paddedBy=" " nxsd:padStyle="tail" />
    <xsd:element name="C2" type="xsd:string" nxsd:style="fixedLength" nxsd:length="xx" nxsd:paddedBy=" " nxsd:padStyle="tail" />
    <xsd:element name="C3" type="xsd:string" nxsd:style="fixedLength" nxsd:length="5" nxsd:paddedBy=" " nxsd:padStyle="tail" />xx stands for the length of your line
    hope this makes sense
    cheers
    James

  • Agent Unreachable, collection status: file handles exhausted

    Hi, I have a problem with management agent. Status of agent in grid control is Agent Unreachable. Here is an output of emctl status agent:
    Oracle Enterprise Manager 10g Release 5 Grid Control 10.2.0.5.0.
    Copyright (c) 1996, 2009 Oracle Corporation. All rights reserved.
    Agent Version : 10.2.0.5.0
    OMS Version : 10.2.0.5.0
    Protocol Version : 10.2.0.5.0
    Agent Home : /opt/oracle/agent10g
    Agent binaries : /opt/oracle/agent10g
    Agent Process ID : 26832
    Parent Process ID : 26821
    Agent URL : https://bs11.xxxx.lan:3873/emd/main/
    Repository URL : https://gridcontrol.xxxx.lan:1159/em/upload
    Started at : 2010-07-06 14:24:30
    Started by user : oracle
    Last Reload : 2010-07-06 14:31:50
    Last successful upload : 2010-07-06 15:11:47
    Total Megabytes of XML files uploaded so far : 51.87
    Number of XML files pending upload : 4
    Size of XML files pending upload(MB) : 0.01
    Available disk space on upload filesystem : 60.35%
    Collection Status                            : File handles exhausted
    Last successful heartbeat to OMS : 2010-07-06 15:11:52
    I wonder what does this message mean Collection Status                            : File handles exhausted I can't find any solution to my problem, I tried restarting agent, clearstate, upload, resynchronization... all of this did nothing.

    Did you already check: Master Note for 10g Enterprise Manager Grid Control Agent Performance & Core Dump issues [ID 1087997.1]
    On http://support.oracle.com
    https://support.oracle.com/CSP/ui/flash.html#tab=KBHome%28page=KBHome&id=%28%29%29,%28page=KBNavigator&id=%28bmDocTitle=Master%20Note%20for%2010g%20Enterprise%20Manager%20Grid%20Control%20Agent%20Performance%20&%20Core%20Dump%20issues&bmDocDsrc=KB&bmDocType=BULLETIN&bmDocID=1087997.1&viewingMode=1143&from=BOOKMARK%29%29
    Regards
    Rob
    http://oemgc.wordpress.com

Maybe you are looking for

  • All Day Events shift order randomly when printing!

    When trying to print in month view, my all day events shift order within their day in a seemingly random way. This is a problem because I want certain all day events to line up next to each other day after day in a specific color coded way. This has

  • Call Webgui from WD application with only one log on

    Hi In my first application using WD ABAP, requirement is to call WEBGUI from the application. Calling WEBGUI is working fine but while calling WEBGUI the system asks for another log-on which is not required. When the application first executes it ask

  • Exchange and MobileMe on Same Phone

    I've read several topics on here and I'm confused and it sounds like some people are making it work and some are not. I have an iPhone 3G with the updated software, a mobile me account for home (e-mail, address book, and iCal on the mac that syncs to

  • Problems with ads sp14

    hello, we have programmed interactive forms with webdynpro for abap using netweaver ads sp12. Now we have upgrade to ads sp14 and have with the the same webdynpros only errors. Are there new configuration-rules on sp14 in comparance to sp12? Best reg

  • How can i rotate a drawing

    How can I rotate a drawing