No error generated if batch process load file uses wrong naming convention

Another interesting one...
When using the batch processing functionality of FDM, which can be executed by either:
- FDM Workbench (manually)
- Hyperion FDM Task Manager (scheduled)
- upsShell.exe (scheduled and executed from a batch script)
..., one has to name the data file (to be loaded) using a specific file naming convention (i.e. "A~LOCATION~CATEGORY~PERIOD~RA.csv" format, for example). However, if one does not name the file correctly and then tries to process the file using batch processing functionality using any of the above three methods, FDM happily moves the file out of the OpenBatch folder and into a new folder, but the file is not loaded as it does not know where to map it to (as expected). However, there are no errors in Outbox\Logs\<username>.err to inform the user, so one is non the wiser that anything has gone wrong!
When using FDM Workbench, an error is displayed on the screen (POV - "Batch Completed With Errors, ( 1 ) Files Contained Errors"), but this is the only indication of any error. And normally, one would be scheduling the load using upsShell.exe or Hyperion FDM Task Manager anyway...
Has anyone else noticed this, or am I doing something wrong here? :-)

Yes, as per my original post the only feedback on any POV errors appears to be when using the FDM Workbench Batch Processing GUI.
Regarding the "Batch Process Report in FDM" you mentioned, are you referring to Analysis | Timeline accessible via FDM web client? Unfortunately this does not appear to provide much in the way of detail or errors, only general events that occurred. I cannot locate any batch process report, other than the log output I defined when calling upsShell.exe. However, this contains no POV errors...

Similar Messages

  • Batch processing saves files in wrong folder.

    I ran into a problem the other day, and I believe an item should be added to the Process Multiple Files window, or perhaps it's a bug that should be squashed.
    I have a folder with many subfolders in it, all full of jpegs. I wanted to use the Process Multiple Files function to convert the jpegs into png files and have the png files placed into the same folder as the jpeg it originated from.
    I have the Destination set to Same as Source, but instead of placing the files into the subfolders, all the images go directly into the parent folder. It was much more time-consuming to try to sort them out than to start over and process each folder individually. I should be able to leave the computer unattended and have the files saved into the proper folders; Instead I had to stay close so I could change the Source folder every few minutes.
    I see no reason why there isn't an option to place the converted files into their original folder. Isn't that what the Same as Source setting is supposed to do?

    Sure, if you owned all of those files and were doing it strictly for yourself you could certainly write a plugin to Acrobat (in C/C++) that could do some/all of the things you asked for.

  • Resource error. No batch process available. Process terminated

    Hi all,
    Am getting following error after scheduling the chain in activating the load from ODS(0SAL_DS01)  to another Data Targets.
    Resource error. No batch process available. Process terminated
    Activation of M ecords from DataStore object 0CRM_CT_I terminated
    any light on this please
    regards
    Subba reddy.

    Hi
    From the description of your problem, I understand that the
    activation has taken too long, and that is the reason for the
    error RSODSO_PROCESSING 17. By making some adjustments to the
    Data Store object parameters, you should be able to activate
    the request. Please note and make the following adjustments in
    transaction RSODSO_SETTINGS (you can adjust this for each Data
    Store object):
    1) maximum package size (for activation)
       This you should reduce to 10000. Then the package processing would
       be less time consuming.
    2) maximum wait time for process (for activation)
       This could be increased to e.g. 600
    3) button "change process parameters"
       here the activation is set to be run in backround (which is also
       adviced by SAP).
    If you want the processes to run in background, you may have to
    increase the amount of batchjobs available in your system. Otherwise
    the activation can not run in parallel if there are no batchjobs
    available. This could also occur because there are other programs
    using batchjobs also.
    Also could you please check the server group used in RSODSO_SETTINGS
    You can make use of the RZ12 for the same. If you adjust these settings
    the activation should finish.

  • Error in DTW- Could not load file or assembly 'ADODB'.

    Hello Experts
    I am getting below error in DTW
    could not load file or assemplt 'ADODB', version =7.0.3300.0, culture=neutral, publickeytoken='b03f57f11d50a3a' or one of its dependencies. the system cannot find the file specified
    Please advise.
    Thanks
    Deepak

    Hi Deepak,
    Kindly follow the check list
    1. Right Click DTW and run it as Administrator.
    2. Are you using same DTW version as SAP B1
    3. Uninstall and re-install DTW.
    4. If you are using 64-bit DTW, try to use 32-bit one.
    5. Check the Template, is it of the same DTW version.
    6. Remove all the unnecessary columns.
    7. Last try different Template extension.. (e.g: CSV (Comma delimited), or Text (Tab delimited))
    Hope you find this helpful......
    Regards,
    Syed Adnan

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • Flatfile conversion with output file has a NAMING CONVENTION

    Dear SAP experts,
    I need some advise regarding my scenario.
    I am converting a message into flatfile. (customized .csv)
    But, the output .csv flatfile must have a naming convention.
    E.g.    Globus_20071020   (Customer name_YearMonthDate)
    Can somebody give me ideas/inputs on what will I configure in File Receiver (FCC) in order to have an output file having a naming convention indicated above.
    Or do i need additional configurations?
    Please advise.
    Thank you very much in advance.
    Fred

    Hi,
    You could pass this kind of File name from mapping at runtime or
    You could use the variable substitutions to create the fiel neame as per naming convention as adding date .
    With reference to Variables youcould set file name as Globus_%payload.<Date>%
    Pass the value in date field of payload 
    Refer
    Variable Substitution
    How to use Variable substituion
    /people/sameer.shadab/blog/2005/09/23/an-interesting-usage-of-variable-substitution-in-xi
    /people/sravya.talanki2/blog/2005/08/11/solution-to-the-problem-encountered-using-variable-substitution-with-xi-sp12
    how to use attributes in variable substitution???:(
    Dynamic file name
    /people/jayakrishnan.nair/blog/2005/06/20/dynamic-file-name-using-xi-30-sp12-part--i --> Dynamic File Name using XI 3.0 SP12 Part – I
    /people/jayakrishnan.nair/blog/2005/06/28/dynamic-file-namexslt-mapping-with-java-enhancement-using-xi-30-sp12-part-ii --> Dynamic file name(XSLT Mapping with Java Enhancement) using XI 3.0 SP12 Part -II
    Dynamic File name in File adapter
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    1. In the sender file adapter , select Adapter Specific Attributes --> FileName.
    2. Use the code in this link to read the filename inside a UDF in your mapping.
    DynamicConfiguration conf = (DynamicConfiguration) container
    .getTransformationParameters()
    .get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey key = DynamicConfigurationKey.create(
    “http://sap.com/xi/XI/System/File”,
    “FileName”);
    String filename = conf.get(key);
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03612cdecc6e76e10000000a422035/content.htm
    Thanks
    Swarup

  • Open Batch Multi Load file type error

    Hi,
    I have been trying to use Open Batch MultiLoad via FDM Workbench, but encountered error that it .csv file type is unknown at Import stage.
    A couple things that I have tried:
    - Using multiload via FDM Web Interface to load the .csv file: Success
    - Using open batch multiload via Workbench to load the .csv file: Failed
    - I tried to rename the .csv file itu .txt (without changing the content), tried to use open batch multiload via Workbench to load the .txt file: Success
    It seems that if I try to execute open batch multiload, FDM is able to read the CSV format but refuse the .csv file type to be processed.
    Do I miss something for using openbatch multiload to load .csv file?
    Thanks,
    Erico
    *[File Content]*
    LOC1
    Budget
    1/31/2008
    2
    R,M
    Center,Description,ACCouNT,UD1,UD3,UD4,DV,v,v,v,UD2
    Sunnyvale,Description,Sales,GolfBalls,,,Periodic,1001,2001,3000,Customer2
    Sunnyvale,Description,Purchases,GolfBalls,,,Periodic,2001,3001,4000,Customer2
    Sunnyvale,Description,OtherCosts,GolfBalls,,,Periodic,4001,5001,5000,Customer2*[Error Message Received]*
    Invalid object Node=ML40942.3981712963_P1?MULTILOADFILE.,CSV_UNKNOWN FILE TYPE IN MULTILOAD BATCH FOR FILE [MULTILOADFILE.CSV]! - 35610
    *[FDM Version]*
    FDM 11.1.2.1

    Kemp2 wrote:
    Hi Erico,
    What is the fix for this issue? I am having same issue.
    Thx
    KempHi Kemp,
    I didn't get the fix for this issue. Since we decided to not use the Open Batch Multi Load (not because this issue), I stopped researching on this issue.
    But some workaround that you might want to try:
    <li>Simply have the source file in .txt file type</li>
    or
    <li>Since open batch uses script, before executing the Multi Load script, change the file type from .csv to .txt using script</li>
    Hope this helps.
    -Regards

  • Getting an error after executing batch process LTRPRT

    hi we are testing to check how the flat files are created for different customer contacts,for that we had ran the batch process LTRPRT batch process
    it got executed and ended abnormally with an error
    these are the parameters which were submitted during batch submission
    FILE-PATH=d:\spl
    FIELD-DELIM-SW=Y
    CNTL-REC-SW=N
    This is the following error
    ERROR (com.splwg.base.support.batch.GenericCobolBatchProgram) A non-zero code was returned from call to COBOL batch program CIPCLTPB: 2
    com.splwg.shared.common.LoggedException: A non-zero code was returned from call to COBOL batch program CIPCLTPB: 2
    *     at com.splwg.shared.common.LoggedException.raised(LoggedException.java:65)*
    *     at com.splwg.base.support.batch.GenericCobolBatchProgram.callCobolInCobolThread(GenericCobolBatchProgram.java:78)*
    *     at com.splwg.base.support.batch.GenericCobolBatchProgram.execute(GenericCobolBatchProgram.java:38)*
    *     at com.splwg.base.support.batch.CobolBatchWork$DoExecuteWorkInSession.doBatchWorkInSession(CobolBatchWork.java:81)*
    *     at com.splwg.base.support.batch.BatchWorkInSessionExecutable.run(BatchWorkInSessionExecutable.java:56)*
    *     at com.splwg.base.support.batch.CobolBatchWork.doExecuteWork(CobolBatchWork.java:54)*
    *     at com.splwg.base.support.grid.AbstractGridWork.executeWork(AbstractGridWork.java:69)*
    *     at com.splwg.base.support.grid.node.SingleThreadedGrid.addToWorkables(SingleThreadedGrid.java:49)*
    *     at com.splwg.base.support.grid.node.AbstractSingleThreadedGrid.processNewWork(AbstractSingleThreadedGrid.java:49)*
    *     at com.splwg.base.api.batch.StandaloneExecuter$ProcessNewWorkExecutable.execute(StandaloneExecuter.java:590)*
    *     at com.splwg.base.support.context.SessionExecutable.doInNewSession(SessionExecutable.java:38)*
    *     at com.splwg.base.api.batch.StandaloneExecuter.submitProcess(StandaloneExecuter.java:188)*
    *     at com.splwg.base.api.batch.StandaloneExecuter.runOnGrid(StandaloneExecuter.java:153)*
    *     at com.splwg.base.api.batch.StandaloneExecuter.run(StandaloneExecuter.java:137)*
    *     at com.splwg.base.api.batch.StandaloneExecuter.main(StandaloneExecuter.java:357)*
    *18:24:14,652 [main] ERROR (com.splwg.base.support.grid.node.SingleThreadedGrid) Exception trapped in the highest level of a distributed excecution context.*
    what could be the reason?

    you need to specify appropriate folder on the server.
    for e.g. choose +/spl/sploutput/letterextract/+

  • I am trying to restore bookmarks to Version 5 from a backup I made from Version 4.+. I get an error message "Unable to process backup file." What's wrong?

    I made a backup of my bookmarks when I was running Firefox Version 4+ (stored in a "json" file on a thumb drive). My primary hard disk crashed and when I down loaded Firefox on to a new hard drive, I got version 5.0. When I tried to restore the saved bookmarks, I got an error message that says "Unable to process backup file."

    A possible cause is a problem with the file places.sqlite that stores the bookmarks and the history.
    * http://kb.mozillazine.org/Bookmarks_history_and_toolbar_buttons_not_working_-_Firefox
    * http://kb.mozillazine.org/Unable_to_process_the_backup_file_-_Firefox

  • Optimal workflow for batch processing LR files with Nik plug ins

    I have a series of 40 or so landscape images that I would like to efficiently apply a couple of Nik filters to using Color Efex.  I'd appreciate any suggestions re: the most efficient workflow to accomplish this task in a batch processing approach, as my current approach of taking each image into Photoshop, while preserving Smart Filter benefits, is terribly inefficient.  Thanks in advance.

    Hi,
    Sorry but File->Process Multiple Files is the only way and functionalities with which you can batch process your images in PSE. However you can add action in Window->Action Player and apply to your images opened in workspace -
    http://tv.adobe.com/watch/learn-photoshop-elements-11/adding-actions/
    http://forums.adobe.com/message/4736578
    The functionality is available in full functionality product Photoshop via File->Automate->Batch where you can load actions in action panel and they would appear in this dialog box but this is not available in Elements.
    The best way to do here is to load actions in Window->Action Player and open all your images and click on play button one by one and then use Process Multiple Files to save all these in one go as per your preferences of file type, size, name etc.
    Thanks,
    Garry

  • Batch processing different files or not?

    Hello, I would like to know if It's possible to batch processing feature with the Acrobat standard version 9.
    I want to securize a bund of pdf files from their filename. What's the best version to deploy this subject, standard or professional? I'd like to do a similar shell where I can define a password per file and to establish a security policy.
    Do I need the SDK or API or only needed Acrobat Professional?
    How can I realize this one?
    How would you do that?
    Thanks

    Hello, ok, then I must adquire the Professional version. Even so, I'd like to know if it's possible secure the collection files with different passwords in an unique process with other ways, ie. from SDK that's free now... or the best good option is batch processing with Professional edition.
    In other words, Do exist other ways to do it that I haven't contemplated yet?
    Thanks!

  • Reducing file size in batch processes (multiple files)

    I am now using a canon XTi 10.5 megapixel camera. I'm loving it, taking tons and pics, and see my memory on my computer shrinking rapidly. I'm not too worried about hard drive space as I can simply buy another hard drive. Where I'm running into problems is on my iPod. I have an 80gb video iPod. It is suddenly full, Almost 1/2 with audio, 1/2 with photos. I'm realizing that I need take the 5-7 megabyte files each (photos) in iPhoto and reduce them down a bit in dimensions and/or jpeg compression.
    I use photoshop elements and am familiar with the multiple files option. I've used that a bit, nice feature.
    Heres my question:
    I know iPhoto is sensitive to manipulation of photos/changes if the changes aren't done through its interface. How can I go about taking the original pics for a number of events (talking about perhaps 2000 pics right now) and batch process them down in size a bit without weirding out iPhoto?
    Does iPhoto have an option to batch change files in such a fashion? Is it as simple as finding the originals in the finder and simpy reducing them in size and making iPhoto update its thumbnails?
    and how do I make iPhoto update its thumbnails, I forget...
    Thanks

    If you just want displayable image files on the iPod and not the full sized files then a very good solution, IMO, is to export those photos you want on the iPod to a folder on the Desktop. Then use Resize! to batch change the pixel dimension and the jpg compression level of the files. For my iPod Nano I use 640 x 480 and have Resize! compress them to medium. That gives me files in the 100 KB size range. If you want a smaller file just increase the compressions level a bit. Resize! automatically creates a new folder for the resized images so you can try different levels on the same folder. You can get a lot of photos on your iPod at that size. I then put that folder of resized files in my Pictures folder and have iTunes use it instead of using iPhoto.
    Do you Twango?
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've created an Automator workflow application (requires Tiger), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. It's compatible with iPhoto 08 libraries. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.

  • Can't batch process opening files with certicate encryption

    Using Acrobat 9.0 pro.
    I am unable to batch proces files that are certicate encrypted.
    When batch processed a dialog message appears that reads;
    Cannot open document
    No files were processed
    I have used the instructions that adobe has for using the associated digital id and ensuring this is logged in.
    And also have set preferences for batch processing  security method to certificate security, no password and, password security.
    FYI: I Don't have livecycle so can't try this.
    Can anyone help with a solution for doing this.
    I was able to create certificate encrypted files using batch processing so can't understand why I now can't open and print file created this way, since
    Acrobat help indicates this can be done.
    Adobe instructions indicate this should work??? What's wrong here.
    Thanks in advance
    Regards,
    smurphy

    I may have to do with the name of the folder, such as having periods or slashes in the folder name.

  • Batch-process all files in a directory

    I'm relatively new to LabView. I'm using 5.1, and would like to open all the text files in a certain directory, and apply my formulaes to the data contained within each. How can I batch process a whole directory?

    Hi,
    first of all you need know names of files in your directory: for this purpose use File I/O / Advanced/List directory.vi. This vi get name of directory, pattern (*.txt for example) and returns array of path to files.
    For all path use Read From Spreadsheet File.vi that get name of file and return 1 or 2-d array of data.
    Now you can apply formula to data.

  • Batch Process increases file size from 230kb - 5044kb

    I'm currently using Acrobat 7 Professional.
    I want to use the "Batch" process to re-convert pdf documents that were watermarked using the batch process. We're talking 1000+ documents.
    The reason we need to re-convert them is because the watermarked documents are taking more than 4min/page to print, if it prints at all.
    I've manually re-converted a few of the documents using "print to pdf", kept the file size at 230kb and with no printing issues.
    However, since we have 1000s of these documents to do, most with multiple pages, this would be a very time consuming project.
    I've "Batched" "print to pdf" and it also takes care of our printing issues, however, the file size has jumped to 5044kb. Having literally thousands of files to do - this is an issue on our file server.
    I've done multiple tests with adjusting the pdf settings in the batch process... some work (and again the file size is large) - some don't allow for printing or slow it way down again.
    I understand that it's a "layer" or "transparency" thing that's slowing the printing process and when I mess with it in the batch process, it takes care of the printing issues but not the file size.
    Can someone help me please??? I've been working on this for a couple days now and I'm at my limits.
    How can I batch process that will flatten layers but give me a manageable file size?
    Your help would be GREATLY appreciated - Michelle

    I don't know if it can help here, but don't I recall that wayyyy back in Photoshop 7.0 the last File - Save As you have executed sets the JPEG quality for batch File - Save As commands?  I am not sure when the "Override Action Save-As Commands" became functional, but try this:
    1.  Uncheck "Override Action Save-As Commands".
    2.  Remove the "File - Save As" step you have recorded in your action.  The action should NOT save the file.
    3.  Save any old JPEG file at the quality level you prefer, just to set the "memory" of the quality level.
    4.  Run the File - Automate - Batch, specifying input and output folders there.
    I'll bet it will work.
    -Noel

Maybe you are looking for

  • How to interprete Statspack report

    Hello all, I'm new in using statspack and i've tried to run it and gather the report but i can't interprete the result, i mean what can i do, the decisions after reading the report how to inteprete, what part of the result are the essential. Here is

  • My MacBook Pro 13" (2012) is running Windows 7 on Bootcamp and there is an issue with the headset I'm using.

    My MacBook Pro 13" (2012) is running Windows 7 on Bootcamp and there is an issue with the headset I'm using. I use a cheap Logitech Stereo Headset H130, but since there is only on audio plugin on the computer, I purchased a 3.5 mm 4-pin smartphone au

  • Nomad MuVo2 File organization H

    Hey all, I have a 4 GB MuVo2 and I was wondering how I can organize my files within a folder by track number. When I view a folder in Windows, I can select Details to show name, album, track, etc. I can then sort by track number for an album and it w

  • Status of an action PPF

    Hi, I used an action definition and it is called successfully when I save a document. But the status of this action is generated in error (red icon) although rp_status = 0 in my implementation. I want to make this action as unprocessed (Yellow icon).

  • Text file contains data separated by pipe symbol read the data and saved in

    Hi , This is Sreedhar, i am new to java. my query is Report its in text file format it contains data like GLNO,name ,amount. All fields are separated by pipe(|) symbol. I would like to read that data and saved into the database. Please anyone can hel