How to configure for Multiple XML

Hi,
We were successful in working with single xml as a input. Now we are using multiple xml as a input for a single transaction. With the references and the suggestion in documents(working with xml files.pdf) related to Documaker we have changed configuration settings in FSISYS.INI and AFGJOB.JDT.
Down below we have mentioned those changes.
Input Extract file has been changed as F_Sch.DAT; following is the content in the extract file
COM_LOB_1111 .\INPUT\F_SCH1.xml
COM_LOB_1111 .\INPUT\F_SCH2.xml
COM_LOB_1111 .\INPUT\F_SCH3.xml
F_SCH1.xml,F_SCH2.xml,F_SCH3.xml
<?xml version="1.0"?>
<global>
     <lm>
          <COM>COM</COM>
          <LOB>LOB</LOB>
          <TRANSACTIONID>121212</TRANSACTIONID>
     </lm>     
     <Foodcard>
          <Accname>Alex</Accname>
          <Email>[email protected]</Email>
          <Accid>1234123412341234</Accid>
          <Accno>1234123412341234</Accno>
          <Accaddr1>address1</Accaddr1>
          <Accaddr2>address2</Accaddr2>
          <Accaddr3>address3</Accaddr3>
     </Foodcard>     
</global>
Changes to FSISYS.INI
< ExtractKeyField >
Key = 1,3
SearchMask = 1,COM_LOB_1111
< TRN_FIELDS >
COM = 1,3,N
LOB = 5,3,N
PolicyNum = 9,3,N
< DATA >
ExtrFile = C:\FAP\DLL\Computer\INPUT\F_Sch.DAT
AFGJOB.JDT
;ImportXMLFile;2;SCH=1,COM_LOB_1111 15,19
Inspite of those changes that made, we were not able to generate the output using three xml as a input.
***error details.***
[04:16:29PM] Warning: Company - LOB - Transaction
[04:16:29PM] Warning: DM17115: : The DownLoadFAP option is set to Yes, this option should be set to No for optimal performance. Check the RunMode control group, DownLoadFAP option.
[04:16:29PM] Error: Company - LOB - Transaction
[04:16:29PM] Error: DM12041: : FAP library error: Transaction:<>, area:<DXMLoadXMLRecs>
code1:<48>, code2:<0>
msg:<XML Parse Error: The 1 chars before error=<C>, the 20 chars starting at error=< >>.
[04:16:29PM] Error: Company - LOB - Transaction
[04:16:29PM] Error: DM12041: : FAP library error: Transaction:<>, area:<DXMLoadXMLRecs>
code1:<48>, code2:<0>
msg:<syntax error at line 1 column 0>.
[04:16:29PM] Error: Company - LOB - Transaction
[04:16:29PM] Error: DM10292: in <RULXMLExtract()>: Unable to <DXMLoadXMLRecs()>.
[04:16:29PM] Warning: Company - LOB - Transaction
[04:16:29PM] Warning: DM13023: in RCBSendToErrBatch(): Unable to assign the transaction to the error batch. The SentToManualBatch field is not defined in the TrnDfdFile.
[04:16:29PM] Error: Company - LOB - Transaction
[04:16:29PM] Error: DM10947: in NoGenTrnTransactionProc(): Unable to RULLoadXtrRecs(pRPS).
[04:16:51PM] Error: Company - LOB - Transaction
[04:16:51PM] Error: DM12018: in RPDoBaseFormsetRulesForward(): Unable to <WINNOGENTRNTRANSACTIONPROC>().
[04:16:51PM] Error: Company - LOB - Transaction
[04:16:51PM] Error: DM12066: in RPProcessOneTransaction(): Unable to RPDoBaseFormsetRulesForward(pRPS, RP_PRE_PROC_A).
[04:16:51PM] Error: Company - LOB - Transaction
[04:16:51PM] Error: DM12064: in RPProcessTransactions(): Unable to RPProcessOneTransaction(pRPS). Skipping the rest of the transactions for this Base. See INI group:< GenDataStopOn > option:TransactionErrors.
[04:16:51PM] Error: Company - LOB - Transaction
[04:16:51PM] Error: DM12004: in RPProcessOneBase(): Unable to RPProcessTransactions(pRPS).
[04:16:51PM] Error: Company - LOB - Transaction
[04:16:51PM] Error: DM12001: in RPProcessBases(): Unable to RPProcessOneBase(pRPS). Skipping the rest of the Bases for this batch run. See INI group:< GenDataStopOn > option:BaseErrors.
[04:16:51PM] Error: Company - LOB - Transaction
[04:16:51PM] Error: DM12127: in RPRun(): Unable to RPProcessBases(pRPS).
[04:16:51PM] Error: An error occurred during processing.
Kindly let us know in case of any solution for this issue.
Regards,
Balaji R.

It looks like it is unable to load your XML files. Depending on how your configuration is laid out, you may want to list full path names in your extract file - we make use of the same technique in 11.5, which I would imagine isn't terribly different in this respect from 12.0. Our configuration is such that we run Gendata at one level, read an extract file in the deflib folder, and the input files are in an input folder, so each entry in the extract file is C:\fap\mstrres\sampco\input\input1.xml.
Not sure if that is analogous to your setup or not, but based on the errors, either it can't find the file, so it is a pathing issue, or the file contents are invalid.
Tony

Similar Messages

  • How to configure for multiple mixed standbys

    Hi,
    I currently have a Primary database with 2 Physical Standbys set up. It is my intention to convert one of the Standbys to a Logical Standby.
    I am following Oracle Note 738643.1 and CH4 of the 11gR2 Data Guard Concepts and Admin manual
    However I am a bit confused as to what the log_archive_dest_1 setting should be on my primary and each of my standbys when I have a set up that includes a physical and a logical standby
    On my Primary
    ==========
    log_archive_dest_1=' location=USE_DB_RECOVERY_FILE_DEST'
    log_archive_dest_2= 'service=STANDBYL async valid_for=(ONLINE_LOGFILES,PRIMARY_ROLE) db_unique_name=STANDBYL'
    If I plan this primary ever to be a logical standby ( in a switchover etc ) then I need to also add on the Primary Side
    log_archive_dest_3= 'location=USE_DB_RECOVERY_FILE_DEST valid_for=(STANDBY_LOGFILES,STANDBY_ROLE) db_unique_name=PRIMARYP'
    On my first Standby ( currently Physical and will remain so )
    ============
    log_archive_dest_1=' location=USE_DB_RECOVERY_FILE_DEST'
    log_archive_dest_2 ='service=PRIMARYP async valid_for=(ONLINE_LOGFILES,PRIMARY_ROLE) db_unique_name=PRIMARYP'
    On my second Standby ( currently Physical but I am wishing to convert to logical )
    ==============
    log_archive_dest_1= 'location=USE_DB_RECOVERY_FILE_DEST'
    log_archive_dest_2 ='service=PRIMARYP async valid_for=(ONLINE_LOGFILES,PRIMARY_ROLE) db_unique_name=PRIMARYP'
    To allow this Standby to be converted to a logical standby I need to also have
    log_archive_dest_3='location=USE_DB_RECOVERY_FILE_DEST valid_for=(STANDBY_LOGFILES,STANDBY_ROLE) db_unique_name=STANDBYL'
    There appears to be some confusion in the Oracle literature as to what the log_archive_dest_1 should be on the Primary, to facilitate both a physical and a logical standby. For a Logical Standby I have been quoted the following setting ( I think for the primary - though I am not sure ? )
    log_archive_dest_1=' location=USE_DB_RECOVERY_FILE_DEST' VALID_FOR=(ONLINE_LOGFILES,ALL_ROLES) DB_UNIQUE_NAME=PRIMARYP'
    It seems that my current settings that use USE_DB_RECOVERY_FILE_DEST assume the default VALID_FOR=(ALL_LOGFILES,ALL_ROLES) which is ok for Physical Standbys ( as there is no need to separate out Online Redo Logs from Standby Redo Logs ), however this setting is not ok for Logical Standbys. Instead for Logical Standbys I have to use the line above where VALID_FOR=(ONLINE_LOGFILES,ALL_ROLES)
    I am just not precisely sure for my setup exactly where I need to make this change !
    Q1. Is the proposed 'location=USE_DB_RECOVERY_FILE_DEST' VALID_FOR=(ONLINE_LOGFILES,ALL_ROLES) DB_UNIQUE_NAME=PRIMARYP'
    to be set on log_archive_dest_1 on my Primary and/or on log_archive_dest_1 or 3 of the Standby I wish to convert to a logical ?
    Q2. If I am meant to change the log_archive_dest_1 on my Primary, how do I ensure the settting is suitable for both Physical and Logical Standbys ?
    any clarity greatly appreciated.
    Jim

    Have never done but since you posted this 12 hours ago and have no hits I will add the link to the white on it.
    http://www.oracle.com/us/products/database/maa10gr2multiplestandbybp-1-131937.pdf
    Best Regards
    mseberg

  • How to execute a MTS (Master Test Script) in SAP ECATT Test Configuration for multiple variants.

    I have a MTS (Master Test Script) which references 4 Test Scripts. As of now, I am able to run this MTS directly by opening it as a Test Script and running it for default test data.
    Now, I want to run this MTS in SAP ECATT Test Configuration for multiple variants/test data. I am not able to see any parameter if I enter MTS in the Test Configuration.
    Below thread is similar to my requirement but I am not able to understand the solution properly.
    eCATT - how to run multiple test scripts with input variables in different test scripts?
    Any help in this case would be highly appreciated.
    Thanks & Regards,
    Vaibhav Gupta

    I have a MTS (Master Test Script) which references 4 Test Scripts. As of now, I am able to run this MTS directly by opening it as a Test Script and running it for default test data.
    Now, I want to run this MTS in SAP ECATT Test Configuration for multiple variants/test data. I am not able to see any parameter if I enter MTS in the Test Configuration.
    Below thread is similar to my requirement but I am not able to understand the solution properly.
    eCATT - how to run multiple test scripts with input variables in different test scripts?
    Any help in this case would be highly appreciated.
    Thanks & Regards,
    Vaibhav Gupta

  • How to configure for the give senario where we from APO demand planning, we provide baseline forecast into TPM

    Hello Guru's,
    Request for information on smart documentation on the integration between Trade Promotion Management and SCM-Demand Planning from an architecture perspective how to configure for the give senario where we from APO demand planning, we provide baseline forecast into TPM and from TPM promotions values can be considered in to APO demand planning: can anybody give me some insight on it.
    Thanks in advance.
    Kumar

    Hi Praveen
    There are several ways you can connect a DP system to a TPM. How are your interfaces and systems setup?
    Option1:
    Extract the baseline from APO to a BW cube, extract the data out of BW, may be using infospokes/open hub and pass it on to TPM.
    2. Have a custom program written in APO-DP to extract the selected keyfigure data at the aggregation level into an interface or a .txt file into APO directory and use some data transfer service such as EDI to take this data and send to TPM.
    In my experience it is simple to define an integration process. The much larger and complicated aspects are the date adjustments and the aggregation levels.
    Date adjustments:
    Understand the basis of the data in APO and TPM. APO is usually at the requested delivery date or material availability date ( when the customer like a store wants the product). However TPMs are usually at the consumption date ( when an end customer like you and me but from store shelf). Depending on the product and business style this can mean several days difference. Let's say it is 25 days on average, you need to add 25 days to APO base line date to get the TPM consumption date.
    Aggregation level:
    This can be product or customer. If the planning in DP is at product sku level and the TPM is at packs or brand level, you need to aggregate the data up before loading to TPM. The same with customer.
    To me the most complex part of the integration is to understand and transform the data in a meaningful way and a strategy to sustain this integration by keeping both DP and TPM in sync in terms of master data. Also important to have a process for identifying exceptions and failures. you send baseline from DP to TPM and if it fails, how do you know? how do you trouble shoot?
    Hope these help.

  • How i configure the log4j.xml in weblogic server

    hi
    i write a new lg4j.xml and i put this in WEB-INF/classes,i deployed this jar in my remote server,but it not reflected. [i am not specify the server logs, we have some API's which is deployed in server for that API log files is configured using log4j.xml ]
    i saw there is a option in weblogic - setDomain -LOG4J_CONFIG_FILE='here we give the path for log4j.xml'.I think it's only work for local machines not for the remote machines.
    i don't know how to get a log files in weblogic server

    To use Quartz primarily within an application server environment,include the Quartz JAR within the enterprise application (.ear or .war file). However, if you want to make Quartz available to many applications then simply make sure it's on the classpath of your appserver.
    You can also deploy it as a jar file or a library module as required.
    Quartz depends on a number of third-party libraries (in the form of jars) which are included in the distribution .zip file in the 'lib' directory. To use all the features of Quartz, all of these jars must also exist on your classpath.
    -Sandeep

  • How to configure for remote JMS client?

    I have my own Java JMS test program for performance measurements.
              I am using the JNDI and JMS provider functionality of the WebLogic 9.1 app-server but my test program is just pure JMS 1.02 sender/receiver clients - ie it is NOT part of, or deployed as a J2EE application.
              SINGLE MACHINE TEST
              ===================
              In a single machine environment I was able to
              - configure a JMSServer
              - configure a JMSSystemModule
              - configure resources for ConnectionFactories and Queue and Topics
              I then made what I believe to be a 'standalone' application module copied from some mysystemmodule-jms.xml and with that I somehow worked out how to deploy it using the weblogic.Deployer tool.
              The deployment apparently set up the JNDI and my JMS client could gain access to the administered objects and do what it does.
              Everything works.
              TWO MACHINE TEST
              ================
              I now have a second machine.
              I want to put my JMS sender client on this new machine and I want the JMS server and JMS receiver client to be unchanged from the SINGLE MACHINE TEST.
              But I really don't know quite how to proceed from here...
              Do I need to install the WebLogic app-server on the sender machine or is the weblogic.jar all I need?
              What is necessary configuration for JNDI access on the sender machine?
              Can I in fact use my original SINGLE MACHINE server unchanged as I am hoping?
              I don't think I want a "thin" client because I read that performance is impacted (and these are performance tests)
              Remember this is NOT a J2EE application. There is no MDB; no client-container; no descriptors etc. Maybe that makes it more complicated - I don't know.
              Sorry for such basic questions but if somebody can just point me to an appropriate example or tutorial it could save me days...
              Thankyou.

    Hi,
              My problem is on similar lines. I have an applet based UI working on RMI/t3 protocol.
              I am using weblogic 9.2 as my app server.
              When my applet is executed on JRE 1.5x it works fine.
              But when I use JRE1.4x it gives the following exception
              java.lang.NoClassDefFoundError: javax/management/InvalidAttributeValueException
              at weblogic.rmi.internal.Stub.<clinit>(Stub.java:21)
              at java.lang.Class.forName0(Native Method)
              at java.lang.Class.forName(Class.java:141)
              at weblogic.rmi.internal.StubInfo.class$(StubInfo.java:34)
              at weblogic.rmi.internal.StubInfo.<clinit>(StubInfo.java:34)
              at java.lang.Class.forName0(Native Method)
              I have analyzed the reason for this.
              the class javax/management/InvalidAttributeValueException was included in java 1.5 and above. So JRE 1.4 does not have it.
              In previous versions of weblogic this class was a part of their 'weblogic.jar' file and in weblogic 9.2 it is not a part of weblogic.jar file so when I am using JRE1.4 and weblogic 9.2 then it obviously does not find this class hence the above exception.
              I tried to put this all together and made custom made client jar file incliding the necessary classes. I was able to get throght this exception only land up in following exception.
              java.lang.VerifyError: class weblogic.utils.classloaders.GenericClassLoader overrides final method .
                   at java.lang.ClassLoader.defineClass0(Native Method)
                   at java.lang.ClassLoader.defineClass(Unknown Source)
                   at java.security.SecureClassLoader.defineClass(Unknown Source)
                   at sun.applet.AppletClassLoader.findClass(Unknown Source)
                   at java.lang.ClassLoader.loadClass(Unknown Source)
                   at sun.applet.AppletClassLoader.loadClass(Unknown Source)
                   at java.lang.ClassLoader.loadClass(Unknown Source)
                   at java.lang.ClassLoader.loadClassInternal(Unknown Source)
                   at weblogic.jndi.WLInitialContextFactoryDelegate.<clinit>(WLInitialContextFactoryDelegate.java:204)
                   at weblogic.jndi.spi.EnvironmentManager$DefaultFactoryMaker.<clinit>(EnvironmentManager.java:26)
                   at weblogic.jndi.spi.EnvironmentManager.getInstance(EnvironmentManager.java:48)
                   at weblogic.jndi.Environment.getContext(Environment.java:307)
                   at weblogic.jndi.Environment.getContext(Environment.java:277)
                   at weblogic.jndi.WLInitialContextFactory.getInitialContext(WLInitialContextFactory.java:117)
                   at javax.naming.spi.NamingManager.getInitialContext(Unknown Source)
                   at javax.naming.InitialContext.getDefaultInitCtx(Unknown Source)
                   at javax.naming.InitialContext.init(Unknown Source)
                   at javax.naming.InitialContext.<init>(Unknown Source)
              I really need to support clients using Jre 1.4 and Jre 1.5
              I will really appreciate any help on this one.
              Please advise.
              Thank you all.

  • How to configure a manifest.xml in order to play dvr

    Hi,
            I want to play an application with stream type dvr.
    Is it need to configure a manifest.xml to play any application with stream type=dvr in flash media playback. If so, plz tell  me how to configure manifest.xml. I didn't see any manifest.xml file in my installation directory of fms.Help me to play the stream with stream type=dvr..
            When i use dvrcast-origin appliaction to stream,i can play it in flash media playback with stream type live or recorded.But couldn't play with stream type =dvr..What is the solution for this?

    Hi,
            I want to play an application with stream type dvr.
    Is it need to configure a manifest.xml to play any application with stream type=dvr in flash media playback. If so, plz tell  me how to configure manifest.xml. I didn't see any manifest.xml file in my installation directory of fms.Help me to play the stream with stream type=dvr..
            When i use dvrcast-origin appliaction to stream,i can play it in flash media playback with stream type live or recorded.But couldn't play with stream type =dvr..What is the solution for this?

  • How to configure for servlets

    i m new to servlets. can any body plz tell me what and how to do configurations for servlets

    hi,
    previously you installed any server or not,
    if not install any one of web server like javawebserver , tomcat etc...
    steps to install tomcat & configure:
    http://forum.java.sun.com/thread.jspa?threadID=5193463
    after successfull config,
    create a servlet program,compile and store under tomcatx.x\webapps\root\WEB-INF\classes
    open a web.xml under WEB-INF and add <servlet> & <servlet-mapping>
    ex:
         <servlet>
              <servlet-name>servletname</servlet-name>
              <servlet-class>package.classname</servlet-class>
         </servlet>
         <servlet-mapping>
              <servlet-name>servletname</servlet-name>
              <url-pattern>/urlvalue</url-pattern>
         </servlet-mapping>
    restart tomcat
    run : http://localhost:8080/urlvalue

  • How to configure crystal report xml file as data source in BOE in Solaris?

    Hi,
    How to configure crystal reports from xml file as data source in Solaris? I didn't find any suitable driver for xml / excel files for sun solaris.
    Which driver i have to use to connect xml file to crystal report to view my crystal report in solaris BOE?
    And the same way excel file as data source for crystal report.
    Thanks

    Hi Don thanks for the reply,
    In windows environment I donot have any problem when creating crystal report from Xml file and Excel file. After creating reports when I publish those into boe server in solaris, getting connection failed error.
    My solaris BOE server doent have any network connection with windows machines. So i have to place the files in solaris server.
    Below the steps what I tried:
    1. Created crystal reports from cr designer in windows using ADO.Net(xml) and in another try with Xml webservices drivers. Reports works well as it should.
    2. Saved in BOE repository in Solaris server from crystal reports and changed database configuration settings as:
        -Used custom database logon information and specified cr_xml as custom driver.
        -Chnaged database path to file directory path according to solaris server file path </app/../../>
        -tried table prefix also
        - Selected radio button use same database logon as when report is run saved.
    My environment :
    SAP BOXI3.1 sp3
    crystal reports 2008 sp3
    SunOS
    Cr developing in windows 7.
    For Excel I tried with ODBC in windows but I can't find any ODBC or JDBC drivers for Excel in solaris.
    Any help to solve my issues
    Thanks
    Nagalla

  • How to configure for GPlus Adapter with SAP Phone and SAP CRM

    Hi All Basis Adm,
                   Please consider as an urgent basis i need the information that how to configure  GPlus Adapter with SAP Phone and SAP CRM . My server is CRM 5.0 (kernel 7) with Oracle Database. This information is require to connect Webclient to Sap Phone with GPlus Adapter (Genesys Adapter 7.1).  If u have any document regarding mentioned above than that will be acceptable. 
    Regds,
    Govinda

    Hi Prasad,
    In you case, it's a file to file scenario.
    Go thro' this link for a solution:-
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    Regards.
    Praveen

  • How to configure for zimbra mail, How to configure for zimbra mail, How to configure for zimbra mail

    How to configure the iphone for zimbra mail?
    have anyone used zimbra mail in iphone mail settings?
    I'm able to access zimbra through safari...
    But what is the configuration needed to be done to see in mail box?

    Contact Zimbra support and ask them.

  • How to search for multiple values with Bex Prompts

    Hi,
    I would like to know  if it is possible to search for multiple values at the same time with Bex Prompts instead of searching one value at a time (please see the screen shot below). I searched the forum couldn't find any relevant answers. please let me know  how  to achieve?
    Thanks,
    Charvi.

    Hi Charvi,
    You can use wild card search to ensure you get multiple similar values listed as search output.use * for multiple characters and ? for single character.
    For Example, Ravi * would result you all employees with first name as Ravi
    You can use various search formats such as * Ravi * and * Ravi and Ra?? etc.
    Thanks
    Mallik

  • How to configure for the price master of the configurable item

    in my client we have the configurable items like car doors with different colour as per the color
    the price is changing
    so how to configure this masterdata for pricing for all the material

    Hi
    This kindly of process is done at Variant configuration. so the pricing will define in condition type VA00. and the variant process as follows:
    Eg: Ford car.If a sale order is raised for fiesta lxi model the system should choose red colour & for vxi blue colour respectively.
    ie you need to define the characteristic and assign the values .
    This characteristic need to assigned to class.
    class need to be assigned to material master
    1. T.code : CT04(characteristic) FORD_MODEL, Choose single value & entry required tab & give the input as 01 - fiesta_lxi & 02 - fiesta_vxi.( characteristic values)
    2.Tcode: CT04, FORD_BODY( another characteristic)
    Choose single value , don't click entry required.
    01 - Red( characteristic values)
    02- Blue
    3. Create class T.code : CL02
    fiesta_class
    type : 300
    4. OBJECT DEPENDENCY:
    T.CODE : CT04
    FORD_MODEL
    CLICK VALUES TAB,
    IN FIESTA_LXI CLICK THE 'O' meant for Obj dependency, action & extra.
    Edit the dependecy,
    010 $Self.ford_body = '01'. & save
    Repeat the same for FIEST_VXI & instead of '01' give '02' for blue colour.
    , then create a ROH as ford_body. & a KMAT material for the car. .
    Then create a Super BOM with usage 3 & give component as ford_body
    Then use T.code cu41 (Create config profile)
    enter a profile name & class 300 & choose class assignment, choose fiest_class ,
    Use T.code cu50 to check the values.
    Then create a sale order.
    Reward if it helps
    Regards
    Prasanna R

  • How to configure for Comcast Internet

    Only thing online at Comcast is for Tiger. Doesn't work with 10.5. Could you walk me through step-by-step on how to configure my MacBook for the built-in ethernet (to be used with my Comcast cable/modem). Thanks.

    1. Shut down the Mac.
    2. Unplug the power from Comcast modem.
    3. Connect the ethernet cable between the Comcast modem and your Mac.
    4. Power on the Comcast modem and wait for the cable light to come on solid and not flash.
    5. Turn on the computer.
    6. Apple menu -> System Preferences -> network -> Show connections
    should show the ethernet with a green circle next to it. If it is yellow or read, try to configure your ethernet and check if PPPoE is enabled. If it is, disable it and restart your Mac. Start step 6 again, and tell us what your settings say on your screen in the Network preferences if the ethernet is still yellow or red.

  • Best Configuration for Multiple Video Captures

    Hello,
    What would be faster and the most efficient for multiple video captures:
    - 9 video feeds written to 3 separate internal hard drives (3 feeds per hard drive)
    - 9 video feeds written to 3 internal hard drives in a combined RAID configuration
    - 9 video feeds written to 9 separate external hard drives via eSATA
    I have a Mac Pro 3.06GHz 12-core Intel Xeon machine that will be dedicated to a security camera application (SecuritySpy). The Mac OS and SecuritySpy will be on one hard drive and send the captured video from nine network HD cameras to the other hard drives. The computer, eSATA card, and all the hard drives are 6Gb/s capable. I also have access to SoftRAID for the RAID configuration, as well as an 8-bay eSATA hard drive enclosure if I choose. There are pros and cons for each of the configurations, but I am looking for the most reliable setup in terms of efficiency and longevity.
    “Technological change is like an axe in the hands of a pathological criminal.” (Albert Einstein, 1941),
    Dr. Z. 

    Just wanted to post the details of my final set up, maybe this will be useful to others looking to do a similar setup:
    1. Connected and set up the external hard drive (I bought a Western Digital My Book Pro Edition II 1TB) via Firewire 800 on iMac #1.
    2. Set up iMac #1 to backup via Time Machine. Even though the other computers will see the drive, apparently they will not recognize it in Time Machine unless you set it up on the computer the drive is connected to first (I tried).
    3. On iMac #2, connect to iMac #1 via file sharing, this is even easier now that you can access any computer on your network under "SHARED" in the sidebar of any finder window.
    4. Double-click the external drive. If you don't see the drive, click "Connect as" in the top right of the window and login. It's not necessary to do this, but if you want to confirm you've connected to the network drive, you can go to Finder -> Preferences -> General and under "Show these items on the Desktop" check the box next to "Connected Servers". This will make an icon for the external drive appear on your desktop.
    5. Turn on Time Machine on iMac #2. Repeat for any other computers you want to back up.
    That's it. So far this has worked great for me. Obviously the backups run a little slower on the computers running through file share, but its a small trade-off compared to either manually moving the drive around or getting three separate drives.

Maybe you are looking for