A question regarding Synchronous, Pub-Sub non-durable model

          Scenario:
          Non-durable, publish-subscribe, syncronous model.
          Publisher1 and Publisher2 both post messages to a Topic at the same time. How to
          design the non durable Subscribers that would be able to obtain both these messages
          in the same thread?
          e.g. if i in my program I create 2 subscribers, with selectors configured for the
          2 different messages...
          program start
          1. create sub1 using selector1 for topic1
          2. create sub2 using selector2 for topic1
          3. sub1.receive(timeout)
          4. sub2.receive(timeout)
          5. further processing using both the obtained messages
          6. program end.
          In the above case, if messages arrive at the same time in the topic, only sub1 will
          be able to get the message, as sub2 won't be active at that time.
          Is there anyway to acheive this considering Synchronous, Pub-Sub non-durable model
          is to be used?
          Many thanks...
          

          Thanks for the response.
          Is a subscriber active the moment it is created?
          e.g. objSub=tsession.createSubscriber(topic,strMessSelector,true);
          or it is said to be active when we call
          objSub.receive() (in case of synchronous subscriber)
          "Shean-Guang Chang" <[email protected]> wrote:
          >I don't know the details of the design. You can use a single subscriber
          >without a selector and then do two receive to get both message.
          >If you have to use selector so sub1 can only select message from pub1 and
          >sub2 can only select message from pub2 then you need sub3 without a
          >selector and sub3 will do blocking receive first and then once the sub3
          >receives a message you can use sub1 and sub2 to do receiveNoWait.
          >
          >e.g.
          >while (true) {
          > if (sub3.receive() ! = null) { // have message in the topic
          > if (sub1.receiveNoWait() != null) { // got message from pub1
          > do something
          > }
          > if (sub2.receiveNoWait() != null) {// got message from pub2
          > do something
          > }
          > }
          >}
          >
          >"vinay s" <[email protected]> wrote in message
          >news:[email protected]...
          >>
          >> Scenario:
          >> Non-durable, publish-subscribe, syncronous model.
          >> Publisher1 and Publisher2 both post messages to a Topic at the same time.
          >How to
          >> design the non durable Subscribers that would be able to obtain both these
          >messages
          >> in the same thread?
          >> e.g. if i in my program I create 2 subscribers, with selectors configured
          >for the
          >> 2 different messages...
          >>
          >> program start
          >> 1. create sub1 using selector1 for topic1
          >> 2. create sub2 using selector2 for topic1
          >> 3. sub1.receive(timeout)
          >> 4. sub2.receive(timeout)
          >> 5. further processing using both the obtained messages
          >> 6. program end.
          >>
          >> In the above case, if messages arrive at the same time in the topic, only
          >sub1 will
          >> be able to get the message, as sub2 won't be active at that time.
          >>
          >> Is there anyway to acheive this considering Synchronous, Pub-Sub
          >non-durable model
          >> is to be used?
          >>
          >> Many thanks...
          >
          >
          

Similar Messages

  • Please advise regarding implementing pub/sub in a BPEL process.

    Hi guys,
    Requirement
    The customer information comes as HTTP/SOAP message and is sent to the BPEL process as input. BPEL uses ESB and database adapters to create the customer in Oracle apps. Now, we need to send the same customer information with the Oracle apps customer number to some other systems. The number of systems is not know at this time, so we need a pub/sub mechanism where we can publish the customer creation event with the required parameters (customer name, number, etc.). Then we just create subscriptions to that event as an when required. This gives us the flexibility to add subscriptions withougt changing the initial BPEL process.
    My questions
    I can think of three ways to do this, please advise.
    1. Use Oracle workflow business events and subscriptions. At the end of the BPEL process, raise the wf event and the subscriptions to that will be processed by workflow engine. My doubt here is that most of the times the subscriptions is another BPEL process, so does it make sense to go from BPEL to WF and then WF to PL/SQL to invoke a new BPEL?
    2. Use Advanced Queues (Oracle AQ). This is similar to using Oracle WF and to me the issues are also the same.
    3. Use routing rules in ESB -- I am not really sure if this can be done to replicate a pub/sub scenario, please advise and elaborate.
    As a summary, we want the ability to have pub/sub within BPEL and/or ESB. Any other suggestions are also welcome.
    Thanks!

    I have the same question , but could not find the releveant post in ESB forum.. is the response posted ? pls post the link to that thread...

  • Question regarding synchronized

    I was trying out some code to understand the working of synchronized. I wrote a small program where the objective is that multiple threads will access a method which increments a static variable. However, I am not sure where I am going wrong. This is the code I have :
    class StaticTest extends Thread
         static int count = 0;
         synchronized void doStuff()
              System.out.println("Thread "+getName()+" has started doing stuff");
              for (int i = 0 ; i < 500000 ; i++ )
                   count++;
              System.out.println("Thread "+getName()+" has finished doing stuff and count is = "+count);
         public void run()
              doStuff();
         public static void main(String[] args)
              StaticTest st0 = new StaticTest();
              StaticTest st1 = new StaticTest();
              st1.start();
              st0.start();
    }I am expecting that whichever thread enters the synchronized method should exist first and no other thread should be able to access that method. However, that is not what I am seeing.

    If they change their code from this:static int
    count = 0;to this:static Integer count =
    0; and then did asynchronized(count)
    }Shouldn't that work? Mine isn't working. Both
    thread merrily enter the sychronized block and run it
    at the same time.That might be because you are changing what count is referencing. It doesn't make much sense to synchronize on an immutable object like Integer, especially if you're going to be reassigning what that variable is referencing (rather than the contents of that reference, which you can't do on an immutable object). So in your case you are still synchronizing on two different objects.

  • Question Regarding Mesh with 3702 and non AC ap´s

    Hello! 
    quick question regarding MESH deployments with 2 different sorts of AP´s: AC and non-AC modells: If my 3702i is my root AP´s, and 3602i my MAP - will AC still work in 80Mhz, or will I have to switch to 40mhz (and thus crippling (???) AC performance?) 
    Not 100% sure on this... I *think* it should still work for the normal 802.11n connection, but I´m not sure if the 80mhz channel width (needed??) for AC, will cause the non-ac 3602i to be stranded? 
    Thanks alot for your insight! 

    Currently, my network DHCP server is a software based DHCP server. In reading over your post if I understood correctly it sounds like the managed switch would have its own hardware based DHCP server to assign IP addresses to those clients identified on the "external" VLAN. Did I understand that correctly or did misread something?
    DHCP server will be software based, even though you defined it on your switch, it is DHCP service running on its OS.
    I am configuring this setup for a small business application and will need to purchase a managed switch with 16 or 24 ports. Do you have any recommendations on a particular managed switch that will handle the VLAN configuration and include POE while keeping costs in mind.
    In this forum, most of us discussed about Cisco enterprise grade wireless. Here is 2960X series switch detail, if you are interested
    http://www.cisco.com/c/en/us/products/switches/catalyst-2960-x-series-switches/index.html
    You may need to check the pricing with your Cisco account manager or from a Cisco partner.
    HTH
    Rasika
    **** Pls rate all useful responses ****

  • Pub/Sub Upgrade question

    I'm in the process of upgrading a 4.x CUCM Pub/Sub to 8.5.
    The new hardware I have doesn't support anything below 8.5 so I used a "swing server" for the DMA export/import, etc.
    To upgrade the swing server I used an upgrade ISO I had previously downloaded to 8.5.1.13900-5.
    Then, I installed 8.5 from disk, upgraded it to 8.5.1.13900-5 and did a backup on the swing server and a restore on the new server.
    So far, everything is fine and dandy.
    Now, for the sub...
    At first it wouldn't install because the disk version of CUCM I have isn't the same as the active one.  So I rebooted the Pub into the inactive (downgraded) partion and then I got the sub to install since it recogized that the Pub was on the same version.
    So now I need to get these both on the 8.5.1.13900-5 where the data import is located.  I tried upgrading the sub first (to the inactive partion) figuring once that is done I'll just reboot the Pub into the correct version and then the Sub. 
    However, the Sub failed twice on the upgrade with an error saying it failed reading from the source file.  So, I rebooted the Pub into the correct version and I'm trying again to upgrade the Sub.  Hopefully this will work.
    So, where did I go wrong and what should I do differently?
    All criticism accepted!
    Thanks!

    The upgrade failed again and I noticed that all the services on the Sub were deactivated and they would not start due to a database error.
    I then rebooted the Pub into the inactive partition which is the downgraded version matching the current Sub version.  This allowed me to activate the services on the Sub.
    Then,  I ran the upgrade on the Sub and it seemed to take.  I then rebooted the Pub into the correct (upgraded) version.  I went to switch versions on the Sub but it only show the active version being the lower (incorrect) version.  I rebooted anyway thinking it may boot into the upgraded version but it didn't.
    When both servers came up, the Pub into the correct (upgraded) version and the Sub into the wrong version I checked the services on the Sub and they're activated and running.  I'm trying to run the upgrade again and I selected to reboot into the inactive partition automatically this time.
    Thoughts?  I feel like I'm chasing my tail...

  • CUCMv8 install pub sub validation error

    I am doing a fresh install of a CUCM pub sub. The version is 8. Cisco sent a DVD with 8.0.1 version of CUCM installation software. The cisco servers probably had a pre loaded installation software with version 8.0.2. Somehow after I started installing the Pub, it installed the 8.0.2.40000-1 version, totally ignoring the 8.0.1.10000-26 on the DVD. Once I finished with the Pub and started installing the Sub, after the first node access configuration there was an error which said the version on Pub does not match the one on Sub. I guess the DVD erased the partition on the Sub which had the 8.0.2.40000-1 on it and went ahead with installing 8.0.1.10000-26
    How do I rectify this? Should I reinstall the Pub with 8.0.1.10000-26 and follow suit on the Sub? Or should I download 8.0.2.40000-1 from the Cisco support and try installing 8.0.2 on the Sub? My question is since the Sub is a blank server with nothing on it, how would I be able to tftp into it from the Pub's CLI and push the 8.0.2 iso? Cisco does not upload bootable iso files and at the same time does not recommend converting non bootable to bootable files. Hence the solution of downloading non bootable 8.0.2 iso files and converting into a bootable DVD and installing is out of question.
    Cisco rep suggested I install 8.0.3. Can I just download 8.0.3, burn a DVD (he said its bootable file) and pop it into the server and commence installation?
    Thanks, appreciate your help.

    Hi
    Basically the download images from Cisco are not bootable (primarily to make life difficult for engineers, but also to discourage piracy I suspect). This leaves you two main choices:
    1) Yes, you can boot the publisher from 8.0.1 media you have, install 8.0.1, and then continue your subscriber build. If you like you can then download any 8.0(x) image from Cisco and upgrade using the normal documented procedures.
    2) Alternatively, if your sub is running 8.0.1, partway through the install it will say 'do you wish to install a patch as part of this installation'... if you pick yes, you can then point the server at a downloaded version (e.g. the same version as your pub is running) and it will apply that version. You would be using your existing media to boot the server, and then applying the latest downloaded version.
    If you intend to go to 8.0.3 you may just upgrade your pub, and then do step two above on your sub (boot from your media, and then apply an 8.0.3 download). You can apply the downlaoded file from an SFTP server or from a burned DVD.
    Regards
    Aaron
    Please rate helpful posts...

  • Question regarding IWDTree and context Value Node naming

    Hi,
    I have a question regarding the IWDTree / IWDTreeNodeType components.
    I have a context looking like this:
    Context
      + ResponseNode
        + PersonNode (1..1)
          + PersonAddressNode                    (empty node, placeholder)
          | + AdresNode (0..n)
          + PersonChildNode                      (empty node, placeholder)
          | + PersonNode (0..n)
          |   + PersonAddressNode                (empty node, placeholder)
          |     + AddressNode (0..n)
          + PersonParentsNode                    (empty node, placeholder)
            + PersonNode (0..n)
              + PersonAddressNode                (empty node, placeholder)
                + AddressNode (0..n)
    The context represents a person, a person's address, and a person's children and parents with their respective addresses.
    As a result, on different branches, a PersonNode and AddressNode can appear.
    And for some strange reason, all PersonNodes and AddressNodes link to the same ResponseNode.PersonNode.PersonParentsNode.PersonNode and ResponseNode.PersonNode.PersonParentsNode.PersonNode.PersonAddressNode.AddressNode respectively, irregardless of their branch...
    Is it illegal to have multiple PersonNode and AddressNode node names, and should they be named uniquely?

    Generally, node names need to be unique inside the context, attributes in different nodes can have same names. I wonder if the context structure you described will result in code without compile errors.
    The WD Tree can only be used with recursive context nodes or with a hierarchy of non-singleton child nodes.
    Can you give an example how your tree should look like at runtime?

  • Questions regarding creation of vendor in different purchase organisation

    Hi abap gurus .
    i have few questions regarding data transfers .
    1) while creating vendor , vendor is specific to company code and vendor can be present in different purchasing organisations within the same company code if the purchasing organisation is present at plant level .my client has vendor in different purchasing org. how the handle the above situatuion .
    2) i had few error records while uploading MM01 , how to download error records , i was using lsmw with predefined programmes .
    3) For few applications there are no predefined programmes , no i will have to chose either predefined BAPI or IDOCS . which is better to go with . i found that BAPI and IDOCS have same predefined structures , so what is the difference between both of them  .

    Hi,
    1. Create a BDC program with Pur orgn as a Parameter on the selection screen
        so run the same BDC program for different Put organisations so that the vendors
        are created in different Pur orgns.
    2. Check the Action Log in LSMW and see
    3.see the doc
    BAPI - BAPIs (Business Application Programming Interfaces) are the standard SAP interfaces. They play an important role in the technical integration and in the exchange of business data between SAP components, and between SAP and non-SAP components. BAPIs enable you to integrate these components and are therefore an important part of developing integration scenarios where multiple components are connected to each other, either on a local network or on the Internet.
    BAPIs allow integration at the business level, not the technical level. This provides for greater stability of the linkage and independence from the underlying communication technology.
    LSMW- No ABAP effort are required for the SAP data migration. However, effort are required to map the data into the structure according to the pre-determined format as specified by the pre-written ABAP upload program of the LSMW.
    The Legacy System Migration Workbench (LSMW) is a tool recommended by SAP that you can use to transfer data once only or periodically from legacy systems into an R/3 System.
    More and more medium-sized firms are implementing SAP solutions, and many of them have their legacy data in desktop programs. In this case, the data is exported in a format that can be read by PC spreadsheet systems. As a result, the data transfer is mere child's play: Simply enter the field names in the first line of the table, and the LSM Workbench's import routine automatically generates the input file for your conversion program.
    The LSM Workbench lets you check the data for migration against the current settings of your customizing. The check is performed after the data migration, but before the update in your database.
    So although it was designed for uploading of legacy data it is not restricted to this use.
    We use it for mass changes, i.e. uploading new/replacement data and it is great, but there are limits on its functionality, depending on the complexity of the transaction you are trying to replicate.
    The SAP transaction code is 'LSMW' for SAP version 4.6x.
    Check your procedure using this Links.
    BAPI with LSMW
    http://esnips.com/doc/ef04c89f-f3a2-473c-beee-6db5bb3dbb0e/LSMW-with-BAPI
    For document on using BAPI with LSMW, I suggest you to visit:
    http://www.****************/Tutorials/LSMW/BAPIinLSMW/BL1.htm
    http://esnips.com/doc/1cd73c19-4263-42a4-9d6f-ac5487b0ebcb/LSMW-with-Idocs.ppt
    http://esnips.com/doc/ef04c89f-f3a2-473c-beee-6db5bb3dbb0e/LSMW-with-BAPI.ppt
    <b>Reward points for useful Answers</b>
    Regards
    Anji

  • Question regarding palcing cache related classes into a package

    Hi all,
    I have a question regarding placing classes into packages. Actually I am writing cache feature which caches the results which were evaluated previously. Since it is a cache, I don't want to expose it outside because it is only for internal purpose. I have 10 classes related to this cache feature. All of them are used by cache manager (the manager class which manages cache) only. So I thought it would make sense if I keep all the classes into a separate package.
    But the problem I have is, since the cache related classes are not exposed outside so I can't make them public. If they are not public I can't access them in the other packages of my code. I can't either make it public or private. Can someone suggest a solution for my problem?

    haki2 wrote:
    But the problem I have is, since the cache related classes are not exposed outside so I can't make them public. If they are not public I can't access them in the other packages of my code.Well, you shouldn't access them in your non-cache code.
    As far as I understand, the only class that other code needs to access is the cache manager. That one must be public. All other classes can be package-private (a.k.a default access). This way they can access each other and the cache manager can access them, but other code can't.

  • Question regarding the "mcxquery" and "dscl -mcxread" commands:

    Question regarding the mcxquery and dscl -mcxread commands:
    Does anyone know why the mcxquery and the dscl . -mcxread commands don't show any info about MCX managed login items & printers? The System Profiler's "Managed Client" section does. Id like to see info regarding managed printers and managed login items using the mcx tools. I have Mac users running 10.5.2 with both login items and printers that are pushed out to them via MCX. The System Profiler app shows all of their policies, but the dscl . -mcxread and mcxquery tools dont. Why not?
    -D
    Message was edited by: Daniel Stranathan
    null

    How do you "call procedures/functions" without sql code? You need at least the call statement like
    {call myProc(?,?,?)}that you wrap into a CallableStatement.
    Other than that: when you switch off autocommit, you need to call commit/rollback at the end. Usually, if you don't commit/rollback a non-autocommitted connection, the transaction get's committed/rollbacked when you close the connection - that depends on the JDBC driver. But it's never a good idea to ommit the commit/rollback calls on a non-autocommit connection. Usually you enclose your code in a try/catch block like this:
    con.setAutocommit(false);
    try {
       con.commit();
    } catch (Exception e) {
       con.rollback();
    } finally {
        con.setAutocommit(true); //or:
        con.close();
    }

  • Looking for sample code to create my own pub/sub!

    I am a new bee in JMS. So I would really appreciate if
    some one could give me some hint to start up with my school project. I am looking for a sample Java code that will:
    For the Publisher:
    1. Connect to a broker [create it, if it does not exist]
    2. Create a publisher/destination.
    3. Create a pub-sub queue
    4. Publish a message
    5. Ack or Nak depending on if the subscriber got or did not get the message.
    For the Subscriber:
    1. Connect to a broker [create it, if it does not exist]
    2. Subscribe to the broker
    3. Subscribe to the Queue
    4. Show an received messages on the console.
    Here are the command line params for both the Publisher and subscriber:
    runPub 127.0.0.1:7676 myTestBroker myQueue "this is my message"
    runSub 127.0.0.1:7676 myTestBroker myQueue
    Please tell me if there are similar java code that will do all this and work with ANY JMS compatible client. i.e. I should not have to use the Admin tool of any JMS Server (MQSeries, iPlanet, SonicMQ etc etc). The code should follow the JMS spec and do this programmatically.
    Thank you very very much in advance for doing this great favor.
    With regards,
    Amir.

    Thanks a lot for that hint. I think that's a great tutorial for a beginner. I could compile those sample codes from chapter 4 with out any problem, but could not run it. I also installed j2sdkee1.3.1 and updated my classpath according to the spec. But when I tried to run the "j2ee -verbose" command it was giving me the following error message:
    ERROR: Set J2EE_HOME before running this script.
    Any advise for me that I should follow next. Thanks again.

  • Some question regarding time evaluation

    Hello,
    I have two questions regarding time evaluation:
    1. Is it possible (if yes, how) to still include a employee in time evaluation even if this employee is inactive (status P0000-STAT2 = 0). We need this in order to calculate weeks of not working (this has to be calculated). I know that for payroll this is possible wih a setting in infotype 0003.
    2. Is it possible (and how) to read data back from payroll into time. For example in payroll you export something to ZL table, can you then pick this up in time evaluation schema, as there is also ZL table. Or is there another way to do this?
    Thanks for you answers,
    Liesbeth

    hi Schrage
    Why you want to evaluate the time for inactive person.
    If you want you can do.
    Process :
    First of all you have to group your employees. and sub groups.( for inactive emp)
    Assign the employee sub group grouping for PCR in Basic Pay ( IMG ).
    Then come to Time Evaluation Schema. Put the Day grouping nn nn nn nn in the Parameters. and run the time evaluation. You will get the output in DZL table.
    For the above process you need to configure the T510S table.
    Yes you can read the payroll into time.
    the same concept will run in both of the modules.
    the output should appear in the ZL table only.
    Here the concept is.....Some companies, they will not use the payroll wage types. only they will use the time wage types.. these wage types has to be configure in the T510S. and we have to do the wage type copying from the part of time management only if they are not using the payroll. So either in Payroll or in Time management the evaluation of time willbe the same.
    Cheers
    Vijai

  • Non-durable subscribers persists after the end of their JMS session

    Hi all,
    I'm using OAQ as my JMS Provider.
    I have created code for publishing and recieving messages to/from topics. Messaging works fine, but I have one problem:
    All my non-durable subscribers persists after the end of their JMS session. I always perform unsubscribe operation in code and then closing session and connection.
    When I subscribe to topic, there's one more line in DB table ( etc. NAME: TSUB_1_24E6DB98A2EB7712E040A8C, TYPE: 65). I have expected that subscribers will be deleted after performing unsubscribe operation, but they don't...
    Can you help me with this problem? Thanks for any answer.
    Best Regards,
    Juraj

    Have you found a solution to this yet - because I face the same problem

  • 3 questions regarding duplicate script

    3 questions regarding duplicate script
    Here is my script for copying folders from one Mac to another Mac via Ethernet:
    (This is not meant as a backup, just to automatically distribute files to the other Mac.
    For backup I'm using Time Machine.)
    cop2drop("Macintosh HD:Users:home:Desktop", "zome's Public Folder:Drop Box:")
    cop2drop("Macintosh HD:Users:home:Documents", "zome's Public Folder:Drop Box:")
    cop2drop("Macintosh HD:Users:home:Pictures", "zome's Public Folder:Drop Box:")
    cop2drop("Macintosh HD:Users:home:Sites", "zome's Public Folder:Drop Box:")
    on cop2drop(sourceFolder, destFolder)
    tell application "Finder"
    duplicate every file of folder sourceFolder to folder destFolder
    duplicate every folder of folder sourceFolder to folder destFolder
    end tell
    end cop2drop
    1. One problem I haven't sorted out yet: How can I modify this script so that
    all source folders (incl. their files and sub-folders) get copied
    as correspondent destination folders (same names) under the Drop Box?
    (At the moment the files and sub-folder arrive directly in the Drop Box
    and mix with the other destination files and sub-folders.)
    2. Everytime before a duplicate starts, I have to confirm this message:
    "You can put items into "Drop Box", but you won't be able to see them. Do you want to continue?"
    How can I avoid or override this message? (This script shall run in the night,
    when no one is near the computer to press OK again and again.)
    3. A few minutes after the script starts running I get:
    "AppleScript Error - Finder got an error: AppleEvent timed out."
    How can I stop this?
    Thanks in advance for your help!

    Hello
    In addition to what red_menace has said...
    1) I think you may still use System Events 'duplicate' command if you wish.
    Something like SCRIPT1a below. (Handler is modified so that it requires only one parameter.)
    *Note that the 'duplicate' command of Finder and System Events duplicates the source into the destination. E.g. A statement 'duplicate folder "A:B:C:" to folder "D:E:F:"' will result in the duplicated folder "D:E:F:C:".
    --SCRIPT1a
    cop2drop("Macintosh HD:Users:home:Documents")
    on cop2drop(sourceFolder)
    set destFolder to "zome's Public Folder:Drop Box:"
    with timeout of 36000 seconds
    tell application "System Events"
    duplicate folder sourceFolder to folder destFolder
    end tell
    end timeout
    end cop2drop
    --END OF SCRIPT1a
    2) I don't know the said error -8068 thrown by Finder. It's likely a Finder's private error code which is not listed in any of public headers. And if it is Finder thing, you may or may not see different error, which would be more helpful, when using System Events to copy things into Public Folder. Also you may create a normal folder, e.g. named 'Duplicate' in Public Folder and use it as desination.
    3) If you use rsync(1) and want to preserve extended attributes, resource forks and ACLs, you need to use -E option. So at least 'rsync -aE' would be required. And I rememeber the looong thread failed to tame rsync for your backup project...
    4) As for how to get POSIX path of file/folder in AppleScript, there're different ways.
    Strictly speaking, POSIX path is a property of alias object. So the code to get POSIX path of a folder whose HFS path is 'Macintosh HD:Users:home:Documents:' would be :
    POSIX path of ("Macintosh HD:Users:home:Documents:" as alias)
    POSIX path of ("Macintosh HD:Users:home:Documents" as alias)
    --> /Users/home/Documents/
    The first one is the cleanest code because HFS path of directory is supposed to end with ":". The second one also works because 'as alias' coercion will detect whether the specified node is file or directory and return a proper alias object.
    And as for the code :
    set src to (sourceFolder as alias)'s POSIX Path's text 1 thru -2
    It is to strip the trailing '/' from POSIX path of directory and get '/Users/home/Documents', for example. I do this because in shell commands, trailing '/' of directory path is not required and indeed if it's present, it makes certain command behave differently.
    E.g.
    Provided /a/b/c and /d/e/f are both directory, cp /a/b/c /d/e/f will copy the source directory into the destination directory while cp /a/b/c/ /d/e/f will copy the contents of the source directory into the destination directory.
    The rsync(1) behaves in the same manner as cp(1) regarding the trailing '/' of source directory.
    The ditto(1) and cp(1) behave differently for the same arguments, i.e., ditto /a/b/c /d/e/f will copy the contents of the source directory into the destination directory.
    5) In case, here are revised versions of previous SCRIPT2 and SCRIPT3, which require only one parameter. It will also append any error output to file named 'crop2dropError.txt' on current user's desktop.
    *These commands with the current options will preserve extended attributes, resource forks and ACLs when run under 10.5 or later.
    --SCRIPT2a - using cp(1)
    cop2drop("Macintosh HD:Users:home:Documents")
    on cop2drop(sourceFolder)
    set destFolder to "zome's Public Folder:Drop Box:"
    set src to (sourceFolder as alias)'s POSIX Path's text 1 thru -2
    set dst to (destFolder as alias)'s POSIX Path's text 1 thru -2
    set sh to "cp -pR " & quoted form of src & " " & quoted form of dst
    do shell script (sh & " 2>>~/Desktop/cop2dropError.txt")
    end cop2drop
    --END OF SCRIPT2a
    --SCRIPT3a - using ditto(1)
    cop2drop("Macintosh HD:Users:home:Documents")
    on cop2drop(sourceFolder)
    set destFolder to "zome's Public Folder:Drop Box:"
    set src to (sourceFolder as alias)'s POSIX Path's text 1 thru -2
    set dst to (destFolder as alias)'s POSIX Path's text 1 thru -2
    set sh to "src=" & quoted form of src & ";dst=" & quoted form of dst & ¬
    ";ditto "${src}" "${dst}/${src##*/}""
    do shell script (sh & " 2>>~/Desktop/cop2dropError.txt")
    end cop2drop
    --END OF SCRIPT3a
    Good luck,
    H
    Message was edited by: Hiroto (fixed typo)

  • A few questions regarding SAP EWM and WM

    Hello,
    I have a few general questions regarding the differences between EWM and WM:
    1) What are the benefits of EWM-MFS compared to WM + TRM (especially in terms of SPS)?
    2) The Quality Inspection Engine (QIE) can also be used by SAP WM, right?
    3) There is RFID-support in EWM, so EWM is able to communicate directly with SAP Auto-ID, right?
         But I have heard that SAP PI is necessary in some cases, when and why?
    4) Is there something new in EWM regarding goods receipt processing?
        I have read that the splitting of inbound delivery items is possible in EWM in case of missing inbound delivery items. Is this really  a new feature?
    5) EWM can easily be connected to SAP BW for reporting purposes, what about WM?
    6) What about scalability if the warehouse grows?
    7) Is there any information about the costs of using EWM compared to WM and vice versa?
    I appreciate any kind of help.
    Thank you.
    Dennis

    Hi,
    1. What does SAP offer as a product for dWM? Is it a u201Cspecialu201D installation of the SAP framework dedicated to WM or is it a standard ECC box where only the WM module is used?
    There are two version of DWM. One is Decentralized WM as a part of ECC and another one is EWM as a part of SCM. Both are decentralized.
    2. My understanding is that the interfaces between ERP and dWM can support some non-real time operations (like when the main ERP system is down, the dWM can still perform some operations). Considering that the transactional interfaces are based on BAPIs, how does SAP achieve this interfacing in non-real time environments? I am thinking you can complete the different processing unless both systems are up
    When it comes to interfaces, DWM needs Deliveries from ERP. That's it, WM can function from there independent of ERP system. But, WM defenitely needs to communicate back PGI and PGR and other posting changes . So, in case ERP is down, even though PGI / PGR is done at WM end, they may not be communicated back to ERP. But WM generates PGI/PGR IDOCs which can always be reprocessed at WM end to resend them to ERP so that Inventory levels are accurate.
    Hope that helps
    Thanks
    Vinod.

Maybe you are looking for

  • Creative Cloud-Abonnement

    Hat jemand Erfahrung, ob ich weiterhin die Programme auf meinem Computer anwenden darf, nach dem ich bei Adobe mein Creative Cloud-Abonnement gekündigt habe?

  • Why does sound not work on my iPod touch 4th generation?

    I have had my iPod for 2 years and suddenly the sound doesn't work anymore. What happened?

  • Hp probook 4530s BIOS password recovery

    I forgot my bios password after changing my operating system from windows 7 pro to windows 8.1 64 bit. I would like to change some settings in my laptop though currently lam completely locked out. Please help me out.

  • Was My Late 2008 MacBook supose to come with a Remote?

    ok, so i have iAlertU on my macbook and the only way to turn it off is with an "Apple Remote". the site said that it comes with EVERY MacBook and i dont have one. The only way i can turn it off is if i pop the batery out. Does it come with a remote o

  • Help! Tried to cancel a file move and I think Im in trouble now...

    I tried to move (NOT copyt) a file from my mac to an external drive. The file was my iphoto file under my username. It is ENORMOUS; 350GB or so. The screen said it would take 14 hours to move. Having some other things to get done, after a minute or s