Best practice for putting binary data on the NMR

Hi,
We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
1. setContent()?
2. addAttachment()?
3. setProperty()?
If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
Thanks,
Bruce

setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

Similar Messages

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best Practice for disparately sized data

    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    A

    Angel 1058 wrote:
    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    AHi A,
    It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
    Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
    Best regards,
    Robert

  • Best practice for retraction of data from BPC 10.0 to General Ledger

    Hi All,
    I have the requirement to retract data from BPC 10.0 to General Ledger.
    What is the best practice for doing this?
    I have read the "How To... Retract data from BPC 7.5 NetWeaver to SAP ERP Cost Center Accounting, Part I", but this is prepared to transfer data to Cost Center accounting. Will General Ledger be populated automatically?
    What is your opinion on this?
    Best regards,
    JA

    Hi Rich,
    In BPC 10 NW, the data entered in Input Forms has to be retracted to ECC.
    For this retraction we are following below link.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c029accf-6d95-2d10-79aa-cc291180bf73?QuickLink=index&overridelayout=true&59180354379512
    In this document, I am unable to get the details of class ZCL_BPC_RTRCT_SUPERCLASS.
    as it is Z class, i am unable to create in my system.
    could you please help me.

  • Best practice for putting together scenes in a Flash project?

    Hi, I'm currently working on a flash project with the following characteristics:
    using a PC
    2048x1080 pixels
    30 fps
    One audio file that plays (once) continuously across the whole project
    there are actions that relate to the audio, so the timing is important
    at least 10 scenes
    about 7 minutes long total
    current intent is for it to be played in a modern theater as a surprise
    What is the best practice for working on this project and then compiling it together?
    Do it all in one project file?
    Split the work into different project (xfl) files for each scene and then put it together when all the scenes are finalized?
    Use one project file but create different "scenes" for each respective scene?  I think this is the "classic" way (?).
    Make the scenes "movie clips" and then insert them into the timeline with the audio as its own layer?
    Other?
    I'm currently working on it by having it all in one project file.  But I've noticed that there's some lag (or it gets choppy) at certain parts during playback and the SWF history shows 3.1 MB with a yellow triangle with exclamation point symbol.  Thanks in advance. 

    you would only do that if it makes your job easier.  generally speaking, it would not.
    when trying to sync sound and animation i think most authors find it easiest to use graphic symbols because you can see their animation when scrubbing the main timeline.  with movieclips you only see their animation when testing.
    however, if you're going to use actionscript to control some of your symbols, those symbols should be movieclips.

  • Best practices for submitting CF data to an AJAX page?

    Hi everyone,
    I've got a project I'm working on for work and have hit a
    little problem.
    I am extracting data from my database and then after each
    piece of data (just numbers, usually 10 chunks of numbers), I tack
    a "|" onto the end of each number. Then, I output the data to the
    page. Back on my AJAX enabled page, I get the "responseText" from
    that page, and then split it up using javascript and the
    pre-inserted "|".
    This seems to work fine, but it is quite a bit more messy.
    Also, It would really, really be nice to be able to do sorting and
    various other operations on the data with javascript instead of
    having to rely on CF's icky code logic.
    Can someone please enlighten me as to best practices for this
    type of thing? I get the suspicion that I'll probably be using XML
    somehow, but I'd your opinion.
    Thanks!

    Check out the Samples and Documentation portions of Adobe's
    Spry website for client side use of JSON with Spry.
    http://labs.adobe.com/technologies/spry/home.html
    Here is link to Adobe's Spry Forums:
    http://www.adobe.com/cfusion/webforums/forum/categories.cfm?forumid=72&catid=602
    If you are using CF8 you can use the SerializeJSON function
    to convert a variable to JSON. You might also be interested in the
    cfsprydataset tag. CF 8 documentation:
    http://livedocs.adobe.com/coldfusion/8/htmldocs/
    If you are using a previous version of CF there is 3rd party
    JSON support. You can find links at
    http://json.org.

  • Best Practice for Securing Web Services in the BPEL Workflow

    What is the best practice for securing web services which are part of a larger service (a business process) and are defined through BPEL?
    They are all deployed on the same oracle application server.
    Defining agent for each?
    Gateway for all?
    BPEL security extension?
    The top level service that is defined as business process is secure itself through OWSM and username and passwords, but what is the best practice for security establishment for each low level services?
    Regards
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Best practice for exposing internal data to external world?

    Currently we have our Internet server sitting in our corporate DMZ taking website and web service requests from the outside world.  Class libraries with compiled connection strings exist on that server.  That server then has a connection through
    the firewall to the database server.  I'm told that this is no longer the secure/recommended best practice.
    I'm told to consider having that Internet server make requests of not the database server, but rather a layer in between (application server, intranet server, whatever) that has those same Web UI methods exposed.. and then THAT server (being inside the firewall)
    connects to the database server.
    Is this the current recommended best practice to have external users interact with internal data?  It seems like lots of hoops -- outside person app queries Web UI methods on Internet server which in-turn queries same method (duplicated) on Intranet
    server which then talks to the database.
    I'm just trying to determine the simples practice, but also what is appropriately secure for our ASP.NET applications and services.
    Thanks.

    IMO this has little to do with SOA and all about DMZs. What you're are trying to stop is the same comm protocol accessing the database as accessed the web site. As long as you fulfil that then great. WCF can help here because it helps with configuring
    the transport of calls. Another mechanism is to use identities but IMO it's easier to use firewalls and transports.
    http://pauliom.wordpress.com

  • Best Practices for Loading Master Data via a Process Chain

    Currently, we load attributes, text, and hierarchies before loading the transactional data.  We have one meta chain.  To load the master data it is taking more than 2 hours.  Most of the master data is full loads.  We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago.  Is there a precedence or best practice to follow such as do we remove these processes from the chain?  If so, how often should it be run?  We would really like to reduce the amount of the master data loading time.  Is there any documentation that I can refer to?  What are other organizations doing to reduce the amount of time to load master data?
    Thanks!
    Debby

    Hi Debby,
    I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
    Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
    Cheers
    Michael

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best way to put binary-data into string?

    Hi there!
    What I want to do is to transfer binary data via HTTP/GET so what I have to do is to transfer binary data into a string.
    Currently I do this the follwing way:
          byte[] rawSecData = new byte[4]; //any binary datas
          ByteArrayOutputStream secBOS = new ByteArrayOutputStream(4);
          DataOutputStream secDOS = new DataOutputStream(secBOS);
          for(int i=0; i < rawSecData.length; i++)
            secDOS.writeByte(rawSecData);
    secDOS.flush();
    String secData = secBOS.toString();
    System.err.println("Lenght of resulting String: "+secData.length());
    I know that this way already worked fine, however I now set up my system up again with another linux-distro and now strange things happen.
    e.g. the secData string differs in lenght from run-to-run between 2 and 4 and I don know at all why?
    Transferring the binary-stuff into string-stuff (e.g. short-binary 255 255, String: 65536) is not possible for me because of various reasons.
    The funny thing is that I remeber that this already worked some time ago and I can figure out why it now doesnt...
    Please help!

    First of all thanks a lot for your help!
    Yes, I already think its an encoding problem, but how can I specify the encoding in my application in a portable way. I dont have an idea what to do.
    My applikation should run as applet on many different 1.1+ VMS (msjvm, netscape-1.1.5, ...).
    Thanks again, lg Clemens

  • Best practice for R12 upgrade middle of the period

    Does anyone know of where I can find an Oracle documented best practice on when to perform the 12.1.1 upgrade? We are considering if we can upgrade in the middle of a period or if we should wait until the period close.

    Best practice is to read the name of a forum before you post into it.
    This forum is titled: "Oracle Database General Questions."
    My suspicion is your question relates to EBS.

  • Best Practice for attaching Meta Data, Organizing & Archiving Photos

    I am about to start the daunting task of organizing and archiving ALL of my photos so that I can access them from an external hard drive (or two) when I need them. I want to know/understand the best way to do this before I start so that thousands of hours later I don't realize that I have not considered something critical. I have already spent hours trying to find this information online to no satisfactory avail. I have both iPhoto ('09 v8.1.2) and Aperture 3, which I am just learning to use. I am wanting to back up to my WD My Book. I have already transferred some pictures and realized that, though I had them nicely organized in Albums in iPhoto, that information doesn't transfer over to the My Book so the file names are unrecognizable as any type of descriptor. I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
    Here are all of the things I want/need to know:
    1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
    2. What is the best way to attach MetaData to pictures so that the information is permanent? Tagging (allowing multiple search options)? Batch Name Change (allowing limited search capabilities)? Other?
    a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
    3. After attaching all of this information, what do I need to know about transferring the photos to an external HD? Are there multiple copies of the same photo (original, edited) and if so, are both transferred? Are they both deleted when I move the pictures to the trash?
    4. Is there anything else that I need to know??
    Your expertise would be greatly appreciated.
    Thank you.

    You are trying to defeat the photo management of iPhoto - My suggestion is to learn iPhoto and use it of choose another program - iPhoto does not work well unless it manages the entire library and you access it using the tools build into the OS - click here for a discussion on the many correct and safe ways to access your iPhoto library photos.
    Aperture may or may not be a better choice - try the Aperture forum for detailed info
    Backing up iPhoto is trivial - simply drag the iPhoto library intact as a single entity to another hard drive - or use an incremental backup program to do it as changes are made - I use Time Machine to one backup drive and SuperDuper to another
    I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
    This is an integral part of iPhoto and one of its strengths
    1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
    iPhoto does a great job of this - file names are only adjusted when you export - and there is no need for them to be adjusted within the iPhoto library since you never should be going into it directly and never need to see the files directly
    2. What is the best way to attach MetaData to pictures so that the information is permanent?
    Exporting from iPhoto
    Tagging (allowing multiple search options)?
    Using iPhoto and the OS tools
    Batch Name Change (allowing limited search capabilities)?
    Done during export if desired
    Other?
    Depends
    a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
    In iPhoto yes so long as you are not going directly into the iPhoto library but are using the provided tools
    3. After attaching all of this information, what do I need to know about transferring the photos to an external HD?
    Simply drag the iPhoto library intact as a single entity to the EHD - the database is a single entity and it will be the same there (so long as the EHD is formatted Mac OS extended (journaled)
    Are there multiple copies of the same photo (original, edited) and if so, are both transferred?
    Yes - Yes
    Are they both deleted when I move the pictures to the trash?
    You do not move photos to the trash - you select them in iPhoto and press the delete key then empty the iPhoto trash (iPhoto menu ==> empty iPhoto trash)
    4. Is there anything else that I need to know??
    Lots - if you are going to use iPhoto stop thinking of managing files and start thinking of using a SQL database which is what iPhoto is -- otherwise you need to use a different program
    LN

Maybe you are looking for