Integration Builder Memory Consumption

Hello,
we are experiencing very high memory consumption of the Java IR designer (not the directory). Especially for loading normal graphical idoc to EDI mappings, but also for normal idoc to idoc mappings. examples (RAM on client side):
- open normal idoc to idoc mapping: + 40 MB
- idoc to edi orders d93a: + 70 MB
- a second idoc to edi orders d93a: + 70 MB
- Execute those mappings: no additional consumption
- third edi to edi orders d93a: + 100 MB
(alle mappings in same namespace)
After three more mappings RAM on client side goes on 580 MB and then Java heap error. Sometimes also OutOfMemory, then you have to terminate the application.
Obviously the mapping editor is not quite will optimized for RAM usage. It seems to not cache the in/out message structures. Or it loads for every mapping very much dedicated functionality.
So we cannot really call that fun. Working is very slow.
Do you have similar experiences ? Are there workarounds ? I know the JNLP mem setting parameters, but the problem is the high load of each mapping, not only the overall maximum memory.
And we are using only graphical mappings, no XSLT !
We are on XI 3.0 SP 21
CSY

Hii
Apart from raising tablespace..
Note 425207 - SAP memory management, current parameter ranges
you have configure operation modes to change work processes dynamically using rz03,rz04.
Please see the below link
http://help.sap.com/saphelp_nw04s/helpdata/en/c4/3a7f53505211d189550000e829fbbd/frameset.htm
You can Contact your Basis administrator for necessary action

Similar Messages

  • How to measure memory consumption during unit tests?

    Hello,
    I'm looking for simple tools to automate measurement of overall memory consumption during some memory-sensitive unit tests.
    I would like to apply this when running a batch of some test suite targetting tests that exercise memory-sensitive operations.
    The intent is, to verify that a modification of code in this area does not introduce regression (raise) of memory consumption.
    I would include it in the nightly build, and monitor evolution of summary figure (a-ah, the "userAccount" test suite consumed 615Mb last night, compared to 500Mb the night before... What did we check-in yesterday?)
    Running on Win32, the system-level info of memory consumed is known not to be accurate.
    Using perfmon is more accurate but it seems an overkill - plus it's difficult to automate, you have to attach it to an existing process...
    I've looked in the hprof included in Sun's JDK, but it seems to be targetted at investigating problems rather than discovering them. In particular there isn't a "summary line" of the total memory consumed...
    What tools do you use/suggest?

    However this requires manual code in my unit test
    classes themselves, e.g. in my setUp/tearDown
    methods.
    I was expecting something more orthogonal to the
    tests, that I could activate or not depending on the
    purpose of the test.Some IDEs display mmeory usage and execution time for each test/group of tests.
    If I don't have another option, OK I'll wire my own
    pre/post memory counting, maybe using AOP, and will
    activate memory measurement only when needed.If you need to check the memory used, I would do this.
    You can do the same thing with AOP. Unless you are using an AOP library, I doubt it is worth additional effort.
    Have you actually used your suggestion to automate
    memory consumption measurement as part of daily builds?Yes, but I have less than a dozen tests which fail if the memory consumption is significantly different.
    I have more test which fail if the execution time is siginificantly different.
    Rather than use the setUp()/tearDown() approach, I use the testMethod() as a wrapper for the real test and add the check inside it. This is useful as different test will use different amounts of memory.
    Plus, I did not understand your suggestion, can you elaborate?
    - I first assumed you meant freeMemory(), which, as
    you suggest, is not accurate, since it returns "an
    approximation of [available memory]"freeMemory gives the free memory from the total. The total can change so you need to take the total - free as the memory used.
    - I re-read it and now assume you do mean
    totalMemory(), which unfortunately will grow only
    when more memory than the initial heap setting is
    needed.more memory is needed when more memory is used. Unless your test uses a significant amount of memory there is no way to measure it reliably. i.e. if a GC is perform during a test, you can have the test appear to use less memory than it consumes.
    - Eventually, I may need to inlcude calls to
    System.gc() but I seem to remember it is best-effort
    only (endless discussion) and may not help accuracy.if you do a System.gc(); followed by a Thread.yield() at the start it can improve things marginally.

  • Memory Consumption: Start A Petition!

    I am using SQL Developer 4.0.0.13 Build MAIN 13.80.  I was praying that SQL Developer 4.0 would no longer use so much memory and, when doing so, slow to a crawl.  But that is not the case.
    Is there a way to start a "petition" to have the SQL Development team focus on the products memory usage?  This is problem has been there for years now with many posts and no real answer.
    If there isn't a place to start a "petition" let's do something here that Oracle will respond to.
    Thank you

    Yes, at this point (after restarting) SQL Developer is functioning fine.  Windows reports 1+ GB of free memory.  I have 3 worksheets open all connected to two different DB connections.  Each worksheet has 1 to 3 pinned query results.  My problem is that after working in SQL Developer for a a day or so with perhaps 10 worksheets open across 3 database connections and having queried large data sets and performing large exports it becomes unresponsive even after closing worksheets.  It appears like it does not clean up after itself to me.
    I will use Java VisualVM to compare memory consumption and see if it reports that SQL Developer is releasing memory but in the end I don't care about that.  I just need a responsive SQL Developer and if I need to close some worksheets at times I can understand doing so but at this time that does not help.

  • Portal Session Memory Consumption

    Dear All,
                          I want to see the user sessions memory consumption for portal 7.0. i.e. if a Portal user opens a session, how much memory is consumed by him/her. How can i check this. Any default value that is associated with this?
    Backend System memory load will get added to portal consumption or to that specific Backend System memory consumption.
    Thanks in Advance......
    Vinayak

    I'm seeing the exact same thing with our setup (it essentially the same
    as yours). The WLS5.1 documentation indicates that java objects that
    aren't serializeable aren't supported with in-memory replication. My
    testing has indicated that the <web_context>._SERVLET_AUTHENTICATION_
    session value (which is of class type
    weblogic.servlet.security.ServletAuthentication) is not being
    replicated. From what I can tell in the WLS5.1 API Javadocs, this class
    is a subclass of java.lang.object (doesn't mention serializeable) as of
    SP9.
    When <web_context>._SERVLET_AUTHENTICATION_ doesn't come up in the
    SECONDARY cluster instance, the <web_context>.SERVICEMANAGER.LOGGED.IN
    gets set to false.
    I'm wondering if WLCS3.2 can only use file or JDBC for failover.
    Either way, if you learn anything more about this, will you keep me
    informed? I'd really appreciate it.
    >
    Hi,
    We have clustered two instances of WLCS in our development environment with
    properties file configured for "in memory replication" of session data. Both the
    instances come up properly and join the cluster properly. But, the problem is
    with the in memory replication. It looks like the session data of the portal is
    getting replicated.
    We tried with the simplesession.jsp in this cluster and its session data is properly
    replicated.
    So, the problem seems to be with the session data put by Portal
    (and that is the reason why I am posting it here). Everytime the "logged in "
    check fails with the removal of one of the instances, serving the request. Is
    there known bug/patch for the session data serialization of WLCS? We are using
    3.2 with Apache as the proxy.
    Your help is very much appreciated.--
    Greg
    GREGORY K. CRIDER, Emerging Digital Concepts
    Systems Integration/Enterprise Solutions/Web & Telephony Integration
    (e-mail) gcrider@[NO_SPAM]EmergingDigital.com
    (web) http://www.EmergingDigital.com

  • NullPointerExeption in Integration Builder since SP12

    Hy,
    since we have patched our XI to SP12, I got NullPointerExceptions while working with the Integration Builder (Repository and Directory). This exceptions are thrown when I try to save or activate an object. It works about 4 or 5 times and than the error appears. The error appears in every object (Interfaces, Mappings, Receiver Determinations, etc.)
    When we restart the J2EE Engine, the XI works properly until the next NullPointer appears. Once this error occurs, I am no longer able to save or activate anything.
    What can I do to fix that problem?
    Thank for your help
    Thomas

    Hi Thomas
    If NullPointerExceptions are occuring sporadically in your case, i guess the problem should be with not enough memory. Try increasing the memory allocation and this should hopefully resolve the problem.
    Another cause could be deletion of work directory in the SDM (i am not sure whether this is useful in your case). But just check the OSS note - 677977. Maybe this could also resolve your problem.
    cheers
    Sameer

  • PI 7.11 sp6 Integration Builder - folder view error

    Experts,
    We recently installed SAP NW PI 7.11 (ehp1) up to sp6.  This is a new install.  We did all the wizards, etc and things are basically ok.
    we execute SXMB_IFR, then go to "integration builder" link and login.
    If I change the view from "object view" (icon of 2 people) to "folder view" (icon of folder), we get the following error:
    class com.sap.guid.GUID:library:core_lib at com.sap.engine.boot.loader.ResourceMultiParentClassLoader at 2353f67e at alive incompatible with interface com.sap.aii.ib.bom.gen.XiReference:sap.com/com.sap.xi.directory at com.sap.engine.boot.loader.ResourceMultiParentClassLoader at 79c008cf @ alive
    I found this note, but it does not quite describe my problem:
    Note 1525457 - Previous version of Directory API and Value Mappings
    I searched all through SDN and OSS, nothing yet.  I'm going to put in a message but I was wondering if anyone out there has seen this?
    Thanks!
    NICK

    experts,
    FYI, ready my problem details but I wanted to add one thing.  Apparently were were at sp5 for a brief time and our developer did create some object while at that level.  He reported the issue since we moved to SP6
    --NICK

  • Unable to Launch Integration Builder: Java Web Start problem

    Hi,
    I am unable to launch the integration builder. I am using JDK 1.4.2_05. The java Web Start gives the following error:
    Unsigned application requesting unrestricted access to system
    Unsigned resource: http://<XIHostName>:50000/rep/repository/aii_util_rb.jar
    I have tried deleting the Java web start cache, and even reinstalling the JDK.
    Can someone suggest a workaround this problem ?
    thanks,
    Manish

    The Java Web Start cannot accept unsigned jars. If the jars are signed, it will ask in a dialog box if it should allow unrestricted access to this jar.
    In my case, I upgraded XI SP0 to SP4, which included updating the Adapter Core, Adapter Framework, and the XI TOOLS to SP4.
    On launching the Web start, it seemed that the jars were not getting signed. JWS cannot allow unsigned jars to the client.
    These jars are maintained in:
    \usr\sap\<SID>\SYS\global\xi\directory_server\javaws\directory
    I am having problems with my XI upgrade to SP4, and it is probably not signing the jars.
    Do you know why this may be happening ...
    thanks,
    Manish

  • Please Help PI Data Dependent Integration Builder Authorizations NOT Workng

    Dear Friends / Experts,
    I had spend many days and explored all Weblog  and links on this website and implemented all the steps required to acheive Data Dependent Integration Builder Security and I am not successful so far. I am just giving up now - Please Help Me ---
    As I said, I already read all the important Forum Links and SAP Web links and Followed Each and Every Step - service.sap.com/instguidesNW04 ® Installation ® SAP XI
    Security Requirement - Data Dependent/Object Level Authorizations in XI / PI
    In distributed teams or in a shared PI environment it might be necessary to limit authorization for a developer or a group of developers to only one Software Component or objects within a Software Component or to specific Configuration Objects.
    Our Environment - PI 7.0 SP 16
    Created a new role in the Integration Builder Design
    u2013Add Object Types of any Software Component and Namespace
    - Enable usage of Integration Builder roles in Exchange Profile
    Integration Builder u2013Integration Builder RepositoryParameter com.sap.aii.util.server.auth.activation to true
    Assign users to the newly created Integration Builder roles
    u2013Create dummy roles in Web AS ABAP, these roles are then available as groups in Web AS Java
    u2013Assign users to these roles
    u2013Assign the Integration Builder roles to the above groups in Web AS Java
    u2013Assign unrestricted roles to Super Users
    Please help - How to validate whether Data Dependent Authorizations are Activated?
    I am working with XI Developers and Basis Team and we did updated all the Required Exchange profile parameters.
    Per this Document - User Authorizations in Integration Builder Tools - Do we need to update the server.lockauth.activation in Exchange Profile. When We updated, It removed Edit Access from all XI Developers in PI
    In both the Integration Repository and the Integration Directory, you can define more detailed authorizations that restrict access to design and configuration objects.
    In both tools, you define such authorizations by choosing Tools ® User Roles from the menu bar. The authorization for this menu option is provided by role SAP_XI_ADMINISTRATOR_J2EE. Of course, this role should only be granted to a very restricted number of administrators. To activate these more detailed authorizations, you must set exchange profile parameter com.sap.aii.ib.server.lockauth.activation to true.
    The access authorizations themselves can be defined at the object-type level only (possibly restricted by a selection path), where you can specify each access action either individually as Create, Modify, or Delete for each object type, or as an overall access granting all three access actions.
    http://help.sap.com/saphelp_nw04/helpdata/en/f7/c2953fc405330ee10000000a114084/frameset.htm
    I was able to control display and maintain access from ABAP Roles, but completely failed to implement Integration Builder Security?
    Are there any ways to check Whether Data Dependent authorization or J2EE Authorizations are activated?
    Thanks a lot
    Satish

    Hello,
    so to give you status of our issue.
    We were able to export missing business component .
    But we also exported some interfaces after that and we had some return code 8, due  to objects still present in change list on quality system (seems after previous failed transports , the change list was not cleared completley...).
    So now we have checked that no objects is present in the change list of quality system and we plan to export again our devs on quality system.
    Hope after that no more return code 8 during imports and all devs transported correctly on quality system.
    Also recommending to read that, which is pretty good.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/7078566c-72e0-2e10-2b8a-e10fcf8e1a3d?overridelayout=t…
    Thanks all,
    S.N

  • How to Launch an Integration Builder under two different java versions

    How to Launch an Integration Builder under two different java versions     1
    1. Situation     2
    2. How To Do     2
    2.1 jre preparation     2
    2.2 Put them into the system     2
    2.3 Execute a Java Web Start under jre 1.4.x version     3
    2.4 Change Java Runtime Versions     3
    2.5 Launch an Integration Builder     6
    1. Situation
    OS: windows 2000 pro – English
    Java version: jdk 1.5.x was already installed. (It’s not permitted to change.)
    I don’t have any authorization to install any software on the PC.
    But I need to use an Integration Builder.
    I already knew URLs of an Integration Builder (http://<hostname>:50000/rep/start/repository.jnlp).
    2. How To Do
    At this moment, an Integration Builder (XI 3.0) can be launch under jre 1.4.x environment (on windows).
    2.1 jre preparation
    I download j2re-1_4_2_10-windows-i586-p.exe from http://java.sun.com/j2se/1.4.2/download.html
    I installed it on my home PC and copied all files from C:\Program Files\Java\ j2re1.4.2_10\ into my USB.
    2.2 Put them into the system
    I pasted j2re1.4.2_10 folder from my USB into the windows 2000 pro system.
    Finally, I could list up all of javaws.exe under this system.
    c:\j2re1.4.2_10\javaws\javaws.exe
    c:\Program Files\Java\jdk1.5.0_05\bin\javaws.exe
    c:\Program Files\Java\jdk1.5.0_05\jre\bin\javaws.exe
    c:\Program Files\Java\jre1.5.0_05\bin\javaws.exe
    2.3 Execute a Java Web Start under jre 1.4.x version
    I executed c:\j2re1.4.2_10\javaws\javaws.exe .
    2.4 Change Java Runtime Versions
    Go to File-> Preferences -> Java
    As you can see, it indicates 1.5 version.
    Click [FIND] button.
    Click [NEXT] button.
    Click the j2re1.4.2_10 folder.
    Click [NEXT] button.
    A JRE Finder is able to find javaw.exe automatically. Or you can indicate C:\j2re1.4.2_10\bin\javaw.exe directly.
    Click [NEXT] button.
    Finally, there are two Java Runtime Versions. Now you need to uncheck the Enabled column of 1.5 version and check 1.4 version.
    Click [OK] button.
    Well, in the General tab, I selected None for Proxies.
    2.5 Launch an Integration Builder
    In the Location field, I typed the URL of an Integration Builder jnlp.
    http://<hostname>:50000/rep/start/repository.jnlp
    SAP Integration Builder comes up inside Applications area.
    Select it and click [Start] button.
    If you click Environment-> Integration Builder (Configuration), you can launch Integration Builder: Configuration.
    [PDF file location] with screenshots
    http://SDN.mobilian.org/SDN/How2LaunchIB.rar
    ===================Advertisement==========================
    How do you search SDN?
    What about [<b>SDN Search Widget</b>]?
    SDN Search Widget
    =========================================================

    I am not getting anywhere with deploying my application or
    applet.
    I have set up my bc4j project. It contains all my VO info,
    links, application module. (proj a)
    I then have another project with DbInfo in it(has all my rowset
    info), Multiple Frames, and my Applet.java file.
    Actually I have an Applet.java file and a Application.java file
    because I was seeing if both/either worked. Anyway they seem the
    same, except for that extra window that comes up when you run the
    applet.
    I follow the steps in the oracle directions (from earlier post).
    And all seems ok. But at ---->
    [*] Select the subdirectory under myhtml where your applet's HTML
    file
    is located, and enter the directory path of the 'staging'
    directory you
    created in step 3 above, if different from the default.</li>
    [*]Select the HTML files that JDeveloper created to run your
    applet.</li>
    [*]Select all of the Java source files in your project that make
    up the
    applet.</li>
    I have no HTML file associated with my applet, at least that I
    know of.
    So do I need to create one, or should it of been done
    automatically.
    Also, I trying to figure out what will be the best way to deploy
    my project. Applet or stand alone application is what my first
    choices have been so far. I have read that there is some issues
    with applets being served from a different server than the
    database. So a stand alone application was my front runner, but
    I haven't gotten either way to work yet.

  • Data not Updated Automatically in SLD to XI Integration Builder

    I am trying to create a new software component in the SLD for use with our XI machine. However, when I do this, the component does not appear in the XI Integration Builder.
    I have looked at the Automatically Updated Data page of the central SLD, and noticed that our XI J2EE system has not Auto-Updated with the SLD for quite some time, if ever.
    Can somebody please give me some pointers how to sheck/configure this? Also, is there a way to update the information between the SLD and the XI system manually?
    Kind Regards,
    Tony.

    I think the problem is the system cannot actually see the SLD for some reason.
    If I try to access the SLD using the DNS name, I am unable to do so:
    http://dnsname:52300/sld/index.html
    However, if I use the IP address direct, I can do so:
    http://192.xxx.xxx.xxx:52300/sld/index.html
    I notice that when I try to access the SLD from the XI start page, the link is using the http://dnsname:52300/sld/index.html form, which isn't working.
    Also, when I try to Import Software Component Versions from the Integration Builder menu, I get the error:
    com.sap.aii.ibrep.client.swc.ExtSwcAccessException: Unable to read software component versions from System Landscape Directory "dnsname:52300"
    So I'm thinking this may be some kind of DNS error, but I don't know how to resolve it.
    Any clues?

  • ERROR, while i importing file in Integration Builder/

    Heading 2: HI every body !!!!
    please help me.
    Now' i'm integrate SAP r/3 and IBM MAXIMO via SAP XI3 (SAP netweaver 2004s)
    i need to import file to integration builder, while i'm importing it, I'"ve got the error:
    ResourceException in method ConnectionFactoryImpl.getConnection(): com.sap.engine.services.dbpool.exceptions.BaseResourceException: SQLException thrown by the physical connection: com.sap.sql.log.OpenSQLException: Error while accessing secure store: File "
    sapmaximo\sapmnt\IPC\SYS\global\security\data\SecStore.properties" does not exist although it should..
    Also i did PrtSc
    Link:[http://ipicture.ru/Gallery/Viewfull/17818229.html]
    I am a beginner in sap netweaver please tell how to solve this step byb step

    I need to import file (XI3_0_IMEA-INTEGRATE_IMEA71-mySAPERP2005_of_ibm.com_5.tpz),it''s in another format , so
    i find on this forum the same error Link: [Error while generating Rules in GRC5.3  RAR Component;
    the solution of this error ---"
    Another reason for this error can be lack of resources or pool size of the server. The recommended settings for the Pool size are :
    Max Pool size : 50
    Max connections : 100
    Connection Timeout : 10 sec
    Max wait time : 30 sec"
    Can you tell me where i can configurate this settings? (step by step)

  • Problem with Integration Builder Logon

    Hi all,
    we get the following error message when logging on  to the Integration Builder: 
    "Authorization error.  Unknown user name or incorrect passwords."
    We installed the SDK versions 1.4.2_03 and 1.4.2_05 on our computer and have local Admin rights. 
    Has anyone already a similar problem and can us further help. 
    Thanks.
    Niko

    Hi Niko,
    The userid/password must be created and be valid on the ABAP side of the XI server (via SU01), with the correct XI developer roles.
    After creating the userid in SU01, it will take a few minutes before it is migrated to the J2EE side so that you can use it to logon to the Integration Builder.
    Regards,
    Bill

  • Business Systems not appearing in the Integration Builder (Configuration)

    Hi Experts,
    I have created new business systems in the Integration Builder of my PI System.
    This is not appearing in the Integration Builder.
    I executed the delta cache in transcation SXI_CACHE but the issue is same. Please let me know what should I do to get the business systems.
    Thank you.
    Thanks
    Kasee

    Business system from SLD not visible in Integration Directory

  • Problems updating projects to new versions of Premiere (CS5 to CC and CC to CC 2014) Memory consumption during re-index and Offline MPEG Clips in CC 2014

    I have 24GB of RAM in my 64 bit Windows 7 system running on RAID 5 with an i7 CPU.
    A while ago I updated from Premiere CS5 to CC and then from Premiere CC to CC 2014. I updated all my then current projects to the new version as well.
    Most of the projects contained 1080i 25fps (1080x1440 anamorphic) MPEG clips originally imported (captured from HDV tape) from a Sony HDV camera using Premiere CS5 or CC.
    Memory consumption during re-indexing.
    When updating projects I experienced frequent crashes going from CS5 to CC and later going from CC to CC 2014. Updating projects caused all clips in the project to be re-indexed. The crashes were due to the re-indexing process causing excessive RAM consumption and I had to re-open each project several times before the re-index would eventually complete successfully. This is despite using the setting to limit the RAM consumed by Premiere to much less than the 24GB RAM in my system.
    I checked that clips played; there were no errors generated; no clips showed as Offline.
    Some Clips now Offline:Importer  CC 2014
    Now, after some months editing one project I found some of the MPEG clips have been flagged as "Offline: Importer" and will not relink. The error reported is "An error occurred decompressing video or audio".
    The same clips play perfectly well in, for example, Windows Media Player.
    I still have the earlier Premiere CC and the project file and the clips that CC 2014 importer rejects are still OK in the Premiere CC version of the project.
    It seems that the importer in CC 2014 has a bug that causes it to reject MPEG clips with which earlier versions of Premiere had no problem.
    It's not the sort of problem expected with a premium product.
    After this experience, I will not be updating premiere mid-project ever again.
    How can I get these clips into CC 2014? I can't go back to the version of the project in Premiere CC without losing hours of work/edits in Premiere CC 2014.
    Any help appreciated. Thanks.

    To answer my own question: I could find no answer to this myself and, with there being no replies in this forum, I have resorted to re-capturing the affected HDV tapes from scratch.
    Luckily, I still had my HDV camera and the source tapes and had not already used any of the clips that became Offline in Premiere Pro CC 2014.
    It seems clear that the MPEG importer in Premiere Pro CC 2014 rejects clips that Premiere Pro CC once accepted. It's a pretty horrible bug that ought to be fixed. Whether Adobe have a workaround or at least know about this issue and are working on it is unknown.
    It also seems clear that the clip re-indexing process that occurs when upgrading a project (from CS5 to CC and also from CC to CC 2014) has a bug which causes memory consumption to grow continuously while it runs. I have 24GB RAM in my system and regardless of the amount RAM I allocated to Premiere Pro, it would eventually crash. Fortunately on restarting Premiere Pro and re-loading the project, re-indexing would resume where it left off, and, depending on the size of the project (number of clips to be indexed), after many repeated crashes and restarts re-indexing would eventually complete and the project would be OK after that.
    It also seems clear that Adobe support isn't the greatest at recognising and responding when there are technical issues, publishing "known issues" (I could find no Adobe reference to either of these issues) or publishing workarounds. I logged the re-index issue as a bug and had zero response. Surely I am not the only one who has experienced these particular issues?
    This is very poor support for what is supposed to be a premium product.
    Lesson learned: I won't be upgrading Premiere again mid project after these experiences.

  • Non-central adapter engine not visable in integration builder

    ls,
    I have a pi system and a separate non-central adapter engine. After installing i did all the post actions including the following part:
    5.13 Clearing the SLD Data Cache after Installing a Non-central Advanced Adapter Engine
    When you have installed a non-central Advanced Adapter Engine, you need to manually clear the SLD Data Cache in the Integration Builder to make it visible and selectable in the communication channels.
    Procedure
    1.
    After SAPinst has finished, open the Integration Builder of your PI system at http://<host>:<port>/dir/start/index.jspIntegration Directory and logon as a user with the ABAP role SAP_XI_CONFIGURATOR assigned.
    2.
    In the Integration Builder, choose Environment.
    3.
    From the drop-down list, choose Clear SLD Data Cache.
    (as described in: Installation GuideAdapter Engine (Java EE) 7.1 Including Enhancement Package 1 on Windows: MS SQL ServerTarget)
    However when creating a communication channel in the integration builder there is only the central adapter engine to choose in parameter tab screen.
    I was hoping maybe some-one knows what else need to be done to make de non-central known as adapter engine.
    thanks in advance
    Peter

    HI,
    with user PIDIRUSER i was indeed able to refresh the af cache. Unfortunately without a positive
    Under the runtime workbench (http://<host>:<port>0/rwb/index.jsp) tab sld registration i don't see the ae neither under non-central components. If i try to register i get:
    Registration of Adapter Framework with SLD was successful
    Registration of fix AF adapter services with the SLD was successful
    Registration of dynamic CPA cache based AF adatper services with SLD was successful
    but no AAE...although it is visible in the SLD under technical systems...

Maybe you are looking for

  • Refurbed 4th Generation Click Wheel 20 GB - Transfer songs with Firewire?

    I currently own a 2nd generation iPod scroll wheel (that's about to die any day)...I'm thinking of getting a refurbed 4th Generation Click Wheel 20 GB. Question: Can I load my huge iTunes library onto the refurbed 4th generation click wheel via Firew

  • SRM PO created with wrong company code

    Hello,         The users have created several PO in SRM with wrong company code. The user was found in the wrong organization structure during the creation of shopping and approval. The PO did not get transfer to the Back because of this. Is there a

  • Records are missing in the File which XI has placed in Target FTP server

    Hi All, I have a scenario where in XI is transfering the files from ECC to Target System . No transformation required here . I am using AAE to run this scenario. Issue i am facing here is that i can see few records missing in the File  which XI had p

  • Sample Servlet Project Please...

    Dear All, Can u please send me a sample code using JSP,SERVLET & JAVABEAN with DB Connection Code. with only one attribute called USERNAME. While submitting, it will have to store in the DB and Display the username in the another JSP. It will make me

  • Output to wmv

    I cannot find a choice for output to windows media player 9 (or any other wmv( is there a location to download this? Website I am posting to will not accept any of the qt files compressor creates.