Is it possible to parameterize MDEX_HOST?

I am running Latitude Data Integrator Designer 2.2.2 and CloverETL Server in Windows server.
I want to execute the same graphs for different MDEX Host. I want to load same data to 3 different MDEX servers.
Please advise on what the best method would be.

This sounds like a case for the Cluster Coordinator where you would set up one of your graphs as a leader and the other two as followers. After the update of the leader completes, Cluster Coordinator would then ensure that the followers are updated automatically. Architecturally, this is probably the recommended solution.
If you're having trouble with Clustering and/or you'd prefer to avoid it, I can conceive of a number of techniques to load the graphs sequentially. Probably the easiest would be to have one MDEX Loader (aka BulkLoader) for each index, in your case you would have 3 loaders. Then, place a "SimpleCopy" ETL element into your graph and connect the edge that was previously feeding into your single MDEX loader to the simple copy input. Then, connect as many SimpleCopy output ports as necessary to your MDEX loaders. Each MDEX loader would then be configured to hit the right MDEX Host and you can load them in different phases.
You could also write something to iteratively execute the same graph multiple times with a different value for the MDEX_HOST. There's a ton of ways to manipulate that variable from editing the workspace.prm file dynamically to passing that value to the graph when you use a RunGraph component, etc.
Regards,
Patrick
Note: All of the "iterative" or "multi-load" approaches will cause your load times to increase since you're doing 3 separate loads as opposed to letting the Clustering handle load and distribution for you. The clustered load also has a cost associated but it should be much lower (depending on volume) than loading 3 graphs on their own.

Similar Messages

  • Is it possible to parameterize a scheduled job?

    I have the following scenario which I consider implementing using Azure scheduler:
    - Request information from an external Web service
    - Store the result in the Azure blob
    When requesting the information I need to supply an HTTP header indicating since what time to retrieve the requested information. So with each new request I need to update such header.
    I checked Azure scheduler but its configuration seems to be static, i.e. I have to predefine all request parameters. Is it so?
    If it is so, then what would be the most practical way of implementing such schedule? Should I build a simple intermediate Web service that will take care of request parameterisation so I can call it using static job configuration? It looks straightforward,
    I just don't want to create middleware if it can be avoided.
    Thanks in advance
    Vagif Abilov

    Hi Vagif,
    Thanks for your posting!
    I totally understand your requirement.  You could custom the headers using scheduler API. Please see this blog(
    http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/ ) and code sample (https://github.com/bradygaster/maml-demo-scheduled-webjob-creator/blob/master/ScheduledWebJobCreator/Program.cs
    Hope this helps.
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Parameterize schema name in AMDP for accesing different SLT replicated schemas

    Hi All,
    We need to create an AMDP for accessing tables from different schemas with SLT replicas of different ERP instances.
    Does someone know if it's possible to parameterize schema name on AMDP so it's possible to use the same stored procedure for accessing the same tables on differen schemas on the same HANA DB instance?
    Best regards
    Fernando Alvarez

    Dear Fernando Alvarez,
    Does your use case require access to multiple schemas during the same call to the AMDP method at runtime or is it the case that during a method call you would access only one specific schema?
    If later is the case you might want to check the possibility of using secondary database connections from AMDP. From NW AS ABAP 7.4 SP08 onwards AMDP methods support the usage of secondary database connections(maintained from DBACOCKPIT). 
    You can find more information from this help documentation.
    Additionally you can refer to the following blog as well.
    Please let us know if this information is helpful to identify the right solution for your use case.
    Best regards
    Sundar

  • How to parameterize pages or books

    Hi all,
    in our project we need to handle navigation entries in different ways. Therefore we thought about customizing the skeleton jsp to behave different when rendering menu entries for special pages or books.
    We were looking for something like:
    <c:choose>
    <c:when test="${pagepc.myProperty=true}">
    ....do something special here...
    </c:when>
    <c:otherwise>
    Question: Is it possible to parameterize pages or books? I hoped to find a properties view in portal console - but there is none.
    Thanks in advance.

    You need to add the properties or Meta Data on the consumers page not the producers. Even though you are remoting the page the page is not really remoted (only the portlets on the page).
    If this is not an option you may want to look into something around custom data transfer, but I really don't think this information should be coming from the producer. As the producer just produces portlets, and they shouldn;t really be tightly coupled to the page they are on.
    As far as the EL, I'm not too sure but there are two methods on the PresentationContext. I don't think these two lend themselves very well to EL, you can always write a scriplet.
    * Return the metadata elements for this control matching the supplied name
    * @return a MetaData element if a match is found otherwise null.
    public MetaData getMetaData(String name)
    * Return the metadata elements for this control
    * @return a non null array of <code>MetaData<code/> objects
    public MetaData[] getMetaDatas()
    For the properties
    * Get the extra properties string of the component, if it exists. This value may be null.
    * Properties are formatted as in the following example:
    * <pre><b>
    * my-first-key: my-first-value; my-second-key: my-second-value;
    * </b></pre>
    * Any number of properties may be in a properties string.
    * @return
    * The control's extra properties, if set
    * @see
    * #getParsedProperties
    public String getProperties()
    * Get the extra properties of the component, if they exist. This value will not be null.
    * The <code>Properties</code> class returned by this method is the parsed view of
    * those from {@link #getProperties()}.
    * @return
    * The control's extra properties, if set; an empty Properties instance if not
    * @see
    * #getProperties
    public Properties getParsedProperties()
    * Get a property of the underlying component. This is a convenience method.
    * The <code>key</code> argument should not be null.
    * @param key
    * The property key
    * @return
    * The value associated with the specified key, if it exists
    public String getProperty(String key)
    BTW: if you are doing a lot of these say > 100, I'd recommend the properties vs the Meta Data.

  • Maxl parameterizing entire path using servername

    I want to use environment (or local if environments not possible) to parameterize three paths for source files for Maxl imports and destination files for exports. Easy enough when the path is the essbase server ($Arborpath etc.) The problem arises when I want to use a path with a servername in a variable. For example:
    import database sample.basic data from data_file "$Testpath\\Calcdat.txt" on error abort;
    where Testpath = \\EssServerName\Dirname\SubDirname
    at the moment TestPath = \\Forbin\MTGTools\Maxl BUT I've tried many permutations of slashes, double slashes, (even tried triple slashes on the servername which works in some circumstances), quotes etc. along with "' (double quotes surrounding single quotes)
    I've gotten variations of this to work fine right up to the point of using the servername with the variable.
    The expanded statement looks perfect in the console black screen but of course that's not reliable.
    Thanks much for your help.
    Ron

    Okay, how about this:
    spool on to "\\$COMPUTERNAME\\temp\\For_Ron.log" ;
    login "admin" "password" on "$COMPUTERNAME" ;
    exit ;Which produces the log file (in c:\temp):
    MAXL> login "admin" "password" on "DEMO1113" ;
    OK/INFO - 1051034 - Logging in user [admin].
    OK/INFO - 1051035 - Last login on Tuesday, March 30, 2010 5:25:13 PM.
    OK/INFO - 1241001 - Logged in to Essbase.
    I'm not seeing a need for quadruple backslashes to get UNC naming, which I think is the need (if this indeed be on Windows). *nix I couldn't say.
    I don't think import would be any different than spool, although I could always be surprised.
    Regards,
    Cameron Lackpour
    Edited by: CL on Mar 30, 2010 5:28 PM
    Of course I could be missing the whole point, again, so who knows.

  • Can the table name in a cursor be parameterized?

    I would like to do the following in pl/sql
    Procedure my_proc ( varchar2 table_name) is
    CURSOR my_cursor ( table_name VARCHAR2 ) IS SELECT some_column FROM table_name ;
    i.e. parameterize the table name. I know it is possible to parameterize the SELECT cause by doing the above but, as written, my example will not compile. So, I'm wondering if it is even possible to create a cursor like this.
    Thanks for any help/advice,
    -=beeky

    I wanted to add that I have no control over the database design. The reason I want to parameterize a cursor is to allow a single procedure to do exactly the same thing to six tables. These tables contain data for different geographic regions and are supplied by an outside party. My script essentially merges these tables into our relational model.
    Thanks again to all who replied,
    -=beeky

  • Running the same query against numerous databases

    My organization has 20+ clients. Each client has a separate database in the same MSSQL server*. For these purposes, the databases are structurally identical. I've been tasked to gather information from tables in all of these databases. I can use the same query on multiple clients - my issue is switching between databases.
    The naive solution would be for me to declare 20+ DataSource instances, and have some mapping from client name to DataSource. I think that's going to be a real mess and my inclination is to avoid it.
    Another possibility would be dynamically generating the PreparedStatement based on the client, something like
    "SELECT foo, bar " + "FROM " + client.getDatabaseName() + ".dbo.my_table";
    My understanding, though, is that it's a bad practice to have dynamically generated prepared statement calls and not possible to parameterize the 'from' clause.
    Another thing I've been digging around at is using a transaction and an SQL Server call to alter what database calls from that transaction go to. I thought that could be done, but I'm not finding anything which suggests it's possible.
    I guess I've identified my problem, but I'm thrashing on how to solve it. I'm still newish to non-trivial database interactions, so I'd appreciate any input. If I posted this in the wrong place, please feel free to yell at me and move it.
    Thanks in advance,
    Eric
    * Not my fault. I'm new.

    I'm collection usage and storage metrics. "Client X has 200 records, Client Y has 250. Client X has 15 active users, Client Y has 12."
    The data is being read from the client databases and stored into a separate metrics database.
    I anticipate it will be mostly select count() calls.
    I can. For the maintenance reasons I mentioned above, I'd rather not have to maintain all those datasources if I don't have to.
    No, the invocations will be sequential.
    for (each metric type) {
        for (each client) {
            gather information;
            add to batch
        update batch in metrics database
    }I guess I was hoping there was a Sekrit SQL way to swap what database inside a server instance the connection was pointing to. I'm guessing that's not going to happen, and I'm left with N datasources for N clients or generated preparedStatements which swap out the database name in the FROM portion.
    The data is being written to a table in a single, separate metrics database.
    I'm trying not to, honest.
    Edited by: 919852 on Mar 12, 2012 8:38 AM

  • Parameterized Mapping

    I have an Object that has a one to many collection. It currently is a managed privately owned, lazy initialized (indirection). Is it possible to parameterize this mapping by passing in or calling a method to get the parameters?
    I'm currently using Java to define the mappings so I have full control over everything. I know this is possible if I don't want it to be a managed collection. I wouldn't to keep it managed, so I can use the Toplink Cache when grabbing the attached collection.

    Correct, I would like to set the date and then have the collection filed with the attributes matching that date.
    So there is not way to pass in an argument that will be used as part of the selection criteria? In your example I wouldn't want to load all of the objects and then manually added the objects that match the date to the collection. That would require loading up all of the objects from the database. I need a way to only load a subset of the objects from the database. The way I would envision this happening would be to set the argument by calling a method on the object such as getDate() and it will pass in that value into the SQL WHERE clause.
    I've been looking at the documentation, is there a way to do this by using a parameterized database call? It looks like I can set the selectionCall for a mapping. I just don't know how to pass in the arguments. I need the ability to set the method "getDate" as the argument for the filter clause.
    Any additional insight would be helpful. Obviously there are workarounds, but it would be helpful to have this ability natively supported within toplink/eclipselink.

  • How I can attach files on a Web Form

    Hi everybody,
    When I add a document on a web form, the document I have to attach from the workspace directory... the server is on Linux ... instead of put the files on linux server, exists the possibility to parameterize a local directory to be easier to attach the file.....???? Or Sombeone knows about a link to see some information about my question.
    Regards

    Hi,
    You can attach documents at two steps.
    1. Go to Explore screen at workspace, import file(s) to workspace via (file->import). Make sure you defined necessary security on the folder and document alike.
    2. Go to Planning, open a form, select a cell, click on "Add/Edit document" button, follow the steps and choose the file that you want to attach from workspace.
    There is no dependency for attaching documents on the OS.
    Cheers,
    Alp

  • 2LIS_08TRFKZ NETWR splited by weight of delivery positions

    Hello experts,
    In 2LIS_08TRFKZ extractor, the cost is split proportionately according to weight of all the positions of the delivery.
    Is it possible to split only by the weight of the position of Finished Product or Main product of the delivery?, To do that, is it necessary to use an user exit, or is it possible to parameterize the extractor specifically? How?
    In the case of user exit, where? In the CMOD at the time of extraction? Or through function module MCEX_BW_LO_API?
    Thanks in advanced.

    Hi
    Cost spiliting is standard functionality of extractor you cannot modify it however, you can write the CMOD code the logic you said.
    Onemore thing, to populate the data create a zfield and update it.
    Regards
    YuvaraajP

  • Modifying JCA Operation in WSDL's using ant script

    Hi
    I have written ant script which would deploy BPEL processes to mutliple environments based on ant-orabpel.properties.....i used customizewsdl tag for modifying url that points to other enviroments...i need some info how to change the jca operation values inside the wsdl that gets created when we configure any adapter like the file adapter or MQ adapter.
    Regards,
    Ravi.

    Its possible to parameterize the jca properties and then at deployment time specify which values to use via different property files. See this thread...
    Re: Partner Link properties BPEL.XML 10.1.3.1

  • Welcome to the SAP Education & Research forum!

    Welcome to the SAP Education & Research forum!
    This forum is a great way for you to share your SAP knowledge or get help from SAP experts worldwide on the range of SAP Education & Research topics and products. Before you post, take a look at our [Forum Guide|http://wiki.sdn.sap.com/wiki/display/SCNGUIDE/Forums].
    Please also check our [Wiki FAQ|http://wiki.sdn.sap.com/wiki/display/HER/FAQ] and use the search before you post. More detailed [rules of engagement|https://wiki.sdn.sap.com/wiki/display/HOME/RulesofEngagement], can be found on the Wiki.
    Implementation guidelines, cookbooks, etc, can be found on the [BPX page|http://www.sdn.sap.com/irj/bpx/highered] for SAP Education & Research
    Happy posting!
    The Community Team
    Rob and Silke

    I'm not sure that it is actually possible to parameterize the DYNR. Anyway, it would still leave you the problem that the field descriptors in the command interface would all be wrong.
    My suggestion for getting around this is to make several recordings using the SAPGUI command, creating a new command for each screen change. You can then edit the script so that the different cases between which you want to differentiate are separated by IF...ELSEIF...ENDIF  commands.
    You might then have something that looks like this (pseudocode):
    * SAPGUI command for common "opening part" of transaction
    IF (subtype = 'LA').
    *SAPGUI covering screen 2001
    ELSEIF (subtype = 'LW').
    * SAGPGUI covering screen 2000
    ENDIF.
    * SAPGUI for common "closing part" of transaction - if relevant.

  • Parameterizing the text data recorded using SAPGUI.

    Hi all,
    Is it possible to record the text input we are providing at item data level say for example,Packing item text for the transaction VA01 using SAPGUI recording.
    I am able to capture the text input after recording those values inside but I am not able to parameterize so that I can change the data everytime to be given as input.
    If it is possible to parameterize  what steps that we need to followed?
    It is urgent.
    Regards,
    Vishwakarma.

    i dont think u can use recording method to upload ur LONG TEXTS since these texts are not stored as other fields in the table. they can instead be put using FM 'SAVE_TEXT' in which u need to provide the id,object and name. These parameters u can have an idea by going into the SApscript editor of the text ( in the transaction itself, below the box where u enter the text, u have a DETAIL button, click it), then in the editor GOTO>HEADER> u ll c the parameters.
    Text Name 
    Language  
    Text ID   
    Text object
    These r to be passed to the FM to upload the text properly. The parameter NAME & ID depends on the item no.
    other 2 are fixed.
    Hope it helps.
    regards,
    Bikash

  • SetParam fails for ROWNUM

    Hi experts,
    I have SQL statement as select col1 from table1 where rownum <= :number;
    I am trying to execute this SQL statement from TTClasses and for :number; I call setParam function. But it fails for SQLINT, SQLBINGINT.
    Is it possible to parameterize 'rownum' through TTClasses; and if yes what should be type for setParam.
    Any help would be of greatly appreciated.
    Thanks

    Strictly speaking, rownum is of type NUMBER. However, normal type conversion rules should apply so SQLINT or SQLBIGINT shoudl work. This sounds liek a bug; are you able ot log an SR for this?
    Chris

  • Welcome to the SAP on DB2 UDB for i5/OS forum

    <b>Welcome to the SAP on DB2 UDB for i5/OS forum!</b>
    Participate in the community of users running SAP on a system i5. This forum is intended to be the place to ask your questions concerning SAP on system i5, to answer other users questions, and to share your experiences.
    Please read the <i>Rules of Engagement</i> before you post your first message here.
    Be part of the SAP on system i5 community and enjoy this forum!
    Kind regards,
    Jan Stallkamp

    I'm not sure that it is actually possible to parameterize the DYNR. Anyway, it would still leave you the problem that the field descriptors in the command interface would all be wrong.
    My suggestion for getting around this is to make several recordings using the SAPGUI command, creating a new command for each screen change. You can then edit the script so that the different cases between which you want to differentiate are separated by IF...ELSEIF...ENDIF  commands.
    You might then have something that looks like this (pseudocode):
    * SAPGUI command for common "opening part" of transaction
    IF (subtype = 'LA').
    *SAPGUI covering screen 2001
    ELSEIF (subtype = 'LW').
    * SAGPGUI covering screen 2000
    ENDIF.
    * SAPGUI for common "closing part" of transaction - if relevant.

Maybe you are looking for